metamorworks - stock.adobe.com

Tip

Will generative AI replace data analysts?

Generative AI isn't going to replace data analysts. It can help analysts be more effective, but GenAI lacks human insights and knowledge to properly do the job.

Generative AI will not replace data analyst jobs. Artificial intelligence can't replace people in many fields, especially ones that require human empathy and insight.

AI can process vast amounts of data and provide quantitative analysis. It can't understand the subtleties of human behavior, cultural nuances or the complexities of human motivations and desires the way human analysts can.

Data analysis might seem like a technical role, but the job is nuanced; it involves more than crunching numbers. Successful data analysis requires an understanding of the human elements behind data, whether it's analyzing customer behavior or detecting fraudulent activity. A human analyst's ability to empathize with and understand the motivations, fears, ambitions and interests of others can lead to compelling insights that go beyond what is immediately apparent in the raw data. Gaining insights requires an element of human judgment and understanding that artificial intelligence currently lacks.

Generative AI tools such as ChatGPT and Gemini can simulate text generation to a human-like standard, and they have the potential to automate some of the tasks that data analysts currently perform. But generative AI also has limitations: It can't understand the full context of data. Data analysts still have to interpret the results from generative AI and make decisions based on the data.

Limitations of generative AI

Generative AI cannot perform the nuanced work of analysts. Data analysis requires the synthesis of visual, numerical and tacit knowledge, which analysts cannot convey through text alone. The training data that generative AI models use limits the text that they can generate. GenAI can't analyze raw data or produce original visualizations, and any insights it provides come from language patterns in the training data.

Another concern with generative AI models is accuracy. Without human oversight, the text outputs from AI can contain logical gaps, biased perspectives and factual errors it inherits from the training data. Accuracy depends on the quality and diversity of the training data; biased or inaccurate training data causes the data sets to be biased or inaccurate.

Artificial intelligence models struggle to keep up with the real world: Training or retraining a model takes a great deal of computing power, time and money. As the world changes, the AI model falls behind until it's retrained. Depending on the last time a model was retrained, it could be several months behind on data, resulting in a potentially significant knowledge gap.

Generative AI models also lack the critical thinking skills and insight to question the validity or relevance of their source material, which is an essential skill for a data analyst. A core component of data literacy is checking the quality of data and identifying potential biases.

As a result of its limitations, GenAI is not a substitute for human analysts. Instead, the models are a tool to help analysts generate text, identify patterns and explore data. With human oversight, generative AI models can be an asset. Without human engagement, they mostly churn out repetitive, formulaic summaries of their existing knowledge.

How data analysts can use artificial intelligence

Forecasting future events is nearly impossible, but it's safe to say that AI's limitations won't be solved for some time. Still, human data scientists and analysts can use AI as a valuable assistant in their work right now.

The human analyst's role is not obsolete.

Generative AI can suggest code to extract, clean and analyze data, which helps automate some repetitive tasks. It lacks the deep understanding of context, business goals and interdependencies that are required to design complex, scalable and maintainable code architectures. But AI can help an analyst who might need to work in multiple languages or diverse architectures to generate helpful code for quick review.

Given the right information, AI can also propose data structures, such as tables, especially for analytic schemas, such as stars and snowflakes. Although AI can identify patterns within data and suggest tables, the task of defining efficient and effective data structures still needs human intervention. AI often struggles to "get it right" the first time because it does not have the same understanding of the data as a human analyst. Human analysts often are wrong on the first iteration too, but they start with a richer understanding of the problem. It might be too much work to describe the necessary details to the AI program, but the human analyst can use what they know and find other potential use cases.

One interesting use of AI is to recommend analytical methods. Analysts must still validate the suitability of a suggested method for the problem, account for business needs, data constraints and possibly even budget constraints for compute and storage.

For example, suppose an AI system is analyzing customer purchase data to increase sales. It sifts through massive data sets and identifies a pattern: Customers who buy a laptop often also buy a wireless mouse. Consequently, the AI recommends that bundling the products together in a promotional offer might lead to increased sales.

The human data analyst -- using specific business knowledge and experience -- can complement the AI-generated insight. They know that the laptop has a high profit margin and the mouse has a low profit margin. A bundle could increase sales, but might dent overall profit. They might suggest a tweak to the AI's strategy: Instead of a bundle, offer the mouse at a discount, perhaps with a coupon, only after the customer has bought a laptop. The proposal maintains the profitability of the laptop and the overall sales might still increase due to the perceived deal. The human analyst can also provide context about supply chain constraints, seasonal trends or upcoming marketing campaigns the AI might not be aware of. With new insight, analysts can prompt the AI again and see if it has more, or similar, recommendations.

Will AI replace data analysts?

AI can enhance, rather than replace, the role of data analysts. Using AI to automate routine tasks enables analysts to dedicate more time to strategic work, for example. But AI is not accountable for its own errors: Responsibility and blame still rest with humans.

Human judgment -- coupled with a healthy dose of skepticism and business acumen -- continues to be indispensable assets that AI cannot replace. Smart analysts can use AI as a tool to augment their abilities, rather than perceive it as a threat to their roles.

Today, AI can automate repetitive tasks, provide insights into large data sets, help draft initial reports, write code snippets and propose potential routes for analysis. As AI advances, the industry might look forward to more sophisticated assistance in data analysis. AI can suggest potential data sources, generate effective test data, or drive operational and tactical decisions.

Even if generative AI reduces the number of analysts that are required at a given organization, the key role of the human analyst remains. Their knowledge of the specific context, ability to apply critical thinking and deep understanding of human needs remain, even with new advances. The human analyst's role is not obsolete. It's more essential to ensure their organization harnesses generative AI's potential effectively and responsibly.

Donald Farmer is a data strategist with 30+ years of experience, including as a product team leader at Microsoft and Qlik. He advises global clients on data, analytics, AI and innovation strategy, with expertise spanning from tech giants to startups. He lives in an experimental woodland home near Seattle.

Dig Deeper on Business intelligence technology

Data Management
SearchAWS
Content Management
SearchOracle
SearchSAP
Close