10 reasons why data and analytics projects fail

10 Reasons Why Analytics & Data Science Projects Fail

Only 20% of the data science and analytics models that get built actually get implemented.

Notwithstanding how hot Analytics and Data Science are right now, the reality is that these projects have as many challenges as any other high-tech project (see The Chaos Report from The Standish Group, which tracks IT project failure to satisfy one or more of scope, schedule, quality, and budget objectives. The numbers are not favorable). Sometimes Analytics and Data Science projects fail for reasons similar to IT projects, and sometimes they fail for very different reasons that are more domain-specific.

Based on my decades of experience as a practitioner, program manager, and executive leader of analytics organizations, here are my top 10 reasons why analytics and data science projects fail along with my advice on how to avoid these detrimental missteps.

#1 Insufficient, Incorrect, or Conflicting Data

Having multiple (or zero) versions of the truth is one of the most common problems that organizations face. It is not that organizations don’t have data, but rather that organizations don’t properly marshal data into an environment where it can be analyzed and modeled, for instance in a Data Warehouse or Data Lake. Due to a lack of data governance, data quality and integrity often inhibit analytics project success.

#2 Failure to Understand the Real Business Problem

Failure to understand the real business problem is usually due to a lack of, or poor, communication between data scientists and business stakeholders. Failure on the part of the data scientist to really understand the problem, decision, question, or opportunity at hand will often result in the right model and the right answer to the wrong issue.

#3 Misapplication of the Analytics | Data Science Model

This is usually due to insufficient skills resulting from a lack of proper education, training, expertise, and experience in applying mostly predictive and prescriptive analytics, but also diagnostic analytics. For example, I witnessed a Marketing person apply A|B Testing to predicting revenue without an experimental design that was statistically valid and sufficiently “blocked” for several confounding factors.

#4 Solving a Problem No One Cares About

Let the business drive analytics projects, not the other way around. Businesses have plenty of critically important problems to solve, decisions to make, and opportunities to analyze and evaluate, and they need our help. If you are applying methods and technology to a problem that no one in the business domain cares about, or you’re measuring by or aiming at the wrong KPI, or even worse no KPI, then you are simply wasting the company’s time and resources.

#5 Poor Communication | Business Interpretation of Results

Communication in business is difficult enough before you start layering on mathematics, statistics, computer science, and AI/ML. At all times, avoid talking over the head of your business partner, especially if they aren’t technical. Every conversation should be tied to business-relevant topics, like modeling assumptions, data interpretation, model validation, or interpreting model results in the form of KPIs in business terminology. Endeavor to be economic results-oriented and business outcome-driven.

#6 Change is Disruptive & Not Handled Well

Analytics is disruptive, especially when it is done right. When the “data speaks” and you start streamlining, automating, and optimizing archaic, inefficient business processes, or start uncovering evidence that long-held beliefs and operating assumptions are flat out wrong, you can bet that not everyone is going to be ecstatic. Quite the opposite. They may be resistant, and even belligerent, in their veiled, or even blatant, attempts to protect their turf, e.g., headcount, budget, and defend the status quo.

Projects get killed because someone with sufficient clout in the leadership hierarchy doesn’t want the project to succeed. Analytics moves cheese and uncovers inconvenient truths.

#7 Unrealistic Expectations

It is as easy for business folks as it is for Analytics and Data Science folks to get caught up in the euphoria of a project. Everyone wants a successful outcome, and everyone wants to make a BIG impact. Sometimes the expectations for the business value impact are beyond what is realistically achievable.

My advice is always to be conservative, start small, remain focused, and don’t set the bar too high. If you promise a 1-5% KPI improvement and deliver 10%, then you are a hero. If you do the reverse, then your credibility suffers immeasurably.

#8 Poor Project Management

Scope, Timing, Budget, Quality are critical components of any project. Failure to meet one or more of these measures is why the majority of IT projects are challenged or fail outright. Why do we think Analytics | Data Science projects are any different? Both involve “writing code,” and our projects are experimental by nature, involving trial and error. Our projects are not exempt from these four pressures.

If you use Agile in model development (Kanban works well in my experience), then keep sprints to 2 weeks (time-box) and get to MVP (Minimum Viable Product, something that works and shows a result) ASAP. This helps to keep everyone focused and on track.

#9 Excessive Focus on the Model, Technique, or Technology

This is when “the model goes in search of the problem.” If you are experimenting with methods and technologies that you think are “really cool,” and in search of a problem to apply them to, then you are simply wasting the company’s time and resources. See #4.

Don’t be the person with some newfangled ML tool, ensemble model, or technology in search of a problem that fits or a business team that cares. Trust me, they don’t and never will—they are far too busy trying to solve the myriad of problems that they care about. Start there instead.

#10 Lack of Empathy

At the end of the day, we are all on the same team—business people, data engineers, data scientists, IT people. We are all striving for the same goals and outcomes. There is a great deal of interdependence among all of the various skill groups required for success. No one of us is as smart as all of us. Don’t let the healthy tension of analytical experimentation lead to unhealthy conflict between constituencies.

Along with credibility comes humility, and like most people who have worked diligently for decades to successfully apply Analytics or Data Science in Digital Transformation, I have personally made most of these mistakes and experienced the fallout. Fortunately, having learned my lessons the hard way, I am usually able to more clearly see and avoid these missteps.

Hopefully, you can learn from my experience. Do better. Keep going.

This post originally appeared on LinkedIn.