Blog 8.28.2018

Six Reasons You’re Not Ready for Analytics, Part 2

In my last post, I started to lay out reasons companies waste money by not understanding some of the fundamentals of how analytics should be implemented in an enterprise. I have three more to cover today that get a little more tactical.

Reason #4

You haven’t articulated your business questions and issues well enough

One of my favorite Clients is a veritable fire hose of questions he’d like his company’s data science teams to be able to answer. His passion around driving the business forward through the use of data is inspirational, but his example is rare. Much more common is that enterprises know they need to get more value from their data but haven’t hit upon the right framework to do that. As a result, very often, Client analytics teams (if they exist at all) have been relegated to simply reporting data rather than analyzing it.

In a nutshell, most businesses haven’t articulated their business questions and issues well enough (or at all) so that an analytics or data science team can begin to answer them. Without a clear “data mission statement”, you’re likely to leave a lot of value on the table as analytics teams work on reporting, or at best, providing findings that aren’t aligned with the truly valuable business questions.

Reason #5

You’re not setting expectations clearly with your (analytics) customers

This reason is articulated from the point of view of an analytics team lead. Many times, we’ve seen Clients assume that an analytics or data science project will proceed as other projects (especially software development projects) do, with well-defined phase gates, an ability to schedule a set of deliverables up front, and relatively firm estimates from developers. The nature of an analytics or data science project, however, is different.

One of my colleagues wrote a Trexin Insight Paper called Agile Practices for Data Analytics. In it, he explains: “More than application development, data analytics is an exploration, or testing of hypotheses, with few requirements determined up front. Rather, the course of a data analytics project must respond quickly to stakeholder feedback from discoveries or findings in the data.”

For analytics efforts to be perceived as “successful”, it’s important to define for your customers what that means. Some key expectations to set up front:

  • The process starts with an EDA Phase. In the beginning of an analytics project using new data, there is a phase called “Exploratory Data Analysis” that’s critical to all subsequent phases. In this phase, the team is exploring the data to form a complex and detailed picture of each table, each field, of the data set, and writing code to prepare it for analysis. From the outside, it looks like nothing is happening, and this can be frustrating for other people who are expecting “insights” to flow out of the project from the start.
  • The process is iterative. All scientific explorations are non-linear, including data science and analytics projects. An iterative process, as detailed in the TIP mentioned above, will help keep the exploration on track.
  • There’s no guarantee of insights. You must clearly communicate the difference between findings and insights. While it’s normal and expected that software development will yield working software, there’s no guarantee that a data science or analytics project will produce meaningful insights that can be used to answer a business question or test a hypothesis. As the eminent statistician John Tukey said,

“The combination of some data and an aching desire for an answer does not ensure that a reasonable answer can be extracted from a given body of data.”

What to do about all of this? Rather than promising insights after two weeks (or ever), you can promise a timeboxed deliverable that lays out two weeks’ worth of analysis and findings every two weeks. Having an (often visual) explanation of progress that gives an explanation in business terms rather than statistical terms can help others understand that progress is being made. Reviews by business stakeholders every two weeks ensures that the analytics and business teams are aligned and pursuing the right questions.

Reason #6

You’re not going the last mile

In order for analytics to provide value, of course you have to have “actionable analytics”, but much more importantly, you have to have a plan to tie analytics to the actions your results suggest. This often involves both the business and analytics groups.

Another favorite Client, a state insurance plan, had clearly articulated a burning business question: what was driving their costs up? To address this question, we first developed an analytical framework. The framework was certainly valuable by itself, and many analytics vendors would stop there, but the real magic, and tremendous cost savings, happened when our clinical analytics specialists, who had both business analytics and significant clinical backgrounds, worked directly with the Client’s medical management team.

The combined “last mile” team was responsible for creating and executing targeted care plans for outreach to and intervention on behalf of their members by applying their clinical expertise to the analytics findings. The result was dramatic: millions of dollars in savings for the company, and healthier members. A true “win-win”.

This success story is not typical of what ends up happening. The handoff of analytical results from the analytics team to whichever group is to act based on those results, whether taking specific actions like those in the story above, or making leadership decisions to guide the business, is critical to producing value from analytics. If the handoff isn’t made well, all of the hard analytics work and potentially business-changing results can simply be lost in the shuffle of day-to-day operations. These cases remind me of the ending to the first Indiana Jones movie: an incredibly valuable treasure is crated up and stored in a vast warehouse, apparently destined to lie in deep storage forever.

Conclusion

There are, of course, many more reasons enterprises don’t get all the value they could out of their analytics programs: politics, human psychology, shifting leadership priorities, budget constraints, miscommunication, lack of analytics staff, and many others. I chose to write just about the reasons above because they’re the most common ones we’ve seen in our years of working with a variety of enterprises on their analytics programs. Trexin would be happy to help you think through all the ways to set yourself up for success, and execute that strategy, rather than fall down one of the expensive rabbit holes I’ve covered in these two posts.

My next post, in two weeks or so, will be a recap of the O’Reilly Artificial Intelligence Conference, which I’m attending next week.

Until then, please let me know what you think in the comments.

Tagged in: Analytics

Leave a Reply

Your email address will not be published.

Social Media Accounts