Real ROI is complicated: musing on where to spend your next analytics dollar (part 2/3)

second Google Image search result for "cold water" from parenting.firstcry.com

See part 1 here: Value of new effort should be defined as the delta compared to the value of your current solution

Value is 100% dependent on what people do after you deploy a solution

For as much as I/we think about designing and building solutions -- what problem we are solving and how we expertly use the data and technology we have to address it -- ending our thinking there puts the full ROI of the effort at risk.

The process to create an analytics solution typically looks something like this:

  • Input: problem/business process to improve identified
  • [many steps to design, build, and test solution]
  • Output: deployed solution

Once the solution is deployed, everybody exhales and feels good about the work they put into it. That is the end of the project for most, but no value has been realized yet. In fact, you are in the red due to the cost of building the solution. You have created potential value, and your plan for what to do next determines how much of that potential value you reap. As somebody who spends much of his time designing and building analytic solutions, this was a sobering insight.

Now I think about value like this:

    value of analytic solution ($) = potential value of analytic solution ($) * adoption of solution (%)

Because people (users) must behave differently than they would without the solution, we must extend the solution process to include evangelizing, educating, listening, and monitoring.

  • What do the users know about the solution?
  • Where do they find it, how do they use it?
  • How should it change their behavior?
  • Where is the data coming from, how fresh will it be, how will they know it is trustworthy?
  • What if they have questions or feedback?
  • How will you know people are using it?

Adoption is behavior change and by its nature non-technical, but it can be assisted with technology, leading to cross-functional teams. There's a fallacy when technical resources discuss user adoption that the people in the room can build a mousetrap so good that the problem goes away. And it's hard for the business alone to drive adoption without automation and data provided by technical resources (e.g., analyzing usage, pushing content to users, answering questions/feedback about the solution), to say nothing of establishing user trust in the data being produced.

Walking from cubicle to cubicle is romantic but not scalable for enterprise solutions or in a world where most are working remotely. I'm not an expert on wide-scale implementation of enablement strategy, but I'm fortunate at Axis to work with a thought leader in this area, Jerry DiMaso.

Some nuance to the value formula

I find it helpful to separate potential value and adoption in the formula above because it is a logical separation of some tasks, timeline, and responsibilities. However, you can influence adoption when you build the solution by creating something that is both effective (addresses business challenges and works in conjunction with business processes/actions users might take) and efficient (responsive, intuitive, easy to find what you need to know in little time). So even though you are still 100% dependent on what actions people take to realize a benefit, design and technical expertise both stack the value deck in your favor. It's not either/or.

Part 3 will be: Generating options for where to spend your next analytic dollar

Contact Form

Name

Email *

Message *