February 1, 2022
An Analytics Enablement Stack: The combination of Tech, Processes, and People
The ingredients you need to make material progress.
7 min read
An Analytics Enablement Stack: The combination of Tech, Processes, and People
“We need the software to have more whistles. And we need the dashboards to be more futuristic to catch the attention of our executives”. I was about 3 months into a 12+ month client engagement to deliver an entire suite of analytics and this was the general feedback and guidance. This deployment of our platform would house new analytics and integrations across governmental departments such as Finance, Transportation, and Education. The use case was perfect to demonstrate the value of creating a single, consistent portal for analytics and collaboration across disparate data. So, I took this feedback seriously. Still I had a sinking gut feeling, informed by years of analytics work, that while a UX absolutely matters, futuristic features were an unlikely “silver bullet” must-have.
I shifted a bunch of analytics dev blocks into the night to make more office hours to do some sleuthing to validate (or challenge) the recommendation. Afterwards, I realized that our software would do what we said it would do.
But holding all else equal this was not going to deliver the final result. They were missing the “full stack”, technology and otherwise.
Technology is a key accelerant, but mixed with other stale ingredients, the finished product will likely be mediocre. So what to do? Considering people, process, and technology as your “full stack,” here are the things I think you must have for analytics enablement to flourish anywhere. I italicized the less obvious ones based on our experience.
Hub + Spoke BI Team: Centralized vs. decentralized: often a debate for BI/analytics. I prefer the “hub + spoke” model which gives you flexibility between both. Like good product-building, there is a lot of heavy lifting that is executed best by a core team all moving in the same direction. At the same time, functional experts who understand specific business use cases work “at the fringe” to progress your data enablement can change the game. Do not confuse this with “owning” more and more data analysts. As the work increases, the work gets easier and more automated. And with the right technology, those on the “spokes” may be external stakeholders/partners who can scale development at a far better value.
Built-In Leadership: Leaders gotta lead. Your “stack” of data enablement is not complete if leaders aren’t asking questions of the data, of their team, and publicly showing that they are making decisions with the analytics available to them. Some of this can be automated or have the “office of” the leader do this work while said leader is in the loop. This is not the same as the leader commenting in a quarterly email/all-hands, or giving the kickoff keynote. Make sure the leader is visible interacting in the tools, collaborating, asking tough questions, and applauding the effort and energy when it’s present. It’s the best tactic to make positive behavior snowball.
Seamless investigative “flow”: Nothing kills the potential of good data and engaged, skilled people like a terrible, fragmented process for discovery, documentation, and action. Enabling flow enables more value, faster.
During a pregnancy visit with my wife, I watched the ultrasound technician work the machine. Her engagement and skill were perfectly complemented by the ergonomic, thoughtful setup of that machine. She moved around, zoomed in, changed settings, made comparisons, made notes, and grabbed snapshots as easily as she was breathing. Make data-driven decision making that easy…and fun! Users stare at a dashboard…but then what? How do they share what they’re seeing or compare it with other related facts? Can they quickly create a hypothesis and build on it without leaving their current workflow, or do they have to interact with too many different systems, or wait for the next chance to show someone what they’re finding? Can they easily pick up where they left off or is it up to their memory? Leverage tech and user feedback to make the process as easy as breathing.
An Upskill + Maintenance Routine: Data literacy is usually cited as essential to form a data-driven culture. I think it’s too lofty the way it’s presented – all “go get ‘em”, no substance. If the rest of your enablement waits to “build data literacy into the culture” it will be too late. We believe in “building the plane—and make it go higher—while flying it”. Building education into analytics enablement through your platform is how you can do that. People learn on the job, and while they’re working with data and making decisions. They can upskill and retrain as they get things done. We like Data Coach and the team behind it, but there are options. Bake this into your analytics enablement, not adjacent as another LMS box to tick.
Built-in, automated feedback: Timely, verbal feedback is great. It’s also less realistic in a growing (and beneficial) asynchronous world. We embraced this long before it was mainstream and still believe in it. You see this recommendation in lots of SaaS applications you use today. Mostly optional: rate this, leave feedback on your way out, “quick temperature check” score, etc. Borrow these concepts in your analytics enablement “stack” to learn, iterate and solve problems for your customers.
Simplified (but visible) Governance: Governance has gotten too complex for most users. It’s incredibly boring—but necessary—to your standard end-user. Let end-users understand where the data came from and explain any material manipulation done to produce the metrics or analysis being provided. Do this in a readable, easy-to-consume way that doesn’t require separate systems, logins, etc. Streamline an easy way for them to dig deeper with the right resources if they need to, but no one asked you to provide a dissertation on the first pass. I’m pleasantly surprised how much time to value decreases with this simplified approach.
Accessible Data: You already know this. Some data is sensitive, but a lot of it is not as sensitive as portrayed. Also, whether internal or external, sometimes people have access to data you don’t, but not all of it is sensitive. Make it easy and fast for “safe” slices to be shared, consumed, and folded into highly shareable analytics. This is a huge unlock we are working on right now at scale.
(At least one) self-service analytics tool: There are use cases for extremely specific custom analytics. But, in complex organizations or firms scaling a specific solution, some of the leading self-service analytics allow a sliding scale between centralized, decentralized—and outsourced—BI. It’s great to keep a neat toolbox of multiple tools as their influence and capabilities shift. But at least one gives you a long cycle on your longer-cycle enablement strategy.
Consistent UI: Per above, I like self-service analytics tools for flexibility. Then, let people make the choice how/where they build, but don’t let this affect the consistency of consumption for the end-users. More technical personas tend to experiment and change their build process often (a good thing). Produce a more consistent user experience through a “surface layer” where decision makers and end-users do and share their best thinking. It separates these very different user personas, their problems, their objectives, and the rate of change forced upon them.
The opinions expressed in this blog are those of the individual authors and do not represent the opinions of BRG or its other employees and affiliates. The information provided in this blog is not intended to and does not render legal, accounting, tax, or other professional advice or services, and no client relationship is established with BRG by making any information available in this publication, or from you transmitting an email or other message to us. None of the information contained herein should be used as a substitute for consultation with competent advisors.