Starting an analytics project is hard at the best of times, but even harder if it's your first time. Having to understand the technology, the language and machinery of analytics can feel, well, overwhelming.
Conversely, like many before you, have you tried and failed in your efforts to apply an analytics process to your business? Maybe this was due to not selecting the right levers or losing stakeholder confidence by the inability to measure the impact correctly. Alternatively, maybe there was too much focus on the technology or overcomplicating the analytics algorithms?
This post is aimed to support product owners, strategy and transformation leads, and stakeholders by introducing four key steps into the analytical use case development process. Since there are plenty of technical articles on data and analytic processing I have focused on the how not the what.
Analytics is becoming key for many businesses' survival as markets become extremely challenging. Companies like Google, Apple and Amazon place analytics as a core element for all decision making which drives product investment decisions, improved customer experiences, operational efficiencies etc,. Many other industries are following in their direction to great success.
“Forty-seven percent say that data and analytics have significantly or fundamentally changed the nature of competition in their industries in the past three years.”
Pivoting an organisation to be analytically driven can be a significant challenge when the organisation’s data and analytic capabilities are based upon legacy skills and processes. Furthermore, issues such as lack of data accessibility, quality and integrity, modern tools and processing environments hinder the ability to leverage data assets through the application of modern analytics processing.
Fixing these types of issues typically requires a lengthy diagnostic analysis, strategy definition and solution roadmap before any tangible environment changes take place. It is therefore imperative to work within the environment constraints and introduce new methods and tools incrementally as you develop impact across the organisation.
The following four points discuss the fundamental steps that, in my view, are needed to become an analytics driven organisation.
Step 1. Ideate and Define
The first step is to bring key stakeholders, subject matter experts and end users together to brainstorm ideas.
Gaining stakeholders mindshare early is critical when developing a first set of use cases. Key is to have them involved by learning about their critical needs while exploring innovative solutions together, and how they could contribute to the development process. Consider hosting a series of “lunch and learn” workshops to canvass ideas across stakeholders while in return, share knowledge and experience of new applicable analytics methods and processes. It is important to note, these are blue sky thinking sessions regardless of practical or existing environment constraints, the ideation process should be collaborative and open to all constructive views.
Choosing the right use case to develop first is going to be key for you to gain organisation support and momentum. Mitigate this risk by selecting a use case on a strict three point criteria
Innovative enough to spark stakeholder engagement
Highly feasible to be a success
Clear recognisable business impact
Falling into the trap of creating something so innovative that it is hard to explain or something so simple it's not novel, will put stakeholder engagement and the overall program objective at risk. Consider applying the Objectives and Key Results (OKR) framework for each use case that clearly describes an inspiring objective, along with a top-down and tangibile set of measurable key results.
“Then come the four OKR “superpowers”: focus, align, track, and stretch.”
You will want to show demonstrable business impact, i.e. how you have moved the dial, so defining measurable KPIs is critical. When defining KPIs, use quantitative measurements that reflect the current situation and try to use leading rather than lagging indicators. See A Guide to OKRs for more information on how best to define OKRs.
Step 2. Prioritisation and Develop
Once you have your set of impact based use cases the next stage is prioritising and selecting the most feasible use case and start development.
Prioritising use cases at the beginning of a transformation should be based upon high feasibility implementation and clear tangible business impact. This will build delivery credibility across stakeholders while establishing development processes and gain an early view on capability gaps. To assess use case feasibility consider the following areas -
Technical skill availability (Data engineering, data science and IT)
Data availability, quality and timeliness
Analytic modelling is understandable and computable
Simple deployment process
Assuming you are following an agile methodology, define a set of high level user stories with the development team that are realistic and achievable. User stories can be refined over time, as without prior experience, it is a common pitfall to over promise what might be practically achievable. Initially, direction of travel and executing is more important than getting user stories perfect.
“Execution is so critical. Sometimes you just need to try something, see what works and move forward.”
Meenal Balar, Former Director of Emerging Market Growth at Facebook
As you progress, be transparent by engaging stakeholders whenever possible. Consider using planning, standups or retrospectives meetings to share progress reporting and raising issues. Involving stakeholders early in key problem solving sessions and leveraging their ability to unblock issues will not only keep development cadence steady, but will as a side effect, also encourage buy-in that will lead them to become your biggest sponsor.
However, there are times when issues cannot be easily fixed in the current environment due to various technical or non-technical reasons. It is therefore critical you recognise this early and take decisive action by parking the use case effort and progress to the next use case. This not only protects the overall objective but acts as an aid to selecting the next best use case to develop.
Step 3. Measure and Improve
Once the use case is deployed your next key step is to validate impact and improve the operating model.
A successful deployment is when there is recognisable business value. Validating impact is dependent upon baselining the OKR impact KPIs before deployment. This is critical when demonstrating how your developments have positively impacted the business. The timing of reporting impact is context dependent as this varies based upon the sensitivity of the KPI measured and the impact recognised by the finance office and/or the business. The use of predefined reporting dashboards is becoming increasingly standard as businesses look to be more agile in their approach to business.
Example business KPIs
Revenue growth and margin protection
Net Promoter Score
Improving capabilities over time is also a key concern for developing an organisation’s analytics muscle. Throughout the development process, establish an ability to continuously evaluate, measure, and improve capabilities over time. Ambitiously driving the momentum of skills and knowledge improvements will support a wider objective of undertaking riskier and more challenging analytical problems. However, be pragmatic on what you measure, There is no need to measure everything to the nth degree. Just ensure what you measure is consistent over time and is applicable to the use case.
Some key areas to consider include:
Analytical model viability (i.e. impact of model drift with respect to prediction and attributes etc,.)
Data availability, integrity and quality
Data engineering and analytical skills
Lifecycle management from pilot to production
“Every line is the perfect length if you don't measure it.”
Step 4. Manage and Iterate
Ensuring the analytics model delivers value over time is a hot topic and deserves a post just describing the monitoring and processes involved. However, mathematical models do require updating as the environment changes or may just need to be removed if the market changes direction.
One specific area to pay attention to is the ability of analytical models to drift overtime which may reduce the desired impact or even worse inverse the benefit. Monitoring model drift with respect to forecasts, predictions, attributes and environment changes is an area of focus for many companies who rely on models. Understanding model limitations and having processes in place to alert, undeploy or remodel and redeploy is crucial to deliver continuous business impact. Failure to employ this mindset could lead to unwinding all the hard work to date. For more details this IBM post is a good starting point.
Taking the first steps to becoming an analytics driven organisation can be hard but it doesn’t need to be. By defining clear objectives, taking calculated risks, model monitoring, and applying an incremental capability model, success can be delivered sustainably.
This post is a first in a series of posts discussing how to build a portfolio of analytical use cases while incrementally developing long lasting capabilities. Finishing up this post I like to leave you with five key principles when starting out:
Identifying the right use cases and what is needed is important
Start with a small tangible use case to gain stakeholder engagement
Fail fast, learn and move on
Measure capability as well as objective results
Achieve long lasting business impact by maximising model performance