Using our research, best practices and expertise, we help you understand how to optimize your business processes using applications, information and technology. We provide advisory, education, and assessment services to rapidly identify and prioritize areas for improvement and perform vendor selection
We provide guidance using our market research and expertise to significantly improve your marketing, sales and product efforts. We offer a portfolio of advisory, research, thought leadership and digital education services to help optimize market strategy, planning and execution.
Services for Technology Vendors
We provide guidance using our market research and expertise to significantly improve your marketing, sales and product efforts. We offer a portfolio of advisory, research, thought leadership and digital education services to help optimize market strategy, planning and execution.
In my last rant, on business analytics and the pathetic state of dashboards, I pointed out significant flaws in business intelligence software created by technology providers and in how it is being deployed by business and IT. Now I want to follow up with some insight on disconnects to a critical asset that is essential to the success of business analytics. I mean key performance indicators (KPIs), a term used in inaccurate ways that have diminished the value of the concept for business.
Let’s start with the definition; for practicality I will use Wikipedia, which says that a KPI is used “by an organization to evaluate its success or the success of a particular activity in which it is engaged.” The success being evaluated could be a goal, a target or something else that is important. To set a baseline, you calculate two measures and compare them to create a performance metric. For example, units sold and unit price are two separate measures that can be calculated to produce a metric called sales. Then through iterations of more precise calculations, this measure can be refined to compare against the sales quota or goal for a specific time period; this creates a key performance indicator on the outcomes of sales and even marketing efforts.
In actual use, however, the KPI, its use and its value have been dumbed down in ways that diminish the quality of intelligence we gain from using business analytics. First is the vague and contradictory ways in which the term is applied by technology providers and practitioners. Over the last decade I have seen “KPI” used to describe what are actually metrics – the building blocks of KPIs – and only sometimes performance-related. A metric like revenue or sales is not a KPI; neither are cost-specific, throughput-related metrics based on quantity or processing, nor customer-related metrics like first-call resolution. Such metrics are commonly presented in dashboards through visualization and called KPIs. Today we seldom see scorecards that use business analytics, which once were common for presenting KPIs properly to business users. Maybe it is time to start using scorecards for managing performance and not just measuring it.
The second issue has to do with the performance part of KPI, which should show how an organization or any of its business processes measure up to expected outcomes. Ideally, upon viewing performance-related metrics or indicators, within seconds an individual should be able to determine what, if any, action should be taken to improve performance, such as discovering what is contributing to the subpar performance or identifying opportunities for improvement. This root-cause level of actions requires examination of different classes of metrics related to performance and can range from people and processes to customers or risk. Understanding the cause and effect of metrics requires knowing and presenting the process and interconnects of how a business operates. Unfortunately most business analytics software just will provide you a table of data with no insight on what metric is contributing to the issue. By creating the right types of metrics underlying a KPI, we can reduce the time and resources required to support the communications (email, phone calls and meetings) that people normally use to investigate performance shortfalls. To get to this point requires creating a library of measures, metrics and indicators that can cross a variety of situations and help inform action-taking and decision-making. Let’s drop the P and just say key indicators (KIs) to set a new context that focuses on the indicators and the types of metrics that support them. This could lead organizations to make substantive improvements.
The third step is to make KPIs or KIs relevant to the particular roles and responsibilities of individuals. Company or divisional KPIs are interesting but only provide a general view of how an organization is performing. Where the rubber hits the road is the context of the indicators and metrics at the department, team and individual levels. We need to provide the ability for individuals to select their own focus within the scope of these facts and figures to determine how well their activities are contributing to the execution of business processes and outcomes. Here the role of business analytics is critical. To make the analyst buzzwords self-service BI and agile BI being pushed by IT analyst firms a reality, tools have to make analytics more intuitive to users. More tools for data discovery are not the answer, and making users select their scope every time they get an updated report or dashboard is a waste of time that decreases productivity and increases costs in running an organization. Instead let’s design a new generation of business analytics based on roles and individuals developed through a profile; this could go a long way toward streamlining the focus of analysis and preparing individuals to quickly determine what action to take.
To erase the stupidity in how KPIs are spoken about, demonstrated and actually deployed, we need to advance our dialogue and educational discussion of what key indicators and range of metrics are required to support particular deployments. I have already said that just placing more charts in a dashboard, no matter how pretty and interactive they might be, will not help support the actions and decisions that business analytics should enable. The effort to make KPIs more valuable begins with ensuring they are properly developed and represent performance in terms of the state of success toward achieving the goal or target. Showing past performance is insufficient without knowing how well it met expectations. Presenting a KPI does not necessarily require a chart; it can be done equally well by text presenting it within the context of how the people or process is performing over time and where it is in progress toward the expected target. These indications can be linked to additional facts with a directional arrow or other simple representations that make it easy to determine whether to take action. If your business intelligence software does not support a simpler way to communicate key indicators and metrics, maybe you have the wrong tool.
If we admit the flaws within our deployments and technologies and force ourselves to have more realistic conversations, we could advance the science of business analytics. Over the years we have made strides forward and then taken steps backward in trying to meet the needs of the lowest competency denominator. We need to aim higher and take steps to find out what should be done to produce full value from business analytics. Increasing the value of these investments can help an organization increase its efficiency and effectiveness. If you are not sure if you are heading in wrong direction with your metrics and indicators, just let me know, that is what myself and others at our firm do for a living.
Regards,
Mark Smith - CEO & Chief Research Officer
Mark Smith is the Partner, Head of Software Research at ISG, leading the global market agenda as a subject matter expert in digital business and enterprise software. Mark is a digital technology enthusiast using market research and insights to educate and inspire enterprises, software and service providers.
Ventana Research’s Analyst Perspectives are fact-based analysis and guidance on business,
Each is prepared and reviewed in accordance with Ventana Research’s strict standards for accuracy and objectivity and reviewed to ensure it delivers reliable and actionable insights. It is reviewed and edited by research management and is approved by the Chief Research Officer; no individual or organization outside of Ventana Research reviews any Analyst Perspective before it is published. If you have any issue with an Analyst Perspective, please email them to ChiefResearchOfficer@isg-research.net