ISG Software Research Analyst Perspectives

Analytic Ops: The Last Mile of Data Ops

Written by David Menninger | Nov 24, 2021 11:00:00 AM

Organizations have become more agile and responsive, in part, as a result of being more agile with their information technology. Adopting a DevOps approach to application deployment has allowed organizations to deploy new and revised applications more quickly. DataOps is enabling organizations to be more agile in their data processes. As organizations are embracing artificial intelligence (AI) and machine learning (ML), they are recognizing the need to adopt MLOps. The same desire for agility suggests that organizations need to adopt AnalyticOps.

Our research shows the most common complaint organizations report about their analytics and business intelligence (BI) technology is that it is not flexible or adaptable to change. How can an organization’s business processes be agile if their analytics approach is not flexible and adaptable? It would mean the organization is flying blind until its analytics catch up with the changes in its business processes. AnalyticOps is about anticipating change and establishing processes to deal with those changes, preferably in an automated way, where possible.

Part of the problem is the disjointed approach to DataOps and AnalyticOps. Many organizations and vendors are talking about DataOps and attempting to address the requirements needed to support a DataOps approach. Almost all AI/ML vendors are beginning to enhance their portfolios with MLOps capabilities. However, very few organizations or vendors are addressing the requirements for AnalyticOps. In much the same way that data governance and analytics governance need to come together, DataOps and AnalyticOps need to come together. They really shouldn’t be separate at all.

Extending DataOps to include AnalyticOps requires a number of processes, many of which can be supported by technology. The basic process is to identify the elements of analytics that may change, anticipate those changes, and build processes to address those changes, starting with the process of moving from development to test to production. Ideally, organizations should be able to continuously integrate and deploy analytics even as changes are being made. Unfortunately, this process is not supported very well in many of the products we evaluated in our Analytics and Data Value Index.

Automated and repeatable data pipelines are critical to analytics processes just as they are to DataOps. Some of these pipelines may be the responsibility of analytics teams rather than data engineering teams because the pipelines are further refining or preparing data for analyses. As sources and targets change, the pipelines need to be modified to accommodate the changes. If the targets change, it may be as a result of the analytical data model changing. If this is the case, there may be downstream impact of those changes in various metrics and visualizations. These impacts need to be identified and resolved appropriately. Just like in DataOps, some of these revisions can be processed automatically. For instance, if metrics are dropped from the data model, they can be automatically dropped from visualizations to prevent failure. There can even be some clever rearranging of the screens where these metrics appeared, but it’s likely that some manual intervention will be needed to tweak the layout. In this case, technology can be used to identify the changes, the objects impacted, and track whether or not the revisions have been addressed.

Automated analyses and insights using AI/ML are another way to support AnalyticOps. Many vendors have started providing this type of analysis which looks at the data set as a whole, or a particular subset, and identifies correlations in the data that have the biggest impact on key outcomes. Assuming the automated analyses are designed well, changes to data sources and data models should have little or no impact. These analyses are already flexible since the elements of the analyses are not predefined.

Collaboration around decision-making processes is another way to support AnalyticOps. Facilitating discussion of changes through a collaborative environment will ensure that changes are communicated effectively and the appropriate resources are involved in the revisions. The collaborative environment should also track the governance of these changes, informing approvers that changes have been made that require their attention and recording their approval, rejection or delegation.

We expect that by 2024, one-third of organizations will adopt an analytic operations approach similar to, and integrated with, their data operations processes to enhance responsiveness and agility. However, don’t anticipate that all aspects of AnalyticOps can be automated. Where automated revisions are not possible, automated notifications can be generated indicating that intervention is needed. The more automation that is put in place, and the more thorough an organization’s AnalyticOps approach is, the more agile and responsive it can be.

Regards,

David Menninger