ISG Software Research Analyst Perspectives

Data is the Strategic Raw Material of Finance Departments

Written by Robert Kugel | Jun 7, 2023 10:00:00 AM

The Office of Finance can be compared to a numbers factory where the main raw material, data, is transformed into financial statements, management accounting, analyses, forecasts, budgets, regulatory filings, tax returns and all kinds of reports. Data is the strategic raw material of the finance and accounting department. It is the key ingredient in every sale and purchase as well as every transaction of any description. Quality control is essential to achieving high standards of output in any factory, and finance is no exception. To that end, a great deal of effort goes into managing the department’s processes well. However, too little attention is paid to the quality of the raw material — the data — and how it is handled at every stage of a process. Since the office lockdowns forced by the pandemic of 2020, there has been widespread agreement that the finance and accounting department needs to digitally transform to ensure continuity and resiliency under any circumstances. To improve their performance and that of the entire organization, finance department executives must adopt a total quality management (TQM) approach to managing data in their department.

Just as in a factory, people in the department work from blueprints. In this case, examples include accounting standards, forecasting models and transaction forms. These guide how the parts are to be pieced together and define the metaphorical “speeds and feeds” that govern how the process is performed. The internal audit team provides quality control and there are final inspectors — the external auditors — who certify the soundness of the most important product, the financial statements. I think the factory analogy is useful because too often departments operate more like an artisanal workshop, ill-suited to the demands of the 21st century. The heart of the matter is this: departments do not use technology effectively. Our Office of Finance benchmark research finds that when it comes to using technology effectively, 49% of organizations are technology laggards and just 12% are innovative. The data also shows that organizations that use technology well outperform those that do not.

It should be clear by now that technology is on the cusp of transforming how the Office of Finance operates to a greater extent than since the dawn of the information age in the 1950s. It’s almost trite to state that technologies such as artificial intelligence (AI), machine learning (ML), large language models, data integration, in-memory computing and more advanced databases (to name a handful) are coalescing to radically redefine how work is done and what work will be done in the department. We are now at the moment when digital transformation makes a TQM approach to data management both possible and necessary.

Manufacturing was transformed in the second half of the 20th century as the principles laid down by Dr. W. Edwards Deming were widely adopted, first by Japanese manufacturers looking to use TQM to shed a reputation for shoddy workmanship, and then by the rest of the world that competed with them. TQM is built on the proven idea that quality must be designed into manufacturing processes from beginning to end because this approach is both more efficient and results in a better final product than relying solely on inspections at the end of the process. This TQM approach also stresses the need for continuous improvement in any process.

TQM is especially relevant for the Office of Finance. Years ago, I started using the term “continuous accounting” to describe a technology-supported approach to managing finance and accounting organizations that embrace TQM principles. Today, technology enables departments to shed their quaint artisanal practices and build quality into their processes. By automating data movement and process management, technology can ensure quality financial processes by maintaining data integrity from the beginning to the end of the process. Doing so eliminates the need to check and reconcile accounting numbers, which is required when data is rekeyed manually or moved and transformed using spreadsheets. In addition, AI will increasingly be used to highlight possible errors, omissions and inconsistencies. This will substantially improve the quality of financial statements and reports. It also will reduce the time spent identifying the source of discrepancies and correcting them.

One area where TQM pays off is in handling data in finance processes. I’ve been using the term “data pantry,” somewhat tongue-in-cheek, to describe a system of data management that promotes operational efficiency while ensuring that the information used in processes is immediately accessible, accurate, timely and consistent. It’s a pantry because all the data ingredients needed to perform a task or process are within easy reach and have labels that are easily read and understood. The data pantry addresses long-standing issues that routinely sap the productivity of finance and business analysts and other users of business data. Our Analytics and Data Benchmark Research finds that 69% of organizations say preparing data is one of the most time-consuming aspects of analyzing data, along with 64% that say reviewing data for quality and consistency is also an issue. These efforts consume a significant amount of time, so much so that just 27% can spend the bulk of their time focusing on how changes are affecting the business. Ventana Research asserts that by 2026, almost all vendors of Office of Finance software will offer a data pantry to facilitate the integration of operational and external data with financials to improve the speed and accuracy of forecasts and plans.

The data pantry is different in concept and construction from other data stores, most significantly in ensuring that users find it easy to access the information they need and in a way that’s unambiguous. The dataset is curated for the users, domain and use cases, making it easier to navigate and therefore readily available for analysts and others to do useful work. So, it’s a pantry and not a warehouse, where people in search of data metaphorically wander along aisles stocked floor to ceiling, trying to spot what they need while being forced to read inscrutable bar codes. It’s not a data lake that recalls the phrase “boil the ocean” to extract needed information. Data warehouses and data lakes are useful but usually not designed for specific users and use cases. And a data pantry isn’t like older forms of more defined data stores, such as a data mart or financial data warehouse. First, because the scope of data available expands beyond financial data to include operational and external information, and second, in programming how the data moves directly from each authoritative source to the pantry without manual interventions.

Can you buy a data pantry off the shelf (so to speak)? Well, not exactly. However, vendors are increasingly configuring their offerings to include a designed-for-purpose data repository that accomplishes the same objective, even if they don’t call it by that name. A data pantry is useful for a wide variety of purposes, especially forecasting, planning, analytics and reporting. Vendors that offer AI-enabled business software will need to have a data pantry-like capability built into their application to support ML.

The finance and accounting department is a numbers factory, and data is its strategic raw material. Technology can improve and ensure the quality of data used throughout finance and accounting. In particular, the right technology can ensure accuracy, timeliness and completeness of data while enhancing the productivity of the entire department. I recommend that buyers of business software focus on how well the vendor has facilitated the availability of data for users of that application.

Regards,

Robert Kugel