Informatica’s Intelligent Data Platform is built in three layers. The bottom layer is Informatica Vibe, the virtual data machine that I covered at its launch last year. Informatica Vibe won our Ventana Research 2013 Technology Innovation Award for information optimization. It virtualizes information management technology to operate on any platform whether on-premises or in any form of cloud computing.
Above Informatica Vibe in the platform is a data infrastructure layer, which contains all the technologies that act upon data, from integration through archiving, masking, mastering, quality assurance, security, streaming and other tasks. At the core of this second layer is Informatica PowerCenter, which provides data integration and other capabilities central to processing of data into information. PowerCenter provides parsing, profiling, joining and filtering but also is integral for data services through Informatica’s Data Integration Hub that operates in a publish-and-subscribe model. The latest PowerCenter release, version 9.6, focuses on providing agility in development and provides a series of packaged editions that provide certain levels of functionality; users choose among them to fit their requirements. This developer support includes advances in test data management and data masking for enterprise-class needs. There are editions for Informatica Data Quality, too. The latest release of Informatica MDM, 9.7, improves the user experience for data stewards along with enhanced performance and governance. Not much was mentioned at the conference about Informatica’s Product Information Management (PIM) offering that our most recent Value Index vendor and product assessment rated Hot.
The third layer is data intelligence. Here Informatica has added capabilities to organize, infer and recommend action from data and to provision and map data to business needs. In addition Informatica’s Business Glossary and Metadata Manager help establish consistent definitions and use of data for operational or analytical tasks. Informatica RulePoint, a product that also was not mentioned much at the conference, processes events through workflow in a continuous rule based manner; depending on how processing occurs, its function is to support complex event processing or event streaming.
On top of the Intelligent Data Platform, Informatica has added a couple of new innovations. Project Springbok, which is not yet released, is a tool for preparation of data for analytics and operations through its Innovation division. This new product will use Informatica’s expertise in providing access to and integration of data sources, which according to our information optimization benchmark research is the top analyst requirement in 39 percent of organizations. Despite data warehouse efforts, analysts and business users still have to access many data sources. Simplifying information is critical for nearly all organizations that have more than 16 data sources. Demonstrations showed that Springbok can dynamically create and automate the transformations that run in PowerCenter. It also offers access to a master reference to ensure that data is processed in a consistent manner. IT professionals gain visibility into what business units are doing to show how they can help in provisioning data. Even in beta release Springbok has significant potential to address the range of data issues analysts face and reduce the time they spend on data-related tasks. Our research has shown for several years that this data challenge presses organizations to diversify the tools they use, and software vendors in this market have responded. Informatica will have to compete with more than a dozen others and demonstrate its superiority for integration. Our research finds that the lines of business and IT now share responsibility for information availability in 42 percent of organizations. Informatica will have to demonstrate its value to line of business analysts who are evaluating a new generation of tools for data and analytics.
A second innovation is a new data security product called Secure@Source, also being developed in the Innovation unit, is designed to protect data assets where they are stored and processed. This product moves Informatica into the information security market segment. Secure@Source helps users discover, detect, assess and protect data assets in their persistent locations and during consumption by applications or Internet services. The question is whether Informatica can convince current customers to examine it or will have to approach information security professionals who are not users of Informatica. Security of data is among the top five required data activities according to our research and a key part of the manageability requirements that organizations find important in considering products. Informatica has an opportunity to insert itself into the dialogue in this area if it properly presents the new product to IT and business people alike.
I believe that one of the highest potential opportunities for Informatica is in the application architectures of organizations whose business processes have been distributed through a collection of cloud-based applications that lack interconnectivity and integration. For example, finance departments often have software from different providers for budgeting and planning, consolidation and reporting, accounting and payroll management. When these applications are spread across the cloud, connecting them is a real challenge, let alone trying to get information from sales force automation and customer service applications. The implications of this are shown in our finance analytics research : Data-related tasks consume the most time and impede the efficiency of financial processes as they do in all other line of business areas that we have researched. Similar situations exist in customer-related areas (marketing, sales and customer service) and employee management processes (recruiting, onboarding, performance, compensation and learning). Informatica has made progress with Informatica Cloud Extend for interconnecting tasks across applications, which can help streamline processes. While perhaps not obvious to data integration specialists, this level of process automation and integration is essential to the future of cloud computing. Informatica also announced it will offer master data management in the cloud; this should help it not just to place a data hub in the cloud but to help companies interoperate separate cloud applications more efficiently.
Overall the Informatica Intelligent Data Platform is a good reference model for tasks related to turning data into information assets. But it could be much distinct in how its automation accelerates the processing of data faster and helps specific roles work faster and smarter. This platform does not provide a context for enterprise architectures that are stretched between on-premises and various cloud deployments. Organizations will have to determine whether Informatica’s approach fits their future data and architectural needs. As Informatica pushes its platform approach, it has to ensure it is seen as a leader in big data integration, helping business analysts with data, supporting a larger number of application sources and connecting cloud computing through unifying business applications. This won’t be easy to accomplish as Informatica has not been as progressive in the broader approach to big data and use across operations and analytics.
Regards,
Mark Smith
CEO & Chief Research Officer