ISG Software Research Analyst Perspectives

Ahana Offers Managed-Services Approach to Simplify Presto Adoption

Written by Matt Aslett | Jun 29, 2022 10:00:00 AM

I previously described the concept of hydroanalytic data platforms, which combine the structured data processing and analytics acceleration capabilities associated with data warehousing with the low-cost and multi-structured data storage advantages of the data lake. One of the key enablers of this approach is interactive SQL query engine functionality, which facilitates the use of existing business intelligence (BI) and data science tools to analyze data in data lakes. Interactive SQL query engines have been in use for several years — many of the capabilities were initially used to accelerate analytics on Hadoop — but have evolved along with data lake initiatives to enable analysis of data in cloud object storage. The open source Presto project is one of the most prominent interactive SQL query engines and has been adopted by some of the largest digital-native organizations. Presto managed-services provider Ahana is on a mission to bring the advantages of Presto to the masses.

Ahana was founded to build a commercial business around Presto, with a particular focus on making it accessible to small and midsize organizations rather than the industry giants and digital natives that drove its early adoption. Presto was originally created at Facebook in 2012 and was made available using the open-source Apache license the following year. In 2019, Facebook teamed up with Uber, Twitter and Alibaba to found the Presto Foundation under the auspices of the Linux Foundation. Ahana became a member of the Presto Foundation the following year when it announced its launch, with seed funding from GV and Leslie Ventures, and introduced technical support for Presto deployments. The company rapidly turned its attention to the development of a Presto cloud-managed service, fueled by $20 million Series A funding provided by Third Point Ventures, GV, Leslie Ventures, and Lux Capital as well as a more recent $7.2 million strategic investment from Liberty Global Ventures. The resulting Ahana Cloud became generally available in December 2020. Customers include ad tech company Carbon, e-commerce marketplace provider Carton, security information and event management vendor Securonix, and parking application provider Metropolis. Adoption of Presto is not without its challenges, including configuration and management complexity. Ahana’s managed-cloud approach is designed to facilitate adoption and administration of Presto for organizations that do not have the expertise to configure and manage Presto themselves, or the resources to hire a team of Presto experts. The cloud service is also well-aligned with the increased use of cloud object stores as a data storage layer for analytics initiatives. More than one-half (53%) of participants in Ventana Research’s Analytics and Data Benchmark Research are currently using object stores in their analytics efforts and an additional 18% plan to do so within the next two years. SQL-based processing is key to generating business value from that data. Organizations are well positioned as almost two-thirds (61%) of participants in Ventana Research’s Analytics and Data Benchmark Research said their organization has SQL skills.

Ahana Cloud is comprised of two key components: the Ahana Compute Plane and the Ahana SaaS Console. The Ahana Compute Plane runs in a customer’s virtual private cloud and includes Presto running in containers on Amazon EKS (Elastic Kubernetes Service), as well as the Hive Metastore and Apache Superset for BI and reporting. Integration with third-party BI and data science tools including Tableau, Qlik, Preset, Looker, Jupyter, and Apache Zeppelin is also supported. While Presto can be run on any cloud or on-premises infrastructure, Ahana Cloud is only currently available on Amazon Web Services (AWS). The company is not planning to address on-premises use cases but is working on support for other cloud providers. The Ahana Compute Plane is managed using the Ahana SaaS Console, which runs in Ahana’s virtual private cloud and enables users to create, deploy, resize and manage Presto clusters as well as connect to external data catalogs or database services. Ahana also offers Presto Query Analyzer by Ahana, a free tool that provides reports on Presto cluster workload metrics and query performance. The primary use cases for Ahana include reporting and dashboards, federated query and SQL-based data science, while it is also increasingly being used to drive customer-facing applications as well as SQL-based data transformations and data lake analytics. Ahana Cloud Community Edition is free to use, without support, but is limited to a cluster of five instances and authentication and caching capabilities are disabled. For the full function Enterprise Edition, Ahana utilizes a pay-as-a-you-go model with availability via AWS Marketplace. Billing is based on the concept of Ahana Cloud Credits with a very low pricing set at Ahana Cloud Credit per hour. The approach is designed to appeal to small and medium businesses without the skills or resources to manage Presto themselves. Ahana is targeting existing data warehouse users looking to augment or replace their data warehouse deployments, as well as companies that have not yet adopted data warehousing with a view to embracing the data lake as their primary analytic data platform.

The addition of interactive SQL query engine functionality to a data lake is one of the first steps toward creating a data lakehouse architecture to enable hydroanalytics. I assert that by 2024, more than three-quarters of current data lake adopters will be investing in data lakehouse technologies to improve the business value generated from the accumulated data. Organizations looking to generate greater value from their data lake investments should consider the importance of interactive SQL query engine functionality in general, and Presto in particular, while examining the potential advantages of Ahana’s managed-services approach.

Regards,

Matt Aslett