
Databricks launches Lakebase Postgres database for AI era
Databricks has launched Lakebase, a fully managed Postgres database designed specifically for artificial intelligence (AI) applications, and made it available in Public Preview.
Lakebase integrates an operational database layer into Databricks' Data Intelligence Platform, with the goal of enabling developers and enterprises to build data applications and AI agents more efficiently on a single multi-cloud environment.
Purpose-built for AI workloads
Operational databases, commonly known as Online Transaction Processing (OLTP) systems, are fundamental to application development across industries. The market for these databases is estimated at over USD $100 billion. However, many OLTP systems are based on architectures developed decades ago, which makes them challenging to manage, inflexible, and expensive. The current shift towards AI-driven applications has introduced new technical requirements, including the need for real-time data handling and scalable architecture that supports AI workloads at speed and scale.
Lakebase, which leverages Neon technology, delivers operational data to the lakehouse architecture — combining low-cost data storage with computing resources that automatically scale to meet workload requirements. This design allows for the convergence of operational and analytical systems, reducing latency for AI processes and offering enterprises current data for real-time decision-making. "We've spent the past few years helping enterprises build AI apps and agents that can reason on their proprietary data with the Databricks Data Intelligence Platform," said Ali Ghodsi, Co-founder and CEO of Databricks. "Now, with Lakebase, we're creating a new category in the database market: a modern Postgres database, deeply integrated with the lakehouse and today's development stacks. As AI agents reshape how businesses operate, Fortune 500 companies are ready to replace outdated systems. With Lakebase, we're giving them a database built for the demands of the AI era."
Key features
Lakebase separates compute and storage, supporting independent scaling for diverse workloads. Its cloud-native architecture offers low latency (under 10 milliseconds), high concurrency (over 10,000 queries per second), and is designed for high-availability transactional operations. The service is built on Postgres, an open source database engine widely used by developers and supported by a rich ecosystem.
For AI workloads, Lakebase launches in under a second and operates on a consumption-based payment model, so users only pay for the resources they use. Branching capabilities allow developers to create copy-on-write database clones, supporting safe testing and experimentation by both humans and AI agents.
Lakebase automatically syncs data with lakehouse tables and provides an online feature store for machine learning model serving. It also integrates with other Databricks services, including Databricks Apps and Unity Catalog. The database is managed entirely by Databricks, with features such as encrypted data at rest, high availability, point-in-time recovery, and enterprise-grade compliance and security.
Market adoption and customer perspectives
According to the company, hundreds of enterprises participated in the Private Preview stage of Lakebase. Potential applications for the technology span sectors, from personalised product recommendations in retail to clinical trial workflow management in healthcare.
Jelle Van Etten, Head of Global Data Platform at Heineken, commented: "At Heineken, our goal is to become the best-connected brewer. To do that, we needed a way to unify all of our datasets to accelerate the path from data to value. Databricks has long been our foundation for analytics, creating insights such as product recommendations and supply chain enhancements. Our analytical data platform is now evolving to be an operational AI data platform and needs to deliver those insights to applications at low latency."
Anjan Kundavaram, Chief Product Officer at Fivetran, said: "Lakebase removes the operational burden of managing transactional databases. Our customers can focus on building applications instead of worrying about provisioning, tuning and scaling."
David Menninger, Executive Director at ISG Software Research, said: "Our research shows that the data and insights from analytical processes are the most critical data to enterprises' success. In order to act on that information, they must be able to incorporate it into operational processes via their business applications. These two worlds are no longer separate. By offering a Postgres-compatible, lakehouse-integrated system designed specifically for AI-native and analytical workloads, Databricks is giving customers a unified, developer-friendly stack that reduces complexity and accelerates innovation. This combination will help enterprises maximise the value they derive across their entire data estate — from storage to AI-enabled application deployment."
Integration and partner network
Lakebase is launching with support from a network of partners, including technology vendors and system integrators such as Accenture, Deloitte, Cloudflare, Informatica, Qlik, and Redis, among others. These partnerships are designed to ease data integration, enhance business intelligence, and support governance for customers as they adopt Lakebase as part of their operational infrastructure.
Lakebase is now available in Public Preview with further enhancements planned in the coming months. Customers can access the preview directly through their Databricks workspace.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Techday NZ
9 hours ago
- Techday NZ
Fivetran awarded Databricks 2025 data integration partner of year
Fivetran has been named the 2025 Databricks Data Integration Partner of the Year. The award recognises the collaborative efforts between Fivetran and Databricks to provide data foundations for analytics and artificial intelligence to enterprise customers. The acknowledgement comes in light of a 40 percent year-over-year increase in the number of joint customers using Fivetran and Databricks to manage and analyse data. Fivetran offers solutions that allow organisations to centralise data from a wide array of sources, such as SaaS applications, databases, files, and event streams, into the Databricks Data Intelligence Platform. By automating the process of moving data and streamlining pipeline management, Fivetran aims to lessen the engineering resources required by its clients while ensuring more reliable and faster access to data. Growth and integration The past year has seen the partnership between Fivetran and Databricks expand further, with the introduction of advanced integrations into Unity Catalog and Delta Lake. These integrations assist customers in maintaining governance requirements while making use of both structured and unstructured data. As more organisations look to refine their data operations, the combined capabilities of Fivetran and Databricks are cited as helping to reduce operational overhead, enhance performance, and expedite the transformation of raw data into actionable insights. "Databricks continues to be a strategic partner as more companies invest in modern data infrastructure. This recognition speaks to the value we are delivering together for customers who need reliable, secure data pipelines to support production-grade AI and analytics. We are proud to help build the foundation for what comes next." The above was stated by Logan Welley, Vice President of Alliances at Fivetran, underscoring the role of partnership in supporting enterprise clients adopting artificial intelligence and analytics-driven solutions. Launch partner initiatives Fivetran has also been announced as a launch partner for Databricks Managed Iceberg Tables. This new feature is designed to provide customers with access to open and high-performance data formats optimised for large scale analytics and artificial intelligence purposes. Through its integration with Unity Catalog, Fivetran seeks to offer enterprises a consistent approach to data governance and efficient data accessibility as they scale their workloads and expand use cases for analytics and AI. The solution is currently employed by a range of organisations across different industries. National Australia Bank, for example, uses Fivetran's Hybrid Deployment model to operate data pipelines within its own cloud infrastructure while utilising Databricks for processing and analytics. This structure allows the bank to adhere to stringent compliance requirements, whilst modernising its infrastructure and accelerating its artificial intelligence adoption efforts. Other companies, including OpenAI, Pfizer, and Dropbox, use Fivetran to facilitate data transfer into Databricks to support a variety of applications, from real-time analytics to machine learning in production settings. The goal for these organisations is to improve operational speed and inform decision-making processes. Partner perspectives "As enterprise demand for data intelligence grows, Fivetran has been an important partner for us in helping organisations move faster with data. Their focus on automation, scale, and governance aligns with what our customers need as they bring more data-driven AI applications from production to market." This statement was made by Roger Murff, Vice President of Technology Partners at Databricks, highlighting the significance of the partnership in meeting evolving customer needs in the data intelligence sector. Fivetran reports that its automated pipelines, security measures, and managed experience are intended to support compliance and facilitate AI-focused data infrastructure modernisation for its enterprise clients.


Techday NZ
a day ago
- Techday NZ
Asia Pacific enterprises shift to genAI spend amid AI cloud push
Forrester's latest research examines the status of artificial intelligence (AI) adoption across Asia Pacific and its implications for cloud strategy and enterprise innovation. The two recent Forrester reports, The State of AI, 2024 and Embrace the AI-Native Cloud Now, provide an in-depth look at how organisations in Asia Pacific and worldwide are approaching generative AI (genAI) investments, use cases, and cloud-native transformations. Regional investment trends Forrester's The State of AI, 2024 report shows more than half of enterprise AI decision-makers globally have allocated between USD $200,000 and USD $400,000 to genAI so far. These figures signal significant but selective engagement with AI, as adoption patterns differ according to regional objectives and regulatory environments. Within Asia Pacific, one of the more distinct trends is a change in budget allocation. Enterprises in the region are more likely to transition funding from predictive AI initiatives to genAI. According to Forrester, this shift indicates "a pragmatic, value-driven approach" to investments in artificial intelligence. Leaders across Asia Pacific continue to place a premium on employee productivity and customer experience (CX), two goals that mirror global priorities. However, Asian Pacific organisations stand out by leading slightly on data literacy efforts, with 50% of respondents indicating a focus on upskilling employees for more informed, data-driven decision making. Adoption and key use cases GenAI applications are gaining traction in both operational and development domains. While IT operations is a top AI use case globally, 43% of Asia Pacific respondents said their firms are deploying genAI to support software development. This reflects the region's intent to bolster engineering productivity and accelerate the pace of digital transformation. Despite ongoing investment, barriers continue to influence adoption rates. Data privacy is the primary risk cited by firms in the region. At the same time, Asia Pacific organisations report a more acute lack of specialised technical skills compared to their counterparts elsewhere. This underlines the need for expanded workforce enablement and training to meet enterprise AI ambitions. Expectations for returns on investment (ROI) also demonstrate variability. Half of organisations surveyed anticipate realising returns within one to three years. Meanwhile, 38% are targeting a three-to-five year time frame. These figures suggest a diversity of approaches based on the scale and complexity of AI projects currently underway. Generative AI continues to spur conversation and investment across industries, albeit at varying levels depending on regional priorities and challenges. Companies that strategically align their AI efforts with measurable outcomes around customer experience and employee productivity are already reaping returns. However, addressing barriers such as data privacy, governance, and skill shortages is critical to ensuring that AI investments deliver sustainable value. That assessment comes from Frederic Giron, Vice President and Senior Research Director at Forrester. Cloud strategies for the AI era The Embrace the AI-Native Cloud Now report explores the evolving nature of public cloud platforms under the influence of widespread AI adoption. The report asserts that as genAI capabilities advance, Asia Pacific enterprises and governments must pursue AI-native cloud strategies to remain competitive and operate efficiently at scale. AI-native clouds are described as moving beyond conventional infrastructure services to offer intelligent, automated operations and application capabilities. Key features highlighted in the research include predictive operations, autoscaling, and automated handling of security updates and system patches. There is also a rapid expansion of AI-enabled development tools, such as TuringBots and low-code platforms that help streamline application design and deployment. These tools aim to raise developer productivity by automating much of the coding and debugging process while freeing teams to focus on strategic initiatives. Furthermore, embedded AI APIs are increasingly underpinning both software-as-a-service (SaaS) solutions and bespoke applications, providing elements such as predictive analytics, personalisation, and intuitive agentic AI-driven user experiences. The report notes that to fully benefit from the AI-native cloud, organisations should integrate technologies such as retrieval-augmented generation (RAG) and adopt composable architectures. The AI-native cloud is not just the next iteration of cloud technology; it is the paradigm shift enterprises need for their cloud strategies to remain competitive in an AI-driven world. Leaders must prioritise AI-native cloud strategies to improve operational efficiency, advance development capabilities, accelerate business innovation, and differentiate their customer experiences. Those comments were made by Charlie Dai, Vice President and Principal Analyst at Forrester. The research indicates that as organisations in Asia Pacific move towards more data-driven operations and cloud-native models, the focus will increasingly fall on workforce competence, governance, and risk management to ensure AI and cloud investments translate into sustained value and competitive differentiation.


Techday NZ
a day ago
- Techday NZ
CData launches accelerator to simplify Databricks integration
CData Software has introduced a new integration accelerator designed to simplify and speed up enterprise data integration for organisations utilising Databricks environments. The CData Databricks Integration Accelerator aims to eliminate traditional data pipeline bottlenecks and shorten integration timelines, enabling companies to make more efficient use of their Databricks Lakehouse investments. Databricks Lakehouse Platform combines data warehousing and artificial intelligence with a unified system for real-time analytics and machine learning. However, integrating data from varied systems can be problematic for enterprises, particularly due to fragmented legacy ETL tools and hybrid environments spanning on-premises and multi-cloud infrastructure. The new accelerator addresses these issues by providing a no-code framework for building scalable ingestion pipelines. It is also designed to simplify transformation tasks and support compliance with data governance frameworks such as Unity Catalogue. "Integrating enterprise data into Databricks often requires extensive custom code and manual configuration, which can introduce delays and increase maintenance overhead," said Manish Patel, Chief Product Officer at CData. "Our Integration Accelerator eliminates those inefficiencies by providing prebuilt connectors, automated pipeline orchestration, and real-time data availability, enabling teams to operationalise data faster and focus on driving value through analytics and AI." The CData Databricks Integration Accelerator is built around four toolkits, each designed to address specific integration challenges. Delta Lake integration The Delta Lake Integration Toolkit allows organisations to ingest data using a no-code approach and supports Change Data Capture (CDC) from over 270 sources. The toolkit provides live data access to business systems — including sales, marketing and finance — using Databricks Lakehouse Federation, while supporting governance through Unity Catalogue. Delta Live Tables The Delta Live Tables (DLT) Extension Toolkit expands connectivity to a broader range of business applications, creating a unified SQL data model via Databricks Spark for straightforward data integration. This toolkit also offers authentication and pagination support for any application programming interface (API), as well as server-side pushdown to improve speed and efficiency. Databricks-Microsoft connectivity With the Databricks-Microsoft Connectivity Toolkit, users can establish standards-based connections between Databricks and the Microsoft software ecosystem. This supports direct integration with products such as SSAS, SSRS and SSIS, maintaining live data connections and Unity Catalogue compatibility. Agentic data pipelines The Agentic Data Pipelines Toolkit focuses on automating data ingestion for agentic workloads, offering programmatic access to serverless PostgreSQL compatible with Databricks serverless deployments. The toolkit enables the instant availability of enterprise data for use by AI agents, provides programmatic orchestration of data pipelines, and processes real-time data from any source using Change Data Capture (CDC). "Everyone needs a good way to load data into their data lake," said Eric Newcomer, CTO and Principal Analyst, Intellyx. "CData has been developing and delivering a broad and deep set of data connectors for more than a decade. Now, Databricks users can leverage not only the proven CData suite of connectors, but also the integration toolkits built on top of them, including SQL data models, Microsoft-specific connections, and MCP servers for AI agent automations." Customer deployment and outcomes NJM Insurance, a provider of personal and commercial insurance policies, has deployed CData's Databricks Integration Accelerator to overhaul its marketing analytics operations. The company reported a 90% reduction in integration build time and saved 66% in project costs by replacing traditional, code-heavy ETL processes with CData's no-code solution. This allowed for quicker insight generation on customer acquisition and lifetime value. "With CData, we were able to ingest our marketing data into Databricks 10 times faster, allowing us to make data-driven decisions more quickly," said Ameya Narvekar, Data Insights Supervisor at NJM Insurance. The CData Databricks Integration Accelerator is now available for organisations seeking to reduce the time and complexity associated with enterprise data integration projects. Follow us on: Share on: