logo
Spice AI Partners With Databricks to Extend Operational Data & AI Capabilities to Real-Time Applications and Agents

Spice AI Partners With Databricks to Extend Operational Data & AI Capabilities to Real-Time Applications and Agents

Business Wire4 days ago

SEATTLE--(BUSINESS WIRE)--Spice AI, a leader in enabling organizations to drive value from their operational data in AI applications and agent development, today announced the expansion of its partnership with Databricks, the Data and AI company, with the launch of Spice AI's new integration with the Databricks Data Intelligence Platform. The new integration increases developer productivity, accelerates the building of AI applications and agents, and enables organizations to drive more value from their data leveraging the Databricks Data Intelligence Platform.
Organizations are under increasing pressure to drive top-line revenue growth and value from their data while reducing cost and complexity when building applications while also maintaining high standards of security and governance. With Spice AI and Databricks, customers enjoy native access to Unity Catalog, Databricks' open and unified governance solution, and Databricks SQL, the company's intelligent data warehouse. With enterprise-grade security via the Databricks Data Intelligence Platform, the partnership also includes a direct integration with Databricks Mosaic AI Gateway and Mosaic AI Model Serving.
Built on an open lakehouse architecture, Databricks' Data Intelligence Platform enables customers to democratize and scale data and AI across their organization. With Spice AI and Databricks, customers can reduce operational complexity, improve application performance, and accelerate time-to-market for their applications and AI agents.
Native Integrations
"The Databricks and Spice AI partnership bridges the gap between cloud-scale analytics and edge computing requirements, enabling organizations to build next-generation intelligent applications. By combining Databricks' Data Intelligence Platform with Spice AI's edge computing capabilities, customers can deploy sophisticated AI applications that operate across cloud and edge environments,' said Luke Kim, Founder & CEO, at Spice AI.
Key Spice AI capabilities available to Databricks customers include:
Native integration with Databricks SQL warehouse for high-performance SQL queries with Spice.ai acceleration.
Unified query across Databricks, on-premises, and edge data sources.
Native support for Databricks Mosaic AI model serving and embeddings.
Unity Catalog integration for governance and security including credential vendoring.
Apache Iceberg TM & Delta Lake support for query and management of open format tables via Unity Catalog.
Service Principal M2M & U2M OAuth authentication for enterprise-grade role-based security.
'Partnering with Spice AI has transformed how NRC Health delivers AI-driven insights. By unifying siloed data across systems, we accelerated AI feature development, reducing time-to-market from months to weeks - and sometimes days. More than boosting speed and efficiency, Spice has empowered our developers to build dependable, patient-centric tools, like AI-powered rounding summaries, that directly improve care. With predictable costs and faster innovation, Spice isn't just solving some of our data and AI challenges - it's helping us redefine personalized healthcare,' said Tim Ottersburg, VP of Technology, NRC Health.
"Today, nearly every enterprise organization is looking to invest in domain-specific AI agents. Successfully leveraging operational data is a key success factor in building agents that make a real business impact,' said Roger Murff, VP of Technology Partners at Databricks. 'We are thrilled to partner with Spice AI to deliver a solution on the Databricks Data Intelligence Platform that enables organizations to extract maximum value from their data and deliver true data intelligence to our joint customers.'
Spice AI's support for the Databricks Data Intelligence Platform is available today. To learn more about the partnership with Databricks, visit https://spice.ai/ or stop by the Spice AI at Booth #D114 at Data + AI Summit in San Francisco, June 10-13.
About Spice AI
Spice AI is purpose-built to help enterprises ground AI in data. By unifying federated data query, retrieval, and AI inference into a single engine, Spice mitigates AI hallucinations, accelerates data access for mission-critical workloads, and makes it simple and easy for developers to build fast and accurate data-intensive applications across cloud, edge, or on-prem.
To read more about Spice AI's recent product announcement enabling a portable compute engine for data-grounded AI, please visit: https://spiceai.org/blog/announcing-1.0-stable
Spice AI is headquartered in Seattle, WA. For more information on how Spice AI helps customers drive value from their data and accelerate their AI application and agent development, please visit: https://spice.ai/.

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

PuppyGraph Announces New Native Integration to Support Databricks' Managed Iceberg Tables
PuppyGraph Announces New Native Integration to Support Databricks' Managed Iceberg Tables

Business Wire

timea day ago

  • Business Wire

PuppyGraph Announces New Native Integration to Support Databricks' Managed Iceberg Tables

SAN FRANCISCO--(BUSINESS WIRE)--PuppyGraph, the first real-time, zero-ETL graph query engine, today announced native integration with Managed Iceberg Tables on the Databricks Data Intelligence Platform. This milestone allows organizations to run complex graph queries directly on Iceberg Tables governed by Unity Catalog- no data movement and no ETL pipelines. "Databricks' new Iceberg capabilities provide a truly open, scalable foundation. With PuppyGraph, teams can ask complex relationship-driven questions without ever leaving their lakehouse. " -- Weimo Liu, CEO of PuppyGraph Share Databricks Managed Iceberg Tables, launching in Public Preview at this year's Data + AI Summit, offers full support for the Apache Iceberg™ REST Catalog API. This allows external engines, such as Apache Spark™, Apache Flink™, and Apache Kafka™, to interoperate seamlessly with tables governed by Unity Catalog. Managed Iceberg Tables provide automatic performance optimizations, which deliver cost-efficient storage and lightning-fast queries out of the box. By combining PuppyGraph's in-place graph engine with the openness and scale of Managed Iceberg Tables, teams can now: Query massive Iceberg datasets as a live graph, in real-time Use graph traversal to detect fraud, lateral movement, and network paths Perform Root Cause Analysis on telemetry data using service relationship graphs Eliminate the need for ETL into siloed graph databases Scale analytics across petabytes with minimal operational overhead Coinbase and CipherOwl are joint customers of Databricks and PuppyGraph. At the Data + AI Summit, both will share how graph analytics has powered their products and enabled real-time insights directly on managed lakehouses. "This changes how graph analytics fits into the modern data stack," said Weimo Liu, CEO of PuppyGraph. "Databricks' new Iceberg capabilities provide a truly open, scalable foundation. With PuppyGraph, teams can ask complex relationship-driven questions without ever leaving their lakehouse." To learn more about how PuppyGraph integrates with Apache Iceberg™ and the Databricks Data Intelligence Platform, visit or see the joint talk with Coinbase at Data + AI Summit 2025. About PuppyGraph: PuppyGraph is the first and only real time, zero-ETL graph query engine in the market, empowering data teams to query existing relational data stores as a unified graph model deployed in under 10 minutes, bypassing traditional graph databases' cost, latency, and maintenance hurdles. Capable of scaling with petabytes of data and executing complex 10-hop queries in seconds, PuppyGraph supports use cases from enhancing LLMs with knowledge graphs to fraud detection, cybersecurity and more. Trusted by industry leaders, including Coinbase, Netskope, CipherOwl, Prevalent AI, Clarivate, and more. Learn more at and follow the company on LinkedIn, YouTube and X.

Striim Announces Neon Serverless Postgres Support to Broaden Agentic AI Use Cases with Databricks
Striim Announces Neon Serverless Postgres Support to Broaden Agentic AI Use Cases with Databricks

Yahoo

time2 days ago

  • Yahoo

Striim Announces Neon Serverless Postgres Support to Broaden Agentic AI Use Cases with Databricks

PALO ALTO, Calif., June 12, 2025 (GLOBE NEWSWIRE) -- Applications in the AI era depend on real-time data, but data ingestion and integration from legacy architectures often hold them back. Traditional ETL pipelines introduce latency, complexity, and stale intelligence, limiting the effectiveness of LLM-driven applications and Retrieval-Augmented Generation (RAG). For enterprises building on the Postgres stack, bridging that gap between operational data and real-time AI is critical. Open-source Postgres is widely deployed as the back-end database by developers to address operational requirements. Neon builds on this foundation with a new paradigm for the creation of databases by AI agents. Most recently, Databricks announced Lakebase, based on its acquisition of Neon—a fully managed Postgres database that is a popular choice to build AI Applications on. Now, Striim is excited to announce that it is expanding its Postgres offerings with high-throughput ingestion from Neon into Databricks for real-time analytics, as well as high-speed data delivery from legacy systems into Neon for platform and data modernization. Striim's unified platform further allows vector embeddings to be built within the data pipeline while delivering real-time data into Neon and into Databricks for building Agentic AI use cases. Using Striim, developers can seamlessly migrate, integrate, or replicate transactional and event data along with in-flight vector embeddings, enriched context, and cleansed high-quality data from multiple operational stores into Neon. This modern integration allows modern agentic applications to be rapidly built with Neon as the transactional backend. With this added capability, organizations can: Seamlessly replicate operational data in real-time from traditional systems like Oracle, PostgreSQL, MySQL, SQL Server, and hundreds of other sources to Neon, with zero downtime and automated schema evolution. Enable real-time ingestion and Change Data Capture (CDC) from Neon into Databricks, ensuring AI models and analytics workloads always operate on fresh data. Fuel Retrieval-Augmented Generation (RAG) and generative AI use cases natively within Neon or Databricks with inline data enrichment and vector embeddings. Stream event data from Apache Kafka into Neon in real time, eliminating the need for brittle batch-based integrations. Maintain end-to-end data governance with in-flight AI-driven PII detection and resolution, encryption, and support for customer-managed keys. "By extending our platform to support Neon and Databricks, we're giving Postgres-native teams the tools to build real-time, AI-native architectures without rethinking their stack,' said Alok Pareek, co-founder and Executive Vice President of Engineering and Products at Striim. 'Our mission is to help customers modernize from legacy platforms and legacy ETL to real-time agent-incorporated intelligence—and Striim's Vector Agent and Neon CDC and delivery capabilities bring us one step closer to that future.' This expansion builds on Striim's momentum with Databricks, following the support for Databricks Delta Lake with open Delta table formats, and the launch of SQL2Fabric-X, which unlocks real-time SQL Server data for both Microsoft Fabric and Azure Databricks. With Neon now part of the Striim ecosystem, Postgres users can join this wave of modernization: streaming operational data to fuel AI and analytics without sacrificing performance or reliability. To learn more about Striim's support for Neon and Databricks, visit or contact our team at sales@ ABOUT STRIIM, INC. Striim pioneers real-time intelligence for AI by unifying data across clouds, applications, and databases via a fully managed, SaaS-based platform. Striim's platform, optimized for modern cloud data warehouses, transforms relational and unstructured data into AI-ready insights instantly with advanced analytics and ML frameworks, enabling swift business action. Striim leverages its expertise in real-time data integration, streaming analytics, and database replication, including industry-leading Oracle, PostgreSQL, MongoDB CDC technology, to achieve sub-second latency in processing over 100 billion daily events for ML analytics and proactive decision-making. To learn more, visit Media Contact: Dianna Spring, Vice President of Marketing at StriimPhone: (650) 241-0680 ext. 354Email: press@ Source: Striim, in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data

Databricks introduces Agent Bricks for AI agent development
Databricks introduces Agent Bricks for AI agent development

Yahoo

time2 days ago

  • Yahoo

Databricks introduces Agent Bricks for AI agent development

Databricks has launched Agent Bricks, an automated solution designed to facilitate the creation of AI agents tailored to specific business needs. This tool allows users to input a 'high-level' description of the desired task and connect their enterprise data, with Agent Bricks managing the subsequent processes. The service, now available in Beta, is optimised for various industry applications, including structured information extraction, knowledge assistance, text transformation, and multi-agent systems, the company said. Agent Bricks employs advanced research methodologies from Mosaic AI Research to generate domain-specific synthetic data and task-aware benchmarks. This approach enables automatic optimisation for both cost and quality, streamlining the development process and enhancing production-level accuracy. The integration of governance and enterprise controls allows teams to transition from concept to production efficiently, eliminating the need for disparate tools. The functionality of Agent Bricks includes automatic generation of task-specific evaluations and LLM judges, the creation of synthetic data that mirrors customer data, and a comprehensive search for optimisation techniques. Users can select the iteration that best balances quality and cost, resulting in a production-ready AI agent capable of delivering consistent output, the company's statement added. Agent Bricks supports various customer use cases across multiple sectors. For instance, the Information Extraction Agent converts documents into structured data, while the Knowledge Assistant Agent provides accurate answers based on enterprise data. The Multi-Agent Supervisor facilitates the integration of multiple agents for complex tasks, and the Custom LLM Agent allows for tailored text transformations. Databricks CEO and co-founder Ali Ghodsi said: 'For the first time, businesses can go from idea to production-grade AI on their own data with speed and confidence, with control over quality and cost tradeoffs. 'No manual tuning, no guesswork and all the security and governance Databricks has to offer. It's the breakthrough that finally makes enterprise AI agents both practical and powerful.' In addition to Agent Bricks, Databricks has introduced several features at the Data + AI Summit, including support for serverless GPUs, enabling teams to fine-tune models and run workloads without managing GPU infrastructure. The release of MLflow 3.0, a platform for managing the AI lifecycle, allows users to monitor and optimise AI agents across various environments. In May 2025, Databricks announced the acquisition of Neon, a serverless Postgres database company. "Databricks introduces Agent Bricks for AI agent development" was originally created and published by Verdict, a GlobalData owned brand. The information on this site has been included in good faith for general informational purposes only. It is not intended to amount to advice on which you should rely, and we give no representation, warranty or guarantee, whether express or implied as to its accuracy or completeness. You must obtain professional or specialist advice before taking, or refraining from, any action on the basis of the content on our site. Sign in to access your portfolio

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store