
AWS And Boomi Announce Strategic AI And SAP Integration Partnership
Press Release – Boomi
Our Strategic Collaboration Agreement with AWS comes at a critical inflection point where enterprise AI adoption requires a delicate balance between innovation and governance, said Steve Lucas, Chairman and CEO at Boomi.
Boomi™, the leader in AI-driven automation, today announced a multi-year Strategic Collaboration Agreement (SCA) with Amazon Web Services (AWS) to help customers build, manage, monitor and govern generative artificial intelligence (AI) agents across enterprise operations. Additionally, the SCA will aim to help customers accelerate SAP migrations from on-premises to AWS.
Enterprise organisations today struggle with fragmented AI agent management across multiple platforms and environments, incurring potential security risks and operational inefficiencies. As AI agent adoption accelerates, organisations need a trusted way to monitor, secure, and optimise their AI investments across their diverse technology landscape.
By integrating Amazon Bedrock — a fully managed service for building and scaling generative AI applications, including access to the broadest selection of fully managed models from leading AI companies — with the Boomi Agent Control Tower, a centralised management solution for deploying, monitoring, and governing AI agents across hybrid and multi-cloud environments, customers can easily discover, build, and manage agents executing in their AWS accounts, while also maintaining visibility and control over agents running in other cloud provider or third-party environments. Through a single API, Amazon Bedrock provides a broad set of capabilities to build generative AI applications with security, privacy, and responsible AI in mind, including support for Model Context Protocol (MCP), a new open standard that enables developers to build secure, two-way connections between their data and AI-powered tools. MCP enables agents to effectively interpret and work with ERP data while complying with data governance and security requirements.
'Our Strategic Collaboration Agreement with AWS comes at a critical inflection point where enterprise AI adoption requires a delicate balance between innovation and governance,' said Steve Lucas, Chairman and CEO at Boomi. 'By integrating Amazon Bedrock's powerful generative AI capabilities with Boomi's Agent Control Tower, we're giving organisations unprecedented visibility and control across their entire AI ecosystem while simultaneously accelerating their critical SAP workload migrations to AWS. This partnership enables enterprises to confidently scale their AI initiatives with the security, compliance, and operational excellence their business demands.'
'As we've worked with thousands of customers adopting generative AI, one thing has become clear—enterprises need robust solutions to effectively deploy, monitor, and govern AI agents across their technology environments,' said Rahul Pathak, VP, Data & AI GTM at AWS. 'Amazon Bedrock provides customers with choice—the most comprehensive selection of foundation models and built-in governance capabilities. When you combine this with Boomi's Agent Control Tower providing centralised visibility and control, customers get what they need to innovate confidently while maintaining security and compliance. This collaboration brings together complementary strengths so customers can move quickly from proof-of-concept to production while meeting their business requirements.'
'As we expand AI-driven automation across our business, Boomi and AWS are essential partners enabling us to securely manage complexity and scale innovation,' said Amit Sinha, President & Co-Founder of WorkSpan. 'Utilising Boomi's integration capabilities combined with AWS infrastructure, we seamlessly connect diverse systems, accelerate SAP and AI initiatives, and maintain essential visibility and governance across our technology landscape—significantly enhancing our ability to innovate quickly and confidently.'
The collaboration will introduce several strategic joint initiatives, including:
Agent Control Tower – Integrated with Amazon Bedrock, Boomi's Agent Control Tower, part of Boomi Agentstudio (formerly Boomi AI Studio), provides holistic multi-cloud governance for AI agents. Organisations gain comprehensive visibility, proactive monitoring, and control of both Boomi-authored and third-party AI agents. This solution is critical for securely scaling AI initiatives across on-premises, public, or hybrid cloud environments, reducing complexity and ensuring enterprise-grade security and compliance.
Enhanced Agent Designer – Boomi's low-code Agent Designer, now integrated with Amazon Q index, allows rapid creation of AI agents with deep contextual understanding and smarter model selection. Enterprises can build, train, and deploy intelligent agents that leverage Amazon Q index for improved relevance and performance—and then manage them at scale via Boomi's multi-platform Agent Control Tower.
New Native AWS Connectors and Boomi for SAP – As part of its SCA with AWS, Boomi has introduced new native connectors for AWS Lambda, Amazon Bedrock, Amazon DynamoDB, and the Amazon Selling Partner Appstore. These connectors enable seamless integration across AWS services, supporting use cases from serverless computing to generative AI and e-commerce. Also part of the broader SCA collaboration, Boomi for SAP provides SAP-certified native integration that simplifies and accelerates connectivity between SAP and non-SAP systems, reduces integration time, and supports cloud migration with modern ELT capabilities powered by the Rivery acquisition. Organisations can efficiently move SAP data into any AWS-powered data warehouse or data lake to enable real-time analytics and AI. Together, Boomi and AWS empower organisations to connect SAP to non-SAP applications, move data seamlessly to the cloud, and fuel AI and analytics initiatives—maximising the value of their technology investments.
Boomi Validated as AWS Generative AI Competency Partner
Boomi also announced it achieved the AWS Generative AI Competency – a specialisation that recognises Boomi as an AWS Partner that helps customers and the AWS Partner Network drive the advancement of services, tools, and infrastructure pivotal for implementing generative AI technologies. As of May 2025, this marks Boomi's fifth AWS Competency, demonstrating Boomi's leadership in building generative AI applications using AWS technologies such as Amazon Bedrock, Amazon Q, and Amazon SageMaker. This recognition reflects Boomi's proven field expertise and technical best practices, enabling customers to innovate confidently with next-generation AI solutions at scale.
About Boomi
Boomi, the leader in AI-driven automation, helps organisations around the world automate and streamline critical processes to achieve business outcomes faster. Harnessing advanced AI capabilities, the Boomi Enterprise Platform seamlessly connects systems and manages data flows with API management, integration, data management, and AI orchestration in one comprehensive solution. With over 23,000 customers globally and a network of 800+ partners, Boomi is revolutionising the way enterprises of all sizes achieve business agility and operational excellence. Discover more at boomi.com.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Scoop
3 days ago
- Scoop
Snowflake Unveils Comprehensive Product Innovations To Empower Enterprises To Achieve Full Potential Through Data And AI
Snowflake Openflow simplifies the process of getting data from where it is created to where it can be used Snowflake Standard Warehouse - Generation 2 and Snowflake Adaptive Compute deliver faster analytics performance to accelerate customer insights, without driving up costs Snowflake Intelligence allows business users to harness AI data agents to analyse, understand, and act on structured and unstructured data Snowflake Cortex AISQL embeds generative AI directly into customers' queries, empowering teams to analyse all types of data and build flexible AI pipelines with familiar SQL syntax With Cortex Knowledge Extensions, enterprises can enrich their AI apps and agents with real-time news and content from trusted third-party providers Snowflake (NYSE: SNOW), the AI Data Cloud company, today announced several product innovations at its annual user conference, Snowflake Summit 2025, designed to revolutionise how enterprises manage, analyse, and activate their data in the AI era. These announcements span data engineering, compute performance, analytics, and agentic AI capabilities, all aimed at helping organisations break down data silos and bridge the gap between enterprise data and business action — without sacrificing control, simplicity, or governance. 'Today's announcements underscore the rapid pace of innovation at Snowflake in our drive to empower every enterprise to unlock its full potential through data and AI,' said Theo Hourmouzis, Senior Vice President, ANZ and ASEAN, Snowflake. 'Organisations across A/NZ are looking to take their AI projects to the next level – from testing, to production, to ultimately providing business value. Today's innovations are focused on providing them with the easiest, most connected, and most trusted data platform to do so.' Snowflake Openflow Unlocks Full Data Interoperability, Accelerating Data Movement for AI Innovation Snowflake unveiled Snowflake Openflow, a multi-modal data ingestion service that allows users to connect to virtually any data source and drive value from any data architecture. Now generally available on AWS, Openflow eliminates fragmented data stacks and manual labor by unifying various types of data and formats, enabling customers to rapidly deploy AI-powered innovations. Snowflake Openflow embraces open standards, so organisations can bring data integrations into a single, unified platform without vendor lock-in and with full support for architecture interoperability. Powered by Apache NiFi™[1], an Apache Software Foundation project built to automate the flow of data between systems, Snowflake Openflow enables data engineers to build custom connectors in minutes and run them seamlessly on Snowflake's managed platform. With Snowflake Openflow, users can harness their data across the entire end-to-end data lifecycle, while adapting to evolving data standards and business demands. Hundreds of ready-to-use connectors and processors simplify and rapidly accelerate data integration from a broad range of data sources including Box, Google Ads, Microsoft Dataverse, Microsoft SharePoint, Oracle, Proofpoint, ServiceNow, Workday, Zendesk, and more, to a wide array of destinations including cloud object stores and messaging platforms, not just Snowflake. Snowflake Unveils Next Wave of Compute Innovations For Faster, More Efficient Warehouses and AI-Driven Data Governance Snowflake announced the next evolution of compute innovations that deliver faster performance, enhanced usability, and stronger price-performance value — raising the bar for modern data infrastructure. This includes Standard Warehouse – Generation 2 (Gen2) (now generally available), an enhanced version of Snowflake's virtual Standard Warehouse with next-generation hardware and additional enhancements to deliver 2.1x[2] faster analytics performance and 1.9x faster analytics performance than Managed Spark. Snowflake also introduced Snowflake Adaptive Compute (now in private preview), a new compute service that lowers the burden of resource management by maximising efficiency through automatic resource sizing and sharing. Warehouses created using Adaptive Compute, known as Adaptive Warehouses, accelerate performance for users without driving up costs, ultimately redefining data management in the evolving AI landscape. Snowflake Intelligence and Data Science Agent Deliver The Next Frontier of Data Agents for Enterprise AI and ML Snowflake announced Snowflake Intelligence (public preview soon), which enables technical and non-technical users alike to ask natural language questions and instantly uncover actionable insights from both structured tables and unstructured documents. Snowflake Intelligence is powered by state-of-the-art large language models from Anthropic and OpenAI, running inside the secure Snowflake perimeter, and is powered by Cortex Agents (public preview) under the hood — all delivered through an intuitive, no-code interface that helps provide transparency and explainability. Snowflake also unveiled Data Science Agent (private preview soon), an agentic companion that boosts data scientists' productivity by automating routine ML model development tasks. Data Science Agent uses Anthropic's Claude to break down problems associated with ML workflows into distinct steps, such as data analysis, data preparation, feature engineering, and training. Today, over 5,200[3] customers from companies like BlackRock, Luminate, and Penske Logistics are using Snowflake Cortex AI to transform their businesses. Snowflake Introduces Cortex AISQL and SnowConvert AI: Analytics Rebuilt for the AI Era Snowflake announced major innovations that expand on Snowflake Cortex AI, Snowflake's suite of enterprise-grade AI capabilities, empowering global organisations to modernise their data analytics for today's AI landscape. This includes SnowConvert AI, an agentic automation solution that accelerates migrations from legacy platforms to Snowflake. With SnowConvert AI, data professionals can modernise their data infrastructure faster, more cost-effectively, and with less manual effort. Once data lands in Snowflake, Cortex AISQL (now in public preview) then brings generative AI directly into customers' query engines, enabling teams to extract insights across multi-modal data and build flexible AI pipelines using SQL — all while providing bestinclass performance and cost efficiency. Snowflake Marketplace Adds Agentic Products and AI-Ready Data from Leading News, Research, and Market Data Providers Snowflake announced new agentic products on Snowflake Marketplace that accelerate agentic AI adoption across the enterprise. This includes Cortex Knowledge Extensions (generally available soon) on Snowflake Marketplace, which enables enterprises to enrich their AI apps and agents with proprietary unstructured data from third-party providers — all while allowing providers to protect their intellectual property and ensure proper attribution. Users can tap into a selection of business articles and content from The Associated Press, which will help users further enhance the usefulness of results in their AI systems. In addition, Snowflake unveiled sharing of Semantic Models (now in private preview), which allows users to easily integrate AI-ready structured data within their Snowflake Cortex AI apps and agents — both from internal teams or third-party providers like CARTO, CB Insights, Cotality™ powered by Bobsled, Deutsche Börse, IPinfo, and truestar. Learn More: Check out all the innovations and announcements coming out of Snowflake Summit 2025 on Snowflake's Newsroom. Stay on top of the latest news and announcements from Snowflake on LinkedIn and X, and follow along at #SnowflakeSummit. About Snowflake Snowflake is the platform for the AI era, making it easy for enterprises to innovate faster and get more value from data. More than 11,000 companies around the globe, including hundreds of the world's largest, use Snowflake's AI Data Cloud to build, use, and share data, apps and AI. With Snowflake, data and AI are transformative for everyone. Learn more at (NYSE: SNOW).


Techday NZ
5 days ago
- Techday NZ
Elastic & AWS partner to enable secure generative AI apps
Elastic has entered into a five-year strategic collaboration agreement with Amazon Web Services (AWS) to support organisations in building secure, generative AI-powered applications with greater speed and reduced complexity. The agreement is focused on joint product integrations and go-to-market initiatives that aim to enable customers to transition into AI-native enterprises more efficiently. It brings together Elastic's Search AI Platform and AWS services, with a particular emphasis on facilitating work in highly regulated sectors such as the public sector and financial services. Under this agreement, the companies will invest in technical integrations, including support for Amazon Bedrock and Elastic Cloud Serverless, to help customers drive AI innovation. The collaboration is designed to allow customers to leverage generative AI features by making use of high-performing foundation models available through Amazon Bedrock. It also offers support for migrating Elasticsearch workloads from on-premise data centres to Elastic Cloud on AWS, ongoing cost efficiencies for users of Elastic Cloud Serverless, and enhanced agentic AI capabilities through work on Model Context Protocol (MCP) and agent-to-agent interoperability. Commenting on the collaboration, Ash Kulkarni, Chief Executive Officer at Elastic, said: "As the speed of generative AI adoption accelerates, search has become increasingly relevant. Our collaboration with AWS and integration with Amazon Bedrock brings the power of search directly to generative AI for a host of use cases, including cybersecurity and observability. Together, we're enabling developers to build intelligent, context-aware applications that leverage their own data securely and at scale." Ruba Borno, Vice President, Specialists and Partners at AWS, said: "Together with Elastic, we're helping customers transform how they leverage data and AI to drive innovation. This strategic collaboration delivers particular value for highly regulated industries requiring robust data protection, while our shared commitment to standards like Model Context Protocols enables seamless agent-to-agent interactions. Available through AWS Marketplace, customers will be able to quickly deploy solutions that combine Elastic's powerful search capabilities with Amazon Bedrock on the secure, global AWS infrastructure, helping them build compliant, intelligent applications that accelerate their AI journey." The collaboration is already producing results for organisations such as Generis and BigID. Mariusz Pala, Chief Technology Officer at Generis, said: "The strength of the Elastic and AWS partnership has been fundamental to Generis's mission of delivering secure, compliant, and intelligent solutions for clients in highly regulated industries. By deploying Elastic on AWS, we've reduced average search times by 1000% and cut the time to produce complex, compliance-driven documents from two weeks to just two days, providing our clients real-time insights while upholding the highest standards of data integrity and control." Avior Malkukian, Head of DevOps at BigID, said: "Leveraging Elastic Cloud on AWS has been transformative for BigID. We've achieved a 120x acceleration in query performance, enabling real-time data insights that were previously unattainable. The scalability and flexibility of Elastic Cloud on AWS allow us to efficiently manage vast and complex data landscapes, ensuring our customers can swiftly discover and protect their sensitive information. Elastic Cloud on AWS is a powerful combination that allows us to deliver innovative features, reduce operational costs, and maintain our leadership in data security and compliance." The integration of Elastic's AI-powered solutions with AWS services includes features such as Elastic AI Assistant, Attack Discovery, Automatic Import, Automatic Migration, Automatic Troubleshoot, and AI Playground, all of which interact with Large Language Models through Amazon Bedrock. These integrations help customers to conduct root cause analysis more quickly, synthesise complex data signals, automate data onboarding, and simplify the migration process. Natural language and retrieval-augmented generation (RAG)-powered workflows are designed to enable teams to interact with their data more intuitively and support faster decision-making. Elastic's relationship with AWS has been recognised within the AWS Partner Network. In December 2024, Elastic was named AWS Global Generative AI Infrastructure and Data Partner of the Year, and it was among the first group of AWS software partners acknowledged with the AWS Generative AI Competency. The company has also received AWS competency designations for the government and education sectors earlier this year.


Techday NZ
29-05-2025
- Techday NZ
Kurrent unveils open-source MCP Server for AI-driven databases
Kurrent has released its open-source MCP Server for KurrentDB, enabling developers to interact with data in the KurrentDB database using natural language and AI agents rather than traditional coding methods. The Kurrent MCP Server offers new functionalities, allowing developers not only to query data but also to create, test, and debug projections directly through conversational commands. This feature is not available in other MCP server implementations, establishing a novel approach to database interaction by integrating AI-driven workflows into the database layer. Central to this release is the introduction of a self-correcting engine, which assists in automatically identifying and fixing logic errors during the prototyping phase. This reduces the need for manual debugging loops, streamlining the development process significantly for users building or modifying projections. The software is fully open-source and released under the MIT license, with documentation and a development roadmap available on GitHub. This permits both enterprise users and open-source contributors to adopt, customise, and improve the KurrentDB MCP Server without licensing restrictions. Kurrent MCP Server supports natural language prompts for tasks such as reading streams, listing streams within the database, building and updating projections, writing events to streams, and retrieving projection status for debugging. These capabilities aim to make the visual and analytical exploration of data more accessible and conversational for users with varying levels of technical expertise. The MCP Server is compatible with a broad range of frontier AI models, such as Claude, GPT-4, and Gemini. It can be integrated with popular IDEs and agent frameworks, including Cursor and Windsurf. This compatibility enables developers to leverage their preferred tools while reducing friction points typically associated with traditional database interactions. Addressing the new approach, Kirk Dunn, CEO of Kurrent, said, "Our new MCP Server makes it possible to use the main features of the KurrentDB database, like reading and writing events to streams and using projections, in a way that's as simple as having a conversation. The system's ability to test and fix itself reduces the need for debugging and increases reliability. Copilots and AI assistants become productive database partners rather than just code generators, seamlessly interfacing with KurrentDB." The server's key functions are designed to reduce development times for database tasks, enabling a focus on higher-value project work. Eight core capabilities are available, including Read_stream, List_streams, Build_projection, Create_projection, Update_projection, Test_projection, Write_events_to_stream, and Get_projections_status. Each of these responds directly to natural language instructions provided by the developer or AI agent. Kurrent has highlighted opportunities for the open source community to participate in the MCP Server's ongoing development. Developers can contribute code, report or tackle issues, and suggest new features through the project's GitHub repository and discussion forums. Comprehensive educational resources and installation guides are intended to help developers quickly integrate the MCP Server with KurrentDB for various use cases. Lokhesh Ujhoodha, Lead Architect at Kurrent, commented, "Before, database interactions required developers to master complex query languages, understand intricate data structures, and spend significant time debugging projections and data flows. Now, everything agentic can interface with KurrentDB through this MCP Server. We're not just connecting to today's AI tools, but we're positioning for a future where AI agents autonomously manage data workflows, make analytical decisions and create business insights with minimal human intervention." Kurrent emphasises that its MCP Server aims to remove barriers historically associated with database development by supporting conversational, agent-driven workflows. This aligns with broader trends towards AI-native infrastructure in enterprise environments, where human and algorithmic agents increasingly collaborate to deliver data-driven business outcomes.