
Elastic & AWS partner to enable secure generative AI apps
Elastic has entered into a five-year strategic collaboration agreement with Amazon Web Services (AWS) to support organisations in building secure, generative AI-powered applications with greater speed and reduced complexity.
The agreement is focused on joint product integrations and go-to-market initiatives that aim to enable customers to transition into AI-native enterprises more efficiently. It brings together Elastic's Search AI Platform and AWS services, with a particular emphasis on facilitating work in highly regulated sectors such as the public sector and financial services.
Under this agreement, the companies will invest in technical integrations, including support for Amazon Bedrock and Elastic Cloud Serverless, to help customers drive AI innovation. The collaboration is designed to allow customers to leverage generative AI features by making use of high-performing foundation models available through Amazon Bedrock. It also offers support for migrating Elasticsearch workloads from on-premise data centres to Elastic Cloud on AWS, ongoing cost efficiencies for users of Elastic Cloud Serverless, and enhanced agentic AI capabilities through work on Model Context Protocol (MCP) and agent-to-agent interoperability.
Commenting on the collaboration, Ash Kulkarni, Chief Executive Officer at Elastic, said: "As the speed of generative AI adoption accelerates, search has become increasingly relevant. Our collaboration with AWS and integration with Amazon Bedrock brings the power of search directly to generative AI for a host of use cases, including cybersecurity and observability. Together, we're enabling developers to build intelligent, context-aware applications that leverage their own data securely and at scale."
Ruba Borno, Vice President, Specialists and Partners at AWS, said: "Together with Elastic, we're helping customers transform how they leverage data and AI to drive innovation. This strategic collaboration delivers particular value for highly regulated industries requiring robust data protection, while our shared commitment to standards like Model Context Protocols enables seamless agent-to-agent interactions. Available through AWS Marketplace, customers will be able to quickly deploy solutions that combine Elastic's powerful search capabilities with Amazon Bedrock on the secure, global AWS infrastructure, helping them build compliant, intelligent applications that accelerate their AI journey."
The collaboration is already producing results for organisations such as Generis and BigID. Mariusz Pala, Chief Technology Officer at Generis, said: "The strength of the Elastic and AWS partnership has been fundamental to Generis's mission of delivering secure, compliant, and intelligent solutions for clients in highly regulated industries. By deploying Elastic on AWS, we've reduced average search times by 1000% and cut the time to produce complex, compliance-driven documents from two weeks to just two days, providing our clients real-time insights while upholding the highest standards of data integrity and control."
Avior Malkukian, Head of DevOps at BigID, said: "Leveraging Elastic Cloud on AWS has been transformative for BigID. We've achieved a 120x acceleration in query performance, enabling real-time data insights that were previously unattainable. The scalability and flexibility of Elastic Cloud on AWS allow us to efficiently manage vast and complex data landscapes, ensuring our customers can swiftly discover and protect their sensitive information. Elastic Cloud on AWS is a powerful combination that allows us to deliver innovative features, reduce operational costs, and maintain our leadership in data security and compliance."
The integration of Elastic's AI-powered solutions with AWS services includes features such as Elastic AI Assistant, Attack Discovery, Automatic Import, Automatic Migration, Automatic Troubleshoot, and AI Playground, all of which interact with Large Language Models through Amazon Bedrock. These integrations help customers to conduct root cause analysis more quickly, synthesise complex data signals, automate data onboarding, and simplify the migration process. Natural language and retrieval-augmented generation (RAG)-powered workflows are designed to enable teams to interact with their data more intuitively and support faster decision-making.
Elastic's relationship with AWS has been recognised within the AWS Partner Network. In December 2024, Elastic was named AWS Global Generative AI Infrastructure and Data Partner of the Year, and it was among the first group of AWS software partners acknowledged with the AWS Generative AI Competency. The company has also received AWS competency designations for the government and education sectors earlier this year.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Techday NZ
4 hours ago
- Techday NZ
Snowflake unveils Gen2 warehouse, AI tools for analytics
Snowflake has announced a series of enhancements to its data platform, introducing new compute innovations aimed at increasing speed, efficiency, and AI-driven data governance for users globally, including those in Canada. The company has launched its Standard Warehouse – Generation 2 (Gen2), an update to its virtual warehousing service, which is now generally available. According to Snowflake, Gen2 delivers analytics performance that is 2.1 times faster than its predecessor, with the goal of allowing organisations to derive insights more quickly and support data-driven decisions at pace with AI developments. "We're reimagining what customers can expect from a modern data platform, delivering faster performance, easier scalability, and greater value for every dollar spent. Snowflake's Gen2 more than doubles the speed of core analytics workloads, helping teams make quicker, data-driven decisions. Coupled with Adaptive Warehouses, we're enabling customers with an easier, more efficient approach to platform management where resource decisions are automated and right-sized. This isn't just about infrastructure improvements, it's about preparing your business to innovate quickly in the fast-moving AI era," said Artin Avanes, Head of Core Data Platform at Snowflake. Alongside Gen2, Snowflake has introduced the private preview of a new service called Adaptive Compute. This service automates resource allocation and management within data warehouses, using automatic resource sizing and sharing to help users achieve better performance without increasing operational costs. Warehouses utilising Adaptive Compute, referred to as Adaptive Warehouses, are designed to optimise compute resources in real time, reducing the need for manual configuration and intervention. Steve Ring, Director of Enterprise Database Solutions at Pfizer, commented on the announcement, highlighting the importance of operational efficiency for large enterprises. "As one of the world's premier biopharmaceutical companies, efficiency and scalability are critical to everything we do as we race to deliver medical innovations. Snowflake's easy, connected, and trusted platform has been instrumental in streamlining our operations so that we can maximize the value of our data, and the impact we drive for end users. We're excited about the potential of Adaptive Warehouses to build on that success. What we're seeing with Adaptive Warehouses is just the beginning of a larger trend in cloud computing — the ability to dynamically adjust resources based on workload demands, without human intervention," Ring said. Snowflake states that Gen2 was developed to address issues associated with growing data volumes and increasingly complex analytics workflows, which can slow down queries and limit productivity. The upgrade incorporates new hardware and software improvements, supporting more efficient analytics at a time when organisations are handling scalable, end-to-end data workloads. The company also outlined a number of advancements in its Snowflake Horizon Catalog. New features include increased interoperability through Catalog-Linked Databases, which can automatically sync with Apache Iceberg objects managed by various REST Catalogs, including Apache Polaris and AWS Glue. These integrations are aimed at simplifying advanced governance and discovery across heterogeneous data environments. Data discovery capabilities are set for expansion with External Data in Universal Search, allowing users to find and access data from external relational databases, such as PostgreSQL and MySQL, directly from the Snowflake environment. This is expected to extend data visibility for users operating across diverse platforms. On the security and governance front, Snowflake is introducing Copilot for Horizon Catalog, an AI-powered chat interface—currently in private preview—that enables users to ask governance and security questions without requiring SQL expertise. This feature is intended to streamline oversight, enhance compliance, and accelerate security-related decision-making through conversational AI. For improved resilience and compliance, Snowflake is rolling out point-in-time, immutable backups with the Snapshots feature, which is entering public preview. Snapshots are designed to prevent alterations or deletions after creation, providing customers with data recovery capabilities critical for regulatory and cyber resilience, especially in cases of ransomware. The Trust Center functionality is also being extended, offering customers the ability to build and integrate partner security scanners that can be customised to meet industry-specific compliance demands. Snowflake suggests this flexibility will support companies' varying regulatory requirements. In order to support developer productivity and faster performance optimisation, the company has generally released new AI Observability Tools, providing real-time insights and diagnostics across data environments. These additions are intended to help users rapidly identify and address issues, and to ensure informed operational decisions across their data infrastructure.


Techday NZ
4 hours ago
- Techday NZ
Snowflake launches Openflow to speed AI data integration
Snowflake has introduced Snowflake Openflow, a data movement service designed to facilitate data interoperability and accelerate data transfer for artificial intelligence initiatives. Snowflake Openflow allows users to connect to a wide range of data sources and architectures, streamlining the process of integrating enterprise data ecosystems with AI models, applications, and data agents directly within the Snowflake platform. The service eliminates fragmented data stacks and seeks to reduce the manual effort data teams invest in ingestion tasks. Snowflake says Openflow supports an open, extensible, and managed approach to data integration, enabling unified data management for rapid deployment of AI-powered solutions. "Snowflake Openflow dramatically simplifies data accessibility and AI readiness. We're seeing more customers adopt an AI-first data strategy, which is dependent on having access to all of your data in a single platform. With Snowflake Openflow, we're redefining what open, extensible, and managed data integration looks like, so our customers can quickly build AI-powered apps and agents without leaving their data behind," Chris Child, Vice President of Product, Data Engineering at Snowflake, said. Snowflake Openflow is powered by Apache NiFi, an open-source technology for automating the flow of data between systems. This enables engineers to develop custom connectors rapidly and operate them on Snowflake's managed platform. Ready-to-use connectors and processors allow integration from an array of data sources such as Box, Google Ads, Microsoft Dataverse, Microsoft SharePoint, Oracle, Proofpoint, Salesforce Data Cloud, ServiceNow, Workday, and Zendesk, with support for destinations that extend beyond the Snowflake platform. Customers including Irwin, Securonix, and WorkWave are planning to use these features to move and scale data in their operations, leveraging hundreds of connectors to simplify global data transfer. The platform also includes features such as dbt Projects, improved support for Apache Iceberg tables, and Snowpipe Streaming, all of which are aimed at making data engineering more collaborative and scalable. With a market opportunity estimated at USD $15 billion, Snowflake Openflow enables businesses to integrate both structured and unstructured, batch and streaming data from virtually any source, whether on-premise or in the cloud. This supports enterprises seeking to establish a single connected view of their data as they build out AI projects, without the complexity of vendor lock-in. Snowflake's additional data engineering advancements include new capabilities to support dbt Projects natively within Snowflake via a new development environment, Snowflake Workspaces. This facilitates automation, integration with git, and assistance from an AI Copilot. The company has also extended its support for semi-structured data using the VARIANT data type and added optimisations for handling file sizes and partitions, particularly benefiting those managing Apache Iceberg tables. Improvements to data streaming are also being rolled out, with Snowpipe Streaming now offering a higher throughput of up to 10 gigabytes per second and reduced latency, allowing near real-time data querying following ingestion. Several partners commented on Snowflake Openflow and their collaborations: "Our partnership with Snowflake is dedicated to empowering businesses to create exceptional customer experiences through the strategic use of their data. With Snowflake Openflow, our joint customers will be able to seamlessly integrate a wider array of rich customer insights and audience data from Adobe Experience Platform with their broader enterprise data in Snowflake. This will accelerate their ability to unlock more holistic insights and activate them with enhanced agility, powering highly personalised engagements with remarkable efficiency," Justin Merickel, Vice President for Analytics and Technology Partners at Adobe, stated. "Enterprises today face increasing pressure to unify fragmented data and make it AI-ready, and that starts with seamless, scalable integration. The strategic partnership between Box and Snowflake combines the leading platforms for both their unstructured and structured data to use the power of AI to unlock the value of their data like never before. Together, we're removing the complexity of data integrations so customers can maximise value and accelerate outcomes across the entire content lifecycle," Ben Kus, Chief Technology Officer at Box, said. "Microsoft is collaborating with Snowflake to make data from Microsoft Dataverse, Dynamics 365 apps, and Power Platform apps easily connected across enterprise data ecosystems through Snowflake Openflow. The ready-to-use connectors make data movement fast, effortless, and secure so customers can reach their full data potential to advance meaningful AI development," Nirav Shah, Corporate Vice President, Dataverse, Microsoft, commented. "Enterprises today are looking to unlock the full potential of their data, wherever it resides, to drive transformative AI initiatives. Our expanded partnership with Snowflake and the introduction of this new, high-speed connector directly addresses this need. By enhancing the interoperability and performance of data connectivity with the Snowflake AI Data Cloud, we are empowering our joint customers to accelerate their data pipelines to gain faster, more actionable insights that can truly move their businesses forward in the AI era," Jeff Pollock, Vice President of Product Management at Oracle, said. "AI has drastically increased the volume and velocity of data, making it even more imperative for enterprises to maintain full control and governance over their most valuable asset. Snowflake Openflow ensures that fully interoperable data movement is not only possible, but can be secure at the same time. Proofpoint DSPM enables customers to classify sensitive data inline within Openflow and apply Snowflake tags to it, unlocking a more complete, centralised data landscape so they can innovate confidently and securely," Amer Deeba, Group Vice President, Proofpoint DSPM group, explained. "The new integration with Snowflake Openflow represents an important enhancement in our existing Zero Copy data integration, actively dismantling these barriers. It empowers joint customers with more functionality to seamlessly and securely access and activate all their data in real time, transforming enterprise data to fuel AI-powered applications and intelligent agents across their operations," Nick Johnston, SVP of Strategic Partnerships and Business Development at Salesforce, remarked. "As AI adoption increases, there's an increasing need to eliminate data siloes through bi-directional integrations with enterprise data lakes like Snowflake. Through our partnership with Snowflake Openflow and ServiceNow's Workflow Data Network, we're eliminating data siloes so that our customers can reap the benefits of intelligent automation to better power their organisations," Amit Saxena, Vice President and General Manager, Workflow Data Fabric, ServiceNow, said. "Zendesk powers more than 10,000 AI customers worldwide, delivering real-time, personalised support at scale. Our partnership with Snowflake and the launch of Snowflake Openflow creates an unprecedented, fully-integrated data ecosystem that drives smarter insights and accelerates AI innovation across customer journeys. Together, we're defining the future of AI-powered service and helping businesses transform their customer experience through data-driven intelligence," Tim Marsden, VP of Technology Alliances at Zendesk, commented.


NZ Herald
5 hours ago
- NZ Herald
Govt open to further changes to crack down on AI deepfake porn
The Government is open to exploring ways it can further crack down on the creation and distribution of non-consensual AI generated, deepfake porn. It comes after Act MP Laura McClure addressed the House last month, warning of the increased prevalence of explicit deepfake content, using an AI nude photo