
Capgemini expands partnership for secure generative AI with SAP
Capgemini has announced the expansion of its strategic partnership with Mistral AI and SAP to provide secure generative AI-powered solutions for organisations in regulated industries.
The collaboration is intended to support sectors such as financial services, public sector, aerospace and defence, and energy and utilities, all of which have especially stringent data requirements. By leveraging Mistral AI's generative AI models together with SAP's Business Technology Platform (BTP), Capgemini plans to develop accessible business AI use cases that also seek to minimise environmental impact through a lower carbon footprint.
Custom business AI solutions will be made available within SAP through a library of more than 50 pre-built use cases. These use cases draw on models from Mistral AI and have been validated by SAP, covering a range of industry-specific and process-driven needs. The solutions have been designed in accordance with regulations, incorporating responsible and ethical AI principles, built-in governance, and a focus on data security.
Examples of use cases provided by the collaboration include augmented field workers in aerospace and defence to help resolve non-conformities in operations, drone-based inspection for predictive maintenance in energy and utilities, and intelligent indirect purchasing processes applicable across a variety of industries to streamline supplier selection.
Capgemini stated that the collaboration aims to help organisations accelerate the deployment of customised generative AI solutions within SAP, particularly for those organisations that require secure environments to meet regulatory or privacy obligations.
Marjorie Janiewicz, Mistral AI Executive Board member and Global Head of Revenue, said, "This new collaboration between Capgemini, Mistral AI and SAP unlocks new high-value business use cases for organizations seeking to augment their operations with generative AI capabilities. By combining our frontier, multilingual and highly customizable AI models with Capgemini's expertise in delivering real world industry-specific generative AI solutions, and the assurance of SAP's robust technology platform, we are making the effective integration of AI more accessible for all organizations, including those in highly regulated industries."
Fernando Alvarez, Chief Strategy and Development Officer and Group Executive Board member at Capgemini, added, "Enterprises are increasingly turning to generative AI to enhance their resilience, streamline operations and accelerate time to value. As a trusted business and technology transformation partner to our clients, Capgemini is committed to helping them evolve their critical business processes through the secure and tailored application of AI. Together with Mistral AI and SAP, we can empower organizations to access a broad range of innovative and customized AI models, to drive significant business value and foster sustainable growth."
Thomas Saueressig, Member of the Executive Board of SAP SE, Customer Services and Delivery, commented, "The collaboration is a powerful example of how we are enabling enterprises to leverage the power of generative AI to address their most critical business challenges. With SAP Business Technology Platform as a secure and scalable foundation, we're enabling organizations, especially those in regulated industries, to adopt AI with confidence, trust, and speed in a way that delivers real business value."
Capgemini has also worked closely with SAP to expand its dedicated Global SAP Center of Excellence, which helps organisations tackle critical business challenges with generative AI. The partners have previously collaborated with Brose, an automotive supplier, to deliver an AI-powered assistant for suppliers known as SupplierGPT. This platform, built on SAP BTP, has enhanced efficiency within Brose's supplier network by improving onboarding and standardising processes.
Michael Seifert, Business Product Owner Brose Supplier Portal, Brose Fahrzeugteile SE & Co. KG, remarked, "Together with Capgemini, we were able to implement SupplierGPT, from idea to reality within a few weeks. This solution enables the seamless integration of new innovations and supports rapid go-to-market, thanks to the AI services in SAP BTP. This co-innovation model combines the expertise of Capgemini, Brose and SAP to allow joint pilots to be designed, implemented, and tested quickly."
Capgemini's efforts in this space have been recognised with the 2025 SAP Pinnacle Award for Business AI Innovation in the Customer AI use case category. The award highlights the performance and achievements of SAP's partners around the world who demonstrate capability in delivering AI-powered solutions.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Techday NZ
5 hours ago
- Techday NZ
Snowflake unveils Gen2 warehouse, AI tools for analytics
Snowflake has announced a series of enhancements to its data platform, introducing new compute innovations aimed at increasing speed, efficiency, and AI-driven data governance for users globally, including those in Canada. The company has launched its Standard Warehouse – Generation 2 (Gen2), an update to its virtual warehousing service, which is now generally available. According to Snowflake, Gen2 delivers analytics performance that is 2.1 times faster than its predecessor, with the goal of allowing organisations to derive insights more quickly and support data-driven decisions at pace with AI developments. "We're reimagining what customers can expect from a modern data platform, delivering faster performance, easier scalability, and greater value for every dollar spent. Snowflake's Gen2 more than doubles the speed of core analytics workloads, helping teams make quicker, data-driven decisions. Coupled with Adaptive Warehouses, we're enabling customers with an easier, more efficient approach to platform management where resource decisions are automated and right-sized. This isn't just about infrastructure improvements, it's about preparing your business to innovate quickly in the fast-moving AI era," said Artin Avanes, Head of Core Data Platform at Snowflake. Alongside Gen2, Snowflake has introduced the private preview of a new service called Adaptive Compute. This service automates resource allocation and management within data warehouses, using automatic resource sizing and sharing to help users achieve better performance without increasing operational costs. Warehouses utilising Adaptive Compute, referred to as Adaptive Warehouses, are designed to optimise compute resources in real time, reducing the need for manual configuration and intervention. Steve Ring, Director of Enterprise Database Solutions at Pfizer, commented on the announcement, highlighting the importance of operational efficiency for large enterprises. "As one of the world's premier biopharmaceutical companies, efficiency and scalability are critical to everything we do as we race to deliver medical innovations. Snowflake's easy, connected, and trusted platform has been instrumental in streamlining our operations so that we can maximize the value of our data, and the impact we drive for end users. We're excited about the potential of Adaptive Warehouses to build on that success. What we're seeing with Adaptive Warehouses is just the beginning of a larger trend in cloud computing — the ability to dynamically adjust resources based on workload demands, without human intervention," Ring said. Snowflake states that Gen2 was developed to address issues associated with growing data volumes and increasingly complex analytics workflows, which can slow down queries and limit productivity. The upgrade incorporates new hardware and software improvements, supporting more efficient analytics at a time when organisations are handling scalable, end-to-end data workloads. The company also outlined a number of advancements in its Snowflake Horizon Catalog. New features include increased interoperability through Catalog-Linked Databases, which can automatically sync with Apache Iceberg objects managed by various REST Catalogs, including Apache Polaris and AWS Glue. These integrations are aimed at simplifying advanced governance and discovery across heterogeneous data environments. Data discovery capabilities are set for expansion with External Data in Universal Search, allowing users to find and access data from external relational databases, such as PostgreSQL and MySQL, directly from the Snowflake environment. This is expected to extend data visibility for users operating across diverse platforms. On the security and governance front, Snowflake is introducing Copilot for Horizon Catalog, an AI-powered chat interface—currently in private preview—that enables users to ask governance and security questions without requiring SQL expertise. This feature is intended to streamline oversight, enhance compliance, and accelerate security-related decision-making through conversational AI. For improved resilience and compliance, Snowflake is rolling out point-in-time, immutable backups with the Snapshots feature, which is entering public preview. Snapshots are designed to prevent alterations or deletions after creation, providing customers with data recovery capabilities critical for regulatory and cyber resilience, especially in cases of ransomware. The Trust Center functionality is also being extended, offering customers the ability to build and integrate partner security scanners that can be customised to meet industry-specific compliance demands. Snowflake suggests this flexibility will support companies' varying regulatory requirements. In order to support developer productivity and faster performance optimisation, the company has generally released new AI Observability Tools, providing real-time insights and diagnostics across data environments. These additions are intended to help users rapidly identify and address issues, and to ensure informed operational decisions across their data infrastructure.


Techday NZ
5 hours ago
- Techday NZ
Snowflake launches Openflow to speed AI data integration
Snowflake has introduced Snowflake Openflow, a data movement service designed to facilitate data interoperability and accelerate data transfer for artificial intelligence initiatives. Snowflake Openflow allows users to connect to a wide range of data sources and architectures, streamlining the process of integrating enterprise data ecosystems with AI models, applications, and data agents directly within the Snowflake platform. The service eliminates fragmented data stacks and seeks to reduce the manual effort data teams invest in ingestion tasks. Snowflake says Openflow supports an open, extensible, and managed approach to data integration, enabling unified data management for rapid deployment of AI-powered solutions. "Snowflake Openflow dramatically simplifies data accessibility and AI readiness. We're seeing more customers adopt an AI-first data strategy, which is dependent on having access to all of your data in a single platform. With Snowflake Openflow, we're redefining what open, extensible, and managed data integration looks like, so our customers can quickly build AI-powered apps and agents without leaving their data behind," Chris Child, Vice President of Product, Data Engineering at Snowflake, said. Snowflake Openflow is powered by Apache NiFi, an open-source technology for automating the flow of data between systems. This enables engineers to develop custom connectors rapidly and operate them on Snowflake's managed platform. Ready-to-use connectors and processors allow integration from an array of data sources such as Box, Google Ads, Microsoft Dataverse, Microsoft SharePoint, Oracle, Proofpoint, Salesforce Data Cloud, ServiceNow, Workday, and Zendesk, with support for destinations that extend beyond the Snowflake platform. Customers including Irwin, Securonix, and WorkWave are planning to use these features to move and scale data in their operations, leveraging hundreds of connectors to simplify global data transfer. The platform also includes features such as dbt Projects, improved support for Apache Iceberg tables, and Snowpipe Streaming, all of which are aimed at making data engineering more collaborative and scalable. With a market opportunity estimated at USD $15 billion, Snowflake Openflow enables businesses to integrate both structured and unstructured, batch and streaming data from virtually any source, whether on-premise or in the cloud. This supports enterprises seeking to establish a single connected view of their data as they build out AI projects, without the complexity of vendor lock-in. Snowflake's additional data engineering advancements include new capabilities to support dbt Projects natively within Snowflake via a new development environment, Snowflake Workspaces. This facilitates automation, integration with git, and assistance from an AI Copilot. The company has also extended its support for semi-structured data using the VARIANT data type and added optimisations for handling file sizes and partitions, particularly benefiting those managing Apache Iceberg tables. Improvements to data streaming are also being rolled out, with Snowpipe Streaming now offering a higher throughput of up to 10 gigabytes per second and reduced latency, allowing near real-time data querying following ingestion. Several partners commented on Snowflake Openflow and their collaborations: "Our partnership with Snowflake is dedicated to empowering businesses to create exceptional customer experiences through the strategic use of their data. With Snowflake Openflow, our joint customers will be able to seamlessly integrate a wider array of rich customer insights and audience data from Adobe Experience Platform with their broader enterprise data in Snowflake. This will accelerate their ability to unlock more holistic insights and activate them with enhanced agility, powering highly personalised engagements with remarkable efficiency," Justin Merickel, Vice President for Analytics and Technology Partners at Adobe, stated. "Enterprises today face increasing pressure to unify fragmented data and make it AI-ready, and that starts with seamless, scalable integration. The strategic partnership between Box and Snowflake combines the leading platforms for both their unstructured and structured data to use the power of AI to unlock the value of their data like never before. Together, we're removing the complexity of data integrations so customers can maximise value and accelerate outcomes across the entire content lifecycle," Ben Kus, Chief Technology Officer at Box, said. "Microsoft is collaborating with Snowflake to make data from Microsoft Dataverse, Dynamics 365 apps, and Power Platform apps easily connected across enterprise data ecosystems through Snowflake Openflow. The ready-to-use connectors make data movement fast, effortless, and secure so customers can reach their full data potential to advance meaningful AI development," Nirav Shah, Corporate Vice President, Dataverse, Microsoft, commented. "Enterprises today are looking to unlock the full potential of their data, wherever it resides, to drive transformative AI initiatives. Our expanded partnership with Snowflake and the introduction of this new, high-speed connector directly addresses this need. By enhancing the interoperability and performance of data connectivity with the Snowflake AI Data Cloud, we are empowering our joint customers to accelerate their data pipelines to gain faster, more actionable insights that can truly move their businesses forward in the AI era," Jeff Pollock, Vice President of Product Management at Oracle, said. "AI has drastically increased the volume and velocity of data, making it even more imperative for enterprises to maintain full control and governance over their most valuable asset. Snowflake Openflow ensures that fully interoperable data movement is not only possible, but can be secure at the same time. Proofpoint DSPM enables customers to classify sensitive data inline within Openflow and apply Snowflake tags to it, unlocking a more complete, centralised data landscape so they can innovate confidently and securely," Amer Deeba, Group Vice President, Proofpoint DSPM group, explained. "The new integration with Snowflake Openflow represents an important enhancement in our existing Zero Copy data integration, actively dismantling these barriers. It empowers joint customers with more functionality to seamlessly and securely access and activate all their data in real time, transforming enterprise data to fuel AI-powered applications and intelligent agents across their operations," Nick Johnston, SVP of Strategic Partnerships and Business Development at Salesforce, remarked. "As AI adoption increases, there's an increasing need to eliminate data siloes through bi-directional integrations with enterprise data lakes like Snowflake. Through our partnership with Snowflake Openflow and ServiceNow's Workflow Data Network, we're eliminating data siloes so that our customers can reap the benefits of intelligent automation to better power their organisations," Amit Saxena, Vice President and General Manager, Workflow Data Fabric, ServiceNow, said. "Zendesk powers more than 10,000 AI customers worldwide, delivering real-time, personalised support at scale. Our partnership with Snowflake and the launch of Snowflake Openflow creates an unprecedented, fully-integrated data ecosystem that drives smarter insights and accelerates AI innovation across customer journeys. Together, we're defining the future of AI-powered service and helping businesses transform their customer experience through data-driven intelligence," Tim Marsden, VP of Technology Alliances at Zendesk, commented.


NZ Herald
6 hours ago
- NZ Herald
Govt open to further changes to crack down on AI deepfake porn
The Government is open to exploring ways it can further crack down on the creation and distribution of non-consensual AI generated, deepfake porn. It comes after Act MP Laura McClure addressed the House last month, warning of the increased prevalence of explicit deepfake content, using an AI nude photo