
Red Hat & AMD extend AI & virtualisation for hybrid cloud
Red Hat and AMD have announced a strategic collaboration aimed at expanding customer options for artificial intelligence (AI) and virtualisation across hybrid cloud environments.
The partnership integrates Red Hat's open-source expertise with AMD's high-performance computing hardware. The companies state that the collaboration will allow organisations to deploy optimised, efficient, and production-ready environments for AI-enabled workloads, while also providing tools for efficiently modernising traditional virtual machines (VMs).
A core component of the collaboration involves full enablement of AMD Instinct GPUs on Red Hat OpenShift AI. This means that customers deploying AI across hybrid cloud environments can take advantage of AMD's processing power without requiring extensive resources.
AMD Instinct MI300X GPUs, when paired with Red Hat Enterprise Linux AI, have demonstrated success in AI inferencing for both small and large language models (SLMs and LLMs). Testing conducted with Microsoft Azure ND MI300X v5 showed that these models could be deployed across multiple GPUs within a single VM. This approach may reduce the need to deploy across multiple VMs and potentially lower performance-related costs.
Red Hat and AMD are also collaborating in the upstream vLLM community, an effort aimed at delivering increased inference performance and improved support for multi-GPU applications on AMD hardware. This collaboration includes technical contributions such as upstreaming the AMD kernel library and optimisations to the Triton kernel and FP8 support, allowing for faster and more efficient execution of vLLM workloads on AMD Instinct MI300X GPUs.
Enhanced collective communication and workload optimisation for multi-GPU environments are also being targeted, which could offer scalability and improved energy efficiency for AI deployments dependent on distributed computing. Collaborative efforts between Red Hat, AMD, and other industry parties such as IBM are intended to accelerate development within the vLLM project and benefit users relying on AMD hardware for AI inference and training.
The companies indicate that AMD Instinct GPUs will provide out-of-the-box support for Red Hat AI Inference Server, a distribution of vLLM designed for enterprise-grade requirements. As the primary commercial contributor to the vLLM project, Red Hat emphasises its commitment to compatibility across various hardware platforms, including those from AMD.
The collaboration also extends to AMD EPYC CPUs, which have been validated for Red Hat OpenShift Virtualization. This capability supports the running of VM workloads in an environment that aims to unify VMs and containerised applications, whether on-premises, in public clouds, or across a hybrid cloud setup.
Red Hat OpenShift Virtualization's validation for AMD EPYC processors allows the leveraging of these CPUs' performance and power efficiency characteristics, with support for leading server platforms including Dell PowerEdge, HPE ProLiant, and Lenovo ThinkSystem. The aim is to achieve higher infrastructure consolidation ratios, which could result in a reduced total cost of ownership (TCO) relating to hardware, software licensing, and energy.
Ashesh Badani, Senior Vice President and Chief Product Officer at Red Hat, said: "Fully realizing the benefits of AI means that organizations must have the choice and flexibility to optimize their IT footprint for the rigors of scaling demand. Our extended collaboration with AMD expands the spectrum of options for organizations seeking to ready their IT environments for an ever-evolving future, from modernizing existing investments on a high-performing CPU architecture and virtualization platform to preparing for production AI with next-generation hardware accelerators and open source AI technologies."
Philip Guido, Executive Vice President and Chief Commercial Officer at AMD, said: "As enterprise customer workloads grow more diverse and demanding, they require solutions that can scale. By combining Red Hat's industry-leading open source platforms with world-class AMD Instinct GPUs and AMD EPYC CPUs, we're delivering the performance and efficiency customers demand to accelerate AI, virtualization and hybrid-cloud innovation."
The companies are aiming to provide customers with a comprehensive platform capable of supporting both traditional workloads and demanding AI applications across the hybrid cloud.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Techday NZ
3 hours ago
- Techday NZ
Snowflake unveils Gen2 warehouse, AI tools for analytics
Snowflake has announced a series of enhancements to its data platform, introducing new compute innovations aimed at increasing speed, efficiency, and AI-driven data governance for users globally, including those in Canada. The company has launched its Standard Warehouse – Generation 2 (Gen2), an update to its virtual warehousing service, which is now generally available. According to Snowflake, Gen2 delivers analytics performance that is 2.1 times faster than its predecessor, with the goal of allowing organisations to derive insights more quickly and support data-driven decisions at pace with AI developments. "We're reimagining what customers can expect from a modern data platform, delivering faster performance, easier scalability, and greater value for every dollar spent. Snowflake's Gen2 more than doubles the speed of core analytics workloads, helping teams make quicker, data-driven decisions. Coupled with Adaptive Warehouses, we're enabling customers with an easier, more efficient approach to platform management where resource decisions are automated and right-sized. This isn't just about infrastructure improvements, it's about preparing your business to innovate quickly in the fast-moving AI era," said Artin Avanes, Head of Core Data Platform at Snowflake. Alongside Gen2, Snowflake has introduced the private preview of a new service called Adaptive Compute. This service automates resource allocation and management within data warehouses, using automatic resource sizing and sharing to help users achieve better performance without increasing operational costs. Warehouses utilising Adaptive Compute, referred to as Adaptive Warehouses, are designed to optimise compute resources in real time, reducing the need for manual configuration and intervention. Steve Ring, Director of Enterprise Database Solutions at Pfizer, commented on the announcement, highlighting the importance of operational efficiency for large enterprises. "As one of the world's premier biopharmaceutical companies, efficiency and scalability are critical to everything we do as we race to deliver medical innovations. Snowflake's easy, connected, and trusted platform has been instrumental in streamlining our operations so that we can maximize the value of our data, and the impact we drive for end users. We're excited about the potential of Adaptive Warehouses to build on that success. What we're seeing with Adaptive Warehouses is just the beginning of a larger trend in cloud computing — the ability to dynamically adjust resources based on workload demands, without human intervention," Ring said. Snowflake states that Gen2 was developed to address issues associated with growing data volumes and increasingly complex analytics workflows, which can slow down queries and limit productivity. The upgrade incorporates new hardware and software improvements, supporting more efficient analytics at a time when organisations are handling scalable, end-to-end data workloads. The company also outlined a number of advancements in its Snowflake Horizon Catalog. New features include increased interoperability through Catalog-Linked Databases, which can automatically sync with Apache Iceberg objects managed by various REST Catalogs, including Apache Polaris and AWS Glue. These integrations are aimed at simplifying advanced governance and discovery across heterogeneous data environments. Data discovery capabilities are set for expansion with External Data in Universal Search, allowing users to find and access data from external relational databases, such as PostgreSQL and MySQL, directly from the Snowflake environment. This is expected to extend data visibility for users operating across diverse platforms. On the security and governance front, Snowflake is introducing Copilot for Horizon Catalog, an AI-powered chat interface—currently in private preview—that enables users to ask governance and security questions without requiring SQL expertise. This feature is intended to streamline oversight, enhance compliance, and accelerate security-related decision-making through conversational AI. For improved resilience and compliance, Snowflake is rolling out point-in-time, immutable backups with the Snapshots feature, which is entering public preview. Snapshots are designed to prevent alterations or deletions after creation, providing customers with data recovery capabilities critical for regulatory and cyber resilience, especially in cases of ransomware. The Trust Center functionality is also being extended, offering customers the ability to build and integrate partner security scanners that can be customised to meet industry-specific compliance demands. Snowflake suggests this flexibility will support companies' varying regulatory requirements. In order to support developer productivity and faster performance optimisation, the company has generally released new AI Observability Tools, providing real-time insights and diagnostics across data environments. These additions are intended to help users rapidly identify and address issues, and to ensure informed operational decisions across their data infrastructure.


Techday NZ
3 hours ago
- Techday NZ
Snowflake launches Openflow to speed AI data integration
Snowflake has introduced Snowflake Openflow, a data movement service designed to facilitate data interoperability and accelerate data transfer for artificial intelligence initiatives. Snowflake Openflow allows users to connect to a wide range of data sources and architectures, streamlining the process of integrating enterprise data ecosystems with AI models, applications, and data agents directly within the Snowflake platform. The service eliminates fragmented data stacks and seeks to reduce the manual effort data teams invest in ingestion tasks. Snowflake says Openflow supports an open, extensible, and managed approach to data integration, enabling unified data management for rapid deployment of AI-powered solutions. "Snowflake Openflow dramatically simplifies data accessibility and AI readiness. We're seeing more customers adopt an AI-first data strategy, which is dependent on having access to all of your data in a single platform. With Snowflake Openflow, we're redefining what open, extensible, and managed data integration looks like, so our customers can quickly build AI-powered apps and agents without leaving their data behind," Chris Child, Vice President of Product, Data Engineering at Snowflake, said. Snowflake Openflow is powered by Apache NiFi, an open-source technology for automating the flow of data between systems. This enables engineers to develop custom connectors rapidly and operate them on Snowflake's managed platform. Ready-to-use connectors and processors allow integration from an array of data sources such as Box, Google Ads, Microsoft Dataverse, Microsoft SharePoint, Oracle, Proofpoint, Salesforce Data Cloud, ServiceNow, Workday, and Zendesk, with support for destinations that extend beyond the Snowflake platform. Customers including Irwin, Securonix, and WorkWave are planning to use these features to move and scale data in their operations, leveraging hundreds of connectors to simplify global data transfer. The platform also includes features such as dbt Projects, improved support for Apache Iceberg tables, and Snowpipe Streaming, all of which are aimed at making data engineering more collaborative and scalable. With a market opportunity estimated at USD $15 billion, Snowflake Openflow enables businesses to integrate both structured and unstructured, batch and streaming data from virtually any source, whether on-premise or in the cloud. This supports enterprises seeking to establish a single connected view of their data as they build out AI projects, without the complexity of vendor lock-in. Snowflake's additional data engineering advancements include new capabilities to support dbt Projects natively within Snowflake via a new development environment, Snowflake Workspaces. This facilitates automation, integration with git, and assistance from an AI Copilot. The company has also extended its support for semi-structured data using the VARIANT data type and added optimisations for handling file sizes and partitions, particularly benefiting those managing Apache Iceberg tables. Improvements to data streaming are also being rolled out, with Snowpipe Streaming now offering a higher throughput of up to 10 gigabytes per second and reduced latency, allowing near real-time data querying following ingestion. Several partners commented on Snowflake Openflow and their collaborations: "Our partnership with Snowflake is dedicated to empowering businesses to create exceptional customer experiences through the strategic use of their data. With Snowflake Openflow, our joint customers will be able to seamlessly integrate a wider array of rich customer insights and audience data from Adobe Experience Platform with their broader enterprise data in Snowflake. This will accelerate their ability to unlock more holistic insights and activate them with enhanced agility, powering highly personalised engagements with remarkable efficiency," Justin Merickel, Vice President for Analytics and Technology Partners at Adobe, stated. "Enterprises today face increasing pressure to unify fragmented data and make it AI-ready, and that starts with seamless, scalable integration. The strategic partnership between Box and Snowflake combines the leading platforms for both their unstructured and structured data to use the power of AI to unlock the value of their data like never before. Together, we're removing the complexity of data integrations so customers can maximise value and accelerate outcomes across the entire content lifecycle," Ben Kus, Chief Technology Officer at Box, said. "Microsoft is collaborating with Snowflake to make data from Microsoft Dataverse, Dynamics 365 apps, and Power Platform apps easily connected across enterprise data ecosystems through Snowflake Openflow. The ready-to-use connectors make data movement fast, effortless, and secure so customers can reach their full data potential to advance meaningful AI development," Nirav Shah, Corporate Vice President, Dataverse, Microsoft, commented. "Enterprises today are looking to unlock the full potential of their data, wherever it resides, to drive transformative AI initiatives. Our expanded partnership with Snowflake and the introduction of this new, high-speed connector directly addresses this need. By enhancing the interoperability and performance of data connectivity with the Snowflake AI Data Cloud, we are empowering our joint customers to accelerate their data pipelines to gain faster, more actionable insights that can truly move their businesses forward in the AI era," Jeff Pollock, Vice President of Product Management at Oracle, said. "AI has drastically increased the volume and velocity of data, making it even more imperative for enterprises to maintain full control and governance over their most valuable asset. Snowflake Openflow ensures that fully interoperable data movement is not only possible, but can be secure at the same time. Proofpoint DSPM enables customers to classify sensitive data inline within Openflow and apply Snowflake tags to it, unlocking a more complete, centralised data landscape so they can innovate confidently and securely," Amer Deeba, Group Vice President, Proofpoint DSPM group, explained. "The new integration with Snowflake Openflow represents an important enhancement in our existing Zero Copy data integration, actively dismantling these barriers. It empowers joint customers with more functionality to seamlessly and securely access and activate all their data in real time, transforming enterprise data to fuel AI-powered applications and intelligent agents across their operations," Nick Johnston, SVP of Strategic Partnerships and Business Development at Salesforce, remarked. "As AI adoption increases, there's an increasing need to eliminate data siloes through bi-directional integrations with enterprise data lakes like Snowflake. Through our partnership with Snowflake Openflow and ServiceNow's Workflow Data Network, we're eliminating data siloes so that our customers can reap the benefits of intelligent automation to better power their organisations," Amit Saxena, Vice President and General Manager, Workflow Data Fabric, ServiceNow, said. "Zendesk powers more than 10,000 AI customers worldwide, delivering real-time, personalised support at scale. Our partnership with Snowflake and the launch of Snowflake Openflow creates an unprecedented, fully-integrated data ecosystem that drives smarter insights and accelerates AI innovation across customer journeys. Together, we're defining the future of AI-powered service and helping businesses transform their customer experience through data-driven intelligence," Tim Marsden, VP of Technology Alliances at Zendesk, commented.


NZ Herald
3 hours ago
- NZ Herald
Govt open to further changes to crack down on AI deepfake porn
The Government is open to exploring ways it can further crack down on the creation and distribution of non-consensual AI generated, deepfake porn. It comes after Act MP Laura McClure addressed the House last month, warning of the increased prevalence of explicit deepfake content, using an AI nude photo