
Amazon Unveils Quantum Computing Chip Ocelot
Amazon Web Services (AWS) has announced Ocelot, a new quantum computing chip that can reduce the costs of implementing quantum error correction by up to 90%, compared to current approaches. Developed by the team at the AWS Center for Quantum Computing at the California Institute of Technology, Ocelot represents a breakthrough in the pursuit to build fault-tolerant quantum computers capable of solving problems of commercial and scientific importance that are beyond the reach of today's conventional computers.
AWS used a novel design for Ocelot's architecture, building error correction in from the ground up and using the 'cat qubit'. Cat qubits–named after the famous Schrödinger's cat thought experiment–intrinsically suppress certain forms of errors, reducing the resources required for quantum error correction. Through this new approach with Ocelot, AWS researchers have, for the first time, combined cat qubit technology and additional quantum error correction components onto a microchip that can be manufactured in a scalable fashion using processes borrowed from the microelectronics industry.
History shows that important advancements in computing have been made by fundamentally rethinking hardware components, as this can have a significant impact on cost, performance, and even the feasibility of a new technology. The computer revolution truly took off when the transistor replaced the vacuum tube, enabling room-sized computers to be shrunk down into today's compact and much more powerful, reliable, and lower-cost laptops. Choosing the right building block to scale is critical, and today's announcement represents an important step in developing efficient means to scaling up to practical, fault-tolerant quantum computers.
'With the recent advancements in quantum research, it is no longer a matter of if, but when practical, fault-tolerant quantum computers will be available for real-world applications. Ocelot is an important step on that journey,' said Oskar Painter, AWS director of Quantum Hardware. 'In the future, quantum chips built according to the Ocelot architecture could cost as little as one-fifth of current approaches, due to the drastically reduced number of resources required for error correction. Concretely, we believe this will accelerate our timeline to a practical quantum computer by up to five years.'
AWS researchers have published their findings in a peer-reviewed research paper in Nature.
The major challenge with quantum computing:
One of the biggest challenges with quantum computers is that they're incredibly sensitive to the smallest changes, or 'noise' in their environment. Vibrations, heat, electromagnetic interference from cell phones and Wi-Fi networks, or even cosmic rays and radiation from outer space, can all knock qubits out of their quantum state, causing errors in the quantum computation being performed. This has historically made it extremely challenging to build quantum computers that can perform reliable, error-free calculations of any significant complexity. 'The biggest challenge isn't just building more qubits,' said Painter. 'It's making them work reliably.'
To solve this problem, quantum computers rely on quantum error correction that uses special encodings of quantum information across multiple qubits—in the form of 'logical' qubits—to shield quantum information from the environment. This also enables the detection and correction of errors as they occur. Unfortunately, given the sheer number of qubits required to get accurate results, current approaches to quantum error correction have come at a huge, and therefore prohibitive, cost.
A new approach to quantum error correction:
To address the current problems associated with quantum error correction, researchers at AWS developed Ocelot. Ocelot was designed from the ground up with error correction 'built in.' 'We looked at how others were approaching quantum error correction and decided to take a different path,' said Painter. 'We didn't take an existing architecture and then try to incorporate error correction afterwards. We selected our qubit and architecture with quantum error correction as the top requirement. We believe that if we're going to make practical quantum computers, quantum error correction needs to come first.' In fact, according to Painter, his team estimates that scaling Ocelot to a 'fully-fledged quantum computer capable of transformative societal impact would require as little as one-tenth of the resources associated with standard quantum error correcting approaches.'
One way to think about quantum correction is in the context of quality control in manufacturing, and the difference between needing one inspection point to catch all defects, instead of 10 inspection points. In other words, it offers the same result, but with fewer resources and an overall improved manufacturing process. By reducing the amount of resources needed through approaches such as with Ocelot, quantum computers can be built smaller, more reliably, and at lower cost. All of this accelerates the path to applying quantum computing to future applications in the real-world, such as faster drug discovery and development, the production of new materials, the ability to make more accurate predictions about risk and investment strategies in financial markets, and many more.
Making science fiction science fact:
While today's announcement is a promising start, Ocelot is still a prototype and AWS is committed to continuing to invest in quantum research and refining its approach. In the same way it took many years of development and learnings of running x86 systems (a widely used computer architecture for central processing units) reliably and securely at scale to build Graviton into one of the leading chips in the cloud, AWS is taking a similar approach to quantum computing. 'We're just getting started and we believe we have several more stages of scaling to go through,' said Painter. 'It's a very tough problem to tackle, and we will need to continue investing in basic research, while staying connected to, and learning from, important work being done in academia. Right now, our task is to keep innovating across the quantum computing stack, to keep examining whether we're using the right architecture, and to incorporate these learnings into our engineering efforts. It's a flywheel of continuous improvement and scaling.'
How to get started with quantum computing:
Customers can get started exploring quantum computing today with Amazon Braket on AWS. Amazon Braket is a full-managed quantum computing service that allows scientists, developers, and students to work with a range of third-party quantum computing hardware, high-performance simulators, and a suite of software tools that make it easy to get started in quantum computing.
Ocelot: Fast facts Ocelot is a prototype quantum computing chip, designed to test the effectiveness of AWS's quantum error correction architecture.
It consists of two integrated silicon microchips. Each chip has an area of roughly 1cm2. They are bonded one on top of the other in an electrically-connected chip stack.
On the surface of each silicon microchip are thin layers of superconducting materials that form the quantum circuit elements.
The Ocelot chip is composed of 14 core components: five data qubits (the cat qubits), five 'buffer circuits' for stabilizing the data qubits, and four additional qubits for detecting errors on the data qubits.
The cat qubits store the quantum states used for computation. To do so, they rely on components called oscillators, which generate a repetitive electrical signal with steady timing.
Ocelot's high-quality oscillators are made from a thin film of superconducting material called Tantalum. AWS material scientists have developed a specific way of processing Tantalum on the silicon chip to boost oscillator performance.
How do quantum computers work?
Quantum computers have the potential to drive major advances in society and technology, from cryptography to engineering novel materials. The main difference between the conventional or 'classical' computers we use today, and quantum computers, is that classical computers use bits—usually represented as a digital value of 1 or 0 —as their most basic unit of information. But quantum computers use quantum bits, or 'qubits'—usually elementary particles such as electrons or photons—to make calculations. Scientists can apply precisely timed and tuned electromagnetic pulses to manipulate what's called the 'quantum state' of the qubit, where it can be both 1 and 0 at the same time. This mind-bending behavior, when performed across many qubits, allows a quantum computer to solve some important problems exponentially faster than a classical computer ever could. 0 0
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Web Release
3 days ago
- Web Release
Snowflake Unveils Comprehensive Product Innovations to Empower Enterprises to Achieve Their Full Potential Through Data and AI
Snowflake Unveils Comprehensive Product Innovations to Empower Enterprises to Achieve Their Full Potential Through Data and AI Snowflake (NYSE: SNOW), the AI Data Cloud company, today announced several product innovations at its annual user conference, Snowflake Summit 2025 , designed to revolutionize how enterprises manage, analyze, and activate their data in the AI era. These announcements span data engineering, compute performance, analytics, and agentic AI capabilities, all aimed at helping organizations break down data silos and bridge the gap between enterprise data and business action — without sacrificing control, simplicity, or governance. 'With our latest announcements, we're showcasing how Snowflake is fundamentally redefining what organizations can expect from a modern data platform,' said James Petter, Vice President, Snowflake EMEA. 'These innovations are focused on helping businesses make AI and machine learning workflows more easy, connected, and trusted for users of all abilities by democratizing access to data and eliminating the technical overhead that slows down business decision-making.' Snowflake Openflow Unlocks Full Data Interoperability, Accelerating Data Movement for AI Innovation Snowflake unveiled Snowflake Openflow , a multi-modal data ingestion service that allows users to connect to virtually any data source and drive value from any data architecture. Now generally available on AWS, Openflow eliminates fragmented data stacks and manual labor by unifying various types of data and formats, enabling customers to rapidly deploy AI-powered innovations. Snowflake Openflow embraces open standards, so organizations can bring data integrations into a single, unified platform without vendor lock-in and with full support for architecture interoperability. Powered by Apache NiFi ™, an Apache Software Foundation project built to automate the flow of data between systems, Snowflake Openflow enables data engineers to build custom connectors in minutes and run them seamlessly on Snowflake's managed platform. With Snowflake Openflow, users can harness their data across the entire end-to-end data lifecycle, while adapting to evolving data standards and business demands. Hundreds of ready-to-use connectors and processors simplify and rapidly accelerate data integration from a broad range of data sources including Box, Google Ads, Proofpoint, ServiceNow, Workday, Zendesk, and more , to a wide array of destinations including cloud object stores and messaging platforms, not just Snowflake. Snowflake Unveils Next Wave of Compute Innovations For Faster, More Efficient Warehouses and AI-Driven Data Governance Snowflake announced the next evolution of compute innovations that deliver faster performance, enhanced usability, and stronger price-performance value — raising the bar for modern data infrastructure. This includes Standard Warehouse – Generation 2 (Gen2) (now generally available), an enhanced version of Snowflake's virtual Standard Warehouse with next-generation hardware and additional enhancements to deliver 2.1x faster analytics performance. Snowflake also introduced Snowflake Adaptive Compute (now in private preview), a new compute service that lowers the burden of resource management by maximizing efficiency through automatic resource sizing and sharing. Warehouses created using Adaptive Compute, known as Adaptive Warehouses, accelerate performance for users without driving up costs, ultimately redefining data management in the evolving AI landscape. Snowflake Intelligence and Data Science Agent Deliver The Next Frontier of Data Agents for Enterprise AI and ML Snowflake announced Snowflake Intelligence (public preview soon), which enables technical and non-technical users alike to ask natural language questions and instantly uncover actionable insights from both structured tables and unstructured documents. Snowflake Intelligence is powered by state-of-the-art large language models from Anthropic and OpenAI, running inside the secure Snowflake perimeter, and is powered by Cortex Agents (public preview) under the hood — all delivered through an intuitive, no-code interface that helps provide transparency and explainability. Snowflake also unveiled Data Science Agent (private preview soon), an agentic companion that boosts data scientists' productivity by automating routine ML model development tasks. Data Science Agent uses Anthropic's Claude to break down problems associated with ML workflows into distinct steps, such as data analysis, data preparation, feature engineering, and training. Today, over 5,200 customers from companies like BlackRock, Luminate, and Penske Logistics are using Snowflake Cortex AI to transform their businesses. Snowflake Introduces Cortex AISQL and SnowConvert AI: Analytics Rebuilt for the AI Era Snowflake announced major innovations that expand on Snowflake Cortex AI , Snowflake's suite of enterprise-grade AI capabilities, empowering global organizations to modernize their data analytics for today's AI landscape. This includes SnowConvert AI , an agentic automation solution that accelerates migrations from legacy platforms to Snowflake. With SnowConvert AI, data professionals can modernize their data infrastructure faster, more cost-effectively, and with less manual effort. Once data lands in Snowflake, Cortex AISQL (now in public preview) then brings generative AI directly into customers' query engines, enabling teams to extract insights across multi-modal data and build flexible AI pipelines using SQL — all while providing best?in?class performance and cost efficiency. Snowflake Marketplace Adds Agentic Products and AI-Ready Data from Leading News, Research, and Market Data Providers Snowflake announced new agentic products on Snowflake Marketplace that accelerate agentic AI adoption across the enterprise. This includes Cortex Knowledge Extensions (generally available soon) on Snowflake Marketplace , which enables enterprises to enrich their AI apps and agents with proprietary unstructured data from third-party providers — all while allowing providers to protect their intellectual property and ensure proper attribution. Users can tap into a selection of business articles and content from The Associated Press, which will help users further enhance the usefulness of results in their AI systems. In addition, Snowflake unveiled sharing of Semantic Models (now in private preview), which allows users to easily integrate AI-ready structured data within their Snowflake Cortex AI apps and agents — both from internal teams or third-party providers like CARTO, CB Insights, Cotality™ powered by Bobsled, Deutsche Börse, IPinfo, and truestar.


TECHx
3 days ago
- TECHx
Snowflake Unveils AI Data Cloud Innovations at Summit 2025
Home » Tech Value Chain » Global Brands » Snowflake Unveils AI Data Cloud Innovations at Summit 2025 Snowflake (NYSE: SNOW), the AI Data Cloud company, announced major product innovations at its annual user conference, Snowflake Summit 2025. These updates are set to transform how enterprises manage, analyze, and activate data in the AI era. The company revealed enhancements across data engineering, compute performance, analytics, and agentic AI. These innovations aim to eliminate data silos and connect enterprise data to business action. At the same time, they maintain control, simplicity, and governance. James Petter, Vice President, Snowflake EMEA, stated that the new updates redefine what organizations can expect from a modern data platform. He emphasized that the company's goal is to make AI and machine learning workflows more accessible, trusted, and efficient. Snowflake introduced Snowflake Openflow, a multi-modal data ingestion service now generally available on AWS. It enables users to connect to nearly any data source and drive value from any architecture. Openflow removes data fragmentation by unifying formats and systems. The service is powered by Apache NiFi™, automating data flow between systems. It allows data engineers to create custom connectors in minutes, running them on Snowflake's managed platform. It supports a wide range of sources like Box, Google Ads, Proofpoint, ServiceNow, Workday, and Zendesk, among others. Key capabilities include: Hundreds of ready-to-use connectors Integration with cloud object stores and messaging platforms Snowflake also revealed new compute innovations. These include Standard Warehouse Generation 2 (Gen2), now generally available, which offers 2.1x faster performance. Another addition, Adaptive Compute, is now in private preview. This feature automatically sizes and shares resources for better performance and lower costs. The company reported the upcoming release of Snowflake Intelligence and Cortex Agents, both in public preview soon. These tools enable users to ask natural language questions and get insights from structured and unstructured data. Powered by models from Anthropic and OpenAI, they run securely within Snowflake. Another announcement was the Data Science Agent, now in private preview. It helps data scientists by automating tasks like data preparation, feature engineering, and model training using Anthropic's Claude. According to Snowflake, more than 5,200 customers, including BlackRock, Luminate, and Penske Logistics, are already using Cortex AI to transform their operations. The company also introduced SnowConvert AI and Cortex AISQL. These tools support fast and cost-effective migration from legacy systems and enable generative AI-powered SQL analytics. Both are designed for high performance and efficiency. Additionally, Snowflake revealed updates to its Marketplace. New agentic products like Cortex Knowledge Extensions will soon be available. These allow enterprises to enrich AI agents with third-party data while ensuring data protection and attribution. Users can access content from The Associated Press and other providers. Through these developments, Snowflake aims to empower global organizations to modernize their data strategies with enterprise-ready AI.


Channel Post MEA
4 days ago
- Channel Post MEA
Snowflake Introduces New Product Innovations For Data And AI
Snowflake has announced several product innovations at its annual user conference, Snowflake Summit 2025, designed to revolutionize how enterprises manage, analyze, and activate their data in the AI era. These announcements span data engineering, compute performance, analytics, and agentic AI capabilities, all aimed at helping organizations break down data silos and bridge the gap between enterprise data and business action — without sacrificing control, simplicity, or governance. 'With our latest announcements, we're showcasing how Snowflake is fundamentally redefining what organizations can expect from a modern data platform,' said James Petter, Vice President, Snowflake EMEA. 'These innovations are focused on helping businesses make AI and machine learning workflows more easy, connected, and trusted for users of all abilities by democratizing access to data and eliminating the technical overhead that slows down business decision-making.' Snowflake Openflow Unlocks Full Data Interoperability, Accelerating Data Movement for AI Innovation Snowflake unveiled Snowflake Openflow, a multi-modal data ingestion service that allows users to connect to virtually any data source and drive value from any data architecture. Now generally available on AWS, Openflow eliminates fragmented data stacks and manual labor by unifying various types of data and formats, enabling customers to rapidly deploy AI-powered innovations. Snowflake Openflow embraces open standards, so organizations can bring data integrations into a single, unified platform without vendor lock-in and with full support for architecture interoperability. Powered by Apache NiFi, an Apache Software Foundation project built to automate the flow of data between systems, Snowflake Openflow enables data engineers to build custom connectors in minutes and run them seamlessly on Snowflake's managed platform. With Snowflake Openflow, users can harness their data across the entire end-to-end data lifecycle, while adapting to evolving data standards and business demands. Hundreds of ready-to-use connectors and processors simplify and rapidly accelerate data integration from a broad range of data sources including Box, Google Ads, Proofpoint, ServiceNow, Workday, Zendesk, and more, to a wide array of destinations including cloud object stores and messaging platforms, not just Snowflake. Snowflake Unveils Next Wave of Compute Innovations For Faster, More Efficient Warehouses and AI-Driven Data Governance Snowflake announced the next evolution of compute innovations that deliver faster performance, enhanced usability, and stronger price-performance value — raising the bar for modern data infrastructure. This includes Standard Warehouse – Generation 2 (Gen2) (now generally available), an enhanced version of Snowflake's virtual Standard Warehouse with next-generation hardware and additional enhancements to deliver 2.1x faster analytics performance. Snowflake also introduced Snowflake Adaptive Compute (now in private preview), a new compute service that lowers the burden of resource management by maximizing efficiency through automatic resource sizing and sharing. Warehouses created using Adaptive Compute, known as Adaptive Warehouses, accelerate performance for users without driving up costs, ultimately redefining data management in the evolving AI landscape. Snowflake Intelligence and Data Science Agent Deliver The Next Frontier of Data Agents for Enterprise AI and ML Snowflake announced Snowflake Intelligence (public preview soon), which enables technical and non-technical users alike to ask natural language questions and instantly uncover actionable insights from both structured tables and unstructured documents. Snowflake Intelligence is powered by state-of-the-art large language models from Anthropic and OpenAI, running inside the secure Snowflake perimeter, and is powered by Cortex Agents (public preview) under the hood — all delivered through an intuitive, no-code interface that helps provide transparency and explainability. Snowflake also unveiled Data Science Agent (private preview soon), an agentic companion that boosts data scientists' productivity by automating routine ML model development tasks. Data Science Agent uses Anthropic's Claude to break down problems associated with ML workflows into distinct steps, such as data analysis, data preparation, feature engineering, and training. Today, over 5,200 customers from companies like BlackRock, Luminate, and Penske Logistics are using Snowflake Cortex AI to transform their businesses. Snowflake Introduces Cortex AISQL and SnowConvert AI: Analytics Rebuilt for the AI Era Snowflake announced major innovations that expand on Snowflake Cortex AI, Snowflake's suite of enterprise-grade AI capabilities, empowering global organizations to modernize their data analytics for today's AI landscape. This includes SnowConvert AI, an agentic automation solution that accelerates migrations from legacy platforms to Snowflake. With SnowConvert AI, data professionals can modernize their data infrastructure faster, more cost-effectively, and with less manual effort. Once data lands in Snowflake, Cortex AISQL (now in public preview) then brings generative AI directly into customers' query engines, enabling teams to extract insights across multi-modal data and build flexible AI pipelines using SQL — all while providing best‑in‑class performance and cost efficiency. Snowflake Marketplace Adds Agentic Products and AI-Ready Data from Leading News, Research, and Market Data Providers Snowflake announced new agentic products on Snowflake Marketplace that accelerate agentic AI adoption across the enterprise. This includes Cortex Knowledge Extensions (generally available soon) on Snowflake Marketplace, which enables enterprises to enrich their AI apps and agents with proprietary unstructured data from third-party providers — all while allowing providers to protect their intellectual property and ensure proper attribution. Users can tap into a selection of business articles and content from The Associated Press, which will help users further enhance the usefulness of results in their AI systems. In addition, Snowflake unveiled sharing of Semantic Models (now in private preview), which allows users to easily integrate AI-ready structured data within their Snowflake Cortex AI apps and agents — both from internal teams or third-party providers like CARTO, CB Insights, Cotality powered by Bobsled, Deutsche Börse, IPinfo, and truestar.