logo
#

Latest news with #QUIEST

Quantum computing is still in its infancy, but researchers have high hopes
Quantum computing is still in its infancy, but researchers have high hopes

Technical.ly

time15-05-2025

  • Science
  • Technical.ly

Quantum computing is still in its infancy, but researchers have high hopes

While AI has been getting most of the attention as a world-changing technology lately, there's another technology on the horizon that has the potential to reshape the world: quantum computing. Explaining what quantum computing is and what it will do is complicated because to really understand it, you need to understand both how classic, or binary, computing works and what its limits are. Quantum physicist Shohini Ghose made this analogy: If binary computing is a candle, quantum computing is an electric light bulb; you can make the highest quality candle in the world, and it won't ever be able to do what a light bulb does. 'Quantum computing runs on linear algebra, just like quantum mechanics runs on linear algebra,' said Gushu Li of QUIEST — The Penn Center for Quantum Information, Engineering, Science and Technology — during his talk on quantum computing at the 2025 Developers Conference last week. 'In contrast, classical computing runs on Boolean algebra.' In the most simple terms, quantum computing is computing in multiple dimensions instead of being two dimensional. It's unimaginably fast — think about how quickly ChatGPT can create a spreadsheet and imagine the power generative AI will have combined with a technology that can solve a calculation in five minutes that would take a supercomputer of today literally millions of years to resolve. The science is so huge and hard to grasp that quantum mechanics has been cited in real-life scientific theory of multiple timelines. The reality, Li said, is that while we won't be using quantum laptops any time soon, quantum computing will likely make breakthroughs in the not-so-distant future. Some of those breakthroughs may be in: Medicine, as quantum computers help design new drugs faster Materials, leading to better batteries or superconductors Security, as today's encryption methods become obsolete Here are five takeaways from Li's talk. Quantum computing is rooted in the laws of physics, not classical logic While classical computers rely on Boolean logic and binary bits (0s and 1s), quantum computing leverages the principles of quantum mechanics, particularly superposition and entanglement, to process information using qubits — a unit of quantum information that is the equivalent of a binary bit. Unlike classical bits, Li said, qubits can exist in multiple states simultaneously, creating a vast computational space. The rise of quantum computing marks the second quantum revolution The first quantum revolution in the early 20th century led to technologies like the transistor and integrated circuits, foundational to today's electronics. The second revolution, Li said, focuses on using quantum mechanics directly for computation, opening new possibilities in algorithm design and computational power. Shor's algorithm sparked major interest in quantum potential In 1994, MIT professor Peter Shor introduced an algorithm that could factor large numbers exponentially faster than classical algorithms. Li said this posed a serious challenge to encryption of the time and demonstrated a use case, spurring investment and interest in quantum computing. Hardware and ecosystem development are rapid but fragmented Multiple quantum technologies are being explored, including superconducting circuits, ion traps (used in television technology) and photonic systems (light manipulation technology), with both startups and big tech companies like IBM, Google and Intel contributing to hardware development. Li said that IBM's release of a cloud-accessible quantum chip in 2016 was a pivotal moment, democratizing access to quantum hardware. The quantum software stack and workflow are still in their infancy Programming a quantum computer involves expressing algorithms as quantum circuits — a visual, linear algebra-based representation, Li said. New tools are required for programming, compiling, and interpreting quantum computing results. Unlike classical systems, reading or measuring, a qubit alters its state, making quantum computing inherently non-intuitive. 'Different technologies are trying to build better quantum devices,' Li said. 'Many companies are joining this field, and if you look at their roadmaps, they usually claim that by the end of 2030, they are expecting something around 1 million qubits.'

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store