Latest news with #LuigiGalvani

RTÉ News
07-08-2025
- Science
- RTÉ News
The rise, fall and renaissance of electrochemistry
Analysis: The once unfashionable science which began with a dissected frog is now behind a vast array of indispensable modern innovations Once seen as a dusty branch of chemistry confined to old textbooks full of mathematical equations, electrochemistry is now at the forefront of some of the world's most exciting technologies. In research labs around the world, electrochemists are quietly driving some of the biggest scientific breakthroughs of our time. But despite its growing influence, most people still don't know what electrochemistry is, or why it's suddenly at the centre of clean-tech innovation that can help solve anything from clean water access to climate change. So, what is electrochemistry exactly? As the name suggests, it's where chemical reactions and electricity meet. We can use electricity to drive chemical reactions or, conversely, we can use chemical reactions to produce electricity. From RTÉ Brainstorm, what's going to happen to used electric car batteries? It's the science behind the batteries that power your phone, watch or car. It's how gold gets plated onto your jewellery and how an ECG can measure your heart function. Even the electricity that powers your TV or computer is rooted in electrochemical processes, often driven by the combustion of fossil fuels. In other words, electrochemistry is everywhere, but you just might not think about it. It also has a rather odd origin story involving frogs. In the late 1700s, Italian scientist Luigi Galvani was examining a dissected frog and noticed its leg twitched whenever it came into contact with metal. He thought the frog had its own built-in electricity. It didn't, but the idea stuck, and Galvani had unknowingly stumbled across the principles of electrochemistry. A few years later, Alessandro Volta realised that it was the reaction between two different metals that caused the spark and ended up inventing the first true battery in 1800. Within years, we'd learned h ow to split water into hydrogen and oxygen and how to coat metal surfaces in gold or nickel, By the mid-1800s, the first rechargeable battery was born, the same kind of batteries still found under millions of car bonnets today. From History of Simple Things, how do rechargeable batteries work? Electrochemistry subsequently went relatively quiet and was overshadowed by other major scientific advancements during the 20th century. It never quote disappeared, though, and could be found quietly powering metal plating factories and corrosion testing labs. It took a backseat as more on-trend fields as organic chemistry, nuclear physics and molecular biology took attention and resources. Electrochemistry was considered useful, though hardly exciting. But it has re-emerged recently as central to some of the world's fastest growing technologies, from electric vehicles to wastewater treatments. In med-tech, it underpins biosensors that allow people with diabetes check their blood sugar in real time, and wearable implants and patches that can monitor everything from heart rate to stress levels. These tiny devices can detect glucose, cholesterol and even early markers of cancer in blood, sweat or saliva. Electrochemistry is also being used in smart drug delivery systems that release medication in precise doses inside the body. As these technologies shrink and become more affordable, they're bringing personalised, preventative healthcare within reach for millions. From Inside Science, new smart material could automatically deliver your medication inside your body without a reminder As if powering a medical revolution wasn't enough, electrochemical technologies are also crucial to mitigating climate change. The concept of "ecological footprint" measures how much nature we use compared to how much the planet can regenerate. Back in 2012, the WWF's Living Planet Report warned that we would need the resources of two planet Earths by 2030 if global consumption and emissions continued a "business as usual" path. More than a decade on, this warning remains just as urgent. Recent data show we are still on track for ecological overshoot, making technological intervention more critical than ever. Right now, most of our energy still comes from fossil fuels which release carbon dioxide when burned. Electrochemistry offers a cleaner alternative. We can now use renewable electricity from wind or solar to split water into hydrogen and oxygen, a process known as electrolysis. That hydrogen can then be used as a clean fuel or stored and used later to generate electricity on demand to power cars, buses, trains. Electrochemistry has re-emerged recently as central to some of the world's fastest growing technologies, from electric vehicles to wastewater treatments In addition, electrochemists are also developing new ways to deal with carbon dioxide itself. Researchers are now developing electrochemical systems that can directly capture carbon dioxide from the air and then convert it into something useful such as fuels or chemicals. In theory, this means we could close the carbon loop, using electricity to turn a waste gas into a valuable commodity, without relying on fossil fuels at all. It's early days, but the potential is enormous. For years, electrochemistry lived somewhat in the shadow of other scientific disciplines, often seen as old fashioned or too complex to be exciting. But as we face urgent global challenges, this "in-between" science is proving to be one of the most powerful tools we have. Whether it's producing clean energy, capturing carbon, or monitoring our health in real time, electrochemistry is having its renaissance, and it won't be overshadowed any time soon.
Yahoo
11-04-2025
- Science
- Yahoo
From brain Bluetooth to ‘full RoboCop': where chip implants will be heading soon
In the 1987 classic film RoboCop, the deceased Detroit cop Alex Murphy is reborn as a cyborg. He has a robotic body and a full brain-computer interface that allows him to control his movements with his mind. He can access online information such as suspects' faces, uses artificial intelligence (AI) to help detect threats, and his human memories have been integrated with those from a machine. It is remarkable to think that the movie's key mechanical robotic technologies have almost now been accomplished by the likes of Boston Dynamics' running, jumping Atlas and Kawasaki's new four-legged Corleo. Similarly we are seeing robotic exoskeletons that enable paralysed patients to do things like walking and climbing stairs by responding to their gestures. Developers have lagged behind when it comes to building an interface in which the brain's electrical pulses can communicate with an external device. This too is changing, however. In the latest breakthrough, a research team based at the University of California has unveiled a brain implant that enabled a woman with paralysis to livestream her thoughts via AI into a synthetic voice with just a three-second delay. The concept of an interface between neurons and machines goes back much further than RoboCop. In the 18th century, an Italian physician named Luigi Galvani discovered that when electricity is passed through certain nerves in a frog's leg, it would twitch. This paved the way for the whole study of electrophysiology, which looks at how electrical signals affect organisms. The initial modern research on brain-computer interfaces started in the late 1960s, with the American neuroscientist Eberhard Fetz hooking up monkeys' brains to electrodes and showing that they could move a meter needle. Yet if this demonstrated some exciting potential, the human brain proved too complex for this field to advance quickly. The brain is continually thinking, learning, memorising, recognising patterns and decoding sensory signals – not to mention coordinating and moving our bodies. It runs on about 86 billion neurons with trillions of connections which process, adapt and evolve continuously in what is called neuroplasticity. In other words, there's a great deal to figure out. Much of the recent progress has been based on advances in our ability to map the brain, identifying the various regions and their activities. A range of technologies can produce insightful images of the brain (including functional magnetic resonance imaging (fMRI) and positron emission tomography (PET)), while others monitor certain kinds of activity (including electroencephalography (EEG) and the more invasive electrocortigraphy (ECoG)). These techniques have helped researchers to build some incredible devices, including wheelchairs and prosthetics that can be controlled by the mind. But whereas these are typically controlled with an external interface like an EEG headset, chip implants are very much the new frontier. They have been enabled by advances in AI chips and micro electrodes, as well as the deep learning neural networks that power today's AI technology. This allows for faster data analysis and pattern recognition, which together with the more precise brain signals that can be acquired using implants, have made it possible to create applications that run virtually in real time. For instance, the new University of California implant relies on ECoG, a technique developed in the early 2000s that captures patterns directly from a thin sheet of electrodes placed directly on the cortical surface of someone's brain. In their case, the complex patterns picked up by the implant of 253 high-density electrodes are processed using deep learning to produce a matrix of data from which it's possible to decode whatever words the user is thinking. This improves on previous models that could only create synthetic speech after the user had finished a sentence. Elon Musk's Neuralink has been able to get patients to control a computer cursor using similar techniques. However, it's also worth emphasising that deep learning neural networks are enabling more sophisticated devices that rely on other forms of brain monitoring. Our research team at Nottingham Trent University has developed an affordable brainwave reader using off-the-shelf parts that enables patients who are suffering from conditions like completely locked-in syndrome (CLIS) or motor neurone disease (MND) to be able to answer 'yes' or 'no' to questions. There's also the potential to control a computer mouse using the same technology. The progress in AI, chip fabrication and biomedical tech that enabled these developments is expected to continue in the coming years, which should mean that brain-computer interfaces keep improving. In the next ten years, we can expect more technologies that provide disabled people with independence by helping them to move and communicate more easily. This entails improved versions of the technologies that are already emerging, including exoskeletons, mind-controlled prosthetics and implants that move from controlling cursors to fully controlling computers or other machines. In all cases, it will be a question of balancing our increasing ability to interpret high-quality brain data with invasiveness, safety and costs. It is still more in the medium to long term that I would expect to see many of the capabilities of a RoboCop, including planted memories and built-in trained skills supported with internet connectivity. We can also expect to see high-speed communication between people via 'brain Bluetooth'. It should be similarly possible to create a Six Million Dollar Man, with enhanced vision, hearing and strength, by implanting the right sensors and linking the right components to convert neuron signals into action (actuators). No doubt applications will also emerge as our understanding of brain functionality increases that haven't been thought of yet. Clearly, it will soon become impossible to keep deferring ethical considerations. Could our brains be hacked, and memories be planted or deleted? Could our emotions be controlled? Will the day come where we need to update our brain software and press restart? With every step forward, questions like these become ever more pressing. The major technological obstacles have essentially been cleared out of the way. It's time to start thinking about to what extent we want to integrate these technologies into society, the sooner the better. This article is republished from The Conversation under a Creative Commons license. Read the original article. Amin Al-Habaibeh receives funding Innovate UK, The British Council, The Royal academy of Engineering, EPSRC, AHRC, and the European Commission.



