Latest news with #MarioLanza


Forbes
20-04-2025
- Health
- Forbes
The Meshing Of Minds And Machines Has Arrived
Brain computer interface in transhumanism connected hybrid mind Examining the mesh between humans and machines provides insight into the future. Science is already making significant progress in the development of brain/computer interface (BCI) technologies, such as brain mapping and neuromorphic circuits. A system that connects the brain directly to an external device is known as a brain-computer interface. These technologies gather brain impulses using sensors implanted in assistive devices, then use those signals to power external equipment. This implies that the conversion of brain impulses into various actions or even commands occurs without requiring human movement. BCIs then rely on brain activity that is recorded by a sensor and typically converted into digital form so that devices can interpret it. The goal of neuromorphic computing with BCI is to mimic the brain's energy efficiency and processing capacity. To achieve this, the system architecture must be redesigned to allow for in-memory computing (IMC), and electronic devices that simulate the actions of synapses and neurons must be created. Artificial intelligence, conceptual image. Neuromorphic Development BCIs have over a hundred years of history. Hans Berger discovered the brain's electrical activity in 1924. The first EEG recordings of brain waves were produced as a result of his investigations, which used electrodes to record electrical activity from the human scalp. He accomplished the first non-invasive BCI-assisted robot control in 1988. Cyberkinetics' BrainGate project successfully controlled a prosthetic hand in 2005. A comprehensive timeline of BCI can be found at: The history of Brain-Computer Interfaces (BCIs) - Timeline - RoboticsBiz In 2018, research funded by the Defense Advanced Research Projects Agency (DARPA) proved that a person with a brain chip could pilot a swarm of drones using signals from the brain. There have been various studies and experiments since then, and no doubt, science combining neural networks and artificial intelligence is on a path to enhance and even upgrade human cognitive capabilities. We could implant nanochips into our brains in the future to enhance our cognitive abilities and enable intelligent data uploads. Advancements in brain/computer interface technologies are progressing rapidly in 2025. There is a breakthrough that is impacting the meshing of mind and machine. When used unconventionally, a single, conventional silicon transistor can simulate a biological neuron and synapse, according to research from the National University of Singapore (NUS). This study, led by Associate Professor Mario Lanza of NUS's College of Design and Engineering's Department of Materials Science and Engineering, suggests scalable, energy-efficient hardware for artificial neural networks (ANNs). "We need hardware that is both scalable and energy-efficient to enable true neuromorphic computing, where microchips behave like biological neurons and synapses," Professor Lanza stated. The Neuralink logo on a laptop arranged in New York, US, on Wednesday, Jan. 31, 2024. Elon Musk said ... More that the first human patient has received a brain implant from his startup Neuralink Corp., a significant step forward for the company that aims to one day let humans control computers with their minds. Photographer: Gabby Jones/Bloomberg Elon Musk has been a pioneer in the neuromorphic field. The core business that develops Elon Musk's brain-computer interface (BCI) technology is Neuralink, which he created in 2016. To improve human potential and restore freedom for people with disabilities, Neuralink is developing implanted brain-computer interfaces (BCIs) that allow direct brain-to-computer communication. To help people with paralysis, the technique uses a surgical robot to implant gadgets in the brain. The procedure allows users to operate computers or other equipment with their thoughts. A 30-year-old man from Arizona, USA, named Noland Arbaugh became the first person to receive a brain chip implant from Neuralink, marking a significant milestone in neurotechnology. After a diving accident in 2016, Arbaugh suffered a paralysis below the shoulders. The BBC claimed that since receiving the chip in January 2024, the outcomes have been nothing short of remarkable. Arbaugh is now able to use a brain-computer interface (BCI) to operate a computer with just his thoughts thanks to this technology. Recalling his early battles with paralysis, he remarked, "You just have no control, no privacy, and it's hard." However, he was able to control a computer cursor after the surgery by simply considering moving his fingers. An article in Frontiers in Science, which involved cooperation between scientists, institutes, and academics, further highlights the promise of the human-computer interface. "We can imagine the possibilities of what may come next with the human brain machine interface," the conclusion reads. Neural nanorobotics-based human brain-computer interface systems could boost human intelligence and learning by giving people quick access to all the knowledge available in the cloud. Furthermore, it could elevate fully immersive virtual and augmented reality to previously unheard-of heights, allowing users to express themselves more completely and have more meaningful experiences. By addressing new difficulties for the human species, these improvements may help humanity adjust to emerging artificial intelligence systems and human augmentation technologies. * Please see Frontiers | Interface between Human Brain and Cloud ( Additionally, there is hope for a quantum brain made of intelligent material that can change physically to learn. In their pursuit of this "quantum brain," physicists have made significant progress. They have shown that they can replicate the independent actions of neurons and synapses in the brain as well as pattern and link a network of individual atoms. Refer to The Initial Steps Toward a Quantum Brain: An Intelligent Substance That Acquires Knowledge by Changing Itself Physically ( Future applications of brain-computer interfaces (BCIs) may enable instant communication, thought transfers, dream recording, and AI-consciousness integration. While these advancements hold potential for human augmentation, they also raise significant ethical concerns related to cyborg rights and the regulation of super AI. Additionally, cybersecurity and privacy issues are critical, as BCIs directly interact with brain impulses and could be susceptible to misuse or compromise. As this technology becomes more widespread, protecting user data and ensuring ethical usage will become increasingly imperative. Human-machine interaction is here, despite technological, security, and ethical challenges. It will shape our future and could define the Fifth Industrial Revolution. The key will be steering its applications with a focus on a positive impact that enhances lives.
Yahoo
05-04-2025
- Science
- Yahoo
Redefining the transistor: The ideal building block for artificial intelligence
SINGAPORE, March 28, 2025 /PRNewswire/ -- The team led by Associate Professor Mario Lanza from the Department of Materials Science and Engineering in the College of Design and Engineering at the National University of Singapore, has just revolutionised the field of neuromorphic computing by inventing a new super-efficient computing cell that can mimic the behaviour of both electronic neurons and synapses. The work, titled "Synaptic and neural behaviours in a standard silicon transistor" was published in the scientific journal Nature on 26 March 2025 and is already attracting interest from leading companies in the semiconductor field. Electronic neurons and synapses are the two fundamental building blocks of next-generation artificial neural networks. Unlike traditional computers, these systems process and store data in the same place, eliminating the need to waste time and energy transferring data from memory to the processing unit (CPU). The problem is that implementing electronic neurons and synapses with traditional silicon transistors requires interconnecting multiple devices — specifically, at least 18 transistors per neuron and 6 per synapse. This makes them significantly larger and more expensive than a single transistor. The team led by Professor Lanza has found an ingenious way to reproduce the electronic behaviours characteristic of neurons and synapses in a single conventional silicon transistor. The key lies in setting the resistance of the bulk terminal to a specific value to produce a physical phenomenon called "impact ionisation," which generates a current spike very similar to what happens when an electronic neuron is activated. Additionally, by setting the bulk resistance to other specific values, the transistor can store charge in the gate oxide, causing the resistance of the transistor to persist over time, mimicking the behaviour of an electronic synapse. Making the transistor operate as a neuron or synapse is as simple as selecting the appropriate resistance for the bulk terminal. The physical phenomenon of "impact ionisation" had traditionally been considered a failure mechanism in silicon transistors, but Professor Lanza's team has managed to control it and turn it into a highly valuable application for the industry. This discovery is revolutionary because it allows the size of electronic neurons to be reduced by a factor of 18 and that of synapses by a factor of 6. Considering that each artificial neural network contains millions of electronic neurons and synapses, this could represent a huge leap forward in computing systems capable of processing much more information while consuming far less energy. Furthermore, the team has designed a cell with two transistors — called Neuro-Synaptic Random Access Memory (NSRAM) — that allows switching between operating modes (neuron or synapse), offering great versatility in manufacturing since both functions can be reproduced using a single block, without the need to dope the silicon to achieve specific substrate resistance values. The transistors used by Professor Lanza's team to implement these advanced neurons and synapses are not cutting-edge transistors like those manufactured in Taiwan or Korea, but rather traditional 180-nanometer node transistors, which can be produced by Singapore-based companies. According to Professor Lanza, "once the operating mechanism is discovered, it's now more a matter of microelectronic design". The first author of the paper, Dr Sebastián Pazos, who is from King Abdullah University of Science and Technology, commented, "Traditionally, the race for supremacy in semiconductors and artificial intelligence has been a matter of brute force, seeing who could manufacture smaller transistors and bear the production costs that come with it. Our work proposes a radically different approach based on exploiting a computing paradigm using highly efficient electronic neurons and synapses. This discovery is a way to democratise nanoelectronics and enable everyone to contribute to the development of advanced computing systems, even without access to cutting-edge transistor fabrication processes." Read more at: View original content: SOURCE National University of Singapore Sign in to access your portfolio


Associated Press
29-03-2025
- Science
- Associated Press
Redefining the transistor: The ideal building block for artificial intelligence
SINGAPORE, March 28, 2025 /PRNewswire/ -- The team led by Associate Professor Mario Lanza from the Department of Materials Science and Engineering in the College of Design and Engineering at the National University of Singapore, has just revolutionised the field of neuromorphic computing by inventing a new super-efficient computing cell that can mimic the behaviour of both electronic neurons and synapses. The work, titled " Synaptic and neural behaviours in a standard silicon transistor" was published in the scientific journal Nature on 26 March 2025 and is already attracting interest from leading companies in the semiconductor field. Electronic neurons and synapses are the two fundamental building blocks of next-generation artificial neural networks. Unlike traditional computers, these systems process and store data in the same place, eliminating the need to waste time and energy transferring data from memory to the processing unit (CPU). The problem is that implementing electronic neurons and synapses with traditional silicon transistors requires interconnecting multiple devices — specifically, at least 18 transistors per neuron and 6 per synapse. This makes them significantly larger and more expensive than a single transistor. The team led by Professor Lanza has found an ingenious way to reproduce the electronic behaviours characteristic of neurons and synapses in a single conventional silicon transistor. The key lies in setting the resistance of the bulk terminal to a specific value to produce a physical phenomenon called 'impact ionisation,' which generates a current spike very similar to what happens when an electronic neuron is activated. Additionally, by setting the bulk resistance to other specific values, the transistor can store charge in the gate oxide, causing the resistance of the transistor to persist over time, mimicking the behaviour of an electronic synapse. Making the transistor operate as a neuron or synapse is as simple as selecting the appropriate resistance for the bulk terminal. The physical phenomenon of 'impact ionisation' had traditionally been considered a failure mechanism in silicon transistors, but Professor Lanza's team has managed to control it and turn it into a highly valuable application for the industry. This discovery is revolutionary because it allows the size of electronic neurons to be reduced by a factor of 18 and that of synapses by a factor of 6. Considering that each artificial neural network contains millions of electronic neurons and synapses, this could represent a huge leap forward in computing systems capable of processing much more information while consuming far less energy. Furthermore, the team has designed a cell with two transistors — called Neuro-Synaptic Random Access Memory (NSRAM) — that allows switching between operating modes (neuron or synapse), offering great versatility in manufacturing since both functions can be reproduced using a single block, without the need to dope the silicon to achieve specific substrate resistance values. The transistors used by Professor Lanza's team to implement these advanced neurons and synapses are not cutting-edge transistors like those manufactured in Taiwan or Korea, but rather traditional 180-nanometer node transistors, which can be produced by Singapore-based companies. According to Professor Lanza, 'once the operating mechanism is discovered, it's now more a matter of microelectronic design'. The first author of the paper, Dr Sebastián Pazos, who is from King Abdullah University of Science and Technology, commented, 'Traditionally, the race for supremacy in semiconductors and artificial intelligence has been a matter of brute force, seeing who could manufacture smaller transistors and bear the production costs that come with it. Our work proposes a radically different approach based on exploiting a computing paradigm using highly efficient electronic neurons and synapses. This discovery is a way to democratise nanoelectronics and enable everyone to contribute to the development of advanced computing systems, even without access to cutting-edge transistor fabrication processes.'