logo
#

Latest news with #Stentrode

Apple Wants To Get Into Your Head, Literally.
Apple Wants To Get Into Your Head, Literally.

Forbes

time15-05-2025

  • Health
  • Forbes

Apple Wants To Get Into Your Head, Literally.

Apple Wants To Get Into Your Head, Literally. In the book Infinite Education I wrote about the advancing front of brain-computer interfaces (BCIs). When exploring AI powered implants that detect and record electrical signals from the brain, it's not difficult to imagine a world approaching where thought could replace touch. Apple has now entered the ring. Apple's move into BCIs through a partnership with Synchron is more than a headline. It's a glimpse into the future of how we interact with machines. The company that made the smartphone mainstream is now validating the idea that the brain itself can drive our devices. The significance lies in scale, signal and simplicity. When Apple shifts, the world takes notice. Their support for Synchron's BCI technology could move the entire conversation forward on the use of such devices. Synchron's Stentrode device is the first BCI implant that doesn't require open-brain surgery. That fact alone will bring relief to a lot of people. The device is inserted via the jugular vein and interfaces with the motor cortex. It reads neural activity and converts it into control signals. It's already helping people with paralysis send texts, browse the web and interact with software using thought alone. While Elon Musk's Neuralink has demonstrated amazing results, it still requires invasive open-brain surgery, which presents significant barriers to widespread adoption. What makes this moment historic is Apple's introduction of a new software protocol called BCI HID (Human Interface Device). This is Apple's way of telling developers and device makers: brain input is no longer fringe. It's officially part of the Apple ecosystem. The same ecosystem that powers iPhones, iPads and the new Vision Pro headset. Is brain activity now joining voice, touch and gesture as a recognized input method for devices? This would open the door to software that responds to thought, hardware that adapts to neural patterns and user experiences that center around intent rather than action. It could make accessibility smarter. Could it even make device interaction more human? Still, context matters. Apple has been slow to enter the AI race. While companies like OpenAI and Google rapidly released generative AI tools, Apple has remained measured, if not hesitant. Their public AI strategy has lacked the urgency or visibility of competitors. The Vision Pro headset, a major foray into spatial computing, has received mixed reviews. Critics argue it lacks compelling use cases. Sales have not met expectations. Some see it as a flop. Others view it as a necessary step toward a larger ecosystem. For users with ALS, spinal cord injuries or locked-in syndrome, the implications are life-changing. A person who cannot move or speak might now be able to control a digital environment through seamless native tools built by Apple. This matters. It's not a technology searching for a use case; it's a technology that could potentially change many lives. But it goes deeper. Apple is known for shaping culture. The iPhone didn't just succeed because it was smart. It succeeded because it redefined what phones could be. The Apple Watch didn't just track steps. It made wearable tech feel essential. If they use the same playbook, then Apple could be signalling to the world that brain-based computing is not just possible, but desirable. This normalization is the most powerful part. It takes BCIs out of the niche and into the mainstream. Not just for medical use, but for everyone. This partnership could lead to apps where students think their notes into existence, professionals control presentations with their minds or artists draw with neural commands. Creativity without friction? Expression without constraint? There are obviously some serious ethical questions. As BCIs evolve, we will need rigorous safeguards. Neural data is the most intimate form of information we can generate. It must never be exploited or used without transparent consent. Apple's history of prioritizing user privacy gives some reassurance, but it will be critical to watch how this evolves. Regulators, ethicists and technologists must collaborate to write new rules for a new reality. In education, the implications are profound. Students with learning differences could gain new forms of input. Those who struggle with motor control could gain a direct link to learning platforms. Teachers might one day read engagement not just by facial expression but by neural signals. The classroom itself could adapt to cognitive states, adjusting pace and content in real time. For entrepreneurial parents and innovative educators, this may be a frontier worth exploring. The tools our children will use in ten years may not be bound by keyboards or touchscreens. Is the new frontier of education to build learning systems that are ready for this level of interface? Systems that are ethical, inclusive and meaningful. In Infinite Education, I warned that education systems stuck in the finite game would miss the transformation. This could be one of those moments. The shift around us is happening. The arrival of Apple into this space is a signal that the age of the interface could be ending. The age of integration has begun. Our tools are becoming extensions of our minds. Not just in metaphor, but in fact. If Apple pulls this off, it will not be a small step; it will be a paradigm shift. One that will demand new thinking. The courage to rethink what it means to connect. What it means to learn. What it means to be human in a world where your thoughts can shape reality.

Apple's mind-control tech to let users control iPhones with their thoughts
Apple's mind-control tech to let users control iPhones with their thoughts

Hindustan Times

time15-05-2025

  • Health
  • Hindustan Times

Apple's mind-control tech to let users control iPhones with their thoughts

Apple is working on a groundbreaking technology that would allow users to control devices like iPhones, iPads and its Vision Pro headset using only their thoughts. The system, still in early development, is aimed at significantly advancing accessibility and assistive technology for users with physical disabilities. According to a report from The Wall Street Journal, Apple has partnered with Synchron, a neurotechnology company known for its innovative brain-computer interface (BCI) called the Stentrode. Unlike more invasive brain implants, the Stentrode is inserted into a blood vessel near the motor cortex, the part of the brain responsible for movement, where it can detect neural signals. These signals are then converted into digital commands, enabling users to perform tasks such as opening apps, navigating menus or interacting with content, all without needing to touch the screen. This isn't just a futuristic concept. Last year, a patient with ALS (amyotrophic lateral sclerosis) successfully used the Stentrode to control an Apple Vision Pro headset, even exploring a virtual simulation of the Swiss Alps. Synchron is also reportedly looking at how this brain-control system can integrate with AI tools like ChatGPT for enhanced interaction. Apple is believed to be developing a software framework that would allow third-party developers to build compatible applications using this technology. While there's no official launch date, the company may unveil further details later this year. Experts caution that this type of brain-interface control is still years away from mainstream use. Beyond the obvious regulatory and ethical hurdles, the technology remains expensive and requires specialist installation. Still, accessibility advocates are encouraged. Bob Farrell, who works on digital accessibility at Applause, welcomed the news but noted that present-day accessibility remains just as important. 'It's great to see innovation like this,' he said, 'but we still need to make sure everyday devices are easy to use for people with disabilities now.' With tech giants like Apple entering the neurotechnology space, BCIs are edging closer to everyday reality. While mind-controlled iPhones may sound like science fiction today, the foundations are clearly being laid for a more accessible digital future.

Apple to develop brain-computer interface, following path laid by Elon Musk's Neuralink
Apple to develop brain-computer interface, following path laid by Elon Musk's Neuralink

Time of India

time14-05-2025

  • Business
  • Time of India

Apple to develop brain-computer interface, following path laid by Elon Musk's Neuralink

Apple is stepping into the brain-computer interface (BCI) space with a major announcement. The tech giant is developing a new technology that will eventually allow users to operate Apple devices like iPhones and iPads using brain signals, without needing to touch or physically interact with the device. This move is part of Apple's broader commitment to accessibility, with a focus on supporting individuals with significant motor impairments, such as those affected by spinal cord injuries or conditions like amyotrophic lateral sclerosis (ALS), according to The Wall Street Journal. 'At Apple, accessibility is part of our DNA,' said Apple CEO Tim Cook in a press release. 'Making technology for everyone is a priority for all of us, and we're proud of the innovations we're sharing this year. That includes tools to help people access crucial information, explore the world around them, and do what they love.' Sponsored Links Sponsored Links Promoted Links Promoted Links You May Like Unsold Container Homes in Cebu - Prices You Won't Believe! Shipping Container Homes | Search Ads Search Now Undo Apple vs Musk's Neuralink To drive its BCI efforts, Apple is collaborating with a startup named Synchron. The company has developed a device called the Stentrode—a tiny, stent-like electrode implant positioned within a vein near the brain's motor cortex. It features 16 electrodes that detect brain activity. Live Events By contrast, Elon Musk 's Neuralink is working on a more invasive approach. Its brain implant, the N1, is inserted directly into brain tissue and contains over 1,000 electrodes, allowing it to capture far more detailed neural data. Discover the stories of your interest Blockchain 5 Stories Cyber-safety 7 Stories Fintech 9 Stories E-comm 9 Stories ML 8 Stories Edtech 6 Stories This difference in design leads to noticeably different user experiences. Mark Jackson, one of the first individuals to test the Stentrode, shared that the system doesn't yet allow him to simulate using a mouse or touchscreen. As a result, navigating digital interfaces is significantly slower than with conventional input methods. On the other hand, Neuralink, in March last year , demonstrated a more advanced level of interaction. In a livestream, Noland Arbaugh, the company's first implanted patient, was seen playing an online chess game, controlling the cursor purely through his thoughts. While Synchron's technology may be at an earlier stage, it has already delivered some remarkable moments. According to the Wall Street Journal, Jackson, though unable to stand and not physically present in Switzerland, used an Apple VR headset connected to his brain implant to virtually gaze off a mountaintop in the Swiss Alps—and was even overcome by the sensation of his legs trembling. 'More is possible with a standard built specifically for these implants,' said Synchron CEO Tom Oxley. Apple is expected to unveil this new standard later this year, making it available to developers across the platform.

Apple's New Tech May Soon Let People Control iPhones With Their Brain
Apple's New Tech May Soon Let People Control iPhones With Their Brain

NDTV

time14-05-2025

  • Health
  • NDTV

Apple's New Tech May Soon Let People Control iPhones With Their Brain

Washington: Apple is making strides towards a future where people control their iPhones with thoughts, using brain implants, according to a report in a leading American daily. This is similar to the vision of Elon Musk's Neuralink. The tech giant is reportedly working closely with Synchron, a brain-interface company, which has developed a stent-like device called the Stentrode, The Wall Street Journal reported. This implant is inserted into a blood vessel near the brain's motor cortex and reads brain signals to control digital devices. Such technology could be life-changing for people with severe spinal cord injuries, ALS (Amyotrophic Lateral Sclerosis), or those recovering from strokes. These implants work by picking up brain activity and turning it into digital commands. A Brain-Computer Interface (BCI), as it is called, allows the brain to communicate directly with a computer or device. When a person thinks, the brain sends out signals. BCIs capture those signals through sensors and convert them into actions, like moving a cursor, typing, or opening apps, without needing to touch the screen. The Stentrode works with Apple's built-in feature called "switch control," which lets users change how they interact with their devices, switching from a joystick to a brain signal. "Today, brain computer-interface companies have to trick computers into thinking the signals coming from their implants are coming from a mouse," Synchron's CEO Tom Oxley told WSJ. He said Apple's new standard, expected to be released later this year, will make it easier for developers to connect implants directly with devices. One early user, Mark Jackson, who has ALS, uses the Synchron device to operate Apple's Vision Pro headset and his iPhone from home. He can't travel or stand, but the brain-implant lets him access Apple's devices in a new way. He told WSJ that through the headset, he was "able to peer over the ledge of a mountain in the Swiss Alps and feel my legs shake." Elon Musk's Neuralink has already implanted its device, the N1, in a human. It has over 1,000 electrodes placed inside the brain, capturing far more data than Synchron's 16 electrodes that sit on top. Neuralink's first user can move a cursor with thoughts faster than some people using a mouse. Musk has said that such implants could one day boost brain power and help people compete with advanced artificial intelligence. Morgan Stanley estimates around 150,000 Americans with serious upper-limb impairments could be early users of brain-implant tech. They predict the first commercial approval could come by 2030, but Synchron's CEO believes it could happen sooner.

Apple Bets On Brain Signals To Boost Accessibility
Apple Bets On Brain Signals To Boost Accessibility

Yahoo

time13-05-2025

  • Business
  • Yahoo

Apple Bets On Brain Signals To Boost Accessibility

Apple (NASDAQ:AAPL) is adding a suite of new accessibility toolsEye Tracking, Head Tracking, Switch Control for Brain Computer Interfaces, Assistive Access, Music Haptics, Name Recognition in Sound Recognition, expanded Voice Control and Large Text CarPlay supportall designed to let users navigate iOS and Apple Vision Pro hands-free. Eye and Head Tracking let you select and type by gazing or nodding, while the new Switch Control partners with Synchron's implantable Stentrode electrodes to read neural signals and operate your device. Tim Cook says, Accessibility is part of our DNA, and this neural-interface leap could transform life for users with paralysis or ALS by unlocking tens of thousands of potential new users. Apple's partnership with Synchron taps into cutting-edge BCI workakin to Elon Musk's Neuralink, which plans to implant its Blindsight device in 2030 patients later this yearpositioning Apple at the forefront of consumer-grade neural tech. Beyond BCIs, Apple is simplifying TV navigation with Assistive Access and giving deaf and hard-of-hearing users Name Recognition in Sound Recognition, while Music Haptics and expanded Voice Control support widen its reach. These updates follow reports that the Wall Street Journal first broke about Switch Control's neural feature and come as Apple aims to cement its promise of technology for everyone. Why it matters: By weaving brain signals into iPhone control, Apple isn't just expanding its accessibility toolkitit's betting neural interfaces are the next consumer frontier. Investors will watch adoption metrics and any FDA feedback on the Stentrode partnership when Apple reports Q3 results later this summer. Apple's current share price of $211.67 sits well above most traditional valuation benchmarks, signaling that investors are pricing in robust growth expectations. The GuruFocus GF Value estimate of $197.36 comes closest to today's levels, suggesting only a slim margin of safety, while the earnings-based DCF at $168.27 and the FCF-based DCF at $132.68 offer more conservative anchors. Metrics rooted in historical performancelike the Median PS Value of $144.82 and Peter Lynch's $96.57 figureimply that the stock is trading at a substantial premium to its past sales and earnings power. Measures that focus on balance-sheet strength, such as the Tangible Book at $4.47 and the negative Net Current Asset Value of $9.76, barely register against Apple's real-world market heft. Even the Graham Number of $26.71 and the Earnings Power Value of $63.11 point to a valuation far below where the market has placed the stock. In sum, unless Apple delivers outsized growth or margin expansion, today's price appears to rest on optimistic forecasts rather than the support of most conservative valuation frameworks. This article first appeared on GuruFocus.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store