Latest news with #fNIRS


Time of India
14-07-2025
- Science
- Time of India
Shubhanshu Shukla, Uznanski complete 1st brain-to-comp comms test in space, here's why it's important
As and the rest of the Ax-4 crew leave the International Space Station for their return to Earth, one of the lesser-known but groundbreaking experiments they carried out was 'PhotonGrav' by Polish neurotechnology firm Cortivision . Tired of too many ads? go ad free now In simple terms, it tested whether humans can communicate directly with a computer using nothing but their brain signals— for the first time ever in space. In an exclusive interview to TOI's , Wojciech Broniatowski , CEO & COO of Cortivision, spoke about the experiment, the challenges of brain monitoring in orbit, and why this matters for the future of spaceflight and life on Earth. Excerpts: Q: Can you explain the core objective of the 'PhotonGrav' experiment on Axiom-4, and how the microgravity environment supports or enhances your research? Allow me to start with a different question—can people communicate with a computer using only their brain? Would it work in space? It turns out, nobody had tried it. The core of our experiment was to prove the effectiveness of our technology in space by communicating with a machine without any muscle engagement. I know it sounds a bit complicated, even though the process is very simple. During the experiment, astronauts Shubhanshu Shukla and Sławosz Uznański performed mental calculation tasks. This means they put themselves in a deep focus state. Our fNIRS system detected changes in their brain activity and, using our AI, correctly identified whether they were focused or relaxed. On the surface, this seems simple. But if we look closer, the astronauts communicated simple signals—just by thinking—and the computer decoded their message correctly. This was the first time in history that humans communicated directly from their brain with a computer in space. Why microgravity? At Cortivision, we build products specifically designed for extreme environments. Microgravity is unique and very challenging, so if our technology works reliably there, it can handle any challenges on Earth. Tired of too many ads? go ad free now That makes microgravity testing crucial and a glimpse into the future of brain monitoring technologies. The International Space Station was the perfect place to prove that this fNIRS-based communication works in real space conditions—not just in a lab. And it worked perfectly. We hope this success will open the door for future collaborations with Indian space researchers, universities, and companies. Shux & Dariusz Zapała PhD from Cortivision during training Q: Why did Cortivision decide to conduct this specific experiment in space? In the autumn of 2023, Poland launched a call for projects to be done in orbit within the IGNIS mission, giving us a unique chance to push boundaries. At that time, we had already participated in two previous space missions (Ax-2 and Ax-3). We gained experience and understood the challenges astronauts face. So we seized this opportunity to demonstrate brain-computer communication in microgravity and continue developing technology for extreme environments. This mission was special for us because it involved a second Pole in space. You might think that science should be subject-agnostic, but in fact, it really matters who you work with on pioneering projects. Slawosz and Shux turned out to be perfect participants to carry such an ambitious experiment. Q: Of the four astronauts on Ax-4, Slawosz was a natural choice. But how and why was Shubhanshu Shukla chosen? Each mission includes a scientific programme. During training, astronauts meet with researchers who present various project proposals. The astronauts then sign up for experiments that interest them or that they feel comfortable with—this is how they express their willingness to participate. Our experiment seemed very appealing, but believe me, there were also projects that required more effort or caused discomfort. So the choice wasn't always straightforward. We were very keen to include as many astronauts as possible in our study. We're also hopeful to involve more crew members in future missions, since our equipment will remain on the ISS even after the current crew returns to Earth. The device Shukla and Uznanski used at ISS Q: Now that the Ax-4 crew has completed two weeks, what specific insights have you gained so far? How many trials did Shux and Slawosz run at the ISS? What did these trials entail? Please also elaborate on the findings. We can't reveal all the details yet, as the data is protected by personal privacy regulations and will be fully analysed before scientific publications are released. However, we can share that both Shubhanshu and Slawosz successfully completed the experiment three times in orbit. They achieved very strong results in detecting whether they were focused or relaxed. These mental states may sound abstract when we speak about communication—but if you think about the early days of computing, where communication was based on binary 'zeros' and 'ones,' it's actually a very similar principle. The astronauts, by changing their mental state, were sending a simple, clear signal that the system could recognise. What's important is that this was far from obvious. In microgravity, body fluids shift differently—this is why astronauts' faces often look swollen in space photos. These fluid shifts also affect how the brain's blood flow works, which could influence how well brain activation signals can be detected. That's why, before flying to the ISS, the astronauts carried out baseline measurements in Warsaw and Houston. After their return, they will do the same measurements again. This way, we'll be able to compare how the brain behaves on Earth and in orbit, and precisely assess the differences in our brain-computer interface communication performance. Q: Were there any unexpected results or anomalies observed during the experiment aboard the ISS that could shape future neurosensory or cognitive research in microgravity? Interestingly, after almost two years of preparation, the biggest surprise had nothing to do with the experiment itself—it was a simple technical glitch with the laptop on the ISS. The most advanced part—the brain-reading system—worked exactly as planned, while a regular piece of equipment gave us a little headache. It's a good reminder that sometimes the most complicated technologies run smoothly because they've been tested a thousand times, and it's the ordinary, everyday things that can surprise you. This experience shows why careful preparation is key, and why testing even the simplest parts of the setup is just as important as preparing the core science. Q: Have the results from Axiom-4 validated or challenged any of your pre-launch hypotheses, and how do you plan to apply this data in terrestrial applications or future missions? The results from the Ax-4 mission gave us a green light and confirmed what we had hoped for. Before the mission, we believed that fNIRS technology could be the best fit for space—more mobile and practical than fMRI, but giving deeper, more precise data than EEG. This mission proved that idea right. This means we now consider fNIRS ready for more advanced use in space—not only for brain-computer communication but also for monitoring astronauts' mental focus, improving their cognitive training, and supporting them in extreme isolation and stress. In the future, we may even be able to create personalised 'attention patterns' for each astronaut. By comparing their real-time focus to this personal pattern, we could detect when their attention drifts and help prevent mistakes during critical mission phases. And no, this has nothing to do with 'reading the mind'. We would only track physiological changes, so astronauts' dreams and thoughts are safe. On Earth, this technology has huge potential as well. We can use it for training pilots, surgeons, or anyone who needs to perform complex tasks under pressure. It could also be applied in neurorehabilitation or mental health monitoring. If our technology works in space—the harshest environment possible—it's certainly reliable in the demanding environments on Earth. Q: Explain how your device is different from the technologies already available on Earth. First, our technology wasn't built only for space. In fact, about 90 percent of what we do at Cortivision happens here on Earth. Universities, research centres, and clinical teams already use our fNIRS systems every day to study cognitive brain activity. Scientists don't have to fly to space to use it—our mission is to make advanced brain monitoring accessible in real-world conditions, whether it's a hospital, a lab, or a sports field. What makes our technology special is that it works in tough environments where traditional tools don't. EEG is lightweight but often noisy and uncomfortable in movement. fMRI gives great data but is huge and stationary. Our fNIRS system is the best of both worlds: it's portable and gives deep insights into how the brain uses oxygen during mental tasks—whether that's focus, stress, or fatigue. After three space missions, our device has been fully certified for safety by Nasa and remains on the ISS for future astronauts and scientists. Shubhanshu Shukla and Slawosz Uznański have already made history with it—but the device is still in orbit, ready for the next mission. Who will be next?


Time of India
30-06-2025
- Health
- Time of India
Shux, crew, study microalgae, cancer, cognition & more
The view was still extraordinary, but the work, even more so for the Axiom-4 (Ax-4) crew. With Earth turning silently below them, the crew, including India's Group Captain Shubhanshu Shukla (Shux) spent their third day aboard the International Space Station (ISS) studying cancer, cognition, microalgae, the building blocks of long-term spaceflight, and more. Tired of too many ads? go ad free now The experiments being carried out by Shux and other crew members are part of the more than 60 scientific investigations from 31 countries, including India, the US, Poland, Hungary, Brazil, Nigeria, and Saudi Arabia. After all, Ax-4 is the most research-intensive private mission to the ISS to date. Shux, who had kicked off his research activity with a muscle loss study the previous day, was hard at work on a project that's microscopic in size but potentially massive in impact. He deployed sample bags and imaging tools for a space microalgae experiment, studying how these tiny organisms fare in orbit. Rich in nutrients and capable of recycling carbon dioxide, microalgae could become a key part of long-duration space missions — a source of food, oxygen and life support rolled into one. He was also part of, along with other AX-4 crew, the 'Neuro Motion VR' study, which uses virtual reality headsets and brain-monitoring technology (fNIRS) to observe how astronauts' cognitive and motor functions respond to spaceflight. 'Alongside that, they collected data for 'Telemetric Health AI', a project that merges biometric tracking with AI analytics to better understand how spaceflight affects the cardiovascular and balance systems. These studies aren't just academic. They're building the foundations for safe, extended human missions into deep space,' Axiom Space said. Commander Peggy Whitson, a seasoned astronaut on her fourth spaceflight, devoted much of the day to the Cancer in LEO (low-Earth orbit) investigation. Tired of too many ads? go ad free now Working with the Sanford Stem Cell Institute, she captured imaging samples of cancer cells exposed to microgravity. The stress of space may change the way these cells behave, potentially offering clues to tackling aggressive, metastatic cancers back on Earth. Her camera didn't stop there. Whitson also photographed student-designed science experiments and artwork for the Saudi Space Agency's Microgravity Challenge — a competition that drew over 80,000 entries from young innovators across the Arab world. With submissions ranging from sustainable farming ideas to space-inspired art, it's an example of how missions like Ax-4 can inspire the next generation of scientists, engineers and dreamers. Mission specialist Suave (Slawosz Uznanski) focused his attention on the microfluidic design experiment to observe how fluids behave in low gravity. The end goal? Developing microfluidic devices that test drug stability and quality — a vital step toward bringing autonomous health care to future space travellers. And yet, research isn't the only focus. The crew also reached out to Earth. Tibor Kapu shared a live conversation with Hungarian PM Viktor Orbán, while Whitson and Suave spoke with Axiom Space Chief Scientist Dr Lucie Low about protecting astronauts from space radiation — a challenge no future mission can ignore.
Yahoo
24-06-2025
- Health
- Yahoo
Your Brain Is Glowing Right Now. Literally.
Here's what you'll learn when you read this story: The human brain actually lights up with signals known as ultra weak photon emotions (UPEs), which are a byproduct of metabolic processes. Researchers have now been able to detect UPEs and determine what they were revealing about brain function. A new imaging technique called photoencephalography could someday harness UPE signals as a diagnostic tool to supplement PET scans and MRIs. From bioluminescent mushrooms in the undergrowth of a rainforest to alien sea creatures eerily glowing in the abyssal depths, glowing organisms light up some of the darkest places on Earth. But humans aren't among them—or, at least, we thought so. As a team of researchers—led by Haley Casey from Algoma University in Ontario, Canada—found out, the human brain can actually luminesce. They called these glimpses of light ultra weak photon emissions (UPEs), and they are a result of metabolic energy flow. As electrons degrade during a process known as oxidation, they lose energy and release photons with it. Our brains emit them in visible light, meaning that if we had the X-ray vision to see through each other's skulls in total darkness, we might be able to make out a faint glow. This is not technically bioluminescence—organisms that are bioluminescent rely on chemicals such as luciferin for their eerie light. It also isn't phosphorescence, which is absorbed energy released in the form of light. It isn't even thermal radiation, which can be seen in infrared and is emitted by anything over a temperature of absolute zero. UPEs are their own phenomenon, and can be detected from the outside. They can also be indicators of what is going on in the brain. '[UPEs] predict oxidative stress, aging, and neurodegeneration,' Casey and her research team said in their study, recently published in the journal Current Biology. 'UPEs are triggered by neurotransmitters and biophysical stimuli, but they are also generated by cells at rest and can be passively recorded using modern photodetectors in dark environments.' Previous studies found that the human body is capable of glowing, but Casey's team specifically zeroed in on the brain and what these emissions could tell us about brain activity and health. They also were trying to prove that UPE signals from the brain could be distinguished from background photon noise. Subjects wore an EEG cap that had electrodes attached, along with photomultiplier tubes, to monitor brain activity. Photomultiplier tubes are so hypersensitive that they can pick up even the faintest trace of light. What the researchers were testing out was a new technique they devised (still in development) called photoencephalography. There are two major advantages of using photoencephalography over other methods (like PET scans and even less invasive fNIRS and fMRI scans): it is non-invasive, and it is less likely that results will be confused by the test itself. Other methods can either spark neural activity or suppress it, but photoencephalography does neither. As a result, passive measurement of brain activity is undisturbed and allows for detection of electromagnetic stimuli in the surrounding environment. Searching for UPE signals, the researchers focused on the left occipital lobe of the brain (which specializes in visual processing) and the right temporal lobe (which is instrumental to learning and remembering nonverbal information such as music). They were curious as to whether UPE signals from either lobe would show up as distinct from background noise, and when compared to background photons, the signals from the brain did in fact stand out as a result of their unique frequency. In the dark, subjects were also given sound-based tasks to accomplish without needing to see what they were doing. They were told to open and close their eyes before and after listening to music. UPEs were logged during tasks done with open and closed eyes—both of which have obvious brain signatures. There were variations in UPE output depending on the task being performed, and the activity detected by the EEG cap was also highly correlated with UPE signals. UPEs could possibly help with diagnosing neurological conditions in the future. 'Because UPEs are related to oxidative metabolism, the most immediately relevant applications might include the detection of budding brain tumors, excitotoxic lesions, mild traumatic injuries, and neurotoxic insults,' Casey said. Photoencephalography won't be replacing MRIs just yet, but it will someday shine a light on what we couldn't see before. You Might Also Like The Do's and Don'ts of Using Painter's Tape The Best Portable BBQ Grills for Cooking Anywhere Can a Smart Watch Prolong Your Life?
Yahoo
23-06-2025
- Health
- Yahoo
Can a Laser Replace MRI Scans? One Bold Experiment Says Yes
In a discovery that could reshape how we view and understand the human brain, researchers have successfully passed a laser beam through an entire human head. While it may sound like science fiction, this quiet milestone could pave the way for faster, cheaper, and noninvasive brain scans. The work comes from a team at the University of Glasgow, who set out to push the limits of functional near-infrared spectroscopy (fNIRS). This technology already offers a portable, low-cost way to monitor brain activity, but until now, it's only been able to peek a few centimeters beneath the skull. For anything deeper, expensive MRI machines have been the standard. That may be about to change. By boosting the power of the laser (within safe limits) and improving the light-collection setup, scientists managed to transmit photons from one side of the skull to the other. It worked on just one out of eight participants—a bald man with fair skin—but it proved something previously thought impossible: a beam of light can travel through the entire human head. The implications are huge. If refined, this technology could close the gap between inexpensive tools like EEG and high-resolution, high-cost MRI scans. In the future, diagnosing strokes, brain injuries, or tumors might not require hospital-grade equipment—just a small, light-based scanner. What made this breakthrough even more compelling was the way the light traveled. Instead of bouncing randomly through the skull, photons followed predictable paths, especially through more transparent areas like cerebrospinal fluid. That opens the door to more targeted imaging, where specific brain regions could be scanned with precision. To validate the experiment, researchers used 3D head models to predict photon movement and then compared the results to actual light data. The results aligned, adding credibility to what may one day be a revolutionary shift in brain imaging. It's early days. Scanning took 30 minutes, and the conditions were highly specific. But it's a proof of concept with massive potential. For now, it's just one beam of light. In the future, it could be the foundation of a global leap in how doctors see the a Laser Replace MRI Scans? One Bold Experiment Says Yes first appeared on Men's Journal on Jun 22, 2025
Yahoo
22-06-2025
- Health
- Yahoo
Scientists Beamed Light Right Through a Man's Head For The First Time
Scientists have developed a new technique for non-invasive brain imaging – and it involves shining light all the way through the head, from one side to the other. Currently the best portable, low-cost method for monitoring the brain is functional near-infrared spectroscopy (fNIRS). Unfortunately, this can only penetrate a few centimeters down, meaning bigger, bulkier MRI machines are needed to probe deeper layers of the brain. A new method, developed by a team from the University of Glasgow in Scotland, expands the sensitivity of fNIRS to shine light all the way through the complex combinations of bone, neurons, and tissue that make up our heads. Doing so required a few tweaks: the researchers increased the strength of the near-infrared laser (within safe boundaries, of course), while also putting in place a more comprehensive collection setup. Even with these adjustments, only a small trickle of photons made it from one side of the head to the other during experiments. However, it's a promising start for portable imaging methods that go deeper, giving us crucial insight into what's happening inside our skulls without opening them up. "These findings uncover the potential to extend non-invasive light based on brain imaging technologies to the tomography of critical biomarkers deep in the adult human head," write the researchers in their published paper. There are quite a number of caveats to mention here. The process was only successful with one out of eight study participants: a man with fair skin and no hair on his head. It needs a very specific setup, and an extended scanning time – around 30 minutes. Those limitations are all acknowledged by the researchers, but they sacrificed certain variables (such as speed) to try and prove that it was possible to get light all the way through a human head via fNIRS – and they succeeded. Computer models based on detailed 3D head scans were used to predict the movement of photons through the skull. These matched up closely with the actual light collected, adding further credibility to the results. What's more, the research also found that light didn't scatter at random through the head, but rather followed preferred paths – including through parts that were more transparent, like those filled with cerebrospinal fluid. That knowledge could help brain scans be better targeted in the future. "Different source positions on the head can then selectively isolate and probe deep regions of the brain," write the researchers. The advantages of fNIRS are that it's a relatively inexpensive and compact technology. Imagine scans for strokes, brain injuries, and tumors that are more accessible for a wider range of people. As future imaging devices are developed, this research should prove useful for techniques that go deeper into the brain – even if it might be a while before we can get light through the entire head in a timeframe that's practically useful. We know that brain scans have tremendous value in everything from understanding adolescence in youngsters to treating disease towards the end of our lives, so there's a huge amount of potential here. "Optical modalities for noninvasive imaging of the human brain hold promise to fill the technology gap between cheap and portable devices such as electroencephalography (EEG) and expensive high-resolution instruments such as functional magnetic resonance imaging (fMRI)," write the researchers. The research has been published in Neurophotonics. The Sad Case of The World's Youngest-Ever Alzheimer's Diagnosis Compound That Turns People Yellow Could Protect Against Malaria Scientists May Have Finally Figured Out How Bats Avoid Cancer