The next wearable tech could be a face 'tattoo' that measures mental stress
Researchers are developing a digital "tattoo" that measures "mental workload."
The "e-tattoo" is meant for workers with high-risk jobs like air traffic controllers.
It's thinner and lighter than older EEG models.
Tattoos can reveal a lot about a person.
One day, they could even reveal a person's brain waves.
At least, that's the goal for researchers at the University of Texas at Austin.
"This tattoo is not like a normal tattoo," Nanshu Lu, a professor in the university's Department of Aerospace Engineering and Engineering Mechanics, told Business Insider.
Lu worked alongside engineering professor Luis Sentis and others to develop an ultra-thin, wireless wearable device that she compared to a temporary tattoo sticker. The non-invasive device, applied to the skin with an adhesive, measures brain activity and eye movement to gauge an individual's mental workload.
Lu said the device is intended for people working in high-stakes or high-demand jobs, such as aircraft pilots, air traffic controllers, drone operators, and robot teleoperators. These jobs could be considered high-stress since they require focus, quick-thinking skills, and a small margin for error.
"Technology is developing much faster than human evolution, so it is very hard to keep up with the technological demand in modern jobs," she said. "Therefore, it's very important not to overload the operator because not only would that jeopardize the outcome of the mission, it would also harm their health."
Although the idea of an electronic tattoo might sound like something out of a sci-fi novel, humans and technology have been melding for a while.
Nearly 40 years ago, for example, the first fully digital hearing aid became available to the public, according to the Hearing Health Foundation. And now, Elon Musk is embedding computer chips into people's brains through his company, Neuralink.
Recent wearable tech inventions include smartwatches, Bluetooth headphones, VR headsets, and fitness trackers, to name a few. Some health-conscious folks invest in wearable technology like the Oura Ring, which collects data on everything from sleep activity to body temperature.
However, those devices don't analyze brain activity and eye movement.
"Over the years, we developed a series of non-invasive skin conformable e-tattoos to measure cardiovascular health, then measure the mental stress from the palm sweating," Lu said of previous devices she helped develop. "Now, finally, we move on to the brainwave."
Lu said the device certainly isn't the first or only EEG sensor on the market, but it's smaller and lighter than previous models. Conducting an EEG test typically requires attaching electrodes to an individual's full scalp with a gel or paste to collect data, which can be time-consuming.
Researchers at the University of Texas at Austin are attempting to streamline that process by proving mental workload can be measured only from the forehead. During lab tests, participants did memorization drills that involved a screen with flashing boxes.
"In the past, there was no way to objectively assess mental workload. The subjects have to finish the test and then a questionnaire," Lu said. "But in the future, if we can implement some lightweight machine learning model directly on a microprocessor in the Bluetooth chip on e-tattoo, then yes, our goal is to one day be able to do it in real time."
A report by Grand View Research said that the global wearable technology market was worth $84.2 billion in 2024 and is expected to reach $186.14 billion by 2030, underscoring consumers' willingness to integrate technology into their everyday lives.
However, electronic tattoos won't be commercial anytime soon. Lu and her fellow researchers are still conducting tests on and developing the tech.
Still, she can imagine a world where the e-tattoo is used by people not employed in high-stakes jobs.
It could be used by "people who would like to focus as well as people who want to meditate to see if they are truly relaxed," Lu said.
Read the original article on Business Insider
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
13 minutes ago
- Yahoo
Your boss is probably using AI more than you
Leaders use AI around twice as much as individual contributors, a new Gallup Poll finds. Gallup data indicates AI adoption has risen, especially in white-collar roles, with tech leading at 50%. 16% of employees surveyed who use AI "strongly agree" that AI tools provided by their company are useful. There's a good chance your boss is using AI more than you. Leaders are adopting AI at nearly double the rate of individual contributors, a new Gallup poll released Monday indicates. The survey found that 33% of leaders, or those who identified as "managers of managers," use AI frequently, meaning a few times a week or more, compared to 16% of individual contributors. Gallup's chief scientist for workplace management and wellbeing, Jim Harter, told Business Insider that leaders are likely feeling added pressure to think about AI and how it can increase efficiency and effectiveness. "There's probably more leaders experimenting with it because they see the urgency and they see it as a competitive threat potentially," Harter said. The data point was one of several findings from Gallup's survey on AI adoption in the workplace, including: The number of US employees who use AI at work at least a few times a year has increased from 21% to 40% in the past two years Frequent AI use increased from 11% to 19% since 2023 Daily use of AI doubled in the past year from 4% to 8% 15% of employees surveyed said it was "very or somewhat likely that automation, robots, or AI" would eliminate their jobs in a five-year period 44% of employees said their company has started to integrate AI, but only 22% say their company shared a plan or strategy 30% of employees said their company has "general guidelines or formal policies" in place for using AI at work 16% of the employees who use AI "strongly agree" that AI tools provided by their company are helpful for their job While AI adoption has increased overall in the last two years, that increase isn't evenly distributed across industries. The Gallup report said that AI adoption "increased primarily for white-collar roles," with 27% surveyed now saying they use AI frequently on the job, a 12% increase from last year. Among white-collar workers, frequent AI is most common in the tech industry, at 50%, according to the survey, followed by professional services at 34%, and finance at 32%. Meanwhile, frequent AI use among production and front-line workers has dropped from 11% in 2023 to 9% this year, according to Gallup's polling. Concerns that AI will eliminate jobs have also not increased overall in the last two years, but the report indicated that employees in industries like technology, retail, and finance are more likely than others to believe AI will one day take their jobs. The most common challenge with AI adoption, according to those surveyed, is "unclear use case or value proposition," suggesting that companies may not providing clear guidance. The report said that when employees say they "strongly agree" that leadership has shared a clear plan for using AI, they're three times as likely to feel "very prepared to work with AI" and 2.6 times as likely to feel comfortable using it at work. "In some cases, you've got to have the training to be able to use AI as a complement with other text analytic tools that are more precise," Gallup's Harter told BI. Harter said that while organizations are increasingly developing plans around AI usage, "there's still a long way to go," and it may not be a one-and-done approach. "They're going to have to continue to be trained in how to use it because it's going to evolve itself," Harter said. Read the original article on Business Insider


Digital Trends
31 minutes ago
- Digital Trends
Apple's smart home device plans are running late, but I'm ready to wait
At its annual developers conference this year, Apple announced a set of sweeping changes across its software platforms, introducing a whole new design language, a massive iPadOS makeover, and some key updates to macOS elements like Spotlight. What was conspicuously missing from WWDC 2025 was homeOS. Apple was expected to make a grand reveal of its smart home-focused operating system, ahead of launching a couple of products. The first one reportedly looks like a smart display with its own speaker assembly, while the other model could even get a robotic arm. It now seems the plans for homeOS, and the smart home devices have been pushed into 2026. Recommended Videos But if you watched the event closely, there was a sneak peek of the future. Yes, AI would be very much part of it. And yes, it would make interacting with a smart home device a lot more meaningful and functional. Hint: Think apps, developers, AI, and voice commands. Read on: What's the current status? Apple was apparently planning to introduce homeOS in March this year, but the delays with AI-related features killed those plans. 'That operating system and device, however, rely heavily on the delayed Siri features. And that means they probably won't be able to ship until the Siri upgrades are ready,' according to Bloomberg. The outlet had previously reported that the first of the two planned device packs a screen measuring roughly six inches. It can be mounted like a wall tablet, and there's also a range of base attachments for it, such as a speaker accessory. The idea isn't novel, as the likes of Amazon and Google have already experimented with such products. What was supposed to set the device apart was the deep integration with Apple's ecosystem. Aside from serving as a central hub for controlling smart home devices, it would also enable video conferencing, and be capable of running apps such as Safari and Apple Music. More importantly, the device is expected to rely heavily on AI chops, and that's where Apple is currently falling behind. 'The technology was part of a planned smart home hub that has now been pushed back as well, keeping Apple from moving into a new product category,' reports the outlet. Apple is now eyeing a Spring 2026 release for the next-gen Siri features and on-device AI capabilities. What exactly is in the package for homeOS remains to be seen, but if the new Alexa+ from Amazon is anything to go by, we are in for a big leap. Why is it the right move? 'This is a big lift,' Apple's senior vice president of software engineering, Craig Federighi, told The Wall Street Journal when asked about Apple's next-gen Siri plans last year. The tone hasn't shifted a year later. 'There's no need to rush out with the wrong features and the wrong product just to be first,' he told the outlet earlier this month. Apple won't be the first to flag the risks, and it certainly doesn't want to be on the receiving end of AI gaffes after the Apple Intelligence-BBC news misinterpretation fiasco. Google's AI, despite firmly landing in the futuristic Project Astra age, still gets something as basic as the date wrong. Again and again. Amazon claims Alexa has hundreds of millions of users, but the AI-powered Alexa+ has only managed to reach a pool of less than one percent of its audience. According to a Reuters report that cites internal sources, Alexa+ is suffering from slow responses and that it 'occasionally generates inaccurate or fabricated information.' The New York Times, after testing Alexa+, reports that a handful of its most promising features are either unavailable or 'most of them are very much a work in progress.' But those are not insurmountable challenges. AI is the real issue. AI, and its natural conversational capabilities, are still a risky affair. As per a devastating account in The New York Times, interacting with ChatGPT pushed two users on an emotional spiral, and one of them ended up dead. It's worth noting here that ChatGPT has been integrated within the Apple Intelligence stack to help Siri handle advanced queries. With the release of iOS 26 and the companion updates across other Apple platforms, it can now do even more. Imagine an Apple-made smart home device running ChatGPT (with all its flaws) in your home, especially with kids and elders around. It's safe and accurate for the most part, but there are scenarios where deep interactions have quickly turned harmful. Apple certainly wouldn't risk putting such a stack on a device that is always at home. Aside from those inherent risks, half-baked features and integrations would simply make the product less appealing and attract criticism. Apple has tasted it and had to pull one of its ambitious Siri-AI ads because the tech is just not there yet, a year after showcasing it. But now that the company has set a 2026 release date, it's plausible that the work on Siri and its AI tricks has progressed meaningfully. A sneak peek of the future Now, you might ask what the whole fuss around the next-gen Siri is all about. Well, Apple is tweaking the fundamental architecture of Siri and molding it to act more like a chatbot, like Gemini, ChatGPT, or Claude. Think of it as the same magnitude of change as Google Assistant going away in favor of Gemini. Aside from handling user interactions and smart home controls, it is now integrated everywhere in apps such as Gmail, Docs, Maps, and even external apps like Spotify. Siri doesn't offer that. Yet. It could change soon, and we already got a glimpse of it at WWDC. That secret sauce is on-device Apple Intelligence foundation model. In a nutshell, developers will be able to build on-device AI experiences within their apps. The best part? Free AI inference. Moreover, the on-device AI capabilities can be integrated within apps using merely a few lines of code. In a nutshell, apps will gain more conversational and visual capabilities powered by AI. And since all it happens on-device, user data privacy is not compromised. Simply put, the apps powered by Apple's AI models are going to be smarter and more intuitive. So, how does it benefit Apple's smart home display? As per Bloomberg, the device will integrate tightly with the iPhone and will even enable Handoff for seamlessly transferring a task from its screen to the smartphone in your hand. Overall, it seems Apple wants to keep its AI stack, apps, and the integrations ready so that when the device launches, it doesn't run into the current limitations. It will allow natural language conversations and let users perform tasks across different apps with voice commands that don't sound like a maths formula. A few more months in development sounds like the right approach at this point.


Forbes
36 minutes ago
- Forbes
Top Creators 2025
The New Industrial Identity: The Power Of Digitalization And Automation In America | Paid Program