logo
#

Latest news with #EmteqLabs

I let smart glasses read my emotions and watch what I eat — and now I can't unsee the future
I let smart glasses read my emotions and watch what I eat — and now I can't unsee the future

Tom's Guide

time2 days ago

  • Health
  • Tom's Guide

I let smart glasses read my emotions and watch what I eat — and now I can't unsee the future

I believe the next great fitness wearable will not be a smartwatch or smart ring — it will be glasses. I saw this for myself when trying a prototype of the new eyewear Emteq Labs is keen to launch next year. Sporting sensors all around the rims, it can detect the subtlest of changes in your facial expressions (even those you aren't consciously aware of doing). With this data, paired with AI, it can become a personalized life coach for your fitness, your diet and even your emotional health. I put this to the test in my time talking to Emteq CEO, Steen Strand, to see what they can truly bring to the table for the average user and what the future holds. At the core of Emteq's glasses are a series of nine sensors that can identify facial movements to a near-microscopic degree. They're dotted across the bottom of the lenses in these prototypes, which are paired with AI to deliver a personalized set of specs that can sense you. Of course, there are plenty of fascinating use-cases for these, such as using your face to interact with a computer, or adding more true-to-life emotion to your in-game character. But the one that jumped out at me is health — not just physical health but emotional health. Currently, health tracking via consumer tech is limited to your fitness routines — filling in Apple Watch rings and checking your sleep. These are all fair and good, but as I've learned in my journey of losing 20 pounds, good nutrition is just as important. And while there are apps like MyFitnessPal that can deliver effective nutritional information. None come quite as easy to use and complex with actionable detail as Emteq's prototype setup. Get instant access to breaking news, the hottest reviews, great deals and helpful tips. Using ChatGPT-4o, the on-board camera takes a snap of what you're eating and breaks it down into total calories and detailed macros. And on top of that, it will even give you a chewing score…yep, you read that correctly. Digestive issues and impacts on metabolic health can creep up if you chew too fast, so it's important to take your time. So those sensors on your glasses can track biting and chewing speeds to ensure you don't become too much of a food hoover. 'We can use AI to give you custom personalized guidance — some of that actually in real-time,' Steen added. 'We have high fidelity information about how you're eating and what you're eating, and are already using haptic feedback for in-the-moment notifications.' And with their ability to track activity too — tracking different exercises such as walking, running and even star jumps — this can all come together with the AI infusion to give you a far better understanding of your fitness levels. Then there's the emotion sensing piece of the puzzle. Up until this point, it's all been very surface level — prompts to fill in a journal, heart rate tracking to detect stress, or practice deep breathing exercises. All nice-to-haves but beyond the big issue that people could just lie to their phones, nothing has really gone deeper. We believe that understanding emotions is a force multiplier for AI, in terms of it being effective for you in the context of wearing glasses all day. If you want AI to be really effective for you, it's critical that it understands how you're feeling in real-time." Well, beyond accurately assessing eating behaviors throughout the day, other data points can be used to assess emotional context, such as mood detection and posture analysis. While I'm able to fake a smile, the upper section of my face and forehead gave me away in the moment. And then when you tap into the evergrowing popularity of people using ChatGPT for emotional support and therapy, you're surely going to get a more personalized, more frank conversation when data is added in there too. 'We believe that understanding emotions is a force multiplier for AI, in terms of it being effective for you in the context of wearing glasses all day,' Steen commented. 'If you want AI to be really effective for you, it's critical that it understands how you're feeling in real-time, in response to different things that are happening around you.' It sounds creepy on paper, and it kind of is when you think about it. But it's certainly a gateway into real emotional honesty that you may not get by rationalizing with yourself in a journal app and possibly glazing over any cracks in your mental health when filling out that survey for the day. Now this may all seem fascination (I think it is too), but I'm not ignorant of the key questions that come with strapping a bunch of sensors to your face: the questions of privacy surrounding a device grabbing so much data, or simply asking do we really want to be judged for our chewing. Privacy is always a question you can have of many different items that collect a lot of information like this. And to that latter question, that's asked with every big step forward like this. But the end result is something so much more advanced than a smart ring, and much more proactive. Here at Augmented World Expo (AWE), I found a breadcrumb trail of a lot of things that could lead to the smart glasses of the future that everyone will wear. Emteq is probably the biggest crumb of them all, because while AI is definitely the key to unlocking XR, personalizing it is the real challenge. Sensors and real-time data collection like this to help aid you into a better life is the clearest step towards tackling that challenge.

Beyond Ray-Ban Meta: Smartglasses may be the future of health tracking
Beyond Ray-Ban Meta: Smartglasses may be the future of health tracking

Yahoo

time10-03-2025

  • Health
  • Yahoo

Beyond Ray-Ban Meta: Smartglasses may be the future of health tracking

'Existing wearables like Fitbit give us continuous daily data about how active we are, and we have the same thing for sleep. The next pillars of wearable health tracking are emotional and dietary health. We believe there's a huge opportunity to deliver a lot of benefits for people through these.' That's how CEO Steen Strand introduced Emteq Labs, a company taking a refreshingly different approach to smart eyewear. But how do smartglasses know how you're feeling and what you're eating? I found out, and was amazed by the accuracy of the tech, and excited by the potential. 'Health applications drove adoption of the Apple Watch,' Strand reminded me as we talked over Zoom, 'and we want to move smart eyewear beyond what we can already do on a smartphone.' It's not looking at a Ray-Ban Meta clone, and instead when developing its Sense smartglasses asked the question 'what can these glasses do that no other product can?' Strand was joined on our Zoom call by Emteq Labs' founder Dr. Charles Nduka, who gave some insight into how the company came about. 'I'm a facial reconstructive surgeon,'Nduka explained, 'and it was trying to solve the problem of how to understand how people were expressing between clinical appointments that started this journey, culminating in developing smart eyewear that measures expressions. Initially a research and development project, it gained this tremendous momentum, so we're now looking at fundamental issues around daily life, mental health, metabolic disease, and dietary function.' Smartglasses like the Ray-Ban Meta and the Solos AirGo Vision have outward-facing cameras, and we look through normal corrective spectacles to improve our vision. Emteq Labs' platform looks inward, examining our facial movements using sensors to give us incredible insights normally unavailable outside of a laboratory environment. The two areas demonstrated to me were around our mood and eating habits. 'We know people who are depressed move with lower amplitude, have a head down posture, don't smile as much, and tend to make more negative expressions,' Nduka explained. He said there's only one muscle that you use to move your eyebrows upward, and only one muscle group that moves your cheeks, and to 'see' them Emteq Labs' Sense smartglasses don't use a camera, but lensless sensors that measure the movement and texture of your skin and those muscles. It was uncannily accurate in a demonstration I watched. The platform replicated in real time Strand's — who was wearing Emteq Labs' smartglasses — facial expressions, all without a camera. Anyone who has worn a smartwatch like the Apple Watch or a smart ring like the Oura Ring will have seen stress measurements already. Strand explained why your face is a far better indicator of the stress you're under than sensors on your wrist or finger: 'Not only are you not getting feedback, but the data is a little misleading,' Strand said. 'It says I'm stressed because my heart rate is elevated, but I may just be exercising. Without seeing the face, it's really hard to tell why someone is stressed. You need additional insight to know what's going on with someone emotionally that you just can't get from the finger, wrist, or phone. These are ways to collect a great quantity of data, but not great quality data. With our smart glasses, we think we can get both.' Emteq Labs' software interpreted the data in a far more extensive and informative way than the Oura Ring, for example. The Oura Ring will often inform you if the day has resulted in more stress than usual, but it's unable to interpret why, and is only basing its information on data collected by its heart rate, body temperature, and other sensors. The Emteq Labs software shows when you were stressed or upset using your expressions, and charts it through your day. Historical data will show mood improvements or deteriorations, and how much of your day was spent feeling positive or not. It's fascinating data, and the potential for use in mental health diagnosis and treatment is obvious. The sensors in Emteq Labs' smartglasses can also be used to help understand eating habits, and potentially help people change their diets and even lose weight. 'The temple sensors measure jaw movement in real time,' Nduka told us. 'We can measure chewing rate, which is a whole new metric that hasn't been possible before, and just that metric itself will inform about calorie ingestion, and this can help determine weight gain as if you eat calorie-dense foods quickly, you have to do more to burn them off.' Strand built on why this is important. 'It's a small piece of data,' he said, 'but from it we know when you ate, how long you ate for, whether you had snacks or three square meals. We know if you ate quickly, slowly, and how many bites you took. There's decades of research showing the relationship between all these points and your overall metabolic health.' This has the potential to help people identify where they're going wrong with diet and eating, and provides actionable data to help change habits. The platform can distinguish with 95% accuracy whether you're eating, or if you're laughing, talking, singing. It means the glasses will understand the difference between chewing food and the flow of a conversation over dinner. The breadth and granularity of the data shown is astonishing. The temple sensors show an X and Y axis, and even understand how the kind of food you're eating affects the way chew, plus as the food is chewed more, the graph changes. Nduka explained why this matters. 'Dietitians know if they can get people to slow down eating, it means the stomach doesn't stretch so much, and they won't need to eat so much next time to feel full, and the brain will better understand fullness. It means people will lose weight.' While the Sense smartglasses collect data using sensors, there is a small camera on the outside and the software can prompt you to take a photo of your food for the AI to assess its calorie count, removing the friction of using a food tracking app on your phone. Research showed Emteq Labs the precise calorie count and intake isn't as important when you also have data on what, when, and how people are eating. Finally, the glasses can also track activity, which adds an additional data point for those looking to improve their overall health, as you'll get a tick in the box marked 'take an after dinner walk.' 'We've got this unique data set that's continuous, in real time, across your entire day,' Strand emphasized. 'Right now this data is really only available in a lab. If you want an understanding of someone's emotions or their eating habits, you need a picture of them across the whole day, and that's what we're unlocking here.' The hardware Emteq Labs demonstrated uses a camera and two small sensors on the arms for emotional tracking, and Strand estimated the company needs about another two years to get the hardware down to the size of the Ray-Ban Meta smart glasses, but for the sensors and technology required to track eating and diet, it can do in smart glasses sized like the Ray-Ban Meta today. 'What we have now is a closed developer kit,' Strand explained, 'and we're working with outside partners in diet and emotional health to see how we can supply the data they need. In some research applications it doesn't matter what the glasses look like, but when it's in real life, wearability and looks matter a lot. For those cases we're exploring partnerships to build hardware together. We see Sense as a platform that has many different applications. Even for sentiment measurement in situations like people viewing pre-release movies. We're trying to stay at a platform level, but we're excited about the diet aspect, and we're weighing up different opportunities around that.' Whatever the next step is for Emteq Labs intriguing, unique, mood-and-food tracking smart eyewear platform, it's fantastic to see a company explore opportunities beyond making another smartphone alternative for your face.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store