logo
Smartwatches may be getting your stress levels wrong, study finds

Smartwatches may be getting your stress levels wrong, study finds

The Star3 days ago
A new study suggests that when it comes to gauging your psychological state, your wearable might be getting it wrong.— Pixabay
For many health-conscious users, a smartwatch is more than just a fitness tracker – it's a daily companion for monitoring everything from heart rate to sleep patterns and stress levels. But a new study suggests that when it comes to gauging your psychological state, your wearable might be getting it wrong.
Published in the Journal of Psychopathology and Clinical Science , the study examined nearly 800 university students wearing the Garmin Vivosmart 4. Participants regularly reported their own emotional states, which were then compared with the stress data recorded by their devices. The result was a striking lack of alignment.
"For the majority of individuals in our sample, we found that self-report and physiological measures of stress show very weak to no associations," the authors wrote. "These results raise several questions about differences between data sources and potential measurement issues."
Garmin promotes the Vivosmart 4's stress-tracking feature, which uses heart rate (HR) and heart rate variability (HRV) data to produce a score from 0 to 100. But the company acknowledges that interpreting stress isn't straightforward.
On its website, Garmin notes that both public speaking and running up stairs can raise heart rate, but for very different reasons. It recommends wearing the device consistently, particularly during sleep, to improve accuracy.
The study adds to a growing body of research questioning the reliability of wearable stress metrics. While the Garmin device underperformed on stress measurement, researchers found it performed well in other areas. Sleep tracking proved highly accurate, although its link to self-reported tiredness was weaker.
For now, the findings suggest that while a smartwatch can be a handy tool for tracking fitness or sleep, but its "stress" alerts may need to be taken with a healthy dose of scepticism. – Khaleej Times, Dubai, United Arab Emirates/Tribune News Service
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Opinion: Jim Acosta's AI interview raises deeper questions about human connection
Opinion: Jim Acosta's AI interview raises deeper questions about human connection

The Star

time2 days ago

  • The Star

Opinion: Jim Acosta's AI interview raises deeper questions about human connection

Grief is different for everyone, and how people grieve is evolving along with technology. — Pixabay From Facebook to FaceTime, it is now easier than ever to stay connected with friends and family members. Thanks to technology, I can FaceTime with my parents, send TikTok videos and share photos of my dog with friends with a few clicks. But what happens when we use technology to virtually resurrect the dead and allow an avatar to speak on behalf of the deceased in a video interview sharing a political viewpoint? That question came to the forefront when former CNN White House correspondent Jim Acosta 'interviewed' an artificial intelligence avatar of Joaquin Oliver, a teenager killed in the 2018 Parkland high school shooting. In the video, the avatar used a chatbot to generate answers in a voice that supposedly sounded like the boy. Acosta said the boy's father had approached him to do the piece as a way of keeping Joaquin's memory alive. The interview sparked backlash and raised ethical concerns over technology's potential role in tarnishing the memory of the dead or changing their viewpoint. In this case, the Joaquin avatar advocated for 'stronger gun control laws, mental health support and community engagement.' Acosta's interview also raises a larger question: Is AI helping us connect or just simulating human connection while we become more disconnected? Grief is different for everyone, and how people grieve is evolving along with technology. Four years ago, I read about Joshua Barbeau, a guy who lost his fiancee to a rare liver disease. He used Project December – a chat website that simulates a text-based conversation with anyone, including someone who is dead – to communicate via chatbot with an AI version of his deceased fiancée. Traditionally, people processed grief through therapy or with the support of trusted friends or family members. Today, programs like ChatGPT are being used as therapists, for friendships and in some cases, as romantic partners. As Derek Thompson wrote in The Atlantic in February, 'Americans are spending less time with other people than in any other period for which we have trustworthy data, going back to 1965.' Isolation isn't accidental. Many people keep their phones on silent, prefer texting to calling and spend hours doomscrolling. Now, AI can simulate people with avatars. So when grief feels heavy, there is a program to help. When my grandmother died, the grief felt unbearable. Fifteen years later, when I talk about her, the loss still tugs at my heart. There isn't a day that I don't wish she was still here. I keep her memory alive without using AI, but I don't judge how others grieve. Grief is intimate – and that's what makes Joaquin Oliver's AI interview so eerie. It raises emotional questions. Is AI helping a family grieve? Was it a family's attempt to give their son a voice in a country that failed to protect him from gun violence? Maybe it's both. Human emotions can be messy. But what makes it all bearable is human connection –holding space for others in tough times – something technology can't replace. A chatbot can't actually resurrect the dead. It's a mirror of memories, reflecting our own words and thoughts. In the end, as the San Francisco Chronicle reported, Barbeau – the man who lost his fiancee – 'felt like the chatbot had given him permission to move on with his life in small ways, simply by urging him to take care of himself.' Perhaps that's the lesson. Technology can offer us tools for processing grief and maintaining memories, and maybe even give us permission to move on. But AI can't hug you or laugh at inside jokes. It won't sit next to you in silence when the world feels heavy. The loss of my grandmother still hurts. No amount of technology can bring her back. That's part of life's beauty – to love something death can touch. We carry those we've lost not through digital simulations, but by sharing memories and stories with others. The danger isn't just that AI will replace human connection – it's that we may settle for it. – Miami Herald/Tribune News Service

Smartwatches may be getting your stress levels wrong, study finds
Smartwatches may be getting your stress levels wrong, study finds

The Star

time3 days ago

  • The Star

Smartwatches may be getting your stress levels wrong, study finds

A new study suggests that when it comes to gauging your psychological state, your wearable might be getting it wrong.— Pixabay For many health-conscious users, a smartwatch is more than just a fitness tracker – it's a daily companion for monitoring everything from heart rate to sleep patterns and stress levels. But a new study suggests that when it comes to gauging your psychological state, your wearable might be getting it wrong. Published in the Journal of Psychopathology and Clinical Science , the study examined nearly 800 university students wearing the Garmin Vivosmart 4. Participants regularly reported their own emotional states, which were then compared with the stress data recorded by their devices. The result was a striking lack of alignment. "For the majority of individuals in our sample, we found that self-report and physiological measures of stress show very weak to no associations," the authors wrote. "These results raise several questions about differences between data sources and potential measurement issues." Garmin promotes the Vivosmart 4's stress-tracking feature, which uses heart rate (HR) and heart rate variability (HRV) data to produce a score from 0 to 100. But the company acknowledges that interpreting stress isn't straightforward. On its website, Garmin notes that both public speaking and running up stairs can raise heart rate, but for very different reasons. It recommends wearing the device consistently, particularly during sleep, to improve accuracy. The study adds to a growing body of research questioning the reliability of wearable stress metrics. While the Garmin device underperformed on stress measurement, researchers found it performed well in other areas. Sleep tracking proved highly accurate, although its link to self-reported tiredness was weaker. For now, the findings suggest that while a smartwatch can be a handy tool for tracking fitness or sleep, but its "stress" alerts may need to be taken with a healthy dose of scepticism. – Khaleej Times, Dubai, United Arab Emirates/Tribune News Service

US city officials to use AI cameras to check recycling bin. Here's why
US city officials to use AI cameras to check recycling bin. Here's why

The Star

time5 days ago

  • The Star

US city officials to use AI cameras to check recycling bin. Here's why

The cameras are trained to identify items like certain plastic bags that can't be recycled but over time will learn to identify new contaminants, according to a news release from the city. — Pixabay Tacoma officials will use an artificial intelligence-powered camera in a new pilot program that will identify contaminated items in the city's curbside recycling program to educate residents about what can and can't be recycled. The two-year program is funded with a US$1.8mil (RM7.62mil) grant from the Environmental Protection Agency's Recycling Education and Outreach Grant Program, part of the city's effort to reduce contamination in Tacoma's residential recycling. The cameras are trained to identify items like certain plastic bags that can't be recycled but over time will learn to identify new contaminants, according to a news release from the city. Preston Peck, a sustainability analyst with the city, said that when the cameras identify contaminated items in recycling bins, residents at the location will receive postcards educating them about the incorrectly placed item. The postcards will include images of the contaminated material. 'A few (postcards) will need to be more generic when a specific recycling bin cannot be confirmed to be associated with a specific customer during the review process,' Peck told The News Tribune in an email. The technology comes from Prairie Robotics, a Canadian company that has implemented such technology 'across North America,' according to its website. Peck said the cameras will focus on documenting the items collected in the truck and not people or private property. The data will be stored 'securely and safely' in the United States and will not be sold or shared with third parties, he said. 'Any images that inadvertently include faces or license plates are automatically blurred before the image is uploaded to protect privacy,' Peck said. 'Residents will only see items identified as contamination in their recycling bin on their postcards and everything else will be blurred.' Peck said only one truck is currently using the cameras, which is covering recycling pick-up routes across the city. The program will be rolled out in phases over the next year in seven recycling trucks and will continue until the grant's two years are up about June 2027. – The Peninsula Gateway (Gig Harbor, Wash.)/Tribune News Service

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store