
AI tech detects hidden heart disease doctors often miss
Now, a new artificial intelligence tool called EchoNext is changing the game. It can flag hidden heart problems that even trained cardiologists miss just by analyzing a standard ECG. That's right. A routine, five-minute heart test you've probably already had could now unlock life-saving information if AI is watching.
Sign up for my FREE CyberGuy ReportGet my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox. Plus, you'll get instant access to my Ultimate Scam Survival Guide — free when you join my CYBERGUY.COM/NEWSLETTER
SHD refers to defects in the heart's walls, valves or chambers. Some are present at birth. Others develop slowly over time. Either way, they often go unnoticed until something major happens, like a heart attack or stroke. That's why experts sometimes call SHD the "hidden" heart disease. Here's the problem. Even the ECGs doctors routinely use to spot heart issues often can't detect SHD on their own. And that's where EchoNext comes in.
EchoNext was created by researchers at Columbia University and NewYork-Presbyterian. The tool was trained on over 1.2 million ECG and echocardiogram pairs from more than 230,000 patients. Its job? Figure out when a patient's ECG suggests a deeper look is needed, specifically, an echocardiogram, the ultrasound that gives a detailed view of heart structure.
"EchoNext basically uses the cheaper test to figure out who needs the more expensive one," explained Dr. Pierre Elias, who led the study.
And the results? Stunning.
In tests, EchoNext correctly flagged 77% of structural heart disease cases from ECGs. Cardiologists? Just 64%. Even more impressive, when tested on nearly 85,000 people, EchoNext identified over 7,500 at high risk for undiagnosed SHD. A year later, researchers found that 73% of those who followed up with echocardiograms were indeed diagnosed with SHD, a rate far above average. These groundbreaking results were published in Nature, one of the world's most respected scientific journals. That's not just a better test. That's a potential lifesaver.
Millions walk around every day with SHD and have no idea. They skip heart screenings because nothing feels wrong. Even when they do get an ECG, subtle warning signs can slip by unnoticed. EchoNext doesn't miss them. And it doesn't get tired or distracted. This isn't about replacing doctors. Iit's about giving them a powerful new tool to catch what humans alone can't.
You don't need to wait for symptoms to take your heart health seriously. If you've ever had an ECG, or you're getting one soon, AI could now help spot hidden risks your doctor might miss. Tools like EchoNext make it easier to catch heart problems early, even if you feel fine.
This means fewer surprises, fewer missed diagnoses and a better shot at treatment before it's too late. It's not about replacing doctors. It's about giving them, and you, a better shot at catching problems early. Ask your doctor if AI tools are being used to review your heart tests. You deserve every advantage. Right now, EchoNext is being used in research settings, but tools like it are quickly moving toward wider use in hospitals and clinics.
AI is no longer the future of medicine. It's happening now. EchoNext proves that machine learning can radically improve how we detect silent killers like SHD. If a simple ECG plus AI could save thousands of lives, what are we waiting for?
Would you trust a machine to catch what your cardiologist might overlook? Let us know by writing us at Cyberguy.com/Contact
Sign up for my FREE CyberGuy ReportGet my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox. Plus, you'll get instant access to my Ultimate Scam Survival Guide — free when you join my CYBERGUY.COM/NEWSLETTER
Copyright 2025 CyberGuy.com. All rights reserved.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Fast Company
10 minutes ago
- Fast Company
Computers were designed for humans, robots should be too
In 1982, personal computers were beige, boxy, and built for engineers. They were powerful, but uninviting. Few people knew what they were for, or why they might need one. It took more than just better processors to turn computers into objects of mass adoption; it took design. Design transformed the computer from an obscure tool into an essential companion. It gave form to possibility. It helped people trust, understand, and eventually fall in love with machines. Apple ushered in a new era of human-centered computer when it released the Macintosh in 1984. Design is the next frontier in robotics Jacob Hennessey-Rubin, executive director of NY Robotics, a non-profit that serves as the hub for robotics innovation in the New York and Tri-State region, recently told me that robotics is having its 1982 moment. Like the early personal computer, robots have the processing power but not the consumer-friendly design. The technology is here. Robots can move, perceive, and make decisions in complex environments. AI enables them to generalize across tasks, understand natural language, and collaborate with humans. The building blocks are ready, but the experience is not. Most robots still resemble their industrial ancestors: articulated arms from factory floors or the humanoid silhouettes of science fiction. These platforms have enabled impressive mechanical advances and have been created to operate effectively in human-built environments. They've also helped us to imagine what robots could be and remain valuable in many contexts. But as robotics expand into new domains, from home to healthcare to creative expression, we have the opportunity to grow our design vocabulary and reimagine how these systems take shape across new environments and experiences. Robotics has long been seen as an engineering challenge. But the next breakthrough won't be technical, it will be experiential. To unlock the next wave of adoption, robots must be designed not just to work, but to live alongside us, in our homes, hospitals, restaurants, and stores. Our approach must be mindful of how their presence shapes the wind-down routine, the medical procedure, the restaurant kitchen, or the customer interaction, and intentional in shaping both their actions and their stillness. If robots are to be part of the messy, meaningful, deeply human moments that define our days, they must earn their place through thoughtful, human-centered design. Design is how we help robots fit into life. We believe these five design principles will define robotics next era. Design for context Robotic forms should emerge from context, not from the machine. Their expression, movement, and interaction style should reflect and shape the space and culture. In practice, they might borrow cues from furniture, chef's tools, classic cars, or other familiar elements—not to mimic them, but to evoke emotion, convey meaning, and set the emotional tone. In a social setting, a robot might use animation techniques to express its state gently and intuitively; in a clinic, it might signal clarity and precision through forms reminiscent of medical instruments. Match capability with expression When a robot looks too human, we expect too much. And when it looks too mechanical, we trust too little. The sweet spot? Forms that are honest, clear expressions of their true capabilities. Trust begins with legible forms, which lead to more open and meaningful engagement. Design for natural interaction Great interaction goes both ways. Robots shouldn't just perform tasks; they should communicate intent and respond to ours with clarity. Movements, gestures, lights, and sounds should feel intuitive and appropriate, helping people to understand what a robot is doing and how to engage with it. At the same time, today's robots are better equipped than ever to understand us, they recognize our actions, focus, and even unspoken cues. This opens the door to more natural, multimodal interaction, where people can use voice, touch, gesture, or even demonstration depending on what feels most intuitive. Instead of rigid commands, we can teach robots by showing them how we would do it ourselves. When communication flows both ways, robots feel less like machines and more like capable collaborators. Design for collaboration Collaboration begins with coexistence. As robots become part of our environments, we must design them to move with our rhythms, respond to our cues, and respect their context. The most impactful robots will work with us, even when they replace aspects of what we do. We must also consider how they collaborate amongst themselves, adding to the ecosystem and accomplishing more together, rather than competing with it. Automate the drudgery, not the joy As author Joanna Maciejewska quipped: 'I want my AI to do my laundry and dishes so I can do art and writing, not the other way around.' Let's preserve the things that make us feel human and automate the things that make us feel like machines. From machines to cohabitants When robots stop being tools and start becoming cohabitants, everything changes. We start to ask new questions: How might the presence of robots reshape our sense of space and privacy? What rituals are worth preserving as appliances become more intelligent? What new roles can robots play beyond utility and across care and companionship? What design languages do robots demand, in form, tone, and gesture? The answers won't come from code alone. They'll come from design. Just as Apple redefined computing through design, the teams that rethink the why of robotics, not just the how, will lead the next wave of human-centered robotics, made for everyday life—not in the distant future, but right now. We're already living with robots. It's time we start designing like it.
Yahoo
39 minutes ago
- Yahoo
JPMorgan, Coinbase Agree to Link Bank, Crypto Accounts
JPMorgan Chase and Coinbase Global signed an agreement to directly link customers' bank accounts to their cryptocurrency wallets. Paige Smith reports on "Bloomberg Markets." Sign in to access your portfolio
Yahoo
39 minutes ago
- Yahoo
GE HealthCare projects reduced tariff expense
This story was originally published on MedTech Dive. To receive daily news and insights, subscribe to our free daily MedTech Dive newsletter. By the numbers Q2 revenue: $5 billion 3% increase year over year Net income: $486 million Nearly 14% increase year over year GE HealthCare lowered the expected impact from tariffs on its financial outlook Wednesday and said customer demand for capital equipment remained healthy in the second quarter. The medical imaging company boosted its forecast for organic revenue growth in 2025 to 3%, up from a range of 2% to 3%. However, shares in the company fell nearly 8% to end at $71.64 on the New York Stock Exchange, as investors focused on slower-than-expected order growth and a delayed market recovery in China. J.P. Morgan analysts, in a Wednesday note to investors, said second-quarter order growth of about 3% year over year was below the 5% that investors wanted, and the outlook for China was disappointing. GE HealthCare CEO Peter Arduini told analysts on the earnings call that the situation in China continues to evolve. 'I'd say China, we're seeing activity pick up, but the market recovery is taking a little bit longer,' said Arduini. 'We think the longer-term outlook will be positive just based on the size of the country.' Still, with greater clarity from deals struck between the U.S. and global trade partners, GE HealthCare now expects a 45-cent tariff impact to adjusted earnings per share for the full year, roughly half of the 85-cent hit it projected last quarter. The reduction is similar to revised outlooks from other large medical device makers, such as Boston Scientific and Johnson & Johnson, which also halved their tariff impact forecasts. GE HealthCare raised its full-year outlook for adjusted EPS to a range of $4.43 to $4.63, from the previous forecast of $3.90 to $4.10. The improved outlook follows the company's steep earnings guidance cut in April. For 2026, tariff expenses are expected to be below the 45 cents forecast for 2025 due to mitigation efforts that include supply chain restructuring actions and selective price increases. CFO Jay Saccaro, on the earnings call, said the company is working on longer-term changes to the supply chain, such as investing in more local manufacturing and shifting capacity within supplier networks to more tariff-friendly locations. 'As we've seen these trade deals shape up, we're now in a position to begin to execute on some of these, which we'll do in the second half of the year, and then those will benefit 2026,' Saccaro said. Capital equipment investment Republicans' 'Big Beautiful Bill' has sparked concerns among healthcare investors that patient volumes could soften if people lose insurance coverage, with the trend in turn pressuring hospital budgets. GE HealthCare, however, is seeing a 'robust' capital environment and healthy procedure growth, said Saccaro, with customers continuing to invest in innovative imaging equipment. In the U.S., hospitals are replacing an aging installed base of equipment, after the replacement cycle was paused during the COVID-19 pandemic, said Arduini. Further driving demand is a need for imaging to support treatment advances in areas such as electrophysiology and pharmaceuticals that require more follow-up. Hospitals also want equipment that helps increase productivity. 'It's difficult for U.S. hospitals to get staffed, and so equipment that moves the patient swiftly through the institution with a high-quality diagnosis is a very important asset,' said the CEO. Recommended Reading Medtech firms slash expected tariff charges Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data