Latest news with #RaymondMak
Yahoo
23-05-2025
- Health
- Yahoo
What If a Selfie Could Predict Your Life Expectancy?
Imagine if a photo could tell you more about your health than your last check-up? A groundbreaking study from Mass General Brigham, published in journal The Lancet Digital Health, recently introduced FaceAge, an AI tool that estimates a person's biological age from a simple photograph. 'Doctors, myself included, still rely on the eyeball test—a split-second judgment of whether a patient looks robust or frail,' said Dr. Raymond Mak, a radiation oncologist and the director of clinical innovation for his department at the Dana-Farber Cancer Center, the faculty leader in AI implementation for the artificial intelligence in medicine (AIM) program at Mass General Brigham and an associate professor at Harvard Medical School, as well as a co-senior author of the study. More from Flow Space 3 Small Things This Orthopedic Surgeon Wishes Women Would Do for a Longer Life 'That snap impression is subjective, yet it influences treatment decision making every day,' he told Flow Space. So, researchers were curious just how well AI could help doctors diagnose. What they found was that in patients with cancer, looking biologically older than your chronological age, was linked with worse survival outcomes (on average, the FaceAge of cancer patients was about five years older than their chronological age). According to the study, FaceAge not only revealed aging patterns invisible to the naked eye but also outperformed doctors in predicting short-term life expectancy for patients receiving palliative care. 'Our goal was to improve that judgment from a subjective glance to a reproducible, data-driven metric by developing an artificial intelligence algorithm called FaceAge,' Mak explained. 'Such a tool gives doctors the ability to assess patient health at low-cost and repeatedly over time with just a simple face photograph.' An AI algorithm like FaceAge works by taking an image of a patient and then analyzing that image against a database of images of healthy individuals and those with cancer. Mak and his team recently expanded their datasets to include millions of healthy individuals and over 20,000 cancer patients to develop an even more accurate FaceAge algorithm and to test AI performance across a larger and more diverse group of patients. 'Also, we are doing some technical work to understand how the algorithm performs over different conditions including things like, varying skin tone, impact of cosmetic surgery, use of make-up or different lighting conditions and facial expression… like whether someone is smiling or sad,' he added. From there every image quantitatively produces a biological age estimation that is generated the same way every time, regardless of a clinician's experience level, fatigue or unconscious assumptions. 'Selfies for health!,' exclaimed Mak. 'When trained on a large and demographically varied set of face photos, the algorithm applies a consistent rule-set to every image, reducing the variability that creeps into one-to-one visual assessments,' he added. 'It does not replace the physician's judgement, but it does support that judgement with an objective reference point and flags when a patient's biological age appears discordant with their stated age.' While the study did have limitations and biases—with further validation in larger, ethnically diverse and younger cohorts necessary before clinical adoption—FaceAge offers valuable prognostic insights independent of conventional clinical factors, with statistically significant results, even after adjusting for chronological age, sex and cancer type. Mak added that doctors with access to FaceAge information have improved performance and reduced variability in predicting outcomes. 'By flagging people who are biologically older than their years, the technology could help us spot elevated risk for age-related conditions such as cancer and cardiovascular disease,' he said. For midlife women—who are most commonly diagnosed with breast, lung and gastrointestinal cancers—AI and FaceAge could have life changing implications. 'The new AI models are a different breed than the AI of the early 2000s, now with an ability to learn and evolve,' Dr. Katerina Dodelzon, a radiologist specializing in breast imaging and an associate professor of radiology at Weill Cornell Medicine, told Flow Space. 'New advances in AI include subsets termed machine learning, which is an AI that can learn to make predictions or decisions, and its subset of deep learning, which uses artificial neural networks. The more data a machine learning model is exposed to, the better it performs over time.' She says it can also help with: Earlier and More Accurate Detection Midlife women benefit greatly from early cancer detection, which improves survival rates for: : AI can detect subtle changes in mammograms up to two years earlier than radiologists. Lung Cancer: AI can flag early-stage nodules in low-dose CT scans. : AI-assisted colonoscopy improves adenoma detection rate. Personalized Treatment Plans AI helps oncologists tailor therapies based on a patient's unique profile: Genomic Data Analysis: AI can interpret massive genomic datasets to find actionable mutations. For example, in breast cancer, it helps identify candidates for hormone therapy, HER2-targeted therapy or immunotherapy, says Dodelzon. Treatment Optimization: AI evaluates past patient responses to suggest optimal chemotherapy regimens, dosage and predict side effects. Management Remote Monitoring Tools: Wearables and AI apps can track vital signs, symptoms and treatment side effects. This supports real-time intervention and minimizes doctor visits. AI Chatbots & Virtual Health Assistants: These can answer questions, schedule appointments and provide appointment and medication reminders. Equity and Access Many midlife women face healthcare disparities based on race, income or geography. And for women living in healthcare deserts, where access to care is limited, AI can: Improve Access to Expertise: AI tools bring expert-level diagnostic and treatment planning to underserved or rural areas via telehealth. Language and Literacy Support: AI-powered translation and plain-language medical explanations empower patients to understand and make informed choices. For Mak and his team harnessing AI to save more lives is the ultimate goal. They are currently developing new facial health recognition algorithms that can predict survival directly or other health conditions, in addition to conducting genetic analyses on a larger group of patients and opening two prospective studies. 'One is a clinical trial in cancer patients where we will compare FaceAge against conventional assessments of frailty in elderly patients,' said Mak. 'Second, we are about to open a healthy volunteer portal where people in the public can upload photos and get their own FaceAge estimate—and their photos will help us develop improved algorithms.' And the future of AI in healthcare is set to be transformative, shifting the industry from reactive to highly proactive, personalized and precise. Dodelzon says rather than replacing doctors, AI will augment their capabilities. This support will help catch conditions earlier, reduce diagnostic errors and streamline clinical decision-making. By leveraging vast datasets, AI will recommend treatment options tailored not only to clinical guidelines but also to a patient's unique biology and preferences. Moreover, AI will take over many of the time-consuming administrative tasks that burden healthcare professionals, such as documentation, billing and charting, which allows for more meaningful patient interaction and personalized care. 'I think the current advances and the future development and promise of these tools is very exciting, with the potential to augment many of the routine detection and characterization tasks, and even more exciting to me, the potential to provide more prognostic in addition to diagnostic information,' Dodelzon said. 'But that is what they are—'tools' in our 'doctor's bag' that allow us to do more for our patients.'


Indian Express
15-05-2025
- Health
- Indian Express
What is FaceAge, the AI tool that can tell how healthy you are from a selfie?
A new AI tool promises to give doctors a clearer picture of a patient's health by analysing their face. Known as FaceAge, it is modelled after what physicians call 'the eyeball test,' a quick visual assessment made by doctors to gauge a patient's overall condition at a glance. The AI tool has been developed by researchers at Mass General Brigham, a non-profit, integrated healthcare initiative, in Boston, United States. Their research paper on the deep learning system was also published in the Lancet Digital Health on May 8, 2025. The developers of the AI tool have said that they expect to conduct a pilot study with about 50 patients starting next week. This means that FaceAge is yet to undergo proper testing before it can be deployed in hospitals to be used by doctors routinely. FaceAge is essentially powered by a deep learning algorithm that has been trained and developed to tell patients' biological age from a selfie. However, the tool is designed to provide a patient's age in health (biological age) and not in years (chronological age). A person's biological age is considered to be important because it could help doctors determine the most appropriate treatment for them. For example, doctors could prescribe a more aggressive treatment for a cancer patient if their biological age indicates that they are healthy enough to tolerate it. 'We found that doctors on average can predict life expectancy with an accuracy that's only a little better than a coin flip when using a photo alone for their analysis,' Dr Raymond Mak, a radiation oncologist at Mass General Brigham and one of the co-authors of the study was quoted as saying by Washington Post. 'Some doctors would hesitate to offer cancer treatment to someone in their late 80s or 90s with the rationale that the patient may die of other causes before the cancer progresses and becomes life-threatening,' Dr Mak added. At a press conference held last week, he recalled the case of an 86-year-old man with terminal lung cancer. 'But he looked younger than 86 to me, and based on the eyeball test and a host of other factors, I decided to treat him with aggressive radiation therapy,' he said. Four years later, Dr Mak said he used FaceAge to analyse the lung cancer patient's face. 'We found he's more than 10 years younger than his chronological age. The patient is now 90 and still doing great,' he said. Mass General Brigham researchers said that FaceAge's training datasets comprised 9,000 photographs of people ages 60 and older who were presumed to be healthy. A majority of the photos were downloaded from Wikipedia and IMDb, the internet movie database. The AI system was also trained using a large-scale dataset sourced from UTKFace, which comprised pictures of people between one year to 116 years old. 'It is important to know that the algorithm looks at age differently than humans do. So, for example, being bald or not, or being grey is less important in the algorithm than we actually initially thought,' Hugo Aerts, one of the co-authors of the study, said. The study noted that no face photographs of patients and other clinical datasets were used to train the AI tool. Researchers of the study have emphasised that FaceAge is not meant to replace but enhance a doctor's visual assessment of a patient, otherwise known as the 'eyeball test'. The deep learning system has also undergone some testing. FaceAge was tested on photographs of over 6,200 cancer patients. These images of the patients were captured before they underwent radiotherapy treatment. The AI algorithm determined that the patients' biological age was on average five years older than their chronological age. The survival outlook of these patients provided by FaceAge was also dependent on how old their faces looked. In another experiment, the researchers asked eight doctors to tell whether patients who had terminal cancer would be alive in six months. When doctors relied only on a patient's photograph to make their prediction, they were right 61 per cent of the time. That figure rose to 73 per cent when doctors relied on the photograph as well as clinical information. The doctors' reached an even higher accuracy of 80 per cent when using FaceAge, along with information on medical charts. The study also noted that an older-looking face does not necessarily lead the AI tool to predict a poor health outcome. After analysing photos of actors Paul Rudd and Wilford Brimley (when both were aged 50), FaceAge determined that Rudd's biological age was 43 and Brimley's was 69, as per the study. However, Brimley died in August, 2020, at 85-years-old. The team behind FaceAge has acknowledged that there is a long way to go before the AI tool is deployed in a real-world clinical setting as there are several risks that need to be effectively addressed. For instance, privacy has always been a long-standing concern when it comes to AI systems that gather facial data. However, the study noted, 'Our model is configured for the task of age estimation, which, in our opinion, has less embedded societal bias than the task of face recognition.' Researchers also said that they sought to address potential racial or ethnic bias in the AI tool by quantifying 'model age predictions across different ethnic groups drawn from the UTK validation dataset.' 'The UTK is one of the most ethnically diverse age-labelled face image databases available publicly and, therefore, appropriate for assessing model performance in this regard, with non-White individuals comprising approximately 55% of the database,' it said. The study also noted that FaceAge is minimally affected by ethnicity as the researchers adjusted for 'ethnicity as a covariate […] in the multivariable analysis of the Harvard clinical datasets.' Still, the developers of FaceAge have said that strong regulatory oversight and further assessments of bias in the performance of FaceAge across different populations is essential. 'This technology can do a lot of good, but it could also potentially do some harm,' said Hugo Aerts, director of the Artificial Intelligence in Medicine program at Mass General Brigham and another co-author of the study, was quoted as saying.


Japan Today
12-05-2025
- Health
- Japan Today
AI tool uses selfies to predict biological age and cancer survival
Three pedestrians take a selfie on the picturesque alleyway at the end of Rue de l'Universite, Paris By Issam AHMED Doctors often start exams with the so-called "eyeball test" -- a snap judgment about whether the patient appears older or younger than their age, which can influence key medical decisions. That intuitive assessment may soon get an AI upgrade. FaceAge, a deep learning algorithm described in The Lancet Digital Health, converts a simple headshot into a number that more accurately reflects a person's biological age rather than the birthday on their chart. Trained on tens of thousands of photographs, it pegged cancer patients on average as biologically five years older than healthy peers. The study's authors say it could help doctors decide who can safely tolerate punishing treatments, and who might fare better with a gentler approach. "We hypothesize that FaceAge could be used as a biomarker in cancer care to quantify a patient's biological age and help a doctor make these tough decisions," said co-senior author Raymond Mak, an oncologist at Mass Brigham Health, a Harvard-affiliated health system in Boston. Consider two hypothetical patients: a spry 75‑year‑old whose biological age clocks in at 65, and a frail 60‑year‑old whose biology reads 70. Aggressive radiation might be appropriate for the former but risky for the latter. The same logic could help guide decisions about heart surgery, hip replacements or end-of-life care. Growing evidence shows humans age at different rates, shaped by genes, stress, exercise, and habits like smoking or drinking. While pricey genetic tests can reveal how DNA wears over time, FaceAge promises insight using only a selfie. The model was trained on 58,851 portraits of presumed-healthy adults over 60, culled from public datasets. It was then tested on 6,196 cancer patients treated in the United States and the Netherlands, using photos snapped just before radiotherapy. Patients with malignancies looked on average 4.79 years older biologically than their chronological age. Among cancer patients, a higher FaceAge score strongly predicted worse survival -- even after accounting for actual age, sex, and tumor type -- and the hazard rose steeply for anyone whose biological reading tipped past 85. Intriguingly, FaceAge appears to weigh the signs of aging differently than humans do. For example, being gray-haired or balding matters less than subtle changes in facial muscle tone. FaceAge boosted doctors' accuracy, too. Eight physicians were asked to examine headshots of terminal cancer patients and guess who would die within six months. Their success rate barely beat chance; with FaceAge data in hand, predictions improved sharply. The model even affirmed a favorite internet meme, estimating actor Paul Rudd's biological age as 43 in a photo taken when he was 50. AI tools have faced scrutiny for under‑serving non-white people. Mak said preliminary checks revealed no significant racial bias in FaceAge's predictions, but the group is training a second‑generation model on 20,000 patients. They're also probing how factors like makeup, cosmetic surgery or room lighting variations could fool the system. Ethics debates loom large. An AI that can read biological age from a selfie could prove a boon for clinicians, but also tempting for life insurers or employers seeking to gauge risk. "It is for sure something that needs attention, to assure that these technologies are used only in the benefit for the patient," said Hugo Aerts, the study's co-lead who directs MGB's AI in medicine program. Another dilemma: What happens when the mirror talks back? Learning that your body is biologically older than you thought may spur healthy changes -- or sow anxiety. The researchers are planning to open a public-facing FaceAge portal where people can upload their own pictures to enroll in a research study to further validate the algorithm. Commercial versions aimed at clinicians may follow, but only after more validation. © 2025 AFP


Malay Mail
10-05-2025
- Health
- Malay Mail
How old are you, really? AI uses headshots to predict biological age and health risks
WASHINGTON, May 10 — Doctors often start exams with the so-called 'eyeball test' – a snap judgment about whether the patient appears older or younger than their age, which can influence key medical decisions. That intuitive assessment may soon get an AI upgrade. FaceAge, a deep learning algorithm described Thursday in The Lancet Digital Health, converts a simple headshot into a number that more accurately reflects a person's biological age rather than the birthday on their chart. Trained on tens of thousands of photographs, it pegged cancer patients on average as biologically five years older than healthy peers. The study's authors say it could help doctors decide who can safely tolerate punishing treatments, and who might fare better with a gentler approach. 'We hypothesize that FaceAge could be used as a biomarker in cancer care to quantify a patient's biological age and help a doctor make these tough decisions,' said co-senior author Raymond Mak, an oncologist at Mass Brigham Health, a Harvard-affiliated health system in Boston. Consider two hypothetical patients: a spry 75-year-old whose biological age clocks in at 65, and a frail 60-year-old whose biology reads 70. Aggressive radiation might be appropriate for the former but risky for the latter. The same logic could help guide decisions about heart surgery, hip replacements or end-of-life care. Sharper lens on frailty Growing evidence shows humans age at different rates, shaped by genes, stress, exercise, and habits like smoking or drinking. While pricey genetic tests can reveal how DNA wears over time, FaceAge promises insight using only a selfie. The model was trained on 58,851 portraits of presumed-healthy adults over 60, culled from public datasets. It was then tested on 6,196 cancer patients treated in the United States and the Netherlands, using photos snapped just before radiotherapy. Patients with malignancies looked on average 4.79 years older biologically than their chronological age. Among cancer patients, a higher FaceAge score strongly predicted worse survival – even after accounting for actual age, sex, and tumor type – and the hazard rose steeply for anyone whose biological reading tipped past 85. Intriguingly, FaceAge appears to weigh the signs of aging differently than humans do. For example, being gray-haired or balding matters less than subtle changes in facial muscle tone. FaceAge boosted doctors' accuracy, too. Eight physicians were asked to examine headshots of terminal cancer patients and guess who would die within six months. Their success rate barely beat chance; with FaceAge data in hand, predictions improved sharply. The model even affirmed a favorite internet meme, estimating actor Paul Rudd's biological age as 43 in a photo taken when he was 50. Bias and ethics guardrails AI tools have faced scrutiny for under-serving non-white people. Mak said preliminary checks revealed no significant racial bias in FaceAge's predictions, but the group is training a second-generation model on 20,000 patients. They're also probing how factors like makeup, cosmetic surgery or room lighting variations could fool the system. Ethics debates loom large. An AI that can read biological age from a selfie could prove a boon for clinicians, but also tempting for life insurers or employers seeking to gauge risk. 'It is for sure something that needs attention, to assure that these technologies are used only in the benefit for the patient,' said Hugo Aerts, the study's co-lead who directs MGB's AI in medicine program. Another dilemma: What happens when the mirror talks back? Learning that your body is biologically older than you thought may spur healthy changes – or sow anxiety. The researchers are planning to open a public-facing FaceAge portal where people can upload their own pictures to enrol in a research study to further validate the algorithm. Commercial versions aimed at clinicians may follow, but only after more validation. — AFP


Malay Mail
09-05-2025
- Health
- Malay Mail
Snap a selfie, spot your real age: How new AI could transform the way doctors treat cancer
A new AI tool called FaceAge can estimate a person's biological age from a selfie, offering doctors a sharper lens to assess frailty and guide treatment plans. Cancer patients identified as biologically older by FaceAge had poorer survival outcomes, making it a potential tool for personalising care. While promising, the technology raises ethical concerns around data use, bias, and how knowledge of one's biological age could affect mental health. WASHINGTON, May 10 — Doctors often start exams with the so-called 'eyeball test' — a snap judgment about whether the patient appears older or younger than their age, which can influence key medical decisions. That intuitive assessment may soon get an AI upgrade. FaceAge, a deep learning algorithm described Thursday in The Lancet Digital Health, converts a simple headshot into a number that more accurately reflects a person's biological age rather than the birthday on their chart. Trained on tens of thousands of photographs, it pegged cancer patients on average as biologically five years older than healthy peers. The study's authors say it could help doctors decide who can safely tolerate punishing treatments, and who might fare better with a gentler approach. 'We hypothesise that FaceAge could be used as a biomarker in cancer care to quantify a patient's biological age and help a doctor make these tough decisions,' said co-senior author Raymond Mak, an oncologist at Mass Brigham Health, a Harvard-affiliated health system in Boston. Consider two hypothetical patients: a spry 75-year-old whose biological age clocks in at 65, and a frail 60-year-old whose biology reads 70. Aggressive radiation might be appropriate for the former but risky for the latter. The same logic could help guide decisions about heart surgery, hip replacements or end-of-life care. Sharper lens on frailty Growing evidence shows humans age at different rates, shaped by genes, stress, exercise, and habits like smoking or drinking. While pricey genetic tests can reveal how DNA wears over time, FaceAge promises insight using only a selfie. The model was trained on 58,851 portraits of presumed-healthy adults over 60, drawn from public datasets. It was then tested on 6,196 cancer patients treated in the United States and the Netherlands, using photos taken just before radiotherapy. Patients with malignancies looked on average 4.79 years older biologically than their chronological age. Among cancer patients, a higher FaceAge score strongly predicted worse survival — even after accounting for actual age, sex, and tumour type — and the hazard rose steeply for anyone whose biological reading tipped past 85. Intriguingly, FaceAge appears to weigh the signs of ageing differently than humans do. For example, being grey-haired or balding matters less than subtle changes in facial muscle tone. FaceAge boosted doctors' accuracy, too. Eight physicians were asked to examine headshots of terminal cancer patients and guess who would die within six months. Their success rate barely beat chance; with FaceAge data in hand, predictions improved sharply. The model even affirmed a favourite internet meme, estimating actor Paul Rudd's biological age as 43 in a photo taken when he was 50. Bias and ethics guardrails AI tools have faced scrutiny for underserving non-white people. Mak said preliminary checks revealed no significant racial bias in FaceAge's predictions, but the group is training a second-generation model on 20,000 patients. They are also probing how factors like makeup, cosmetic surgery or room lighting variations could fool the system. Ethics debates loom large. An AI that can read biological age from a selfie could prove a boon for clinicians, but also tempting for life insurers or employers seeking to gauge risk. 'It is for sure something that needs attention, to assure that these technologies are used only in the benefit for the patient,' said Hugo Aerts, the study's co-lead who directs MGB's AI in medicine programme. Another dilemma: what happens when the mirror talks back? Learning that your body is biologically older than you thought may spur healthy changes — or sow anxiety. The researchers are planning to open a public-facing FaceAge portal where people can upload their own pictures to enrol in a research study to further validate the algorithm. Commercial versions aimed at clinicians may follow, but only after more validation. — AFP