logo
#

Latest news with #patientprivacy

Michigan Medicine mails postcards without envelopes, potentially exposing data of 1,000+ patients
Michigan Medicine mails postcards without envelopes, potentially exposing data of 1,000+ patients

CBS News

time5 days ago

  • Health
  • CBS News

Michigan Medicine mails postcards without envelopes, potentially exposing data of 1,000+ patients

Michigan Medicine says the health information of 1,015 patients was potentially exposed when it mailed out postcards without envelopes. According to Michigan Medicine, a research study postcard was sent to prospective participants on June 27, 2025. Those postcards were mailed without envelopes and contained a "limited amount" of protected health information that might have been exposed by anyone who handled the postcards. An investigation by the health system found that the University of Michigan's Institutional Review Board, which is responsible for the oversight of human subject research, mistakenly approved the use of the postcards. Michigan's Institutional Review Board is taking steps to ensure that a similar incident does not happen again, Michigan Medicine said in a statement. "We take patient privacy very seriously, and we regret this incident. Whenever situations like this occur, we immediately take steps to investigate," said Jeanne Strickland, Michigan Medicine Chief Compliance Officer, in a statement. "We will analyze this incident and review our safeguards and make changes if needed to protect those we care for." Michigan Medicine says it mailed notices to patients affected by the incident on Aug. 14. Those who are concerned about the breach and who did not receive a notice letter can call the Michigan Medicine Assistance Line at 1-833-353-4105, Monday through Friday, from 9 a.m. to 9 p.m. Although Michigan Medicine believes the risk of medical or identity theft is low with this incident, it encourages impacted patients to monitor their health insurance statements for any evidence of fraudulent transactions.

GPs are using artificial intelligence to record patient consultations but how safe is your personal data?
GPs are using artificial intelligence to record patient consultations but how safe is your personal data?

ABC News

time06-08-2025

  • Health
  • ABC News

GPs are using artificial intelligence to record patient consultations but how safe is your personal data?

For the last 12 months Dr Grant Blashki has used what he calls a "medical intern" in every appointment. His intern is completely digital. It is an artificial intelligence scribe that listens to every word his patients say. "It's mostly surprisingly accurate," the GP told 7.30. "Occasionally it will mishear the name of something. Occasionally it will mishear a diagnosis." He says patient consent when using AI scribes in a clinical setting is essential but that most people don't have an issue. "I do ask for consent. Occasionally people don't want me to use it, which is absolutely fine, but almost everyone is comfortable with it and it just streamlines the work," Dr Blashki said. "It's good for the patient because I'm concentrating on them." Dr Blashki says he has become so reliant on the scribe that he would struggle to conduct appointments without it. "I use it almost in every consultation," he said. As patients reveal intimate details about their medical history with Dr Blashki, he says the scribe is constantly collecting sensitive data. "So I make sure that at the end of each consultation I actually delete all the transcriptions off my software." Dr Blashki uses software from Melbourne-based company Heidi Health, which is one of the main AI scribe tools used by clinicians in Australia. Heidi Health declined 7.30's request for an interview but its CEO and co-founder Dr Thomas Kelly provided a written response to questions about patient privacy. "Heidi now supports almost two million visits a week, and that's around the world from Australia, New Zealand, Canada, to the US and the UK," Dr Kelly said. "In each region, data is stored in compliance with the healthcare regulations and privacy policies of the region." "Here it's Australian Privacy Principles (APP), in the EU that would be GDPR, in the US that would be HIPAA. All data is protected according to ISO 27K and SOC2 requirements, which are the highest enterprise standards that exist. We get audited by third parties to protect our data and ensure the security that we have.' Dr Kelly says all data is protected according to "the highest enterprise standards that exist". Lyrebird Health is another AI scribe software company that is based in Melbourne. It is used by GPs, surgeons, psychiatrists and paediatricians — the company says the software was used in "200,000 consults" in Australia last week. "All data is stored 100 per cent in Australian sovereign databases if you're an Australian customer — it's different obviously if you're overseas," Lyrebird Health CEO Kai Van Lieshout told 7.30. Patient notes are deleted automatically after seven days from Lyrebird Health's system (doctors need to back up the notes if they want to keep them), but users do have the option to manually extend this period to six months. "For us it is definitely really gone," Mr Van Lieshout said. "I know that because we've had doctors that have needed something that we've had ... that don't realise that it's deleted after seven days and there's nothing we can do." John Lalor is an Assistant Professor of IT, Analytics and Operations at the University of Notre Dame, he warns there is always an element of risk when storing digital data. "A lot of those models, they're very data-driven, so the more data they have, usually the better they get," Mr Lalor told 7.30. "So on the one hand, if it has a lot more data from patients, that can typically improve the models, but on the other hand, there's the privacy risk of the data being exposed if it's leaked or hacked." He says patients and doctors should be making sure AI scribe companies are transparent with how they are storing and using data. "Making sure that the firms are clear with how exactly the data is being used, because if there's ambiguity in what they say, then there could be ambiguity in the interpretation as well," he said. "With individuals, if they're uncomfortable with using something like that, they could speak with their physician to see if it's optional or see if they could get more information about what exactly is being done when the data is taken into scribe system." To show how Heidi Health's AI scribe works, Dr Blashki has taken 7.30 through a mock appointment about a headache. We discuss how the headache has been "on and off pretty much every morning for the last month" and that there's no history of migraines. Heidi Health then processes the conversation, in a process it calls 'Making Magic', then produces consultation notes. The software also suggests "differential diagnoses" including a "tension-type headache" and a "cervicogenic headache". "We're seeing some of the medical softwares and some of the AI generally come up with differential diagnoses, make suggestions, and ... the doctor really needs to turn their mind to it and look at them more as suggestions than the answer," Dr Blashki said. Dr Kelly said the software "aims to be more than a literal summary and is able to identify the clinical rationale underpinning a line of questioning". In response to 7.30's mock consultation, Dr Kelly said: "We summarise the clinical encounter reflecting their lines of questioning and using appropriate clinical terminology to describe them. Mr Van Lieshout said Lyrebird Health doesn't produce potential diagnoses after a consultation. "We won't try to tell the clinician what to do, if that makes sense," he said. "It's subjective to: what did the patient describe? Did I do any forms of examination? What was their blood pressure assessment? What's my diagnosis or assessment of situation? Then plan what's the next steps. "We will break up that conversation into those categories." Dr Blashki said about 50 per cent of the doctors in the GP clinic he works at in Melbourne are using AI scribe software for every consultation. He says he has also received referral letters from specialists that look like they've been created by AI. "I have had one letter where I think, 'Oh, I don't think they've checked this properly. They've clearly got one of the diagnoses not quite right'," he said. Former Victorian Chief Health Officer Brett Sutton believes AI scribes have become "indispensable" although he conceded that protecting patient data is the greatest concern for the industry. "I think the regulators need to make sure that it's safe," Dr Sutton said. "Obviously the clinicians who are using it have a responsibility for sensitive health information to be properly recorded and stored and made safe, so that it's treated in exactly the same way as any other clinical notes would be treated historically." Watch 7.30, Mondays to Thursdays 7:30pm on ABC iview and ABC TV Do you know more about this story? Get in touch with 7.30 here.

Thai hospital fined US$37,000 after 1,000 pages of patient records used as street food wrappers
Thai hospital fined US$37,000 after 1,000 pages of patient records used as street food wrappers

South China Morning Post

time06-08-2025

  • Health
  • South China Morning Post

Thai hospital fined US$37,000 after 1,000 pages of patient records used as street food wrappers

A Thai hospital has been fined 1.21 million baht (US$37,000) after the confidential medical records of its patients were discovered to have been used as wrappers for street food. Advertisement The case was exposed by an online influencer who posted the medical documents being used as bags for a type of crispy crepe known in Thailand as khanom Tokyo. The hospital at fault was a private medical facility in Ubon Ratchathani province in northeastern Thailand. Its name has not been revealed. The patient medical records were used as wrappers for food sold on the street. Photo: Facebook According to the influencer, whose name is translated as Doctor Lab Panda, patient details were visible on the wrapper. One showed clearly that it was that of a man infected with the hepatitis B virus. The influencer asked: 'Should I continue eating it, or is this enough?' The hospital came under fire after the post, which attracted 33,000 reactions and 1,700 comments, went viral. Advertisement The post was made in May 2024. On August 1, the government's Personal Data Protection Committee (PDPC) reported that it had imposed a penalty of 1.21 million baht on the hospital for violating data laws.

Thai hospital fined US$37,000 after 1,000 pages of patient records used as street food wrappers
Thai hospital fined US$37,000 after 1,000 pages of patient records used as street food wrappers

South China Morning Post

time06-08-2025

  • Health
  • South China Morning Post

Thai hospital fined US$37,000 after 1,000 pages of patient records used as street food wrappers

A Thai hospital has been fined 1.21 million baht (US$37,000) after the confidential medical records of its patients were discovered to have been used as wrappers for street food. Advertisement The case was exposed by an online influencer who posted the medical documents being used as bags for a type of crispy crepe known in Thailand as khanom Tokyo. The hospital at fault was a private medical facility in Ubon Ratchathani province in northeastern Thailand. Its name has not been revealed. The patient medical records were used as wrappers for food sold on the street. Photo: Facebook According to the influencer, whose name is translated as Doctor Lab Panda, patient details were visible on the wrapper. One showed clearly that it was that of a man infected with the hepatitis B virus. The influencer asked: 'Should I continue eating it, or is this enough?' The hospital came under fire after the post, which attracted 33,000 reactions and 1,700 comments, went viral. Advertisement The post was made in May 2024. On August 1, the government's Personal Data Protection Committee (PDPC) reported that it had imposed a penalty of 1.21 million baht on the hospital for violating data laws.

‘We don't want to feel like Big Brother is watching us': the NHS staff being filmed by patients
‘We don't want to feel like Big Brother is watching us': the NHS staff being filmed by patients

Telegraph

time13-07-2025

  • Health
  • Telegraph

‘We don't want to feel like Big Brother is watching us': the NHS staff being filmed by patients

Katie Thompson has grown accustomed to friends seeing her on the internet – not that she enjoys being filmed. A sonographer at the Great Western Hospital in Swindon, she says she is frequently spotted on social media platforms by her nearest and dearest having been covertly filmed while attending to patients. 'You don't want to find yourself, without being asked, suddenly appearing on someone's TikTok or Facebook,' says Thompson, who warns this is happening on a 'daily basis'. 'I choose what I do and don't put on social media, but if someone else is taking pictures of you or recording you [and uploading that content online], you have no control over who is seeing that.' Thompson is far from alone in her discomfort. In fact, she is one of a growing number of NHS professionals reporting that they are regularly being recorded, overtly or covertly, by their patients or their patients' friends and family. The problem has become so widespread that senior medics were last month forced to intervene, with the Society of Radiographers (SoR) publicly expressing its concerns. On one occasion, the union said, a member had reported being filmed by the 19-year-old daughter of a cancer patient who was having a cannula inserted. 'She wanted to record the cannulation because she thought it would be entertaining on social media. But she didn't ask permission,' the staff member said. 'I spent the weekend afterwards worrying: did I do my job properly? I know I did, but no one's perfect all the time and this was recorded. I don't think I slept for the whole weekend.' The group warned that those filming inside hospitals and other clinical environments risk 'publicising other patients' medical information, and compromising their own treatment'. It is now calling for the introduction of clear policies to prevent patients from photographing or recording clinical procedures without having express permission to do so. Lives as 'content' Some of the clips posted online and seen by The Telegraph are seemingly the product of individuals keen to document their own health journeys, or to raise awareness of specific conditions. Others, however, appear to follow the broader trend of many people treating their entire lives as 'content' for social media. Whatever the rationale of those behind the videos, the SoR warns that they are making staff in the health service – the vast majority of whom wear identity badges – 'uncomfortable' and 'anxious'. Hospital trusts across the UK have their own policies when it comes to filming, with many clearly stipulating that 'no patients or staff are to be filmed without consent'. But some warn current measures do not go far enough. 'As healthcare professionals, we need to think: does that recording breach the confidentiality of other patients? Does it breach our ability to deliver care?' Dean Rogers, the director of strategy at the SoR, told the BBC last month. 'There are hospital trusts that have very good policies around patients taking photos and filming procedures but this is something all trusts need to have in place.' Thompson, whose own ward has a sign which explicitly states that patients must not record or take pictures, agrees. 'I think there should be policies [to prevent this],' she says. 'People aren't asking permission to do this, and it's causing anxiety among the staff. We don't want to feel that we're being watched all the time… like Big Brother's watching us.' Erosion of privacy Many medics are despairing over the current state of affairs, pointing out the litany of issues that come with filming inside clinical settings. Other patients who may appear in the background of such footage risk being exposed, for one, while NHS staff going about their duties could have their privacy eroded too. 'God forbid, it could be somebody [who is filmed] who's not actually told a relative they're going to hospital or got anything wrong with them,' says Rachel Nolan, the SoR's vice-president. 'Then they see it [the clip] on somebody's TikTok or Instagram, and think 'that's my relative in the background, I wonder what they're doing there?'' In some instances, people may also be unwittingly leaking their own personal and highly sensitive information. For Thompson, who routinely carries out pregnancy scans, all sorts of data risk being compromised when her patients film their treatment, potentially putting them 'at risk'. (Screens used as part of the procedure typically show the individual's name, their date of birth and their hospital number, among other personal information.) Moreover, patients filming and uploading footage of their time in hospital may distract staff attempting to carry out complicated medical procedures. 'It's mentally taxing enough making decisions that are going to affect people's health without thinking at the back of your mind that you're being filmed,' says Dave Pilborough, a therapeutic radiographer at the Royal Derby Hospital and a past president of the SoR. Surprising culprits A trawl of social media sites such as TikTok and Instagram reveals a glut of videos of this kind, captioned with messages such as 'come with me to the hospital' or 'spend the day with me in A and E'. Such clips often attract hundreds of views and likes, generating thousands of followers for the accounts that air them. The Telegraph found multiple examples of videos featuring footage of staff who are seemingly unaware they are being captured on camera. There is also an abundance of content posted online in which other patients seem to have been filmed without their knowledge or consent. Some of these clips show individuals lying on hospital beds, capturing their ordeal in intimate and occasionally graphic detail. Others are filmed inside busy waiting rooms. But it's not always the patients themselves who are behind the camera. In fact, Thompson says, the most common culprits are friends and family accompanying women as they come in for a scan. She explains that in many instances, it is 'the person sitting with [the patient] who will have their phone in a very unnatural position on their lap and be filming the screen, or filming you doing the scan'. Sometimes, they appear to be trying to capture the moment covertly. 'It's the ones that have got the phones really low on their laps. They're the ones that know they shouldn't be doing it,' she says. When she is able to spot that they are recording, Thompson asks them to stop – at which point they can get 'irate', she says. 'I think they feel guilty. They've been caught, but they've been doing it in such a covert way that they know they're not supposed to be doing it.' Worsening problem The SoR says the problem has worsened significantly in recent years. 'It's pretty trendy to take pictures of just about everything that happens to you and then stick it on social media,' says Richard Evans, the organisation's chief executive. 'It's almost the norm that people put everything on TikTok and Instagram,' says Nolan. 'They document their whole lives, what they're having for tea, and say, 'Oh, I've been in an operation today, and I'll put that on [the internet] as well'. I just think it's the availability of being able to record everything and document absolutely everything in their lives.' Thompson, meanwhile, speculates that influencers who have been given explicit permission to film their treatment (as part of them documenting their own health journeys) are unwittingly inspiring copycats. And as the trend grows, there are mounting worries there could be a more sinister side to some of the filming too, with some patients thought to be seeking to 'expose' the NHS, or catch out doctors they deem to be complicit in a broken system. 'There will always be people who want to exploit systems and be sensational,' says Evans. 'Maybe they even see themselves taking part in revealing the truth about the health service or something like that. You can imagine a range of motivations, can't you.' He adds that tighter policy can't come soon enough, arguing all NHS trusts need to have clear rules in place that prevent patients from filming inside their facilities, unless they have been granted permission to do so. 'For the vast majority of people, I suspect this is just unintentional, and they are not really thinking that what they're doing could be a problem,' he says. But ultimately, Evans concludes, 'some clear policy will be helpful'.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store