logo
Karnataka seeks central probe into clinical trial lapses at HCG hospitals

Karnataka seeks central probe into clinical trial lapses at HCG hospitals

India Todaya day ago
The Karnataka government has written to the Central Drugs Standard Control Organisation (CDSCO), requesting an investigation into alleged irregularities in clinical trials conducted at the Bengaluru-based Healthcare Global Enterprises Ltd (HCG) hospitals. The move follows concerns raised by Justice P Krishna Bhat, a retired judge and former chairperson of the hospital's Institutional Ethics Committee (IEC) regarding patient safety and procedural violations during trial approvals.advertisementSpeaking to the media, Karnataka Health Minister Dinesh Gundu Rao said the decision to seek an inquiry stemmed from troubling reports. 'This is based on certain information we saw and received, including some news articles, and then we found out there were some issues. This is a very serious issue, and it has been raised by the ethics committee of HCG Cancer Hospital itself by their own committee, chaired by Justice Krishna Bhat,' he said.
He added that the Health Commissioner has already written to the Drug Controller General of India. 'We have asked them to look into the issue and investigate the matter, because it has serious implications regarding clinical trials and related concerns. I do not know what the truth is behind the whole thing, but it must be investigated by a responsible agency, and that is the CDSCO,' Rao said.advertisementDr BS Ajai Kumar, Founder and Chairman of HCG hospitals, responding to the allegations, issued a statement reiterating the hospital's adherence to all regulatory frameworks. 'We have noticed some unverified information about HCG, a pioneer in cancer care in India and Africa. We assure you that we strictly adhere to all guidelines set by regulatory authorities, including the Drug Controller General of India (DCGI) and the Indian Council of Medical Research (ICMR). Currently, we are successfully conducting a significant number of trials with utmost transparency, prioritising patient safety, approved by our Ethics Committee. Our commitment to delivering exceptional care remains unwavering,' Dr Ajai Kumar said on behalf of Healthcare Global Enterprises Limited.Details of the concerns were outlined in a letter dated June 30, 2025, by Health and Family Welfare Department Commissioner Sivakumar K B, who highlighted issues raised by Justice Bhat. These included unchecked conflicts of interest and irregularities in patient enrolment during trials, as reported by South First.The letter, addressed to the Drugs Controller General of India, stated: 'These concerns have been flagged by none other than the chairperson of the institutional ethics committee, who has subsequently resigned.' It added: 'These lapses, if proven, will undermine the strict ethical principles laid down by the CDSCO, Department of Health Research, Indian Council of Medical Research, and global regulatory bodies like the World Health Organisation, which mandate the highest standards of patient safety and ethical conduct in clinical trials.'advertisementThe Commissioner described the matter as 'of serious concern' and called for a thorough probe into the allegations of 'unfair clinical trials being conducted at Bengaluru's HCG.'Justice Bhat had raised multiple concerns with then Chief Executive Officer Raj Gore and former Medical Director Dr Harish Reddy following discussions in several ethics committee meetings, in a March 5, 2025, letter, which was accessed by South First.One of the most serious issues was a potential conflict of interest involving Dr Sathish, who allegedly served as both principal investigator and in a supervisory role as Director of the Ethics Committee. Justice Bhat wrote that this dual role posed ethical risks, including compromised patient safety, dilution of inclusion criteria, and resistance to procedural reforms.The letter noted that although there is no formal post of 'Director of Clinical Trials' within ethics committees, institutions may appoint someone as 'Director of Clinical Trial Development' at the corporate level, typically endorsed by top leadership. Justice Bhat stated that during the 18 committee meetings he attended, the individual never clarified that he was not serving in such a dual capacity.advertisementAdditional concerns listed in the letter included rushed presentations, bypassing informed review processes, an excessive number of poorly explained trial proposals, and direct communication between the investigator and sponsors, which could open the door to commercial bias and protocol manipulation.- Ends
IN THIS STORY#Karnataka#Bengaluru
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Review held to enhance administrative efficiency Kurnool GGH
Review held to enhance administrative efficiency Kurnool GGH

Hans India

timean hour ago

  • Hans India

Review held to enhance administrative efficiency Kurnool GGH

Kurnool: A comprehensive review meeting was held at the Kurnool Government General Hospital on Thursday under the supervision of Hospital Superintendent Dr K Venkateswarlu. The session focused on the performance and responsibilities of the ministerial staff and hospital personnel. Key topics included punctuality, section-wise performance assessments, adherence to the Face Recognizing System (FRS), and the delivery of timely administrative and patient-related services. Dr Venkateswarlu conducted an in-depth evaluation of the ministerial staff's functioning, analysing their roles across various sections. He also scrutinised the duties of fourth-class employees and voiced displeasure regarding lapses in their responsibilities. Strict instructions were issued mandating the use of ID cards, compliance with dress code regulations, proper FRS-based attendance, and strict punctuality. The Superintendent warned that any violations in these areas would result in departmental disciplinary action. Further, the Medical Records and Transcription (MRT) section was directed to ensure the prompt issuance of reports, death certificates, and birth certificates to avoid inconvenience to patients. Emphasis was placed on attending to public grievances without delay and avoiding any negligence in service delivery. All hospital staff were instructed to remain accessible during their designated duty hours and to maintain proper documentation of files and records. The Superintendent also reviewed seat allotments within the ministerial staff and announced changes aimed at improving workflow. Deputy CS RMO Dr Padmaja, Hospital Administrator Sindhu Subrahmanyam, Administrative Officer Srinivasulu, and other hospital personnel attended the meeting.

Content moderators for Big Tech unite to tackle mental trauma
Content moderators for Big Tech unite to tackle mental trauma

Time of India

timean hour ago

  • Time of India

Content moderators for Big Tech unite to tackle mental trauma

Content moderators from the Philippines to Turkey are uniting to push for greater mental health support to help them cope with the psychological effects of exposure to a rising tide of disturbing images online. The people tasked with removing harmful content from tech giants like Meta Platforms or TikTok, report a range of noxious health effects from loss of appetite to anxiety and suicidal thoughts. "Before I would sleep seven hours," said one Filipino content moderator who asked to remain anonymous to avoid problems with their employer. "Now I only sleep around four hours." Workers are gagged by non-disclosure agreements with the tech platforms or companies that do the outsourced work, meaning they cannot discuss exact details of the content they are seeing. But videos of people being burned alive by the Islamic State, babies dying in Gaza and gruesome pictures from the Air India crash in June were given as examples by moderators who spoke to the Thomson Reuters Foundation. Social media companies, which often outsource content moderation to third parties, are facing increasing pressure to address the emotional toll of moderation. Meta, which owns Facebook, WhatsApp and Instagram, has already been hit with workers' rights lawsuits in Kenya and Ghana, and in 2020 the firm paid a $52 million settlement to American content moderators suffering long-term mental health issues. The Global Trade Union Alliance of Content Moderators was launched in Nairobi in April to establish worker protections for what they dub "a 21st century hazardous job", similar to the work of emergency responders. Their first demand is for tech companies to adopt mental health protocols, such as exposure limits and trauma training, in their supply chains. "They say we're the ones protecting the internet, keeping kids safe online," the Filipino worker said, "But we are not protected enough." Scrolling trauma Globally, tens of thousands of content moderators spend up to 10 hours a day scrolling through social media posts to remove harmful content - and the mental toll is well-documented. "I've had bad dreams because of the graphic content, and I'm smoking more, losing focus," said Berfin Sirin Tunc, a content moderator for TikTok in Turkey employed via Canadian-based tech company Telus, which also does work for Meta. In a video call with the Thomson Reuters Foundation, she said the first time she saw graphic content as part of her job she had to leave the room and go home. While some employers do provide psychological support, some workers say it is just for show - with advice to count numbers or do breathing exercises. Therapy is limited to either group sessions or a recommendation to switch off for a certain number of "wellness break" minutes. But taking them is another thing. "If you don't go back to the computer, your team leader will ask where are you and (say) that the queue of videos is growing," said Tunc, "Bosses see us just as machines." In emailed statements to the Thomson Reuters Foundation, Telus and Meta said the well-being of their employees is a top priority and that employees should have access to 24/7 healthcare support. Rising pressure Moderators have seen an uptick in violent videos. A report by Meta for the first quarter of 2025 showed a rise in the sharing of violent content on Facebook, after the company changed its content moderation policies in a commitment to "free expression." However, Telus said in its emailed response that internal estimates show that distressing material represents less than 5% of the total content reviewed. Adding to the pressure on moderators is a fear of losing jobs as companies shift towards artificial intelligence-powered moderation. Meta, which invested billions and hired thousands of content moderators globally over the years to police extreme content, scrapped its US fact-checking programme in January, following the election of Donald Trump. In April, 2,000 Barcelona-based workers were sent home after Meta severed a contract with Telus. A Meta spokesperson said the company has moved the services that were being performed from Barcelona to other locations. "I'm waiting for Telus to fire me," said Tunc, "because they fired my friends from our union." Fifteen workers in Turkey are suing the company after being dismissed, they say, after organising a union and attending protests this year. A spokesperson for Telus said in an emailed response that the company "respects the rights of workers to organise". Telus said a May report by Turkey's Ministry of Labour found contract terminations were based on performance and it could not be concluded that the terminations were union-related. The Labour Ministry did not immediately respond to a request for comment. Protection protocols Moderators in low-income countries say that the low wages, productivity pressure and inadequate mental health support can be remedied if companies sign up to the Global Alliance's eight protocols. These include limiting exposure time, making realistic quotas and 24/7 counselling, as well as living wages, mental health training and the right to join a union. Telus said in its statement that it was already in compliance with the demands, and Meta said it conducts audits to check that companies are providing required on-site support. New European Union rules - such as the Digital Services Act, the AI Act and supply chain regulations which demand tech companies address risks to workers - should give stronger legal grounds to protect content moderators' rights, according to labour experts. "Bad things are happening in the world. Someone has to do this job and protect social media," said Tunc. "With better conditions, we can do this better. If you feel like a human, you can work like a human."

Content moderators for Big Tech unite to tackle mental trauma
Content moderators for Big Tech unite to tackle mental trauma

The Hindu

time2 hours ago

  • The Hindu

Content moderators for Big Tech unite to tackle mental trauma

Content moderators from the Philippines to Turkey are uniting to push for greater mental health support to help them cope with the psychological effects of exposure to a rising tide of disturbing images online. The people tasked with removing harmful content from tech giants like Meta Platforms or TikTok, report a range of noxious health effects from loss of appetite to anxiety and suicidal thoughts. "Before I would sleep seven hours," said one Filipino content moderator who asked to remain anonymous to avoid problems with their employer. "Now I only sleep around four hours." Workers are gagged by non-disclosure agreements with the tech platforms or companies that do the outsourced work, meaning they cannot discuss exact details of the content they are seeing. But videos of people being burned alive by the Islamic State, babies dying in Gaza and gruesome pictures from the Air India crash in June were given as examples by moderators who spoke to the Thomson Reuters Foundation. Social media companies, which often outsource content moderation to third parties, are facing increasing pressure to address the emotional toll of moderation. Meta, which owns Facebook, WhatsApp and Instagram, has already been hit with workers' rights lawsuits in Kenya and Ghana, and in 2020 the firm paid a $52 million settlement to American content moderators suffering long-term mental health issues. The Global Trade Union Alliance of Content Moderators was launched in Nairobi in April to establish worker protections for what they dub "a 21st century hazardous job", similar to the work of emergency responders. Their first demand is for tech companies to adopt mental health protocols, such as exposure limits and trauma training, in their supply chains. "They say we're the ones protecting the internet, keeping kids safe online," the Filipino worker said, "But we are not protected enough." Globally, tens of thousands of content moderators spend up to 10 hours a day scrolling through social media posts to remove harmful content, and the mental toll is well-documented. "I've had bad dreams because of the graphic content, and I'm smoking more, losing focus," said Berfin Sirin Tunc, a content moderator for TikTok in Turkey employed via Canadian-based tech company Telus, which also does work for Meta. In a video call with the Thomson Reuters Foundation, she said the first time she saw graphic content as part of her job she had to leave the room and go home. While some employers do provide psychological support, some workers say it is just for show, with advice to count numbers or do breathing exercises. Therapy is limited to either group sessions or a recommendation to switch off for a certain number of "wellness break" minutes. But taking them is another thing. "If you don't go back to the computer, your team leader will ask where are you and (say) that the queue of videos is growing," said Tunc, "Bosses see us just as machines." In emailed statements to the Thomson Reuters Foundation, Telus and Meta said the well-being of their employees is a top priority and that employees should have access to 24/7 healthcare support. Moderators have seen an uptick in violent videos. A report by Meta for the first quarter of 2025 showed a rise in the sharing of violent content on Facebook, after the company changed its content moderation policies in a commitment to "free expression." However, Telus said in its emailed response that internal estimates show that distressing material represents less than 5% of the total content reviewed. Adding to the pressure on moderators is a fear of losing jobs as companies shift towards artificial intelligence-powered moderation. Meta, which invested billions and hired thousands of content moderators globally over the years to police extreme content, scrapped its U.S. fact-checking programme in January, following the election of Donald Trump. In April, 2,000 Barcelona-based workers were sent home after Meta severed a contract with Telus. A Meta spokesperson said the company has moved the services that were being performed from Barcelona to other locations. "I'm waiting for Telus to fire me," said Tunc, "because they fired my friends from our union." Fifteen workers in Turkey are suing the company after being dismissed, they say, after organising a union and attending protests this year. A spokesperson for Telus said in an emailed response that the company "respects the rights of workers to organise". Telus said a May report by Turkey's Ministry of Labour found contract terminations were based on performance and it could not be concluded that the terminations were union-related. The Labour Ministry did not immediately respond to a request for comment. Moderators in low-income countries say that the low wages, productivity pressure and inadequate mental health support can be remedied if companies sign up to the Global Alliance's eight protocols. These include limiting exposure time, making realistic quotas and 24/7 counselling, as well as living wages, mental health training and the right to join a union. Telus said in its statement that it was already in compliance with the demands, and Meta said it conducts audits to check that companies are providing required on-site support. New European Union rules, such as the Digital Services Act, the AI Act and supply chain regulations which demand tech companies address risks to workers, should give stronger legal grounds to protect content moderators' rights, according to labour experts. "Bad things are happening in the world. Someone has to do this job and protect social media," said Tunc. "With better conditions, we can do this better. If you feel like a human, you can work like a human."

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store