
Three men guilty of accessory to the murder of Swedish rapper C.Gambino, court rules
STOCKHOLM, July 16 (Reuters) - A Swedish court found three men guilty of accessory to the murder last year of the award-winning hip-hop artist known by the alias C.Gambino, it said on Wednesday.
The Swedish artist, who kept his identity hidden and wore a mask in public, was shot dead in a parking garage in June 2024 in a suspected gang-related attack in Gothenburg on the Nordic country's west coast.
The men - Hassan Rabeie, 22, Vide Atterstam, 20, and Fatjam Vardari, 21 - have denied the charges against them, according to Swedish media.
Rabeie and Vardari had been charged with murder, or alternatively accessory to murder, but the Gothenburg District Court said in a statement that the investigation had not established beyond reasonable doubt that they were the ones who actually shot the rapper.
As a result, all three were convicted of accessory to murder.
The court sentenced Rabeie to life imprisonment. The other two were sentenced to 15 years and six months, and 12 years and six months, respectively.
"The shooting has had the character of a pure execution and has entailed severe suffering for the victim," the court said. "The crime has its background in a conflict between criminal networks."
The lawyer representing Atterstam told Reuters the verdict was disappointing and declined to comment further. Rabeie's and Vardari's lawyers were not immediately available for comment when contacted by Reuters.
The late rapper is not to be confused with American artist Childish Gambino. (Reporting by Stine Jacobsen, editing by Anna Ringstrom)

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles

The Hindu
9 minutes ago
- The Hindu
AI-powered 'nudify' apps fuel deadly wave of digital blackmail
After a Kentucky teenager died by suicide this year, his parents discovered he had received threatening texts demanding $3,000 to suppress an AI-generated nude image of him. The tragedy underscores how so-called sextortion scams targeting children are growing around the world, particularly with the rapid proliferation of "nudify" apps: AI tools that digitally strip off clothing or generate sexualised imagery. Elijah Heacock, 16, was just one of thousands of American minors targeted by such digital blackmail, which has spurred calls for more action from tech platforms and regulators. His parents told U.S. media that the text messages ordered him to pay up or an apparently AI-generated nude photo would be sent to his family and friends. "The people that are after our children are well organised," John Burnett, the boy's father, said in a CBS News interview. "They are well financed, and they are relentless. They don't need the photos to be real, they can generate whatever they want, and then they use it to blackmail the child." U.S. investigators were looking into the case, which comes as nudify apps, which rose to prominence targeting celebrities, are being increasingly weaponised against children. The FBI has reported a "horrific increase" in sextortion cases targeting U.S. minors, with victims typically males between the ages of 14 and 17. The threat has led to an "alarming number of suicides," the agency warned. In a recent survey, Thorn, a non-profit focused on preventing online child exploitation, found that six percent of American teens have been a direct victim of deepfake nudes. "Reports of fakes and deepfakes - many of which are generated using these 'nudifying' services, seem to be closely linked with reports of financial sextortion, or blackmail with sexually explicit images," the British watchdog Internet Watch Foundation (IWF) said in a report last year. "Perpetrators no longer need to source intimate images from children because images that are convincing enough to be harmful - maybe even as harmful as real images in some cases - can be produced using generative AI." The IWF identified one "pedophile guide" developed by predators that explicitly encouraged perpetrators to use nudifying tools to generate material to blackmail children. The author of the guide claimed to have successfully blackmailed some 13-year-old girls. The tools are a lucrative business. A new analysis of 85 websites selling nudify services found they may be collectively worth up to $36 million a year. The analysis from Indicator, a U.S. publication investigating digital deception, estimates that 18 of the sites made between $2.6 million and $18.4 million over the six months to May. Most of the sites rely on tech infrastructure from Google, Amazon, and Cloudflare to operate, and remain profitable despite crackdowns by platforms and regulators, Indicator said. The proliferation of AI tools has led to new forms of abuse impacting children, including pornography scandals at universities and schools worldwide, where teenagers created sexualised images of their own classmates. A recent Save the Children survey found that one in five young people in Spain have been victims of deepfake nudes, with those images shared online without their consent. Earlier this year, Spanish prosecutors said they were investigating three minors in the town of Puertollano for allegedly targeting their classmates and teachers with AI-generated pornographic content and distributing it in their school. In the United Kingdom, the government this year made creating sexually explicit deepfakes a criminal offense, with perpetrators facing up to two years in jail. And in May, U.S. President Donald Trump signed the bipartisan "Take It Down Act," which criminalises the non-consensual publication of intimate images, while also mandating their removal from online platforms. Meta also recently announced it was filing a lawsuit against a Hong Kong company behind a nudify app called Crush AI, which it said repeatedly circumvented the tech giant's rules to post ads on its platforms. But despite such measures, researchers say AI nudifying sites remain resilient. "To date, the fight against AI nudifiers has been a game of whack-a-mole," Indicator said, calling the apps and sites "persistent and malicious adversaries." Those in distress or having suicidal tendencies can seek help and counselling by calling these helplines.


Time of India
39 minutes ago
- Time of India
AI-powered 'nudify' apps fuel deadly wave of digital blackmail
Academy Empower your mind, elevate your skills After a Kentucky teenager died by suicide this year, his parents discovered he had received threatening texts demanding $3,000 to suppress an AI-generated nude image of tragedy underscores how so-called sextortion scams targeting children are growing around the world, particularly with the rapid proliferation of "nudify" apps -- AI tools that digitally strip off clothing or generate sexualized Heacock, 16, was just one of thousands of American minors targeted by such digital blackmail , which has spurred calls for more action from tech platforms and parents told US media that the text messages ordered him to pay up or an apparently AI-generated nude photo would be sent to his family and friends."The people that are after our children are well organized," John Burnett, the boy's father, said in a CBS News interview."They are well financed, and they are relentless. They don't need the photos to be real, they can generate whatever they want, and then they use it to blackmail the child."US investigators were looking into the case, which comes as nudify apps -- which rose to prominence targeting celebrities -- are being increasingly weaponized against FBI has reported a "horrific increase" in sextortion cases targeting US minors, with victims typically males between the ages of 14 and 17. The threat has led to an "alarming number of suicides," the agency a recent survey, Thorn, a non-profit focused on preventing online child exploitation, found that six percent of American teens have been a direct victim of deepfake nudes "Reports of fakes and deepfakes -- many of which are generated using these 'nudifying' services -- seem to be closely linked with reports of financial sextortion , or blackmail with sexually explicit images," the British watchdog Internet Watch Foundation (IWF) said in a report last year."Perpetrators no longer need to source intimate images from children because images that are convincing enough to be harmful -- maybe even as harmful as real images in some cases -- can be produced using generative AI."The IWF identified one "pedophile guide" developed by predators that explicitly encouraged perpetrators to use nudifying tools to generate material to blackmail children. The author of the guide claimed to have successfully blackmailed some 13-year-old tools are a lucrative business.A new analysis of 85 websites selling nudify services found they may be collectively worth up to $36 million a analysis from Indicator, a US publication investigating digital deception, estimates that 18 of the sites made between $2.6 million and $18.4 million over the six months to of the sites rely on tech infrastructure from Google, Amazon, and Cloudflare to operate, and remain profitable despite crackdowns by platforms and regulators, Indicator proliferation of AI tools has led to new forms of abuse impacting children, including pornography scandals at universities and schools worldwide, where teenagers created sexualized images of their own classmates.A recent Save the Children survey found that one in five young people in Spain have been victims of deepfake nudes, with those images shared online without their this year, Spanish prosecutors said they were investigating three minors in the town of Puertollano for allegedly targeting their classmates and teachers with AI-generated pornographic content and distributing it in their the United Kingdom, the government this year made creating sexually explicit deepfakes a criminal offense, with perpetrators facing up to two years in in May, US President Donald Trump signed the bipartisan "Take It Down Act," which criminalizes the non-consensual publication of intimate images, while also mandating their removal from online also recently announced it was filing a lawsuit against a Hong Kong company behind a nudify app called Crush AI, which it said repeatedly circumvented the tech giant's rules to post ads on its despite such measures, researchers say AI nudifying sites remain resilient."To date, the fight against AI nudifiers has been a game of whack-a-mole," Indicator said, calling the apps and sites "persistent and malicious adversaries."


Time of India
an hour ago
- Time of India
Indian woman caught shoplifting: US embassy reacts after viral video; issues fresh visa warning
(Source: YouTube/Body Cam Edition) A day after a viral video showed an Indian tourist being arrested for allegedly shoplifting nearly $1,000 worth of items from a Target store in the United States, the US Embassy in India has issued a pointed warning to visa holders: breaking American laws could result in visa revocation and a permanent bar from future entry. In a statement posted on X, the US Embassy said, 'Committing assault, theft, or burglary in the United States won't just cause you legal issues – it could lead to your visa being revoked and make you ineligible for future US visas. The United States values law and order and expects foreign visitors to follow all US laws.' Just a day earlier, bodycam footage surfaced showing a woman identified as Avlani, an Indian tourist, being arrested for felony theft. The video, shared by YouTube channel @BodyCamEdition, shows her pleading with officers to allow her to pay for the items. 'Why can't I just pay for it?' she asks, but an officer replies, 'We're way past that. You committed a felony.' by Taboola by Taboola Sponsored Links Sponsored Links Promoted Links Promoted Links You May Like If you have a mouse, this game will keep you up all night. No Install. Play for free. Navy Quest Undo US immigration lawyer Alen Takhsh said that even without a conviction, an arrest for shoplifting can carry severe immigration consequences. 'This is a Crime Involving Moral Turpitude, one that involves dishonesty and could have serious consequences,' he said earlier. The advisory comes amid a broader clampdown under President Trump's immigration policy. According to the UN, more than 1.42 lakh individuals have been deported from the US since January. The US has also tightened social media vetting for visa applicants and introduced a sweeping travel ban on several countries, though India remains unaffected by those measures.