logo
#

Latest news with #digitalrights

Survivors of online harms not getting the support they need: SG Her Empowerment survey
Survivors of online harms not getting the support they need: SG Her Empowerment survey

CNA

time5 days ago

  • General
  • CNA

Survivors of online harms not getting the support they need: SG Her Empowerment survey

A new study by non-profit organisation SG Her Empowerment (SHE), which surveyed 25 survivors of online harms, has found that they are not getting the support they need. The survivors cited complex legal systems, delayed platform responses and societal stigma. SHE is calling for a system that allows survivors to report to a central agency, and for harmful content to be taken down quickly. Kate Low reports.

Why a new anti-revenge porn law has free speech experts alarmed
Why a new anti-revenge porn law has free speech experts alarmed

TechCrunch

time24-05-2025

  • Politics
  • TechCrunch

Why a new anti-revenge porn law has free speech experts alarmed

Privacy and digital rights advocates are raising alarms over a law that many would expect them to cheer: a federal crackdown on revenge porn and AI-generated deepfakes. The newly signed Take It Down Act makes it illegal to publish nonconsensual explicit images — real or AI-generated — and gives platforms just 48 hours to comply with a victim's takedown request or face liability. While widely praised as a long-overdue win for victims, experts have also warned its vague language, lax standards for verifying claims, and tight compliance window could pave the way for overreach, censorship of legitimate content, and even surveillance. 'Content moderation at scale is widely problematic and always ends up with important and necessary speech being censored,' India McKinney, director of federal affairs at Electronic Frontier Foundation, a digital rights organization, told TechCrunch. Online platforms have one year to establish a process for removing nonconsensual intimate imagery (NCII). While the law requires takedown requests come from victims or their representatives, it only asks for a physical or electronic signature — no photo ID or other form of verification is needed. That likely aims to reduce barriers for victims, but it could create an opportunity for abuse. 'I really want to be wrong about this, but I think there are going to be more requests to take down images depicting queer and trans people in relationships, and even more than that, I think it's gonna be consensual porn,' McKinney said. Senator Marsha Blackburn (R-TN), a co-sponsor of the Take It Down Act, also sponsored the Kids Online Safety Act which puts the onus on platforms to protect children from harmful content online. Blackburn has said she believes content related to transgender people is harmful to kids. Similarly, the Heritage Foundation — the conservative think tank behind Project 2025 — has also said that 'keeping trans content away from children is protecting kids.' Because of the liability that platforms face if they don't take down an image within 48 hours of receiving a request, 'the default is going to be that they just take it down without doing any investigation to see if this actually is NCII or if it's another type of protected speech, or if it's even relevant to the person who's making the request,' said McKinney. Techcrunch event Join us at TechCrunch Sessions: AI Secure your spot for our leading AI industry event with speakers from OpenAI, Anthropic, and Cohere. For a limited time, tickets are just $292 for an entire day of expert talks, workshops, and potent networking. Exhibit at TechCrunch Sessions: AI Secure your spot at TC Sessions: AI and show 1,200+ decision-makers what you've built — without the big spend. Available through May 9 or while tables last. Berkeley, CA | REGISTER NOW Snapchat and Meta have both said they are supportive of the law, but neither responded to TechCrunch's requests for more information about how they'll verify whether the person requesting a takedown is a victim. Mastodon, a decentralized platform that hosts its own flagship server that others can join, told TechCrunch it would lean towards removal if it was too difficult to verify the victim. Mastodon and other decentralized platforms like Bluesky or Pixelfed may be especially vulnerable to the chilling effect of the 48-hour takedown rule. These networks rely on independently operated servers, often run by nonprofits or individuals. Under the law, the FTC can treat any platform that doesn't 'reasonably comply' with takedown demands as committing an 'unfair or deceptive act or practice' – even if the host isn't a commercial entity. 'This is troubling on its face, but it is particularly so at a moment when the chair of the FTC has taken unprecedented steps to politicize the agency and has explicitly promised to use the power of the agency to punish platforms and services on an ideological, as opposed to principled, basis,' the Cyber Civil Rights Initiative, a nonprofit dedicated to ending revenge porn, said in a statement. Proactive monitoring McKinney predicts that platforms will start moderating content before it's disseminated so they have fewer problematic posts to take down in the future. Platforms are already using AI to monitor for harmful content. Kevin Guo, CEO and co-founder of AI-generated content detection startup Hive, said his company works with online platforms to detect deepfakes and child sexual abuse material (CSAM). Some of Hive's customers include Reddit, Giphy, Vevo, Bluesky, and BeReal. 'We were actually one of the tech companies that endorsed that bill,' Guo told TechCrunch. 'It'll help solve some pretty important problems and compel these platforms to adopt solutions more proactively.' Hive's model is a software-as-a-service, so the startup doesn't control how platforms use its product to flag or remove content. But Guo said many clients insert Hive's API at the point of upload to monitor before anything is sent out to the community. A Reddit spokesperson told TechCrunch the platform uses 'sophisticated internal tools, processes, and teams to address and remove' NCII. Reddit also partners with nonprofit SWGfl to deploy its StopNCII tool, which scans live traffic for matches against a database of known NCII and removes accurate matches. The company did not share how it would ensure the person requesting the takedown is the victim. McKinney warns this kind of monitoring could extend into encrypted messages in the future. While the law focuses on public or semi-public dissemination, it also requires platforms to 'remove and make reasonable efforts to prevent the reupload' of nonconsensual intimate images. She argues this could incentivize proactive scanning of all content, even in encrypted spaces. The law doesn't include any carve outs for end-to-end encrypted messaging services like WhatsApp, Signal, or iMessage. Meta, Signal, and Apple have not responded to TechCrunch's request for more information on their plans for encrypted messaging. Broader free speech implications On March 4, Trump delivered a joint address to Congress in which he praised the Take It Down Act and said he looked forward to signing it into law. 'And I'm going to use that bill for myself, too, if you don't mind,' he added. 'There's nobody who gets treated worse than I do online.' While the audience laughed at the comment, not everyone took it as a joke. Trump hasn't been shy about suppressing or retaliating against unfavorable speech, whether that's labeling mainstream media outlets 'enemies of the people,' barring The Associated Press from the Oval Office despite a court order, or pulling funding from NPR and PBS. On Thursday, the Trump administration barred Harvard University from accepting foreign student admissions, escalating a conflict that began after Harvard refused to adhere to Trump's demands that it make changes to its curriculum and eliminate DEI-related content, among other things. In retaliation, Trump has frozen federal funding to Harvard and threatened to revoke the university's tax-exempt status. 'At a time when we're already seeing school boards try to ban books and we're seeing certain politicians be very explicitly about the types of content they don't want people to ever see, whether it's critical race theory or abortion information or information about climate change…it is deeply uncomfortable for us with our past work on content moderation to see members of both parties openly advocating for content moderation at this scale,' McKinney said.

U.S. tech firms earn dismal grades on human rights report card
U.S. tech firms earn dismal grades on human rights report card

Fast Company

time19-05-2025

  • Business
  • Fast Company

U.S. tech firms earn dismal grades on human rights report card

This story originally appeared in Global Voices. A decade after the first assessment, the 2025 Ranking Digital Rights Index: Big Tech Edition reveals a landscape of paradox. While some of the world's most influential digital platforms demonstrate incremental improvements in transparency, particularly in governance disclosures from Chinese companies like Alibaba, Baidu, and Tencent, the overall picture suggests a concerning inertia. In a world grappling with rising authoritarianism, the use of AI tools, and ongoing global conflicts, the report shows that many Big Tech companies are largely continuing with 'business as usual,' failing to address critical issues. The concentration of power within Big Tech remains a central concern. The report highlights how companies like Alphabet, Amazon, Apple, Meta, and Microsoft have aggressively acquired competitors, consolidating their dominance in the digital landscape. This market concentration, where Alphabet, Meta, and Amazon capture two-thirds of online advertising revenue, grants them power over online access and information flows. Despite increasing scrutiny from legal systems, evidenced by rulings against Google for illegal monopolies in search and advertising, the political influence of Big Tech appears to have increased. The symbolic image of US Big Tech CEOs in the front row of the presidential inauguration underscores their deep connections with government bodies, potentially hindering much-needed oversight at a time when human rights and democratic structures face unprecedented challenges globally. This dominance is further exacerbated in a context of conflict. ' Alphabet, Amazon, and Microsoft have all developed tools meant for war and integration with lethal weapons. Their cloud infrastructure has powered military campaigns,' reveals the report. Ranking Digital Rights also calls attention to propaganda, especially on X and platforms owned by Meta. Lack of transparency While the report highlights pockets of progress, particularly among Chinese companies (Alibaba, Tencent, and Baidu), showing increased transparency in governance, patterns have been spotted throughout the analysis that raise concerns. Though Meta has shown improvements in disclosing how its algorithms curate content and has enhanced security with default end-to-end encryption on some messaging services, significant shortcomings persist across the industry. A common issue is the widespread lack of transparency in how companies handle private requests for user data or content restrictions, with Samsung notably disclosing no information in this area. The very engines of Big Tech's profit—algorithms and targeted advertising—remain largely opaque. Despite the known risks for democracies linked to disinformation and election interference, none of the assessed companies achieved even half the possible score in this area. Alphabet and Meta even showed slight declines in transparency related to their targeted advertising practices. Most companies fail to disclose information about advertisements removed for violating their policies or provide evidence of enforcing their ad targeting rules. X declined significantly more than other companies analyzed. 'The company's transformation from the publicly listed Twitter to the privately held X Corp. and the elimination of its human rights team coincided with a significant drop in transparency across its governance, freedom of expression, and privacy practices,' the report emphasized. X failed to publish a transparency report in both 2022 and 2023. While a report finally surfaced in September 2024, it fell outside the assessment's cutoff. Even more troubling is the reported removal of years' worth of transparency reports dating back to 2011. Finally, the report points to a troubling pattern of policy evolution. Companies like Meta and YouTube have been revising their content policies in ways that have sparked widespread concern, such as Meta dismantling its third-party fact-checking program in the US and YouTube removing 'gender identity' from its hate speech policy. Global Voices covered the consequences of this policy in Africa, and also how fact-checking practices are needed amidst digital authoritarianism, especially during elections, such as the case of Indonesia. This suggests a potential shift towards justifying existing behaviors rather than upholding previously embraced principles. The 2025 RDR Index demonstrates stagnation at a critical time. While acknowledging some positive developments, the report also calls for a renewed effort from different stakeholders, especially civil society, investors, and policymakers.

Africa Daily  Why are so many countries saying yes to Starlink?
Africa Daily  Why are so many countries saying yes to Starlink?

BBC News

time13-05-2025

  • Business
  • BBC News

Africa Daily Why are so many countries saying yes to Starlink?

The number of African countries now allowing Elon Musk's company Starlink to provide internet services has been growing rapidly - six have granted permission in 2025 alone. And there are reports that Uganda might be next. Starlink can be cheaper than some traditional internet providers and has been seen as a way to provide internet access to communities that are hard to reach. But does it come at a cost to governments who might have less control over internet access? And what does it mean for local economies if a big, international company has access to the market? Alan Kasujja speaks to Temidayo Onionsun; a Nigerian space scientist, and Juliet Nanfuka, a digital rights activist and member of the African Digital Rights Network.

Presenter whose intimate pics were leaked fights back
Presenter whose intimate pics were leaked fights back

Yahoo

time09-05-2025

  • Entertainment
  • Yahoo

Presenter whose intimate pics were leaked fights back

TV presenter Jess Davies was just 15 years old when images of her in her underwear were shared around her town. She had exchanged photos with a boy she fancied, and he had forwarded them on to others without her consent. She was in art class when her phone started buzzing with messages from older boys. "Nice pictures," read one. "I didn't think you were that type of girl," came another. "It turns out my images had been Bluetoothed around the whole sixth form centre, which quickly got shared around my school, then around my hometown and eventually ended up on the phones of the men's football team in the town," said Jess. Warning: Contains sexually explicit language and themes "It's a small town so people knew who I was and knew I was underage and yet still flashed my images around to people that were in their 20s or 30s," Jess said. Eventually news of the images reached her grandmother who told her parents. This was to be the first of several incidents Jess experienced in her teens and 20s that would later inform her women's rights campaigning. Her 2022 BBC documentary Deepfake Porn: Could You Be Next? was used to lobby the UK government to criminalise sexually explicit deepfakes in the Online Safety Act. Now she has written a book, No One Wants To See Your D*ck: A Handbook for Survival in the Digital World, for which she has had to explore everything from sexual harassment to cyber-flashing and catfishing, and tells of men on well-known, easy-to-access forums requesting explicit deepfakes of their mothers and teachers. Others are uploading explicit photos of women they know and asking other men to write rape fantasies about them, Jess said. "These aren't some weirdos in their mum's basement who are chronically online, never leave their homes and don't have a social life, no, these are people's friends and people's husbands," said Jess, who lives in Penarth, Vale of Glamorgan. "There's a generation that's growing up online and it's a generation who don't see women as whole humans who have rights. "It's a pandemic of misogyny that is unfolding online and isn't being taken seriously." Jess, who grew up in the seaside town Aberystwyth in Ceredigion, said she had dealt with unwanted male attention since she was a child. "I developed my body when I was really young and started wearing a bra when I was in year four so by the time I was in year six I would get comments from grown adult men about me being jailbait," she said. "It's never really spoken about how girls who develop early are just treated so differently, all of a sudden it's like you're seen as 'you're mature now'." When her photos were leaked at 15 her parents were supportive but Jess said she got her first taste of victim-blaming from others around her. "So much shame is put on the victim. It's like, 'why did you take that? Why did you share that?' I'm like, 'why did someone share that without my consent? And why are grown adult men passing it around?'." Three years later Jess was a glamour model. She said it was an attempt to "reclaim a bit of power back". "You've all seen my images, you all have this idea of me, so why not make some money out of this and make a career out of it?" she explained. When she started out modelling she decided she was only going to do lingerie and swimwear shoots - but she said this was also taken out of her control. She said she agreed to pose in a mesh swimsuit on the agreement her nipples would be edited out. A couple of months later a man messaged her on social media to complement her on the images. She searched for them online and discovered the agreement had been broken and her nipples were on show. "From then on it just kind of spiralled really," she said. "You're trying to grapple with holding on to some kind of power and holding on to some kind of boundaries but other people keep taking them from you." Jess said before she was even 20 she had "just kind of accepted that this is just how it is". But then she was let down by someone she had hoped she could trust. "I really liked this guy," she recalled. "He'd made a few comments about telling his friends that I was 'Jess from Nuts magazine' and you think 'okay, you see me through that lens', but you brush it off because you like them." One morning after staying over at his she woke with a weird sense that something was not right. While he was in the shower she decided to check his phone. "It opened on to a group chat and there was an image of me totally naked in his bed and asleep - he'd sent that in the group chat," she said. "He had a single bed, so I was like, 'you would have had to stand up to take that, it's such a conscious decision'." She quickly deleted the image from his phone, knowing those who had received it may have already saved it and forwarded it on. What do you say to someone who has done that to you? "When he came back I still didn't say anything because I was just so ashamed and embarrassed, I didn't like confrontation and I didn't want to argue," said Jess. It was while studying sociology at university that Jess made her first foray into feminism and began really questioning her experiences. "I had this sense of anger, I just felt it wasn't fair that women were being treated this way and women losing total control over their images online," she said. It was only once she started speaking to other women and experts in the field while making her documentary that she started to let go of the shame and blame she had carried for years. "That was really life-changing for me," she said. "[I realised] there was something here that we can fight for and try and change things, which is what I've been doing ever since." 'Headphones don't rape women' Influencers driving extreme misogyny, say police Andrew Tate: The self-proclaimed misogynist Jess, 32, said one of the many reasons she wrote her book was to call out victim-blaming which she said remained "rife". "You shouldn't have gone to that house party, you shouldn't have sent that photo, or don't wear that short skirt because you're going to get attention," she said, were remarks that only further solidified sexist attitudes and removed the blame from the perpetrator. She said she was glad when Netflix's Adolescence started a conversation on the so-called manosphere but wants the conversation to go further. "What we're missing is the teenage girls who are actually being affected by this," she said. "We're expecting them to be able to navigate this male entitlement at a very young age, where they're being pressured to send images and being turned into explicit deep fakes. "We're putting all our attention on giving these workshops to teenage boys and talking about saving them from being radicalised, which is important, but no-one's talking about what to do for teenage girls." Jess wants to see more money put into educating young people about navigating digital spaces, arguing the occasional workshop was not enough to counteract the thousands of hours teens spend online potentially being exposed to misogynistic content. Men also need to stop being so defensive and call out bad behaviour in other men, she said. "I always get in my [social media] comments, 'not all men'. Of course, not all men - but you're just shutting down the conversation," she said. "Instead of being defensive, actually listen to women... read books by women, listen to podcasts that are presented by women, watch documentaries that are presented by women." She said parents also needed to be more "switched on". "If Adolescence shocked you that's shocking to me, because that's basic, basic level stuff," she said. "That is skimming the surface of what's happening in these spaces."

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store