Latest news with #AgeCheckCertificationScheme


The Guardian
29-05-2025
- Health
- The Guardian
Tech trial for Australia's social media ban ‘broadly on track' amid concerns under 16s could circumvent systems
The technology trial for Australia's social media ban is 'broadly on track', the government says, despite a month-long delay of a key report on the best ways to keep under 16s off the platforms. It comes as the company behind the age assurance trial has revealed only one type of technology has been tested on children so far and some internal stakeholder concerns about how young people may circumvent the age ban systems. The federal government has also been sitting on a separate report, costing more than $275,000, that it commissioned last year on Australians' attitudes to age assurance technology. It was delivered to the government on 2 January but has not yet been released. The UK-based company recruited to run the trial, Age Check Certification Scheme (ACCS), was due to publish its age assurance report in June. The report will focus on what technology could be used to prevent under 16s gaining access to social media and under 18s accessing adult websites. The federal communications department has confirmed the ACCS report would now be delivered in July, and the minister would decide when to publish it, a spokesperson said. 'The independent trial of age assurance technologies remains broadly on track, in line with project delivery timeframes,' the spokesperson said. Briefing documents from Senate estimates in February, released under freedom of information laws, stated the final report 'is due in June 2025'. ACCS had previously stated the report was due 'at the end of June', and it would independently publish it. One of the first tasks for the new communications minister, Labor's Anika Wells, will be to assess the outcome of the trial, to decide which technologies are applicable and to which platforms they will apply. Sign up for Guardian Australia's breaking news email Affected platforms must have age assurance systems in place by December. Wells must be satisfied that the platforms – expected to include Facebook, Instagram, TikTok and Snapchat – are taking reasonable steps to stop under 16s accessing their services. In an update on the age assurance trial from ACCS last week, it said the only type of technology trialed so far is facial age estimation tech, which examines a photo or video of a user to try to estimate the age based on their facial features. A total of 1,580 tests have been conducted on 485 students, in years 7 to 12. Aside from this testing, further work has been limited to interviews with dozens of potential vendors, and statements outlining how their age ban enforcement technology could work. The trial will try to confirm those claims through 'a combination of practical testing and a vendor interview'. About half of the interviews have been completed. Technologies deemed sufficiently mature to include in the final report will be tested by another company – the Australian-owned KJR – or through schools testing or mystery-shopper type testing. Mystery-shopper testing is a 'real-world environment, where users will have a variety of equipment, light conditions and access to required resources, be that an ID document or a bank account'. ACCS said there will only be 'enough testing' to confirm claims made by vendors 'and that may be achieved with a relatively modest level of practical experimentation'. The March meeting minutes for the stakeholder advisory board overseeing the trial reported stakeholders had raised concerns about gaps in the testing, particularly around how children may circumvent the age ban systems. A spokesperson for the department said a preliminary report, provided in April but not released publicly, gave the government 'anticipated findings in relation to age verification, age estimation, age inference, successive validation, parental control and parental consent methods'. Sources close to the trial told Guardian Australia they believed it was unlikely the report on the trial would be finalised by the due date – or that if it was, it would have not been adequate to inform government decisionmaking on the best technology to use. One concern raised was that other countries, including New Zealand and the United States, are looking to Australia's trial to guide their own plans. Those who supported the policy wanted it implemented correctly, rather than rushed through with technology that could later present privacy or other issues. The Social Research Centre was commissioned in August, and paid $278,000, to research attitudes to age assurance. This included an online survey of 3,140 adults, and 870 people aged 8 to 17 years. A spokesperson for the department said it was a matter for the minister on when that report, delivered to government in January, would be released. A spokesperson for the Albanese government did not directly respond to questions on the timing of the tech trial report or the Social Research Centre report release. 'The government looks forward to receiving the age assurance report and progressing our reforms to protect children from social media harms,' the spokesperson said.


The Guardian
04-04-2025
- Politics
- The Guardian
Australia's social media ban is attracting global praise – but we're no closer to knowing how it would work
The smash hit Netflix show Adolescence, which explores a teenage murder fuelled by social media and toxic masculinity, has renewed calls for social media bans in some countries. One of the show's stars this week said the UK should follow Australia's lead in banning children aged under 16 from social media platforms. The ban has been praised in the US and UK, and is described as 'world-leading' by the Australian government. Time magazine this week praised the prime minister, Anthony Albanese, for a 'remarkable' policy that was 'politically uncontroversial' on the basis that both major parties supported it. Left unsaid was all the criticism raised by mental health groups, LGBTQ+ groups and other campaigners during the rushed process to pass the bill in parliament last year. The committee reviewing the bill only reviewed the legislation for a single day, despite over 15,000 submissions being received. Author Jonathan Haidt, who reportedly lobbied politicians in Australia to push the policy before it was adopted and privately dismissed critics of his approach, told the New York Times this week that 'it's going to work. It doesn't have to be perfect at first, but within a few years it will be very good'. If it worked in Australia, it was going to go global 'very quickly', he said. But nine months out from the policy coming into effect, Australians are still in the dark about how our ban – which was passed by the parliament in November 2024 – will work. And that's likely to remain the case up to the federal election on 3 May. A trial of age assurance technology is under way, with schoolchildren still being recruited to participate just weeks before the first report is due. The under-16s social media ban is due to come into effect in December, but the government faces a number of hurdles before then, including figuring out what tech to use, and whether the platforms – emboldened by the apparent backing of Donald Trump – will comply. The Age Check Certification Scheme (ACCS), a UK-based company recruited by the Albanese government to conduct an assessment of the technology used to determine whether people are the age they say they are when accessing social media, is due to provide a preliminary report to the government by the end of April. Sign up for the Afternoon Update: Election 2025 email newsletter While the report is said to be on track to be delivered this month, Guardian Australia has confirmed this preliminary report will not be released publicly by the company. A spokesperson for the communications department said the report was never intended for public release and is designed 'to afford procedural fairness to trial participants' on any changes that need to be made. The final report is due just two months after the preliminary report in June, before the communications minister – whoever it will be after the federal election – will get to decide which platforms it applies to, and what technology is appropriate. ACCS has begun recruitment of school-age children to test out the various technologies, but there is an education and consent process still under way. From there, schoolchildren will test out age estimation (where tech estimates how old a user is), assurance (where a parent or guardian confirms an age) and verification (using some sort of identity document verification) technologies. The children will act as 'mystery shoppers' and attempt to access a purpose-built online platform through the various age assurance methods, documents released by ACCS state. This process leaves just weeks to get the trials conducted, analysed, and a final report prepared for government. The ban is not a major focus of the federal election campaign – it had bipartisan support, after the Coalition pushed for it for months until the Labor government relented. However, there are still major concerns over how the ban will work, and who is included. TikTok and Meta, for example, are angry over the carve out YouTube received. The government's messaging on why this exemption was allowed has been mixed. The communications minister, Michelle Rowland, said last year that YouTube would be included in a range of services exempt from the ban on health and education grounds. But in the draft wording of the document that sets out what services are exempt, YouTube is granted an exemption on its own, while health and education services are another carve out. In response to questions in Senate estimates from the Greens senator Sarah Hanson-Young last month, the department said the exclusion was 'consistent with broad community sentiment, which highlights the value of YouTube as a tool for education and learning'. Evidence shows most children under 13 accessing social media are accessing YouTube. A report from the eSafety commissioner last month stated more than 80% of children aged between eight and 12 are accessing social media, despite the current minimum age requirement being 13. However, this figure was largely skewed by children accessing YouTube, either by watching while logged out, or using a parent or carer's account. When YouTube is excluded, the figure is closer to 44%. TikTok and Snapchat are second and third behind YouTube (68%) on 31% and 19%, respectively. It is also worth noting that the stakeholder advisory board overseeing the trial has some members who have long called for bans or restrictions on online pornography, and have called for online censorship. But missing from the board are digital rights and privacy groups. Those organisations have subsequently been invited to apply to join the stakeholder advisory board, after inquiries from Guardian Australia, but as of yet do not appear to be included. Whether all this results in a report that the government can rely upon and implement before the end of this year remains to be seen, as does whether the social media companies will be willing players. The inconsistent treatment of some platforms over others might lead to companies such as Meta – which has already approached the Trump administration over their treatment by the Australian government – to seek the US government to push back on the ban before it comes into effect. But this week, Albanese, and the opposition leader, Peter Dutton, said the ban was not up for negotiation.


The Independent
30-03-2025
- Politics
- The Independent
As UK families grieve, can one determined country stop social media harming children?
Ellen Roome has said more than once that if her son had been hit by a car, his death would have at least made some kind of sense. But after finding 14-year-old Jools dead in his room on a night in April 2022, she is still searching for answers. 'Not one person in Jools' life thought there was a problem. Not one teacher, not one adult, not one child,' Ms Roome says nearly three years later. Her crusade is now squarely aimed at social media, and after finding out about the deaths of other British teenagers in similar circumstances, she has joined a group of parents suing TikTok over a dangerous online 'blackout' challenge they believe their children took part in. Ms Roome has tried to access her son's social media accounts to see the content he was looking at before his death, but says she's been blocked by the platforms. 'I thought, well, we're responsible for a minor. Why on earth can't we see what he's looking at?' In the past week, the grief of another family involved in the action against TikTok was made plain before a coroner, who is investigating the death of Maia Walsh, a 13-year-old girl found dead in her Hertford bedroom in October 2022 after seeing concerning content on the platform. Months before, she had commented: 'I don't think I'll live past 14.' Harrowing tales like these have sparked a debate over the best ways to protect children from social media harms. The government is already facing criticism that new laws in force ordering tech companies to remove dangerous content are not robust enough, while prime minister Sir Keir Starmer batted away a Conservative push for a blanket phone ban in schools as 'wasting time' and 'completely unnecessary'. Labour backbencher Josh MacAlister's fight to place age restrictions on Facebook, TikTok, and similar platforms was shot down by technology minister Peter Kyle. But in Australia, parents' anxiety over their children's exposure to an unsupervised online world has shaped concrete government action: a ban on teenagers under the age of 16 from accessing social media. The new laws, which have been given a year to take effect, are a litmus test for a society growing increasingly fearful of the harms faced by children on their smartphones, including violent radicalisation, misogyny, eating disorders, and bullying. 'We know social media is doing social harm,' Australian prime minister Anthony Albanese said upon introducing the legislation in November. 'We want Australian children to have a childhood, and we want parents to know the government is in their corner.' But alongside the question of whether the government should bar children from the platforms is the question of whether it actually can, as doubts are raised over the effectiveness of systems designed to restrict the ages of their users. Age Check Certification Scheme founder Tony Allen believes a ban is absolutely possible. The UK-based company has been tasked by the Australian government with undertaking a trial of age assurance technology - so far involving 55 participants and 62 different systems - that will underpin the success of the scheme. Mr Allen says age assurance is split into three categories: age verification, linked to proving someone's date of birth; age estimation which analyses a person's biometric data such as their pulse and facial features; and age inference, which assumes someone's age based on a particular qualifier - like owning a credit card. 'You have to be over 18 to be able to be issued with a credit card… so the reasonableness of the inference is the law requires you to be over 18. You're therefore likely to be over 18,' he says. However, he qualifies that whatever the system chosen by the government would involve a never-ending catch-up game to fend off those finding new ways to get around it. 'There's a lot of work going on on how you detect deep-fakes and injection attacks,' he says, explaining the latter 'injects code right behind the camera, right and then tricks the system into thinking it's looking at you, and it's not'. Another pitfall is the tendency of some artificial intelligence to discriminate against people of colour by assuming they're younger than they actually are, according to Professor Toby Walsh, chief scientist at the University of New South Wales's AI Institute. But Prof Walsh, who is independently overseeing the trial, is broadly optimistic. He has likened the ban to age restrictions on smoking and drinking in that, while it is unlikely to be flawless, it could be a major driver in forcing cultural change. 'You go behind the bicycle sheds, maybe at school, you will find people smoking cigarettes. Young people will find ways to access alcohol. But we have made it difficult, and we have made it illegal to provide tobacco and alcohol to people underage, and that has changed the conversation around those things,' he says. Despite the legislation passing in November with opposition support, the approach has been sharply criticised by independent MPs and the Greens, as well as human rights organisations, who have warned it will leave marginalised teenagers, such as those in the LGBTQ+ community without a place to interact. Contributing to the criticism is Andy Burrows, CEO of Molly Rose Foundation, a suicide prevention charity set up following the death of British teenager Molly Russell, who took her own life after viewing toxic content online. ' Banning under-16s from social media is a backwards step that would push risks and bad actors onto gaming and messaging services and leave young people at a cliff edge of harm when they turn 16,' Mr Burrows says. "Children should not be punished for the failures of tech platforms nor the delayed response from successive governments. Our young people's safety deserves strong, effective solutions to complex problems." Unsurprisingly, the social media giants targeted by the law are also opposed to what they claim is a rushed bill that will fail to achieve its goals. Meta, the parent company of Facebook and Instagram, argued before an Australian parliamentary inquiry into the legislation that the evidence didn't support a blanket ban and it was unclear what 'reasonable steps' companies needed to take to bar children from their platforms to avoid nearly $50 million (£24.4 million) penalties. 'This ambiguity is problematic as understanding a person's real age on the internet is a complex challenge,' the company's submission reads. However, Australia's eSafety commissioner and former industry insider, Julie Inman-Grant, says she has already spent years calling on tech companies to be more proactive in addressing the harms on their platforms. 'It's not as though they haven't been given the chance,' she says. 'But age assurance in isolation is not enough. We also need to keep the pressure on the tech industry to ensure their services are safer and our systemic transparency powers and codes and standards are already having an effect in this area.' The outspoken Ms Inman-Grant, who last month described Elon Musk as an 'unelected bureaucrat', was involved in a high-profile court dispute last year with X over the proliferation of a video on the platform that showed the stabbing of a controversial Sydney preacher. It later surfaced that Southport killer Axel Rudakubana had viewed the video before carrying out his notorious attack. Prof Walsh concedes he is concerned about the willingness of American tech giants to comply with the new laws amid the shifting political climate in the US. 'The US-centric policies coming out of North America these days are certainly troubling,' he says, before turning to a precautionary principle enshrined in European law to support Australia's trajectory. 'We have been running a very interesting, but somewhat concerning, experiment on human society, especially the young people in human society... we are obliged to take a precautionary approach to the potential harmful effects.' Despite the unknowns, Ellen Roome is supportive of what Australia is trying to pull off - in fact, she says it doesn't go far enough. 'Just get rid of it. It's not fit for children. It should be 18, in my opinion,' she says. Earlier this month the UK's own online safety legislation came into force. Its primary goal is to make social media companies prevent and remove harmful content such as extremist and child-abuse material from being published on their platforms. The Department for Science, Innovation, and Technology says the bill will make the UK 'the safest place in the world to be a child online'. However, during that same time, provisions in a Labour backbencher's private member's bill to force social media companies to exclude teenagers under 16 from their algorithms were watered down to a commitment to researching the issue. But Ms Roome says that in that time children will continue to access harmful material: 'How much more research do you need?' If you are experiencing feelings of distress, or are struggling to cope, you can speak to the Samaritans, in confidence, on 116 123 (UK and ROI), email jo@ or visit the Samaritans website to find details of your nearest branch.