logo
Transgender ‘pro-Nazi' wanted to carry out mass school shooting

Transgender ‘pro-Nazi' wanted to carry out mass school shooting

Telegraph21-02-2025

A transgender teenager obsessed with murder wanted to carry out a mass shooting at their school, a court has heard.
The trans boy 'idolised' the killers behind the Columbine High School massacre in Colorado in the United States in 1999, which saw 12 students and a teacher gunned down.
The accused repeatedly spoke about doing the same at his Edinburgh secondary school describing the 'Doomsday' when he would 'clear it out'.
However, a large-scale police investigation was opened in the summer of 2023 after a social media photo of him at school in full combat gear and carrying an imitation gun caused panic among pupils and parents.
The teenager had already been referred to a UK-wide programme designed to stop people becoming terrorists or supporting terrorism. The accused also held racist and pro-Nazi views.
The now 17-year-old, who cannot be identified for legal reasons and is understood to have been born a girl, appeared in the dock at the High Court in Glasgow.
The teenager pleaded guilty to a breach of the peace and a charge under the Terrorism Act. The crimes spanned between June 2022 and July 2023. Bail was revoked by Judge Lord Arthurson pending sentencing next month.
'Idolised school shooters in America'
Prosecutor Greg Farrell told how on June 20 2023, the accused had turned up at school wearing boots as well as cargo trousers and had brought a military tactical vest and helmet.
Mr Farrell: 'He was later seen at the school carrying an imitation firearm while wearing the vest and helmet. A photograph was circulated on social media. It was taken and published without his knowledge.
'The image provoked a considerable degree of fear and alarm among pupils and parents. Police were advised by a parent who saw the image.'
Officers went on to discover that the accused had a TikTok account which had footage of the teenager in black combat clothes as well as a skeleton mask.
Mr Farrell: 'One piece of commentary referenced school shootings.' The teenager was immediately suspended.
It emerged the vest and helmet were part of a costume for a short film the accused had been involved in for a drama class. The teenager had been the 'kidnapper' and had used a 'prop gun'.
Police took statements from other pupils who knew the accused. Mr Farrell: 'They provided information that the boy had exhibited a variety of alarming behaviours over a period of time.
'The greatest concern was a suggestion he had divulged to various people a desire to carry out a school shooting similar to that which had taken place in 1999 at Columbine High School in Colorado in the USA.'
Classmates recalled how the accused 'spoke excitedly and with considerable enthusiasm' when they talked about Columbine and other school shootings.
The teenager 'sympathised' with the pair behind it – Dylan Klebold and Eric Harris – and would copy how they had dressed.
One girl said the accused 'idolised school shooters in America'.
'Place a bomb in every second classroom'
Mr Farrell said: 'In November 2022, he told her how he would go about carrying out a school attack.
'He explained that he would start on the second floor and that he would 'clear it out' using guns. He would then move downstairs continuing to shoot until police arrived, at which point he would turn the gun on himself.'
The accused described a possible mass shooting at the school as 'Doomsday' and said it would occur on April Fool's Day or the last day of their sixth year.
Mr Farrell said: 'He spoke about setting up trip wires at fire exits and of placing 'pipe bombs' at the school.'
The teenager claimed they would buy a 3D printer to help construct a firearm. The accused was said to be so 'interested' in Columbine that they said they wanted to change their name 'in an act of homage' to Klebold.
Mr Farrell said: 'One pupil told police that the boy wore the same black trousers, trench coat, cap worn backwards and circular glasses as favoured by one of the Columbine pair.'
The accused told another classmate that they would 'place a bomb in every second classroom'. They would then shoot people as they fled the building.
The accused openly chatted about the making of what were described as deadly 'pressure cooker bombs'.
'Can't wait to hold my gun again'
The teenager was stopped by police under the Terrorism Act as they returned from holiday with their family on July 9 2023 and their electronic devices were seized.
The court heard the accused had 65 videos of Columbine and had added music which appeared to 'glamorise' the mass killing.
Police also seized a journal in which the accused had made various sinister remarks. One stated: 'I can't wait to hold my gun again in my gear. Hoping I'll get a bomb... kill this time. Will be unstoppable.'
The hearing was told the accused had previously been referred to the Prevent counter-terrorism programme due to concerns.
Shelagh McCall KC, defending, had asked the teenager to remain on bail pending sentencing. She described the accused as 'vulnerable' with ongoing issues.
However, Lord Arthurson did not continue bail. The teenager instead is expected to be sent to a secure unit for young people or what was described as a 'place of safety'. Sentencing was deferred for reports.

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Terrified woman slams disgusting act on the 'worst street in Melbourne'
Terrified woman slams disgusting act on the 'worst street in Melbourne'

Daily Mail​

time28 minutes ago

  • Daily Mail​

Terrified woman slams disgusting act on the 'worst street in Melbourne'

A young woman has recalled the terrifying moment an elderly man exposed himself to her on Chapel Street as she shopped for a wedding dress. The bride-to-be was on the busy street - regarded as the city's most upmarket destinations for fashion - when the 'worst thing ever' happened last month. She was heading back to her car when a man approached her, undid his pants and exposed himself while she was waiting to cross the road. 'Today, I went to Chapel Street for the first time in a really long time and it has become so scary down there,' she said in a TikTok following the incident. 'I had to cross the road and I pressed on the traffic lights. It felt really eerie so I took a step back and put my back towards a shop front and I made a conscious effort to not be on my phone so I could like look at the surroundings. 'I am so glad I did because as I was waiting, about maybe two or three metres from me, a man stopped. He started playing with his fly... He then proceeded to undo his belt, his pants and I'm sure you can imagine what happened next.' The woman said she 'freaked out' and ran into a nearby shop for safety, with two workers quickly locking the door behind her. She waited for the man to leave before going straight to her car and driving home. The woman called police once she got home and provided them with specific details of what the man looked like and when and where the incident took place. She said the officer she spoke to asked her to come to the station to write a statement, which she was happy to do. But, the woman said she became uneasy when she learned her full name would have to be included in the official report. 'I asked if this man was going to know my details because I have to put all of that in the statement, they said 'yes, he will be provided with your name and everything',' she continued in the TikTok. The woman told police she did not want to provide her name but still wanted to report the incident to ensure it didn't happen to anyone else. She asked if officers could look up the CCTV in the area as she had provided specific details of the location but claims police simply said 'no'. 'They can't go looking into things unless I put my name to it, which will then go to this freak. They won't do anything,' the woman said. Daily Mail Australia has contacted Victoria Police about the incident. Social media users praised the woman for sharing her experience. One woman, who worked near Chapel Street claimed to know the man and said he had behaved in the same way to 'a lot of young women'. 'Same thing happened when my coworker was attacked at work, they wouldn't do anything because they wouldn't give their personal details even with CCTV footage from the street and our workplace,' she wrote. 'Why are they giving out private information to the offender? Sounds like the cop didn't want to assist & knew you would back down,' another wrote. A third chimed: 'And this, this is how women are not protected in the slightest. This is how women get hurt. I had a similar situation'. Others encouraged the woman to report the incident anonymously via Crime Stoppers. 'Your name goes into the report but it doesn't go to the man! Call the cop shop and speak to someone different,' one person wrote.

The misogyny of the metaverse: is Mark Zuckerberg's dream world a no-go area for women?
The misogyny of the metaverse: is Mark Zuckerberg's dream world a no-go area for women?

The Guardian

time29 minutes ago

  • The Guardian

The misogyny of the metaverse: is Mark Zuckerberg's dream world a no-go area for women?

Everybody knows that young women are not safe. They are not safe in the street, where 86% of those aged 18 to 24 have experienced sexual harassment. They are not safe at school, where 79% of young people told Ofsted that sexual assault was common in their friendship groups and almost a third of 16- to 18-year-old girls report experiencing 'unwanted sexual touching'. They are not safe in swimming pools or parks, or at the beach. They are not even safe online, with the children's safety charity the NSPCC reporting that social media sites are 'failing to protect girls from harm at every stage'. This will come as no surprise to any woman who has ever used social media. But it is particularly relevant as Meta, the operator of some of the biggest social platforms on the internet, is busily engaged in constructing a whole new world. The company is pumping billions of dollars a year into building its metaverse, a virtual world that it hopes will become the future not just of socialising, but of education, business, shopping and live events. This raises a simple question: if Meta has utterly failed to keep women and girls safe in its existing online spaces, why should we trust it with the future? Mark Zuckerberg has grandly promised: 'In the metaverse, you'll be able to do almost anything you can imagine.' It's the sort of promise that might sound intensely appealing to some men and terrifying to most women. Indeed, the deeply immersive nature of the metaverse will make the harassment and abuse so many of us endure daily in text-based form on social media feel 100 times more real and will simultaneously make moderation 100 times more difficult. The result is a perfect storm. And I am speaking from experience, not idly speculating: I spent days in the metaverse researching my book, The New Age of Sexism. There is no single definition of the metaverse, but most people use the term to describe a shared world in which virtual and augmented technologies allow users (represented by avatars) to interact with people, objects and environments. Most of Meta's virtual world is accessible only to those who pay for the company's Quest headsets, but a limited number of metaverse spaces can be accessed by any device connected to the internet. Advanced technology such as 3D positional audio, hand tracking and haptic feedback (when controllers use various vibrations to coincide with actions you take) combine to make virtual worlds feel real. Your avatar moves, speaks and gestures when you do, allowing users to interact verbally and physically. Less than two hours after I first entered the metaverse, I saw a woman's avatar being sexually assaulted. When I approached her to ask her about the experience, she confirmed: 'He came up to me and grabbed my ass.' 'Does that happen a lot?' I asked. 'All the time,' she replied, wearily. I used my haptic controller to 'pick up' a bright-yellow marker and moved towards a giant blackboard. 'HAVE YOU BEEN ASSAULTED IN THE METAVERSE?' I wrote. The response was near instantaneous. 'Yeah, many times,' someone shouted. 'I think everybody's been assaulted in the damn metaverse,' one woman replied immediately, in a US accent. 'Unfortunately, it is too common,' a British woman added, nodding. Both women told me they had been assaulted multiple times. During my time in the metaverse, sexual harassment and unwanted sexual comments were almost constant. I heard one player shout: 'I'm dragging my balls all over your mother's face,' to another and witnessed male players making claims about 'beating off', as well as comments about 'gang bangs'. My virtual breasts were commented on repeatedly. I did not witness any action taken in response – whether by a moderator or by another player. A damning TechCrunch report from 2022 found that human moderators were available only in the main plaza of Meta's metaverse game Horizon Worlds – and that they seemed more engaged in giving information on how to take a selfie than moderating user behaviour. More worryingly still, I visited worlds where I saw what appeared to be young children frequently experiencing attention from adult men they did not know. In one virtual karaoke-style club, the bodies of the singers on stage were those of young women in their early 20s. But based on their voices, I would estimate that many of the girls behind the avatars were perhaps nine or 10 years old. Conversely, the voices of the men commenting on them from the audience, shouting out to them and following them offstage were often unmistakably those of adults. It is particularly incumbent on Meta to solve this problem. Of course, there are other companies, from Roblox to Microsoft, building user-generated virtual-reality gaming platforms and virtual co-working spaces. But, according to NSPCC research, while 150 apps, games and websites were used to groom children online between 2017 and 2023, where the means of communication was known, 47% of online grooming offences took place on products owned by Meta. These are not isolated incidents or cherry-picked horror stories. Research by the Center for Countering Digital Hate (CCDH) found that users were exposed to abusive behaviour every seven minutes in the metaverse. During 11 and a half hours recording user behaviour, the report identified 100 potential violations of Meta's policies. This included graphic sexual content, bullying, abuse, grooming and threats of violence. In a separate report, the CCDH found repeated instances of children being subjected to sexually explicit abuse and harassment, including an adult asking a young user: 'Do you have a cock in your mouth?' and another adult shouting: 'I don't want to cum on you,' to a group of underage girls who explicitly told him they were minors. Since its inception, Meta's virtual world has been plagued with reports of abuse. Users have reported being virtually groped, assaulted and raped. Researchers have also described being virtually stalked in the metaverse by other players, who tail them insistently, refuse to leave them alone and even follow them into different rooms or worlds. In December 2021, a beta tester of the metaverse wrote in the official Facebook group of the Horizon platform: 'Not only was I groped last night, but there were other people there who supported this behaviour.' What was even more revealing than the virtual assault itself was Meta's response. Vivek Sharma, then vice-president of Horizon at Meta, responded to the incident by telling the Verge it was 'absolutely unfortunate'. After Meta reviewed the incident, he claimed, it determined that the beta tester didn't use the safety features built into Horizon Worlds, including the ability to block someone from interacting with you. 'That's good feedback still for us because I want to make [the blocking feature] trivially easy and findable,' he continued. This response was revealing. First, the euphemistic description of the event as 'unfortunate', which made it sound on a par with poor sound quality. Second, the immediate shifting of the blame and responsibility on to the person who experienced the abuse – 'she should have been using certain tools to prevent it' – rather than an acknowledgment that it should have been prevented from happening in the first place. And, finally, most importantly, the description of a woman being abused online as 'good feedback'. Much subsequent discourse has focused on the question of whether or not a sexual assault or rape carried out in virtual reality should be described as such; whether it might have an impact on the victims similar to a real‑life assault. But this misses the point. First, it is worth noting that the experience of being sexually harassed, assaulted or raped in the metaverse has had a profound and distressing impact on many victims. When it was revealed in 2024 that British police were investigating the virtual gang-rape of a girl below the age of 16 in the metaverse, a senior officer familiar with the case told the media: 'This child experienced psychological trauma similar to that of someone who has been physically raped'. Second, technology to make the metaverse feel physically real is developing at pace. You can already buy full-body suits that promise to 'enhance your VR experience with elaborate haptic sensations'. They have sleeves, gloves and vests with dozens of different feedback points. Wearable haptic technology will bring the experience of being virtually assaulted much closer to the physical sensation of real-life victimisation. All the more reason to tackle it now, regardless of how 'realistic' it is or isn't, rather than waiting for things to get worse. But most importantly, regardless of how similar to or different from physical offline harms these forms of abuse are, what matters is that they are abusive, distressing, intimidating, degrading and offensive and that they negatively affect victims. And, as we have already seen with social media, the proliferation of such abuse will prevent women and girls from being able to fully use and benefit from new forms of technology. If Zuckerberg's vision comes to fruition and the boardrooms, classrooms, operating theatres, lecture halls and meeting spaces of tomorrow exist in virtual reality, then closing those spaces off from women, girls and other marginalised groups, because of the tolerance of various forms of prejudice and abuse in the metaverse, will be devastating. If we allow this now, when the metaverse is (relatively speaking) in its infancy, we are baking inequality into the building blocks of this new world. At the time of the afore­mentioned virtual-reality rape of an underage girl, Meta said in a statement: 'The kind of behaviour described has no place on our platform, which is why for all users we have an automatic protection called personal boundary, which keeps people you don't know a few feet away from you.' In another incident, when a researcher experienced a virtual assault, Meta's comment to the press was: 'We want everyone using our services to have a good experience and easily find the tools that can help prevent situations like these and so we can investigate and take action.' The focus always seems to be on users finding and switching on tools to prevent harassment or reporting abuse when it does happen. It is not on preventing abuse and taking serious action against abusers. But in the CCDH research that identified 100 potential violations of Meta's VR policies, just 51 of the incidents could be reported to Meta using a web form created by the platform for this purpose, because the platform refuses to examine policy violations if it cannot match them to a predefined category or username in its database. Worse, not one of those 51 reports of policy violation (including sexual harassment and grooming of minors) was acknowledged by Meta and as a result no action was taken. It's not much good pointing to your complaints system as the solution to abuse if you don't respond to complaints. Meta's safety features will no doubt continue to evolve and adapt – but, once again, in a repeat of what we have already seen happen on social media, women and girls will be the canaries in the coalmines, their abuse and suffering providing companies with useful data points with which to tweak their products and increase their profits. Teenage girls' trauma: a convenient building material. There is something incredibly depressing about all this. If we are really talking about reinventing the world here, couldn't we push the boat out a little? Couldn't we dare to dream of a virtual world in which those who so often face abuse are safe by design – with the prevention and eradication of abuse built in – instead of being tasked with the responsibility of protecting themselves when the abuse inevitably arises? None of this is whining or asking too much. Don't be fooled into thinking that we are all lucky to be using Meta's tools for nothing. We are paying for them in the tracking and harvesting of our data, our content, our photographs, our ideas and, as the metaverse develops, our hand and even eye movements. All of it can be scraped and used to train enormously powerful AI tools and predictive behavioural algorithms, access to which can then be sold to companies at gargantuan prices to help them forecast how we as consumers behave. It is not an exaggeration to say that we already pay Meta a very high price for using its platforms. And if the metaverse really does become as widely adopted and as ubiquitous in the fundamental operation of our day-to-day lives as Zuckerberg hopes, there won't be an easy way to opt out. We can't let tech companies off the hook because they claim the problem is too big or too unwieldy to tackle. We wouldn't accept similar excuses for dodging regulation from international food companies, or real-life venues. And the government should be prepared to act in similar ways here, introducing regulation to require proved safety standards at the design stage, before products are rolled out to the public. 'Hold on, just building the future here,' Horizon Worlds tells me as I wait to access the metaverse. As we battle to eradicate the endemic harassment and abuse that women and girls face in real-world settings, the metaverse presents a risk of slipping backwards. We are sleepwalking into virtual spaces where men's entitlement to women's bodies is once again widespread and normalised with near total impunity. The Guardian invited Meta to reply to this article, but the company did not respond. The New Age of Sexism: How the AI Revolution Is Reinventing Misogyny by Laura Bates is published by Simon & Schuster (£20). To support the Guardian, buy a copy at Delivery charges may apply

Legal aid lawyers face chaos following cyber attack, says representative body
Legal aid lawyers face chaos following cyber attack, says representative body

The Independent

time2 hours ago

  • The Independent

Legal aid lawyers face chaos following cyber attack, says representative body

Legal aid lawyers are facing 'administrative chaos' in the aftermath of a cyber attack amid fears more providers will leave the sector, a representative body has said. The Legal Aid Agency's (LAA) digital services, which are used by legal aid providers to log their work and get paid by the Government, were taken offline following the data breach in April. Payments for the publicly funded work initially stopped and legal aid applications are not able to be formally granted at the moment. This has left legal aid law firms, often small businesses, to decide whether to take on the risk of cases and hope they will be approved and paid retrospectively. Chris Minnoch, chief executive of Legal Aid Practitioners Group (LAPG), said that lawyers have called the representative body 'in tears' having 'sleepless nights' waiting on news and payments coming through from legal aid because of the cyber attack. 'They've been on the verge of collapse because they hadn't had payments for one or two weeks,' he told the PA news agency. 'What a cyber attack like this brings to the front of your mind is that the legal aid scheme is teetering on the precipice of collapse. 'If it goes and if lawyers just say, 'I can't do this anymore', it has profound consequences for almost every aspect of society, because we are talking about the law of everyday life.' Those eligible to apply for legal aid include domestic violence and modern slavery victims, people involved in care proceedings or at risk of homelessness, as well as people accused of criminal offences. The Ministry of Justice (MoJ) previously said a 'significant amount of personal data' of people who applied to the LAA since 2010, including criminal records, was accessed and downloaded in the cyber attack. The Government became aware of the hack on April 23, but realised on May 16 that it was more extensive than originally thought, and took the system offline. Mr Minnoch said it was 'unforgivable' for an organisation of that size to not have a business continuity plan that anticipated this sort of fallout. 'You'll have a number of providers that, by the end of this digital disaster we're dealing with, won't be there anymore because they just couldn't afford to, and a number of others that just say, 'I just I can't take this anymore',' he said. 'Every time there is some sort of crisis that befalls the legal aid scheme, you end up with fewer lawyers willing to do it at the end of it.' An MoJ source had put the breach down to the 'neglect and mismanagement' of the previous government, saying vulnerabilities in the LAA's systems have been known for many years. The attack happened as the MoJ was working on replacing the internal system with a new version hoped to be up and running in the coming weeks. At the end of May, the LAA brought in a scheme for civil legal aid providers to be able to receive payments based on their average billing until the system is back online, when lawyers will then submit specific bills and applications. On June 4, payments were sent to more than 1,700 solicitors and barristers who took up the offer. Payments for criminal legal aid cases separately have also resumed, the MoJ said. Mr Minnoch added that while the civil legal aid payments should have been done two weeks earlier, they were an important step forward, effectively loaning providers cash while they cannot claim money owed to them. 'I think one of the ways they need to mitigate this is by being as flexible as possible with providers, once the situation is restored to some sort of business as usual, because they're going to be punch drunk by the end of this, they're already punch drunk,' he said. There are also concerns the move will create a backlog of providers trying to push through all their cases that need to be claimed in a short period of time when the system is back up and running, to effectively gain the income to repay the loan as quickly as possible. Jenny Beck KC, co-chair of the LAPG, said: 'There's administrative chaos in an already beleaguered and fragile supplier base. 'Nobody does it because it's a very sensible business proposition. People do it because they genuinely care about vulnerable people and they want to help them.' Ms Beck, who runs a family law practice, said her firm's workload is 60% legal aid work helping extremely vulnerable people such as domestic abuse survivors who need protection orders. She told PA a 'good proportion' of clients are reliant on legal aid to access their rights which is a 'massive access to justice issue'. Of the legal aid work, Ms Beck, who is also a member of the Law Society's access to justice committee, added: 'We'll have to supplement it with private income sources, because it's anxiety provoking to be working at risk on cases that we haven't an absolute guarantee that we're going to be paid.' 'I am going to continue to work at risk because I cannot leave my clients vulnerable. 'But I know that many other firms are very concerned about taking that risk.' Nicola Jones-King, Law Society council member for legal aid, said that while taking the risk for certain types of cases is 'okay' when it is known legal aid will be granted, there is more concern for means and merit tested applications for legal aid such as in housing or cases against police action. 'If it's a slightly more challenging or unusual case, or it's a case where you need to assess merits and means, that's more problematic obviously for the solicitor to take that on that risk,' she said. The risk also applies in cases that already have legal aid granted, but then need further approval to continue the funding at different stages in the proceedings. Stuart Nolan, managing director at DPP Law Ltd and chairman of the Law Society's criminal law committee, said his firm's criminal law department is made up of about 95% of legal aid cases and could sustain the current situation 'at a cost' for a couple of months before feeling the pinch. 'The impact is so far reaching, clogged criminal justice system, this is prospect of more delays,' he said. 'The quicker it's sorted out, the better for everyone.' A Ministry of Justice spokesperson said: 'We understand the challenges this situation presents for legal aid providers – we are working as fast as possible to restore our online systems and have put in place contingencies to allow legal aid work to continue safely with confidence. 'These measures include setting up an average payment scheme for civil legal aid cases, resuming payments on criminal legal aid cases, putting in place processes for urgent civil application approvals and confirming that criminal applications made in this time will be backdated.'

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store