
Dramatic CCTV footage shows synagogue arson in Melbourne
Counter-terrorism authorities have released dramatic CCTV footage and images of suspects believed to be responsible for the politically motivated arson attack on the Adass Israel Synagogue in Melbourne late last year.
The Joint Counter Terrorism Team (JCTT) is appealing for public assistance to identify the individuals involved or anyone connected to a blue 2020 VW Golf sedan believed to have been used in the attack.
The footage, captured on December 6, 2024, shows the blue Golf driving past the Elsternwick synagogue multiple times before parking outside the main entrance.
Three masked individuals wearing hoodies are then seen exiting the vehicle, using an axe to break into the synagogue.
At least two of the offenders are captured on CCTV pouring the contents of red jerry cans inside the entrance and returning to the car to retrieve more.
The flammable liquid was then ignited, and the group fled in the Golf, heading towards Melbourne's western suburbs.
The ultra-Orthodox temple was firebombed and significantly damaged about 4am on Friday December 6.'
Two worshippers were in the temple at the time.
Millions of dollars of holy texts, handwritten Torah scrolls, artefacts and furniture were destroyed or badly damaged.
The vehicle, which was stolen and bore cloned number plates, has since been recovered by police.
Investigators say it was also used in other serious crimes across the city, including the Lux nightclub arson in South Yarra in November 2024, and a shooting and arson in Bundoora on the same night as the synagogue attack.
While the other incidents are being investigated by Victoria Police and are not considered politically motivated, the synagogue fire remains a terrorism matter.
AFP Assistant Commissioner for Counter Terrorism and Special Investigations Command Stephen Nutt said officers had examined CCTV from more than 1400 locations to track the alleged offenders.
'We believe there are multiple offenders directly and indirectly linked to the synagogue arson, and our terrorism investigation into their actions continues,' he said.
'I remind those involved that the penalty for terrorism is life imprisonment. It is just a matter of time before police knock on your door. It is in your interest to come forward now.
'This is no normal crime and that is why it is being investigated by the JCTT with the full force and capability of Victoria Police, the AFP and ASIO. Do not stay silent, come forward.
'Based on the information we have, we suspect some of the individuals involved are extremely violent.'
Assistant Commissioner Nutt said investigators were pursuing all leads and thanked both the Jewish and wider Victorian communities for their support and assistance.
Victoria Police Assistant Commissioner for Counter Terrorism Command Tess Walsh said the investigation had been a top priority over the past five months.
'This was an attack that impacted Victorians' feeling of safety and left people feeling deeply shocked, saddened and rightly concerned,' she said.
She said both the JCTT and specialist detectives, including the Arson and Explosives Squad, had been working to uncover not only who carried out the attack but who planned it and why.
'Today has provided us with a breakthrough in the Lux nightclub arson investigation and we are now in a position to publicly confirm that we believe the vehicle used in that incident is the same as the synagogue fire, as well as many other incidents of significant criminality such as shootings and aggravated burglaries across Melbourne,' she said.
Assistant Commissioner Walsh said the blue VW Golf sedan was a crucial link in the investigation and urged the public to help identify who had used it and what they were involved in.
'We need assistance from the public as we attempt to identify those who have been using this vehicle and what they have been involved in, and we know there are people out there who can supply this information. Any small detail could be crucial.'
She said it was time for people with information to come forward and warned of the potentially deadly consequences of such reckless behaviour.
'We have said many times before; it is only luck that stands between a fire that damages a property and a fire that kills dozens of people. Fire is absolutely uncontrollable, and the sheer recklessness of this offending cannot be tolerated.'
She reassured the Jewish community that police remained committed to tracking down those responsible.
'I understand that it can be difficult when these matters take time, but again, make no mistake that this investigation remains a key priority for Victoria Police and the AFP and we are dedicated to bringing it to a successful conclusion as soon as possible.'
The public is urged to contact Crime Stoppers on 1800 333 000 or the National Security Hotline on 1800 123 400 if they recognise anyone in the CCTV footage or have information about the vehicle or the individuals involved. Reports can also be made confidentially online.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


7NEWS
2 hours ago
- 7NEWS
Aussie traveller faces decade in jail over find in Esky en route to Christmas Island
Methamphetamine has been found stuffed into a bag of raw mince meat stopped at the border. It was in an Esky full of meat which arrived at Christmas Island Airport by freight late last year — now, three men face drug trafficking charges over the discovery, and could spend a collective 30 years in jail if found guilty. The 28g bag of meth, sealed in a snap-lock bag and then hidden in a freezer bag full of mince, was tested to confirm the substance and flagged with the Australian Federal Police on November 15. A West Australian man, 50, will face court on Monday over the find. It comes after two Christmas Island men — aged 24 and 30 — were charged in February over the Esky discovery. Each of the men faces one count of attempting to traffic a controlled drug, namely methamphetamine. They remain before the courts. 'Police allege the Christmas Island locals paid the WA man to purchase and send them the drugs,' AFP said. The 50-year-old WA man was stopped at Perth Airport as he returned home from regional WA on May 9 — a search warrant was executed both there and later at his Cannington home. 'A mobile phone, clothing and other items were seized,' police said The man was charged with one count of traffic a controlled drug, namely methamphetamine. Both he and the Christmas Island locals each face a decade behind bars, the maximum penalty if found guilty of their charges. AFP Inspector Dave Reis said, 'methamphetamine is a devastating drug that causes immeasurable harm to users and their families, especially in smaller communities.' 'The criminals who push these substances are driven by greed and profit and have zero regard for the lives they ruin.' ABF Superintendent Shaun Senior added that the border force is vigilant at the air freight facilities at airports, 'regardless of the size of the facility or how the packages are concealed.'

The Age
8 hours ago
- The Age
Police searched a man's laptop for malware. What they found is becoming all too common
When police searched the computer of 29-year-old IT worker Aaron Pennesi in March, they were looking for the malware he used to steal personal information from his colleagues at The Forest High School on Sydney's northern beaches. That wasn't all they found. In an all-too-common turn of events, police stumbled upon child sexual abuse material on a laptop seized for another reason. But something was different about this content. The scenes depicted weren't real. Instead, Pennesi had used a popular AI-generation website to create the child abuse material using search prompts that are too grotesque to publish. In an even more severe case, a Melbourne man was sentenced to 13 months in prison in July last year for offences including using an artificial-intelligence program to produce child abuse images. Police found the man had used an AI image-generation program and inputted text and images to create 793 realistic images. As cases involving the commercial generation of AI child abuse material that is completely original and sometimes indistinguishable from the real thing become increasingly common, one expert says the phenomenon has opened a 'vortex of doom' in law enforcement's efforts to stamp out the content online. Naive misconceptions As the tug of war over the future of AI oscillates in the court of public opinion, one of the more terrifying realities that suggests it could do more harm than good is the ease with which it enables offenders to produce and possess child sexual abuse material. The widespread adoption of image-generation models has been a boon for paedophiles seeking to access or profit from the content online. Interpol's immediate past director of cybercrime, Craig Jones, says the use of AI in child sexual abuse material online has 'skyrocketed' in the past 12 to 18 months. 'Anybody is able to use an online tool [to access child sexual abuse content], and with the advent of AI, those tools are a lot stronger. It allows offenders to do more,' Jones said. The AFP-led Australian Centre to Counter Child Exploitation, or ACCCE, received 63,547 reports of online child exploitation from July 2024 to April 2025. That's a 30 per cent increase on the previous financial year, with two months remaining. 'We're seeing quite a significant increase in what's occurring online,' AFP Acting Commander Ben Moses says, noting that those statistics don't differentiate between synthetic and real child abuse content. Loading That's in line with the legal treatment of the issue; possessing or creating the content in either form is punishable under the same offences. But a common misconception is that AI-generated material shouldn't be taken as seriously or is not as harmful as the traditional type because no child is abused in the creation of the material. Moses says that while identifying real victims will always be the ACCCE's priority, AI-generated content is being weaponised against real children. 'It can still be very harmful and horrific. [It] can include the ability … to generate abuse in relation to people they know. For those victims, it has significant consequences.' In 2024, a British man was jailed for 18 years for turning photographs of real children, some younger than 13, into images to sell to other paedophiles online. The sentencing judge called the images 'chilling'. In another British example, a BBC report in 2024 found evidence that an adults-only VR sex simulator game was being used to create child models for use in explicit sex scenes, and that models had been based on photos taken of real girls in public places. 'The other aspect of it, and what may not be well known, is cases where innocent images of children have been edited to appear sexually explicit, and those photos are then used to blackmail children into providing other intimate content,' Moses says. Moses says this new 'abhorrent' form of sextortion, and how it opens up new ways for offenders to victimise minors, is of great concern to the ACCCE. Professor Michael Salter, the director of Childlight UNSW, the Australasian branch of the Global Child Safety Institute, calls the misconception that AI-generated abuse material is less harmful 'really naive'. 'The forensic evidence says that it is a serious risk to children.' 'The emergence of AI has been something of a vortex of doom in the online Child Protection space.' Professor Michael Salter Salter says the demand for synthetic material primarily comes from serious offenders and that, generally, they also possess actual child sexual abuse content. 'It's also important to understand that a lot of the material that they're creating is extremely egregious because they can create whatever they want,' he said. 'The sort of material they're creating is extremely violent, it's extremely sadistic, and it can include imagery of actual children they want to abuse.' Tech-savvy paedophiles AI child sexual abuse material first crossed Michael Salter's desk around five years ago. In that time, he's witnessed how offenders adapt to new technologies. As AI advanced, so did the opportunities for paedophiles. He explains that AI was first used to sharpen older material and later to create new images of existing victims. It has now proliferated into offenders training their own engines or using commercially available image-generation sites to create brand-new material. This can include deepfake videos featuring real people. But Salter says what is more common is still-image generation that is frighteningly readily available. 'We have commercial image generation sites that you can go to right now, and you don't even have to look for child sexual abuse material because the generation of [it] is so popular that these sites often have trending pages, and I've seen sections where the keyword is 'pre-teen', or 'tween', or 'very young'.' In a 2024 report, the Internet Watch Foundation (IWF) found a 380 per cent increase in reported cases of AI-generated child sexual abuse content online, noting that the material was becoming 'significantly more realistic' and that perpetrators were finding 'more success generating complex 'hardcore' scenarios' involving penetrative sexual activity, bestiality or sadism. 'One user shared an anonymous webpage containing links to fine-tuned models for 128 different named victims of child sexual abuse.' Internet Watch Foundation's July 2024 AI child sexual abuse material report The IWF found evidence that AI models that depict known child abuse victims and famous children were being created and shared online. In some of the most perverse cases, this could include the re-victimisation of 'popular' real-life child abuse victims, with AI models allowing perpetrators to generate new images of an abused minor. Loading The report acknowledged that the usage of these fine-tuned models, known as LoRAs, was likely to go much deeper than the IWF could assess, thanks to end-to-end encrypted peer-to-peer networks that were essentially inaccessible. Moreover, Australia's eSafety Commission warns that child sexual abuse material produced by AI is 'highly scalable'. '[It requires] little effort to reproduce en masse once a model is capable of generating illegal imagery,' a spokesperson said. Commercial interests The rapid escalation of the amount of content available online is partially attributed to how AI has enabled the commercialisation of child sexual abuse material. 'Offenders who are quite adept at creating material are essentially taking orders to produce content, and this material is increasingly realistic,' Salter says. Jones says that in the span of his career, he's seen the provision of child sexual abuse content go from physical photocopies being shared in small groups to it being available online in a couple of clicks. 'Unfortunately, there is a particular audience for child sexual abuse material, and what AI can do is generate that content, so [commercialisation] is serving a demand that is out there.' In one of the biggest stings involving an AI-child abuse enterprise, Danish police, in conjunction with Europol, uncovered a subscription service that commercialised access to the content. The global operation saw two Australian men charged, and 23 others apprehended around the world. 'There were over 237 subscribers to that one matter,' Moses says of Operation Cumberland. 'When we talk about proliferation and people profiting from this type of activity, this is of great concern to us.' Swamped by the growing sea of content, officers now face the difficulty of identifying which situations depict real children being abused, as opposed to an AI-generated child who doesn't exist. 'It also means that police have to spend quite a lot of time looking at material to determine whether it's real or not, which is quite a serious trauma risk for police as well,' Salter says. Moses from the ACCCE agrees that it's 'very difficult work' for officers. 'Whilst it is very confronting material, it doesn't compare to the trauma that child victims endure, and there's very much a focus on identifying victims.' The influx of AI-generated content has complicated its mission in many ways, Moses says, including by robbing crucial resources from ACCCE's primary goal of rescuing children who are being abused. 'It takes a lot of time to identify real victims, and the concern for us … is that the [AI-generated content] is becoming increasingly harder [to detect], and it takes time away from our people who are trying to identify real victims.' Law enforcement 'overwhelmed' While prosecutions for offences involving fake abuse material have increased, the rate hasn't kept up with the pace of the increase in the amount of content found online. Salter says resourcing is one of the biggest challenges facing law enforcement. 'Law enforcement is so overwhelmed with really egregious online sexual exploitation cases … their primary priority is to prevent and rescue the abuse of actual kids.' He says it's a struggle he's heard across all jurisdictions. 'They're really struggling in terms of people power, in terms of access to the technology that they need to conduct these investigations and to store that amount of material,' Salter says. 'There needs to be a huge uplift right across the law enforcement space.' Additionally, AI-generated child sexual abuse content requires a whole reset of the way the content is detected. Old machine methods of detecting the content online involved scraping for verified abuse content, which means it has to have already been assessed by a human as illegal content to be detected. 'The obvious challenge we see with AI-generated material is that it's all new, and so it's very unlikely, through current detection technologies, that we can proactively screen it,' Salter says. Unregulated threat let loose It's a global issue that crosses jurisdictions and exists on the internet's severely under-regulated new frontier. But that hasn't deterred Australia's eSafety commissioner, Julie Inman Grant, from introducing world-first industry standards to hold tech companies to account for the content they platform. The standards came into force in December 2024 and require storage services such as Apple's iCloud and Google Drive, messaging services, and online marketplaces that offer generative AI models to prevent their products from being misused to store or distribute child sexual abuse material and pro-terror content. 'We have engaged with both AI purveyors and the platforms and libraries that host them to ensure they are aware of their obligations under the standards,' an eSafety commission spokesperson said. 'We believe the standards are a significant step in regulating unlawful and seriously harmful content and align with our broader efforts to ensure that AI tools, such as those used to create deepfakes, are held to the highest safety standards.' The recent passage of the Criminal Code Amendment (Deepfake Sexual Material) Bill 2024 also expanded on the available criminal offences relating to non-consensual, sexually explicit AI-generated material. While international companies can face multimillion-dollar penalties for breaches of the eSafety Commission's standards in Australia, major tech players such as Meta are increasingly adopting end-to-end encryption, which means even the companies themselves can't see what content they're hosting, let alone law enforcement. Interpol works at the forefront of these issues, often acting as a bridge between authorities and the private sector. Jones observes that while interventions such as Australia's new standards play an important role in setting high standards for tech companies, encryption and other privacy policies make it 'very hard for law enforcement to get those data sets'. International co-operation is crucial for successfully prosecuting commercial child sexual abuse content cases, and Jones says that in best practice examples, when a global chain is identified, the tech industry is brought in as part of the investigation. 'I'm seeing more of an involvement in the tech sector around supporting law enforcement. But that's sometimes at odds with encryption and things like that,' Jones says. Loading 'I think the tech industry has a duty of care to the communities that they serve. So I don't think it's good enough to say, 'Oh, well, it's encrypted. We don't know what's there.' ' Salter takes a more pessimistic view of the tech industry's actions, arguing that most companies are moving away from, not towards, proactively monitoring the presence of child sexual abuse content. 'The emergence of AI has been something of a vortex of doom in the online child protection space,' Salter says. Online child protection efforts were already overwhelmed, he says, before the tech sector 'created a new threat to children' and 'released [it] into the wild with no child protection safeguards'. 'And that's very typical behaviour.'


West Australian
12 hours ago
- West Australian
King's Birthday Honours: Robin Cohen recognised for service to Perth's Jewish community
Many people are completely unaware of the work done behind the scenes by Perth's Community Services Group, which is dedicated to protecting the Jewish community. And director Robin Cohen would rather keep it that way — even though it is his efforts to nurture the CSG for the past 30 years which have thrust it into the spotlight. Mr Cohen has been recognised on this year's King's Birthday Honours list with a Medal of the Order of Australia for his service to the Jewish community of Perth. News of his nomination came as such a surprise that he almost deleted the email, assuming it was spam. 'I was quite surprised, stunned when I got the email from the Governor General's department,' he said. The CSG provides a security service to members of the Jewish community in Perth, including security guards at venues and functions. While it was unfortunate the community had to resort to setting up its own security systems, Mr Cohen said it provided a measure of comfort. 'We know our community, and we have a vested interest in looking after it,' he said. 'And that's not taking anything away from WA Police, who we have an awesome working relationship with.' Over the years, the CSG had become more than just a security organisation . During COVID-19 lockdowns it organised volunteers to help older people who could not leave their homes by delivering packages or doing their shopping. Mr Cohen said demand for its security services has leapt in recent months, coinciding with a massive surge in anti-Semitic incidents. Figures from the latest report by the Executive Council for Australian Jewry shows that WA reported 116 instances of anti-Semitism in WA between October 1, 2023, and September 30, 2024, up from just 25 the previous year. Perth incidents reported during that time included a physical assault and verbal abuse of an identifiably Jewish boy who was called a 'dirty Jew' and slapped in the face by a group of youths in Osborne Park. Other instances included verbal abuse and graffiti of swastikas and offensive statements. Mr Cohen said that since he first joined the CSG as a volunteer in 1992, before taking over its operations in 1996, it had grown from a small group to a large, professional operation. 'It constantly amazes me how it's developed,' he said. While the group has a number of paid employees as well as volunteers, Mr Cohen's time is 100 per cent donated, on top of his day job as a finance broker. He has also served as a member of the Jewish Community Council of WA and chaired the Perth Jewish Community Critical Incident Response Initiative. 'It's just the way I was brought up,' he said of his voluntary service. 'If it wasn't this it would have been something else — I'm very community-focused.'