Police investigate anti-Israel video threatening Lovitt Technologies
A disturbing video is circulating across social media that appears to threaten violence against Australians working at aerospace company Lovitt Technologies.
The unverified video shows a person speaking in a digitally altered voice and dressed completely in black.
The person calls the message 'an anonymous communique' from the 'cell' that torched and vandalised three cars on the Lovitt lot in Melbourne on July 5.
CCTV of the incident shows five hooded offenders entering the business just before 4am, setting fire to the cars.
In the video, the person says the vandalism was not 'an accident'.
'This is a clear and serious threat,' the person says.
'If you continue making weapons or components of any kind there will be consequences. Consider this a warning.'
The person then reveals anti-Israel, anti-American and anti-Australian sentiment underpins the group's actions.
'After 21 months of an accelerated genocide against the Palestinian people by the illegitimate Zionist entity, eight decades of American warmongering and imperialism, 2½ centuries of the most violent colonial oppression, ethnic cleansing and murder of Aboriginal peoples across so-called Australia, Lovitt Technologies has chosen its place at the intersection of these catastrophes,' the person says.
The unverified video shows a person clad all in black and talking in a digitally altered voice. The person threatens employees of Australian aerospace company Lovitt Technologies. Picture: Supplied
The group is targeting Lovitt because it supplies components to defence and aerospace companies Lockheed Martin, BAE and Boeing, the person says.
The person then threatens violence against the company's workers and suggests his group has been 'watching' Lovitt's employees and collecting personal information on them.
'Every worker in this supply chain is complicit,' the person says.
'You have had years to contemplate the consequences of your actions. We will decide your fate as you have decided the fate of millions.
'For the past few months, we have been closely watching you. We have your addresses. All the information we have about you will be distributed to our underground networks. Stop arming Israel or else.'
The person ends the video by saying 'every colony will burn'.
'Death to Israel. Death to Australia. Death to America. From the river to the sea, Palestine will be free,' they say.
Victoria Police has confirmed it is investigating the July 5 attack and now the video.
'The matter is now being investigated by the Victorian Joint Counter Terrorism Team, which includes personnel from Victoria Police, the Australian Federal Police and the Australian Security Intelligence Organisation,' a Victoria Police spokeswoman told NewsWire on Monday.
'Investigators are aware of a video which has been circulating where a group has claimed responsibility for the incident.
'This video is being reviewed as part of the ongoing investigation.
'Police have already released CCTV of five people they would like to speak to in relation to the incident. Each person was dressed in black hooded jumpers, backpacks and gloves.'
Police said there were as yet no links between the July 5 act of vandalism at Lovitt and other criminal acts that hit Melbourne over that weekend, including the arson attack on the East Melbourne Hebrew Congregation synagogue.
Police say there is at this time no link between the arson attack on the East Melbourne Hebrew Congregation synagogue and the group that attacked Lovitt Technologies on July 5. Picture: NewsWire / Valeriu Campan
Executive Council of Australian Jewry co-chief executive Alex Ryvchin said the group's video and message resembled 'an al-Qaeda terror cell'.
'It doesn't matter that they think they're doing something just and righteous – Islamist terrorists and neo-Nazis think that too,' he told NewsWire.
'What matters is that we remain a country of laws and not allow bands of zealots to decide what is a legitimate target for violence and criminal acts.
'Today it is a business they oppose and tomorrow it will be individuals, politicians, journalists or religious institutions they deem impure.
'We expect this incident to be investigated and for those responsible to be met with the law.'
NewsWire contacted Lovitt, but the company has declined to comment.
Originally published as Video shows underground anti-Israel group threatening violence against Lovitt Technologies
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles

9 News
24 minutes ago
- 9 News
Why you should never remove chalk marks from your parked car
Your web browser is no longer supported. To improve your experience update it here A criminal lawyer is warning Australians about the potential consequences of removing chalk marks from their tyres before a parking inspection is complete. A recent TikTok depicting an individual removing chalk marks from multiple car tyres has gone viral and sparked online conversation. But Avinash Singh, from Astor Lega has warned of the potential consequences of the act. "Removing chalk off a tyre could be seen as attempting to pervert the course of justice. This is because the removal of chalk would hinder a parking ranger from carrying out their duties and prevent them from issuing a fine," he said. The video sparked debate, with Singh criticising the video, the comment section was flooded with people showing support for the stunt. In NSW, Section 319 Crimes Act 1900 makes it an offence to do any act or omission intending in any way to pervert the court of justice. Chalk mark on car tyre from parking inspectors ( South Australia is the only state that has a specific law that addresses this issue due to the issue becoming so widespread. Section 174AB of the Road Traffic Act 1961 makes it an offence to remove a parking inspector's chalk from a vehicle. The maximum fine this offense can carry is a $750 fine. "If a driver finds that a fine has been issued, they can contest the fine and ask for evidence that they were timed correctly. This is usually in the form of timestamped photos that a parking ranger has taken," says Singh. parking national Cars Fines driving Australia TikTok CONTACT US


The Advertiser
an hour ago
- The Advertiser
'Why do we need that?': Push to ban AI nudity apps
Parents are being warned their kids may be exploited online for child abuse material, amid a push to criminalise the use of apps that "nudify" pictures. Possessing nudify apps, digital platforms that allow users to insert a person's photos and use generative artificial intelligence to sexualise them, would become a criminal offence and carry up to 15 years in jail under proposed laws. "Why do we need that in an Australian community?" International Centre for Missing and Exploited Children's Dannielle Kelly told reporters in Canberra on Monday. One in four children has experienced sexual abuse, according to the Australian Child Maltreatment Study. Independent MP Kate Chaney, who introduced the proposed laws, said the federal government needed to respond more nimbly to ensure it wasn't outpaced by technological developments, such as AI being used to exploit children with little consequence. The proposed laws contain small carve-outs for law enforcement and researchers. "This is just the start, but it's something that the government could do right now," Ms Chaney said after introducing her private member's bill on Monday. The legislation follows a roundtable on AI-facilitated child exploitation, which called for urgent action. Child safety advocates and law enforcement representatives at the roundtable called for AI literacy for young people, the use of new technology to detect child exploitation material, legal restrictions on downloading such apps and better resourcing for police to tackle the issue. There was a consensus that AI was being weaponised to harm children, from creating deepfakes - which digitally manipulate images and video to superimpose someone's face or voice - to generating child abuse material, creating the potential for exploitation, blackmail and bullying. MP Zali Steggall, who seconded Ms Chaney's bill, branded it every parent's worst nightmare. "When a criminal is downloading this technology to then create this material, that's going to have a lifelong impact on children and is really damaging," the independent MP said. "We need these guardrails with urgency, we need the government to show it can act quickly. "My concern is, amidst the paralysis of a broad review of AI, we have these very clear areas of harm that go unaddressed for months at a time ... this is a very clear area of harm identified that can be dealt with very quickly." International Justice Mission Australia chief executive David Braga called for the government to legislate a digital duty of care, requiring platforms to actively take steps to prevent harm. "Now is the time for the Australian government to strengthen the Online Safety Act to require companies ... to detect and disrupt child sexual abuse material in all its forms on their platforms," he said. Attorney-General Michelle Rowland said keeping vulnerable Australians safe was the government's priority, and it would consider the legislation. "Keeping young people safe from emerging harms is above politics and the government will give appropriate consideration to the private member's bill," she said in a statement to AAP. Lifeline 13 11 14 Kids Helpline 1800 55 1800 (for people aged 5 to 25) 1800 RESPECT (1800 737 732) National Sexual Abuse and Redress Support Service 1800 211 028 Parents are being warned their kids may be exploited online for child abuse material, amid a push to criminalise the use of apps that "nudify" pictures. Possessing nudify apps, digital platforms that allow users to insert a person's photos and use generative artificial intelligence to sexualise them, would become a criminal offence and carry up to 15 years in jail under proposed laws. "Why do we need that in an Australian community?" International Centre for Missing and Exploited Children's Dannielle Kelly told reporters in Canberra on Monday. One in four children has experienced sexual abuse, according to the Australian Child Maltreatment Study. Independent MP Kate Chaney, who introduced the proposed laws, said the federal government needed to respond more nimbly to ensure it wasn't outpaced by technological developments, such as AI being used to exploit children with little consequence. The proposed laws contain small carve-outs for law enforcement and researchers. "This is just the start, but it's something that the government could do right now," Ms Chaney said after introducing her private member's bill on Monday. The legislation follows a roundtable on AI-facilitated child exploitation, which called for urgent action. Child safety advocates and law enforcement representatives at the roundtable called for AI literacy for young people, the use of new technology to detect child exploitation material, legal restrictions on downloading such apps and better resourcing for police to tackle the issue. There was a consensus that AI was being weaponised to harm children, from creating deepfakes - which digitally manipulate images and video to superimpose someone's face or voice - to generating child abuse material, creating the potential for exploitation, blackmail and bullying. MP Zali Steggall, who seconded Ms Chaney's bill, branded it every parent's worst nightmare. "When a criminal is downloading this technology to then create this material, that's going to have a lifelong impact on children and is really damaging," the independent MP said. "We need these guardrails with urgency, we need the government to show it can act quickly. "My concern is, amidst the paralysis of a broad review of AI, we have these very clear areas of harm that go unaddressed for months at a time ... this is a very clear area of harm identified that can be dealt with very quickly." International Justice Mission Australia chief executive David Braga called for the government to legislate a digital duty of care, requiring platforms to actively take steps to prevent harm. "Now is the time for the Australian government to strengthen the Online Safety Act to require companies ... to detect and disrupt child sexual abuse material in all its forms on their platforms," he said. Attorney-General Michelle Rowland said keeping vulnerable Australians safe was the government's priority, and it would consider the legislation. "Keeping young people safe from emerging harms is above politics and the government will give appropriate consideration to the private member's bill," she said in a statement to AAP. Lifeline 13 11 14 Kids Helpline 1800 55 1800 (for people aged 5 to 25) 1800 RESPECT (1800 737 732) National Sexual Abuse and Redress Support Service 1800 211 028 Parents are being warned their kids may be exploited online for child abuse material, amid a push to criminalise the use of apps that "nudify" pictures. Possessing nudify apps, digital platforms that allow users to insert a person's photos and use generative artificial intelligence to sexualise them, would become a criminal offence and carry up to 15 years in jail under proposed laws. "Why do we need that in an Australian community?" International Centre for Missing and Exploited Children's Dannielle Kelly told reporters in Canberra on Monday. One in four children has experienced sexual abuse, according to the Australian Child Maltreatment Study. Independent MP Kate Chaney, who introduced the proposed laws, said the federal government needed to respond more nimbly to ensure it wasn't outpaced by technological developments, such as AI being used to exploit children with little consequence. The proposed laws contain small carve-outs for law enforcement and researchers. "This is just the start, but it's something that the government could do right now," Ms Chaney said after introducing her private member's bill on Monday. The legislation follows a roundtable on AI-facilitated child exploitation, which called for urgent action. Child safety advocates and law enforcement representatives at the roundtable called for AI literacy for young people, the use of new technology to detect child exploitation material, legal restrictions on downloading such apps and better resourcing for police to tackle the issue. There was a consensus that AI was being weaponised to harm children, from creating deepfakes - which digitally manipulate images and video to superimpose someone's face or voice - to generating child abuse material, creating the potential for exploitation, blackmail and bullying. MP Zali Steggall, who seconded Ms Chaney's bill, branded it every parent's worst nightmare. "When a criminal is downloading this technology to then create this material, that's going to have a lifelong impact on children and is really damaging," the independent MP said. "We need these guardrails with urgency, we need the government to show it can act quickly. "My concern is, amidst the paralysis of a broad review of AI, we have these very clear areas of harm that go unaddressed for months at a time ... this is a very clear area of harm identified that can be dealt with very quickly." International Justice Mission Australia chief executive David Braga called for the government to legislate a digital duty of care, requiring platforms to actively take steps to prevent harm. "Now is the time for the Australian government to strengthen the Online Safety Act to require companies ... to detect and disrupt child sexual abuse material in all its forms on their platforms," he said. Attorney-General Michelle Rowland said keeping vulnerable Australians safe was the government's priority, and it would consider the legislation. "Keeping young people safe from emerging harms is above politics and the government will give appropriate consideration to the private member's bill," she said in a statement to AAP. Lifeline 13 11 14 Kids Helpline 1800 55 1800 (for people aged 5 to 25) 1800 RESPECT (1800 737 732) National Sexual Abuse and Redress Support Service 1800 211 028 Parents are being warned their kids may be exploited online for child abuse material, amid a push to criminalise the use of apps that "nudify" pictures. Possessing nudify apps, digital platforms that allow users to insert a person's photos and use generative artificial intelligence to sexualise them, would become a criminal offence and carry up to 15 years in jail under proposed laws. "Why do we need that in an Australian community?" International Centre for Missing and Exploited Children's Dannielle Kelly told reporters in Canberra on Monday. One in four children has experienced sexual abuse, according to the Australian Child Maltreatment Study. Independent MP Kate Chaney, who introduced the proposed laws, said the federal government needed to respond more nimbly to ensure it wasn't outpaced by technological developments, such as AI being used to exploit children with little consequence. The proposed laws contain small carve-outs for law enforcement and researchers. "This is just the start, but it's something that the government could do right now," Ms Chaney said after introducing her private member's bill on Monday. The legislation follows a roundtable on AI-facilitated child exploitation, which called for urgent action. Child safety advocates and law enforcement representatives at the roundtable called for AI literacy for young people, the use of new technology to detect child exploitation material, legal restrictions on downloading such apps and better resourcing for police to tackle the issue. There was a consensus that AI was being weaponised to harm children, from creating deepfakes - which digitally manipulate images and video to superimpose someone's face or voice - to generating child abuse material, creating the potential for exploitation, blackmail and bullying. MP Zali Steggall, who seconded Ms Chaney's bill, branded it every parent's worst nightmare. "When a criminal is downloading this technology to then create this material, that's going to have a lifelong impact on children and is really damaging," the independent MP said. "We need these guardrails with urgency, we need the government to show it can act quickly. "My concern is, amidst the paralysis of a broad review of AI, we have these very clear areas of harm that go unaddressed for months at a time ... this is a very clear area of harm identified that can be dealt with very quickly." International Justice Mission Australia chief executive David Braga called for the government to legislate a digital duty of care, requiring platforms to actively take steps to prevent harm. "Now is the time for the Australian government to strengthen the Online Safety Act to require companies ... to detect and disrupt child sexual abuse material in all its forms on their platforms," he said. Attorney-General Michelle Rowland said keeping vulnerable Australians safe was the government's priority, and it would consider the legislation. "Keeping young people safe from emerging harms is above politics and the government will give appropriate consideration to the private member's bill," she said in a statement to AAP. Lifeline 13 11 14 Kids Helpline 1800 55 1800 (for people aged 5 to 25) 1800 RESPECT (1800 737 732) National Sexual Abuse and Redress Support Service 1800 211 028


Perth Now
an hour ago
- Perth Now
'Why do we need that?': Push to ban AI nudity apps
Parents are being warned their kids may be exploited online for child abuse material, amid a push to criminalise the use of apps that "nudify" pictures. Possessing nudify apps, digital platforms that allow users to insert a person's photos and use generative artificial intelligence to sexualise them, would become a criminal offence and carry up to 15 years in jail under proposed laws. "Why do we need that in an Australian community?" International Centre for Missing and Exploited Children's Dannielle Kelly told reporters in Canberra on Monday. One in four children has experienced sexual abuse, according to the Australian Child Maltreatment Study. Independent MP Kate Chaney, who introduced the proposed laws, said the federal government needed to respond more nimbly to ensure it wasn't outpaced by technological developments, such as AI being used to exploit children with little consequence. The proposed laws contain small carve-outs for law enforcement and researchers. "This is just the start, but it's something that the government could do right now," Ms Chaney said after introducing her private member's bill on Monday. The legislation follows a roundtable on AI-facilitated child exploitation, which called for urgent action. Child safety advocates and law enforcement representatives at the roundtable called for AI literacy for young people, the use of new technology to detect child exploitation material, legal restrictions on downloading such apps and better resourcing for police to tackle the issue. There was a consensus that AI was being weaponised to harm children, from creating deepfakes - which digitally manipulate images and video to superimpose someone's face or voice - to generating child abuse material, creating the potential for exploitation, blackmail and bullying. MP Zali Steggall, who seconded Ms Chaney's bill, branded it every parent's worst nightmare. "When a criminal is downloading this technology to then create this material, that's going to have a lifelong impact on children and is really damaging," the independent MP said. "We need these guardrails with urgency, we need the government to show it can act quickly. "My concern is, amidst the paralysis of a broad review of AI, we have these very clear areas of harm that go unaddressed for months at a time ... this is a very clear area of harm identified that can be dealt with very quickly." International Justice Mission Australia chief executive David Braga called for the government to legislate a digital duty of care, requiring platforms to actively take steps to prevent harm. "Now is the time for the Australian government to strengthen the Online Safety Act to require companies ... to detect and disrupt child sexual abuse material in all its forms on their platforms," he said. Attorney-General Michelle Rowland said keeping vulnerable Australians safe was the government's priority, and it would consider the legislation. "Keeping young people safe from emerging harms is above politics and the government will give appropriate consideration to the private member's bill," she said in a statement to AAP. Lifeline 13 11 14 Kids Helpline 1800 55 1800 (for people aged 5 to 25) 1800 RESPECT (1800 737 732) National Sexual Abuse and Redress Support Service 1800 211 028