
'Why do we need that?': Push to ban AI nudity apps
Possessing nudify apps, digital platforms that allow users to insert a person's photos and use generative artificial intelligence to sexualise them, would become a criminal offence and carry up to 15 years in jail under proposed laws.
"Why do we need that in an Australian community?" International Centre for Missing and Exploited Children's Dannielle Kelly told reporters in Canberra on Monday.
One in four children has experienced sexual abuse, according to the Australian Child Maltreatment Study.
Independent MP Kate Chaney, who introduced the proposed laws, said the federal government needed to respond more nimbly to ensure it wasn't outpaced by technological developments, such as AI being used to exploit children with little consequence.
The proposed laws contain small carve-outs for law enforcement and researchers.
"This is just the start, but it's something that the government could do right now," Ms Chaney said after introducing her private member's bill on Monday.
The legislation follows a roundtable on AI-facilitated child exploitation, which called for urgent action.
Child safety advocates and law enforcement representatives at the roundtable called for AI literacy for young people, the use of new technology to detect child exploitation material, legal restrictions on downloading such apps and better resourcing for police to tackle the issue.
There was a consensus that AI was being weaponised to harm children, from creating deepfakes - which digitally manipulate images and video to superimpose someone's face or voice - to generating child abuse material, creating the potential for exploitation, blackmail and bullying.
MP Zali Steggall, who seconded Ms Chaney's bill, branded it every parent's worst nightmare.
"When a criminal is downloading this technology to then create this material, that's going to have a lifelong impact on children and is really damaging," the independent MP said.
"We need these guardrails with urgency, we need the government to show it can act quickly.
"My concern is, amidst the paralysis of a broad review of AI, we have these very clear areas of harm that go unaddressed for months at a time ... this is a very clear area of harm identified that can be dealt with very quickly."
International Justice Mission Australia chief executive David Braga called for the government to legislate a digital duty of care, requiring platforms to actively take steps to prevent harm.
"Now is the time for the Australian government to strengthen the Online Safety Act to require companies ... to detect and disrupt child sexual abuse material in all its forms on their platforms," he said.
Attorney-General Michelle Rowland said keeping vulnerable Australians safe was the government's priority, and it would consider the legislation.
"Keeping young people safe from emerging harms is above politics and the government will give appropriate consideration to the private member's bill," she said in a statement to AAP.
Lifeline 13 11 14
Kids Helpline 1800 55 1800 (for people aged 5 to 25)
1800 RESPECT (1800 737 732)
National Sexual Abuse and Redress Support Service 1800 211 028
Parents are being warned their kids may be exploited online for child abuse material, amid a push to criminalise the use of apps that "nudify" pictures.
Possessing nudify apps, digital platforms that allow users to insert a person's photos and use generative artificial intelligence to sexualise them, would become a criminal offence and carry up to 15 years in jail under proposed laws.
"Why do we need that in an Australian community?" International Centre for Missing and Exploited Children's Dannielle Kelly told reporters in Canberra on Monday.
One in four children has experienced sexual abuse, according to the Australian Child Maltreatment Study.
Independent MP Kate Chaney, who introduced the proposed laws, said the federal government needed to respond more nimbly to ensure it wasn't outpaced by technological developments, such as AI being used to exploit children with little consequence.
The proposed laws contain small carve-outs for law enforcement and researchers.
"This is just the start, but it's something that the government could do right now," Ms Chaney said after introducing her private member's bill on Monday.
The legislation follows a roundtable on AI-facilitated child exploitation, which called for urgent action.
Child safety advocates and law enforcement representatives at the roundtable called for AI literacy for young people, the use of new technology to detect child exploitation material, legal restrictions on downloading such apps and better resourcing for police to tackle the issue.
There was a consensus that AI was being weaponised to harm children, from creating deepfakes - which digitally manipulate images and video to superimpose someone's face or voice - to generating child abuse material, creating the potential for exploitation, blackmail and bullying.
MP Zali Steggall, who seconded Ms Chaney's bill, branded it every parent's worst nightmare.
"When a criminal is downloading this technology to then create this material, that's going to have a lifelong impact on children and is really damaging," the independent MP said.
"We need these guardrails with urgency, we need the government to show it can act quickly.
"My concern is, amidst the paralysis of a broad review of AI, we have these very clear areas of harm that go unaddressed for months at a time ... this is a very clear area of harm identified that can be dealt with very quickly."
International Justice Mission Australia chief executive David Braga called for the government to legislate a digital duty of care, requiring platforms to actively take steps to prevent harm.
"Now is the time for the Australian government to strengthen the Online Safety Act to require companies ... to detect and disrupt child sexual abuse material in all its forms on their platforms," he said.
Attorney-General Michelle Rowland said keeping vulnerable Australians safe was the government's priority, and it would consider the legislation.
"Keeping young people safe from emerging harms is above politics and the government will give appropriate consideration to the private member's bill," she said in a statement to AAP.
Lifeline 13 11 14
Kids Helpline 1800 55 1800 (for people aged 5 to 25)
1800 RESPECT (1800 737 732)
National Sexual Abuse and Redress Support Service 1800 211 028
Parents are being warned their kids may be exploited online for child abuse material, amid a push to criminalise the use of apps that "nudify" pictures.
Possessing nudify apps, digital platforms that allow users to insert a person's photos and use generative artificial intelligence to sexualise them, would become a criminal offence and carry up to 15 years in jail under proposed laws.
"Why do we need that in an Australian community?" International Centre for Missing and Exploited Children's Dannielle Kelly told reporters in Canberra on Monday.
One in four children has experienced sexual abuse, according to the Australian Child Maltreatment Study.
Independent MP Kate Chaney, who introduced the proposed laws, said the federal government needed to respond more nimbly to ensure it wasn't outpaced by technological developments, such as AI being used to exploit children with little consequence.
The proposed laws contain small carve-outs for law enforcement and researchers.
"This is just the start, but it's something that the government could do right now," Ms Chaney said after introducing her private member's bill on Monday.
The legislation follows a roundtable on AI-facilitated child exploitation, which called for urgent action.
Child safety advocates and law enforcement representatives at the roundtable called for AI literacy for young people, the use of new technology to detect child exploitation material, legal restrictions on downloading such apps and better resourcing for police to tackle the issue.
There was a consensus that AI was being weaponised to harm children, from creating deepfakes - which digitally manipulate images and video to superimpose someone's face or voice - to generating child abuse material, creating the potential for exploitation, blackmail and bullying.
MP Zali Steggall, who seconded Ms Chaney's bill, branded it every parent's worst nightmare.
"When a criminal is downloading this technology to then create this material, that's going to have a lifelong impact on children and is really damaging," the independent MP said.
"We need these guardrails with urgency, we need the government to show it can act quickly.
"My concern is, amidst the paralysis of a broad review of AI, we have these very clear areas of harm that go unaddressed for months at a time ... this is a very clear area of harm identified that can be dealt with very quickly."
International Justice Mission Australia chief executive David Braga called for the government to legislate a digital duty of care, requiring platforms to actively take steps to prevent harm.
"Now is the time for the Australian government to strengthen the Online Safety Act to require companies ... to detect and disrupt child sexual abuse material in all its forms on their platforms," he said.
Attorney-General Michelle Rowland said keeping vulnerable Australians safe was the government's priority, and it would consider the legislation.
"Keeping young people safe from emerging harms is above politics and the government will give appropriate consideration to the private member's bill," she said in a statement to AAP.
Lifeline 13 11 14
Kids Helpline 1800 55 1800 (for people aged 5 to 25)
1800 RESPECT (1800 737 732)
National Sexual Abuse and Redress Support Service 1800 211 028
Parents are being warned their kids may be exploited online for child abuse material, amid a push to criminalise the use of apps that "nudify" pictures.
Possessing nudify apps, digital platforms that allow users to insert a person's photos and use generative artificial intelligence to sexualise them, would become a criminal offence and carry up to 15 years in jail under proposed laws.
"Why do we need that in an Australian community?" International Centre for Missing and Exploited Children's Dannielle Kelly told reporters in Canberra on Monday.
One in four children has experienced sexual abuse, according to the Australian Child Maltreatment Study.
Independent MP Kate Chaney, who introduced the proposed laws, said the federal government needed to respond more nimbly to ensure it wasn't outpaced by technological developments, such as AI being used to exploit children with little consequence.
The proposed laws contain small carve-outs for law enforcement and researchers.
"This is just the start, but it's something that the government could do right now," Ms Chaney said after introducing her private member's bill on Monday.
The legislation follows a roundtable on AI-facilitated child exploitation, which called for urgent action.
Child safety advocates and law enforcement representatives at the roundtable called for AI literacy for young people, the use of new technology to detect child exploitation material, legal restrictions on downloading such apps and better resourcing for police to tackle the issue.
There was a consensus that AI was being weaponised to harm children, from creating deepfakes - which digitally manipulate images and video to superimpose someone's face or voice - to generating child abuse material, creating the potential for exploitation, blackmail and bullying.
MP Zali Steggall, who seconded Ms Chaney's bill, branded it every parent's worst nightmare.
"When a criminal is downloading this technology to then create this material, that's going to have a lifelong impact on children and is really damaging," the independent MP said.
"We need these guardrails with urgency, we need the government to show it can act quickly.
"My concern is, amidst the paralysis of a broad review of AI, we have these very clear areas of harm that go unaddressed for months at a time ... this is a very clear area of harm identified that can be dealt with very quickly."
International Justice Mission Australia chief executive David Braga called for the government to legislate a digital duty of care, requiring platforms to actively take steps to prevent harm.
"Now is the time for the Australian government to strengthen the Online Safety Act to require companies ... to detect and disrupt child sexual abuse material in all its forms on their platforms," he said.
Attorney-General Michelle Rowland said keeping vulnerable Australians safe was the government's priority, and it would consider the legislation.
"Keeping young people safe from emerging harms is above politics and the government will give appropriate consideration to the private member's bill," she said in a statement to AAP.
Lifeline 13 11 14
Kids Helpline 1800 55 1800 (for people aged 5 to 25)
1800 RESPECT (1800 737 732)
National Sexual Abuse and Redress Support Service 1800 211 028
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Canberra Times
37 minutes ago
- Canberra Times
'No accountability': Pocock, Steggall push to bring back truth in political advertising
"If the government chooses not to progress electoral communication reform in this term of parliament without delay - doing it very quickly now, before the end of this year, to ensure it can be implemented and in place in time for the next election ... it says it wants to lie to the Australian people."

The Age
38 minutes ago
- The Age
‘Trends are worrying': Incoming US naval boss sounds AUKUS warning
Washington: The incoming chief of US Navy operations has warned the US will not be able to fulfil its AUKUS obligations without doubling its submarine-building capacity, in a fresh sign of the doubts over whether the agreement can be honoured. Meanwhile, this masthead can confirm Australia's second $800 million payment to help the US build nuclear-powered submarines was made in June, when the Australian government was already aware of the Pentagon's review of AUKUS. Admiral Daryl Caudle, President Donald Trump's nominee as chief of navy operations, told his confirmation hearing it was not yet certain the US defence industrial base was capable of producing enough Virginia-class nuclear-powered submarines to sell three to Australia. 'As you know, the delivery pace is not where it needs to be to make good on Pillar 1 of the AUKUS agreement, which is currently under review by our defence department,' he said last week. 'I think rightly so. We do have to understand whether the industrial base can produce the submarines required so that we can make good on the actual pact that we've made with the UK and Australia.' Across the two US shipyards that build the submarines, the current delivery rate is about 1.2 boats a year, but needs to reach 2.2 to 2.3 a year. 'That's going to require a transformational improvement. Not a 10 per cent improvement, not a 20 per cent improvement – a 100 per cent improvement,' Caudle told the hearing. 'We need a transformational improvement and the ability to deliver twice the capacity that we're currently delivering.'

Sydney Morning Herald
38 minutes ago
- Sydney Morning Herald
‘Trends are worrying': Incoming US naval boss sounds AUKUS warning
Washington: The incoming chief of US Navy operations has warned the US will not be able to fulfil its AUKUS obligations without doubling its submarine-building capacity, in a fresh sign of the doubts over whether the agreement can be honoured. Meanwhile, this masthead can confirm Australia's second $800 million payment to help the US build nuclear-powered submarines was made in June, when the Australian government was already aware of the Pentagon's review of AUKUS. Admiral Daryl Caudle, President Donald Trump's nominee as chief of navy operations, told his confirmation hearing it was not yet certain the US defence industrial base was capable of producing enough Virginia-class nuclear-powered submarines to sell three to Australia. 'As you know, the delivery pace is not where it needs to be to make good on Pillar 1 of the AUKUS agreement, which is currently under review by our defence department,' he said last week. 'I think rightly so. We do have to understand whether the industrial base can produce the submarines required so that we can make good on the actual pact that we've made with the UK and Australia.' Across the two US shipyards that build the submarines, the current delivery rate is about 1.2 boats a year, but needs to reach 2.2 to 2.3 a year. 'That's going to require a transformational improvement. Not a 10 per cent improvement, not a 20 per cent improvement – a 100 per cent improvement,' Caudle told the hearing. 'We need a transformational improvement and the ability to deliver twice the capacity that we're currently delivering.'