Latest news with #InternationalCentreforMissingandExploitedChildren


The Advertiser
2 days ago
- Politics
- The Advertiser
'Why do we need that?': Push to ban AI nudity apps
Parents are being warned their kids may be exploited online for child abuse material, amid a push to criminalise the use of apps that "nudify" pictures. Possessing nudify apps, digital platforms that allow users to insert a person's photos and use generative artificial intelligence to sexualise them, would become a criminal offence and carry up to 15 years in jail under proposed laws. "Why do we need that in an Australian community?" International Centre for Missing and Exploited Children's Dannielle Kelly told reporters in Canberra on Monday. One in four children has experienced sexual abuse, according to the Australian Child Maltreatment Study. Independent MP Kate Chaney, who introduced the proposed laws, said the federal government needed to respond more nimbly to ensure it wasn't outpaced by technological developments, such as AI being used to exploit children with little consequence. The proposed laws contain small carve-outs for law enforcement and researchers. "This is just the start, but it's something that the government could do right now," Ms Chaney said after introducing her private member's bill on Monday. The legislation follows a roundtable on AI-facilitated child exploitation, which called for urgent action. Child safety advocates and law enforcement representatives at the roundtable called for AI literacy for young people, the use of new technology to detect child exploitation material, legal restrictions on downloading such apps and better resourcing for police to tackle the issue. There was a consensus that AI was being weaponised to harm children, from creating deepfakes - which digitally manipulate images and video to superimpose someone's face or voice - to generating child abuse material, creating the potential for exploitation, blackmail and bullying. MP Zali Steggall, who seconded Ms Chaney's bill, branded it every parent's worst nightmare. "When a criminal is downloading this technology to then create this material, that's going to have a lifelong impact on children and is really damaging," the independent MP said. "We need these guardrails with urgency, we need the government to show it can act quickly. "My concern is, amidst the paralysis of a broad review of AI, we have these very clear areas of harm that go unaddressed for months at a time ... this is a very clear area of harm identified that can be dealt with very quickly." International Justice Mission Australia chief executive David Braga called for the government to legislate a digital duty of care, requiring platforms to actively take steps to prevent harm. "Now is the time for the Australian government to strengthen the Online Safety Act to require companies ... to detect and disrupt child sexual abuse material in all its forms on their platforms," he said. Attorney-General Michelle Rowland said keeping vulnerable Australians safe was the government's priority, and it would consider the legislation. "Keeping young people safe from emerging harms is above politics and the government will give appropriate consideration to the private member's bill," she said in a statement to AAP. Lifeline 13 11 14 Kids Helpline 1800 55 1800 (for people aged 5 to 25) 1800 RESPECT (1800 737 732) National Sexual Abuse and Redress Support Service 1800 211 028 Parents are being warned their kids may be exploited online for child abuse material, amid a push to criminalise the use of apps that "nudify" pictures. Possessing nudify apps, digital platforms that allow users to insert a person's photos and use generative artificial intelligence to sexualise them, would become a criminal offence and carry up to 15 years in jail under proposed laws. "Why do we need that in an Australian community?" International Centre for Missing and Exploited Children's Dannielle Kelly told reporters in Canberra on Monday. One in four children has experienced sexual abuse, according to the Australian Child Maltreatment Study. Independent MP Kate Chaney, who introduced the proposed laws, said the federal government needed to respond more nimbly to ensure it wasn't outpaced by technological developments, such as AI being used to exploit children with little consequence. The proposed laws contain small carve-outs for law enforcement and researchers. "This is just the start, but it's something that the government could do right now," Ms Chaney said after introducing her private member's bill on Monday. The legislation follows a roundtable on AI-facilitated child exploitation, which called for urgent action. Child safety advocates and law enforcement representatives at the roundtable called for AI literacy for young people, the use of new technology to detect child exploitation material, legal restrictions on downloading such apps and better resourcing for police to tackle the issue. There was a consensus that AI was being weaponised to harm children, from creating deepfakes - which digitally manipulate images and video to superimpose someone's face or voice - to generating child abuse material, creating the potential for exploitation, blackmail and bullying. MP Zali Steggall, who seconded Ms Chaney's bill, branded it every parent's worst nightmare. "When a criminal is downloading this technology to then create this material, that's going to have a lifelong impact on children and is really damaging," the independent MP said. "We need these guardrails with urgency, we need the government to show it can act quickly. "My concern is, amidst the paralysis of a broad review of AI, we have these very clear areas of harm that go unaddressed for months at a time ... this is a very clear area of harm identified that can be dealt with very quickly." International Justice Mission Australia chief executive David Braga called for the government to legislate a digital duty of care, requiring platforms to actively take steps to prevent harm. "Now is the time for the Australian government to strengthen the Online Safety Act to require companies ... to detect and disrupt child sexual abuse material in all its forms on their platforms," he said. Attorney-General Michelle Rowland said keeping vulnerable Australians safe was the government's priority, and it would consider the legislation. "Keeping young people safe from emerging harms is above politics and the government will give appropriate consideration to the private member's bill," she said in a statement to AAP. Lifeline 13 11 14 Kids Helpline 1800 55 1800 (for people aged 5 to 25) 1800 RESPECT (1800 737 732) National Sexual Abuse and Redress Support Service 1800 211 028 Parents are being warned their kids may be exploited online for child abuse material, amid a push to criminalise the use of apps that "nudify" pictures. Possessing nudify apps, digital platforms that allow users to insert a person's photos and use generative artificial intelligence to sexualise them, would become a criminal offence and carry up to 15 years in jail under proposed laws. "Why do we need that in an Australian community?" International Centre for Missing and Exploited Children's Dannielle Kelly told reporters in Canberra on Monday. One in four children has experienced sexual abuse, according to the Australian Child Maltreatment Study. Independent MP Kate Chaney, who introduced the proposed laws, said the federal government needed to respond more nimbly to ensure it wasn't outpaced by technological developments, such as AI being used to exploit children with little consequence. The proposed laws contain small carve-outs for law enforcement and researchers. "This is just the start, but it's something that the government could do right now," Ms Chaney said after introducing her private member's bill on Monday. The legislation follows a roundtable on AI-facilitated child exploitation, which called for urgent action. Child safety advocates and law enforcement representatives at the roundtable called for AI literacy for young people, the use of new technology to detect child exploitation material, legal restrictions on downloading such apps and better resourcing for police to tackle the issue. There was a consensus that AI was being weaponised to harm children, from creating deepfakes - which digitally manipulate images and video to superimpose someone's face or voice - to generating child abuse material, creating the potential for exploitation, blackmail and bullying. MP Zali Steggall, who seconded Ms Chaney's bill, branded it every parent's worst nightmare. "When a criminal is downloading this technology to then create this material, that's going to have a lifelong impact on children and is really damaging," the independent MP said. "We need these guardrails with urgency, we need the government to show it can act quickly. "My concern is, amidst the paralysis of a broad review of AI, we have these very clear areas of harm that go unaddressed for months at a time ... this is a very clear area of harm identified that can be dealt with very quickly." International Justice Mission Australia chief executive David Braga called for the government to legislate a digital duty of care, requiring platforms to actively take steps to prevent harm. "Now is the time for the Australian government to strengthen the Online Safety Act to require companies ... to detect and disrupt child sexual abuse material in all its forms on their platforms," he said. Attorney-General Michelle Rowland said keeping vulnerable Australians safe was the government's priority, and it would consider the legislation. "Keeping young people safe from emerging harms is above politics and the government will give appropriate consideration to the private member's bill," she said in a statement to AAP. Lifeline 13 11 14 Kids Helpline 1800 55 1800 (for people aged 5 to 25) 1800 RESPECT (1800 737 732) National Sexual Abuse and Redress Support Service 1800 211 028 Parents are being warned their kids may be exploited online for child abuse material, amid a push to criminalise the use of apps that "nudify" pictures. Possessing nudify apps, digital platforms that allow users to insert a person's photos and use generative artificial intelligence to sexualise them, would become a criminal offence and carry up to 15 years in jail under proposed laws. "Why do we need that in an Australian community?" International Centre for Missing and Exploited Children's Dannielle Kelly told reporters in Canberra on Monday. One in four children has experienced sexual abuse, according to the Australian Child Maltreatment Study. Independent MP Kate Chaney, who introduced the proposed laws, said the federal government needed to respond more nimbly to ensure it wasn't outpaced by technological developments, such as AI being used to exploit children with little consequence. The proposed laws contain small carve-outs for law enforcement and researchers. "This is just the start, but it's something that the government could do right now," Ms Chaney said after introducing her private member's bill on Monday. The legislation follows a roundtable on AI-facilitated child exploitation, which called for urgent action. Child safety advocates and law enforcement representatives at the roundtable called for AI literacy for young people, the use of new technology to detect child exploitation material, legal restrictions on downloading such apps and better resourcing for police to tackle the issue. There was a consensus that AI was being weaponised to harm children, from creating deepfakes - which digitally manipulate images and video to superimpose someone's face or voice - to generating child abuse material, creating the potential for exploitation, blackmail and bullying. MP Zali Steggall, who seconded Ms Chaney's bill, branded it every parent's worst nightmare. "When a criminal is downloading this technology to then create this material, that's going to have a lifelong impact on children and is really damaging," the independent MP said. "We need these guardrails with urgency, we need the government to show it can act quickly. "My concern is, amidst the paralysis of a broad review of AI, we have these very clear areas of harm that go unaddressed for months at a time ... this is a very clear area of harm identified that can be dealt with very quickly." International Justice Mission Australia chief executive David Braga called for the government to legislate a digital duty of care, requiring platforms to actively take steps to prevent harm. "Now is the time for the Australian government to strengthen the Online Safety Act to require companies ... to detect and disrupt child sexual abuse material in all its forms on their platforms," he said. Attorney-General Michelle Rowland said keeping vulnerable Australians safe was the government's priority, and it would consider the legislation. "Keeping young people safe from emerging harms is above politics and the government will give appropriate consideration to the private member's bill," she said in a statement to AAP. Lifeline 13 11 14 Kids Helpline 1800 55 1800 (for people aged 5 to 25) 1800 RESPECT (1800 737 732) National Sexual Abuse and Redress Support Service 1800 211 028


Perth Now
2 days ago
- Politics
- Perth Now
'Why do we need that?': Push to ban AI nudity apps
Parents are being warned their kids may be exploited online for child abuse material, amid a push to criminalise the use of apps that "nudify" pictures. Possessing nudify apps, digital platforms that allow users to insert a person's photos and use generative artificial intelligence to sexualise them, would become a criminal offence and carry up to 15 years in jail under proposed laws. "Why do we need that in an Australian community?" International Centre for Missing and Exploited Children's Dannielle Kelly told reporters in Canberra on Monday. One in four children has experienced sexual abuse, according to the Australian Child Maltreatment Study. Independent MP Kate Chaney, who introduced the proposed laws, said the federal government needed to respond more nimbly to ensure it wasn't outpaced by technological developments, such as AI being used to exploit children with little consequence. The proposed laws contain small carve-outs for law enforcement and researchers. "This is just the start, but it's something that the government could do right now," Ms Chaney said after introducing her private member's bill on Monday. The legislation follows a roundtable on AI-facilitated child exploitation, which called for urgent action. Child safety advocates and law enforcement representatives at the roundtable called for AI literacy for young people, the use of new technology to detect child exploitation material, legal restrictions on downloading such apps and better resourcing for police to tackle the issue. There was a consensus that AI was being weaponised to harm children, from creating deepfakes - which digitally manipulate images and video to superimpose someone's face or voice - to generating child abuse material, creating the potential for exploitation, blackmail and bullying. MP Zali Steggall, who seconded Ms Chaney's bill, branded it every parent's worst nightmare. "When a criminal is downloading this technology to then create this material, that's going to have a lifelong impact on children and is really damaging," the independent MP said. "We need these guardrails with urgency, we need the government to show it can act quickly. "My concern is, amidst the paralysis of a broad review of AI, we have these very clear areas of harm that go unaddressed for months at a time ... this is a very clear area of harm identified that can be dealt with very quickly." International Justice Mission Australia chief executive David Braga called for the government to legislate a digital duty of care, requiring platforms to actively take steps to prevent harm. "Now is the time for the Australian government to strengthen the Online Safety Act to require companies ... to detect and disrupt child sexual abuse material in all its forms on their platforms," he said. Attorney-General Michelle Rowland said keeping vulnerable Australians safe was the government's priority, and it would consider the legislation. "Keeping young people safe from emerging harms is above politics and the government will give appropriate consideration to the private member's bill," she said in a statement to AAP. Lifeline 13 11 14 Kids Helpline 1800 55 1800 (for people aged 5 to 25) 1800 RESPECT (1800 737 732) National Sexual Abuse and Redress Support Service 1800 211 028


The Advertiser
17-07-2025
- The Advertiser
AI turbocharges child abuse as image-creation made easy
Australia needs to re-examine how it tackles child sexual exploitation as experts warn rapid development in artificial intelligence is widening gaps exploited by perpetrators. International Centre for Missing and Exploited Children chief executive Colm Gannon said the organisation had received a 1325 per cent spike in reports involving AI-generated child sexual abuse material in a year. The centre received more than 67,000 reports on the matter in 2024. Experts and government officials convened at Parliament House for a round table to address the increasing use of AI in the sexual exploitation of children. Child safety advocates called for the explicit criminalisation of the use and possession of software designed to generate child sexual exploitation material. "I have been involved in investigations where there is active trading and profiteering from using these models, it's a pay-as-you-use design that's happening within child sexual offender communities," Mr Gannon told reporters in Canberra on Thursday. "There is no social-positive reason why people are going to be in possession of this software except to generate child sexual abuse material." A 10-year government plan released in 2021 to address child protection needed to be updated to capture new technology as it didn't mention AI and associated harms, he said. Child abuse survivor Grace Tame said there needed to be a broader review into tackling child sexual exploitation with a royal commission into institutional child sexual abuse more than a decade ago failing to examine key issues. "It was very specifically focused on institutional child sexual abuse and the responses of institutions," the former Australia of the Year said. "Incest accounts for the overwhelming majority of all child sexual abuse. "A lot of this is taking place in home, a lot of the online content that we're seeing is often filmed by parents and distributed by parents and there's no institution involved in that." Jon Rouse, who worked in law enforcement for nearly four decades and tackled online child exploitation material, called for authorities to be given greater resources and new tools to quickly identify victims and combat the crime. "The tragedy about that is that if we don't find them quickly, they get buried in a landslide of new content," he said of child abuse content. Mr Rouse also demanded risk assessments for new technology as social media algorithms pushed users toward disturbing and harmful content. "The tragedy is we're at a point now where we're having to ban our kids from social media, because we can't rely on any sector of the industry to protect our kids, which is pretty sad," he said. One social media app kept suggesting AI-generated content of scantily clad mothers with young children, he said, showing reporters a series of photos. "They're not sexually explicit but they are telling you something about the people that created them," Mr Rouse said. There also needed to be community-wide education on how to spot problem behaviours and precipitating actions from offenders, Ms Tame said. "We've been talking about early childhood education - these kids are pre-verbal, so they're even more vulnerable," she said. Lifeline 13 11 14 Kids Helpline 1800 55 1800 (for people aged 5 to 25) 1800 RESPECT (1800 737 732) National Sexual Abuse and Redress Support Service 1800 211 028 Australia needs to re-examine how it tackles child sexual exploitation as experts warn rapid development in artificial intelligence is widening gaps exploited by perpetrators. International Centre for Missing and Exploited Children chief executive Colm Gannon said the organisation had received a 1325 per cent spike in reports involving AI-generated child sexual abuse material in a year. The centre received more than 67,000 reports on the matter in 2024. Experts and government officials convened at Parliament House for a round table to address the increasing use of AI in the sexual exploitation of children. Child safety advocates called for the explicit criminalisation of the use and possession of software designed to generate child sexual exploitation material. "I have been involved in investigations where there is active trading and profiteering from using these models, it's a pay-as-you-use design that's happening within child sexual offender communities," Mr Gannon told reporters in Canberra on Thursday. "There is no social-positive reason why people are going to be in possession of this software except to generate child sexual abuse material." A 10-year government plan released in 2021 to address child protection needed to be updated to capture new technology as it didn't mention AI and associated harms, he said. Child abuse survivor Grace Tame said there needed to be a broader review into tackling child sexual exploitation with a royal commission into institutional child sexual abuse more than a decade ago failing to examine key issues. "It was very specifically focused on institutional child sexual abuse and the responses of institutions," the former Australia of the Year said. "Incest accounts for the overwhelming majority of all child sexual abuse. "A lot of this is taking place in home, a lot of the online content that we're seeing is often filmed by parents and distributed by parents and there's no institution involved in that." Jon Rouse, who worked in law enforcement for nearly four decades and tackled online child exploitation material, called for authorities to be given greater resources and new tools to quickly identify victims and combat the crime. "The tragedy about that is that if we don't find them quickly, they get buried in a landslide of new content," he said of child abuse content. Mr Rouse also demanded risk assessments for new technology as social media algorithms pushed users toward disturbing and harmful content. "The tragedy is we're at a point now where we're having to ban our kids from social media, because we can't rely on any sector of the industry to protect our kids, which is pretty sad," he said. One social media app kept suggesting AI-generated content of scantily clad mothers with young children, he said, showing reporters a series of photos. "They're not sexually explicit but they are telling you something about the people that created them," Mr Rouse said. There also needed to be community-wide education on how to spot problem behaviours and precipitating actions from offenders, Ms Tame said. "We've been talking about early childhood education - these kids are pre-verbal, so they're even more vulnerable," she said. Lifeline 13 11 14 Kids Helpline 1800 55 1800 (for people aged 5 to 25) 1800 RESPECT (1800 737 732) National Sexual Abuse and Redress Support Service 1800 211 028 Australia needs to re-examine how it tackles child sexual exploitation as experts warn rapid development in artificial intelligence is widening gaps exploited by perpetrators. International Centre for Missing and Exploited Children chief executive Colm Gannon said the organisation had received a 1325 per cent spike in reports involving AI-generated child sexual abuse material in a year. The centre received more than 67,000 reports on the matter in 2024. Experts and government officials convened at Parliament House for a round table to address the increasing use of AI in the sexual exploitation of children. Child safety advocates called for the explicit criminalisation of the use and possession of software designed to generate child sexual exploitation material. "I have been involved in investigations where there is active trading and profiteering from using these models, it's a pay-as-you-use design that's happening within child sexual offender communities," Mr Gannon told reporters in Canberra on Thursday. "There is no social-positive reason why people are going to be in possession of this software except to generate child sexual abuse material." A 10-year government plan released in 2021 to address child protection needed to be updated to capture new technology as it didn't mention AI and associated harms, he said. Child abuse survivor Grace Tame said there needed to be a broader review into tackling child sexual exploitation with a royal commission into institutional child sexual abuse more than a decade ago failing to examine key issues. "It was very specifically focused on institutional child sexual abuse and the responses of institutions," the former Australia of the Year said. "Incest accounts for the overwhelming majority of all child sexual abuse. "A lot of this is taking place in home, a lot of the online content that we're seeing is often filmed by parents and distributed by parents and there's no institution involved in that." Jon Rouse, who worked in law enforcement for nearly four decades and tackled online child exploitation material, called for authorities to be given greater resources and new tools to quickly identify victims and combat the crime. "The tragedy about that is that if we don't find them quickly, they get buried in a landslide of new content," he said of child abuse content. Mr Rouse also demanded risk assessments for new technology as social media algorithms pushed users toward disturbing and harmful content. "The tragedy is we're at a point now where we're having to ban our kids from social media, because we can't rely on any sector of the industry to protect our kids, which is pretty sad," he said. One social media app kept suggesting AI-generated content of scantily clad mothers with young children, he said, showing reporters a series of photos. "They're not sexually explicit but they are telling you something about the people that created them," Mr Rouse said. There also needed to be community-wide education on how to spot problem behaviours and precipitating actions from offenders, Ms Tame said. "We've been talking about early childhood education - these kids are pre-verbal, so they're even more vulnerable," she said. Lifeline 13 11 14 Kids Helpline 1800 55 1800 (for people aged 5 to 25) 1800 RESPECT (1800 737 732) National Sexual Abuse and Redress Support Service 1800 211 028 Australia needs to re-examine how it tackles child sexual exploitation as experts warn rapid development in artificial intelligence is widening gaps exploited by perpetrators. International Centre for Missing and Exploited Children chief executive Colm Gannon said the organisation had received a 1325 per cent spike in reports involving AI-generated child sexual abuse material in a year. The centre received more than 67,000 reports on the matter in 2024. Experts and government officials convened at Parliament House for a round table to address the increasing use of AI in the sexual exploitation of children. Child safety advocates called for the explicit criminalisation of the use and possession of software designed to generate child sexual exploitation material. "I have been involved in investigations where there is active trading and profiteering from using these models, it's a pay-as-you-use design that's happening within child sexual offender communities," Mr Gannon told reporters in Canberra on Thursday. "There is no social-positive reason why people are going to be in possession of this software except to generate child sexual abuse material." A 10-year government plan released in 2021 to address child protection needed to be updated to capture new technology as it didn't mention AI and associated harms, he said. Child abuse survivor Grace Tame said there needed to be a broader review into tackling child sexual exploitation with a royal commission into institutional child sexual abuse more than a decade ago failing to examine key issues. "It was very specifically focused on institutional child sexual abuse and the responses of institutions," the former Australia of the Year said. "Incest accounts for the overwhelming majority of all child sexual abuse. "A lot of this is taking place in home, a lot of the online content that we're seeing is often filmed by parents and distributed by parents and there's no institution involved in that." Jon Rouse, who worked in law enforcement for nearly four decades and tackled online child exploitation material, called for authorities to be given greater resources and new tools to quickly identify victims and combat the crime. "The tragedy about that is that if we don't find them quickly, they get buried in a landslide of new content," he said of child abuse content. Mr Rouse also demanded risk assessments for new technology as social media algorithms pushed users toward disturbing and harmful content. "The tragedy is we're at a point now where we're having to ban our kids from social media, because we can't rely on any sector of the industry to protect our kids, which is pretty sad," he said. One social media app kept suggesting AI-generated content of scantily clad mothers with young children, he said, showing reporters a series of photos. "They're not sexually explicit but they are telling you something about the people that created them," Mr Rouse said. There also needed to be community-wide education on how to spot problem behaviours and precipitating actions from offenders, Ms Tame said. "We've been talking about early childhood education - these kids are pre-verbal, so they're even more vulnerable," she said. Lifeline 13 11 14 Kids Helpline 1800 55 1800 (for people aged 5 to 25) 1800 RESPECT (1800 737 732) National Sexual Abuse and Redress Support Service 1800 211 028


7NEWS
17-07-2025
- 7NEWS
Child abuse images a click away as experts warn predators are using AI
Australia needs to re-examine how it tackles child sexual exploitation as experts warn rapid development in artificial intelligence is widening gaps exploited by perpetrators. International Centre for Missing and Exploited Children chief executive Colm Gannon said the organisation had received a 1325 per cent spike in reports involving AI-generated child sexual abuse material in a year. The centre received more than 67,000 reports on the matter in 2024. Experts and government officials convened at Parliament House for a round table to address the increasing use of AI in the sexual exploitation of children. Child safety advocates called for the explicit criminalisation of the use and possession of software designed to generate child sexual exploitation material. 'I have been involved in investigations where there is active trading and profiteering from using these models, it's a pay-as-you-use design that's happening within child sexual offender communities,' Gannon told reporters in Canberra on Thursday. 'There is no social-positive reason why people are going to be in possession of this software except to generate child sexual abuse material.' A 10-year government plan released in 2021 to address child protection needed to be updated to capture new technology as it didn't mention AI and associated harms, he said. Child abuse survivor Grace Tame said there needed to be a broader review into tackling child sexual exploitation with a royal commission into institutional child sexual abuse more than a decade ago failing to examine key issues. 'It was very specifically focused on institutional child sexual abuse and the responses of institutions,' the former Australian of the Year said. 'Incest accounts for the overwhelming majority of all child sexual abuse. 'A lot of this is taking place in home, a lot of the online content that we're seeing is often filmed by parents and distributed by parents and there's no institution involved in that.' Jon Rouse, who worked in law enforcement for nearly four decades and tackled online child exploitation material, called for authorities to be given greater resources and new tools to quickly identify victims and combat the crime. 'The tragedy about that is that if we don't find them quickly, they get buried in a landslide of new content,' he said of child abuse content. Rouse also demanded risk assessments for new technology as social media algorithms pushed users toward disturbing and harmful content. 'The tragedy is we're at a point now where we're having to ban our kids from social media, because we can't rely on any sector of the industry to protect our kids, which is pretty sad,' he said. One social media app kept suggesting AI-generated content of scantily clad mothers with young children, he said, showing reporters a series of photos. 'They're not sexually explicit but they are telling you something about the people that created them,' Rouse said. There also needed to be community-wide education on how to spot problem behaviours and precipitating actions from offenders, Tame said. 'We've been talking about early childhood education — these kids are pre-verbal, so they're even more vulnerable,' she said.


Perth Now
17-07-2025
- Perth Now
AI turbocharges child abuse as image-creation made easy
Australia needs to re-examine how it tackles child sexual exploitation as experts warn rapid development in artificial intelligence is widening gaps exploited by perpetrators. International Centre for Missing and Exploited Children chief executive Colm Gannon said the organisation had received a 1325 per cent spike in reports involving AI-generated child sexual abuse material in a year. The centre received more than 67,000 reports on the matter in 2024. Experts and government officials convened at Parliament House for a round table to address the increasing use of AI in the sexual exploitation of children. Child safety advocates called for the explicit criminalisation of the use and possession of software designed to generate child sexual exploitation material. "I have been involved in investigations where there is active trading and profiteering from using these models, it's a pay-as-you-use design that's happening within child sexual offender communities," Mr Gannon told reporters in Canberra on Thursday. "There is no social-positive reason why people are going to be in possession of this software except to generate child sexual abuse material." A 10-year government plan released in 2021 to address child protection needed to be updated to capture new technology as it didn't mention AI and associated harms, he said. Child abuse survivor Grace Tame said there needed to be a broader review into tackling child sexual exploitation with a royal commission into institutional child sexual abuse more than a decade ago failing to examine key issues. "It was very specifically focused on institutional child sexual abuse and the responses of institutions," the former Australia of the Year said. "Incest accounts for the overwhelming majority of all child sexual abuse. "A lot of this is taking place in home, a lot of the online content that we're seeing is often filmed by parents and distributed by parents and there's no institution involved in that." Jon Rouse, who worked in law enforcement for nearly four decades and tackled online child exploitation material, called for authorities to be given greater resources and new tools to quickly identify victims and combat the crime. "The tragedy about that is that if we don't find them quickly, they get buried in a landslide of new content," he said of child abuse content. Mr Rouse also demanded risk assessments for new technology as social media algorithms pushed users toward disturbing and harmful content. "The tragedy is we're at a point now where we're having to ban our kids from social media, because we can't rely on any sector of the industry to protect our kids, which is pretty sad," he said. One social media app kept suggesting AI-generated content of scantily clad mothers with young children, he said, showing reporters a series of photos. "They're not sexually explicit but they are telling you something about the people that created them," Mr Rouse said. There also needed to be community-wide education on how to spot problem behaviours and precipitating actions from offenders, Ms Tame said. "We've been talking about early childhood education - these kids are pre-verbal, so they're even more vulnerable," she said. Lifeline 13 11 14 Kids Helpline 1800 55 1800 (for people aged 5 to 25) 1800 RESPECT (1800 737 732) National Sexual Abuse and Redress Support Service 1800 211 028