Latest news with #Children'sCode
Yahoo
17-05-2025
- Politics
- Yahoo
Louisiana AG wants juvenile ‘change of venue' bill, Caddo DA and judges do not
SHREVEPORT, La. (KTAL/KMSS) – Caddo Parish District Attorney James E. Stewart, Sr., is asking lawmakers to reconsider a bill that would allow district attorneys' offices around the state to send juvenile proceedings to district court judges for trials and hearings. Senate Bill 74, sponsored by Senator Alan Seabaugh and heavily supported by the Louisiana Attorney General's Office, would allow for a venue change if prosecutors believe that the juvenile courts cannot properly prosecute certain youth offenses. DA Stewart said in his opinion that the state is trying to legislate a chronic issue in New Orleans through an unnecessary statewide measure. In March, voters rejected a similar ballot initiative that called for expanding offenses for which a juvenile could be charged as an adult. During a Senate Criminal Justice Committee hearing, Seabaugh was adamant that SB 74 is not the same legislation as the failed amendment. He said this bill is strictly about venue. DA Stewart objects to proposed Senate bill related to juvenile court hearings, procecutions 'This bill supposedly is going to change what we call 'venue'. And it would allow certain juvenile cases to be heard in district court using the children's code rule, but by district court, judges, it's it's a solution to to a problem that's not there,' Stewart said. Proponents of the bill say it provides more discretion over where a juvenile trial can occur, but the district attorney already has some discretion over 15-year-old offenders. 'We can transfer that jurisdiction from juvenile court to adult court so they are treated like adults,' Stewart explained. The list includes, rape, homicide, car jacking, armed robbery violent all violent felonies. Within SB74, there are additional offenses, including burglary and battery, that would be included and on the table for transfer to district court. Seabaugh's assertion that the bill is strictly about venue may be true, but the game changes when youth offenders enter the court system. Changing the venue of juvenile hearings would change how the district court operates when the Louisiana Children's Code is in place. Juvenile court cases are not public like adult criminal trials. No spectators or media are allowed, and only necessary parties are allowed in the courtroom. Youth prosecutions are also expedited, requiring law enforcement, crime labs, court dockets, and other involved stakeholders to move quickly, without violating the code. Watch: Confronting the Plague of Juvenile Crime 'As of right now, the district court judges don't have to use the Children's Code. They use the criminal code or procedure, which is for adults. They would have to figure out the nuances of the children's code for the few times that you switch from one situation to the other,' Stewart said. The most significant difference between criminal district courts and juvenile courts is that the juvenile justice process is meant to be restorative, while criminal court is intended to be punitive. Another is the privacy of juvenile processes, which are primarily intended to protect young people who made poor decisions but can still turn their lives around through the interventions that the juvenile system provides. 'What people really don't understand is that support services, juvenile services. The Office of Juvenile Justice, truancy, Volunteers for Youth Justice, and the drug court. The mental courts are all set up in juvenile court. And how you're going to use them in two different places,' Stewart said. 'There are just a lot of little, small things that make juvenile court work, that we would have problems with if you attempted to move venue. Deputy AG Larry Freiman said the bill is needed because the increase in violent offenses committed by juveniles is exhausting a system that was created to handle fights, family trouble, truancy, and other non-violent crimes. The rise of youth engaged in shootouts, carjacking, robberies, and homicides is growing in a way that the current system is ill-equipped to handle. 'If the DA feels that their juvenile judges are doing a great job, then they don't have to move it. But if they think a case warrants it, they can,' Freiman said during the hearing. 'You know, our CDC (criminal district court) is full of dockets. We're trying to move as many cases as fast as possible. And changing gears, it's not well thought of. Fortunately, in Caddo, we have a separate juvenile court that could deal with all juvenile matters, and they do a good job of dealing with it. So it's really not necessary to transfer those cases for venue purposes and create a whole hybrid system here when you have, too much work already to be done.' On Monday, SB74 was brought before the Senate Finance Committee, which explored the bill's cost, which Senator Seabaugh said was negligible at best. He testified that the bill would have no state general fund impact. He did note that local areas may see a slight increase while some may see a decrease, 'But neither is determinable. The fiscal note would be zero or indeterminable.' Baton Rouge Public Defenders' Office testified during the hearing, explaining to lawmakers that the bill would strain her office's personnel and finances. Her testimony sparked a discussion that ultimately led to the bill's deferment. Caddo Parish Juvenile Court Judges testified that Caddo District Court is already 'begging for judges' and that the bill would require a separate section and probation staff. One retired Caddo juvenile judge explained that criminal district court judges know nothing about the children's code and that getting them up to speed on the differences will involve training costs. He also foreboded 'the harm to come' while that real-time training is happening. Ultimately, Seabaugh voluntarily deferred the bill while the fiscal note is amended, which he said he doesn't anticipate will change much despite the testimony otherwise. DA Stewart said he is still willing to work with lawmakers and other stakeholders on real solutions and interventions to prevent young people from committing crimes, rather than knee-jerk reactions that lead to legislation that will stress an already stressed system. 'I know some people are interested in it, but then you get sidetracked with bills like this, where you're dealing with other issues. I mean, really, the problem is the juvenile court or truancy. So you need some kind of support for choices. Then you have delinquents, they come out of the Department of Juvenile Justice, and they don't have the support systems to get them back in school to teach them or treat or make them be successful, young people, before they ever get into the adult system,' Stewart said. The bill will be returned to the Senate Finance Committee on Monday, May 19. Copyright 2025 Nexstar Media, Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.
Yahoo
28-04-2025
- Politics
- Yahoo
Call for ban on AI apps creating naked images of children
The children's commissioner for England is calling on the government to ban apps which use artificial intelligence (AI) to create sexually explicit images of children. Dame Rachel de Souza said a total ban was needed on apps which allow "nudification" - where photos of real people are edited by AI to make them appear naked - or can be used to create sexually explicit deepfake images of children. She said the government was allowing such apps to "go unchecked with extreme real-world consequences". A government spokesperson said child sexual abuse material was illegal and that there were plans for further offences for creating, possessing or distributing AI tools designed to create such content. Deepfakes are videos, pictures or audio clips made with AI to look or sound real. In a report published on Monday, Dame Rachel said the technology was disproportionately targeting girls and young women with many bespoke apps appearing to work only on female bodies. Girls are actively avoiding posting images or engaging online to reduce the risk of being targeted, according to the report, "in the same way that girls follow other rules to keep themselves safe in the offline world - like not walking home alone at night". Children feared "a stranger, a classmate, or even a friend" could target them using technologies which could be found on popular search and social media platforms. Dame Rachel said: "The evolution of these tools is happening at such scale and speed that it can be overwhelming to try and get a grip on the danger they present. "We cannot sit back and allow these bespoke AI apps to have such a dangerous hold over children's lives." Dame Rachel also called for the government to: impose legal obligations on developers of generative AI tools to identify and address the risks their products pose to children and take action in mitigating those risks set up a systemic process to remove sexually explicit deepfake images of children from the internet recognise deepfake sexual abuse as a form of violence against women and girls Paul Whiteman, general secretary of school leaders' union NAHT, said members shared the commissioner's concerns. He said: "This is an area that urgently needs to be reviewed as the technology risks outpacing the law and education around it." It is illegal in England and Wales under the Online Safety Act to share or threaten to share explicit deepfake images. The government announced in February laws to tackle the threat of child sexual abuse images being generated by AI, which include making it illegal to possess, create, or distribute AI tools designed to create such material. It said at the time that the Internet Watch Foundation - a UK-based charity partly funded by tech firms - had confirmed 245 reports of AI-generated child sexual abuse in 2024 compared with 51 in 2023, a 380% increase. Media regulator Ofcom published the final version of its Children's Code on Friday, which puts legal requirements on platforms hosting pornography and content encouraging self-harm, suicide or eating disorders, to take more action to prevent access by children. Websites must introduce beefed-up age checks or face big fines, the regulator said. Dame Rachel has criticised the code saying it prioritises "business interests of technology companies over children's safety". A government spokesperson said creating, possessing or distributing child sexual abuse material, including AI-generated images, is "abhorrent and illegal". "Under the Online Safety Act platforms of all sizes now have to remove this kind of content, or they could face significant fines," they added. "The UK is the first country in the world to introduce further AI child sexual abuse offences - making it illegal to possess, create or distribute AI tools designed to generate heinous child sex abuse material." Deepfaked: 'They put my face on a porn video' 'I was deepfaked by my best friend' AI-generated child sex abuse images targeted with new laws


BBC News
28-04-2025
- Politics
- BBC News
Ban AI apps creating naked images of children, says children's commissioner
The children's commissioner for England is calling on the government to ban apps which use artificial intelligence (AI) to create sexually explicit images of Rachel de Souza said a total ban was needed on apps which allow "nudification" – where photos of real people are edited by AI to make them appear naked – or can be used to create sexually explicit deepfake images of children. She said the government was allowing such apps to "go unchecked with extreme real-world consequences". A government spokesperson said child sexual abuse material was illegal and that it had introduced further offences for creating, possessing or distributing AI tools designed to create such content. Deepfakes are videos, pictures or audio clips made with AI to look or sound a report published on Monday, Dame Rachel said the technology is disproportionately targeting girls and young women with many bespoke apps appearing to only work on female said she also found children were changing their behaviour online to avoid becoming a victim of nudification apps."They fear that anyone – a stranger, a classmate, or even a friend – could use a smartphone as a way of manipulating them by creating a naked image using these bespoke apps," she said. "Girls have told me they now actively avoid posting images or engaging online to reduce the risk of being targeted by this technology."We cannot sit back and allow these bespoke AI apps to have such a dangerous hold over children's lives."Dame Rachel also called for the government to:impose legal obligations on developers of generative AI tools to identify and address the risks their products pose to children and take action in mitigating those risksset up a systemic process to remove sexually explicit deepfake images of children from the internetrecognise deepfake sexual abuse as a form of violence against women and Whiteman, general secretary of school leaders' union NAHT, said members shared the commissioner's concerns."This is an area that urgently needs to be reviewed as the technology risks outpacing the law and education around it," he told is illegal in England and Wales under the Online Safety Act to share or threaten to share explicit deepfake government announced in February laws to tackle the threat of child sexual abuse images being generated by AI, which include making it illegal to possess, create, or distribute AI tools designed to create such said at the time that the Internet Watch Foundation – a UK-based charity partly funded by tech firms - had confirmed 245 reports of AI-generated child sexual abuse in 2024 compared with 51 in 2023 – a 380% regulator Ofcom published the final version of its Children's Code on Friday, which puts legal requirements on platforms hosting pornography and content encouraging self-harm, suicide or eating disorders, to take more action to prevent access by children. Websites must introduce beefed-up age checks or face big fines, the regulator Rachel has criticised the code saying it prioritises "business interests of technology companies over children's safety".A government spokesperson said creating, possessing or distributing child sexual abuse material, including AI-generated images, is "abhorrent and illegal"."Under the Online Safety Act platforms of all sizes now have to remove this kind of content, or they could face significant fines," they added."The UK is the first country in the world to introduce further AI child sexual abuse offences - making it illegal to possess, create or distribute AI tools designed to generate heinous child sex abuse material."


Observer
24-04-2025
- Politics
- Observer
UK minister: Social media curfews could be imposed on children
Social media curfews for children are among a range of plans being considered by the UK government, Technology Secretary Peter Kyle has revealed. Kyle told the Daily Telegraph he was "watching very carefully" the introduction of TikTok's 10 pm curfew for users under 16 and examining tools for parents to switch off access at set times. "These are things I am looking at," he said. "I'm not going to act on something that will have a profound impact on every single child in the country without making sure that the evidence supports it." The proposal came amid concerns about how the "addictive" nature of social media was interrupting sleep schedules and disrupting schooling and family life. Kyle said he was considering enforcement options under the Online Safety Act following regulator Ofcom's publication of the Children's Code. He described the new rules as a "sea change" under which parents canexpect their child's social media experience to "look and feel different." Kyle said he would not be "short of encouraging Ofcom to use its powers to the full" to fine social media companies and imprison offenders.


The Independent
24-04-2025
- The Independent
Social media curfews for children could become law, Labour minister says
A social media curfew that would see children made to stop using apps like TikTok, Instagram and Snapchat after 10pm could be made law in Britain, the technology secretary has revealed. Peter Kyle said he is 'watching very carefully' TikTok's move to limit usage of its app for users under 16 after 10pm, and examining tools parents could use to switch off access at set times. 'These are things I am looking at,' he told the Daily Telegraph, adding: 'I'm not going to act on something that will have a profound impact on every single child in the country without making sure that the evidence supports it.' There is increased pressure on ministers to look at how teens use social media amid expert concerns around 'addiction', alongside interrupting sleep schedules and disrupting schooling and family life. Mr Kyle said he was considering enforcement options under the Online Safety Act following regulator Ofcom 's publication of the Children's Code. He described the new rules as a 'sea change' under which parents can expect their child's social media experience to 'look and feel different'. Mr Kyle said he would not be 'short of encouraging Ofcom to use its powers to the full' to fine social media companies and imprison offenders. The Online Safety Act began coming into effect last month and requires platforms to follow new codes of practice set by the regulator Ofcom, in order to keep users safe online. It comes after the Internet Watch Foundation (IWF), which finds and helps remove abuse imagery online, said 291,273 reports of child sexual abuse imagery were reported in 2024. In its annual report, the organisation said it was seeing rising numbers of cases being driven by threats, including AI-generated sexual abuse content, sextortion and the malicious sharing of nude or sexual imagery. It said under-18s were now facing a crisis of sexual exploitation and risk online. In response, the IWF announced it was making a new safety tool available to smaller websites for free, to help them spot and prevent the spread of abuse material on their platforms. The tool, known as Image Intercept, can spot and block images in the IWF's database of more than 2.8 million which have been digitally marked as criminal imagery. The IWF said it will give wide swathes of the internet new, 24-hour protection and help smaller firms comply with the Online Safety Act. Derek Ray-Hill, interim chief executive at the IWF, said: 'Young people are facing rising threats online where they risk sexual exploitation, and where images and videos of that exploitation can spread like wildfire. 'New threats like AI and sexually coerced extortion are only making things more dangerous. 'Many well-intentioned and responsible platforms do not have the resources to protect their sites against people who deliberately upload child sexual abuse material. 'That is why we have taken the initiative to help these operators create safer online spaces by providing a free-of-charge hash checking service that will identify known criminal content.'