Latest news with #CanadianCentreforChildProtection


Hamilton Spectator
02-05-2025
- Hamilton Spectator
AI is fuelling an alarming surge of deepfake child sexual abuse material in Canada: What parents need to know
Canada could be facing a disturbing new frontier in digital child sexual abuse. New data from the Canadian Centre for Child Protection's shows a chilling surge in AI-generated child sexual abuse materials (CSAM) in Canada. The number of sexually explicit deepfakes of children that the tip line has processed more than doubled in the past year, director David Rabsch said in an interview with Metroland Media. Deepfakes are hyper-realistic, AI-generated synthetic media that use existing photos, images or audio. It is produced by feeding an AI software with images or audio of real people to create fake materials. According to Rabsch, who leads the team of analysts at Cybertip — Canada's national tip line for reporting online sexual exploitation of children — the number of these types of materials processed has shot up from 4,000 in 2023 to 9,000 in 2024. And the figures in early 2025 paint an even grimmer picture, with 4,000 of sexually explicit deepfakes processed just in the first quarter of 2025 alone. 'We have seen cases where offenders have taken images of known children and used AI to create intimate images,' Det. Staff Sgt. Mike Krisko of the Waterloo Region Police's cybercrime unit said in a phone interview. Offenders will sometimes find photos of people that are publicly available on social media and run it through an AI tool, he explained, adding that the cybercrime unit started seeing cases of this in Waterloo Region late last year. In cases involving sextortion , extortionists will send the image to the victim and threaten to distribute the deepfakes to friends and family unless the victim pays or gives in to more demands, Krisko said. With AI, predators can now skip the initial process of luring the child into sending the explicit photos. That 'fast tracks' their process by picking up typically harmless images on a child's Snapchat, TikTok or Instagram, then using a generative AI tool to create sexually explicit deepfakes of children that they can use for sextortion, to add to their own collections or to sell and trade in underground pedophile forums. 'These are minors, and adult content is being made of them nonconsensually, then also being posted on social media,' says Bellingcat senior researcher Kolina Koltai. ttps:// While many of these are accumulated by child predators and added to massive personal collections that are saved in their computers, others trade and share and many more make profits out of it on the dark web. Rabsch said some criminals share the materials and ask for 'donations' in cryptocurrency. Transactions are done in crypto, the currency of the dark web, since it's an anonymous way of trading value, he explained. In other cases, these child predators will go on dark web forums looking for like-minded individuals. These are places on the dark web where people discuss child abuse and advertise their massive collections. Advertisements luring other predators to get access are posted on the dark web, but the content itself may be kept in a compressed archive on the clear web or surface web, the publicly accessible layer of the internet. These archives have thousands of child sexual abuse photos and videos and are hosted on the clear web by file hosting services that do not have any form of proactive scanning for harmful materials, Rabsch explained. They then set up premium file hosting services that charge pedophiles a subscription or one-time fee to download the content. 'Now you've got a profit model where you can have a ring of sites on the dark web that are all funneling this traffic and potentially money into these premium file hosters,' he said. A Pulitzer investigation published early this month has found that sites like Instagram are also being used to advertise child sexual exploitation materials and direct child predators willing to pay content subscription platforms Patreon and Fanvue. The two are file hosting sites that are sometimes used by digital child exploitation rings to keep large collections of AI-generated child sexual exploitation materials. According to the investigation, the Patreon and Fanvue accounts offered 'exclusive subscription content' and production of 'special media.' Patreon is a content hosting and sharing site where fans can support content creators by paying for monthly subscription to access videos, images and blogs. A BBC investigation has found that some Patreon accounts used to host 'AI-generated, photorealistic obscene images of children,' including materials depicting sexual abuse of babies and toddlers, are selling them with varying pricing levels depending on the type of material predators want to access. Detective Krisko advises parents to make sure profiles are 'locked down with the tightest security settings on them.' Parents should also keep in mind that any photo they post publicly can be taken by online predators and altered. 'You only want to share images with people that you really know,' he added. This means parents should make sure all those in their contact list and who may have access to their children's photos are people they know and trust. 'You shouldn't have strangers or people you've never met on your contact list.' How does exploitation happen? 🫢 It starts small. Something as simple as setting your family's accounts to private can make a major difference in protecting your kids online. We're proud to partner with @Know2Protect , a DHS campaign to support you. For those with children who go online, Rabsch says the best that they can do is to talk to their children. Explain the dangers lurking online and 'teach them how to spot and avoid these kinds of scenarios,' he said. 'We have content on and to help parents at least understand what the scope of the problem is,' he shared. Parents who discover sexually explicit materials of their children being shared online can reach out to or report it to . NeedHelpNow can guide parents through the process of reporting and removing the harmful images or videos of their children online. Parents of victims can also reach out to the Canadian Centre for Child Protection who can help determine and navigate the next steps, assist victims of online sexual abuse with impact statements for criminal proceedings, connect victims and their families to support services and assist in disrupting the availability of the harmful material online. While Rabsch acknowledges the role parents must play to protect their children, he thinks emphasis should instead be on pressuring governments, regulators and the industry to create more comprehensive standards for the development, deployment and use of AI technologies. The bigger problem, he said, is how this new technology has recently been brought out into the public to freely use without systems of controls in place. 'It's being developed at a ridiculously rapid pace. There's no oversight, there's no regulation … And here we see the fruits of that reckless behaviour,' he said. Rabsch says it doesn't matter how the imagery is created. 'Whether it was created the way it's always been created or whether it's produced through deepfake technology, the harm or trauma inflicted on the victim is the same,' he said. 'We have this real issue now where child sexual abuse material can be freely created by basically anyone and there's nothing that the industry, government and regulators are really doing,' Rabsch said, adding that governments should start pressuring and applying regulations so the AI industry will have to conform. 'Until that is really addressed, this problem is just going to continue as it is right now.'


Ottawa Citizen
02-05-2025
- Health
- Ottawa Citizen
Aaron and Munter: Let's take the pledge to give our kids a smartphone-free childhood
Article content Every generation of parents has faced new challenges. One of the biggest right now is how to prevent young kids from being groomed by tech platforms that want to turn them into lifelong customers. Article content Article content Just like our parents understood the addictive potential and health risks of tobacco, today's parents and caregivers are realizing how social media algorithms are affecting kids' health. And a group of us here in Ottawa is banding together to try to do something about it. Article content Article content The evidence is clear: more social media equals less mental health. For example, recent CHEO research demonstrated the link between social media and worsening rates of anxiety and depression. The U.S. Surgeon-General called for a health warning on social media last year. Jonathan Haidt documents the problem in his best-selling book, ' The Anxious Generation: How the great rewiring of childhood is causing an epidemic of mental illness.' Article content Tech billionaires are using the most sophisticated technology to get into our children's brains, deploying algorithms designed to get kids hooked as quickly as possible. Research in the United Kingdom has demonstrated how algorithms feed young girls imagery of eating disorders and self-harm while young boys are pushed pornographic and misogynistic content. Article content The Canadian Centre for Child Protection now offers services to families affected by online sextortion scams. UNICEF Canada reports that 44 per cent of youth have experienced cyber-bullying or negative interactions online that have affected their mental health. Article content Article content Article content So what to do? Many people believe that government regulation to deal with online harms is critical. This is a view backed by many children's health, medical and other organizations, including Unplugged Canada. Article content We're not powerless as parents to take action right away, though. One of the things we can do is delay giving our kids a smartphone until they are in at least Grade 9 and better equipped to navigate the online world. Article content Both of us have a child in Grade 1. And we've both fielded our first requests for a cellphone. We expect those will accelerate as peers get phones. Article content But what if peers didn't? The Mutchmor School Council has recently endorsed the Unplugged Canada Campaign. This is a voluntary effort of parents and caregivers to commit to delaying smartphones until at least the age of 14. Article content This is something we can do to shift the norm around when smartphones and social media are introduced in our schools and communities. This won't just protects kids from online harm, but also create space for more social connection in the real world, more play, and more face-to-face connection with friends and family in particular.


CBC
19-04-2025
- CBC
Court must consider showing child porn sample at sentencing hearing: Manitoba judge
Social Sharing A Manitoba judge says the court must consider showing a sample of child pornography during a May sentencing hearing for a man who has pleaded guilty to possessing the material. "The proper administration of justice requires the court to consider the representative sample," provincial court Judge Geoffrey H. Bayly wrote in an April 9 decision delivered from court in the Interlake community of Ashern. Rodney Yankie's defence had made a motion to prevent the court from seeing the material. The defence had argued "there is no necessity" for the court to view the child sex abuse material, suggesting that doing so would "create a prejudice to the accused that far outweighs the probative value of the sample," according to Bayly's decision. Yankie's lawyer also argued that showing the material could be "adding to the trauma and victimization of the children depicted in the sample." The defence suggested because Yankie is not contesting the facts in this case, a written description of the material would suffice. However, the Crown said the sample should be reviewed alongside a written description of the material at the sentencing hearing, referencing a 2002 Alberta appeal court decision that stated "the photographs do not depict the crime — they are the crime." Bayly denied the application, ruling that the evidence the defence wanted to exclude accurately demonstrates the seriousness of the crime and Yankie's culpability, suggesting the presiding judge should consider showing a sample at the sentencing hearing next month. Differing views on benefits, harm of showing material Monique St. Germain, general counsel for the Winnipeg-based Canadian Centre for Child Protection, told CBC News that generally speaking, judges may decide to review child sexual abuse material to get an accurate understanding of the crime, as opposed to "a sanitized version." "Seeing an image is never going to be replaced by words. That's just not possible," St. Germain said, adding that a written or verbal description "diminishes the victim's experience in a considerable way." "In the context of sentencing, it is viewed because that is the crime that was committed — either the making of it, or the possessing of it, or the distributing it. Understanding what [child sexual abuse material] was possessed or distributed is part of the sentencing process." In his decision, Judge Bayly wrote that the sentencing judge serves a "gatekeeper function," adding the decision to review the material is decided on a case-by-case basis, depending on how the judge weighs the value of the evidence to the case at hand. Showing the child pornography could revictimize the children depicted in it, Bayly wrote, but his decision also said a judge shouldn't avoid showing it on the grounds it is "distasteful or upsetting," if safeguards are put in place to protect the children's dignity. However, no one but the court — the witness, judge and counsel — should view the material, Bayly wrote. The public and Yankie should not see it, he wrote. Brandon Trask, a University of Manitoba associate law professor who himself developed post-traumatic stress disorder while prosecuting a number of child sex abuse cases as a Crown attorney in Nova Scotia, says viewing these types of images and videos in a court setting is "definitely an unsettled issue," with judges ruling both for and against. "Every time somebody views these materials, arguably that has a very negative impact on the victim — wherever they are located," Trask told CBC News. "Unless it's absolutely necessary for the truth-finding function of the criminal justice system, I'm certainly of the perspective that we should not be encouraging the actual submission of the actual materials where there is agreement on, through a verbal description, of what the materials depict," he said. "We should be looking to protect people as much as possible — victims and everybody else involved in the criminal justice system." Trask also said descriptions should be the "default" approach for courts, with materials provided only in "exceptional situations." Yankie's sentencing hearing is scheduled for May 12.
Yahoo
05-03-2025
- Yahoo
Perspective: Apple's new protections for kids don't go far enough
After years of applied pressure and even begging from parents, advocates and lawmakers, Apple has suddenly decided to fix failures in its child safety features. Why now? Simple. Utah and other states are moving to enact legislation that requires age verification and parental consent for all app downloads and purchases. What we call the App Store Accountability Act appears poised to become law in the Beehive State and will likely follow in others, and Apple is paying close attention. The company has released a set of reforms it headlined 'Helping Protect Kids Online,' and its strategy of 'shock and awe' seems to be working. As word of these updates tears through state capitals across the country, lawmakers are rightly wondering, do these updates answer the issues that our legislation identifies and seeks to fix? After getting the cold shoulder from Apple for years, we admit that we are pleased by some of these updates. But given what's at stake, they are not enough. Here's why. Apple's eight-page announcement outlines a number of updates to be rolled out by the year's end. Promised features include making it easier to set up child accounts, a new 'age range' application programming interface (API) that allows app stores and app developers to share age category data to better ensure age-appropriate experiences, more granular app age ratings and better app descriptions and removing apps that exceed a child's age range from the app store. The 'age range API' seems particularly well-done. According to Tech Crunch: 'Instead of asking kids to input their birthdays, as many social apps do today, developers will have the option to use a new Declared Age Range API that allows them to access the age range information the parent input during the child account setup. (Parents can also correct this information at any time if it was originally entered incorrectly).' These updates are needed. But why in the world has it taken so long? For years, shareholders and child-safety advocates have been asking the company to better protect kids online. In 2018, almost 11 years after the iPhone's release, Apple shareholders wrote a letter to the board of directors demanding that the company give parents more resources and tools to protect children. A 2022 Canadian Centre for Child Protection report detailed Apple's failure to enforce app age ratings for younger users and labeled its parental controls as 'inadequate' and 'unusable.' A Screen Time parental control bug causing protective settings to disengage on child accounts has plagued iPhones and iPads for more than two years, unsolved. And a scathing Wall Street Journal investigation published in December exposed Apple for misrepresenting up to a quarter of its apps as 'safe for kids' when many were rife with sexual exploitation and bullying. Between December and now, Apple's technological capabilities did not change. To put it positively: Apple already possessed the technological and financial capacity to institute these changes. Modern APIs have been around for years. And Apple has every financial resource it needs to develop and effectively implement these safeguards, raking in $26 billion in revenue from its app store alone during FY 2023. These are good changes, but from a moral, financial and technical standpoint, there is no reason why it should have taken this long. But are Apple's new features enough? No, deep issues remain unaddressed. A few years ago, the young son of one of the authors of this column was served ads for sexual role play apps — including a graphic strip show — and ads for apps that focus on gambling and marijuana cultivation, all while playing a cartoon game that the app store rated safe for children. The mother had only stepped away for a few minutes to fold laundry, assuming that the age rating accurately represented what her son would experience while using the app. Such anecdotes are not the exception; they are the rule. There is a systemic failure that leaves parents and children who access the internet through the app store totally vulnerable, because Apple allows developers to operate on an honor system, without any meaningful enforcement. Apple's recent safety update doesn't change this; its app store will still rely on the honor system, allowing developers to self-report content with little oversight and few consequences for misrepresenting age ratings or content warnings. As mentioned, Apple's announcement comes just days before Utah is expected to become the first state to pass the App Store Accountability Act. The bill would require Apple, and other app store providers like Google, to perform age verification and get parental consent before minors can download apps, purchase apps or make in-app purchases. Using the account holders' age-category status and a parental-consent status collected by the app store, app developers would be required to enforce any developer-created age-related restrictions. Additionally, the legislation would require all minors to link to a parent's account before using the app store in the first place. Apple's proposed changes will not solve the core issue, which is that minors need parental consent when it comes to entering binding terms of service agreements. Currently, the app store routinely allows known minors to download apps, accept complex terms of service, and make in-app purchases without any parental consent. This loophole exposes children to privacy risks, financial harm and dangerous digital environments. Only in app stores do we allow minors to enter into terms of service agreements with trillion-dollar companies that determine how their personal data can be used, often giving such companies full access to extremely sensitive information like photographs and videos of children, their exact location and their contact lists. Our legislative model ends that practice. Apple's new updates, by contrast, will not stop any of this. Apple will still treat teens as digital adults, allowing minors to agree to complex terms of service contracts without parental consent, despite the fact that in the real world, one has to be 18 to enter into a binding contract. Furthermore, Apple will only enforce the proposed app store protections for parents who figure out how to enable content controls, leaving parents without any meaningful backstop when their kids try to circumvent the controls. (And they will try.) The stakes are high. If Apple can convince lawmakers that its updates are adequate to the task, then it will continue to prioritize its profit over protecting kids online without real consequences. We welcome better labels and controls, but parents and their kids need much more than that. They need a bill that provides app store accountability. And they need it now. Melissa McKay is the chairman of the Digital Childhood Alliance. Chris McKenna is the Founder and CEO of Protect Young Eyes. Michael Toscano is the executive director of the Institute for Family Studies and director of the Family First Technology Initiative. Jared Hayden is Policy Analyst, Family First Technology Initiative, the Institute for Family Studies.