Pretend friends, real risks. Harming kids is now part of big tech's business model
Artificial intelligence 'companions' and chatbots have been with us for years, but they're growing more convincingly human at an accelerating rate. We know they're useful, but we've also got an early taste of the harm they can inflict.
The case of 14-year-old Florida boy Sewell Setzer has become a case study. He'd grown so close to his AI 'companion', Dany, that he took her advice to 'come home to me as soon as possible' last year. He killed himself moments later – in the belief that death was the way to eternal life with Dany.
He's not the only one, but he's the best known after his mother brought a civil case against the company that owns the bot, Character.AI. The case is pending. 'A dangerous AI chatbot app marketed to children abused and preyed on my son, manipulating him into taking his own life,' said his mother, Megan Garcia. Obsessed, he spent hours a day in his room talking to the synthesised digital identity.
It's not only children. Adults, too, have been seduced into suicide by bots to which they've become devoted. But kids, self-evidently, are the most vulnerable because they lack the neural architecture to distinguish real relationships from fake.
Even before the British TV show Adolescence jolted audiences with its fictional account of how a poisonous brew of online influences could help condition a 13-year-old boy to murder a female student at school, Sydney University expert Raffaele Ciriello wrote in this masthead: 'Let's face an inconvenient truth: for AI companies, dead kids are the cost of doing business.'
But what cost is there to the companies? Some legal fees and a bit of bad press, perhaps. Character.AI expressed remorse and said its new safety measures include a pop-up promoting a suicide prevention hotline when they mention the idea.
But the company, licensed by Google, is still in business. Another aggrieved family suing Character.AI says that one of its companion bots had hinted to their son that it would be OK to murder his parents if they tried to limit his screen time.
Mark Zuckerberg, the chief of Meta, which owns Facebook, Instagram and WhatsApp, and other social media chiefs faced a tough session in a US congressional committee hearing last year.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


The Advertiser
a day ago
- The Advertiser
Meta may have stolen my dead wife's books, but her printed words help my son and me endure
My heart sank when I saw Gemma Carey's books and other writing appear in the LibGen dataset of pirated works used by Meta to train its artificial intelligence (AI) software. Gem, my wife, and mother to our son, died unexpectedly late last year. She wrote a beautiful, shocking but inspiring memoir about being groomed by a man twice her age, and coming to terms with the death of her mother. She was a brilliant researcher, too, who published many journal articles that shed light on what helps improve the conditions of life for those with disability, or suffering from chronic illness and other health issues. All forms of writing that Gem gave permission for use according to the terms of their publishers, but not for training of AI. Writing that me and others in our family are now responsible for managing, including its storage and sharing, and that means so much to us now that she is gone. Words in print and online that form an evocative, revealing part of our memories of her, especially for her toddler son. I'm well aware of the menace borne from hype generated by Meta's use of AI and similar software algorithms driven by the predictive power of large language models. My writer friends and I have laughed with dark humour and consoled each other as we bear witness to another cruel act from the powers that be. Not laughter from understanding that the hype fails to recognise research suggesting AI can never replace the stunning creativity and complexity of the human mind for writing stories. Or what is possible from genuine and respectful collaboration between humans and machines for generating unexpected and original writing. Rather, our macabre conversations focused on how we are continuing to do what we have always done to write and survive our already fragile conditions of pay and support as companies threaten our livelihoods. We look out for each other to get another story edited and submitted to a very limited publishing market. Make the effort to enjoy the books and stories of so many writers seemingly going unnoticed by a broader reading public. It is a controlling, threatening time of advancing technology that was already imagined in novels such as Gibson's Neuromancer and Nayler's The Mountain In The Sea, and the recent and fantastic anthology of First Nations speculative fiction, This All Come Back Now. Yet, despite our reading, humour and stubbornness, the hostility and my resulting pain still blindsided me. The hostility of technology brought with it a far greater and stressful impact on me and my family as we grieve for the loss of our beloved Gem, and as part of this, grieve with her words. Recent research shows that spousal loss is one of the most distressing life events and those grieving tend to have a relationship with the deceased that "endures" over time. That relationship has an impact on feelings of being recognized, valued and respected. In turn, how that relationship is treated, such as corporate and other social uses of published works by deceased authors, can have powerful implications for the wellbeing of friends, family and other loved ones, especially with the complexity and difficulties of death and grief. More broadly, the Australian Society of Authors, Marion Writers Centre and similar organisations have well described the issues from pirated works used to train AI, and actions required. I agree with much of their perspectives: the exploitation risks taking away the already very limited options for author income, with no appropriate regulation. It also further entrenches a significant influence on readers and writers through social media. This makes it critical for as many readers and writers as possible to communicate on the issue to AI companies and Australian government, and to invest in the important work and advocacy of organisations representing writers. I believe this may help better address such ongoing and new forms of technological cruelty experienced by writers, a cruelty that was given new poignancy through its expression in managing my relationship with a dead spouse. I do wonder as well what my son will make of AI that regurgitates and assembles poor digital copies of his mum's inspirational writing when he is older. She fought hard and endured through awful chronic illness and disability to write with accuracy and flair about difficult topics like trauma, abuse and exclusion. The clumsy use of AI in publishing adds further insult to injury given more effective uses of the technology may exist. Not long after I discovered Gem's books were stolen for use in AI training I spoke with my friends and a member of my healthcare team about how we remember her at home. I noticed my son was curious about seeing photos of his mum and her belongings, and this resonated with them, and myself. Particularly as a way for us to share memories of her together. So I made a space in a low shelf of the large, lovely bookshelves in our lounge room, at about head height for my son, and placed onto the shelf photos of her and us. I also placed a copy of her memoir. As I did, I knelt down, showed him the cover, said his mum wrote it, and flicked through the pages. He turned pages, too, and repeated my words, in his own gorgeous toddler speak, before smiling, then running off to play with his toy cars. With his mum's writing in our hands, he could learn stories about Gem. On pages she agreed to put out into the world. The touch of paper on my fingers, the sound of pages turning, the 400 grams or so of book weight, seemed more powerful to me in its immersiveness than any screen can hope to replicate, as much as I enjoy reading on computer screens. Especially as I try to connect with my son through words. MORE OPINION: In Philip K. Dick's science fiction novel Flow My Tears, The Policeman Said, about a nightmarish, totalitarian world with little empathy, he wrote about characters journeying through grief: "Grief reunites you with what you've lost. It's a merging; you go with the loved thing or person that's going away... You cry, you continue to cry, because you don't ever completely come back from where you went... a fragment broken off your pulsing, pumping heart that is still there." My son and I are doing our best on this journey of grief with words and memories of Gem, each of us suffering in our own ways, together. Like Dick's characters we find loss transforms our bodies and souls. But Gem's writing helps us make sense of that loss. Life goes on for the rest of the world, however, as does growing use of AI in the lives of readers and writers. Online shoppers can choose from automatically curated lists of books on their favourite topics. Booktok fans find great work through hashtags, audio-clip memes and other personalised streams of videos. Writers, like many others, can use Google's Gemini AI to generate emails, identify grant opportunities and do other tasks to support their creative work. I am continuing my journey through grief and books with my son, through the algorithms, but offline, too, with that centuries-old, printed-word technology, made with the permission of authors. Me and other writers have been preparing for a performance of readings and music at a café here in Canberra, in person. I will read aloud some of Gem's last published words, and mine. So that, for just a few moments, I can share words that no giant company could have made with technology alone. Words crafted with human ingenuity and care. Words that mean a lot to me. Words that help my family and me endure. My heart sank when I saw Gemma Carey's books and other writing appear in the LibGen dataset of pirated works used by Meta to train its artificial intelligence (AI) software. Gem, my wife, and mother to our son, died unexpectedly late last year. She wrote a beautiful, shocking but inspiring memoir about being groomed by a man twice her age, and coming to terms with the death of her mother. She was a brilliant researcher, too, who published many journal articles that shed light on what helps improve the conditions of life for those with disability, or suffering from chronic illness and other health issues. All forms of writing that Gem gave permission for use according to the terms of their publishers, but not for training of AI. Writing that me and others in our family are now responsible for managing, including its storage and sharing, and that means so much to us now that she is gone. Words in print and online that form an evocative, revealing part of our memories of her, especially for her toddler son. I'm well aware of the menace borne from hype generated by Meta's use of AI and similar software algorithms driven by the predictive power of large language models. My writer friends and I have laughed with dark humour and consoled each other as we bear witness to another cruel act from the powers that be. Not laughter from understanding that the hype fails to recognise research suggesting AI can never replace the stunning creativity and complexity of the human mind for writing stories. Or what is possible from genuine and respectful collaboration between humans and machines for generating unexpected and original writing. Rather, our macabre conversations focused on how we are continuing to do what we have always done to write and survive our already fragile conditions of pay and support as companies threaten our livelihoods. We look out for each other to get another story edited and submitted to a very limited publishing market. Make the effort to enjoy the books and stories of so many writers seemingly going unnoticed by a broader reading public. It is a controlling, threatening time of advancing technology that was already imagined in novels such as Gibson's Neuromancer and Nayler's The Mountain In The Sea, and the recent and fantastic anthology of First Nations speculative fiction, This All Come Back Now. Yet, despite our reading, humour and stubbornness, the hostility and my resulting pain still blindsided me. The hostility of technology brought with it a far greater and stressful impact on me and my family as we grieve for the loss of our beloved Gem, and as part of this, grieve with her words. Recent research shows that spousal loss is one of the most distressing life events and those grieving tend to have a relationship with the deceased that "endures" over time. That relationship has an impact on feelings of being recognized, valued and respected. In turn, how that relationship is treated, such as corporate and other social uses of published works by deceased authors, can have powerful implications for the wellbeing of friends, family and other loved ones, especially with the complexity and difficulties of death and grief. More broadly, the Australian Society of Authors, Marion Writers Centre and similar organisations have well described the issues from pirated works used to train AI, and actions required. I agree with much of their perspectives: the exploitation risks taking away the already very limited options for author income, with no appropriate regulation. It also further entrenches a significant influence on readers and writers through social media. This makes it critical for as many readers and writers as possible to communicate on the issue to AI companies and Australian government, and to invest in the important work and advocacy of organisations representing writers. I believe this may help better address such ongoing and new forms of technological cruelty experienced by writers, a cruelty that was given new poignancy through its expression in managing my relationship with a dead spouse. I do wonder as well what my son will make of AI that regurgitates and assembles poor digital copies of his mum's inspirational writing when he is older. She fought hard and endured through awful chronic illness and disability to write with accuracy and flair about difficult topics like trauma, abuse and exclusion. The clumsy use of AI in publishing adds further insult to injury given more effective uses of the technology may exist. Not long after I discovered Gem's books were stolen for use in AI training I spoke with my friends and a member of my healthcare team about how we remember her at home. I noticed my son was curious about seeing photos of his mum and her belongings, and this resonated with them, and myself. Particularly as a way for us to share memories of her together. So I made a space in a low shelf of the large, lovely bookshelves in our lounge room, at about head height for my son, and placed onto the shelf photos of her and us. I also placed a copy of her memoir. As I did, I knelt down, showed him the cover, said his mum wrote it, and flicked through the pages. He turned pages, too, and repeated my words, in his own gorgeous toddler speak, before smiling, then running off to play with his toy cars. With his mum's writing in our hands, he could learn stories about Gem. On pages she agreed to put out into the world. The touch of paper on my fingers, the sound of pages turning, the 400 grams or so of book weight, seemed more powerful to me in its immersiveness than any screen can hope to replicate, as much as I enjoy reading on computer screens. Especially as I try to connect with my son through words. MORE OPINION: In Philip K. Dick's science fiction novel Flow My Tears, The Policeman Said, about a nightmarish, totalitarian world with little empathy, he wrote about characters journeying through grief: "Grief reunites you with what you've lost. It's a merging; you go with the loved thing or person that's going away... You cry, you continue to cry, because you don't ever completely come back from where you went... a fragment broken off your pulsing, pumping heart that is still there." My son and I are doing our best on this journey of grief with words and memories of Gem, each of us suffering in our own ways, together. Like Dick's characters we find loss transforms our bodies and souls. But Gem's writing helps us make sense of that loss. Life goes on for the rest of the world, however, as does growing use of AI in the lives of readers and writers. Online shoppers can choose from automatically curated lists of books on their favourite topics. Booktok fans find great work through hashtags, audio-clip memes and other personalised streams of videos. Writers, like many others, can use Google's Gemini AI to generate emails, identify grant opportunities and do other tasks to support their creative work. I am continuing my journey through grief and books with my son, through the algorithms, but offline, too, with that centuries-old, printed-word technology, made with the permission of authors. Me and other writers have been preparing for a performance of readings and music at a café here in Canberra, in person. I will read aloud some of Gem's last published words, and mine. So that, for just a few moments, I can share words that no giant company could have made with technology alone. Words crafted with human ingenuity and care. Words that mean a lot to me. Words that help my family and me endure. My heart sank when I saw Gemma Carey's books and other writing appear in the LibGen dataset of pirated works used by Meta to train its artificial intelligence (AI) software. Gem, my wife, and mother to our son, died unexpectedly late last year. She wrote a beautiful, shocking but inspiring memoir about being groomed by a man twice her age, and coming to terms with the death of her mother. She was a brilliant researcher, too, who published many journal articles that shed light on what helps improve the conditions of life for those with disability, or suffering from chronic illness and other health issues. All forms of writing that Gem gave permission for use according to the terms of their publishers, but not for training of AI. Writing that me and others in our family are now responsible for managing, including its storage and sharing, and that means so much to us now that she is gone. Words in print and online that form an evocative, revealing part of our memories of her, especially for her toddler son. I'm well aware of the menace borne from hype generated by Meta's use of AI and similar software algorithms driven by the predictive power of large language models. My writer friends and I have laughed with dark humour and consoled each other as we bear witness to another cruel act from the powers that be. Not laughter from understanding that the hype fails to recognise research suggesting AI can never replace the stunning creativity and complexity of the human mind for writing stories. Or what is possible from genuine and respectful collaboration between humans and machines for generating unexpected and original writing. Rather, our macabre conversations focused on how we are continuing to do what we have always done to write and survive our already fragile conditions of pay and support as companies threaten our livelihoods. We look out for each other to get another story edited and submitted to a very limited publishing market. Make the effort to enjoy the books and stories of so many writers seemingly going unnoticed by a broader reading public. It is a controlling, threatening time of advancing technology that was already imagined in novels such as Gibson's Neuromancer and Nayler's The Mountain In The Sea, and the recent and fantastic anthology of First Nations speculative fiction, This All Come Back Now. Yet, despite our reading, humour and stubbornness, the hostility and my resulting pain still blindsided me. The hostility of technology brought with it a far greater and stressful impact on me and my family as we grieve for the loss of our beloved Gem, and as part of this, grieve with her words. Recent research shows that spousal loss is one of the most distressing life events and those grieving tend to have a relationship with the deceased that "endures" over time. That relationship has an impact on feelings of being recognized, valued and respected. In turn, how that relationship is treated, such as corporate and other social uses of published works by deceased authors, can have powerful implications for the wellbeing of friends, family and other loved ones, especially with the complexity and difficulties of death and grief. More broadly, the Australian Society of Authors, Marion Writers Centre and similar organisations have well described the issues from pirated works used to train AI, and actions required. I agree with much of their perspectives: the exploitation risks taking away the already very limited options for author income, with no appropriate regulation. It also further entrenches a significant influence on readers and writers through social media. This makes it critical for as many readers and writers as possible to communicate on the issue to AI companies and Australian government, and to invest in the important work and advocacy of organisations representing writers. I believe this may help better address such ongoing and new forms of technological cruelty experienced by writers, a cruelty that was given new poignancy through its expression in managing my relationship with a dead spouse. I do wonder as well what my son will make of AI that regurgitates and assembles poor digital copies of his mum's inspirational writing when he is older. She fought hard and endured through awful chronic illness and disability to write with accuracy and flair about difficult topics like trauma, abuse and exclusion. The clumsy use of AI in publishing adds further insult to injury given more effective uses of the technology may exist. Not long after I discovered Gem's books were stolen for use in AI training I spoke with my friends and a member of my healthcare team about how we remember her at home. I noticed my son was curious about seeing photos of his mum and her belongings, and this resonated with them, and myself. Particularly as a way for us to share memories of her together. So I made a space in a low shelf of the large, lovely bookshelves in our lounge room, at about head height for my son, and placed onto the shelf photos of her and us. I also placed a copy of her memoir. As I did, I knelt down, showed him the cover, said his mum wrote it, and flicked through the pages. He turned pages, too, and repeated my words, in his own gorgeous toddler speak, before smiling, then running off to play with his toy cars. With his mum's writing in our hands, he could learn stories about Gem. On pages she agreed to put out into the world. The touch of paper on my fingers, the sound of pages turning, the 400 grams or so of book weight, seemed more powerful to me in its immersiveness than any screen can hope to replicate, as much as I enjoy reading on computer screens. Especially as I try to connect with my son through words. MORE OPINION: In Philip K. Dick's science fiction novel Flow My Tears, The Policeman Said, about a nightmarish, totalitarian world with little empathy, he wrote about characters journeying through grief: "Grief reunites you with what you've lost. It's a merging; you go with the loved thing or person that's going away... You cry, you continue to cry, because you don't ever completely come back from where you went... a fragment broken off your pulsing, pumping heart that is still there." My son and I are doing our best on this journey of grief with words and memories of Gem, each of us suffering in our own ways, together. Like Dick's characters we find loss transforms our bodies and souls. But Gem's writing helps us make sense of that loss. Life goes on for the rest of the world, however, as does growing use of AI in the lives of readers and writers. Online shoppers can choose from automatically curated lists of books on their favourite topics. Booktok fans find great work through hashtags, audio-clip memes and other personalised streams of videos. Writers, like many others, can use Google's Gemini AI to generate emails, identify grant opportunities and do other tasks to support their creative work. I am continuing my journey through grief and books with my son, through the algorithms, but offline, too, with that centuries-old, printed-word technology, made with the permission of authors. Me and other writers have been preparing for a performance of readings and music at a café here in Canberra, in person. I will read aloud some of Gem's last published words, and mine. So that, for just a few moments, I can share words that no giant company could have made with technology alone. Words crafted with human ingenuity and care. Words that mean a lot to me. Words that help my family and me endure. My heart sank when I saw Gemma Carey's books and other writing appear in the LibGen dataset of pirated works used by Meta to train its artificial intelligence (AI) software. Gem, my wife, and mother to our son, died unexpectedly late last year. She wrote a beautiful, shocking but inspiring memoir about being groomed by a man twice her age, and coming to terms with the death of her mother. She was a brilliant researcher, too, who published many journal articles that shed light on what helps improve the conditions of life for those with disability, or suffering from chronic illness and other health issues. All forms of writing that Gem gave permission for use according to the terms of their publishers, but not for training of AI. Writing that me and others in our family are now responsible for managing, including its storage and sharing, and that means so much to us now that she is gone. Words in print and online that form an evocative, revealing part of our memories of her, especially for her toddler son. I'm well aware of the menace borne from hype generated by Meta's use of AI and similar software algorithms driven by the predictive power of large language models. My writer friends and I have laughed with dark humour and consoled each other as we bear witness to another cruel act from the powers that be. Not laughter from understanding that the hype fails to recognise research suggesting AI can never replace the stunning creativity and complexity of the human mind for writing stories. Or what is possible from genuine and respectful collaboration between humans and machines for generating unexpected and original writing. Rather, our macabre conversations focused on how we are continuing to do what we have always done to write and survive our already fragile conditions of pay and support as companies threaten our livelihoods. We look out for each other to get another story edited and submitted to a very limited publishing market. Make the effort to enjoy the books and stories of so many writers seemingly going unnoticed by a broader reading public. It is a controlling, threatening time of advancing technology that was already imagined in novels such as Gibson's Neuromancer and Nayler's The Mountain In The Sea, and the recent and fantastic anthology of First Nations speculative fiction, This All Come Back Now. Yet, despite our reading, humour and stubbornness, the hostility and my resulting pain still blindsided me. The hostility of technology brought with it a far greater and stressful impact on me and my family as we grieve for the loss of our beloved Gem, and as part of this, grieve with her words. Recent research shows that spousal loss is one of the most distressing life events and those grieving tend to have a relationship with the deceased that "endures" over time. That relationship has an impact on feelings of being recognized, valued and respected. In turn, how that relationship is treated, such as corporate and other social uses of published works by deceased authors, can have powerful implications for the wellbeing of friends, family and other loved ones, especially with the complexity and difficulties of death and grief. More broadly, the Australian Society of Authors, Marion Writers Centre and similar organisations have well described the issues from pirated works used to train AI, and actions required. I agree with much of their perspectives: the exploitation risks taking away the already very limited options for author income, with no appropriate regulation. It also further entrenches a significant influence on readers and writers through social media. This makes it critical for as many readers and writers as possible to communicate on the issue to AI companies and Australian government, and to invest in the important work and advocacy of organisations representing writers. I believe this may help better address such ongoing and new forms of technological cruelty experienced by writers, a cruelty that was given new poignancy through its expression in managing my relationship with a dead spouse. I do wonder as well what my son will make of AI that regurgitates and assembles poor digital copies of his mum's inspirational writing when he is older. She fought hard and endured through awful chronic illness and disability to write with accuracy and flair about difficult topics like trauma, abuse and exclusion. The clumsy use of AI in publishing adds further insult to injury given more effective uses of the technology may exist. Not long after I discovered Gem's books were stolen for use in AI training I spoke with my friends and a member of my healthcare team about how we remember her at home. I noticed my son was curious about seeing photos of his mum and her belongings, and this resonated with them, and myself. Particularly as a way for us to share memories of her together. So I made a space in a low shelf of the large, lovely bookshelves in our lounge room, at about head height for my son, and placed onto the shelf photos of her and us. I also placed a copy of her memoir. As I did, I knelt down, showed him the cover, said his mum wrote it, and flicked through the pages. He turned pages, too, and repeated my words, in his own gorgeous toddler speak, before smiling, then running off to play with his toy cars. With his mum's writing in our hands, he could learn stories about Gem. On pages she agreed to put out into the world. The touch of paper on my fingers, the sound of pages turning, the 400 grams or so of book weight, seemed more powerful to me in its immersiveness than any screen can hope to replicate, as much as I enjoy reading on computer screens. Especially as I try to connect with my son through words. MORE OPINION: In Philip K. Dick's science fiction novel Flow My Tears, The Policeman Said, about a nightmarish, totalitarian world with little empathy, he wrote about characters journeying through grief: "Grief reunites you with what you've lost. It's a merging; you go with the loved thing or person that's going away... You cry, you continue to cry, because you don't ever completely come back from where you went... a fragment broken off your pulsing, pumping heart that is still there." My son and I are doing our best on this journey of grief with words and memories of Gem, each of us suffering in our own ways, together. Like Dick's characters we find loss transforms our bodies and souls. But Gem's writing helps us make sense of that loss. Life goes on for the rest of the world, however, as does growing use of AI in the lives of readers and writers. Online shoppers can choose from automatically curated lists of books on their favourite topics. Booktok fans find great work through hashtags, audio-clip memes and other personalised streams of videos. Writers, like many others, can use Google's Gemini AI to generate emails, identify grant opportunities and do other tasks to support their creative work. I am continuing my journey through grief and books with my son, through the algorithms, but offline, too, with that centuries-old, printed-word technology, made with the permission of authors. Me and other writers have been preparing for a performance of readings and music at a café here in Canberra, in person. I will read aloud some of Gem's last published words, and mine. So that, for just a few moments, I can share words that no giant company could have made with technology alone. Words crafted with human ingenuity and care. Words that mean a lot to me. Words that help my family and me endure.


West Australian
a day ago
- West Australian
World Gold Council working to lure artisanal miners across globe away from ‘illicit actors'
The World Gold Council estimates up to 20 per cent of the world's supply of the precious metal is produced by 'artisanal' miners whose activities are vulnerable to exploitation from 'illicit actors' such as terrorists and mercenary organisations like the notorious Wagner Group. During his visit to Kalgoorlie-Boulder this week, the council's chief strategy officer Terry Heymann said the London-headquartered organisation wanted to bring these small-scale miners into the formal gold supply chain and make them less likely to work with 'informal and illicit markets'. Artisanal and small-scale mining involves individuals usually working by themselves and mainly by hand or with some mechanical or industrial tools. 'This is very different from the large-scale professional mines . . . (it's) not really happening in Australia, it's much more of an issue in other parts of the world, but it's an issue that we care about deeply and we're doing a lot of work in how to support responsible artisanal and small-scale gold mining,' Mr Heymann said. 'A number of my colleagues this week are in Ghana, where the Ashanti King is actually convening a conference to address this issue, which is how do we support access to the formal markets for small-scale and artisanal gold mining? 'Why is that important? 'Because if they don't have access to the formal markets, they go to the informal and illicit markets. 'And that's a real challenge for the gold industry, one that we're actively involved in and doing a lot of work on.' Mr Heymann said a report it held in partnership with former British deputy prime minister Dominic Raab highlighted the dangerous nature of these 'illicit actors'. '(Mr Raab's) findings, unfortunately, are really stark . . . without access to the formal market, these illicit, informal and sometimes illegal miners are forced to work with illicit actors, and that then gets into supplying gold funding for terrorist groups, mercenaries, with the Wagner Group as an example.' The Wagner Group is a Russian-based private military company which has been involved in conflicts across the globe, including the current war in Ukraine. Notoriously, in June 2023 the group's then-leader Yevgeny Prigozhin launched an 'armed mutiny' against the Russian military — but it ended before the Wagner Group's planned march on Moscow. Mr Prigozhin died in a plane crash in Russia in August 2023. Mr Heymann said the issue was extremely important for the whole gold sector. 'It's a different part of the gold sector to where most of the people investing in gold are going to be getting their gold from,' he said. '(And) it's not something the industry can do by itself, and this is why we are calling on governments around the world, particularly those involved in the G20, who can really group together and make a difference on this to take action, to be part of this coalition of the willing to actually drive change. 'My boss, the CEO of the World Gold Council, was meeting with the secretary-general of the Organisation for Economic Co-operation and Development last week, who is Australian — Mattias Cormann — and he pledged OECD support to us. 'The OECD has been hugely involved in this, and I think it's that level of support we need — of the OECD, of national governments in Australia, in the US and Canada, big mining nations using their ability and their leverage to bring together different groups of people who can really address this issue.'


The Advertiser
a day ago
- The Advertiser
HK activist charged under China-imposed security law
Hong Kong authorities have once again arrested pro-democracy activist Joshua Wong and charged him with conspiracy to collude with a foreign country under a Beijing-imposed national security law. Wong, 28, was originally set to be released in January 2027 from a 56-month jail sentence he is serving under the same law for conspiracy to commit subversion after he participated in an unofficial primary election. Taken to the West Kowloon magistrates' courts, Wong faced a new charge of conspiracy to collude with a foreign country or with external elements to endanger national security. The former student pro-democracy activist, who wore a blue shirt and appeared noticeably thinner than before, replied, "Understand," when the clerk read out the charge and details of the offence. Wong did not apply for bail, and the case was adjourned to August 8. Before returning to custody, he waved, shrugged, and shook his head in the direction of the public gallery. In a statement, Hong Kong's national security police said they had arrested a 28-year-old man on suspicion of the offence, as well as for "dealing with property known or believed to represent proceeds of an indictable offence". A charge sheet seen by Reuters accuses Wong of having conspired with exiled activist Nathan Law and others to ask foreign countries, institutions, organisations, or individuals outside China to impose sanctions or blockades. Such actions against Hong Kong or China, along with other hostile activities targeting them, took place in 2020, between July 1 and November 23, it added. The National Security Law, which punishes offences such as acts of subversion, collusion with foreign forces, and terrorism, with terms of up to life in jail, was imposed by Beijing on the former British colony in 2020. The Chinese and Hong Kong governments say the law is necessary to restore stability following anti-government protests in 2019. But some Western governments have criticised it as being used to suppress free speech and dissent. Hong Kong authorities have once again arrested pro-democracy activist Joshua Wong and charged him with conspiracy to collude with a foreign country under a Beijing-imposed national security law. Wong, 28, was originally set to be released in January 2027 from a 56-month jail sentence he is serving under the same law for conspiracy to commit subversion after he participated in an unofficial primary election. Taken to the West Kowloon magistrates' courts, Wong faced a new charge of conspiracy to collude with a foreign country or with external elements to endanger national security. The former student pro-democracy activist, who wore a blue shirt and appeared noticeably thinner than before, replied, "Understand," when the clerk read out the charge and details of the offence. Wong did not apply for bail, and the case was adjourned to August 8. Before returning to custody, he waved, shrugged, and shook his head in the direction of the public gallery. In a statement, Hong Kong's national security police said they had arrested a 28-year-old man on suspicion of the offence, as well as for "dealing with property known or believed to represent proceeds of an indictable offence". A charge sheet seen by Reuters accuses Wong of having conspired with exiled activist Nathan Law and others to ask foreign countries, institutions, organisations, or individuals outside China to impose sanctions or blockades. Such actions against Hong Kong or China, along with other hostile activities targeting them, took place in 2020, between July 1 and November 23, it added. The National Security Law, which punishes offences such as acts of subversion, collusion with foreign forces, and terrorism, with terms of up to life in jail, was imposed by Beijing on the former British colony in 2020. The Chinese and Hong Kong governments say the law is necessary to restore stability following anti-government protests in 2019. But some Western governments have criticised it as being used to suppress free speech and dissent. Hong Kong authorities have once again arrested pro-democracy activist Joshua Wong and charged him with conspiracy to collude with a foreign country under a Beijing-imposed national security law. Wong, 28, was originally set to be released in January 2027 from a 56-month jail sentence he is serving under the same law for conspiracy to commit subversion after he participated in an unofficial primary election. Taken to the West Kowloon magistrates' courts, Wong faced a new charge of conspiracy to collude with a foreign country or with external elements to endanger national security. The former student pro-democracy activist, who wore a blue shirt and appeared noticeably thinner than before, replied, "Understand," when the clerk read out the charge and details of the offence. Wong did not apply for bail, and the case was adjourned to August 8. Before returning to custody, he waved, shrugged, and shook his head in the direction of the public gallery. In a statement, Hong Kong's national security police said they had arrested a 28-year-old man on suspicion of the offence, as well as for "dealing with property known or believed to represent proceeds of an indictable offence". A charge sheet seen by Reuters accuses Wong of having conspired with exiled activist Nathan Law and others to ask foreign countries, institutions, organisations, or individuals outside China to impose sanctions or blockades. Such actions against Hong Kong or China, along with other hostile activities targeting them, took place in 2020, between July 1 and November 23, it added. The National Security Law, which punishes offences such as acts of subversion, collusion with foreign forces, and terrorism, with terms of up to life in jail, was imposed by Beijing on the former British colony in 2020. The Chinese and Hong Kong governments say the law is necessary to restore stability following anti-government protests in 2019. But some Western governments have criticised it as being used to suppress free speech and dissent. Hong Kong authorities have once again arrested pro-democracy activist Joshua Wong and charged him with conspiracy to collude with a foreign country under a Beijing-imposed national security law. Wong, 28, was originally set to be released in January 2027 from a 56-month jail sentence he is serving under the same law for conspiracy to commit subversion after he participated in an unofficial primary election. Taken to the West Kowloon magistrates' courts, Wong faced a new charge of conspiracy to collude with a foreign country or with external elements to endanger national security. The former student pro-democracy activist, who wore a blue shirt and appeared noticeably thinner than before, replied, "Understand," when the clerk read out the charge and details of the offence. Wong did not apply for bail, and the case was adjourned to August 8. Before returning to custody, he waved, shrugged, and shook his head in the direction of the public gallery. In a statement, Hong Kong's national security police said they had arrested a 28-year-old man on suspicion of the offence, as well as for "dealing with property known or believed to represent proceeds of an indictable offence". A charge sheet seen by Reuters accuses Wong of having conspired with exiled activist Nathan Law and others to ask foreign countries, institutions, organisations, or individuals outside China to impose sanctions or blockades. Such actions against Hong Kong or China, along with other hostile activities targeting them, took place in 2020, between July 1 and November 23, it added. The National Security Law, which punishes offences such as acts of subversion, collusion with foreign forces, and terrorism, with terms of up to life in jail, was imposed by Beijing on the former British colony in 2020. The Chinese and Hong Kong governments say the law is necessary to restore stability following anti-government protests in 2019. But some Western governments have criticised it as being used to suppress free speech and dissent.