logo
Meta may have stolen my dead wife's books, but her printed words help my son and me endure

Meta may have stolen my dead wife's books, but her printed words help my son and me endure

My heart sank when I saw Gemma Carey's books and other writing appear in the LibGen dataset of pirated works used by Meta to train its artificial intelligence (AI) software.
Gem, my wife, and mother to our son, died unexpectedly late last year. She wrote a beautiful, shocking but inspiring memoir about being groomed by a man twice her age, and coming to terms with the death of her mother. She was a brilliant researcher, too, who published many journal articles that shed light on what helps improve the conditions of life for those with disability, or suffering from chronic illness and other health issues.
All forms of writing that Gem gave permission for use according to the terms of their publishers, but not for training of AI. Writing that me and others in our family are now responsible for managing, including its storage and sharing, and that means so much to us now that she is gone. Words in print and online that form an evocative, revealing part of our memories of her, especially for her toddler son.
I'm well aware of the menace borne from hype generated by Meta's use of AI and similar software algorithms driven by the predictive power of large language models. My writer friends and I have laughed with dark humour and consoled each other as we bear witness to another cruel act from the powers that be. Not laughter from understanding that the hype fails to recognise research suggesting AI can never replace the stunning creativity and complexity of the human mind for writing stories. Or what is possible from genuine and respectful collaboration between humans and machines for generating unexpected and original writing.
Rather, our macabre conversations focused on how we are continuing to do what we have always done to write and survive our already fragile conditions of pay and support as companies threaten our livelihoods. We look out for each other to get another story edited and submitted to a very limited publishing market. Make the effort to enjoy the books and stories of so many writers seemingly going unnoticed by a broader reading public.
It is a controlling, threatening time of advancing technology that was already imagined in novels such as Gibson's Neuromancer and Nayler's The Mountain In The Sea, and the recent and fantastic anthology of First Nations speculative fiction, This All Come Back Now. Yet, despite our reading, humour and stubbornness, the hostility and my resulting pain still blindsided me. The hostility of technology brought with it a far greater and stressful impact on me and my family as we grieve for the loss of our beloved Gem, and as part of this, grieve with her words.
Recent research shows that spousal loss is one of the most distressing life events and those grieving tend to have a relationship with the deceased that "endures" over time. That relationship has an impact on feelings of being recognized, valued and respected. In turn, how that relationship is treated, such as corporate and other social uses of published works by deceased authors, can have powerful implications for the wellbeing of friends, family and other loved ones, especially with the complexity and difficulties of death and grief.
More broadly, the Australian Society of Authors, Marion Writers Centre and similar organisations have well described the issues from pirated works used to train AI, and actions required. I agree with much of their perspectives: the exploitation risks taking away the already very limited options for author income, with no appropriate regulation. It also further entrenches a significant influence on readers and writers through social media.
This makes it critical for as many readers and writers as possible to communicate on the issue to AI companies and Australian government, and to invest in the important work and advocacy of organisations representing writers. I believe this may help better address such ongoing and new forms of technological cruelty experienced by writers, a cruelty that was given new poignancy through its expression in managing my relationship with a dead spouse.
I do wonder as well what my son will make of AI that regurgitates and assembles poor digital copies of his mum's inspirational writing when he is older. She fought hard and endured through awful chronic illness and disability to write with accuracy and flair about difficult topics like trauma, abuse and exclusion. The clumsy use of AI in publishing adds further insult to injury given more effective uses of the technology may exist.
Not long after I discovered Gem's books were stolen for use in AI training I spoke with my friends and a member of my healthcare team about how we remember her at home. I noticed my son was curious about seeing photos of his mum and her belongings, and this resonated with them, and myself. Particularly as a way for us to share memories of her together.
So I made a space in a low shelf of the large, lovely bookshelves in our lounge room, at about head height for my son, and placed onto the shelf photos of her and us. I also placed a copy of her memoir. As I did, I knelt down, showed him the cover, said his mum wrote it, and flicked through the pages. He turned pages, too, and repeated my words, in his own gorgeous toddler speak, before smiling, then running off to play with his toy cars.
With his mum's writing in our hands, he could learn stories about Gem. On pages she agreed to put out into the world. The touch of paper on my fingers, the sound of pages turning, the 400 grams or so of book weight, seemed more powerful to me in its immersiveness than any screen can hope to replicate, as much as I enjoy reading on computer screens. Especially as I try to connect with my son through words.
MORE OPINION:
In Philip K. Dick's science fiction novel Flow My Tears, The Policeman Said, about a nightmarish, totalitarian world with little empathy, he wrote about characters journeying through grief:
"Grief reunites you with what you've lost. It's a merging; you go with the loved thing or person that's going away... You cry, you continue to cry, because you don't ever completely come back from where you went... a fragment broken off your pulsing, pumping heart that is still there."
My son and I are doing our best on this journey of grief with words and memories of Gem, each of us suffering in our own ways, together. Like Dick's characters we find loss transforms our bodies and souls. But Gem's writing helps us make sense of that loss.
Life goes on for the rest of the world, however, as does growing use of AI in the lives of readers and writers. Online shoppers can choose from automatically curated lists of books on their favourite topics. Booktok fans find great work through hashtags, audio-clip memes and other personalised streams of videos. Writers, like many others, can use Google's Gemini AI to generate emails, identify grant opportunities and do other tasks to support their creative work.
I am continuing my journey through grief and books with my son, through the algorithms, but offline, too, with that centuries-old, printed-word technology, made with the permission of authors. Me and other writers have been preparing for a performance of readings and music at a café here in Canberra, in person. I will read aloud some of Gem's last published words, and mine.
So that, for just a few moments, I can share words that no giant company could have made with technology alone. Words crafted with human ingenuity and care. Words that mean a lot to me. Words that help my family and me endure.
My heart sank when I saw Gemma Carey's books and other writing appear in the LibGen dataset of pirated works used by Meta to train its artificial intelligence (AI) software.
Gem, my wife, and mother to our son, died unexpectedly late last year. She wrote a beautiful, shocking but inspiring memoir about being groomed by a man twice her age, and coming to terms with the death of her mother. She was a brilliant researcher, too, who published many journal articles that shed light on what helps improve the conditions of life for those with disability, or suffering from chronic illness and other health issues.
All forms of writing that Gem gave permission for use according to the terms of their publishers, but not for training of AI. Writing that me and others in our family are now responsible for managing, including its storage and sharing, and that means so much to us now that she is gone. Words in print and online that form an evocative, revealing part of our memories of her, especially for her toddler son.
I'm well aware of the menace borne from hype generated by Meta's use of AI and similar software algorithms driven by the predictive power of large language models. My writer friends and I have laughed with dark humour and consoled each other as we bear witness to another cruel act from the powers that be. Not laughter from understanding that the hype fails to recognise research suggesting AI can never replace the stunning creativity and complexity of the human mind for writing stories. Or what is possible from genuine and respectful collaboration between humans and machines for generating unexpected and original writing.
Rather, our macabre conversations focused on how we are continuing to do what we have always done to write and survive our already fragile conditions of pay and support as companies threaten our livelihoods. We look out for each other to get another story edited and submitted to a very limited publishing market. Make the effort to enjoy the books and stories of so many writers seemingly going unnoticed by a broader reading public.
It is a controlling, threatening time of advancing technology that was already imagined in novels such as Gibson's Neuromancer and Nayler's The Mountain In The Sea, and the recent and fantastic anthology of First Nations speculative fiction, This All Come Back Now. Yet, despite our reading, humour and stubbornness, the hostility and my resulting pain still blindsided me. The hostility of technology brought with it a far greater and stressful impact on me and my family as we grieve for the loss of our beloved Gem, and as part of this, grieve with her words.
Recent research shows that spousal loss is one of the most distressing life events and those grieving tend to have a relationship with the deceased that "endures" over time. That relationship has an impact on feelings of being recognized, valued and respected. In turn, how that relationship is treated, such as corporate and other social uses of published works by deceased authors, can have powerful implications for the wellbeing of friends, family and other loved ones, especially with the complexity and difficulties of death and grief.
More broadly, the Australian Society of Authors, Marion Writers Centre and similar organisations have well described the issues from pirated works used to train AI, and actions required. I agree with much of their perspectives: the exploitation risks taking away the already very limited options for author income, with no appropriate regulation. It also further entrenches a significant influence on readers and writers through social media.
This makes it critical for as many readers and writers as possible to communicate on the issue to AI companies and Australian government, and to invest in the important work and advocacy of organisations representing writers. I believe this may help better address such ongoing and new forms of technological cruelty experienced by writers, a cruelty that was given new poignancy through its expression in managing my relationship with a dead spouse.
I do wonder as well what my son will make of AI that regurgitates and assembles poor digital copies of his mum's inspirational writing when he is older. She fought hard and endured through awful chronic illness and disability to write with accuracy and flair about difficult topics like trauma, abuse and exclusion. The clumsy use of AI in publishing adds further insult to injury given more effective uses of the technology may exist.
Not long after I discovered Gem's books were stolen for use in AI training I spoke with my friends and a member of my healthcare team about how we remember her at home. I noticed my son was curious about seeing photos of his mum and her belongings, and this resonated with them, and myself. Particularly as a way for us to share memories of her together.
So I made a space in a low shelf of the large, lovely bookshelves in our lounge room, at about head height for my son, and placed onto the shelf photos of her and us. I also placed a copy of her memoir. As I did, I knelt down, showed him the cover, said his mum wrote it, and flicked through the pages. He turned pages, too, and repeated my words, in his own gorgeous toddler speak, before smiling, then running off to play with his toy cars.
With his mum's writing in our hands, he could learn stories about Gem. On pages she agreed to put out into the world. The touch of paper on my fingers, the sound of pages turning, the 400 grams or so of book weight, seemed more powerful to me in its immersiveness than any screen can hope to replicate, as much as I enjoy reading on computer screens. Especially as I try to connect with my son through words.
MORE OPINION:
In Philip K. Dick's science fiction novel Flow My Tears, The Policeman Said, about a nightmarish, totalitarian world with little empathy, he wrote about characters journeying through grief:
"Grief reunites you with what you've lost. It's a merging; you go with the loved thing or person that's going away... You cry, you continue to cry, because you don't ever completely come back from where you went... a fragment broken off your pulsing, pumping heart that is still there."
My son and I are doing our best on this journey of grief with words and memories of Gem, each of us suffering in our own ways, together. Like Dick's characters we find loss transforms our bodies and souls. But Gem's writing helps us make sense of that loss.
Life goes on for the rest of the world, however, as does growing use of AI in the lives of readers and writers. Online shoppers can choose from automatically curated lists of books on their favourite topics. Booktok fans find great work through hashtags, audio-clip memes and other personalised streams of videos. Writers, like many others, can use Google's Gemini AI to generate emails, identify grant opportunities and do other tasks to support their creative work.
I am continuing my journey through grief and books with my son, through the algorithms, but offline, too, with that centuries-old, printed-word technology, made with the permission of authors. Me and other writers have been preparing for a performance of readings and music at a café here in Canberra, in person. I will read aloud some of Gem's last published words, and mine.
So that, for just a few moments, I can share words that no giant company could have made with technology alone. Words crafted with human ingenuity and care. Words that mean a lot to me. Words that help my family and me endure.
My heart sank when I saw Gemma Carey's books and other writing appear in the LibGen dataset of pirated works used by Meta to train its artificial intelligence (AI) software.
Gem, my wife, and mother to our son, died unexpectedly late last year. She wrote a beautiful, shocking but inspiring memoir about being groomed by a man twice her age, and coming to terms with the death of her mother. She was a brilliant researcher, too, who published many journal articles that shed light on what helps improve the conditions of life for those with disability, or suffering from chronic illness and other health issues.
All forms of writing that Gem gave permission for use according to the terms of their publishers, but not for training of AI. Writing that me and others in our family are now responsible for managing, including its storage and sharing, and that means so much to us now that she is gone. Words in print and online that form an evocative, revealing part of our memories of her, especially for her toddler son.
I'm well aware of the menace borne from hype generated by Meta's use of AI and similar software algorithms driven by the predictive power of large language models. My writer friends and I have laughed with dark humour and consoled each other as we bear witness to another cruel act from the powers that be. Not laughter from understanding that the hype fails to recognise research suggesting AI can never replace the stunning creativity and complexity of the human mind for writing stories. Or what is possible from genuine and respectful collaboration between humans and machines for generating unexpected and original writing.
Rather, our macabre conversations focused on how we are continuing to do what we have always done to write and survive our already fragile conditions of pay and support as companies threaten our livelihoods. We look out for each other to get another story edited and submitted to a very limited publishing market. Make the effort to enjoy the books and stories of so many writers seemingly going unnoticed by a broader reading public.
It is a controlling, threatening time of advancing technology that was already imagined in novels such as Gibson's Neuromancer and Nayler's The Mountain In The Sea, and the recent and fantastic anthology of First Nations speculative fiction, This All Come Back Now. Yet, despite our reading, humour and stubbornness, the hostility and my resulting pain still blindsided me. The hostility of technology brought with it a far greater and stressful impact on me and my family as we grieve for the loss of our beloved Gem, and as part of this, grieve with her words.
Recent research shows that spousal loss is one of the most distressing life events and those grieving tend to have a relationship with the deceased that "endures" over time. That relationship has an impact on feelings of being recognized, valued and respected. In turn, how that relationship is treated, such as corporate and other social uses of published works by deceased authors, can have powerful implications for the wellbeing of friends, family and other loved ones, especially with the complexity and difficulties of death and grief.
More broadly, the Australian Society of Authors, Marion Writers Centre and similar organisations have well described the issues from pirated works used to train AI, and actions required. I agree with much of their perspectives: the exploitation risks taking away the already very limited options for author income, with no appropriate regulation. It also further entrenches a significant influence on readers and writers through social media.
This makes it critical for as many readers and writers as possible to communicate on the issue to AI companies and Australian government, and to invest in the important work and advocacy of organisations representing writers. I believe this may help better address such ongoing and new forms of technological cruelty experienced by writers, a cruelty that was given new poignancy through its expression in managing my relationship with a dead spouse.
I do wonder as well what my son will make of AI that regurgitates and assembles poor digital copies of his mum's inspirational writing when he is older. She fought hard and endured through awful chronic illness and disability to write with accuracy and flair about difficult topics like trauma, abuse and exclusion. The clumsy use of AI in publishing adds further insult to injury given more effective uses of the technology may exist.
Not long after I discovered Gem's books were stolen for use in AI training I spoke with my friends and a member of my healthcare team about how we remember her at home. I noticed my son was curious about seeing photos of his mum and her belongings, and this resonated with them, and myself. Particularly as a way for us to share memories of her together.
So I made a space in a low shelf of the large, lovely bookshelves in our lounge room, at about head height for my son, and placed onto the shelf photos of her and us. I also placed a copy of her memoir. As I did, I knelt down, showed him the cover, said his mum wrote it, and flicked through the pages. He turned pages, too, and repeated my words, in his own gorgeous toddler speak, before smiling, then running off to play with his toy cars.
With his mum's writing in our hands, he could learn stories about Gem. On pages she agreed to put out into the world. The touch of paper on my fingers, the sound of pages turning, the 400 grams or so of book weight, seemed more powerful to me in its immersiveness than any screen can hope to replicate, as much as I enjoy reading on computer screens. Especially as I try to connect with my son through words.
MORE OPINION:
In Philip K. Dick's science fiction novel Flow My Tears, The Policeman Said, about a nightmarish, totalitarian world with little empathy, he wrote about characters journeying through grief:
"Grief reunites you with what you've lost. It's a merging; you go with the loved thing or person that's going away... You cry, you continue to cry, because you don't ever completely come back from where you went... a fragment broken off your pulsing, pumping heart that is still there."
My son and I are doing our best on this journey of grief with words and memories of Gem, each of us suffering in our own ways, together. Like Dick's characters we find loss transforms our bodies and souls. But Gem's writing helps us make sense of that loss.
Life goes on for the rest of the world, however, as does growing use of AI in the lives of readers and writers. Online shoppers can choose from automatically curated lists of books on their favourite topics. Booktok fans find great work through hashtags, audio-clip memes and other personalised streams of videos. Writers, like many others, can use Google's Gemini AI to generate emails, identify grant opportunities and do other tasks to support their creative work.
I am continuing my journey through grief and books with my son, through the algorithms, but offline, too, with that centuries-old, printed-word technology, made with the permission of authors. Me and other writers have been preparing for a performance of readings and music at a café here in Canberra, in person. I will read aloud some of Gem's last published words, and mine.
So that, for just a few moments, I can share words that no giant company could have made with technology alone. Words crafted with human ingenuity and care. Words that mean a lot to me. Words that help my family and me endure.
My heart sank when I saw Gemma Carey's books and other writing appear in the LibGen dataset of pirated works used by Meta to train its artificial intelligence (AI) software.
Gem, my wife, and mother to our son, died unexpectedly late last year. She wrote a beautiful, shocking but inspiring memoir about being groomed by a man twice her age, and coming to terms with the death of her mother. She was a brilliant researcher, too, who published many journal articles that shed light on what helps improve the conditions of life for those with disability, or suffering from chronic illness and other health issues.
All forms of writing that Gem gave permission for use according to the terms of their publishers, but not for training of AI. Writing that me and others in our family are now responsible for managing, including its storage and sharing, and that means so much to us now that she is gone. Words in print and online that form an evocative, revealing part of our memories of her, especially for her toddler son.
I'm well aware of the menace borne from hype generated by Meta's use of AI and similar software algorithms driven by the predictive power of large language models. My writer friends and I have laughed with dark humour and consoled each other as we bear witness to another cruel act from the powers that be. Not laughter from understanding that the hype fails to recognise research suggesting AI can never replace the stunning creativity and complexity of the human mind for writing stories. Or what is possible from genuine and respectful collaboration between humans and machines for generating unexpected and original writing.
Rather, our macabre conversations focused on how we are continuing to do what we have always done to write and survive our already fragile conditions of pay and support as companies threaten our livelihoods. We look out for each other to get another story edited and submitted to a very limited publishing market. Make the effort to enjoy the books and stories of so many writers seemingly going unnoticed by a broader reading public.
It is a controlling, threatening time of advancing technology that was already imagined in novels such as Gibson's Neuromancer and Nayler's The Mountain In The Sea, and the recent and fantastic anthology of First Nations speculative fiction, This All Come Back Now. Yet, despite our reading, humour and stubbornness, the hostility and my resulting pain still blindsided me. The hostility of technology brought with it a far greater and stressful impact on me and my family as we grieve for the loss of our beloved Gem, and as part of this, grieve with her words.
Recent research shows that spousal loss is one of the most distressing life events and those grieving tend to have a relationship with the deceased that "endures" over time. That relationship has an impact on feelings of being recognized, valued and respected. In turn, how that relationship is treated, such as corporate and other social uses of published works by deceased authors, can have powerful implications for the wellbeing of friends, family and other loved ones, especially with the complexity and difficulties of death and grief.
More broadly, the Australian Society of Authors, Marion Writers Centre and similar organisations have well described the issues from pirated works used to train AI, and actions required. I agree with much of their perspectives: the exploitation risks taking away the already very limited options for author income, with no appropriate regulation. It also further entrenches a significant influence on readers and writers through social media.
This makes it critical for as many readers and writers as possible to communicate on the issue to AI companies and Australian government, and to invest in the important work and advocacy of organisations representing writers. I believe this may help better address such ongoing and new forms of technological cruelty experienced by writers, a cruelty that was given new poignancy through its expression in managing my relationship with a dead spouse.
I do wonder as well what my son will make of AI that regurgitates and assembles poor digital copies of his mum's inspirational writing when he is older. She fought hard and endured through awful chronic illness and disability to write with accuracy and flair about difficult topics like trauma, abuse and exclusion. The clumsy use of AI in publishing adds further insult to injury given more effective uses of the technology may exist.
Not long after I discovered Gem's books were stolen for use in AI training I spoke with my friends and a member of my healthcare team about how we remember her at home. I noticed my son was curious about seeing photos of his mum and her belongings, and this resonated with them, and myself. Particularly as a way for us to share memories of her together.
So I made a space in a low shelf of the large, lovely bookshelves in our lounge room, at about head height for my son, and placed onto the shelf photos of her and us. I also placed a copy of her memoir. As I did, I knelt down, showed him the cover, said his mum wrote it, and flicked through the pages. He turned pages, too, and repeated my words, in his own gorgeous toddler speak, before smiling, then running off to play with his toy cars.
With his mum's writing in our hands, he could learn stories about Gem. On pages she agreed to put out into the world. The touch of paper on my fingers, the sound of pages turning, the 400 grams or so of book weight, seemed more powerful to me in its immersiveness than any screen can hope to replicate, as much as I enjoy reading on computer screens. Especially as I try to connect with my son through words.
MORE OPINION:
In Philip K. Dick's science fiction novel Flow My Tears, The Policeman Said, about a nightmarish, totalitarian world with little empathy, he wrote about characters journeying through grief:
"Grief reunites you with what you've lost. It's a merging; you go with the loved thing or person that's going away... You cry, you continue to cry, because you don't ever completely come back from where you went... a fragment broken off your pulsing, pumping heart that is still there."
My son and I are doing our best on this journey of grief with words and memories of Gem, each of us suffering in our own ways, together. Like Dick's characters we find loss transforms our bodies and souls. But Gem's writing helps us make sense of that loss.
Life goes on for the rest of the world, however, as does growing use of AI in the lives of readers and writers. Online shoppers can choose from automatically curated lists of books on their favourite topics. Booktok fans find great work through hashtags, audio-clip memes and other personalised streams of videos. Writers, like many others, can use Google's Gemini AI to generate emails, identify grant opportunities and do other tasks to support their creative work.
I am continuing my journey through grief and books with my son, through the algorithms, but offline, too, with that centuries-old, printed-word technology, made with the permission of authors. Me and other writers have been preparing for a performance of readings and music at a café here in Canberra, in person. I will read aloud some of Gem's last published words, and mine.
So that, for just a few moments, I can share words that no giant company could have made with technology alone. Words crafted with human ingenuity and care. Words that mean a lot to me. Words that help my family and me endure.

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Meta may have stolen my dead wife's books, but her printed words help my son and me endure
Meta may have stolen my dead wife's books, but her printed words help my son and me endure

The Advertiser

timea day ago

  • The Advertiser

Meta may have stolen my dead wife's books, but her printed words help my son and me endure

My heart sank when I saw Gemma Carey's books and other writing appear in the LibGen dataset of pirated works used by Meta to train its artificial intelligence (AI) software. Gem, my wife, and mother to our son, died unexpectedly late last year. She wrote a beautiful, shocking but inspiring memoir about being groomed by a man twice her age, and coming to terms with the death of her mother. She was a brilliant researcher, too, who published many journal articles that shed light on what helps improve the conditions of life for those with disability, or suffering from chronic illness and other health issues. All forms of writing that Gem gave permission for use according to the terms of their publishers, but not for training of AI. Writing that me and others in our family are now responsible for managing, including its storage and sharing, and that means so much to us now that she is gone. Words in print and online that form an evocative, revealing part of our memories of her, especially for her toddler son. I'm well aware of the menace borne from hype generated by Meta's use of AI and similar software algorithms driven by the predictive power of large language models. My writer friends and I have laughed with dark humour and consoled each other as we bear witness to another cruel act from the powers that be. Not laughter from understanding that the hype fails to recognise research suggesting AI can never replace the stunning creativity and complexity of the human mind for writing stories. Or what is possible from genuine and respectful collaboration between humans and machines for generating unexpected and original writing. Rather, our macabre conversations focused on how we are continuing to do what we have always done to write and survive our already fragile conditions of pay and support as companies threaten our livelihoods. We look out for each other to get another story edited and submitted to a very limited publishing market. Make the effort to enjoy the books and stories of so many writers seemingly going unnoticed by a broader reading public. It is a controlling, threatening time of advancing technology that was already imagined in novels such as Gibson's Neuromancer and Nayler's The Mountain In The Sea, and the recent and fantastic anthology of First Nations speculative fiction, This All Come Back Now. Yet, despite our reading, humour and stubbornness, the hostility and my resulting pain still blindsided me. The hostility of technology brought with it a far greater and stressful impact on me and my family as we grieve for the loss of our beloved Gem, and as part of this, grieve with her words. Recent research shows that spousal loss is one of the most distressing life events and those grieving tend to have a relationship with the deceased that "endures" over time. That relationship has an impact on feelings of being recognized, valued and respected. In turn, how that relationship is treated, such as corporate and other social uses of published works by deceased authors, can have powerful implications for the wellbeing of friends, family and other loved ones, especially with the complexity and difficulties of death and grief. More broadly, the Australian Society of Authors, Marion Writers Centre and similar organisations have well described the issues from pirated works used to train AI, and actions required. I agree with much of their perspectives: the exploitation risks taking away the already very limited options for author income, with no appropriate regulation. It also further entrenches a significant influence on readers and writers through social media. This makes it critical for as many readers and writers as possible to communicate on the issue to AI companies and Australian government, and to invest in the important work and advocacy of organisations representing writers. I believe this may help better address such ongoing and new forms of technological cruelty experienced by writers, a cruelty that was given new poignancy through its expression in managing my relationship with a dead spouse. I do wonder as well what my son will make of AI that regurgitates and assembles poor digital copies of his mum's inspirational writing when he is older. She fought hard and endured through awful chronic illness and disability to write with accuracy and flair about difficult topics like trauma, abuse and exclusion. The clumsy use of AI in publishing adds further insult to injury given more effective uses of the technology may exist. Not long after I discovered Gem's books were stolen for use in AI training I spoke with my friends and a member of my healthcare team about how we remember her at home. I noticed my son was curious about seeing photos of his mum and her belongings, and this resonated with them, and myself. Particularly as a way for us to share memories of her together. So I made a space in a low shelf of the large, lovely bookshelves in our lounge room, at about head height for my son, and placed onto the shelf photos of her and us. I also placed a copy of her memoir. As I did, I knelt down, showed him the cover, said his mum wrote it, and flicked through the pages. He turned pages, too, and repeated my words, in his own gorgeous toddler speak, before smiling, then running off to play with his toy cars. With his mum's writing in our hands, he could learn stories about Gem. On pages she agreed to put out into the world. The touch of paper on my fingers, the sound of pages turning, the 400 grams or so of book weight, seemed more powerful to me in its immersiveness than any screen can hope to replicate, as much as I enjoy reading on computer screens. Especially as I try to connect with my son through words. MORE OPINION: In Philip K. Dick's science fiction novel Flow My Tears, The Policeman Said, about a nightmarish, totalitarian world with little empathy, he wrote about characters journeying through grief: "Grief reunites you with what you've lost. It's a merging; you go with the loved thing or person that's going away... You cry, you continue to cry, because you don't ever completely come back from where you went... a fragment broken off your pulsing, pumping heart that is still there." My son and I are doing our best on this journey of grief with words and memories of Gem, each of us suffering in our own ways, together. Like Dick's characters we find loss transforms our bodies and souls. But Gem's writing helps us make sense of that loss. Life goes on for the rest of the world, however, as does growing use of AI in the lives of readers and writers. Online shoppers can choose from automatically curated lists of books on their favourite topics. Booktok fans find great work through hashtags, audio-clip memes and other personalised streams of videos. Writers, like many others, can use Google's Gemini AI to generate emails, identify grant opportunities and do other tasks to support their creative work. I am continuing my journey through grief and books with my son, through the algorithms, but offline, too, with that centuries-old, printed-word technology, made with the permission of authors. Me and other writers have been preparing for a performance of readings and music at a café here in Canberra, in person. I will read aloud some of Gem's last published words, and mine. So that, for just a few moments, I can share words that no giant company could have made with technology alone. Words crafted with human ingenuity and care. Words that mean a lot to me. Words that help my family and me endure. My heart sank when I saw Gemma Carey's books and other writing appear in the LibGen dataset of pirated works used by Meta to train its artificial intelligence (AI) software. Gem, my wife, and mother to our son, died unexpectedly late last year. She wrote a beautiful, shocking but inspiring memoir about being groomed by a man twice her age, and coming to terms with the death of her mother. She was a brilliant researcher, too, who published many journal articles that shed light on what helps improve the conditions of life for those with disability, or suffering from chronic illness and other health issues. All forms of writing that Gem gave permission for use according to the terms of their publishers, but not for training of AI. Writing that me and others in our family are now responsible for managing, including its storage and sharing, and that means so much to us now that she is gone. Words in print and online that form an evocative, revealing part of our memories of her, especially for her toddler son. I'm well aware of the menace borne from hype generated by Meta's use of AI and similar software algorithms driven by the predictive power of large language models. My writer friends and I have laughed with dark humour and consoled each other as we bear witness to another cruel act from the powers that be. Not laughter from understanding that the hype fails to recognise research suggesting AI can never replace the stunning creativity and complexity of the human mind for writing stories. Or what is possible from genuine and respectful collaboration between humans and machines for generating unexpected and original writing. Rather, our macabre conversations focused on how we are continuing to do what we have always done to write and survive our already fragile conditions of pay and support as companies threaten our livelihoods. We look out for each other to get another story edited and submitted to a very limited publishing market. Make the effort to enjoy the books and stories of so many writers seemingly going unnoticed by a broader reading public. It is a controlling, threatening time of advancing technology that was already imagined in novels such as Gibson's Neuromancer and Nayler's The Mountain In The Sea, and the recent and fantastic anthology of First Nations speculative fiction, This All Come Back Now. Yet, despite our reading, humour and stubbornness, the hostility and my resulting pain still blindsided me. The hostility of technology brought with it a far greater and stressful impact on me and my family as we grieve for the loss of our beloved Gem, and as part of this, grieve with her words. Recent research shows that spousal loss is one of the most distressing life events and those grieving tend to have a relationship with the deceased that "endures" over time. That relationship has an impact on feelings of being recognized, valued and respected. In turn, how that relationship is treated, such as corporate and other social uses of published works by deceased authors, can have powerful implications for the wellbeing of friends, family and other loved ones, especially with the complexity and difficulties of death and grief. More broadly, the Australian Society of Authors, Marion Writers Centre and similar organisations have well described the issues from pirated works used to train AI, and actions required. I agree with much of their perspectives: the exploitation risks taking away the already very limited options for author income, with no appropriate regulation. It also further entrenches a significant influence on readers and writers through social media. This makes it critical for as many readers and writers as possible to communicate on the issue to AI companies and Australian government, and to invest in the important work and advocacy of organisations representing writers. I believe this may help better address such ongoing and new forms of technological cruelty experienced by writers, a cruelty that was given new poignancy through its expression in managing my relationship with a dead spouse. I do wonder as well what my son will make of AI that regurgitates and assembles poor digital copies of his mum's inspirational writing when he is older. She fought hard and endured through awful chronic illness and disability to write with accuracy and flair about difficult topics like trauma, abuse and exclusion. The clumsy use of AI in publishing adds further insult to injury given more effective uses of the technology may exist. Not long after I discovered Gem's books were stolen for use in AI training I spoke with my friends and a member of my healthcare team about how we remember her at home. I noticed my son was curious about seeing photos of his mum and her belongings, and this resonated with them, and myself. Particularly as a way for us to share memories of her together. So I made a space in a low shelf of the large, lovely bookshelves in our lounge room, at about head height for my son, and placed onto the shelf photos of her and us. I also placed a copy of her memoir. As I did, I knelt down, showed him the cover, said his mum wrote it, and flicked through the pages. He turned pages, too, and repeated my words, in his own gorgeous toddler speak, before smiling, then running off to play with his toy cars. With his mum's writing in our hands, he could learn stories about Gem. On pages she agreed to put out into the world. The touch of paper on my fingers, the sound of pages turning, the 400 grams or so of book weight, seemed more powerful to me in its immersiveness than any screen can hope to replicate, as much as I enjoy reading on computer screens. Especially as I try to connect with my son through words. MORE OPINION: In Philip K. Dick's science fiction novel Flow My Tears, The Policeman Said, about a nightmarish, totalitarian world with little empathy, he wrote about characters journeying through grief: "Grief reunites you with what you've lost. It's a merging; you go with the loved thing or person that's going away... You cry, you continue to cry, because you don't ever completely come back from where you went... a fragment broken off your pulsing, pumping heart that is still there." My son and I are doing our best on this journey of grief with words and memories of Gem, each of us suffering in our own ways, together. Like Dick's characters we find loss transforms our bodies and souls. But Gem's writing helps us make sense of that loss. Life goes on for the rest of the world, however, as does growing use of AI in the lives of readers and writers. Online shoppers can choose from automatically curated lists of books on their favourite topics. Booktok fans find great work through hashtags, audio-clip memes and other personalised streams of videos. Writers, like many others, can use Google's Gemini AI to generate emails, identify grant opportunities and do other tasks to support their creative work. I am continuing my journey through grief and books with my son, through the algorithms, but offline, too, with that centuries-old, printed-word technology, made with the permission of authors. Me and other writers have been preparing for a performance of readings and music at a café here in Canberra, in person. I will read aloud some of Gem's last published words, and mine. So that, for just a few moments, I can share words that no giant company could have made with technology alone. Words crafted with human ingenuity and care. Words that mean a lot to me. Words that help my family and me endure. My heart sank when I saw Gemma Carey's books and other writing appear in the LibGen dataset of pirated works used by Meta to train its artificial intelligence (AI) software. Gem, my wife, and mother to our son, died unexpectedly late last year. She wrote a beautiful, shocking but inspiring memoir about being groomed by a man twice her age, and coming to terms with the death of her mother. She was a brilliant researcher, too, who published many journal articles that shed light on what helps improve the conditions of life for those with disability, or suffering from chronic illness and other health issues. All forms of writing that Gem gave permission for use according to the terms of their publishers, but not for training of AI. Writing that me and others in our family are now responsible for managing, including its storage and sharing, and that means so much to us now that she is gone. Words in print and online that form an evocative, revealing part of our memories of her, especially for her toddler son. I'm well aware of the menace borne from hype generated by Meta's use of AI and similar software algorithms driven by the predictive power of large language models. My writer friends and I have laughed with dark humour and consoled each other as we bear witness to another cruel act from the powers that be. Not laughter from understanding that the hype fails to recognise research suggesting AI can never replace the stunning creativity and complexity of the human mind for writing stories. Or what is possible from genuine and respectful collaboration between humans and machines for generating unexpected and original writing. Rather, our macabre conversations focused on how we are continuing to do what we have always done to write and survive our already fragile conditions of pay and support as companies threaten our livelihoods. We look out for each other to get another story edited and submitted to a very limited publishing market. Make the effort to enjoy the books and stories of so many writers seemingly going unnoticed by a broader reading public. It is a controlling, threatening time of advancing technology that was already imagined in novels such as Gibson's Neuromancer and Nayler's The Mountain In The Sea, and the recent and fantastic anthology of First Nations speculative fiction, This All Come Back Now. Yet, despite our reading, humour and stubbornness, the hostility and my resulting pain still blindsided me. The hostility of technology brought with it a far greater and stressful impact on me and my family as we grieve for the loss of our beloved Gem, and as part of this, grieve with her words. Recent research shows that spousal loss is one of the most distressing life events and those grieving tend to have a relationship with the deceased that "endures" over time. That relationship has an impact on feelings of being recognized, valued and respected. In turn, how that relationship is treated, such as corporate and other social uses of published works by deceased authors, can have powerful implications for the wellbeing of friends, family and other loved ones, especially with the complexity and difficulties of death and grief. More broadly, the Australian Society of Authors, Marion Writers Centre and similar organisations have well described the issues from pirated works used to train AI, and actions required. I agree with much of their perspectives: the exploitation risks taking away the already very limited options for author income, with no appropriate regulation. It also further entrenches a significant influence on readers and writers through social media. This makes it critical for as many readers and writers as possible to communicate on the issue to AI companies and Australian government, and to invest in the important work and advocacy of organisations representing writers. I believe this may help better address such ongoing and new forms of technological cruelty experienced by writers, a cruelty that was given new poignancy through its expression in managing my relationship with a dead spouse. I do wonder as well what my son will make of AI that regurgitates and assembles poor digital copies of his mum's inspirational writing when he is older. She fought hard and endured through awful chronic illness and disability to write with accuracy and flair about difficult topics like trauma, abuse and exclusion. The clumsy use of AI in publishing adds further insult to injury given more effective uses of the technology may exist. Not long after I discovered Gem's books were stolen for use in AI training I spoke with my friends and a member of my healthcare team about how we remember her at home. I noticed my son was curious about seeing photos of his mum and her belongings, and this resonated with them, and myself. Particularly as a way for us to share memories of her together. So I made a space in a low shelf of the large, lovely bookshelves in our lounge room, at about head height for my son, and placed onto the shelf photos of her and us. I also placed a copy of her memoir. As I did, I knelt down, showed him the cover, said his mum wrote it, and flicked through the pages. He turned pages, too, and repeated my words, in his own gorgeous toddler speak, before smiling, then running off to play with his toy cars. With his mum's writing in our hands, he could learn stories about Gem. On pages she agreed to put out into the world. The touch of paper on my fingers, the sound of pages turning, the 400 grams or so of book weight, seemed more powerful to me in its immersiveness than any screen can hope to replicate, as much as I enjoy reading on computer screens. Especially as I try to connect with my son through words. MORE OPINION: In Philip K. Dick's science fiction novel Flow My Tears, The Policeman Said, about a nightmarish, totalitarian world with little empathy, he wrote about characters journeying through grief: "Grief reunites you with what you've lost. It's a merging; you go with the loved thing or person that's going away... You cry, you continue to cry, because you don't ever completely come back from where you went... a fragment broken off your pulsing, pumping heart that is still there." My son and I are doing our best on this journey of grief with words and memories of Gem, each of us suffering in our own ways, together. Like Dick's characters we find loss transforms our bodies and souls. But Gem's writing helps us make sense of that loss. Life goes on for the rest of the world, however, as does growing use of AI in the lives of readers and writers. Online shoppers can choose from automatically curated lists of books on their favourite topics. Booktok fans find great work through hashtags, audio-clip memes and other personalised streams of videos. Writers, like many others, can use Google's Gemini AI to generate emails, identify grant opportunities and do other tasks to support their creative work. I am continuing my journey through grief and books with my son, through the algorithms, but offline, too, with that centuries-old, printed-word technology, made with the permission of authors. Me and other writers have been preparing for a performance of readings and music at a café here in Canberra, in person. I will read aloud some of Gem's last published words, and mine. So that, for just a few moments, I can share words that no giant company could have made with technology alone. Words crafted with human ingenuity and care. Words that mean a lot to me. Words that help my family and me endure. My heart sank when I saw Gemma Carey's books and other writing appear in the LibGen dataset of pirated works used by Meta to train its artificial intelligence (AI) software. Gem, my wife, and mother to our son, died unexpectedly late last year. She wrote a beautiful, shocking but inspiring memoir about being groomed by a man twice her age, and coming to terms with the death of her mother. She was a brilliant researcher, too, who published many journal articles that shed light on what helps improve the conditions of life for those with disability, or suffering from chronic illness and other health issues. All forms of writing that Gem gave permission for use according to the terms of their publishers, but not for training of AI. Writing that me and others in our family are now responsible for managing, including its storage and sharing, and that means so much to us now that she is gone. Words in print and online that form an evocative, revealing part of our memories of her, especially for her toddler son. I'm well aware of the menace borne from hype generated by Meta's use of AI and similar software algorithms driven by the predictive power of large language models. My writer friends and I have laughed with dark humour and consoled each other as we bear witness to another cruel act from the powers that be. Not laughter from understanding that the hype fails to recognise research suggesting AI can never replace the stunning creativity and complexity of the human mind for writing stories. Or what is possible from genuine and respectful collaboration between humans and machines for generating unexpected and original writing. Rather, our macabre conversations focused on how we are continuing to do what we have always done to write and survive our already fragile conditions of pay and support as companies threaten our livelihoods. We look out for each other to get another story edited and submitted to a very limited publishing market. Make the effort to enjoy the books and stories of so many writers seemingly going unnoticed by a broader reading public. It is a controlling, threatening time of advancing technology that was already imagined in novels such as Gibson's Neuromancer and Nayler's The Mountain In The Sea, and the recent and fantastic anthology of First Nations speculative fiction, This All Come Back Now. Yet, despite our reading, humour and stubbornness, the hostility and my resulting pain still blindsided me. The hostility of technology brought with it a far greater and stressful impact on me and my family as we grieve for the loss of our beloved Gem, and as part of this, grieve with her words. Recent research shows that spousal loss is one of the most distressing life events and those grieving tend to have a relationship with the deceased that "endures" over time. That relationship has an impact on feelings of being recognized, valued and respected. In turn, how that relationship is treated, such as corporate and other social uses of published works by deceased authors, can have powerful implications for the wellbeing of friends, family and other loved ones, especially with the complexity and difficulties of death and grief. More broadly, the Australian Society of Authors, Marion Writers Centre and similar organisations have well described the issues from pirated works used to train AI, and actions required. I agree with much of their perspectives: the exploitation risks taking away the already very limited options for author income, with no appropriate regulation. It also further entrenches a significant influence on readers and writers through social media. This makes it critical for as many readers and writers as possible to communicate on the issue to AI companies and Australian government, and to invest in the important work and advocacy of organisations representing writers. I believe this may help better address such ongoing and new forms of technological cruelty experienced by writers, a cruelty that was given new poignancy through its expression in managing my relationship with a dead spouse. I do wonder as well what my son will make of AI that regurgitates and assembles poor digital copies of his mum's inspirational writing when he is older. She fought hard and endured through awful chronic illness and disability to write with accuracy and flair about difficult topics like trauma, abuse and exclusion. The clumsy use of AI in publishing adds further insult to injury given more effective uses of the technology may exist. Not long after I discovered Gem's books were stolen for use in AI training I spoke with my friends and a member of my healthcare team about how we remember her at home. I noticed my son was curious about seeing photos of his mum and her belongings, and this resonated with them, and myself. Particularly as a way for us to share memories of her together. So I made a space in a low shelf of the large, lovely bookshelves in our lounge room, at about head height for my son, and placed onto the shelf photos of her and us. I also placed a copy of her memoir. As I did, I knelt down, showed him the cover, said his mum wrote it, and flicked through the pages. He turned pages, too, and repeated my words, in his own gorgeous toddler speak, before smiling, then running off to play with his toy cars. With his mum's writing in our hands, he could learn stories about Gem. On pages she agreed to put out into the world. The touch of paper on my fingers, the sound of pages turning, the 400 grams or so of book weight, seemed more powerful to me in its immersiveness than any screen can hope to replicate, as much as I enjoy reading on computer screens. Especially as I try to connect with my son through words. MORE OPINION: In Philip K. Dick's science fiction novel Flow My Tears, The Policeman Said, about a nightmarish, totalitarian world with little empathy, he wrote about characters journeying through grief: "Grief reunites you with what you've lost. It's a merging; you go with the loved thing or person that's going away... You cry, you continue to cry, because you don't ever completely come back from where you went... a fragment broken off your pulsing, pumping heart that is still there." My son and I are doing our best on this journey of grief with words and memories of Gem, each of us suffering in our own ways, together. Like Dick's characters we find loss transforms our bodies and souls. But Gem's writing helps us make sense of that loss. Life goes on for the rest of the world, however, as does growing use of AI in the lives of readers and writers. Online shoppers can choose from automatically curated lists of books on their favourite topics. Booktok fans find great work through hashtags, audio-clip memes and other personalised streams of videos. Writers, like many others, can use Google's Gemini AI to generate emails, identify grant opportunities and do other tasks to support their creative work. I am continuing my journey through grief and books with my son, through the algorithms, but offline, too, with that centuries-old, printed-word technology, made with the permission of authors. Me and other writers have been preparing for a performance of readings and music at a café here in Canberra, in person. I will read aloud some of Gem's last published words, and mine. So that, for just a few moments, I can share words that no giant company could have made with technology alone. Words crafted with human ingenuity and care. Words that mean a lot to me. Words that help my family and me endure.

YouTube, Meta, TikTok reveal misinformation tidal wave
YouTube, Meta, TikTok reveal misinformation tidal wave

The Advertiser

time3 days ago

  • The Advertiser

YouTube, Meta, TikTok reveal misinformation tidal wave

Thousands of misleading videos, scam ads and fake profiles made in Australia have been wiped from online platforms over the past year to address a growing wave of misinformation. More than 25,000 videos deemed to feature "harmful" fake claims were removed from TikTok and YouTube, reports showed, while unverified and misleading election ads ranked among the most commonly removed content by Meta and Google. Eight technology companies outlined their actions in transparency reports published on Thursday in accordance with the voluntary Australian Code of Practice on Disinformation and Misinformation. Several tech firms declined to detail their efforts to tackle fraudulent content in Australia, including social media platforms X and Snapchat. The statistics follow heightened concern about misinformation online after the emergence of generative artificial intelligence tools, and warnings they may be used to create convincing deepfakes and political ads. US firms including Google, Meta, Twitch, Apple and Microsoft released transparency reports under the industry code, and addressed issues including the identification of misleading claims, safeguards for users, and content removal. TikTok revealed it removed more than 8.4 million videos from its Australian platform during 2024, including more than 148,000 videos deemed to be inauthentic. Almost 21,000 of the videos violated the company's "harmful misinformation policies" during the year, the report said, and 80 per cent, on average, were removed before users could view them. Google removed more than 5100 YouTube videos from Australia identified as misleading, its report said, out of more than 748,000 misleading videos removed worldwide. Election advertising also raised red flags for tech platforms in Australia, with Google rejecting more than 42,000 political ads from unverified advertisers and Meta removing more than 95,000 ads for failing to comply with its social issues, elections and politics policies. Meta purged more than 14,000 ads in Australia for violating misinformation rules, took down 350 posts on Facebook and Instagram for misinformation, and showed warnings on 6.9 million posts based on articles from fact-checking partners. In January, the tech giant announced plans to end fact-checking in the US and its report said it would "continue to evaluate the applicability of these practices" in Australia. Striking a balance between allowing content to be shared online and ensuring it would not harm others was a "difficult job," Digital Industry Group code reviewer Shaun Davies said, and the reports showed some companies were using AI tools to flag potential violations. "I was struck in this year's reports by examples of how generative AI is being leveraged for both the creation and detection of (misinformation) and disinformation," he said. "I'm also heartened that multiple initiatives that make the provenance of AI-generated content more visible to users are starting to bear fruit." In its report, Microsoft also revealed it had removed more than 1200 users from LinkedIn for sharing misinformation, while Apple identified 2700 valid complaints against 1300 news articles. Thousands of misleading videos, scam ads and fake profiles made in Australia have been wiped from online platforms over the past year to address a growing wave of misinformation. More than 25,000 videos deemed to feature "harmful" fake claims were removed from TikTok and YouTube, reports showed, while unverified and misleading election ads ranked among the most commonly removed content by Meta and Google. Eight technology companies outlined their actions in transparency reports published on Thursday in accordance with the voluntary Australian Code of Practice on Disinformation and Misinformation. Several tech firms declined to detail their efforts to tackle fraudulent content in Australia, including social media platforms X and Snapchat. The statistics follow heightened concern about misinformation online after the emergence of generative artificial intelligence tools, and warnings they may be used to create convincing deepfakes and political ads. US firms including Google, Meta, Twitch, Apple and Microsoft released transparency reports under the industry code, and addressed issues including the identification of misleading claims, safeguards for users, and content removal. TikTok revealed it removed more than 8.4 million videos from its Australian platform during 2024, including more than 148,000 videos deemed to be inauthentic. Almost 21,000 of the videos violated the company's "harmful misinformation policies" during the year, the report said, and 80 per cent, on average, were removed before users could view them. Google removed more than 5100 YouTube videos from Australia identified as misleading, its report said, out of more than 748,000 misleading videos removed worldwide. Election advertising also raised red flags for tech platforms in Australia, with Google rejecting more than 42,000 political ads from unverified advertisers and Meta removing more than 95,000 ads for failing to comply with its social issues, elections and politics policies. Meta purged more than 14,000 ads in Australia for violating misinformation rules, took down 350 posts on Facebook and Instagram for misinformation, and showed warnings on 6.9 million posts based on articles from fact-checking partners. In January, the tech giant announced plans to end fact-checking in the US and its report said it would "continue to evaluate the applicability of these practices" in Australia. Striking a balance between allowing content to be shared online and ensuring it would not harm others was a "difficult job," Digital Industry Group code reviewer Shaun Davies said, and the reports showed some companies were using AI tools to flag potential violations. "I was struck in this year's reports by examples of how generative AI is being leveraged for both the creation and detection of (misinformation) and disinformation," he said. "I'm also heartened that multiple initiatives that make the provenance of AI-generated content more visible to users are starting to bear fruit." In its report, Microsoft also revealed it had removed more than 1200 users from LinkedIn for sharing misinformation, while Apple identified 2700 valid complaints against 1300 news articles. Thousands of misleading videos, scam ads and fake profiles made in Australia have been wiped from online platforms over the past year to address a growing wave of misinformation. More than 25,000 videos deemed to feature "harmful" fake claims were removed from TikTok and YouTube, reports showed, while unverified and misleading election ads ranked among the most commonly removed content by Meta and Google. Eight technology companies outlined their actions in transparency reports published on Thursday in accordance with the voluntary Australian Code of Practice on Disinformation and Misinformation. Several tech firms declined to detail their efforts to tackle fraudulent content in Australia, including social media platforms X and Snapchat. The statistics follow heightened concern about misinformation online after the emergence of generative artificial intelligence tools, and warnings they may be used to create convincing deepfakes and political ads. US firms including Google, Meta, Twitch, Apple and Microsoft released transparency reports under the industry code, and addressed issues including the identification of misleading claims, safeguards for users, and content removal. TikTok revealed it removed more than 8.4 million videos from its Australian platform during 2024, including more than 148,000 videos deemed to be inauthentic. Almost 21,000 of the videos violated the company's "harmful misinformation policies" during the year, the report said, and 80 per cent, on average, were removed before users could view them. Google removed more than 5100 YouTube videos from Australia identified as misleading, its report said, out of more than 748,000 misleading videos removed worldwide. Election advertising also raised red flags for tech platforms in Australia, with Google rejecting more than 42,000 political ads from unverified advertisers and Meta removing more than 95,000 ads for failing to comply with its social issues, elections and politics policies. Meta purged more than 14,000 ads in Australia for violating misinformation rules, took down 350 posts on Facebook and Instagram for misinformation, and showed warnings on 6.9 million posts based on articles from fact-checking partners. In January, the tech giant announced plans to end fact-checking in the US and its report said it would "continue to evaluate the applicability of these practices" in Australia. Striking a balance between allowing content to be shared online and ensuring it would not harm others was a "difficult job," Digital Industry Group code reviewer Shaun Davies said, and the reports showed some companies were using AI tools to flag potential violations. "I was struck in this year's reports by examples of how generative AI is being leveraged for both the creation and detection of (misinformation) and disinformation," he said. "I'm also heartened that multiple initiatives that make the provenance of AI-generated content more visible to users are starting to bear fruit." In its report, Microsoft also revealed it had removed more than 1200 users from LinkedIn for sharing misinformation, while Apple identified 2700 valid complaints against 1300 news articles. Thousands of misleading videos, scam ads and fake profiles made in Australia have been wiped from online platforms over the past year to address a growing wave of misinformation. More than 25,000 videos deemed to feature "harmful" fake claims were removed from TikTok and YouTube, reports showed, while unverified and misleading election ads ranked among the most commonly removed content by Meta and Google. Eight technology companies outlined their actions in transparency reports published on Thursday in accordance with the voluntary Australian Code of Practice on Disinformation and Misinformation. Several tech firms declined to detail their efforts to tackle fraudulent content in Australia, including social media platforms X and Snapchat. The statistics follow heightened concern about misinformation online after the emergence of generative artificial intelligence tools, and warnings they may be used to create convincing deepfakes and political ads. US firms including Google, Meta, Twitch, Apple and Microsoft released transparency reports under the industry code, and addressed issues including the identification of misleading claims, safeguards for users, and content removal. TikTok revealed it removed more than 8.4 million videos from its Australian platform during 2024, including more than 148,000 videos deemed to be inauthentic. Almost 21,000 of the videos violated the company's "harmful misinformation policies" during the year, the report said, and 80 per cent, on average, were removed before users could view them. Google removed more than 5100 YouTube videos from Australia identified as misleading, its report said, out of more than 748,000 misleading videos removed worldwide. Election advertising also raised red flags for tech platforms in Australia, with Google rejecting more than 42,000 political ads from unverified advertisers and Meta removing more than 95,000 ads for failing to comply with its social issues, elections and politics policies. Meta purged more than 14,000 ads in Australia for violating misinformation rules, took down 350 posts on Facebook and Instagram for misinformation, and showed warnings on 6.9 million posts based on articles from fact-checking partners. In January, the tech giant announced plans to end fact-checking in the US and its report said it would "continue to evaluate the applicability of these practices" in Australia. Striking a balance between allowing content to be shared online and ensuring it would not harm others was a "difficult job," Digital Industry Group code reviewer Shaun Davies said, and the reports showed some companies were using AI tools to flag potential violations. "I was struck in this year's reports by examples of how generative AI is being leveraged for both the creation and detection of (misinformation) and disinformation," he said. "I'm also heartened that multiple initiatives that make the provenance of AI-generated content more visible to users are starting to bear fruit." In its report, Microsoft also revealed it had removed more than 1200 users from LinkedIn for sharing misinformation, while Apple identified 2700 valid complaints against 1300 news articles.

YouTube, Meta, TikTok reveal misinformation tidal wave
YouTube, Meta, TikTok reveal misinformation tidal wave

West Australian

time3 days ago

  • West Australian

YouTube, Meta, TikTok reveal misinformation tidal wave

Thousands of misleading videos, scam ads and fake profiles made in Australia have been wiped from online platforms over the past year to address a growing wave of misinformation. More than 25,000 videos deemed to feature "harmful" fake claims were removed from TikTok and YouTube, reports showed, while unverified and misleading election ads ranked among the most commonly removed content by Meta and Google. Eight technology companies outlined their actions in transparency reports published on Thursday in accordance with the voluntary Australian Code of Practice on Disinformation and Misinformation. Several tech firms declined to detail their efforts to tackle fraudulent content in Australia, including social media platforms X and Snapchat. The statistics follow heightened concern about misinformation online after the emergence of generative artificial intelligence tools, and warnings they may be used to create convincing deepfakes and political ads. US firms including Google, Meta, Twitch, Apple and Microsoft released transparency reports under the industry code, and addressed issues including the identification of misleading claims, safeguards for users, and content removal. TikTok revealed it removed more than 8.4 million videos from its Australian platform during 2024, including more than 148,000 videos deemed to be inauthentic. Almost 21,000 of the videos violated the company's "harmful misinformation policies" during the year, the report said, and 80 per cent, on average, were removed before users could view them. Google removed more than 5100 YouTube videos from Australia identified as misleading, its report said, out of more than 748,000 misleading videos removed worldwide. Election advertising also raised red flags for tech platforms in Australia, with Google rejecting more than 42,000 political ads from unverified advertisers and Meta removing more than 95,000 ads for failing to comply with its social issues, elections and politics policies. Meta purged more than 14,000 ads in Australia for violating misinformation rules, took down 350 posts on Facebook and Instagram for misinformation, and showed warnings on 6.9 million posts based on articles from fact-checking partners. In January, the tech giant announced plans to end fact-checking in the US and its report said it would "continue to evaluate the applicability of these practices" in Australia. Striking a balance between allowing content to be shared online and ensuring it would not harm others was a "difficult job," Digital Industry Group code reviewer Shaun Davies said, and the reports showed some companies were using AI tools to flag potential violations. "I was struck in this year's reports by examples of how generative AI is being leveraged for both the creation and detection of (misinformation) and disinformation," he said. "I'm also heartened that multiple initiatives that make the provenance of AI-generated content more visible to users are starting to bear fruit." In its report, Microsoft also revealed it had removed more than 1200 users from LinkedIn for sharing misinformation, while Apple identified 2700 valid complaints against 1300 news articles.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store