
Why are we giving AI the best jobs?
Did you hear about the Sydney radio station that used an AI-generated bot to host a show for six months?
That's not a set-up for a joke but I wish it was.
'If your day is looking a bit bleh, let Thy and CADA be the energy and vibe to get your mood lifted,' radio station CADA urged listeners. Turns out Thy was ONLY 'energy and vibe' because she doesn't actually exist in the physical sense. Well, she kind of exists. The station cloned the voice of a woman who works on their finance team. The real Thy is sitting at a desk, doing the accounts. Her robot clone is spinning tracks. Surely there is something very wrong when we assign the robots the fun jobs.
AI is going to change the world. We can't stop it now — it would be like putting the toothpaste back in the tube. But before things go too much further, perhaps we need to have a bit of a think about which jobs we want it to take over. While us humans are still in charge surely it's not too late to steer things in the right direction.
Because radio host is a cool job. Lots of people want that gig. Why let a robot do it ? We need to give the robots jobs that no one wants. Here are some roles in society I think AI could take over and leave humanity the richer for it.
Royalty: There isn't much good about being a royal. You are either destined to live a life of stoic misery in a gilded cage or you attempt to break free and suffer an outpouring of hatred towards your family. We need to get rid of the royals. Not with a guillotine. Just stick them in a semi-detached somewhere in middle England and let them get jobs at Tesco or working at solving murders in quaint villages while dancing around the maypole or whatever it is people do over there. We can fill the seemingly insatiable need for royal family gossip with an AI-generated monarchy. A kindly king who keeps bees. A beautiful princess with an expensively dull wardrobe. A prince who has weird but endearing hobbies. Of course, to keep things juicy we will need to generate a low-level scandal every six to 12 months. But really, an AI-generated royal family seems much more humane than what we currently have.
Declutterers: My apologies to anyone who makes their living out of going to people's houses and asking them if all five spatulas in the second draw down spark joy, but I really feel this is an opening for AI. Because when you are a robot nothing sparks joy! They can chuck it all out. Forget Swedish Death Cleaning, this is Not Even Truly Alive Minimalism. And when they are done they can program your phone to block Temu.
Child stars: If you have watched the recent exploits of former child stars JoJo Siwa and Justin Bieber, it's difficult to avoid the sense you are witnessing a slowly unfolding car crash. People just shouldn't be that famous before their prefrontal cortex has fully developed. Next time the world needs a precocious nine-year-old pop star, just get AI on the job. That way they can stay nine forever. They can pump out cutesy pop hits for several generations of fans, without ever having to grow up and into slightly damaged adults.
Used correctly, AI could do great things. Just leave the energy and vibes to those of us with a pulse.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


The Advertiser
2 hours ago
- The Advertiser
Getty argues its UK copyright case does not threaten AI
Getty Images' landmark copyright lawsuit against artificial intelligence company Stability AI has begun at London's High Court, with Getty rejecting Stability AI's contention the case poses a threat to the generative AI industry. Seattle-based Getty, which produces editorial content and creative stock images and video, accuses Stability AI of using its images to "train" its Stable Diffusion system which can generate images from text inputs. Getty, which is bringing a parallel lawsuit against Stability AI in the United States, says Stability AI unlawfully scraped millions of images from its websites and used them to train and develop Stable Diffusion. Stability AI - which has raised hundreds of millions of dollars in funding and in March announced investment by the world's largest advertising company, WPP - is fighting the case and denies infringing any of Getty's rights. Before the trial began on Monday, Stability AI's spokesperson said "the wider dispute is about technological innovation and freedom of ideas". "Artists using our tools are producing works built upon collective human knowledge, which is at the core of fair use and freedom of expression," the spokesperson said. In court filings, Stability AI lawyer Hugo Cuddigan said Getty's lawsuit posed "an overt threat to Stability's whole business and the wider generative AI industry". Getty's lawyers said that argument was incorrect and their case was about upholding intellectual property rights. "It is not a battle between creatives and technology, where a win for Getty Images means the end of AI," Getty's lawyer Lindsay Lane told the court. She added: "The two industries can exist in synergistic harmony because copyright works and database rights are critical to the advancement and success of AI ... the problem is when AI companies such as Stability want to use those works without payment." Getty's case is one of several lawsuits brought in the United Kingdom, the US and elsewhere over the use of copyright-protected material to train AI models, after ChatGPT and other AI tools became widely available more than two years ago. Creative industries are grappling with the legal and ethical implications of AI models that can produce their own work after being trained on existing material. Prominent figures including Elton John have called for greater protections for artists. Lawyers say Getty's case will have a major effect on the law, as well as potentially informing government policy on copyright protections relating to AI. "Legally, we're in uncharted territory. This case will be pivotal in setting the boundaries of the monopoly granted by UK copyright in the age of AI," Rebecca Newman, a lawyer at Addleshaw Goddard, who is not involved in the case, said. Cerys Wyn Davies, from the law firm Pinsent Masons, said the High Court's ruling "could have a major bearing on market practice and the UK's attractiveness as a jurisdiction for AI development". Getty Images' landmark copyright lawsuit against artificial intelligence company Stability AI has begun at London's High Court, with Getty rejecting Stability AI's contention the case poses a threat to the generative AI industry. Seattle-based Getty, which produces editorial content and creative stock images and video, accuses Stability AI of using its images to "train" its Stable Diffusion system which can generate images from text inputs. Getty, which is bringing a parallel lawsuit against Stability AI in the United States, says Stability AI unlawfully scraped millions of images from its websites and used them to train and develop Stable Diffusion. Stability AI - which has raised hundreds of millions of dollars in funding and in March announced investment by the world's largest advertising company, WPP - is fighting the case and denies infringing any of Getty's rights. Before the trial began on Monday, Stability AI's spokesperson said "the wider dispute is about technological innovation and freedom of ideas". "Artists using our tools are producing works built upon collective human knowledge, which is at the core of fair use and freedom of expression," the spokesperson said. In court filings, Stability AI lawyer Hugo Cuddigan said Getty's lawsuit posed "an overt threat to Stability's whole business and the wider generative AI industry". Getty's lawyers said that argument was incorrect and their case was about upholding intellectual property rights. "It is not a battle between creatives and technology, where a win for Getty Images means the end of AI," Getty's lawyer Lindsay Lane told the court. She added: "The two industries can exist in synergistic harmony because copyright works and database rights are critical to the advancement and success of AI ... the problem is when AI companies such as Stability want to use those works without payment." Getty's case is one of several lawsuits brought in the United Kingdom, the US and elsewhere over the use of copyright-protected material to train AI models, after ChatGPT and other AI tools became widely available more than two years ago. Creative industries are grappling with the legal and ethical implications of AI models that can produce their own work after being trained on existing material. Prominent figures including Elton John have called for greater protections for artists. Lawyers say Getty's case will have a major effect on the law, as well as potentially informing government policy on copyright protections relating to AI. "Legally, we're in uncharted territory. This case will be pivotal in setting the boundaries of the monopoly granted by UK copyright in the age of AI," Rebecca Newman, a lawyer at Addleshaw Goddard, who is not involved in the case, said. Cerys Wyn Davies, from the law firm Pinsent Masons, said the High Court's ruling "could have a major bearing on market practice and the UK's attractiveness as a jurisdiction for AI development". Getty Images' landmark copyright lawsuit against artificial intelligence company Stability AI has begun at London's High Court, with Getty rejecting Stability AI's contention the case poses a threat to the generative AI industry. Seattle-based Getty, which produces editorial content and creative stock images and video, accuses Stability AI of using its images to "train" its Stable Diffusion system which can generate images from text inputs. Getty, which is bringing a parallel lawsuit against Stability AI in the United States, says Stability AI unlawfully scraped millions of images from its websites and used them to train and develop Stable Diffusion. Stability AI - which has raised hundreds of millions of dollars in funding and in March announced investment by the world's largest advertising company, WPP - is fighting the case and denies infringing any of Getty's rights. Before the trial began on Monday, Stability AI's spokesperson said "the wider dispute is about technological innovation and freedom of ideas". "Artists using our tools are producing works built upon collective human knowledge, which is at the core of fair use and freedom of expression," the spokesperson said. In court filings, Stability AI lawyer Hugo Cuddigan said Getty's lawsuit posed "an overt threat to Stability's whole business and the wider generative AI industry". Getty's lawyers said that argument was incorrect and their case was about upholding intellectual property rights. "It is not a battle between creatives and technology, where a win for Getty Images means the end of AI," Getty's lawyer Lindsay Lane told the court. She added: "The two industries can exist in synergistic harmony because copyright works and database rights are critical to the advancement and success of AI ... the problem is when AI companies such as Stability want to use those works without payment." Getty's case is one of several lawsuits brought in the United Kingdom, the US and elsewhere over the use of copyright-protected material to train AI models, after ChatGPT and other AI tools became widely available more than two years ago. Creative industries are grappling with the legal and ethical implications of AI models that can produce their own work after being trained on existing material. Prominent figures including Elton John have called for greater protections for artists. Lawyers say Getty's case will have a major effect on the law, as well as potentially informing government policy on copyright protections relating to AI. "Legally, we're in uncharted territory. This case will be pivotal in setting the boundaries of the monopoly granted by UK copyright in the age of AI," Rebecca Newman, a lawyer at Addleshaw Goddard, who is not involved in the case, said. Cerys Wyn Davies, from the law firm Pinsent Masons, said the High Court's ruling "could have a major bearing on market practice and the UK's attractiveness as a jurisdiction for AI development". Getty Images' landmark copyright lawsuit against artificial intelligence company Stability AI has begun at London's High Court, with Getty rejecting Stability AI's contention the case poses a threat to the generative AI industry. Seattle-based Getty, which produces editorial content and creative stock images and video, accuses Stability AI of using its images to "train" its Stable Diffusion system which can generate images from text inputs. Getty, which is bringing a parallel lawsuit against Stability AI in the United States, says Stability AI unlawfully scraped millions of images from its websites and used them to train and develop Stable Diffusion. Stability AI - which has raised hundreds of millions of dollars in funding and in March announced investment by the world's largest advertising company, WPP - is fighting the case and denies infringing any of Getty's rights. Before the trial began on Monday, Stability AI's spokesperson said "the wider dispute is about technological innovation and freedom of ideas". "Artists using our tools are producing works built upon collective human knowledge, which is at the core of fair use and freedom of expression," the spokesperson said. In court filings, Stability AI lawyer Hugo Cuddigan said Getty's lawsuit posed "an overt threat to Stability's whole business and the wider generative AI industry". Getty's lawyers said that argument was incorrect and their case was about upholding intellectual property rights. "It is not a battle between creatives and technology, where a win for Getty Images means the end of AI," Getty's lawyer Lindsay Lane told the court. She added: "The two industries can exist in synergistic harmony because copyright works and database rights are critical to the advancement and success of AI ... the problem is when AI companies such as Stability want to use those works without payment." Getty's case is one of several lawsuits brought in the United Kingdom, the US and elsewhere over the use of copyright-protected material to train AI models, after ChatGPT and other AI tools became widely available more than two years ago. Creative industries are grappling with the legal and ethical implications of AI models that can produce their own work after being trained on existing material. Prominent figures including Elton John have called for greater protections for artists. Lawyers say Getty's case will have a major effect on the law, as well as potentially informing government policy on copyright protections relating to AI. "Legally, we're in uncharted territory. This case will be pivotal in setting the boundaries of the monopoly granted by UK copyright in the age of AI," Rebecca Newman, a lawyer at Addleshaw Goddard, who is not involved in the case, said. Cerys Wyn Davies, from the law firm Pinsent Masons, said the High Court's ruling "could have a major bearing on market practice and the UK's attractiveness as a jurisdiction for AI development".


West Australian
5 hours ago
- West Australian
Getty argues its UK copyright case does not threaten AI
Getty Images' landmark copyright lawsuit against artificial intelligence company Stability AI has begun at London's High Court, with Getty rejecting Stability AI's contention the case poses a threat to the generative AI industry. Seattle-based Getty, which produces editorial content and creative stock images and video, accuses Stability AI of using its images to "train" its Stable Diffusion system which can generate images from text inputs. Getty, which is bringing a parallel lawsuit against Stability AI in the United States, says Stability AI unlawfully scraped millions of images from its websites and used them to train and develop Stable Diffusion. Stability AI - which has raised hundreds of millions of dollars in funding and in March announced investment by the world's largest advertising company, WPP - is fighting the case and denies infringing any of Getty's rights. Before the trial began on Monday, Stability AI's spokesperson said "the wider dispute is about technological innovation and freedom of ideas". "Artists using our tools are producing works built upon collective human knowledge, which is at the core of fair use and freedom of expression," the spokesperson said. In court filings, Stability AI lawyer Hugo Cuddigan said Getty's lawsuit posed "an overt threat to Stability's whole business and the wider generative AI industry". Getty's lawyers said that argument was incorrect and their case was about upholding intellectual property rights. "It is not a battle between creatives and technology, where a win for Getty Images means the end of AI," Getty's lawyer Lindsay Lane told the court. She added: "The two industries can exist in synergistic harmony because copyright works and database rights are critical to the advancement and success of AI ... the problem is when AI companies such as Stability want to use those works without payment." Getty's case is one of several lawsuits brought in the United Kingdom, the US and elsewhere over the use of copyright-protected material to train AI models, after ChatGPT and other AI tools became widely available more than two years ago. Creative industries are grappling with the legal and ethical implications of AI models that can produce their own work after being trained on existing material. Prominent figures including Elton John have called for greater protections for artists. Lawyers say Getty's case will have a major effect on the law, as well as potentially informing government policy on copyright protections relating to AI. "Legally, we're in uncharted territory. This case will be pivotal in setting the boundaries of the monopoly granted by UK copyright in the age of AI," Rebecca Newman, a lawyer at Addleshaw Goddard, who is not involved in the case, said. Cerys Wyn Davies, from the law firm Pinsent Masons, said the High Court's ruling "could have a major bearing on market practice and the UK's attractiveness as a jurisdiction for AI development".


Perth Now
5 hours ago
- Perth Now
Getty argues its UK copyright case does not threaten AI
Getty Images' landmark copyright lawsuit against artificial intelligence company Stability AI has begun at London's High Court, with Getty rejecting Stability AI's contention the case poses a threat to the generative AI industry. Seattle-based Getty, which produces editorial content and creative stock images and video, accuses Stability AI of using its images to "train" its Stable Diffusion system which can generate images from text inputs. Getty, which is bringing a parallel lawsuit against Stability AI in the United States, says Stability AI unlawfully scraped millions of images from its websites and used them to train and develop Stable Diffusion. Stability AI - which has raised hundreds of millions of dollars in funding and in March announced investment by the world's largest advertising company, WPP - is fighting the case and denies infringing any of Getty's rights. Before the trial began on Monday, Stability AI's spokesperson said "the wider dispute is about technological innovation and freedom of ideas". "Artists using our tools are producing works built upon collective human knowledge, which is at the core of fair use and freedom of expression," the spokesperson said. In court filings, Stability AI lawyer Hugo Cuddigan said Getty's lawsuit posed "an overt threat to Stability's whole business and the wider generative AI industry". Getty's lawyers said that argument was incorrect and their case was about upholding intellectual property rights. "It is not a battle between creatives and technology, where a win for Getty Images means the end of AI," Getty's lawyer Lindsay Lane told the court. She added: "The two industries can exist in synergistic harmony because copyright works and database rights are critical to the advancement and success of AI ... the problem is when AI companies such as Stability want to use those works without payment." Getty's case is one of several lawsuits brought in the United Kingdom, the US and elsewhere over the use of copyright-protected material to train AI models, after ChatGPT and other AI tools became widely available more than two years ago. Creative industries are grappling with the legal and ethical implications of AI models that can produce their own work after being trained on existing material. Prominent figures including Elton John have called for greater protections for artists. Lawyers say Getty's case will have a major effect on the law, as well as potentially informing government policy on copyright protections relating to AI. "Legally, we're in uncharted territory. This case will be pivotal in setting the boundaries of the monopoly granted by UK copyright in the age of AI," Rebecca Newman, a lawyer at Addleshaw Goddard, who is not involved in the case, said. Cerys Wyn Davies, from the law firm Pinsent Masons, said the High Court's ruling "could have a major bearing on market practice and the UK's attractiveness as a jurisdiction for AI development".