Latest news with #NickClegg
Yahoo
3 days ago
- Entertainment
- Yahoo
The AI copyright standoff continues - with no solution in sight
The fierce battle over artificial intelligence (AI) and copyright - which pits the government against some of the biggest names in the creative industry - returns to the House of Lords on Monday with little sign of a solution in sight. A huge row has kicked off between ministers and peers who back the artists, and shows no sign of abating. It might be about AI but at its heart are very human issues: jobs and creativity. It's highly unusual that neither side has backed down by now or shown any sign of compromise; in fact if anything support for those opposing the government is growing rather than tailing off. This is "unchartered territory", one source in the peers' camp told me. The argument is over how best to balance the demands of two huge industries: the tech and creative sectors. More specifically, it's about the fairest way to allow AI developers access to creative content in order to make better AI tools - without undermining the livelihoods of the people who make that content in the first place. What's sparked it is the uninspiringly-titled Data (Use and Access) Bill. This proposed legislation was broadly expected to finish its long journey through parliament this week and sail off into the law books. Instead, it is currently stuck in limbo, ping-ponging between the House of Lords and the House of Commons. The bill states that AI developers should have access to all content unless its individual owners choose to opt out. Nearly 300 members of the House of Lords disagree. They think AI firms should be forced to disclose which copyrighted material they use to train their tools, with a view to licensing it. Sir Nick Clegg, former president of global affairs at Meta, is among those broadly supportive of the bill, arguing that asking permission from all copyright holders would "kill the AI industry in this country". Those against include Baroness Beeban Kidron, a crossbench peer and former film director, best known for making films such as Bridget Jones: The Edge of Reason. She says ministers would be "knowingly throwing UK designers, artists, authors, musicians, media and nascent AI companies under the bus" if they don't move to protect their output from what she describes as "state sanctioned theft" from a UK industry worth £124bn. She's asking for an amendment to the bill which includes Technology Secretary Peter Kyle giving a report to the House of Commons about the impact of the new law on the creative industries, three months after it comes into force, if it doesn't change. Mr Kyle also appears to have changed his views about UK copyright law. He said copyright law was once "very certain", but is now "not fit for purpose". Perhaps to an extent both those things are true. The Department for Science, Innovation and Technology say that they're carrying out a wider consultation on these issues and will not consider changes to the Bill unless they're completely satisfied that they work for creators. If the "ping pong" between the two Houses continues, there's a small chance the entire bill could be shelved; I'm told it's unlikely but not impossible. If it does, some other important elements would go along with it, simply because they are part of the same bill. It also includes proposed rules on the rights of bereaved parents to access their children's data if they die, changes to allow NHS trusts to share patient data more easily, and even a 3D underground map of the UK's pipes and cables, aimed at improving the efficiency of roadworks (I told you it was a big bill). There is no easy answer. Here's how it all started. Initially, before AI exploded into our lives, AI developers scraped enormous quantities of content from the internet, arguing that it was in the public domain already and therefore freely available. We are talking about big, mainly US, tech firms here doing the scraping, and not paying for anything they hoovered up. Then, they used that data to train the same AI tools now used by millions to write copy, create pictures and videos in seconds. These tools can also mimic popular musicians, writers, artists. For example, a recent viral trend saw people merrily sharing AI images generated in the style of the Japanese animation firm Studio Ghibli. The founder of that studio meanwhile, had once described the use of AI in animation as "an insult to life itself". Needless to say, he was not a fan. There has been a massive backlash from many content creators and owners including household names like Sir Elton John, Sir Paul McCartney and Dua Lipa. They have argued that taking their work in this way, without consent, credit or payment, amounted to theft. And that artists are now losing work because AI tools can churn out similar content freely and quickly instead. Sir Elton John didn't hold back in a recent interview with the BBC's Laura Kuenssberg. He argued that the government was on course to "rob young people of their legacy and their income", and described the current administration as "absolute losers". Others though point out that material made by the likes of Sir Elton is available worldwide. And if you make it too hard for AI companies to access it in the UK they'll simply do it elsewhere instead, taking much needed investment and job opportunities with them. Two opposing positions, no obvious compromise. Sign up for our Tech Decoded newsletter to follow the world's top tech stories and trends. Outside the UK? Sign up here. Elton John and Dua Lipa seek protection from AI Artists release silent album in protest against AI using their work


Spectator
29-05-2025
- Politics
- Spectator
National Liberal Club distances itself from Farage
Egad! Uproar in clubland. The reason? Nigel Farage. Yes, it seems that the veteran Brexiteer is still capable of causing a fuss among t'great and t'good – even when he is pledging to, er, lift the two child benefit cap. The Reform UK leader gave a big speech on Tuesday in Whitehall, talking about his party's plans for welfare reform. His choice of venue was the Royal Horseguards Hotel – the construction of which, in the 1880s, involved an elaborate pyramid scheme of fraud. Insert your own jokes here…. Unfortunately, the hotel's site also encompasses the National Liberal Club, the haunt of choice for that rarest of all breeds: senior Liberal Democrats. The likes of Nick Clegg and Tim Farron have made speeches here while Charles Kennedy's infamous Newsnight interview was filmed in the club smoking room. At least 25 MPs have snaffled free memberships since the last election too. And it seems that the right-on Lib Dems are not too happy at lengthy journalistic reports linking


Int'l Business Times
28-05-2025
- Entertainment
- Int'l Business Times
Former Meta Executive Declares Asking Artists for Permission to Train AI Would 'Kill' the Industry Almost Immediately
Meta's former president of global affairs has stated that having to ask artists for permission to use their content to train AI is "implausible" and would be detrimental to the industry. Nick Clegg, who worked with Meta for almost seven years, was asked about his opinions regarding copyright laws and artificial intelligence while speaking to members of parliament on Thursday. "I think the creative community wants to go a step further," Clegg said, according to The Times . "Quite a lot of voices say, 'You can only train on my content, [if you] first ask'. And I have to say that strikes me as somewhat implausible because these systems train on vast amounts of data." "I just don't know how you go around, asking everyone first. I just don't see how that would work," Clegg said. "And by the way if you did it in Britain and no one else did it, you would basically kill the AI industry in this country overnight." Clegg made these statements after discussion pertaining to a potential amendment to the Data (Use and Access) Bill. If passed, the amendment would mandate that technology companies disclose the copyrighted works they used to train their AI. Earlier this month, hundreds of creatives, including Paul McCartney, Dua Lipa, Ian McKellen, Elton John and more, signed an open letter supporting the amendment to the Data Bill and urging the government to ensure that AI companies credit the copyrighted work they use. "We will lose an immense growth opportunity if we give our work away at the behest of a handful of powerful overseas tech companies, and with it our future income, the UK's position as a creative powerhouse, and any hope that the technology of daily life will embody the values and laws of the United Kingdom," the letter read, according to The Guardian. "I think people should have clear, easy to use ways of saying, no, I don't. I want out of this. But I think expecting the industry, technologically or otherwise, to preemptively ask before they even start training — I just don't see. I'm afraid that just collides with the physics of the technology itself," said Clegg. Originally published on Latin Times Artificial intelligence AI


Phone Arena
28-05-2025
- Business
- Phone Arena
The UK is about to choose between AI and copyright — and the stakes are massive
AI is on every tech company's mind right now. It's becoming bigger and bigger and as such, there's bound to be criticism. And the UK Parliament seems to have an issue with AI and the use of customer data and copyright. Meanwhile, an ex Meta executive claims that if AI companies would need to ask permission to use copyrighted material, this would make the technology unworkable. The UK Parliament is debating the Data (Use and Access) Bill, which is a legislation to regulate access to user and customer data. As you can imagine, the bill could have a huge impact on the technology sector, and more specifically, on AI companies. AI companies need to collect vast amounts of human-made content to train chatbots. However, former UK Deputy Prime Minister Nick Clegg now says that AI companies shouldn't need to seek permission to use copyright-protected data. It's important to note that Clegg previously served as a Meta executive. During an event to promote his book "How to Save the Internet", he takes the side of the AI industry on the issue. Clegg said that forcing tech firms to comply with copyright laws and notify right holders when protected content is used to train AI would be the end of the UK's AI industry. He argues that the content is already publicly available, and AI systems need quite a lot of data to improve their reasoning. Generative AI is now on everyone's minds. | Image Credit - PhoneArena According to him, copyright laws are incompatible with AI. If companies, he claims, are forced to require permission every time they need to train a model, this would make the entire tech unworkable. He believes that artists and right holders should be able to opt out of data scraping for AI, but seeking confirmation isn't, to him, a viable solution. The former Meta executive stated that people should have a clear and easy way of saying they don't want to be a part of AI training. The new Data (Use and Access) Bill aims to regulate access to customer and company data. Film director Beeban Kidron is leading a coalition of artists and authors and pushes to amend the law, requiring AI companies to disclose the data they use for their models... but the Parliament rejected the proposal. Meanwhile, Kidron accused the UK government of approving a plan to facilitate mass cultural theft. The accusation here is that UK authorities are letting AI companies use copyrighted materials freely... and also underlines that opting out would be impossible without actual, proper transparency. Which, I personally agree with. The draft is expected to return to the House of Lords for a new vote on June 2.
Yahoo
27-05-2025
- Business
- Yahoo
Legendary Facebook Exec Scoffs, Says AI Could Never Be Profitable If Tech Companies Had to Ask for Artists' Consent to Ingest Their Work
Fresh on the heels from his exit from Meta, former Facebook executive Nick Clegg is defending artificial intelligence against copyright holders who want to hold the industry accountable. As the Times of London reports, Clegg insisted during an arts festival last weekend that it's "implausible" to ask tech companies to ask for consent from creators before using their work to train their AI models. During a speech at the Charleston Festival in East Sussex — which was, ironically enough, meant to promote his new book titled "How To Save The Internet" — Meta's former global affairs vice president initially said that it was "not unreasonable" that artists may want to "opt out of having their creativity, their products, what they've worked on indefinitely modeled." But he then went on to suggest that those same artists are getting greedy. "I think the creative community wants to go a step further," Clegg then charged. "Quite a lot of voices say 'you can only train on my content, [if you] first ask.' And I have to say that strikes me as somewhat implausible because these systems train on vast amounts of data." "I just don't know how you go around, asking everyone first," Clegg said during a speech to promote his new book, ironically titled "How to Save The Internet," that took place at this year's Charleston Festival in East Sussex, England. "I just don't see how that would work." The former deputy prime minister then added that if AI companies were required only in Britain to gain permission to use copyright holders' works, "you would basically kill the AI industry in this country overnight." Clegg's comments came amid a fiery debate in England about AI and copyright, spurred on by a recent Parliament vote on an amendment to the UK government's data bill, which would have required companies to tell copyright holders when their work was used had it not been struck down in the House of Commons last week. His stance also put him in opposition to Paul McCartney, Elton John, Dua Lipa, and hundreds of other artists who called on the British government to "protect copyright in the age of AI," as Sir Elton put it in an Instagram post. Unfortunately, it seems that Parliament's lower house agreed with Clegg's sentiments and not the artists' — but history will show who was on which side of the AI wars. More on AI and copyright: Meta Says It's Okay to Feed Copyrighted Books Into Its AI Model Because They Have No "Economic Value"