
Devon artists concerned by planned change to AI laws
Artists in Devon say they are concerned government proposals to change copyright law could make it easier for Artificial Intelligence (AI) companies to use their work for free.The government has proposed law changes which would allow AI developers to be able to use creators' content on the internet to help develop their models, unless the rights holders elected to "opt out".The Devon Artist Network said it was "very worried" about the proposals and Devon illustrator Sarah McIntyre said it would be "disastrous" for her. A consultation on the plans by the Department for Science, Innovation and Technology (DSIT) has closed and a spokesperson said no decisions had been taken yet.
Destroying careers
Ms McIntyre, from Bovey Tracey, said she would be impacted if the change was to happen.She said: "We've always owned our work, that's just a part of British law. "I made this, it's mine and I can earn money from it and if someone else copies it then that's against the law."But now they are saying everything we've ever created, all our artwork, we have to go back and opt out of it being used to train AI."Ms McIntyre has written to Mel Stride, Conservative MP for Central Devon, and asked for help.Stride said although AI presented "significant opportunities for innovation and economic growth" the government was "putting the creative industries at risk"."Labour must press pause on its rushed consultation and rethink its approach to harness the benefits of AI without compromising the success of our creative industries," he said.
Avenda Burnell Walsh from the Devon Artist Network said the group was also opposed to the plans.She said: "My car is parked on the road all night and day in the public domain but I wouldn't expect to have to say to somebody legally you can't have this car it's mine. "You shouldn't have to say that about your art either, should you?"
However some argue artists might benefit from the potential changes. Mike Phillips, a professor of interdisciplinary arts at the University of Plymouth, said AI could be used for artists to track down copyright abuses.He said: "It would be nice if some of the effort put into ripping stuff off was put into tracking stuff down. "That is something AI is good at, identifying things and recognising patterns in things and so maybe that would allow artists to use AI and seek the benefits from it."
The Department for Science, Innovation and Technology (DSIT) said it would consider the submissions the creative industries have made during the consultation. It said the UK's "current regime for copyright and AI is holding back the creative industries, media and AI sector from realising their full potential - and that cannot continue"."That's why we have been consulting on a new approach."It added: "No decisions have been taken".
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


The Herald Scotland
an hour ago
- The Herald Scotland
Immigration warning over 'less than welcoming' statements
The tone of Sir Keir's remarks on May 12 was, as observed by Mr Sheerin and many others, surely something of a surprise. And it was unexpected even with an awareness - having covered this key issue closely over months and years - of Labour's developing and lamentable stance on immigration. The Prime Minister declared: 'Nations depend on rules – fair rules. Sometimes they're written down, often they're not, but either way, they give shape to our values. They guide us towards our rights, of course, but also our responsibilities, the obligations we owe to one another. Now, in a diverse nation like ours, and I celebrate that, these rules become even more important. Without them, we risk becoming an island of strangers, not a nation that walks forward together.' The 'island of strangers' was a striking turn of phrase. Sir Keir went on: 'So when you have an immigration system that seems almost designed to permit abuse, that encourages some businesses to bring in lower-paid workers rather than invest in our young people, or simply one that is sold by politicians to the British people on an entirely false premise, then you're not championing growth, you're not championing justice, or however else people defend the status quo. You're actually contributing to the forces that are slowly pulling our country apart.' Maybe with the benefit of hindsight the Prime Minister's remarks, even though they could have been uttered just as easily by the Tory Brexiters, should not have been quite so much of a shock as they were. After all, Labour has embraced the key elements of the Conservatives' hard Brexit: loss of free movement of people between the UK and European Economic Area nations and the ending of the frictionless trade from which the country previously benefited enormously when it was part of the single market. Nevertheless, Sir Keir's tone was surely surprisingly dismal, even given all of this. Not only did we have the reference to 'an island of strangers' but also this declaration: 'This strategy will finally take back control of our borders and close the book on a squalid chapter for our politics, our economy, and our country.' What seemed clear from Sir Keir's utterings was that populism most certainly did not end with the exit of Boris Johnson or Rishi Sunak from the prime minister post. Sir Keir's tone contrasted so starkly with Mr Sheerin's reasoned appraisal of the Prime Minister's remarks and Labour's plans on immigration. We had this from Sir Keir: 'We do have to ask why parts of our economy seem almost addicted to importing cheap labour rather than investing in the skills of people who are here and want a good job in their community. Sectors like engineering, where visas have rocketed while apprenticeships have plummeted.' You would imagine Mr Sheerin, as a veteran of the engineering sector, knows a lot more about the specifics than Sir Keir. And it is worth observing the Scottish Engineering chief executive is passionate about people in Scotland and elsewhere in the UK being trained as engineers. He would love to see the skills shortages which are posing such a challenge to member companies of Scottish Engineering and others in the sector solved. Mr Sheerin is not a politician - just someone with deep knowledge of the Scottish engineering sector. So what did the Scottish Engineering chief have to say in his quarterly report published on Friday? Read more He declared that he found the UK Government's 'latest pronouncements on immigration disappointing', highlighting the detrimental impact on companies of 'statements that feel less than welcoming'. Mr Sheerin hammered home his view that raising minimum qualification levels from Higher equivalents to degree level would 'leave out the skilled trades and crafts roles where we are already in shortest supply: welders, fabricators, electricians, pipefitters, CNC (computer numerical control) machinists to name a few'. That is surely a crucial point. And it is worth emphasising Mr Sheerin's observation that people skilled in these roles are 'already in shortest supply'. Mr Sheerin also noted: 'The shortening of the graduate visa scheme reducing the right to work from two years to 18 months after graduating will not only hit our education sector but also reduce the attractiveness of the scheme for companies who will have a shorter timeline to decide whether to invest in the process to extend the visa of the employee.' This is another good point. And the Scottish Engineering chief executive declared: 'Whilst I recognise that this [immigration] is a contentious political issue across the UK for a whole range of reasons, in engineering and manufacturing in Scotland the reality is that immigration is a vital source of skills and experience that cannot be replaced overnight. These skills levels take years to build - and we should be building them - but closing off the supply before putting in place the actions to do that is another example of an action that will challenge the stated ambition of growing our economy.' The time horizon with regard to building skills levels is important. It might not chime with that of politicians such as Sir Keir, who seems at pains to bang the drum on immigration as Nigel Farage's Reform UK makes a big noise on this front. However, it is a simple factual point that engineering skills do take years to build. Mr Sheerin declared that a frustration for him in Labour's immigration pronouncements was that 'whereas there is considerable detail on how we plan to restrict and close this supply of skills, on the laudable stated aim that we will replace the loss with trained or upskilled UK-born workers, the detail is missing on how that will be achieved'. He added: 'And there is no detail that recognises that engineering skills take between four and six years to get to a starting level of competency. It does not seem an unreasonable request for the get-well plan to carry at least the same level of detail as the take-it-away plan.' This seems like an absolutely fair summation of the problems with Labour's populist immigration proposals. If you were asked to choose whether you think it is Sir Keir or Mr Sheerin who is on the money in relation to immigration policy and its effect on engineering and the broader economy, it would surely be the easiest of questions to answer, any day of the week.


The Herald Scotland
an hour ago
- The Herald Scotland
New risks emerge as America becomes less attractive
For decades, international investors have treated US government bonds as the safest place for their money. A long bull market in shares has been supported by American bonds and a sound US dollar. Since the global financial crisis of 2008, the underpinning of a 'safe haven' has helped stock markets to cope with other uncertainties. Now, investors are demanding much higher returns to lend money to the US government long-term. America is becoming less attractive to global investors at a time when its government needs them for finance more than ever. There is plenty to be nervous about. The US government is spending far more than it takes in, with the deficit up this year. Trump's spending and tax cut plans are likely to add to the US national debt over the next decade. And the US dollar has fallen to its lowest level in almost three years. US business confidence is weak, with the full impact of the supply turmoil yet to bite. Many manufacturers had stockpiled goods and components ahead of Trump's tariffs and import controls, but this buffer will soon be exhausted. May's stock market rally might seem reassuring on the surface. Major US technology companies like Apple, Microsoft, Amazon, and Nvidia delivered strong earnings and drove most of the market gains. The biggest seven tech companies alone were responsible for more than half of the US stock market's rise in May. But these trading results do not yet reflect the full impact of the trade war and supply changes. Analysts expect slower earnings growth for these businesses over the next year. Trump still plans further action, and the tariffs to date will produce significant adverse effects; higher consumer prices, lower business investment and lower economic growth. Read more: Perhaps most worrying for investors is the inflation risk building up worldwide. As global tensions rise, governments will spend more on defence, with limited scope for tax increases. Business costs will also increase, as trade disputes continue to disrupt how goods move around the world. Global borrowing costs could force central banks to keep interest rates higher for longer. The Governor of the Bank of England has warned that interest rate cuts are now more uncertain. There are signs that the tension between governments that want to spend more and nervous international lenders is also playing out in the UK, EU and Japan. British government bonds – gilts- are already seeing pressure as investors become more choosy about lending to governments anywhere. The OECD report this month warned that weak consumer confidence and fragile public finances leave the UK vulnerable to shocks. Appeasing lenders by cutting spending or raising taxes would hit economic growth. The end of US exceptionalism, linked to the declining role of the US dollar as a reserve currency, may be a gradual process as it was for the UK. There is still growth in many major US businesses and the US stock market is by far the most liquid globally. Shares have a record of coping better than bonds with rising inflation and there is value is stock markets outside the US. But we may be seeing the end of an era when investors could pay less attention to currency movements. And, although government bonds have a role in diversifying portfolios along with a spread of investments internationally, it is harder now to escape geopolitical risks. The recent stock market rebound may give opportunity to rebalance portfolios. Colin McLean is a director of Barnton Capital


Daily Mail
4 hours ago
- Daily Mail
Lawyers warned to stop using ChatGPT to argue lawsuits after AI programs 'made up fictitious cases'
Lawyers in England and Wales have been warned they could face 'severe sanctions' including potential criminal prosecution if they present false material generated by AI in court. The ruling, by one of Britain's most senior judges, comes on the back of a string of cases in which which artificially intelligence software has produced fictitious legal cases and completely invented quotes. The first case saw AI fabricate 'inaccurate and fictitious' material in a lawsuit brought against two banks, The New York Times reported. Meanwhile, the second involved a lawyer for a man suing his local council who was unable to explain the origin of the nonexistent precedents in his legal argument. While large language models (LLMs) like OpenAI 's ChatGPT and Google 's Gemini are capable of producing long accurate-sounding texts, they are technically only focused on producing a 'statistically plausible' reply. The programs are also prone to what researchers call 'hallucinations' - outputs that are misleading or lack any factual basis. AI Agent and Assistance platform Vectera has monitored the accuracy of AI chatbots since 2023 and found that the top programs hallucinate between 0.7 per cent and 2.2 per cent of the time - with others dramatically higher. However, those figures become astronomically higher when the chatbots are prompted to produce longer texts from scratch, with market leader OpenAI recently acknowledging that its flagship ChatGPT system hallucinates between 51 per cent and 79 per cent of the time if asked open-ended questions. While large language models (LLMs) like OpenAI's ChatGPT and Google's Gemini are capable of producing long accurate-sounding texts, they are technically only focused on producing a 'statistically plausible' reply - which can lead to them 'hallucinating' false information Dame Victoria Sharp, president of the King's Bench Division of the High Court, and Justice Jeremy Johnson KC, authored the new ruling. In it they say: 'The referrals arise out of the actual or suspected use by lawyers of generative artificial intelligence tools to produce written legal arguments or witness statements which are not then checked, so that false information (typically a fake citation or quotation) is put before the court. 'The facts of these cases raise concerns about the competence and conduct of the individual lawyers who have been referred to this court. 'They raise broader areas of concern however as to the adequacy of the training, supervision and regulation of those who practice before the courts, and as to the practical steps taken by those with responsibilities in those areas to ensure that lawyers who conduct litigation understand and comply with their professional and ethical responsibilities and their duties to the court.' The pair argued that existing guidance around AI was 'insufficient to address the misuse of artificial intelligence'. Judge Sharp wrote: 'There are serious implications for the administration of justice and public confidence in the justice system if artificial intelligence is misused,' While acknowledging that AI remained a 'powerful technology' with legitimate use cases, she nevertheless reiterated that the technology brought 'risks as well as opportunities.' In the first case cited in the judgment, a British man sought millions in damages from two banks. The court discovered that 18 out of 45 citations included in the legal arguments featured past cases that simply did not exist. Even in instances in which the cases did exist, often the quotations were inaccurate or did not support the legal argument being presented. The second case, which dates to May 2023, involved a man who was turned down for emergency accommodation from the local authority and ultimately became homeless. His legal team cited five past cases, which the opposing lawyers discovered simply did not exist - tipped off by the fact by the US spellings and formulaic prose style. Rapid improvements in AI systems means its use is becoming a global issue in the field of law, as the judicial sector figures out how to incorporate artificial intelligence into what is frequently a very traditional, rules-bound work environment. Earlier this year a New York lawyer faced disciplinary proceedings after being caught using ChatGPT for research and citing a none-existent case in a medical malpractice lawsuit. Attorney Jae Lee was referred to the grievance panel of the 2nd U.S. Circuit Court of Appeals in February 2025 after she cited a fabricated case about a Queens doctor botching an abortion in an appeal to revive her client's lawsuit. The case did not exist and had been conjured up by OpenAI's ChatGPT and the case was dismissed. The court ordered Lee to submit a copy of the cited decision after it was not able to find the case. She responded that she was 'unable to furnish a copy of the decision.' Lee said she had included a case 'suggested' by ChatGPT but that there was 'no bad faith, willfulness, or prejudice towards the opposing party or the judicial system' in doing so. The conduct 'falls well below the basic obligations of counsel,' a three-judge panel for the Manhattan-based appeals court wrote. In June two New York lawyers were fined $5,000 after they relied on fake research created by ChatGPT for a submission in an injury claim against Avianca airline. Judge Kevin Castel said attorneys Steven Schwartz and Peter LoDuca acted in bad faith by using the AI bot's submissions - some of which contained 'gibberish' - even after judicial orders questioned their authenticity.