logo
The gulf is not the place to build the world's AI infrastructure

The gulf is not the place to build the world's AI infrastructure

Washington Post3 days ago
Christopher S. Chivvis is director of the American Statecraft Program at the Carnegie Endowment for International Peace. Sam Winter-Levy is a fellow in the Technology and International Affairs Program at the Carnegie Endowment for International Peace.
In May, the Trump administration green-lit one of the most consequential technology investments of the decade: the construction of massive artificial intelligence data centers in the United Arab Emirates and Saudi Arabia. These facilities, funded by gulf sovereign wealth and built with U.S. technology, are projected to host some of the largest and most powerful computing clusters in the world — critical infrastructure for training and deploying advanced AI models.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

PM signals UK will help Gaza aid airdrop amid calls for Palestinian statehood
PM signals UK will help Gaza aid airdrop amid calls for Palestinian statehood

Yahoo

time26 minutes ago

  • Yahoo

PM signals UK will help Gaza aid airdrop amid calls for Palestinian statehood

Sir Keir Starmer has suggested the UK will play a role in dropping aid into Gaza by air, as he faces calls from 221 cross-party MPs to recognise a Palestinian state. Israel said on Friday it will allow airdrops of aid by foreign countries into Gaza to alleviate starvation in the Palestinian territory. The Prime Minister said the UK will 'do everything we can to get aid in via this route'. Sir Keir meanwhile faces growing calls to recognise a Palestinian state immediately, amid mounting global anger over the starving population in Gaza. Some 221 MPs from Labour, the Conservatives, Liberal Democrats, SNP, Greens, Plaid Cymru, SDLP and independents, have signed a letter calling on the Government to take the step at a UN meeting next week. France's president Emmanuel Macron announced his nation would formally recognise Palestine at the UN General Assembly in September, leading UK politicians to question whether the British Government would follow suit. US President Donald Trump suggested Mr Macron's announcement 'doesn't matter' as he left America for a visit to Scotland. But Sarah Champion, the senior Labour MP who organised the letter by parliamentarians, said recognition 'would send a powerful symbolic message that we support the rights of the Palestinian people'. Other senior Commons figures who signed the letter include Labour select committee chairs Liam Byrne, Dame Emily Thornberry and Ruth Cadbury. Lib Dem leader Sir Ed Davey, as well as Tory former minister Kit Malthouse, and Sir Edward Leigh – Parliament's longest-serving MP – also signed it. The majority of those who have signed, 131, are Labour MPs. In a video statement released on Friday, Sir Keir made plain his desire for a ceasefire in the war. He said: 'I know the British people are sickened by what is happening. The images of starvation and desperation are utterly horrifying. 'The denial of aid to children and babies is completely unjustifiable, just as the continued captivity of hostages is completely unjustifiable.' Signalling the UK is willing to help get aid into Gaza via air, the Prime Minister added: 'News that Israel will allow countries to airdrop aid into Gaza has come far too late, but we will do everything we can to get aid in via this route. 'We are already working urgently with the Jordanian authorities to get British aid on to planes and into Gaza.' Children who need specialist medical treatment will be evacuated from Gaza to the UK, Sir Keir added. The Prime Minister also called for an international coalition to 'end the suffering' in Gaza, similar to the coalition of the willing aimed at helping Ukraine. Sir Keir had earlier responded to calls for the recognition of a Palestinian state, insisting such a move needed to be part of the 'pathway' to peace in the Middle East, which he and allies are working towards. He added: 'Recognition of a Palestinian state has to be one of those steps. I am unequivocal about that. But it must be part of a wider plan which ultimately results in a two-state solution and lasting security for Palestinians and Israelis.' In a statement released on Friday alongside the leaders of France and Germany, the Prime Minister urged Israel to stop restricting the flow of aid into Gaza. Charities operating in Gaza have said Israel's blockade and ongoing military offensive are pushing people there towards starvation, warning that they are seeing their own workers and Palestinians 'waste away'. The Prime Minister will meet the US president during his trip to Scotland, where he arrived on Friday evening. US-led peace talks in Qatar were cut short on Thursday, with Washington's special envoy Steve Witkoff accusing Hamas of a 'lack of desire to reach a ceasefire'. The deal under discussion is expected to include a 60-day ceasefire in which Hamas would release 10 living hostages and the remains of 18 others in phases in exchange for Palestinians imprisoned by Israel. Aid supplies would be ramped up and the two sides would hold negotiations on a lasting truce.

Think your ChatGPT therapy sessions are private? Think again.
Think your ChatGPT therapy sessions are private? Think again.

Fast Company

time27 minutes ago

  • Fast Company

Think your ChatGPT therapy sessions are private? Think again.

If you've been confessing your deepest secrets to an AI chatbot, it might be time to reevaluate. With more people turning to AI for instant life coaching, tools like ChatGPT are sucking up massive amounts of personal information on their users. While that data stays private under ideal circumstances, it could be dredged up in court – a scenario that OpenAI CEO Sam Altman warned users in an appearance on Theo Von's popular podcast this week. 'One example that we've been thinking about a lot… people talk about the most personal shit in their lives to ChatGPT,' Altman said. 'Young people especially, use it as a therapist, as a life coach, 'I'm having these relationship problems, what should I do?' And right now, if you talk to a therapist or a lawyer or a doctor about those problems, there's legal privilege for it, there's doctor patient confidentiality, there's legal confidentiality.' Altman says that as a society we 'haven't figured that out yet' for ChatGPT. Altman called for a policy framework for AI, though in reality OpenAI and its peers have lobbied for a regulatory light touch. 'If you go talk to ChatGPT about your most sensitive stuff and then there's a lawsuit or whatever, we could be required to produce that, and I think that's very screwed up,' Altman told Von, arguing that AI conversations should be treated with the same level of privacy as a chat with a therapist. While interactions with doctors and therapists are protected by federal privacy laws in the U.S., exceptions exist for instances in which someone is a threat to themselves or others. And even with those strong privacy protections, relevant medical information can be surfaced by court order, subpoena or a warrant. Altman's argument seems to be that from a regulatory perspective, ChatGPT shares more in common with licensed, trained specialists than it does with a search engine. 'I think we should have the same concept of privacy for your conversations with AI that we do with a therapist,' he said. Altman also expressed concerns about how AI will adversely impact mental health, even as people seek its advice in lieu of the real thing. 'Another thing I'm afraid of… is just what this is going to mean for users' mental health. There's a lot of people that talk to ChatGPT all day long,' Altman said. 'There are these new AI companions that people talk to like they would a girlfriend or boyfriend. 'I don't think we know yet the ways in which [AI] is going to have those negative impacts, but I feel for sure it's going to have some, and we'll have to, I hope, we can learn to mitigate it quickly.'

Even OpenAI's CEO Says Be Careful What You Share With ChatGPT
Even OpenAI's CEO Says Be Careful What You Share With ChatGPT

CNET

time27 minutes ago

  • CNET

Even OpenAI's CEO Says Be Careful What You Share With ChatGPT

Maybe don't spill your deepest, darkest secrets with an AI chatbot. You don't have to take my word for it. Take it from the guy behind the most popular generative AI model on the market. Sam Altman, the CEO of ChatGPT maker OpenAI, raised the issue this week in an interview with host Theo Von on the This Past Weekend podcast. He suggested that your conversations with AI should have similar protections as those you have with your doctor or lawyer. At one point, Von said one reason he was hesitant to use some AI tools is because he "didn't know who's going to have" his personal information. "I think that makes sense," Altman said, "to really want the privacy clarity before you use it a lot, the legal clarity." More and more AI users are treating chatbots like their therapists, doctors or lawyers, and that's created a serious privacy problem for them. There are no confidentiality rules and the actual mechanics of what happens to those conversations are startlingly unclear. Of course, there are other problems with using AI as a therapist or confidant, like how bots can give terrible advice or how they can reinforce stereotypes or stigma. (My colleague Nelson Aguilar has compiled a list of the 11 things you should never do with ChatGPT and why.) Altman's clearly aware of the issues here, and seems at least a bit troubled by it. "People use it, young people especially, use it as a therapist, a life coach, I'm having these relationship problems, what should I do?" he said. "Right now, if you talk to a therapist or a lawyer or a doctor about those problems, there's legal privilege for it." The question came up during a part of the conversation about whether there should be more rules or regulations around AI. Rules that stifle AI companies and the tech's development are unlikely to gain favor in Washington these days, as President Donald Trump's AI Action Plan released this week expressed a desire to regulate this technology less, not more. But rules to protect them might find favor. Read more: AI Essentials: 29 Ways You Can Make Gen AI Work for You, According to Our Experts Altman seemed most worried about a lack of legal protections for companies like his to keep them from being forced to turn over private conversations in lawsuits. OpenAI has objected to requests to retain user conversations during a lawsuit with the New York Times over copyright infringement and intellectual property issues. (Disclosure: Ziff Davis, CNET's parent company, in April filed a lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.) "If you go talk to ChatGPT about the most sensitive stuff and then there's a lawsuit or whatever, we could be required to produce that," Altman said. "I think that's very screwed up. I think we should have the same concept of privacy for your conversations with AI that you do with your therapist or whatever." Be careful what you tell AI about yourself For you, the issue isn't so much that OpenAI might have to turn your conversations over in a lawsuit. It's a question of whom you trust with your secrets. William Agnew, a researcher at Carnegie Mellon University who was part of a team that evaluated chatbots on their performance dealing with therapy-like questions, told me recently that privacy is a paramount issue when confiding in AI tools. The uncertainty around how models work -- and how your conversations are kept from appearing in other people's chats -- is reason enough to be hesitant. "Even if these companies are trying to be careful with your data, these models are well known to regurgitate information," Agnew said. If ChatGPT or another tool regurgitates information from your therapy session or from medical questions you asked, that could appear if your insurance company or someone else with an interest in your personal life asks the same tool about you. "People should really think about privacy more and just know that almost everything they tell these chatbots is not private," Agnew said. "It will be used in all sorts of ways."

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store