logo
#

Latest news with #Where'sDaddy?

Brian Eno Challenges Microsoft on Israel Partnership, Pledges Windows 95 Chime Fee to Gaza Victims
Brian Eno Challenges Microsoft on Israel Partnership, Pledges Windows 95 Chime Fee to Gaza Victims

Yahoo

time21-05-2025

  • Politics
  • Yahoo

Brian Eno Challenges Microsoft on Israel Partnership, Pledges Windows 95 Chime Fee to Gaza Victims

The post Brian Eno Challenges Microsoft on Israel Partnership, Pledges Windows 95 Chime Fee to Gaza Victims appeared first on Consequence. Brian Eno is leveraging his historical ties to Microsoft to urge the company to cease providing AI technology and cloud services to the Israeli Ministry of Defense. In a significant further step, he has also pledged to donate the fees he received for composing the Windows 95 startup sound to victims of Israel's attacks on Gaza. 'Today, I'm compelled to speak, not as a composer this time, but as a citizen alarmed by the role Microsoft is playing in a very different kind of composition: one that leads to surveillance, violence, and destruction in Palestine,' Eno wrote in an open letter posted to Instagram on Wednesday, May 21st. 'In a blog post dated May 15, 2025, Microsoft acknowledged that it provides Israel's Ministry of Defense with 'software, professional services, Azure cloud services and Azure AI services, including language translation.'' Eno continued, 'These 'services' support a regime that is engaged in actions described by leading legal scholars and human rights organisations, United Nations experts, and increasing numbers of governments from around the world, as genocidal. The collaboration between Microsoft and the Israeli government and army is no secret and involves the company's software being used in lethal technologies with 'funny' names like 'Where's Daddy?' (guidance systems for tracking Palestinians in order to blow them up in their homes).' 'Selling and facilitating advanced AI and cloud services to a government engaged in systematic ethnic cleansing is not 'business as usual.' It is complicity,' he added, pointing out that corporations like Microsoft 'often command more influence' than governments. 'I believe that with such a power comes an absolute ethical responsibility. Accordingly, I call on Microsoft to suspend all services that support any operations that contribute to violations of international law.' Turning his attention to the 'brave' Microsoft workers who have 'refused to stay silent' about the partnership, Eno praised them for 'risking their livelihoods for people who have lost and will continue to lose their lives. I invite artists, technologists, musicians, and all people of conscience to join me in this call.' After pledging the fee he originally received for composing the Windows 95 chime 'towards helping the victims of the attacks on Gaza,' Eno ended his open letter by proclaiming, 'If a sound can signal a real change then let it be this one.' Microsoft acknowledged its partnership with the Israeli military several months after The Associated Press published a report detailing how the company's Azure cloud servers and AI technology were being used to select targets in Israel's attacks on Gaza and Lebanon. In an unsigned blog post published on May 15th, Microsoft stated it had found 'no evidence to date' that its services have been used to 'target or harm people in the conflict in Gaza.' In early April, Eno's Windows 95 chime was inducted into the Library of Congress' National Recording Registry as a 'culturally, historically, or aesthetically significant' piece of recorded music. Now, it has potentially taken on even more profound historical significance. Popular Posts Trump Warns Springsteen: "He Ought to Keep His Mouth Shut Until He's Back Into the Country" New Reality TV Show That Sees immigrants Compete for US Citizenship Has Backing of Trump Administration: Report Holy Shit, You Have to See Footage from System of a Down's Concert in Brazil Bruce Springsteen Gives Trump the Middle Finger with Another Defiant Concert Guns N' Roses Share Video of Axl Rose Repeatedly Falling Onstage: Watch Nathan Fielder's The Rehearsal Tackles Autism — Thanks to a Consequence Article Subscribe to Consequence's email digest and get the latest breaking news in music, film, and television, tour updates, access to exclusive giveaways, and more straight to your inbox.

AI-fuelled warfare is the terrifying reality that the tech sector refuses to discuss
AI-fuelled warfare is the terrifying reality that the tech sector refuses to discuss

Middle East Eye

time10-02-2025

  • Politics
  • Middle East Eye

AI-fuelled warfare is the terrifying reality that the tech sector refuses to discuss

As global leaders, policymakers and tech innovators convene at the AI Action Summit in Paris on Monday, a glaring omission in its agenda raises concerns: the lack of any meaningful dialogue on the militarisation of artificial intelligence (AI). This oversight is particularly alarming given recent revelations about the involvement of both Microsoft and Google in supplying the Israeli military with AI technology, which has concerning implications for human rights and international law. Reports from DropSite News, +972 Magazine and The Guardian reveal that Microsoft's Azure platform has been used extensively by Israeli intelligence units to power surveillance systems, contributing to systematic human rights abuses. Recent revelations have also highlighted Google's deep involvement in supplying advanced AI tools to the Israeli military as part of the $1.2bn Project Nimbus contract. During the October 2023 Gaza offensive, Google's Vertex AI platform was reportedly deployed to process vast datasets for "predictions" where algorithms analyse behavioural patterns and metadata to identify potential threats. Proprietary Israeli military systems such as Lavender, The Gospel, and Where's Daddy? also played a central role in the Gaza war. Lavender, an AI-powered database, reportedly flagged more than 37,000 individuals as potential assassination targets during the first weeks of the war, with operators spending as little as 20 seconds reviewing each case. New MEE newsletter: Jerusalem Dispatch Sign up to get the latest insights and analysis on Israel-Palestine, alongside Turkey Unpacked and other MEE newsletters Where's Daddy? tracked individuals via their smartphones, enabling precise air strikes that often targeted entire families. Such tools demonstrate how AI is being weaponised against civilian populations, raising urgent concerns about accountability and compliance with international humanitarian law. Mission undermined The Paris AI Action Summit, organised under the banner of ethical AI innovation, appears disconnected from the realities of how AI technologies are being weaponised. Civil society groups, particularly those from the Global South, have struggled to gain access to this event for various reasons, including financial constraints and a lack of clarity on how to secure an invitation. Several organisations, including my own organisation, report that they were not informed about the event or the criteria for participation, leading to confusion and frustration. Moreover, the high costs of attending the summit, including travel and accommodation, are prohibitive for many NGOs, particularly those operating in the Global South. We need binding international standards to prevent tech companies from enabling human rights abuses through military contracts The result is a further marginalisation of voices that could highlight the devastating human costs of militarised AI. Key actors, especially those who work directly with communities impacted by AI-powered warfare, have been effectively shut out, as mentioned in a statement signed by more than 100 civil society organisations, which is calling for human rights to be at the heart of AI regulation. Such exclusion undermines the summit's stated mission to ensure that AI benefits all populations. Summit organisers did not reply to our emails - even the ones requesting visa support or an official invitation. Civil society groups play a crucial role in challenging the militarisation of AI and advocating for international legal frameworks that prioritise human rights. Without their voices, the summit risks reinforcing a narrow, top-down view of AI development that overlooks the potential human costs. The militarisation of AI is not a distant issue confined to conflict zones. Many of the tools used in Gaza, such as biometric identification systems, were developed in the West and continue to be used around the world for 'security purposes'. Concerns are also being raised globally about how AI technologies, such as facial recognition and surveillance systems, are disproportionately used to target vulnerable groups, violating privacy and exacerbating existing biases. These tools often lead to discriminatory racial profiling, further marginalising individuals who are already at risk. Civil liberties Investigate Europe has also highlighted the potential for the new European Union AI Act to infringe on civil liberties and human rights, after some governments lobbied for exceptions that would allow AI-powered surveillance by police and border authorities, posing a particular risk to groups such as migrants and asylum seekers. This could exacerbate existing biases and discriminatory practices, including predictive policing and the use of biometric systems for real-time surveillance in public spaces. Such practices raise alarms about the increasing erosion of privacy and rights, especially for marginalised groups. War on Gaza: European AI Act must be expanded to protect Palestinians Read More » As Europe races to compete with US investments in AI infrastructure, it must prioritise ethical guidelines over unchecked innovation. Ignoring such concerns risks normalising technologies that reduce human oversight and potentially violate international humanitarian law. To address these critical challenges, a comprehensive and multi-stakeholder approach is essential. Policymakers must prioritise the integration of discussions on militarised AI into global governance agendas, ensuring meaningful participation from civil society groups, particularly those representing Global South countries. We need binding international standards to prevent tech companies from enabling human rights abuses through military contracts. Transparency must become a fundamental principle, with mandatory disclosure requirements for companies engaging in military partnerships. Moreover, future AI summits should create dedicated spaces for critical dialogue, moving beyond technological innovation to examine the profound ethical implications of AI in warfare. France often portrays itself as the land of human rights. To truly uphold this legacy, it must take a leading role in regulating AI technologies, ensuring they are used responsibly and not as instruments of oppression. The views expressed in this article belong to the author and do not necessarily reflect the editorial policy of Middle East Eye.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store