
Group that opposed OpenAI's restructuring raises concerns about new revamp plan
A group that opposed
OpenAI
's restructuring wrote in a letter this week that the startup's new organisational plan still does not go far enough to safeguard the ChatGPT creator from generating dangerous
artificial intelligence
technology.
In the letter dated May 12, submitted to the California and Delaware attorneys-general, members of the 'Not For Private Gain' group argued that while OpenAI's announcement earlier this month to dial back some of its restructuring "might be a step in the right direction," it still does not adequately prevent OpenAI from straying from its original mission to ensure that artificial intelligence is developed for the benefit of humanity.
The larger group, comprising former OpenAI employees and AI experts such as
Geoffrey Hinton
, had written an initial letter in April opposing OpenAI's then plan to restructure to remove control from its nonprofit parent entity.
The letter was part of a firestorm of criticism and legal challenges, including a high-profile lawsuit filed by rival and co-founder Elon Musk, that prompted OpenAI to dial back its restructuring plan.
OpenAI, in which Microsoft has invested more than $13 billion, now plans to convert its for-profit arm into a public benefit corporation (PBC), with the nonprofit parent controlling the PBC and becoming a "big shareholder" in it, which it says will allow OpenAI to raise more capital to keep pace in the expensive AI race.
Live Events
A PBC is a structure designed to balance shareholder returns with social goals, unlike nonprofits, which are solely focused on public good.
Discover the stories of your interest
Blockchain
5 Stories
Cyber-safety
7 Stories
Fintech
9 Stories
E-comm
9 Stories
ML
8 Stories
Edtech
6 Stories
But Monday's letter says OpenAI's new plan significantly diminishes the nonprofit's existing authority. First, OpenAI's current for-profit entity is required to advance its mission and charter above any investor interests, while the proposed PBC is not required to do so, it said.
Second, OpenAI's nonprofit, as the sole manager, has 100% control over its for-profit entity today, granting it day-to-day operational power such as the ability to fire executives. In the proposed restructuring, the nonprofit would not have comprehensive control over the PBC, which the group said is concerning because the attorneys-general's enforcement powers are derived solely from the nonprofit's authority.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


News18
an hour ago
- News18
Energy And Electrons: Could They Become Currency In A Decade? What Nikhil Kamath Says
Last Updated: Zerodha co-founder Nikhil Kamath predicts energy and electrons could become trade currency in a decade due to rising data center and AI energy consumption. Zerodha co-founder Nikhil Kamath has shared a thought-provoking idea based on research with a lingering assumption that energy and electrons might be the currency of trade in a decade. In a series of tweets, he shared the infographics explaining the research, highlighting the growing financial footprint of data centers and artificial intelligence. The electricity consumption in data centers is insane, with one new Data center consuming more electricity than 4 lakh electric vehicles in a year. Thus, the research says, electricity alone eats up 65 per cent of a data center's costs – primarily for computing and cooling. Data centers are large facilities that store, process, and manage digital data. They are the backbone of the internet and digital services—every time you use a website, stream a video, store files in the cloud, or make a bank transaction, chances are it passes through a data center. US is the leader with the most number of data centers (3680), followed by Germany (424) and UK (418). India ranks seventh in the number of 262 data centers. The more servers the data center has, the more energy it requires. Earlier, OpenAI founder Sam Altman explained how the words like please and thank you cost the company tens of millions of dollars. The research adds that one ChatGPT query uses 10x the electricity of a regular Google search. Data Centers To Consume 10% Of Global Energy By 2030 The research shows that data centers' consumption of energy is expected to grow to 10 per cent by 2030, which is 1.5 per cent of global energy. 'Just 5% of global internet searches using AI could consume enough energy to power over 1 million Indian homes for a year." Energy To Become Asset Kamath's idea is rooted in an emerging economic paradigm: as the demand for electricity grows, energy becomes a critical and tradable asset—much like money. If energy is as essential as cash in powering AI, streaming services, financial systems, and cloud operations, pricing and trading electricity akin to currency could be the next step. This concept isn't just futuristic—it carries real-world relevance. Companies might begin to hedge not only currency risk but also energy risk. Imagine supermarkets trading kilowatt-hours or data centers using energy derivatives in a similar fashion to forex trading. Over time, energy tokens or blockchain-based credits for electrons could emerge, giving rise to a sophisticated energy-backed financial ecosystem. If this shift happens, economies would have to rethink everything from inflation metrics to banking, as 'energy currency" could fundamentally alter how value is transferred and stored—making energy both a means and a measure of wealth. About the Author Varun Yadav First Published: June 08, 2025, 15:03 IST News business Energy And Electrons: Could They Become Currency In A Decade? What Nikhil Kamath Says


The Hindu
an hour ago
- The Hindu
Physics changed AI in the 20th century. Is AI returning the favour now?
Artificial intelligence (AI) is booming. Various AI algorithms are used in many scientific domains, such as to predict the structure of proteins, search for materials with particular properties, and interpret medical data to provide a diagnosis. People use tools like ChatGPT, Claude, NotebookLM, DALL-E, Gemini, and Midjourney to generate images and videos from text prompts, write text, and search the web. The question arises in the same vein: can they prove useful in studies of the fundamental properties of nature or is there a gap between human and artificial scientists that needs to be bridged first? There is certainly some gap. Many of the current applications of AI in scientific research often use AI models as a black box: when the models are trained on some data and they produce an output, but the relationship between the inputs and the output is not clear. This is considered unacceptable by the scientific community. Last year, for example, DeepMind faced pressure from the life sciences community to release an inspectable version of its AlphaFold model that predicts protein structures. The black-box nature presents a similar concern in the physical sciences, where the steps leading up to a solution are as important as the solution itself. Yet this hasn't dissuaded scientists from trying. In fact, they started early: since the mid-1980s, they have integrated AI-based tools in the study of complex systems. In 1990, high-energy physics joined the fold. Astro- and high-energy physics In astronomy and astrophysics, scientists study the structure and dynamics of celestial objects. Big-Data analytics and image enhancement are two major tasks for researchers in this field. AI-based algorithms help with the first by looking for patterns, anomalies, and correlations. Indeed, AI has revolutionised astrophysical observations by automating tasks like capturing images and tracking distant stars and galaxies. AI algorithms are able to compensate for the earth's rotation and atmospheric disturbances, producing better observations in a shorter span. They are also able to 'automate' telescopes that are looking for very short-lived events in the sky and record important information in real time. Experimental high-energy physicists often deal with large datasets. For example, the Large Hadron Collider experiment in Europe generates more than 30 petabytes of data every year. A detector on the collider called the Compact Muon Solenoid alone captures 40 million 3D images of particle collisions every second. It is very difficult for physicists to analyse such data volumes rapidly enough to track subatomic events of interest. So in one measure, researchers at the collider started using an AI model able to accurately identify a particle of interest in very noisy data. Such a model helped discover the Higgs boson particle over a decade ago. AI in statistical physics Statistical mechanics is the study of how a group of particles behaves together, rather than individually. It is used to understand macroscopic properties like temperature, and pressure. For example, Ising developed a statistical model for magnetism in the 1920s, focusing on the collective behavior of atomic spins interacting with their neighbors. In this model, there are higher and lower energy states for the system, and the material is more likely to exist in the lowest energy state. The Boltzmann distribution is an important concept in statistical mechanics, used to predict, say, the precise conditions in which ice will turn to water. Using this distribution, in the 1920s, Ernst Ising and Wilhelm Lenz predicted the temperature at which a material changed to non-magnetic from magnetic. Last year's physics Nobel laureates John Hopefield and Geoffrey Hinton developed a theory of neural networks in the same way, based on the idea of statistical mechanics. An NN is a type of model where nodes that can receive data to perform computations on them are linked to each other in different ways. Overall, NNs process information the way animal brains do. For example, imagine an image made up of pixels, where some are visible and the rest are hidden. To determine what the image is, physicists have to consider all possible ways the hidden pixels could fit together with the visible pieces. The idea of most likely states of statistical mechanics could help them in this scenario. Hopefield and Hinton developed a theory for NNs that considered the collective interactions of pixels as neurons, just like Lenz and Ising before them. A Hopfield network calculates the energy of an image by determining the least-energy arrangement of hidden pixels, similar to statistical physics. AI tools apparently returned the favour by helping make advances in the study of Bose-Einstein condensates (BEC). A BEC is a peculiar state of matter that a collection of certain subatomic or atomic particles have been known to enter at very low temperatures. Scientists have been creating it in the lab since the early 1990s. In 2016, scientists at Australian National University tried to do so using AI's help with creating the right conditions for a BEC to form. They found that it did so with flying colours. The tool was even able to help keep the conditions stable, allowing the BEC to last longer. 'I didn't expect the machine could learn to do the experiment itself, from scratch, in under an hour,' the paper's coauthor Paul Wigley said in a statement. 'A simple computer program would have taken longer than the age of the universe to run through all the combinations and work this out.' Bringing AI to the quantum In a 2022 paper, scientists from Australia, Canada, and Germany reported a simpler method to entangle two subatomic particles using AI. Quantum computing and quantum technologies are of great research and practical interest today, with governments — including India's — investing millions of dollars in developing these futuristic technologies. A big part of their revolutionary power comes from achieving quantum entanglement. For example, quantum computers have a process called entanglement swapping: where two particles that have never interacted become entangled using intermediate entangled particles. In the 2022 paper, the scientists reported a tool called PyTheus, 'a highly-efficient, open-source digital discovery framework … which can employ a wide range of experimental devices from modern quantum labs' to better achieve entanglement in quantum-optic experiments. Among other results, scientists have used PyTheus to make a breakthrough with implications for quantum networks used to securely transmit messages, making these technologies more feasible. More work, including research, remains to be done but tools like PyTheus have demonstrated a potential to make it more efficient. From this vantage point in time, it seems like every subfield of physics will soon use AI and ML to help solve their toughest problems. The end goal is to make it easier to come up with the more appropriate questions, test hypotheses faster, and understand results more gainfully. The next groundbreaking discovery may well come from collaborations between human creativity and machine power. Shamim Haque Mondal is a researcher in the Physics Division, State Forensic Science Laboratory, Kolkata.


Time of India
an hour ago
- Time of India
Trump-Musk rift rattles Wall Street; Tesla share slide exposes market fragility; major indexes take a hit
The public feud between US President Donald Trump and Tesla CEO Elon Musk has turned into both a political and a Wall Street drama, raising investor concerns and exposing the vulnerability of stock markets to sharp moves in major companies. The clash, which played out mostly on social media, triggered a 14% drop in Tesla shares on Thursday, after Trump threatened to cut off government contracts to Musk's companies. Thursday's decline reduced Tesla's market value by approximately $150 billion, with its weight in the S&P 500 and Nasdaq 100 at 1.6% and 2.6%, respectively. Tesla shares recovered partially on Friday, increasing about 5% by mid-day, reaching a market value of around $970 billion. Microsoft and Nvidia, both valued above $3 trillion, maintained weights of 6.9% and 6.8% in the S&P 500 as of Thursday. Despite a slight recovery on Friday, Thursday's sharp fall weighed heavily on major US indexes, with Tesla alone accounting for nearly half the day's declines. The company's decline made up nearly half of the day's losses for both the S&P 500 and the Nasdaq 100, which fell 0.5% and 0.8% respectively. by Taboola by Taboola Sponsored Links Sponsored Links Promoted Links Promoted Links You May Like Giao dịch vàng CFDs với mức chênh lệch giá thấp nhất IC Markets Đăng ký Undo The S&P 500 is widely seen as the key benchmark for the US stock market, while the tech-focused Nasdaq 100 underpins the popular Invesco QQQ ETF. "It's a widely held stock," said Robert Pavlik, senior portfolio manager at Dakota Wealth told Reuters. "When this big-name company that represents a sizable portion of the index sells off, it has an overall effect on the index, but it also has a psychological effect on investors," Pavlk added. The situation highlights long-standing concerns about index concentration in a small number of large-capitalisation stocks. The "Magnificent Seven", including Apple, Microsoft and Nvidia, collectively represented nearly one-third of the S&P 500's total weight as of Thursday's close. Though Tesla is the smallest among these tech and growth giants, it played a major role in driving index gains in 2023 and 2024. While 2025 started off uncertain, recent trends suggest signs of recovery. Tesla shares have dropped around 37% since mid-December, while the S&P 500 has fallen just 1% in the same period—reducing Tesla's overall influence on the index. Tesla is included in about 10% of the roughly 4,200 ETFs, giving it wide market exposure, according to Todd Sohn, ETF and technical strategist at Strategas. Some major funds affected include the Consumer Discretionary Select Sector SPDR Fund, which fell 2.5% on Thursday, and the Roundhill Magnificent Seven ETF, which declined 2.6%. "It's very important to know holistically what is in all your ETFs, because a lot of them are overlapping," the analyst noted. Stay informed with the latest business news, updates on bank holidays and public holidays . AI Masterclass for Students. Upskill Young Ones Today!– Join Now