
Tesla-Samsung $16.5 billion supply deal may spur chipmaker's US contract business
Tesla CEO Elon Musk said late on Sunday that Samsung's new chip factory in Taylor, Texas would make Tesla's next-generation AI6 chip. This could re-energize Samsung's project, which has faced long delays because the company had trouble retaining and attracting major clients.
Samsung shares on Monday closed up 6.8 per cent on hopes that this deal would help the world's top memory chip maker in the race to produce artificial intelligence chips, where it trails rivals such as TSMC.
With production still years away, the deal is unlikely to help Tesla address immediate challenges, including ongoing declines in its EV sales and efforts to scale its emerging robotaxi service. Tesla shares still rose 4.2 per cent on Monday.
Musk has said that future AI inference chips, including AI6, would be deployed in self-driving vehicles and its Optimus humanoid robots, though he has noted the substantial computing power could enable broader AI applications. Inference chips are used to run AI models and make real-time decisions.
"Samsung agreed to allow Tesla to assist in maximizing manufacturing efficiency. This is a critical point, as I will walk the line personally to accelerate the pace of progress. And the fab is conveniently located not far from my house," Musk said in a post on X on Monday.
"The $16.5B number is just the bare minimum. Actual output is likely to be several times higher," Musk said in another post.
It was unclear whether the deal is related to ongoing trade talks between South Korea and the U.S. Seoul is seeking U.S. partnerships in chips and shipbuilding amid last-ditch efforts to reach a trade deal to eliminate or reduce potential 25 per cent U.S. tariffs.
A South Korean trade ministry official told Reuters he had not heard that the specific deal was part of the trade negotiations.
According to a senior analyst at NH Investment & Securities, Ryu Young-ho, Samsung's Taylor factory "so far had virtually no customers, so this order is quite meaningful," although the deal may represent a small portion of its logic chip revenue annually.
In October, Reuters reported that Samsung had postponed taking deliveries of ASML chipmaking equipment for its Texas factory as it had not yet won any major customers for the project. It has already delayed the plant's operational start to 2026.
PRODUCTION TIMELINE
While no timeline was provided for AI6 chip production, Musk has previously said that next-generation AI5 chips will be produced at the end of 2026, suggesting AI6 would follow. Musk confirmed during Tesla's earnings call last week that AI5 chips would enter "buying production" by the end of next year.
Lee Dong-ju, an analyst at SK Securities, expects production in 2027 or 2028, but Tesla has a history of missing its targets.
Samsung currently makes Tesla's AI4 chips, which power its Full Self-Driving (FSD) driver assistant system, while TSMC is slated to make the AI5, initially in Taiwan and then Arizona, Musk has said.
Samsung, the world's top memory chip maker, also produces logic chips designed by customers through its foundry business. The Texas project is central to Samsung Chairman Jay Y. Lee's strategy to expand beyond its bread-and-butter memory chips into contract chip manufacturing.
It holds just 8 per cent of the global foundry market, far behind TSMC, which has a 67 per cent share, data from market researcher Trendforce show.
Pak Yuak, an analyst at Kiwoom Securities, said the deal would help reduce losses at Samsung's foundry business, which he estimates exceeded 5 trillion won ($3.6 billion) in the first half of the year.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


CNA
36 minutes ago
- CNA
Exclusive-LG Energy Solution, Tesla sign $4.3 billion battery supply deal, source says
South Korean battery maker LG Energy Solution (LGES) has signed a $4.3 billion contract to supply Tesla with lithium iron phosphate (LFP) batteries for energy storage systems, a person familiar with the matter said on Wednesday. The batteries will be supplied from LGES' U.S. factory, the person said on condition of anonymity because the details were not public. LGES said earlier on Wednesday that it had signed a $4.3 billion contract to supply LFP batteries over three years globally, without identifying the customer. The announcement by the company, whose major customers include Tesla and General Motors, did not say whether the LFP batteries will be used in vehicles or energy storage systems.


CNA
2 hours ago
- CNA
LG Energy Solution signs $4.3 billion battery supply contract
SEOUL :South Korean battery maker LG Energy Solution (LGES) said on Wednesday it had signed a $4.3 billion contract to supply lithium iron phosphate (LFP) batteries over three years globally. LGES did not name the counterpart in a regulatory filing, but said it will supply batteries to locations globally, without specifying. The company, whose major customers include Tesla and General Motors, did not say whether the LFP batteries will be used in vehicles or energy storage systems.
Business Times
3 hours ago
- Business Times
Trivial AI use can have devastating climate consequences
THE future of humanity, we are told, will be irrevocably shaped by artificial intelligence (AI). That same future is also under existential threat from global warming and its disastrous consequences – a threat that is being exacerbated by the unthinking use of AI. Of course, there are many powerful and life-changing uses for AI, from cancer research to predictive data analytics. Perhaps the most widespread use today, however, is arguably the most trivial: generative AI (GenAI), not least in the form of large language models (LLMs) that spit out text based on spotting linguistic patterns, rather than engaging with meaning. Even if one sets aside the legal and ethical issues surrounding GenAI models and their use of copyrighted material, their meteoric rise has had regrettable practical consequences. Some have decried how LLMs encourage intellectual laziness, with students or entry-level employees outsourcing what used to be fundamental work, and thus never developing the required skills: understanding and summarising information, or turning rough ideas into polished text. Due to the looseness of the term 'AI', LLMs are also drawing attention – and perhaps resources – away from more useful forms. BT in your inbox Start and end each day with the latest news stories and analyses delivered straight to your inbox. Sign Up Sign Up At an SG60-themed conference on Tuesday (Jul 29), Prime Minister Lawrence Wong noted that not all AI applications are equally useful, saying: 'Most of us use AI the way we use Google – that is not exactly the best way to use AI or to harness the potential of AI.' He stressed that LLMs are just one small part of AI, with other areas having far more potential. Even as Singapore encourages broad-based adoption of AI, the country must 'think equally hard about applying technologies like AI in a meaningful and deliberate manner that creates jobs for Singaporeans', he said. The climate cost For those with no attachment to the old-fashioned notion of thinking for oneself, the widespread use of GenAI might not be an issue. The true problem is that trivial GenAI use exacts a real environmental cost, for often dubious gains. Take Grok, the GenAI model created by X, formerly known as Twitter. Since late 2024, the data centre powering this LLM has made headlines for guzzling electricity and water, polluting nearby waterways and emitting greenhouse gases. And this is just one of many resource-hungry data centres that are crucial for powering AI models. Such facilities consume high amounts of water and electricity to keep their servers cool and running. In the current climate crisis, the proliferation of data centres might seem almost profligate. Even as nations try to cut emissions and energy use, every ChatGPT query adds to the carbon burden. Of course, as with anything else, AI use should be subject to cost-benefit analysis. Many data centres may be supporting truly meaningful work: AI applications that reap concrete gains, for companies or for the government. When smart factories use data analytics to reduce energy consumption and cut waste, for instance, the gains from doing so should more than offset the cost of powering such AI tools. In contrast, it is depressing to consider the carbon cost of the trivial GenAI queries that are being made en masse each day. And even when AI use is purported to improve productivity, we should question the real savings being achieved. Is the time saved in getting ChatGPT to produce a corporate-speak e-mail, for instance, really worth the resources burned?