logo
LG Electronics Shares Jump on Report of AI Chip Gear Development

LG Electronics Shares Jump on Report of AI Chip Gear Development

Bloomberg14-07-2025
LG Electronics Inc. shares advanced in Seoul after a local media report that the company is developing cutting-edge tools for making the memory chips that work alongside AI processors designed by Nvidia Corp. and others.
The Korean company is targeting mass production of hybrid bonders for high-bandwidth memory chips in 2028, Seoul Economic Daily reported, citing unidentified sources. LG Electronics said it is conducting technical research on hybrid bonders for HBM, but the specific timing of mass production has not been confirmed.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Google's Smart Home Devices Are Finally Getting Gemini's AI Skills
Google's Smart Home Devices Are Finally Getting Gemini's AI Skills

CNET

timea minute ago

  • CNET

Google's Smart Home Devices Are Finally Getting Gemini's AI Skills

Gemini for Home will replace the Google Assistant in the company's line of smart displays and speakers, a step up in capabilities for its smart home devices, Google said Wednesday. Gemini for Home will use AI models tuned for home tasks and will work with any member of the household, including guests, the company said in a press release alongside its Made By Google event in New York. It'll work with the same "hey Google" activation phrase, but Gemini for Home can better understand context with the ability for users to ask more complex questions. Nest smart displays, speakers, thermostats and smart lights can interact with Gemini for Home. For example, users can have Gemini for Home dim lights and set the thermostat to 72 degrees in one command. Google's blog post also says commands like "turn off the lights everywhere except my bedroom" will also work. Now Playing: Everything Announced at Made by Google 2025 in 7 Minutes 06:58 Because Gemini for Home will tap into the power of large language models, a Nest Hub smart speaker can be prompted with what ingredients are in your fridge and will be able to process which recipes to consider. Gemini for Home also links to Google Search, meaning it can find and use up-to-date information. Gemini for Home could also create bespoke content, like a bedtime story. It's unclear now exactly which devices will support Gemini for Home. It's likely the current line of Nest devices will support it, but Google's blog post makes no mention of older Google smart speakers or the Pixel Tablet. Google didn't immediately respond to a request for comment. Google's AI push is spreading the company's generative technology across all of its devices and services. From Google Search, Gmail, Pixel phones and now smart home devices, Gemini is in everything. Unlike OpenAI, creators of ChatGPT, Google can spread its AI tech across a wide product portfolio. Google's early investments in AI means the company is ahead of Apple, which has been struggling to bake AI into its suite of devices. The rollout of Apple Intelligence on iPhone, which was made in partnership with OpenAI, has been slow by comparison to Google and other smartphone makers. The AI-ification of Google seems to be working. Google reported 14% increase in sales this past quarter, thanks to increased Google Search usage and higher cloud sales.

The AI Battle's Newest Warrior Strikes a Major Blow to Big Tech
The AI Battle's Newest Warrior Strikes a Major Blow to Big Tech

Gizmodo

timea minute ago

  • Gizmodo

The AI Battle's Newest Warrior Strikes a Major Blow to Big Tech

The ongoing slugfest between tech players racing to get the most intuitive and powerful AI may have just gotten a brief knockout punch. The slammer that landed? A new version of DeepSeek's increasingly impressive V3.1, which has a whopping 685-billion-parameter system and can deliver about $1.01 per complete coding task, compared to a beginning price of $70 for traditional systems. 🚨 BREAKING: DeepSeek V3.1 is Here! 🚨 The AI giant drops its latest upgrade — and it's BIG:⚡685B parameters🧠Longer context window📂Multiple tensor formats (BF16, F8_E4M3, F32)💻Downloadable now on Hugging Face📉Still awaiting API/inference launch The AI race just got… — Commentary DeepSeek News (@deepsseek) August 19, 2025DeepSeek is no stranger to wowing the world. Its R1 model rolled out last year and immediately astonished AI watchers with its speed and accuracy compared to its Western competitors, and it looks like V3.1 may follow suit. That price point and complexity of service is a direct challenge to bigger, recent frontier systems from OpenAI and Anthropic, both of which are based in the U.S. A face-off between Chinese and American tech systems has been actively happening for years, but to have such a formidable entrant from a much smaller company may ring in a new era of challenges. Alibaba Group Holding Ltd. and Moonshot have also released AI models that challenge American tech. 'While many recognize DeepSeek's achievements, this represents just the beginning of China's AI innovation wave,' Louis Liang, an AI sector investor with Ameba Capital, told Bloomberg. 'We are witnessing the advent of AI mass adoption, this goes beyond national competition.' DeepSeek's entire approach to how AI can work is different than the way most American tech companies have been tackling the idea. That could transform the global competition from one that focuses on accessibility instead of power, VentureBeat reports. It is also challenging giants like Meta and Alphabet by processing a much larger amount of data, which makes a bigger 'context window,' which is how much text a model can consider when answering a query. That's important to users because it boosts the model's ability to stay understandable in long conversations, use memory to complete complicated tasks it has done before, and comprehend how different parts of text relate to one another. More importantly, users are loving it. Deepseek V3.1 is already 4th trending on HF with a silent release without model card 😅😅😅 The power of 80,000 followers on @huggingface (first org with 100k when?)! — clem 🤗 (@ClementDelangue) August 19, 2025Another major accolade? DeepSeek's V3.1 notched a 71.6% score on the Aider coding benchmark, a major win considering it had only just debuted on popular AI tool tester Hugging Face last night, and pretty much instantly blew away other rivals like OpenAI's ChatGPT 4.5 model, which scored a paltry 40%. 'Deepseek v3.1 scores 71.6% on aider—non-reasoning SOTA,' tweeted AI researcher Andrew Christianson, adding that it is '1% more than Claude Opus 4 while being 68 times cheaper.' The achievement places DeepSeek in rarefied company, matching performance levels previously reserved for the most expensive proprietary systems.

Oracle Will Reportedly Spend $1 Billion a Year on a Gas-Powered Data Center
Oracle Will Reportedly Spend $1 Billion a Year on a Gas-Powered Data Center

Gizmodo

timea minute ago

  • Gizmodo

Oracle Will Reportedly Spend $1 Billion a Year on a Gas-Powered Data Center

Oracle is going all-in on its AI-focused cloud business, pouring billions into building massive new data centers. One site under construction in West Texas will reportedly cost the company around $1 billion just to keep the lights on. Bloomberg reported, citing unnamed sources familiar with the plans, that Oracle intends to spend more than $1 billion a year to run a new West Texas megasite on gas generators rather than wait for a utility hookup. It can take years to get approval and infrastructure to pull the kind of electricity these massive data centers need from local grids. Oracle's workaround shows just how much it's willing to spend, how fast it wants these sites running, and how little it cares for any environmental consequences. Oracle, founded all the way back in 1977 and best known for its business database software, has reinvented itself in the last decade or so as a player in cloud services, after years of initially dismissing the idea. Its big bet now is on AI-focused cloud computing. In recent years, Oracle has landed big deals with major tech firms and quickly built a reputation for handling AI projects. A big draw is the company's 'bare metal' approach, which gives clients their own dedicated servers—opposed to sharing them with other clients. This makes the servers faster and more secure. Oracle has now become the backbone for many AI companies. Things really took off for the company when TikTok came aboard, and by 2022, all U.S. user traffic on the app was flowing through Oracle servers. This deal quickly generated over $1 billion of annual revenue for the company. Today, the company is helping power Elon Musk's xAI from a Utah data center and building a cluster of tens of thousands of AI chips for Nvidia. All these deals have helped make Oracle founder Larry Ellison the second-richest man in the world, just behind Elon Musk. Oracle also recently signed what Bloomberg calls the largest single cloud deal ever with OpenAI for its Stargate project, an ambitious joint venture announced at the White House back in January. The AI company has agreed to develop about 4.5 gigawatts of data center power with Oracle. For context, one gigawatt is enough to power about 750,000 homes. Still, we've seen little movement in the real world to indicate the project is on track. Oracle's new gas-powered data center is going up in Shackelford County, Texas, not far from another one of its data centers in Abilene. Developed by Vantage Data Centers, the site will have a massive 1.4-gigawatt capacity, making it one of the largest data centers in the world. Oracle did not immediately respond to a request for comment from Gizmodo.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store