
Samsung is giving its TVs a major AI upgrade — here's what you can do now
Samsung announced the update in a newsroom blog on its Korean site. It focuses on enhancing its Bixby AI voice assistant with improved natural language processing (NLP) while watching cable TV, Samsung TV Plus and terrestrial broadcasting.
The new update will only be applicable on 2025 TVs, specially its more premium sets that includes New QLED, OLED, QLED, and The Frame TVs. it will first roll out in Korea, then expand to other regions in the coming months.
It's no secret that Bixby is a bit of a headache to use on most Samsung sets, but the new Click to Search update should make it a far better experience. Samsung is kitting the Bixby assistant with better AI programming, specifically highlighting that users will feel "as if they were talking to their TV."
Newer models among the best Samsung TVs have a so-called AI button that will now have broader capabilities depending on the situation. When pressed, Bixby can either relay pertinent information about the on-screen content, or you can simply ask it a question about what's on-screen.
That's because Samsung has enhanced Bixby with Natural Language Processing (NLP). It's a subset in the field of AI that focuses on bridging human language with computer interpretation. It essentially will make Bixby far easier to use and more interactive.
This new Click to Search functionality will only be available in certain applications, like Samsung TV Plus, terrestrial broadcasting and cable TV. Samsung claims it's "considering expanding to OTT channels in the future," but gives no word on if it will be available with some of the best streaming services.
Get instant access to breaking news, the hottest reviews, great deals and helpful tips.
Samsung says the new update will only be applicable for 2025 sets. These include its Neo QLED models, like the Samsung QN990F 8K TV, OLED TVs, like the Samsung S95F, QLEDs, and even The Frame. It will first launch in Korea, then expand to other parts of the world later this year.
It's unclear if this update will be retroactively added to older models through Samsung's seve- year Tizen OS update plan, but it seems unlikely. Samsung promoted AI as the main backbone of its 2025 TV lineup, and the hardware on older sets might not be as capable — not to mention the fact that the AI button is only available on newer models.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
21 minutes ago
- Yahoo
Block AI Labs Empowers Startups with Affordable, AI-Driven Software Development from U.S. and Colombia
Leveraging Medellín's rising tech talent and bilingual support, Block AI Labs delivers smart solutions for global startups. AUSTIN, Texas, July 26, 2025 /PRNewswire/ -- Block AI Labs, founded by tech entrepreneurs Daniel and David Shnader, is transforming the way startups access high-quality software development by combining U.S. leadership with top-tier engineering talent in Medellín, Colombia. The company specializes in affordable, full-stack development and AI integration services, supporting emerging startups across the U.S., Canada, Australia, and other global tech leveraging Medellín's fast-growing digital nomad and IT community, Block AI Labs delivers powerful, scalable software solutions with bilingual (English and Spanish) support. This cross-border model ensures seamless collaboration, fast turnaround times, and access to global expertise — all at a startup-friendly price AI Labs is the driving force behind several innovative AI-based platforms, including: – AI voice assistant and chatbot automation tools – AI-powered local SEO and social media automation – Bookkeeping and tax return software for freelancers and solopreneurs – Fintech solution for automated credit card dispute resolution and identity verification With a focus on innovation, communication, and execution, Block AI Labs helps startups go from idea to launch faster, smarter, and more efficiently. The company's international footprint, bilingual team, and proven track record make it a trusted development partner for founders and innovators around the more information, visit Shnadersupport@ ID: View original content: SOURCE Block AI Labs Sign in to access your portfolio
Yahoo
an hour ago
- Yahoo
Here are 3 ways to think about Nvidia stock
Fortunes have been made by many, thanks to investing in chip giant Nvidia (NASDAQ: NVDA). Nvidia stock has soared 1,576% over the past five years. It is now the most valuable listed company in the world. I continue to weigh my options when it comes to investing. I would be happy to own Nvidia stock in my portfolio — but I am not willing to pay the current price. In making my decisions, I have been trying to think about the share from different perspectives. Here are three of them. Like Amazon before the dotcom crash Artificial intelligence (AI) has some signs of being a stock market bubble. If that bubble bursts, for example because computing power progress means future chip demand is much less than expected, it would likely have a big impact on Nvidia. That helps explain why I am nervous about buying at the current Nvidia stock price. If it falls down I would then be nursing a paper loss, perhaps a sizeable one. Then again, Amazon fell 94% between the dotcom boom of November 1999 and September 2001. Still, since then it has gone up 76,600%. As a long-term investor, I do not mind sitting on a paper loss (even a sizeable one) if I continue to believe in the long-term investment case for a share. But while Amazon in 1999 could be an interesting comparison for Nvidia stock today, there is no guarantee latter would bounce back the way the former did. Amazon's market grew significantly. The market for AI chips may keep growing fast – but it could also be that after initial installations are complete, demand falls. A bubble waiting to burst? That leads me onto another potential way to view Nvidia stock: as a massive bubble waiting to burst. After all, the price-to-earnings (P/E) ratio is 56. That is higher than I would be willing to pay, though large tech stocks often do command high P/E ratios. But earnings have exploded at Nvidia in recent years. Last year's basic earnings per share of $2.97 were far more than double the prior year's $1.21 – and around 25 times higher than just five years previously. If the surging demand for AI chips turns out to be a blip rather than a long-term trend, Nvidia's eanings could come crashing back to earth. In such a scenario, even if Nvidia remained solidly profitable, its stock price may move far below where it currently stands. This is the risk that most puts me off investing at the current share price. Success story set to grow A third scenario could be that Nvidia might be like Microsoft or Apple at multiple points in their history – massively successful yet set to grow further, boosting an already costly-looking share price. Apple stock is up 131% in the past five years. But five years ago, Apple was already massively successful and one of the biggest companies on the market. Nvidia's proprietary technology, large customer base and proven business model have brought it a long way in a few years. Maybe it can do the same again over the next few years. The post Here are 3 ways to think about Nvidia stock appeared first on The Motley Fool UK. More reading 5 Stocks For Trying To Build Wealth After 50 One Top Growth Stock from the Motley Fool C Ruane has no position in any of the shares mentioned. The Motley Fool UK has recommended Amazon, Apple, Microsoft, and Nvidia. Views expressed on the companies mentioned in this article are those of the writer and therefore may differ from the official recommendations we make in our subscription services such as Share Advisor, Hidden Winners and Pro. Here at The Motley Fool we believe that considering a diverse range of insights makes us better investors. Motley Fool UK 2025


The Hill
an hour ago
- The Hill
Tech companies building massive AI data centers should pay to power them
The projected growth in artificial intelligence and its unprecedented demand for electricity to power enormous data centers present a serious challenge to the financial and technical capacity of the U.S. utility system. Appreciation for the sheer magnitude of that challenge has gotten lost as forecast after forecast projects massive growth in electric demand over the coming decade. The idea of building a data center that will draw 1 gigawatt of power or more, an amount sufficient to serve over 875,000 homes, is in the plans of so many data center developers and so routinely discussed that it no longer seems extraordinary. The challenge, when viewed in the aggregate, may be overwhelming. A recent Wood Mackenzie report identified 64 gigawatts of confirmed data center related power projects currently on the books with another 132 gigawatts potentially to be developed. 64 gigawatts are enough to power 56 million homes — more than twice the population of the 15 largest cities in America. The U.S. electric utility system is struggling to meet the projected energy needs of the AI industry. The problem is that many utilities do not have the financial and organizational resources to build new generating and transmission facilities at the scale and on the data center developers' desired timeline. The public policy question now on the table is who should pay for and bear the risk for these massive mega-energy projects. Will it be the AI developers such as Amazon, Microsoft, Meta and Alphabet — whose combined market value is seven times that of the entire S&P 500 Utility Sector — or the residential and other customers of local electric utilities? The process to answer this and related questions is underway in the hallways of the U.S. Congress, at the Federal Energy Regulatory Commission and other federal agencies, in tariff proceedings before state regulatory authorities and in public debate at the national, state and local levels. Whether they are developed at the federal, state or local level, the following values and objectives should form the core of public policy in this area: Data centers developers that require massive amounts of electric power (e.g. above 500MW or another specified level) should be required to pay for building new generating and transmission facilities. The State of Texas recently enacted legislation that requires data centers and other new large users to fund the infrastructure necessary to serve their needs. Although it is customary to spread the cost of new facilities across the user base of a utility, the demands that data center developers are placing on utility systems across the country are sufficiently extraordinary to justify allocating the costs of new facilities to those developers. Moreover, data center developers have the financial resources to cover those costs and incorporate them into the rates charged to users of their AI services. The developers of large data centers should bear the risk associated with new utility-built generating and transmission facilities, not the utility. As an example of such a policy, the Public Utility Commission of Ohio just approved a compromise proposed by American Electric Power of Ohio that would require data centers with loads greater than 1 gigawatt and mobile data centers over 25 megawatts to commit to 10-year electric service contracts and pay minimum demand charges based on 85 percent of their contract capacity, up from 60 percent under the utility's current general service tariff. Another option included in the Texas legislation requires significant up-front payments early in the planning process and mandates that data center developers disclose where they may have simultaneously placed demands for power. It is not unusual for data center requests for service to be withdrawn once they decide on the best location and package of incentives. Data center developers have the financial capacity and ability to manage this risk, utilities do not. Generating facilities that are co-located at large data centers should be integrated with the local utility electric grid, with appropriate cost allocation. Although a few projects have examined the option of a co-located power generation 'island' fully independent of the grid, most projects intend to interconnect with the grid system for back-up power and related purposes. Properly managed, this interconnection could be advantageous for both the data center and the utility system, provided that costs are appropriately allocated across the system. The U.S. government should continue to support the development of nuclear technology, including small modular reactors. U.S. utilities do not have the financial resources to assume the risk of building new nuclear-powered generating facilities. The emergence of a new set of customers, data center developers with enormous needs for electric power and deep pockets, changes the equation. The U.S. government has provided billions of dollars of support for new nuclear technologies and should continue to do so for the purpose of bringing their costs down. The U.S. government should continue to support energy efficiency improvements at data centers. Data centers use massive amounts of power for running servers, cooling systems, storage systems, networking equipment, backup systems, security systems and lighting. The National Renewable Energy Laboratory has developed a 'handbook' of measures that data centers can implement to reduce energy usage and achieve savings. In addition, there now are strong market forces to develop new super-efficient chips that will lower the unit costs of training and using AI models. The U.S. government should help accelerate the development of these chips given their leverage on U.S. electricity demand. The stakes in this public policy debate over our energy future could not be higher. If we get these policies right, AI has the potential to remake the U.S. economy and the energy infrastructure of this country. If we get it wrong, the push to build new generating and transmission facilities to provide gigawatts of power has the potential to overwhelm the financial and operational capacity our electric utility system, impose burdensome rate increases on homeowners and businesses, undercut efforts to reduce the use of fossil fuels to meet climate-related goals and compromise the reliability of our electricity grid for years to come. David M. Klaus is a consultant on energy issues who served as deputy undersecretary of the U.S. Department of Energy during the Obama administration and as a political appointee to two other Democratic presidents. Mark MacCarthy is the author of 'Regulating Digital Industries' (Brookings, 2023), an adjunct professor at Georgetown University's Communication, Culture & Technology Program, a nonresident senior fellow at the Institute for Technology Law and Policy at Georgetown Law and a nonresident senior fellow at the Brookings Institution.