logo
Mouser Electronics Inspire Innovation at COMPUTEX 2025

Mouser Electronics Inspire Innovation at COMPUTEX 2025

Yahoo16-05-2025

Showcasing the Future of AI with Industry Leader
TAIPEI, May 16, 2025 /PRNewswire/ -- Mouser Electronics, Inc., the industry's leading New Product Introduction (NPI) distributor with the widest selection of semiconductors and electronic components™, announces its debut at COMPUTEX 2025. The event will take place from May 20th to 23th, at the Taipei Nangang Exhibition Center, Hall 1, Booth #I0102. Mouser will be exhibiting alongside industry-leading partners NexCOBOT, Renesas Electronics, and TAIYO YUDEN, spotlighting the transformative power of AI and the latest trends shaping the industry.
"As digital transformation accelerates, AI is revolutionizing industries like never before," said Daphne Tien, Vice President of Marketing and Business Development at Mouser APAC. "COMPUTEX is a leading global platform for AIoT and breakthrough technologies, and this year's 'AI Next' theme highlights the latest AI innovations and trends. We're thrilled to exhibit in Taiwan for the first time and invite everyone to visit our booth to see how AI is shaping the future."
NexCOBOT's AI-driven robotic systems
At COMPUTEX 2025, Mouser will feature the latest in semiconductor innovations and AI-powered solutions designed to spark creativity and accelerate development for engineers. Attendees will have the opportunity to see NexCOBOT's AI-driven robotic systems, including its Functional Safety Controller, GRC Robot Controller, and Robotic Actuator. These modular solutions are designed for industrial use, offering certifiable safety functions like position and speed monitoring, and are built to comply with international safety standards such as ISO 10218 — helping engineers build smarter and safer robots with ease.
Renesas Electronics power and sensing solutions
Visitors to Mouser's stand can explore Renesas' cutting-edge power and sensing solutions, including high-efficiency GaN technology for over 1kW power conversion, USB PD fast charging designs (240W and 65W), and edge AI sensor systems powered by Reality AI Tools. These technologies enable faster, smarter, and more energy-efficient devices for both industrial and consumer applications.
TAIYO YUDEN Small High-End MLCC products
Attendees at Mouser stand can see TAIYO YUDEN's Small High-End MLCC products, ideally suited for AI servers and laptops. Engineered to support high capacitance and reliable performance up to 125℃, these components meet the growing demands of compact, high-performance computing systems.

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Is Nvidia stock a massive bargain — or a massive value trap?
Is Nvidia stock a massive bargain — or a massive value trap?

Yahoo

time6 minutes ago

  • Yahoo

Is Nvidia stock a massive bargain — or a massive value trap?

AI has transformed demand for computer chips and the most obvious beneficiary of that has been Nvidia (NASDAQ: NVDA). With a stock market capitalization of $3.4trn, Nvidia might not seem like an obvious bargain. But what if it is really worth that much – or potentially a lot more? I have been keen to add some Nvidia stock to my portfolio, but I do not want to overpay. After all, Nvidia has shot up 1,499% in five years! So, here is what I am doing. For some companies in which I have invested in the past, from Reckitt to Burberry, I have benefited as an investor from a market being mature. Sales of detergent or pricy trenchcoats may grow over time, but they are unlikely to shoot up year after year. That is because those firms operate in mature markets. On top of that, as they are large and long-established, it is hard for them to grow by gaining substantial market share. So, market maturity has helped me as an investor because it has made it easier for me to judge what I think the total size of a market for a product or service may be – and how much of it the company in question looks likely to have in future. Chips, by contrast, are different. Even before AI, this was still a fast-growing industry – and AI has added fuel to that fire. On top of that, Nvidia is something of a rarity. It is already a large company and generated $130bn in revenues last year. But it is not mature – rather, it continues to grow at a breathtaking pace. Its first-quarter revenue was 69% higher than in the same three months of last year. Those factors mean that it is hard to tell what Nvidia is worth. Clearly that is not only my opinion: the fact that Nvidia stock is 47% higher than in April suggests that the wider market is wrestling with the same problem. Could it be a value trap? It is possible. For example, chip demand could fall after the surge of recent years and settle down again at a much lower level. A lower cost rival could eat badly into Nvidia's market share. Trade disputes could see sales volumes fall. With a price-to-earnings (P/E) ratio of 46, just a few things like that going wrong could mean today's Nvidia stock price ends up looking like a value trap. On the other hand, think about those first-quarter growth rates. If Nvidia keeps doing as well, let alone better, its earnings could soar. In that case, the prospective P/E ratio based on today's share price could be low and the current share price a long-term bargain. I see multiple possible drivers for such an increase, such as more widespread adoption of AI and Nvidia launching even more advanced proprietary chip designs. So, I reckon the company could turn out to be either a massive bargain at today's price, or a massive value trap. The price does not offer me enough margin of safety for my comfort if the stock is indeed a value trap. So, I will wait for a more attractive valuation before buying. The post Is Nvidia stock a massive bargain — or a massive value trap? appeared first on The Motley Fool UK. More reading 5 Stocks For Trying To Build Wealth After 50 One Top Growth Stock from the Motley Fool C Ruane has no position in any of the shares mentioned. The Motley Fool UK has recommended Burberry Group Plc, Nvidia, and Reckitt Benckiser Group Plc. Views expressed on the companies mentioned in this article are those of the writer and therefore may differ from the official recommendations we make in our subscription services such as Share Advisor, Hidden Winners and Pro. Here at The Motley Fool we believe that considering a diverse range of insights makes us better investors. Motley Fool UK 2025

We're offloading mental tasks to AI. It could be making us stupid
We're offloading mental tasks to AI. It could be making us stupid

Yahoo

time7 minutes ago

  • Yahoo

We're offloading mental tasks to AI. It could be making us stupid

Koen Van Belle, a test automation engineer who codes for a living, had been using the artificial intelligence large language model Copilot for about six months when one day the internet went down. Forced to return to his traditional means of work using his memory and what he had decades of experience doing, he struggled to remember some of the syntax he coded with. 'I couldn't remember how it works,' Van Belle, who manages a computer programming business in Belgium, told Salon in a video call. 'I became way too reliant on AI … so I had to turn it off and re-learn some skills.' As a manager in his company, Van Belle oversees the work of a handful of interns each year. Because their company has limits on the use of AI, the interns had to curb their use as well, he said. But afterward, the amount and quality of their coding was drastically reduced, Van Belle said. 'They are able to explain to ChatGPT what they want, it generates something and they hope it works,' Van Belle said. 'When they get into the real world and have to build a new project, they will fail.' Since AI models like Copilot and ChatGPT came online in 2022, they have exploded in popularity, with one survey conducted in January estimating that more than half of Americans have used Copilot, ChatGPT, Gemini or Claude. Research examining how these programs affect users is limited because they are so new, but some early studies suggest they are already impacting our brains. 'In some sense, these models are like brain control interfaces or implants — they're that powerful,' said Kanaka Rajan, a computational neuroscientist and founding faculty member at the Kempner Institute for the Study of Natural and Artificial Intelligence at Harvard University. 'In some sense, they're changing the input streams to the networks that live in our brains.' In a February study conducted by researchers from Microsoft and Carnegie Mellon University, groups of people working with data worked more efficiently with the use of generative AI tools like ChatGPT — but used less critical thinking than a comparator group of workers who didn't use these tools. In fact, the more that workers reported trusting AI's ability to perform tasks for them, the more their critical thinking was reduced. Another 2024 study published last year reported that the reduction in critical thinking stemmed from relying on AI to perform a greater proportion of the brain work necessary to perform tasks in a process called cognitive offloading. Cognitive offloading is something we do everyday when we write our shopping list, make an event on the calendar or use a calculator. To reduce our brain's workload, we can 'offload' some of its tasks to technology, which can help us perform more complex tasks. However, it has also been linked in other research to things like having a worse memory. As a review published in March concluded: 'Although laboratory studies have demonstrated that cognitive offloading has benefits for task performance, it is not without costs.' It's handy, for example, to be able to rely on your brain to remember the grocery list in case it gets lost. So how much cognitive offloading is good for us — and how is AI accelerating those costs? This concept is not new: The Greek philosopher Socrates was afraid that the invention of writing would make humans dumber because we wouldn't exercise our memory as much. He famously never wrote anything down, though his student, Plato, did. Some argue Socrates was right and the trend is escalating: with each major technological advancement, we increasingly rely on tools outside of ourselves to perform tasks we once accomplished in-house. Many people may not perform routine calculations in their head anymore due to the invention of the calculator, and most people use a GPS instead of pulling out a physical map or going off physical markers to guide them to their is no doubt these inventions have made us more efficient, but the concern lies in what happens when we stop flexing the parts of the brain that are responsible for these tasks. And over time, some argue we might lose those abilities. There is an old ethos of "use it or lose it" that may apply to cognitive tasks as well. Despite concerns that calculators would destroy our ability to do math, research has generally shown that there is little difference in performance when calculators are used and when they are not. Some have even been critical that the school system still generally spends so much time teaching students foundational techniques like learning the multiplication tables when they can now solve those sorts of problems at the touch of a button, said Matthew Fisher, a researcher at Southern Methodist University. On the other hand, others argue that this part of the curriculum is important because it provides the foundational mathematical building blocks from which students learn other parts of math and science, he explained. As Fisher told Salon in a phone interview: "If we just totally get rid of that mathematical foundation, our intuition for later mathematical study, as well as just for living in the world and understanding basic relationships, is going to be off.' Other studies suggest relying on newer forms of technology does influence our brain activity. Research, for example, has found that students' brains were more active when they handwrote information rather than typing it on a keyboard and when using a pen and paper versus a stylus and a tablet. Research also shows that 'use it or lose it' is somewhat true in the context of the skills we learn. New neurons are produced in the hippocampus, the part of the brain responsible for learning. However, most of these new cells will die off unless the brain puts effort and focus into learning over a period of time. People can certainly learn from artificial intelligence, but the danger lies in forgoing the learning process to simply regurgitate information that it feeds us. In 2008, after about two decades of the public internet, The Atlantic published a cover story asking "Is Google making us stupid?" Since then, and with the emergence of smart phones and social media, research has shown that too much time on the internet can lower our ability to concentrate, make us feel isolated and lower our self-esteem. One 2011 review found that people increasingly turn to the internet for difficult questions and are less able to recall the information that they found on the internet when using it to answer those questions. Instead, participants had an enhanced ability to recall where they found it. 'The internet has become a primary form of external or transactive memory, where information is stored collectively outside ourselves,' the authors concluded. In 2021, Fisher co-authored research that also found people who used internet searches more had an inflated sense of their own knowledge, reporting exaggerated claims about things they read on the internet compared to a control group who learned things without it. He termed this phenomenon the 'Google effect.' 'What we seem to have a hard time doing is differentiating where our internally mastered knowledge stops and where the knowledge we can just look up but feels a lot like our knowledge begins,' Fisher said. Many argue that AI takes this even further and cuts out a critical part of our imaginative process. In an opinion piece for Inside Higher Education, John Warner wrote that overrelying on ChatGPT for written tasks 'risks derailing the important exploration of an idea that happens when we write.' 'This is particularly true in school contexts, when the learning that happens inside the student is far more important than the finished product they produce on a given assignment,' Warner wrote. Much of the energy dedicated to understanding how AI affects our brains has been focused on adolescents because younger generations use these tools more and may also be more vulnerable to changes that occur because their brains are still developing. One 2023 study, for example, found junior high school students who used AI more had less of an ability to adapt to new social situations. Another 2023 paper also found that students who more heavily relied on AI to answer multiple choice questions summarizing a reading excerpt scored lower than those who relied on their memory alone, said study author Qirui Ju, a researcher at Duke University. 'Writing things down is helping you to really understand the material,' Ju told Salon in a phone interview. 'But if you replace that process with AI, even if you write higher quality stuff with less typos and more coherent sentences, it replaces the learning process so that the learning quality is lower.' To get a better idea of what is happening with people's brains when using large language models, researchers at the Massachusetts Institute of Technology connected 32-channel electroencephalograms to three groups of college-age students who were all answering the same writing prompts: One group used ChatGPT, another used Google and the third group simply used their own brains. Although the study was small, with just 55 participants, its results suggest large language models could affect our memory, attention and creativity, said Nataliya Kos'myna, the leader of the 'Your Brain on LLM' project, and a research scientist at the MIT Media Lab. After writing the essay, 85% of the group using Google and the group using their brains could recall a quote from their writing, compared to only 20% of those who used large language models, Kos'myna said. Furthermore, 16% of people using AI said they didn't even recognize their essay as their own after completing it, compared to 0% of students in the other group, she added. Overall, there was less brain activity and interconnectivity in the group that used ChatGPT compared to the groups that used Google or their brains only. Specifically, activity in the regions of the brain corresponding to language processing, imagination and creative writing in students using large language models were reduced compared to students in other groups, Kos'myna said. The research team also performed another analysis in which students first used their brains for the tasks before switching to performing the same task with the large language models, and vice versa. Those who used their brains first and then went on to try their hand at the task with the assistance of AI appeared to perform better and had the aforementioned areas of their brains activated. But the same was not true for the group that used AI first and then went on to try it with just their brains, Kos'myna said. 'It looks like the large language models did not necessarily help you and provide any additional interconnectivity in the brain,' Kos'myna told Salon in a video call. 'However, there is potential … that if you actually use your brain and then rework the task when being exposed to the tool, it might be beneficial.' Whether AI hinders or promotes our capacity for learning may depend more on how we use it than whether we use it. In other words, it is not AI that is the problem, but our overreliance on it. Van Belle, in Belgium, now uses large language models to write social media posts for his company because he doesn't feel like that is where his skills are most refined and the process can be very time-consuming otherwise. 'I would like to think that I would be able to make a fairly decent LinkedIn post by myself, but it would take me an extra amount of time,' he said. 'That is time that I don't want to waste on something I don't really care about.' These days, he sees AI as a tool, which it can be — as long as we don't offload too much of our brain power on it. 'We've been on this steady march now for thousands of years and it feels like we are at the culmination of deciding what is left for us to know and for us to do,' Fisher said. 'It raises real questions about how best to balance technology and get the most out of it without sacrificing these essentially human things.'

At WWDC 25, Apple should make amends with developers after AI shortfalls and lawsuits
At WWDC 25, Apple should make amends with developers after AI shortfalls and lawsuits

TechCrunch

time31 minutes ago

  • TechCrunch

At WWDC 25, Apple should make amends with developers after AI shortfalls and lawsuits

There was palpable excitement around Apple's Worldwide Developer Conference (WWDC) last year. The company was about to unveil its AI capabilities, with the tech world expecting the company to unveil an AI platform capable of competing with Google and OpenAI. The demos Apple showed off at the time were compelling, but the follow-through has been underwhelming, leaving both developers and consumers wanting more. Apple's broader struggles with AI have become clearer over the past year. Its ambitions around personalized intelligence have faced delays, and its rollout of new tools has been inconsistent. The vision that Apple sold in 2024 — a seamless blend of on-device AI, revamped Siri interactions, and powerful new developer capabilities — has yet to materialize in full. Apple Intelligence features saw a staggered rollout that came with several hiccups. The personalized version of Siri that was showcased last year has been delayed, which matters because Apple framed the new Siri as a cornerstone of its AI strategy — a context-aware assistant that could understand user behavior across apps. Without it, the company's AI value proposition looks surprisingly thin. This also meant that developers couldn't take full advantage of the new AI-powered Siri, and users couldn't rely on the assistant to perform in-app actions as promised. For developers, that's a lost opportunity to build more interactive, intelligent app experiences. For consumers, it's another promise unfulfilled. And for Apple, it raises concerns about how competitive its AI stack really is compared to its increasingly powerful rivals like OpenAI, Google, and Microsoft. With WWDC 2025 now just around the corner, expectations for consumer-facing Apple Intelligence features are more cautious than last year. Most developers and analysts are now hoping for incremental improvements: smoother integration of AI into native apps, and tools that empower developers to actually use the AI that Apple is building. (No is expecting much on the Siri front.) One of Apple's best opportunities lies in enabling AI-assisted app development. The rise of tools like Cursor, Replit, and has made code generation a whole lot easier, helping developers, and even non-developers, bring products to life faster. AI-powered apps have found the web an effective distribution platform. ChatGPT, for instance, gained massive traction on the web before launching native apps for iOS and Android. At the same time, tools like WordPress, Hostinger, Canva, and Figma now let non-technical users create simple apps using natural language prompts. Apple needs to modernize here, too. Techcrunch event Save $200+ on your TechCrunch All Stage pass Build smarter. Scale faster. Connect deeper. Join visionaries from Precursor Ventures, NEA, Index Ventures, Underscore VC, and beyond for a day packed with strategies, workshops, and meaningful connections. Save $200+ on your TechCrunch All Stage pass Build smarter. Scale faster. Connect deeper. Join visionaries from Precursor Ventures, NEA, Index Ventures, Underscore VC, and beyond for a day packed with strategies, workshops, and meaningful connections. Boston, MA | REGISTER NOW Ideally, new AI tooling should allow more developers to create apps and post them on the App Store. That would enrich the iOS app ecosystem and open up new revenue opportunities for Apple, which is even more critical now that some of its App Store income is under threat. Apple has made some announcements, but many have yet to materialize. Swift Assist, a coding assistant for Xcode, was shown off last year but hasn't seen wide release. Apple is also reportedly working on an Anthropic-powered AI coding tool and plans to open access to its own AI models for developers. The goal is to lower the barrier for building iOS apps, both for pros and newcomers. However, there are two things to consider: the web's dominance as an application distribution platform and new regulations that bar Apple from charging fees in the U.S. for payments outside the app. The second part is a particularly big deal. In April, Judge Yvonne Gonzalez Rogers asked Apple to remove restrictions around linking to outside payment methods for digital purchases in apps for the U.S. App Store. More importantly, the ruling also barred Apple from charging any fees for these kinds of payments. On Wednesday, a U.S. court rejected Apple's appeal to put a stay on the ruling. This means developers will encourage customers to purchase subscriptions and add-ons outside the App Store, also possibly at a discounted rate compared with their App Store prices. This ruling could also spur other regulators to put similar pressure on Apple and cull App Store fees for third-party payments. Earlier this week, Apple reported that it generated $1.3 trillion in billings and sales in 2024, with 90% of that value generation happening without paying Apple a commission. But even some percentage of the remaining $130 billion means many billions in revenue for the company. Amid all this, Apple needs to reassert the value of its ecosystem. It's not clear if Apple will cut its commissions, but it will be interesting to see what kind of App Store features the company launches to make native iOS apps a more lucrative avenue for developers. As WWDC 2025 approaches, Apple is in the unusual position of having to share a better story. Its AI ambitions are being challenged not only by faster-moving competitors but also by changing legal and economic realities. To succeed, Apple has to demonstrate that it can deliver on AI, for end users and the developers who power its ecosystem. Especially in a world where AI accelerates everything, Apple can't afford to lag behind.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store