
20 Tech Experts On Emerging Hardware Trends Businesses Must Watch
In an era shaped by AI adoption, rising end-user expectations and tightening privacy regulations, IT leaders are reevaluating not only what hardware their organizations need, but also where it should reside and how it should be deployed. Below, members of Forbes Technology Council share key hardware strategies designed to deliver the flexibility, security and cost efficiency modern enterprises require.
1. AI-Embedded Hardware Security At The Edge
AI-embedded hardware security at the edge is becoming essential. By integrating intelligent processing directly into devices—servers, endpoints and storage—companies can achieve real-time, autonomous security; reduce latency; and protect privacy without cloud dependence. This hardware-native AI trend will be critical for secure, scalable operations in the near term. - Camellia Chan, Flexxon
2. Inference-Optimized Hardware
We're seeing a shift toward inference-optimized hardware—systems designed specifically for running, not training, AI models. As model deployment scales, general-purpose GPUs waste energy and rack space. Purpose-built accelerators with high memory bandwidth utilization will be essential for cost-effective, real-time AI. - Thomas Sohmers, Positron AI
Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology executives. Do I qualify?
3. Disaggregated Infrastructure
Disaggregated infrastructure is rising fast—separating compute, storage and memory lets companies scale AI workloads efficiently. Paired with smart NICs and GPUs, it's the backbone for low-latency, high-throughput architectures in tomorrow's data centers. - Sai Krishna Manohar Cheemakurthi, U.S. Bank
4. Field-Deployed Edge AI Accelerators
Edge AI accelerators are gaining traction—particularly in the insurance industry, where devices that can analyze claims locally are deployed in field adjusters' kits. These devices slash cloud costs while preserving privacy. Key benefits include TOPS/watt efficiency, hardware-encrypted data pipelines, and precertification for IEC 62304 medical-grade reliability. The future is distributed intelligence. - Srinath Chandramohan, EY
5. Hybrid Cloud Infrastructure
Hybrid cloud—combining locally hosted servers and public cloud providers—is a trending configuration. This strategy can help businesses reduce costs while maintaining the flexibility to scale when needed. - Anto Joseph, Eigen Labs
6. Edge-Enabled Safety Infrastructure
As a public safety company, we're seeing increasing demand for edge-enabled safety infrastructure—devices like smart panic buttons, mobile gateways and compact edge processors that can locally process video, audio or wellness data before syncing with cloud-based platforms. This reduces latency, enables real-time decision-making in emergencies, and enhances privacy by limiting data exposure. - Kevin Mullins, SaferMobility
7. AI-Enabled Edge Computing
A key trend is integrating edge computing with AI, enabling real-time data processing near the source. This reduces latency and bandwidth, which is crucial for IoT and smart systems. Advances in processors and AI chips facilitate local data analysis, enhancing decision-making and efficiency. Adopting this trend can help companies in data-driven industries cut costs, improve performance and stay competitive. - Gautam Nadkarni, Wipro
8. Localized Foundation Model Deployment
Enabling hardware to run localized foundational models is key. Current foundation models and large language models require significant infrastructure and pose security risks due to generalization and centralized data processing. New approaches deploy personalized models on small devices like watches or phones, enabling secure local processing and paving the way for personalized AI assistants. - Abhijeet Mukkawar, Siemens Digital Industries Software
9. On-Site Edge Computing In Factories And Telecom Sites
A growing trend is building edge computing setups equipped with GPUs or AI chips close to where data is generated. The advantages include reduced delays, because information is processed locally; bandwidth savings; and scalability. These reliable, flexible and modular hardware configurations allow factories and telecom sites to run AI-powered tasks on site, enabling faster responses, stronger data protection and more efficient workload management. - Maman Ibrahim, EugeneZonda Cyber Consulting Services
10. Data Center Layouts Built For AI
Local compute is making a comeback. Everyone chased the cloud—until inference costs punched them in the face. We used to fight over RAM; now it's NVMe lanes, PCIe bandwidth and power delivery. Welcome to the AI hardware wars! But AI-native workloads demand rack design, not just chip choice. If your data center layout hasn't changed since 2015, you're not ready. - Mirror Tang, ZEROBASE
11. Energy-Optimized Hardware
We're in the early stages of a shift toward energy-optimized hardware. Organizations are investing in renewable-powered data centers to meet ESG goals and reduce their carbon footprints. - Ohm Kundurthy, Santander Bank
12. Accelerated Compute AI Clusters
I'm seeing accelerated compute AI clusters doing double duty for both training and inference. The push toward agentic and multimodal AI requires significant processing power to solve complex problems and advance AI autonomy. - Steven Carlini, Schneider Electric
13. Modular Hardware Setups
One significant trend is the shift to modular hardware setups, such as servers and storage, which allow for scaling up or down as needed. This lets companies add power or space without a full rebuild, making it easier to keep up with changing needs and control costs. It's a flexible approach that's quickly becoming standard. - Ganesh Ariyur, Gainwell Technologies
14. Edge Computing With NPUs And Specialized Chips
One essential hardware trend is AI-accelerated edge computing. By processing data closer to its source with neural processing units and specialized chips, companies reduce latency, improve privacy and enable real-time decision-making. As AI becomes core to operations, edge intelligence will be critical for speed, scalability and resilience. - Rishit Lakhani, Nile
15. Heterogeneous Hardware Compatibility
Adopting heterogeneous hardware compatibility and mixed-hardware serving is essential. This enables the flexible use of diverse hardware types—GPUs, CPUs and ASICs—across generations and vendors, boosting capacity utilization and cutting costs. It supports scalable AI workloads by running models efficiently on mixed hardware fleets, increasing agility and sustainability. - Pooja Jain, Meta (Facebook)
16. Privacy-Driven Edge Computing
A growing hardware trend is edge computing—processing data closer to the user instead of relying entirely on the cloud. It's becoming essential for real-time decision-making in privacy-sensitive environments. For example, in AdTech, edge setups enable brands to deliver faster, more compliant, personalized ads without sacrificing speed or data security. - Ivan Guzenko, SmartyAds Inc.
17. Hybrid CPU-GPU Architectures
A clear short-term trend is the adoption of hybrid CPU-GPU architectures optimized for AI and data analytics workloads. It's important to understand that AI is no longer optional—it must be integrated into workflows. These architectures improve performance without requiring full infrastructure replacement, helping companies balance cost and efficiency. - David Barberá Costarrosa, Beeping Fulfilment
18. Chip-Level Security Integration
A key hardware trend is the integration of security at the silicon level—such as trusted platform modules, secure enclaves and hardware-based authentication. With rising cyberthreats and remote workforces, companies must adopt hardware that enforces zero-trust principles from the chip up to protect sensitive data and systems. - Raj Jhaveri, Greenlane™ Infrastructure
19. On-Device NPUs
One essential hardware trend is the adoption of neural processing units on personal devices. Newer PCs and devices come equipped with NPUs to handle AI workloads efficiently—on the device. As AI becomes integral to everyday workflows, devices without these chips risk falling behind in performance and capability. - Tarun Eldho Alias, Neem Inc.
20. Heterogeneous Compute
One key trend is the move to heterogeneous compute—combining CPUs, GPUs and AI accelerators—to handle growing machine learning workloads. Traditional CPUs can't keep up with large models. Adopting specialized hardware like H100s, faster interconnects and memory-rich nodes is essential for faster training, cost efficiency and staying competitive in the AI era. - Karan Alang, Versa Networks Inc.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Gizmodo
23 minutes ago
- Gizmodo
Breakfast With ChatGPT: Three Workers, One Morning, A Different AI Story
I came to Cleveland, Ohio, for the 50th anniversary of the National Association of Black Journalists (NABJ) convention. I expected the hallways to be buzzing with conversations about AI, and they were, but not in the way I'd hoped. For the first two days, the phrase I heard most from my fellow journalists was 'we must protect ourselves.' In session after session, the consensus was that AI is a danger, a threat, an enemy coming to replace us. Then I had breakfast at Betts, the restaurant in my hotel, and a single conversation with my waiter gave me another perspective on the AI revolution. As he brought me the bill, I asked Kevin Knestrick, 49, if he used AI. Haunted by the fearful rhetoric from the convention, I expected him to either rebuff me or launch into an anti-AI tirade. 'Not really,' he replied cautiously, then paused. 'Actually,' he continued, 'I used it for the first time when we changed the menu. I took a picture and uploaded it to ChatGPT and asked it to copy the text and prepare a message for a colleague. It saved me so much time.' As he relaxed, he called over a younger colleague, Jamie Sargent, 31, and he introduced me a bit later to another younger colleague of his, Dawud Hamzah, 37. 'You should talk to these guys,' he said. 'They use it a lot more.' He was right. It quickly became clear that for Hamzah and Sargent, ChatGPT is a part of their daily lives. They don't see it as a threat. For Hamzah, a bartender at Betts and a youth motivational speaker who founded his own empowerment association, H.Y.P.E. (Helping You Produce Excellence), ChatGPT has effectively replaced Google. 'I use it to build solid, well-structured PowerPoint presentations for my speaking engagements with students,' he told me. But its use extends far beyond his professional life. It's his trip planner, health advisor, and personal coach. 'I just used it for my lady's birthday,' he said. 'I said, 'I want something relaxing that has vegan friendly foods.' It gave me a whole itinerary, a phenomenal itinerary.' When back problems flared up, he turned to the chatbot for help. 'I asked it to give me specific home workouts and mobility exercises to relieve pressure from a degenerating disc in my back. And it did.' Did it work? 'Oh yeah!' he responded. Sargent, a former special education teacher, has been using ChatGPT since it launched in late 2022. He used it to generate baseline lesson plans, saving him hours of work which he could then devote to tailoring the content to each of his students' individual needs. 'I saved about an hour's worth of time writing a lesson plan,' he said. I asked him if it felt like cheating. 'No, because I would have done the same thing it did. It just did it faster than I can.' He dismisses the idea that teachers shouldn't use it. 'I'd say, that's nonsense. We spend so many hours outside of the classroom working on our own stuff. If we can make it faster, the better.' Like Hamzah, Sargent is also an avid travel planner, using ChatGPT to map out complex international vacations. 'My brother and I planned a trip to Italy from Milan to Florence to Naples, and it basically showed us the map of taking a train from here to here, gave us good restaurants to go to, and then it told us how much it was going to cost.' Both men hold a pragmatic view of AI's future. They believe jobs will be lost, but that it's on individuals to adapt. 'If you don't learn, develop, and adjust, you'll fail, because it's not going to stop,' Hamzah insisted. Sargent agreed, adding that the key is to focus on what makes you human. 'I'm part of the experience, whereas AI is not part of that experience. Find a way to differentiate yourself from AI and make yourself valuable.' Kevin, who first introduced me to the group, represents a different demographic's journey into AI. His use was born of pure necessity. 'I was in a time crunch to get this menu to the printer,' he recalled. The AI solved his problem in seconds. That single, surprisingly effective interaction transformed him from a non-user into a curious convert. 'Now I'm much more open to any problem I have. I'm just going to ask it now,' he told me. His regret over missing out on the Bitcoin boom has also made him wonder if AI could be a tool to help 'the little guy' get an edge in investing. 'I guess I'm from the generation where all the Wall Street fat cats make money, while us little people just get crushed,' he said. 'How do we not be the little guy anymore?' Their manager, Curtis Helser, 56, was also introduced to ChatGPT by his wife about a year ago. He uses it to refine important work emails, making them shorter and more professional. He isn't afraid of it, seeing it as a tool that can be used for good or ill, just like a car. And he's not worried about his job. 'You have to be in the building,' he said with a laugh. 'Kissing babies, shaking hands, that kind of stuff.' I was stunned. At the restaurant, AI wasn't a terrifying enemy; it was a useful, if imperfect, assistant. The younger employees had fully embraced it, while the older generation was more cautious but still open, integrating it into their lives at their own pace. They see the current panic as a movie they've seen before, recalling the fears that accompanied the rise of the personal computer. The contrast with my colleagues at the journalism convention was stark. Perhaps those of us whose jobs are built on creating and controlling information see AI as an existential threat, while those in the service of people see it as just another tool to get the job done. The real AI revolution, I realized, wasn't happening in the headlines or the panicked convention halls. It was happening quietly, in conversations like this, one practical problem at a time.


Forbes
23 minutes ago
- Forbes
Doctors Use Large Neuro Model To Decode Brain Activity
While Dimitris Fotis Sakellariou and Kris Pahuja both shared a passion for playing music, what ultimately brought them together was an opportunity to use artificial intelligence to advance the field of brain science. Sakellariou's medical research and deep technical skills coupled with Pahuja's AI strategy and product credentials were the perfect mix and in 2023 they became cofounders of Piramidal. As a graduate startup of Y Combinator, what has made Piramidal particularly compelling is that they have built a large foundation model that instead of learning from a corpus of text, uses data produced by electrical activity in the human brain. In this way, their AI is trained to understand and detect patterns of brain language potentially transforming neurological diagnostics. It's the first step in many that they hope will lead them to their ultimate goal of building a fully AI-enabled neurologist. Building The First Large Neuro Model In November 2022, ChatGPT was released to the public. It enabled anyone to type a plain English question into a text box and get a natural sounding, informed response. ChatGPT was most people's first encounter with a large language model, a type of AI. In simple terms, an LLM works by being trained on a massive amount of text data that is derived from websites, databases, articles, and more. Through this process, the LLM learns language patterns and is then able to apply them in response to input from a user. Sakellariou, who holds a PhD in neuroscience and AI, had a breakthrough idea to build a specialized LLM, which his team now calls a large neuro model, that would use data, specifically neural language from the brain, from an electroencephalogram also known as an EEG. EEG devices, found in a clinical setting, conduct tests that record and display brainwave patterns, and are used to detect and investigate epilepsy, and other problems such as dementia, brain tumors, sleep disorders, and head injuries. What Problems Can A Large Neuro Model Solve? In a typical hospital context an EEG is hooked up to a patient through electrodes that are placed on the scalp. Brainwaves are displayed on a monitor or printed on paper. Doctors, nurses, and other medical technicians check on EEG results from many patients periodically during the day. As a practical matter, it's not possible for medical staff to continuously monitor and interpret EEG output. As an example, if a doctor checks an EEG in the morning and then before lunch the patient has a brain dysfunction, the doctor may not know about it until they check the EEG again in the evening, when appropriate intervention may be too late. New York-based Piramidal's LNM solves this problem by constantly consuming the EEG data, enabling it to produce accurate patient time series reports, in seconds. The LNM's on-going monitoring means it can analyze, identify, flag, and alert medical staff about abnormalities in real-time. Treatment close to or as the medical event is occurring can literally save the patient's life. Their model also eliminates the manual time-consuming work required to study EEG results, which often takes hours of effort, and it is particularly valuable in situations such as emergencies, when high quality data can support better real-time medical decisions and interventions. The result? Improved healthcare outcomes. Cleveland Clinic Makes A Bet On Piramidal Cleveland Clinic, opened in 1921, is a medical center with 23 hospitals and 280 outpatient facilities globally. In 2024 it served close to 16 million patients, and it is considered one of the world's top centers for neurology. As a large provider, the Clinic has around 100 EEG devices in ICUs serving patients at any time. Monitoring, reporting, and managing each EEG is a highly time-consuming task relying on scarce time availability from medical professionals. In addition, the current absence of real-time brainwave time-series analysis, interpretation, and alerts means inefficiencies can exist in being able to reduce brain injury and even death in the event of an ICU emergency. It makes sense then that Clinic leaders would have a keen interest in Sakellariou and Pahuja's innovation and consequently, a strategic collaboration is now underway. Over a period of several months, Piramidal's LNM will be deployed across many of the Clinic's ICUs. The center will work to co-develop a custom version through testing and refinement that meets their specific needs. Sakellariou believes the solution that emerges from this collaboration will also inform the development of a more widely available commercial version for medical networks across the world. A Challenging But Bright Future For AI In Healthcare AI is ushering in a new era of healthcare innovation. Today, breakthroughs using AI in multiple areas of medicine are happening with greater frequency. Examples include greater accuracy in imaging and diagnosis, acceleration of drug discovery and development, robots assisting with surgery, and precision medicine enabling treatments to be tailored for each patient. There's a lot happening to be encouraged and excited about in the medical field. That said, it will take more than just advances in technology to realize the benefits of innovation in healthcare. Pahuja sees many non-technical hurdles in the way, particularly in the US. Despite the availability of solutions, slow technological adoption is still a characteristic of healthcare systems for many complex reasons including the process in which reimbursements are made. In addition, the current healthcare regulatory environment can quickly become a roadblock for adoption of AI. Despite these hurdles, both Sakellariou and Pahuja are convinced that healthcare innovation driven by AI is about to flourish, and they are well positioned to ride what will likely be a long wave. They acknowledge that it's going to require a lot more investment, after all, training a large neuro model doesn't come cheap. With AI, perhaps many of our worst healthcare fears, from cancer to neurological diseases, will be soon be overcome. That future can't come fast enough.


CNBC
24 minutes ago
- CNBC
Having a personal brand at work is key for introverts, expert says: ‘A lot of people underestimate' them
Goldie Chan knows how to put herself out there. Chan, the founder and head of content for branding agency Warm Robots, is also an author, a frequent keynote speaker and a LinkedIn Top Voice. Still, she describes herself as an introvert at heart — which presented a challenge in growing her career. "I think a lot of people underestimate introverts in the workplace," she says. According to Chan, building a personal brand was key to her success. Earlier in her career, Chan worked "completely behind the scenes" in various marketing and social media roles, she says. In 2017, she began posting short videos about pop culture branding and marketing on LinkedIn, which was beta-testing its video feature at the time. Becoming a content creator wasn't initially her plan: "I really thought, I'm just going to do this until I get my next full-time job," Chan recalls. While in between roles, she posted over 800 consecutive daily videos — a pace she doesn't necessarily recommend "unless you never want to sleep again" — and quickly attracted a large audience. Today, she has over 100,000 followers on LinkedIn, and that platform has helped her land new roles, obtain a book deal and share her expert advice as a Forbes contributor. "I grew my own personal brand through a lot of consistency and hard work," she says. Chan writes about how introverts can level up their careers in her upcoming book "Personal Branding for Introverts," which debuts in October. Whether you know it or not, "everyone has a personal brand," Chan says. If the idea of a personal brand sounds intimidating, just think of it as "something in your career that people know you for," whether that's your penchant for public speaking, affinity for colorful blazers or your humorous LinkedIn posts. Developing a strong personal brand can help introverts bring their talents to the forefront and boost their careers, she says: "It allows you to shape the story that other people are telling about you." Still sound scary? Chan hears that a lot. "I've had so many introverts who have come up to me and said, 'I'm so terrified of growing my personal brand because I don't want to put myself out there,'" she says. Chan herself used to feel the same way. To combat her social anxiety, she challenged herself to speak with a new person at her local coffee shop each day for a month. Just one small, consistent action can help you break out of your comfort zone, she says. If you're nervous about reaching out to professional connections, commit to sending one networking email a month, Chan says. Trying to grow an online presence? Post one update a week. "The more you do it regularly, the easier it is to keep going," she says. Keep in mind that your personal brand doesn't just exist online. According to Chan, participating in social events like hobby groups or networking meetups can also contribute to building your brand. Your unique interests help set you apart, she says, and you might be surprised at the opportunities that arise. "A personal brand is never created in isolation," Chan says. "Being involved in groups that interest you, even if they're not directly related to your career, can actually help with your career." She describes personal brands as the "hub in the center of spokes." "All these different spokes are all the different things that you do, and they all connect back to a central hub — so whatever small things you do are always going to help your overall personal brand," she says.