logo
#

Latest news with #IoTAnalytics

AI & edge technologies top Hannover Messe 2025 trends
AI & edge technologies top Hannover Messe 2025 trends

Techday NZ

time2 days ago

  • Business
  • Techday NZ

AI & edge technologies top Hannover Messe 2025 trends

New research from IoT Analytics highlights the top ten industrial technology trends observed at Hannover Messe 2025. IoT Analytics' latest report, based on comprehensive analysis at Hannover Messe 2025, has identified the principal trends influencing the industrial technology landscape. This 111-page research document offers in-depth insights into the state and future direction of industrial automation, driven extensively by the integration of artificial intelligence and advanced digital operations. The prominence of generative artificial intelligence within industrial software was a notable theme, with businesses embedding these capabilities into their digital platforms and manufacturing workflows. The evaluation notes that agentic AI—technology capable of taking actions autonomously—has begun to emerge, though it remains at an early stage of development and adoption. Edge technology also featured strongly among the key trends. Developments in edge-native design are enabling faster, more efficient data processing outside centralised cloud environments, contributing to greater responsiveness and improved operational security within industrial settings. IoT Analytics identifies DataOps platforms as seeing increasing demand, with their role expanding up the technology stack. DataOps helps industrial companies better manage, integrate and operationalise their rapidly growing data resources, aligning collection, processing and insight generation with business objectives. Another important development is the use of digital threads—connected digital records of a product's lifecycle—augmented by artificial intelligence. These are transforming design and engineering practices, allowing greater traceability and optimised decision-making throughout asset and production cycles. Sensors and predictive maintenance are now more tightly integrated. The report observes a shift toward sensor-centric predictive maintenance systems, which are expanding into previously overlooked equipment asset classes, aiming to minimise downtime and enhance reliability. The demand for private 5G networks across industry has risen, driven by the need for secure, scalable and reliable connectivity. However, IoT Analytics notes that integration remains a significant barrier for many organisations, slowing widespread adoption despite clear interest and value. Sustainability efforts are evolving as companies introduce AI frameworks and platforms to track, manage and reduce environmental impact. Artificial intelligence is enabling more targeted approaches and generating actionable sustainability insights for manufacturers and industrial operators. The concept of cognition in robotics is also advancing, with digital twins evolving from static virtual replicas to more dynamic, real-time industrial copilots. These AI-powered tools facilitate adaptive simulations, process planning and augmented production support on the factory floor. "Industrial AI, edge-native architectures, and data-centric operations are the defining industrial technology trends in 2025. In our conversations with dozens of industrial tech vendors, most notably at Hannover Messe 2025, it has become clear that AI now dominates the industrial agenda. Vendors are racing to show meaningful progress, with many already moving beyond generative AI toward agentic workflows and more autonomous systems," Knud Lasse Lueth, Chief Executive Officer at IoT Analytics, said, providing perspective on these findings. "AI is transforming every layer of the industrial technology stack — spanning edge, dataops, software solutions, and cloud. It was clear from Hannover Messe 2025 that every major industrial player is placing bets on AI—not just as an enhancement but as the engine room of their manufacturing strategy. This included applications ranging from copilots and industrial foundation models to AI-native edge stacks and digital twins. However, to unlock the next wave of value from AI, industrial companies must move beyond basic AI assistants and focus towards deeper integration across the technology stack and develop industry-focused AI solutions," Harsha Anand, Senior Analyst at IoT Analytics, stated, expanding on the role of artificial intelligence throughout the sector. IoT Analytics reported an increased presence of edge technologies, DataOps applications, and AI-centred solutions at Hannover Messe, with many organisations demonstrating their current solutions in these areas for the first time. The research also highlights increasing collaboration within the technology ecosystem, as companies seek to accelerate industrial automation progress through partnership and knowledge sharing. The IoT Analytics team, comprised of 20 analysts onsite, collected data and conducted interviews to assess these trends and benchmark them across vendors and sectors. Their event report includes 34 detailed insights and 118 topic or vendor-specific examples, as observed during the event. The identified trends point towards a growing reliance on artificial intelligence as a fundamental operational component as manufacturers and industrial technology providers navigate new strategies for production, asset management, and process optimisation.

Neuromorphic Technology Poised for Hyper-Growth as Market Surges Over 45x by 2030
Neuromorphic Technology Poised for Hyper-Growth as Market Surges Over 45x by 2030

Yahoo

time18-04-2025

  • Business
  • Yahoo

Neuromorphic Technology Poised for Hyper-Growth as Market Surges Over 45x by 2030

Strategic Investments and R&D Fuel the Next Wave of Growth in Neuromorphic Computing Neuromorphic Computing Market Dublin, April 18, 2025 (GLOBE NEWSWIRE) -- The "Neuromorphic Computing Market by Offering (Processor, Sensor, Memory, Software), Deployment (Edge, Cloud), Application (Image & Video Processing, Natural Language Processing (NLP), Sensor Fusion, Reinforcement Learning) - Global Forecast to 2030" report has been added to neuromorphic computing market was worth approximately USD 28.5 million in 2024 and is estimated to reach USD 1.32 billion by 2030, growing at a CAGR of 89.7% between 2024 and 2030. The demand for real-time data processing and decision-making capabilities in edge computing drives the adoption of neuromorphic computing. The increasing requirements to process real-time massive data for applications related to industrial automation, autonomous driving, and monitoring with a capability to make instantaneous decisions are making neuromorphic computing increasingly in demand. Moreover, the semiconductor industry is facing challenges in continuing to double the transistor count on ICs. The miniaturization of ICs faces issues such as current leakage, overheating, and other quantum mechanical effects, driving the urgent need for alternative approaches like neuromorphic technology to enhance computational power. Software segment to exhibit the highest CAGR during forecast periodSoftware segment is anticipated to hold the highest CAGR in the Neuromorphic computing market, as software allows live data streaming rather than static data, which makes them an attractive market for deep learning. Neuromorphic computing compares and analyses data and generates similar results if the new pattern matches the existing patterns. Similarly, for biometric pattern recognition, using neuromorphic computing has an advantage, as it gives real-time computation of patterns with high speed, accuracy, and low power consumption. The growing demand for edge devices and lot sensors underscores the importance of energy efficiency in computing applications often involve large numbers of sensors and devices that must operate efficiently with minimal energy consumption, due to their limited power resources and the need for prolonged battery life. According to analysis by IoT Analytics, the number of IoT connections could exceed 29 billion, by 2027, due to growing dependence of various sectors on interconnected devices. These applications involves large number of sensors and devices which needs to operate efficiently along with minimal energy consumption, as they have limited power resources. These requirements are met by neuromorphic computing as it minimizes the energy-intensive data movement between the processing and memory, which was a limitation in traditional von neuman architectures, leading to rise in demand for neuromorphic segment expected to have the highest share during the forecast periodEdge segment is expected to hold highest share during the forecast period. Neuromorphic computing on edge can be used in various applications. For instance, IoT devices that connect to the Internet can benefit from running code on the device itself rather than on the cloud for more efficient user interactions. Similarly, autonomous vehicles that need to react in real time, without waiting for instructions from a server, can benefit from neuromorphic computing on edge. Medical monitoring devices that must respond in real time without waiting to hear from a cloud server would also benefit from the rapid response time of neuromorphic computing at the edge. Therefore, the increasing demand for real-time processing, low-latency responses, and energy-efficient solutions across industries like IoT, autonomous vehicles, and medical devices will drive the edge segment to dominate the neuromorphic computing market during the forecast and video processing/computer vision segment to hold the largest share during the forecast periodImage and video processing/ computer vision holds major share in the neuromorphic computing market. The rise of smart cities is propelling deployment of surveillance systems, thus increasing the need for real-time image analysis. According to the World Economic Forum, 1.3 million people are moving to cities every week around the globe, and by 2040, 65% of the world's population will live in cities. Today, 60% of the world's GDP comes from the 600 largest cities and these figures can be expected to expand as these cities grow and thrive. It is projected that up to 80% of further growth in developing regions will take place in urban centers. This rapid urbanization is going to lead to a force of requirement in neuromorphic computing concerning image and video processing because cities have no choice but to demand their sophisticated application for dealing with large amounts of visual data in the implementation of applications such as surveillance, traffic management, and infrastructure monitoring for safety and efficiency into crowded and complex electronics segment projected to hold the largest share during the forecast periodConsumer electronics will witness a higher share during the forecast period because of its high demand for smart, efficient, and high-performance devices. The Neuromorphic computing sector has immense advantage due to ultra-low power consumption and exceptional processing capabilities, critical to powering the next generation of consumer electronics. One such feature that is still being incorporated with consumer electronics is AI-driven most effective of these are an image and a speech recognition tool. These chips enable devices to process complex tasks locally, reducing reliance on cloud computing and enhancing user privacy and real-time performance. For example, services such as Alexa and Siri that currently rely heavily on cloud computing will directly benefit from deployment of neuromorphic chips. That would make the latency low and make these AI assistants much more American market expected to hold the largest share during the forecast period North America will occupy the largest share during the forecast period due to the presence of prominent technology providers, such as IBM (US) and Intel Corporation (US), Qualcomm Technologies, Inc. (US), Advanced Micro Devices, Inc. (US), Hewlett Packard Enterprise Development LP (US), OMNIVISION (US), contributes to the market's growth in this region. These firms are researching and developing neuromorphic chips and AI solutions, leading the region into the innovation front in technology. Increased government spending over the years to address concerns over the security of critical infrastructures and sensitive data has resulted in the adoption of neuromorphic chipsets in security applications. High consumerization of personal care products, routine checkup medical tools, and wearable devices is boosting the adoption of neuromorphic computing devices in North America, thereby driving the growth of this Coveragehis research report categorizes the neuromorphic computing market based on offering, deployment, application, vertical, and region. The report describes the major drivers, restraints, challenges, and opportunities pertaining to the neuromorphic computing market and forecasts the same till 2030. Apart from these, the report also consists of leadership mapping and analysis of all the companies included in the neuromorphic computing report provides insights on the following pointers: Analysis of key drivers (expanding cyber threats; the surge in data generation necessitating robust and scalable security solutions capable of handling large volumes of sensitive information) influencing the growth of the neuromorphic computing market. Product Development/Innovation: Detailed insights on upcoming technologies, research & development activities, and new product & service launches in the neuromorphic computing market. Market Development: Comprehensive information about lucrative markets - the report analysis the neuromorphic computing market across varied regions Market Diversification: Exhaustive information about new products & services, untapped geographies, recent developments, and investments in the neuromorphic computing market Competitive Assessment: In-depth assessment of market shares, growth strategies, and service offerings of leading players like Intel Corporation (US), IBM (US), Qualcomm Technologies, Inc. (US), Samsung Electronics Co., Ltd. (South Korea), Sony Corporation (Japan), among others in the neuromorphic computing market. Key Attributes Report Attribute Details No. of Pages 259 Forecast Period 2024-2030 Estimated Market Value (USD) in 2024 $28.5 Million Forecasted Market Value (USD) by 2030 $1.32 Billion Compound Annual Growth Rate 89.7% Regions Covered Global Market Dynamics Drivers Rising Adoption of Neuromorphic Hardware Need for Alternative Approaches to Enhance Computational Power Growing Application of AI and ML Increasing Demand for Real-Time Data Processing and Decision-Making Capabilities Restraints Lack of R&D Investments Complexity of Algorithms Shortage of Educational Resources and Training Opportunities Opportunities Ability to Withstand Harsh Conditions of Space Increasing Adoption in Healthcare Sector Ability to Automate Complex Decision-Making Processes in Cybersecurity Operations Integration of Neuroplasticity into Neuromorphic Computing Challenges Complications Associated with Software Development Complexities Linked with Developing Computational Models Case Study Analysis Intel Labs Offered Lava Neuromorphic Framework to Concordia University That Optimized Hyperparameters for Large Scale Problems Intel Labs and Cornell University Collaborated to Train Intel's Loihi Neuromorphic Chip to Identify Hazardous Chemicals Based on Their Scents Tu/E and Northwestern University Implemented Neuromorphic Biosensors Capable of On-Chip Learning That Improved Efficiency and Accuracy Additional Insights Covered Trends/Disruptions Impacting Customer Business Pricing Analysis Value Chain Analysis Ecosystem Analysis Investment and Funding Scenario Technology Analysis Patent Analysis Trade Analysis Key Conferences and Events, 2025-2026 Regulatory Landscape Key Stakeholders and Buying Criteria Companies Featured Intel Corporation IBM Qualcomm Technologies, Inc. Samsung Electronics Co. Ltd. Sony Corporation Brainchip, Inc. Synsense Mediatek Inc. NXP Semiconductors Advanced Micro Devices, Inc. Hewlett Packard Enterprise Development L.P. Omnivision Innatera Nanosystems B.V. General Vision Inc. Applied Brain Research, Inc. Numenta Aspinity Natural Intelligence Grai Matter Labs Microchip Technology Inc. Memcomputing, Inc. Cognixion Neuropixels Spinncloud Systems Polyn Technology For more information about this report visit About is the world's leading source for international market research reports and market data. We provide you with the latest data on international and regional markets, key industries, the top companies, new products and the latest trends. Attachment Neuromorphic Computing Market CONTACT: CONTACT: Laura Wood,Senior Press Manager press@ For E.S.T Office Hours Call 1-917-300-0470 For U.S./ CAN Toll Free Call 1-800-526-8630 For GMT Office Hours Call +353-1-416-8900Sign in to access your portfolio

Enterprise IT's Inflection Point: How IT And IoT Are Shaping A New Era
Enterprise IT's Inflection Point: How IT And IoT Are Shaping A New Era

Forbes

time14-04-2025

  • Business
  • Forbes

Enterprise IT's Inflection Point: How IT And IoT Are Shaping A New Era

Charles Yeomans is the chairman and founder of Atombeam. Change is a constant in IT—change driven by new innovations, new use cases, new problems and new solutions. Over time, advancements from mainframe computers to the evolution of PCs to the rise of virtualization and the supremacy of the cloud—and even the move from desktops to mobile devices—fundamentally altered the very notion of enterprise IT. Most CIOs are accustomed to this ever-changing reality, with many having cut their teeth when businesses looked to IT to oversee internal systems and a corporate, on-premises data center. With the hub-and-spoke model, responsibilities increased. Then, the cloud changed everything. The cloud's secret weapon transformed enterprise IT. Its magical ability to automate previously weighty tasks—from adding compute power to spinning up additional storage capacity—was a game-changer. Everything didn't get easy, though. The distributed organizations the cloud helped make possible called for inherently more complex IT infrastructure. And it didn't take long for organizations to realize they still needed to keep some data under lock and key within their own walls. Organizations that previously looked to get out of the "data center business" altogether responded with hybrid approaches and single-tenet private clouds, even as many oversaw the building of new in-house data centers and the provisioning of the mission-critical systems within them. But even as many IT leaders again looked inward, the edge of the network continued to expand. This brings us to today, when enterprise networks are more complex than ever before, and IT teams—even those armed with greater visibility and control—have even more to do than ever before. And that is before even taking into account two truly transformative computing trends: AI and the burgeoning Internet of Things. With these technologies, we have entered a new era, one that will impact enterprise IT like never before. Enterprise IT teams today must oversee a highly distributed computing environment marked by unprecedented complexity. Not only is the edge of the network no longer bound by the constraints of a wired network, but the Internet of Everything is growing as satellite and mobile networks connect a dizzying array of devices, from autonomous vehicles and drones to appliances and sensors of every kind. Machine-generated data is increasing exponentially, with IoT Analytics estimating the existence of 18.8 billion connected IoT devices globally. And that is not even factoring in what is arguably the most transformative computing trend of our time: the use of generative AI, which has upended entire industries while simultaneously forcing everyone to consider its disruptive potential in less than three years of public use. Both trends pose immediate challenges to enterprise IT teams, who must account for, plan and address their significant impact on the most fundamental computing tasks, including: Heavy, data-intensive AI workloads are dramatically straining networks and using an unprecedented amount of power—a fact prompting hyperscalers to look to nuclear reactors to power their data centers. Notably, power consumption is particularly great in use cases that require low latency and near-real-time communications. AI workloads are also overwhelming the pipes, satellite and cellular networks that enterprise IT relies on. In contrast, the machine-generated data shared from sensors and other small devices that comprise the IoT is typically lightweight but often burdens infrastructure with the continual ping of shared information. In both cases, networks—including the data centers within them, the pipes that feed them and the connections between machines—are increasingly bogged down. The data deluge that made companies like EMC some of the most profitable in history ultimately drove many enterprises to the cloud, in part for its elasticity. Massive AI datasets upend that dynamic even in hyperscale environments, where data center operators are running out of capacity in their facilities and across their clouds. At the edge of the IoT, many endpoints—98% of them, according to Palo Alto Networks—are unprotected because devices like low-power sensors lack the computational power and memory encryption algorithms required, leaving bad actors with an open door. In light of these factors, the responsibilities that have for decades defined IT's estate and its core responsibilities—to provide sufficient compute power, networking speed and effectiveness, storage capacity and data protection—are increasingly untenable using traditional approaches and means. For years, we have turned to more powerful hardware to provide and manage the IT infrastructure needed to handle the ever-growing amount of data we create. Those efforts will, by necessity, continue as we work to effectively address the demands associated with AI workloads and the dramatic growth of machine-generated data. In response, leading innovators are creating faster chips and more powerful processors, even as quantum computing emerges. Simultaneously, a satellite network and data center building boom is underway, even as the aforementioned use of nuclear reactors reveals the very real concerns around how data centers will be powered. Time, however, may be the greatest challenge: In one projection for data growth, McKinsey sees demand potentially reaching 298 gigawatts by 2030, a reality it notes will require 'twice the data center capacity created since 2000 to be achieved in one quarter of the time.' Given this, one can effectively argue that no one approach or response will effectively address the very real reality that, with the growth of AI and machine-generated data, workloads have outpaced the infrastructure resources they require. We are at an inflection point that requires us to accept that the need for faster data processing and storage is not going away. For that reason, it is imperative that take a step back and think not only of the additional infrastructure capacity we can create, but also what we can do differently—from bringing information closer to the edge to changing the very notion of how data is configured and how it can be moved, stored and protected. Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology executives. Do I qualify?

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store