logo
#

Latest news with #AIworkloads

AKAM, Aptum Team Up to Streamline Cloud Adoption: Stock to Gain?
AKAM, Aptum Team Up to Streamline Cloud Adoption: Stock to Gain?

Yahoo

time2 days ago

  • Business
  • Yahoo

AKAM, Aptum Team Up to Streamline Cloud Adoption: Stock to Gain?

Akamai Technologies, Inc. AKAM recently announced that it has joined forces with Aptum, a leading managed hybrid cloud services provider. Aptum offers tailored infrastructure design and implementation, consulting and managed services that help organizations to solve critical IT challenges. The partnership of AKAM and Aptum will primarily focus on mitigating complexities in cloud transition processes, expediting cloud-based application development and ensuring cost are placing greater emphasis on cloud repatriation and sustainable cloud operations. Business are increasing their investment in cloud platforms that can support generative AI workloads. Hybrid cloud systems are gaining traction. Edge computing and applications with low latency requirements are also driving demand for the cloud. Per Precedence Research, the cloud computing market is projected to grow from $912.77 billion in 2025 to $5,150.92 billion in 2034, with a compound annual growth rate of 21.2%. With a combination of innovation and strategic collaboration, Akamai is steadily expanding its portfolio to capitalize on this emerging market trend. Akamai's cloud optimization solutions help organizations to improve performance, increase availability and enhance the security of applications and key web assets delivered from data centers to the end user. The Akamai Connected Cloud is also witnessing healthy demand trends. The company is expanding the Akamai Partner Program to extend support to its existing client base. Recently, Aptum has also opted to join the program. Growing demand for cloud infrastructure services is driving net sales in the Compute segment. In the second quarter of 2025, Akamai registered $171.4 million in revenues, up from $151.5 million in the prior-year quarter. Per our estimate, the company is projected to generate $725 million in revenues in 2025, indicating year-over-year growth of 15%. Stocks to Consider Ubiquiti Inc. UI delivered an earnings surprise of 61.29% in the last reported quarter. Ubiquiti spends significantly on research and development activities for developing innovative products and state-of-the-art technology to expand its addressable market and remain at the cutting edge of networking technology. The company believes its new product pipeline will help to increase average selling prices for high-performance, best-value products, thus raising the top line. Ubiquiti is witnessing healthy traction in the Enterprise Technology segment. Jabil, Inc. JBL delivered an earnings surprise of 9.44% in the last reported quarter. Jabil's focus on end-market and product diversification is a key catalyst. The company's target of 'no product or product family should be greater than 5% operating income or cash flows in any fiscal year' is commendable. This initiative should position Jabil well on the growth Solutions, Inc. MSI delivered an earnings surprise of 6.8% in the trailing four quarters. The company expects to record strong demand across video security and services, land mobile radio products and related software while benefiting from a solid foundation. MSI intends to boost its position in the public safety domain by entering into strategic alliances with other players in the ecosystem. Motorola's VB400 body-worn cameras are increasingly being deployed across the globe to boost the security of police officers. Want the latest recommendations from Zacks Investment Research? Today, you can download 7 Best Stocks for the Next 30 Days. Click to get this free report Akamai Technologies, Inc. (AKAM) : Free Stock Analysis Report Jabil, Inc. (JBL) : Free Stock Analysis Report Motorola Solutions, Inc. (MSI) : Free Stock Analysis Report Ubiquiti Inc. (UI) : Free Stock Analysis Report This article originally published on Zacks Investment Research ( Zacks Investment Research Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data

MSI Showcases DC-MHS and MGX Server Platforms for Cloud-Scale and AI Infrastructure at OCP APAC 2025
MSI Showcases DC-MHS and MGX Server Platforms for Cloud-Scale and AI Infrastructure at OCP APAC 2025

Yahoo

time05-08-2025

  • Business
  • Yahoo

MSI Showcases DC-MHS and MGX Server Platforms for Cloud-Scale and AI Infrastructure at OCP APAC 2025

TAIPEI, Aug. 5, 2025 /PRNewswire/ -- At OCP APAC 2025 (Booth S04), MSI, a global leader in high-performance server solutions, presents modular server platforms for modern data center needs. The lineup includes AMD DC-MHS servers for 21" ORv3 and 19" EIA racks, built for scalable and energy-efficient cloud infrastructure, and a NVIDIA MGX-based GPU server optimized for high-density AI workloads such as LLM training and inference. These platforms demonstrate how MSI is powering what's next in computing with open, modular, and workload-optimized infrastructure. "Open and modular infrastructure is shaping the future of compute. With OCP-aligned and MGX-based platforms, MSI helps customers reduce complexity, accelerate scale-out, and prepare for the demands of cloud-native and AI-driven environments." – Danny Hsu, General Manager of MSI's Enterprise Platform Solutions AMD DC-MHS Platforms for Cloud Infrastructure Powered by a single AMD EPYC™ 9005 processor and up to 12 DDR5 DIMM slots per node, MSI's DC-MHS open compute and core compute platforms deliver strong compute performance and high memory bandwidth to meet the demands of data-intensive and parallel workloads. Built on the modular OCP DC-MHS architecture and equipped with DC-SCM2 management modules, these systems offer cross-vendor interoperability, streamlined integration, and easier serviceability, ideal for modern, scalable infrastructure in hyperscale and cloud environments. The CD281-S4051-X2 targets 21" ORv3 rack deployments with 48Vdc power, featuring a 2OU 2-node design and EVAC cooling that supports up to 500W TDP per node. With 12 E3.S PCIe 5.0 NVMe bays per node, it offers high-density, front-access storage for throughput-heavy applications. The CD270-S4051-X4 fits into a standard 2U 4-node 19" EIA chassis, maximizing compute density for environments with limited rack space. Supporting up to 400W air-cooled or 500W liquid-cooled CPUs, and equipped with front-access U.2 NVMe bays, it's built for flexible deployment across general-purpose and scale-out workloads. NVIDIA MGX AI Server for Scalable AI Workloads Built on the NVIDIA MGX modular architecture, the CG480-S5063 is optimized for large-scale AI workloads with a 2:8:5 CPU:GPU:NIC topology. It supports dual Intel® Xeon® 6 processors and up to eight 600W FHFL dual-width GPUs, including NVIDIA H200 NVL and RTX PRO 6000 Blackwell Server Edition. With 32 DDR5 DIMMs and 20 PCIe 5.0 E1.S bays, it delivers high compute density, fast storage, and modular scalability for next-gen AI infrastructure. View original content to download multimedia: SOURCE MSI Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data

SNIA Announces Storage.AI
SNIA Announces Storage.AI

Associated Press

time04-08-2025

  • Business
  • Associated Press

SNIA Announces Storage.AI

SANTA CLARA, Calif.--(BUSINESS WIRE)--Aug 4, 2025-- SNIA, a not-for-profit global organization for technologies related to handling and optimizing data, today announced an open standards project for efficient data services related to AI workloads. will focus on industry-standard, non-proprietary, and neutral approaches to solving AI-related data problems to optimize the performance, efficiency, and cost-effectiveness of AI workloads. This press release features multimedia. View the full release here: Improving storage technologies for all AI workloads Initial industry leaders who have signed on to include AMD, Cisco, DDN, Dell, IBM, Intel, KIOXIA, Microchip, Micron, NetApp, Pure Storage, Samsung, Seagate, Solidigm, and WEKA. In addition to SNIA's substantial membership, the project will work to build broad ecosystem support with SNIA's partners, including UEC, NVM Express®, OCP, OFA, DMTF, SPEC, and others. AI workloads are extraordinarily complex and constrained by issues related to latency, space, power and cooling, memory, and cost. Addressing these problems through an open industry initiative is the fastest path to optimization and adoption. SNIA's 25+ year track record of developing industry standards, together with its technical accomplishments to accelerate, store, unify, and optimize the compute, movement, and performance of data, makes it uniquely positioned to lead the SNIA data project. 'The unprecedented demands of AI require a holistic view of the data pipeline, from storage and memory to networking and processing,' said Dr. J Metz, SNIA Chair. 'No single company can solve these challenges alone. SNIA's provides the essential, vendor-neutral framework for the industry to coordinate a wide range of data services, building the efficient, non-proprietary solutions needed to accelerate AI for everyone.' The project will create an open ecosystem for efficient data services to address the most difficult challenges related to AI workloads, closing current gaps in processing and accessing data. The founding members share common goals. For more information, visit Industry Response AMD 'The enterprise technologies powering today's most demanding AI workloads are only as effective as the data they can access,' said Robert Hormuth, corporate vice president, Data Center Solutions Group, AMD. 'As a member of we're excited to collaborate with other leaders to give the industry a neutral, open path to fix data bottlenecks and unlock true AI performance.' Cisco 'Storage networking has long been a foundational pillar of our customers' digital infrastructure. As AI continues to expand across the enterprise, integrating robust storage solutions with AI workloads has become increasingly important. In today's AI-driven landscape, data sets our customers apart, and the performance and reliability of that data are essential to realizing the full benefits of AI.' Jeremy Foster, SVP and GM, Cisco Compute. DDN 'As AI continues to reshape every industry, the demand for open, scalable, and intelligent data infrastructure has never been greater,' said Santosh Erram, VP of Public Cloud Alliances at DDN . 'At DDN, we're proud to support SNIA's initiative—an important step toward fostering collaboration, driving interoperability, and setting open standards that enable the AI ecosystem to thrive. By working together, we can unlock faster innovation and more meaningful outcomes for customers and partners across the AI landscape.' Dell 'To fully realize the potential of AI, enterprises must transform raw data into structured, governed knowledge. Storage and Data Engines play a pivotal role in this journey—powering data pipelines from training and continuous learning to agentic outcomes. These engines must evolve rapidly, embracing innovation across flash media, drives, networking, compute and foundational software. Dell Technologies is proud to collaborate with SNIA to advance the technologies and standards that make this transformation possible,' said Rajesh Rajaraman, CTO and VP Dell Storage, Data and Cyber Resilience. IBM 'Effective data management is fundamental to the performance, quality, and cost-efficiency of AI projects. AI-optimized storage plays a critical role in enabling this, and the industry urgently needs a consistent methodology and interface for consuming AI-optimized storage. IBM is proud to collaborate with SNIA in advancing these standards and solutions,' said Vincent Hsu, IBM Fellow, CTO and VP for IBM Storage. Intel 'AI workloads are redefining the boundaries of compute and data infrastructure. offers a critical, vendor-neutral foundation that unites innovation across the ecosystem from storage architecture to memory hierarchy, helping to ensure scalable, efficient solutions, purpose-built for the AI era,' said Ronak Singhal, Intel Senior Fellow, Xeon Products. Microchip 'Microchip is pleased to support the new open standard project under SNIA focused on optimizing AI data solutions,' said Brian McCarson, corporate vice president of Microchip's data center solutions business unit. 'Advancing features and capabilities in both our storage and flash controllers is key to providing solutions for complex AI-related data workloads across both Large Language Model and Inference specific hardware architectures.' Micron 'AI workloads demand unprecedented levels of data throughput, efficiency and scalability,' said Karthik Ganesan, Fellow, Storage Solutions Architecture at Micron. 'As the leaders in memory and storage products for AI workloads, enabling standards to ensure interoperability is critical to our ability to stay on the leading edge of delivering new solutions like our PCIe Gen6 SSDs. By building together using an open approach, we can speed up innovation and create the data infrastructure AI needs to thrive.' NetApp 'AI workloads demand unprecedented efficiency, scalability, and performance from storage solutions,' said Ed Fiore, Vice President and Fellow, Chief Systems Architect at NetApp. 'NetApp is committed to delivering innovative storage technologies that optimize AI data pipelines, ensuring seamless data access and management. By joining SNIA's initiative, we are excited to collaborate with industry leaders to develop open standards that drive interoperability and unlock the full potential of AI. Together, we will accelerate AI advancements and create a robust ecosystem that benefits everyone.' Pure Storage ' highlights the role of data infrastructure and management as organizations evolve, interoperate, and scale for an AI future,' said Rob Lee, Pure Storage CTO. 'Pure Storage has long led the storage industry in championing open standards and deep ecosystem integration. As the industry tackles AI-related data challenges to drive real outcomes, Pure Storage sees the foundation in seamless data movement across environments, policy-aware storage orchestration, and a clear focus on ensuring data sovereignty is never a strategic landmine. Artificial intelligence is only as powerful as the data storage platform beneath it, and the future of AI demands modern, scalable infrastructure that is built for flexibility, performance, and governance.' Samsung 'As AI workloads require faster and more efficient data processing, a collaborative approach is required to address the storage challenges that come with it,' said Leno Park, Vice President of Storage Solutions Product Planning, Samsung Electronics. 'We are excited to join the SNIA initiative and contribute our expertise in memory and storage solutions to help develop industry-standard, non-proprietary approaches that optimize AI performance and efficiency.' Seagate 'Hard drive storage is critical to unlocking the full value of data in AI workloads—delivering scalable, efficient and cost-effective solutions that support global innovation,' said Jason Feist, Senior Vice President of Cloud Marketing at Seagate Technology. 'By advancing open standards through Seagate is helping to enable infrastructure growth that meets the increased demand for performance required by the next-gen technology ecosystem.' Solidigm 'Solidigm is excited to be a founding supporter of the next big data project from SNIA,' said Greg Matson, Senior Vice President and Head of Products and Marketing, Solidigm. 'Data is the fuel of the AI engine, and we have found, working with our customers, that SSDs are ideal for the AI pipeline—helping reduce power consumption and physical footprint with greater capacity. We look forward to working with SNIA and our industry partners to help advance AI and storage together in this evolving ecosystem.' WEKA 'AI is pushing infrastructure to its limits, creating efficiency, power, and scale challenges that require industry-wide solutions. Building on SNIA's legacy of advancing storage standards, WEKA is excited to join the initiative to help develop open standards that will benefit the entire AI ecosystem,' said Ajay Singh, chief product officer at WEKA. 'We've long believed that AI needs a new class of data architecture—one that scales with the workload, without compromising performance or resiliency. By working with stakeholders across the entire infrastructure stack, we can better address these core technical challenges and unlock AI's potential to transform the future of innovation.' Join The power and impact of industry cooperation in developing open standards are proven. Technical work is beginning now. Companies and industry organizations interested in participating in the groundbreaking project should contact [email protected] About SNIA SNIA is a not-for-profit global organization made up of corporations, universities, startups, and individuals. The members collaborate to develop and promote vendor-neutral architectures as well as international standards and specifications. SNIA promotes technologies related to the storage, transport, optimization of infrastructure, acceleration, format, and protection of data. Follow SNIA at View source version on Inquiries:[email protected] KEYWORD: UNITED STATES NORTH AMERICA CALIFORNIA INDUSTRY KEYWORD: DATA MANAGEMENT TECHNOLOGY SOFTWARE NETWORKS ARTIFICIAL INTELLIGENCE INTERNET SOURCE: SNIA Copyright Business Wire 2025. PUB: 08/04/2025 11:30 AM/DISC: 08/04/2025 11:29 AM

CoreWeave Q2 Earnings Preview: What to Expect From Upcoming Report
CoreWeave Q2 Earnings Preview: What to Expect From Upcoming Report

Yahoo

time04-08-2025

  • Business
  • Yahoo

CoreWeave Q2 Earnings Preview: What to Expect From Upcoming Report

Aug 4 - CoreWeave (NASDAQ:CRWV) will release its second-quarter 2025 earnings after the market closes on Tuesday, Aug. 12. The report could highlight how the GPU-focused cloud provider is navigating rapid growth and market volatility. Warning! GuruFocus has detected 6 Warning Signs with CRWV. Analysts forecast a loss of $0.20 per share, with revenue expected to come in around $1.08 billion. The company has become a key infrastructure partner for artificial intelligence workloads, serving clients such as Microsoft (NASDAQ:MSFT), Meta Platforms (NASDAQ:META), International Business Machines (IBM), and OpenAI. Since its March IPO at $40 per share, CoreWeave stock climbed to $187 before sliding about 39% amid market caution, analyst downgrades, and its proposed acquisition of Core Scientific (CORZ). Despite the pullback, the stock still trades at roughly 19.6 times forward sales. CoreWeave posted $981.6 million in first-quarter revenue, up 420% year-over-year, with non-GAAP operating income of $162.6 million. Net loss widened to $314.6 million, or $1.49 per share, as heavy infrastructure spending offset revenue gains. For Q2, the company projects revenue of $1.06$1.10 billion and adjusted operating income of $140$170 million. Full-year revenue guidance now stands at $4.9$5.1 billion, signaling continued momentum in AI-driven cloud services despite ongoing losses. This article first appeared on GuruFocus.

Synapse Power Opens Contributor Access to Build the Future of Decentralized AI Compute
Synapse Power Opens Contributor Access to Build the Future of Decentralized AI Compute

Associated Press

time02-08-2025

  • Business
  • Associated Press

Synapse Power Opens Contributor Access to Build the Future of Decentralized AI Compute

Dubai, United Arab Emirates, August 2, 2025 -- Synapse Power invites global participants to contribute computing power to its GPU network—powering the next phase of AI infrastructure with real-world utility and transparent participation. Synapse Power, a UK & UAE based AI infrastructure provider, has announced the opening of contributor access to its global GPU network—inviting individuals and organizations to support and participate in the growing demand for decentralized compute power. As artificial intelligence continues to evolve at record speed, the need for scalable, secure, and performance-audited infrastructure has never been more urgent. Synapse Power is addressing this need through its high-performance compute platform, deployed across Tier-3+ data centers in the US, and EU. Now, it's extending an opportunity for others to contribute computing resources to that network. 'AI infrastructure shouldn't be limited to hyperscalers or closed ecosystems,' said a Synapse Power spokesperson. 'We're building a performance-first platform where contributors can play a real role in powering the next wave of intelligent systems—transparently, sustainably, and with purpose.' A New Way to Contribute Value Participants in the contributor program can allocate their GPU power or support node infrastructure expansion, becoming part of the engine behind AI workloads ranging from scientific research to enterprise LLM training. This is not a token sale or speculative campaign—it's a real, infrastructure-backed initiative focused on: Whether you're a GPU owner, infrastructure provider, or a tech-forward partner seeking to engage with future AI infrastructure, Synapse Power is offering a structured way to plug in and contribute to meaningful compute cycles. Built for Scalability, Designed for Trust Unlike traditional cloud platforms or unproven blockchain miners, Synapse Power operates on audited infrastructure and performance data—not allocations or idle speculation. Every node is verified. Every session is metered. Every contributor can see how their resources are used in real time. And as Synapse expands into new markets—Germany, Singapore, LATAM, and beyond—the contributor model will form a core part of how global compute access becomes decentralized and democratized. Get Involved This program marks the beginning of Synapse Power's larger vision to redefine how computing power is sourced, distributed, and rewarded. Those interested in contributing to this foundational layer of AI infrastructure can register to learn more: About Synapse Power Synapse Power AI LLC is a next-generation AI infrastructure company delivering performance-based GPU computing to power AI, ML, and real-time data operations. Through its global network and sustainability-driven architecture, Synapse enables scalable and verifiable compute access for startups, researchers, and enterprises worldwide. Contact Info: Name: Media Team Email: Send Email Organization: SYNAPSE POWER AI LLC. Website: Release ID: 89166254 In case of encountering any inaccuracies, problems, or queries arising from the content shared in this press release that necessitate action, or if you require assistance with a press release takedown, we urge you to notify us at [email protected] (it is important to note that this email is the authorized channel for such matters, sending multiple emails to multiple addresses does not necessarily help expedite your request). Our responsive team will be readily available to promptly address your concerns within 8 hours, resolving any identified issues diligently or guiding you through the necessary steps for removal. The provision of accurate and dependable information is our primary focus.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store