Latest news with #datacompression


Forbes
23-06-2025
- Science
- Forbes
Six Ways To Advance Modern Architecture For AI Systems
View of the clouds reflected in the curve glass office building. 3d rendering These days, many engineering teams are coming up against a common problem – basically speaking, the models are too big. This problem comes in various forms, but there's often a connecting thread and a commonality to the challenges. Project are running up against memory constraints. As parameters range into the billions and trillions, data centers have to keep up. Stakeholders have to look out for thresholds in vendor services. Cost is generally an issue. However, there are new technologies on the horizon that can take that memory footprint and compute burden, and reduce them to something more manageable. How are today's innovators doing this? Let's take a look. Input and Data Compression First of all, there is the compression of inputs. You can design a loss algorithm to compress the model, and even run a compressed model versus the full one; compression methodologies are saving space when it comes to specialized neural network function. Here's a snippet from a paper posted at Apple's Machine Learning Research resource: 'Recently, several works have shown significant success in training-free and data-free compression (pruning and quantization) of LLMs achieving 50-60% sparsity and reducing the bit-width down to 3 or 4 bits per weight, with negligible perplexity degradation over the uncompressed baseline.' That's one example of how this can work. This Microsoft document looks at prompt compression, another component of looking at how to shrink or reduce data in systems. The Sparsity Approach: Focus and Variation Sometimes you can carve away part of the system design, in order to save resources. Think about a model where all of the attention areas work the same way. But maybe some of the input area is basically white space, where the rest of it is complex and relevant. Should the model's coverage be homogenous or one-size-fits-all? You're spending the same amount of compute on high and low attention areas. Alternately, people engineering the systems can remove the tokens that don't get a lot of attention, based on what's important and what's not. Now in this part of the effort, you're seeing hardware advances as well. More specialized GPU and multicore processors can have an advantage when it comes to this kind of differentiation, so take a look at everything that makers are doing to usher in a whole new class of GPU gear. Changing Context Strings Another major problem with network size is related to the context windows that systems use. If they are typical large language systems operating on a sequence, the length of that sequence is important. Context means more of certain kinds of functionality, but it also requires more resources. By changing the context, you change the 'appetite' of the system. Here's a bit from the above resource on prompt compression: 'While longer prompts hold considerable potential, they also introduce a host of issues, such as the need to exceed the chat window's maximum limit, a reduced capacity for retaining contextual information, and an increase in API costs, both in monetary terms and computational resources.' Directly after that, the authors go into solutions that might have broad application, in theory, to different kinds of fixes. Dynamic Models and Strong Inference Here are two more big trends right now: one is the emergence of strong inference systems, where the machine teaches itself what to do over time based on its past experience. Another is dynamic systems, where the input weights and everything else changes over time, rather than remaining the same. Both of these have some amount of promise, as well, for helping to match the design and engineering needs that people have when they're building the systems. There's also the diffusion model where you add noise, analyze, and remove that noise to come up with a new generative result. We talked about this last week in a post about the best ways to pursue AI. Last, but not least, we can evaluate traditional systems such as digital twinning. Twinning is great for precise simulations, but it takes a lot of resources – if there's a better way to do something, you might be able to save a lot of compute that way. These are just some of the solutions that we've been hearing about and they dovetail with the idea of edge computing, where you're doing more on an endpoint device at the edge of a network. Microcontrollers and small components can be a new way to crunch data without sending it through the cloud to some centralized location. Think about all of these advances as we sit through more of what people are doing these days with AI.

Associated Press
11-06-2025
- Business
- Associated Press
Initial Order Received from Vietnamese Maritime Security and Defense Services for Advanced Video Compression Solution: RMX (Reticulate Micro, Inc.) Symbol: RMXI
$RMXI Proprietary Platform Delivers High-Impact Results in Ultra-Low-Bandwidth Environments, Enabling Real-Time Video Where Traditional Solutions Fail RMX (Reticulate Micro, Inc.) is a technology company specializing in advanced data compression and video optimization. Leveraging proprietary, field-validated technology that has demonstrated exceptional performance in the most demanding environments, RMX is transforming the way organizations capture, transmit, store, and share visual data. Originally developed for mission-critical military applications, RMX's platform reduces video bandwidth, storage needs, and power consumption by up to 50% — all without compromising quality, functional across any network or hardware infrastructure. As data becomes a foundational asset across Defense, AI, Cloud, and Enterprise Ecosystems, RMX is uniquely positioned to lead the next generation of intelligent, efficient data compression solutions in a rapidly digitizing world. The Company's 2024 was defined by rapid technological advancements and rigorous market validation of its flagship VAST platform. Originally developed for government and defense applications, RMXI VAST delivered high-impact results in ultra-low-bandwidth environments, enabling real-time video where traditional solutions fail. Multiple military exercises and test events (over 20 in 2024) have proven RMXI VAST's ability to stream video on any communication band, including narrow-band SATCOM, L-Band and even legacy HF networks where streaming video was thought to be unviable. Building on its success in tactical video solutions for defense applications, RMXI is expanding its enterprise-level offerings for government and commercial customers. RMXI plans to release significant platform enhancements in early 2025, including advanced dynamic bandwidth management and adaptive encoding capabilities. Additionally, through its commercial joint venture launching in Q1 2025, RMXI is positioned to bring its revolutionary technology to global commercial markets. The RMXI 2025 strategy focuses on a platform-first roadmap designed to drive innovation, optimize performance, and scale our compression technologies across critical data environments. Government Sector Partnership & OEM Integration Strategic Partner Program: RMXI is launching a comprehensive integration program enabling partners and system integrators to embed VAST technology directly into their product lines and existing government contract vehicles. Major Exercises & Evaluations: VAST has already been validated within U.S. Special Operations, multiple U.S. Army and Naval initiatives, and a range of international defense programs. This year's schedule includes high-profile exercises with U.S. SOCOM, Army, and Navy, as well as several invitations specifically created for VAST testing and evaluation. Commercial Scale via Joint Venture (RMX and CRISP) Global Managed Services: RMXI and K2 Endeavor DMCC formed RMX as a joint venture to commercialize and scale VAST through a newly developed solution, CRISP (Compressed Rate Intelligent Streaming Protocol). Purpose built for commercial and enterprise sectors from telecommunications to AI, CRISP delivers next-generation data compression to meet the demands of high-performance digital infrastructure. Recurring Revenue Model: RMX's global managed service structure is designed to offer predictable, long-term revenue while solving critical video-delivery challenges for enterprise customers. Expanded Product Suite VAST Video Encoder: The core software-based video encoder product will see the addition of key features including audio, KLV metadata support, video tele-conferencing, and region of interest encoding. VAST Vue: Cross-platform (Windows, Linux, Mac OS, Android, iOS) video player for seamless end-to-end experiences, including support for tactical metadata. VAST SDK: Development tools and APIs for advanced integration of core VAST capabilities into partner products and solutions. VAST Cloud: VAST-as-a-Service cloud offering for enterprise solutions, including live streaming, transcoding and VOD (video on demand) services. RMXI to Deliver VAST's Video Compression Technology to Strategic U.S. Partner On June 4th RMXI announced it received an initial order from its Asia-Pacific partner TEKSEA Technology Joint Stock Company for VASTTM to supply a critical operational need of the Vietnamese maritime security and defense services. TEKSEA is engaged with the Vietnamese government to design and deploy a satellite-based camera surveillance system, which includes equipping vessels with video encoding and compression systems to enable real-time transmission of surveillance footage from vessels to a command center via satellite connectivity. The ability to stream real-time video and images from cameras and sensors installed onboard ships to command centers on shore is critical to improving situational awareness and responding quickly to threats at sea. RMXI VAST is a software-based video encoder capable of streaming high-definition video at extremely low bitrates – HD at just 200 Kbps and SD at under 50 Kbps. It can run on virtually any computing hardware without requiring specialized equipment. VAST's video compression and low-SWaP-C make it ideal for solutions like TEKSEA's satellite-based camera surveillance system. GSA Schedule Placement, Enhancing Government Market Access On May 12th RMXI announced its VAST™ product line is now available to government agencies through a partnership with Dfuse Technologies, Inc. under Dfuse's GSA Multiple Award Schedule (MAS). This strategic collaboration enables federal, state, and local government entities to procure RMXI VAST™ solutions at pre-negotiated pricing and terms, streamlining the acquisition process and accelerating deployment timelines. Being on the GSA Schedule offers companies a significant advantage when doing business with the U.S. government. It streamlines the procurement process by providing a pre-approved contracting vehicle, allowing federal agencies to purchase products and services more quickly and efficiently without going through lengthy bid procedures. This not only reduces administrative burden but also increases a company's visibility and credibility among government buyers. Acquisition of Remaining RMX Industries Inc. Shares to Consolidate Operations and Expand Leadership Team On April 23rd RMXI announced that it has acquired the remaining shares of RMX Industries Inc., the Company's 50/50 joint venture company with K2 Endeavor DMCC, through a stock exchange transaction, making it a wholly-owned subsidiary. The move consolidates operations and aligns resources to accelerate commercial business opportunities across key sectors. As part of this strategic consolidation, RMXI also announced important changes to its governance and management teams: Kirchof has been appointed as an independent member of the RMXI board of directors. Mr. Kirchof has over 20 years of experience building entrepreneurial companies and driving innovation in healthcare technology. Mr. Kirchof is the CEO of CureGrail, Inc., a healthcare technology company engaging and empowering patients to own and manage their disease, and founder and CEO of RxPath, LLC, a healthcare transaction company. Mr. Kirchof previously co-founded Matrix Oncology and served in executive, sales management, marketing, and strategy leadership positions at iKnowMed, Inc., MedStat Group, and IBM Healthcare. Karl Kit has been appointed as RMXI Chief Executive Officer, President and as a member of the board of directors. Mr. Kit is a seasoned entrepreneur with over 40 years of international business experience across advertising, communications, mobile data services, and financial technology. Mr. Kit previously headed K2 Endeavor DMCC, a UAE-based strategic investment group. Andrew Sheppard has been appointed as President of RMX Government. Mr. Sheppard will lead RMXI efforts in the defense and government sectors, focusing on deploying its cutting-edge video and data compression technologies to meet the mission-critical needs of military and public safety customers. Mr. Sheppard previously served as the Company's Chief Executive Officer and President. Maxwell Kit has been appointed as RMXI Chief Marketing Officer. Mr. Kit has extensive experience in global brand strategy, digital engagement, and go-to-market execution. As CMO, he will lead RMX's marketing and communications efforts, refine the RMXI brand identity, and expand market visibility. Mr. Kit will oversee the launch of integrated campaigns that highlight the transformative power of RMXI proprietary compression technology across defense, AI, telecommunications, and enterprise sectors. For more information on $RMXI visit DISCLAIMER: Disclosure listed on the CorporateAds website Media Contact Company Name: Reticulate Micro, Inc. Contact Person: Reticulate Micro Media Relations Email: Send Email Phone: 866-70 MICRO Address:4220 Duncan Ave, Suite 201 City: St. Louis State: Missouri 63110 Country: United States Website: Source: CAP, LLC