logo
Plainsight unveils OpenFilter to simplify vision AI pipelines

Plainsight unveils OpenFilter to simplify vision AI pipelines

Techday NZ22-05-2025

Plainsight has launched OpenFilter, an open source project designed to simplify and accelerate the development, deployment, and scaling of production-grade computer vision applications.
OpenFilter is available under the Apache 2.0 licence and is designed to help enterprises build, deploy, and manage vision AI pipelines using modular, reusable components, referred to as "filters". These filters combine code and AI models into building blocks for assembling custom vision pipelines.
The project aims to address key challenges that organisations face when implementing AI-driven computer vision in production environments, such as cost, scalability, and the complexity of infrastructure integration.
Priyanshu Sharma, Senior Data Engineer at BrickRed Systems, explained the practical benefits seen in manufacturing and logistics implementations. "OpenFilter has revolutionised how we deploy vision AI for our manufacturing and logistics clients. With its modular filter architecture, we can quickly build and customise pipelines for tasks like automated quality inspection and real-time inventory tracking, without having to rewrite core infrastructure. This flexibility has enabled us to deliver robust, scalable solutions that meet our clients' evolving needs, while dramatically reducing development time and operational complexity," Sharma said.
Plainsight claims that OpenFilter's features - including frame deduplication and priority scheduling - lower GPU inference costs, while its abstractions are intended to shorten deployment timelines from weeks to days. The system's extensible architecture is designed to future-proof investments, offering compatibility not only with computer vision but also adaptable extensions for audio, text, and multimodal AI use cases.
OpenFilter aims to bridge a common gap in computer vision adoption, where projects can stall due to fragmented tooling and difficulties in scaling from prototype to production. The platform includes several features: a core runtime available as open source, pre-built filters for tasks such as object tracking and image segmentation, and a pipeline management system that can handle various video inputs like RTSP streams, webcams, and image files. It enables routing of processed data to destinations including databases, MQTT brokers, or APIs.
The system is designed to support deployment across a wide range of hardware, from CPUs and GPUs to edge devices, allowing for resource optimisation in different environments. OpenFilter supports broad model integration, letting users deploy models from frameworks such as PyTorch and OpenCV, or custom models like YOLO, without vendor lock-in.
Typical use cases for OpenFilter span a variety of sectors. In manufacturing, the platform can be used for automated quality inspection, defect detection, and fill-level monitoring. Retailers and food service operations may use it for drive-through analytics or inventory tracking, while logistics operators could automate vehicle tracking or workflow processes. Additional applications include precision agriculture, surveillance, people counting, and event detection for IoT and edge environments.
Andrew Smith, CTO of Plainsight, commented on the broader aim for OpenFilter's architecture. "Filters are the building blocks for operationalising vision AI," Smith said. "Instead of wrestling with brittle pipelines and bespoke infrastructure, developers can snap together reusable components that scale from prototypes to production. It's how we make computer vision feel more like software engineering - and less like science experiments."
Chris Aniszczyk, CTO of CNCF, endorsed the open source nature of OpenFilter, saying, "OpenFilter is a leap forward for open source, giving developers and data scientists a powerful, collaborative platform to build and scale computer vision AI. Its modular design and permissive Apache 2.0 license make it easy to adapt solutions for everything from agriculture and manufacturing to retail and logistics, helping organisations of all types and sizes unlock the value of vision-based AI."
Kit Merker, CEO of Plainsight, described the broader ambition for OpenFilter in the industry. "OpenFilter is the abstraction the AI industry has been waiting for. We're making it possible for anyone - not just experts - to turn camera data into real business value, faster and at lower cost," Merker said.
"By treating vision workloads as modular filters, we give developers the power to build, scale, and update applications with the same ease and flexibility as modern cloud software. This isn't just about productivity, it's about democratising computer vision, unlocking new use cases, and making AI accessible and sustainable for every organisation. We believe this is the foundation for the next wave of AI-powered transformation."
Plainsight has made OpenFilter available to the public under the Apache 2.0 licence and offers an Early Access Programme for enterprises interested in a commercial version of the platform.

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Plainsight unveils OpenFilter to simplify vision AI pipelines
Plainsight unveils OpenFilter to simplify vision AI pipelines

Techday NZ

time22-05-2025

  • Techday NZ

Plainsight unveils OpenFilter to simplify vision AI pipelines

Plainsight has launched OpenFilter, an open source project designed to simplify and accelerate the development, deployment, and scaling of production-grade computer vision applications. OpenFilter is available under the Apache 2.0 licence and is designed to help enterprises build, deploy, and manage vision AI pipelines using modular, reusable components, referred to as "filters". These filters combine code and AI models into building blocks for assembling custom vision pipelines. The project aims to address key challenges that organisations face when implementing AI-driven computer vision in production environments, such as cost, scalability, and the complexity of infrastructure integration. Priyanshu Sharma, Senior Data Engineer at BrickRed Systems, explained the practical benefits seen in manufacturing and logistics implementations. "OpenFilter has revolutionised how we deploy vision AI for our manufacturing and logistics clients. With its modular filter architecture, we can quickly build and customise pipelines for tasks like automated quality inspection and real-time inventory tracking, without having to rewrite core infrastructure. This flexibility has enabled us to deliver robust, scalable solutions that meet our clients' evolving needs, while dramatically reducing development time and operational complexity," Sharma said. Plainsight claims that OpenFilter's features - including frame deduplication and priority scheduling - lower GPU inference costs, while its abstractions are intended to shorten deployment timelines from weeks to days. The system's extensible architecture is designed to future-proof investments, offering compatibility not only with computer vision but also adaptable extensions for audio, text, and multimodal AI use cases. OpenFilter aims to bridge a common gap in computer vision adoption, where projects can stall due to fragmented tooling and difficulties in scaling from prototype to production. The platform includes several features: a core runtime available as open source, pre-built filters for tasks such as object tracking and image segmentation, and a pipeline management system that can handle various video inputs like RTSP streams, webcams, and image files. It enables routing of processed data to destinations including databases, MQTT brokers, or APIs. The system is designed to support deployment across a wide range of hardware, from CPUs and GPUs to edge devices, allowing for resource optimisation in different environments. OpenFilter supports broad model integration, letting users deploy models from frameworks such as PyTorch and OpenCV, or custom models like YOLO, without vendor lock-in. Typical use cases for OpenFilter span a variety of sectors. In manufacturing, the platform can be used for automated quality inspection, defect detection, and fill-level monitoring. Retailers and food service operations may use it for drive-through analytics or inventory tracking, while logistics operators could automate vehicle tracking or workflow processes. Additional applications include precision agriculture, surveillance, people counting, and event detection for IoT and edge environments. Andrew Smith, CTO of Plainsight, commented on the broader aim for OpenFilter's architecture. "Filters are the building blocks for operationalising vision AI," Smith said. "Instead of wrestling with brittle pipelines and bespoke infrastructure, developers can snap together reusable components that scale from prototypes to production. It's how we make computer vision feel more like software engineering - and less like science experiments." Chris Aniszczyk, CTO of CNCF, endorsed the open source nature of OpenFilter, saying, "OpenFilter is a leap forward for open source, giving developers and data scientists a powerful, collaborative platform to build and scale computer vision AI. Its modular design and permissive Apache 2.0 license make it easy to adapt solutions for everything from agriculture and manufacturing to retail and logistics, helping organisations of all types and sizes unlock the value of vision-based AI." Kit Merker, CEO of Plainsight, described the broader ambition for OpenFilter in the industry. "OpenFilter is the abstraction the AI industry has been waiting for. We're making it possible for anyone - not just experts - to turn camera data into real business value, faster and at lower cost," Merker said. "By treating vision workloads as modular filters, we give developers the power to build, scale, and update applications with the same ease and flexibility as modern cloud software. This isn't just about productivity, it's about democratising computer vision, unlocking new use cases, and making AI accessible and sustainable for every organisation. We believe this is the foundation for the next wave of AI-powered transformation." Plainsight has made OpenFilter available to the public under the Apache 2.0 licence and offers an Early Access Programme for enterprises interested in a commercial version of the platform.

In the AI gold rush, data centre infrastructure vendors are selling shovels
In the AI gold rush, data centre infrastructure vendors are selling shovels

Techday NZ

time22-05-2025

  • Techday NZ

In the AI gold rush, data centre infrastructure vendors are selling shovels

As Computex in Taiwan wraps, the prominent presence of infrastructure vendors on the floors is noted, with a preponderance of liquid cooling technology on display. It is, of course, one thing demonstrating a PC or even a rack chilled by neon-lit plumbing, and quite another reticulating water supply and piping throughout a data centre. Seeing the hardware necessary for the latter is both fascinating and raises questions for the challenges involved in, among other things, retrofitting existing facilities with cooling systems capable of handling the growing heat generation associated with the high-performance computing sitting behind AI. Schneider Electric Secure Power Division VP Pankaj Sharma sat down with Techday and explained that yes, the mechanical components of a Cooling Distribution Unit (that's the pumps, heat exchangers, and other physical, mechanically engineered bits) are relatively straightforward to build. At scale is where the magic happens. "Precision in managing temperature variability through firmware and software is what sets high-quality systems apart," he said. "And in AI data centres, where racks loaded with for example NVIDIA GPUs can cost millions, even slight temperature deviations can damage hardware or reduce efficiency, leading to significant financial losses." He explained that dynamically controlling coolant temperature and flow to maintain optimal conditions for high-density compute is paramount. "This requires firmware to monitor and adjust parameters in real-time, ensuring stability and efficiency. For example, maintaining water temperatures within a tight range (as little as ±1°C) is critical for liquid-cooled systems supporting AI workloads." Now, data centres were always expressly designed to keep water out, because water doesn't mix well with computers. That's a simple first principle now turned on its head as increased heat generation leaves no choice but to pipe it in, along with additives like ethylene glycol, good old antifreeze. Again, Sharma summons the challenge of scale and says delivering at scale is what sets Schnieder Electric apart from the multitude of cooling systems prominently in neon at Computex. "Many companies can produce mechanical cooling systems, but few have the expertise to integrate advanced software/firmware for precise thermal management," he said. "This is where companies like Motivair, with a decade of experience in supercomputer data centres, stand out." While it had progressed some work on its own-developed liquid cooling solutions, Sharma said Schneider Electric's recent acquisition of Motivair is an effective leapfrog over competitors, as the company comes with decades of experience in delivering air and liquid cooling in demanding data centre environments. Cray Supercomputer was mentioned – now there's a blast from the past. Sharma explained that Motivair's chops extend to material science for pipelines and connectors capable of resisting corrosion, avoiding leaks, and degradation over time which could knock out data centre infrastructure. "Scaling liquid cooling across diverse global environments (different climates, power grids, and facility designs) demands expertise and most competitors lack this depth of experience," he said. In fact, as AI data centres flourish like mushrooms on a global scale, Sharma agreed that the probability of a high-profile mechanical – or hydraulic – failure owing to insufficient heat management is quite likely. That's because the challenges are expressly mechanical, even to the extent of floor loading: redesigned brownfield data centres supporting higher-density computing must also support literally denser racks. Water is heavy, necessitating more robust racks, and even stronger floors. What is certain is that in the AI goldrush, and the emergence of the data centre as the 'AI factory', there will be winners and losers. That's a consistent feature of every stampede in the direction of certain riches. On the Witwatersrand of AI, companies like Scheider Electric might not be digging out the lumps of yellow stuff directly, but without their expertise and tooling, nobody else can either. Donovan Jackson is attending Computex as the guest of Schneider Electric. AI must live somewhere, and among other things, Schneider Electric makes data centre infrastructure.

Hot chips, cool solutions: powering the AI revolution
Hot chips, cool solutions: powering the AI revolution

Techday NZ

time20-05-2025

  • Techday NZ

Hot chips, cool solutions: powering the AI revolution

Strap yourselves in, fellow travellers, for today we are talking data centre infrastructure. Over at Computex in Taipei, Schneider Electric which makes the physical stuff where AI lives (power, cabinets, UPS, cooling systems) delivered the lowdown on the role and challenges facing what is probably the most fundamental component of the revolution sweeping through, well, everything today. Those challenges largely revolve around increasing compute density and the necessity for appropriate cooling and housing. And in turn, that means out with the bulky finned heatsinks consuming excessive rackspace, and in with liquid cooling. Now, it is hard to get very excited about data centre infrastructure in much the same way as being enthralled with the foundations of Taipei 101 is an unlikely prospect. But that doesn't make those foundations any less crucial, for without them the tower wouldn't stand. At a swanky press conference, Schnieder Electric secure power VP Pankaj Sharma contextualised the challenge by noting the astonishingly rapid adoption of AI: 100 million lemmings consigned free thought to be past in just two months, according to his figures. In what may be a case of a false analogy, Sharma noted this sort of adoption took seven years for that other transformative technology we like to call 'the internet' (those of a certain age will remember being puzzled about the utility of email and downloading fuzzy images over the course of half-hours, right up until we tried it). Since forever, compute has generated heat, and generating heat eats electricity. Sharma noted that demand for AI (and the other now-boring stuff data centres do, like storage or application hosting) has spiked electricity demand, straining power grids and challenging net zero goals. It is, of course, only going to get worse as that demand keeps ramping up. In much the same way that one's spouse insists that more shopping for stuff on sale delivers greater savings, Sharma said AI itself will cleverly help ameliorate demand. NVIDIA, with which Schneider Electric has forged a partnership, fielded head of data centre product marketing at NVIDIA conceded that this is some potentially Inception-level circular logic, so it really sort of boils down to 'trust me bro'. Pankaj's colleague and Schneider Electric international secure power VP Nirupa Chander emphasised the unique needs of AI data centres, noting ultra-high power densities and the necessity of future-proof designs. She explained Schneider Electric's collaboration with NVIDIA aims to streamline power integration from grids to data centres, addressing challenges like unstable power supply and high energy costs. There was some insider banter too: from chip to chiller, and if you haven't already, then you heard it here first. Harris riffed on NVIDIA supremo Jenson Huang's celebrated keynote and noted AI's growth driven by diverse models and use cases, transforming data centres into AI factories capable, one imagines, of Incepting us all. And then, in a highlight of the afternoon, Trent McCarley from Schieder Electric's recently acquired cooling pioneer Motivair went into some detail on the emergence of liquid as a crucial component in creating the infrastructure capable of handling AI-driven heat loads. For those who are into motorcycles, an easy analogy emerged. Back in the day, most engines were air cooled, with effective 'heat sinks' on the engine evidenced by fins. As compression ratios increased in the search for more power, the heat dissipation of those fins proved inadequate, and so radiators, water pumps, ethylene gycol, and a bit of plumbing became part and parcel of the package. Not that dissimilar from the trajectory on which AI data centres find themselves, but of course, at considerably expanded scale. Donovan Jackson is attending Computex as the guest of Schneider Electric. AI must live somewhere, and among other things, Schneider Electric makes data centre infrastructure.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store