
Red Hat CEO Defines Hybrid Today, Hybrid Tomorrow
Hybrid today means any application, any container, any cloud abstraction - hybrid tomorrow means any ... More AI model, any accelerator & any cloud.
IT stacks morph. Enterprise software stacks tend to move, blend, shift and drift over time as software developers and data scientists skew them one way or another depending on the need for new applications, the use of different toolsets and the adoption of different development methodologies. Software stacks in this state of operation often get behind on patching updates for new features and security. So-called 'system drift' can move an organization's IT deployment away from its original build, such that it ends up in a state where troubleshooting becomes extremely complicated.
It's always been this way, which is a large part of why the more componentized worlds of cloud computing and composable containerization technologies have been so eminently justified and validated. Hybrid harmony in this realm is a positive thing i.e. building technology out of intelligently disaggregated and distributed resources and components means we can shift and lift with system drift, without short shrift.
Enterprise open source software company Red Hat says it has rolled out the latest version of its core platform with a view to catering for the reality of this movable feast. It has also detailed its wider position on enterprise IT software development in terms of where we are with hybrid interoperability today… and what it will mean in our immediate tomorrow.
Red Hat Enterprise Linux 10 has been designed to help enterprise IT departments handle the challenges of hybrid environments and the imperative to integrate AI workloads with a durable operating system.
According to a Red Hat-sponsored IDC study, 'Organisations [are] struggling to hire the Linux skill sets they need to operate and support their expanding fleet of distributions, which opens them up to further risk around security, compliance and application downtime. As technology demands continue to evolve and necessitate the use of more of these deployment scenarios and mechanisms.'
Aiming to address this skills gap in Linux administration, Red Hat Enterprise Linux 10 introduces Red Hat Enterprise Linux Lightspeed. The team behind this development explains that integrating generative AI directly within the platform helps provide context-aware guidance and recommendations through a natural language interface. With Red Hat's enterprise Linux-specific knowledge, it assists with tasks from troubleshooting common issues to best practices for managing complex IT estates.
'Red Hat Enterprise Linux 10 is engineered to empower enterprise IT and developers to not just manage the present, but to architect the future. With intelligent features using generative AI, unified hybrid cloud management through image mode and a proactive approach to security with post-quantum cryptography, Red Hat Enterprise Linux 10 provides the robust and innovative foundation needed to thrive in the era of hybrid cloud and AI,' said Gunnar Hellekson, vice president and general manager, Red Hat Enterprise Linux, Red Hat.
According to Red Hat, its platform is also prepared for the long-term security implications of quantum computing. Red Hat Enterprise Linux 10 now integrates National Institute of Standards and Technology (NIST) compliance for post-quantum cryptography. This approach is designed to give IT teams the ability to defend against 'harvest now; decrypt later' attacks and meet evolving regulatory requirements. This includes incorporating quantum-resistant algorithms to mitigate the risk of future decryption of currently harvested data and post-quantum signature schemes.
The above-referenced enterprise operating system 'image mode' is a container-native approach that unifies the build, deployment and management of both the operating system and applications within a single, streamlined workflow. Users can now manage their IT landscape, from containerized applications to the underlying platform, with the same consistent tools and techniques. On our question on managing configuration drift, these mechanisms enable IT teams to proactively prevent unexpected patch deviations and establish a unified set of practices for application developers and IT operations.
But where does all this get us in terms of the wider direction for Red Hat and the opinions and dreams harbored by its C-suite across the CEO and CTO roles?
'Open source removed the barriers that defined proprietary software. The challenges it created [to the established norm] were initially feared, ridiculed and attacked. But look at where we are today i.e. last year there were five billion contributions made to open source code repositories globally,' said Red Hat CEO Matt Hicks, speaking to press and analysts in person this month. 'Open source defined my career, where others saw fear, I saw potential… and that's exactly where we are right now with AI. This might be because it can still appear scary; if we think about open source AI (in areas like intellectual property for example), if AI is trained on open source software and trained on how to reproduce it, how do we establish who owns the original codebase?'
Hicks says that these are aspects of technical disruption at work right now of huge magnitude. He agrees that the forces at work now are so massive that we will inevitably find disagreement between individuals. But he insists, AI can help unlock barriers, so we need to harness it for the greater good and help it to 'amplify the human creativity' that we ourselves have, even when it is sometimes locked within us.
'Enterprises have to balance the 'now' (running hybrid cloud and all the other elements of composable elements of computing) and the 'next' factor too,' said Hicks. 'The next element of the modern IT stack is of course AI-focused (and making that run and operate at scale) as the IT department must also lean forward towards quantum and more. Consequently then, we're focused on enabling the now with any workload on any cloud in any environment and – we realize that to be a platform company, we need to be able to focus on enabling enterprises for what comes next in the future. Hybrid today means any app, any container and any virtual machine. Hybrid tomorrow will mean any AI model, any [hardware] accelerator and any cloud… really, any envirobnment.'
According to Red Hat CTO Chris Wright, Red Hat's approach to AI is to be both the tools for development and, by equal measure, the platform for deployment. He says the company is now working to apply the orchestration benefits of Kubernetes for AI… and further, he also notes that reasoning models are going to be a big challenge for datacenter infrastructures because they work with so many tokens (an AI token being akin to a unit of text i.e. a word or part of a word used by language models use to process and generate language). Red Hat wants to enable distributed inference for production workloads. With the company's llm-d new open source project, a single inference request can be served by multiple accelerators across multiple GPUs within a datacenter.
Inference time scaling means enabling a model to 'think' longer, which means it will generally produce better results. But this involves increased use of model token processing, so Red Hat is looking at inference at scale-driven by Kubernetes architecture and llm-d based distributed inference. By distributing workloads and routing workloads to the appropriate area of cloud infrastructure (where compute processing engines are underutilized and ripest for taking on jobs), Red Hat hopes to be able to provide a more cost-effective route to functional AI.
'There's a huge amount of architectural management and orchestration occurring across the entire firmament of cloud, data, analytics and open source right now. With many of the workloads across these arenas now being impacted by the rise of AI services, we're seeing new intelligence functions move from experimentation to real enterprise adoption… and that shift demands the right infrastructure,' advises Tina Tarquinio, IBM chief product officer for IBM Z and LinuxONE, the mainframe hardware and enterprise-grade Linux server division of IBM that logically forms a parental dovetailing bond to Red Hat's platform.
'In light of these realities, we've focused on building systems that help software engineering and data science teams put AI into practice with trust and speed. This has been a foundational year for enterprise infrastructure and we have introduced products and services designed to meet the evolving demands of AI and hybrid cloud environments,' added Tarquinio. 'We need to enable technology teams to run mission-critical workloads with precision, reliability and clear control of energy efficiency inside every deployment. With modern systems now capable of processing up to 450 billion 'inference operations' per day [the computational steps carried out by an inference engine to execute a machine learning model's ability to detect pattern recognition and wider data relationships] the latest systems-level development must support a broader and progressive platform engineering strategy in line with workloads, user wellbeing and sustainability.'
Analyst house IDC also has opinions on where Red Hat sits now. IDC research manager Ryan Caskey reminds us that due to evolving operational requirements, organizations today very often accumulate diverse Linux distributions. He underlines the fact that this means that the consistent maintenance of these heterogeneous environments can become increasingly challenging.
'Large-scale and intricate infrastructures, as well as teams that are routinely understaffed, underskilled and underfunded, present problems in need of solutions. Red Hat Enterprise Linux aims to establish a robust, foundational layer for both current and future IT strategic initiatives,' said Caskey, going on the record for commentary on recent platform updates.
Red Hat has been on a growth and development trajectory that is similar and yet different to other enterprise open source platforms. Like Java, like Suse, like Git and like Apache, Red Hat has grown from relative anonymity to widespread business-level recognition. Where Java was acquired by Oracle from Sun, Red Hat was similarly acquired by IBM, but the parallel stops there i.e. the branding from Sun Microsystems is long gone on Java. Conversely, Red Hat has, under IBM, stayed very much as Red Hat. For further comparative confusion, open source version control software project GitHub was acquired by Microsoft in 2018 and still retains its brand. There is no standard template in this segment of the software market, it seems.
Today, spanning increased enterprise penetration as it does, Red Hat's OpenShift is recognized as the market-leading technology (in terms of deployment and revenue) for enterprise Kubernetes and hybrid cloud orchestration. Microsoft would argue that its Azure Red Hat OpenShift is a more comprehensive way of consuming this technology, primarily because it enables customers to get Red Hat OpenShift for cloud container management with automation and networking controls, but with the additional benefit of Microsoft Azure cloud infrastructure services. Although Microsoft Azure Red Hat OpenShift is a fully managed service, OpenShift is also available as a managed app platform on cloud services from AWS, Google Cloud and IBM.
Generally working in competition as much as collaborative 'co-opetition', Red Hat will often find itself deployed in cloud proximity to managed Linux and Kubernetes services from the likes of AWS, Microsoft Azure and Google Cloud. The three major cloud hyperscalers (with Oracle, Alibaba and others playing catch up) are often able to reduce the need for operating system licensing by keeping customers more natively aligned to their cloud platform.
Like Stella Artois lager, its breadth might make Red Hat reassuringly expensive for some. Red Hat Enterprise Linux has maintenance, support and subscription costs that sit higher than services emanating from Canonical Ubuntu or Suse. For smaller operations willing to adopt so-called open source clones (there are many, but names here include Rocky Linux and AlmaLinux), alternatives always exist, such is the nature of open source. For additional balance here, let's note that the total cost of ownership and return on investment measures within the rogue/clone Linux market are much harder to define. Red Hat offers a more measured and quantifiable approach, hence it's not free.
Looking wider afield, the ServiceNow AI Platform has been updated to allow users to put any AI, any agent, any model to work across the enterprise. The company says that this introduces deeper integrations with partners including Nvidia, Microsoft, Google and Oracle to accelerate enterprise-wide orchestration. The agents follow the introduction of the company's agentic AI suite earlier this year. Also of note, Salesforce has introduced its Agentic Maturity Model, a framework for organizations to understand and navigate the different stages of adopting and scaling AI agents within their environments.
CEO Hicks spoke to the competition factor Red Hat faces in light of the AI development models that are currently coming out of ServiceNow and Salesforce.
'ServiceNow and Salesforce have impressive capabilities in building AI models also, but they will naturally be focused around ticketing and customer relationship management (respectively here) in the first instance. But there is a wealth of information in every company that makes them unique and no single one of those tools is going to unlock the breadth of data that organizations need to take advantage of AI in the next age of business. Red Hat's has always been focused on working with open source and with an open approach to enterprise data in a broader way that differentiates us,' said Hicks.
For what might be the most pertinent example of pernicious proximity in this space, Red Hat competes with Terraform in the infrastructure-as-code software market… and Terraform was created by HashiCorp, which was finally acquired by IBM late last year. All of that said, Red Hat appears to have very few qualms about the proximity of those brands it works in competitive allegiance with and, as we have stated, still remains Red Hat, not necessarily Red Hat by IBM… because that will never happen.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Gizmodo
31 minutes ago
- Gizmodo
Wikipedia Tries to Calm Fury Over New AI-Generated Summaries Proposal
The denizens of the open web don't want anything to do with AI. The Wikimedia Foundation, the organization behind Wikipedia, made the unfortunate decision to announce the trial of a new AI-fueled article generator this week. The backlash from the site's editors was so swift and so vengeful that the organization quickly walked back its idea, announcing a temporary 'pause' of the new feature. A spokesperson on behalf of the Foundation—which is largely separate from the decentralized community of editors that populate the site with articles—explained last week that, in an effort to make wikis 'more accessible to readers globally through different projects around content discovery,' the organization planned to trial 'machine-generated, but editor moderated, simple summaries for readers.' Like many other organizations that have been plagued by new automated features, Wikipedia's rank and file were quick to anger over the experimental new tool. The responses, which are posted to the open web, are truly something to behold. 'What the hell? No, absolutely not,' said one editor. 'Not in any form or shape. Not on any device. Not on any version. I don't even know where to begin with everything that is wrong with this mindless PR hype stunt.' 'This will destroy whatever reputation for accuracy we currently have,' another editor said. 'People aren't going to read past the AI fluff to see what we really meant.' Yet another editor was even more vehement: 'Keep AI out of Wikipedia. That is all. WMF staffers looking to pad their resumes with AI-related projects need to be looking for new employers.' 'A truly ghastly idea,' said another. 'Since all WMF proposals steamroller on despite what the actual community says, I hope I will at least see the survey and that—unlike some WMF surveys—it includes one or more options to answer 'NO'.' 'Are y'all (by that, I mean WMF) trying to kill Wikipedia? Because this is a good step in that way,' another editor said. 'We're trying to keep AI out of Wikipedia, not have the powers that be force it on us and tell us we like it.' The forum is littered with countless other negative responses from editors who expressed a categorical rejection of the tool. Not long afterward, the organization paused the feature, 404 Media reported. 'The Wikimedia Foundation has been exploring ways to make Wikipedia and other Wikimedia projects more accessible to readers globally,' a Wikimedia Foundation spokesperson told 404 Media. 'This two-week, opt-in experiment was focused on making complex Wikipedia articles more accessible to people with different reading levels. For the purposes of this experiment, the summaries were generated by an open-weight Aya model by Cohere. It was meant to gauge interest in a feature like this, and to help us think about the right kind of community moderation systems to ensure humans remain central to deciding what information is shown on Wikipedia.'


Wall Street Journal
32 minutes ago
- Wall Street Journal
Treasury Yields Fall Amid Concerning Labor Data, Mild Inflation
0900 ET – U.S. labor and inflation data deepen a decline in Treasury yields. Weekly jobless claims were unchanged from the previous week's upwardly revised pace, at 248,000. Economists surveyed by WSJ expected 246,000. Continuing claims, a measure of the unemployed population, was 1.96 million, the highest level since November 2021. May's wholesale price inflation was 0.1%, accelerating from April's 0.2% deflation and below consensus of a positive 0.2%. The combination of slower-than-expected inflation and concerning labor data underscores bets that the Fed may need to change its hawkish position. Yields were already declining and fell further after the data. The 10-year Treasury yield is at 4.360% and the two-year at 3.891%. ( @ptrevisani) 0614 GMT – A downside surprise in U.S. CPI data gave only a small boost to Treasurys, probably because tariff-driven price hikes still look imminent, says Capital Economics' James Reilly in a note. That said, these price hikes look discounted in markets, shielding Treasury yields from rising pressure, the senior markets economist says. 'We don't expect much upwards pressure on Treasury yields even as the inflationary impact of tariffs eventually feeds into U.S. consumer prices,' he says. Capital Economics expects core inflation to rise in coming months but it thinks that investors are already braced for a broadly similar outcome on tariffs, he says. (


Forbes
32 minutes ago
- Forbes
iOS 26 Is About To Save You Money On Your Energy Bill
Apple hasn't given the smart home much love on stage at WWDC 2025 this week, but if you dig a little deeper into the dev sessions you'll find a feature that could actually make a dent in your energy bill. Dubbed EnergyKit, it's coming as part of iOS 26, which will arrive later this year. It's all a bit technical right now, but it points toward eventually turning your Apple Home system into a money-saving energy manager for your house. I say system, rather than app, because it sounds as if Apple is going to allow app developers to bake this tech into their own apps, even if those device types aren't currently supported. EnergyKit is a developer framework lets apps tap into Apple's Home energy data, and things like your rate plan and a forecast of when the grid is running cleaner or cheaper, which it will use to shift when your devices draw power. For example, it could allow your EV charger to schedule itself to run when grid rates are low or solar energy is peaking, or have your smart thermostat pre-cool your house before prices spike. If you're hooked up to PG&E (the first and only energy provider supported so far), your Apple Home app can already show this kind of info, but EnergyKit will supercharge things and open it up to developers to build smarter automations on top. Apple says the framework is aimed at residential use, for things like HVAC systems and EV charging. In a video introducing the new tools, we're told that EnergyKit can provide personalized guidance for when to use electricity based on environmental impact and cost. Apple is actually pretty late to the smart energy party, with platforms like SmartThings and Homey pushing energy optimization for a while now; and devices from the likes of Ecobee, Eve and Tado already doing this kind of thing on their own. But this feels like Apple finally putting down a foundation to make its Home app more than just a pretty interface for turning off your lights. The Cupertino tech giant doesn't actually support energy monitoring or EV chargers natively at the moment, but obviously Matter makes this sort of thing easier. If you're a dev, you can read more technical info on EnergyKit over on the Apple Developer website.