
Red Hat, announced a number of improvements in its core enterprise Linux product, including better security, better support for containers, better support for edge devices. But the one topic that dominated the conversation was AI.
Companies that don’t want to be stuck with just one cloud provider, or one AI vendor, but are instead beginning to operationalize AI across all their environments need a platform to do it on — and there’s a window of opportunity right now for that platform to emerge and become, in effect, the operating system of enterprise AI.
This week, Red Hat threw its considerable weight into the ring to be that platform. At least that was the big take-away for Forrester analyst Devin Dickerson.
“Red Hat continues to position itself as an enterprise-grade alternative for deploying modern workloads—now including AI—in environments that extend far beyond the public cloud,” he said.
And generative AI and agentic AI were front and center in Red Hat’s messaging this week. “Red Hat is offering a path to do AI in an open, portable, and secure way — on your terms,” he says. This is especially important in regulated industries with high compliance demands, Dickerson said.
“Red Hat is now positioning AI as a key pillar of its value proposition, alongside Linux, OpenShift and Ansible,” said Dimitris Mavrakis, senior research director at ABI Research.
For example, Red Hat announced that RHEL 10 was now an “AI native” operating system, he says. AI payloads are now treated as a core workload class, not an add-on like before.
“It is also a clear signal for enterprises that want to adopt AI for on-premise implementation, that Red Hat is becoming a key enabler for this journey,” Mavrakis added
Some enterprises opt for letting vendors handle their inferencing, via API calls to OpenAI’s ChatGPT or Anthropic’s Claude. It’s an easier and quicker approach, but can get expensive at scale. Plus, for security or compliance reasons, companies might want to run their AI on their own infrastructure.
“Many enterprises — especially ones that deal in business or mission-critical verticals — require AI to be deployed on-premise or in very tightly controlled environments,” said ABI’s Mavrakis. “However the implementation of these systems has been lacking,”
“Moreover, a large part of the market was previously served by VMWare, who has alienated a large part of the market,” he added.
Broadcom acquired VMware in 2023 and, since then, there have been some unpopular changes to license terms, pricing, and support. “These companies are looking at alternatives and Red Hat is a viable contender,” Mavrakis said.
Red Hat isn’t a generative AI company. Its parent, IBM, does have its own family of LLMs, but Red Hat is focusing on managing AI rather than building it themselves.
To do this, Red Hat has launched its AI Inference Server, a platform that supports a wide range of hardware accelerators and models. That includes chips from Nvidia, AMD, Google, AWS, Intel and IBM, and models from IBM, Google, Mistral, Microsoft, Qwen, DeepSeek, among others.
Plus, Red Hat now has an AI model library of more than three dozen validated models. Hosted on Hugging Face, it includes several flavors of Llama, Mistral, Granite and other leading edge open source models.
“The buzzy aspects of AI are giving way to more practical conversations about where and how it can deliver value,” said Eric Hanselman, an analyst at S&P Global Market Intelligence. That leads to a focus on inferencing capabilities and scaling, he says.
AI models can be used to answer questions — but they can also be used to power agents. Agentic AI gives enterprises the power to automate entire business processes. But deploying an agentic system is extremely complicated. In addition to the models that power the agents, enterprises also need to add layers of security, orchestration, and monitoring.
In addition, agents need to be able to access tools, data, and other resources.
To make that happen, Red Hat is betting on a couple of new technologies, both released in the latter half of last year by commercial vendors — but are both open source and have the potential of becoming industry standards.
One was obvious — MCP, or model context protocol, allows AI systems and agents to communicate with resources, tools, and data sources.
Since it was open-sourced by Anthropic late last year it has been adopted by a number of big tech vendors, including Google and Microsoft, as well as by its biggest rival, ChatGPT maker OpenAI.
The other one wasn’t as obvious. Llama Stack, released by Meta in September of 2024, is a set of tools for building, scaling and deploying AI applications and agentic systems.
A more popular alternative, the open source project LangChain, dates back to 2022. As of last October, LangChain reported more than 130 million downloads and 132,000 apps built using the technology, but it doesn’t exactly have the weight of a giant like Meta behind it. It’s too early to call MCP and Llama Stacks the new standards for agentic AI. But, says Forrester’s Dickerson, “Red Hat’s support along with others does lend credibility and momentum.”
In addition to helping enterprises deploy their own AI, Red Hat is also integrating AI into its own products. Big announcements in this area included AI assistants and AI-driven automation for setting up IT environments inside the Linux operating systems, said Walid Negm, US software-defined vehicle CTO at Deloitte Consulting.
Outside AI, the Red Hat news he was most excited about was the certification of the Red Hat In-Vehicle Operating System.
“Linux appears ready to reimagine the automotive industry, as developers may now be able to simplify processes along the vehicle development lifecycle both in the cloud and in the car itself,” he says.
For S&P’s Hanselman, the biggest surprise was the formal integration of HashiCorp and Red Hat. There’s a bit of overlap in their offerings, since HashiCorp’s Terraform automates cloud management, and so does Red Hat’s Ansible.
“There are natural connections across IBM’s portfolio, but there are fundamental differences across the automation products, Terraform and Ansible,” he said.
But HashiCorp also has its Vault product, used for identity-based authentication. “Tying Vault into the Red Hat world, given the challenges many face with secrets management, is a much more natural union,” Hanselman said.
Source:: Network World