
IBM has rolled out a family of Power11 servers and chips aimed at helping enterprise customers grow infrastructure for secure AI, hybrid cloud and edge-driven computing.
The IBM Power11 family, which will be available July 25, features a variety of new models, from the high-end flagship IBM Power E1122 system to the entry level Power S1012. All models are designed to help enterprise customers grow AI infrastructure, enable LLM-driven apps, and extend IBM’s hybrid cloud ecosystem with tighter integration with Red Hat and watsonx. The IBM Power11 family of servers also promise guaranteed uptime and security, according to Tom McPherson, general manager of IBM’s Power Systems.
Enhancing the servers’ role as an AI development box will be support for Big Blue’s Spyre AI Accelerator, available in Q4 2025, which IBM says will significantly boost speed and accuracy of AI processing capabilities for the Power server family. By offloading AI tasks to the accelerator, Power11 servers can improve performance, leading to faster response times and improved overall system efficiency.
Once its available, IBM will feature the Spyre Accelerator across its core infrastructure servers, including with the Telum II processor in its new z17 and LinuxONE mainframes. The Spyre Accelerator contains 1TB of memory and 32 AI accelerator cores that will share a similar architecture to the AI accelerator integrated into the Telum II chip, according to IBM. Each Spyre is mounted on a PCIe card. Multiple IBM Spyre Accelerators can be connected to bring a substantial increase in the amount of available acceleration, IBM says.
The Spyre Accelerator is an enterprise-grade accelerator designed for AI inferencing tasks with high efficiency and scalability, particularly for complex models and generative AI, wrote Chris Drake, a senior research director with IDC’s worldwide infrastructure research organization, in a report on the new servers. “This will enable seamless AI integration into existing applications and workflows running on Power and deploy a broader range of AI use cases. IBM is also looking to use AI to modernize Power applications with the upcoming IBM watsonx Code Assistant for i,” Drake wrote. “It is expected to accelerate RPG code modernization tasks for IBM i applications with AI-powered capabilities made available directly in the integrated development environment.”
As for specifics, the Power E1180 server features include:
- 10, 12, or 16 core processors operating at 4.4 GHz per core.
- Up to 256 Power11 processor cores in one to four systems nodes; up to 64 TB of 4000 MHz, DDR5 DRAM memory, and six PCIe Gen4 x16 which four of them can be PCIe Gen5 x8 and two PCIe Gen5 x8.
- 19-inch PCIe Gen4 4U I/O expansion drawer and PCIe fan-out modules, supporting a maximum of 192 PCIe slots and four I/O expansion drawers per node.
- PCIe Gen1, Gen2, Gen3, Gen4, and Gen5 adapter cards supported in the system node, and PCIe Gen1, Gen2, Gen3, and Gen4 adapter cards supported in I/O expansion drawer.
- Dynamic LPAR support for adjusting workload placement of processor and memory resources.
- Active Memory Expansion (AME) that is optimized onto the processor chip.
- Power Enterprise Pools that support unsurpassed enterprise flexibility for workload balancing and system maintenance.
The new midrange server, the 4U rack mounted IBM Power E1150 supports 32 to 120 processor cores, 256 GB -16 TB high-performance up to 4000 MHz DDR4 OMI memory and Elastic, Mobile and Shared Utility Capacity options meaning the Power E1150 system lets clients deploy pay-for-use consumption of processor, memory, and supported operating systems, by the day, across a collection of Power E1150 systems, IBM stated. The S1122is a2U rack-mounted, entry-level Power11 server can support either two Entry Single-Chip Module (eSCM) or Dual Chip Mode (DCM) processors per server depending on customer power needs. Each new Power11 processor single chip module (SCM) contains two memory controllers. Four 10-core 3.90 – 4.20 GHz (max), four 12-core 3.90 – 4.40 GHz (max), or four 16-core 3.80 – 4.30 GHz (max) are used in each system node, providing 40 cores to a 160-core system, 48 cores to a 196-core system, or 64 cores to a 256-core system, according to IBM.
“The new Power11 platform has been specifically designed to help organizations harness opportunities in the era of AI and hybrid, multicloud operations,” wrote Drake. “First, Power11 addresses both performance and compliance requirements of AI, including acceleration capabilities and the ability to access, transform, and manage enterprise data at scale, and support the seamless integration of generative AI (genAI) capabilities with mission-critical enterprise processes,” Drake wrote.
Second, the platform supports new automation capabilities, bringing a range of benefits to Power platform customers that span management, reliability, security, and sustainability, Drake wrote. “Finally, IBM’s Power11 is designed to support a distributed and flexible hybrid infrastructure with business-driven workload placement across on premises, public cloud, or private cloud based on performance, cost, and compliance needs.”
All of the servers include an AI-based Matrix Math Accelerator that helps to perform in-core AI inferencing and machine learning where data resides, IBM stated. IBM says there can be zero planned downtime for system maintenance. For instance, features such as autonomous patching, automated workload movement, and planned system maintenance events can occur without ever taking critical applications offline, according to Bargav Balakrishnan, vice president of product management with IBM. This feature can free IT professionals from spending time planning, testing, and executing upgrades to their systems, Balakrishnan said. The Power servers support IBM’s Power Cyber Vault that utilized AI/ML and the NIST cybersecurity framework to help identify, protect, detect, and automatically respond to cyber threats instantly, Balakrishnan said.
“Cyber Vault provides protection against cyberattacks such as data corruption and encryption with proactive immutable snapshots that are automatically captured, stored, and tested on a custom-defined schedule. Power11 also used NIST-approved built-in quantum-safe cryptography designed to help protect systems from harvest-now, decrypt later attacks as well as firmware integrity attacks,” Balakrishnan said.
Other shared server features include:
- IBM PowerVC is a virtualization and cloud management software solution designed for IBM Power Systems. It’s based on Openstack, and it simplifies the management of AIX, IBM i, and Linux virtual machines running on Power Systems.
- The IBM Cloud Management Console: CMC is a cloud-based service that provides a centralized platform for monitoring and it provides insights of IBM Power Systems infrastructure.
- The Hardware Management Console (HMC): An appliance used to configure and manage Power systems. It facilitates Power server hardware management, Power hypervisor and virtualization management, service and update management, monitoring, and is an integration point for other Power solutions.
- To further modernize application development, IBM watsonx Code Assistant for i will help developers extend RPG applications for greater ease and productivity. IBM will also make watsonx.data, its hybrid, open data lakehouse, available on Power11 by the end of 2025.
“IBM is well placed to provide customers with a flexible, cost-effective, and operationally efficient AI ecosystem that supports AI development and deployment across diverse cloud, distributed, and sovereign IT environments,” Drake wrote. “By integrating AI-driven automation into IT, cybersecurity, and data management, IBM will be a driver of AI adoption across a range of regulated industries—including financial services and healthcare—where trust and governance are essential,” Drake wrote.
IBM’s biggest challenge will be to demonstrate the scaling AI solutions and customer bases beyond the company’s traditional enterprise base, Drake wrote.
“Other leading cloud providers and hyperscale digital service providers are also deploying and aggressively promoting AI-native cloud services that offer seamless integration with broader ecosystems. IBM must therefore ensure that it is seen as a core AI enabler of AI across hybrid, interoperable environments,” Drake wrote.
Source:: Network World