Major de-booking proves that suppliers are in the driver’s seat: Analyst

There was one particular juicy tidbit during Hewlett Packard Enterprise’s (HPE’s) fourth quarter earnings call with financial analysts last week. The company announced it had “de-booked” a $700 million order for AI equipment due to what was described by CEO Antonio Neri as a “concern with a specific customer.”

During the call on Thursday, chief financial officer Marie Myers said that, while revenue from AI systems orders were in-line with expectations of an estimated $1.2 billion, “we had an order de-book in Q4, leaving our net orders for the quarter at approximated $500 million. Subsequent to the end of the quarter, we have received orders that bring our current backlog to over $3.5 billion. As we have mentioned before, AI systems orders can be lumpy [defined as revenues that come in at sporadic intervals] and this is an example of that.”

In a statement to Network World on Monday in response to questions about the de-booking, HPE explained, “we have a strong controls environment, and we continue to be vigilant on engaging with sound customers, managing risk, and ensuring that we have a diversified order book.”

John Annand, practice lead at Info-Tech Research Group, said Monday the move was “something we have not heard for a long time,” and described Neri as being “decently candid. HPE had lost faith the client was going to be able to honor the contractual terms, and so HPE managed that risk appropriately. One data point does not make a trend, but it does lead to some interesting speculation about what this might mean for the broader market.”

It also proved, he said, that “cash is king. There is an old saying — if you owe the bank $10,000, then you’ve got a problem. If you owe the bank $10 million, then they’ve got a problem. The demonstrable, defensible ROI for genAI technologies has been shaky at best. Sequoia Capital reportedly estimated that the AI industry spent $50 billion on Nvidia chips last year, but only realized some $3 billion in revenue.”

Financial due diligence, said Annand, has “not always been central to big tech deal making, but rather than risk another .com bubble, the supply chain is signaling they want no part in evaluating their customer’s business model and future profitability. Discounting or creative financing are no longer necessary parts of a successful product strategy.”   

And when it comes to the purchasing of equipment by CIOs or data center managers, he said, suppliers are currently in the driver’s seat: “When demand outstrips supply, it’s the customer who bears the brunt of the impact. HPE is in a position to prioritize certain market segments and product bundles as it allocates its limited supply of AI infrastructure.”

Annand added, “we’ve been given no indication this is what is happening, but it’s conceivable that higher margin and stickier customer deals, like those featuring HPE GreenLake, would take precedence over commodity sales of Gen11 HPE ProLiant Servers with Nvidia GPUs. We saw what COVID-19 did five years ago with exponential demand and a constrained supply chain. It’s not unreasonable, but we’ll see some of the same behaviors from suppliers and customers again.”

That said, he noted, “deals can be made. The dark horse hypothesis involves knowing when the inflection point of supply outstripping demand will occur. It takes a comparatively long time to stuff the supply chain with product; if the perception of genAI ROI remains soft amongst the CFOs of the Fortune 2000, we could see a significant swing in vendor/customer dynamics.” 

Fast followers, he said, “will lose a first-mover market advantage to be sure, but in the long run, they might have lower TCO and thus higher ROI; they don’t call it the ‘bleeding edge’ of the technology adoption curve for nothing.”

According to Annand, “without a magic eight ball, it’s impossible to know how this will all unfold. Moore’s law used to predict that a pilot costing $1 million worth of CPU capacity could be scaled up 800% and be put into production four years later for that same $1 million.”

Smart CIOs, he said, “could plan product offerings around this. Transistor miniaturization is hitting the limits of thermodynamics and theoretical physics, and the smart CIO of today is going to have to plot the increasing performance of niche and increasingly refined AI models if they’re going to hit that sweet spot on the adoption curve.”

Source:: Network World