The real KPI forgotten in the AI ​​race: the rate of use of datacenters

While investments in artificial intelligence reach historical levels, a fundamental indicator remains largely absent from radars: the use rate of data centers. While digital capitals around the world are competing for supremacy in terms of infrastructure, few actors wonder about the real capacity of these facilities to generate value.

Blind expansion

From Microsoft to SoftBank, via Alibaba or Amazon, technological giants invest dozens of billions in physical infrastructure intended to support generative AI models. Alibaba has announced a plan of $ 52 billion over three years to strengthen his cloud capacities. Servers farms arise in India, Malaysia, Emirates or the American Midwest. At first glance, the dynamics seem rational: more models, more servers, more power.

But as Joe Tsai, president of Alibaba Group recently pointed out, this investment frenzy often takes place without identified customers, without contractual commitments on use, and above all, without public indicators on the actual level of occupation of installed capacities.

An indicator absent from quarterly reports

With the exception of rare transparency initiatives (such as Microsoft Azure use reports in certain regions), most hyperscalers avoid disclosing the effective occupancy rate of their datacenters. This indicator, however essential in the hotel or logistics industry, remains absent from the financial presentations of tech leaders.

Result: impossible to assess the actual profitability of investments. An underused datacenter generates little income but mobilizes significant fixed costs (real estate, energy, maintenance). On a macroeconomic scale, this opacity feeds a potential infrastructure bubble, where the early AI value justifies a largely oversized capacity.

Build First, Monetize Later

The current logic recalls that of the commercial real estate sector before 2008: Build first, hoping for use then. AI is perceived as a wave of inevitable transformation, which apparently legitimizes the accumulation of physical capital. But this approach is based on two fragile hypotheses: that the uses IA will grow in a linear way, and that future models will always require more raw power.

However, neither is guaranteed. The emergence of frugal models calls into question exclusive dependence on expensive architectures. In parallel, the maturity of use cases remains limited in many sectors. Few out of tech companies have integrated generative AI models into their business processes on an industrial scale.

Towards a measure of productive sobriety

THE rate of use could become the reference indicator to distinguish projects actually creating the value of speculative constructions. A datacenter whose GPUs run 70 % of their average capacity over the year does not have the same risk profile as a structure supplied with 20 % and dependent on a single pilot customer.

This KPI would also make it possible to assess the productive sobriety of AI: how much energy, how much infrastructure, for what concrete results? In a context of increasing energy tension, this measure could become a strategic arbitration criterion for investors as well as for regulators.

Conclusion: the right kpi at the right time

The AI ​​will not develop permanently without rational performance indicators. The rate of use of datacenters must impose itself as a central criterion, in the same way as the Arr or the cost per request. Ignoring this signal would amount to piloting an industrial revolution blind.