CES 2026: AI enters its industrial age, the economic reading of Jensen Huang (NVIDIA)

Jensen Huang’s speeches are reminiscent of those of Steve Jobs’ keynotes. Beyond the “supershow” dimension, the speech given this night at the opening of CES 2026 goes far beyond the presentation of a new generation of NVIDIA technologies. To those who take the time to listen to her, she delivers above all a reading of the changes in progress.

So, beyond the performance or the roadmap of NVIDIA engineers, Jensen Huang asks a more fundamental question: what type of industry is artificial intelligence creating, and according to what economic rules will it operate?

As a preamble, his remarks revolve around a structural change: AI is no longer a software asset with low marginal costs, but a productive infrastructure with high fixed costswhere capital, energy and returns to scale become decisive. This inflection modifies the very nature of the current technological cycle.

From lightweight software to heavy infrastructure

When Jensen Huang mentions a platform shiftit refers to well-identified ruptures: the personal computer in the 1980s, the Internet in the 1990s, then the cloud in the 2010s. These platforms had in common hide infrastructureto abstract it for the benefit of software and usage. The AI ​​operates the opposite movement by putting infrastructure back at the center of the game.

According to him, almost $10 trillion in legacy IT assets are now entering a phase of modernization towards AI. Not because they have become technically obsolete, but because AI imposes new constraints in terms of calculation, network, memory and energy. Obsolescence is less functional than economic: for equivalent performance, these infrastructures consume too much and deliver insufficient output. Where the cloud shared, AI requires re-specialize and of rebuild.

This rupture is first of all physical, because training a new generation model mobilizes tens of thousands of GPUs for several weeksin data centers consuming several hundred megawatts. At the scale of a hyperscaler, an AI campus quickly exceeds the gigawattor the consumption of a large city. Infrastructure thus becomes the limiting factor, both economic and energetic.

In this context, AI acts as a vector for redeployment of existing capital. IT budgets, R&D expenses and industrial investments are converging towards infrastructures capable of supporting AI uses in production.

Through this observation, Jensen Huang seeks to deconstruct the idea of ​​a speculative bubble. We are in a cycle where trillions of dollars gradually change destination to finance a profound reconfiguration of the global digital base.

Why AI is getting more and more expensive…even when it’s progressing

Another key point of his presentation is to refute the idea that AI would become mechanically less expensive. Certainly, the cost per token decreases, but computing volume increases faster than efficiency gains.

Three dynamics overlap: pre-training, with models going from hundreds of billions to several trillion parameters; post-workout, via strengthening and alignment; finally the test-time scalingwhere models reason longer and generate more tokens.

Result: inference becomes a structural cost. An AI agent that reasons for a few seconds uses more resources than an instantaneous model. On a large scale, AI becomes a machine to consume computing continuously.

The agent, economic unit of the new software generation

The notion of agent is central in the demonstration. An agent is a system capable of chaining actions: querying a database, calling an API, writing code, triggering a business process. Agentic AI is no longer a tool for assistance, but for productivity. A developer can thus save 20 to 30% time using a code agent.

This explains the rapid adoption of agents in software development, data or finance, where the cost of human resources is high and the return on investment is immediate.

Without open source, no global distribution

Jensen Huang insisted on open models, not for ideological reasons, but for economic ones. An infrastructure only becomes dominant if it is massively adoptedlike Ethernet or Linux. Thus open source lowers the entry ticket, activates millions of developers and avoids the strangulation of innovation. Even slightly behind proprietary models, its pace is sufficient to structure the ecosystem.

Physical AI, when the cost of error becomes critical

Unlike text or code, the real world does not tolerate approximation. A hallucination is benign in a summary, but potentially fatal in an industrial or automobile maneuver. Hence the importance of long tail. Achieving 99% performance is attainable; managing the last percent, that of rare or dangerous situations, is exponentially more expensive.

Industrial AI is thus closer to aeronautics or nuclear power: system redundancy, complete certification, software and hardware traceability. The partnership with Mercedes-Benz illustrates this logic.

The data does not exist: it is manufactured

If the Internet has served as a repository for language models, this wealth does not exist for the physical world for which it must be constructed. Simulation, digital twins and synthetic data become central. Computing becomes a data factorythe computing power serving both to generate learning situations and to learn from reality.

NVIDIA, architect of a complete economic system

In conclusion NVIDIA is no longer positioned as a simple chip supplier, but as a industrial systems architectorchestrating silicon, network, calculation, simulation and models.

AI has entered the logic ofheavy industry. She became a condition of competitivenessand like any critical infrastructure, it will redistribute value not between those who innovate the fastest, but between those who can invest the most sustainably.