Digital fog: how the AI ​​redefines the Fog of War

Since the Prussian general Carl Von Clausewitz forged the concept of war fogthe armies try to unravel the opacity inherent in conflicts: disinformation, uncertainty, chaos of perceptions. Artificial intelligence, in theory, would dispel this fog by analyzing massive volumes of real -time data, to offer a clear and exploitable image of the situation. In reality, it changes its nature – sometimes to lighten it, often to thicken it.

Fog, this permanent strategic enemy

THE Fog of Warit is the impossibility of knowing the operational situation perfectly: enemy intentions, movements, reliability of information, real effects of actions. This uncertainty creates an asymmetry of permanent information that the belligerents are trying to exploit. Modern armies invest massively to reduce this shadow area by imaging, information of electromagnetic origin (ROEM), massive data processing – now enriched by AI.

But as the sensors multiply and the flows explode, The challenge is no longer to collect information, but to understand it. AI thus becomes an essential tool for filtering, prioritizing, contextualizing … and sometimes misleading.

AI as a hyperperception tool

In staffs, AI transforms intelligence into an anticipation capacity. Analysis of satellite images, detection of weak signals, modeling of enemy behavior: machines can cross heterogeneous data (visual, thermal, acoustic, textual) and generate dynamic situation cards.

Objective: to reduce the time between detection, understanding and action.

The conflict in Ukraine offers a striking example. The armies now use AI to process in a few seconds video flows from drones, identify anomalies, predict trajectories, assess concentrations of troops. This rapid, fast treatment gives a decisive tactical advantage: See faster, understand earlier, act before the enemy.

But a fog can hide another

This promise of clarity is fragile. AI can generate A fog of a new type: algorithmic, opaque, confusing. Three major risks emerge:

  1. Overload of poorly hierarchical information : AI produces correlations, but not necessarily strategic conclusions. Too much poorly structured data can paralyze action rather than enlighten it.
  2. False signals and digital hallucinations : poorly trained AI, on biased or corrupt data, can produce erroneous detections. Targeting an area based on a statistical probability of presence – and not a confirmation – constitutes a radical change in paradigm, denounced by several military experts.
  3. Handling of fog : AI is not used to see. It can also be used for mislead. Create fictitious thermal signatures, distort images, inject noise into opposing systems: cognitive war and electronic war operations now incorporate Offensive AI capable of thickening the fog on the opposing side.

When clarity becomes vulnerability

The more a military system is based on an algorithmic perception of the battlefield, the more it becomes vulnerable to the falsification of this perception. It is the great asymmetry revealed by recent conflicts: The fog can be instrumentalized by little technological actorsat low cost, to saturate or disorient complex systems.

A simple swarm of drones, a false radar signal, or a targeted disinformation campaign may be enough to cause automatic reactions or to divert attention from a defense system.

The fog, formerly suffered, becomes today a deliberately built tactical weapon.

AI can see, but does not understand

The fantasy of a “total vision” of the battlefield thanks to the AI ​​comes up against an ontological limit: the machine can detect, model, project – but it does not understand the political, human or moral meaning of an action. She does not read the intention. She does not feel ambiguity. However, it is often in these gray areas that strategic reversals are played out.

The ability to discern what is “true” in a war is not based solely on the treatment speed, but on the ability to interpret a context, to meet the levels of meaning, to be judgmental. The fog does not disappear. He moves, gross information to The interpretative layer.

War remains a human affair

Artificial intelligence transforms Fog of Warbut does not eliminate it. It reduces certain technical aspects, but introduces new, more insidious. In a world saturated with signals, Confusion is no longer due to the lack of information, but to their excess, their falsification, their automation.

Mastering AI in war is not believing in its transparency is to understand your dead angles. And never forget that in digital fog, it is not the one who sees the most who wins, but the one who understands what he sees – and knows when not to trust the machine.