In contemporary criminal cases, the amount of digital evidence far exceeds the capacity for human analysis. Phones seized, email exports, call recordings, videos from fixed or mobile cameras, thousands of pages of reports, the volume increases with each investigation while procedural delays do not lengthen. Criminal justice has never been so rich in data, but so vulnerable to saturation error.
The promise of the American startup Longeye is to enable legal professionals to more quickly identify the truly decisive elements in masses of information that have become unmanageable. Longeye is primarily aimed at law enforcement, but its existence has not escaped the lawyers, who see it as much a potential threat as a new counter-investigation tool.
The founder, Guillaume Delepine, already knows the public security ecosystem. He previously helped launch drone programs used by hundreds of U.S. agencies. Observing this sector, he noticed a paradoxical evolution, the more technology police services integrate, the more the quantity of information collected explodes, and the more manual analysis becomes an operational pit. Longeye was born from this disproportion between collection and exploitation, with the central idea that if AI today makes it possible to sort millions of contents in the private sector, why could it not help justice to bring out essential signals in the midst of digital noise?
The platform functions as a secure workspace in which investigators deposit evidence already obtained legally: audio recordings, images, documents, exports from telephones or social networks. The AI analyzes this data according to the file parameters, people mentioned, places, events, objects, and prioritizes the contents according to their presumed relevance. Where an investigator might spend weeks listening to calls or poring over PDFs, Longeye promises results in hours.
The most notable element, and the one that is of particular interest to lawyers, lies in the way in which the tool regulates the use of AI. Each generated summary automatically links back to the primary evidence. No summary is displayed alone, no correlation is presented without its original extract. Longeye seeks to avoid gray areas, automatic interpretations without a source, and especially the “hallucinations” which so weaken language models. It allows the defense to verify, challenge or recontextualize the conclusions of the tool, which profoundly changes the usual dynamics of police technologies.
Thus, in a criminal investigation, an incarcerated suspect made nearly five hundred phone calls. The investigators ended up listening to everything manually, discovering compromising statements along the way. By injecting the same files into Longeye, the tool isolated key passages within a few hours. For lawyers, this type of situation poses as many questions as it provides answers: if technology speeds up the work of investigators, can it be made available to the defense to identify inconsistencies, omissions or exculpatory elements lost in the shuffle? And what happens to equality of arms if only certain actors have this analytical capacity?
On this point, Longeye adopts an unusual position and wants to offer certain modules free of charge to court-appointed lawyers. The argument is strategic as well as ethical. Justice that accelerates analysis only for the prosecution automatically creates blind spots for the defense. Conversely, a tool accessible to both parties could strengthen the quality of the adversarial debate and reduce investigation times, provided obviously that the standards of transparency and auditability are respected.
The arrival of Longeye comes at a particular time, in the United States, several states are already discussing standards governing the use of AI in investigations. In Europe, the AI Act is preparing to classify certain technologies as “high risk”, by imposing traceability, documentation and control obligations.
For lawyers, the question is no longer theoretical, because if the police equip themselves with tools allowing them to analyze colossal files in a few hours, can the defense continue to work without equivalent instruments? Can we conduct a rigorous counter-investigation without the ability to explore large databases? And how can we ensure that these tools remain a support for contradiction rather than a unilateral technological advantage?
Longeye is still early in its deployment, but its philosophy, AI designed to be verifiable and contestable, could well become a model for next-generation investigative technologies.