CMS uses AI to fully reconstruct LHC particle collisions for first time
Summary
CMS uses a new machine-learning algorithm to reconstruct LHC particle collisions faster and more precisely than traditional methods, improving data analysis.
CMS uses AI to reconstruct particle collisions
The CMS Collaboration has demonstrated, for the first time, that machine learning can be used to fully reconstruct particle collisions at the Large Hadron Collider. The new method is faster and more precise than the traditional algorithm the experiment has used for over a decade. The findings are detailed in a paper submitted to the European Physical Journal C and available on the arXiv preprint server.
Each proton collision at the LHC produces a complex spray of particles. For years, CMS has used a hand-crafted particle-flow algorithm to identify them. The new machine-learning-based particle-flow (MLPF) algorithm replaces that rigid logic with a single model trained on simulated collisions.
"Instead of being told how to reconstruct particles, the algorithm learns how particles look in the detectors," said Joosep Pata, the lead developer. "It's like how humans learn to recognize faces without memorizing explicit rules."
The new algorithm outperforms the old one
When tested on simulated data mimicking the current LHC run, the MLPF algorithm matched the traditional method's performance. In key tests, it performed even better.
For simulated events creating top quarks, the new algorithm improved the precision of reconstructing particle sprays, known as jets, by 10% to 20% in important momentum ranges. This directly enhances the quality of data physicists can analyze.
"New uses of machine learning could make data reconstruction more accurate and directly benefit CMS measurements," Pata said. The improvements span from precision tests of the Standard Model to searches for new, undiscovered particles.
Reconstruction is now significantly faster
A major advantage of the MLPF algorithm is its speed. It can run efficiently on modern graphics processing units (GPUs), which are often faster for this type of task than the central processing units (CPUs) required by the traditional algorithm.
This allows a full collision to be reconstructed far more quickly. The shift to GPU-optimized code is a critical step in handling ever-increasing amounts of data.
The core benefits of the new system are clear:
- Matches or exceeds the precision of the traditional algorithm.
- Improves jet reconstruction precision by 10-20% in key tests.
- Runs far more quickly on modern GPU hardware.
Preparing for the High-Luminosity LHC
While tested under current conditions, the algorithm is predicted to be even more crucial for the future. The High-Luminosity LHC upgrade, scheduled to start in 2030, will deliver about five times more particle collisions.
This deluge of data poses a significant challenge to the experiments' computing systems. The faster, more efficient MLPF algorithm will be essential for managing that workload.
By teaching detectors to learn directly from data, physicists are not just improving performance. They are fundamentally redefining what is possible in experimental particle physics for the next decade and beyond.
Related Articles

Scientists trace origin of ultra-energetic 'Amaterasu' cosmic ray
Scientists found the Amaterasu particle, an ultra-energetic cosmic ray, likely came from a nearby star-forming galaxy, not an empty void.

AI maps gene control networks driving Alzheimer's disease
UCI researchers used a new machine learning tool, SIGNET, to create detailed maps of cause-and-effect gene interactions in Alzheimer's-affected brain cells, revealing key regulatory genes and pathways that drive the disease.
Stay in the loop
Get the best AI-curated news delivered to your inbox. No spam, unsubscribe anytime.

