PACMAN LHC

Particle Accelerators and Machine Learning

Started
January 2, 2019
Status
In Progress
Share this project

Abstract

Particle accelerator facilities have a wide range of operational needs when it comes to tuning, optimisation, and control. At the Large Hadron Collider (LHC) at CERN reducing the risks related to the high beam power by reducing the beam losses will lead to increase in particle collision rates and a deeperunderstanding of the physics mechanisms. In order to meet these sorts of demands, particle accelerators rely on interactions with control systems, on fine-tuning of machine settings by operators, online optimisation routines, and on databases of previous settings that were known to be optimal for some desired operating condition. We aim to bring Machine Learning (ML) to particle accelerator operation, in order to increase the performance. Each of the mentioned operational needs have corresponding ML-based approaches that could be used to supplement the existing workflows. In addition, new HL-LHC and FCC designs will be proposed based on the LHC findings and prepare for more effective novel FCC operation.

People

Collaborators

SDSC Team:
Ekaterina Krymova
Guillaume Obozinski

PI | Partners:

Particle Accelerator Physics Laboratory:

  • Dr. Tatiana Pieloni
  • Dr. Michael Schenk
  • Loic Coyle

description

Goal:

Minimise beam losses, better control of accelerator parameters, prevent unnecessary machine interruptions.

Impact:

We are aiming to implement the paradigm of digital twins, i.e. a virtual representation of the real world accelerator. At the same time, this could create new virtual and augmented reality opportunities, which will certainly be a big theme in the implementation of the future FCC.

Proposed approach:

We propose to gather massive amount of accelerator data in collaboration with the LHC Operation groups to evaluate automatic and semi-automatic ways to optimise and steer the overall collider set-up and define the strategy for the operational aspects of the future projects (i.e. HL-LHC and FCC). In parallel to operational data accumulated during the physics runs, time will be devoted for machine development studies for testing the robustness of the developed models used for an automatised optimisation of the collider performances In dedicated experiments we will request the trained model to predict and set new parameters to improve the beam lifetimes in the LHC. Depending on the results obtained a continuation of the study and the extension of the models to other accelerators of the CERN complex and to future machine (HL-LHC) will be a natural path for a continuation of the collaboration.

Gallery

Figure 1: LHC Fill, Beam Modes (from Wyszkowski, Przemysław Michał. ESB application for effective synchronization of large volume measurements data. Diss. AGH-UST, Cracow, 2011).
Figure 2: Schematic view of the LHC with two-beam design (from  Brüning O, Burkhardt H, Myers S. The large hadron collider. Progress in Particle and Nuclear Physics. 2012 Jul 1;67(3):705-34).

Annexe

Additionnal resources

Bibliography

  1. G. Apollinari et al. (including T. Pieloni) “High-Luminosity Large Hadron Collider (HL- LHC): Preliminary Design Report – Chapter 2: Machine Layout and Performances” Preliminary Design Report
  2. L. Coyle, “Machine learning applications for hadron colliders: LHC lifetime optimization and designing Future Circular Colliders”, presented at the 2018 Swiss Physics Society Meeting at EPFL Annual meeting of the Swiss Physical Society 2018 2752252/SPS_talk.pdf

Publications

Related Pages

More projects

ML4FCC

In Progress
Machine Learning for the Future Circular Collider Design
Big Science Data

CLIMIS4AVAL

In Progress
Real-time cleansing of snow and weather data for operational avalanche forecasting
Energy, Climate & Environment

SEMIRAMIS

Completed
AI-augmented architectural design
Energy, Climate & Environment

4D-Brains

In Progress
Extracting activity from large 4D whole-brain image datasets
Biomedical Data Science

News

Latest news

The Promise of AI in Pharmaceutical Manufacturing
April 22, 2024

The Promise of AI in Pharmaceutical Manufacturing

The Promise of AI in Pharmaceutical Manufacturing

Innovation in pharmaceutical manufacturing raises key questions: How will AI change our operations? What does this mean for the skills of our workforce? How will it reshape our collaborative efforts? And crucially, how can we fully leverage these changes?
Efficient and scalable graph generation through iterative local expansion
March 20, 2024

Efficient and scalable graph generation through iterative local expansion

Efficient and scalable graph generation through iterative local expansion

Have you ever considered the complexity of generating large-scale, intricate graphs akin to those that represent the vast relational structures of our world? Our research introduces a pioneering approach to graph generation that tackles the scalability and complexity of creating such expansive, real-world graphs.
RAvaFcast | Automating regional avalanche danger prediction in Switzerland
March 6, 2024

RAvaFcast | Automating regional avalanche danger prediction in Switzerland

RAvaFcast | Automating regional avalanche danger prediction in Switzerland

RAvaFcast is a data-driven model pipeline developed for automated regional avalanche danger forecasting in Switzerland. It combines a recently proposed classifier for avalanche danger prediction at weather stations with a spatial interpolation model and a novel aggregation strategy to estimate the danger levels in predefined wider warning regions, ultimately assembled as an avalanche bulletin.

Contact us

Let’s talk Data Science

Do you need our services or expertise?
Contact us for your next Data Science project!