Increasing particle accelerator efficiency with AI, ML, and automation

Extreme precision and previously unheard-of collision energy are becoming increasingly important as particle accelerator technology enters the high-luminosity era. The design and functioning of CERN’s accelerators must be continuously improved in order to be as efficient as possible, given the Laboratory’s goal to lower energy consumption and expenses.

In response, the Efficient Particle Accelerators project (EPA) was formed, bringing together experts from various accelerator, equipment, and control groups at CERN to enhance accelerator efficiency.

Following a 2022 workshop to plan upgrades for the High Luminosity LHC (HL-LHC), a think-tank was established, and it produced seven efficiency recommendations for the EPA to consider.

According to Alex Huschauer, an EPA member and engineer in charge of the CERN PS, the goal was to examine efficiency from the widest possible perspective. They desired a framework that could be used with all of the accelerator complex’s machines.

To accomplish this, the team developed nine work packages on efficiency that will be implemented over the years leading up to the start of the HL-LHC run.

According to Verena Kain, the EPA project leader, our efficiency think-tank discussions revealed that automation is the way forward. This entails using automation both conventionally and through AI and machine learning.

For example, artificial intelligence can assist physicists in overcoming accelerator magnet hysteresis. This occurs when the field of iron-dominated accelerator magnets cannot be described using a simple mapping of current in the electromagnet to field.

If this is not taken into account, it can result in inconsistencies in programmed fields as well as negative effects on beam quality, such as reduced trajectory stability and precision. Today, these field errors are manually tuned to correct the field, which takes time and effort.

According to Kain, hysteresis occurs because the actual magnetic field is defined by both the current in the power supply and the magnet’s history. What makes it difficult is that we can’t model it analytically—we can’t figure out exactly how much current is required to create the proper field for the beam in the accelerator magnet—at least not with the precision required. However, AI can learn from the magnet’s historical data and develop a precise model.

In the upcoming years, the team plans to train the AI on all of CERN’s accelerating magnets, having conducted preliminary testing with magnets in the SPS.

A large portion of the beam and accelerator control has previously been done manually, even though experiments throughout the CERN accelerator complex already use automation, artificial intelligence, and machine learning to help with data collection.

The majority of the lower-energy devices, such as the PS, were constructed during a time when modern automation was simply not feasible, says Kain. Scheduling is another area where automation has the potential to transform efficiency.

According to her, the accelerator complex produces its various beams sequentially, requiring coordination to allow the beam to be taken out of one machine and injected into the next at the appropriate time. They occasionally need to alter the schedule 20–40 times a day, and it can take five minutes each time. A large portion of the work performed by personnel in the control center is currently done manually to complete that task.

Control center operators will be able to work on the beams more often than on scheduling if this process is automated.

The EPA also focuses on autopilots, automated fault recovery and prevention, automated testing and sequencing, automated parameter control and optimization, and automated LHC filling. During the next five years, the team plans to carry out tests with LHC Run 3 and Long Shutdown 3.

Huschauer adds, “We will be using AI and automation for the accelerators on a large scale for the first time thanks to the EPA project.” Better beam quality will allow us to run the complex for shorter periods of time, which will improve the quality of the physics data and use less energy overall.

Source link