Optimizing nuclear physics with Machine learning

To assist them save time and money, scientists have started utilizing new tools provided by machine learning. Nuclear physics has seen a flurry of machine learning initiatives go online during the past few years, and numerous papers have been published on the topic. This burst of work using artificial intelligence is now summarized by 18 writers from 11 institutions in “Machine Learning in Nuclear Physics,” a publication that was just released in Reviews of Modern Physics.

According to Amber Boehnlein, the paper’s lead author and the associate director for computational science and technology at the U.S. Department of Energy, it was important to document the work that has been done. They really do want to raise the profile of the use of machine learning in nuclear physics to help people see the breadth of the activities.

Boehnlein hopes that the article will serve as a roadmap for future projects as well as an educational resource for readers who are interested in the topic because it compiles and analyses significant work that has been done in it so far.

People can use it as a benchmark as they advance to the following step, she said.

An advancement in machine learning

In March 2020, Boehnlein and two of her co-authors, Witold Nazarewicz and Michelle Kuchera, attended a workshop at Jefferson Lab that explored artificial intelligence. After writing a follow-up report, they were motivated to take things a step further. They made the decision to conduct a survey of the current machine learning efforts in nuclear physics along with 15 other colleagues who represented all of the subfields of nuclear physics.

At the beginning, they began. As the authors note, computer experiments were utilised in 1992 to examine nuclear properties, such as atomic masses, in the first substantial work utilising machine learning in nuclear physics. Despite the potential of machine learning being suggested by this work, its use in the field remained limited for more than two decades. That has altered in recent years.

Complicated computations are among the specific tasks that machine learning, which includes creating models that can carry out tasks without explicit instruction, requires computers to complete. Recently made improvements in computing power have made it easier for physicists to apply machine learning into their research.

Due to the increased use of the techniques, Boehnlein noted, This would have been a less interesting study in 2019 because there wouldn’t have been enough work to catalogue, but now, there is significant work to cite.

Today, machine learning is used in research across all scales and energies, from studies of the constituents of matter to studies of the cyclic evolution of stars. Additionally, it can be found in theory, experiment, accelerator science and operations, and data science, which are the four subfields of nuclear physics.

Co-author Kuchera, an associate professor of physics and computer science at Davidson College, said, They made an attempt to assemble a comprehensive, communal resource that bridges the work in our subfields, which will hopefully generate lively discussions and innovation across nuclear physics.

Nuclear physics experiments can benefit from the use of machine learning models in both their planning and execution. They can also be utilized to help in the analysis of the data from such tests, which frequently exceeds petabytes.

Kuchera predicted that machine learning would be integrated into data collection and processing.

These procedures will be expedited by machine learning, which could result in less beamtime, computer use, and other experimental expenditures.

Bringing theory and experiment together

However, nuclear theory has so far seen the strongest growth in machine learning. Nazarewicz, a nuclear theorist and director of research at Michigan State University’s Facility for Rare Isotope Beams, is particularly interested in this topic. He asserts that machine learning can assist theorists in making predictions, improving and simplifying models, and understanding the uncertainty associated with such predictions. Additionally, it can be used to analyze phenomena like neutron stars and supernova explosions that are too complex for experimentation.

According to Nazarewicz, neutron stars are not very user-friendly.

In order to analyse elements and hyperheavy nuclei, which have so many protons and neutrons in their nucleus that they cannot be viewed empirically, he leverages machine learning.

The low-energy theory community, which Witold is a part of, has embraced these methodologies, according to Boehnlein, who found the results to be the most astounding in the theory community.

According to Boehnlein, theorists at Jefferson Lab have begun to use these tools to explore the architecture of protons and neutrons. For example, quantum chromodynamics, the theory that describes interactions between the quarks and gluons that make up protons and neutrons, is a complex theory from which machine learning can help extract information.

The authors forecast that the application of machine learning to both theory and experiment will accelerate these subfields separately and improve their interconnections, accelerating the entire scientific process.

The faster they can do the cycle between experiment and theory, the faster they will arrive to discoveries and applications, said Nazarewicz. Nuclear physics helps them make discoveries to better understand the nature of their cosmos, and it’s also employed for social applications.

The authors anticipate greater advancements and larger applications utilizing machine learning as this subject continues to flourish.

According to Boehnlein, the use of machine learning in nuclear physics is still in its development. This document will serve as a resource along the way, even for its own writers.

His research is focused on machine learning methods, so he absolutely will utilize this paper as a window into the state of machine learning across nuclear physics right now, Kuchera said. I hope the paper is used as a resource to understand the current state of machine learning research, allowing us to build from these efforts.

Source link