William & Mary was well represented at the recent Town Hall Meeting on Hot & Cold Quantum Chromodynamics, part of the decadal long range planning process for the field of nuclear physics, hosted by the Massachusetts Institute of Technology.
“The meeting was part of a process that will take the better part of the year,” Jozef Dudek explained. “It’s supposed to outline the major thrusts and programs that the field of nuclear physics writ large will consider over the next five to 10 years.”
Dudek, the Sallie Gertrude Smoot Spears Associate Professor of Physics, was an invited speaker at the Hot & Cold QCD gathering, as were Justin Stevens, associate professor of physics, and Cristiano Fanelli, assistant professor of data science at William & Mary and Jefferson Lab bridge.
“We were asked as invited speakers to speak on behalf of the community,” Stevens explained. “So we’re not necessarily just representing ourselves, but providing some kind of context about the field for the rest of the community to understand where we are today and where the field is headed.”
Together, the three represented the theoretical, experimental and computational aspects of QCD, which is the search for a complete explanation of the strong force, the interaction that governs the quarks and gluons that make up the protons and neutrons in the nucleus of the atom.
“There’re a lot of universities that have a strong theory component for the field of hadron spectroscopy, and lots of universities that have strong experimental groups that work on this,” Stevens said. “But William & Mary is a little unique in really having pretty strong components of both of those, and more recently, a growing initiative in data science. And so it’s not unjustified that we were there speaking for all aspects of our part of the QCD field.”
“My talk was about the theoretical side,” Dudek said. “And really, I was trying to convince the audience that the field has undergone — to use that horrible phrase, ‘a paradigm shift.’ Terrible phrase, but people love it.”
He went on to explain that over the past decade the field has evolved to the point where the theorists can use a mathematical tool known as lattice QCD to make predictions for what the experiments actually measure.
“And then on the other side, we have the theoretical tools for analyzing the experimental data in a sufficiently rigorous way that the two sides of the ledger can actually be connected,” Dudek explained.
Stevens, the experimentalist, told the participants at the meeting about his work on the GlueX experiment at the U.S. Department of Energy’s Thomas Jefferson National Accelerator Facility (Jefferson Lab), in Newport News.
“There, we use a photon beam to produce interesting new particles,” he said. “We made connections to previous measurements that give us our first best evidence of something called an exotic hybrid meson, which has both a quark and an antiquark, but also constituent gluon contributions to its wave function.”
He added that he and the rest of the nuclear physics community expect the discovery of an entire family of similar particles in the foreseeable future. Stevens also spoke about the latest news in the search for other kinds of novel particles.
“There’s been production of other unexpected exotic states that aren’t just made out of a quark and an antiquark, like a meson, or three quarks like a baryon — rather we’ve seen novel particles that are made out of four quarks or five quarks, called tetraquarks or pentaquarks,” he said. “And it turns out, there’s been many of those observed in various experiments.”
But Stevens said that so far, there is a problem of inconsistency between those observations and the different ways that such exotic states of matter are produced. He added that he expects the consistency problems to be resolved through experiments conducted using higher energy photon beams, possibly through an energy upgrade to the Jefferson Lab accelerator or at the proposed Electron-Ion Collider at the DOE’s Brookhaven National Lab.
Fanelli gave an overview of experimental applications of artificial intelligence and machine learning for quantum chromodynamics experiments. He describes himself as both a physicist as well as a data scientist, and points out that QCD experiments have generated larger and larger sets of data.
“The data we have from these experiments may be highly dimensional and may have important hidden correlations,” he said. “So we can use cutting-edge tools — data science tools — to get insight into the physics.”
Fanelli was invited to give a plenary talk at the Town Hall on the applications of artificial intelligence and machine learning in experimental quantum chromodynamics.
“Artificial intelligence and machine learning techniques will be applied nearly in every system of the next QCD frontier experiments like the Electron Ion Collider,” he said. “AI/ML, supported by the growth of computational power, are supporting research in directions previously unexplored due to the complexity of the problems.”
All three are eminent practitioners and respected contributors to the field, indicated by their invitations to the Town Hall meeting. Dudek was just invited to serve on the joint Department of Energy/National Science Foundation Nuclear Science Advisory Committee charged with recommending a long-range plan for advancement of the U.S. nuclear science research program, which will be the culmination of the process that began with the Town Hall at MIT.
And Fanelli organized the second workshop on Artificial Intelligence for the Electron-Ion Collider, held in October at William & Mary. The event drew a large attendance of AI/ML experts from academia, industry and the national labs.
“The Electron-Ion Collider community is pretty active in AI,” Fanelli said. “The EIC will be commissioned in 2028 and this is unanimously deemed to be an opportune time to discuss how to leverage AI/ML in all phases of the EIC schedule.”
Joseph McClain, Research Writer