Alumni

NPA Seminar, Flavio Cavanna, FERMILAB and University of L’Aquila, "The path of the DUNE Experiment at a turning point."

Getting there! DUNE with two 17kt LAr TPC Far Detector (FD1-FD2) modules, a Near Detector Complex and a Neutrino Beam with an intensity of 1.2 MW is well on its way to start physics in 2028 at SURF (SD). Mass Ordering and sensitivity to Maximal CPV - the initial goals of the flagship Long-Baseline (LBL) Neutrino Program - are within reach. 
The time has come to define a strategy to achieve the ambitious ultimate precision in the LBL physics goals and possibly further expand the DUNE science scope into the low-energy domain of rare underground physics and BSM searches.

Dissertation Defense: Hannah Bossi, Yale University, "Novel Uses of Machine Learning for Differential Jet Quenching Measurements at the LHC"

At sufficiently high temperatures and pressures, QCD matter becomes a hot and dense deconfined medium known as the Quark Gluon Plasma (QGP). Collisions of relativistic heavy-ions are used to recreate the QGP, providing a rich laboratory for exploring the mysteries of the strong interaction. The intrinsic and dynamic properties of the QGP are probed with jets, narrow cones of particles resulting from the scattering of quarks and gluons with a high momentum transfer.

Dissertation Defense: London Cooper-Troendle, Yale University, "First Measurement of Inclusive Muon Neutrino Charged Current Triple Differential Cross Section on Argon"

The field of accelerator neutrino experiments is entering an era of precision oscillation measurements where the remaining unknown neutrino measurements will be determined. The upcoming DUNE and Hyper-K experiments aim to determine the neutrino mass hierarchy and degree of Charge-Parity (CP) violation in the neutrino sector, providing potential insight on the matter-antimatter imbalance observed in the universe. However, these experiments require highly accurate measurements, and neutrino cross section modeling uncertainties may limit their capabilities.

Inference Project Virtual Talk: Inference in a Nonconceptual World

Classical models of inference, such as those based on logic, take inference to be *conceptual* – i.e., to involve representations formed of terms, predicates, relation symbols, and the like. Conceptual representation of this sort is assumed to reflect the structure of the world: objects of various types, exemplifying properties, standing in relations, grouped together in sets, etc. These paired roughly algebraic assumptions (one epistemic, the other ontological) form the basis of classical logic and traditional AI (GOFAI).

Dissertation Defense: Kaicheng Li, Yale University, "Searching for the Electron Neutrino Anomaly with the MicroBooNE Experiment Using Wire-Cell Reconstruction"

The Micro Booster Neutrino Experiment (MicroBooNE) is a leading large-scale Liquid Argon Time Projection Chamber (LArTPC) experiment, designed for precision neutrino physics. The main scientific objectives of MicroBooNE include the investigation of the Low Energy Excess (LEE) observed by the MiniBooNE Experiment between 2002-2019 in the Booster Neutrino Beam (BNB) at Fermilab, the measurements of neutrino-argon interactions, and the research and development of LArTPC technology. This thesis focuses on understanding the MiniBooNE LEE through charged-current electron neutrino interactions.

Inference Project Virtual Talk and Conversation, Inference: A Logical-Philosophical Perspective

In this talk, Professor Paseau will describe some of his work on inference within mathematics and more generally. Inferences can be usefully divided into deductive or non-deductive. Formal logic studies deductive inference, the obvious question here being: which formal logic correctly captures it? His view, defended in his recent monograph One True Logic (Oxford UP, co-authored with Owen Griffiths), is that any such logic must be highly infinite. In this Inference Project event, he shall explain what this means and sketch some arguments for it.

WIDG Seminar: Zoltan Varga, Wigner Research Centre for Physics, "Investigating the role of the underlying event in the charm-baryon enhancement”

The factorization hypothesis states that the production cross-section of heavy-flavor hadrons can be calculated as the convolution of three independent terms: the parton distribution function of the colliding hadrons, the production cross sections of the heavy-quarks in the hard partonic process, and finally the fragmentation functions of the heavy-flavor quarks into the given heavy-flavor hadron species. The fragmentation function has been traditionally treated as universal, i.e. independent of the collision systems.

Subscribe to RSS - Alumni