Dr. Susanne Westhoff
Assistant Professor High Energy Physics
This is our bi-weekly seminar on particle physics at the intersection of theory and experiment. It takes place on Tuesdays from 16:00 - 17:00 in the Hilbert room, HG 02.802.
We discuss topics related to our research in a relaxed setting, with lots of time for discussion. If you would like to give a talk, please contact me: susanne.westhoff@ru.nl.
To uncover the sources of ultra-high-energy cosmic rays, three main observables are measured at the Pierre Auger Observatory - the cosmic-ray energies, the depths of maximum air shower development, and the arrival directions. At energy E>8 EeV, the arrival directions exhibit a dipolar structure pointing away from the center of our Galaxy indicating an extragalactic origin of cosmic rays at that energy. At the highest energies E>40 EeV, anisotropies at intermediate scales arise which correlate with the directions of nearby source candidates like Centaurus A or a catalog of starburst galaxies. By combining these observations of anisotropies in the cosmic-ray arrival directions with the energy spectrum and shower depth distributions in a global fit, the results can be interpreted further. This allows for conclusions on the contributions of individual source candidates, the density of sources, or the influence of extragalactic and Galactic magnetic fields.
The Standard Model Effective Field Theory (SMEFT) is an essential tool for probing physics beyond the Standard Model. With New Physics signals remaining elusive, deriving constraints on SMEFT Wilson coefficients is increasingly important in order to pinpoint its low-energy effects. This talk presents comprehensive global fits of SMEFT under the Minimal Flavour Violation (MFV) hypothesis. We establish global limits on Wilson coefficients using both leading and next-to-leading order SMEFT predictions for various observables. Our findings highlight significant interactions among different observables, emphasizing the necessity of integrating diverse data from multiple energy scales in global SMEFT analyses. Even within this flavour-symmetric framework, where Flavour Changing Neutral Currents (FCNC) cannot be generated at tree-level, they significantly contribute via Renormalization Group Evolution (RGE) effects. More in general a consistent treatment of the RGE in global analyses is an often-overlooked aspect which proves to be crucial for analysing properly datasets spanning various energy scales.
Searches for physics beyond the Standard Model (BSM) are the vital tasks of particle physics nowadays. Observational hints of BSM, however, are scarce and so far BSM particles have eluded direct observation. This might be due to their high mass or rare interaction probability with known particles. Thus, BSM might reveal itself first in minuscule deviations from Standard Model predictions. A hypothetical deviation caused by BSM physics is Lepton Flavour Violation in the charged lepton sector, i.e. transitions of electrons, muons or taus from one flavour to another - a process that is forbidden in the Standard Model. Its observation would therefore be an unambiguous sign of BSM. In this talk, I will present searches for lepton flavour violation at the Large Hadron Collider as well as at the dedicated muon experiment Mu3e.
In recent years, the end-to-end machine learning algorithms have drawn much attention, and they are proved to have superior performance compared to previous machine learning approaches relying on feature engineering. Do we still need feature engineering in HEP? In this talk, I will share my personal experience in the area of flavour tagging and tell you the story about how we developed, validated, and finally deployed the new graph neural network-based tagger. We will see how feature engineering still plays a critical role, though less glamorous. I will also discuss a recent phenomenology work on semi-visible jets where a newly handcrafted variable is proven to be very powerful. The fast advancing AI technologies have opened up so many opportunities, but classic approaches still hold power.
Hidden particles can help explain many hints for new physics, but the large variety of viable hidden sector models is a challenge for model-independent interpretations of new particle searches. Effective field theories (EFTs) that add higher dimensional operators to the standard model (SM) model-independently describe the impact of heavy new particles, but do not capture light ones. Adapting the EFT approach, we propose a method to streamline the computation of light hidden particle production rates by factorizing them into i) a model-independent SM contribution, and ii) a observable-independent hidden sector contribution. This factorization uses portal effective theories (PETs) that list all operators coupling SM fields to generic hidden particles. We developed a framework to construct such PETs, and used it to derive portal chiral perturbation theories that couple the light pseudoscalar mesons to a single generic spin 0, 1/2, or 1 particle.
I will discuss the use of Unruh detectors, a very simple theoretical model for particle detectors, as a simplified model for the decay of radioactive nuclei. Through this model, the effects of acceleration on the lifetime of unstable nuclei can be calculated. This turns out to give a non-trivial dependence between acceleration and lifetime. This acceleration-dependent decay width could potentially be used as an avenue for experimental verification of the Unruh effect, though there remain some open questions that will need to be answered to make the model better match reality.
The quest for lepton number violation, represented primarily by searches for neutrinoless double beta decay (0vbb), is a prominent probe of physics beyond the Standard Model. The new particle physics underlying any potential lepton-number-violating signal can be parametrized within the framework of effective field theory in terms of a set of higher-dimensional operators triggering a variety of distinct mechanisms. While it seems to be challenging to unravel the dominant source from the observation of 0vbb itself, the (non-)conservation of lepton number can be also tested in a variety of other experiments. After going over the phenomenological consequences of double-beta-decay searches and their limitations, I will follow our recent analysis and discuss the complementary probes of lepton number violation and their interplay with 0vbb.
Dark Matter in the form of a new elementary particle is still the best explanation we have for the many gravitational effects observed in the universe that cannot be explained by visible matter. Direct detection experiments aim to directly observe rare interactions between Dark Matter particles and ordinary matter. These experiments involve sensitive detectors located deep underground to shield them from cosmic rays. While no definitive detection has been made yet, these experiments have placed increasingly stringent constraints on the properties of Dark Matter particles, ruling out certain theoretical models and narrowing the search space. This seminar will explore current technologies, focussing on the detection of weakly interacting massive particles (WIMPs), and discuss where the field can go with the next detector generation.
-- seminar starts at 13:45 --
In this talk we will discuss non-perturbative production of fermionic dark matter in the early universe, during the transition between inflation and radiation domination known as (p)-reheating. Specifically, I will focus on the gravitational production mechanism accompanied by the coupling of fermions to the background inflaton field. The latter leads to the variation of effective fermion mass during preheating and makes the resulting spectrum and abundance sensitive to its parameters. Assuming fast preheating that completes in less than the inflationary Hubble time and no oscillations of the inflaton field after inflation, we find an abundant production of particles with energies ranging from the inflationary Hubble rate to the inverse duration of preheating. The produced fermions can account for all observed dark matter in a broad range of parameters. As an application, we discuss the non-perturbative production of heavy Majorana neutrinos in the model of Palatini Higgs inflation.
The Higgs boson discovery in 2012 during the Run-I of the Large Hadron Collider (LHC) kickstarted a rich exploration of fundamental physics. Ten years later and with five times more data, the LHC Run-2 has enabled us to study this central constituent of the Standard Model (SM) in great detail. The large dataset enables us to take a generic approach to uncover signatures beyond the SM physics. The Standard Model Effective Field Theory (SMEFT) allows us to look for all possible new interactions that are too heavy to be directly produced at the LHC. I will present the latest Higgs boson measurements from the ATLAS experiment and its role in constraining the SMEFT parameter space. I will also discuss a global SMEFT interpretation within the ATLAS experiment looking at the electroweak and the Higgs sector. These results demonstrate that a precision era at the high-energy frontier of particle physics is genuinely underway.
Measurement of D meson production is one of the most fundamental and important measurements in heavy flavour physics. The production of D meson is a crucial background in searches for new physics phenomena, and the measurement of D meson production cross section provides a direct constraint on perturbative quantum chromodynamics (QCD). On the other hand, such measurements are in particular important for searches for new physics appearing in heavy flavour decays such as lepton flavour violating tau decays at the LHC, which uses the D meson as a source of tau. In this seminar, the current analysis performed using the ATLAS detector will be presented; the prospect in the HL-LHC will also be discussed.
In the absence of any clear signs of new physics, the Standard Model Effective Field Theory (SMEFT) provides a model independent picture by combining the largest possible datasets at the LHC and beyond with state-of-the-art theory predictions parameterized in terms of Wilson coefficients. In this talk, I will present recent work in this direction, focusing especially on the impact of the FCC-ee and HL-LHC on the combined top, Higgs and EW sectors. I will also demonstrate how Machine Learning (ML) assisted optimal observables help to increase our sensitivity to new physics. Finally, I will briefly cover the automatized matching of UV models and its related bounds as obtained from a global fit.
We have explored the parameter space for the phenomenological minimal supersymmetric standard model (PMSSM) with specific focus on the region with a light neutralino dark matter (with mass less than half the mass of the Higgs), which is consistent with current collider and astrophysical constraints. We show that the latest results from the LHC searches for sparticles and direct detection constraints from XENON and LUX-ZEPLIN , basically rule out all the region for positive sign of the Higgsino mass parameter mu whereas for the negative sign only a very narrow region with light electro-weakinos is allowed. We further show that it should be possible to explore this region conclusively in the Run-3 of the LHC. We have also studied the impact of a possible light stau on our results. Time permitting I shall make some preliminary statements about the next to minimal PMSSM, viz. the NMSSM as well.
One of the cornerstones of the Standard Model is the breaking of the $SU(2)_L \times U(1)_Y$ symmetry groups to $U(1)_{EM}$ via the Higgs mechanism. This mechanism, spontaneous symmetry breaking (SSB), is traditionally discussed and taught in the Lagrangian formalism. However, in this talk SSB and all its consequences will be derived in a purely diagrammatic manner without resorting to Lagrangians or Lie groups.
FASER, an experiment at the LHC, was designed to explore the existence of light, weakly interacting particles that are generated in proton-proton collisions at the ATLAS interaction point and travel in the far-forward direction. The initial data analysis focused on two searches: the decay of dark photons into an electron-positron pair and the charged-current interaction of muon neutrinos leading to the production of muons. The outcomes of these searches, with a dataset collected during LHC Run 3 in 2022, will be presented.
The Standard Model of particle physics (SM) is a very successful theory. However, it fails to predict several phenomena observed in the Universe, like the origin of Dark Matter or the matter-antimatter asymmetry. Experiments at the Large Hadron Collider at CERN look for Physics Beyond the Standard Model (BSM) that could explain these. One of such experiments, LHCb, searches for rare decays, that SM predicts to happen very rarely, but that are severely enhanced within several BSM theories. An example of such decays are b → sl+l- transitions, which will be the focus of my talk. I will discuss several ways in which these are studies, as well as possible BSM implications and the latest results from LHCb.
The unique design of the LHCb detector, with its flexible trigger and precision vertex tracker, offers the possibility to search for long-lived particles with low masses and short lifetimes, in complementarity with other general-purpose detectors at the LHC. In this talk, I will discuss the searches that have been performed at LHCb and their future prospects. Although searches for low-mass LLPs decaying hadronically are particularly challenging, they can benefit from the upgrade of the LHCb’s online trigger. I will also discuss suggested phenomenological models for these searches, as well as the opportunities available during the LHC Run 3.
B meson decays are important players in the search for physics beyond the Standard Model (SM) of particle physics. Linked to the antimatter-matter asymmetry in the universe, a key interest in this is also improving the understanding of the mechanism of CP violation within the SM. The large amount of data gathered by the B factories and LHCb allows testing the SM with an unprecedented precision, probing scales much higher than the reach of direct searches at the LHC. In this sense, B mesons serve as a Telescope in the search for new interactions and particles. To do so, requires precise and reliable theoretical predictions. In this talk, I will present some of the challenges and new ideas to push the theoretical precision up. Specifically, I will focus on the determination of the CKM element Vcb, Vub and how to handle non-leptonic decays.
Since the discovery of the Higgs boson in 2012 at CERN, particle physicists have studied its properties with higher and higher precision. If any significant deviation from the Standard Model predictions is present in the experimental data, it would be an important hint of new physics. To perform these studies, we use the Higgs decay into two photons, which, despite the very low branching ratio, represents one of the cleanest channels experimentally. In this talk, I will present the most recent measurements - and their interpretation - of the Higgs production cross sections in the diphoton decay channel using proton-proton collisions recorded by the ATLAS detector at the LHC.
In many extensions of the Standard Model, the electroweak phase transition is first order. Such a phase transition proceeds via the formation and collision of bubbles. The bubble collisions can source a stochastic gravitational wave background signal, with characteristic frequency right in the sensitivity band of LISA. We can thus use data from gravitational wave experiments to probe physics beyond the standard model. In this talk, I will focus on the contribution to the gravitational wave signal from sound waves that get formed in the interactions between the plasma and the bubble walls. I will discuss the parameters that describe the phase transition, and argue that the speed of sound plays a significant role. To quantify the importance of this effect, I will discuss the use of effective field theories for an accurate description of the phase transition parameters. I will then present the results of a recent accurate computation of the sound speed in a representative model.
In PTOLEMY, we aim to develop a detector for the Cosmic Neutrino Background. This is done measuring the endpoint of the electron energy spectrum of tritium decay in great detail. Current experiments are reaching the limits of energy resolution for measurements with molecular tritium, and to improve the resolution even further we want to work with atomic tritium, which we will attach to a graphene base. Currently we are testing hydrogenation of graphene samples to gain understanding of this process before using tritium. In this talk I will introduce the PTOLEMY experiment and show the contribution of our group at Radboud.
It is starting to become apparent that physicists can use Large Language Models (like the model underlying ChatGPT) for a variety of tasks, such as helping with coding simulations, accurately summarising papers, helping to frame experimental designs, aiding hypothesis generation, or simply asking information retrieval questions. In our project we are evaluating the capabilities and limitations of these large models in the context of how they might increase or extend scientific understanding. This includes questions such as how to conceptualise/test possible discoveries made by these models, what are good ways to use these tools to increase the efficiency and effectiveness of scientific research, and whether these models at some point might contain scientific understanding in some meaningful way such that they might be able to translate this understanding onto human scientists. We are currently creating a framework useful towards generating understanding tests, which can be used to measure the degree of scientific understanding of agents. This can serve as a benchmark for different models, but it can also serve to establish whether a human can increase their understanding after interacting with these models.
In our efforts to ensure that no hint of physics beyond the Standard Model (BSM) eludes us, the possibility and exotic signatures of long-lived BSM particles (LLPs) have recently attracted the attention of the HEP community. We discuss our studies exploring and addressing the various challenges faced from the very beginning at the triggering stage to dedicated search analyses. We also plan to discuss the multitude of new proposals and experiments coming up to explore the lifetime frontier.
Neutrino telescope experiments are rapidly becoming more competitive in indirect detection searches for dark matter. Neutrino signals arising from dark matter annihilations are typically assumed to originate from the hadronisation and decay of Standard Model particles. I will discuss a supersymmetric model, the BLSSMIS, that can simultaneously obey current experimental limits while still providing a potentially observable non-standard neutrino spectrum from dark matter annihilation.
The (phenomenological) minimal supersymmetric SM predicts the existence of a supersymmetric partner particle to the tau; the supersymmetric tau particle, or stau for short. In my master’s internship I simulate the ATLAS detector at CERN to find possible signatures of stau pair production. In particular I try to make the argument for a lower luminosity LHC run to better observe or exclude the existence of staus in a certain mass range.
Despite ongoing experimental and theoretical efforts, the nature of DM remains elusive. Historically, most attention has been on weakly interacting particles (WIMPs), whose parameter space is increasingly constrained. This has caused the community to consider alternatives to the standard DM paradigm. One such alternative is strongly interactive particles (SIMPs). Moreover collisionless DM is in tension with cosmological observations, in the literature known as the cusp-core problem. A velocity-dependent self-interaction in the dark sector alleviates this problem. In this talk I will show that scattering via a massive resonant particle can provide the correct velocity dependence, and apply this in a dark sector consisting of dark pions and photons. I will then show the requirements for reproducing the correct relic density in a SIMP setup, as well as constraints on the kinetic mixing between the dark photon and standard model photons. I will conclude with the validity of the model.
Machine Learning techniques are widely used for the analysis of collider events, especially for jet tagging (distinguishing jets originally hadronized from gluons, t-quark, b-quark, light quarks, W, etc.). It is also used for event classification between selected models (normally SM vs. one single BSM). We tried for years to extend the usage of ML techniques in a more general case, i.e. the signal could be a mix of various parameters of one single model, a mix of models, or even unknown models. Finally, we hope we can find some general methods to detect any anomalies in the real experimental data.
The dark photon is a well-motivated new particle, arising from a renormalizable interaction with the photon field. This so-called vector portal can lead to a dark sector, which can contain candidates for dark matter. This dark photon can be long-lived for suitable values of the parameters and, therefore, escape searches using prompt signals. We propose a different strategy, namely displaced vertex search, that is viable at e^+ e^- colliders such as Belle II that can probe these long-lived particles. We found that Belle II has excellent sensitivity to such dark photon signals and can probe them in an unexplored region of the parameter space using already collected data. For those who have heard my talk before, to give you something new, I will go into more details on how I model the background for the search.