Large Hadron Collider and Dark Matter Brings Gregory Peim to Northeastern University
by Jessica Driscoll
Gregory Peim, 26, is a graduate student pursuing a doctoral degree in physics, conducting research under the direction of Professor Pran Nath. Originally from New Jersey, he expects to graduate this year.
Peim says he started to conduct his present research in January 2010 and chose his focus before coming to Northeastern University.
“I applied to the graduate program at Northeastern to work with Professor Pran Nath for his reputation as a researcher,” he says. “I was interested in working on supergravity and the ability to detect such models experimentally, e.g. at the Large Hadron Collider and Dark Matter experiments.”
Peim says during the initial data taking stage of the Large Hadron Collider (LHC), his research focused on the potential to discover new physics in the early runs.
“Such studies included the crucial understanding and proper simulation of background processes. In the same work, these Standard Model processes were then used to investigate the LHC’s reach potential in supergravity parameter space, which is in excellent agreement with their present reach,” he says.
“Parameter points with varying sparticle spectrums were analyzed to find the most encouraging modes of discovery. These parameter points were also investigated in indirect and direct detection dark matter experiments and were found to be testable in multiple experiments. We later explored parameter points where the mass of the lightest neutralino was roughly half the mass of the light CP-even Higgs, i.e. Higgs pole models. Such parameter points have very predictable spectrums and give distinct signatures on several experiments. We successfully showed how, using potential results from a variety of experiments, one can reconstruct the gaugino sector of the model by measuring the peak of the effective mass distribution, the edge of the dilepton invariant mass distribution, and the spin independent neutralino-proton cross section.
Peim says that in the beginning of 2011, the CMS and ATLAS Collaborations started to release their early 7 TeV analysis beyond the Standard Model physics.
“The limits they found in the mSUGRA/CMSSM parameter space surpassed those from the Tevatron,” he says. “My colleagues and I began to study how their results could be extended to other regions of the parameter space. They also compared the reach of CMS and ATLAS with the indirect SUSY constraints (e.g. from flavor physics). Specifically, it was found that a significant portion of the parameter space excluded by the LHC was essentially already excluded by the indirect constraints and the majority of parameter space was yet to be probed.”
Additionally, Peim said, his group explored the implications that these results had on direct detection dark matter experiments.
“It was found that within supergravity models the LHC had excluded a large region of the signature space at direct detection experiments. The analysis was then extended to supergravity models with non-universal soft breaking in the gaugino sector. In this case, we found that a part of the dark matter-excluded region became repopulated and thus a signature to observe nonuniversality.”
Peim said he and his collaborators also investigated the Hyperbolic Branch in great detail and showed that it consists of three regions: the Focal Point, Focal Curves, and Focal Surfaces.
“These focal regions allow for a small Higgs mixing parameter while scalar masses become large and may be order TeV or larger,” he said. “Applying the LHC experimental SUSY results to constrain these regions, we found that in the case of universal soft breaking the Focal Point region was depleted while regions on Focal Curves and Focal Surfaces remained largely intact.”
Following the evidence of the Standard Model Higgs boson at the LHC in December of 2011, which indicated a signal in the range 115 GeV to 131 GeV, Peim said he and the group worked on the implications of this result within Supersymmetry.
“The analysis done in the framework of SUSY with gravity mediation showed that one needed a large scalar mass and a large trilinear coupling so that the ratio is sizable in order to generate a loop correction that could boost the Higgs boson mass to the allowed range,” he says. “Once the LHC’s data confirmed the discovery of the (Higgs) boson with a mass near 125 GeV, we carried out a Bayesian analysis to identify more precisely the regions of the parameter space that were consistent with the measurement of the boson mass. Our findings showed that the universal gaugino mass could be in the sub-TeV region, the scalar mass was typically a TeV or larger, and the ratio of the scalar mass to trilinear coupling was confined to a narrow strip less then or equal to 1. Further, we used our Bayesian analysis to set 95-percent confidence level lower bounds on sparticle masses. Additionally, we observed that the spin independent neutralino-proton cross section lies just beyond the reach of the current sensitivity.”
Peim said the majority of current studies only consider the case where one fundamental particle contributes to cold dark matter, but there is no overriding principle that requires such a restriction.
“Dark matter may in fact be composed of several components,” he says. “A branch of my research has been to investigate such a possibility. In one paper, my collaborators and I proposed extending the Minimal Supersymmetric Standard Model by a hidden sector field that includes both fermionic and baryonic stable particles as dark matter candidates, which was the first in literature to do so. These models were required to be consistent with the current cold dark matter relic density observed by WMAP and are found to successfully explain the excess seen in the PAMELA experiment while still being consistent with direct detection experiments. More specifically, an exploration of the case when the dark matter consists of Dirac and Majorana particles shows that the Dirac component can fit the PAMELA data via its annihilation close to a Breit-Wigner pole while the Majorana component of dark matter remained the dominant component and can be detected in direct detection dark matter experiments. Additionally, we show that in the multi-component picture, it is possible to generate events that can be tested by XENON-100 and other ongoing direct detection experiments. Further, allowing a leptophilic gauge symmetry in the model can produce a discoverable Z vector boson with signatures in the leptonic final states.”
Peim says he has also investigated whether there is some underlying principle behind the ratio of the dark matter relic density to baryonic relic density being about 5, i.e. the cosmic coincidence.
“We extended the Standard Model as well as the Minimal Supersymmetric Standard Model using the Stueckelberg mechanism to explain the cosmic coincidence,” says Peim. “We discussed several candidate models for asymmetric dark matter using a variety of operators constructed from Standard Model fields, which transfers the asymmetry to the dark matter sector at thermal equilibrium in the early universe. The Stueckelberg extension provides us with a mechanism to deplete the symmetric component of dark matter produced by thermal processes. In the Minimal Supersymmetric Standard Model extension, the model has two dark matter candidates with the additional one being the stable neutralino. For a broad class of supersymmetric models we found that the neutralino could be the subdominant component of dark matter and still be consistent with experimental SUSY constraints (including a the recent Higgs result). Additionally, these models are also accessible at future direct detection experiments and can produce clear excesses at a muon collider.”
Peim currently has 13 publications published in Physical Review D, Physics Letters B, and Modern Physics Letters A.
“My papers are centered around a common theme of detection beyond the Standard Model physics and their experimental verification,” he says. “My future goals are to continue on my current research path and to help in the aid of discovering what’s beyond Standard Model physics.”
Peim has a Hirsch Index of 11, which is a metric to measure a researcher’s activity and consistency. Of his 13 total papers, 11 have 11 or more citations. His most cited paper (so far) has 73 citations.
Peim has presented at the Phenomenology 2011 Symposium, the Theoretical Advanced Study Institute in Elementary Particle Physics, and the Phenomenology 2012 Symposium. This year, he is expecting to give talks at the Phenomenology 2013 Symposium and the Brookhaven Forum 2013.
“Greg has worked on two very different areas of fundamental physics — the search for new physics at the LHC and the physics of the early universe,” says Professor Nath. “In each area, he has made outstanding contributions. He has published 13 papers, has over 350 citations and a Hirsch index of 11, which are significantly more than any other theory graduate student in recent memory.”