February 7th, 2019:
Met with Prof. Jabeen last week to discuss potential options for research. As of right now, currently on dark matter and higher dimensions pathway.
Began reading paper from Prof. Jabeen's CMS logbook from last year.
February 12th, 2019:
Switched research groups to focus more on theory rather than experimentation. Began to review lecture notes.
Lecture notes (summarized by Dr. Agashe):
(1). SM itself, including the idea of Feynman diagram for force (EM, strong and weak)
between matter particles (quarks and leptons) from exchange of gauge bosons (photon, gluon and W/Z)
+ Higgs mechanism for giving mass etc.
(2). Planck-weak hierarchy (and briefly dark matter) problems of the SM and their resolution by
adding new, ~TeV mass particles which are related to the SM by some "symmetry".
(3). How a particle propagating in a compact, extra (general) dimension looks like from the viewpoint of
the infinite, usual 4D, i.e., KK tower corresponding to each SM particle
(based on rough analogy with solving Schroedinger equation for particle in 1D box).
(4). Specifically, a warped/curved extra dimension can explain why ~ TeV KK scale (as need to address problems of
the SM) is much smaller than Planck scale. I gave the analogy with expanding universe here, i.e., just like
3D (flat) space expands with time, here we have 4D (flat) space-time expands/contracts as we move along
extra dimension (leading to effective 4D mass scale varying with location in extra dimension).
So, the idea is that gravitational physics resides near one end (UV/Planck brane), while Higgs sector/weak scale originates
near the other, thus the associated mass scales can be vastly different (due to exponential warp/curvature factor).
(5). Outline of the signal that they will analyze, i.e., KK W -> radion (which is the particle parametrizing
fluctuation of size of extra dimension) + SM W, followed by radion -> 2 W's (in turn, W -> quarks or leptons).
February 19th, 2019:
Went into more depth on Madgraph and the different variables and settings.
Model being used can be changed from Standard Model to new, theoretical models. Finding new particles using the data simulations under the new model would mean evidence of physics beyond the standard model.
February 25th, 2019:
Finished setting up Madgraph and the cluster, as well as Pythia and Delphes, and learning how to use both programs within Madgraph. Homework for this week is to understand the LHCO file output.
First 9 runs from Madgraph:
Index:
March 5th, 2019
#: number of this particle output. 0s are headers and contain what number event that event is. The run above was generated with 10,000 events.
typ: refers to the type of particle output. Each particle has a specific number assigned to it. Jets are marked with a number 4.
eta: Angle of the momentum of the detected particles with respect to the transverse plane and the beam axis. Measured from 0 when parallel with the transverse plane to infinity at the beam axis, but cut when abs(eta) > 3. Becomes exponentially larger as eta approaches infinity. Related to hyperbolic tangent.
phi: Angle of rotation of the momentum of the detected particles in the plane transverse to the beam axis. Measured from 0 to 2*pi.
pt: Transverse momentum (sqrt(px^2 + py^2)).
jmas: Mass of the jet
ntrk: (Do more research)
btag: (Do more research)
had/em: Ratio of hadronic energy to electromagnetic energy for jets (type 4). Ranges from 0.000 < had/em < 1000. Low values indicate a high proportion of electromagnetic energy, while high values indicate a high proportion of hadrons. Values of 999.9 indicate a jet that is effectively entirely hadronic.
dum1: Dummy variable in case another variable type is needed.
dum2: Dummy variable in case another variable type is needed.
Board from today:
Four measured quantities are transverse momentum, mass, eta, and phi; can convert to four-momentum using these quantities.
The pythagorian theorem can be used on delta eta and delta phi to find delta R, a quanitity representing the anglar distance between the two jets. This should always be equal to pi when given two identical decay jets; however, when there are more than 2 decays and the output particles are unbalanced, it will vary.
Cells and histograms:
Invariant mass of a z boson from two jets:
Transverse momentum of the first jet from this run:
Delta R for the two jets:
March 11th, 2019:
Downloaded Anaconda, a library of almost 500 different data analysis packages for Python, and began working through the list to determine which packages were most relevant. We had discussed converting the data analysis code from Mathematica to Python; I am acting a little in anticipation of that so we will be able to move quickly once we have enough information.
March 12th, 2019:
Met with Peizhi and discussed the implementation of the Wkk and Radion decay pathways. Implemented the new model into the cluster to allow Madgraph to run non-SM simulations. Awaiting an updated version of Mathematica from Peizhi to run the new calculations since the first one for Wkk he sent us had a few bugs.
March 26th, 2019:
Discussed cuts in further depth. We ran a background comparison against the data to filter out some of the noise from the signal we were searching for. We discussed other possible backgrounds to run against, as well as the options to project the current data to sensor technologies that would be available 30 or so years from now, in terms of raising the maximum energy of the collision and the luminosity.
April 2nd, 2019:
Went over how to calculate the significance to determine how useful a simulation is. For the number of real signal events S and the number of real background events B, the significance is calculated using S/(S+B)^.5 . A significance of 2 or 3 usually indicates that further research into this area could prove to bear fruit; a significance of >5 indicates a discovery. By using cuts and eliminating background noise, the significance can be improved and potentially eventually lead to a discovery.
April 9th, 2019:
I was sick today, so I missed our weekly meeting. The others continued to play with the different variables involved in the cuts and further refine the signal. Nick, John, and Peizhi managed to get a simulation with a significance over 3 by raising the TeV from 13 to 14 and increasing the luminosity to 3000 fb-1 from 300 fb-1. In addition, Nick continued his work on converting the Mathematica analysis file to a ROOT file that could be run within the cluster.
April 16th, 2019:
Finished our final versions of our graphs. My Madgraph has been having issues since last week that I have not been able to resolve thus far; however, Nick and John were able to get successful final graphs through Python and Mathematica respectively. Their logbooks are located here and here, again respectively. We have begun to move into discussions about the final paper and presentation.
April 23rd, 2019:
Finalized discussions about the project. Meeting acted as essentially a Q&A session between Peizhi and us to clarify any questions we had.