NOTE to future readers: The first section of these notes are somewhat disorganized and cluttered with research mistakes and false starts. If you turn to them for advice, make sure that you read all of the parts to make sure I'm not referencing something that I end up realizing is a mistake in the next entry. A general summary of the research process can be found in the last entry (posted 5/14/17) and may be a good place to start.
2/14/2017
I met with Mr. Hong today (alongside Austin, John, David, and two graduate students) to discuss the plans for this semester's research.
After explaining the basics of the Standard Model, Mr. Hong outlined the purpose of MadGraph. He explained that the probability of a collision between two particles relates to their cross-sectional area, and that the results of the collision depend on the cosine of the angle of impact. The probability distributions for the virtual particles created by the collision depend furthermore on the matrix element M, which is determined by the theory of particle interactions. In this case, we will be using MadGraph to calculate distributions based on theories going beyond the Standard Model. We will also use Mathematica or another language to graph these distributions.
Mr. Hong emailed us instructions on how to install MadGraph on our computers and the cluster, which I will do soon.
2/22/2017
We met with Mr. Hong again yesterday and learned some of the basics of using MadGraph. We were also introduced to Pythia, Delphes, and other particle simulation tools.
With the sage advice of John Martyn, I was able to install MadGraph on my cluster account. An outline of the process (as it should be done) follows:
1. Download CMSSW (https://github.com/cms-sw/cmssw), scp it to the cluster, and unzip it.
2. Download MadGraph (http://madgraph.phys.ucl.ac.be/), scp it to the cluster, and untar it.
3. Follow the steps here (https://sites.google.com/a/physics.umd.edu/umdt3/personalized-software/cmssw-environment) to set up the CMS environment.
The next log entry will focus on simulating a series of test events with MadGraph and Pythia.
2/28/2017
Today was another session with Mr. Hong (which will be outlined at the end of this entry).
It turns out that I had not installed MadGraph exactly as I should have: it's important to set it up in the "src" section of the CMSSW folder and to run cmsenv before doing anything with MG.
For future reference, I use MadGraph 2.5.2, CMSSW 8.0.0, and the SCRAM_ARCH with gcc493.
To run MadGraph, I use "./bin/mg5_aMC" from the main directory of MG.
To handle hadronization and detector effects, I installed the Root analysis toolkit, Pythia 6, PGS, and Delphes via the commands "install ExRootAnalysis", "install pythia-pgs", and "install Delphes" from the MG command prompt.
Mr. Hong outlined two ways to specify processes through MG. One used the MG command prompt, while another used the MG process card structure. I'll outline both.
In the first way, I import the model I'm using via "import model <NAME>" and then use "generate <PROCESS> @1". I can add more processes via "add process <PROCESS> @2/3/4/...". When satisfied, I type "output <DIRECTORY>" to place the results in a new directory.
In the second way, I copy "proc-card.dat" from the main MG folder. Then I edit the pre-set commands (which are analogous to above) and run "./bin/MG5_aMC ./<NEW FILE>". This generates a new folder (beginning with "proc"), which I rename.
Once either of these are done, I move to the "Cards" subdirectory of the new directory and edit "run_card.dat" to update the parameters of the run (e.g. setting the number of runs to 1000 for a test). To simulate the process, I move to the main directory for the process and use "./bin/generate_events". Then I follow the on-screen instructions, setting Pythia and Delphes on as desired. The results are output in a bunch of gzipped files (of varieties including LHE, LHCO, and ROOT) which I can unzip to examine. When I was doing this, I generated basic test processes; none of the events I generated here are of future import to this research.
Now I will talk about what Mr. Hong told us today.
We learned that we will be using Condor for further simulations and that we can use the utility "root2lhco" (located within the Delphes folder in MadGraph) to transform the root files output from the simulation software into easy-to-use LHCO files. To execute it, I run "Delphes/root2lhco <ROOT file> <NEW LHCO file>" from the main MadGraph directory.
Our goal will be to investigate the kinematic variables of the events (e.g. COM energy) to determine how we can tell "signal" from "background" events. Mr. Hong compared it to the film 300: we are searching for a small but powerful group of events in a larger group of background events.
3/11/2017
Austin and I got spaces in the "data" section of the cluster, so I set up CMSSW there. I'm using the same version as before.
Today Austin showed me how to set up an alias (which involves editing ~/.cshrc with the necessary command), so I made an alias called "gotomg" that sets up my CMS environment and moves me to the source folder containing MadGraph.
The code for it is:
"alias gotomg 'cd /data/users/hon-nolan/ && setenv SCRAM_ARCH slc_amg64_gcc493 && cd CMSSW_8_0_0/src && cmsenv'"
Since my last post, I have used Condor to submit some MadGraph jobs, and I've viewed the results in Root.
Mr. Hong gave us the Condor files we need (one JDL file and one SH file). To run Condor, I copy the JDL and SH files into the directory for the process I'll be running, edit the directories listed in their source as needed, and use condor_submit followed by the name of the JDL file. This outputs the results into the Events/0 directory. Then I use root2lhco and zcat to convert the results to LHCO and LHE files as necessary. I can also use a TBrowser in Root to view the plots in each ROOT file produced.
On Tuesday, Mr. Hong sent us a set of Mathematica functions that none of us are sure of how to use. We met up today but didn't make much progress. I will try to figure out what I can over the rest of the weekend.
3/30/2017
I did not make much progress with Mathematica last time, and then Spring Break happened, so it was just last Tuesday (3/28) that Mr. Hong gave us Mathematica files that should be easier to understand. He will meet with us next Tuesday to walk us through using them.
4/6/2017
Last week, Mr. Hong gave us copies of Radion_GKK_2 and Radion_GKK_3j_2 models for use in simulations. The former is designed for proton-proton collisions producing two photons and one jet; the latter is similar, but leads to three jets. "GKK" stands for "Kaluza-Klein gluon," which appears in the signal processes we are simulating. (I focused on the former collision.)
Both models came in .tar.gz forms - to use them, I had to run "tar -xzvf" on each. I then moved them to the "models" subdirectory of my MadGraph installation, returned to the main folder, and ran MadGraph via "./bin/mg5_aMC".
To generate processes for each, I import the specific model I'm using via "import model <NAME>" (executed within MadGraph). Then I type the command "generate <PROCESS> @1" - the two processes I used were "p p > j j j" and "p p > a a j". I generated each process in separate runs of MadGraph, with only the necessary model used for each. (NOTE: these generated processes don't ensure that KK gluons appear - this will be addressed in the next entry.)
To set up Pythia and Delphes on both processes, I just ran a test run with both enabled. (I think there's another way that involves moving around the cards, but I haven't done that yet.) I submitted both jobs through Condor, obtained results, and used root2lhco and zcat to change the Delphes and Pythia files respectively into readable forms.
After that, I used WinSCP (https://winscp.net/eng/download.php) to connect to the cluster and download the result files to my personal desktop, where I have Mathematica installed. I will use the Mathematica code Mr. Hong gave us to analyze them.
4/13/2017
I met with John and Austin last Saturday to discuss our analysis of the data. It turns out I hadn't generated good signal events, so I ran MadGraph (as before) with the newer, better processes "p p > gkk > j j j" and "p p > gkk > a a j". These processes force the events to include an intermediate Kaluza Klein excited gluon, making them part of the signal. Otherwise, they are as before. The processes I had mentioned before include the background as well as the signal, making it next to impossible to analyze the signal events. I pulled the data from the cluster to my home computer for analysis.
I haven't fully adapted the Mathematica code yet, though Austin has apparently gotten a working version. I will get my code in shape over the weekend so I can compare it with others and determine how to analyze the data best. I will post another log here once I've done that.
4/23/2017
I have (mostly) working Mathematica code that takes in the SM / KK events, applies cuts, and returns the results in the form of text files. The code, adapted from code by Mr. Hong for his own investigations, follows:
(*eventchar = SG / BK*)
SelectionCutAAJ[event_, eventchar_, xsecmultiplier_] :=
Module[{jetfind, phfind, METfind, visiblefind, Alljets, onejet,
twoph,(*\[Rule]Pre-cuts*)ptj, ptph, etaj, etaph, drjj,(*<-Pre-
cuts*)wgt,(*\[Rule]Observables*)PTj1, ETAj1, PTph1, PTph2, ETAph1,
ETAph2, Mjet, MAll, Met,(*<-Observables*)jx, phx, i},
(*Preselection cuts*)
wgt = 0;
ptj = 20;(*Min Subscript[Pt, j] in GeV*) ptph = 20;
etaj = 3;(*Max Subscript[\[Eta], j]*) etaph = 3;
drjj = 0.4;(*Min Subscript[\[CapitalDelta]R, jj]*)
DeleteFile["1_" <> ToString[eventchar] <> "_info_Presel.txt"];
DeleteFile["1_" <> ToString[eventchar] <> "_ptj1_Presel.txt"];
DeleteFile["1_" <> ToString[eventchar] <> "_ptph1_Presel.txt"];
DeleteFile["1_" <> ToString[eventchar] <> "_ptph2_Presel.txt"];
DeleteFile["1_" <> ToString[eventchar] <> "_etaj1_Presel.txt"];
DeleteFile["1_" <> ToString[eventchar] <> "_etaph1_Presel.txt"];
DeleteFile["1_" <> ToString[eventchar] <> "_etaph2_Presel.txt"];
DeleteFile["1_" <> ToString[eventchar] <> "_Mjet_Presel.txt"];
DeleteFile["1_" <> ToString[eventchar] <> "_MAll_Presel.txt"];
DeleteFile["1_" <> ToString[eventchar] <> "_MET_Presel.txt"];
For[(*For1*)i = 1, i < Length[event] + 1, i++,
jetfind = {}; phfind = {}; METfind = {}; Alljets = {};
visiblefind = {};
jetfind = FindParticles[event[[i]], {21}];
phfind = FindParticles[event[[i]], {22}];
METfind = FindParticles[event[[i]], {12}];
visiblefind = Table[v, {v, 1, Length[event[[i]]]}];
If[(*If0-visible-particle*)Length[METfind] > 0,
visiblefind = RemoveFrom[visiblefind, METfind], None(*If0-
visible-particle*)];
If[(*If1*)Length[jetfind] >= 1 && Length[phfind] >= 2 ,
onejet = {};
onejet = ptHighOrder2[event[[i]], jetfind, 1];
jx = {onejet[[1]][[
1]]}; (*This is one j for the current event*)
twoph = {};
twoph = ptHighOrder2[event[[i]], phfind, 2];
phx = {twoph[[1]][[1]], twoph[[2]][[1]]};
PTj1 = ptOf[event[[i]][[jx[[1]]]]];
ETAj1 = etaOf[event[[i]][[jx[[1]]]]];
PTph1 = ptOf[event[[i]][[phx[[1]]]]];
ETAph1 = etaOf[event[[i]][[phx[[1]]]]];
PTph2 = ptOf[event[[i]][[phx[[2]]]]];
ETAph2 = etaOf[event[[i]][[phx[[2]]]]];
(* DRjj=DeltaR[event[[i]][[jx[[1]]]],event[[i]][[jx[[2]]]]]; *)
If[(*If2*) *CUT CONDITION HERE*,
(*Compute observables*)
Mjet = Abs[InvMassSq[event[[i]], jx]]^(1/2);
MAll = Abs[InvMassSq[event[[i]], visiblefind]]^(1/2);
Met =
ptOf[event[[i]][[
METfind[[1]]]]]; (*There is only one MET in PGS lhco file*)
wgt = xsecmultiplier;
(*Saving information into text file*)
(*(0) Event Info*)
PutAppend[{i(*event#*), wgt(*weight*), jx(*j-pair*), phx(*ph-
pair*)}, "1_" <> ToString[eventchar] <> "_info_Presel.txt"];
(*(1) pT*)
PutAppend[{PTj1, wgt(*weight*)},
"1_" <> ToString[eventchar] <> "_ptj1_Presel.txt"];
PutAppend[{PTph1, wgt(*weight*)},
"1_" <> ToString[eventchar] <> "_ptph1_Presel.txt"];
PutAppend[{PTph2, wgt(*weight*)},
"1_" <> ToString[eventchar] <> "_ptph2_Presel.txt"];
(*(2) ETA*)
PutAppend[{ETAj1, wgt(*weight*)},
"1_" <> ToString[eventchar] <> "_etaj1_Presel.txt"];
PutAppend[{ETAph1, wgt(*weight*)},
"1_" <> ToString[eventchar] <> "_etaph1_Presel.txt"];
PutAppend[{ETAph2, wgt(*weight*)},
"1_" <> ToString[eventchar] <> "_etaph2_Presel.txt"];
(*(4) Subscript[M, jet]*)
PutAppend[{Mjet, wgt(*weight*)},
"1_" <> ToString[eventchar] <> "_Mjet_Presel.txt"];
PutAppend[{MAll, wgt(*weight*)},
"1_" <> ToString[eventchar] <> "_MAll_Presel.txt"];
PutAppend[{Met, wgt(*weight*)},
"1_" <> ToString[eventchar] <> "_MET_Presel.txt"];
, None(*If2*)]
, None
(*If1*)]
(*For1*)];
ClearMemory
(*Module*)];
This code produces good results for the 50K SM events (p p > a a j, generated without importing any BSM theories into MadGraph) and the 10K signal events (p p > gkk > a a j) I have generated. It outputs information on the transverse momenta, pseudorapidities, and masses of particles involved.
To run the code, I navigate the larger Mathematica notebook it's in to the folder with the particle events in it via SetDirectory[datadir]; where datadir is the absolute path of said folder. Then I import the simulated events through KKpgs = ReadPGSEventsAsMGME["KK_delphes_events.lhco"];
SMpgs = ReadPGSEventsAsMGME["SM_delphes_events.lhco"];. I set variables as:
KKsigmafb = 10^3 * 0.0001324; (* taken from crossx.html file *)
SMsigmafb = 10^3 * 99.31; (* taken from crossx.html file *)
Lumifb = 300; (* 300 fb^-1 *)
XsecwgtKK = KKsigmafb*Lumifb/Length[KKpgs];
XsecwgtSM = SMsigmafb*Lumifb/Length[SMpgs];
NOTE: The sigmafb variables here are supposed to be obtained from the crossx.html files and should be updated when new events are generated. The values plotted here are values that I had as of this logbook entry, and updates will be mentioned later.
Then I create and move to a new "analysis" directory and run "SelectionCutAAJ[KKpgs, KK, XsecwgtKK]; SelectionCutAAJ[SMpgs, SM, XsecwgtSM];" After this I can plot the results by the commands
{KKPTph, KKPTphlog} =
ObsHistogramNormtoOne[
"1_KK_ptph1_Presel.txt", {0, 1600, 40}, {PlotRange -> All,
Frame -> True, PlotStyle -> {Directive[Blue, Thickness[0.005]]},
Filling -> Axis,
FrameLabel -> {"\!\(\*SubscriptBox[\(PT\), \(a\)]\)",
"Unit-normalized events"},
LabelStyle -> Directive[Bold, Black, 20],
FrameTicksStyle -> Directive[Bold, 20], Frame -> True,
FrameStyle -> Thickness[0.002], Background -> White(*,
PlotLabel\[Rule]Style["Dilepton: SG",25]*)}]; {SMPTph, SMPTphlog} =
ObsHistogramNormtoOne[
"1_SM_ptph1_Presel.txt", {0, 1600, 40}, {PlotRange -> All,
Frame -> True,
PlotStyle -> {Directive[Green, Dotted, Thickness[0.005]]},
Filling -> Axis,
FrameLabel -> {"\!\(\*SubscriptBox[\(PT\), \(a\)]\)",
"Unit-normalized events"},
LabelStyle -> Directive[Bold, Black, 20],
FrameTicksStyle -> Directive[Bold, 20], Frame -> True,
FrameStyle -> Thickness[0.002], Background -> White(*,
PlotLabel\[Rule]Style["Dilepton: SG",25]*)}];
InvMassSq and other functions for calculating kinematic variables came with the original code by Mr. Hong and are based on general, adaptable physical principles.
4/26/2017
We met with Mr. Hong on Tuesday and he informed me that most of my cutting should be done on the invariant mass of the particles in question. As such, I added functionality to deal with invariant masses to my code, which will be re-posted later. I have performed invariant mass cuts, and they are more effective than cuts on other data elements.
In addition, I discovered during my initial investigations that the cross-section for my background was so high that I was unable to generate a large enough number of events to represent the background in Mathematica. To fix this, I generated 50,000 events with cuts imposed from within MadGraph (making the cross-section smaller and the events better suited to my research). In particular, I ensured that all events had a photon with pT > 300 by editing run_card.dat and setting "xpta = 300" and "cut_decays = True".
The cutting's been going well so far - I'm currently at sigma values between 4.7 and 4.9, so a little bit more prudence will be enough to push me into the realm of New Physics.
5/9/2017
It's been a while since my last entry - my apologies for that. A few days after my last post, I was able to get sigma ~= 5.1 at a luminosity of 300 fb-1, which is a strong indicator of New Physics (if experimentally validated). Other than that, it's been mostly paper / presentation / poster work - I've spent a lot of time working on getting the paper well-formatted in LaTeX. We've also met with Mr. Hong and Dr. Jabeen a few times to get their opinions on how to make the paper and poster better.
Now I will describe how I was able to get 5.1 sigma on the processes I've been working with.
One of the most important parts of the process is generating the right events - one must be sure that the cross sections are small enough for this analysis to work. To do that, I cut the events within MadGraph as listed above. I did this for both the signal and background events, to prevent any false differences from appearing between the two. I also imposed these cuts within the Mathematica analysis in case any events slipped past through detector smearing.
As before, the luminosity was at 300 fb^-1. The cross-sections I used were:
KKsigmafb = 10^3 * 0.000132; (* taken from crossx.html file *)
SMsigmafb = 10^3 * .1052; (* taken from crossx.html file *)
Speaking of the Mathematica code, I have updated it so that I can plot data and calculate sigma values for earlier stages in the cutting. The updated main code is shown below:
SelectionCutAAJ[event_, eventchar_, xsecmultiplier_] :=
Module[{jetfind, afind, METfind, visiblefind, onejet,
twoph,(*\[Rule]Pre-cuts*)ptj, pta, etaj, etaa, draa,(*<-Pre-cuts*)
wgt,(*\[Rule]Observables*)PTj1, PTa1, PTa2, ETAj1, ETAa1, ETAa2,
DRaa, Mjet, Maa, Maaj, MAll, Met,(*<-Observables*)jx, ax, i},
(*Preselection cuts*)
wgt = 0;
ptj = 20;(*Min Subscript[Pt, j] in GeV*) pta = 300;
etaj = 3;(*Max Subscript[\[Eta], j]*) etaa = 3;
draa = 0.4;(*Min Subscript[\[CapitalDelta]R, jj]*)
DeleteFile["1_" <> ToString[eventchar] <> "_info_Presel.txt"];
DeleteFile["1_" <> ToString[eventchar] <> "_ptj1_Presel.txt"];
DeleteFile["1_" <> ToString[eventchar] <> "_pta1_Presel.txt"];
DeleteFile["1_" <> ToString[eventchar] <> "_pta2_Presel.txt"];
DeleteFile["1_" <> ToString[eventchar] <> "_etaj1_Presel.txt"];
DeleteFile["1_" <> ToString[eventchar] <> "_etaa1_Presel.txt"];
DeleteFile["1_" <> ToString[eventchar] <> "_etaa2_Presel.txt"];
DeleteFile["1_" <> ToString[eventchar] <> "_DRaa_Presel.txt"];
DeleteFile["1_" <> ToString[eventchar] <> "_Mjet_Presel.txt"];
DeleteFile["1_" <> ToString[eventchar] <> "_Maa_Presel.txt"];
DeleteFile["1_" <> ToString[eventchar] <> "_Maaj_Presel.txt"];
DeleteFile["1_" <> ToString[eventchar] <> "_MAll_Presel.txt"];
DeleteFile["1_" <> ToString[eventchar] <> "_MET_Presel.txt"];
For[(*For1*)i = 1, i < Length[event] + 1, i++,
jetfind = {}; afind = {}; METfind = {}; Alljets = {};
visiblefind = {};
jetfind = FindParticles[event[[i]], {21}];
afind = FindParticles[event[[i]], {22}];
METfind = FindParticles[event[[i]], {12}];
visiblefind = Table[v, {v, 1, Length[event[[i]]]}];
If[(*If0-visible-particle*)Length[METfind] > 0,
visiblefind = RemoveFrom[visiblefind, METfind], None(*If0-
visible-particle*)];
If[(*If1*)Length[jetfind] >= 1 && Length[afind] >= 2 ,
onejet = {};
onejet = ptHighOrder2[event[[i]], jetfind, 1];
jx = {onejet[[1]][[
1]]}; (*This is one j for the current event*)
twoph = {};
twoph = ptHighOrder2[event[[i]], afind, 2];
ax = {twoph[[1]][[1]], twoph[[2]][[1]]};
PTj1 = ptOf[event[[i]][[jx[[1]]]]];
PTa1 = ptOf[event[[i]][[ax[[1]]]]];
PTa2 = ptOf[event[[i]][[ax[[2]]]]];
ETAj1 = etaOf[event[[i]][[jx[[1]]]]];
ETAa1 = etaOf[event[[i]][[ax[[1]]]]];
ETAa2 = etaOf[event[[i]][[ax[[2]]]]];
DRaa = DeltaR[event[[i]][[ax[[1]]]], event[[i]][[ax[[2]]]]];
Mjet = Abs[InvMassSq[event[[i]], jx]]^(1/2);
Maa = Abs[InvMassSq[event[[i]], ax]]^(1/2);
Maaj = Abs[InvMassSq[event[[i]], Join[ax, jx]]]^(1/2);
MAll = Abs[InvMassSq[event[[i]], visiblefind]]^(1/2);
Met =
ptOf[event[[i]][[
METfind[[1]]]]]; (*There is only one MET in PGS lhco file*)
If[(*If2*) (*pseudorapidity cuts*)
Abs[ETAj1] < etaj && Abs[ETAa1] < etaa &&
Abs[ETAa2] < etaa && (*pT cuts*) PTa1 > pta,
(*Compute observables*)
wgt = xsecmultiplier;
(*Saving information into text file*)
(*(0) Event Info*)
PutAppend[{i(*event#*), wgt(*weight*), jx(*j-pair*), ax(*ph-
pair*)}, "1_" <> ToString[eventchar] <> "_info_Presel.txt"];
(*(1) pT*)
PutAppend[{PTj1, wgt(*weight*)},
"1_" <> ToString[eventchar] <> "_ptj1_Presel.txt"];
PutAppend[{PTa1, wgt(*weight*)},
"1_" <> ToString[eventchar] <> "_pta1_Presel.txt"];
PutAppend[{PTa2, wgt(*weight*)},
"1_" <> ToString[eventchar] <> "_pta2_Presel.txt"];
(*(2) ETA*)
PutAppend[{ETAj1, wgt(*weight*)},
"1_" <> ToString[eventchar] <> "_etaj1_Presel.txt"];
PutAppend[{ETAa1, wgt(*weight*)},
"1_" <> ToString[eventchar] <> "_etaa1_Presel.txt"];
PutAppend[{ETAa2, wgt(*weight*)},
"1_" <> ToString[eventchar] <> "_etaa2_Presel.txt"];
(*(3) DRphph*)
PutAppend[{DRaa, wgt(*weight*)},
"1_" <> ToString[eventchar] <> "_DRaa_Presel.txt"];
(*(4) InvMass*)
PutAppend[{Mjet, wgt(*weight*)},
"1_" <> ToString[eventchar] <> "_Mjet_Presel.txt"];
PutAppend[{Maa, wgt(*weight*)},
"1_" <> ToString[eventchar] <> "_Maa_Presel.txt"];
PutAppend[{Maaj, wgt(*weight*)},
"1_" <> ToString[eventchar] <> "_Maaj_Presel.txt"];
PutAppend[{MAll, wgt(*weight*)},
"1_" <> ToString[eventchar] <> "_MAll_Presel.txt"];
PutAppend[{Met, wgt(*weight*)},
"1_" <> ToString[eventchar] <> "_MET_Presel.txt"];
, None(*If2*)]
, None
(*If1*)]
(*For1*)];
ClearMemory
(*Module*)];
This is then fed through a series of smaller cuts, an example of which is shown below:
Maacut[event_, infotxt_, {maamin_, maamax_}, eventchar_, No_] :=
Module[(*Module*)
{infolist, ex, jetfind, afind, METfind, visiblefind,
wgt,(*\[Rule]Observables*)PTj1, PTa1, PTa2, ETAj1, ETAa1, ETAa2,
DRaa, Mjet, Maa, Maaj, MAll, Met,(*<-Observables*)jx, ax, i, j},
DeleteFile[
ToString[No] <> "_" <> ToString[eventchar] <> "_info_Maacut.txt"];
DeleteFile[
ToString[No] <> "_" <> ToString[eventchar] <> "_ptj1_Maacut.txt"];
DeleteFile[
ToString[No] <> "_" <> ToString[eventchar] <> "_pta1_Maacut.txt"];
DeleteFile[
ToString[No] <> "_" <> ToString[eventchar] <> "_pta2_Maacut.txt"];
DeleteFile[
ToString[No] <> "_" <> ToString[eventchar] <> "_etaj1_Maacut.txt"];
DeleteFile[
ToString[No] <> "_" <> ToString[eventchar] <> "_etaa1_Maacut.txt"];
DeleteFile[
ToString[No] <> "_" <> ToString[eventchar] <> "_etaa2_Maacut.txt"];
DeleteFile[
ToString[No] <> "_" <> ToString[eventchar] <> "_DRaa_Maacut.txt"];
DeleteFile[
ToString[No] <> "_" <> ToString[eventchar] <> "_Mjet_Maacut.txt"];
DeleteFile[
ToString[No] <> "_" <> ToString[eventchar] <> "_Maa_Maacut.txt"];
DeleteFile[
ToString[No] <> "_" <> ToString[eventchar] <> "_Maaj_Maacut.txt"];
DeleteFile[
ToString[No] <> "_" <> ToString[eventchar] <> "_MAll_Maacut.txt"];
DeleteFile[
ToString[No] <> "_" <> ToString[eventchar] <> "_MET_Maacut.txt"];
infolist = {};
infolist =
SplitBy[ToExpression@
StringSplit[Import[infotxt, "List"], {",", "}", "{"}], First];
(*infolist={event#,wgt,{j},{a,a}}*)
(*one row = {{2,0.00396,Null,3,Null,Null,1,2}}*)
(*SplitBy[list,First] =>
Split the list into sublists if the first element is the same =>
when multiple combinations of particle subset in a given event are \
considered*)
(*So, one element is a list with the same event #*)
For[(*For1*)i = 1, i < Length[infolist] + 1, i++,
ex = 0; METfind = {}; visiblefind = {}; jx = {}; ax = {};
ex = infolist[[i]][[1]][[1]];(*event position*)
jx = {infolist[[i]][[1]][[4]]};
ax = {infolist[[i]][[1]][[7]], infolist[[i]][[1]][[8]]};
METfind = FindParticles[event[[ex]], {12}];
visiblefind = Table[v, {v, 1, Length[event[[ex]]]}];
If[(*If0-visible-particle*)Length[METfind] > 0,
visiblefind = RemoveFrom[visiblefind, METfind], None
(*If0-visible-
particle*)]; (*Note that there is ONLY one MET in lhco file*)
Maa = Abs[InvMassSq[event[[ex]], ax]]^(1/2);
If[(*if1*) maamin <= Maa <= maamax,
For[(*For2*)j = 1, j < Length[infolist[[i]]] + 1, j++,
(*Compute observables*)
PTj1 = ptOf[event[[ex]][[jx[[1]]]]];
PTa1 = ptOf[event[[ex]][[ax[[1]]]]];
PTa2 = ptOf[event[[ex]][[ax[[2]]]]];
ETAj1 = etaOf[event[[ex]][[jx[[1]]]]];
ETAa1 = etaOf[event[[ex]][[ax[[1]]]]];
ETAa2 = etaOf[event[[ex]][[ax[[2]]]]];
DRaa = DeltaR[event[[ex]][[ax[[1]]]], event[[ex]][[ax[[2]]]]];
Mjet = Abs[InvMassSq[event[[ex]], jx]]^(1/2);
Maaj = Abs[InvMassSq[event[[ex]], Join[ax, jx]]]^(1/2);
MAll = Abs[InvMassSq[event[[ex]], visiblefind]]^(1/2);
wgt = infolist[[i]][[j]][[2]]; (*event weight*)
(*****
Saving surviving event information into text \
file*****)(*Output = {event#, wgt}*)
(*Saving information into text file*)
(*(0) Event Info*)
PutAppend[{ex(*event#*), wgt(*weight*), jx(*j-pair*), ax(*ph-
pair*)},
ToString[No] <> "_" <> ToString[eventchar] <>
"_info_Maacut.txt"];
(*(1) pT*)
PutAppend[{PTj1, wgt(*weight*)},
ToString[No] <> "_" <> ToString[eventchar] <>
"_ptj1_Maacut.txt"];
PutAppend[{PTa1, wgt(*weight*)},
ToString[No] <> "_" <> ToString[eventchar] <>
"_pta1_Maacut.txt"];
PutAppend[{PTa2, wgt(*weight*)},
ToString[No] <> "_" <> ToString[eventchar] <>
"_pta2_Maacut.txt"];
(*(2) ETA*)
PutAppend[{ETAj1, wgt(*weight*)},
ToString[No] <> "_" <> ToString[eventchar] <>
"_etaj1_Maacut.txt"];
PutAppend[{ETAa1, wgt(*weight*)},
ToString[No] <> "_" <> ToString[eventchar] <>
"_etaa1_Maacut.txt"];
PutAppend[{ETAa2, wgt(*weight*)},
ToString[No] <> "_" <> ToString[eventchar] <>
"_etaa2_Maacut.txt"];
(*(3) DRphph*)
PutAppend[{DRaa, wgt(*weight*)},
ToString[No] <> "_" <> ToString[eventchar] <>
"_DRaa_Maacut.txt"];
(*(4) InvMass*)
PutAppend[{Mjet, wgt(*weight*)},
ToString[No] <> "_" <> ToString[eventchar] <>
"_Mjet_Maacut.txt"];
PutAppend[{Maa, wgt(*weight*)},
ToString[No] <> "_" <> ToString[eventchar] <>
"_Maa_Maacut.txt"];
PutAppend[{Maaj, wgt(*weight*)},
ToString[No] <> "_" <> ToString[eventchar] <>
"_Maaj_Maacut.txt"];
PutAppend[{MAll, wgt(*weight*)},
ToString[No] <> "_" <> ToString[eventchar] <>
"_MAll_Maacut.txt"];
PutAppend[{Met, wgt(*weight*)},
ToString[No] <> "_" <> ToString[eventchar] <>
"_MET_Maacut.txt"];
(*For2*)]
, None(*if1*)]
(*For1*)]
ClearMemory
(*Module*)];
This is how Mr. Hong's original code performed the analysis, and it enables me to demonstrate (in the paper) which cuts are most effective.
The cuts that got me my high sigma value were the initial cuts (requiring one jet and two photons with |eta| < 3 for the highest-pT of these and pT of the highest photon > 300 GeV) followed by 1425 GeV < diphoton invariant mass < 1575 GeV, 2300 GeV < invariant mass of these three, 2700 < invariant mass of all particles, and diphoton delta-R < 3.5. Most of these cuts were decided on by examining graphs of the data, using the cuts in Mathematica, and examining the resultant increase in sigma.
Here are some graphs of the particle invariant masses (the first after preselection, the second midway through the cutting process). These can also be seen in our paper, which should appear in the class journal.
My apologies for not having graphs from the earlier (failed) stages of this project; saving them and posting them here had not occurred to me when they were still accessible.
5/14/2017
As stated before, the research end of this project is done. The paper may need another revision (pending feedback from Mr. Hong), but there's not much to talk about with respect to that. I'm going to edit the past entries to include a bit more info to make this easier to replicate. This post will be a general summary of how the research was performed (written out in handy step-by-step format!). Someone following these steps should be able to repeat this research rather quickly; much of my time was spent troubleshooting, waiting between meetings, and finding good cuts.
1. Get onto the cluster.
Of course, you must have an account to do this. To do this, I use an Ubuntu VirtualBox and type "ssh <MY ACCOUNT>@hepcms.umd.edu" into the terminal, followed by my password.
2. Install CMSSW, MadGraph, Pythia, and Delphes in your "data" section on the cluster, and install Mathematica on your side.
This requires downloading CMSSW and MadGraph from sources linked to above. Installing CMSSW is best done under the instructions at https://sites.google.com/a/physics.umd.edu/umdt3/personalized-software/cmssw-environment. Once that's done, move the tarred MadGraph file to the "src" directory within CMSSW and untar it. Then move to MadGraph and run it via "./bin/mg5_aMC". Typing in "install pythia-pgs", and "install Delphes" will get the desired packages installed. Mathematica (on your side) should be accessible from TERPware and will explain itself when you try to install it.
3. Generate Signal and Background events.
To generate the signal, restart MadGraph and type in "import model Radion_GKK_2" followed by "generate p p > gkk > a a j @1" and "output GKK2". This will create a new folder "GKK2" for the event channel within your MadGraph directory. Quit MadGraph and navigate to "GKK2/Cards". There should be a "run_card.dat" file in there; open it and set the number next to nevents to 1000. Then go back to GKK2 and execute "./bin/generate_events", setting the parton and detector level simulations to Pythia and Delphes, respectively. This will get you the cards necessary to submit your job to Condor and get the results you want. Return to Cards and change run_card.dat to include the true number of events you want it to generate (next to nevents as before). Now get two Condor files (a .sh and a .jdl) and move them to the GKK2 folder, modifying the paths referenced within them to reflect where the files are. Now use condor_submit on your JDL file and wait for the cluster to generate your events.
Generating the background is much as above, but you don't need to import any models and you should use "generate p p > a a j @1" and a different folder name. You should restart before doing this.
4. Bring the Delphes files for the events back to your side.
I use WinSCP for this, and it explains itself pretty well.
5. Get Mathematica code that analyzes the collision the way you want it to.
My group got code from Mr. Hong which calculated kinematic variables, performed cuts, and made graphs. The catch, of course, was that this was designed for a different project. The kinematic variable functions remained untouched, but the cutting functions were changed to fit diphoton / jet final states and to cut a few different variables. Each stage of cutting resulted in a group of output files containing kinematic variables and data from the surviving events. After much debugging, I was able to run my code successfully. It may be advisable to test your code by running it on only ten events, since writing out the results can take hours if your computer is slow or is acting up.
6. Analyze your events using said Mathematica code.
One important part of this is including the correct cross-sections in Mathematica, which can be obtained from the "crossx.html" files that MadGraph generates. These are often generated in picobarns; our MadGraph code used femtobarns instead, so I had to multiply the cross-sections in Mathematica by 1000. Set these up, then import your Delphes files and run them through your analysis pipeline. After each stage of the pipeline, calculate S/sqrt(S+B) and make some graphs. I plotted the diphoton invariant mass (= radion mass), highest photon pT, and the invariant mass of both photons and the jet (= KK gluon mass). These graphs can show you how to optimize your cuts to remove background while retaining signal.
7. Repeat steps 3-6 as necessary for good statistics and good results.
If your cross-sections are large enough that the weight for your events is greater than one (i.e. one simulated event corresponds to multiple real events), you should impose cuts within MadGraph and generate new events. To do this, edit run_card.dat and set "cut_decays" to "True". Then impose the most effective cuts you can think of. (I found that "xpta1" = 300 was useful in reducing the background cross-section.) Then use condor_submit on your JDL file to generate your new events. When doing this, you must remember to impose the same cuts on both signal and background (so you must re-generate both). You must also impose these same cuts in your Mathematica analysis in case any unwanted events get through via Delphes's detector smearing.
Getting S/sqrt(S+B) > 5 is the goal here, and it should be doable for these processes. Note that, no matter how many background events you cut, you must have at least 25 real signal events remaining in the end if you wish to do this.
8. Exult!
Party as you like, but be moderate; you will have to type up your findings soon.