Feb 24, 2016
MadGraph (MG) was installed including its sub-program packages. These packages (so named 'pythia-pgs','Delphes','ExRootAnalysis') allow us to extend the functionality of MadGraph including hadronization and fragmentation procedures (Pythia) and data visualization (Root). To install these packages, we need to have an available Fortran 77 compiler, which fortunately is included in the cluster software (use of the 'cmsenv' command).
Upon installation of the of MadGraph, we completed the in software tutorial to familiarize ourselves with some of the commands. Prof. Belloni provided us with a Process Card (a script used by MG) and a Run Card. These cards were used to generate W(lv)γ events from pp collisions, and then store those into an output file. The output file could then be read with the Run Card to produce an HTML GUI were would could get a read of the cross-section of those events and extract an LHEF file.
With an LHEF file, we will then convert that into a .root file, which could then be read for the histograms and data analysis.
Mar 1, 2016
In the directory, 3_1_16, we have the generated MC simulation of Wγ events.
Mar 6, 2016
We can open the Delphes data as a .root file in ROOT. Here we can use commands such as:
Delphes->Print()
Delphes->Draw("")
The first can print out the leaves of the Delphes tree so that we may inspect the properties of the data that we are using. The second is used to draw the histograms, using the leaves of the Delphes tree as parameters on which various operations can be preformed (ex. "Particle.Phi - Track.Phi"). Arguments can be added to make cuts on these values.
Mar 8, 2016
For ROOT to work with the Delphes tree, ROOT must import a library with Delphes definitions. The command to do that is as follows:
gSystem->Load("~/MS4BSM/MG5_aMC_v2_3_3/Delphes/libDelphes")
I encountered a strange problem where after I load the library, a Delphes referred both to a command and a tree. Seeing as this is a problem, we had to name the Delphes tree something else in order to work with it. The command that we used was as follows:
TTree* myTree = (TTree*)_file0->Get("Delphes")
Once we had loaded the library, we could make a selector file. This selector file is a .C macro and a .h header file. This .C macro could be executed to make histograms that could be filled recursively seeing as we did not know which truth particles went with which reconstructed particle. The command to make the selector is as follows:
Delphes->MakeSelector("delphes_selec")
Note: some important edits you must make to the header file:
You must declare each and every histogram that you plan to make. Declare them in the following form:
TH1F *deltaPhi, *deltaEta, *deltaPt, *deltaK, *deltaR, *muonPT, *phoPT, *missET, *MT, *muMult, *METMult;
Fit functions must also be declared. They can be declared in a similar fashion:
TF1 *myFitFunc;
Histograms and fit functions do not necessarily have to be defined here, they just have to be declared.
We decided to comment out TRef values since they were causing errors when running our macro. You can comment these out if they are causing problems with out doing any harm to the analysis.
In the .C macro, here are some of the standard notations that one would need to create a macro to make histograms:
Histogram declaration is as follows:
hist_obj = new TH1F("title", "", bins, min, max);
Perform functions on histograms by using a pointer( -> )
Use a special line, fChain->GetEntry(entry); , in the Process method.
Fill histograms via hist_obj->Fill(data_value); in the Process method.
Write histograms via hist_obj->Write(); in the Terminate method.
Mar 22, 2016
In the macro, we decided to fill histograms recursively with the Delphes data that we simulated. This would ensure that truth particles could be paired with their reconstructed tracks, with a caveat of an extremely large background. We wrote a double nested iterator that looped over every Particle_ data value and every Track_ data value. Certain operations had to be performed in order to correct certain values into the values that we wanted, for example the correction of delta_phi values. Since we were taking the difference to the Particle_Phi and the Track_Phi, which both ranged from -π to π, we would get results that ranged from -2π to 2π. To correct these values, we used ternary statements to correct those values that did not follow delta_phi<=|π| to an equal value that obeyed that statement. The iterator and correction lines are listed below:
for (int i=0;i<Particle_;++i) {
//Data Pre-Cuts
if (fabs(Particle_Eta[i])>5) continue;
for (int j=0;j<Track_;++j) {
//Entry creation
float delta_eta = Particle_Eta[i]-Track_Eta[j];
float delta_phi = Particle_Phi[i]-Track_Phi[j];
float delta_Pt = Particle_PT[i]-Track_PT[j];
float delta_K = 1./Particle_PT[i]-1./Track_PT[j];
//Entry modification
delta_phi = (delta_phi>TMath::Pi())?-2*TMath::Pi()+delta_phi:delta_phi;
delta_phi = (delta_phi<-TMath::Pi())?2*TMath::Pi()+delta_phi:delta_phi;
float delta_R = sqrt(delta_phi*delta_phi+delta_eta*delta_eta);
//Fill Histograms
deltaPhi->Fill(delta_phi);
deltaEta->Fill(delta_eta);
deltaPt ->Fill(delta_Pt);
deltaR ->Fill(delta_R);
if(Particle_PT[i]>0.5 && Track_PT[j]>0.5)
deltaK->Fill(delta_K);
}
}
Mar 29, 2016
*DO NOT USE TMUX. PEOPLE WILL GET VERY ANGRY AT YOU FOR USING UP CLUSTER RESOURCES ON AN INTERACTIVE NODE. USE CONDOR INSTEAD*
Link to HTCondor tutorial:
https://research.cs.wisc.edu/htcondor/tutorials/
Apr 5, 2016
Isolation is useful as you want to remove the average background energy to detect interesting spikes in your data. In the case of the muon, they can be associated with jets or are isolated events. Isolated events are what we want, hence we cut on the number of muons in an event (1 in this case) to get a better picture of what the MC simulation should look like, which should be reflected in actual CMS data. In short, isolation is comparable to calibrating your average energy to 0 and detecting the spikes and fluctuations in our data about 0.
Apr 12, 2016
In the directory, 4_2_16, we have the generated MC simulation of tt~, Zj, Wj, and jj events. These are background simulations that we used in absence of actual CMS data.
NOTE: the size of these folders are in the order of GB hence they will be added later or are available upon request.
On a side note, here is a link to some emacs short cuts which could be useful. I personally like to use X11 forwarding and have the pretty (or not so pretty) GUI for emacs, but to each their own.
https://www.gnu.org/software/emacs/refcards/pdf/refcard.pdf
File also included below
To have X11 forwarding enabled use:
ssh -Y user@hepcms.umd.edu
Use this when logging in.
Apr 19, 2016
This particular week was a slow week and I started to get overwhelmed with other classes. At this point, Avi Kahn took on most of the work with the efficiency calculations and fiducial cuts. His log of events would be much more comprehensive and constructive than mine would be, hence I refer you there where credit is due. Hopefully this log has served as somewhat useful to anybody wishing to expand upon the research done by Prof. Alberto Belloni, Avi Kahn, and myself.