Spring 2016 Work
4/14 Meeting #6
looked at a really cool 3d version of the cosmic microwave background
http://thecmb.org/
a poisson distribution is linear when plotted on a log by log scale like we have done
find the fraction of events needed for a specific sigma for confidence level
trying to find how significant the hottest spot is
ToDo
-use the whole stripe, with the number of bins in each stripe to find the p-value for the pixel
-plot the hottests number from each scramble then compare to real map
-sum expectation values in whole sky, should be about the same number of points in the sample (actually about the same)
-******make histograms for every bin the expected number (poisson mean) and the count (scrambled should have the same means)
- heat map of events-map of expecation should have a lot of zeroes
-try injecting a point source, adding events by hand(in one single pixel) to see how many events we need to see a point source, 3ish sigma in 50% of scrambles
-try at different point( try 5) and see how many points are required based on the background at those points
4/12 Working on P-values
import numpy as np
import matplotlib.pyplot as plt
import healpy as hp
#import matplotlib.backends.backend agg
import scipy
from scipy.stats import poisson
#importing libraries
NSIDE=32
res=1
nbins=100
data=np.load("./IC86_exp_all.npy")
#setting up the histogram bins
n, bins, patches=plt.hist(data["dec"], bins=nbins)
bins= bins +np.pi/2
#normalizing bins
for i in np.arange(nbins):
n[i]=n[i]*2/hp.nside2npix(NSIDE)/(np.cos(bins[i])-np.cos(bins[i+1]))
#populating sky map
skymap=np.zeros(hp.nside2npix(NSIDE))
for ev in data:
if np.abs(ev["dec"])<85*np.pi/180:
binnum=hp.ang2pix(NSIDE,(ev["dec"]+np.pi/2),ev["ra"])
skymap[binnum]+=1
#calculating p-value
pvalmap=np.ones(hp.nside2npix(NSIDE))
for ev in np.arange(np.size(skymap)):
neighbors=hp.get_all_neighbours(NSIDE,ev)
neighborhood=np.append(neighbors,ev)
for i in neighborhood:
coords=hp.pix2ang(NSIDE,i)
value+=skymap[i]
lambdan=n[np.floor_divide
value=skymap[ev]
coords=hp.pix2ang(NSIDE,ev)
if np.abs(coords[0]-np.pi/2)<85*np.pi/180:
lamb=n[np.floor_divide(coords[0],np.pi/nbins)]
pvalue = 1.0 - poisson.cdf(value,lamb)
pvalmap[ev]=pvalue
plt.figure()
h2=plt.hist(-np.log(pvalmap),bins=50,log=True)
plt.savefig("pval_hist.png")
plt.show()
print np.mean(skymap)
map=hp.mollview(skymap, title="I3 IC86 2011 skymap")
plt.savefig('skymap_pval.png')
I guess I'm not taking into account the poles so its dividing by zero which is bad.
4/7 Meeting
-Lines from nearest neighbors results in the weird line in the middle due to the changing background
-sum the expectation for the nine bins instead of multiplying by 9 to get rid of the lines
-convert pvalues to probablities (# of the p-value / trials)
-uses openGL to map neutrinos
To Do
-getting the probablities for pvalues
-summing expectations
3/24 Meeting
Erik was out of town for a meeting. We will continue to work on the P-value calculations.
3/24 Meeting
The expected value, scramble dec to see what the average number of events is then calculate p-value
the use .1 deg by .1 deg binning then compare against entire sky ---unbinned method
multiply the event density by all the bins you wish to consider
'
using a bin density is fine multiply by neighbors, use get_neighbors - the same event can impact a couple bins
TODO
use get_neighbors
scramble Right Ascension -possibly use time -slalib
when we compare to scramble do trials corrected p-value
3/22 Making Graphs
sudo apt-get install ipython
sudo apt-get install python-numpy python-scipy
sudo apt-get install python-matplotlib
http://matplotlib.org/1.2.1/examples/pylab_examples/histogram_demo.html
making histogram
run "ipython -pylab"
import matplotlib.pyplot as plt
import numpy as np
data = np.load("./IC86_exp_all.npy")
h1 = plt.hist(data["zenith"],bins=100)
h1 = plt.hist(cos(data["zenith"]),bins=100)//Changes from radians to a number between -1 and 1
The dip in the data is where the cuts change from analyzing particles coming straight up through the earth when you are standing on the south pole, denoted as a -1, to coming down from the sky, denoted by a positive 1. They have very different backgrounds and they switch from using one background to the other at the dip.
h1 = plt.hist2d(cos(data["zenith"]),data["azimuth"],bins=100) // this will plot the zenith angle (whether or not you are higher or lower than directly overhead) vs the azimuth angle(how far left or right you are)
The down going side is pretty much the same on the upgoing side no matter the azimuth angle. However there are 6 spikes on the downward going side that seem to depend on the azimuth angle.
h1 = plt.hist(data["azimuth"],bins=100)
Hey! Look! We found those six spikes! Dr.Blaufuss explained that the spikes are from particles going down a row of DOMS which increased the detector's sensitivity.
Next up is Right Ascension, it's really good because it stays constant. Azimuth changes as the detector rotates. Right Ascension stays constant with the stars even as the detector rotates.
h1 = plt.hist(data["ra"],bins=100)
Yay! It looks nice and constant just like its supposed to!
now let's plot zenith vs ra
h1 = plt.hist2d(np.cos(data["zenith"]),data["ra"],bins=100
This looks just like we would expect! Now we get to do a mollweide projection. We have to use healpy since its good at making each bin worth the same area of the sky.
this is handy
https://healpy.readthedocs.org/en/latest/tutorial.html#creating-and-manipulating-maps
import numpy as np
import healpy as hp
import matplotlib as plt
NSIDE = 32
m = np.arange(hp.nside2npix(NSIDE))
hp.mollview(m, title="Mollview image RING")
plt.show()
Yay! This is the tutorial image let's try with right ascension and declination
import healpy as hp
import numpy as np
import matplotlib.pyplot as plt
NSIDE =32
data=np.load("./IC86_exp_all.npy")
skymap=np.zeros(hp.nside2npix(NSIDE))
for ev in data:
if np.abs(ec["dec"]}<85*np.pi/180
binnum=hp.ang2pix(NSIDE,(ev["dec"]+np/pi/2),ev["ra"])
skymap[binnum]+=1
hp.mollview(skymap,title"I3 IC86 skymap")
plt.show()
3/10 IceCube Meeting
All sky telescopes do similiar things
sin(dec) makes area equal
use poison probability when trying to figure out p values for each bin
nside="how hard you cut up your sphere" powers of 2
HealPy -----download -optimized for astronomical sky plots
sudo apt-get install python-pip
sudo pip install healpy
3/7
I used code from
http://matplotlib.org/examples/mplot3d/2dcollections3d_demo.html
to actually show a graph!!!!
this might actually be relevant, graphing color
http://matplotlib.org/examples/axes_grid/demo_axes_rgb.html
3/3
Virtual Machine SetUp
Iownload https://help.ubuntu.com/community/Lubuntu/GetLubuntu/LTS
https://www.virtualbox.org/wiki/Downloads
I have 16gb of RAM on my computer and I wanted to be a little conservative so I gave the VM 4gbs.
okay, I was running a 32bit version of Virtual box and apparently if you give it more than 3 gigs of RAM it freaks out. I will download the 64bit version of both virtualbox and Lubuntu
The previous steps remain the same.
---- okay so it still wouldn't give me 64 bit option so I'm just going to go with 2 gigs of RAM
-----so I had to go into the BIOS settings on my computer and turn on Virtualization Technology. Then I could do the 64 bit version with 8gbs of ram.
3/3
Meeting - Virtual Machine Rabbit Hole
goal plot RA and Dec, in 2D
(plot sin(dec) to normalize the unequal areain declination)
-get astronomy plotting pakages for matplotlib
(Also make sure you change settings so computer doesn't go to sleep when screen is off)
event number -8 bit
ra -radians
download https://help.ubuntu.com/community/Lubuntu/GetLubuntu/LTS
https://www.virtualbox.org/wiki/Downloads
install ipython matpoltlib numpy
sudo apt-get install ipython
sudo apt-get install python -matplotlib
sudo apt-get install python numpy
create virtual hard disk
give like ram like 4 or 8gb
memory like 8gb
dynamically size
use VDI
start
optical disk is iso (install
erase disk during Lubuntu install is okay
Erik uses VMwarefusion
matplotlib examples toolkits
(-pylab does this)
import matplotlib as plt
run before using
ipython -pylab
opening a terminal
start
systemtools
lx terminal
right click then add to desktop
data["zenith"] = one deminisional array of all zenith data
len( (data["zenith"])
123059
h1=plt.his(data["zenith"]) //its in radiants
h1=plt.his(data["zenith"],bins=100)
h1=plt.his(data["zenith"]*180/np.pi,bins=100) //its in degrees
///dip is from 5 degrees is from cuts from downgoing to upgoing events
3/3
We opened the data!! We used Anaconda
and the commands
> import numpy as np
> datas = np.load(“./IC86_exp_all.npy”)
> datas
In the second line there were problems fin
ding the exact data path for where I saved the file but we got it working.
3/2
Let's Get Graphing... Maybe
Alright so we'll have to download some extra libraries. Dr. Erik Blaufuss recommended the MatPlotLib library found here
http://matplotlib.org/
This page was also helpful
http://matplotlib.org/users/installing.html
to download it I ran
apt-get install python-setuptools
apt-get install python-numpy
apt-get install python-dateutil
apt-get install python-pytz
apt-get install python-pyparsing,
but I also needed packages called cycler and matplotlib.
This is when I realized MobaXterm uses its own plugins to add the other packages and these weren't included so I would have to use something else.
I had previously installed Python 3.5 from python.org so the matplotlib website recommended downloading Visual C++ Redistributable for Visual Studio 2015
since I use windows. However I seriously considered finally breaking down and making my laptop dual boot with Ubuntu or something but it was kinda late so
that's a job for another day. But the Linux directions looked a lot easier.
okay so that didn't work either. The matplotlib website and Paul said to download Anaconda
https://www.continuum.io/downloads
so I did. It's taking forever to download but hopefully that's a good sign. It seems like it has its own editor with it and a bunch of the packages I need so hopefully that will be handy.
It has it's own command line! Yay! yay? Great, it uses windows command line and I only know Linux command. I'm thinking about just making a Live USB at this point.
I started making a Live USB
2/26
I got Python working on my Laptop!!!
My group will be using Python during our project so it will be important to work with it without using the cluster.
Codecadamey.com is a great resource for learning Python. (Plus there's fun Monty Python references)
I had to run a local terminal in order to not use Cern's system.
Then I ran
apt-get install emacs
This allowed me to open emacs without using the cluster
Next
apt-get install python
This allows me to write and run files in python!
All python files end in .py
to run a file run
Python [filenamehere].py
2/25
IceCube Meeting
We'll mostly be looking at unbinned method of sorting data
using a test statistic. This test statistic is calculated for every point of each of the 10,000 random maps
for making projections
Python is supposed to be easier
http://matplotlib.org/
https://pypi.python.org/pypi/tables
2/24
Ice Cube Reading
Possible Point Sources
-Super Nova Remnant Shocks (SNR)
-Active Galactic Nuclei jets (AGN)
-Starburst Galaxies
-Gamma Ray Bursts
From April 2008 to May 2012
-37 observed Astrophysical neutrinos
-All sky surveys show that extended regions, not just point sources can produce astrophysical neutrinos.
-Triggered by 4 DOMs detecting light within 5 microseconds
-conducts linear fit, then more complicated fits
-Point Sources are found by clusters of events that are more frequent than the background(atmospheric muon and neutrino) would merit
-Selected 44 possible point sources
-None of the points were consistent with discovery but upper limits can now be placed.
-sources of error include imperfect knowledge of the optics of surrounding ice and absolute detector effiency
Time-Dependent Point Source Search Methods in High Energy Neutrino Astronomy
-AGN bursts would last from hours to weeks
-GRB would be from millisecons to mintues
2/17
Ice Cube Meeting
Papers about Point Sources
"Searches for Extended and Point-like Neutrino Sources with Four Years of IceCube Data"
http://arxiv.org/pdf/1406.6757v1.pdf
"Time-Dependent Point Source Search Methods in High Energy Neutrino Astronomy"
http://arxiv.org/pdf/0912.1572v1.pdf
IceCube did not observe any neutrinos during the recent Gravitational Wave
http://icecube.wisc.edu/news/view/398
2/15
Geant
http://geant4.web.cern.ch/geant4/
git clone https://github.com/saraheno/honrgeant.git
cd honrgeant
source G4_Setup.sh // make sure it's a capital S in Setup G4_setup.sh does not exist
mkdir B4-build
cd B4-build
cmake -DGEANT4_DIR=$MYGEANT2 ../B4
make
./exampleB4a
/tracking/verbose 1
/run/beamOn 1
exit
2/2
Ice Cube
The main website for ice cube
https://icecube.wisc.edu/
http://www.livescience.com/51924-icecube-observatory-photos.html
http://phys.org/news/2013-11-world-largest-particle-detector-icecube.html
"The IceCube Neutrino Observatory, a particle detector buried in the Antarctic ice, is a demonstration of the power of the human passion for discovery, where scientific ingenuity meets technological innovation. Today, nearly 25 years after the pioneering idea of detecting neutrinos in ice, the IceCube Collaboration announces the observation of 28 very high-energy particle events that constitute the first solid evidence for astrophysical neutrinos from cosmic accelerators."
http://www.symmetrymagazine.org/article/august-2015/icecube-sees-highest-energy-neutrino-ever-found
"In 2013, the IceCube neutrino experiment at the South Pole reported the observation of two ultra-high-energy neutrino events, which they named after Sesame Street characters Bert and Ernie. Later, they found one more."
http://physicsworld.com/cws/article/news/2015/may/11/icecube-neutrinos-do-come-in-three-flavours-after-all
"High-energy neutrinos detected by the IceCube experiment in Antarctica are equally distributed among the three possible neutrino flavours, according to two independent teams of physicists. Their analyses overturn a preliminary study of data, which suggested that the majority of the particles detected were electron neutrinos. The latest result is in line with our current understanding of neutrinos, and appears to dash hopes that early IceCube data point to "exotic physics" beyond the Standard Model."
HW1
Clone https://github.com/saraheno/honrpythia.git and then do the following.
git clone https://github.com/saraheno/honrpythia.git
cd honrpythia
source PYTHIA_setup.csh
make pythiaTree
./pythiaTree >& output.txt
TODO: alter pythiaTree.cc so that the particle's mass is stored in the tree.
float_t mass[5000];//created a variable to hold mass where the other variables are defined, its an array since it will hold many
//Book tree
t1->Branch("mass",mass,"mass/F"); //created a branch that will hold the mass array
if(pythia.event[i].isFinal()){
mass[nTot-1]=pythia.event[i]/m();
//rest here
}
TODO: alter it so that it does 10000 events instead of 100
graph of 100
for(int iEvent =0; iEvent<10000; ++iEvent)//changed original 100 to 10000, also change the 5000 cut off to 10000
if(!pythia.next()) continue;
"Let's make a Z' which decays to electron pairs and has a mass of 1500 GeV. In order to do this, remove the lines that tell it to make QCD events and add instead"
Pythia pythia;
//pythia.readString("HardQCD:all = on");
//pythia.readString("PhaseSpace:pTHatMin = 20.");
//pythia.readString("Beams:eCM = 14000.");
pythia.readString("NewGaugeBoson:ffbar2gmZZprime = on");
pythia.readString("32:m0 = 1500.");
pythia.readString("32:mMin =900.");
pythia.readString("32:mMax =3000.");
pythia.readString("PhasSpace:mHatMin =20. ");
pythia.readString("Zprime:gmZmode = 3");
pythia.readString("Beams:eCM = 14000.");
pythia.readString("32:onMode = off");
pythia.readString("32:onIfAny = 11");
TODO: if you did this right, if you plot the invariant mass of the two highest pT electrons in the event, you should see a nice bump at 1500 GeV. Please make this plot
//declare
double pT,pT2,mass1,mass2;
inside for (int iEvent = 0; iEvent < 1000; ++iEvent) {
//declare
nTot=0;
pT1=0;
pT2=0;
mass1=0;
mass2=0
after Apz[nTot-1]=pythia.event[i].pz();
put
double pPT=pythia.event[i].pT();
add the if statement
if(pythia.event[i].id ==11 || pythia.event[i].id==-11)//checks if its an electron
{
cout<<"Electron id"<<i<< " with invariant mass "<<pythia.event[i].m()<<emdl;
if(particlePT> pT1)//if new pt is bigger than old biggest pt, replacce it and make the old biggest the second biggest.
{
pT2=pT1;
mass2=mass1;
pT1=particlePT;
mass1=pythia.even[i].m();
}
else if(pPT>pT2)//bigger than the second biggest but not
{
pT2=pPT;
mass2 =pythia.even[i].m();
}
}
cout<<"Mass 1: "<<mass1<<" Mass2;"<<mass2<<endl;
mass->Fill(pT2+pT1);
cout<<"Z boson mass: "<<pT1+pT2<<endl;
TODO: in a run of 20 fb-1, how many Zprime will be made? How many DY? What is the ratio of signal over background?
Z-Prime 73.30 fb
DY =2.605e6 fb
Zprime = 73.3fb*20fb= 1466 events
Drell-Yan =2.605e6 fb*10 fb-1 = 5.2e7 events
Zprime/DY
Ratio: 2.819e-5
N_selected =( N_generated_passing_cuts/N_generated) * (sigma * Lum)
Zprime_cuts
(1000/5104)*73.3fb*20fb=287.225
DY_cuts
(1000/2663)*2.605e6 fb*20 fb-1=19564404.05
DY_cuts/Zprime_cuts=68168.644
graphs and rest on pdf