IceCube
-----
1/25/2018
Confirmed final groups for upcoming research projects
Started using the logbook
Found a few helpful links that introduce the IceCube project that I will likely compile into a handy references section in the near future
http://icecube.umd.edu/Home.html (the UMD IceCube group home page)
http://icecube.wisc.edu/ (the general IceCube website available to the public)
https://arxiv.org/pdf/1612.05093.pdf (information about IceCube's design and production
1/29/2018
Met with Dr. Erik Blaufuss to discuss ideas for research plans for the semester
He recommended a few papers we could read to develop some background
On neutrino astronomy
https://arxiv.org/abs/1007.1247
https://arxiv.org/abs/1701.03731
On IceCube systems
https://arxiv.org/abs/1612.05093
On point source results and methodology
https://arxiv.org/abs/1406.6757
https://arxiv.org/abs/1609.04981
https://arxiv.org/abs/0912.1572
2/9/2018
Did not meet up with Dr. Blaufuss for this week, but continued reading up on the literature and took some notes
Having some issues with the VM
2/17/2018
Discussed questions we had regarding the literature and event histograms from data we received from Dr. Blaufuss
Stacking searches can highlight detectable sources that are not otherwise discover-able individually
We will start to look into analyzing data using the stacking search method to stack bins with "interesting objects" and compare with other bins
candidates for "interesting objects" can be chosen by looking through catalogs of astronomical objects that emit bright flux levels, such as quasars, blazars, gamma-ray sources, etc
Learned more about celestial coordinates and how they relate to astronomical objects
right ascension runs longitudinally while declination runs latitudinally
2/19/2018
Continuing to look for candidates from numerous catalogs; we are particularly interested in the LAT 4-year Point Source Catalog and its listing of pulsars
Pulsar Map
A simple Mollweide projection of pulsar-like projects pulled from the catalog's interactive table
Finally downloaded the VirtualBox, installed lubuntu, and installed the Anaconda distribution of Python; having a bit of trouble perusing through the packages and handling the "IC86_exp_all.npy" numpy file
3/6/2018
Have been busy here and there with exams, so I did not get much done by this week's meeting
Discussed with Jake and Dr. Blaufuss about this IceCube wiki page that talks about the stacking method; we won't look at single point source searches, but rather a binned contribution, and "some fine tuning" as Dr. Blaufuss calls it
Did some more soul-searching for a good catalog to start with (The WISE catalog is just too big to work with right now)
http://www.astro.gsu.edu/AGNmass/ (details some well-noted AGN black holes along with a few of their properties
AGN Black Hole Database
The front page of the AGN Black Hole Mass Database web interface. The main component of the front page is the object table with the black hole masses. Also included are options for the user to change the adopted value of h f i in the mass determination, an option to create and download a file with either comma-separated or tab-separated values for the object table, and links to the detailed information for each object in the far left column of the object table. There are also two anchors that will take the user to the bottom of the page where s/he can find information on the black hole mass calculations and how to acknowledge use of the database
3/12/2018
Considered possibilities that we can take with the AGN database:
We've got 62 AGN's, which is a good number because there aren't a lot :)
object name
black hole mass
right ascension and declination
redshift
It would be nice if we could plot each data points onto a Mollweide projection
Other parameters such as redshift and estimated black hole mass would be helpful in determining different weights for the stacking analysis
3/28/2018:
We looked at Paul's code, which randomized data to simulate background, which created some pretty histograms of p-values (quantities that tell us how significant a certain event) and a Mollweide projection of the 2011 data in terms of these p-values
4/1/2018:
We're considering creating a skymap of point sources from the AGN (active galactic nuclei) catalog published by Georgia State University
We downloaded astropy, which'll likely make things a little easier to tinker around with units and whatnot. It's an astronomer's package
We had problems creating a table from the mbh.csv file that we downloaded from their website
We ended up deleting the first two rows to clear up formatting, which fixed up or ascii.read() command
The cleaned up mbh.csv code is at the bottom for reference
Our initial csv file on Excel:
Converted to a neat-looking table using astropy
After getting this nice table, we wanted to try visualizing our objects on a 2D surface to get a sense of where they are with respect to the celestial sphere so we ended up using matplotlib to try plotting them onto a Mollweide projection
We had some trouble with this because the units of the right ascension and declination were in sexagesimal, meaning that instead of degrees, the right ascension was in hours:minutes:seconds and the declination was in degrees:minutes:seconds; once we got the units figured out, we were able to plot them
Our Mollweide projection map code is also listed on the bottom
Our Mollweide projection of the AGN objects:
We eyeballed a few of the catalog objects from their right ascension and declination and they seem to be accurate
4/9/2018:
I forgot to mention that last Monday, Jake managed to derive a conversion of redshift to proper distance
https://en.wikipedia.org/wiki/Hubble%27s_law is where a good chunk of his proof and insight came from
We also received an email from Dr. Blaufuss about a bunch of a flux graphs and declination efficiencies at various declinations
I'm not too sure what they mean but I think that a detector weighting can be established into our stacking equation
We ended up downloading the data and tried graphing some of these plots to get an idea of what we're working with
With these two things (redshift and detector weight), we're looking to incorporate them as weights, which essentially are things that can impact the detected flux of neutrinos
For example, neutrinos traveling to IceCube that pass through the Earth are more unlikely to be detected while upward traveling neutrinos are more likely to be from an atmospheric muon background; therefore there is angular dependence of the right ascension and the number of neutrino counts
We've got this lovely looking equation (https://wiki.icecube.wisc.edu/index.php/WHSP_Blazar_Stacking/Blazar_Stacking_Method) that mathematically represents the stacking that we're doing
W^k is a theoretical weight affecting neutrino flux and is primarily based on the object
We consider mass and redshift as viable weights
R^k(delta,gamma) represents our detector efficiency, which is a more well-defined quantity
These are more based off of kinetic parameters that affect the detector's ability to count neutrinos like declination, as I've mentioned in our example
Here are some of the graphs we obtained from Dr. Blaufuss's code describing the effectiveness of the detector at certain declinations:
4/23/2018:
Not a whole lot to say here except that we're continuing to work on the code and combining our weighting code although we are having trouble with array sizes from convolve
4/26/2018:
We talked about some of the insight that goes on behind the healpix algorithm
We're dealing about 12000 pixels, which each represents a bin of the sky
We worked on a bit of the code and ended up matching up the catalog objects to the pixel in the skymap of events from the IceCube data
4/29/2018:
We added neighbors to take into account neutrino flux from just one bin
We also hard coded the declination for our detector weight, which was given to us by Dr. Blaufuss; we were hard-pressed on time for presentations and whatnot, so this ended up expediting a bit of the weighting code and we should be able to produce a test statistic shortly while we work out some kinks in the statistical analysis
4/30/2018:
We got the code working after some issues with expected and actual events and Jake ended up running a few trials
We got a test statistic that essentially told us that our candidates are not significant sources of neutrinos, which is to be expected because as of yet, several years of IceCube data have not found a significant source of neutrinos
AFTERNOON:
So our statistic ended up being wrong
Dr. Blaufuss told us that the analysis didn't look right and told us to fix expected background distributions
We scramble the data and compare it to the background to get a test statistic (a pseudo p-value is what Dr. Blaufuss calls it), which is run for multiple trials
The actual data is compared to the expected background to get another test statistic; if this test statistic is above a few standard deviations from the mean, we can normally say that there is some significance and that it denies the null hypothesis (that it is not significant); my understanding is not the greatest so take my advice with a grain of salt
We've got presentations tomorrow so we'll run with it for now and hopefully get meaningful results later during the week
5/4/2018:
We ended up getting a correct test statistic this time!
Once again, the test statistic suggested that there is no significance
We'll look to incorporate this onto our poster once Jake runs a couple thousand trials
I included our code for the final stacking
We didn't manage to fit in all of our weighting parameters like mass and didn't get too far in our statistical analysis, like injection signals and linear extrapolations (whatever those mean) but I think we did a fair job completing our goal of at least determining if our sources are significant
All in all, this was a nice teaching exercise in which I learned a lot about neutrino astrophysics, Python, and some statistics