Week One: Introduction
For the first week, I was able to get into contact with my partner for the semester, Matthew Kirby. We then got into contact with our instructor, Erik Blauffus. We will be conducting research with the Neutrino Detector called Ice Cube, which is located in the southpole. He gave us some scientific journals to read up on the subject which are linked below:
As for some background. Here are some papers
* Background on neutrino astronomy
https://arxiv.org/abs/1007.1247
https://arxiv.org/abs/1701.03731
* Background on IceCube systems, etc:
https://arxiv.org/abs/1612.05093
* IceCube Point source results.
Most recent IceCube point source papers are here:
http://arxiv.org/abs/1406.6757
https://arxiv.org/abs/1609.04981
It references a good methods paper, that describes how things work.
http://arxiv.org/abs/0912.1572
**FYI: Throughout this semester, most of my work was done in the Mac OS, therefore the instructions below would be for a linux based OS. It could possibly work the same in Ubuntu, but will most likely be different for Windows OS.
Week Two: More Reading and Python.
For this semester, we will be writing most of our code in Python, using libraries: NumPy, HealPy, and Matplotlib. Instructions on getting these set up will be found below. This week, our instructor sent us a data file containing a year’s worth of data collected from Ice Cube. It contained information such as: run#, event#, RA (), DEC (), azimuth, zenith, energy (log10), angular error, and time (fractional MJD). To be able to fully understand this data, one should get familiar with the astronomical coordinates. Trust me, it’s pretty important.
Setting up Python Libraries:
Typically, Mac is preinstalled with Python ver. 2.7. This is perfect for the analysis that we do throughout this semester. Therefore, the process of installing the python libraries is as follows:
pip install numpy
pip install scipy
pip install healpy
pip install matplotlib
Example:
If these commands do not work, you can visit these libraries main websites to figure out what extra line of code is needed to install these libraries. Other than that, it should be fairly simple to install these libraries.
Week Three: Data Manipulation
For the past week or so, we have started going over the logbooks of previous students, looking at what they did last semester with the data. We have also been trying to replicate some of their data. Below you will find some graphs found in Alison Duck's log book from the 2015 semester. I ran through her log book to see what steps she had taken, and replicated the graphs using code I had written, following her steps:
Week 4:
Reading Papers on Ice Cube, and reading on types of point source searches. Here are links to the papers given to us by Erik:
* Background on neutrino astronomy
https://arxiv.org/abs/1007.1247
https://arxiv.org/abs/1701.03731
* Background on IceCube systems, etc:
https://arxiv.org/abs/1612.05093
* IceCube Point source results.
Some IceCube point source papers are here:
http://arxiv.org/abs/1406.6757
https://arxiv.org/abs/1609.04981
It references a good methods paper, that describes how things work.
http://arxiv.org/abs/0912.1572
Week 5:
This week we focused on types of point source searches, and looking more in depth of the p-value codes of last semester students.
Week 7:
So this week I tried to get python working on my desktop at home so that when I run these trials while I am at school instead of using my laptop since they sometimes can take a few hours at a time. (WARNING: THIS FIRST PART WAS A TOTAL WASTE OF TIME. HEALPY DOES NOT SUPPORT WINDOWS, THEREFORE SETTING UP THIS CODE TO TRY AND WORK FOR WINDOWS IS AND WAS A WASTE OF TIME). Anyhow, this is how I began setting it up on windows:
1. Go to Python's main website and download version 2.7 of python. There are newer versions, but the code was written with 2.7, and could cause errors if you try to run it with higher versions of python. The main website is here: https://www.python.org/
2. To install the packages need, you can follow the instructions in this video: https://www.youtube.com/watch?v=-llHYUMH9Dg
3. At this point I realized that Healpy was not supported for windows, therefore, this time was wasted... :(
After I realized it was not going to work, so my plan B was to use a virtual machine to do this work in Ubuntu. This actually worked, and below are the steps on how to do it:
1. Go to terpware.umd.edu, download and set up VM-Ware. VM-Ware is the virtual machine program that will be running Ubuntu which will help us run the code.
2. After the VM-Ware is setup, you can download the newest version of Ubuntu from: https://www.ubuntu.com/download/desktop
Once it is downloaded, instead of installing it right away, you will want to open the VM-Ware software, and access the download through that. This way, when Ubuntu begins installation, it will install within the virtual machine software.
3. Now that that is done, you can proceed with the installation of the python libraries. Python should already be installed, which you can double check by just typing Python into the terminal, and python should load up.
4. Installing the libraries is the easy part:
pip install numpy
pip install healpy
pip install matplotlib
pip install scipy
Those are the commands that are used to install the libraries, if they do not work, it could be possible that some may need the "sudo apt install ___' command instead. Other than that, it should be fairly simple in Ubuntu.
5. Now Python is ready to go, and the code should be able run.
Week 10:
This week, we ran the main code a few times to get a sense of how it works. We were using the data from 2011's Ice Cube data set that Erik gave us. Linked at the bottom you will find the code that was written by last semester's students. This code simply takes the data from Ice Cube, scrambles it, and records the highest p-value of the sky map. It does this process for the amount of trials you set up (typically somewhere between 1000-2000 trials). Running it at this many trials is what will take possible a few hours to run, and after it is done, it gives you a graph of each of the highest P-values recorded. This way, if we see that there are any specifically high P-values, and they seem to be coming from the same point in the sky we can look closer at that point to see whether or not it is a point source.
WEEK 12:
Around this time we began to talk about HAWC (High Altitude Water Cherenkov Observatory). HAWC supposedly found about 40 point sources for gamma rays, which tend to be correlated with neutrinos. Therefore, our goal from now on would be to compare Ice Cube's data with the gamma ray point sources found by HAWC to see if there is any correlation, and if these point sources can also be considered Neutrino Point Sources.
The way we plan to approach this is to alter the p-value code given to us to only focus on the point sources found by HAWC. This will save a lot of time since the code no longer has to look at every point in the sky, and instead only look at about 40 points.
WEEK 13:
Around this time we were given multiple years of data collected by Ice Cube which we were gonna run again through the code given last year, as well as run it through the code that we modified this semester. To combine the files into one you can run this code:
Then you can just adjusts the preexisting code to just take this new created data file which should contain all the data in one.
So Matthew was able to write this code that populates a skymap full of zeros, and then puts a value of 1 inside the bin that matches the RA (Right Ascension) and Declination of the point sources. If the bin already has a 1 inside of it, then it won't do anything to that bin. The skymap code looks like this:
Once that code was done being written, we let it run a few times and got graphs like this:
Don't worry about getting that line through the middle or the axes titles as though were added in later to make the graph look nicer and presentable during our final stages of the research. Anyways, by the data shown in this graph though, we can conclude with some certainty that the HAWC data points don't really show any real significance in being a Neutrino Point Source.
WEEK 15:
Around this time we are beginning to wrap up our research, and create a presentation/paper/poster all at the same time. Below you will find attached our poster, paper, and final presentation to look through.