For 30 years, I (Craig McNeile) worked in the field of lattice QCD. During that time, I worked on many projects that used high-performance computing and computational science techniques. I have been working as a lecturer in Theoretical Physics at the University of Plymouth since 2013. I am currently the program manager for the Data Science and Business Analytics MSc at the University of Plymouth,
This site gives an overview of my skills. I have a short CV for computational science .
I have recently been focusing on using Python for data analysis. The physics analysis code I use is here, but I also use standard Python libraries such as pandas, scipy and keras
C++ / C
Fortran 90/95
Scripting: Perl, UNIX command line tools, awk, sed, bash programming.
Mathematical libraries and systems: R, Octave, gsl, maple, Matlab
Version control systems: CVS and GitHub.
Parallel computing with MPI.
SQL and the design of relational databases. I teach the MSc module Software Development and Databases at the University of Plymouth. See github repository exercises for the module.
List of computational science skills
Numerical linear algebra (dense and sparse)
Monte Carlo techniques
Nonlinear regression
Scientific plotting
Jackknife and bootstrap data analysis
Bayesian analysis
I have contributed to a number of open source codes for lattice QCD research, such as Chroma, and the MILC code.
I added code to chroma that implemented something called hex smearing (this just sets up a sparse matrix). There are a couple of routines that integrate the hex smearing into routines that create the matrix times vector routine.
Chroma is an open source c++ code to do lattice QCD calculations. The sloccount utility reports that the code base contains 825,185 lines of code. I have not written the bulk of the code, but I have contributed routines. Chroma uses patterns and template techniques for performance. The main format for input and output is XML.
Below is a python code to do a statistical analysis of correlators. Array data structures from numpy are used. There is a chi**2 minimization using routines from scipy and plotting from pylab.
See some basic python analysis code in my github.
When I was in the European Twisted Mass collaboration I started to use the R package to do some curve fitting, because R was proposed as the framework for statistical analysis in that collaboration.
Root is data analysis framework from CERN. I am using it to do the final data analysis. It has easy access to the MINUIT minimizer and although the base language is c++, it uses an interpreter.
go_root.sh (this is the top level bash script that calls the root routines)
loop_boot_myfit_3param_2anal_B.C This is analysis code.
While at the University of Liverpool we needed a tool to help with a testing framework for the Chroma library. The xmldiff toll is used to compare XML output up to rounding errors The code was written by people in EPCC, but I was involved with the requirements analysis. I did the first simple integration into the Chroma library.
To perform data analysis of lattice QCD calculations I have written a c++ program. I started work on this around 1994. I started by using the lower routines from the Numerical Recipes book (this seemed like a good idea when I started the code). When I add new library calls to the code I now mostly use routines from the GNU science library. The slocount utility reports that the code is 25,000 lines of code. I use array data structures from Blitz for part of the code (although today I would use data structures from boost.)
Please see here for contact information.