Building a deep learning research machine

4/21/19

Lessons learned in building my first PC.

Throughout my career I have been tinkering with software in various ways. I mostly write code to solve scientific problems such as what causes extreme waves in the North Sea to who won the CrossFit Open in Miami. I recently wanted to challenge myself and tinker with hardware and build my own PC. I was motivated by a couple of blog posts [1, 2] that talked about building a PC for deep learning as opposed to paying for cloud usage. It will also provide me with a platform to improve my knowledge of AI/ML. I also just needed a new computer as my laptop is on its last legs.

Choosing the parts

I gave myself a $1k budget and used pcpartpicker.com. pcpartpicker is great at ensuring all parts are compatible i.e. this CPU goes with this motherboard (although I still messed this up. More on this later...). You can see my part list here.

If you can, buy from Amazon. Makes returns very easy.

PC parts

Building the machine

The first time I tried to build the PC I put everything in the case and then (after a few hours) hit the power switch, hoping to fire it up. However, nothing happened. I then had to take it apart and build it in stages i.e. does the motherboard work (yep), does the CPU work (nope). The kind people at r/buildapc noticed I had bought an older generation CPU for my motherboard so I had to return the Intel i5-7600K and exchange it for the Intel i5-8600K. Thankfully there was no difference in price and outletpc.com accepted my open (although still working) CPU.

Building the machine part 2

While getting the wrong CPU was frustrating, the pedagogy is in the mistakes. I got to build it again and learn about the different generation of CPUs. I made sure to check everything from the start this time and my build was much cleaner and quicker.

She's alive!

Installing software

I opted to install Ubuntu. It's free and most scientific applications run on linux. I was slightly crazy and went for the latest version 19.04 (Disco Dingo) instead of 18.04 LTS (Bionic Beaver, Long Term Support). The newest release which is 1 day old when I installed it, doesn't have much user support, which made it more difficult installing software. Nonetheless, I got to learn how to build various software packages from scratch [3] and tested bleeding edge software versions.

Powered by a six-core CPU and GPU with CUDA I can now play with various deep learning libraries such as TensorFlow, Keras, PyTorch, Caffe, CNTK, Theano and utilize GPU acceleration [4].

The images below shows things are in working order when testing tensorflow with GPU acceleration. More fun stuff to come soon!

CUDA sample executables

No errors on the test of 2D convolution using FFT

Running a docker image of the latest GPU tensorflow build.

tf.reduce_sum(tf.random_normal([1000, 1000])) yields 1516.9102