This is the project I am building, on and off. The idea started as a a Perceptron I coded while taking Andrew Ng's Machine Learning course on Coursera. The project then hibernated for many months and woke up to become a simple Multi-Layer Perceptron, then a set of Fully Connected layers to sort through UCI's Datasets. One dataset that I had in my cross-hairs was the Leed's butterfly set.
I then took the project more seriously to add other types of layers in order:
The project then evolved to have a network description language that can make layers by reading a text file written in a Network Description Language, it now supports multiple types of error functions, gradient descent algorithms, activation functions, GPGPU implementation etc.
This project achieves some 98%+ accuracy in classifying MNIST data and about 67% in CIFAR-10 data.
CUDA in Neural Networks Project
My attempt at this project was largely driven by a desire to understand and implement the mathematics of back propagation algorithm in C/C++. However I also wanted to bring in GPGPU, if not for anything else, then at least to see the training epochs fly. I have implemented the Fully-Connected and Convolution layers in CUDA, more to follow including plumbing to connect these layers etc.
Overall architecture of Neural Networks Project
Running of the code