myCNN

myCNN (stands for "my Convolutional Neural Network") is a Matlab implementation of convolutional neural network (CNN).

The first CNN appeared in the work of Fukushima in 1980 and was called Neocognitron. The basic architectural ideas behind the CNN (local receptive fields, shared weights, and spatial or temporal subsampling) allow such networks to achieve some degree of shift and deformation invariance and at the same time reduce the number of training parameters.

Since 1989, Yann LeCun and co-workers have introduced a series of CNNs with the general name LeNet, which contrary to the Neocognitron use supervised training. In this case, the major advantage is that the whole network is optimized for the given task, making this approach useable for real-world applications.

LeNet has been successfully applied to character recognition, generic object recognition, face detection and pose estimation, obstacle avoidance in an autonomous robot etc.

myCNN class allows to create, train and test generic convolutional networks (e.g., LeNet) as well as more general networks with features:

    • any directed acyclic graph can be used for connecting the layers of the network;

    • the network can have any number of arbitrarily sized input and output layers;

    • the neuron’s receptive field (RF) can have an arbitrary stride (step of local RF tiling), which means that in the S-layer, RFs can overlap and in the C-layer the stride can differ from 1;

    • any layer or feature map of the network can be switched from trainable to nontrainable (and vice versa) mode even during the training;

    • a new layer type: softmax-like M-layer.

The archive contains the myCNN class source (with comments) and a few simple examples use of myCNN class. For proper training, the some examples need the MNIST dataset, which can be directly downloaded from Yann LeCun's website or, in Matlab format, from here (train data) and here (test data).

Changelog:

2013-09-27-11-33 myCNN-0.08

    • the "quick" implementations of the squashing functions (qsquash(), qdsquash() and qsquash_and_dsquash()) for variables of single type were removed as they actually are much slower than the conventional versions

    • thanks to Aniket Vartak, a stupid bug was found in qsquash() and qsquash_and_dsquash() functions

2009-09-30-14-33 myCNN-0.07

    • a graphical demo (demo_myCNN) added (thanks to Mikhail Sitotenko)

    • an example of pretrained network (myLeNet5-example.mat) with a new M-layer added

    • the myCNN constructor is fixed (now it is possible to properly load myCNN objects from a mat file)

2009-09-28-14-30 myCNN-0.06

    • the first attempt to make the M-layer trainable

    • a couple of new private functions unfold.m and unfold2.m added

    • soft_max now suppots n-dimensional array (not only 2D and 3D)

    • a little clean-up of the code

    • more eleborated myCNN object description (myCNN-description.pdf)

2009-09-23-11-38 myCNN-0.05

    • bug in init_net (weights initialiazation) is fixed

    • example2.m added

    • doc folder added

    • first attempt to document the myCNN object doc/myCNN-description.pdf

2009-09-09-09-09 myCNN-0.04

    • ChangeLog added

2009-09-08-09-59 myCNN-0.03

    • content.m added

    • read_idx_data.m fixed

2009-09-07-22-54 myCNN-0.02

    • the example got comments

2009-09-07-04-11 myCNN-0.01

    • the first release, sumbitted to Mathworks FileExchange