Assignemt 2 and update on grading rules

Post date: Oct 16, 2013 9:20:59 PM

Hi All!

First of all, I have posted Assignment 2. It's due next Wednesday. Tomorrow I should post lecture notes covering the basic equations for signal propagation through the network.

The assignment is best implemented in a language with good matrix support. If you use something other than Matlab or Python+numpy, find a matrix library. For Java this can be the Efficient Java Matrix Library, for C/C++ something based on a good BLAS implementation, such as Atlas or MKL (these are the state-of-the art numerical libraries that are the basis of matrix support in Matlab and numpy too).

Some of you didn't submit anything for the first assignment. I decided to clarify the grading rules:

1. Assignments will be posted during the first weeks, before we start working on the projects. There will be 2-3 more assignments.

2. The assignments will contribute 30% towards the lab grade, the project will contribute the remaining 70%.

3. You get 5 late days for the assignments. I will not grade submissions once you are out of the late days. This means that for those of you that didn't submit anything Today, you can submit the solutions on Monday at the latest, this will use all of your 5 late days and all next submissions will need to be on time.

The next lecture will focus on more practical aspects of training and validation and we should briefly introduce the learning theory, i.e. how to tell whether our classifier will be able to generalize to cases not seen during training. I will introduce some project ideas in two to three weeks, but PLEASE START THINKING ABOUT YOUR OWN. I'm sure you will have lots of great things to do for the projects and I will be more than happy to help you with them!

I have posted two links to papers that I consider important:

    • The paper that (re)introduced backpropagation: http://www.cs.toronto.edu/~hinton/absps/naturebp.pdf

    • The best reference on backpropagation training of networks: http://yann.lecun.com/exdb/publis/pdf/lecun-98b.pdf

Especially Yann Le Cun's paper is one of the best hands-on discussion on the tricks to make backpropagation work. If the name tells you nothing, he's the one behind the digit recognizer that I was showing on the first lecture (the one that worked for smudged/crossed/moving digits).

If you have trouble with the exercises, or want some clarification please email me! I need your feedback to make the course better.