Teaching‎ > ‎Zimowy2015-16‎ > ‎


This is the main website for the Fall 2015 course on Neural Networks. Please come back for announcements.

Course rules: course_rules.pdf

Office hours: We 13-15, 203

Exam: 5.2.2016, 14:15 - 17:00
Makeup Exam: 18.2.2016, 9:15 - 12:00 in 141

Please remember to email us at:
jch+nn, alan+nn, patrykfi+nn AT cs.uni.wroc.pl

Grade thresholds for the exercises (total number of points was 104):
50: 3.0    60: 3.5    70: 4    80: 4.5    90: 5

Grade thresholds for the exam (total number of points on the exam was 75 + 10% of points from assignments):
35: 3.0    44: 3.5    53: 4.0    62: 4.5    70: 5.0

Materials for lecture 15

posted Feb 1, 2016, 8:07 AM by Jan Chorowski   [ updated Feb 1, 2016, 8:07 AM ]


The project details and deadline extension

posted Jan 31, 2016, 3:55 AM by Patryk Filipiak   [ updated Jan 31, 2016, 3:55 AM by Jan Chorowski ]

Deadline for submitting your projects is extended to Saturday, 20th February.

I would like to remind you that the minimum requirement for completing the assignment and obtaining the grade '3' comprises of implementing a classifier that reaches at least 75% accuracy on the CIFAR-10 test data. A proper solution must consist of the three following parts:

* a piece of software (in the form of a source code),

* test results obtained with the test batch described above,

* a short description of the proposed algorithm containing at least 500 words in English (a quality of the English language will not be assessed).

The test results stated above must contain a full listing of the following pairs (one line per item):

<nn_classification_result> <true_image_label>

Keep in mind that the project is focused on scientific work rather than software engineering. The grades higher then '3' will be given to students who achieve a significant improvement over the minimum 75% accuracy level.

Your endeavors must be described in details on your blogs with convincing justifications, e.g.:

* If you apply algorithm A and/or B, describe why do you think that choosing them was reasonable.

* If you select a certain value of a given parameter, present an argumentation of your choice by comparing the results of other variants or referring to the scientific publication that advocates this choice in a similar case.

Note that any scientific work does not aim exclusively at reaching the highest possible scores. It also tries to answer the questions like:

* What is the area of applicability of the proposed approach?

* What is the cost of using it? What has to be assumed to apply it? How computationally efficient it is?

* What are the main weaknesses of the proposed approach?

* Where it is and where it is not reasonable to apply it?

* What are the possible further improvements of the approach? What do you think is worth and what is not worth trying further?

* Can your results be measured in other ways? Using different metrics? Applying in different contexts?

Answering the above questions (and the like) would be of great importance while assessing your projects. 

Nice tutorial with rules of thumb for convolutional layer sizes

posted Jan 29, 2016, 4:24 AM by Jan Chorowski   [ updated Jan 29, 2016, 4:24 AM ]

See https://www.kaggle.com/c/datasciencebowl/forums/t/13166/happy-lantern-festival-report-and-code/69196 and epsecially https://kaggle2.blob.core.windows.net/forum-message-attachments/69182/2287/A%20practical%20theory%20for%20designing%20very%20deep%20convolutional%20neural%20networks.pdf?sv=2012-02-12&se=2016-02-01T12%3A11%3A48Z&sr=b&sp=r&sig=0BGPePJgeX146I1zlltk3mVfYyb%2ByEiDNq7PA7rfk%2Bw%3D

Materials for lecture 14

posted Jan 29, 2016, 4:08 AM by Jan Chorowski   [ updated Jan 29, 2016, 4:08 AM ]

Materials for lecture 13

posted Jan 29, 2016, 4:07 AM by Jan Chorowski   [ updated Jan 29, 2016, 4:07 AM ]

I also upload the RNN demo with a preview of LSTM gates

Theano sessions

posted Jan 13, 2016, 2:56 AM by Jan Chorowski   [ updated Jan 14, 2016, 7:30 AM ]

This and next week (13-20.1.2016) during labs sessions I will offer Theano tutorials. We will follow my materials prepared for the "Matlab, R, Python" course: https://sites.google.com/a/cs.uni.wroc.pl/jch/teaching/teaching/marpy/materialyzwykladow14i15/wyklad14.ipynb

I attach my .theano configuration file. Please copy it to your home directory. In principle on all computers in lab110 you chould be able to execute code with both cpu and gpu. You control the execution device by setting the THEANO_FLAGS=device=cpu or THEANO_FLAGS=device=gpu environment variable.

Please come. For more materials, please see:
  1. Tutorials by alec Radford https://github.com/Newmu/Theano-Tutorials
  2. Deep learning tutorials http://deeplearning.net/tutorial/
  3. Theano tutorials http://deeplearning.net/software/theano/tutorial/
  4. My notebook from the Matlab, R, and Python class:http://nbviewer.ipython.org/urls/sites.google.com/a/cs.uni.wroc.pl/jch/teaching/teaching/marpy/materialyzwykladow14i15/wyklad14.ipynb

There are many neural network libraries built on top of Theano:

  1. a lightweight one, nice for learning this course https://github.com/Lasagne/Lasagne
  2. a more heavy-weight one, for research https://github.com/mila-udem/blocks

Materials for lecture 12

posted Jan 13, 2016, 2:53 AM by Jan Chorowski   [ updated Jan 15, 2016, 3:17 PM ]

We have talked about recurrent networks and LSTMs.

Important: to run the notebooks, please copy the attached .theanorc file to your home directory. Also use the latest assignment starter code (it includes theano and points to a smiple .theanorc file)

Materials for lecture 11

posted Jan 13, 2016, 2:50 AM by Jan Chorowski   [ updated Jan 13, 2016, 2:50 AM ]

We have talked about PAC learning and we had a guest lecture by Lukasz Kaiser.

Materials for lectures 7 & 10

posted Dec 31, 2015, 2:21 AM by Jan Chorowski   [ updated Dec 31, 2015, 2:21 AM ]

Materials for lectures 8 & 9

posted Dec 14, 2015, 12:54 PM by Adrian Łańcucki   [ updated Dec 31, 2015, 2:15 AM by Jan Chorowski ]

Lectures on SVM have been largely based on Andrew Ng's notes for Stanford cs229 course. The following excellent resources should help you review the material. Note that Andrew Ng introduces functional margin, which we have omitted for simplicity and fixed as 1 right from the start.

1-10 of 26