DeSR has now a new classifier using a multi layer perceptron (MLP), based on code contributed by Joseph Turian of the University of Montreal.
The original code was written in python using Theano, a sophisticated library for the manipulation and evaluation of mathematical expressions.
Theano allows to write in python syntax symbolic expressions which can be transformed, for instance computing derivatives and gradients, used in the backpropagation algorithm, which are compiled into C code before evaluation.
I translated the code to C++, obtaining substantial speed improvements, loosing of course some flexibility.
Training a model for the full CoNLL English corpus takes 6 hours instead of 4 days and achieves our best score so far for a single model:
Labeled Attachment Score: 87.03 % (4354/5003)
Unlabeled Attachment Score: 88.17 % (4411/5003)
This beats DeSR previous best at CoNLL 2007 with a second order Average Perceptron at:
Labeled Attachment Score: 85.85 % (4295/5003)
Unlabeled Attachment Score: 86.99 % (4352/5003)
The MLP was built with a hidden layer of 320 units.
Different settings could be explored to see if accuracy can improve further.
I plan to train models for all other languages and make them available.
Models in fact are quite reasonable in size (57 MB for English) and the program consumes little memory when running (61 MB).
The code is available for download at Sourceforge.