There are many libraries for Machine Learning these days. Some are powerful, some are modular and some are just different. One of those is Theano. Instead of providing implementations for common algorithms, the library allows to optimize arbitrary cost functions with gradient descent. In combination with its ability to perform symbolic and automatic differentiation, the programmer can focus on the problem at hand, and does not to derive and manually check all those complex gradient expressions which are required to optimize a given cost function. Plus, the library can optimize the underlying graph to make the numerical optimization more stable and efficient.
The library is our first choice if we want to evaluate a new algorithm either published in some paper, or based on our own ideas. Since the library is pretty low-level, you need some code to get rid of common tasks, but once you have that, it is straightforward to evaluate new cost functions. Especially in case of neural networks, the ‘fprop’ part can be implemented very efficiently. Stated differently, the forward propagation through a trained network for some input to get the output which can be either a label, a tag, or even a real-valued vector.
This post is just a reminder how cool and versatile Theano is and if you haven’t heard of it, it is a good day to give it a try. Plus, a new version 0.7, a release candidate actually, but it makes a pretty good impression, was released on March 15. In a nutshell, Theano is a pythonic library that allows you to optimize user-defined cost functions, like Neural Networks, very efficiently if you are not afraid to make your hands a little dirty.