Nearest Neighbor in Space – Part I

When we trained our first model of a Siamese network, we used b/w images of handwritten digits for a more efficient evaluation, we were a little disappointed with the performance. The 2D embedding of the model output definitely showed that classes were separated but only to some degree. The reason we bring this up again is that we plan to evaluate a new kNN model, k nearest neighbor, that combines a learned embedding, trained by autoencoders or RBMs, with kNN.

Again, we will start with digits because of the limited number of classes that allows an easier interpretation of the results. But this time, we decided to use a larger set of digits to start with and actually, the learned embedding -we used a Siamese neural network as a first test- showed much better results. Now, the classes are more clearly separated and there is a visible larger margin between them Lessons learned: There cannot be enough data and sometimes, early stopping can hurt the performance a model.

In the next post, we will discuss the new kNN model more in detail.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s