Cogprints

Pattern Recognition with Slow Feature Analysis

Berkes, Pietro (2005) Pattern Recognition with Slow Feature Analysis. [Preprint]

Full text available as:

[img]
Preview
PDF
1016Kb
[img]
Preview
Postscript
1584Kb

Abstract

Slow feature analysis (SFA) is a new unsupervised algorithm to learn nonlinear functions that extract slowly varying signals out of the input data. In this paper we describe its application to pattern recognition. In this context in order to be slowly varying the functions learned by SFA need to respond similarly to the patterns belonging to the same class. We prove that, given input patterns belonging to C non-overlapping classes and a large enough function space, the optimal solution consists of C-1 output signals that are constant for each individual class. As a consequence, their output provides a feature space suitable to perform classification with simple methods, such as Gaussian classifiers. We then show as an example the application of SFA to the MNIST handwritten digits database. The performance of SFA is comparable to that of other established algorithms. Finally, we suggest some possible extensions to the proposed method. Our approach is in particular attractive because for a given input signal and a fixed function space it has no parameters, it is easy to implement and apply, and it has low memory requirements and high speed during recognition. SFA finds the global solution (within the considered function space) in a single iteration without convergence issues. Moreover, the proposed method is completely problem-independent.

Item Type:Preprint
Keywords:slow feature analysis, pattern recognition, digit recognition, unsupervised feature extraction
Subjects:Neuroscience > Computational Neuroscience
Computer Science > Machine Learning
Computer Science > Neural Nets
ID Code:4104
Deposited By: Berkes, Pietro
Deposited On:16 Feb 2005
Last Modified:11 Mar 2011 08:55

References in Article

Select the SEEK icon to attempt to find the referenced article. If it does not appear to be in cogprints you will be forwarded to the paracite service. Poorly formated references will probably not work.

Bishop, C. M., 1995. Neural Networks for Pattern Recognition. Oxford University Press.

Bray, A., Martinez, D., 2002. Kernel-based extraction of Slow Features: Complex cells learn disparity and translation invariance from natural images. In: NIPS 2002 proceedings.

Burges, C. J. C., 1998. A tutorial on support vector machines for pattern recognition. Data Mining and Knowledge Discovery 2 (2), 121-167.

Gantmacher, F. R., 1959. Matrix Theory. Vol. 1. AMS Chelsea Publishing.

Hashimoto, W., 2003. Quadratic forms in natural images. Network: Computation in Neural Systems 14 (4), 765-788.

LeCun, Y., Bottou, L., Bengio, Y., Haffner, P., 1998. Gradient-based learning applied to document recognition. Proceedings of the IEEE 86 (11), 2278-2324.

Müller, K.-R., Mika, S., Rätsch, G., Tsuda, K., Schölkopf, B., 2001. An Introduction to Kernel Based Learning Algorithms. IEEE Transactions on Neural Networks 12 (2), 181-202.

Simard, P., LeCun, Y., Denker, J., 1993. Efficient pattern recognition using a new transformation distance. In: Hanson, S., Cowan, J., Giles, L. (Eds.), Advances in Neural Information Processing Systems. Vol. 5. Morgan Kaufmann.

Simard, P. Y., LeCun, Y., Denker, J. S., Victorri, B., 2000. Transformation invariance in pattern recognition - tangent distance and tangent propagation. International Journal of Imaging Systems and Technology 11 (3).

Wiskott, L., 1998. Learning invariance manifolds. In: Niklasson, L., Bodén, M., Ziemke, T. (Eds.), Proc. Intl. Conf. on Artificial Neural Networks, ICANN 98, Skövde. Perspectives in Neural Computing. Springer, pp. 555-560.

Wiskott, L., Sejnowski, T., 2002. Slow feature analysis: Unsupervised learning of invariances. Neural Computation 14 (4), 715-770.

Metadata

Repository Staff Only: item control page