Sprekeler, Henning and Wiskott, Dr. Laurenz (2008) Understanding Slow Feature Analysis: A Mathematical Framework. [Preprint]
Full text available as:
|
PDF
- Submitted Version
285Kb |
Abstract
Slow feature analysis is an algorithm for unsupervised learning of invariant representations from data with temporal correlations. Here, we present a mathematical analysis of slow feature analysis for the case where the input-output functions are not restricted in complexity. We show that the optimal functions obey a partial differential eigenvalue problem of a type that is common in theoretical physics. This analogy allows the transfer of mathematical techniques and intuitions from physics to concrete applications of slow feature analysis, thereby providing the means for analytical predictions and a better understanding of simulation results. We put particular emphasis on the situation where the input data are generated from a set of statistically independent sources. The dependence of the optimal functions on the sources is calculated analytically for the cases where the sources have Gaussian or uniform distribution.
Item Type: | Preprint |
---|---|
Keywords: | slow feature analysis, unsupervised learning, invariant representations, statistically independent sources, theoretical analysis |
Subjects: | Neuroscience > Computational Neuroscience Computer Science > Machine Learning |
ID Code: | 6223 |
Deposited By: | Sprekeler, Henning |
Deposited On: | 16 Oct 2008 13:47 |
Last Modified: | 11 Mar 2011 08:57 |
Metadata
- ASCII Citation
- Atom
- BibTeX
- Dublin Core
- EP3 XML
- EPrints Application Profile (experimental)
- EndNote
- HTML Citation
- ID Plus Text Citation
- JSON
- METS
- MODS
- MPEG-21 DIDL
- OpenURL ContextObject
- OpenURL ContextObject in Span
- RDF+N-Triples
- RDF+N3
- RDF+XML
- Refer
- Reference Manager
- Search Data Dump
- Simple Metadata
- YAML
Repository Staff Only: item control page