creators_name: Sprekeler, Henning creators_name: Wiskott, Laurenz creators_id: henning.sprekeler@epfl.ch creators_id: l.wiskott@biologie.hu-berlin.de type: preprint datestamp: 2008-10-16 13:47:35 lastmod: 2011-03-11 08:57:12 metadata_visibility: show title: Understanding Slow Feature Analysis: A Mathematical Framework subjects: comp-neuro-sci subjects: comp-sci-mach-learn full_text_status: public keywords: slow feature analysis, unsupervised learning, invariant representations, statistically independent sources, theoretical analysis abstract: Slow feature analysis is an algorithm for unsupervised learning of invariant representations from data with temporal correlations. Here, we present a mathematical analysis of slow feature analysis for the case where the input-output functions are not restricted in complexity. We show that the optimal functions obey a partial differential eigenvalue problem of a type that is common in theoretical physics. This analogy allows the transfer of mathematical techniques and intuitions from physics to concrete applications of slow feature analysis, thereby providing the means for analytical predictions and a better understanding of simulation results. We put particular emphasis on the situation where the input data are generated from a set of statistically independent sources. The dependence of the optimal functions on the sources is calculated analytically for the cases where the sources have Gaussian or uniform distribution. date: 2008-08-19 date_type: submitted refereed: FALSE citation: Sprekeler, Henning and Wiskott, Dr. Laurenz (2008) Understanding Slow Feature Analysis: A Mathematical Framework. [Preprint] document_url: http://cogprints.org/6223/2/SprekelerWiskott08.pdf