Cogprints

Neural networks as a model for visual perception: what is lacking?

Würtz, Rolf P. (1998) Neural networks as a model for visual perception: what is lacking? [Preprint]

Full text available as:

[img] Postscript
189Kb

Abstract

A central mystery of visual perception is the classical problem of invariant object recognition: Different appearances of an object can be perceived as ``the same'', despite, e.g., changes in position or illumination, distortions, or partial occlusion by other objects. This article reports on a recent email discussion over the question whether a neural network can learn the simplest of these invariances, i.e. generalize over the position of a pattern on the input layer, including the author's view on what ``learning shift-invariance'' could mean. That definition leaves the problem unsolved. A similar problem is the one of learning to detect symmetries present in an input pattern. It has been solved by a standard neural network requiring some 70000 input examples. Both leave some doubt if backpropagation learning is a realistic model for perceptual processes. Abandoning the view that a stimulus-response system showing the desired behavior must be learned from scratch, yields a radically different solution. Perception can be seen as an active process that rapidly converges from some initial state to an ordered state, which in itself codes for a percept. As an example, I will present a solution to the visual correspondence problem, which greatly alleviates both problems mentioned above.

Item Type:Preprint
Subjects:Biology > Animal Cognition
Psychology > Cognitive Psychology
Computer Science > Complexity Theory
Computer Science > Machine Vision
Computer Science > Neural Nets
Neuroscience > Neural Modelling
ID Code:507
Deposited By: Würtz, Rolf P.
Deposited On:05 Aug 1998
Last Modified:11 Mar 2011 08:54

Metadata

Repository Staff Only: item control page