title: Pre-integration lateral inhibition enhances unsupervised learning creator: Spratling, M. W. creator: Johnson, M. H. subject: Neural Modelling subject: Computational Neuroscience subject: Neural Nets description: A large and influential class of neural network architectures use post-integration lateral inhibition as a mechanism for competition. We argue that these algorithms are computationally deficient in that they fail to generate, or learn, appropriate perceptual representations under certain circumstances. An alternative neural network architecture is presented in which nodes compete for the right to receive inputs rather than for the right to generate outputs. This form of competition, implemented through pre-integration lateral inhibition, does provide appropriate coding properties and can be used to efficiently learn such representations. Furthermore, this architecture is consistent with both neuro-anatomical and neuro-physiological data. We thus argue that pre-integration lateral inhibition has computational advantages over conventional neural network architectures while remaining equally biologically plausible. date: 2002 type: Journal (Paginated) type: PeerReviewed format: application/pdf identifier: http://cogprints.org/2380/1/neurocomp.pdf identifier: Spratling, M. W. and Johnson, M. H. (2002) Pre-integration lateral inhibition enhances unsupervised learning. [Journal (Paginated)] relation: http://cogprints.org/2380/