MacLennan, Bruce J. (1998) Mixing Memory and Desire: Want and Will in Neural Modeling. [Conference Paper]
Full text available as:
Postscript
1060Kb |
Abstract
Values are critical for intelligent behavior, since values determine interests, and interests determine relevance. Therefore we address relevance and its role in intelligent behavior in animals and machines. Animals avoid exhaustive enumeration of possibilities by focusing on relevant aspects of the environment, which emerge into the (cognitive) foreground, while suppressing irrelevant aspects, which submerge into the background. Nevertheless, the background is not invisible, and aspects of it can pop into the foreground if background processing deems them potentially relevant. Essential to these ideas are questions of how contexts are switched, which defines cognitive/behavioral episodes, and how new contexts are created, which allows the efficiency of foreground/background processing to be extended to new behaviors and cognitive domains. Next we consider mathematical characterizations of the foreground/background distinction, which we treat as a dynamic separation of the concrete space into (approximately) orthogonal subspaces, which are processed differently. Background processing is characterized by large receptive fields which project into a space of relatively low dimension to accomplish rough categorization of a novel stimulus and its approximate location. Such background processing is partly innate and partly learned, and we discuss possible correlational (Hebbian) learning mechanisms. Foreground processing is characterized by small receptive fields which project into a space of comparatively high dimension to accomplish precise categorization and localization of the stimuli relevant to the context. We also consider mathematical models of valences and affordances, which are an aspect of the foreground. Cells processing foregound information have no fixed meaning (i.e., their meaning is contextual), so it is necessary to explain how the processing accomplished by foreground neurons can be made relative to the context. Thus we consider the properties of several simple mathematical models of how the contextual representation controls foreground processing. We show how simple correlational processes accomplish the contextual separation of foreground from background on the basis of differential reinforcement. That is, these processes account for the contextual separation of the concrete space into disjoint subspaces corresponding to the foreground and background. Since an episode may comprise the activation of several contexts (at varying levels of activity) we consider models, suggested by quantum mechanics, of foreground processing in superposition. That is, the contextual state may be a weighted superposition of several pure contexts, with a corresponding superposition of the foreground representations and the processes operating on them. This leads us to a consideration of the nature and origin of contexts. Although some contexts are innate, many are learned. We discuss a mathematical model of contexts which allows a context to split into several contexts, agglutinate from several contexts, or to constellate out of relatively acontextual processing. Finally, we consider the acontextual processing which occurs when the current context is no longer relevant, and may trigger the switch to another context or the formation of a new context. We relate this to the situation known as "breakdown" in phenomenology.
Item Type: | Conference Paper |
---|---|
Keywords: | context, contextual understanding, breakdown, intelligence, values, relevance, foreground, background, combinatorial explosion, phenomenology, Dreyfus, intentionality, attention, episode, correlation, convolution, affordance, valence, hippocampus, hippocampal place cell, Hebbian learning, novelty, reinforcement |
Subjects: | Neuroscience > Behavioral Neuroscience Biology > Animal Cognition Biology > Theoretical Biology Psychology > Cognitive Psychology Neuroscience > Computational Neuroscience Computer Science > Artificial Intelligence Computer Science > Neural Nets Computer Science > Robotics Neuroscience > Neural Modelling Philosophy > Epistemology |
ID Code: | 84 |
Deposited By: | MacLennan, Bruce |
Deposited On: | 08 May 1999 |
Last Modified: | 11 Mar 2011 08:53 |
Metadata
- ASCII Citation
- Atom
- BibTeX
- Dublin Core
- EP3 XML
- EPrints Application Profile (experimental)
- EndNote
- HTML Citation
- ID Plus Text Citation
- JSON
- METS
- MODS
- MPEG-21 DIDL
- OpenURL ContextObject
- OpenURL ContextObject in Span
- RDF+N-Triples
- RDF+N3
- RDF+XML
- Refer
- Reference Manager
- Search Data Dump
- Simple Metadata
- YAML
Repository Staff Only: item control page