Rachkovskij, Dmitri A. and Kussul, Ernst M. (1999) Binding and Normalization of Binary Sparse Distributed Representations by Context-Dependent Thinning. [Preprint] (Unpublished)
Full text available as:
Postscript
775Kb |
Abstract
Distributed representations were often criticized as inappropriate for encoding of data with a complex structure. However Plate's Holographic Reduced Representations and Kanerva's Binary Spatter Codes are recent schemes that allow on-the-fly encoding of nested compositional structures by real-valued or dense binary vectors of fixed dimensionality. In this paper we consider procedures of the Context-Dependent Thinning which were developed for representation of complex hierarchical items in the architecture of Associative-Projective Neural Networks. These procedures provide binding of items represented by sparse binary codevectors (with low probability of 1s). Such an encoding is biologically plausible and allows to reach high information capacity of distributed associative memory where the codevectors may be stored. In distinction to known binding procedures, Context-Dependent Thinning allows to support the same low density (or sparseness) of the bound codevector for varied number of constituent codevectors. Besides, a bound codevector is not only similar to another one with similar constituent codevectors (as in other schemes), but it is also similar to the constituent codevectors themselves. This allows to estimate a structure similarity just by the overlap of codevectors, without the retrieval of the constituent codevectors. This also allows an easy retrieval of the constituent codevectors. Examples of algorithmic and neural network implementations of the thinning procedures are considered. We also present representation examples of various types of nested structured data (propositions using role-filler and predicate-arguments representation, trees, directed acyclic graphs) using sparse codevectors of fixed dimension. Such representations may provide a fruitful alternative to the symbolic representations of traditional AI, as well as to the localist and microfeature-based connectionist representations.
Item Type: | Preprint |
---|---|
Keywords: | distributed representation, sparse coding, binary coding, binding, variable binding, thinning, representation of structure, structured representation, recursive representation, nested representation, compositional representation, connectionist symbol processing. |
Subjects: | Computer Science > Artificial Intelligence Computer Science > Neural Nets |
ID Code: | 537 |
Deposited By: | Rachkovskij, Dmitri |
Deposited On: | 30 Apr 1999 |
Last Modified: | 11 Mar 2011 08:54 |
Metadata
- ASCII Citation
- Atom
- BibTeX
- Dublin Core
- EP3 XML
- EPrints Application Profile (experimental)
- EndNote
- HTML Citation
- ID Plus Text Citation
- JSON
- METS
- MODS
- MPEG-21 DIDL
- OpenURL ContextObject
- OpenURL ContextObject in Span
- RDF+N-Triples
- RDF+N3
- RDF+XML
- Refer
- Reference Manager
- Search Data Dump
- Simple Metadata
- YAML
Repository Staff Only: item control page