TY - GEN ID - cogprints2249 UR - http://cogprints.org/2249/ A1 - Lavigne, Frédéric A1 - Denis, Sylvain Y1 - 2001/// N2 - Why are attentional processes important in the driving of anticipations? Anticipatory processes are fundamental cognitive abilities of living systems, in order to rapidly and accurately perceive new events in the environment, and to trigger adapted behaviors to the newly perceived events. To process anticipations adapted to sequences of various events in complex environments, the cognitive system must be able to run specific anticipations on the basis of selected relevant events. Then more attention must be given to events potentially relevant for the living system, compared to less important events. What are useful attentional factors in anticipatory processes? The relevance of events in the environment depend on the effects they can have on the survival of the living system. The cognitive system must then be able to detect relevant events to drive anticipations and to trigger adapted behaviors. The attention given to an event depends on i) its external physical relevance in the environment, such as time duration and visual quality, and ii) on its internal semantic relevance in memory, such as knowledge about the event (semantic field in memory) and anticipatory power (associative strength to anticipated associates). How can we model interactions between attentional and semantic anticipations? Specific types of distributed recurrent neural networks are able to code temporal sequences of events as associated attractors in memory. Particular learning protocol and spike rate transmission through synaptic associations allow the model presented to vary attentionally the amount of activation of anticipations (by activation or inhibition processes) as a function of the external and internal relevance of the perceived events. This type of model offers a unique opportunity to account for both anticipations and attention in unified terms of neural dynamics in a recurrent network. KW - Semantic processing - attention - context effects - neural networks TI - Attentional and Semantic Anticipations SP - 74 AV - public EP - 95 ER -