<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0">
  <channel>
    <title>Auteurs : Frederic Lavigne</title>
    <link>http://popups.lib.uliege.be/1373-5411/index.php?id=459</link>
    <description>Publications of Auteurs Frederic Lavigne</description>
    <language>fr</language>
    <ttl>0</ttl>
    <item>
      <title>Anticipatory Semantic Processes</title>
      <link>http://popups.lib.uliege.be/1373-5411/index.php?id=3556</link>
      <description>Why anticipatory processes correspond to cognitive abilities of living systems? To be adapted to an environment, behaviours need at least i) internal representations of events occurring in the external environment; and ii) internal anticipations of possible events to occur in- the external environment. Interactions of these two opposite but complementary cognitive properties lead to various patterns of experimental data on semantic processing. How to investigate dynamic semantic processes? Experimental studies in cognitive psychology offer several interests such as: i) the control of the semantic environment such as words embedded in sentences; ii) the methodological tools allowing the observation of anticipations and adapted oculomotor behaviour during reading; and iii) the analyse of different anticipatory processes within the theoretical framework of semantic processing. What are the different types of semantic anticipations? Experimental data show that semantic anticipatory processes involve i) the coding in memory of sequences of words occurring in textual environments; ii) the anticipation of possible future words from currently perceived words; and iii) the selection of anticipated words as a function of the sequences of perceived words, achieved by anticipatory activations and inhibitory selection processes. How to modelize anticipatory semantic processes? Localist or distributed neural networks models can account for some types of semantic processes, anticipatory or not. Attractor neural networks coding temporal sequences are presented as good candidate for modelling anticipatory semantic processes, according to specific properties of the human brain such as i) auto-associative memory; ii) learning and memorization of sequences of patterns; and iii) anticipation. of memorized patterns from previously perceived patterns. </description>
      <pubDate>Thu, 26 Sep 2024 09:35:52 +0200</pubDate>
      <lastBuildDate>Tue, 08 Oct 2024 17:22:39 +0200</lastBuildDate>
      <guid isPermaLink="true">http://popups.lib.uliege.be/1373-5411/index.php?id=3556</guid>
    </item>
    <item>
      <title>AIM Networks : Autolncursive Memory Networks for Anticipation Toward Learned Goals</title>
      <link>http://popups.lib.uliege.be/1373-5411/index.php?id=2622</link>
      <description>The ability to anticipate future states is a key adaptive property of living systems (Glenberg, 1997). Robert Rosen (1985) suggested that an anticipatory system is characterized by finality, and &quot;is a system containing a predictive model of itself and/or of its environment, which allows it to change state at an instant in accord with the model's predictions pertaining to a later instant&quot;. Daniel Dubois (Dubois &amp;amp; Resconi, 1992; Dubois, 1998a, 2000) defined the concept of incursive and hyperincursive anticipatory systems, able to generate respectively one or several anticipations influencing the computing of the next state of the system. In this article, the concept of autoincursion is proposed as the ability for a system to compute its successive internal states as a function of its past, present and anticipated states, to select among several anticipated states, and to autonomously change its own equation parameters by learning. Some fundamental properties of a neural network architecture and dynamics are proposed to define Autolncursive Memory Networks. AIM Networks can learn and activate multiple attractors simultaneously, exhibiting synergic dynamics of attractors encoding external inputs. This allows them (l) to compute their successive states as a function of past, present, and multiple anticipated states, (2) to change the way they compute their successive states through symmetric or asymmetric modification of the synaptic structure during autonomous leaming, and (3) to select sequences of anticipations oriented toward learned goals.  </description>
      <pubDate>Thu, 29 Aug 2024 15:16:38 +0200</pubDate>
      <lastBuildDate>Thu, 29 Aug 2024 15:16:50 +0200</lastBuildDate>
      <guid isPermaLink="true">http://popups.lib.uliege.be/1373-5411/index.php?id=2622</guid>
    </item>
    <item>
      <title>Neural Network Modeling of Learning of Contextual Constraints on Adaptive Anticipations</title>
      <link>http://popups.lib.uliege.be/1373-5411/index.php?id=1767</link>
      <description>Anticipatory processes take into account of the contextual events occurring in the environment to anticipate probable upcoming events, and to select the best behavioral responses. The necessary knowledge for prediction of events adapted to context can be learned by classical associative conditioning, which allows associations between events occurring close in a sequence. Context can then correspond to events perceived in the environment as well as to the reinforcing valence of the event eliciting emotional states in the system, both orienting anticipations in memory. Knowledge for anticipation of adapted behaviors to context can be learned by operant reinforced conditioning, which allows associations between behaviors and reinforcing events in the environment, as a function of the reinforcing valence of the event (positive or negative). In this case the processing of a contextual event can select behavioral responses orienting the system to positive reinforcers rather than to negative reinforcers. An attractor neural network model is proposed to account for the different types of anticipatory processes presented as well as for the leaming principles of conditioning allowing adapted anticipations. </description>
      <pubDate>Tue, 16 Jul 2024 15:30:43 +0200</pubDate>
      <lastBuildDate>Tue, 16 Jul 2024 15:30:58 +0200</lastBuildDate>
      <guid isPermaLink="true">http://popups.lib.uliege.be/1373-5411/index.php?id=1767</guid>
    </item>
    <item>
      <title>Attentional and Semantic Anticipations in Recurrent Neural Networks</title>
      <link>http://popups.lib.uliege.be/1373-5411/index.php?id=452</link>
      <description>Why are attentional processes important in the driving of anticipations ? Anticipatory processes are fundamental cognitive abilities of living systems, in order to rapidly and accurately perceive new events in the environment, and to trigger adapted behaviors to the newly perceived events. To process anticipations adapted to sequences of various events in complex environments, the cognitive system must be able to run specific anticipations on the basis of selected relevant events. Then more attention must be given to events potentially relevant for the living system, compared to less important events. What are useful attentional factors in anticipatory processes ? The relevance of events in the environment depend on the effects they can have on the survival of the living system. The cognitive system must then be able to detect relevant events to drive anticipations and to trigger adapted behaviors. The attention given to an event depends on i) its external physical relevance in the environment, such as time duration and visual quality, and ii) on its internal semantic relevance in memory, such as knowledge about the event(semantic field in memory) and anticipatory power(associative strength to anticipated associates). How can we model interactions between attentional and semantic anticipations ? Specific types of distributed recurrent neural networks are able to code temporal sequences of events as associated attractors in memory. Particular learning protocol and spike rate transmission through synaptic associations allow the model presented to vary attentionally the amount of activation of anticipations (by activation or inhibition processes) as a function of the external and internal relevance of the perceived events. This type of model offers a unique opportunity to account for both anticipations and attention in unified terms of neural dynamics in a recurrent network. </description>
      <pubDate>Thu, 27 Jun 2024 11:06:40 +0200</pubDate>
      <lastBuildDate>Mon, 07 Oct 2024 12:54:33 +0200</lastBuildDate>
      <guid isPermaLink="true">http://popups.lib.uliege.be/1373-5411/index.php?id=452</guid>
    </item>
  </channel>
</rss>