<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0">
  <channel>
    <title>channel</title>
    <link>http://popups.lib.uliege.be/1373-5411/index.php?id=1902</link>
    <description>Index terms</description>
    <language>fr</language>
    <ttl>0</ttl>
    <item>
      <title>Generalized Formula of Physical Channel Capacities</title>
      <link>http://popups.lib.uliege.be/1373-5411/index.php?id=1900</link>
      <description>We consider a process of information transmission in the environment described by the canonical distribution of noise, considered as the transmission channel in the Information Theory. We derive the generalized form of the noisy entropy of this environment. Then, the generalized information capacity formula is derived for the geometric distribution of the output variable under the condition of greater or equal than the minimal value of the mathematical expectation of the input variable. We also state the hypotheses the capacity, when this input parameter is less or equal than that minimal (critical, extreme) value, is just defined by the lower capacity estimation which is the capacity for the limit distribution with this minimal input parameter. This paper generalizes the paper (Hejna &amp;amp; Vajda, 1999) presented on CASYS'98. </description>
      <pubDate>Wed, 17 Jul 2024 13:04:30 +0200</pubDate>
      <lastBuildDate>Wed, 17 Jul 2024 13:04:36 +0200</lastBuildDate>
      <guid isPermaLink="true">http://popups.lib.uliege.be/1373-5411/index.php?id=1900</guid>
    </item>
  </channel>
</rss>