<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0">
  <channel>
    <title>pseudo information divergence</title>
    <link>http://popups.lib.uliege.be/1373-5411/index.php?id=2064</link>
    <description>Index terms</description>
    <language>fr</language>
    <ttl>0</ttl>
    <item>
      <title>Pseudo Information Divergences Defined on the Family of Specific Probability Distributions</title>
      <link>http://popups.lib.uliege.be/1373-5411/index.php?id=2063</link>
      <description>Several information measures have been used as the criteria in information theory, statistics and various fields of engineering. Especially an information divergence has been well used as the measure of the difference between two probability distributions. In this paper, we propose the pseudo information divergence, which functions as usual information divergence, if two measured probability distributions are in some family of specific distributions. We introduce an example of the pseudo information divergence, and apply it to the problem of training multi-layer perceptrons from the data with the gross error noise. </description>
      <pubDate>Mon, 29 Jul 2024 10:47:59 +0200</pubDate>
      <lastBuildDate>Mon, 29 Jul 2024 10:48:09 +0200</lastBuildDate>
      <guid isPermaLink="true">http://popups.lib.uliege.be/1373-5411/index.php?id=2063</guid>
    </item>
  </channel>
</rss>