[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
research-article

Classification with finite memory

Published: 01 September 2006 Publication History

Abstract

Consider the following situation. A device called a classifier observes a probability law P on l-vectors from an alphabet of size A. Its task is to observe a second probability law Q and decide whether P≡Q or P and Q are sufficiently different according to some appropriate criterion. If the classifier has available an unlimited memory (so that it can remember P(z) exactly for all z), this is a simple matter. In fact for most differentness criteria, a finite memory of 2(log A)l+o(l) bits will suffice (for large l), i.e., store a finite approximation of P(z) for all Alz's. In a sense made precise in this paper, it is shown that a memory of only about 2Rl bits is required, where the quantity R<log A, and is closely related to the entropy of P. Further, it is shown that if instead of being given P(z), for all z, the classifier is given a training sequence drawn with a probability law P that can be stored using about 2Rl bits, then correct classification is also possible

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image IEEE Transactions on Information Theory
IEEE Transactions on Information Theory  Volume 42, Issue 2
March 1996
347 pages

Publisher

IEEE Press

Publication History

Published: 01 September 2006

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 0
    Total Downloads
  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 14 Dec 2024

Other Metrics

Citations

View Options

View options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media