Entropy and information rates for hidden Markov models

Hanseok Ko, R. H. Baran

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

A practical approach to statistical inference for hidden Markov models (HMMs) requires expressions for the mean and variance of the log-probability of an observed T-long sequence given the model parameters. From the viewpoint of Shannon theory, in the limit of large T, the expected value of the per step log-probability is minus one times the mean entropy rate at the output of a noisy channel driven by the Markov source. A novel procedure for finding the entropy rate is presented. The rate distortion function of the Markov source, subject to the requirement of instantaneous coding, is a by-product of the derivation.

Original languageEnglish
Title of host publicationIEEE International Symposium on Information Theory - Proceedings
DOIs
Publication statusPublished - 1998 Dec 1
Event1998 IEEE International Symposium on Information Theory, ISIT 1998 - Cambridge, MA, United States
Duration: 1998 Aug 161998 Aug 21

Other

Other1998 IEEE International Symposium on Information Theory, ISIT 1998
CountryUnited States
CityCambridge, MA
Period98/8/1698/8/21

ASJC Scopus subject areas

  • Applied Mathematics
  • Modelling and Simulation
  • Theoretical Computer Science
  • Information Systems

Fingerprint Dive into the research topics of 'Entropy and information rates for hidden Markov models'. Together they form a unique fingerprint.

  • Cite this

    Ko, H., & Baran, R. H. (1998). Entropy and information rates for hidden Markov models. In IEEE International Symposium on Information Theory - Proceedings [708979] https://doi.org/10.1109/ISIT.1998.708979