Entropy and information rates for hidden Markov models

Hanseok Ko, R. H. Baran

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

A practical approach to statistical inference for hidden Markov models (HMMs) requires expressions for the mean and variance of the log-probability of an observed T-long sequence given the model parameters. From the viewpoint of Shannon theory, in the limit of large T, the expected value of the per step log-probability is minus one times the mean entropy rate at the output of a noisy channel driven by the Markov source. A novel procedure for finding the entropy rate is presented. The rate distortion function of the Markov source, subject to the requirement of instantaneous coding, is a by-product of the derivation.

Original languageEnglish
Title of host publicationIEEE International Symposium on Information Theory - Proceedings
DOIs
Publication statusPublished - 1998 Dec 1
Event1998 IEEE International Symposium on Information Theory, ISIT 1998 - Cambridge, MA, United States
Duration: 1998 Aug 161998 Aug 21

Other

Other1998 IEEE International Symposium on Information Theory, ISIT 1998
CountryUnited States
CityCambridge, MA
Period98/8/1698/8/21

Fingerprint

Information Rates
Hidden Markov models
Markov Model
Entropy
Rate-distortion
Statistical Inference
Expected Value
Instantaneous
Byproducts
Coding
Output
Requirements
Model

ASJC Scopus subject areas

  • Applied Mathematics
  • Modelling and Simulation
  • Theoretical Computer Science
  • Information Systems

Cite this

Ko, H., & Baran, R. H. (1998). Entropy and information rates for hidden Markov models. In IEEE International Symposium on Information Theory - Proceedings [708979] https://doi.org/10.1109/ISIT.1998.708979

Entropy and information rates for hidden Markov models. / Ko, Hanseok; Baran, R. H.

IEEE International Symposium on Information Theory - Proceedings. 1998. 708979.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Ko, H & Baran, RH 1998, Entropy and information rates for hidden Markov models. in IEEE International Symposium on Information Theory - Proceedings., 708979, 1998 IEEE International Symposium on Information Theory, ISIT 1998, Cambridge, MA, United States, 98/8/16. https://doi.org/10.1109/ISIT.1998.708979
Ko H, Baran RH. Entropy and information rates for hidden Markov models. In IEEE International Symposium on Information Theory - Proceedings. 1998. 708979 https://doi.org/10.1109/ISIT.1998.708979
Ko, Hanseok ; Baran, R. H. / Entropy and information rates for hidden Markov models. IEEE International Symposium on Information Theory - Proceedings. 1998.
@inproceedings{df00599b7d2b4ea29b7460980f165192,
title = "Entropy and information rates for hidden Markov models",
abstract = "A practical approach to statistical inference for hidden Markov models (HMMs) requires expressions for the mean and variance of the log-probability of an observed T-long sequence given the model parameters. From the viewpoint of Shannon theory, in the limit of large T, the expected value of the per step log-probability is minus one times the mean entropy rate at the output of a noisy channel driven by the Markov source. A novel procedure for finding the entropy rate is presented. The rate distortion function of the Markov source, subject to the requirement of instantaneous coding, is a by-product of the derivation.",
author = "Hanseok Ko and Baran, {R. H.}",
year = "1998",
month = "12",
day = "1",
doi = "10.1109/ISIT.1998.708979",
language = "English",
isbn = "0780350006",
booktitle = "IEEE International Symposium on Information Theory - Proceedings",

}

TY - GEN

T1 - Entropy and information rates for hidden Markov models

AU - Ko, Hanseok

AU - Baran, R. H.

PY - 1998/12/1

Y1 - 1998/12/1

N2 - A practical approach to statistical inference for hidden Markov models (HMMs) requires expressions for the mean and variance of the log-probability of an observed T-long sequence given the model parameters. From the viewpoint of Shannon theory, in the limit of large T, the expected value of the per step log-probability is minus one times the mean entropy rate at the output of a noisy channel driven by the Markov source. A novel procedure for finding the entropy rate is presented. The rate distortion function of the Markov source, subject to the requirement of instantaneous coding, is a by-product of the derivation.

AB - A practical approach to statistical inference for hidden Markov models (HMMs) requires expressions for the mean and variance of the log-probability of an observed T-long sequence given the model parameters. From the viewpoint of Shannon theory, in the limit of large T, the expected value of the per step log-probability is minus one times the mean entropy rate at the output of a noisy channel driven by the Markov source. A novel procedure for finding the entropy rate is presented. The rate distortion function of the Markov source, subject to the requirement of instantaneous coding, is a by-product of the derivation.

UR - http://www.scopus.com/inward/record.url?scp=84890370594&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84890370594&partnerID=8YFLogxK

U2 - 10.1109/ISIT.1998.708979

DO - 10.1109/ISIT.1998.708979

M3 - Conference contribution

SN - 0780350006

SN - 9780780350007

BT - IEEE International Symposium on Information Theory - Proceedings

ER -