### Abstract

A practical approach to statistical inference for hidden Markov models (HMMs) requires expressions for the mean and variance of the log-probability of an observed T-long sequence given the model parameters. From the viewpoint of Shannon theory, in the limit of large T, the expected value of the per step log-probability is minus one times the mean entropy rate at the output of a noisy channel driven by the Markov source. A novel procedure for finding the entropy rate is presented. The rate distortion function of the Markov source, subject to the requirement of instantaneous coding, is a by-product of the derivation.

Original language | English |
---|---|

Title of host publication | IEEE International Symposium on Information Theory - Proceedings |

DOIs | |

Publication status | Published - 1998 Dec 1 |

Event | 1998 IEEE International Symposium on Information Theory, ISIT 1998 - Cambridge, MA, United States Duration: 1998 Aug 16 → 1998 Aug 21 |

### Other

Other | 1998 IEEE International Symposium on Information Theory, ISIT 1998 |
---|---|

Country | United States |

City | Cambridge, MA |

Period | 98/8/16 → 98/8/21 |

### Fingerprint

### ASJC Scopus subject areas

- Applied Mathematics
- Modelling and Simulation
- Theoretical Computer Science
- Information Systems

### Cite this

*IEEE International Symposium on Information Theory - Proceedings*[708979] https://doi.org/10.1109/ISIT.1998.708979

**Entropy and information rates for hidden Markov models.** / Ko, Hanseok; Baran, R. H.

Research output: Chapter in Book/Report/Conference proceeding › Conference contribution

*IEEE International Symposium on Information Theory - Proceedings.*, 708979, 1998 IEEE International Symposium on Information Theory, ISIT 1998, Cambridge, MA, United States, 98/8/16. https://doi.org/10.1109/ISIT.1998.708979

}

TY - GEN

T1 - Entropy and information rates for hidden Markov models

AU - Ko, Hanseok

AU - Baran, R. H.

PY - 1998/12/1

Y1 - 1998/12/1

N2 - A practical approach to statistical inference for hidden Markov models (HMMs) requires expressions for the mean and variance of the log-probability of an observed T-long sequence given the model parameters. From the viewpoint of Shannon theory, in the limit of large T, the expected value of the per step log-probability is minus one times the mean entropy rate at the output of a noisy channel driven by the Markov source. A novel procedure for finding the entropy rate is presented. The rate distortion function of the Markov source, subject to the requirement of instantaneous coding, is a by-product of the derivation.

AB - A practical approach to statistical inference for hidden Markov models (HMMs) requires expressions for the mean and variance of the log-probability of an observed T-long sequence given the model parameters. From the viewpoint of Shannon theory, in the limit of large T, the expected value of the per step log-probability is minus one times the mean entropy rate at the output of a noisy channel driven by the Markov source. A novel procedure for finding the entropy rate is presented. The rate distortion function of the Markov source, subject to the requirement of instantaneous coding, is a by-product of the derivation.

UR - http://www.scopus.com/inward/record.url?scp=84890370594&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84890370594&partnerID=8YFLogxK

U2 - 10.1109/ISIT.1998.708979

DO - 10.1109/ISIT.1998.708979

M3 - Conference contribution

SN - 0780350006

SN - 9780780350007

BT - IEEE International Symposium on Information Theory - Proceedings

ER -