Hidden Markov Mixtures of Experts for prediction of non-stationary dynamics

Stefan Liehr, Klaus Pawelzik, Jens Kohlmorgen, Steven Lemm, Klaus Muller

Research output: Chapter in Book/Report/Conference proceedingConference contribution

5 Citations (Scopus)

Abstract

The prediction of non-stationary dynamical systems may be performed by identifying appropriate sub-dynamics and an early detection of mode changes. In this paper, we present a framework which unifies the mixtures of experts approach and a generalized hidden Markov model with an input-dependent transition matrix: the Hidden Markov Mixtures of Experts (HMME). The gating procedure incorporates state memory, information about the current location in phase space, and the previous prediction performance. The experts and the hidden Markov gating model are simultaneously trained by an EM algorithm that maximizes the likelihood during an annealing procedure. The HMME architecture allows for a fast on-line detection of mode changes: change points are detected as soon as the incoming input data stream contains sufficient information to indicate a change in the dynamics.

Original languageEnglish
Title of host publicationNeural Networks for Signal Processing - Proceedings of the IEEE Workshop
Place of PublicationPiscataway, NJ, United States
PublisherIEEE
Pages195-204
Number of pages10
Publication statusPublished - 1999 Dec 1
Externally publishedYes
EventProceedings of the 1999 9th IEEE Workshop on Neural Networks for Signal Processing (NNSP'99) - Madison, WI, USA
Duration: 1999 Aug 231999 Aug 25

Other

OtherProceedings of the 1999 9th IEEE Workshop on Neural Networks for Signal Processing (NNSP'99)
CityMadison, WI, USA
Period99/8/2399/8/25

Fingerprint

Hidden Markov models
Dynamical systems
Annealing
Data storage equipment

ASJC Scopus subject areas

  • Signal Processing
  • Software
  • Electrical and Electronic Engineering

Cite this

Liehr, S., Pawelzik, K., Kohlmorgen, J., Lemm, S., & Muller, K. (1999). Hidden Markov Mixtures of Experts for prediction of non-stationary dynamics. In Neural Networks for Signal Processing - Proceedings of the IEEE Workshop (pp. 195-204). Piscataway, NJ, United States: IEEE.

Hidden Markov Mixtures of Experts for prediction of non-stationary dynamics. / Liehr, Stefan; Pawelzik, Klaus; Kohlmorgen, Jens; Lemm, Steven; Muller, Klaus.

Neural Networks for Signal Processing - Proceedings of the IEEE Workshop. Piscataway, NJ, United States : IEEE, 1999. p. 195-204.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Liehr, S, Pawelzik, K, Kohlmorgen, J, Lemm, S & Muller, K 1999, Hidden Markov Mixtures of Experts for prediction of non-stationary dynamics. in Neural Networks for Signal Processing - Proceedings of the IEEE Workshop. IEEE, Piscataway, NJ, United States, pp. 195-204, Proceedings of the 1999 9th IEEE Workshop on Neural Networks for Signal Processing (NNSP'99), Madison, WI, USA, 99/8/23.
Liehr S, Pawelzik K, Kohlmorgen J, Lemm S, Muller K. Hidden Markov Mixtures of Experts for prediction of non-stationary dynamics. In Neural Networks for Signal Processing - Proceedings of the IEEE Workshop. Piscataway, NJ, United States: IEEE. 1999. p. 195-204
Liehr, Stefan ; Pawelzik, Klaus ; Kohlmorgen, Jens ; Lemm, Steven ; Muller, Klaus. / Hidden Markov Mixtures of Experts for prediction of non-stationary dynamics. Neural Networks for Signal Processing - Proceedings of the IEEE Workshop. Piscataway, NJ, United States : IEEE, 1999. pp. 195-204
@inproceedings{e2e60db185604679b13bb732bc4276d0,
title = "Hidden Markov Mixtures of Experts for prediction of non-stationary dynamics",
abstract = "The prediction of non-stationary dynamical systems may be performed by identifying appropriate sub-dynamics and an early detection of mode changes. In this paper, we present a framework which unifies the mixtures of experts approach and a generalized hidden Markov model with an input-dependent transition matrix: the Hidden Markov Mixtures of Experts (HMME). The gating procedure incorporates state memory, information about the current location in phase space, and the previous prediction performance. The experts and the hidden Markov gating model are simultaneously trained by an EM algorithm that maximizes the likelihood during an annealing procedure. The HMME architecture allows for a fast on-line detection of mode changes: change points are detected as soon as the incoming input data stream contains sufficient information to indicate a change in the dynamics.",
author = "Stefan Liehr and Klaus Pawelzik and Jens Kohlmorgen and Steven Lemm and Klaus Muller",
year = "1999",
month = "12",
day = "1",
language = "English",
pages = "195--204",
booktitle = "Neural Networks for Signal Processing - Proceedings of the IEEE Workshop",
publisher = "IEEE",

}

TY - GEN

T1 - Hidden Markov Mixtures of Experts for prediction of non-stationary dynamics

AU - Liehr, Stefan

AU - Pawelzik, Klaus

AU - Kohlmorgen, Jens

AU - Lemm, Steven

AU - Muller, Klaus

PY - 1999/12/1

Y1 - 1999/12/1

N2 - The prediction of non-stationary dynamical systems may be performed by identifying appropriate sub-dynamics and an early detection of mode changes. In this paper, we present a framework which unifies the mixtures of experts approach and a generalized hidden Markov model with an input-dependent transition matrix: the Hidden Markov Mixtures of Experts (HMME). The gating procedure incorporates state memory, information about the current location in phase space, and the previous prediction performance. The experts and the hidden Markov gating model are simultaneously trained by an EM algorithm that maximizes the likelihood during an annealing procedure. The HMME architecture allows for a fast on-line detection of mode changes: change points are detected as soon as the incoming input data stream contains sufficient information to indicate a change in the dynamics.

AB - The prediction of non-stationary dynamical systems may be performed by identifying appropriate sub-dynamics and an early detection of mode changes. In this paper, we present a framework which unifies the mixtures of experts approach and a generalized hidden Markov model with an input-dependent transition matrix: the Hidden Markov Mixtures of Experts (HMME). The gating procedure incorporates state memory, information about the current location in phase space, and the previous prediction performance. The experts and the hidden Markov gating model are simultaneously trained by an EM algorithm that maximizes the likelihood during an annealing procedure. The HMME architecture allows for a fast on-line detection of mode changes: change points are detected as soon as the incoming input data stream contains sufficient information to indicate a change in the dynamics.

UR - http://www.scopus.com/inward/record.url?scp=0033344927&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0033344927&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:0033344927

SP - 195

EP - 204

BT - Neural Networks for Signal Processing - Proceedings of the IEEE Workshop

PB - IEEE

CY - Piscataway, NJ, United States

ER -