### Abstract

We propose a novel algebraic algorithmic framework for dealing with probability distributions represented by their cumulants such as the mean and covariance matrix. As an example, we consider the unsupervised learning problem of finding the subspace on which several probability distributions agree. Instead of minimizing an objective function involving the estimated cumulants, we show that by treating the cumulants as elements of the polynomial ring we can directly solve the problem, at a lower computational cost and with higher accuracy. Moreover, the algebraic viewpoint on probability distributions allows us to invoke the theory of algebraic geometry, which we demonstrate in a compact proof for an identifiability criterion.

Original language | English |
---|---|

Pages (from-to) | 855-903 |

Number of pages | 49 |

Journal | Journal of Machine Learning Research |

Volume | 13 |

Publication status | Published - 2012 Mar 1 |

### Fingerprint

### Keywords

- Approximate algebra
- Computational algebraic geometry
- Unsupervised Learning

### ASJC Scopus subject areas

- Artificial Intelligence
- Software
- Control and Systems Engineering
- Statistics and Probability

### Cite this

*Journal of Machine Learning Research*,

*13*, 855-903.

**Algebraic geometric comparison of probability distributions.** / Király, Franz J.; Von Bünau, Paul; Meinecke, Frank C.; Blythe, Duncan A J; Muller, Klaus.

Research output: Contribution to journal › Article

*Journal of Machine Learning Research*, vol. 13, pp. 855-903.

}

TY - JOUR

T1 - Algebraic geometric comparison of probability distributions

AU - Király, Franz J.

AU - Von Bünau, Paul

AU - Meinecke, Frank C.

AU - Blythe, Duncan A J

AU - Muller, Klaus

PY - 2012/3/1

Y1 - 2012/3/1

N2 - We propose a novel algebraic algorithmic framework for dealing with probability distributions represented by their cumulants such as the mean and covariance matrix. As an example, we consider the unsupervised learning problem of finding the subspace on which several probability distributions agree. Instead of minimizing an objective function involving the estimated cumulants, we show that by treating the cumulants as elements of the polynomial ring we can directly solve the problem, at a lower computational cost and with higher accuracy. Moreover, the algebraic viewpoint on probability distributions allows us to invoke the theory of algebraic geometry, which we demonstrate in a compact proof for an identifiability criterion.

AB - We propose a novel algebraic algorithmic framework for dealing with probability distributions represented by their cumulants such as the mean and covariance matrix. As an example, we consider the unsupervised learning problem of finding the subspace on which several probability distributions agree. Instead of minimizing an objective function involving the estimated cumulants, we show that by treating the cumulants as elements of the polynomial ring we can directly solve the problem, at a lower computational cost and with higher accuracy. Moreover, the algebraic viewpoint on probability distributions allows us to invoke the theory of algebraic geometry, which we demonstrate in a compact proof for an identifiability criterion.

KW - Approximate algebra

KW - Computational algebraic geometry

KW - Unsupervised Learning

UR - http://www.scopus.com/inward/record.url?scp=84859464784&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84859464784&partnerID=8YFLogxK

M3 - Article

AN - SCOPUS:84859464784

VL - 13

SP - 855

EP - 903

JO - Journal of Machine Learning Research

JF - Journal of Machine Learning Research

SN - 1532-4435

ER -