### Abstract

We propose a novel algebraic algorithmic framework for dealing with probability distributions represented by their cumulants such as the mean and covariance matrix. As an example, we consider the unsupervised learning problem of finding the subspace on which several probability distributions agree. Instead of minimizing an objective function involving the estimated cumulants, we show that by treating the cumulants as elements of the polynomial ring we can directly solve the problem, at a lower computational cost and with higher accuracy. Moreover, the algebraic viewpoint on probability distributions allows us to invoke the theory of algebraic geometry, which we demonstrate in a compact proof for an identifiability criterion.

Original language | English |
---|---|

Pages (from-to) | 855-903 |

Number of pages | 49 |

Journal | Journal of Machine Learning Research |

Volume | 13 |

Publication status | Published - 2012 Mar 1 |

### Keywords

- Approximate algebra
- Computational algebraic geometry
- Unsupervised Learning

### ASJC Scopus subject areas

- Artificial Intelligence
- Software
- Control and Systems Engineering
- Statistics and Probability

## Fingerprint Dive into the research topics of 'Algebraic geometric comparison of probability distributions'. Together they form a unique fingerprint.

## Cite this

*Journal of Machine Learning Research*,

*13*, 855-903.