### Abstract

We propose a method called ideal regression for approximating an arbitrary system of polynomial equations by a system of a particular type. Using techniques from approximate computational algebraic geometry, we show how we can solve ideal regression directly without resorting to numerical optimization. Ideal regression is useful whenever the solution to a learning problem can be described by a system of polynomial equations. As an example, we demonstrate how to formulate Stationary Subspace Analysis (SSA), a source separation problem, in terms of ideal regression, which also yields a consistent estimator for SSA. We then compare this estimator in simulations with previous optimization-based approaches for SSA.

Original language | English |
---|---|

Pages (from-to) | 628-637 |

Number of pages | 10 |

Journal | Journal of Machine Learning Research |

Volume | 22 |

Publication status | Published - 2012 |

Externally published | Yes |

### Fingerprint

### ASJC Scopus subject areas

- Artificial Intelligence
- Software
- Control and Systems Engineering
- Statistics and Probability

### Cite this

*Journal of Machine Learning Research*,

*22*, 628-637.

**Regression for sets of polynomial equations.** / Király, Franz J.; Von Bünau, Paul; Müller, Jan S.; Blythe, Duncan A J; Meinecke, Frank C.; Muller, Klaus.

Research output: Contribution to journal › Article

*Journal of Machine Learning Research*, vol. 22, pp. 628-637.

}

TY - JOUR

T1 - Regression for sets of polynomial equations

AU - Király, Franz J.

AU - Von Bünau, Paul

AU - Müller, Jan S.

AU - Blythe, Duncan A J

AU - Meinecke, Frank C.

AU - Muller, Klaus

PY - 2012

Y1 - 2012

N2 - We propose a method called ideal regression for approximating an arbitrary system of polynomial equations by a system of a particular type. Using techniques from approximate computational algebraic geometry, we show how we can solve ideal regression directly without resorting to numerical optimization. Ideal regression is useful whenever the solution to a learning problem can be described by a system of polynomial equations. As an example, we demonstrate how to formulate Stationary Subspace Analysis (SSA), a source separation problem, in terms of ideal regression, which also yields a consistent estimator for SSA. We then compare this estimator in simulations with previous optimization-based approaches for SSA.

AB - We propose a method called ideal regression for approximating an arbitrary system of polynomial equations by a system of a particular type. Using techniques from approximate computational algebraic geometry, we show how we can solve ideal regression directly without resorting to numerical optimization. Ideal regression is useful whenever the solution to a learning problem can be described by a system of polynomial equations. As an example, we demonstrate how to formulate Stationary Subspace Analysis (SSA), a source separation problem, in terms of ideal regression, which also yields a consistent estimator for SSA. We then compare this estimator in simulations with previous optimization-based approaches for SSA.

UR - http://www.scopus.com/inward/record.url?scp=84918835489&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84918835489&partnerID=8YFLogxK

M3 - Article

AN - SCOPUS:84918835489

VL - 22

SP - 628

EP - 637

JO - Journal of Machine Learning Research

JF - Journal of Machine Learning Research

SN - 1532-4435

ER -