Regression for sets of polynomial equations

Franz J. Király, Paul Von Bünau, Jan S. Müller, Duncan A J Blythe, Frank C. Meinecke, Klaus Muller

Research output: Contribution to journalArticle

4 Citations (Scopus)

Abstract

We propose a method called ideal regression for approximating an arbitrary system of polynomial equations by a system of a particular type. Using techniques from approximate computational algebraic geometry, we show how we can solve ideal regression directly without resorting to numerical optimization. Ideal regression is useful whenever the solution to a learning problem can be described by a system of polynomial equations. As an example, we demonstrate how to formulate Stationary Subspace Analysis (SSA), a source separation problem, in terms of ideal regression, which also yields a consistent estimator for SSA. We then compare this estimator in simulations with previous optimization-based approaches for SSA.

Original languageEnglish
Pages (from-to)628-637
Number of pages10
JournalJournal of Machine Learning Research
Volume22
Publication statusPublished - 2012
Externally publishedYes

Fingerprint

Polynomial equation
Regression
Polynomials
Computational geometry
Source separation
Subspace
Source Separation
Consistent Estimator
Algebraic Geometry
Computational Geometry
Numerical Optimization
Estimator
Optimization
Arbitrary
Demonstrate
Simulation

ASJC Scopus subject areas

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability

Cite this

Király, F. J., Von Bünau, P., Müller, J. S., Blythe, D. A. J., Meinecke, F. C., & Muller, K. (2012). Regression for sets of polynomial equations. Journal of Machine Learning Research, 22, 628-637.

Regression for sets of polynomial equations. / Király, Franz J.; Von Bünau, Paul; Müller, Jan S.; Blythe, Duncan A J; Meinecke, Frank C.; Muller, Klaus.

In: Journal of Machine Learning Research, Vol. 22, 2012, p. 628-637.

Research output: Contribution to journalArticle

Király, FJ, Von Bünau, P, Müller, JS, Blythe, DAJ, Meinecke, FC & Muller, K 2012, 'Regression for sets of polynomial equations', Journal of Machine Learning Research, vol. 22, pp. 628-637.
Király FJ, Von Bünau P, Müller JS, Blythe DAJ, Meinecke FC, Muller K. Regression for sets of polynomial equations. Journal of Machine Learning Research. 2012;22:628-637.
Király, Franz J. ; Von Bünau, Paul ; Müller, Jan S. ; Blythe, Duncan A J ; Meinecke, Frank C. ; Muller, Klaus. / Regression for sets of polynomial equations. In: Journal of Machine Learning Research. 2012 ; Vol. 22. pp. 628-637.
@article{8f8ba9ffe65e403c824c130d633a953b,
title = "Regression for sets of polynomial equations",
abstract = "We propose a method called ideal regression for approximating an arbitrary system of polynomial equations by a system of a particular type. Using techniques from approximate computational algebraic geometry, we show how we can solve ideal regression directly without resorting to numerical optimization. Ideal regression is useful whenever the solution to a learning problem can be described by a system of polynomial equations. As an example, we demonstrate how to formulate Stationary Subspace Analysis (SSA), a source separation problem, in terms of ideal regression, which also yields a consistent estimator for SSA. We then compare this estimator in simulations with previous optimization-based approaches for SSA.",
author = "Kir{\'a}ly, {Franz J.} and {Von B{\"u}nau}, Paul and M{\"u}ller, {Jan S.} and Blythe, {Duncan A J} and Meinecke, {Frank C.} and Klaus Muller",
year = "2012",
language = "English",
volume = "22",
pages = "628--637",
journal = "Journal of Machine Learning Research",
issn = "1532-4435",
publisher = "Microtome Publishing",

}

TY - JOUR

T1 - Regression for sets of polynomial equations

AU - Király, Franz J.

AU - Von Bünau, Paul

AU - Müller, Jan S.

AU - Blythe, Duncan A J

AU - Meinecke, Frank C.

AU - Muller, Klaus

PY - 2012

Y1 - 2012

N2 - We propose a method called ideal regression for approximating an arbitrary system of polynomial equations by a system of a particular type. Using techniques from approximate computational algebraic geometry, we show how we can solve ideal regression directly without resorting to numerical optimization. Ideal regression is useful whenever the solution to a learning problem can be described by a system of polynomial equations. As an example, we demonstrate how to formulate Stationary Subspace Analysis (SSA), a source separation problem, in terms of ideal regression, which also yields a consistent estimator for SSA. We then compare this estimator in simulations with previous optimization-based approaches for SSA.

AB - We propose a method called ideal regression for approximating an arbitrary system of polynomial equations by a system of a particular type. Using techniques from approximate computational algebraic geometry, we show how we can solve ideal regression directly without resorting to numerical optimization. Ideal regression is useful whenever the solution to a learning problem can be described by a system of polynomial equations. As an example, we demonstrate how to formulate Stationary Subspace Analysis (SSA), a source separation problem, in terms of ideal regression, which also yields a consistent estimator for SSA. We then compare this estimator in simulations with previous optimization-based approaches for SSA.

UR - http://www.scopus.com/inward/record.url?scp=84918835489&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84918835489&partnerID=8YFLogxK

M3 - Article

VL - 22

SP - 628

EP - 637

JO - Journal of Machine Learning Research

JF - Journal of Machine Learning Research

SN - 1532-4435

ER -