Regularizing AdaBoost

Gunnar Rätsch, Takashi Onoda, Klaus Muller

Research output: Chapter in Book/Report/Conference proceedingConference contribution

26 Citations (Scopus)

Abstract

Boosting methods maximize a hard classification margin and are known as powerful techniques that do not exhibit overfitting for low noise cases. Also for noisy data boosting will try to enforce a hard margin and thereby give too much weight to outliers, which then leads to the dilemma of non-smooth fits and overfitting. Therefore we propose three algorithms to allow for soft margin classification by introducing regularization with slack variables into the boosting concept: (1) AdaBoostreg and regularized versions of (2) linear and (3) quadratic programming AdaBoost. Experiments show the usefulness of the proposed algorithms in comparison to another soft margin classifier: the support vector machine.

Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems
PublisherNeural information processing systems foundation
Pages564-570
Number of pages7
ISBN (Print)0262112450, 9780262112451
Publication statusPublished - 1999 Jan 1
Externally publishedYes
Event12th Annual Conference on Neural Information Processing Systems, NIPS 1998 - Denver, CO, United States
Duration: 1998 Nov 301998 Dec 5

Other

Other12th Annual Conference on Neural Information Processing Systems, NIPS 1998
CountryUnited States
CityDenver, CO
Period98/11/3098/12/5

Fingerprint

Adaptive boosting
Quadratic programming
Support vector machines
Classifiers
Experiments

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Cite this

Rätsch, G., Onoda, T., & Muller, K. (1999). Regularizing AdaBoost. In Advances in Neural Information Processing Systems (pp. 564-570). Neural information processing systems foundation.

Regularizing AdaBoost. / Rätsch, Gunnar; Onoda, Takashi; Muller, Klaus.

Advances in Neural Information Processing Systems. Neural information processing systems foundation, 1999. p. 564-570.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Rätsch, G, Onoda, T & Muller, K 1999, Regularizing AdaBoost. in Advances in Neural Information Processing Systems. Neural information processing systems foundation, pp. 564-570, 12th Annual Conference on Neural Information Processing Systems, NIPS 1998, Denver, CO, United States, 98/11/30.
Rätsch G, Onoda T, Muller K. Regularizing AdaBoost. In Advances in Neural Information Processing Systems. Neural information processing systems foundation. 1999. p. 564-570
Rätsch, Gunnar ; Onoda, Takashi ; Muller, Klaus. / Regularizing AdaBoost. Advances in Neural Information Processing Systems. Neural information processing systems foundation, 1999. pp. 564-570
@inproceedings{a69086517e254fe7b324eb26d9df37b6,
title = "Regularizing AdaBoost",
abstract = "Boosting methods maximize a hard classification margin and are known as powerful techniques that do not exhibit overfitting for low noise cases. Also for noisy data boosting will try to enforce a hard margin and thereby give too much weight to outliers, which then leads to the dilemma of non-smooth fits and overfitting. Therefore we propose three algorithms to allow for soft margin classification by introducing regularization with slack variables into the boosting concept: (1) AdaBoostreg and regularized versions of (2) linear and (3) quadratic programming AdaBoost. Experiments show the usefulness of the proposed algorithms in comparison to another soft margin classifier: the support vector machine.",
author = "Gunnar R{\"a}tsch and Takashi Onoda and Klaus Muller",
year = "1999",
month = "1",
day = "1",
language = "English",
isbn = "0262112450",
pages = "564--570",
booktitle = "Advances in Neural Information Processing Systems",
publisher = "Neural information processing systems foundation",

}

TY - GEN

T1 - Regularizing AdaBoost

AU - Rätsch, Gunnar

AU - Onoda, Takashi

AU - Muller, Klaus

PY - 1999/1/1

Y1 - 1999/1/1

N2 - Boosting methods maximize a hard classification margin and are known as powerful techniques that do not exhibit overfitting for low noise cases. Also for noisy data boosting will try to enforce a hard margin and thereby give too much weight to outliers, which then leads to the dilemma of non-smooth fits and overfitting. Therefore we propose three algorithms to allow for soft margin classification by introducing regularization with slack variables into the boosting concept: (1) AdaBoostreg and regularized versions of (2) linear and (3) quadratic programming AdaBoost. Experiments show the usefulness of the proposed algorithms in comparison to another soft margin classifier: the support vector machine.

AB - Boosting methods maximize a hard classification margin and are known as powerful techniques that do not exhibit overfitting for low noise cases. Also for noisy data boosting will try to enforce a hard margin and thereby give too much weight to outliers, which then leads to the dilemma of non-smooth fits and overfitting. Therefore we propose three algorithms to allow for soft margin classification by introducing regularization with slack variables into the boosting concept: (1) AdaBoostreg and regularized versions of (2) linear and (3) quadratic programming AdaBoost. Experiments show the usefulness of the proposed algorithms in comparison to another soft margin classifier: the support vector machine.

UR - http://www.scopus.com/inward/record.url?scp=0001102148&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0001102148&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:0001102148

SN - 0262112450

SN - 9780262112451

SP - 564

EP - 570

BT - Advances in Neural Information Processing Systems

PB - Neural information processing systems foundation

ER -