ν-Arc: Ensemble learning in the presence of outliers

G. Rätsch, B. Schölkopf, A. Smola, Klaus Muller, T. Onoda, S. Mika

Research output: Chapter in Book/Report/Conference proceedingConference contribution

5 Citations (Scopus)

Abstract

AdaBoost and other ensemble methods have successfully been applied to a number of classification tasks' seemingly defying problems of overfitting. AdaBoost performs gradient descent in an error function with respect to the margin' asymptotically concentrating on the patterns which are hardest to learn. For very noisy problems' however' this can be disadvantageous. Indeed' theoretical analysis has shown that the margin distribution' as opposed to just the minimal margin' plays a crucial role in understanding this phenomenon. Loosely speaking' some outliers should be tolerated if this has the benefit of substantially the margin on the remaining points. We propose a new boosting algorithm which allows for the possibility of a pre-specified fraction of points to lie in the margin area or even on the wrong side of the decision boundary.

Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems
PublisherNeural information processing systems foundation
Pages561-567
Number of pages7
ISBN (Print)0262194503, 9780262194501
Publication statusPublished - 2000 Jan 1
Externally publishedYes
Event13th Annual Neural Information Processing Systems Conference, NIPS 1999 - Denver, CO, United States
Duration: 1999 Nov 291999 Dec 4

Other

Other13th Annual Neural Information Processing Systems Conference, NIPS 1999
CountryUnited States
CityDenver, CO
Period99/11/2999/12/4

Fingerprint

Adaptive boosting

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Cite this

Rätsch, G., Schölkopf, B., Smola, A., Muller, K., Onoda, T., & Mika, S. (2000). ν-Arc: Ensemble learning in the presence of outliers. In Advances in Neural Information Processing Systems (pp. 561-567). Neural information processing systems foundation.

ν-Arc : Ensemble learning in the presence of outliers. / Rätsch, G.; Schölkopf, B.; Smola, A.; Muller, Klaus; Onoda, T.; Mika, S.

Advances in Neural Information Processing Systems. Neural information processing systems foundation, 2000. p. 561-567.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Rätsch, G, Schölkopf, B, Smola, A, Muller, K, Onoda, T & Mika, S 2000, ν-Arc: Ensemble learning in the presence of outliers. in Advances in Neural Information Processing Systems. Neural information processing systems foundation, pp. 561-567, 13th Annual Neural Information Processing Systems Conference, NIPS 1999, Denver, CO, United States, 99/11/29.
Rätsch G, Schölkopf B, Smola A, Muller K, Onoda T, Mika S. ν-Arc: Ensemble learning in the presence of outliers. In Advances in Neural Information Processing Systems. Neural information processing systems foundation. 2000. p. 561-567
Rätsch, G. ; Schölkopf, B. ; Smola, A. ; Muller, Klaus ; Onoda, T. ; Mika, S. / ν-Arc : Ensemble learning in the presence of outliers. Advances in Neural Information Processing Systems. Neural information processing systems foundation, 2000. pp. 561-567
@inproceedings{b87f1f86d43043c18550650cbb79622e,
title = "ν-Arc: Ensemble learning in the presence of outliers",
abstract = "AdaBoost and other ensemble methods have successfully been applied to a number of classification tasks' seemingly defying problems of overfitting. AdaBoost performs gradient descent in an error function with respect to the margin' asymptotically concentrating on the patterns which are hardest to learn. For very noisy problems' however' this can be disadvantageous. Indeed' theoretical analysis has shown that the margin distribution' as opposed to just the minimal margin' plays a crucial role in understanding this phenomenon. Loosely speaking' some outliers should be tolerated if this has the benefit of substantially the margin on the remaining points. We propose a new boosting algorithm which allows for the possibility of a pre-specified fraction of points to lie in the margin area or even on the wrong side of the decision boundary.",
author = "G. R{\"a}tsch and B. Sch{\"o}lkopf and A. Smola and Klaus Muller and T. Onoda and S. Mika",
year = "2000",
month = "1",
day = "1",
language = "English",
isbn = "0262194503",
pages = "561--567",
booktitle = "Advances in Neural Information Processing Systems",
publisher = "Neural information processing systems foundation",

}

TY - GEN

T1 - ν-Arc

T2 - Ensemble learning in the presence of outliers

AU - Rätsch, G.

AU - Schölkopf, B.

AU - Smola, A.

AU - Muller, Klaus

AU - Onoda, T.

AU - Mika, S.

PY - 2000/1/1

Y1 - 2000/1/1

N2 - AdaBoost and other ensemble methods have successfully been applied to a number of classification tasks' seemingly defying problems of overfitting. AdaBoost performs gradient descent in an error function with respect to the margin' asymptotically concentrating on the patterns which are hardest to learn. For very noisy problems' however' this can be disadvantageous. Indeed' theoretical analysis has shown that the margin distribution' as opposed to just the minimal margin' plays a crucial role in understanding this phenomenon. Loosely speaking' some outliers should be tolerated if this has the benefit of substantially the margin on the remaining points. We propose a new boosting algorithm which allows for the possibility of a pre-specified fraction of points to lie in the margin area or even on the wrong side of the decision boundary.

AB - AdaBoost and other ensemble methods have successfully been applied to a number of classification tasks' seemingly defying problems of overfitting. AdaBoost performs gradient descent in an error function with respect to the margin' asymptotically concentrating on the patterns which are hardest to learn. For very noisy problems' however' this can be disadvantageous. Indeed' theoretical analysis has shown that the margin distribution' as opposed to just the minimal margin' plays a crucial role in understanding this phenomenon. Loosely speaking' some outliers should be tolerated if this has the benefit of substantially the margin on the remaining points. We propose a new boosting algorithm which allows for the possibility of a pre-specified fraction of points to lie in the margin area or even on the wrong side of the decision boundary.

UR - http://www.scopus.com/inward/record.url?scp=0008161172&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0008161172&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:0008161172

SN - 0262194503

SN - 9780262194501

SP - 561

EP - 567

BT - Advances in Neural Information Processing Systems

PB - Neural information processing systems foundation

ER -