Abstract
This paper provides an introduction to support vector machines (SVMs), kernel Fisher discriminant analysis, and kernel principal component analysis (PCA), as examples for successful kernel-based learning methods. We first give a short background about Vapnik-Chervonenkis (VC) theory and kernel feature spaces and then proceed to kernel based learning in supervised and unsupervised scenarios including practical and algorithmic considerations. We illustrate the usefulness of kernel algorithms by finally discussing applications such as optical character recognition (OCR) and DNA analysis.
Original language | English |
---|---|
Pages (from-to) | 181-201 |
Number of pages | 21 |
Journal | IEEE Transactions on Neural Networks |
Volume | 12 |
Issue number | 2 |
DOIs | |
Publication status | Published - 2001 Mar 1 |
Externally published | Yes |
Fingerprint
Keywords
- Boosting
- Fisher's discriminant
- Kernel methods
- Kernel PCA
- Mathematical programming machines
- Mercer kernels
- Principal component analysis (PCA)
- Single-class classification
- Support vector machines (SVMs)
ASJC Scopus subject areas
- Control and Systems Engineering
- Theoretical Computer Science
- Electrical and Electronic Engineering
- Artificial Intelligence
- Computational Theory and Mathematics
- Hardware and Architecture
Cite this
An introduction to kernel-based learning algorithms. / Muller, Klaus; Mika, Sebastian; Rätsch, Gunnar; Tsuda, Koji; Schölkopf, Bernhard.
In: IEEE Transactions on Neural Networks, Vol. 12, No. 2, 01.03.2001, p. 181-201.Research output: Contribution to journal › Article
}
TY - JOUR
T1 - An introduction to kernel-based learning algorithms
AU - Muller, Klaus
AU - Mika, Sebastian
AU - Rätsch, Gunnar
AU - Tsuda, Koji
AU - Schölkopf, Bernhard
PY - 2001/3/1
Y1 - 2001/3/1
N2 - This paper provides an introduction to support vector machines (SVMs), kernel Fisher discriminant analysis, and kernel principal component analysis (PCA), as examples for successful kernel-based learning methods. We first give a short background about Vapnik-Chervonenkis (VC) theory and kernel feature spaces and then proceed to kernel based learning in supervised and unsupervised scenarios including practical and algorithmic considerations. We illustrate the usefulness of kernel algorithms by finally discussing applications such as optical character recognition (OCR) and DNA analysis.
AB - This paper provides an introduction to support vector machines (SVMs), kernel Fisher discriminant analysis, and kernel principal component analysis (PCA), as examples for successful kernel-based learning methods. We first give a short background about Vapnik-Chervonenkis (VC) theory and kernel feature spaces and then proceed to kernel based learning in supervised and unsupervised scenarios including practical and algorithmic considerations. We illustrate the usefulness of kernel algorithms by finally discussing applications such as optical character recognition (OCR) and DNA analysis.
KW - Boosting
KW - Fisher's discriminant
KW - Kernel methods
KW - Kernel PCA
KW - Mathematical programming machines
KW - Mercer kernels
KW - Principal component analysis (PCA)
KW - Single-class classification
KW - Support vector machines (SVMs)
UR - http://www.scopus.com/inward/record.url?scp=0035272287&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=0035272287&partnerID=8YFLogxK
U2 - 10.1109/72.914517
DO - 10.1109/72.914517
M3 - Article
C2 - 18244377
AN - SCOPUS:0035272287
VL - 12
SP - 181
EP - 201
JO - IEEE Transactions on Neural Networks and Learning Systems
JF - IEEE Transactions on Neural Networks and Learning Systems
SN - 2162-237X
IS - 2
ER -