An efficient radius-incorporated MKL algorithm for Alzheimer's disease prediction

Xinwang Liu, Luping Zhou, Lei Wang, Jian Zhang, Jianping Yin, Dinggang Shen

Research output: Contribution to journalArticle

9 Citations (Scopus)

Abstract

Integrating multi-source information has recently shown promising performance in predicting Alzheimer's disease (AD). Multiple kernel learning (MKL) plays an important role in this regard by learning the combination weights of a set of base kernels via the principle of margin maximisation. The latest research on MKL further incorporates the radius of minimum enclosing ball (MEB) of training data to improve the kernel learning performance. However, we observe that directly applying these radius-incorporated MKL algorithms to AD prediction tasks does not necessarily improve, and sometimes even deteriorate, the prediction accuracy. In this paper, we propose an improved radius-incorporated MKL algorithm for AD prediction. First, we redesign the objective function by approximating the radius of MEB with its upper bound, a linear function of the kernel weights. This approximation makes the resulting optimisation problem convex and globally solvable. Second, instead of using cross-validation, we model the regularisation parameter C of the SVM classifier as an extra kernel weight and automatically tune it in MKL. Third, we theoretically show that our algorithm can be reformulated into a similar form of the SimpleMKL algorithm and conveniently solved by the off-the-shelf packages. We discuss the factors that contribute to the improved performance and apply our algorithm to discriminate different clinic groups from the benchmark ADNI data set. As experimentally demonstrated, our algorithm can better utilise the radius information and achieve higher prediction accuracy than the comparable MKL methods in the literature. In addition, our algorithm demonstrates the highest computational efficiency among all the comparable methods.

Original languageEnglish
Pages (from-to)2141-2150
Number of pages10
JournalPattern Recognition
Volume48
Issue number7
DOIs
Publication statusPublished - 2015 Jul 1

Fingerprint

Learning algorithms
Computational efficiency
Classifiers

Keywords

  • Alzheimer's disease
  • Multiple kernel learning
  • Neuroimaging
  • Radius-margin bound
  • Support vector machines

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence
  • Computer Vision and Pattern Recognition
  • Signal Processing

Cite this

An efficient radius-incorporated MKL algorithm for Alzheimer's disease prediction. / Liu, Xinwang; Zhou, Luping; Wang, Lei; Zhang, Jian; Yin, Jianping; Shen, Dinggang.

In: Pattern Recognition, Vol. 48, No. 7, 01.07.2015, p. 2141-2150.

Research output: Contribution to journalArticle

Liu, Xinwang ; Zhou, Luping ; Wang, Lei ; Zhang, Jian ; Yin, Jianping ; Shen, Dinggang. / An efficient radius-incorporated MKL algorithm for Alzheimer's disease prediction. In: Pattern Recognition. 2015 ; Vol. 48, No. 7. pp. 2141-2150.
@article{2a7c89473ce34f01940cf429af70d22c,
title = "An efficient radius-incorporated MKL algorithm for Alzheimer's disease prediction",
abstract = "Integrating multi-source information has recently shown promising performance in predicting Alzheimer's disease (AD). Multiple kernel learning (MKL) plays an important role in this regard by learning the combination weights of a set of base kernels via the principle of margin maximisation. The latest research on MKL further incorporates the radius of minimum enclosing ball (MEB) of training data to improve the kernel learning performance. However, we observe that directly applying these radius-incorporated MKL algorithms to AD prediction tasks does not necessarily improve, and sometimes even deteriorate, the prediction accuracy. In this paper, we propose an improved radius-incorporated MKL algorithm for AD prediction. First, we redesign the objective function by approximating the radius of MEB with its upper bound, a linear function of the kernel weights. This approximation makes the resulting optimisation problem convex and globally solvable. Second, instead of using cross-validation, we model the regularisation parameter C of the SVM classifier as an extra kernel weight and automatically tune it in MKL. Third, we theoretically show that our algorithm can be reformulated into a similar form of the SimpleMKL algorithm and conveniently solved by the off-the-shelf packages. We discuss the factors that contribute to the improved performance and apply our algorithm to discriminate different clinic groups from the benchmark ADNI data set. As experimentally demonstrated, our algorithm can better utilise the radius information and achieve higher prediction accuracy than the comparable MKL methods in the literature. In addition, our algorithm demonstrates the highest computational efficiency among all the comparable methods.",
keywords = "Alzheimer's disease, Multiple kernel learning, Neuroimaging, Radius-margin bound, Support vector machines",
author = "Xinwang Liu and Luping Zhou and Lei Wang and Jian Zhang and Jianping Yin and Dinggang Shen",
year = "2015",
month = "7",
day = "1",
doi = "10.1016/j.patcog.2014.12.007",
language = "English",
volume = "48",
pages = "2141--2150",
journal = "Pattern Recognition",
issn = "0031-3203",
publisher = "Elsevier Limited",
number = "7",

}

TY - JOUR

T1 - An efficient radius-incorporated MKL algorithm for Alzheimer's disease prediction

AU - Liu, Xinwang

AU - Zhou, Luping

AU - Wang, Lei

AU - Zhang, Jian

AU - Yin, Jianping

AU - Shen, Dinggang

PY - 2015/7/1

Y1 - 2015/7/1

N2 - Integrating multi-source information has recently shown promising performance in predicting Alzheimer's disease (AD). Multiple kernel learning (MKL) plays an important role in this regard by learning the combination weights of a set of base kernels via the principle of margin maximisation. The latest research on MKL further incorporates the radius of minimum enclosing ball (MEB) of training data to improve the kernel learning performance. However, we observe that directly applying these radius-incorporated MKL algorithms to AD prediction tasks does not necessarily improve, and sometimes even deteriorate, the prediction accuracy. In this paper, we propose an improved radius-incorporated MKL algorithm for AD prediction. First, we redesign the objective function by approximating the radius of MEB with its upper bound, a linear function of the kernel weights. This approximation makes the resulting optimisation problem convex and globally solvable. Second, instead of using cross-validation, we model the regularisation parameter C of the SVM classifier as an extra kernel weight and automatically tune it in MKL. Third, we theoretically show that our algorithm can be reformulated into a similar form of the SimpleMKL algorithm and conveniently solved by the off-the-shelf packages. We discuss the factors that contribute to the improved performance and apply our algorithm to discriminate different clinic groups from the benchmark ADNI data set. As experimentally demonstrated, our algorithm can better utilise the radius information and achieve higher prediction accuracy than the comparable MKL methods in the literature. In addition, our algorithm demonstrates the highest computational efficiency among all the comparable methods.

AB - Integrating multi-source information has recently shown promising performance in predicting Alzheimer's disease (AD). Multiple kernel learning (MKL) plays an important role in this regard by learning the combination weights of a set of base kernels via the principle of margin maximisation. The latest research on MKL further incorporates the radius of minimum enclosing ball (MEB) of training data to improve the kernel learning performance. However, we observe that directly applying these radius-incorporated MKL algorithms to AD prediction tasks does not necessarily improve, and sometimes even deteriorate, the prediction accuracy. In this paper, we propose an improved radius-incorporated MKL algorithm for AD prediction. First, we redesign the objective function by approximating the radius of MEB with its upper bound, a linear function of the kernel weights. This approximation makes the resulting optimisation problem convex and globally solvable. Second, instead of using cross-validation, we model the regularisation parameter C of the SVM classifier as an extra kernel weight and automatically tune it in MKL. Third, we theoretically show that our algorithm can be reformulated into a similar form of the SimpleMKL algorithm and conveniently solved by the off-the-shelf packages. We discuss the factors that contribute to the improved performance and apply our algorithm to discriminate different clinic groups from the benchmark ADNI data set. As experimentally demonstrated, our algorithm can better utilise the radius information and achieve higher prediction accuracy than the comparable MKL methods in the literature. In addition, our algorithm demonstrates the highest computational efficiency among all the comparable methods.

KW - Alzheimer's disease

KW - Multiple kernel learning

KW - Neuroimaging

KW - Radius-margin bound

KW - Support vector machines

UR - http://www.scopus.com/inward/record.url?scp=84925682671&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84925682671&partnerID=8YFLogxK

U2 - 10.1016/j.patcog.2014.12.007

DO - 10.1016/j.patcog.2014.12.007

M3 - Article

AN - SCOPUS:84925682671

VL - 48

SP - 2141

EP - 2150

JO - Pattern Recognition

JF - Pattern Recognition

SN - 0031-3203

IS - 7

ER -