Dimensionality reduction based on ICA for regression problems

Nojun Kwak, Chunghoon Kim, Hwangnam Kim

Research output: Contribution to journalArticle

17 Citations (Scopus)

Abstract

In manipulating data such as in supervised learning, we often extract new features from the original input variables for the purpose of reducing the dimensions of input space and achieving better performances. In this paper, we show how standard algorithms for independent component analysis (ICA) can be extended to extract attributes for regression problems. The advantage is that general ICA algorithms become available to a task of dimensionality reduction for regression problems by maximizing the joint mutual information between target variable and new attributes. We applied the proposed method to a couple of real world regression problems as well as some artificial problems and compared the performances with those of other conventional methods. Experimental results show that the proposed method can efficiently reduce the dimension of input space without degrading the regression performance.

Original languageEnglish
Pages (from-to)2596-2603
Number of pages8
JournalNeurocomputing
Volume71
Issue number13-15
DOIs
Publication statusPublished - 2008 Aug 1

Fingerprint

Independent component analysis
Regression Analysis
Supervised learning
Joints
Learning

Keywords

  • Dimensionality reduction
  • Feature extraction
  • ICA
  • Regression

ASJC Scopus subject areas

  • Artificial Intelligence
  • Computer Science Applications
  • Cognitive Neuroscience

Cite this

Dimensionality reduction based on ICA for regression problems. / Kwak, Nojun; Kim, Chunghoon; Kim, Hwangnam.

In: Neurocomputing, Vol. 71, No. 13-15, 01.08.2008, p. 2596-2603.

Research output: Contribution to journalArticle

Kwak, Nojun ; Kim, Chunghoon ; Kim, Hwangnam. / Dimensionality reduction based on ICA for regression problems. In: Neurocomputing. 2008 ; Vol. 71, No. 13-15. pp. 2596-2603.
@article{87cae3543d1147299250e15292228992,
title = "Dimensionality reduction based on ICA for regression problems",
abstract = "In manipulating data such as in supervised learning, we often extract new features from the original input variables for the purpose of reducing the dimensions of input space and achieving better performances. In this paper, we show how standard algorithms for independent component analysis (ICA) can be extended to extract attributes for regression problems. The advantage is that general ICA algorithms become available to a task of dimensionality reduction for regression problems by maximizing the joint mutual information between target variable and new attributes. We applied the proposed method to a couple of real world regression problems as well as some artificial problems and compared the performances with those of other conventional methods. Experimental results show that the proposed method can efficiently reduce the dimension of input space without degrading the regression performance.",
keywords = "Dimensionality reduction, Feature extraction, ICA, Regression",
author = "Nojun Kwak and Chunghoon Kim and Hwangnam Kim",
year = "2008",
month = "8",
day = "1",
doi = "10.1016/j.neucom.2007.11.036",
language = "English",
volume = "71",
pages = "2596--2603",
journal = "Neurocomputing",
issn = "0925-2312",
publisher = "Elsevier",
number = "13-15",

}

TY - JOUR

T1 - Dimensionality reduction based on ICA for regression problems

AU - Kwak, Nojun

AU - Kim, Chunghoon

AU - Kim, Hwangnam

PY - 2008/8/1

Y1 - 2008/8/1

N2 - In manipulating data such as in supervised learning, we often extract new features from the original input variables for the purpose of reducing the dimensions of input space and achieving better performances. In this paper, we show how standard algorithms for independent component analysis (ICA) can be extended to extract attributes for regression problems. The advantage is that general ICA algorithms become available to a task of dimensionality reduction for regression problems by maximizing the joint mutual information between target variable and new attributes. We applied the proposed method to a couple of real world regression problems as well as some artificial problems and compared the performances with those of other conventional methods. Experimental results show that the proposed method can efficiently reduce the dimension of input space without degrading the regression performance.

AB - In manipulating data such as in supervised learning, we often extract new features from the original input variables for the purpose of reducing the dimensions of input space and achieving better performances. In this paper, we show how standard algorithms for independent component analysis (ICA) can be extended to extract attributes for regression problems. The advantage is that general ICA algorithms become available to a task of dimensionality reduction for regression problems by maximizing the joint mutual information between target variable and new attributes. We applied the proposed method to a couple of real world regression problems as well as some artificial problems and compared the performances with those of other conventional methods. Experimental results show that the proposed method can efficiently reduce the dimension of input space without degrading the regression performance.

KW - Dimensionality reduction

KW - Feature extraction

KW - ICA

KW - Regression

UR - http://www.scopus.com/inward/record.url?scp=56449100423&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=56449100423&partnerID=8YFLogxK

U2 - 10.1016/j.neucom.2007.11.036

DO - 10.1016/j.neucom.2007.11.036

M3 - Article

AN - SCOPUS:56449100423

VL - 71

SP - 2596

EP - 2603

JO - Neurocomputing

JF - Neurocomputing

SN - 0925-2312

IS - 13-15

ER -