Decision tree decomposition-based complex feature selection for text chunking

Young Sook Hwang, Hae-Chang Rim

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Citations (Scopus)

Abstract

Incorporating a method of feature selection into a classification model often provides a number of advantages. In this paper we propose a new feature selection method based on the discriminative perspective of improving the classification accuracy. The feature selection method is developed for a classification model for text chunking. For effective feature selection, we utilize a decision tree as an intermediate feature space inducer. To select a more compact feature set with less computational load, we organized a partially ordered feature space according to the IGR distribution of features. Experimental results show that: (1) the computational complexity on high-dimensional feature space can be reduced by selecting features based on the decision tree decomposition; (2) the text chunking system using the proposed feature selection can significantly improve the performance compared with a decision tree classifier.

Original languageEnglish
Title of host publicationICONIP 2002 - Proceedings of the 9th International Conference on Neural Information Processing: Computational Intelligence for the E-Age
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages2217-2222
Number of pages6
Volume5
ISBN (Electronic)9810475241, 9789810475246
DOIs
Publication statusPublished - 2002
Event9th International Conference on Neural Information Processing, ICONIP 2002 - Singapore, Singapore
Duration: 2002 Nov 182002 Nov 22

Other

Other9th International Conference on Neural Information Processing, ICONIP 2002
CountrySingapore
CitySingapore
Period02/11/1802/11/22

Fingerprint

Decision trees
Feature extraction
Decomposition
Computational complexity
Classifiers

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Cite this

Hwang, Y. S., & Rim, H-C. (2002). Decision tree decomposition-based complex feature selection for text chunking. In ICONIP 2002 - Proceedings of the 9th International Conference on Neural Information Processing: Computational Intelligence for the E-Age (Vol. 5, pp. 2217-2222). [1201887] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/ICONIP.2002.1201887

Decision tree decomposition-based complex feature selection for text chunking. / Hwang, Young Sook; Rim, Hae-Chang.

ICONIP 2002 - Proceedings of the 9th International Conference on Neural Information Processing: Computational Intelligence for the E-Age. Vol. 5 Institute of Electrical and Electronics Engineers Inc., 2002. p. 2217-2222 1201887.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Hwang, YS & Rim, H-C 2002, Decision tree decomposition-based complex feature selection for text chunking. in ICONIP 2002 - Proceedings of the 9th International Conference on Neural Information Processing: Computational Intelligence for the E-Age. vol. 5, 1201887, Institute of Electrical and Electronics Engineers Inc., pp. 2217-2222, 9th International Conference on Neural Information Processing, ICONIP 2002, Singapore, Singapore, 02/11/18. https://doi.org/10.1109/ICONIP.2002.1201887
Hwang YS, Rim H-C. Decision tree decomposition-based complex feature selection for text chunking. In ICONIP 2002 - Proceedings of the 9th International Conference on Neural Information Processing: Computational Intelligence for the E-Age. Vol. 5. Institute of Electrical and Electronics Engineers Inc. 2002. p. 2217-2222. 1201887 https://doi.org/10.1109/ICONIP.2002.1201887
Hwang, Young Sook ; Rim, Hae-Chang. / Decision tree decomposition-based complex feature selection for text chunking. ICONIP 2002 - Proceedings of the 9th International Conference on Neural Information Processing: Computational Intelligence for the E-Age. Vol. 5 Institute of Electrical and Electronics Engineers Inc., 2002. pp. 2217-2222
@inproceedings{b41cd95ea934485684e2c1e5d7f25f7e,
title = "Decision tree decomposition-based complex feature selection for text chunking",
abstract = "Incorporating a method of feature selection into a classification model often provides a number of advantages. In this paper we propose a new feature selection method based on the discriminative perspective of improving the classification accuracy. The feature selection method is developed for a classification model for text chunking. For effective feature selection, we utilize a decision tree as an intermediate feature space inducer. To select a more compact feature set with less computational load, we organized a partially ordered feature space according to the IGR distribution of features. Experimental results show that: (1) the computational complexity on high-dimensional feature space can be reduced by selecting features based on the decision tree decomposition; (2) the text chunking system using the proposed feature selection can significantly improve the performance compared with a decision tree classifier.",
author = "Hwang, {Young Sook} and Hae-Chang Rim",
year = "2002",
doi = "10.1109/ICONIP.2002.1201887",
language = "English",
volume = "5",
pages = "2217--2222",
booktitle = "ICONIP 2002 - Proceedings of the 9th International Conference on Neural Information Processing: Computational Intelligence for the E-Age",
publisher = "Institute of Electrical and Electronics Engineers Inc.",

}

TY - GEN

T1 - Decision tree decomposition-based complex feature selection for text chunking

AU - Hwang, Young Sook

AU - Rim, Hae-Chang

PY - 2002

Y1 - 2002

N2 - Incorporating a method of feature selection into a classification model often provides a number of advantages. In this paper we propose a new feature selection method based on the discriminative perspective of improving the classification accuracy. The feature selection method is developed for a classification model for text chunking. For effective feature selection, we utilize a decision tree as an intermediate feature space inducer. To select a more compact feature set with less computational load, we organized a partially ordered feature space according to the IGR distribution of features. Experimental results show that: (1) the computational complexity on high-dimensional feature space can be reduced by selecting features based on the decision tree decomposition; (2) the text chunking system using the proposed feature selection can significantly improve the performance compared with a decision tree classifier.

AB - Incorporating a method of feature selection into a classification model often provides a number of advantages. In this paper we propose a new feature selection method based on the discriminative perspective of improving the classification accuracy. The feature selection method is developed for a classification model for text chunking. For effective feature selection, we utilize a decision tree as an intermediate feature space inducer. To select a more compact feature set with less computational load, we organized a partially ordered feature space according to the IGR distribution of features. Experimental results show that: (1) the computational complexity on high-dimensional feature space can be reduced by selecting features based on the decision tree decomposition; (2) the text chunking system using the proposed feature selection can significantly improve the performance compared with a decision tree classifier.

UR - http://www.scopus.com/inward/record.url?scp=84971673630&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84971673630&partnerID=8YFLogxK

U2 - 10.1109/ICONIP.2002.1201887

DO - 10.1109/ICONIP.2002.1201887

M3 - Conference contribution

AN - SCOPUS:84971673630

VL - 5

SP - 2217

EP - 2222

BT - ICONIP 2002 - Proceedings of the 9th International Conference on Neural Information Processing: Computational Intelligence for the E-Age

PB - Institute of Electrical and Electronics Engineers Inc.

ER -