LVQ combined with simulated annealing for optimal design of large-set reference models

Hee Heon Song, Seong Whan Lee

Research output: Contribution to journalArticle

23 Citations (Scopus)

Abstract

Learning Vector Quantization (LVQ) has been intensively studied to generate good reference models in pattern recognition since 1986, and it has some nice theoretical properties. However, the design of reference models based on LVQ suffers from several major drawbacks for the recognition of large-set patterns, in which good reference models play an important role in achieving high performance. They are due in large part to the following facts. (1) It may not generate good reference models, if the initial values of the reference models are outside the convex hull of the input data, (2) it cannot guarantee optimal reference models due to the strategy to accept new reference models in each iteration step, and (3) it is apt to get stuck at overtraining phenomenon. In this paper, we first discuss the impact of these problems. And then, to cope with these, we propose a new method for the optimal design of large-set reference models using an improved LVQ3 combined with Simulated Annealing which has been proven to be a useful technique in many areas of optimization problems. Experimental results with large-set handwritten characters reveal that the proposed method is superior to the conventional method based on averaging and other LVQ-based methods.

Original languageEnglish
Pages (from-to)329-336
Number of pages8
JournalNeural Networks
Volume9
Issue number2
DOIs
Publication statusPublished - 1996 Mar 1

Fingerprint

Vector quantization
Simulated annealing
Learning
Reference Values
Character sets
Optimal design
Pattern recognition
Recognition (Psychology)

Keywords

  • large-set pattern recognition
  • learning vector quantization
  • optimal design of large-set reference models
  • simulated annealing
  • vector quantization

ASJC Scopus subject areas

  • Artificial Intelligence
  • Neuroscience(all)

Cite this

LVQ combined with simulated annealing for optimal design of large-set reference models. / Song, Hee Heon; Lee, Seong Whan.

In: Neural Networks, Vol. 9, No. 2, 01.03.1996, p. 329-336.

Research output: Contribution to journalArticle

@article{13a17ba0ced445c7a453508241634e99,
title = "LVQ combined with simulated annealing for optimal design of large-set reference models",
abstract = "Learning Vector Quantization (LVQ) has been intensively studied to generate good reference models in pattern recognition since 1986, and it has some nice theoretical properties. However, the design of reference models based on LVQ suffers from several major drawbacks for the recognition of large-set patterns, in which good reference models play an important role in achieving high performance. They are due in large part to the following facts. (1) It may not generate good reference models, if the initial values of the reference models are outside the convex hull of the input data, (2) it cannot guarantee optimal reference models due to the strategy to accept new reference models in each iteration step, and (3) it is apt to get stuck at overtraining phenomenon. In this paper, we first discuss the impact of these problems. And then, to cope with these, we propose a new method for the optimal design of large-set reference models using an improved LVQ3 combined with Simulated Annealing which has been proven to be a useful technique in many areas of optimization problems. Experimental results with large-set handwritten characters reveal that the proposed method is superior to the conventional method based on averaging and other LVQ-based methods.",
keywords = "large-set pattern recognition, learning vector quantization, optimal design of large-set reference models, simulated annealing, vector quantization",
author = "Song, {Hee Heon} and Lee, {Seong Whan}",
year = "1996",
month = "3",
day = "1",
doi = "10.1016/0893-6080(95)00022-4",
language = "English",
volume = "9",
pages = "329--336",
journal = "Neural Networks",
issn = "0893-6080",
publisher = "Elsevier Limited",
number = "2",

}

TY - JOUR

T1 - LVQ combined with simulated annealing for optimal design of large-set reference models

AU - Song, Hee Heon

AU - Lee, Seong Whan

PY - 1996/3/1

Y1 - 1996/3/1

N2 - Learning Vector Quantization (LVQ) has been intensively studied to generate good reference models in pattern recognition since 1986, and it has some nice theoretical properties. However, the design of reference models based on LVQ suffers from several major drawbacks for the recognition of large-set patterns, in which good reference models play an important role in achieving high performance. They are due in large part to the following facts. (1) It may not generate good reference models, if the initial values of the reference models are outside the convex hull of the input data, (2) it cannot guarantee optimal reference models due to the strategy to accept new reference models in each iteration step, and (3) it is apt to get stuck at overtraining phenomenon. In this paper, we first discuss the impact of these problems. And then, to cope with these, we propose a new method for the optimal design of large-set reference models using an improved LVQ3 combined with Simulated Annealing which has been proven to be a useful technique in many areas of optimization problems. Experimental results with large-set handwritten characters reveal that the proposed method is superior to the conventional method based on averaging and other LVQ-based methods.

AB - Learning Vector Quantization (LVQ) has been intensively studied to generate good reference models in pattern recognition since 1986, and it has some nice theoretical properties. However, the design of reference models based on LVQ suffers from several major drawbacks for the recognition of large-set patterns, in which good reference models play an important role in achieving high performance. They are due in large part to the following facts. (1) It may not generate good reference models, if the initial values of the reference models are outside the convex hull of the input data, (2) it cannot guarantee optimal reference models due to the strategy to accept new reference models in each iteration step, and (3) it is apt to get stuck at overtraining phenomenon. In this paper, we first discuss the impact of these problems. And then, to cope with these, we propose a new method for the optimal design of large-set reference models using an improved LVQ3 combined with Simulated Annealing which has been proven to be a useful technique in many areas of optimization problems. Experimental results with large-set handwritten characters reveal that the proposed method is superior to the conventional method based on averaging and other LVQ-based methods.

KW - large-set pattern recognition

KW - learning vector quantization

KW - optimal design of large-set reference models

KW - simulated annealing

KW - vector quantization

UR - http://www.scopus.com/inward/record.url?scp=0030111094&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0030111094&partnerID=8YFLogxK

U2 - 10.1016/0893-6080(95)00022-4

DO - 10.1016/0893-6080(95)00022-4

M3 - Article

AN - SCOPUS:0030111094

VL - 9

SP - 329

EP - 336

JO - Neural Networks

JF - Neural Networks

SN - 0893-6080

IS - 2

ER -