TY - GEN

T1 - Hierarchical linear discriminant analysis for beamforming

AU - Choo, Jaegul

AU - Drake, Barry L.

AU - Park, Haesun

N1 - Copyright:
Copyright 2010 Elsevier B.V., All rights reserved.

PY - 2009

Y1 - 2009

N2 - This paper demonstrates the applicability of the recently proposed supervised dimension reduction, hierarchical linear discriminant analysis (h-LDA) to a well-known spatial localization technique in signal processing, beamforming. The main motivation of h-LDA is to overcome the drawback of LDA that each cluster is modeled as a unimodal Gaussian distribution. For this purpose, h-LDA extends the variance decomposition in LDA to the subcluster level, and modifies the definition of the within-cluster scatter matrix. In this paper, we present an efficient h-LDA algorithm for oversampled data, where the data dimension is larger than the dimension of the data vectors. The new algorithm utilizes the Cholesky decomposition based on a generalized singular value decomposition framework. Furthermore, we analyze the data model of h-LDA by relating it to the two-way multivariate analysis of variance (MANOVA), which fits well within the context of beamforming applications. Although beamforming has been generally dealt with as a regression problem, we propose a novel way of viewing beamforming as a classification problem, and apply a supervised dimension reduction, which allows the classifier to achieve better accuracy. Our experimental results show that h-LDA out-performs several dimension reduction methods such as LDA and kernel discriminant analysis, and regression approaches such as the regularized least squares and kernelized support vector regression.

AB - This paper demonstrates the applicability of the recently proposed supervised dimension reduction, hierarchical linear discriminant analysis (h-LDA) to a well-known spatial localization technique in signal processing, beamforming. The main motivation of h-LDA is to overcome the drawback of LDA that each cluster is modeled as a unimodal Gaussian distribution. For this purpose, h-LDA extends the variance decomposition in LDA to the subcluster level, and modifies the definition of the within-cluster scatter matrix. In this paper, we present an efficient h-LDA algorithm for oversampled data, where the data dimension is larger than the dimension of the data vectors. The new algorithm utilizes the Cholesky decomposition based on a generalized singular value decomposition framework. Furthermore, we analyze the data model of h-LDA by relating it to the two-way multivariate analysis of variance (MANOVA), which fits well within the context of beamforming applications. Although beamforming has been generally dealt with as a regression problem, we propose a novel way of viewing beamforming as a classification problem, and apply a supervised dimension reduction, which allows the classifier to achieve better accuracy. Our experimental results show that h-LDA out-performs several dimension reduction methods such as LDA and kernel discriminant analysis, and regression approaches such as the regularized least squares and kernelized support vector regression.

UR - http://www.scopus.com/inward/record.url?scp=72749123797&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=72749123797&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:72749123797

SN - 9781615671090

T3 - Society for Industrial and Applied Mathematics - 9th SIAM International Conference on Data Mining 2009, Proceedings in Applied Mathematics

SP - 889

EP - 900

BT - Society for Industrial and Applied Mathematics - 9th SIAM International Conference on Data Mining 2009, Proceedings in Applied Mathematics 133

T2 - 9th SIAM International Conference on Data Mining 2009, SDM 2009

Y2 - 30 April 2009 through 2 May 2009

ER -