### Abstract

Canonical correlation analysis (CCA) is a widely used statistical technique to capture correlations between two sets of multivariate random variables and has found a multitude of applications in computer vision, medical imaging, and machine learning. The classical formulation assumes that the data live in a pair of vector spaces which makes its use in certain important scientific domains problematic. For instance, the set of symmetric positive definite matrices (SPD), rotations, and probability distributions all belong to certain curved Riemannian manifolds where vector-space operations are in general not applicable. Analyzing the space of such data via the classical versions of inference models is suboptimal. Using the space of SPD matrices as a concrete example, we present a principled generalization of the well known CCA to the Riemannian setting. Our CCA algorithm operates on the product Riemannian manifold representing SPD matrix-valued fields to identify meaningful correlations. As a proof of principle, we present experimental results on a neuroimaging data set to show the applicability of these ideas.

Original language | English |
---|---|

Title of host publication | Riemannian Computing in Computer Vision |

Publisher | Springer International Publishing |

Pages | 69-100 |

Number of pages | 32 |

ISBN (Electronic) | 9783319229577 |

ISBN (Print) | 9783319229560 |

DOIs | |

Publication status | Published - 2015 Jan 1 |

Externally published | Yes |

### Fingerprint

### ASJC Scopus subject areas

- Engineering(all)
- Computer Science(all)
- Mathematics(all)

### Cite this

*Riemannian Computing in Computer Vision*(pp. 69-100). Springer International Publishing. https://doi.org/10.1007/978-3-319-22957-7_4

**Canonical correlation analysis on spd(n) manifolds.** / Kim, Hyun Woo; Adluru, Nagesh; Bendlin, Barbara B.; Johnson, Sterling C.; Singh, Vikas; Vemuri, Baba C.

Research output: Chapter in Book/Report/Conference proceeding › Chapter

*Riemannian Computing in Computer Vision.*Springer International Publishing, pp. 69-100. https://doi.org/10.1007/978-3-319-22957-7_4

}

TY - CHAP

T1 - Canonical correlation analysis on spd(n) manifolds

AU - Kim, Hyun Woo

AU - Adluru, Nagesh

AU - Bendlin, Barbara B.

AU - Johnson, Sterling C.

AU - Singh, Vikas

AU - Vemuri, Baba C.

PY - 2015/1/1

Y1 - 2015/1/1

N2 - Canonical correlation analysis (CCA) is a widely used statistical technique to capture correlations between two sets of multivariate random variables and has found a multitude of applications in computer vision, medical imaging, and machine learning. The classical formulation assumes that the data live in a pair of vector spaces which makes its use in certain important scientific domains problematic. For instance, the set of symmetric positive definite matrices (SPD), rotations, and probability distributions all belong to certain curved Riemannian manifolds where vector-space operations are in general not applicable. Analyzing the space of such data via the classical versions of inference models is suboptimal. Using the space of SPD matrices as a concrete example, we present a principled generalization of the well known CCA to the Riemannian setting. Our CCA algorithm operates on the product Riemannian manifold representing SPD matrix-valued fields to identify meaningful correlations. As a proof of principle, we present experimental results on a neuroimaging data set to show the applicability of these ideas.

AB - Canonical correlation analysis (CCA) is a widely used statistical technique to capture correlations between two sets of multivariate random variables and has found a multitude of applications in computer vision, medical imaging, and machine learning. The classical formulation assumes that the data live in a pair of vector spaces which makes its use in certain important scientific domains problematic. For instance, the set of symmetric positive definite matrices (SPD), rotations, and probability distributions all belong to certain curved Riemannian manifolds where vector-space operations are in general not applicable. Analyzing the space of such data via the classical versions of inference models is suboptimal. Using the space of SPD matrices as a concrete example, we present a principled generalization of the well known CCA to the Riemannian setting. Our CCA algorithm operates on the product Riemannian manifold representing SPD matrix-valued fields to identify meaningful correlations. As a proof of principle, we present experimental results on a neuroimaging data set to show the applicability of these ideas.

UR - http://www.scopus.com/inward/record.url?scp=84957014689&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84957014689&partnerID=8YFLogxK

U2 - 10.1007/978-3-319-22957-7_4

DO - 10.1007/978-3-319-22957-7_4

M3 - Chapter

AN - SCOPUS:84957014689

SN - 9783319229560

SP - 69

EP - 100

BT - Riemannian Computing in Computer Vision

PB - Springer International Publishing

ER -