### Abstract

Canonical correlation analysis (CCA) is a well-appreciated linear subspace method to leverage hidden sources common to two or more datasets. CCA benefits are documented in various applications, such as dimensionality reduction, blind source separation, classification, and data fusion. However, the standard CCA does not exploit the geometry of common sources, which may be deduced from (cross-) correlations, or, inferred from the data. In this context, the prior information provided by the common source is encoded here through a graph, and is employed as a CCA regularizer. This leads to what is termed here as graph CCA (gCCA), which accounts for the graph-induced knowledge of common sources, while maximizing the linear correlation between the canonical variables. When the dimensionality of data vectors is high relative to the number of vectors, the dual formulation of the novel gCCA is also developed. Tests on two real datasets for facial image classification showcase the merits of the proposed approaches relative to their competing alternatives.

Original language | English (US) |
---|---|

Title of host publication | 2018 IEEE Statistical Signal Processing Workshop, SSP 2018 |

Publisher | Institute of Electrical and Electronics Engineers Inc. |

Pages | 463-467 |

Number of pages | 5 |

ISBN (Print) | 9781538615706 |

DOIs | |

State | Published - Aug 29 2018 |

Event | 20th IEEE Statistical Signal Processing Workshop, SSP 2018 - Freiburg im Breisgau, Germany Duration: Jun 10 2018 → Jun 13 2018 |

### Publication series

Name | 2018 IEEE Statistical Signal Processing Workshop, SSP 2018 |
---|

### Other

Other | 20th IEEE Statistical Signal Processing Workshop, SSP 2018 |
---|---|

Country | Germany |

City | Freiburg im Breisgau |

Period | 6/10/18 → 6/13/18 |

### Fingerprint

### Keywords

- Canonical correlations
- dimensionality reduction
- generalized eigenvalue
- signal processing over graphs

### Cite this

*2018 IEEE Statistical Signal Processing Workshop, SSP 2018*(pp. 463-467). [8450749] (2018 IEEE Statistical Signal Processing Workshop, SSP 2018). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/SSP.2018.8450749

**Canonical Correlation Analysis with Common Graph Priors.** / Chen, Jia; Wang, Gang; Shen, Yanning; Giannakis, Georgios B.

Research output: Chapter in Book/Report/Conference proceeding › Conference contribution

*2018 IEEE Statistical Signal Processing Workshop, SSP 2018.*, 8450749, 2018 IEEE Statistical Signal Processing Workshop, SSP 2018, Institute of Electrical and Electronics Engineers Inc., pp. 463-467, 20th IEEE Statistical Signal Processing Workshop, SSP 2018, Freiburg im Breisgau, Germany, 6/10/18. https://doi.org/10.1109/SSP.2018.8450749

}

TY - GEN

T1 - Canonical Correlation Analysis with Common Graph Priors

AU - Chen, Jia

AU - Wang, Gang

AU - Shen, Yanning

AU - Giannakis, Georgios B

PY - 2018/8/29

Y1 - 2018/8/29

N2 - Canonical correlation analysis (CCA) is a well-appreciated linear subspace method to leverage hidden sources common to two or more datasets. CCA benefits are documented in various applications, such as dimensionality reduction, blind source separation, classification, and data fusion. However, the standard CCA does not exploit the geometry of common sources, which may be deduced from (cross-) correlations, or, inferred from the data. In this context, the prior information provided by the common source is encoded here through a graph, and is employed as a CCA regularizer. This leads to what is termed here as graph CCA (gCCA), which accounts for the graph-induced knowledge of common sources, while maximizing the linear correlation between the canonical variables. When the dimensionality of data vectors is high relative to the number of vectors, the dual formulation of the novel gCCA is also developed. Tests on two real datasets for facial image classification showcase the merits of the proposed approaches relative to their competing alternatives.

AB - Canonical correlation analysis (CCA) is a well-appreciated linear subspace method to leverage hidden sources common to two or more datasets. CCA benefits are documented in various applications, such as dimensionality reduction, blind source separation, classification, and data fusion. However, the standard CCA does not exploit the geometry of common sources, which may be deduced from (cross-) correlations, or, inferred from the data. In this context, the prior information provided by the common source is encoded here through a graph, and is employed as a CCA regularizer. This leads to what is termed here as graph CCA (gCCA), which accounts for the graph-induced knowledge of common sources, while maximizing the linear correlation between the canonical variables. When the dimensionality of data vectors is high relative to the number of vectors, the dual formulation of the novel gCCA is also developed. Tests on two real datasets for facial image classification showcase the merits of the proposed approaches relative to their competing alternatives.

KW - Canonical correlations

KW - dimensionality reduction

KW - generalized eigenvalue

KW - signal processing over graphs

UR - http://www.scopus.com/inward/record.url?scp=85051170977&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85051170977&partnerID=8YFLogxK

U2 - 10.1109/SSP.2018.8450749

DO - 10.1109/SSP.2018.8450749

M3 - Conference contribution

AN - SCOPUS:85051170977

SN - 9781538615706

T3 - 2018 IEEE Statistical Signal Processing Workshop, SSP 2018

SP - 463

EP - 467

BT - 2018 IEEE Statistical Signal Processing Workshop, SSP 2018

PB - Institute of Electrical and Electronics Engineers Inc.

ER -