### Abstract

We introduce Probabilistic Matrix Addition (PMA) for modeling real-valued data matrices by simultaneously capturing covariance structure among rows and among columns. PMA additively combines two latent matrices drawn from two Gaussian Processes respectively over rows and columns. The resulting joint distribution over the observed matrix does not factorize over entries, rows, or columns, and can thus capture intricate dependencies in the matrix. Exact inference in PMA is possible, but involves inversion of large matrices, and can be computationally prohibitive. Efficient approximate inference is possible due to the sparse dependency structure among latent variables. We propose two families of approximate inference algorithms for PMA based on Gibbs sampling and MAP inference. We demonstrate the effectiveness of PMA for missing value prediction and multi-label classification problems.

Original language | English (US) |
---|---|

Title of host publication | Proceedings of the 28th International Conference on Machine Learning, ICML 2011 |

Pages | 1025-1032 |

Number of pages | 8 |

State | Published - Oct 7 2011 |

Event | 28th International Conference on Machine Learning, ICML 2011 - Bellevue, WA, United States Duration: Jun 28 2011 → Jul 2 2011 |

### Publication series

Name | Proceedings of the 28th International Conference on Machine Learning, ICML 2011 |
---|

### Other

Other | 28th International Conference on Machine Learning, ICML 2011 |
---|---|

Country | United States |

City | Bellevue, WA |

Period | 6/28/11 → 7/2/11 |

### Fingerprint

### Cite this

*Proceedings of the 28th International Conference on Machine Learning, ICML 2011*(pp. 1025-1032). (Proceedings of the 28th International Conference on Machine Learning, ICML 2011).