We consider the problem of approximating a (nonnegative definite) covariance matrix by the sum of two structured covariances-one which is diagonal and one which has low-rank. Such an additive decomposition follows the dictum of factor analysis where linear relations are sought between variables corrupted by independent measurement noise. We use as distance the Wasserstein metric between their respective distributions (assumed Gaussian) which induces a metric between nonnegative definite matrices, in general. The rank-constraint renders the optimization non-convex. We propose alternating between optimization with respect to each of the two summands. Properties of these optimization problems and the performance of the approach are being analyzed.