Singularities in second-order statistics reveal linear dependencies between correlation variables. This fact underlies techniques ranging from Gauss' least squares to modern subspace methods in system identification. In particular, the latter exploit a decomposition of covariance data according to a hypothesis that the stochastic process breaks up into a sum of a deterministic component plus white noise. In the present paper we continue our earlier studies ,  on the correspondence between state-covariances and input power spectra in dynamical systems. We characterize state-covariances which correspond to deterministic inputs and develop formulae for the input spectrum of a singular state-covariance. We show that multivariable decomposition of a state-covariance in accordance with a "deterministic component + white noise" hypothesis for the input does not exist in general, and study a possible alternative where the "white noise" is replaced with a general "moving-average process" having "short-range correlation structure". The decomposition according to the range of their time-domain correlations is an alternative to the well-known (Carathéodory- Fejér-)Pisarenko decomposition of Toeplitz matrices with potentially great practical significance, and can be determined via convex optimization.