Algorithms that extract the principal or minor components of a signal are widely used in signal processing and control applications. This paper explores new frameworks for generating learning rules for iteratively computing the principal and minor components (or subspaces) of a given matrix. Stability analysis using Liapunov theory and La Salle invariance principle is provided to determine regions of attraction of these learning rules. Among many derivations, it is specifically shown that Oja's rule and many variations of it are asymptotically globally stable. Liapunov stability theory is also applied to weighted learning rules. Some of the essential features for the proposed MCA/PCA learning rules are that they are self normalized and can be applied to non-symmetric matrices. Exact solutions for some nonlinear dynamical systems are also provided.