next up previous contents
Next: 2.3.1 Gaussian mixture models Up: 2. Modeling of data Previous: 2.2.4 Neural Gas


2.3 Mixture of local PCA

A mixture of local PCA combines vector quantization with PCA. Code-book vectors are replaced by local PCA units. Each unit has a center, and the PCA needs to be computed relative to this center. The principal components point into the directions of major variance of the local distribution of assigned patterns. Different from vector quantization, here no general cost function exists. Some algorithms (Hinton et al., 1997; Kambhatla and Leen, 1997) try to minimize a global reconstruction error, which is the sum over the reconstruction error (2.3) for each unit. Other algorithms use a mixture of Gaussian functions to model the density of the training data, and therefore, choose the parameters such that the likelihood of the data is maximized (Bishop, 1995).

The first group of algorithms assigns a pattern to the unit that reconstructs the pattern with a minimal error (2.3). Thus also distant patterns are assigned to a unit as long as they lie in the direction of the principal components. However, for modeling non-linear manifolds this is of a disadvantage because the units are not locally confined and protrude out of the manifold (Möller and Hoffmann, 2004; Tipping and Bishop, 1999). On the other hand, density models are locally confined and are discussed in the following.



Subsections
next up previous contents
Next: 2.3.1 Gaussian mixture models Up: 2. Modeling of data Previous: 2.2.4 Neural Gas
Heiko Hoffmann
2005-03-22