next up previous contents
Next: Bibliography Up: hoffmann_diss Previous: D. Database of hand-written


E. Notation and Symbols

Some mathematical notations are used throughout this book:

$ \bf x$
a vector
$ \bf A$
a matrix
xi
component of the vector $ \bf x$
aij
component of the matrix $ \bf A$
$ \bf a^{T}_{}$$ \bf b$
scalar product of the vectors $ \bf a$ and $ \bf b$
$ \bf a$$ \bf b^{T}_{}$
matrix with components aibj (direct product)
$ \left\langle\vphantom{ x }\right.$x$ \left.\vphantom{ x }\right\rangle$
expectation value of a random variable x
{$ \bf x_{i}^{}$}
set of vectors with index i
p($ \bf x$| j)
probability of $ \bf x$ given the condition j (conditional probability)



The meaning of often used symbols:

t
time (discrete)
St
sensory state at time t
Mt
motor command at time t
IR
set of all real numbers
n
number of training patterns
d
dimension of training patterns
m
number of units in a mixture, or for kernel PCA, the number of points in a reduced set
q
number of principal components
$ \bf c_{j}^{}$
code-book vector or the center of the unit j
$ \bf C$
covariance matrix of a data distribution
$ \bf W$
d×q matrix containing the principal components as columns
$ \bf w$
a principal component
$ \lambda^{l}_{}$
eigenvalue belonging to the principal component l
$ \sigma^{2}_{}$
residual variance per dimension. $ \sigma$ is also used as the width of a Gaussian function
$ \bf K$
kernel matrix



In this book, the following abbreviations appear:

PCA
principal component analysis (or analyzer)
MLP
multi-layer perceptron
RNN
recurrent neural network
SOM
self-organizing map
PSOM
parametrized self-organizing map
NGPCA
neural gas extended to principal component analysis
MPPCA
mixture of probabilistic principal component analyzers
RRLSA
robust recursive least square algorithm


next up previous contents
Next: Bibliography Up: hoffmann_diss Previous: D. Database of hand-written
Heiko Hoffmann
2005-03-22