Article snapshot taken from Wikipedia with creative commons attribution-sharealike license.
Give it a read and then ask your questions in the chat.
We can research this topic together.
The generalized linear array model or GLAM was introduced in 2006. Such models provide a structure and a computational procedure for fitting generalized linear models or GLMs whose model matrix can be written as a Kronecker product and whose data can be written as an array. In a large GLM, the GLAM approach gives very substantial savings in both storage and computational time over the usual GLM algorithm.
Suppose that the data is arranged in a -dimensional array with size ; thus, the corresponding data vector has size . Suppose also that the design matrix is of the form
The standard analysis of a GLM with data vector and design matrix proceeds by repeated evaluation of the scoring algorithm
where represents the approximate solution of , and is the improved value of it; is the diagonal weight matrix with elements
and
is the working variable.
Computationally, GLAM provides array algorithms to calculate the linear predictor,
and the weighted inner product
without evaluation of the model matrix
Example
In 2 dimensions, let , then the linear predictor is written where is the matrix of coefficients; the weighted inner product is obtained from and is the matrix of weights; here is the row tensor function of the matrix given by
where means element by element multiplication and is a vector of 1's of length .
On the other hand, the row tensor function of the matrix is the example of Face-splitting product of matrices, which was proposed by Vadym Slyusar in 1996:
These low storage high speed formulae extend to -dimensions.
Applications
GLAM is designed to be used in -dimensional smoothing problems where the data are arranged in an array and the smoothing matrix is constructed as a Kronecker product of one-dimensional smoothing matrices.