In
mathematics, a moment matrix is a special symmetric square
matrix whose rows and columns are indexed by
monomials. The entries of the matrix depend on the product of the indexing monomials only (cf.
Hankel matrices.)
Moment matrices play an important role in
polynomial fitting, polynomial optimization (since
positive semidefinite moment matrices correspond to polynomials which are
sums of squares)
[1] and
econometrics.
[2]
Application in regression
A multiple
linear regression model can be written as
![{\displaystyle y=\beta _{0}+\beta _{1}x_{1}+\beta _{2}x_{2}+\dots \beta _{k}x_{k}+u}](https://wikimedia.org/api/rest_v1/media/math/render/svg/dda3c8af3dcdbfeba3620f092513268ebc1c62a1)
where
is the explained variable,
are the explanatory variables,
is the error, and
are unknown coefficients to be estimated. Given observations
, we have a system of
linear equations that can be expressed in matrix notation.
[3]
![{\displaystyle {\begin{bmatrix}y_{1}\\y_{2}\\\vdots \\y_{n}\end{bmatrix}}={\begin{bmatrix}1&x_{11}&x_{12}&\dots &x_{1k}\\1&x_{21}&x_{22}&\dots &x_{2k}\\\vdots &\vdots &\vdots &\ddots &\vdots \\1&x_{n1}&x_{n2}&\dots &x_{nk}\\\end{bmatrix}}{\begin{bmatrix}\beta _{0}\\\beta _{1}\\\vdots \\\beta _{k}\end{bmatrix}}+{\begin{bmatrix}u_{1}\\u_{2}\\\vdots \\u_{n}\end{bmatrix}}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/543a955437e6687e2656b2085c36ef3b502ec151)
or
![{\displaystyle \mathbf {y} =\mathbf {X} {\boldsymbol {\beta }}+\mathbf {u} }](https://wikimedia.org/api/rest_v1/media/math/render/svg/d38d96cd3acd49e3f61add1d8d9ebf0a3458da52)
where
and
are each a vector of dimension
,
is the
design matrix of order
, and
is a vector of dimension
. Under the
Gauss–Markov assumptions, the best linear unbiased estimator of
is the linear
least squares estimator
, involving the two moment matrices
and
defined as
![{\displaystyle \mathbf {X} ^{\mathsf {T}}\mathbf {X} ={\begin{bmatrix}n&\sum x_{i1}&\sum x_{i2}&\dots &\sum x_{ik}\\\sum x_{i1}&\sum x_{i1}^{2}&\sum x_{i1}x_{i2}&\dots &\sum x_{i1}x_{ik}\\\sum x_{i2}&\sum x_{i1}x_{i2}&\sum x_{i2}^{2}&\dots &\sum x_{i2}x_{ik}\\\vdots &\vdots &\vdots &\ddots &\vdots \\\sum x_{ik}&\sum x_{i1}x_{ik}&\sum x_{i2}x_{ik}&\dots &\sum x_{ik}^{2}\end{bmatrix}}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/ef5ca417dfc390d6e9fc72adb0c7ee72f201dfb8)
and
![{\displaystyle \mathbf {X} ^{\mathsf {T}}\mathbf {y} ={\begin{bmatrix}\sum y_{i}\\\sum x_{i1}y_{i}\\\vdots \\\sum x_{ik}y_{i}\end{bmatrix}}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/b81c38d3da289f30fcf17958044ccf5123240f1d)
where
is a square
normal matrix of dimension
, and
is a vector of dimension
.
See also
References
External links