Found In Space – a.k.

a.k. from thus spake a.k.

Some time ago we saw how Newton's method used the derivative of a univariate scalar valued function to guide the search for an argument at which it took a specific value. A related problem is finding a vector at which a multivariate vector valued function takes one, or at least comes as close as possible to it. In particular, we should often like to fit an arbitrary parametrically defined scalar valued functional form to a set of points with possibly noisy values, much as we did using linear regression to find the best fitting weighted sum of a given set of functions, and in this post we shall see how we can generalise Newton's method to solve such problems.

The Spectral Apparition – a.k.

a.k. from thus spake a.k.

Over the last few months we have seen how we can efficiently implement the Householder transformations and shifted Givens rotations used by Francis's algorithm to diagonalise a real symmetric matrix M, yielding its eigensystem in a matrix V whose columns are its eigenvectors and a diagonal matrix Λ whose diagonal elements are their associated eigenvalues, which satisfy

    M = V × Λ × VT

and together are known as the spectral decomposition of M.
In this post, we shall add it to the ak library using the householder and givens functions that we have put so much effort into optimising.

Funky Givens – a.k.

a.k. from thus spake a.k.

We have recently been looking at how we can use a special case of Francis's QR transformation to reduce a real symmetric matrix M to a diagonal matrix Λ by first applying Householder transformations to put it in tridiagonal form and then using shifted Givens rotations to zero out the off diagonal elements.
The columns of the matrix of transformations V and the elements on the leading diagonal of Λ are the unit eigenvectors and eigenvalues of M respectively and they consequently satisfy

    M × V = V × Λ

and, since the product of V and its transpose is the identity matrix

    M = V × Λ × VT

which is known as the spectral decomposition of M.
Last time we saw how we could efficiently apply the Householder transformations in-place, replacing the elements of M with those of the matrix of accumulated transformations Q and creating a pair of arrays to represent the leading and off diagonal elements of the tridiagonal matrix. This time we shall see how we can similarly improve the implementation of the Givens rotations.

A Well Managed Household – a.k.

a.k. from thus spake a.k.

Over the last few months we have seen how we can use a sequence of Householder transformations followed by a sequence of shifted Givens rotations to efficiently find the spectral decomposition of a symmetric real matrix M, formed from a matrix V and a diagonal matrix Λ satisfying

    M × V = V × Λ

implying that the columns of V are the unit eigenvectors of M and their associated elements on the diagonal of Λ are their eigenvalues so that

    V × VT = I

where I is the identity matrix, and therefore

    M = V × Λ × VT

From a mathematical perspective the combination of Householder transformations and shifted Givens rotations is particularly appealing, converging on the spectral decomposition after relatively few matrix multiplications, but from an implementation perspective using ak.matrix multiplication operations is less than satisfactory since it wastefully creates new ak.matrix objects at each step and so in this post we shall start to see how we can do better.

Spryer Francis – a.k.

a.k. from thus spake a.k.

Last time we saw how we could use a sequence of Householder transformations to reduce a symmetric real matrix M to a symmetric tridiagonal matrix, having zeros everywhere other than upon the leading, upper and lower diagonals, which we could then further reduce to a diagonal matrix Λ using a sequence of Givens rotations to iteratively transform the elements upon the upper and lower diagonals to zero so that the columns of the accumulated transformations V were the unit eigenvectors of M and the elements on the leading diagonal of the result were their associated eigenvalues, satisfying

    M × V = V × Λ

and, since the transpose of V is its own inverse

    M = V × Λ × VT

which is known as the spectral decomposition of M.
Unfortunately, the way that we used Givens rotations to diagonalise tridiagonal symmetric matrices wasn't particularly efficient and I concluded by stating that it could be significantly improved with a relatively minor change. In this post we shall see what it is and why it works.

FAO The Householder – a.k.

a.k. from thus spake a.k.

Some years ago we saw how we could use the Jacobi algorithm to find the eigensystem of a real valued symmetric matrix M, which is defined as the set of pairs of non-zero vectors vi and scalars λi that satisfy

    M × vi = λi × vi

known as the eigenvectors and the eigenvalues respectively, with the vectors typically restricted to those of unit length in which case we can define its spectral decomposition as the product

    M = V × Λ × VT

where the columns of V are the unit eigenvectors, Λ is a diagonal matrix whose ith diagonal element is the eigenvalue associated with the ith column of V and the T superscript denotes the transpose, in which the rows and columns of the matrix are swapped.
You may recall that this is a particularly convenient representation of the matrix since we can use it to generalise any scalar function to it with

    f(M) = V × f(Λ) × VT

where f(Λ) is the diagonal matrix whose ith diagonal element is the result of applying f to the ith diagonal element of Λ.
You may also recall that I suggested that there's a more efficient way to find eigensystems and I think that it's high time that we took a look at it.