NLA MasterMath  Weekly Schedule 2018Week 13After explaining some features of the Second Matlab Assessment: we concentrated on Harmonic RayleighRitz, which is nothing else than orthogonalizing eigenpair residuals to W=AV instead of to V. Interestingly, this has an interpretation as RayleighRitz applied to inv(A) with search space W.Then we went to study the JacobiDavidson method, which is an attempt to improve upon the IRA method. Instead of expanding the current search space in each step by ``the'' residual (which happens in Lanczos/Arnoldi, where the all residuals are linearly dependent) it was proposed to precondition the residual first, i.e. to expand with inv(K)r The system Kq=r was derived from an old idea by Jacobi and aims to approximate the correction q orthogonal to the current eigenvector approximation v such that v+q is a next and better eigenvector approximation. Instead of accepting v+q as next eigenvector approximation, q is added to the search space and the RitzGalerking approach is applied with that search space, in order for even better approximate eigenvector than v+q to be able to be determined. The derivation of the orthogonal correction equation (a generalized algebraic Riccati equation) was left as an exercise, as was its direct approximation in a subspace by another RitzGalerkin method, which turned out to be a small eigenproblem. Week 12Wegens succes geprolongeerd: Test 3 van vorige week, nu met extra aannames! Maak al de opgaven zelf (met boek, aantekeningen en internet erbij, maar zonder andere personen i.h.b. medestudenten te raadplegen) en lever ze morgen op hoorcollege in.Je eindcijfer voor Test 3 is dan het gemiddelde (afgerond op halven) van de scores van deze en vorige week. We behandelden de Implicitly Restarted Arnoldi method. This is a way how to change the start vector v of your Arnoldifactorization without into (Amu I)v using the matrix A again, but at the cost of a one dimension smaller new Arnoldi factorization. This filerting technique can be used to remove unwanted eigenvector components from v, or in other words, to enrich v in the wanted components. Week 11This week, we start with the last of the three small tests about the material of Weeks 080910. See below for details.Then we paid attention to the following results in Chapter 4:
Chapter 6 deals with Krylov subspace methods, and repeats many wellknown concepts from the linear system setting. We looked at the Hermitian Lanczos and the Arnoldi method and so far avoided the nonHermitian Lanczos method.
Week 10This week we looked at deflation. Standard direct deflation is infeasible because it is computationally much too expensive. Instead, we consider
We introduced and investigated
Some exercises from Saad to be tried:
Week 9We reviewed simple vector iterations (Section 4.1) to approximate a single eigenpair of a matrix A, which we assumed to be Hermitian.We looked at
Week 8We will start with a short test that will count towards your final grade for the course. The test is about the covered material of Chapters 6, which builds upon the general theory in Chapter 5.Then, we move on to eigenvalues, using the other book by Saad. After repeating some wellknown facts about eigenvalueproblems (see Chapter 1) we discussed
We already encountered most of Section 3.11 and 3.1.2 in the Linear Systems book; we briefly commented on Section 3.1.3 and 3.1.4 and mentioned the example finishing section 3.1.5. More explicitly we covered:
As an exercise, show that BauerFike is sharp for the matrices in Example 3.2. Perturb A with eH for small e. Do this exact for the 2x2 case, and use Matlab to experiment with the nxn case with n=4,5,6,... Week 7It is possible to create biorthogonal bases for a pair of Krylov spaces for A and A* using short recurrences only. This leads to a new class of methods in which an approximation from K(A,b) is sought whose residual is orthogonal to K(A*,c). Mathematically it is much harder to prove anything about such methods, even though in practice they can be quite effective.We covered:
Week 6Not only symmetric/Hermitian matrices yield short recurrences for the orthonormal basis for the Krylov Space. The FaberManteuffel Theorem shows which other matrices have short recurrences and which are optimal in some sense.We covered:
Week 5This week we looked at the consequences of symmetry of the system matrix A. This implies symmetry of the upper Hessenberg matrix in the Arnoldi method and leads to threeterm recurrence relations.We covered:
Week 4We will start with a short test that will count towards your final grade for the courseWe will then cover:
Week 3We covered
Week 2We covered
1) give N&S conditions for Jacobi and GaussSeidel to converge for an arbitrary invertible 2x2 matrix 2) apply the Jacobi method to an upper triangular system 3) implement the random minimal residual method (which selects a random update direction of current approximation and minimizes the residual along that direction) 4) With A = [1 p q ; 1 1 r ; 0 1 1] give N&S conditions for GaussSeidel to converge. Week 1We covered
