NLA MasterMath - Weekly Schedule 2018

Week 5

This week we looked at the consequences of symmetry of the system matrix A. This implies symmetry of the upper Hessenberg matrix in the Arnoldi method and leads to three-term recurrence relations.

We covered:

  • Chapter 6: Krylov Subspace Methods
  • Section 6.6 (without 6.6.2)
  • Section 6.7 (without Section 6.7.3)
  • Section 6.8
Suggested exercises are:
  • 6.20, 6.21, 6.23 (only for GMRES and FOM)

Week 4

We will start with a short test that will count toerds your final grade for the course

We will then cover:

  • Chapter 6: Krylov Subspace Methods
  • Sections 6.1, 6.2, 6.3
  • 6.4 (without 6.4.2)
  • 6.5 (without 6.5.6 and 6.5.8 and 6.5.9)
In particular, we derived in detail:
  • The Full Orthogonalization Method
  • The Generamized Minimal Residual Method
Suggested exercises are:
  • 6.8, 6.9, 6.10, 6.13

Week 3

We covered
  • Chapter 5: Projection Methods
  • Excluding Sect 5.4
The following concepts should be understood:
  • Subspace methods with L=K and L=AK
  • A-inner products and norms
  • A*A-inner product and norms
  • Algorithm Minimal Residuals Sect 5.3.2
  • Algorithm Steepest Descent Sect 5.3.1
Suggested exercises are:
  • 5.1
  • 5.4, 5.8 (related to Ths 5.7 and 5.10)
  • 5.12 (random search direction)
  • 5.10 and 5.11
  • 5.16 and 5.17

Week 2

We covered
  • Chapter 4: Basic Iterative Methods
  • Excluding: 4.1.1: overlapping block methods
  • Exclusing: 4.2.5, 4.3
The following concepts should be understood:
  • splittings of a matrix
  • fixed point iterations for affine transformations
  • convergent if and only if spectral radius of iteration matrix smaller than one
  • errors versus residuals,
  • are powers of a matrix times initial error / residual
  • preconditioning: either to improve condition number or to reduce the spectral radius of iteration matrix
  • sufficient conditions for convergent iterations:
    • regular splittings for inverse nonnegative matrices
    • matrix A strictly diagonally dominant
    • A is SPD
  • basic methods as update & correction algorithms
Suggested exercises are:
  • 4.1, 4.4
  • 4.7 (but `consistently ordered matrices' will not be tested)
Further exercises were given on the blackboard:

1) give N&S conditions for Jacobi and Gauss-Seidel to converge for an arbitrary invertible 2x2 matrix

2) apply the Jacobi method to an upper triangular system

3) implement the random minimal residual method (which selects a random update direction of current approximation and minimizes the residual along that direction)

4) With A = [1 p q ; -1 1 r ; 0 -1 1] give N&S conditions for Gauss-Seidel to converge.

Week 1

We covered
  • Chapter 3: Sparse Matrices
  • Sections 3.1, 3.2, 3.3 only
The following concepts should be understood:
  • adjacency graph
  • row- and column permuations of matrices
  • symmetric permutation of a matrix
  • reordering of equations and unknowns in a linear system
  • LU and Cholesky decomposition depend on reorderings
  • Cuthill McKee and Reverse CMK
  • reoderings based on independent set and multi-colorings
Suggested exercises are:
  • 3.4, 3.5, 3.6, 3.7, 3.8, 3.10, 3.11, 3.12
See alse the pdf-section for Lecture Notes of:
  • Linear Algebra 2
  • Introduction to Numerical Mathematics: SVD
which are (part of) two first-year courses.