NLA  exercises 2017Week 12: Lecture 30We considered the Jacobi method for approximating eigenvalues, and the bisection method.Furthermore, and to be continued, we sketched the idea of Divide and Conquer. Exercises: 30.1, 30.4, 30.6 Exam 2 from 2014, Exercise 4 (interlacing singular values?) And two QRrelated exercises: Exam 2 from 2014, Exercise 2 (the ``nonsymmetric QRlemma'') Exam 2 from 2015, Exercise 3 (shifting with an eigenvalue). Handing in to be graded: Exam 2014 Ex 4 and Exam 2015 Ex 3. Week 11: Lecture 29This week we consider the full QRalgorithm, including shifts.We prove an elegant lemma with a rather surprising interpre tation. As a consequence, we consider the RQI shift and the Wilkinson shift. Exercises: Third Matlab Exercise = T&B Exercise 29.1:
Please send this file by Sunday November 26, 23:59 to Jan. In your email, include the discussion as asked for in part (e). To better understand the mentioned lemma, Week 10: Lectures 27, 28After finishing Lecture 27 we consider the QRiteration as introduced in Lecture 28. We restrict ourselves to real symmetric matrices. It is shown that it is mathematically equivalent to Simultaneous Iteration (also called Block Power Method) and we study its convergence.Exercises: 27.2 (I do not know an elementary proof of 27.2 (a)!), 28.1, 28.2, 28.3. Exercise 1 from Tweede Deeltentamen 2015 (on Rayleigh Quotient Iteration). Handing in to be graded: 27.2(b) and Exercise 1 (a,b,c,d) from Tweede Deeltentamen 2015. Week 09: Lectures 24, 25, 26, 27Lecture 24 reviews all basics about eigenvalues and eigenvectors, including diagonizability and the Schur decomposition. In Lectures 2526, the twophase structure present in a number of algorithms is eplained.Lecture 27 contains the first algorithms: the Power Method, Inverse Iteration, and Rayleigh Quotient Iteration. So far we have only seen the Power Method. The following exercises are, in fact, important perturbation theorems:
Handing in to be graded: 25.1, 25.3, 26.1 Week 08: Lectures 20, 21, 22, 23In these four lectures we look at Gaussian elimination from the computational point of view.We look at the algorithm, at socalled pivoting to enhance stability, and we reexamine our definition of stability based on a gruelling example. As a special case, look (again) at Cholesky factorization. Exercises: 20.1, 20.2, 20.3, 21.1, 22.1, 23.1. Handing in to be graded: 21.1, 22.1, 23.1 Week 06: Lects 16, 17In today's lecture we study the stability of two algorithms:
(1) Householder reflection QRfactorization The first will be done on a global level, the second in detail. Try exercises 16.1, 17.1, 17.2, Exam 2013 Ex 3 (c)(d) Handing in to be graded: Exam 2013 Ex 3 (c)(d). Week 05: Lects 12, 14, 15We encountered conditioning of problems, and stability of algorihms to solve these problems.A backward stable algorithm applied to a wellconditioned problem results in an accurately computed solution (see Th 15.1). Backward stability is wellillustrated at the beginning of Lect 15: subtracting two numbers is backward stable; bad results in case they are almost equal are caused by illconditioning of the mathematical problem of subtraction, and not by the algorithm. Try exercises 12.1, 14.1, 14.2, 15.1, 15.2. Apart from that, Ex. 3(a)(b) from Exam 1, 2013 (next week you continue with (c) and (d)) and Ex. 2 from Exam 1, 2015. Handing in to be graded: Ex. 3(a)(b) from Exam 1, 2013. Week 04: Lects 1112We summarized our analysis of LeastSquares problems and algorithms to solve them:
Then we moved to the concept of conditioning of a problem. We defined absolute and relative condition numbers and gave intuition behind their definition. We showed that catastrophic cancellation is not (just) due to finite precision arithmetic, but due to inherent illconditioning of subtraction. We finished by considering the conditioning of roots of polynomials in terms of perturbation of their coefficients. Multiple roots cause serious problems, but also single roots may be very sensitive to perturbations. Because of Matlab Exercise 1, here will be no additional exercises. Week 03: Lects 911, and 31,23superficiallyWe recap the triangular orthogonalization of Lecture 8, to contrast it to the orthogonal triangularization of Lecture 10.Specifically, we define Householder reflections, and QRdecomposition using these. Also, we use Givens Rotations (Ex. 10.4) to do the same. We also show that both these orthogonal tranformations can be employed to factorize a given matrix A as A = UBV where U and V are unitary, and B is bidiagonal (see also Lecture 31). Finally we go through Lecture 11 on Least Squares Problems. We will briefly comment on the Cholesky decomposition, which is mentioned in Lecture 11 and treated in detail in Lecture 23. It would be good to try the following old exam questions, being:
Please also look at Exercises 10.1 and 11.1 from T&B. Handing in to be graded: Exercise 1 (a)(f) from the Exam of 2015. Deadline: the beginning of the lecture of October 4. Week 02: Lects 13, 68We consider some aspects of Finite Precision Arithmetic (FPA).Try to make exercises 13.1, 13.2, 13.3. Exercise 13.3 shows that the order of computations matters! In this light we continued with studying the difference between the Classical Gram Schmidt (CGS) algorithm with its modified version (MGS). In exact arithmetic, they are the same. Lecture 6 on projections should cause no difficulties. Exercises 6.1 to 6.4 are easy. Try also the more difficult 6.5. Lecture 7 discusses CGS, and the QRdecomposition (unitary times upper triangular) of an arbitrary matrix. Two related examples discuss CGS in a space of polynomials, and solving Ax=b using QRdecomposition. Try Exercises 7.1 and 7.5 Lecture 8 introduces MGS and the interpretation of both CGS and MGS as triangular orthogonalization (to be discussed next week). Additional exercises:
(1)
(2) Handing in to be graded: Exercises 6.5 and 13.2 Deadline: the beginning of next lecture (not after!) Week 01: Lects 15Please, carefully read the rather basic Lectures 15 of Trefethen and Bau (B&T).Then, look at the exercises belonging to them, and concentrate on the ones which seem difficult (there should not be too many!) Try as many of these difficult ones as you can in the time you have, while also starting up Matlab and playing around with it a bit to renew your feel for it. Matlab may also help you to solve some of the problems! Since the first week's exercises only review prerequisites, you can not score homework points for them. It is however possible to hand in problematic ones in order to receive feedback. See alse the pdfsection for Lecture Notes of:
