Just a short post today to recommend this article on knee strengthening exercises by Martin Koban. Obviously, exercises to strengthen and stabilize the knee are of great interest to me at the moment as I recover from my knee surgery. I’d say that I was probably already familiar with 50% of these exercises, and look I forward to working up to doing them all. Hopefully, consistent practice of these exercises will stave off future knee issues.
The topic of this third part of MA398 Matrix Analysis and Algorithms is the study of the (usually inevitable) differences between computed and mathematically exact results, particularly in our three prototypical problems of simultaneous linear equations (SLE), least squares (LSQ), and eigenvalue/eigenvector problems (EVP). The first key point is that we split the error analysis into two parts: conditioning, which is a property of the problem alone, and stability, which is a property of the algorithm used to (approximately) solve the problem.
A useful point of view is that of backward error analysis: the computed solution to the problem is viewed as the exact solution to a perturbed problem. If that perturbation in “problem space” is small then the algorithm is called backward stable, and unstable otherwise. Once the original and perturbed problems have been identified, we wish to understand the difference between their respective exact solutions — this is the issue of conditioning. The three prototypical problems can all be viewed as solving an equation of the form G(y, w) = 0, where w denotes the data that define the problem and y the solution.
|Solution, y||x||x||(x, λ)|
|Data, w||(A, b)||(A, b)||A|
In the backward error analysis point of view, the computed solution ŷ solves G(ŷ, ŵ) = 0. Conditioning concerns the estimation of Δy ≔ y−ŷ in terms of Δw ≔ w−ŵ.