Summary: problem exactly in one pass; instead, solve it approximately, then iterate. Multigrid methods,
perhaps the most important development in numerical computation in the past twenty years,
are based on a recursive application of this idea.
Even direct algorithms have been affected by the new manner of computing. Thanks to the
work of Skeel and others, it has been noticed that the expense of making a direct method
stable---say, of pivoting in Gaussian elimination---may in certain contexts be costineffective.
Instead, skip that step---solve the problem directly but unstably, then do one or two steps of
iterative refinement. ``Exact'' Gaussian elimination becomes just another preconditioner!
Other problems besides Ax = b have undergone analogous changes, and the famous example
is linear programming. Linear programming problems are mathematically finite, and for
decades, people solved them by a finite algorithm: the simplex method. Then Karmarkar
announced in 1984 that iterative, infinite algorithms are sometimes better. The result has
been controversy, intellectual excitement, and a perceptible shift of the entire field of linear
programming away from the rather anomalous position it has traditionally occupied towards
the mainstream of numerical computation.
I believe that the existence of finite algorithms for certain problems, together with other
historical forces, has distracted us for decades from a balanced view of numerical analysis.
Rounding errors and instability are important, and numerical analysts will always be the
experts in these subjects and at pains to ensure that the unwary are not tripped up by
them. But our central mission is to compute quantities that are typically uncomputable,