I had thought the matrix inversion lemma was difficult to prove, but it is in fact not so tricky!
The lemma states that if A and C are square invertible matrices (and B, D are matrices so that A and BCD have the same dimensions), then
Thanks to , it is now easier for me to derive this formula than to remember it, the way much of mathematics should be. Other ways of arriving at the formula are by matrix blockwise elimination or inversion. See the Wikipedia entry on the Woodbury matrix identity (another name for the lemma) for more information.
1. Start with the equation . We find x in terms of b either as or as follows.
2. Let , giving us the two equations
3. From (1) we obtain
4. Substituting (3) into (2) gives and rearranging this gives
5. From (3) and (4) we end up with
6. Since b was arbitrary, from step 1 we conclude that
7. To arrive at the slightly more complicated form (*) we replace with and note that
(using the result ).
The matrix inversion lemma is especially useful when it is easy to invert A and C, e.g. if they are diagonal or have small dimension. The latter may occur in recursive formulas such as recursive least squares or the Kalman filter. The lemma is actually a special case of the Binomial inverse theorem, which applies when is not invertible:
A couple more special cases of the matrix inversion lemma follow:
1. Sherman-Morrison formula (B and D replaced by column vectors u and v, C replaced by the identity):
 S. Boyd and L. Vandenberghe, Convex Optimization (Appendix C.4.3), Cambridge University Press, 2004