|
|
(One intermediate revision by one other user not shown) |
Line 1: |
Line 1: |
| In [[mathematics]], the '''Binomial Inverse Theorem''' is useful for expressing [[matrix (mathematics)|matrix]] inverses in different ways.
| | I would like to introduce myself to you, I am Andrew and my spouse doesn't like it at all. For years he's been living in Mississippi and he doesn't strategy on altering it. To play domino is some thing I truly appreciate performing. I am currently a travel agent.<br><br>Visit my blog post - best psychics ([http://cspl.postech.ac.kr/zboard/Membersonly/144571 http://cspl.postech.ac.kr]) |
| | |
| If '''A''', '''U''', '''B''', '''V''' are matrices of sizes ''p''×''p'', ''p''×''q'', ''q''×''q'', ''q''×''p'', respectively, then
| |
| | |
| :<math>
| |
| \left(\mathbf{A}+\mathbf{UBV}\right)^{-1}=
| |
| \mathbf{A}^{-1} - \mathbf{A}^{-1}\mathbf{UB}\left(\mathbf{B}+\mathbf{BVA}^{-1}\mathbf{UB}\right)^{-1}\mathbf{BVA}^{-1}
| |
| </math>
| |
| | |
| provided '''A''' and '''B''' + '''BVA'''<sup>−1</sup>'''UB''' are nonsingular. Note that if '''B''' is invertible, the two '''B''' terms flanking the quantity inverse in the right-hand side can be replaced with ('''B'''<sup>−1</sup>)<sup>−1</sup>, which results in
| |
| | |
| :<math>
| |
| \left(\mathbf{A}+\mathbf{UBV}\right)^{-1}=
| |
| \mathbf{A}^{-1} - \mathbf{A}^{-1}\mathbf{U}\left(\mathbf{B}^{-1}+\mathbf{VA}^{-1}\mathbf{U}\right)^{-1}\mathbf{VA}^{-1}.
| |
| </math>
| |
| | |
| This is the [[matrix inversion lemma]], which can also be derived using [[Invertible matrix#Blockwise inversion|matrix blockwise inversion]].
| |
| | |
| ==Verification==
| |
| First notice that
| |
| :<math>\left(\mathbf{A} + \mathbf{UBV}\right) \mathbf{A}^{-1}\mathbf{UB} = \mathbf{UB} + \mathbf{UBVA}^{-1}\mathbf{UB} = \mathbf{U} \left(\mathbf{B} + \mathbf{BVA}^{-1}\mathbf{UB}\right).</math>
| |
| | |
| Now multiply the matrix we wish to invert by its alleged inverse
| |
| :<math>\left(\mathbf{A} + \mathbf{UBV}\right) \left( \mathbf{A}^{-1} - \mathbf{A}^{-1}\mathbf{UB}\left(\mathbf{B} + \mathbf{BVA}^{-1}\mathbf{UB}\right)^{-1}\mathbf{BVA}^{-1} \right) </math>
| |
| :<math>= \mathbf{I}_p + \mathbf{UBVA}^{-1} - \mathbf{U} \left(\mathbf{B} + \mathbf{BVA}^{-1}\mathbf{UB}\right) \left(\mathbf{B} + \mathbf{BVA}^{-1}\mathbf{UB}\right)^{-1}\mathbf{BVA}^{-1} </math>
| |
| :<math>= \mathbf{I}_p + \mathbf{UBVA}^{-1} - \mathbf{U BVA}^{-1} = \mathbf{I}_p \!</math>
| |
| | |
| which verifies that it is the inverse.
| |
| | |
| So we get that—if '''A'''<sup>−1</sup> and <math>\left(\mathbf{B} + \mathbf{BVA}^{-1}\mathbf{UB}\right)^{-1}</math> exist, then <math>\left(\mathbf{A} + \mathbf{UBV}\right)^{-1}</math> exists and is given by the theorem above.<ref name="strang">{{cite book | author = Gilbert Strang | title = Introduction to Linear Algebra | edition = 3rd edition | year = 2003 | publisher = Wellesley-Cambridge Press: Wellesley, MA | isbn = 0-9614088-9-8}}</ref>
| |
| | |
| ==Special cases==
| |
| If ''p'' = ''q'' and '''U''' = '''V''' = '''I'''<sub>''p''</sub> is the identity matrix, then
| |
| | |
| :<math>
| |
| \left(\mathbf{A}+\mathbf{B}\right)^{-1} = \mathbf{A}^{-1} - \mathbf{A}^{-1}\mathbf{B}\left(\mathbf{B}+\mathbf{BA}^{-1}\mathbf{B}\right)^{-1}\mathbf{BA}^{-1}.
| |
| </math>
| |
| | |
| Remembering the identity
| |
| :<math>
| |
| \left(\mathbf{A} \mathbf{B}\right)^{-1} = \mathbf{B}^{-1} \mathbf{A}^{-1} .
| |
| </math>
| |
| we can also express the previous equation in the simpler form as
| |
| | |
| :<math>
| |
| \left(\mathbf{A}+\mathbf{B}\right)^{-1} = \mathbf{A}^{-1} - \mathbf{A}^{-1}\left(\mathbf{I}+\mathbf{B}\mathbf{A}^{-1}\right)^{-1}\mathbf{B}\mathbf{A}^{-1}.
| |
| </math>
| |
| | |
| If '''B''' = '''I'''<sub>''q''</sub> is the identity matrix and ''q'' = 1, then '''U''' is a column vector, written '''u''', and '''V''' is a row vector, written '''v'''<sup>T</sup>. Then the theorem implies
| |
| | |
| :<math> | |
| \left(\mathbf{A}+\mathbf{uv}^\mathrm{T}\right)^{-1} = \mathbf{A}^{-1}- \frac{\mathbf{A}^{-1}\mathbf{uv}^\mathrm{T}\mathbf{A}^{-1}}{1+\mathbf{v}^\mathrm{T}\mathbf{A}^{-1}\mathbf{u}}.
| |
| </math>
| |
| | |
| This is useful if one has a matrix <math>A</math> with a known inverse '''A'''<sup>−1</sup> and one needs to invert matrices of the form '''A'''+'''uv'''<sup>T</sup> quickly.
| |
| | |
| If we set '''A''' = '''I'''<sub>''p''</sub> and '''B''' = '''I'''<sub>''q''</sub>, we get
| |
| :<math>\left(\mathbf{I}_p + \mathbf{UV}\right)^{-1} = \mathbf{I}_p - \mathbf{U}\left(\mathbf{I}_q + \mathbf{VU}\right)^{-1}\mathbf{V}.</math>
| |
| | |
| In particular, if ''q'' = 1, then
| |
| | |
| :<math>\left(\mathbf{I}+\mathbf{uv}^\mathrm{T}\right)^{-1} = \mathbf{I} - \frac{\mathbf{uv}^\mathrm{T}}{1+\mathbf{v}^\mathrm{T}\mathbf{u}}.</math>
| |
| | |
| ==See also==
| |
| *[[Woodbury matrix identity]]
| |
| *[[Sherman-Morrison formula]]
| |
| *[[Invertible matrix]]
| |
| *[[Matrix determinant lemma]]
| |
| * For certain cases where ''A'' is singular and also [[Moore-Penrose pseudoinverse]], see Kurt S. Riedel, ''A Sherman—Morrison—Woodbury Identity for Rank Augmenting Matrices with Application to Centering'', SIAM Journal on Matrix Analysis and Applications, 13 (1992)659-662, {{doi|10.1137/0613040}} [http://math.nyu.edu/mfdd/riedel/ranksiam.ps preprint] {{MR|1152773}}
| |
| * [[Moore-Penrose pseudoinverse#Updating the pseudoinverse]]
| |
| | |
| ==References==
| |
| <references/>
| |
| | |
| [[Category:Linear algebra]]
| |
| [[Category:Matrix theory]]
| |
| [[Category:Theorems in algebra]]
| |
I would like to introduce myself to you, I am Andrew and my spouse doesn't like it at all. For years he's been living in Mississippi and he doesn't strategy on altering it. To play domino is some thing I truly appreciate performing. I am currently a travel agent.
Visit my blog post - best psychics (http://cspl.postech.ac.kr)