Kerr metric: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>Monkbot
en>Tom.Reding
m WP:GenFixes (page/s, endash, &nbsp, unicodify, and/or concising wikilinks, etc.), et al., ed., ref cleanup using AWB
 
(One intermediate revision by one other user not shown)
Line 1: Line 1:
{{Refimprove|date=December 2009}}
Some users of computer are aware which their computer become slower or have some mistakes following utilizing for a while. But most individuals don't learn how to accelerate their computer and a few of them don't dare to operate it. They always find some experts to keep the computer in wise condition yet they have to invest certain money on it. Actually, we can do it by oneself. There are many registry cleaner software which you can get one of them online. Some of them are free plus we only want to download them. After installing it, this registry cleaner software usually scan the registry. If it found these errors, it usually report you plus you are able to delete them to keep your registry clean. It is easy to operate and it's the best means to repair registry.<br><br>Many of the reliable businesses might provide a full funds back guarantee. This signifies which we have the opportunity to get a money back if you find the registry cleaning has not delivered what we expected.<br><br>So what could you look for when we compare registry products. Many of the registry cleaners accessible today, have really synonymous features. The principal ones that you need to be trying to find are these.<br><br>Paid registry products on the alternative hand, I have found, are often inexpensive. They provide usual, free changes or at least inexpensive updates. This follows because the software manufacturer must guarantee their product is best in staying ahead of its competitors.<br><br>After which, I also bought the Regtool [http://bestregistrycleanerfix.com/tune-up-utilities tuneup utilities 2014] Software, and it further secure my computer having system crashes. All my registry issues are fixed, plus I may work peacefully.<br><br>Software mistakes or hardware mistakes that happen when running Windows plus intermittent mistakes are the general factors for a blue screen bodily memory dump. New software or motorists that have been installed or changes in the registry settings are the typical s/w causes. Intermittent mistakes refer to failed program memory/ difficult disk or over heated processor plus these too may result the blue screen bodily memory dump error.<br><br>In additional words, if the PC has any corrupt settings inside the registry database, these settings usually create a computer run slower plus with a lot of errors. And unluckily, it's the case that XP is prone to saving countless settings within the registry inside the wrong method, making them unable to run correctly, slowing it down plus causing a great deal of errors. Each time you use your PC, it has to read 100's of registry settings... and there are often a lot of files open at when which XP gets confuse and saves countless inside the incorrect means. Fixing these damaged settings can boost the speed of your program... plus to do that, we should look to use a 'registry cleaner'.<br><br>There is a lot a wise registry cleaner could do for a computer. It may check for plus download updates for Windows, Java and Adobe. Keeping changes present is an significant part of advantageous computer wellness. It can moreover safeguard the individual and business confidentiality and the online safety.
In [[mathematics]], a '''block matrix''' or a '''partitioned matrix''' is a [[matrix (mathematics)|matrix]] which is ''interpreted'' as having been broken into sections called '''blocks''' or '''submatrices'''.<ref>{{cite book |last=Eves |first=Howard |authorlink=Howard Eves |title=Elementary Matrix Theory |year=1980 |publisher=Dover |location=New York |isbn=0-486-63946-0 |page=37 |url=http://books.google.com/books?id=ayVxeUNbZRAC&lpg=PA40&dq=block%20multiplication&pg=PA37#v=onepage&q&f=false |edition=reprint |accessdate=24 April 2013 |quote=We shall find that it is sometimes convenient to subdivide a matrix into rectangular blocks of elements. This leads us to consider so-called ''partitioned'', or ''block'', ''matrices''.}}</ref> Intuitively, a matrix interpreted as a block matrix can be visualized as the original matrix with a collection of horizontal and vertical lines which break it out, or [[Partition of a set|partition]] it, into a collection of smaller matrices.<ref>{{cite book |last=Anton |first=Howard |title=Elementary Linear Algebra |year=1994 |publisher=John Wiley |location=New York |isbn=0-471-58742-7 |page=30 |edition=7th |quote=A matrix can be subdivided or '''''partitioned''''' into smaller matrices by inserting horizontal and vertical rules between selected rows and columns.}}</ref> Any matrix may be interpreted as a block matrix in one or more ways, with each interpretation defined by how its rows and columns are partitioned.
 
This notion can be made more precise for an <math>n</math> by <math>m</math> matrix <math>M</math> by partitioning <math>n</math> into a collection <math>rowgroups</math>, and then partitioning <math>m</math> into a collection <math>colgroups</math>. The original matrix is then considered as the "total" of these groups, in the sense that the <math>(i,j)</math> entry of the original matrix corresponds in a [[Bijection|1-to-1 and onto]] way to some <math>(s,t)</math> [[Offset (computer science)|offset]] entry of some <math>(x,y)</math>, where <math>x \in rowgroups</math> and <math>y \in colgroups</math>.
 
==Example==
[[File:BlockMatrix168square.png|thumb |A 168×168 element block matrix with 12×12, 12×24, and 24×24 sub-Matrices. Non-zero elements are in blue, zero elements are grayed.]]
 
The matrix
 
:<math>\mathbf{P} = \begin{bmatrix}
1 & 1 & 2 & 2\\
1 & 1 & 2 & 2\\
3 & 3 & 4 & 4\\
3 & 3 & 4 & 4\end{bmatrix}</math>
 
can be partitioned into 4 2×2 blocks
 
:<math>\mathbf{P}_{11} = \begin{bmatrix}
1 & 1 \\
1 & 1 \end{bmatrix},  \mathbf{P}_{12} = \begin{bmatrix}
2 & 2\\
2 & 2\end{bmatrix},  \mathbf{P}_{21} = \begin{bmatrix}
3 & 3 \\
3 & 3 \end{bmatrix},  \mathbf{P}_{22} = \begin{bmatrix}
4 & 4\\
4 & 4\end{bmatrix}.</math>
 
The partitioned matrix can then be written as
 
:<math>\mathbf{P} = \begin{bmatrix}
\mathbf{P}_{11} & \mathbf{P}_{12}\\
\mathbf{P}_{21} & \mathbf{P}_{22}\end{bmatrix}.</math>
 
==Block matrix multiplication==
A block partitioned matrix product can sometimes be used that involves only algebra on submatrices of the factors. The partitioning of the factors is not arbitrary, however, and requires "conformable partitions"<ref>{{cite book |last=Eves |first=Howard |authorlink=Howard Eves |title=Elementary Matrix Theory |year=1980 |publisher=Dover |location=New York |isbn=0-486-63946-0 |page=37 |url=http://books.google.com/books?id=ayVxeUNbZRAC&lpg=PA40&dq=block%20multiplication&pg=PA39#v=onepage&q&f=false |edition=reprint |accessdate=24 April 2013 |quote=A partitioning as in Theorem 1.9.4 is called a ''conformable partition'' of ''A'' and ''B''.}}</ref> between two matrices <math>A</math> and <math>B</math> such that all submatrix products that will be used are defined.<ref>{{cite book |last=Anton |first=Howard |title=Elementary Linear Algebra |year=1994 |publisher=John Wiley |location=New York |isbn=0-471-58742-7 |page=36 |edition=7th |quote=...provided the sizes of the submatrices of A and B are such that the indicated operations can be performed.}}</ref> Given an <math>(m \times p)</math> matrix <math>\mathbf{A}</math> with <math>q</math> row partitions and <math>s</math> column partitions
 
:<math>
\mathbf{A} = \begin{bmatrix}
\mathbf{A}_{11} & \mathbf{A}_{12} & \cdots &\mathbf{A}_{1s}\\
\mathbf{A}_{21} & \mathbf{A}_{22} & \cdots &\mathbf{A}_{2s}\\
\vdots          & \vdots          & \ddots &\vdots \\
\mathbf{A}_{q1} & \mathbf{A}_{q2} & \cdots &\mathbf{A}_{qs}\end{bmatrix}</math>
 
and a <math>(p\times n)</math> matrix <math>\mathbf{B}</math> with <math>s</math> row partitions and <math>r</math> column partitions
 
:<math>
\mathbf{B} = \begin{bmatrix}
\mathbf{B}_{11} & \mathbf{B}_{12} & \cdots &\mathbf{B}_{1r}\\
\mathbf{B}_{21} & \mathbf{B}_{22} & \cdots &\mathbf{B}_{2r}\\
\vdots          & \vdots          & \ddots &\vdots \\
\mathbf{B}_{s1} & \mathbf{B}_{s2} & \cdots &\mathbf{B}_{sr}\end{bmatrix},</math>
that are compatible with the partitions of <math>A</math>, the matrix product
 
:<math>
\mathbf{C}=\mathbf{A}\mathbf{B}
</math>
 
can be formed blockwise, yielding <math>\mathbf{C}</math> as an <math>(m\times n)</math> matrix with <math>q</math> row partitions and <math>r</math> column partitions. The matrices in your matrix <math>\mathbf{C}</math> are calculated by multiplying:
 
:<math>
\mathbf{C}_{\alpha \beta} = \sum^s_{\gamma=1}\mathbf{A}_{\alpha \gamma}\mathbf{B}_{\gamma \beta}.
</math>
 
Or, using the [[Einstein notation]] that implicitly sums over repeated indices:
 
:<math>
\mathbf{C}_{\alpha \beta} = \mathbf{A}_{\alpha \gamma}\mathbf{B}_{\gamma \beta}.  
</math>
 
==Block diagonal matrices {{anchor|Block diagonal matrix}} ==
A '''block diagonal matrix''' is a block matrix which is a [[square matrix]], and having [[main diagonal]] blocks square matrices, such that the off-diagonal blocks are zero matrices.  A block diagonal matrix '''A''' has the form
 
:<math>
\mathbf{A} = \begin{bmatrix}
\mathbf{A}_{1} & 0 & \cdots & 0 \\ 0 & \mathbf{A}_{2} & \cdots &  0 \\
\vdots & \vdots & \ddots & \vdots \\
0 & 0 & \cdots & \mathbf{A}_{n}
\end{bmatrix}
</math>
 
where '''A'''<sub>''k''</sub> is a square matrix; in other words, it is the [[Direct sum of matrices|direct sum]] of '''A'''<sub>1</sub>, …, '''A'''<sub>''n''</sub>. It can also be indicated as '''A'''<sub>1</sub>&nbsp;<math>\oplus</math>&nbsp;'''A'''<sub>2</sub>&nbsp;<math>\oplus\,\ldots\,\oplus </math>&nbsp;'''A'''<sub>n</sub> &nbsp;or&nbsp; diag('''A'''<sub>1</sub>, '''A'''<sub>2</sub>,<math>\ldots</math>, '''A'''<sub>n</sub>) &nbsp;(the latter being the same formalism used for a [[diagonal matrix]]).
Any square matrix can trivially be considered a block diagonal matrix with only one block.
 
For the [[determinant]] and [[trace (linear algebra)|trace]], the following properties hold
:<math> \operatorname{det} \mathbf{A} = \operatorname{det} \mathbf{A}_1 \times \ldots \times \operatorname{det} \mathbf{A}_n</math>,
:<math> \operatorname{tr} \mathbf{A} = \operatorname{tr} \mathbf{A}_1 +\cdots +\operatorname{tr} \mathbf{A}_n.</math>
 
The inverse of a block diagonal matrix is another block diagonal matrix, composed of the inverse of each block, as follows:
:<math>\begin{pmatrix}
\mathbf{A}_{1} & 0 & \cdots & 0 \\
0 & \mathbf{A}_{2} & \cdots &  0 \\
\vdots & \vdots & \ddots & \vdots \\
0 & 0 & \cdots & \mathbf{A}_{n}
\end{pmatrix}^{-1} = \begin{pmatrix} \mathbf{A}_{1}^{-1} & 0 & \cdots & 0 \\
0 & \mathbf{A}_{2}^{-1} & \cdots &  0 \\
\vdots & \vdots & \ddots & \vdots \\
0 & 0 & \cdots & \mathbf{A}_{n}^{-1}
\end{pmatrix}.
</math>
 
The eigenvalues and eigenvectors of <math>A</math> are simply those of <math>A_{1}</math> and <math>A_{2}</math> and ... and <math>A_{n}</math> (combined).
 
==Block tridiagonal matrices==
A '''block tridiagonal matrix''' is another special block matrix, which is just like the block diagonal matrix a [[square matrix]], having square matrices (blocks) in the lower diagonal, [[main diagonal]] and upper diagonal, with all other blocks being zero matrices.
It is essentially a [[tridiagonal matrix]] but has submatrices in places of scalars. A block tridiagonal matrix '''A''' has the form
 
:<math>
\mathbf{A} = \begin{bmatrix}
\mathbf{B}_{1}  & \mathbf{C}_{1}  &        &        & \cdots  &        & 0 \\
\mathbf{A}_{2}  & \mathbf{B}_{2}  & \mathbf{C}_{2}  &        &        &        & \\
      & \ddots & \ddots  & \ddots  &        &        & \vdots \\
      &        & \mathbf{A}_{k}  & \mathbf{B}_{k}  & \mathbf{C}_{k}  &        & \\
\vdots &        &        & \ddots  & \ddots  & \ddots  & \\
      &        &        &        & \mathbf{A}_{n-1} & \mathbf{B}_{n-1} & \mathbf{C}_{n-1}  \\
0      &        & \cdots  &        &        & \mathbf{A}_{n}  & \mathbf{B}_{n}
\end{bmatrix}
</math>
 
where '''A'''<sub>''k''</sub>, '''B'''<sub>''k''</sub> and '''C'''<sub>''k''</sub> are square sub-matrices of the lower, main and upper diagonal respectively.
 
Block tridiagonal matrices are often encountered in numerical solutions of engineering problems (e.g., [[computational fluid dynamics]]). Optimized numerical methods for [[LU factorization]] are available and hence efficient solution algorithms for equation systems with a block tridiagonal matrix as coefficient matrix. The [[Thomas algorithm]], used for efficient solution of equation systems involving a [[tridiagonal matrix]] can also be applied using matrix operations to block tridiagonal matrices (see also [[Block LU decomposition]]).
 
==Block Toeplitz matrices==
A '''block Toeplitz matrix''' is another special block matrix, which contains blocks that are repeated down the diagonals of the matrix, as a [[Toeplitz matrix]] has elements repeated down the diagonal. The individual block matrix elements, Aij, must also be a Toeplitz matrix.
 
A block Toeplitz matrix '''A''' has the form
 
:<math>
\mathbf{A} = \begin{bmatrix}
\mathbf{A}_{(1,1)}  & \mathbf{A}_{(1,2)}  &        &        & \cdots  &    \mathbf{A}_{(1,n-1)}    & \mathbf{A}_{(1,n)} \\
\mathbf{A}_{(2,1)}  & \mathbf{A}_{(1,1)}  & \mathbf{A}_{(1,2)}  &        &        &        & \mathbf{A}_{(1,n-1)} \\
      & \ddots & \ddots  & \ddots  &        &        & \vdots \\
      &        & \mathbf{A}_{(2,1)}  & \mathbf{A}_{(1,1)}  & \mathbf{A}_{(1,2)}  &        & \\
\vdots &        &        & \ddots  & \ddots  & \ddots  & \\
\mathbf{A}_{(n-1,1)}      &        &        &        & \mathbf{A}_{(2,1)} & \mathbf{A}_{(1,1)} & \mathbf{A}_{(1,2)}  \\
\mathbf{A}_{(n,1)}      & \mathbf{A}_{(n-1,1)}      & \cdots  &        &        & \mathbf{A}_{(2,1)}  & \mathbf{A}_{(1,1)}
\end{bmatrix}.
</math>
 
==Direct sum==
For any arbitrary matrices '''A''' (of size ''m''&nbsp;×&nbsp;''n'') and '''B''' (of size ''p''&nbsp;×&nbsp;''q''), we have the '''direct sum''' of '''A''' and '''B''', denoted by '''A'''&nbsp;<math>\oplus</math>&nbsp;'''B''' and defined as
:<math>
  \mathbf{A} \oplus \mathbf{B} =
  \begin{bmatrix}
    a_{11} & \cdots & a_{1n} &      0 & \cdots &      0 \\
    \vdots & \cdots & \vdots & \vdots & \cdots & \vdots \\
    a_{m 1} & \cdots & a_{mn} &      0 & \cdots &      0 \\
          0 & \cdots &      0 & b_{11} & \cdots &  b_{1q} \\
    \vdots & \cdots & \vdots & \vdots & \cdots & \vdots \\
          0 & \cdots &      0 & b_{p1} & \cdots &  b_{pq}
  \end{bmatrix}.
</math>
 
For instance,
 
:<math>
  \begin{bmatrix}
    1 & 3 & 2 \\
    2 & 3 & 1
  \end{bmatrix}
\oplus
  \begin{bmatrix}
    1 & 6 \\
    0 & 1
  \end{bmatrix}
=
  \begin{bmatrix}
    1 & 3 & 2 & 0 & 0 \\
    2 & 3 & 1 & 0 & 0 \\
    0 & 0 & 0 & 1 & 6 \\
    0 & 0 & 0 & 0 & 1
  \end{bmatrix}.
</math>
 
This operation generalizes naturally to arbitrary dimensioned arrays (provided that '''A''' and '''B''' have the same number of dimensions).
 
Note that any element in the [[direct sum of vector spaces|direct sum]] of two [[vector space]]s of matrices could be represented as a direct sum of two matrices.
 
==Direct Product==
{{main|Kronecker product}}
 
==Application==
In [[linear algebra]] terms, the use of a block matrix corresponds to having a [[linear mapping]] thought of in terms of corresponding 'bunches' of [[basis vector]]s. That again matches the idea of having distinguished direct sum decompositions of the [[domain (mathematics)|domain]] and [[range (mathematics)|range]]. It is always particularly significant if a block is the zero matrix; that carries the information that a summand maps into a sub-sum.
 
Given the interpretation ''via'' linear mappings and direct sums, there is a special type of block matrix that occurs for square matrices (the case ''m'' = ''n''). For those we can assume an interpretation as an [[endomorphism]] of an ''n''-dimensional space ''V''; the block structure in which the bunching of rows and columns is the same is of importance because it corresponds to having a single direct sum decomposition on ''V'' (rather than two). In that case, for example, the [[diagonal]] blocks in the obvious sense are all square. This type of structure is required to describe the [[Jordan normal form]].
 
This technique is used to cut down calculations of matrices, column-row expansions, and many [[computer science]] applications, including [[VLSI]] chip design. An example is the [[Strassen algorithm]] for fast [[matrix multiplication]], as well as the [[Hamming(7,4)]] encoding for error detection and recovery in data transmissions.
 
==References==
{{Reflist}}
*{{Cite web |last=Strang |first=Gilbert |authorlink=Gilbert Strang |url=http://ocw.mit.edu/courses/mathematics/18-06-linear-algebra-spring-2010/video-lectures/lecture-3-multiplication-and-inverse-matrices |title=Lecture 3: Multiplication and inverse matrices |publisher=MIT Open Course ware |at=18:30–21:10 |date=1999}}
 
{{Linear algebra}}
 
{{DEFAULTSORT:Block Matrix}}
[[Category:Matrices]]
[[Category:Sparse matrices]]

Latest revision as of 01:06, 7 January 2015

Some users of computer are aware which their computer become slower or have some mistakes following utilizing for a while. But most individuals don't learn how to accelerate their computer and a few of them don't dare to operate it. They always find some experts to keep the computer in wise condition yet they have to invest certain money on it. Actually, we can do it by oneself. There are many registry cleaner software which you can get one of them online. Some of them are free plus we only want to download them. After installing it, this registry cleaner software usually scan the registry. If it found these errors, it usually report you plus you are able to delete them to keep your registry clean. It is easy to operate and it's the best means to repair registry.

Many of the reliable businesses might provide a full funds back guarantee. This signifies which we have the opportunity to get a money back if you find the registry cleaning has not delivered what we expected.

So what could you look for when we compare registry products. Many of the registry cleaners accessible today, have really synonymous features. The principal ones that you need to be trying to find are these.

Paid registry products on the alternative hand, I have found, are often inexpensive. They provide usual, free changes or at least inexpensive updates. This follows because the software manufacturer must guarantee their product is best in staying ahead of its competitors.

After which, I also bought the Regtool tuneup utilities 2014 Software, and it further secure my computer having system crashes. All my registry issues are fixed, plus I may work peacefully.

Software mistakes or hardware mistakes that happen when running Windows plus intermittent mistakes are the general factors for a blue screen bodily memory dump. New software or motorists that have been installed or changes in the registry settings are the typical s/w causes. Intermittent mistakes refer to failed program memory/ difficult disk or over heated processor plus these too may result the blue screen bodily memory dump error.

In additional words, if the PC has any corrupt settings inside the registry database, these settings usually create a computer run slower plus with a lot of errors. And unluckily, it's the case that XP is prone to saving countless settings within the registry inside the wrong method, making them unable to run correctly, slowing it down plus causing a great deal of errors. Each time you use your PC, it has to read 100's of registry settings... and there are often a lot of files open at when which XP gets confuse and saves countless inside the incorrect means. Fixing these damaged settings can boost the speed of your program... plus to do that, we should look to use a 'registry cleaner'.

There is a lot a wise registry cleaner could do for a computer. It may check for plus download updates for Windows, Java and Adobe. Keeping changes present is an significant part of advantageous computer wellness. It can moreover safeguard the individual and business confidentiality and the online safety.