VList: Difference between revisions
No edit summary |
en>Siskus No edit summary |
||
Line 1: | Line 1: | ||
{{for|a result in enumerative combinatorics|MacMahon Master theorem}} | |||
In the [[analysis of algorithms]], the '''master theorem''' provides a cookbook solution in [[asymptotic]] terms (using [[Big O notation]]) for [[recurrence relation]]s of types that occur in the [[Analysis of algorithms|analysis]] of many [[divide and conquer algorithm]]s. It was popularized by the canonical algorithms textbook ''[[Introduction to Algorithms]]'' by [[Thomas H. Cormen|Cormen]], [[Charles E. Leiserson|Leiserson]], [[Ron Rivest|Rivest]], and [[Clifford Stein|Stein]], in which it is both introduced and proved. Not all recurrence relations can be solved with the use of the master theorem; its generalizations include the [[Akra–Bazzi method]]. | |||
== Introduction == | |||
Consider a problem that can be solved using a [[recursive algorithm]] such as the following: | |||
<code> | |||
'''procedure''' T( n : size of problem ) '''defined as:''' | |||
'''if''' n < 1 '''then exit'''<br /><br /> | |||
Do work of amount f(n)<br /><br /> | |||
T(n/b) | |||
T(n/b) | |||
...repeat for a total of ''a'' times... | |||
T(n/b) | |||
'''end procedure''' | |||
</code> | |||
In the above algorithm we are dividing the problem into a number of subproblems recursively, each subproblem being of size ''n/b''. This can be visualized as building a call tree with each node of the tree as an instance of one recursive call and its child nodes being instances of subsequent calls. In the above example, each node would have ''a'' number of child nodes. Each node does an amount of work that corresponds to the size of the sub problem ''n'' passed to that instance of the recursive call and given by <math>f(n)</math>. The total amount of work done by the entire tree is the sum of the work performed by all the nodes in the tree. | |||
Algorithms such as above can be represented as a recurrence relation <math>T(n) = a \; T\left(\frac{n}{b}\right) + f(n)</math>. This recursive relation can be successively substituted into itself and expanded to obtain expression for total amount of work done.<ref> | |||
Duke University, | |||
"Big-Oh for Recursive Functions: Recurrence Relations", | |||
http://www.cs.duke.edu/~ola/ap/recurrence.html | |||
</ref> | |||
The Master theorem allows us to easily calculate the running time of such a recursive algorithm in [[Big O notation|Θ-notation]] without doing an expansion of the recursive relation above. | |||
== Generic form == | |||
The master theorem concerns recurrence relations of the form: | |||
:<math>T(n) = a \; T\!\left(\frac{n}{b}\right) + f(n) \;\;\;\; \mbox{where} \;\; a \geq 1 \mbox{, } b > 1</math> | |||
In the application to the analysis of a recursive algorithm, the constants and function take on the following significance: | |||
*''n'' is the size of the problem. | |||
*''a'' is the number of subproblems in the recursion. | |||
*''n''/''b'' is the size of each subproblem. (Here it is assumed that all subproblems are essentially the same size.) | |||
*''f'' (''n'') is the cost of the work done outside the recursive calls, which includes the cost of dividing the problem and the cost of merging the solutions to the subproblems. | |||
It is possible to determine an asymptotic tight bound in these three cases: | |||
=== Case 1 === | |||
==== Generic form ==== | |||
If <math>f(n) = \Theta\left( n^{c} \right)</math> where <math>c < \log_b a</math> (using [[Big O notation]]) | |||
then: | |||
:<math>T(n) = \Theta\left( n^{\log_b a} \right)</math> | |||
==== Example ==== | |||
:<math>T(n) = 8 T\left(\frac{n}{2}\right) + 1000n^2</math> | |||
As one can see from the formula above: | |||
:<math>a = 8, \, b = 2, \, f(n) = 1000n^2</math>, so | |||
:<math>f(n) = \Theta\left(n^c\right)</math>, where <math>c = 2</math> | |||
Next, we see if we satisfy the case 1 condition: | |||
:<math>\log_b a = \log_2 8 = 3>c</math>. | |||
It follows from the first case of the master theorem that | |||
:<math>T(n) = \Theta\left( n^{\log_b a} \right) = \Theta\left( n^{3} \right)</math> | |||
(indeed, the exact solution of the recurrence relation is <math>T(n) = 1001 n^3 - 1000 n^2</math>, assuming <math>T(1) = 1</math>). | |||
=== Case 2 === | |||
==== Generic form ==== | |||
If it is true, for some constant ''k'' ≥ 0, that: | |||
:<math>f(n) = \Theta\left( n^{c} \log^{k} n \right)</math> where <math>c = \log_b a</math> | |||
then: | |||
:<math>T(n) = \Theta\left( n^{c} \log^{k+1} n \right)</math> | |||
==== Example ==== | |||
<math>T(n) = 2 T\left(\frac{n}{2}\right) + 10n</math> | |||
As we can see in the formula above the variables get the following values: | |||
:<math>a = 2, \, b = 2, \, c = 1, \, f(n) = 10n</math> | |||
:<math>f(n) = \Theta\left(n^{c} \log^{k} n\right)</math> where <math>c = 1, k = 0</math> | |||
Next, we see if we satisfy the case 2 condition: | |||
:<math>\log_b a = \log_2 2 = 1</math>, and therefore, yes, <math>c = \log_b a</math> | |||
So it follows from the second case of the master theorem: | |||
:<math>T(n) = \Theta\left( n^{\log_b a} \log^{k+1} n\right) = \Theta\left( n^{1} \log^{1} n\right) = \Theta\left(n \log n\right)</math> | |||
Thus the given recurrence relation ''T''(''n'') was in Θ(''n'' log ''n''). | |||
(This result is confirmed by the exact solution of the recurrence relation, which is <math>T(n) = n + 10 n\log_2 n</math>, assuming <math>T(1) = 1</math>. | |||
=== Case 3 === | |||
==== Generic form ==== | |||
If it is true that: | |||
:<math>f(n) = \Theta\left( n^{c} \right)</math> where <math>c > \log_b a</math> | |||
then: | |||
:<math>T\left(n \right) = \Theta\left(f(n) \right)</math> | |||
==== Example ==== | |||
:<math>T(n) = 2 T\left(\frac{n}{2}\right) + n^2</math> | |||
As we can see in the formula above the variables get the following values: | |||
:<math>a = 2, \, b = 2, \, f(n) = n^2</math> | |||
:<math>f(n) = \Theta\left(n^c\right)</math>, where <math>c = 2</math> | |||
Next, we see if we satisfy the case 3 condition: | |||
:<math>\log_b a = \log_2 2 = 1</math>, and therefore, yes, <math>c > \log_b a</math> | |||
So it follows from the third case of the master theorem: | |||
:<math>T \left(n \right) = \Theta\left(f(n)\right) = \Theta \left(n^2 \right).</math> | |||
Thus the given recurrence relation ''T''(''n'') was in Θ(''n''<sup>2</sup>), that complies with the ''f'' (''n'') of the original formula. | |||
(This result is confirmed by the exact solution of the recurrence relation, which is <math>T(n) = 2 n^2 - n</math>, assuming <math>T(1) = 1</math>.) | |||
== Inadmissible equations== | |||
The following equations cannot be solved using the master theorem:<ref> | |||
Massachusetts Institute of Technology (MIT), | |||
"Master Theorem: Practice Problems and Solutions", | |||
http://www.csail.mit.edu/~thies/6.046-web/master.pdf | |||
</ref> | |||
*<math>T(n) = 2^nT\left (\frac{n}{2}\right )+n^n</math> | |||
*:''a'' is not a constant; the number of subproblems should be fixed | |||
*<math>T(n) = 2T\left (\frac{n}{2}\right )+\frac{n}{\log n}</math> | |||
*:non-polynomial difference between f(n) and <math>n^{\log_b a}</math> (see below) | |||
*<math>T(n) = 0.5T\left (\frac{n}{2}\right )+n</math> | |||
*:''a''<1 cannot have less than one sub problem | |||
*<math>T(n) = 64T\left (\frac{n}{8}\right )-n^2\log n</math> | |||
*:f(n) which is the combination time is not positive | |||
*<math>T(n) = T\left (\frac{n}{2}\right )+n(2-\cos n)</math> | |||
*:case 3 but regularity violation. | |||
In the second inadmissible example above, the difference between <math>f(n)</math> and <math>n^{\log_b a}</math> can be expressed with the ratio <math>\frac{f(n)}{n^{\log_b a}} = \frac{\frac{n}{\log n}}{n^{log_2 2}} = \frac{n}{n \log n} = \frac{1}{\log n}</math>. It is clear that <math>\frac{1}{\log n} < n^\epsilon</math> for any constant <math>\epsilon > 0</math>. Therefore, the difference is not polynomial and the Master Theorem does not apply. | |||
==See also== | |||
* [[Akra–Bazzi method]] | |||
== Application to common algorithms == | |||
{| class="wikitable" | |||
|- | |||
! Algorithm | |||
! Recurrence Relationship | |||
! Run time | |||
! Comment | |||
|- | |||
| [[Binary search]] | |||
| <math>T(n) = T\left(\frac{n}{2}\right) + O(1)</math> | |||
| <math>O(\log n)</math> | |||
| Apply Master theorem case <math>c = \log_b a</math>, where <math>a = 1, b = 2, c = 0, k = 0</math><ref name="dartmouth"> | |||
Dartmouth College, | |||
http://www.math.dartmouth.edu/archive/m19w03/public_html/Section5-2.pdf | |||
</ref> | |||
|- | |||
| Binary tree traversal | |||
| <math>T(n) = 2 T\left(\frac{n}{2}\right) + O(1)</math> | |||
| <math>O(n)</math> | |||
| Apply Master theorem case <math>c < \log_b a</math> where <math>a = 2, b = 2, c = 0</math><ref name="dartmouth" /> | |||
|- | |||
| Optimal Sorted Matrix Search | |||
| <math>T(n) = 2 T\left(\frac{n}{2}\right) + O(\log n)</math> | |||
| <math>O(n)</math> | |||
| Apply [[Akra-Bazzi theorem]] for <math>p=1</math> and <math>g(u)=\log(u)</math> to get <math>\Theta(2n - \log n)</math> | |||
|- | |||
| [[Merge Sort]] | |||
| <math>T(n) = 2 T\left(\frac{n}{2}\right) + O(n)</math> | |||
| <math>O(n \log n)</math> | |||
| Apply Master theorem case <math>c = \log_b a</math>, where <math>a = 2, b = 2, c = 1, k = 0</math> | |||
|} | |||
== Notes == | |||
{{reflist}} | |||
== References == | |||
* [[Thomas H. Cormen]], [[Charles E. Leiserson]], [[Ronald L. Rivest]], and [[Clifford Stein]]. ''[[Introduction to Algorithms]]'', Second Edition. MIT Press and McGraw-Hill, 2001. ISBN 0-262-03293-7. Sections 4.3 (The master method) and 4.4 (Proof of the master theorem), pp. 73–90. | |||
* [[Michael T. Goodrich]] and [[Roberto Tamassia]]. ''Algorithm Design: Foundation, Analysis, and Internet Examples''. Wiley, 2002. ISBN 0-471-38365-1. The master theorem (including the version of Case 2 included here, which is stronger than the one from CLRS) is on pp. 268–270. | |||
{{DEFAULTSORT:Master Theorem}} | |||
[[Category:Asymptotic analysis]] | |||
[[Category:Theorems in computational complexity theory]] | |||
[[Category:Recurrence relations]] | |||
[[Category:Analysis of algorithms]] |
Revision as of 13:21, 24 August 2013
28 year-old Painting Investments Worker Truman from Regina, usually spends time with pastimes for instance interior design, property developers in new launch ec Singapore and writing. Last month just traveled to City of the Renaissance. In the analysis of algorithms, the master theorem provides a cookbook solution in asymptotic terms (using Big O notation) for recurrence relations of types that occur in the analysis of many divide and conquer algorithms. It was popularized by the canonical algorithms textbook Introduction to Algorithms by Cormen, Leiserson, Rivest, and Stein, in which it is both introduced and proved. Not all recurrence relations can be solved with the use of the master theorem; its generalizations include the Akra–Bazzi method.
Introduction
Consider a problem that can be solved using a recursive algorithm such as the following:
procedure T( n : size of problem ) defined as:
if n < 1 then exit
Do work of amount f(n)
T(n/b)
T(n/b)
...repeat for a total of a times...
T(n/b)
end procedure
In the above algorithm we are dividing the problem into a number of subproblems recursively, each subproblem being of size n/b. This can be visualized as building a call tree with each node of the tree as an instance of one recursive call and its child nodes being instances of subsequent calls. In the above example, each node would have a number of child nodes. Each node does an amount of work that corresponds to the size of the sub problem n passed to that instance of the recursive call and given by . The total amount of work done by the entire tree is the sum of the work performed by all the nodes in the tree.
Algorithms such as above can be represented as a recurrence relation . This recursive relation can be successively substituted into itself and expanded to obtain expression for total amount of work done.[1]
The Master theorem allows us to easily calculate the running time of such a recursive algorithm in Θ-notation without doing an expansion of the recursive relation above.
Generic form
The master theorem concerns recurrence relations of the form:
In the application to the analysis of a recursive algorithm, the constants and function take on the following significance:
- n is the size of the problem.
- a is the number of subproblems in the recursion.
- n/b is the size of each subproblem. (Here it is assumed that all subproblems are essentially the same size.)
- f (n) is the cost of the work done outside the recursive calls, which includes the cost of dividing the problem and the cost of merging the solutions to the subproblems.
It is possible to determine an asymptotic tight bound in these three cases:
Case 1
Generic form
If where (using Big O notation)
then:
Example
As one can see from the formula above:
Next, we see if we satisfy the case 1 condition:
It follows from the first case of the master theorem that
(indeed, the exact solution of the recurrence relation is , assuming ).
Case 2
Generic form
If it is true, for some constant k ≥ 0, that:
then:
Example
As we can see in the formula above the variables get the following values:
Next, we see if we satisfy the case 2 condition:
So it follows from the second case of the master theorem:
Thus the given recurrence relation T(n) was in Θ(n log n).
(This result is confirmed by the exact solution of the recurrence relation, which is , assuming .
Case 3
Generic form
If it is true that:
then:
Example
As we can see in the formula above the variables get the following values:
Next, we see if we satisfy the case 3 condition:
So it follows from the third case of the master theorem:
Thus the given recurrence relation T(n) was in Θ(n2), that complies with the f (n) of the original formula.
(This result is confirmed by the exact solution of the recurrence relation, which is , assuming .)
Inadmissible equations
The following equations cannot be solved using the master theorem:[2]
-
- a is not a constant; the number of subproblems should be fixed
-
- a<1 cannot have less than one sub problem
-
- f(n) which is the combination time is not positive
-
- case 3 but regularity violation.
In the second inadmissible example above, the difference between and can be expressed with the ratio . It is clear that for any constant . Therefore, the difference is not polynomial and the Master Theorem does not apply.
See also
Application to common algorithms
Algorithm | Recurrence Relationship | Run time | Comment |
---|---|---|---|
Binary search | Apply Master theorem case , where [3] | ||
Binary tree traversal | Apply Master theorem case where [3] | ||
Optimal Sorted Matrix Search | Apply Akra-Bazzi theorem for and to get | ||
Merge Sort | Apply Master theorem case , where |
Notes
43 year old Petroleum Engineer Harry from Deep River, usually spends time with hobbies and interests like renting movies, property developers in singapore new condominium and vehicle racing. Constantly enjoys going to destinations like Camino Real de Tierra Adentro.
References
- Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest, and Clifford Stein. Introduction to Algorithms, Second Edition. MIT Press and McGraw-Hill, 2001. ISBN 0-262-03293-7. Sections 4.3 (The master method) and 4.4 (Proof of the master theorem), pp. 73–90.
- Michael T. Goodrich and Roberto Tamassia. Algorithm Design: Foundation, Analysis, and Internet Examples. Wiley, 2002. ISBN 0-471-38365-1. The master theorem (including the version of Case 2 included here, which is stronger than the one from CLRS) is on pp. 268–270.
- ↑ Duke University, "Big-Oh for Recursive Functions: Recurrence Relations", http://www.cs.duke.edu/~ola/ap/recurrence.html
- ↑ Massachusetts Institute of Technology (MIT), "Master Theorem: Practice Problems and Solutions", http://www.csail.mit.edu/~thies/6.046-web/master.pdf
- ↑ 3.0 3.1 Dartmouth College, http://www.math.dartmouth.edu/archive/m19w03/public_html/Section5-2.pdf