Resolvent cubic: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>D.Lazard
→‎See also: add link
→‎References: Kaplansky (1972)
 
Line 1: Line 1:
{{Orphan|date=April 2010}}


{{Unreferenced|date=January 2010}}
[[Randomized algorithms]] are algorithms that employ a degree of randomness as part of their logic. These algorithms can be used to give good [[average case complexity|average-case]] results (complexity-wise) to problems which are hard to solve deterministically, or display poor [[worst case]] complexity. An algorithmic [[game theory|game theoretic]] approach can help explain why in the average case randomized algorithms may work better than deterministic algorithms.


==Formalizing the game==
[http://Statigr.am/tag/Commercial Commercial] companies, an [http://Pinterest.com/search/pins/?q=attorney attorney] In the event you liked this article and you would like to receive details about [http://durl.me/7m6ruf Discount Car Rental] i implore you to check out our own website. .
Consider a [[zero-sum game]] between player A, whose [[strategy (game theory)|strategies]] are deterministic algorithms, and player B, who’s strategies are inputs for A’s algorithms. The cost of a strategy profile is the running time of A’s chosen algorithm on B’s chosen input. Therefore, player A tries to minimize the cost, and player B tries to maximize it. In the world of pure strategies, for every algorithm that A chooses, B may choose the most costly input – this is the worst case scenario, and can be found using standard [[complexity analysis]].
 
But in the real world, inputs are normally not selected by an ‘evil opponent’ – rather, they come from some distribution over inputs. Since this is the case, if we allow the algorithms to also be drawn from some distribution, we may look at the game as one that allows [[mixed strategy|mixed strategies]]. That is, each player chooses a distribution over it’s strategies.
 
==Analysis==
Incorporating mixed strategies into the game allows us to use [[John von Neumann|von Neumann's]] [[minimax]] theorem:
 
:<math> \min_R \max_D T(A,D) = \max_D \min_A T(A,D) \, </math>
 
where ''R'' is a distribution over the algorithms, ''D'' is a distribution over inputs, ''A'' is a single deterministic algorithm, and ''T''(''A'',&nbsp;''D'') is the average running time of algorithm a on input&nbsp;''D''. More specifically:
 
:<math> T(A,D) = \,\underset{x \sim D}{\operatorname{E}}[T(A,X)]. \, </math>
 
If we limit the set of algorithms to a specific family (for instance, all deterministic choices for pivots in the [[quick sort]] algorithm), choosing an algorithm A from R is equivalent to running a randomized algorithm (for instance, running quick sort and randomly choosing the pivots at each step).
 
This gives us an insight on [[Yao's principle]], which states that the [[expected value|expected]] cost of any [[randomized algorithm]] for solving a given problem, on the worst case input for that algorithm, can be no better than the expected cost, for a worst-case random [[probability distribution]] on the inputs, of the [[deterministic algorithm]] that performs best against that distribution.
 
[[Category:Non-cooperative games]]
[[Category:Probabilistic complexity theory]]

Latest revision as of 19:59, 20 June 2014


Commercial companies, an attorney In the event you liked this article and you would like to receive details about Discount Car Rental i implore you to check out our own website. .