Intersection theory: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>Myasuda
 
en>ElNuevoEinstein
Line 1: Line 1:
Hello, my name is Andrew and my spouse doesn't like it at all. To climb is some thing she would never give up. Office supervising is where her main earnings comes from but she's currently utilized for another 1. Mississippi is exactly where her home is but her spouse wants them to transfer.<br><br>Here is my blog: [http://chungmuroresidence.com/xe/reservation_branch2/152663 real psychic]
The '''Frank–Wolfe algorithm''' is a simple [[iterative method|iterative]] [[First-order approximation|first-order]] [[Mathematical optimization|optimization]] [[algorithm]] for [[constrained optimization|constrained]] [[convex optimization]]. Also known as the '''conditional gradient method''',<ref>{{Cite doi|10.1016/0041-5553(66)90114-5|noedit}}</ref> '''reduced gradient algorithm''' and the '''convex combination algorithm''', the method was originally proposed by [[Marguerite Frank]] and [[Philip Wolfe (mathematician)|Philip Wolfe]] in&nbsp;1956.<ref>{{cite doi|10.1002/nav.3800030109|noedit}}</ref> In each iteration, the Frank–Wolfe algorithm considers a [[linear approximation]] of the objective function, and moves slightly towards a minimizer of this linear function (taken over the same domain).
 
==Problem statement==
 
:Minimize <math> f(\mathbf{x})</math>
:subject to <math> \mathbf{x} \in \mathcal{D}</math>.
Where the function <math> f</math> is [[Convex function|convex]] and [[differentiable function|differentiable]], and the domain / feasible set <math>\mathcal{D}</math> is a [[Convex set|convex]] and bounded set in some [[vector space]].
 
==Algorithm==
[[File:Frank-Wolfe-Algorithm.png|thumbnail|right|A step of the Frank-Wolfe algorithm]]
 
:''Initialization:'' Let <math>k \leftarrow 0</math>, and let <math>\mathbf{x}_0 \!</math> be any point in <math>\mathcal{D}</math>.
 
:'''Step 1.'''  ''Direction-finding subproblem:'' Find <math>\mathbf{s}_k</math> solving
::Minimize <math>  \mathbf{s}^T \nabla f(\mathbf{x}_k)</math>
::Subject to <math>\mathbf{s} \in \mathcal{D}</math>
:''(Interpretation: Minimize the linear approximation of the problem given by the first-order [[Taylor series|Taylor approximation]] of <math>f</math> around <math>\mathbf{x}_k \!</math>.)''
 
:'''Step 2.'''  ''Step size determination:'' Set <math>\gamma \leftarrow \frac{2}{k+2}</math>, or alternatively find <math>\gamma</math> that minimizes <math> f(\mathbf{x}_k+\gamma(\mathbf{s}_k -\mathbf{x}_k))</math> subject to <math>0 \le \gamma \le 1</math> .
 
:'''Step 3.'''  ''Update:''  Let <math>\mathbf{x}_{k+1}\leftarrow \mathbf{x}_k+\gamma(\mathbf{s}_k-\mathbf{x}_k)</math>, let <math>k \leftarrow k+1</math> and go to Step 1.
 
==Properties==
While competing methods such as [[gradient descent]] for constrained optimization require a [[Projection (mathematics)|projection step]] back to the feasible set in each iteration, the Frank–Wolfe algorithm only needs the solution of a linear problem over the same set in each iteration, and automatically stays in the feasible set.
 
The convergence of the Frank–Wolfe algorithm is sublinear in general: the error to the optimum is <math>O(1/k)</math> after k iterations. The same convergence rate can also be shown if the sub-problems are only solved approximately.<ref>{{cite doi|10.1016/0022-247X(78)90137-3|noedit}}</ref>
 
The iterates of the algorithm can always be represented as a sparse convex combination of the extreme points of the feasible set, which has helped to the popularity of the algorithm for sparse greedy optimization in [[machine learning]] and [[signal processing]] problems,<ref>{{cite doi|10.1145/1824777.1824783|noedit}}</ref> as well as for example the optimization of [[flow network|minimum–cost flow]]s in [[transportation network]]s.<ref>{{cite doi|10.1016/0191-2615(84)90029-8|noedit}}</ref>
 
If the feasible set is given by a set of linear constraints, then the subproblem to be solved in each iteration becomes a [[linear programming|linear program]].
 
While the worst-case convergence rate with <math>O(1/k)</math> can not be improved in general, faster convergence can be obtained for special problem classes, such as some strongly convex problems.<ref>{{Cite book|title=Nonlinear Programming|first= Dimitri |last=Bertsekas|year= 2003|page= 222|publisher=  Athena Scientific| isbn =1-886529-00-0}}</ref>
 
==Lower bounds on the solution value, and primal-dual analysis==
 
Since <math>f</math> is convex, <math>f(\mathbf{y})</math> is always above the [[Tangent|tangent plane]] of <math>f</math> at any point <math>\mathbf{x} \in \mathcal{D}</math>:
 
:<math>
  f(\mathbf{y}) \geq f(\mathbf{x}) +  (\mathbf{y} - \mathbf{x})^T \nabla f(\mathbf{x})
</math>
 
This holds in particular for the (unknown) optimal solution <math>\mathbf{x}^*</math>. The best lower bound with respect to a given point <math>\mathbf{x}</math> is given by
 
:<math>
  f(\mathbf{x}^*) \geq \min_{\mathbf{y} \in D} f(\mathbf{x}) +  (\mathbf{y} - \mathbf{x})^T \nabla f(\mathbf{x}) = f(\mathbf{x}) - \mathbf{x}^T \nabla f(\mathbf{x}) + \min_{\mathbf{y} \in D} \mathbf{y}^T \nabla f(\mathbf{x})
</math>
 
The latter optimization problem is solved in every iteration of the Frank-Wolfe algorithm, therefore the solution <math>\mathbf{s}_k</math> of the direction-finding subproblem of the <math>k</math>-th iteration can be used to determine increasing lower bounds <math>l_k</math> during each iteration by setting <math>l_0 = - \infty</math> and
 
:<math>
  l_k := \max (l_{k - 1}, f(\mathbf{x}_k) +  (\mathbf{s}_k - \mathbf{x})^T \nabla f(\mathbf{x}_k))
</math>
Such lower bounds on the unknown optimal value are important in practice because they can be used as a stopping criterion, and give an efficient certificate of the approximation quality in every iteration, since always <math>l_k \leq f(\mathbf{x}^*) \leq f(\mathbf{x}_k)</math>.
 
It has been shown that this corresponding [[duality gap]], that is the difference between <math>f(\mathbf{x}_k)</math> and the lower bound <math>l_k</math>, decreases with the same convergence rate, i.e.
<math>
  f(\mathbf{x}_k) - l_k = O(1/k) .
</math>
 
==Notes==
{{Reflist}}
 
==Bibliography==
*{{cite journal|last=Jaggi|first=Martin|title=Revisiting Frank-Wolfe: Projection-Free Sparse Convex Optimization|journal=Journal of Machine Learning Research: Workshop and Conference Proceedings |volume=28|issue=1|pages=427–435|year= 2013 |url=http://jmlr.csail.mit.edu/proceedings/papers/v28/jaggi13.html}} (Overview paper)
*[http://www.math.chalmers.se/Math/Grundutb/CTH/tma946/0203/fw_eng.pdf The Frank-Wolfe algorithm] description
 
== See also ==
* [[Proximal Gradient Methods]]
 
{{Optimization algorithms|convex}}
 
{{DEFAULTSORT:Frank-Wolfe algorithm}}
[[Category:Optimization algorithms and methods]]
[[Category:Iterative methods]]
[[Category:First order methods]]
[[Category:Gradient methods]]

Revision as of 23:49, 9 January 2014

The Frank–Wolfe algorithm is a simple iterative first-order optimization algorithm for constrained convex optimization. Also known as the conditional gradient method,[1] reduced gradient algorithm and the convex combination algorithm, the method was originally proposed by Marguerite Frank and Philip Wolfe in 1956.[2] In each iteration, the Frank–Wolfe algorithm considers a linear approximation of the objective function, and moves slightly towards a minimizer of this linear function (taken over the same domain).

Problem statement

Minimize
subject to .

Where the function is convex and differentiable, and the domain / feasible set is a convex and bounded set in some vector space.

Algorithm

A step of the Frank-Wolfe algorithm
Initialization: Let , and let be any point in .
Step 1. Direction-finding subproblem: Find solving
Minimize
Subject to
(Interpretation: Minimize the linear approximation of the problem given by the first-order Taylor approximation of around .)
Step 2. Step size determination: Set , or alternatively find that minimizes subject to .
Step 3. Update: Let , let and go to Step 1.

Properties

While competing methods such as gradient descent for constrained optimization require a projection step back to the feasible set in each iteration, the Frank–Wolfe algorithm only needs the solution of a linear problem over the same set in each iteration, and automatically stays in the feasible set.

The convergence of the Frank–Wolfe algorithm is sublinear in general: the error to the optimum is after k iterations. The same convergence rate can also be shown if the sub-problems are only solved approximately.[3]

The iterates of the algorithm can always be represented as a sparse convex combination of the extreme points of the feasible set, which has helped to the popularity of the algorithm for sparse greedy optimization in machine learning and signal processing problems,[4] as well as for example the optimization of minimum–cost flows in transportation networks.[5]

If the feasible set is given by a set of linear constraints, then the subproblem to be solved in each iteration becomes a linear program.

While the worst-case convergence rate with can not be improved in general, faster convergence can be obtained for special problem classes, such as some strongly convex problems.[6]

Lower bounds on the solution value, and primal-dual analysis

Since is convex, is always above the tangent plane of at any point :

This holds in particular for the (unknown) optimal solution . The best lower bound with respect to a given point is given by

The latter optimization problem is solved in every iteration of the Frank-Wolfe algorithm, therefore the solution of the direction-finding subproblem of the -th iteration can be used to determine increasing lower bounds during each iteration by setting and

Such lower bounds on the unknown optimal value are important in practice because they can be used as a stopping criterion, and give an efficient certificate of the approximation quality in every iteration, since always .

It has been shown that this corresponding duality gap, that is the difference between and the lower bound , decreases with the same convergence rate, i.e.

Notes

43 year old Petroleum Engineer Harry from Deep River, usually spends time with hobbies and interests like renting movies, property developers in singapore new condominium and vehicle racing. Constantly enjoys going to destinations like Camino Real de Tierra Adentro.

Bibliography

  • One of the biggest reasons investing in a Singapore new launch is an effective things is as a result of it is doable to be lent massive quantities of money at very low interest rates that you should utilize to purchase it. Then, if property values continue to go up, then you'll get a really high return on funding (ROI). Simply make sure you purchase one of the higher properties, reminiscent of the ones at Fernvale the Riverbank or any Singapore landed property Get Earnings by means of Renting

    In its statement, the singapore property listing - website link, government claimed that the majority citizens buying their first residence won't be hurt by the new measures. Some concessions can even be prolonged to chose teams of consumers, similar to married couples with a minimum of one Singaporean partner who are purchasing their second property so long as they intend to promote their first residential property. Lower the LTV limit on housing loans granted by monetary establishments regulated by MAS from 70% to 60% for property purchasers who are individuals with a number of outstanding housing loans on the time of the brand new housing purchase. Singapore Property Measures - 30 August 2010 The most popular seek for the number of bedrooms in Singapore is 4, followed by 2 and three. Lush Acres EC @ Sengkang

    Discover out more about real estate funding in the area, together with info on international funding incentives and property possession. Many Singaporeans have been investing in property across the causeway in recent years, attracted by comparatively low prices. However, those who need to exit their investments quickly are likely to face significant challenges when trying to sell their property – and could finally be stuck with a property they can't sell. Career improvement programmes, in-house valuation, auctions and administrative help, venture advertising and marketing, skilled talks and traisning are continuously planned for the sales associates to help them obtain better outcomes for his or her shoppers while at Knight Frank Singapore. No change Present Rules

    Extending the tax exemption would help. The exemption, which may be as a lot as $2 million per family, covers individuals who negotiate a principal reduction on their existing mortgage, sell their house short (i.e., for lower than the excellent loans), or take part in a foreclosure course of. An extension of theexemption would seem like a common-sense means to assist stabilize the housing market, but the political turmoil around the fiscal-cliff negotiations means widespread sense could not win out. Home Minority Chief Nancy Pelosi (D-Calif.) believes that the mortgage relief provision will be on the table during the grand-cut price talks, in response to communications director Nadeam Elshami. Buying or promoting of blue mild bulbs is unlawful.

    A vendor's stamp duty has been launched on industrial property for the primary time, at rates ranging from 5 per cent to 15 per cent. The Authorities might be trying to reassure the market that they aren't in opposition to foreigners and PRs investing in Singapore's property market. They imposed these measures because of extenuating components available in the market." The sale of new dual-key EC models will even be restricted to multi-generational households only. The models have two separate entrances, permitting grandparents, for example, to dwell separately. The vendor's stamp obligation takes effect right this moment and applies to industrial property and plots which might be offered inside three years of the date of buy. JLL named Best Performing Property Brand for second year running

    The data offered is for normal info purposes only and isn't supposed to be personalised investment or monetary advice. Motley Fool Singapore contributor Stanley Lim would not personal shares in any corporations talked about. Singapore private home costs increased by 1.eight% within the fourth quarter of 2012, up from 0.6% within the earlier quarter. Resale prices of government-built HDB residences which are usually bought by Singaporeans, elevated by 2.5%, quarter on quarter, the quickest acquire in five quarters. And industrial property, prices are actually double the levels of three years ago. No withholding tax in the event you sell your property. All your local information regarding vital HDB policies, condominium launches, land growth, commercial property and more

    There are various methods to go about discovering the precise property. Some local newspapers (together with the Straits Instances ) have categorised property sections and many local property brokers have websites. Now there are some specifics to consider when buying a 'new launch' rental. Intended use of the unit Every sale begins with 10 p.c low cost for finish of season sale; changes to 20 % discount storewide; follows by additional reduction of fiftyand ends with last discount of 70 % or extra. Typically there is even a warehouse sale or transferring out sale with huge mark-down of costs for stock clearance. Deborah Regulation from Expat Realtor shares her property market update, plus prime rental residences and houses at the moment available to lease Esparina EC @ Sengkang (Overview paper)
  • The Frank-Wolfe algorithm description

See also

Template:Optimization algorithms

  1. Template:Cite doi
  2. Template:Cite doi
  3. Template:Cite doi
  4. Template:Cite doi
  5. Template:Cite doi
  6. 20 year-old Real Estate Agent Rusty from Saint-Paul, has hobbies and interests which includes monopoly, property developers in singapore and poker. Will soon undertake a contiki trip that may include going to the Lower Valley of the Omo.

    My blog: http://www.primaboinca.com/view_profile.php?userid=5889534