Misplaced Pages

Local unimodal sampling

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.

This is an old revision of this page, as edited by MrOllie (talk | contribs) at 17:30, 6 December 2010. The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Revision as of 17:30, 6 December 2010 by MrOllie (talk | contribs)(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)
It is proposed that this article be deleted because of the following concern:

Likely WP:COI article on a technique of questionable notability - has no sources on technique that are independent of its originator.

If you can address this concern by improving, copyediting, sourcing, renaming, or merging the page, please edit this page and do so. You may remove this message if you improve the article or otherwise object to deletion for any reason. Although not required, you are encouraged to explain why you object to the deletion, either in your edit summary or on the talk page. If this template is removed, do not replace it.

This message has remained in place for seven days, so the article may be deleted without further notice.

If you created the article, please don't be offended. Instead, consider improving the article so that it is acceptable according to the deletion policy.
Find sources: "Local unimodal sampling" – news · newspapers · books · scholar · JSTOR
PRODExpired+%5B%5BWP%3APROD%7CPROD%5D%5D%2C+concern+was%3A+Likely+%5B%5BWP%3ACOI%5D%5D+article+on+a+technique+of+questionable+notability+-+has+no+sources+on+technique+that+are+independent+of+its+originator.Expired ], concern was: Likely WP:COI article on a technique of questionable notability - has no sources on technique that are independent of its originator.
Nominator: Please consider notifying the author/project: {{subst:proposed deletion notify|Local unimodal sampling|concern=Likely ] article on a technique of questionable notability - has no sources on technique that are independent of its originator.}} ~~~~
Timestamp: 20101206173004 17:30, 6 December 2010 (UTC)
Administrators: delete

Local unimodal sampling (LUS) is a method for doing numerical optimization which does not require the gradient of the problem to be optimized and LUS can hence be used on functions that are not continuous or differentiable. Such optimization methods are also known as direct-search, derivative-free, or black-box methods.

LUS is attributed to Pedersen and works by maintaining a single position in the search-space and moving to a new position in case of improvement to the fitness or cost function. New positions are sampled from the neighbourhood of the current position using a uniform distribution. The sampling-range is initially the full search-space and decreases exponentially during optimization.

Motivation

When the current position x is far from the optimum the probability is 1/2 for finding an improvement through uniform random sampling.
As we approach the optimum the probability of finding further improvements through uniform sampling decreases towards zero if the sampling-range d is kept fixed.

Using a fixed sampling-range for randomly sampling the search-space of a unimodal function the probability of finding improved positions will decrease as we approach the optimum. This is because a decreasing portion of the sampling-range will yield improved fitness (see pictures for the single-dimensional case.) Hence, the sampling range must be decreased somehow. Pedersen noted that pattern search works well for optimizing unimodal functions and translated its halving of the sampling-range into a formula that would have similar effect when using a uniform distribution for the sampling.

Algorithm

Let f: ℝ → ℝ be the fitness or cost function which must be minimized. Let x ∈ ℝ designate a position or candidate solution in the search-space. The LUS algorithm can then be described as:

  • Initialize x~U(blo,bup) with a random uniform position in the search-space, where blo and bup are the lower and upper boundaries, respectively.
  • Set the initial sampling range to cover the entire search-space: d = bup − blo
  • Until a termination criterion is met (e.g. number of iterations performed, or adequate fitness reached), repeat the following:
    • Pick a random vector a ~ U(−dd)
    • Add this to the current position x to create the new potential position y = x + a
    • If (f(y) < f(x)) then move to the new position by setting x = y, otherwise decrease the sampling-range by multiplying with factor q (see below): d = q d
  • Now x holds the best-found position.

Sampling-range decrease factor

The factor q for exponentially decreasing the sampling-range is defined by: q = 2 where n is the dimensionality of the search-space and α is a user-adjustable parameter. Setting 0<α<1 causes slower decrease of the sampling-range and setting α > 1 causes more rapid decrease of the sampling-range. Typically it is set to α = 1/3

Note that decreasing the sampling-range n times by factor q results in an overall decrease of q = 2 and for α = 1 this would mean a halving of the sampling-range.

The main differences between LUS and the Luus–Jaakola (LJ) method are that the decrease-factor α depends on the dimensionality of the search-space n where as LJ typically just sets α = 0.95, and that LUS starts out by sampling the entire search-space where as LJ starts out by sampling only a fraction of the search-space.

Usage and criticism

LUS is used in meta-optimization for tuning the behavioural parameters of another optimization method, see e.g. Pedersen , because it is simple and usually finds satisfactory solutions using few iterations, which is necessary for such computationally expensive optimization problems. The rapidity of LUS is achieved by its exponential decrease of the sampling-range which works well for unimodal problems, but it is a weakness for multi-modal problems as it sometimes decreases the sampling-range too rapidly before it finds the basin holding the global minimum. This means LUS must typically be run several times to locate the minimum of multi-modal problems and sometimes fails completely.

LUS has also been criticized informally amongst researchers for using a greedy update rule and it has been suggested to use a stochastic update rule as in the Metropolis–Hastings algorithm or simulated annealing, but it was noted in section 2.3.3 of that such stochastic update rules are not useful when sampling real-valued search-spaces as they merely cause the optimizer to move back and forth towards the optimum with a still low probability of escaping local optima and a now lowered probability of converging to any optimum.

History

The abbreviation LUS is the word for louse in the Danish language. According to Pedersen the name was chosen as a humorous tribute to the Marx Brothers who have several louse-puns in their films, and not because the method has any resemblance to the movement of a real louse. The LUS method was developed in Hundested and Copenhagen, Denmark.

See also

References

  1. ^ Pedersen, M.E.H. (2010). Tuning & Simplifying Heuristical Optimization (PDF) (PhD thesis). University of Southampton, School of Engineering Sciences, Computational Engineering and Design Group.
  2. Jaakola, T.H.I. (1973). "Optimization by direct search and systematic reduction of the size of search region". American Institute of Chemical Engineers Journal (AIChE). 19 (4): 760–766.
  3. Pedersen, M.E.H. (2010). "Simplifying particle swarm optimization" (PDF). Applied Soft Computing. 10: 618–628. {{cite journal}}: Unknown parameter |coauthors= ignored (|author= suggested) (help)
Optimization: Algorithms, methods, and heuristics
Unconstrained nonlinear
Functions
Gradients
Convergence
Quasi–Newton
Other methods
Hessians
Graph of a strictly concave quadratic function with unique maximum.
Optimization computes maxima and minima.
Constrained nonlinear
General
Differentiable
Convex optimization
Convex
minimization
Linear and
quadratic
Interior point
Basis-exchange
Combinatorial
Paradigms
Graph
algorithms
Minimum
spanning tree
Shortest path
Network flows
Metaheuristics
Categories: