Jump to content
 







Main menu
   


Navigation  



Main page
Contents
Current events
Random article
About Wikipedia
Contact us
Donate
 




Contribute  



Help
Learn to edit
Community portal
Recent changes
Upload file
 








Search  

































Create account

Log in
 









Create account
 Log in
 




Pages for logged out editors learn more  



Contributions
Talk
 



















Contents

   



(Top)
 


1 One-dimensional line search  



1.1  Zero-order methods  





1.2  First-order methods  





1.3  Curve-fitting methods  







2 Multi-dimensional line search  





3 Overcoming local minima  





4 See also  





5 References  





6 Further reading  














Line search






Deutsch
Español
فارسی
Français


Português
Slovenščina
Українська

 

Edit links
 









Article
Talk
 

















Read
Edit
View history
 








Tools
   


Actions  



Read
Edit
View history
 




General  



What links here
Related changes
Upload file
Special pages
Permanent link
Page information
Cite this page
Get shortened URL
Download QR code
Wikidata item
 




Print/export  



Download as PDF
Printable version
 
















Appearance
   

 






From Wikipedia, the free encyclopedia
 


Inoptimization, line search is a basic iterative approach to find a local minimum of an objective function . It first finds a descent direction along which the objective function will be reduced, and then computes a step size that determines how far should move along that direction. The descent direction can be computed by various methods, such as gradient descentorquasi-Newton method. The step size can be determined either exactly or inexactly.

[edit]

Suppose f is a one-dimensional function, , and assume that it is unimodal, that is, contains exactly one local minimum x* in a given interval [a,z]. This means that f is strictly decreasing in [a,x*] and strictly increasing in [x*,z]. There are several ways to find an (approximate) minimum point in this case.[1]: sec.5 

Zero-order methods

[edit]

Zero-order methods use only function evaluations (i.e., a value oracle) - not derivatives:[1]: sec.5 

Zero-order methods are very general - they do not assume differentiability or even continuity.

First-order methods

[edit]

First-order methods assume that f is continuously differentiable, and that we can evaluate not only f but also its derivative.[1]: sec.5 

Curve-fitting methods

[edit]

Curve-fitting methods try to attain superlinear convergence by assuming that f has some analytic form, e.g. a polynomial of finite degree. At each iteration, there is a set of "working points" in which we know the value of f (and possibly also its derivative). Based on these points, we can compute a polynomial that fits the known values, and find its minimum analytically. The minimum point becomes a new working point, and we proceed to the next iteration:[1]: sec.5 

Curve-fitting methods have superlinear convergence when started close enough to the local minimum, but might diverge otherwise. Safeguarded curve-fitting methods simultaneously execute a linear-convergence method in parallel to the curve-fitting method. They check in each iteration whether the point found by the curve-fitting method is close enough to the interval maintained by safeguard method; if it is not, then the safeguard method is used to compute the next iterate.[1]: 5.2.3.4 

[edit]

In general, we have a multi-dimensional objective function . The line-search method first finds a descent direction along which the objective function will be reduced, and then computes a step size that determines how far should move along that direction. The descent direction can be computed by various methods, such as gradient descentorquasi-Newton method. The step size can be determined either exactly or inexactly. Here is an example gradient method that uses a line search in step 5:

  1. Set iteration counter and make an initial guess for the minimum. Pick a tolerance.
  2. Loop:
    1. Compute a descent direction .
    2. Define a one-dimensional function , representing the function value on the descent direction given the step-size.
    3. Find an that minimizes over .
    4. Update , and
  3. Until

At the line search step (2.3), the algorithm may minimize h exactly, by solving , or approximately, by using one of the one-dimensional line-search methods mentioned above. It can also be solved loosely, by asking for a sufficient decrease in h that does not necessarily approximate the optimum. One example of the former is conjugate gradient method. The latter is called inexact line search and may be performed in a number of ways, such as a backtracking line search or using the Wolfe conditions.

Overcoming local minima

[edit]

Like other optimization methods, line search may be combined with simulated annealing to allow it to jump over some local minima.

See also

[edit]

References

[edit]
  1. ^ a b c d e Nemirovsky and Ben-Tal (2023). "Optimization III: Convex Optimization" (PDF).

Further reading

[edit]
Retrieved from "https://en.wikipedia.org/w/index.php?title=Line_search&oldid=1225974371"

Category: 
Optimization algorithms and methods
Hidden categories: 
Articles with short description
Short description matches Wikidata
 



This page was last edited on 27 May 2024, at 21:03 (UTC).

Text is available under the Creative Commons Attribution-ShareAlike License 4.0; additional terms may apply. By using this site, you agree to the Terms of Use and Privacy Policy. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc., a non-profit organization.



Privacy policy

About Wikipedia

Disclaimers

Contact Wikipedia

Code of Conduct

Developers

Statistics

Cookie statement

Mobile view



Wikimedia Foundation
Powered by MediaWiki