Descent Methods

Descent Methods

Different algorithms differ with respect to the choice of

    • Gradient Method

    • Steepest Descent Method

    • Newton Method

    • Fixed Step Size

    • Exact Line Search

    • Backtracking Line Search

    • by choosing different norm P we can have different directions.

    • is the steepest in the first order approximation.

    • is the steepest in the second order approximation.

References: EE230 Convex Optimization Lecture Slide Topic 6: Unconstrained Optimization; Convex Optimization notes provided by Dr. S. Boyd.

Newton's Method

Application in Inverse Kinematics:

References: https://en.wikipedia.org/wiki/Newton%27s_method; https://brilliant.org/wiki/newton-raphson-method/; Chapter 6 in Modern Robotics textbook

Newton's Method in Optimization

Putting everything together, Newton's method performs the iteration

Reference: https://en.wikipedia.org/wiki/Newton%27s_method_in_optimization

Gauss-Newton Algorithm

The Gauss–Newton algorithm is used to solve non-linear least squares problems. It is a modification of Newton's method for finding a minimum of a function.

Unlike Newton's method, the Gauss–Newton algorithm can only be used to minimize a sum of squared function values, but it has the advantage that second derivatives, which can be challenging to compute, are not required.

Reference: https://en.wikipedia.org/wiki/Gauss%E2%80%93Newton_algorithm

Levenberg–Marquardt Algorithm

Levenberg–Marquardt algorithm (LMA), also known as the damped least-squares (DLS) method, is used to solve non-linear least squares problems. The LMA interpolates between the Gauss–Newton algorithm (GNA) and the method of gradient descent.

Conclusions:

  • When the function can be well approximated by second order expansion, we use the Gauss-Newton method.

References: https://en.wikipedia.org/wiki/Gauss%E2%80%93Newton_algorithm; https://en.wikipedia.org/wiki/Levenberg%E2%80%93Marquardt_algorithm

Last updated