LDA++
|
#include <GradientDescent.hpp>
Public Types | |
typedef ParameterType::Scalar | Scalar |
Public Member Functions | |
GradientDescent (std::shared_ptr< LineSearch< ProblemType, ParameterType > > line_search, std::function< bool(Scalar, Scalar, size_t)> progress) | |
void | minimize (const ProblemType &problem, Eigen::Ref< ParameterType > x0) |
A very simple implementation of batch gradient descent.
Given a problem, a line search and a starting point it performs the following simple iteration.
Convergence is decided by another part of the program through injection of a callback.
TODO: Maybe bind the problem type through an interface
|
inline |
line_search | A line search method |
progress | A callback that decides when the optimization has ended; it can also be used to get informed about the progress of the optimization |
|
inline |
Minimize the function defined in the 'problem' argument.
The 'problem' argument should implement the functions value() and gradient() in order to be used with the GradientDescent class.
problem | The function being minimized |
x0 | The initial position during our minimization (it will be overwritten with the optimal position) |