Package org.ddogleg.optimization
package org.ddogleg.optimization
-
ClassDescriptionLambda intended to be used to adjust the value of an array.Configuration for
GaussNewtonBase_F64
.Configuration for built inLoss Functions
.General configuration for unconstrained non-linear least squares solvers.Convergence paramters forUnconstrainedMinimization
andUnconstrainedLeastSquares
.Used to validate an algebraic Jacobian numerically.Factory for creating differentLossFunction
andLossFunctionGradient
.Functions for creating numerical derivativesCreates optimization algorithms using easy to use interfaces.Factory for sparse optimization algorithms.Base class for Gauss-Newton based approaches for unconstrained optimization.Optimization modeInterface for iterative optimization classes.LeastSquaresSwitcher<S extends DMatrix>Class that lets you easily switch between differentGaussNewtonBase_F64
solvers and matrix formats.Abstracts writing to the Jacobian so thatLeastSquaresSwitcher.ProcessJacobianOut
doesn't need to know the formatHigh level interface for implementing the Jacobian.Line search for nonlinear optimization.OptimizationDerivative<State>Interface for computing the gradient of a set of functions given a set of model parameters.This message is thrown if something bad happens while optimizing that would be the results invalidUnconstrainedLeastSquares<S extends DMatrix>Non-linear least squares problems have a special structure which can be taken advantage of for optimization.Common base for implementations ofUnconstrainedLeastSquares
that differ in the JacobianFunctionUnconstrainedLeastSquaresSchur<S extends DMatrix>A variant onUnconstrainedLeastSquares
for solving large scale systems which can be simplified using the Schur Complement.Optimization algorithm which seeks to minimize F(X) ∈ ℜ and X ∈ ℜNPerforms common optimization tasks.