Class UnconMinTrustRegionBFGS_F64
java.lang.Object
org.ddogleg.optimization.GaussNewtonBase_F64<ConfigTrustRegion,HM>
org.ddogleg.optimization.trustregion.TrustRegionBase_F64<DMatrixRMaj,HessianBFGS>
org.ddogleg.optimization.trustregion.UnconMinTrustRegionBFGS_F64
- All Implemented Interfaces:
Serializable,IterativeOptimization,UnconstrainedMinimization,VerbosePrint
public class UnconMinTrustRegionBFGS_F64
extends TrustRegionBase_F64<DMatrixRMaj,HessianBFGS>
implements UnconstrainedMinimization
Implementations of
TrustRegionUpdateCauchy_F64 for UnconstrainedMinimization. The hessian is approximated
using the BFGS method. This method exhibits poor convergence, probably due to the Hessian
being estimated poorly at times.- See Also:
-
Nested Class Summary
Nested classes/interfaces inherited from class org.ddogleg.optimization.trustregion.TrustRegionBase_F64
TrustRegionBase_F64.ParameterUpdate<S extends DMatrix>Nested classes/interfaces inherited from class org.ddogleg.optimization.GaussNewtonBase_F64
GaussNewtonBase_F64.Mode -
Field Summary
FieldsFields inherited from class org.ddogleg.optimization.trustregion.TrustRegionBase_F64
gradientNorm, parameterUpdate, tmp_pFields inherited from class org.ddogleg.optimization.GaussNewtonBase_F64
config, ftest_val, fx, gradient, gtest_val, hessian, hessianScaling, mode, p, postUpdateAdjuster, sameStateAsCost, totalFullSteps, totalSelectSteps, verbose, verboseLevel, x, x_next -
Constructor Summary
ConstructorsConstructorDescriptionUnconMinTrustRegionBFGS_F64(TrustRegionBase_F64.ParameterUpdate<DMatrixRMaj> parameterUpdate, HessianBFGS hessian) -
Method Summary
Modifier and TypeMethodDescriptionprotected doublecost(DMatrixRMaj x) Computes the function's value at xprotected voidfunctionGradientHessian(DMatrixRMaj x, boolean sameStateAsCost, DMatrixRMaj gradient, HessianBFGS hessian) Computes the gradient and Hessian at 'x'.doubleReturns the value of the objective function being evaluated at the current parameters value.double[]After each iteration this function can be called to get the current best set of parameters.voidinitialize(double[] initial, double ftol, double gtol) Specify the initial set of parameters from which to start from.voidinitialize(double[] initial, int numberOfParameters, double minimumFunctionValue) Override parent to initialize matricesbooleanIndicates if iteration stopped due to convergence or not.booleanTrue if the parameter(s) being optimized have been updatedvoidsetFunction(FunctionNtoS function, FunctionNtoN gradient, double minFunctionValue) Specifies the function being optimized.protected booleanwolfeCondition(DMatrixRMaj s, DMatrixRMaj y, DMatrixRMaj g_k) Indicates if there's sufficient decrease and curvature.Methods inherited from class org.ddogleg.optimization.trustregion.TrustRegionBase_F64
acceptNewState, checkConvergenceFTest, computeCostAtInitialization, computeStep, configure, considerCandidate, getConfig, setVerbose, solveCauchyStepLength, updateDerivatesMethods inherited from class org.ddogleg.optimization.GaussNewtonBase_F64
applyHessianScaling, checkConvergenceGTest, computeHessianScaling, computeHessianScaling, computePredictedReduction, initialize, iterate, mode, setVerbose, undoHessianScalingOnParametersMethods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitMethods inherited from interface org.ddogleg.optimization.IterativeOptimization
iterate, setVerboseMethods inherited from interface org.ddogleg.struct.VerbosePrint
setVerbose
-
Field Details
-
f_prev
protected double f_prev
-
-
Constructor Details
-
UnconMinTrustRegionBFGS_F64
public UnconMinTrustRegionBFGS_F64(TrustRegionBase_F64.ParameterUpdate<DMatrixRMaj> parameterUpdate, HessianBFGS hessian)
-
-
Method Details
-
setFunction
Description copied from interface:UnconstrainedMinimizationSpecifies the function being optimized. A numerical Jacobian will be computed if null is passed in.- Specified by:
setFunctionin interfaceUnconstrainedMinimization- Parameters:
function- Function being optimized.gradient- Partial derivative for each input in the function. If null a numerical gradient will be computed.minFunctionValue- Minimum possible value that 'function' can have. E.g. for least squares problems this value should be set to zero.
-
initialize
public void initialize(double[] initial, double ftol, double gtol) Description copied from interface:UnconstrainedMinimizationSpecify the initial set of parameters from which to start from. Call afterUnconstrainedMinimization.setFunction(org.ddogleg.optimization.functions.FunctionNtoS, org.ddogleg.optimization.functions.FunctionNtoN, double)has been called.- Specified by:
initializein interfaceUnconstrainedMinimization- Parameters:
initial- Initial parameters or guess.ftol- Relative convergence test based on function value. 0 disables test. 0 ≤ ftol<1, Try 1e-12gtol- Absolute convergence test based on gradient. 0 disables test. 0 ≤ gtol. Try 1e-12
-
initialize
public void initialize(double[] initial, int numberOfParameters, double minimumFunctionValue) Override parent to initialize matrices- Overrides:
initializein classTrustRegionBase_F64<DMatrixRMaj,HessianBFGS> - Parameters:
initial- Initial parameter statenumberOfParameters- Number many parameters are being optimized.minimumFunctionValue- The minimum possible value that the function can output
-
cost
Description copied from class:TrustRegionBase_F64Computes the function's value at x- Specified by:
costin classTrustRegionBase_F64<DMatrixRMaj,HessianBFGS> - Parameters:
x- parameters- Returns:
- function value
-
functionGradientHessian
protected void functionGradientHessian(DMatrixRMaj x, boolean sameStateAsCost, DMatrixRMaj gradient, HessianBFGS hessian) Description copied from class:GaussNewtonBase_F64Computes the gradient and Hessian at 'x'. If sameStateAsCost is true then it can be assumed that 'x' has not changed since the cost was last computed.- Specified by:
functionGradientHessianin classGaussNewtonBase_F64<ConfigTrustRegion,HessianBFGS> sameStateAsCost- (Input) If true then when the cost (or residuals) was last called it had the same value of xgradient- (Input) xhessian- (Output) hessian
-
wolfeCondition
Indicates if there's sufficient decrease and curvature. If the Wolfe condition is meet then the Hessian will be positive definite.- Parameters:
s- change in state (new - old)y- change in gradient (new - old)g_k- Gradient at step k.
-
getParameters
public double[] getParameters()Description copied from interface:UnconstrainedMinimizationAfter each iteration this function can be called to get the current best set of parameters.- Specified by:
getParametersin interfaceUnconstrainedMinimization
-
getFunctionValue
public double getFunctionValue()Description copied from interface:UnconstrainedMinimizationReturns the value of the objective function being evaluated at the current parameters value. If not supported then an exception is thrown.- Specified by:
getFunctionValuein interfaceUnconstrainedMinimization- Returns:
- Objective function's value.
-
isUpdated
public boolean isUpdated()Description copied from interface:IterativeOptimizationTrue if the parameter(s) being optimized have been updated- Specified by:
isUpdatedin interfaceIterativeOptimization- Returns:
- True if parameters have been updated
-
isConverged
public boolean isConverged()Description copied from interface:IterativeOptimizationIndicates if iteration stopped due to convergence or not.- Specified by:
isConvergedin interfaceIterativeOptimization- Returns:
- True if iteration stopped because it converged.
-