Uses of Interface
org.ddogleg.optimization.functions.FunctionNtoN
Packages that use FunctionNtoN
Package
Description
-
Uses of FunctionNtoN in org.ddogleg.optimization
Methods in org.ddogleg.optimization that return FunctionNtoNModifier and TypeMethodDescriptionstatic FunctionNtoN
FactoryNumericalDerivative.gradientForwards
(FunctionNtoS func) Methods in org.ddogleg.optimization with parameters of type FunctionNtoNModifier and TypeMethodDescriptionstatic boolean
DerivativeChecker.gradient
(FunctionNtoS func, FunctionNtoN gradient, double[] param, double tol) Compares the passed in gradient function to a numerical calculation.static boolean
DerivativeChecker.gradient
(FunctionNtoS func, FunctionNtoN gradient, double[] param, double tol, double differenceScale) static boolean
DerivativeChecker.gradientR
(FunctionNtoS func, FunctionNtoN gradient, double[] param, double tol) Compares the passed in gradient function to a numerical calculation.static boolean
DerivativeChecker.gradientR
(FunctionNtoS func, FunctionNtoN gradient, double[] param, double tol, double differenceScale) void
UnconstrainedMinimization.setFunction
(FunctionNtoS function, FunctionNtoN gradient, double minFunctionValue) Specifies the function being optimized. -
Uses of FunctionNtoN in org.ddogleg.optimization.derivative
Classes in org.ddogleg.optimization.derivative that implement FunctionNtoNModifier and TypeClassDescriptionclass
Finite difference numerical gradient calculation using the forward+backwards equation.class
Finite difference numerical gradient calculation using forward equation. -
Uses of FunctionNtoN in org.ddogleg.optimization.trustregion
Methods in org.ddogleg.optimization.trustregion with parameters of type FunctionNtoNModifier and TypeMethodDescriptionvoid
UnconMinTrustRegionBFGS_F64.setFunction
(FunctionNtoS function, FunctionNtoN gradient, double minFunctionValue) -
Uses of FunctionNtoN in org.ddogleg.optimization.wrap
Classes in org.ddogleg.optimization.wrap that implement FunctionNtoNModifier and TypeClassDescriptionclass
LsToNonLinearDeriv<S extends DMatrix>
Convert the Jacobian of a least squares function into a nonlinear optimization gradient.Fields in org.ddogleg.optimization.wrap declared as FunctionNtoNModifier and TypeFieldDescriptionprotected FunctionNtoN
CachedGradientLineFunction.gradient
protected FunctionNtoN
CachedNumericalGradientLineFunction.gradient
Methods in org.ddogleg.optimization.wrap with parameters of type FunctionNtoNModifier and TypeMethodDescriptionvoid
QuasiNewtonBFGS_to_UnconstrainedMinimization.setFunction
(FunctionNtoS function, FunctionNtoN gradient, double minFunctionValue) Constructors in org.ddogleg.optimization.wrap with parameters of type FunctionNtoNModifierConstructorDescriptionCachedGradientLineFunction
(FunctionNtoS function, FunctionNtoN gradient)