Uses of Interface
org.ddogleg.optimization.functions.FunctionNtoN
Packages that use FunctionNtoN
Package
Description
-
Uses of FunctionNtoN in org.ddogleg.optimization
Methods in org.ddogleg.optimization that return FunctionNtoNModifier and TypeMethodDescriptionstatic FunctionNtoNFactoryNumericalDerivative.gradientForwards(FunctionNtoS func) Methods in org.ddogleg.optimization with parameters of type FunctionNtoNModifier and TypeMethodDescriptionstatic booleanDerivativeChecker.gradient(FunctionNtoS func, FunctionNtoN gradient, double[] param, double tol) Compares the passed in gradient function to a numerical calculation.static booleanDerivativeChecker.gradient(FunctionNtoS func, FunctionNtoN gradient, double[] param, double tol, double differenceScale) static booleanDerivativeChecker.gradientR(FunctionNtoS func, FunctionNtoN gradient, double[] param, double tol) Compares the passed in gradient function to a numerical calculation.static booleanDerivativeChecker.gradientR(FunctionNtoS func, FunctionNtoN gradient, double[] param, double tol, double differenceScale) voidUnconstrainedMinimization.setFunction(FunctionNtoS function, FunctionNtoN gradient, double minFunctionValue) Specifies the function being optimized. -
Uses of FunctionNtoN in org.ddogleg.optimization.derivative
Classes in org.ddogleg.optimization.derivative that implement FunctionNtoNModifier and TypeClassDescriptionclassFinite difference numerical gradient calculation using the forward+backwards equation.classFinite difference numerical gradient calculation using forward equation. -
Uses of FunctionNtoN in org.ddogleg.optimization.loss
Subinterfaces of FunctionNtoN in org.ddogleg.optimization.lossModifier and TypeInterfaceDescriptioninterfaceAnalytical gradient for a function that implementsLossFunction.Classes in org.ddogleg.optimization.loss that implement FunctionNtoNModifier and TypeClassDescriptionstatic classImplementation of the smooth Cauchy loss gradientstatic classImplementation of the Huber Loss gradientstatic classImplementation of the smooth Huber loss gradientstatic classclassIteratively Reweighted Least-Squares (IRLS) allows the weights to be recomputed every iteration.static classstatic classImplementation of the Tukey Loss gradientclassA weighted least squares cost function. -
Uses of FunctionNtoN in org.ddogleg.optimization.trustregion
Methods in org.ddogleg.optimization.trustregion with parameters of type FunctionNtoNModifier and TypeMethodDescriptionvoidUnconMinTrustRegionBFGS_F64.setFunction(FunctionNtoS function, FunctionNtoN gradient, double minFunctionValue) -
Uses of FunctionNtoN in org.ddogleg.optimization.wrap
Classes in org.ddogleg.optimization.wrap that implement FunctionNtoNModifier and TypeClassDescriptionclassLsToNonLinearDeriv<S extends DMatrix>Convert the Jacobian of a least squares function into a nonlinear optimization gradient.Fields in org.ddogleg.optimization.wrap declared as FunctionNtoNModifier and TypeFieldDescriptionprotected FunctionNtoNCachedGradientLineFunction.gradientprotected FunctionNtoNCachedNumericalGradientLineFunction.gradientMethods in org.ddogleg.optimization.wrap with parameters of type FunctionNtoNModifier and TypeMethodDescriptionvoidQuasiNewtonBFGS_to_UnconstrainedMinimization.setFunction(FunctionNtoS function, FunctionNtoN gradient, double minFunctionValue) Constructors in org.ddogleg.optimization.wrap with parameters of type FunctionNtoNModifierConstructorDescriptionCachedGradientLineFunction(FunctionNtoS function, FunctionNtoN gradient)