Uses of Interface
org.ddogleg.optimization.functions.FunctionNtoS
Packages that use FunctionNtoS
Package
Description
-
Uses of FunctionNtoS in org.ddogleg.optimization
Methods in org.ddogleg.optimization with parameters of type FunctionNtoSModifier and TypeMethodDescriptionstatic booleanDerivativeChecker.gradient(FunctionNtoS func, FunctionNtoN gradient, double[] param, double tol) Compares the passed in gradient function to a numerical calculation.static booleanDerivativeChecker.gradient(FunctionNtoS func, FunctionNtoN gradient, double[] param, double tol, double differenceScale) static FunctionNtoNFactoryNumericalDerivative.gradientForwards(FunctionNtoS func) static booleanDerivativeChecker.gradientR(FunctionNtoS func, FunctionNtoN gradient, double[] param, double tol) Compares the passed in gradient function to a numerical calculation.static booleanDerivativeChecker.gradientR(FunctionNtoS func, FunctionNtoN gradient, double[] param, double tol, double differenceScale) voidUnconstrainedMinimization.setFunction(FunctionNtoS function, FunctionNtoN gradient, double minFunctionValue) Specifies the function being optimized. -
Uses of FunctionNtoS in org.ddogleg.optimization.derivative
Constructors in org.ddogleg.optimization.derivative with parameters of type FunctionNtoSModifierConstructorDescriptionNumericalGradientFB(FunctionNtoS function) NumericalGradientFB(FunctionNtoS function, double differenceScale) NumericalGradientForward(FunctionNtoS function) NumericalGradientForward(FunctionNtoS function, double differenceScale) -
Uses of FunctionNtoS in org.ddogleg.optimization.loss
Subinterfaces of FunctionNtoS in org.ddogleg.optimization.lossModifier and TypeInterfaceDescriptioninterfaceResidual loss function for regression.Classes in org.ddogleg.optimization.loss that implement FunctionNtoSModifier and TypeClassDescriptionstatic classImplementation of the smooth Cauchy loss functionstatic classImplementation of the Huber loss functionstatic classImplementation of the smooth Huber loss functionstatic classclassIteratively Reweighted Least-Squares (IRLS) allows the weights to be recomputed every iteration.static classstatic classImplementation of the Tukey loss functionclassA weighted least squares cost function. -
Uses of FunctionNtoS in org.ddogleg.optimization.trustregion
Methods in org.ddogleg.optimization.trustregion with parameters of type FunctionNtoSModifier and TypeMethodDescriptionvoidUnconMinTrustRegionBFGS_F64.setFunction(FunctionNtoS function, FunctionNtoN gradient, double minFunctionValue) -
Uses of FunctionNtoS in org.ddogleg.optimization.wrap
Classes in org.ddogleg.optimization.wrap that implement FunctionNtoSModifier and TypeClassDescriptionclassConverts a least squares function into a nonlinear optimization function.Fields in org.ddogleg.optimization.wrap declared as FunctionNtoSModifier and TypeFieldDescriptionprotected FunctionNtoSCachedGradientLineFunction.functionprotected FunctionNtoSCachedNumericalGradientLineFunction.functionMethods in org.ddogleg.optimization.wrap with parameters of type FunctionNtoSModifier and TypeMethodDescriptionvoidQuasiNewtonBFGS_to_UnconstrainedMinimization.setFunction(FunctionNtoS function, FunctionNtoN gradient, double minFunctionValue) Constructors in org.ddogleg.optimization.wrap with parameters of type FunctionNtoSModifierConstructorDescriptionCachedGradientLineFunction(FunctionNtoS function, FunctionNtoN gradient)