Package org.ddogleg.optimization.loss
Interface LossFunction
- All Superinterfaces:
FunctionNtoS
- All Known Implementing Classes:
LossCauchy.Function
,LossHuber.Function
,LossHuberSmooth.Function
,LossIdentity.Function
,LossIRLS
,LossSquared.Function
,LossTukey.Function
,LossWeighted
Residual loss function for regression. The standard squared loss can be sensitive to outliers. This function enables robust loss functions to be used instead which are less sensitive to. Most implementations will attempt to behave like squared error when an observation is not an outlier. Therefor, the following should be true f(0) = 0.
All implementations should return a scalar which is &ge 0. A value of zero would indicate no errors.
-
Method Summary
Modifier and TypeMethodDescriptiondefault boolean
fixate
(double[] residuals) Passes in the current residuals at the start of an iteration.int
Number of elements in the residualdefault int
The number of inputs.void
setNumberOfFunctions
(int value) Methods inherited from interface org.ddogleg.optimization.functions.FunctionNtoS
process
-
Method Details
-
fixate
default boolean fixate(double[] residuals) Passes in the current residuals at the start of an iteration. If a loss function is dynamically computed and conditional on the residuals, here's where it should be done- Returns:
- true if the loss function has changed and the cost needs to be recomputed.
-
getNumberOfFunctions
int getNumberOfFunctions()Number of elements in the residual -
setNumberOfFunctions
void setNumberOfFunctions(int value) -
getNumOfInputsN
default int getNumOfInputsN()Description copied from interface:FunctionNtoS
The number of inputs. Typically the parameters you are optimizing.- Specified by:
getNumOfInputsN
in interfaceFunctionNtoS
- Returns:
- Number of inputs.
-