# getml.feature_learning.loss_functions

Loss functions used by the feature learning algorithms.

The getML Python API contains two different loss functions. We recommend using `SQUARELOSS`

for regression problems and `CROSSENTROPYLOSS`

for classification problems.

Please note that these loss functions will only be used by the feature learning algorithms and not by the `predictors`

.

## CrossEntropyLossType `module-attribute`

```
CrossEntropyLossType = Literal['CrossEntropyLoss']
```

Type of the cross entropy loss function.

## CROSSENTROPYLOSS `module-attribute`

```
CROSSENTROPYLOSS: Final[CrossEntropyLossType] = (
"CrossEntropyLoss"
)
```

The cross entropy between two probability distributions \(p(x)\) and \(q(x)\) is a combination of the information contained in \(p(x)\) and the additional information stored in \(q(x)\) with respect to \(p(x)\). In technical terms: it is the entropy of \(p(x)\) plus the Kullback-Leibler divergence - a distance in probability space - from \(q(x)\) to \(p(x)\).

For discrete probability distributions the cross entropy loss can be calculated by

and for continuous probability distributions by

with \(X\) being the support of the samples and \(p(x)\) and \(q(x)\) being two discrete or continuous probability distributions over \(X\).

## Note

Recommended loss function for classification problems.

## SquareLossType `module-attribute`

```
SquareLossType = Literal['SquareLoss']
```

Type of the square loss function.

## SQUARELOSS `module-attribute`

```
SQUARELOSS: Final[SquareLossType] = 'SquareLoss'
```

The Square loss (aka mean squared error (MSE)) measures the loss by calculating the average of all squared deviations of the predictions \(\hat{y}\) from the observed (given) outcomes \(y\). Depending on the context this measure is also known as mean squared error (MSE) or mean squared deviation (MSD).

with \(n\) being the number of samples, \(y\) the observed outcome, and \(\hat{y}\) the estimate.

## Note

Recommended loss function for regression problems.