vathos.model.loss¶
The Various Loss Functions that can be used with the model
Note
All the loss functions take the Logits as input, internally they will sigmoid the input perform their operations and then return the mean error
Segmentation Loss¶
-
class
DiceLoss
[source]¶ Criterion that computes Sørensen-Dice Coefficient loss.
According to [1], we compute the Sørensen-Dice Coefficient as follows:
\[\text{Dice}(x, class) = \frac{2 |X| \cap |Y|}{|X| + |Y|}\]where: - \(X\) expects to be the scores of each class. - \(Y\) expects to be the one-hot tensor with the class labels.
the loss, is finally computed as:
\[\text{loss}(x, class) = 1 - \text{Dice}(x, class)\][1] https://en.wikipedia.org/wiki/S%C3%B8rensen%E2%80%93Dice_coefficient
-
class
BCEDiceLoss
[source]¶ Performs BCE and Dice Loss and adds them both
loss = bce_loss + 2 * dice_loss
-
class
TverskyLoss
(alpha: float, beta: float)[source]¶ Performs Tversky Loss on Logits
According to [1], we compute the Tversky Coefficient as follows:
\[\text{S}(P, G, \alpha; \beta) = \frac{|PG|}{|PG| + \alpha |P \ G| + \beta |G \ P|}\]- where:
- \(P\) and \(G\) are the predicted and ground truth binary labels.
- \(\alpha\) and \(\beta\) control the magnitude of the penalties for FPs and FNs, respectively.
Notes
- \(\alpha = \beta = 0.5\) => dice coeff
- \(\alpha = \beta = 1\) => tanimoto coeff
- \(\alpha + \beta = 1\) => F beta coeff
- Reference:
- [1] https://kornia.readthedocs.io/en/latest/losses.html
Depth Loss¶
-
class
RMSELoss
(eps=1e-06)[source]¶ Performs RMSE Loss
we simply sigmoid the input, pass it through nn.MSELoss and then do a torch.sqrt on it
-
class
BerHuLoss
(threshold: float = 0.2)[source]¶ Implementation of the BerHu Loss from [1]
\[ \begin{align}\begin{aligned}B(y, y') = (1/n) * |y' - y| if |y'-y| <= c\\B(y, y') = (1/n) * ( (y'-y)^2 + c^2 ) / 2*c othwerwise\\c = 1/5*max(|y'-y|)\end{aligned}\end{align} \]
-
class
GradLoss
[source]¶ Performs Gradient Loss
The Image XY Gradients are computed for input and target and the mean L1Loss between these gradients is returned