Explore tens of thousands of sets crafted by our community.
Loss Functions Explained
10
Flashcards
0/10
Kullback-Leibler Divergence (KLD)
KLD is a measure of how one probability distribution diverges from a second, expected probability distribution, used in various tasks like variational autoencoders.
Cross-Entropy Loss
Cross-Entropy quantifies the difference between two probability distributions for a given random variable or set of events, widely used in classification tasks.
Mean Squared Error (MSE)
MSE measures the average squared difference between estimated values and the actual value, commonly used in regression problems.
Huber Loss
Huber Loss is less sensitive to outliers in data than MSE and combines the properties of MSE and MAE. It's quadratic for small errors, and linear for large errors.
Poisson Loss
Poisson Loss is used when modeling count data in regression problems. It assumes the target values, typically counts, are Poisson distributed and the loss is the log-likelihood of that distribution.
Softmax Loss
Softmax Loss, or Multinomial Logistic Loss, extends Logistic Loss to multi-class classification problems. It applies the softmax function to the raw output scores, which are then used in the Cross-Entropy Loss.
Hinge Loss
Hinge loss is often used for binary classification tasks, such as Support Vector Machines. It's defined as the maximum of zero or the difference between one and the product of the true label and the predicted score.
Logistic Loss
Also known as Binary Cross-Entropy Loss, it's used for binary classification problems. It measures the loss between predicted probabilities and target binary state.
Mean Absolute Error (MAE)
MAE measures the average of the absolute errors (i.e., the absolute differences between the predictions and the actual observations), commonly used in regression.
Focal Loss
Focal Loss modifies Cross-Entropy Loss to focus more on hard-to-classify examples, it is particularly useful for imbalanced datasets in object detection.
© Hypatia.Tech. 2024 All rights reserved.