Logo
Pattern

Discover published sets by community

Explore tens of thousands of sets crafted by our community.

Common Loss Functions

10

Flashcards

0/10

Still learning
StarStarStarStar

Negative Log-Likelihood Loss

StarStarStarStar

NLL=log(p^yi)NLL = -\log(\hat{p}_{y_i})

StarStarStarStar

Categorical Cross-Entropy Loss

StarStarStarStar

CCE=i=1Nc=1Myiclog(pic)CCE = -\sum_{i=1}^N \sum_{c=1}^M y_{ic} \log(p_{ic})

StarStarStarStar

Mean Squared Error (MSE)

StarStarStarStar

MSE=1Ni=1N(yiy^i)2MSE = \frac{1}{N} \sum_{i=1}^N (y_i - \hat{y}_i)^2

StarStarStarStar

Mean Absolute Error (MAE)

StarStarStarStar

MAE=1Ni=1Nyiy^iMAE = \frac{1}{N} \sum_{i=1}^N |y_i - \hat{y}_i|

StarStarStarStar

Huber Loss

StarStarStarStar

Lδ(y,y^)={12(yy^)2for yy^δ,δyy^12δ2otherwiseL_{\delta} (y, \hat{y}) = \begin{cases} \frac{1}{2}(y - \hat{y})^2 & \text{for } |y - \hat{y}| \le \delta, \\ \delta |y - \hat{y}| - \frac{1}{2}\delta^2 & \text{otherwise} \end{cases}

StarStarStarStar

Hinge Loss

StarStarStarStar

L(y)=max(0,1yif(xi))L(y) = \max(0, 1 - y_i \cdot f(x_i))

StarStarStarStar

Cross-Entropy Loss

StarStarStarStar

H(y,p)=iyilog(pi)H(y,p) = -\sum_{i} y_i \log(p_i)

StarStarStarStar

Kullback-Leibler Divergence Loss

StarStarStarStar

DKL(PQ)=p(x)log(p(x)q(x))D_{KL}(P \parallel Q) = \sum p(x)\log(\frac{p(x)}{q(x)})

StarStarStarStar

Logistic Loss

StarStarStarStar

logistic=log(1+eyif(xi))logistic = \log(1 + e^{-y_i f(x_i)})

StarStarStarStar

Binary Cross-Entropy Loss

StarStarStarStar

BCE=1Ni=1Nyilog(pi)+(1yi)log(1pi)BCE = -\frac{1}{N} \sum_{i=1}^N y_i \log(p_i) + (1 - y_i) \log(1 - p_i)

Know
0
Still learning
Click to flip
Know
0
Logo

© Hypatia.Tech. 2024 All rights reserved.