Binary Cross Entropy Lossš
Interactive Binary Cross Entropy Loss Demo
See how BCE loss "punishes" wrong predictions through both visual intuition and mathematical precision. The red "glow" shows punishment intensity!
Mathematical Foundation: Binary Cross Entropy Loss
1. Binary Cross Entropy Formula:
BCE(y, p) = -[y Ć log(p) + (1-y) Ć log(1-p)]
Where y = true label (0 or 1), p = predicted probability (0 to 1)
2. Why This Works:
⢠Perfect prediction: BCE = 0 when p=1 and y=1, or p=0 and y=0
⢠Wrong confident prediction: BCE ā ā as pā0 when y=1 (or pā1 when y=0)
⢠Uncertainty: BCE = log(2) ā 0.693 when p=0.5 regardless of y
The logarithmic penalty severely punishes confident wrong predictions, creating a strong learning signal
Step-by-Step: How Data Transforms into Loss
Raw Data
Sites with known liquefaction outcomes
Model Predictions
Probability of liquefaction (0-1)
BCE Loss
Punishment for wrong predictions
Learning Signal
Gradient updates to improve model
1. Training Data (Ground Truth)
2. Model Predictions + Decision Boundary
3. Loss "Punishment" (Visual)
4. Loss Curves (Mathematical View)