ReLU and the importance of non-linear transformation🔗
Interactive Transformation Demo
Adjust the sliders to see how a linear (rotation + scaling) and non-linear (ReLU) transformation can make data separable. Or, press "Solve" to see a working solution.
Mathematical Transformations
1. Linear Transformation:
Y = WTX + b
Where W is the transformation matrix (rotation + scaling) and b is the bias (set to 0 here)
2. Non-linear Transformation (Leaky ReLU):
Z = f(Y) = max(αY, Y)
Where α is the negative slope parameter: 1.00
Applied element-wise: f(y) = y if y > 0, else α × y
1. Original Data (Input Space X)
2. After Linear Transform (Y = WTX)
3. After Non-linearity (Z = f(Y))