L2 Or Ridge Regression
Though the name suggests Regression, it's a Regularization method for preventing Overfitting
[!def] L2 or Ridge Regression
$$
Loss = loss_{prev} + \lambda W^2
$$
- The value of slope will never go to total 0
- Why? Read L1 vs. L2 Regression