Instruction: Describe each regularization technique and how they affect the model differently.
Context: This question tests the candidate's understanding of regularization techniques and their impact on model complexity and feature selection.
Official answer available
Preview the opening of the answer, then unlock the full walkthrough.
The way I'd explain it in an interview is this: L1 and L2 regularization both penalize model complexity, but they push the parameters in different ways. L1 tends to drive some coefficients all the way to zero, which makes it useful when you want sparsity...