Correct answer is (b) Overload noise
Easy explanation: In the statistical approach, we assume that the quantization error is random in nature. We model this error as noise that is added to the original (unquantized) signal. If the input analog signal falls outside the range of the quantizer (clipping), eq (n) becomes unbounded and results in overload noise.