Right option is (b) At least one root in (a, b) and at least one root in (b, c)
The best explanation: Here, f (x) being a polynomial is continuous and differentiable for all real values of x.
We also have f(a) = f(b) = f(c). If we apply Rolle’s theorem to f (x) in [a, b] and [b, c] we will observe that f'(x) = 0 will have at least one root in (a, b) and at least one root in (b, c).
But f'(x) is a polynomial of degree two, so that f'(x) = 0 can’t have more than two roots. It implies that exactly one root of f'(x) = 0 will lie in (a, b) and exactly one root off'(x) = 0 will lie in (b, c).
Let y = f(x) be a polynomial function of degree n. If f (x) = 0 has real roots only, then f'(x) = 0, f”(x) = 0, … , f^n-1(x) = 0 will have real roots. It is in fact the general version of above mentioned application, because if f (x) = 0 have all real roots, then between two consecutive roots of f(x) = 0, exactly one root of f'(x) = 0 will lie.