# Let f(x) = ax2+bx+c, where a,b,c belongs to R and a is not equals to zero. If f(x) = x has non real roots, then show that the eqn f(f(x)) = x has all non real roots.

Jitender Singh IIT Delhi
8 years ago
Ans:
Hello Student,





This equation have non real roots.
(b – 1)2< 4ac






So this equation also have non real roots.
mycroft holmes
272 Points
8 years ago
Without Loss of Generality we can assume that a>0.

Then f(x) = x has no real roots implies that f(x)> x for all real x.

This also implies that f(f(x))>x for all real x.

This means f(f(x)) – x = 0 has non-real roots.
mycroft holmes
272 Points
8 years ago
I need to add an explanation:

f(x)> x for all x implies f(f(x))>f(x)>x. That’s how we get the inequality f(f(x))>x for all x