Let f(x) = ax2+bx+c, where a,b,c belongs to R and a is not equals to zero. If f(x) = x has non real roots, then show that the eqn f(f(x)) = x has all non real roots.
Tushar Watts , 15 Years ago
Grade 12
3 Answers
Jitender Singh
Last Activity: 10 Years ago
Ans: Hello Student, Please find answer to your question below
This equation have non real roots. (b – 1)2< 4ac So this equation also have non real roots.
mycroft holmes
Last Activity: 10 Years ago
Without Loss of Generality we can assume that a>0.
Then f(x) = x has no real roots implies that f(x)> x for all real x.
This also implies that f(f(x))>x for all real x.
This means f(f(x)) – x = 0 has non-real roots.
mycroft holmes
Last Activity: 10 Years ago
I need to add an explanation:
f(x)> x for all x implies f(f(x))>f(x)>x. That’s how we get the inequality f(f(x))>x for all x
Provide a better Answer & Earn Cool Goodies
Enter text here...
LIVE ONLINE CLASSES
Prepraring for the competition made easy just by live online class.
Full Live Access
Study Material
Live Doubts Solving
Daily Class Assignments
Ask a Doubt
Get your questions answered by the expert for free