 Click to Chat

1800-1023-196

+91-120-4616500

CART 0

• 0

MY CART (5)

Use Coupon: CART20 and get 20% off on all online Study Material

ITEM
DETAILS
MRP
DISCOUNT
FINAL PRICE
Total Price: Rs.

There are no items in this cart.
Continue Shopping
`        Let f(x) = ax2+bx+c, where a,b,c belongs to R and a is not equals to zero. If f(x) = x has non real roots, then show that the eqn f(f(x)) = x has all non real roots.`
10 years ago Jitender Singh
IIT Delhi
158 Points
```							Ans:Hello Student,Please find answer to your question belowThis equation have non real roots.(b – 1)2< 4acSo this equation also have non real roots.
```
5 years ago
```							Without Loss of Generality we can assume that a>0. Then f(x) = x has no real roots implies that f(x)> x for all real x. This also implies that f(f(x))>x for all real x. This means f(f(x)) – x = 0 has non-real roots.
```
5 years ago
```							I need to add an explanation: f(x)> x for all x implies f(f(x))>f(x)>x. That’s how we get the inequality f(f(x))>x for all x
```
5 years ago
Think You Can Provide A Better Answer ?

## Other Related Questions on Algebra

View all Questions »  ### Course Features

• 731 Video Lectures
• Revision Notes
• Previous Year Papers
• Mind Map
• Study Planner
• NCERT Solutions
• Discussion Forum
• Test paper with Video Solution  ### Course Features

• 101 Video Lectures
• Revision Notes
• Test paper with Video Solution
• Mind Map
• Study Planner
• NCERT Solutions
• Discussion Forum
• Previous Year Exam Questions