Click to Chat

1800-2000-838

+91-120-4616500

CART 0

• 0

MY CART (5)

Use Coupon: CART20 and get 20% off on all online Study Material

ITEM
DETAILS
MRP
DISCOUNT
FINAL PRICE
Total Price: Rs.

There are no items in this cart.
Continue Shopping
`        Let f(x) = ax2+bx+c, where a,b,c belongs to R and a is not equals to zero. If f(x) = x has non real roots, then show that the eqn f(f(x)) = x has all non real roots.`
8 years ago

Jitender Singh
IIT Delhi
158 Points
```										Ans:Hello Student,Please find answer to your question belowThis equation have non real roots.(b – 1)2< 4acSo this equation also have non real roots.
```
3 years ago
mycroft holmes
271 Points
```										Without Loss of Generality we can assume that a>0. Then f(x) = x has no real roots implies that f(x)> x for all real x. This also implies that f(f(x))>x for all real x. This means f(f(x)) – x = 0 has non-real roots.
```
3 years ago
mycroft holmes
271 Points
```										I need to add an explanation: f(x)> x for all x implies f(f(x))>f(x)>x. That’s how we get the inequality f(f(x))>x for all x
```
3 years ago
Think You Can Provide A Better Answer ?

## Other Related Questions on Algebra

View all Questions »
• Complete JEE Main/Advanced Course and Test Series
• OFFERED PRICE: Rs. 15,900
• View Details