Guest

Hi!!, If I have to measure the temperature of a hot cup of tea, I insert a thermometer in it. Initial temperature of the thermometer bulb is 0 0 C. When it is in contact with the tea, essentially the temperature of the bulb will increase and of the tea will decrease to attain thermal equilibrium. This means that thermometer heats up a little and tea cools down a little. When they attain a thermal equilibrium, thermometer measures its own temperature(which is equal to the tea's temperature). The problem here is- to attain equilibrium, the tea has cooled down a little, so what we measure is not the actual temperature of the tea but the temperature when the tea is a little cooled off. If my line of reasoning is correct, then it means we all measure wrong temperature. Any views over this? (Consider an isolated system)


Hi!!,
If I have to measure the temperature of a hot cup of tea, I insert a thermometer in it. Initial temperature of the thermometer bulb is 00C. When it is in contact with the tea, essentially the temperature of the bulb will increase and of the tea will decrease to attain thermal equilibrium. This means that thermometer heats up a little and tea cools down a little. When they attain a thermal equilibrium, thermometer measures its own temperature(which is equal to the tea's temperature). The problem here is- to attain equilibrium, the tea has cooled down a little, so what we measure is not the actual temperature of the tea but the temperature when the tea is a little cooled off.
If my line of reasoning is correct, then it means we all measure wrong temperature.
Any views over this? (Consider an isolated system)


Grade:9

0 Answers

No Answer Yet!

ASK QUESTION

Get your questions answered by the expert for free