A thermometer showing room temperature of is placed on a block of ice with a temperature of . After one minute the temperature of thermometer is . How long will it take for the thermometer to have a temperature of ?
This problem is solved using Newton's Law of Cooling, which states that the rate of change of the temperature of an object is proportional to the difference between its own temperature and the ambient (surrounding) temperature. We use a first-order differential equation to model this relationship and solve for the unknown time.
According to Newton's Law of Cooling:
Where:
Given the surrounding temperature :
Integrating both sides to find the general solution:
\begin{align*} \int \frac{d T}{T-30} & =-k \int d t \\ \ln (T-30) & =-k t+A \end{align*}
We use the initial condition: at , the thermometer is at room temperature, so .
Substitute back into equation (1):
\begin{align*} \ln (T-30) & =-k t+\ln 50 \\ \ln (T-30)-\ln 50 & =-k t \\ \ln \left(\frac{T-30}{50}\right) & =-k t \end{align*}
We are given that after one minute (), the temperature . Substitute these values into equation (2):
Substitute the value of back into equation (2):
Now we find when . Using equation (3):
To convert this into seconds: