Indeed it would, and in fact in the late 19th century this idea was used by William Thomson, Lord Kelvin, to estimate the age of the Earth. He measured the temperature as a function of depth down deep mines (it is indeed hotter in deep mines) and used this to estimate the rate of heat transfer through the Earth, and then tried to estimate how long the Earth would have taken to cool to its present temperature from an initially molten state. His answer was wrong, for two reasons. One is that he assumed (for want of any better information) that the thermal conductivity of the Earth was constant throughout, which isn’t true: the Earth consists of distinct layers which have different chemical compositions and are in different physical states (some solid, some molten), so the thermal conductivity varies with depth. This was not known when Kelvin made his estimate, but was understood much later through studies of earthquakes. The other reason is that he – and you – neglected the heat generated by the decay of radioactive elements like uranium and thorium, which supply an additional source of heat energy – it’s not just cooling from a hot initial state. At the present time, both radioactive heat and heat left over from the Earth’s formation are about equally significant in understanding the internal heat flow of the Earth.
It is true that the Earth *would* eventually run out of thermal energy: the heat of formation is not being replaced, and even the radioactive heating will eventually stop as all the radioactive elements decay. However, the timescale for this to happen is probably longer than the lifetime of the Sun.
Comments
#nerdyweirdo commented on :
I get it now, thank you!