How do you calculate the energy needed to heat water?

1 Answer
Dec 7, 2016


It depends on how much water you have, and to what temperature you want to raise it to.


The equation for the amount of thermal energy needed to produce a certain temperature change is as follows:

#q = cmDeltaT#


#q# is the amount of thermal energy
#c# is the heat capacity of water (#~~ 4.184 J/g^oC#)
#DeltaT# is the change in temperature.

So, how much thermal energy you need is dependent on exactly how much you want to raise the temperature of water by.

Also, note that the equation includes a mass. This is simply because the larger the sample of molecules you have, the more energy it will take to raise their average kinetic energy (i.e. temperature) to a certain level. This is why you may notice that heating a pan of water takes much less time than heating up your bathtub full of water, even if you use the same amount of heat.

This is taken into account by the heat capacity. The heat capacity tells us how much energy it takes to raise the temperature of 1 g of water by #1^oC#. We add it in there, therefore, to take into account how much water we have.

So to conclude, the amount of energy you'll need to heat water really depends on the temperature you want to heat it to, as well as the amount of water you have.

Hope that helped :)