Sigma heat explained

Sigma heat, denoted

S

, is a measure of the specific energy of humid air. It is used in the field of mining engineering for calculations relating to the temperature regulation of mine air. Sigma heat is sometimes called total heat, although total heat may instead mean enthalpy.

Definition

Sigma heat is the energy which would be extracted from a unit mass of humid air if it were cooled to a certain reference temperature under constant pressure while simultaneously removing any condensation formed during the process. Because sigma heat assumes that condensation will be removed, any energy which would be extracted by cooling the water vapor below its condensation point does not count towards sigma heat. The reference temperature is usually 0°F, although 32°F is sometimes used as well.

Assuming a reference temperature of 0°F, the following formula may be used under standard temperature ranges and pressure:[1]

S=0.24\tfrac{BTU{lb\circF}}t+W(0.45\tfrac{BTU{lb\circF}}t+1061\tfrac{BTU{lb}})

where

S

is the sigma heat of the air (in BTU/lb),

t

is the dry-bulb temperature of the air (in °F), and

W

is the specific humidity of the air (unitless).

The equivalent metric formula:

S=17.86\tfrac{kJ{kg}}+1.005\tfrac{kJ{kgK}}t+W(2501\tfrac{kJ{kg}}+1.884\tfrac{kJ{kgK}}t)

where

S

is the sigma heat of the air (in kJ/kg),

t

is the dry-bulb temperature of the air (in °C), and

W

is the specific humidity of the air (unitless) sometimes expressed as kg/kg.

Comparison with enthalpy

Sigma heat is not the same as the enthalpy of the humid air above the reference temperature. (Enthalpy is sometimes called total heat or true total heat) Unlike sigma heat, enthalpy does include the energy which would be extracted in cooling the condensed water vapor all the way to the reference temperature. Essentially, enthalpy assumes that all components of the system must be cooled during the cooling process, whereas sigma heat assumes that some of those components (liquid water) are removed part way through the process. Nevertheless, some writers mistakenly use the term enthalpy when they actually mean sigma heat, creating some confusion.

Assuming a reference temperature of 0°F, the relationship between enthalpy and sigma heat may be shown mathematically as:

h=S+1\tfrac{BTU{lb}}Wt'

where

h

is the specific enthalpy of the air above its reference temperature,

S

is the sigma heat of the air (in BTU/lb),

W

is the specific humidity of the air (unitless), and

t'

is the wet bulb temperature (in °F).

(Standard temperature ranges are assumed.)

Wet bulb temperature vs. dry bulb temperature

Assuming constant pressure, sigma heat is solely a function of the wet bulb temperature of the air. For this reason, humidity need not be taken into account unless dry-bulb temperature measurements are used. Like sigma heat, the wet bulb temperature is not directly affected by the temperature of any condensed water vapor (liquid water), and it varies only when there is a net energy change to the system. In contrast, the dry bulb temperature can vary even for processes where there is no such net energy change. This difference may be understood by examining evaporative cooling. During evaporative cooling, all energy lost from air molecules as sensible heat is gained as latent heat by water molecules evaporating into that air. With no net energy gained or lost from the now more humid air, sigma heat remains unchanged. In keeping with this, the wet bulb temperature also remains unchanged, as its reading already represented the maximum possible amount of evaporative cooling. The dry bulb temperature however is in conflict with the sigma heat since it decreases during such evaporative cooling. This is why measurements of sigma heat which use dry bulb temperatures must also take into account the humidity of the air.

References

  1. Different temperature ranges or pressures will slightly alter the heat capacity of the water vapor and the air, causing a deviation from the accuracy of this formula.

[2]

[3]

[4]