Temperature measurement (also known as thermometry) describes the process of measuring a current temperature for immediate or later evaluation. Datasets consisting of repeated standardized measurements can be used to assess temperature trends.
See also: Temperature, Temperature scale, Thermoscope, Thermometer and Pyrometer. Attempts at standardized temperature measurement prior to the 17th century were crude at best. For instance in 170 AD, physician Claudius Galenus[1] mixed equal portions of ice and boiling water to create a "neutral" temperature standard. The modern scientific field has its origins in the works by Florentine scientists in the 1600s including Galileo constructing devices able to measure relative change in temperature, but subject also to confounding with atmospheric pressure changes. These early devices were called thermoscopes. The first sealed thermometer was constructed in 1654 by the Grand Duke of Tuscany, Ferdinand II.[1] The development of today's thermometers and temperature scales began in the early 18th century, when Daniel Gabriel Fahrenheit produced a mercury thermometer and scale, both developed by Ole Christensen Rømer. Fahrenheit's scale is still in use, alongside the Celsius and Kelvin scales.
Many methods have been developed for measuring temperature. Most of these rely on measuring some physical property of a working material that varies with temperature. One of the most common devices for measuring temperature is the glass thermometer. This consists of a glass tube filled with mercury or some other liquid, which acts as the working fluid. Temperature increase causes the fluid to expand, so the temperature can be determined by measuring the volume of the fluid. Such thermometers are usually calibrated so that one can read the temperature simply by observing the level of the fluid in the thermometer. Another type of thermometer that is not really used much in practice, but is important from a theoretical standpoint, is the gas thermometer.
Other important devices for measuring temperature include:
One must be careful when measuring temperature to ensure that the measuring instrument (thermometer, thermocouple, etc.) is really the same temperature as the material that is being measured. Under some conditions heat from the measuring instrument can cause a temperature gradient, so the measured temperature is different from the actual temperature of the system. In such a case the measured temperature will vary not only with the temperature of the system, but also with the heat transfer properties of the system.
What thermal comfort humans, animals and plants experience is related to more than temperature shown on a glass thermometer. Relative humidity levels in ambient air can induce more or less evaporative cooling. Measurement of the wet-bulb temperature normalizes this humidity effect. Mean radiant temperature also can affect thermal comfort. The wind chill factor makes the weather feel colder under windy conditions than calm conditions even though a glass thermometer shows the same temperature. Airflow increases the rate of heat transfer from or to the body, resulting in a larger change in body temperature for the same ambient temperature.
The theoretical basis for thermometers is the zeroth law of thermodynamics which postulates that if you have three bodies, A, B and C, if A and B are at the same temperature, and B and C are at the same temperature then A and C are at the same temperature. B, of course, is the thermometer.
The practical basis of thermometry is the existence of triple point cells. Triple points are conditions of pressure, volume and temperature such that three phases are simultaneously present, for example solid, vapor and liquid. For a single component there are no degrees of freedom at a triple point and any change in the three variables results in one or more of the phases vanishing from the cell. Therefore, triple point cells can be used as universal references for temperature and pressure (see Gibbs phase rule).
Under some conditions it becomes possible to measure temperature by a direct use of the Planck's law of black-body radiation. For example, the cosmic microwave background temperature has been measured from the spectrum of photons observed by satellite observations such as the WMAP. In the study of the quark–gluon plasma through heavy-ion collisions, single particle spectra sometimes serve as a thermometer.
During recent decades, many thermometric techniques have been developed. The most promising and widespread non-invasive thermometric techniques in a biotech context are based on the analysis of magnetic resonance images, computerized tomography images and echotomography. These techniques allow monitoring temperature within tissues without introducing a sensing element.[2] In the field of reactive flows (e.g., combustion, plasmas), laser induced fluorescence (LIF), CARS, and laser absorption spectroscopy have been exploited to measure temperature inside engines, gas-turbines, shock-tubes, synthesis reactors[3] etc. The capability of such optical-based techniques include rapid measurement (down to nanosecond timescales), notwithstanding the ability to not perturb the subject of measurement (e.g., the flame, shock-heated gases).
The American Society of Mechanical Engineers (ASME) has developed two separate and distinct standards on temperature Measurement, B40.200 and PTC 19.3.B40.200 provides guidelines for bimetallic-actuated, filled-system, and liquid-in-glass thermometers. It also provides guidelines for thermowells.PTC 19.3 provides guidelines for temperature measurement related to Performance Test Codes with particular emphasis on basic sources of measurement errors and techniques for coping with them.