Humidity
Humidity is defined as the water vapor content in air or other gases. Humidity is usually measured in terms of absolute humidity (the ratio of the mass of water vapor to the volume of air or gas), dew point (the temperature and pressure at which a gas begins to condense into a liquid), and relative humidity, or RH (the ratio of the moisture content of air compared to the saturated moisture level at the same temperature or pressure). Thermal conductivity humidity sensors, also known as absolute humidity sensors, are capable of measuring absolute humidity using a system that employs two thermistors in a bridge connection, even at high temperatures or in polluted environments. Since the early 1960s, chilled mirrors have been used to measure dew point, but the development of thin film capacitive sensors now allows measurement of dew points at temperatures as low as –40°F at far less cost and with greater accuracy. Relative humidity was once determined by measuring the change in moisture absorption in silk, human hair, and later, nylon and synthetics. Mechanical methods for measuring RH were introduced in the 1940s. Recently, polymer-based resistive and capacitive sensors have been developed
Sensor Types and Technologies
Recent developments in semiconductor technology have made possible humidity sensors that are highly accurate, durable, and cost effective. The most common humidity sensors are capacitive, resistive, and thermal conductivity. The following sections discuss how each sensor type is constructed and used to measure humidity.
Capacitive RH Sensors
Capacitive RH sensors are used widely in industrial, commercial, and weather telemetry applications. They dominate both atmospheric and process measurements and are the only types of full-range RH measuring devices capable of operating accurately down to 0% RH. Because of their low temperature effect, they are often used over wide temperature ranges without active temperature