Dew Point and Humidity



Relative Humidity (RH) Is a ratio, expressed in percent, of the amount of atmospheric moisture present relative to the amount that would be present if the air were saturated. Since the latter amount is dependent on temperature, relative humidity is a function of both moisture content and temperature. Relative Humidity is derived from the associated Temperature and Dew Point for the indicated hour.

Relative humidity is the normal term we use when discussing atmospheric moisture...it is a relative figure, that is, it tells the current humidity relative to the amount of humidity the atmosphere can support at a given temperature.  If the relative humidity is 50%, it means the air is holding, at that temperature, have the amount of moisture it is capable of holding.  Any additional moisture would condense from the air as dew. The dew point is the temperature at which dew will begin to drop out of the air, in effect, as an air mass cools it is able to hold less and less water. When the first moisture begins to drop out as dew, that temperature is the dew point.  it is indeed related to the relative humidity of the air mass.  In effect, you can plot for a given evening, if, say the relative humidity were 50% and the temperature began dropping, the relative humidity would begin to increase, even without adding additional moisture. This is simply because the atmosphere can hold less moisture at the lower temperature so that what is was and now is holding represents a greater (higher) percentage of what it is capable of holding.  I would agree your way of looking at the relationship is a more easily understood interpretation.  In any discussion you should include dew point, relative humidity and temperature.  Obviously they are all interrelated.

Dew Point Temperature (Td) Dew point temperature is a measure of atmospheric moisture. It is the temperature to which air must be cooled in order to reach saturation (assuming air pressure and moisture content are constant).

Dew Point is the temperature to which the air must be cooled before it becomes saturated.  When it is saturated it is holding all the water vapor it is able to hold. As you may know, warm air can hold more water than cold air.  Or said another way, as air cools, it loses its ability to hold water.  To what temperature would we have to cool the air in your back yard to have it holding all the water it can hold?  That's the dew point. If air continues to cool, the water vapor in the air has to condense (change into a liquid) or sublime (change into a solid).  When it condenses it may form dew, clouds, fog, rain... when it sublimes it forms frost, snow... Even though we use the same degrees to describe temperature and dew point, they are really different variables used to describe the atmosphere.

The dew point is the temperature to which the air must be COOLED, in order to become saturated, or, 100 percent relative humidity. The dew point is an indication of how much water vapor is in the air. The more water vapor, the closer the dew point is to the temperature.  When the air becomes saturated, the dew point and the temperature are the same. So it is really the HUMIDITY that affects the dew point, rather than the temperature.

Dew point and humidity are directly related.  And the dew point is used to compute the RELATIVE humidity. Some definitions are in order here for a clear understanding of the measurement of moisture (water vapor) in air. RELATIVE humidity is a measure of the actual amount of water vapor in the air compared to the total amount of vapor that can exist in the air at its current temperature, and is expressed as a percentage. Other ways to express the moisture in air are by its SPECIFIC humidity and its ABSOLUTE humidity. These last two measurements are usually only of interest to scientists.

Source