How Humid Is It? Simple Conversion Between Relative Humidity and Dew Point in Moist Air
Meteorologists indicate the amount of moisture in the atmosphere in many different ways. Two of these are the relative humidity and the dewpoint temperature. People generally are most familiar with the relative humidity, and know that when the relative humidity is high, like 90%, the air can begin to feel uncomfortable, especially when it is hot. Meteorologists, on the other hand, tend to prefer the dewpoint temperature, which is a better measure of phenomena such as comfort levels, the altitudes of cumulus cloud bases, and the potential effectiveness of evaporative coolers.
To convert between the dewpoint and relative humidity, several approximations have been proposed over the last 200 years, but they all require a calculator or mathematical tables, and most involve exponentials or logarithms. Now, Mark Lawrence from the Max Planck Institute for Chemistry has proposed a very easy rule of thumb for this conversion. Simply put, for every one degree Celsius decrease in the dewpoint temperature, the relative humidity decreases by 5%, starting at a relative humidity of 100% when the dewpoint equals the normal air temperature. This holds well for moist air, that is, as long as the relative humidity is above about 50%.
Putting this to use, it's easy to figure out the dewpoint, and thus the expected comfort level, directly from the relative humidity and the temperature: for instance, if it is 30 °C outside, and the relative humidity is 75%, then the dewpoint temperature will be about 25 °C. It's also easy to see how much could be gained from evaporative cooling - in this case, at most 5 degrees. "Further, by adjusting the relationship a little to account for the effects of temperature, it is also simple to use the relative humidity to compute the altitude of cumulus cloud bases without a calculator to a good approximation, usually within about 10%", says Lawrence.
Lawrence gives a historical perspective with a few anecdotes of early research in this field. One anecdote in particular, based on the earliest dewpoint measurements made by John Dalton around 1800, makes a nice science experiment to demonstrate the principles of humidity: students can figure out the dewpoint by starting with a glass full of water at room temperature, slowly adding ice cubes until dew just begins to form on the outside, and measuring the temperature of the water at that point. By multiplying the difference between the dewpoint and the air temperature by 5, and subtracting from 100% they get the relative humidity, which can then be compared with the reading on a hygrometer.