Three wise men are predominantly celebrated once a year but monitoring the environment of your data centre should be 24/7.
Environmental Monitoring in Data Centres is often synonymous with temperature monitoring. However, there are two equally important measurements, humidity and dew point. Each has their own qualities and importance when measured and understood.
The degree or intensity of heat present in a substance or object, especially as expressed according to a comparative scale and shown by a thermometer or perceived by touch.
It is well known that the data centre, by nature of its design, houses the highest density of electrical equipment in an organisation. A consequence of this means it the one area that has the potential to generate the most heat. Therefore, temperature is the most commonly monitored environmental condition by far.
By monitoring temperature and measuring, in real time, the fluctuations associated with hot and cold spots means IT departments can take remedial action in advance of a temperature related incident causing problems.
The amount of water vapour present in air expressed as a percentage of the amount needed for saturation at the same temperature.
In simple terms when humidity is too high, water condenses leading to water based damage and an unpleasant, sticky environment. When humidity is too low, static charge build-up increases in the environment and poses risk to electronics.
The majority of confusion when considering humidity is how it is measured. Typically, in many data centres there is at least one humidity sensor. The problem is that it measures “relative humidity”. The word “relative” refers to the water content of the air relative to the current temperature. Hotter air is capable of containing more water. That’s why it’s important to measure temperature as well as humidity. Without knowing both is meaningless. The same air has different relative humidity depending on temperature.
As hot air cools, its relative humidity increases until it reaches 100%. This is the point where water in the air will start condensing, also known as “dew point”.
The atmospheric temperature (varying according to pressure and humidity) below which; water droplets begin to condense and dew can form.
In layman’s terms dew point is the temperature where the air will become saturated with water vapour. Further cooling will cause the water vapour to condense and form into liquid water. The measurement for dew point is related to both the humidity of the air and the temperature of the room. Much like humidity, a higher value indicates there is more moisture in the air.
It’s unlikely but if a data centre were cooled below the dew point temperature, it would get physically foggy!
Because dew point is an “absolute” measurement it is perhaps a better measure than humidity as a value for IT rooms.
As the temperature falls the relative humidity will rise because less water vapour is required to fully saturate the air. A high humidity reading implies that the dew point value is close to the current temperature.
The dew point value will not exceed the temperature, as humidity cannot go higher than 100%, which indicates the air is fully saturated with water. If the moisture content of the air remains the same but temperature increases and humidity decreases then the dew point will remain constant.
Spook’s leading environmental monitoring service OmniWatch calculates all three measurements in real-time. See our free dew point calculator for those of you who only know your temperature and humidity values.
Get in touch with Spook
Please contact us if you wish for further information on how Spook can help with your environmental and power monitoring needs.