Data centre monitoring

'Over the weekend Spook alerted us to a series of temperature fluctuations in our data centre and because of OmniWatch we were able to deal with air conditioning issues long before they became a real problem.'

Henry Schein

A Fortune 500 Company and the only global dental medication distributor.

What is a data centre?

Data centres (also known as Datacentre, Data Centre, Datacenter) are dedicated facilities, purpose built to house computing equipment such as main frames, servers, routers, switches and firewalls, as well as supporting components like backup equipment, fire suppression facilities and air conditioning (AC) units.

Spook's experience of monitoring data centres

Spook's data centre customers range from legacy buildings to new builds. They typically have high levels of design and employ power and communications redundancy to ensure maximum uptime. Investment in kit and building fabric is typically considerable.

Perpetual data collection

The majority of data centre IT equipment usually has some level of inherent monitoring. The problem is that each have different functions and controls. This means data centre managers and IT teams need to learn different ‘languages' for disparate equipment.

To complicate matters any changes need to be micro managed. Even a simple email or phone number change can mean hundreds of updates need to be rolled out across the enterprise.

With Spook's monitoring service, OmniWatch, it's easy. Once the dynamic on-call rota is set up via the secure cloud based console, any changes are applied centrally. These can be amended at any time, with ease.

Monitor, measure and control

By monitoring key environmental conditions, power metrics and 3rd party equipment a data centre can not only remedy any potential threatening change in a monitored condition at the earliest of stages, it can also intelligently use the collected data for trend analysis, planned maintenance and load balancing and measuring key performance indicators (KPI's).

Warranty protection

There are many different types of equipment in a data centre, each with its own dependancies.

Manufacturers of IT equipment issue guidance as to the optimum environmental operating condition for equipment to be kept within in order to maintain warranties. If these are not adhered to warranties inevitably get questioned and in some circumstances they are in danger of being voided.

Spook customers are able to easily stay on top of this and produce evidence based reports at the touch of a button.

Data Centre Infrastructure Management (DCIM)

DCIM started out as a component of building information modelling (BIM) software, which is used by facilities managers (FM) to create building digital schematic diagrams.

DCIM tools bring the same capabilities to data centres, allowing administrators to collate, store and analyse data related to power and cooling in real time.

The topic of DCIM touches many departments and often spans a data centre's availability and IT reliability requirements. It can identify and eliminate sources of risk to increase availability of critical IT systems (IT) as well as identifying interdependencies between the IT infrastructure teams and the FM department.

As Spook's OmniWatch service has evolved in-line with customer requests, DCIM has organically grown as a natural by-product. OmniWatch presents a joined up, easy to use DCIM solution that is modern and functionally rich.

View more about facilities monitoring

Data centre environmental monitoring

Ensuring the temperature of IT rooms and IT racks are at a level that will not compromise the efficient running of the equipment contained within is invaluable. Humidity and dew point have important properties too, even air-flow monitoring can be a key indicator that the room is acting unusually.

Monitoring water ingress to detect leaks from air conditioning units and flooding from other areas of the building that may not be IT related such as toilet blocks, water points and pipework and natural floods is vital.

Many potential water ingress points are hidden under raised floors, in false ceiling voids and in adjacent parts of the building, not necessarily directly associated with the data centre.

ClassTemperature (°C)Humidity (%)Max dew point (°C)
A115 to 30-12°C DP & 8% RH to 17°C DP and 80% RH17
A210 to 35-12°C DP & 8% RH to 21°C DP and 80% RH21
A35 to 40-12°C DP & 8% RH to 24°C DP and 85% RH24
A45 to 45-12°C DP & 8% RH to 24°C DP and 90% RH24
B5 to 358°C to 28°C DP & 80% RH28
C5 to 40-8°C to 28°C DP & 80% RH28

ASHRAE issued its first thermal guidelines for data centres in 2004 and recommended temperatures be maintained between 20 and 25°C. They have since stated the ideal range is between 5 to 45°C due to a shift in importance from reliability and uptime to a more energy saving and green approach in equipment.

It is entirely dependent on the rating of equipment as to which of the ASHRE guidelines are adhered. Older and lower rated equipment should always be considered the baseline for any targets as aiming for the higher temperature allowance for new equipment risks damage to the older devices.

Data Centre power monitoring

Monitoring and measuring power in a data centre has become fundamental. Whether there is a requirement to monitor key power breakers to the data centre via appropriate metering (which provides real-time power load data) or rack level monitoring measuring the power metrics of a power distribution unit (PDU monitoring), each gives valuable power metrics that can be taken into account when installing new equipment or building in appropriate redundancy to the location.

Modern data centres invest in infrastructure to ensure seamless power provision to equipment should the power utility service providers cease to successfully deliver mains power. This is known a ‘power redundancy'.

They do this by investing in and deploying Uninterruptible Power Supplies (UPS monitoring) and installing back-up generators (generator monitoring).

By measuring key power metrics and monitoring equipment status levels means early warnings to power related issues can be managed efficiently.

PUEDCiELevel of efficiency
3.033%Very inefficient
2.540%Inefficient
2.050%Average
1.567%Efficient
1.283%Very efficient
PUE and DCiE are efficiency benchmarks to help you understand and manage your IT load.

Data Centre power efficiency monitoring

The industry standard for measuring these efficiencies is known as Power Usage Efficiency (PUE) and Data Centre Infrastructure Effectiveness (DCiE).

By monitoring PUE and DCiE in real-time ensures a business can measure and create a clear audit as efficiencies are gained. Spook's free PUE/DCiE calculator will give you an insight into how efficient your power efficiency is.

The objective is to ensure that the maximum possible percentage of power utilised in a data centre is actually used by IT equipment and the minimum by supporting equipment such as AC units.

Clearly things such as antiquated and/or unreliable AC units will have an adverse effect on the power efficiencies.

However knowing what the power efficiency figures in real time and being able to review these over time allows customers to highlight areas for improvement for a better overall power efficiency rating.

View more about power efficiency

Spook is an approved Endorser of the European Code of Conduct for Data Centres.

The Data Centres Energy Efficiency CoC has been established in response to increasing energy consumption in data centres and the need to reduce the related environmental, economic and energy supply security impacts.

The aim is to inform and stimulate data centre operators and owners to reduce energy consumption in a cost-effective manner without hampering the mission critical function of data centres.

The Code of Conduct aims to achieve this by improving understanding of energy demand within the data centre, raising awareness, and recommending energy efficient best practice and targets.

Read more 2019 best practice guidlines