0800 231 5914
 

Power, Cooling, Security and Connectivity – Part I

by posted in Cloud technology, Enterprise business, Enterprise hosting

This is a guest blog by Stuart, Data Centre Project Manager

Having project managed our latest data centre build there are some obvious points that must remain prevalent when selecting a data centre. In this first instalment I will cover our approach to power and cooling.

Power
In this digital age, everything is drawing power. From the mobile phone in your pocket, the screen you are reading this from, the internet connection you are viewing this website through, down to the data centre in which the servers are hosted and the server itself.

Whereas you can afford your laptop battery to die, we cannot allow power to affect our new data centre. For that reason, we have invested in three generators – the kind of generators you would find bolted beneath an oil rig. We had to have a design that would withstand any eventuality and these generators fit the bill. If needed, these generators will allow us to run for 48 hours with no incoming power as well as enabling us to stop one to refill the fuel without causing power loss to the data centre. Power is a serious consideration, and a heavy but essential investment.

Our latest data centre has 2MVa capacity. This means little to most, so let me put this into some context. Using research from the University of Strathclyde, our new data centre – including its average daily electricity usage and its ‘contingency’ store – uses the energy equivalent of 63 family houses each year.

This is a staggering figure that also generates an interesting set of challenges. Sourcing this level of power requires significant investment and it can also have a considerable environmental impact. In terms of minimising the environmental impact we carefully monitor our emissions and processes in line with our ISO14001 and PAS2060 certifications meaning that all clients hosting in our new data centre are carbon neutral.

Cooling
Cooling is almost as critical to the data centre as its power. As part of our build I consulted with the leading suppliers of server equipment and based our temperature and humidity levels around advised thresholds. The cooling system for the data centre is also carefully monitored and controlled to keep within the thresholds necessary to maintain the recommended temperatures of the server equipment.

Different cooling methods are used to keep the servers cool. These include warm aisle, free cooling and cold aisle containment to name but a few! We currently utilise cold aisle containment to cool the environment. Refrigerated air is pumped under the floor into a corridor between the servers. This is forced upwards and the servers drag this cold air back through to cool it down. The warm air that exhausts from the server is then re-cooled to start the process again. These technologies work well for us and allow us to maintain optimal temperature and humidity within the data centre through our 19 CRAC (computer room air conditioning) units.

In my next instalment I’ll cover the importance of security and connectivity.

 

Solutions to suit your needs

24 hour support

Support around the clock with 24/7 access to qualified experts

Our award-winning customer services and comprehensive service level agreements ensure that your IT services are in good hands.

  • Freephone support number answered within 3 rings
  • 24/7/365 access to UK-based Microsoft, Linux and HP engineers
  • Dedicated account management

Read more about Enterprise support

 

Call now on 0800 231 5914 to discuss your IT solutions with one of our consultants

Talk to a Consultant Send an Enquiry