The University of Birmingham were looking at successor technology to its iDataPlex deployment, and modern high density solutions, coupled with a limited air cooling capacity per rack meant they were looking at installing only two chassis per rack. They turned to Lenovo for a WCT solution which allows six chassis per rack, saving valuable data centre floor space and reducing energy required to cool the systems. This increases efficiencies, both from an IT and facitlities perspective, of the University of Birmingham by 20-25% . The university project, which is the first of its kind in the UK, is projected to reduce cooling energy by up to 83% compared to using air cooling alone, and adds only 4.5kW of heat per rack to the data centre
The water cooling system replaces the typical system fans with an internal and external manifold. Water is delivered directly into the rear of the server to cool it. This allows for a simple blind dock, quick release mechanism for removing or installing the servers. Internally, water is pumped through direct attached heat-sinks on the CPUs, Dual In-Line Memory Modules, on board components, and IO which transfers heat into the water before being pumped away. Water typically enters the system at up to 45°C; the subsequent heat transfer from the system components to the water typically results in an water temperature increase of approximately 10°C and, in doing so, reduces the temperature of the system. Lenovo also partnered with Mellanox and OCF to deploy the project.
The system took nine months to develop from initial demonstration, system design and testing to validate all hardware for use with warm Water Cooling Technology. The system will eventually be connected to the University’s central ‘BlueBear’ HPC service and the technology will be used to power the University’s private research cloud deployment.
Simon Thompson, Research Computing Specialist at Birmingham, comments:
“We’re constantly experimenting with new technologies to improve the service we deliver to users. This project will allow us to manage 85 per cent of the heat recovery from a single 30 kW rack leaving just 4.5kW of unrecovered heat in the data centre. In addition, we are looking at adding a rear door heat exchanger to the system, which will capture this remaining heat and mean that we will need almost no air cooling for HPC research equipment. Although this has been a small deployment, we’ve been working with Lenovo and Mellanox to qualify new technology into the systems to meet our requirements.”