in a wide-ranging interview about social media site linkedin’s date center infrastructure and computing processes, john dix of network world addressed the thermal management requirements of the server farms that linkedin operates in texas, virginia, oregon, and in singapore.
linkedin uses a liquid cooling solution for its data centers. (wikimedia commons)
dix spoke with sonu nayyar, vice-president of operations and it, and zaid ali khan, senior director of infrastructure engineering, and revealed several interesting tidbits about the company’s cooling methods.
in the oregon data center, which is the newest of the three that linkedin uses in the u.s. and came online in november, kahn noted that it is denser than the previous two centers and uses 7-9 kw of power per server rack. in order to dissipate the large amounts of heat that comes from high rack-density, linkedin turned to rear-door heat exchangers.
nayyar explained, “we’re basically precooling water outside and circulating it through these rear-door heat exchangers, which neutralizes the hot air right at the rack so there is no cold air/hot air-aisle containment necessary.”
dix asked if there was concern about pumping water into the data centers but nayyar called the designs “robust” and insisted that there is plenty of monitoring to avoid a serious issue.
nayyar added, “and it’s worth mentioning that our corporate goal is to be using 100% sustainable energy in the future. we’re not there yet, obviously, but we’re working towards it and that’s part of the reason why we chose infomart in oregon because they have direct access to renewable energy.”
another innovation by linkedin is its open-source hardware initiative, open 19, which seeks to create a highly adaptable and cost-effective rack level server standard with fast integration time and will be able to serve a large number of standard data centers.
nayyar said, “the goal is to reduce common components by 50%. everything in a rack requires power and network so we are consolidating anything that’s a common component inside the rack by 50%.”
read the rest of the interview with linkedin at http://www.networkworld.com/article/3161184/data-center/linkedin-pumps-water-down-to-its-server-racks-uses-an-interesting-spine-and-leaf-network-fabric.html?page=2.