Saturday, April 2, 2016

Facebook Data Center

This video discusses about Facebook data center. Handling the profiles of 1 in 7 people on earth needs staggering scale of devices and technology to handle the data. The data center of this 104 billion dollar company and the company which grows at the rate of 100 million users every six months is explained with immense information and its challenge. In order to handle such a huge data would require a monster data center which was first opened in Prineville, OR which is having a size of three football stadiums together (memory chip the size of 300,000 sq ft and costed 100s of million dollars. It handles the data flying from your computer to internet at the speed of light and vice versa with 21 million feet of fiber optic cable.



Kenpachi, GM of the data center explains the working nature of the data center in this video clearly. The data center makes sure the data transfer from internet to data center to your laptop in milliseconds. The video shows the fine illustration of never ending servers in different racks. It uses 30 megawatts of electricity on tap inorder to prevent running out of power. They also use huge diesel power generators as backup to prevent power outage thereby preventing data loss. And these huge generators kick in immediately when there is power loss at all. The massive 7 room rooftop state of the art (chiller less data center) natural air conditioning system is used to cool the heat generated from the servers and acts as a heat sink. The cool air from high plains of Oregon is sucked in and mixed with warm air to regulate the temperature of the server area thus controlling the overheating of the servers. On hot days, the Prineville data center is designed to use evaporative cooling instead of a chiller system.



To keep up with increase in users and data available, thousands of servers and memories are brought in daily. They use both Intel and AMD chips with custom made motherboards and chassis, allowing the use of larger heat sinks and fans to improve cooling efficiency. In the event of a server failing the technicians get a notice of which one failed but is very difficult to find it in this massive place. The facebook has spent more than $1 billion for the infrastructure and are still running out of space to accommodate its growth. But with this rate of growth in online involvement, they need to better speed up increasing the size.  

Reference: https://www.youtube.com/watch?v=Y8Rgje94iI0

No comments:

Post a Comment