Transcript for:
Facebook's Infrastructure and User Management

Coming up on How Do They Do It, the $100 billion website. How does Facebook store the profiles of more than one in seven people on Earth? Facebook is nothing short of a phenomenon. The world's largest social networking site, it was valued at over $100 billion and boasts over a billion users.

But handling the profiles, photos, and messages of more than one in seven people on the planet calls for technology on a simply staggering scale. So, how do they do it? Northern California, home of Facebook.

Invented by Harvard students in 2004, this social networking site lets your friends know what you're doing at the click of a mouse. Eight years after this company's birth, it floated on the stock market with a value of an incredible... 104 billion dollars. You can tell it was set up by students because in here they do things a bit differently. Graffiti and touchscreen technology vie for wall space.

Vending machines serve up boxes of of tech instead of cans of soda. And open bars and video games entertain a staff with an average age of just 26. But this quirky environment seems to work. Their website has clicked with the world and has been growing at a rate of 100 million new users every six months.

Handling the personal details of that many people is a massive challenge. We have one engineer here for every 1 million users on the site. We operate at just an unprecedented scale.

There's no user guide for how this works because no website has ever handled this many visitors before. When you've got more users than there are cars in the world, one of the biggest problems is storage. The storage in your laptop could fit in your hand. Here, they need something bigger. In Prineville, Oregon, the landscape is dominated by a monster data center of 300,000 square feet.

It's like having a memory chip the size of three football fields. And it costs hundreds of millions of dollars to build. This is where your information is stored.

In cutting-edge servers and massive memory banks. With data flying between them at the speed of light through over 21 million feet of fiber optic cable. Ken Patchett is the General Manager. When you type in Facebook.com, your request goes to the open internet, and that internet lands right here. And from right here, we request from one of the Facebook servers your profile and all the information associated with it.

Our data centers work and compile all that. information and then send it back to you right across the open Internet again and all of that happens in milliseconds. If you've ever wanted to visualize the Internet then these never-ending rows of servers are a fine illustration.

Some people consider the internet a cloud, as if it's floating around in the sky, but it's not. It's a real physical thing. The internet is a physical building, just like this, interconnected through miles and miles of fiber and cable all throughout the world. And all of these buildings can talk to each other and share data back and forth.

This place makes a supercomputer look like a pocket calculator. They've got 30 megawatts of electricity on tap, so they should never run out of power. But just like when you forget to back up work on your PC, a power failure could be disastrous.

So just in case, huge diesel generators stand by as a backup. Because a world without social networks would be unimaginable to millions of teenagers. In the event we lose main power to the building, these generators kick in right away.

So we've got to keep a real close eye on these guys. They generate up to 3 megawatts worth of power each, and we've got 14 of them on this site. All this technology generates an incredible amount of heat.

Without constant cooling, these servers would quickly burn out. Your computer is cooled by a heat sink not much bigger than a matchbox. The Facebook computers have this. This massive seven-room rooftop is a state-of-the-art natural air conditioning system. Cool air from the high plains of Oregon is sucked in, filtered, and mixed with warm air to regulate the temperature.

A fine mist controls the humidity, and the chilled air then sinks to the back of the servers, stopping them from overheating. Finally, any excess warm air is pushed out by monster fans hundreds of times bigger than the ones you'll find in your lounge. More fans will soon be needed because social networking is heating up.

Almost 600 million online addicts log in every day. That's almost twice the population of the United States. And the site is still growing.

To keep up, thousands of new servers arrive here daily. Tom Furlong is in charge of the data centers. When I started four and a half years ago, we had 27 million users, a few thousand servers.

These days we receive thousands of servers on a given day and I, you know, barely note the event. These trucks aren't delivering food. They're bringing in more and more computer memory.

Most of us are familiar with gigabytes or even terabytes. Here, they have petabytes. We have over 100 petabytes of photos and videos that are growing every single day, and that is a fantastic amount of information.

That's 100,000 times as much data as the hard drive of a high-end PC coming in every day. Every server rack contains 500 terabytes. That's over 130 billion times more memory than the first Apple personal computer.

And when a single server goes wrong, the job of finding a flickering needle in this digital haystack falls to technicians like David Gaylord. One hard drive has failed, and he's sent off to find rack B25. in a labyrinth of humming servers. Once he's tracked it down, David can replace an entire circuit board in the time it takes to update your status. But David and the rest of the technicians have got their work cut out for them.

Nearly 2.5 billion people have internet access worldwide, each spending 20% of their online time social networking and uploading hundreds of millions of photos, messages and updates every day. With all that activity, even this gigantic data center is running out of space. So construction crews are already working hard to increase capacity. But with online activity on this scale, they'd better get a move on.