Behind the ephemeral "cloud" of cloud computing, the network we use for everything from checking our email to streamlining our health care system, there lies a very tangible and very big computer infrastructure.
But besides a glimpse at some of the hardware in 2009, there has been little information about Google's data centers, the warehoused collections of servers that have given the company the foundation for its vast Internet operations.
Today, the company is throwing open the gates to the world — digitally, of course. It has released a site featuring photos of facilities from Belgium to Finland to Iowa and launched a guided Street View tour of one in Lenoir, N.C.
It's the same facility the company revealed in an exclusive tour to Wired senior writer Steven Levy, whose story on Google's infrastructure appears in the magazine's November issue. In an interview with Morning Edition's Steve Inskeep, Levy described going where no Google outsider has gone before:
"What strikes you immediately is the scale of things. The room is so huge you can almost see the curvature of Earth on the end. And wall to wall are racks and racks and racks of servers with blinking blue lights and each one is many, many times more powerful and with more capacity than my laptop. And you're in the throbbing heart of the Internet. You really feel it."
Google has a lot of servers — Levy reports 49,923 in Lenoir alone. The total worldwide number eludes even Google's executives, but there have been at least 1 million cumulatively, according to a plaque Levy found on the premises.
Google is not the only company with massive computer networks. In 2006, Microsoft built a giant data center on 75 acres of bean fields in Quincy, Wash. Yahoo and Dell, among other companies, have also set up data centers in Quincy.
But Google's data center technology is unique, Levy says, which is partly responsible for the company's success — and a cause for the secrecy surrounding it.
"One technique that Google really pioneered was keeping things hotter than has been traditionally expected in a data center," Levy says. "In old data centers, you would put on a sweater before you went in there. Google felt that you could run the general facility somewhat warmer than even normal room temperature. When I walked into Lenoir I think it was 77 degrees."
The trick to keeping the heat under control, Levy writes, is a "hot aisle," a space that harnesses hot air from the servers into water-filled coils, sends it out of the building to cool, and then circulates it back inside. This is a dramatic departure from traditional centers, which use large amounts of energy on air conditioning.
Levy says that the specific technology varies from center to center and that Google has considered the local resources and geography in each design. That doesn't necessarily mean, however, that the centers don't have an environmental cost.
"There's no way around it. These things burn a lot of energy, and a lot of the energy in a data center is done to cool it down so the computers don't melt. Data centers in general consume 1.5 percent roughly of all the world's electricity," Levy says.
Google's servers have been getting progressively faster and cheaper, and even now the company has plans to completely change the basics of the system in places like Lenoir. But the specifics of those changes will remain secret — at least for now.
"Google may be dedicated to providing access to all the world's data, but some information it's still keeping to itself," Levy writes.
Copyright 2021 NPR. To see more, visit https://www.npr.org.