The next steel shipping container you see being hauled by a truck or train might not stow the usual mass of lumber, textiles or foodstuffs. It might hold 10 tons of finely interlaced computer servers, ready to be deposited in a parking lot to serve 10,000 employees at a corporate headquarters—or 10,000 people on the Internet. Sun Microsystems has just started delivering these data-centers-to-go, taking the concept of portable computing to a whole new level.

True, the Project Blackbox system is portable only in the industrial sense that it is integrated into a standard 20-foot shipping container. But once delivered to a site, it is almost as self-contained as any laptop. All the system requires is a power cable and an Internet connection—plus a water supply and an external chiller for cooling. As many as 250 servers inside provide up to seven terabytes of active memory and more than two petabytes of disk storage. Perhaps most critically, says Greg Papadopoulos, Sun’s chief technology officer in Menlo Park, Calif., Project Blackbox will deliver that functionality in about one-tenth the time and at one-hundredth the cost of building a traditional computer room of equal prowess.

That prospect means such boxed data centers could not only replace the corporate data center, they could also transform the computer experience for all of us. “Project Blackbox symbolizes a big bet we’re making as a company,” Papadopoulos explains. “It’s a bet that the billions and billions of client machines we’ll have in the future—desktops, handhelds, iPods, whatever—will spend most of their time interacting with the network.” These devices will have little need to store and run common software applications the way most computers do today. Instead they will simply access programs online that enable word processing, spreadsheets, and so on.

This transition is already well on its way, under names such as grid, utility or cloud computing. More and more people use Internet services for e-mail (such as Hotmail), blogging (Blogger), social networking (MySpace), mapping (Google Earth) and other tasks. They do not host the software on their own machines; they just link to it when they need it. Papadopoulos compares the movement to what happened with electricity a century ago: very few of us keep a generator in the basement anymore; we just plug into the power grid and consume electricity as needed.

Powering the Cloud

If cloud computing is the future, the Net will have to get a lot bigger, fast, and Project Blackbox could play a crucial role. Of course, Papadopoulos says, the Internet already has many computational power plants, in the form of big, institutional data centers crammed with hundreds of computers in floor-to-ceiling racks.

The problem is that the generating capacity of these plants is slipping further and further behind the growth in demand. Each new data center has to be custom-designed and specially installed, computer by computer, in a process that can take years and cost tens of millions of dollars. Once up and running, the centers cost a fortune to operate, both because the computer rooms tend to be carved out of high-rent office space and because they require massive air-conditioning around the clock to cool the sea of power-hungry microprocessors. As Jonathan Schwartz, Sun’s chief executive officer, put it at the Project Blackbox debut ceremony, “Just about every chief information officer and start-up I meet says they’re crippled by data-center energy and space constraints.”

Sun’s goal is to offer a way out. The big companies such as Qwest, Level 3, Akamai and Google that erect the huge server farms that support the Internet’s ever skyrocketing traffic could add capacity much faster by linking together the prefab containers, saving millions of dollars in the process. They and other firms could also sprinkle Blackboxes around in numerous other places to create nodes in the expanding grid.

At the same time, companies that must expand in-house computing would have new options. A firm in New York City, say, could augment its downtown data center with containers on a rooftop or in a parking garage—or in a low-rent warehouse in New Jersey. An oil company could mount a Blackbox on an offshore oil rig to run on-site operations as well as seismic monitoring. A humanitarian organization could set up emergency data centers to coordinate relief efforts in a disaster zone. The Pentagon could deploy mobile data centers to support combat operations.

Of course, the real-world viability of these installations is only beginning to be tested. Zoning and building codes could make siting tricky in some places. And few parking lots or rooftops come ready-equipped with an industrial 600-amp power conduit, an ultrahigh-bandwidth network connection, a hefty 60-gallon-per-minute water pipe, and a large chiller to cool the water. Security fences, cameras or guards would almost certainly be desired.

Nevertheless, early industry reaction has been favorable. InfoWorld magazine listed Project Blackbox as one of “12 crackpot tech ideas that just might work.” David Patterson, a noted computer scientist at the University of California, Berkeley, who sits on Sun’s technical advisory board, adds that Project Blackbox would allow companies “to put data centers closer to inexpensive or environmentally friendly sources of electricity, like a hydroelectric dam or a wind turbine.” And the spread of Blackboxes, he notes, “could significantly reduce the cost of utility computing—this notion that, in the future, an iPhone or whatever will be the only thing we carry with us, and most of what we do will be an online service.”

An Extended Conversation

Project Blackbox was inspired by a casual discussion two years ago between Papadopoulos and computer inventor Danny Hillis—although in truth, Hillis notes, that chat was only the latest round of a conversation that had gone on for more than a decade. The notion, he says, dates back to when he was chief scientist at Thinking Machines, Inc., a supercomputer maker in Cambridge, Mass., and Papadopoulos was an engineer he had just hired from the Massachusetts Institute of Technology.

“A bunch of us there liked to fantasize about what the world would be like when computers were really small and cheap,” Hillis says. “We loved the idea that you’d have a very simple machine on your desk and a lot of the work would be done back at some really big buildings full of computers.”

Thinking Machines closed shop in 1994, but the conversation continued. A decade later Papadopoulos, who had gone to Sun when that firm bought Thinking Machines’s technology, dropped by to visit Hillis at his new consulting company, Applied Minds, in Glendale, Calif. As the two men were puzzling over how to make the smallest and most energy-efficient computers possible, Hillis turned the question inside out: What was the biggest computer that could be built?

In practice, Papadopoulos reasoned, the biggest computer would be defined by the largest box that could be ship­ped around. As both men recall, that notion quickly put them onto the idea of building a system inside a shipping container. The container would be the computer.

This idea was not original. Brewster Kahle, another Thinking Machines alumnus, was already trying to supply developing nations with computer systems built into shipping containers. And the U.S. military had experimented with transportable data centers in trucks for field operations. But those designs had been for ad hoc, one-of-a-kind products. No one had done the serious engineering required to design a mobile data center as a mass-producible commodity.

Heat Dump

Intrigued, Papadopoulos asked Hillis and his colleagues at Applied Minds to design and build a prototype for Sun. The challenge was trickier than it might seem. Hillis could not just throw a bunch of servers into a container willy-nilly; they would fry themselves. A standard rack of modern servers consumes about 25 kilowatts of power, almost all of which ends up as heat. Conventional data centers are therefore built with plenty of space between the racks to allow for air cooling—a primary reason why data centers tend to consume so much floor space. Inside a sealed container, however, the heat would have no place to go. “That was the number-one technical challenge by far,” Papadopoulos says.

After much trial and error, they found an elegantly simple solution. A standard shipping container is eight feet wide, eight feet high and 20 feet long. Up to 38 servers are stacked like pizza boxes on a frame that looks like a deep, stand-alone bookcase with no back panel. Four of these “bookcases” are spaced out along one wall [see box on opposite page]. A tall, water-cooled heat exchanger is sandwiched between each pair of cases. Another set of four cases and coolers stands along the opposite wall.

In this configuration, hot air from the first case of servers is vented out its back side right into the adjacent heat exchanger. The exchanger draws out the heat and sends cool air into the front of the next case. Heat exits the back of that case into the next heat exchanger, and so on, in a loop that travels down along one side of the container and back around the other side. Indeed, once the door is closed the air circulates in a continuous cycle.

The catch, of course, is that the heat exchangers must exhaust hot water outside the container to the environment or send the warmed water through an external chiller to cool and recirculate it. Either way, because the heat exchangers must absorb large amounts of heat quickly, water must flow through them at 60 gallons a minute, requiring a substantial supply pipe.

Another practical matter is bandwidth. If 10,000 company employees or Internet users are tapping into a Blackbox’s terabytes of memory, its owner will have to run more than a little phone line to it. Sun recommends a dedicated fiber-optic cable. That arrangement is not too taxing for an on-site parking garage, but a downtown company that places boxes in a suburban warehouse or next to a low-cost, rural power plant will have to lease such lines from a regional telecommunications firm. Of course, an owner would have to program the system as well, but most companies choose to do this work so they can customize a data center to their needs.

Hillis’s group had its prototype working well enough by spring 2006 that Sun decided to develop the system for market. Papadopoulos tapped yet another Thinking Machines alumnus to lead the development team: David Douglas, Sun’s vice president of advanced technology.

“In the prototype, they got everything about 80 percent right,” Douglas says. “But now my job was to figure out how we were really going to manufacture these things. How do you take empty shipping containers and stamp out data centers in high volume? How do you drive the cost down and get the reliability up?”

Engineering a Real Product

The result was an endless amount of fine-tuning, he says. Figuring out how to run the plumbing for the cooling system. Installing sensors for standing water so that a leak could quickly be detected. Providing escape routes for those times when people had to be inside a container. Finding a way to run the data cables that interconnected the servers so that someone could still pull out a unit for repair. Putting shock absorbers under each rack of computers so that they could survive the rough landings they would inevitably encounter during transport.

Finally, in October 2006, Sun unveiled the product. (The bright white container was hastily painted black for the occasion.) Papadopoulos says potential customers were intrigued—though not exactly for the reasons he initially expected.

“We were originally excited about making these systems really energy-efficient and inexpensive to operate,” he says. “But the ‘early adopters’ we talk to are much more taken with the notion that Project Blackbox is prefab and fast. People are saying, ‘I need this in 30, 60, 90 days. And I need it to work.’”

Sun will not discuss how many of these early adopters have actually placed orders. “Suffice it to say we have a very robust pipeline of interest,” Papadopoulos notes. “These are guys who are close to meltdown,” meaning they cannot add faster computers to their current systems because they have no way to exhaust the increased heat. “We expected skepticism, and we got it,” he adds. “This is a radical concept in the data-center world. What we didn’t expect, however, was to give a presentation and have people come up afterward saying, ‘I need 10 of these tomorrow.’”

More to Explore
The Box: How the Shipping Container Made the World Smaller and the World Economy Bigger. Mark Levinson. Princeton University Press, 2006.

What Is Web 2.0: Design Patterns and Business Models for the Next Generation of Software. Tim O’Reilly. September 30, 2005. Available at

Sun Microsystems’s official Project Blackbox Web page: