In 1977, 22-year-old Steve Jobs introduced the world to one of the first self-contained personal computers, the Apple II. The machine was a bold departure from previous products built to perform specific tasks: turn it on, and there was only a blinking cursor awaiting further instruction. Some owners were inspired to program the machines themselves, but others could load up software written and shared or sold by others more skilled or inspired.
Later, when Apple’s early lead in the industry gave way to IBM, Jobs and company fought back with the now classic Super Bowl advertisement promising a break from the alleged Orwellian ubiquity of Big Blue. “Unless Apple does it, no one will be able to innovate except IBM,” said Jobs’s handpicked CEO John Sculley.
In 1984 Jobs delivered the Macintosh. The blinking cursor was gone. Unlike prior PCs, the Mac was useful even without adding software. Turn it on, and the first thing it did, literally, was smile.
Under this friendly exterior, the Mac retained the essence of the Apple II and the IBM PCs: outside developers could write software and share it directly with users.
The rise of the Internet brought a new dimension to this openness. Users could run new code within seconds of encountering it online. This was deeply empowering but also profoundly dangerous. The cacophony of available code began to include viruses and spyware that can ruin a PC—or make the experience of using one so miserable that alternatives seem attractive.
Jobs’s third big new product introduction came 30 years after his first. It paid homage to both fashion and fear. The iPhone, unveiled in 2007, did for mobile phones what the Mac did for PCs and the iPod did for MP3 players, setting a new standard for ease of use, elegance and cool. But the iPhone dropped the fundamental feature of openness. Outsiders could not program it. “We define everything that is on the phone,” Jobs said. “You don’t want your phone to be like a PC. The last thing you want is to have loaded three apps on your phone, and then you go to make a call and it doesn’t work anymore.”
Being closed to outsiders made the iPhone reliable and predictable. In that first year those who dared hack the phone to add features or to make it compatible with providers other than AT&T risked having it “bricked”—completely and permanently disabled— on the next automatic update from Apple. It was a far cry from the Apple II’s ethos, and it raised objections.
Jobs answered his critics with the App Store in 2008. Outside coders were welcomed back, and thousands of apps followed. But new software has to go through Apple, which takes a 30 percent cut, along with 30 percent of new content sales such as magazine subscriptions. Apple reserves the right to kill any app or content it doesn’t like. No more surprises.
As goes the iPhone, so perhaps goes the world. The nerds of today are coding for cool but tethered gizmos, like the iPhone, and Web 2.0 platforms, like Facebook and Google Apps—attractive all, but controlled by their makers in a way even the famously proprietary Bill Gates never achieved with Windows. Thanks to iCloud and other services, the choice of a phone or tablet today may lock a consumer into a branded silo, making it hard for him or her to do what Apple long importuned potential customers to do: switch.
Such walled gardens can eliminate what we now take for granted and what Jobs originally represented: a world in which mainstream technology can be influenced, even revolutionized, out of left field and without intermediation. Today control increasingly rests with the legislators and judges who discipline platform makers. Enterprising law-enforcement officers with a warrant can flick a distant switch and turn a standard mobile phone into a roving mic or eavesdrop on occupants of cars equipped with travel assistance systems. These opportunities are arising not only in places under the rule of law but also in authoritarian states.[break]
Curtailing abuse will require borrowing and adapting some of the tools of the hidebound, consumer-centric culture that many who love the Internet seek to supplant. A free Net may depend on some wisely developed and implemented locks and a community ethos that secures the keys to those locks among groups with shared norms and a sense of public purpose rather than in the hands of one gatekeeper.
In time, the brand names may change; Android may tighten up its control of outside code, and Apple could ease up a little. Yet the core battle between the freedom of openness and the safety of the walled garden will remain. It will be fought through information appliances that are not just products but also services, updated through a network by the constant dictates of their makers. Jobs, it seems, left his mark on both sides on the tug-of-war over Internet openness.