FACING THE FEAR: So many people think the future is something that is set. They say, "You're a futurist, make a prediction," Johnson says, adding, "The future is much more complicated than that. The future is completely in motion, it isn't this fixed point out there that we're all sort of running for and can't do anything about." Image: Courtesy of Intel
Much of Intel's success as a microprocessor-maker over the past four decades has come from the company's ability to anticipate the future of technology. Since company co-founder Gordon Moore's famous assertion in 1965 that the number of transistors that can be placed on an integrated circuit would roughly double every two years, Intel's microprocessors have grown steadily smaller, faster and cheaper, helping to give birth to personal computing and mobile devices that once existed only in the realm of science fiction.
So it comes as no surprise that science fiction serves as a key inspiration for the man whose job it is to envisage Intel's future and, to a large degree, the future of computing itself.
Brian David Johnson is hardly the world's first futurist--a vocation of prognosticating scientists and social scientists dating back to the likes of Jules Verne and H. G. Wells. But he is the first to hold that title at Intel. Far from just imagining a future whose fate depends largely on the actions of others, Johnson has the resources at his disposal to transform his future-casting into reality.
For example, Johnson worked directly with Intel's software, hardware and silicon architects on the company's Atom-based system-on-a-chip (SoC) designs for processors used in next-generation compact and mobile devices. The company's software and hardware engineers likewise have consulted the research Johnson presented in his 2010 book Screen Future: The Future of Entertainment, Computing and the Devices we Love (Intel Press) to, as he puts it, "help envision a world of multiple devices and form factors that are all connected together." Johnson is currently working on the design for Intel's CPU circa 2019.
Last month Johnson was in Manhattan at the pop-culture convention New York Comic Con to promote Intel's Tomorrow Project, which engages the public in discussions about the future of computing as well its impact on society. As part of the Tomorrow Project, Intel also publishes annual science fiction anthologies featuring short stories that Johnson, himself a sci-fi writer who has worked at Intel for the past decade, says emphasize the "science" side of the genre and are intended to convey the message that humanity ultimately still controls its own destiny.
Soon after the convention, Scientific American spoke with Johnson about future-casting microprocessors, what scares people most about technology, how we can learn about the future from the past and what it takes to become a futurist—nature, nurture or a little of both?
[An edited transcript of the interview follows.]
How can science fiction influence real-world research and development?
There's a great symbiotic history between science fiction and science fact-fiction informs fact. I go out and I do a lot of lectures on AI [artificial intelligence] and robotics, and I talk about inspiration and how we can use science fiction to play around with these ideas and every time people come to me, pull me aside and say, "You do know the reason why I got into robotics was C3PO, right?" I've become a confessor to some people. I just take their hand and say, "You are not alone. It's okay."
And it's true, science fiction inspires people to what they could do. It captures their imagination, which is incredibly important for developing better technology. Such as, I'm going to write this story based on this research from these artificial intelligence and robotics guys so they have a better image of what they can do with that technology.
Which science fiction authors have inspired you the most?
So there's what inspired me as a kid--the Asimov, the Bradbury, the Heinlein—that forms the core of science fiction. As I got a little older and a little more sophisticated, it was people like Philip K. Dick, J. G. Ballard, and even more recently people like Vernor Vinge and Cory Doctorow and Charlie Stross, those types of guys. Now most of the stuff I'm inspired by is the near future that is very much based on science fact.
How does the past help inform your decisions now and your thinking about the future?
For me, it's all about models. Everything I do is models based on computer science, social science, statistics, economics—that all goes together. What I tell a lot of people is that 80 percent of what I do is actually history because that's where some of the best models are. Not that you can do a copy/repeat, but you can look at what's happened. Not only what happened and where did we go—a lot of economists, for example, will tell you that you can't say, "This is the way that it happened in the past, so this is the way it will happen in the future." But you can say, "This is the way people thought things were going to go." So I do a lot of reading about history. It's not that I want to say, "That guy got it wrong." I don't care about that. Ultimately, it's my job, because I'm an incredibly pragmatic futurist, to come up with a vision we can build.
How does your role future casting for Intel fit in with what the company is doing as a maker of microprocessors?
I sit in front of the company's development road map. For the folks that do the chip manufacturing and chip design, it's my job to literally get out in front of that. So I work with a lot of the chip designers in Israel and elsewhere. And every year they remind me that I need to be thinking about, for example, 2020. I need to get out and inform Intel's movement toward that year. My day job is to create the models that feed the chip designers. I create models of what the experience will be like, so what it will feel like to use a computer in 2020. Intel is an engineering company, so I turn that into requirements and capabilities for our chips. I'm working on 2019 right now.