So you're usually looking ahead nine or 10 years ahead?
About 10 years is the cadence I try to keep to. It varies. A lot of what I do also is called "backcasting." I'll work with the people who are designing ultrabooks [extremely thin, light laptops], and they'll ask, "What should we do for 2015?" And I'll say, I've got this body of data, let's look at what the future of ultrabooks looks like by starting at 2020 and working back five years, instead of starting at 2011 and looking ahead a few years.
How do you ensure that the ideas you have for Intel's future are compatible with the directions that hardware-makers (Apple, Dell, etcetera), who use Intel chips in their PCs and mobile devices, want to go with their products?
The first step in my process is social science. We have ethnographers and anthropologists studying people first and foremost. So all of the future casting work I do starts with a rich understanding of humans, who are going to use the technology after all. Then we get into the computer science. Then I do the statistical modeling. Then I start developing models about what the future is going to look like. Then I hit the road.
A huge part of our work is getting out and talking not just to our customers but the broader ecosystem of government, the military and universities. I ask them, "Where do you see things going? And what will it be like for a person to experience this future?" It is such an important part of my work to get their input.
Can you give us an example of some research with interesting implications for the future of technology?
I've been doing a lot of work with a synthetic biologist named Andrew Hessel doing work with the Pink Army Cooperative, the folks doing cancer research. He's studying the design of viruses as well as DNA. Think of the DNA as the software and an organism--bacteria or virus--as the hardware. You stick the software in and it actually becomes a computational device. Consider this, you take a GPS app and put it into your cell phone, and your cell phone becomes a GPS. But what's really awesome about synthetic biology is that you go to sleep with one organism and when you wake up in the morning there are two, and then there are four. They become self-replicating computational devices. I'm just starting to look into that.
What are some of the most important issues that you're talking to people about now when you're out on the road?
There are three main themes--one is called the secret life of data, the second is the ghost of computing and the third is the future of fear.
Those sound like book titles. How can data have a secret life?
The secret life of data is thinking about what it will be like to live in a world of big data. Consumers already know about big data. They already know about cloud computing, for example. What will that feel like when we're creating so much data about ourselves through sensors and other technology that data begins to take on a life of its own? It's already starting to happen, and it's only going to get bigger. You have algorithms talking to algorithms, machines talking to machines. What does it feel like to be in that world, number one, and number two, how do we make sure that when that data comes back to us that it's meaningful? It's not just synthesizing massive amounts of financial data and spitting me out some credit ratings. We've moved beyond that.
What do you mean when you talk about the "ghost of computing"?
Look at the microprocessor, it keeps getting smaller and smaller and smaller—it's crazy how small it gets. If it keeps getting smaller what happens when that unit of compute gets so small that it disappears? We've been talking about that world for awhile but as you get out 10 or 15 years we're getting closer and closer to it. What happens when computing is in the walls or in a table? So that's one side of it, what does the world look like when we're surrounded by intelligence?
There's another ghost of computing that doesn't look like this invisible specter that's all around us. It looks more like the ghost of [Jacob] Marley, dragging the chains behind him leading to all the cash boxes. We're dragging computer legacy systems behind us. I could go online and book a flight with Orbitz, but Orbitz still needs to talk to the Sabre Global Distribution System, the old system that came out of the mainframes that all of the airlines use. Orbitz still needs to speak with an antiquated piece of software. We can't forget that. New technologies and older technologies aren't mutually exclusive. They're going to have to work together to some extent, and we at Intel need to recognize that.
And the future of fear?
The reason I like talking about fear is that it's a human experience. We know that security is important, and it's only going to get more important. So as we look 10 to 15 years out, what I want to do is to think, what do we really need to be afraid of? I'm on sort of a personal campaign against fear. When we talk about what it means to live in a safe and secure world, there's a lot of misinformation and a lack of information out there. Because of that, people are creating bogeymen. We're creating these irrational things, and that's very dangerous--especially when we're making decisions, whether it's hardware design or something else. We need to take a fact-based approach to what should we be afraid of and what shouldn't we be afraid of. And the stuff that we shouldn't be afraid of, we need to push that aside. The stuff we should be afraid of, we really need to dig into.
What's frustrating is that talking about this fear is not usually a technology question, it's a cultural conversation. When I'm out teaching or lecturing, 50 percent of the questions I'm asked have to do with fear, something that someone is worried about. Let's find out what people are afraid of and attack it. I'm an incredibly optimistic person. The problem with fear is that fear sells. It even has policy implications. I want to pull people away from the fear because otherwise people will gravitate toward it. Very few innovations have come out of being fearful.
What are people afraid of--technology in general or something more specific?
Well, there are some specific fears such as identity theft and online banking. That interests me, but I want to go deeper into the stack. People think about security and privacy as if it is a thing, an element. We have carbon, we have sodium, we have security. That's not true. Security is a social construct. So you have to ask people that when they talk about security, what are they talking about? For example, security and privacy in the United States looks very different than it does in the E.U. or in China. What does it mean to be secure? What is the DNA of security and privacy?