More In This Article
Rugged individualists aside, many people find themselves increasingly connected not just to one another but also to the devices that make those connections possible. It’s clear that dependence on smartphones, tablets and other gadgets will only strengthen as broadband access, wireless connectivity and content grow. Less obvious is the impact this human–machine bond will have on our lives.
Cultural anthropologist Genevieve Bell leads a group at Intel Labs—Interaction and Experience Research—that aims to understand what people want from their technology and what might happen if they get it. The group also studies how people use technology, what motivates this use and what frustrates them, all in an effort to design microprocessors that help meet those demands.
A second-generation anthropologist, Bell grew up at her mother’s field sites in central and northern Australia in the 1970s and ‘80s. Scientific American recently spoke with Bell—who joined Intel in 1998 as one of the company’s first social scientists—about her role there, our evolving attachment to our gadgets and making “magic” from silicon and circuits.
[An edited transcript of the interview follows.]
What is a cultural anthropologist doing at the world’s largest chipmaker?
Anthropology is a classically well-designed discipline for making sense of what people want. To make those insights about what people care about legible and intelligible to an engineering-oriented organization, you have to do a bit of translation.
How do you translate ideas from the social sciences to the technology world?
You have to say, here are the things we have seen in the field, and here are some consequences of those insights. Then you present ways to turn those ideas into prototypes. Some of those ideas are smoke and mirrors, some are sketch-board prototypes and some are fully fledged working things that make you ask: What would it take to actually [manufacture] this?
What happens to the smoke-and-mirror ideas?
With those, we realize that if we want to create a product out of an idea, we’re going to have to invent new technology to make that possible or hack the hell out of something else to get us close. Sometimes our scientists start by going, there’s this piece of technology and everyone’s using it for this thing, but if we do this other thing with it, oh my God, it would be totally cool. You come at a technology from different angles.
Customization has come a long way, with services such as Amazon and Netflix trying to anticipate our needs and make recommendations based on our behavior on those sites. How will our interactions with technology change as it becomes more personalized?
At the moment those recommendation algorithms sit in a number of different places in our lives, and there’s a little bit of bleed in between them. But we are getting to a point where recommendations won’t just come from services [like Amazon and Netflix]. They’ll come from our devices as well. Google+ and [Apple’s] Siri have learning algorithms that respond to your voice. Now imagine a world where our devices know our bodies. Apple’s new iPhone fingerprint sensor is a lovely example of that. Devices start by recognizing your thumb or your voice; then they could learn to recognize your friends’ voices, recognize the way you walk. Imagine if those devices put that information together with information about your location and the appointments on your calendar. That device gets to know you as a human being.
Why is it important that your devices get to know the real you?
This is about moving from human–computer interactions to human–computer relationships. The moment this really crystalized for me, and it’s a silly thing really, was when I saw a YouTube video of a Furby talking to Siri. And it was  seconds of splendor where the little Furby waves its ears and [bats] its little eyelashes and [makes noises]. [Editor’s note: To Furby’s nonsensical sounds, Siri responds, “I don’t see ‘Killher’ in your address book, should I look for businesses by that name?” Later in the video, Siri announces it is searching for Shell.com in response to more Furby gibberish.]
I was utterly mesmerized by the video for a really long time, and I couldn’t work out why. Then I realized it was a genealogy of talking things, a classic kinship diagram—a granddaddy thing talking to grandbaby thing. It was a thing that talked to a thing that listened. Siri promises to listen to you. There’s a notion of reciprocity with Siri. Once things listen, there is an implicit transformation that is no longer you telling something what to do, there is relationship building.
What are the benefits of devices being able to add context to the information that devices collect about their owners?
A device might be able to give you directions home, including the best ways to avoid traffic. It starts to make recommendations. The thing that is fascinating is that we are moving closer to a world where the technology in our lives—partly because of the devices themselves and partly because of the services that sit on those devices—has the capacity to know us and to start to act in concert with us on our behalves without us needing to tell them all the time what to do.