Scientific American editor in chief, John Rennie, discusses the future of privacy and security, the subject of the September single-topic issue of Scientific American magazine. Plus, we'll test your knowledge of some recent science in the news. Web sites mentioned in this episode include www.SciAm.com/sciammag; www.snipurl.com/sciamfootball
Podcast Transcription
Steve: Welcome to Science Talk, the weekly podcast of Scientific American for the seven days starting September 3rd, 2008, I'm Steve Mirsky, and I am not watching you, but somebody could be watching you or listening to your phone calls or going through your personal records. The future of privacy is the subject of the special single topic September issue of Scientific American magazine. I spoke with editor –in chief John Rennie about the issue this past Friday.
Steve: Let's first talk about where did this idea to do this as our single topic issue come from.
Rennie: Well, the idea has been around for a while because it ties in so well with both technological issues and certain very important policy ones. You know, it's given that we live in the information age; huge parts of our economy and our lives otherwise consists of shuttling information around in different ways and we have more and more technologies that facilitate that. What, sometimes, we can forget is that that information is our information. It's information about us. It's about information about things that we care about and although the expansion, through all these technologies, of ways of exchanging information, increases the different kinds of services and goods that can be available to us. At the same time, it does open up lots of potential liabilities, because you never know whether somebody is going to be looking at your information when you wouldn't want them to. So we had felt Scientific American had a very good reason to look at this kind of issue just from the technological standpoint, but also of course, there are so many political considerations that have also [come] up in bearing a lot on what people have been making of matters relating to privacy and security. Obviously, ever since 9/11, people have very strong security concerns, and that has increased because of the, what seems to be extraordinary times. There have been more and more calls within the government for us to be expanding the range of surveillance that we would make both of our physical beings in public spaces but also just of the information that we have. All of that together just suggested this was a very, very rich topic for us to look into, and I think the future-of-privacy issue we put together bears that out.
Steve: One of the interesting things you bring up is this idea that more and more, you know, you were just really nearly in some ways exposing ourselves on the Internet, but that seems to be almost of no concern to people under [a] certain age.
Rennie: Right. One of the articles in the issue—actually, the final one—it talks about the end of privacy and it talks about, sort of, a generational change. But this shift of the idea of it; so many people are completely comfortable using social networking sites like Facebook and MySpace and so forth, to really live their lives in a kind of public scrutiny that once would have just been inconceivable. In many ways we don't really know what all the consequences of that are going to be. There may be a lot of things that people are putting up online these days that they will come to regret down the line, but it may also suggest that we are really looking at a change in a way that we view privacy. But something that [a] number of the authors point out is that when we talk about privacy, our modern notions of privacy are, sort of, a historical artifact anyway. One point is that, you know, at a time when we all lived in small villages, frankly everybody knew everybody else's business. In a lot of ways, it's really only been in the past couple of hundred years that it's been possible for people to achieve something like the sort of privacy that we have taken for granted. So this may be, sort of, [a] regression to the norm that way.
Steve: That's an interesting point—you can have privacy by being a home made [hermit] up in the mountains or by living in a hugely dense city, which [but you] really can't have privacy in a small town.
Rennie: Right. Well you know that's that nature of society. We deal with other people, and when we deal with other people, we make things about ourselves available to them. They get to know us and in that process we lose a little bit of control over some information that we have about ourselves. The real question is whether or not the spade [spate] of new technologies that are now coming out—and new ones that will continue to come out—whether they do anything to help us regain some control over that access to our information or whether or not they just increase the rate of [at] which that information slips through our fingers. The answer is it is actually some of both, and a lot of will depend on the kind of regulatory structures and other decisions we make about implementing these technologies to determine just what the balance in that edge is.
Steve: Another issue is, you know, you might be putting things up there that expose you to embarrassment; and that's one thing if 20 years from now you are embarrassed because there is a video of you doing something stupid on the Web, but then there is this stuff that is really important where somebody can clean out your bank account, which goes way beyond the embarrassment.
Rennie: Right. Well you know that's an interesting notion. Esther Dyson who is a writer and well known member of the digerati, she wrote the key note article talking about the notion of Privacy 2.0 which did go to this way in which we are rethinking what privacy really is. And she made what I think is probably a very good argument in there, that a lot of the times when we talk about privacy, we tend to rope in a lot of concerns that maybe shouldn't be seen as ones relating directly to privacy itself. They're security concerns. They can actually be separated from that. An example which she gives, for example, is—and in fact we have another article writing about—genetic privacy. There are lots of concerns that people have about electronic medical record keeping, that health information about ourselves could, maybe slip out to other people or to agencies that we wouldn't like to know [it]. If we have certain genetic traits, do we necessarily want insurance companies to know about that if that becomes the basis for them to decide that they are not going to extend medical insurance to us anymore?
Steve: So if there is some kind of predisposition to a disease that you don't even have, but they decide we are not going to take the chance on you because you have a 12 percent chance of getting this disease versus somebody else who has a 2 percent chance.
Rennie: Right. And of course as we start to move, you know it becomes more and more routine for people to be getting some kinds of genetic profiles made of themselves, and as we understand what the genes do better and better, those kinds of connections will start to come out. So I mean this is one concern that often gets raised about our privacy. But a point that Esther Dyson makes is that that goes to questions about health insurance itself. If we wanted to live in a society in which everybody was guaranteed health insurance, we could decide that was the case. It's not really a matter of privacy as such.
Steve: It's privacy within that matrix of the existing insurance establishment.
Rennie: Exactly, and so there are a whole other set of decisions we could make that have nothing to do with privacy but that could provide some kind of security for us. Similarly, there are lots of issues about, you know, people worry about like getting hold of say somebody's social security number, or you're losing control of your social security number. Well there's a question of maybe what we needed to do is not worry about the privacy of that social security number because, let's face [it], whether we want to or not, we often give out our social security numbers all the time; that the answer may be that we should try to make it difficult for people who are not us to use that social security number and get our bank accounts or other valuable records. So she makes a number of different points about the changing state of privacy this way and that in the future, what is really going to be important is most of us would want to try to have better ways of regulating the degree of access that we grant over certain kinds of critical information. There are some kinds of information I don't care if the world knows. There may be a subset of related pieces of information that I only want my doctors to know or I want only my wife to know and then so forth. So it will be interesting to see whether we can develop ways of regulating access to the information, even though in a sense, the information is no longer private in a deep, dark, secret way. And she also makes very important points about the fact that you really want to maintain people's control over information when you're getting into situations in which governments or businesses have a, kind of, asymmetrical ability to demand information or to get information about you that you can't reciprocate. She makes the argument that we should have a certain level of safeguards about the privacy of our information; but, quite frankly, governments and businesses largely should be much more open, because it is very important for us to be able to monitor them and blow the whistle on any wrong in their doings, so we can safeguard against any kinds of abuses.
Steve: Which is exactly the opposite situation that currently exists for the most part?
Rennie: That's right. In fact that's a concern that Whitfield Diffie and Susan Landau—who are other authors—get into because they talk about the expanding brave new world of wiretapping. Wiretapping is not new; people have actually been tapping communications a lot longer than people might have thought.
Steve: Since, I think the telegram.
Rennie: Oh! That's right.
Steve: Oh! I mean before that you can [go] back to thousands of years intercepting messengers—but in terms of electronics ...
Rennie: Right basically.
Steve: ... you could tap a telegram's message.
Rennie: Yeah, as long as there have been telecommunications, there have been forms of wiretapping, and the laws regulating wiretapping have sometimes had to evolve with the changes in the technology, not too surprisingly. There was a certain level of widening of the scope of that that had happened back in the '90s because suddenly you had so many different devices of answering machines and so forth and when the government need[ed] to go and get a wiretapping, you had to be in a position to say that they had to be able to get access to all of those sorts of devices. But really just since 2004 has the federal government been pushing—through the department of justice and the FBI—to argue that the government also has a need to be able to get into Internet communications the same way. But technically this is a much more complicated problem because you're dealing with packet switching. It's not like you have just one wire on which your message is being transmitted. So technically there are a lot of issue[s] about how you're going to do that and what Whitfield Diffie and Susan Landau talk about is a little bit about what's involved in trying to accomplish that. The argument they make is, it's possible to do this, but it may not be advisable to do it. First of all because all the techniques that would make it possible for the government to be able to tap into say the voice-over-the-Internet telephone calls of drug dealers or suspected terrorists could also be abused to tap into all [kinds] of other information, in ways that would really escape a lot of the usual kinds of ways we have to try to prevent that and would also open some kinds of backdoor systems that basically hackers and foreign powers could use to hack into our own telecommunications. So their argument was, it is probably not a good idea to continue to expand the government's ability to look into our communications that way.
Steve: But what about, you know, security? Why shouldn't the government be allowed to monitor all telecommunication to try to find terrorist chatter when you know we're not doing anything wrong anyway? It's: Unless you are David Duchovny you don't really care if anybody hears what you're talking about on the phone?
Rennie: (laughs) Well, you know, that goes into an obviously a much larger political discussion. We don't go too much into a lot of the debate of that in this issue, although it obviously is reflected in some of the kinds of positions and backgrounds that the authors have to give; but right there's always a question in a free society—or what we want to have, [a] ear [near]-free society—about what is the level of authority that you give to the government to be able to act on our behalf and provide greater security. A lot of people do look at this, you know, these issues of privacy and security as though they're two goods that are sort of antagonistic to one another. But I think, Esther Dyson and a lot of other people would make the argument that's probably a false dichotomy; that in fact you don't really have to sacrifice privacy in the interest of greater security.
Steve: Let's talk about a little bit about some of the technology. I know you have an article on the RFID tags, and I was at a privacy conference—four to five years ago at this point—and I remember one of the speakers was really at lather about RFID tags and the ability of companies to track your every move because you bought a shirt.
Rennie: Right. RFIDs are radiofrequency identification tags. They can be really quite small and the size associated with them gets smaller all the time. Lots of different businesses are entrusted in trying to incorporate RFIDs into products more and more commonly because they are great for tracking inventory, for example, and for preventing shoplifting and lots of problems like that. But as has been pointed out, if when you're leaving the store, if that RFID hasn't been deactivated in some way, theoretically somebody else can be identifying the combination of RFIDs that [are], be you know, in the clothes that you're wearing or the products that you're carrying and could be used as way of sort of tracking you. You get into various debates about how technologically feasible that is about whether or not they have to be very, very close to be able to read those signals, whether they can be done at more of a distance, but I think it has been demonstrated a number of times. And this is something that the article that we have on RFIDs gets into that has been demonstrated, that sometimes that can be done at quite a considerable distance; and in many case even though most RFIDs don't have any real information that you'd think of as private about you, it can nonetheless be used in various ways to start to develop a profile of you; it would connect you to another information about yourself, or at least you to a set of habits that they might want to take advantage of if they were thieves or anyone else.
Steve: Or if they just want to figure out what they can sell you next.
Rennie: Well, that's right, you know, I mean this, sort of, goes to the promise or the parallel of which [peril of what] you can do with RFIDs. Under one sort of speculative vision of this, you know, if I wear lots of clothes, which have all these different kinds of chips that are embedded in them, and they happen to identify the different types of clothes and models and styles of things that I like, in theory as I walk into a store, you could, as [you] walk through the entrance way of the store, computers could quickly get a profile of what I like, what styles I like, what things I might be looking for, and immediately some sales person could be coming over and tailor their pitch to me. Now that may sound heavenly or hellish to you depending on what you think about it; but that's really both the tragedy or the triumph of what you can get with some of these kinds of technologies. And it really goes to, you know, all of these issues, there are lots of debates that surround them about how do we want to use them. What are the prices we're willing to pay for maybe being able to let some kinds of information out very, very freely, but what are the downsides? What are the prices we are willing to pay, but what are the benefits that we could also accrue?
Steve: Another aspect of the technology is biometrics and biometrics as identification. And I know we have an article about the idea that, forget about knowing all these passwords and carrying a million keys and key cards—just your fingerprints, your cornea, or whatever else you've got, your DNA in a swab, can actually open doors for you.
Rennie: Right. I'll vote for, of course, everybody is very familiar with fingerprints which go far back in terms of being able to identify people as a kind of a biological signature that could be associated with just them. A lot of these kinds of things which you're talking about, you know, like voice prints or iris scanning and so forth, the technologies aren't necessarily brand new, but the amount of processing power that it took to be able to implement them used to be forbidding. You couldn't dream of trying to put it into something like a laptop computer. These days because computing power has become so cheap, and because we can sometimes link—wirelessly or otherwise—devices up to other kinds of databases of information very, very easily, it's certainly possibly for this to be very easy. So for example there are brand new lines of laptops that are coming out that the camera that is built into the laptop will automatically have a face recognition. So that when I sit down to use my laptop, that laptop is keyed in to let me use it, but not as you sat down and try to present yourself that same way. So these technologies work, and especially if you use them in combinations, there are some ways for them to be much more secure than they would be for you if you just had to rely on something like a password you might forget or something of that sort. The catch is that the error rates of a lot of these technologies are still little bit higher than you would like and you would like to avoid having to use multiple biometric signatures to get you into a laptop or anything else. So, you know, it's in the evolving stage of the technology, but the point is that's probably closer to where a lot of the future on this lies than us having to try to remember, that you know, the million and one passwords that we're all supposed to have.
Steve: Really how many times each week do we say, "Yeah, I forgot my password, send it to me"; unless you use the same password for all the things that you try to access, and that's dangerous too.
Rennie: Yeah, you know, that goes back true to the basic user behavior. One of the other things that we have in the issue is a round table of representatives of different companies from the security industry, the data security industry, talking a lot about the state of online security and what could be done to conceivable[ly] improve that. It's a very interesting discussion. We have a part of it that's in the September issue itself, but then people go online to ScientificAmerican.com, they can find the full version of that, because we spoke for a good hour an [and] a half. It was a very interesting discussion. But again one of the points that they made was that bad user experience or bad consideration for what's its like to be on the user end is itself a kind of an inducement to security problems. Because the fact is, the rules for having a really good password or that you don't reuse your password and you should use the password that is a random gibberish string for all intents and purposes, the reality is most people don't do anything like that. People do have certain passwords that they use over and over again and the reason why it's often so easy for hackers to be able to get into people's accounts, files or [of] whatever sort is because in fact the passwords they use tend to be ridiculously easy to use or to guess. So its, you know, things like pet names or names of mythological characters or even apparently the number of people who use password itself as their password is still insanely high.
Steve: The password is...
Rennie: Security breach. So really, you know, the question is, where are [a] lot of these things going? And one of the points that they made is that when you're trying to device [devise] a more secure systems online, or for any kind of network system at all, you need to be very much aware of what the user experiences. Because if you make it hard for people to have, sort of, good security hygiene, they will work around it. They will do things like, if you make it very, very hard for them to transfer a file say from one location to another one in a secure way, they will just, you know, call up their G-mail account and they'll mail it to themselves.
Steve: Right.
Rennie: Then you've completely violated all the protocols of that. So, anyways, very interesting article that way, getting, sort of, the insights from the security site [side] about what are some of the problems that faces online security.
Steve: Let's talk just a bit; there's a fun piece in the magazine on spy gadgetry.
Rennie: That's right. Yeah, we have just a simple sort of pictorial piece that takes a look at a lot of the current and cutting edge of new kinds of surveillance equipment. Cameras of course get smaller and smaller all the time. Basically it's easy for anybody to have a camera, anyplace, anytime, anywhere and you wouldn't necessarily know it. But the world of surveillance is constantly expanding. So, for example—and some people may be aware of this that it is possible—if you and I are speaking in this room and there are windows over there, and someone across the street using a laser can actually train the laser on the window panes for the room and measure the minute vibrations of the window glass, measure that with the laser and reconstruct what we're saying as a result.
Steve: Wow. So the window glass acts as a speaker.
Rennie: Right, that's right. And you know, thanks to other kinds of amazing improvements, you know, robotics means that, in principle, that we're getting closer and closer to the day when you could build a, sort of, little bug spybot something that could, sort of, literally look a little bit like a good-sized insect and it could crawl through a heating vent and into the room and position itself well or it might actually may be able to fly around. It might be something that looks like a little dragonfly. So you know that's also a sort of new era of surveillance that we're in. The reality is that these days you can almost always be under surveillance. It is almost impossible for you to be absolutely sure, no matter where you are, that somebody couldn't be watching you if they wanted to.
Steve: So the special issue, the future of privacy, really sets up the terms of what's going on right now rather than coming to any vast conclusions.
Rennie: That's right. This is not an issue that leads up to a particular argument of trying to say that the changes in privacy are a good thing or a bad thing or that we need to do something to, you know, curtail certain technologies or that we're a 100 percent behind certain technologies. In a sense this is what should happen with any kind of new technology or set of technologies. We have to debate how we're going to use them. We have to feel our way through that. And, reality is, there are pluses and minuses to any of this. The world is always changing, and our conceptions of what we want out of the world, they're always changing. So what's smart is for us to learn as much as possible about the technologies involved, the potential issues that are involved, and then make good, prudent choice along the way.
Steve: The September single topic privacy issue of Scientific American is available in digital form at www.SciAm.com/SciAmmag.
(music)
Steve: Now it's time to play TOTALL....... Y BOGUS. Here are four science stories; only three are true. See if you know which story is TOTALL....... Y BOGUS.
Story number 1: Chemists have created a new type of cinnamon-based paper packaging that could keep baked goods fresher for an extra 10 days.
Story number 2: If you want to build a successful galaxy, you better have a minimum mass of at least 10 million times that of our sun.
Story number 3: Children burn more than four times as many calories per minute playing an active video game than a game they sit to play.
And story number 4: One good thing about a sleepless night: It can depress an overactive immune system.
We will be back with the answer after this.
(whistling sound)
Male voice: Hello sports fans. The football season starts this week. To get ready don't miss Scientific American's special look at the science of football. Learn why coaches should go for it on fourth down more, the mysteries of turf toe and how former NFL draft pick Leland Melvin became an astronaut. It's all in SciAm's special football package at the SciAm Web site, www.SciAm.com and at www.snipurl.com/SciAmfootball.
Steve: Time is up.
Story number 1 is true. Spanish researchers have developed a new type of paper packaging made with cinnamon oil, which kills microbes. The packaging appears to keep bread and other baked goods fresh for up to an extra 10 days. The report appears in the Journal of Agricultural and Food Chemistry.
Story number 2 is true. The minimum mass required for a galaxy seems to be about 10 million times that of our sun. That's according to research in the journal Nature. Measurements of the smallest known galaxies found them all to have about that mass, which appears to be what you need to get stars that form within the region to clump together.
And story number 3 is true. Active video games get children to burn more than four times as many calories than the sedentary video games do. That research appears in the Archives of Pediatrics and Adolescent Medicine and the accompanying editorial is called "Active Gaming May be Part of the Solution to Obesity Crisis".
All of which means that story number 4 about a sleepless night depressing immunity is TOTALL....... Y BOGUS. Because what is true is that even a single night's loss of sleep can rev up immunity against healthy tissues, increasing inflammation. The study is in the journal Biological Psychiatry. So a good night sleep can lower the risk of heart disease and autoimmune disorders like rheumatoid arthritis.
Well that's it for this edition of the weekly SciAm podcast. Visit http://www.SciAm.com for all the latest science news and to check out the special report on football. For Science Talk, the weekly podcast of Scientific American, I'm Steve Mirsky. Thanks for clicking on us.
Science Talk is a weekly podcast, subscribe here: RSS | iTunes