WHO IS RESPONSIBLE?
The panelists agreed on certain priorities for maintaining or strengthening data security. Some of these were technological or related to users’ experience of various systems, but regulatory and legal frameworks were also crucial.
DIFFIE: I think that probably the root cause of the insecurities that already plague us is the terrific ability of the information security industry to get itself out from under liability. If we want a secure internet, the right thing to do is set a deadline. Basically say, “In 10 years we’re going to have strict liability in software security. And that means you better develop the technology so that you can answer to that responsibility.” It wouldn’t do any good to insist on doing it overnight. It would just bankrupt Microsoft and probably the rest of us. But I believe it is achievable as a 10-year national goal. I proposed this to the National Academies in 2002. We’re now six years into my 10-year proposal and it hasn’t happened yet. [LAUGHTER.]
SADLER: You think that’s a national goal rather than an international goal?
DIFFIE: Yes, it’s an international goal. For the U.S. to make it a national goal would go a long way toward making it an international goal.
The foremost influence on these things in the next decade is going to be web services, and what I call digital outsourcing. Right now our business religion in the U.S. is that you outsource everything that isn’t one of your core capabilities. We’re going into a world where there will be a million computational services that somebody else can do for you better than you can do for yourselves.
What we see today with Google is just a camel’s nose under the tent. Every organization in the U.S.--even ones that are draconian about watching their employees’ e-mail, etc.--let people query Google as a research tool. Which means that the people with access to the Google query stream--who on the face of it are just Google themselves but who knows--know what every development group in the country is doing. What every legal group in the country is doing. What every marketing group in the country.
Ten years from now, you’ll look around and see what we call secure computing today will not exist. That is, we say now you’ve computed something securely if you did it on your own machines and you protected them adequately. Every major business program will be turning around constantly and going outside in-house systems to the rest of the Internet.
So what is going to be needed is a legal framework that obliges contractors to protect the security of the information. But they cannot respond to the obligation unless the technical machinery can be developed to allow them to protect that information.
GILLILAND: Yes, but if you look at how customers are actually implementing technology today, they’re already far behind what it can do. That’s not to say that this isn’t the direction in which we should be heading as a country and as an industry, but that’s not necessarily the problem now. It’s how do we make this technology practical so that customers can actually address their own privacy issues, their own auditing processes, and manage the protection of their data for themselves to current standards, which for the most part they’re not doing today.
LIPNER: I think there are two components. One is to get the underlying pieces of the infrastructures robust enough so that it’s hard to do the sorts of attacks that Whit was alluding to. And then the second is to provide the infrastructure so that you know, both as a practical matter and with legal assurance, whom you’re dealing with and what kinds of assurances you have about your interactions with them.
For the business customers, you want the sort of things that Art and Whit are talking about: assurance about what will be done with your data, ways to describe the restrictions on it and so on. For the consumer, you want an environment that they trust and that just works--because a lot of the growth of the Internet and Internet business is based on consumer confidence. We need to increase that confidence and ensure that it’s justified.
GILLILAND: The interesting balance that we have to figure out is, how do you enable businesses to continue to share information as rapidly as possible so they can make good decisions and yet make that sharing simple? Content filtering and other practices can be invisible to the end-user yet controllable by an administrator so that you can enable businesses to share information faster but still feel safe in its security.
DIFFIE: How can content filtering really be invisible? I send email and it violates the standard so it gets censored on the way to somebody. Clearly I’m going to notice.
GILLILAND: Yes, but you should make the classification of that data invisible to the end user. You’re not asking them to say, “Is this an important document?” You should ask, rather, “Is this something that shouldn’t actually be sent out?” Try to prevent people from sharing data accidentally that shouldn’t have been shared.