THE DANGEROUS HUMAN ELEMENT
Users themselves can be the Achilles' heel of security systems because of their propensities for error and their tendency (however unwittingly) to trade data safety for ease of use. As such, it falls to technology to compensate for the potential failings of users.
HEIM: We should not underestimate the human element. There is a tendency among technologists in the U.S. to see technology as the solution. Nowadays we see that neglect and poor maintenance of systems leading to failures in security also have a broad impact.
I liken it to driving. The reason we have controls in place such as driver’s licenses is so that people at least have a basic understanding of the rules of the road and how to operate a vehicle safely, so that we can minimize those risks. I don’t think there’s been enough educational outreach to end-users on how to use their systems safely. I’m not necessarily proposing there needs to be a “cyber driver’s license,” but you know, that probably wouldn’t be a bad idea because we see that many, many of the observed problems are behavioral in nature.
DIFFIE: See, that’s exactly what would be an utterly monstrous idea. Cyberspace is the world of the future. If you don’t have a right to be there, you don’t have a free society.
LANDWEHR: I have a story that illustrates this kind of human element. It was my first exposure to identity theft, in some respects. Back in 1992, I applied for a credit card--I probably wanted to get more points or a free toaster or something --and was denied. I requested a copy of my credit report and it had an item on there from a collection agency. I called them up and asked, “What’s this, I have no idea what it is.” (And of course, calling a collection agency and saying “I have no idea what this is and I don’t owe you money” is going to immediately be an uphill battle for anyone.) They said that there was a patient who went to a doctor in Florida for $75.00 worth of medical services and didn’t pay his bill.
What happened was that somebody in the clinic had written down that patient’s social security number, which was the same as mine except for one digit. On the medical record, there may have been a handwritten six or eight or nine in which a loop was left open, or something like that. Then that error propagated through the whole system. So, not only did that billing get put on my credit reports, but that person’s name ended up on my credit report, too.
I then called the credit reporting agencies and said, “I’m not this person. It’s pretty easy for me to prove this is not my name.” They said, “Well, is this your social security number?” And I said, “Yes, but that’s not my name and I’ve never seen this doctor before. I’ve never been to a doctor in Florida.” Credit agencies then took it off; collection agencies put it back on. It was kind of a ping-pong game that went on for several months. Finally I had to get a law firm to send the credit reporting and the collection agency a letter saying, “We’re prepared to go into court. We will walk our client into the courtroom to prove that it’s not the patient in question.”
The story gets even better. After the attorneys sent out this letter, I got a nice letter back in the mail saying “We’re sorry. There was a mistake somewhere down the line where this person” and they spelled out his actual name, “had a social security number typo. Their social security number is this; your social security number is that, and you could see how this could happen.” They actually printed the other guy’s name and social security number in my letter!
So, even upstream of the technologies that you’re talking about, the human elements definitely apply. That’s why the education needs to be there. Throughout the whole process, you need to be able to look at the system and say, “When something goes wrong, how do you prove who you say you are? And how do you prevent somebody else from claiming that they’re you?”
DIFFIE: But the fault in that story is one of liability. The point is, the main thing people use power for is to negotiate their way out of liability. That is exactly what the credit collection industry has done. If it bore strict liability for its errors, and so was obliged to pay you back for the money and time that it had cost you, there would be far fewer of these errors. And that’s never going to happen because that industry, like the rest of our industries, has tremendous power to say, “You will cripple us and damage society if you make us live up to these standards.”
ABHYANKAR: The human element is something that we can’t ignore. We recently celebrated the 30th anniversary of spam. Email continues to be something that gets exploited. There is a dark underbelly to technology, and the rate of innovation that the bad guys have, and the techniques of social engineering to steal your dataare are that much further ahead of the rate of innovation that the good guys have. That’s something that technology alone is not going to solve.
GILLILAND: If you look at the research that we’ve been doing, around 98 percent of the data loss is through mistakes of human error and process breakdown. Being in the security industry, we’re always going to be fighting the bad guys. But the bad guys are less of the problem around data loss. The reality is that even after 30 years of spam, the bad guys are going to continue to invest in innovation, as we invest in innovation, because they make money sending spam. Being able to steal information is always going to be a business for somebody and you can’t ever fight all of them 100 percent. But we can stop the large percent that is the human and process error.
HEIM: Beyond the behavioral, there are also structural challenges with individuals. We see this on a day-to-day basis where if the technology organization itself can’t anticipate the needs of the individuals, they will enable themselves to get their jobs done using consumer-grade technologies in many cases.
SHERSTOBITOFF: Right. We can’t keep your information secure if you’re going to email it to yourself over Gmail so that you can work from home.
HEIM: Sure, if individuals are not enabled through secure technology, they will compensate using consumer technologies, such as putting in a wireless access router or copying data to a USB drive. So there are technological challenges, but there are challenges on the economics, too. What does it take to do information technology right? To do it securely and in a manner such that people can get their jobs done and they don’t have to backdoor the process?
DIFFIE: In short, lack of features is frequently a security problem. If the system doesn’t offer you the ability to do what you need to do securely, you will do what you need to do anyway. This problem has been known in the military since the first World War.
GILLILAND: And that’s the problem of enablement versus protection. How do you get businesses to effectively work with technology that may or may not have the functionality or features that allow it in a fast, safe, seamless way?
SHERSTOBITOFF: Another thing about process breakdown is that it creates prime breeding grounds for cybercriminals where the configuration and change management is not up to standards. It’s a lot easier to get information out of an organization if we do not have strict processes that control it or protect it. The hackers begin to understand: “Hey you know what? This isn’t up to standard so it’s a lot easier to attack.”
So, it’s kind of like burglary. I’m not going to attack the house with 20 alarms and surveillance cameras. I’m going to the location that has easy access, where the locks are easily picked, where there are no surveillance cameras and it’s dark. If people start to send the data out a backdoor channel, it leads to interception and man-in-the-middle attacks.