DAVID A. FISHER: PROGRAMMING PIONEER
As one of the primary lines of defense against hackers, cyberterrorists and other online malefactors, the CERT Coordination Center at Carnegie Mellon University is a natural target. So like many high-profile organizations, it beefed up its security measures after September's audacious terrorist attacks. Before I can enter the glass and steel building, I have to state my business to an intercom and smile for the camera at the front door. Then I must sign my name in front of two uniformed guards and wait for an escort who can swipe her scan card through a reader (surveilled by another camera) to admit me to the "classified" area. But these barriers--just like the patting down I endured at the airport and like the series of passwords I must type to boot up my laptop--create more of an illusion of security than actual security. In an open society, after all, perfect security is an impossible dream.
That is particularly true of computer systems, which are rapidly growing more complicated, interdependent, indispensable--and easier to hack. The tapestries of machines that control transportation, banking, the power grid and virtually anything connected to the Internet are all unbounded systems, observes CERT researcher David A. Fisher: "No one, not even the owner, has complete and precise knowledge of the topology or state of the system. Central control is nonexistent or ineffective."
This article was originally published with the title Survival in an Insecure World.