The Dilemma of Cybersecurity

 

Shortly before the 1990–1991 Persian Gulf War, also known as Operation Desert Storm, two teenagers from the Netherlands hacked into the United States Department of Defense’s (DOD) new logistics system and gained control over it, according to Rebecca M. Slayton, Science and Technology Studies. “They might have stopped or diverted shipments of weapons or other critical supplies, and that might have had a devastating effect on military operation,” she says.

United States–led coalition forces achieved their military objectives in the Gulf War in a matter of weeks. But Slayton points out that things could have been quite different if the two hackers had had their way. The teens had offered to manipulate the DOD’s system for Iraqi President Saddam Hussein in exchange for one million dollars. “The Iraqi government declined,” Slayton says. “If it had not, we might think of the war in a completely different way today.”

Cybersecurity at the Department of Defense

Slayton is currently working on a paper about the history of cybersecurity expertise in the DOD. The story about the hackers and the Gulf War is an example of events that brought the DOD to the growing realization that the information technology it increasingly relied on for military strength was also a vulnerability. “There’s no way to make these systems invulnerable to hacking, and that can have really big military consequences,” Slayton says.

Even to this day, though, Slayton points out, the DOD is not nearly as good at defense of its own cyber systems as it is at attacking other systems. “U.S. Cyber Command (part of the DOD) is the most capable cyber attacker in the world,” she says. “So why are they so good at offense and not at defense? In general, defense is harder because the goals are more complex. You’re trying to keep a very complex information network running properly, without any malicious activity. If you’re an attacker, on the other hand, you might have a relatively simple goal, such as compromising one computer.”

In her current paper, Slayton takes things a step further, arguing that the DOD’s problem is more than just the complexity of defense. “The DOD’s information technology procurement and management has historically been very decentralized, which makes the job of cyber defense very difficult,” she says. “Additionally, war fighting is the military’s top priority, and cybersecurity often seems more like tedious technology management than combat. This leads to cultural problems, where some parts of the military relax cyber security to achieve goals that seem more urgent. Good security practice is not always convenient.”

How Do You Prove Cybersecurity Expertise?

Slayton’s research into the history of the DOD’s cyber defense is part of a larger book project, Shadowing Cybersecurity, in which she looks at the rise of cybersecurity expertise through time and across different organizations. “Expertise is really about trust,” she says. “It’s not enough to have knowledge or skills; you have to convince others you have them in order to be effective. So how does that work in the context of cybersecurity? Everyone who works in the field will acknowledge that they can’t give you perfect security, that if someone really wants to break into your system, then given enough time and resources, they will.”

“Everyone who works in the field [of cybersecurity] will acknowledge that they can’t give you perfect security, that if someone really wants to break into your system...they will.”

Since they can’t guarantee system security, cybersecurity experts often seek to demonstrate that they’re good at their job by hacking the system. “They break into it to prove they know it well enough to defend it,” Slayton says. “That’s an unusual way to demonstrate expertise. Doctors don’t break your arm to show that they’re good doctors.”

These problems make identifying expertise difficult for organizations looking to hire a security expert, and while there are some professional certifications, they only go so far. “Just having a credential doesn’t necessarily make you competent to do the particular job that an organization needs you to do,” Slayton says.

Securing Industrial-Control Systems

The needs of cybersecurity can run up against the needs of the system as well, especially when it comes to industrial-control system computers that run infrastructures 24/7. “You can’t shut down the electrical power grid for an hour to update security,” Slayton says. “And yet security often needs updating. Also, technology used in the electrical grid and other industrial-control systems has traditionally been purchased with the expectation that it will last 20 or 30 years, but computers have a very different timescale in terms of how quickly they need to be updated or replaced.”

These competing tensions make it difficult to decide how a system should be secured, Slayton says. As an example, she points to issues with protecting the United States power grid. The Federal Energy Regulatory Commission authorized an industry group, the North American Electric Reliability Corporation, to make and enforce Critical Infrastructure Protection (CIP) standards for the generation and transmission of electricity. “But what effect have these standards had on the grid?” Slayton asks. “Do they actually improve security?”

Slayton worked with Aaron Clark-Ginsburg, at the time a postdoctorate researcher and now at the RAND Corporation, to investigate these questions. The researchers found that the CIP regulations actually had a leveling effect, causing some companies to improve their security and others to lower theirs. For example, one of the requirements is that an energy supplier must both have a security policy and also enforce it. This results in some companies with high security standards—say, requiring computer updates every month—being penalized when they miss their own update deadline due to an unforeseen problem such as electrical outages.

“The supplier may have actually enforced the minimum federal standard, but they got dinged because they didn’t enforce their own, higher standard,” Slayton explains. “That ends up causing the supplier to lower their standards to a new minimum because they don’t want to get in trouble for not enforcing a better policy. The regulations set up a sort of perverse incentive.”

Retooling a Career

Slayton conducted her doctoral research in physical chemistry but found herself drawn to history and the social sciences. She retooled her career to focus on the history of science and technology in an effort to better understand the authority of science. “I have mixed feelings about the authority of science and technology,” she says. “Science and technology can be powerful forces for good, but they have also been developed and used in ways that are oppressive. Studying the history of science gives us insight on these processes. It shows that what we accept as true is always influenced to some extent by culture and by society, and it changes over time.”

More news

View all news
		 Person on tarmac next to fighter jets
Top