What’s the real reason behind the biggest information security failures? If you ask most security professionals, they’ll generally answer the same way, with varying degrees of politeness: “It’s a chair-keyboard interface error. The root cause being the meat puppet between the chair and the keyboard.”
We spend obscene amounts of money on security software, hardware devices and training, but our user community still represents the soft underbelly of every organization. It’s not even a dirty little secret anymore. There have been countless studies and surveys devoted to this topic, even entire sectors within the security industry dedicated to improving user security awareness.
In Kevin Mitnick’s 2002 book, “The Art of Deception,” the former hacker writes about obtaining passwords and other pieces of sensitive information from companies by pretending to be someone else and just asking for it. Seems like things haven’t changed much. The colossal compromise of RSA in 2011 was initiated by a spear phishing email containing a malware-laden document. Defacement of the New York Times and the Washington Post websites by the Syrian Electronic Army succeeded due to phishing scams directed against staff at content partner Outbrain.
More than likely, users at these organizations probably went through yearly security awareness training. So why did they still click on that link?
We think how easy security would be if the users were smart like us. They must be stupid, because we give them policies and online training, but they keep making the same mistakes. Here’s a radical notion: maybe the problem isn’t the user. As Microsoft researcher Cormac Herley wrote about in a research paper on security training users "are offered long, complex and growing sets of advice, mandates, policy updates and tips. These sometimes carry vague and tentative suggestions of reduced risk, never security.”
As human beings, we’re wired to be efficient, rapid thinkers. Gerd Gigerenzer, a leader in the field of Bounded Rationality proposes that the human brain has a tendency to create heuristics -- frugal methods of decision making that allow us to arrive quickly at conclusions that are “good enough” in the context of uncertainty. In his book, “Gut Feelings,” he discusses one such example: catching a fly ball. In studies, it was found that people use a common heuristic, keeping the angle of the gaze constant with the ball, in order to catch it. This is one of many shortcuts we humans use to optimize our time, because effort is never free.
[Read about the biggest threats an organization faces in "The Banality of IT Failure: Overlooking Mundane Insider Threats."]
Life moves rapidly around us. As emotional beings, we’re motivated by the desire to maximize reward while minimizing danger or punishment. It’s what has kept us alive over thousands of years. So is it really surprising that a user would decide to click a link promising a free tablet or suggestive photos of a favorite celebrity when there’s so much competing for our attention? It seems to be the most economic choice. Some scientists even propose that technology is hijacking the brain’s reward system, driving us to seek the stimulating jolt of dopamine, a neurotransmitter linked to addiction.
Ultimately, the embarrassing cause of the security industry’s ongoing failure is a rejection of our own humanity. Leadership guru Simon Sinek said it best, “If you don’t understand people, you don’t understand business.” Maybe that should also read, “Then you don’t understand security.”
This means that we need to stop trying to use technology to solve the user problem; it’s really a protocol mismatch. Let’s talk to users, carbon unit to carbon unit. Then we can try to understand their needs, while communicating ours and finding real solutions in the process.