From The Wall Street Journal
As technical defenses against cyberattacks have improved, attackers have adapted by zeroing in on the weakest link: people. And too many companies are making it easy for the attackers to succeed.
An analogy that I often use is this: You can get a stronger lock for your door, but if you are still leaving the key under your mat, are you really any more secure?
It isn’t as if people aren’t aware of the weapons hackers are using. For instance, most people have heard of, and probably experienced, phishing—emails or messages asking you to take some action. (“We are your IT dept. and want to help you protect your computer. Click on this link for more information.”) Although crude, these tactics still achieve a 1% to 3% success rate.
Then there are the more deadly, personalized “spearphish” attacks. One example is an email, apparently sent from a CEO to the CFO, that starts by mentioning things they discussed at dinner last week and requests that money be transferred immediately for a new high-priority project. These attacks are increasingly popular because they have a high success rate.
The common element of all these kinds of attacks: They rely on people falling for them.
Too much information
Such gullibility, unfortunately, is a self-inflicted wound, the result of a cyberculture where people are willing to share all kinds of information and try new things all the time. There are lots of good things about that, but also much that is dangerous. So now is the time for companies and institutions to change that culture. It won’t be easy, and it will take some time. But it’s crucial if we want our companies and information to be safe from cybertheft. We have to start now, and we have to do it right.
Unfortunately, most of the things companies are doing simply don’t work. In our studies, we have found that such typical initiatives as distributing fliers about cybersecurity, sending people to a one-time training class, or asking them to view a 30-minute video are pretty much worthless. People do them, but they don’t retain enough information and they don’t change their behaviors.
To understand what does work, it’s helpful to learn from some prior successful efforts to change our culture. A good example is smoking. When the U.S. Surgeon General came out with his report in 1964 outlining the dangers of smoking, things didn’t change immediately. Photos of lungs blackened by smoking, though disturbing, had little impact, with responses like, “It is my body, leave me alone.” Over time, though, what had more impact regarding smoking was society’s reaction. It is not just about you, but about the others that your action affects. It’s about how your smoking damages your family.
Out of curiosity
To better understand how far we have to go in creating a cybersafe culture, consider this: If you were taking a tour through a nuclear plant, and there was a big red valve with a sign on it that said “Do not touch,” how many of you would turn it? None, I would guess. But in a phishing test conducted at a major financial-services firm, one of the test emails actually said: “This is a Phishing Test. Clicking the link below will cause harm to your computer.” At least one executive clicked it! When asked why, he said, “I was curious to see what it would do.”
Read the full post at The Wall Street Journal.
Stuart Madnick is the John Norris Maguire Professor of Information Technologies at the MIT Sloan School of Management, a Professor of Engineering Systems at the MIT School of Engineering, and the Founding Director of Cybersecurity at MIT Sloan: the Interdisciplinary Consortium for Improving Critical Infrastructure Cybersecurity.
MIT Sloan Executive Education has launched a new program: “Cybersecurity Leadership for Non-Technical Executives”, see details here