by Bruce Schneier
Chief Security Technology Officer, BT
Mr. Schneier is world a renown security guru. He sends out “good reads” 10 or 12 times a year.. You should subscribe.
Conficker’s April Fool’s joke — the huge, menacing build-up and then
nothing — is a good case study on how we think about risks, one whose
lessons are applicable far outside computer security. Generally, our
brains aren’t very good at probability and risk analysis. We tend to use
cognitive shortcuts instead of thoughtful analysis. This worked fine for
the simple risks we encountered for most of our species’ existence, but
it’s less effective against the complex risks society forces us to face
today.
We tend to judge the probability of something happening on how easily we
can bring examples to mind. It’s why people tend to buy earthquake
insurance after an earthquake, when the risk is lowest. It’s why those
of us who have been the victims of a crime tend to fear crime more than
those who haven’t. And it’s why we fear a repeat of 9/11 more than other
types of terrorism.
We fear being murdered, kidnapped, raped and assaulted by strangers,
when friends and relatives are far more likely to do those things to us.
We worry about plane crashes instead of car crashes, which are far more
common. We tend to exaggerate spectacular, strange, and rare events, and
downplay more ordinary, familiar, and common ones.
We also respond more to stories than to data. If I show you statistics
on crime in New York, you’ll probably shrug and continue your vacation
planning. But if a close friend gets mugged there, you’re more likely to
cancel your trip.
And specific stories are more convincing than general ones. That is why
we buy more insurance against plane accidents than against travel
accidents, or accidents in general. Or why, when surveyed, we are
willing to pay more for air travel insurance covering “terrorist acts”
than “all possible causes”. That is why, in experiments, people judge
specific scenarios more likely than more general ones, even if the
general ones include the specific.
Conficker’s 1 April deadline was precisely the sort of event humans tend
to overreact to. It’s a specific threat, which convinces us that it’s
credible. It’s a specific date, which focuses our fear. Our natural
tendency to exaggerate makes it more spectacular, which further
increases our fear. Its repetition by the media makes it even easier to
bring to mind. As the story becomes more vivid, it becomes more convincing.
The New York Times called it an “unthinkable disaster”, the television
news show 60 Minutes said it could “disrupt the entire internet” and we
at the Guardian warned that it might be a “deadly threat”. Naysayers
were few, and drowned out.
The first of April passed without incident, but Conficker is no less
dangerous today. About 2.2m computers worldwide, are still infected with
Conficker.A and B, and about 1.3m more are infected with the nastier
Conficker.C. It’s true that on 1 April Conficker.C tried a new trick to
update itself, but its authors could have updated the worm using another
mechanism any day. In fact, they updated it on 8 April, and can do so again.
And Conficker is just one of many, many dangerous worms being run by
criminal organizations. It came with a date and got a lot of press —
that 1 April date was more hype than reality — but it’s not
particularly special. In short, there are many criminal organizations on
the internet using worms and other forms of malware to infect computers.
They then use those computers to send spam, commit fraud, and infect
more computers. The risks are real and serious. Luckily, keeping your
anti-virus software up-to-date and not clicking on strange attachments
can keep you pretty secure. Conficker spreads through a Windows
vulnerability that was patched in October. You do have automatic update
turned on, right?
But people being people, it takes a specific story for us to protect
ourselves.