This article was interesting to read. It posits that it isn’t enough for science to learn to predict catastrophic events, science also has to learn how to get that information to people in such a way that they actually do something about it.
This is a fair example of a term that we see pop up from time to time but no one ever really explains – normalcy bias. Wikipedia, naturally, defines it but the short version is this: you’re used to planes NOT hitting skyscrapers, you’re used to people NOT running into a bar and shooting the place up with AKs, you’re used to the earth NOT suddenly shaking and toppling buildings…and when an event like that does finally happen, many people’s brains, not being used to thinking about such things, kinda freeze up. That’s normalcy bias. That’s also why we wargame stuff…fire drills, for example.
Another buzzword that became popular and still pops up is ‘Black Swan’, as in “a Black Swan event”. Again, wikipedia helps us out. A Black Swan event is something that is so statistically rare and infrequently occurring that you can’t really predict it’s occurrence. It is important to note that a Black Swan event is NOT always negative…the development of the internet, for example. One of the criteria for something being a Black Swan event, as stated by the guy who brought the term into popular use, is that it is a complete surprise, it has a huge impact, and it is Monday Morning Quarterbacked afterwards. It is also subjective…whats a Black Swan to you may not be a Black Swan to someone else. An excellent example is this idiot. A power outage occurs, they suffer, and then they rationalize it…and add a dose of normalcy bias to justify their failure.
I was watching a really bad disaster show on SyFy last night. Typical midwestern farm town, huge black clouds ominously roll across the sky and the townspeople…walk out of their shops and stand in the middle of the street to watch. And they continue to stand there as it approaches, hurling debris and automobiles in their general direction. Why? Because they’re slack-jawed at never having seen such things before. You’d think someone would have the brains to turn around, run back in the building, get on the phone, warn their family, and then hustle to the basement. Normalcy bias.
While the Black Swan event is, by definition, unpredictable we can still prepare for consequences even if we can’t predict the event. We don’t know if it’s going to be a flood, tornado, earthquake, the return of Xenu, planet X, or Red Dawn….but we can be pretty sure of what the consequences will be – electrical failure, infrastructure issues, communication issues, etc, etc. And we can prepare for those. I think for most of us, we’re a bit past normalcy bias because we tend to think a good deal about ‘what if’ situations.
Preparing for the ‘Black Swan’ type of event reminds me of the ‘Tenth Man‘ response. In fact, go watch the video…the Mossad guy gives some very good examples of normalcy bias.
Anyway, ‘normalcy bias’ and ‘Black Swan’ are two buzzwords that seem to appear more and more in various articles on the subject of preparedness and I thought it might be a good idea to explore them.