Responses to winter storm Juno seem to show that you cannot please the public when it comes to preparedness. In this article Geary Sikich asks whether business continuity and emergency planners are missing something when it comes to communicating preparedness with the public.
I was supposed to be in Boston presenting at ‘The Disaster Conferences’ on 28 January 2015. Well, the weather just put us out to 19 March 2015 for the now rescheduled Boston conference. I guess that they are still feeling the effects of this week’s blizzard, now named ‘Juno’; that left Boston with over 24 inches of snow. According to the Weather Channel Winter Storm Juno pounded locations from Long Island to New England with heavy snow, high winds and coastal flooding late Monday into Tuesday. The storm is now winding down. The National Weather Service has dropped all winter storm and blizzard warnings for Juno.
Snow amounts in New York have ranged from 9.8 inches at Central Park in New York City to 30 inches on Long Island. The snippets from the Weather Channel and from other news sources barrage us with the details of this latest storm:
- In Massachusetts, up to 36 inches of snow has been measured in Lunenburg, while Boston has seen 24.4 inches. Juno was a record snowstorm for Worcester, Massachusetts (34.5 inches). Incredibly, 31.9 inches fell in Worcester on Jan. 27, alone!
- Thundersnow was reported in coastal portions of Rhode Island and Massachusetts late Monday night and early Tuesday.
We want you to be prepared; but do not inconvenience me in the process!
There have been some amazing comments from the public regarding the preparedness measures taken by authorities in New York City (NYC), Philadelphia, etc. It seems that you cannot please the public when it comes to preparedness. I find this interesting in terms of the public psyche regarding preparedness. Something to the effect of "We want you to be prepared; but do not inconvenience me in the process!"
I speak at an average of 15 – 20 conferences nationally and internationally on an annual basis. I have conducted multi-day workshops on disaster preparedness, crisis management, risk management, etc. I have attended conferences put on by the American Red Cross, The International Emergency Management Society, Harvard, Continuity Insights, Disaster Recovery Journal, Business Resumption Planners Association, World Conference on Disaster Management, etc. It is interesting that in all the years in this business that I have not heard one speaker talk about the apathy of the public when it comes to disaster preparedness.
We now have people from the National Weather Service apologizing for their reports and projections on Winter Storm Juno. We have people in city, state and national government apologizing for taking action to prepare their communities for the possibility of a disastrous situation. What are we missing when it comes to communicating with the public? Have we cried wolf too often? Does the public really not care about preparedness and personal safety?
What would have happened if the authorities ignored the warnings and allowed the public to fend for themselves? Would we be hearing the raging voices of public outcry as officials scrambled to respond to the event? Of course we would; the public would be outraged! They would demand that we be better prepared!
Yet, when they take the initiative and implement early action and prepare for an event public officials are criticized for their efforts.
A colleague, Dennis Wenk, offered the following insight (edited for those of a sensitive nature):
“The hope that s**t won't happen in the future is three times greater than the reality of spending any real money today. I call it the Love-2-b**ch, hate-2-spend ratio”.
Public perception is ‘reality’
The public has a problem. The problem is a simple one to define. It is, short term memory loss and long term memory incongruence, causing a divergence in retained memories and a discounting of what was experienced.
How can we overcome this dichotomy, contradiction, irreconcilable clash between public perception and reality? We teach crisis communications courses, prepare spokespersons, provide fantastic graphics and media to support and emphasize the message that we need to be better prepared. Yet, for all these efforts, it seems that the public continues to respond in the same manner – with apathy, distrust and wrath over whatever the outcome is (either positive or negative).
Predicting the future is an integral part of business continuity planning and of human cognition. Yet, as Nassim Taleb cites in his latest book ‘Antifragile’, prediction is not achievable. This is due to what he calls opacity. Add to opacity, complexity and the unseen effects of cascading events and you have a recipe for second guessing what business continuity planners are attempting to do when they conduct a business impact assessment/analysis (BIA), develop business continuity plans, contingency plans, emergency plans, disaster recovery plans, etc. Cascading events can be defined as secondary adverse events caused by an initial event leading to further subsequent events. Cascading events can cause immense and unexpected human, environmental, economic, social and political damages. According to C. Delvosalle, a domino accident is “a cascade of events in which the consequences of a previous accident are increased both spatially and temporally by the following ones, thus leading to major accident.”
Natural disasters may be followed by other natural or technological disasters, creating a chain of disasters coupled with cascading effects. For example, a storm can provoke a mudslide that will itself be the cause of a transport accident. The behavior and reaction of the population in the case of disasters in chain (cascading) is really important in order to predict the future reaction and act in order to thwart the domino effect of disasters.
We design these programs (continuity, etc.) to help our organizations, the public, etc. with the expectation of some form of reciprocity that they will help us in return by taking preparedness seriously and being able to adjust to future situations with all the unpredictability that accompanies them.
Should we be considering greater public education, outreach and communication?
Accurate prediction (an oxymoron if you agree with Taleb) of events requires an array of skills. Content knowledge, analytics, anticipated impacts, etc. are all a best guess at what may transpire if an event occurs. There are too many variables to address to get to an accurate prediction. So, if the best we can do is provide a best guess at anticipated events and our reaction to these events; how do we get public buy-in and support for our programs?
We need to educate the public, government, management, etc. on the impact of situational variables and the need to aggressively develop risk buffering strategies that will allow flexibility and rapid response to ever changing situations. This is not an easy process. Public perception of risk is not the same as that of government and the private sector. The public sees risk as a reality; not in the complex way that planners, risk managers, etc. define risk.
Dr. Peter M. Sandman defines public perception of risk as: ‘Risk = Hazard + Outrage’. Sandman further gives us five steps to manage outrage: as he indicates, the guidelines for mitigating and preventing outrage are deceptively obvious:
- Stake out the middle, not the extreme. In a fight between ‘terribly dangerous’ and ‘perfectly safe,’ the winner will be ‘terribly dangerous’. But ‘modestly dangerous’ is a contender. If you deserve a B-, activists can get away with giving you an F instead, but you won’t be able to get away with giving yourself an A.
- Acknowledge prior misbehavior. The prerogative of deciding when you can put your mistakes behind you belongs to your stakeholders, not to you. The more often and apologetically you acknowledge the sins of the past, the more quickly others decide it’s time to move on.
- Acknowledge current problems. Omissions, distortions, and ‘spin control’ damage credibility nearly as much as outright lies. The only way to build credibility is to acknowledge problems: before you solve them, before you know if you will be able to solve them; going beyond mere honesty to ‘transparency’.
- Discuss achievements with humility. Odds are you resisted change until regulators or activists forced your hand. Now have the grace to say so. Attributing your good behavior to your own natural goodness triggers skepticism. Attributing it to pressure greatly increases the likelihood that we’ll believe you actually did it.
- Share control and be accountable. The higher the outrage, the less willing people are to leave the control in your hands. Look for ways to put the control elsewhere (or to show that it is already elsewhere). Let others – regulators, neighbors, activists – keep you honest and certify your good performance.
The ability to transform in response to stress and shocks seems to me to be a better approach to explaining to the public the need for preparedness.
Change the paradigm: think risk parity instead of resilience, antifragility and recovery
Risk parity is an approach that focuses on the allocation of risk, usually defined by exposure, velocity and volatility rather than allocation of assets to the risk. The risk parity approach asserts that when asset allocations are adjusted (leveraged or deleveraged) to the same risk level, risk parity is created resulting in more resistance to discontinuity events. The principles of risk parity will be applied differently according to the risk appetite, goals and objectives of the organization and can yield different results for each organization over time.
The greatest failure of most business continuity and enterprise risk management programs is that they cannot de-center. That is, they cannot see the risk from different perspectives internally or externally. Poor or no situation awareness generates a lack of expectancies, resulting in inadequate preparation for the future – until after the fact.
Being able to answer the ‘why’ question
As my colleague John Stagl often points out, senior management is focused on getting the answer to two questions in a crisis or a situation of discontinuity. These are:
Why didn’t the plan work? – In Croatian there is a phrase, ‘Ni luk jeo, ni luk mirisao’. The phrase basically means to deny ones involvement in something. To insist you have nothing to do with the situation. The literal translation is ‘I haven't eaten the onion nor smelled it’. (I haven't been involved in this at all).
Question: Business continuity planner, you have generated a BIA and a BCP that gives us a lot of data, do you think that is appropriate in the current crisis?
Answer: Ni luk jeo, ni luk mirisao - I haven't been involved in this at all. It is also a poor excuse to offer that your program is not as mature as it should be or to state that the plan was based on ‘best practices’ (copying ‘best practices’ from other companies is more dangerous than helpful).
Why did the plan work? – While you may have dodged a bullet, taking credit for why the plan worked can be a double edged sword. Yes, it worked, but generally speaking, not as it was written or intended. How many times have you seen some in a crisis stop to read their plan? Heck, just getting some organizations to bring their plans to a drill is a major accomplishment. The plan worked because you were able to respond to action and got results.
Business continuity has its roots in the year 1790 when the US Coast Guard was established under the Treasury Department: Continuity of Waterways. The Coast Guard mission was (and still is) to protect US waters, keep goods and service flowing, regulate US waterway vessels and respond to incidents: business continuity at its core. But to conclude this article I will go back a bit further to The Ingenious Gentleman, Don Quixote of La Mancha. We must ask ourselves, “Are we tilting at windmills”? Are we attacking imaginary enemies: engaged in after the fact explanations to the public? The phrase ‘tilting at windmills’ is sometimes used to describe confrontations where adversaries are incorrectly perceived, or to courses of action that are based on misinterpreted or misapplied heroic, romantic, or idealistic justifications. It may also connote an importune, unfounded and vain effort against confabulated adversaries for a vain goal.
Ultimately, building a culture of preparedness within the public arena and getting them to accept that prediction is not possible to the extent of 100 percent accuracy can be the intangible asset that reduces public angst regarding things that did not happen as predicted.
Geary Sikich, entrepreneur, consultant, author and business lecturer, is a seasoned risk management professional who advises private and public sector executives to develop risk buffering strategies to protect their asset base. With a M.Ed. in Counseling and Guidance, Geary's focus is human capital: what people think, who they are, what they need and how they communicate. With over 25 years in management consulting as a trusted advisor, crisis manager, senior executive and educator, Geary brings unprecedented value to clients worldwide. Geary is well-versed in contingency planning, risk management, human resource development, ‘war gaming,’ as well as competitive intelligence, issues analysis, global strategy and identification of transparent vulnerabilities.
A well-known author, Geary's books and articles are readily available on Amazon, Barnes & Noble and the Internet.
Contact: G.Sikich@att.net or firstname.lastname@example.org
Apgar, David, Risk Intelligence – Learning to Manage What We Don’t Know, Harvard Business School Press, 2006.
Delvosalle, C., 1996, Domino Effects Phenomena: Definition, Overview and Classification. First European Seminar on Domino Effects. Leuven.
Freitag, Robert (Bob), “The time has come to mitigate and adapt” invited comment, Natural Hazards Observer September 2013.
Kami, Michael J., “Trigger Points: how to make decisions three times faster,” 1988, McGraw-Hill, ISBN 0-07-033219-3
Klein, Gary, “Sources of Power: How People Make Decisions,” 1998, MIT Press, ISBN 13 978-0-262-11227-7
Odiorne, George S., “Management and the Activity Trap” Publisher: William Heinemann Ltd (August 4, 1975), ISBN-10: 0434914304 ISBN-13: 978-0434914302
Sandman, Peter M., “Risk = Hazard + Outrage: Coping with Controversy about Utility Risks”, Engineering News-Record, October 4, 1999, pp. A19–A23
Sandman, Peter M., “Managing Outrage: A Primer”, A sidebar article in Engineering News-Record, October 4, 1999, pp. A19–A23
Sikich, Geary W., Managing Crisis at the Speed of Light, Disaster Recovery Journal Conference, 1999
Sikich, Geary W., What is there to know about a crisis, John Liner Review, Volume 14, No. 4, 2001
Sikich, Geary W., Graceful Degradation and Agile Restoration Synopsis, Disaster Resource Guide, 2002
Sikich, Geary W., "Integrated Business Continuity: Maintaining Resilience in Times of Uncertainty," PennWell Publishing, 2003
Sikich, Geary W., “It Can’t Happen Here: All Hazards Crisis Management Planning”, PennWell Publishing 1993.
Sikich, Geary W., "The Emergency Management Planning Handbook", McGraw Hill, 1995.
Sikich, Geary W., “"In a World of “Black Swans”, How Do You Know Which One to Worry About", http://www.academia.edu/6184928/In_a_World_of_Black_Swans_..
Stagl, John M., "It’s Time to Change Your Mindset", Continuity Central, 2013.
Tainter, Joseph, “The Collapse of Complex Societies,” Cambridge University Press (March 30, 1990), ISBN-10: 052138673X, ISBN-13: 978-0521386739
Taleb, Nicholas Nassim, The Black Swan: The Impact of the Highly Improbable, 2007, Random House – ISBN 978-1-4000-6351-2
Taleb, Nicholas Nassim, The Black Swan: The Impact of the Highly Improbable, Second Edition 2010, Random House – ISBN 978-0-8129-7381-5
Taleb, Nicholas Nassim, Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets, 2005, Updated edition (October 14, 2008) Random House – ISBN-13: 978-1400067930
Taleb, N.N., Common Errors in Interpreting the Ideas of The Black Swan and Associated Papers; NYU Poly Institute October 18, 2009
Taleb, Nicholas Nassim, Antifragile: Things that gain from disorder, 2012, Random House – ISBN 978-1-4000-6782-4
This article is Copyright© Geary W. Sikich 2015. World rights reserved.
Published with permission of the author.