The focus of this article by Geary Sikich is on the application of guidance (ISO 31000, FFIEC, etc.) often resulting in the appearance of compliance resulting from a checkbox perspective rather than actually and actively identifying and managing risk by organizations.
In Risk Management: History, Definition and Critique, by Georges Dionne (March 2013 – CIRRELT-2013-17); the opening statement from the Abstract is revealing:
“Risk management has long been associated with the use of market insurance to protect individuals and companies from various losses associated with accidents. Other forms of risk management, alternatives to market insurance, surfaced during the 1950s when market insurance was perceived as very costly and incomplete for protection against pure risk. The use of derivatives as risk management instruments arose during the 1970s, and expanded rapidly during the 1980s, as companies intensified their financial risk management. International risk regulation began in the 1990s, and financial firms developed internal risk management models and capital calculation formulas to hedge against unanticipated risks and reduce regulatory capital. Concomitantly, governance of risk management became essential, integrated risk management was introduced and the first corporate risk officer positions were created. Nonetheless, these regulations, governance rules and risk management methods failed to prevent the financial crisis that began in 2007.”
We see, all too often, organizations complying with conflicting regulatory guidance in order to preclude fines and maintain the appearance of compliance regardless of the cost (lost opportunities, financial minimization of gain, fear of fines, etc.). Yet, few of these organizations actually consider the cost of poor risk management practices.
If you carry out a Google Search on crisis and operational risk loss you will note that many of the top results are mainly focused on the 2008 financial crisis. Could one assume that only the financial industry is concerned about risk management; or that other industry segments do not have operational risk losses? Of course it would be ludicrous to make either assumption. Perhaps what we are seeing is that there is a lack of differentiation when it comes to operational risk losses. That is to say that we may need to think of operational risk in terms of a hierarchy instead of a horizontal delineation. I would suggest that three tiers of operational risk be created by organizations: strategic, operational and tactical. One could then create matrices that can be expanded upon to provide a linkage between and among the three tiers.
Before we go further, perhaps some definitions, that may or may not clarify things, should be presented:
Strategic risk: a possible source of loss that might arise from the pursuit of an unsuccessful business plan. For example, strategic risk might arise from making poor business decisions, from the substandard execution of decisions, from inadequate resource allocation, or from a failure to respond well to changes in the business environment (1).
Operational risk: the Basel II Committee defines operational risk as: "The risk of loss resulting from inadequate or failed internal processes, people and systems or from external events." According to Simplicable, operational risk is the chance of a loss due to the day-to-day operations of an organization. Every activity entails some risk, even processes that are highly optimized will generate risks. Operational risk can also result from a breakdown of processes or the management of exceptions that aren't handled by standard processes. It should be noted that some definitions of operational risk suggest that it's the result of insufficient or failed processes. However, risk can also result from processes that are deemed sufficient and successful. In a practical sense, organizations choose to take on a certain amount of risk with every process they establish.
Tactical risk: tactical risk is the chance of loss due to changes in business conditions on a real time basis. Tactics differ from strategy in that they handle real time conditions. In other words, a strategy is a plan for the future while a tactic is a plan to handle real world conditions as they unfold. As such, tactical risk is associated with present threats rather than long term conditions. Tactics and strategy are both military terms. Military organizations primary view tactical risk as the conditions on a battlefield. An army may identify strategic risks before a battle but tactical risks can only be identified as they unfold.
Confused by the definitions? Don’t worry you are not alone. It seems that few can agree on standard definitions and examples that are directly related to the level of risk (strategic, operational and tactical) that is being discussed, assessed, defined, etc. It may seem as if there are as many risk definitions and examples as there are risks, not to mention the 7 billion+ people that inhabit our globe.
According to History, Definition and Critique, by Georges Dionne, there are five main risks:
1. Pure risk: (insurable or not, and not necessarily exogenous in the presence of moral hazard);
2. Market risk: (variation in prices of commodities, exchange rates, asset returns);
3. Default risk: (probability of default, recovery rate, exposure at default);
4. Operational risk: (employee errors, fraud, IT system breakdown);
5. Liquidity risk: risk of not possessing sufficient funds to meet short-term financial obligations without affecting prices. May degenerate into default risk.
ISO 31000:2009 gives a list on how to deal with risk:
- Avoiding the risk by deciding not to start or continue with the activity that gives rise to the risk;
- Accepting or increasing the risk in order to pursue an opportunity;
- Removing the risk source;
- Changing the likelihood;
- Changing the consequences;
- Sharing the risk with another party or parties (including contracts and risk financing);
- Retaining the risk by informed decision.
Risks, black swans, chaos and confusion – what’s a person to do?
Recognizing risk and putting it into a context that is meaningful to your organization may be the first step in finding some focus regarding risk and risk management activities. While we have to live with the regulations and the guidance; we do not have to blankly embrace them. We can approach the risk spectrum (from identification to ranking) with a broader, and perhaps, more rational focus.
ISO 3100:2009’s first bullet point above says that we can “avoid the risk by deciding not to start or continue with the activity that gives rise to the risk”. Since this could be a risk in and of itself, creating even greater risk for deciding not to pursue “the activity that gives rise to risk” puts us in the quandary of ‘decision paralysis’ which creates a risk. So, the issue with this guidance appears to be conservatively flawed. Wouldn’t it be better to buffer against the risk as you pursue the activity?
And, what about ‘black swan’ events? As defined by Nassim Taleb, author of the book The Black Swan: The Impact of the Highly Improbable:
“A black swan is a highly improbable event with three principal characteristics: it is unpredictable; it carries a massive impact; and, after the fact, we concoct an explanation that makes it appear less random, and more predictable, than it was.”
There is a general lack of knowledge when it comes to rare events with serious consequences. This is due to the rarity of the occurrence of such events. In his book, Taleb states that “the effect of a single observation, event or element plays a disproportionate role in decision-making creating estimation errors when projecting the severity of the consequences of the event. The depth of consequence and the breadth of consequence are underestimated resulting in surprise at the impact of the event.”
ISO 31000:2009, COSO, and other guidance do not appear to address this issue at all. Although ISO 31000:2009 does state that: “Changing the likelihood” and “Changing the consequences” are two ways to address risk. Since a black swan is a rare event, changing the likelihood is going to be difficult due to the unpredictable nature of a black swan event. Changing the consequences is a valid goal; but this requires a business continuity plan that is not incident specific, but rather focuses on a strategy that is ‘all hazards’ based. All hazards based strategies are philosophically good, however, the reality is that we do not have a clear picture of all the hazards that we face as an organization (i.e., unknown – unknowns).
It is therefore prudent to constantly assess the risk landscape and to be able to link apparently non-related aspects to create a mosaic of risk complexity that can be addressed at multiple levels. In probability theory and mathematical physics, a random matrix (sometimes stochastic matrix) is a matrix-valued random variable — that is, a matrix some or all of whose elements are random variables.
The power of infinite random matrix theory comes from being able to systematically identify and work with non-crossing partitions (as depicted on the left). The figure on the right depicts a crossing partition which becomes important when trying to understand the higher order terms which infinite random matrix theory cannot predict. (Figure by Prof. Alan Edleman.)
It would be wise to begin to consider random matrix theory with respect to risk identification and assessment. The complexity we face today with a globalized society is that risks are shared more and more, even though we have less and less awareness of the manner in which risks are shared. A good example of this is the international supply chain system. While organizations have their own supply chain; the combination of all organizations supply chains creates an entirely different risk exposure. Just think in terms of movement in the supply chain. Shippers (air, rail, ship, overland, etc.) are all trying to maximize their resources from an efficiency perspective. This has led the shipping industry to build mega-container ships, which require different portage and logistics capabilities. Shippers are handling multiple organization’s supply chains, moving products to a wide audience of customers. While efficiency is increased, risk is also greater due to the potential for a ‘single point of failure’ resulting in the loss of the ship and cargoes.
Risk questions that the board should ask?
Let’s turn our attention to the board of directors. NACD, Deloitte and other organizations are producing a lot of literature and are providing a lot of advice to boards of directors. Recently, I received an e-mail that posited questions that the Board should be asking. Below are the posited questions and my responses to each question. We have to start looking at our flawed mental models and their effect on our risk management focus. Cognitive biases affecting risk assessment, discounting risks, over emphasizing risks, not seeing the interconnectivity of risk, etc., clearly will affect how risk is perceived and presented to the board of directors.
So, here are the questions the Board should be asking (really)?
Are we clear about the company’s risk appetite and is it communicated effectively?
This question is quite ambiguous and falls into the cognitive bias trap. Who would answer that their organization is not clear about risk appetite and that the organization suffers from an inability to communicate effectively?
What risks does our current corporate culture create for the organization?
Culture, cognitive biases, groupthink, etc. all will create risks for an organization. But, what about the cultures of other organizations? Suppliers, customers, vendors, government all impact your corporate culture and you need to be aware of their culture in respect to your risks.
How do we acknowledge and live our stated corporate values when addressing and resolving risk dilemmas?
Ethics, morals and shared values are in a state of deterioration not just in the USA but worldwide. A 60 minutes (6 November 2016) had a segment with a professional pollster who expressed his concern that people do not want to listen and learn; they want to push their perspective and vent their frustrations. Organizations really need to take a hard look at values and assess how well these values are internalized.
How do we actively seek out information on risk events and near misses – both ours and those of others – and ensure key lessons are learnt?
Let’s face it, on one wants to hear bad news. And, as a result a lot of ‘near miss’ situations go unreported. More importantly, a lot of near miss situations go unrecognized by organizations and individuals. We too often fail to capture lessons learned and, if we do, we all too often fail to communicate the importance of the lessons.
How do we reward and encourage appropriate risk taking behaviors and challenge unbalanced risk behaviors (either overly risk averse or risk seeking)?
First, what are appropriate risk taking behaviors? This is situational and not able to be standardized, as much as we would like it to be. If I take a risk today and then take the same risk tomorrow, the risk will have changed and my behavior, while acceptable in the first instance, may be totally unacceptable in the second instance. What is balanced and unbalanced risk behavior? This, again is situational and selective and subject to cognitive biases of the observer and person taking the risk.
I will not comment on the following FRC Risk Management Guidance Document points as I think that you can get the flavor of where I am going based on the above comments:
- The board must determine its willingness to take on risk, and the desired culture within the company;
- Risk management and internal control should be incorporated within the company’s normal management and governance processes, not treated as a separate compliance exercise;
- The board must make a robust assessment of the principal risks to the company’s business model and ability to deliver its strategy, including solvency and liquidity risks. In making that assessment the board should consider the likelihood and impact of these risks materializing in the short and longer term;
- Once those risks have been identified, the board should agree how they will be managed and mitigated, and keep the company’s risk profile under review. It should satisfy itself that management’s systems include appropriate controls, and that it has adequate sources of assurance;
- The assessment and management of the principal risks, and monitoring and review of the associated systems, should be carried out as an on-going process, not seen as an annual one-off exercise; and
- This process should inform a number of different disclosures in the annual report: the description of the principal risks and uncertainties facing the company; the disclosures on the going concern basis of accounting and material uncertainties thereto; and the report on the review of the risk management and internal control systems.
Needless to say, we are faced with a lot of quandaries when addressing risk in accordance with the current guidance and regulatory requirements; and new regulations and guidance is being promulgated at a quickening pace. Staying abreast of all this is challenging and presents a risk in and of itself. You can readily comply with one regulation, but find that you are in conflict with another regulation.
I will conclude by offering the following seven points:
Recognize that your risks are not unique
Risks are shared by every organization regardless of if they are directly or indirectly affected. To treat your risks as unique, is to separate your organization from the interconnected world that we live in. If you have a risk, you can be assured that that same risk is shared by your ‘value chain’ and those organizations outside of your value chain.
Whatever you do to buffer the risk has a cascading effect (internally and externally)
Your organization needs to be in a position of constantly scanning for changes in the risk environment. When you buffer a risk, you create an altered risk. By altering the risk exposure, your network (i.e., value chain) now has to address the cascade effects. The same is true for your organization; as the value chain buffers risk it is altered and you are faced with a different risk exposure.
Risk changes as it is being buffered by you and others
As mentioned above, risk changes, it alters, it morphs into a different risk exposure when you or others do something to buffer against the risk being realized. Your challenge is to recognize the altered form of risk and determine how to change your risk buffering actions to protect against the risk being realized and your organization not having the right risk treatment in place.
Risk is not static
If we look at commodities trading we begin to understand the nature of speed and its ability to move throughout an organization rapidly. Commodities are complexity personified. Markets are large (global) in scale and trading is nearly constant. This makes identifying and managing risk a challenge that can be daunting to many. In many conversations with commodity traders I came to conclusion that their ability to see risk as a continuum, constantly changing, opaque and rapid in its impact creates a mindset of constant vigilance and offsetting of risks.
Risk is in the future not the past
During the cold war between the United States of America and the former Soviet Union, there were thousands of nuclear warheads targeted at the antagonists and their allies. The result, the concept of mutually assured destruction was created. The term was used to convey the idea that neither side could win an all-out war; both sides would destroy each other. The risks were high; there was a constant effort to ensure that ‘noise’ was not mistaken for ‘signal’ triggering an escalation of fear that could lead to a reactive response and devastation. Those tense times have largely subsided, however, we now find ourselves in the midst of global competition and the need to ensure effective resilience in the face of uncertainty.
We are faced with a new risk paradigm: efficient or effective? Efficiency is making us rigid in our thinking; we mistake being efficient for being effective. Efficiency can lead to action for the sake of accomplishment with no visible end in mind. We often respond very efficiently to the symptoms rather than the overriding issues that result in our next crisis. Uncertainty in a certainty seeking world offers surprises to many people and, to a very select few, confirmation of the need for optionality.
It’s all about targeted flexibility, the art of being prepared, rather than preparing for specific events. Being able to respond rather than being able to forecast, facilitates early warning and proactive response to unknown unknowns.
I think that Jeffrey Cooper offers some perspective: "The problem of the Wrong Puzzle. You rarely find what you are not looking for, and you usually do find what you are looking for." In many cases the result is irrelevant information.
Horst Rittel and Melvin Webber would define this as a systemic operational design (SOD) problem - a ‘wicked problem’ that is a social problem that is difficult and confusing versus a ‘tame problem’ not trivial, but sufficiently understood that it lends itself to established methods and solutions. I think that we have a wicked problem.
As Milo Jones and Philippe Silberzahn in their book Constructing Cassandra: Reframing Intelligence Failure at the CIA, 1947–2001 write, “Gresham's Law of Advice comes to mind: "Bad advice drives out good advice precisely because it offers certainty where reality holds none"” (page 249).
The questions that must be asked should form a hypothesis that can direct efforts at analysis. We currently have a ‘threat’ but it is a very ill defined threat that leads to potentially flawed threat assessment; leading to the expending of effort (manpower), money and equipment resources that might be better employed elsewhere. It is a complicated problem that requires a lot of knowledge to solve and it also requires a social change regarding acceptability.
Experience is a great teacher it is said. However, experience may date you to the point of insignificance. Experience is static. You need to ask the question, “What is the relevance of the experience to your situation now?”
The world is full of risk: diversify
When it comes to building your risk and/or business continuity program, focusing on survivability is the right approach, provided you have thoroughly done your homework and understand what survivability means to the organization. The risks to your organization today are as numerous as they are acute. Overconcentration in any one area can result in complete devastation.
Recognize opacity and nonlinearity
The application or, misapplication, of the concept of a ‘near miss’ event has gained increasing popularity and more importance than it should have. A risk manager's priorities should be based on recognizing the potential consequences of a near miss event, not on determining the cause of the event. While causality is important, due to today’s complexity and the nonlinearity of events, determining causality can become an exercise in frustration. Instead we need to focus on consequence analysis and recognize that as risks evolve change begins to occur, collateral factors come into play, uniqueness is created in the way that the evolution of risk occurs. Nonlinear evolution of risks combine with reactions to events to transform potential risk consequences.
I don't disagree that analysis of near miss events can benefit the organization and facilitate progress in reducing risk exposures in the future. Investigating near misses is often hampered by the nonlinearity and opaque nature of the event itself. Thereby rendering any lessons learned less than helpful in reducing risk exposure and more importantly risk realization consequences. Near miss events will not have exactly the same chain of causality as a risk event that actually materializes into an impact with unforeseen consequences.
Recognizing and analyzing all near miss events isn't a realistic option. This is due in part to the fact that we do not experience events uniformly. A near miss to you might be a non-event to someone who deals with similar situations regularly as the event becomes transparent to them.
About the author
Geary Sikich is a seasoned risk management professional who advises private and public sector executives to develop risk buffering strategies to protect their asset base. With a M.Ed. in Counseling and Guidance, Geary's focus is human capital: what people think, who they are, what they need and how they communicate. With over 25 years in management consulting as a trusted advisor, crisis manager, senior executive and educator, Geary brings unprecedented value to clients worldwide.
Geary is well-versed in contingency planning, risk management, human resource development, ‘war gaming’, as well as competitive intelligence, issues analysis, global strategy and identification of transparent vulnerabilities. Geary has developed more than 4,000 plans and conducted over 4,500 simulations from tabletops to full scale integrated exercises.
Geary began his career as an officer in the US Army after completing his BS in Criminology.
As a thought leader, Geary leverages his skills in client attraction and the tools of LinkedIn, social media and publishing to help executives in decision analysis, strategy development and risk buffering. A well-known author, his books and articles are readily available on Amazon, Barnes & Noble and the Internet.
Apgar, David, Risk Intelligence – Learning to Manage What We Don’t Know, Harvard Business School Press, 2006.
Davis, Stanley M., Christopher Meyer, Blur: The Speed of Change in the Connected Economy, (1998).
Dionne, Georges; Risk Management: History, Definition and Critique (March 2013 – CIRRELT-2013-17);
Edelman, Dr. Alan; Crossing partition figure, Infinite Random Matrix, Random Matrix Theory, MIT Course Number 18.338J / 16.394J
Jones, Milo and Silberzahn, Philippe, Constructing Cassandra: Reframing Intelligence Failure at the CIA, 1947–2001, Stanford Security Studies (August 21, 2013) ISBN-10: 0804785805, ISBN-13: 978-0804785808
Kami, Michael J., “Trigger Points: how to make decisions three times faster,” 1988, McGraw-Hill, ISBN 0-07-033219-3
Klein, Gary, “Sources of Power: How People Make Decisions,” 1998, MIT Press, ISBN 13 978-0-262-11227-7
Marks, Norman; Time for a leap change in risk management guidance, November 5, 2016
Sikich, Geary W., Graceful Degradation and Agile Restoration Synopsis, Disaster Resource Guide, 2002
Sikich, Geary W., "Integrated Business Continuity: Maintaining Resilience in Times of Uncertainty," PennWell Publishing, 2003
Sikich, Geary W., "Risk and Compliance: Are you driving the car while looking in the rearview mirror?” 2013
Sikich, Geary W., "“Transparent Vulnerabilities” How we overlook the obvious, because it is too clear that it is there” 2008
Sikich, Geary W., "Risk and the Limitations of Knowledge” 2014
Tainter, Joseph, “The Collapse of Complex Societies,” Cambridge University Press (March 30, 1990), ISBN-10: 052138673X, ISBN-13: 978-0521386739
Taleb, Nicholas Nassim, “The Black Swan: The Impact of the Highly Improbable,” 2007, Random House – ISBN 978-1-4000-6351-2, 2nd Edition 2010, Random House – ISBN 978-0-8129-7381-5
Taleb, Nicholas Nassim, Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets, 2005, Updated edition (October 14, 2008) Random House – ISBN-13: 978-1400067930
Taleb, N.N., “Common Errors in Interpreting the Ideas of The Black Swan and Associated Papers;” NYU Poly Institute October 18, 2009
Taleb, Nicholas Nassim, “Antifragile: Things that gain from disorder,” 2012, Random House – ISBN 978-1-4000-6782-4
Threatened by Dislocation: https://www.youtube.com/watch?v=eWKDGamtoIE