Michael Bret Hood (21puzzles@gmail.com), a former senior-level FBI agent, is a professional trainer on financial crime, money laundering, ethics, and executive leadership development, based in Raleigh-Durham, North Carolina, USA.
Believing that you should have foreseen what happened is often referred to as hindsight bias. In the compliance world, this bias often surfaces after a corporate scandal or serious compliance violation. When such an offense occurs, executives and compliance officials look for answers to determine how the system failed and why the particular incident was not foreseen. In reality, noncompliance is not something that is predictable. Human beings are much too diverse in their behaviors to accurately predict when someone in an organization is going to deviate from established norms. Todd Haugh, assistant professor of business law and ethics at the Indiana University Kelley School of Business, writes, “Accurately predicting the probability and scope of compliance failures is more difficult than currently understood.”[1]
How the unconscious brain affects compliance
When compliance failures occur, the typical response is to “plug the holes,” or, in other words, create new policies/procedures to ensure that the failure does not repeat itself. Although this step is certainly necessary to protect organizational interests, it is not the be-all and end-all solution that most people perceive: “Companies believe bad employee conduct will transpire in their organizations in a manner that conforms to a recognizable, and ultimately manageable, pattern. This approach, it is further believed, will foster a positive corporate culture, thereby improving corporate compliance en masse.”[2] What executives and compliance leaders fail to take into account is that our ethical breakdowns and rule/procedure deviations are often committed without conscious knowledge.
In his landmark book, Thinking Fast and Slow, author Daniel Kahnemann referred to two types of thinking, which he labeled as System 1 and System 2. In System 1, your brain instantly reacts to stimuli and makes decisions using previous experiences, belief systems, culture, and desires. System 2, on the other hand, is the more rational part of your brain. In System 2, you think before you act.[3] While most people believe they give careful thought and analysis before making important decisions, you may be surprised to know that almost 95% of your daily decisions are made using System 1, including whether or not to follow established norms.[4] Typical compliance frameworks are based upon employees properly assessing a situation and making a correct and ethical choice. This, however, is rarely the case.
Noncompliant offenders rarely consider the consequences of their behavior prior to committing the action because of the manner in which System 1 operates. On most occasions, the dimensions of the decision have been altered or transformed in a way that the decision-maker disregards or is completely unaware of the ethical ramifications of his/her decision. “In many cases, wrongdoers have trouble recognizing their own unethicality, meaning that they act wrongly not because they are willing to pay some external or internal price, but rather because they have a biased assessment of what it is they are doing.”[5] Some of the System 1 manifestations can lead you and your colleagues to, among other things, ignore ethical dimensions of decisions, refrain from speaking up about known compliance violations, and accede to the adopted rules of the informal culture, even if doing so violates established laws, policies, and procedures.
What Challenger can teach us about compliance
NASA’s space shuttle Challenger exploded shortly after takeoff on January 28, 1986. Subsequent investigation revealed that O-rings manufactured by Morton-Thiokol had failed due to abnormally cold weather in Florida at the time of launch. Surprisingly, many of the engineers expected the O-rings to fail prior to launch and had originally voted to delay the launch, because they were afraid the lower temperatures would lead to “catastrophic” results. NASA, however, had openly committed to an ambitious launch schedule and were eager to launch the Challenger in spite of the low temperatures. NASA launch officials, knowing they were required to get the engineer’s approval to launch, went back to Morton-Thiokol’s general manager to see if the engineers were absolutely sure that the O-rings would fail given the temperatures. Knowing that NASA was one of Morton-Thiokol’s biggest clients, the general manager went to his three senior managers and asked them to “take off their engineering hat and put on their management hat” before considering whether or not to approve the launch. Changing the senior engineer’s perspective to thinking like a manager led to the approval NASA needed for launch and the resulting disaster.[6]
The Challenger explosion is an example of how even a small shift in perspective can change the outcome of an important decision. Morton-Thiokol managers certainly did not intend to make a decision that would directly cause the death of the Challenger astronauts, but the instruction to think like a manager instead of an engineer allowed unconscious biases to creep into the decision-making process.[7] By referencing the “manager’s hat,” the ethical dimensions of the decision were apparently removed from consideration.
What we should do doesn’t always equate to what we want to do
Harvard Professor Max H. Bazerman and Notre Dame Professor Ann E. Tenbrunsel refer to the “want self” and the “should self” in situations where you are faced with inner conflicts between what you desire and what you perceive to be the right thing to do.[8] Normally, you believe that you will recognize the implications of various compliance decisions and make the proper choices, thereby establishing the should self. What you fail to realize, however, is that stress and pressure can affect your decision-making processes, thereby establishing the want self. “When it comes time to make a decision, our thoughts are dominated by thoughts of how we want to behave, thoughts of how we should behave disappear.”[9] In the Challenger case, an implicit desire to satisfy and maintain the business relationship enabled Morton-Thiokol engineers to overcome their reluctance to approve the launch decision despite the significant odds of a negative outcome.
Likewise, the should self and the want self can also affect your compliance program. Many organizations attempt to address the should and want selves by continually reminding employees to “do the right thing.” However, such a directive leaves plenty of leeway for the want self to justify the right thing as that which meets your desires as opposed to that which is best for your organization. In the case of business travel, is the right thing to select the lowest cost airfare, or is the right thing to select first-class airfare so that you can arrive refreshed? As you can see, an argument can be made to justify both decisions depending on what you and others desire and perceive. Without clear direction, you can deviate from compliance directives without even realizing you have made a bad decision.
How bounded ethicality can destroy compliance
What is ironic in many of these situations is that even after making incorrect decisions, we are commonly unaware of our transgressions. Bounded ethicality refers to the systematic and predictable ways people engage in unethical acts without their own awareness that they are doing anything wrong. To the chagrin of compliance departments, bounded ethicality occurs with a greater frequency than you would like: “Since people typically value a positive self-image, they will tend to distort ethical deliberations in a way that presents themselves in a positive light.”[10] With little or no effort, we will find ways to rationalize our behavior and make it seem that not only was the noncompliance reasonable, but it was also the right thing to do in that particular situation.
Prior to the 2008 financial collapse, ratings agencies granted AAA ratings to collateralized mortgage securities that consisted of low-quality assets. Although the credit-ratings agencies were in the business of providing accurate assessments of creditworthiness for investors, their funding came from the companies and organizations that were offering these collateralized mortgage securities to the public.[11] What would have happened if the credit-rating agencies were to have given a low-quality rating on these investment vehicles? Would the companies and organizations continue to fund the credit-rating agencies if their products were given low-quality ratings? While the answers may be obvious to us, bounded ethicality interferes with the decision-making processes and leads you to make a decision that you can justify, but one that may be clearly incorrect when viewed by someone who doesn’t have the same self-interests.
‘Tone in the middle’ and informal culture
In your organization, how amenable are executives and leaders to not only hearing, but also actually listening to negative information? Much has been written about “tone at the top” and how executive’s behavior can influence others throughout an organization. Although the research is accurate, compliance officials should also be aware that the “tone in the middle” can have an equally powerful effect on employees, especially when it comes to compliance. Employees are constantly looking to others to determine what is accepted behavior. In most organizations, employees have much more frequent contact with direct and mid-level managers as opposed to executive management. Whether these individuals realize it or not, people are constantly looking to others to determine what are the accepted behaviors in an organization.
Additionally, every organization has not only a formal culture replete with rules, policies, and procedures, but also (and sometimes more powerful) an informal culture. The informal culture is driven by what you perceive as accepted among the masses. Robert Cialdini, author of Influence: The Psychology of Persuasion, refers to the psychological process of copying the actions and behaviors of others in certain situations as social proof. “The principle of social proof operates most powerfully when we are observing the behavior of people just like us. It is the conduct of such people that gives us the greatest insight into what constitutes correct behavior for ourselves.”[12] If employees perceive that people in managerial positions, leadership roles, and other influential positions, including compliance personnel, deviate from established rules, policies, and procedures, then it only follows that the employees will perceive the behavior to be accepted, even if it goes against established norms.
Psychological safety is necessary for compliance programs
Psychological safety is the perceived ability to share information within a group, team, or organization without fear of retribution, even if the information reflects negatively on the group, team goals, or the organization. Google did a study on team performance and found the number one factor for success was psychological safety.[13] If you were to assess your organization for psychological safety, how would it rate? “The brain processes a provocation by a boss, competitive coworker, or dismissive subordinate as a life-or-death threat. The amygdala, the alarm bell in the brain, ignites the fight-or-flight response, hijacking higher brain centers. This ‘act first, think later’ brain structure shuts down perspective and analytical reasoning.”[14] When this happens, the safest choice is usually the status quo. Without psychological safety, perception of authority and informal culture can cause compliance efforts to go awry.
From an early age, we are taught the benefits of listening to authority figures. In an organization, senior leaders, senior managers, middle managers, and other influential people in the organization set the tone for both existing and new employees and the informal culture of the organization. “We rarely agonize to such a degree over the pros and cons of authority’s demands. In fact, our obedience frequently takes place in a click, whirr fashion, with little or no conscious deliberation. Information from an established authority can provide us a valuable shortcut for deciding how to act in a situation.”[15]
Toshiba was a model corporation on the surface, but despite their existing compliance program, the informal culture at Toshiba taught employees to avoid disagreement with supervisors.[16] In this case, the informal culture violated the tenets of psychological safety and employees perceived that it was necessary to follow the lead of authority figures, even when it went against their basic ethical principles. The Japanese principle of makoto places harmony and obedience in the workplace over everything else, including honesty and truthfulness. “In a workplace setting, makoto can create a leadership command chain similar to a military hierarchy: the top executives give the orders and all lower level employees are expected to obediently follow them.”[17]
In a compliance program, blind adherence to positional power and influence can lead to transgressions as we naturally succumb to authority figures who may not accurately see the proper perspectives of their decisions. The Toshiba fraud was estimated to be approximately $1.2 billion, and it certainly won’t be the last corporate scandal we see. Given how our brain works, are there actions you and your compliance team can take to mitigate the effects of things such as bounded ethicality and blind adherence to authority?
Improving your compliance program
Embedding psychological safety into the formal and informal organizational culture will be a laborious process, but also one that will pay significant dividends to compliance efforts. Empowering others to alert you to potential compliance transgressions can counteract your brain’s efforts to transform and alter situations in order to justify your decision-making. Establishing a formal and informal culture where employees are expected to respectfully challenge others can mitigate the effects of bounded ethicality, but only if employees on the lower rungs of the organizational chart perceive that they are safe from retribution for doing so.
In addition, there has been some recent scientific research on a technique referred to as a sentinel events approach, which was designed to assist in understanding how biases have interfered with decision-making. Although the study references law enforcement practices, the principles are easily applied to compliance departments. “Most analyses of investigative failure have focused on errors of procedure, or malfeasance on the part of authorities, which covers a broad list of ills.”[18] When we assess why noncompliance occurs in the sentinel events process, we need to go beyond casting blame. “These mistakes usually have their roots in, or are compounded by, the psychology of those who are leading the investigation.”[19]
Researchers hypothesize that a sentinel review is a more holistic, enterprise risk analysis that involves compliance personnel, investigators, supervisors, and executives. Instead of superficially looking for the failure, the sentinel events approach recommends investigating further and trying to determine the root cause of the compliance deviation. By pursuing a method such as the sentinel events approach, you are aspiring to create a culture of critical thinking, supported by diversity of opinions, which could alleviate some of our natural psychological steadfastness, resulting in a more complete assessment of what went wrong and what went right in the compliance process.
If you want your compliance efforts to be effective, understanding how the human brain works can assist you in creating a compliance program that more effectively addresses human weaknesses. Traditional deterrents such as penalties, discipline, and even dismissal are not nearly as effective when your brain has failed to consciously register the fact that you are about to engage in behavior that will violate organizational or societal rules and laws. In order to be compliant, you, as well as the rest of your colleagues, need assistance in recognizing when you are about to deviate from your normal ethical baseline. Creating a culture where people feel free to respectfully offer contradictory opinions and where the root causes of compliance failures are dissected using diverse opinions and perspectives will energize your compliance program. Instead of perceiving compliance efforts as a burden, employees will understand that everyone has blind spots, and with someone else’s help, we can all be the person we each strive to be.
Takeaways
-
When compliance failures occur, the natural response is to plug the gaps, but is that the best way forward for compliance programs?
-
Ethical and compliance violations are normally not committed after careful analysis, but rather because the unconscious brain does not recognize the potential violation.
-
Bounded ethicality refers to the systematic and predictable ways people engage in unethical acts without their own awareness that they are doing anything wrong.
-
Psychological safety allows people to voice their true thoughts and feelings without fear of retribution or ridicule by team members, managers, and executives.
-
The sentinel events approach attempts to diagnose compliance deviations on a root cause level so that future changes can more properly address random human behaviors.