Have you ever made an immediate connection with someone you just met, only to spend another 10 minutes with them and wonder: what you were thinking? What about the opposite? Initial impressions are part of being a human being. Because of how your brain works, you can’t always stop these impressions and feelings from affecting your decision-making.
Stereotypes—also known as mental schemas or mental shortcuts—have been essential tools that have enabled humans to survive. When you smell smoke emanating from a room, your brain senses the danger even before you confirm any threat. Things such as how you were raised, your belief systems, your culture, and other personal experiences have created a unique set of file folders in your brain, which filter and process the information and stimuli around you. However, the brain’s perception and processes aren’t always as accurate as you wish.
Drifting toward stereotypes
Hiring is an important function for any organization. Most compliance leaders strive to find a diverse cadre of qualified and talented employees, but unconscious biases can inadvertently interfere with that process. Let’s do a quick exercise. Read the following words and register the first image that pops into your brain.
-
Nurse
-
Computer programmer
-
Librarian
-
Welder
Did images of women flow into your brain when you read nurse and librarian? Did images of men dominate your thoughts when you read computer programmer and welder? If so, you are not alone, as people have been conditioned to associate these professions with particular genders. In the case of nurses, the stereotype could be justified, as almost 9 out of 10 registered nurses, on average, are female.[1] In contrast, women make up approximately 5% of welders in the workforce.[2] When you make hiring decisions, these expectations and social constructs can threaten impartiality by causing you to unconsciously dismiss a qualified candidate simply because they did not fit your preconceived notions of who might be best for the position.
In a study performed by University of Pennsylvania Wharton School professors Judd Kessler and Corinne Low and doctoral student Colin D. Sullivan, women or minority candidates applying for a science, technology, engineering, and mathematics (STEM) position with a 4.0 grade point average (GPA) were treated similarly to white male candidates who had obtained a 3.75 GPA.[3] It’s not that individuals purposely seek to discount qualified candidates, but rather, it is difficult to recognize when your unconscious brain interferes. “One reason for continued lack of diversity is that even if equally qualified candidates from diverse backgrounds apply for job openings, recruiters, because of implicit biases, gravitate toward candidates with identities that fit a stereotype.”[4] While some people actively discriminate against certain classes of individuals, many people fall victim to unconscious rationalizations such as describing the chosen candidate as being a better “culture fit” over a comparably qualified candidate who didn’t meet stereotypical associations.
The brain’s two systems
In his landmark book Thinking, Fast and Slow, author Daniel Kahneman broadly described the brain by identifying two thinking systems referred to as System 1 and System 2.[5] The former referenced your quick thinking and automatic brain processes while the latter referenced your more deliberate thinking mechanisms. Driving a car is an example of how your brain can operate while using both System 1 and System 2. If you are driving on a highway with no traffic in safe conditions, you can probably recall times when your brain wandered to thinking about things that did not relate to driving. Because you have driven a car for a long time, you did not need your full attention to successfully navigate the operation of your vehicle. Contrast that with driving in heavy traffic. In this kind of driving, you are less likely to allow your mind to wander because driving successfully requires more focus and attention. Recall the last time you interviewed a current or potential compliance team member. Did your mind wander out of that meeting thinking about other things other than what the interviewee was presenting? When you are not mentally present for an interview, your mental schemas have free rein to take over thinking and rely on stereotypical associations.
Focus and attention are some of your body’s most scarce resources. The compliance function in an organization can be very taxing mentally. Think about the last time you labored on a report, audit, or project. Were you mentally exhausted? Were you physically tired? System 2 is similar to a cell phone in that the longer you use it, the quicker your battery dies. When System 2 becomes exhausted, System 1 takes over decision-making without you consciously recognizing that the transition has occurred. Unfortunately, this is when many people succumb to unconscious biases. If you can carve out a few minutes for a short walk, a brief respite from your task, or even look out of a window, you can recharge System 2 and improve your decision-making.
Affinity bias
Read the following names and imagine these were the people who had applied for a position within your compliance team:
-
Michelle Smith
-
Saad Mann
-
Mircle Century
-
Abcde Grisham
While you may believe that resume qualification and job requirements will drive your decision-making, a study by Qi Ge, an economist at Vassar College, and Stephen Wu, an economist at Hamilton College, found otherwise. Ge and Wu found that people with difficult-to-pronounce names were not only less likely to be hired but were also significantly less likely to receive callbacks for job interviews.[6] Names aren’t the only candidate trait that impacts hiring, however.
Affinity bias (the tendency to prefer individuals like yourself) can lead you to seek out people with similar perspectives, ideas, and viewpoints. For compliance teams, this can lead to disastrous results. When you are conducting risk assessments as a compliance officer, you must consider the possibility of fraud and other ethical deviations. Your organization is filled with capable people who are smart and innovative. Skirting rules and circumventing controls occur in every organization. The best way to combat these issues and maintain a hearty compliance ecosystem is to have a creative and diverse team full of different perspectives and ideas that match the creativity of those deviating from established policies and procedures.
Overcoming hiring bias
There have been many suggestions for combatting unconscious bias related to teams and hiring. Organizations have revised job descriptions, standardized interviews, and removed names from resumes in attempts to mitigate the effects of unconscious bias. Researchers are continuously studying methods to improve the hiring process. One such group, led by Zhiyu Feng, Yukun Liu, Zhen Wang, and Krishna Savani, separated job candidates into different groupings and delineated between, among other variables, things such as university, culture, race, etc. What they found is that when job candidates were interspersed together, the candidates selected for interviews were more homogenous than when the various candidates were grouped by any of the variables listed previously.[7] Evidently, grouping people by things like their gender, race, or university caused System 2 thinking to activate and openly consider the need for diversity to improve team performance.
How bias affects teams
Hiring a team member is only part of the objective for compliance teams. Integrating these new members into the group is also vital if you want to take advantage of the different perspectives and ideas a new team member can offer. People are generally hesitant to contradict people in positions of authority. In addition, group dynamics lead people to follow the ideas and behaviors of others, even when they don’t necessarily believe in the action or decision.
Illusory superiority is a cognitive bias where you tend to overestimate your own knowledge, skills, and abilities. When you serve as a leader of a compliance team, there is a natural tendency for you to prefer your own ideas to gravitate to ideas that are representative of the decision you think should be made. This leads to groupthink, a psychological phenomenon where a group of people become so harmonious and conforming that they fail to see the faults in their decision.
President John F. Kennedy was a victim of illusory superiority and groupthink when he approved the failed Bay of Pigs invasion.[8] Upon reflecting, President Kennedy realized that no group member had offered a contradictory assessment of the potential outcomes. To protect himself in the future, Kennedy approached his brother, Robert—the attorney general of the United States—and directed him to ascertain the president’s opinion on any subject. Once Robert had done so, President Kennedy ordered his brother to argue the exact opposite position even if he didn’t agree with his own argument. By doing so, President Kennedy took active steps to ensure that alternative ideas and opinions were being considered, hoping to mitigate some of his inherent unconscious biases.
Most compliance team leaders actively seek different opinions, but how they do so could indirectly hinder open conversation. For example, compliance leaders frequently bring team members together to brainstorm different ideas, especially after a problem has arisen. While this can be a fruitful exercise, chances are the conversations are dictated by either the leader or team members with the strongest personalities. Once an idea is offered, the team can then be struck by the anchoring bias, your tendency to rely on the first bits of information as the reference point for subsequent discussion and decision-making. In addition, group dynamics and your natural desire to fit in with other group members can lead to the Abilene paradox, the tendency for a group to go along with an idea because they believe everyone else is for it when, in reality, no one thinks the idea is appropriate.
Overcoming team bias
Deploying techniques to minimize these decision-making biases is essential to overcome anchoring bias and the Abilene paradox. One such antidote is psychological safety, the perception that a team member can offer a contradictory idea or opinion without fear of retribution or ridicule. By creating a culture where respectful disagreement is not only allowed but also encouraged, compliance team leaders discuss the issues from multiple viewpoints, which leads to deeper discussions and analysis.
Think, for a moment, about your current compliance team. Do you believe you would have the psychological safety you need to offer your true opinion about an issue at hand? Do you believe your colleagues feel similarly to you? If not, then your team lacks psychological safety and is more likely to succumb to decision-making biases. However, creating psychological safety may not be as easy as you think.
A board of directors for a company experiencing financial hardship decided to bring in an experienced chief executive officer (CEO) from outside the organization due to this individual’s unique experience with turnarounds. When doing so, board members expressed high confidence in the current executive staff and informed them that the new CEO was hired to help them grow. In their first meeting, the newly appointed CEO queried his fellow executives to assess the organizational issues objectively. None of the existing executive staff provided an impactful answer. Sensing nervousness, the CEO took an empty notebook, pushed it to the center of the conference room table, and asked his fellow executives to write down their thoughts. The CEO then left the room. A few hours later, he had a notebook full of useful analysis and praised his team for being forthright. Over the next few months, the CEO acted on many of their ideas.
Compliance team leaders frequently seek out the opinions and ideas of others, but group dynamics, conformity, and other biases play a role in creating psychological safety. People with dominant personalities tend to offer their ideas first, and this can inadvertently instigate the anchoring bias. However, there are potential adaptations for compliance leaders to use. Instead of having compliance team members offer their ideas in an open forum with team members present, the compliance team leader can send individual emails to compliance team members asking for their input on the topic. Once team members reply, the compliance team leader can summarize the ideas and present them to the team in an open meeting. This ensures that unfiltered perspectives are given the chance to be discussed and judged on their merits as opposed to who originated the idea.
Conclusion
Unconscious bias usually evokes a negative connotation, but in reality, you and everyone else in the compliance field allow bias to creep into your decision-making. By becoming aware of different decision-making biases and how they affect people, you can take active steps to try and mitigate their effects. Chances are you have made bad decisions in your past, both in the workplace and your personal life, and will continue to do so in the future despite your best efforts. However, if you were about to make a bad decision, would you want someone to speak up before you acted?
Takeaways
-
Most of your daily decisions are made without conscious thought. Pausing before making a decision can increase the chances that you put more thought into the decision at hand.
-
When hiring, grouping applicants in different ways can cause you to see the potential hires much differently than if you were to place them in a single group.
-
Taking even a momentary break from an arduous task can increase conscious consideration of decisions.
-
Assigning compliance team members to actively take the devil’s advocate role can increase the number of perspectives considered before making a decision.
-
Getting team members’ ideas about a topic prior to a group meeting can temper the effects of anchoring, group dynamics, and the Abilene paradox.