Is emotional intelligence one of the most potent weapons for preventing compliance issues? I think it might be.
In 2011, as an investigator who felt undertrained, I enrolled in an intensive course titled “Evaluating Truthfulness and Credibility.” It was the first stop in a journey that will end only when I do. After years of studies and practical application, I emerged as a behavioral analyst and trainer in the dark arts of deception, detection, and investigative interviewing.
I apply the lessons daily, but investigations are an increasingly small component. Looking for liars made me challenge most of the assumptions I’d made about risk management. I had to face facts. Or, more accurately, recognize that emotions, not facts, drive decisions and actions.
Let’s start from the beginning—deception detection. When we lie, there is a cognitive process I’d summarize as:
Lying about thinking: We relay a different version of events to what is in our heads or we withhold the truth (another form of deception). The simplest example occurs in almost all relationships: when we are asked, “What are you thinking about?” and we reply, “Nothing,” to avoid explaining something we’d rather not.
Thinking about lying: You can (usually) see the cogs whirring when a small child lies. As adults, we get better at concocting lies on the fly. That is, until the stakes get high. At that point, the cognitive load increases, and we might spin our mental wheels.
Lying about feelings: “I’m fine,” said with a snarl, will be a familiar cliché to most of you. As an investigator, this is a goldmine. We’re not all born Oscar winners, and it can be challenging to match our words with our emotions as they leak across multiple channels* of communication.
Feeling about lying: In some cases, it’s a game, and you might see “duping delight” when the liar feels they’re getting away with it. In others, the toll of deception is much heavier.
*The channels: facial expressions, body language, psychophysiology, and the voice (what’s said, how it’s said, and pitch and tone).
When I first started these studies, I had a strong bias toward what was said. My mentors and instructors intervened, enabling me to recognize and learn new behavioral languages. As I started to do this, it helped in three unexpected areas.
Proper risk assessments
It helps to know what risks people really think might happen. When I asked someone working in a treasury team about interactions with state bank officials and saw palpable fear (but no words), I knew our work was just beginning. It’s important to note that people have an obvious right to privacy of thoughts and emotions. My role is (typically) not to cajole and extract; it’s to react appropriately.
In some instances, you may need to challenge gently. For example, on more than one occasion, I’ve seen contempt smirks as people discuss the impacts of risks. A bit of probing usually reveals that they don’t feel the issue is serious (e.g., they perceive the chances of detection as low). It is important to understand these perspectives.
Training getting straining
“More compliance training,” said no one, ever. I don’t ascribe to that entirely—stories, scenarios, interactivity, and simulated crises (using no-win dilemmas) usually (and somewhat amazingly) result in people asking for more. However, good training (where we assess comprehension) helps to know what people truly think.
Are they confused, confounded, disengaged, or in disagreement? Much of this you will already be spotting. Categorizing multichannel communication helps. You save time. And you reduce risks.
(Don’t) tell us what you really think
The interaction doesn’t always need to be in person. I use various tools to gather data remotely, including Menti, Miro, ScoreApp, and the Ethics Insight platform. The applications span risk assessment, training, implementation support, and surveys. In the latter category, we’re often focused on assessing the culture of integrity—how we respond, not just what we choose, really matters.
For instance, in a recent survey, when asked to agree or disagree with the statement, “Our leaders always adhere to our business integrity standards,” just over 1% of people dropped out of the survey. Those who answered took (on average) 20 seconds. The entire 20-question survey took around three minutes (on average). Twenty seconds is a long time to hover. That tells me we need to ask more questions about walking the talk.
When using engagement tools, I’m also looking at what words people choose, whether people hesitate to go first (even if anonymous polling) and then group like lemmings.
Technology won’t replace the behavioral information transfer we get when physically together. But, used wisely, it can get you 80% of the way there 20% of the time. Then it’s time to earn your money, lasering in on that 20% of unknowns.