By Vera Cherepanova, FCCA, CIA, MSc
Independent Ethics and Compliance Consultant based in Milan, Italy
Behavioral compliance is a relatively new way of thinking how to tackle ethics-driven transformation of corporate culture. The researchers at several American universities, as well as public institutions including Federal Reserve Bank of New York, try to look outside the traditional toolkit to understand what motivates individuals to behave unethically.
To create awareness amid compliance community on the new ways of leveraging behavioral science to address the issue of corporate misconduct, the topic appeared on the agenda of prominent compliance events held recently. OECD Global Anti-Corruption and Integrity Forum ran a session on behavioral insights as a valuable input to the process of rethinking integrity policies. The panel of experts agreed that integrating human behavior factor into traditional compliance-based approaches will foster their effectiveness. In the same vein, at the Annual Conference for Compliance and Risk professionals ‘Compliance Week 2018’ a keynote speaker Christopher Adkins shared his views on new ways to designing high impact C&E programs based on insights form brain science and social psychology. Special attention was given to the redesign of training and communications part of C&E.
Yet there is still an obvious disconnect between the accumulated scientific knowledge and actual Compliance & Ethics (C&E) programs. With the increased pressures to build and improve ethical cultures inside organizations, working in silos is no longer an option for compliance officers. To broaden the practitioners’ focus, this article covers the unconscious biases relevant to C&E training.
We are all biased
Recent research in behavioral ethics has challenged the assumption that (un)ethical decision-making is a cognitive process. Academic progress in this field is dramatically expanding the view on corporate ethics and compliance. The conducted experiments demonstrate that social and psychological factors influence (un)ethical decisions in business.
The unconscious or implicit biases are one category within a range of unconscious behavior driving human actions, i.e. affecting our behavior without actual awareness from our side. Although it is not good or bad to have/experience these biases, the acknowledgment that we all have them is a first step towards the desired ethical standard. Integrating the results of behavioral ethics experiments into C&E training and communication will help to create awareness of the gap between how ethical we think we are and our actual behavior. Moreover, this additional perspective will make compliance message more engaging and memorable, thereby improving the effectiveness of the training efforts.
When sanctions don’t work
Imposing sanctions can sometimes have an opposite effect: instead of encouraging ethical behavior it can encourage the unethical one. J.W. Brehm in his book ” A theory of psychological reactance” (1966) analyses a common human tendency to rebel against limitations to one’s freedom. Reactance emerges when we experience a threat to or a loss of our free behaviors. When a C&E program is perceived as a system of sanctions, employees feel overly controlled. Because of the reactance phenomenon, they will be motivated to devote extra effort to regain their threatened freedom. This can lead to the program failure by bringing reverse results.
Self-serving biases driving our decisions
Self-serving biases occur at several stages of the decision-making process. First, we make over-optimistic predictions of our future behavior. Consider an experiment with 247 female college students conducted by two psychologists, J. Woodzicka and M. LaFrance. One group of women was asked to imagine how they would react to inappropriate questions during a job interview (“Do you have a boyfriend?” “Do people find you desirable?” “Do you think it is appropriate for women to wear bras to work?”) 62% of respondents said they would tell the interviewer that questions are inappropriate, and 68% said they would refuse to answer them. A separate group of women was actually asked these questions during the interview, and in none of the participants refused to answer.
These results confirm a well-documented human tendency to make inaccurate predictions about our own behavior. Why? Social scientists say that different things drive our behavior: when we imagine a future event, we think about high-level principles and attitudes; however, when we face the actual situation we focus on details and pragmatic considerations. Our behavior becomes automatic, as we switch to fight-or-flight mode of quick and intuitive responses. The ethical dimension of a decision fades away.
After the unethical choice was made, post-decision self-serving biases make sure we are not upset with the discrepancy between our actual behavior and ethical beliefs about ourselves. The human brain is very good at finding a plausible explanation for its automatic responses. We start to deflect blame (‘I only followed orders’), rationalize unethical behavior (“Everyone is doing it’), forget the memories that do not support our self-image or adjust our ethical standards to our own advantage.
Looking for an equilibrium
According to behavioral ethics research, each of us maintains a moral identity that we keep in equilibrium by engaging in ethical or unethical behaviors. We constantly score both sides by comparing our self-image of a good person with what we actually do. When we engage in an immoral act, our moral identity is threatened, and we actively look for an opportunity to do something moral and regain equilibrium. In other words, we compensate the deficit on the good side of the scorecard. On the contrary, after we behaved ethically, we may license ourselves to act unethically to again find the balance between the two sides.
The social psychological phenomenon called “bystander effect” was first demonstrated by J. M. Darley and B. Latane in 1968. During the experiment conducted by the researchers, the volunteers isolated in booths overheard an epileptic seizure. In cases when the volunteers believed they were the only observant of the emergency 85% reported the incident. However, when the volunteers thought there were one or several others who knew what was happening only one third did something about it. This experiment was later replicated in different formats and variations, but the original thesis stayed intact: the more people witness an event, the fewer will intervene. When there is a number of witnesses aware of each other but not in direct communication people tend to think that others will take action and so they do nothing. Such factors as diffusion of responsibility, fear of judgment, conformity bias and ambiguity were found to contribute to the bystander effect.
Physical distance and exposure to money create risks
In 2013 the researchers examined the likelihood of unethical behavior following the exposure to money. Several studies were conducted to analyze the impact of money on unethical intentions and behavior of the participants. The results of these studies demonstrated that individuals primed with money (e.g. reading descriptions related to money) were more likely to exhibit unethical intentions and behavior compared to the control group. Besides that, because of the exposure to money, the participants were more likely to adopt a business frame of mind, which further led to a greater likelihood of unethical intentions and behavior.
Similarly, the factor of physical distance can increase the risk of unethical behavior. The exponential development of communication technologies made it possible to expand businesses across the globe. With Internet, emails, web conferencing and social networks, we do not have to sit in the same room anymore. However, distance still does matter. Technology can maintain relationships, but it can’t build them. In a variation to bystander experiments, J. Darley placed pairs of participants either face-to-face or sitting back to back. After hearing the sounds of an accident 80% of those face to face reacted compared to only 20% of those sitting back to back. Face-to-face communication and relationships can change our behavior. That’s why face-to-face C&E training is crucial for people in critical positions or working on high-risk markets. Its effectiveness in changing employees’ attitudes and behaviors is recognized by both regulators and NGOs.
How to integrate behavioral research into C&E training & communications
Apparently, training and communications is an area where behavioral ethics has a lot to offer. The cited experiments can be quite helpful when it comes to making C&E training and communications more effective. It’s not that behavioral studies tell us something absolutely different about C&E program design, however, they offer an additional perspective which can make compliance message a lot more compelling. Here are some practical ideas how to make enhancements to training and communications using ideas from behavioral ethics.
A strong sanctioning system may exacerbate unethical behavior instead of decreasing it. Compliance officers should guard against the trap of ‘imposing’ ethics as a compliance requirement through constant surveillance and tight controls. Employees should be encouraged to consider ethical implications of a decision when faced with a moral dilemma. Therefore, ethics should be communicated and promoted as a decision-making framework rather than a fixed reference point.
Change ‘we are already ethical’ attitude
Ethics can often ‘be taken for granted’. Without a pressing need to tackle compliance issues, both employees and leaders tend to think that no special focus on ethics is necessary as it’s already there. Behavioral ethics demonstrates why this assumption is not reasonable. Our natural inclinations to sometimes give ourselves permission to depart from usual ethical standards should be accounted for. Our unconscious biases do not make us bad persons; however, they are 100% worth discussing at a training session. Creating awareness is a first step towards reflecting realistically on your behavior.
Integrate risks into the training program
The behavioral findings shed additional light on how to create a comprehensive risk-based training program. The experiments suggest the need to focus face-to-face training efforts on internal functions that deal with money (e.g. sales, treasury). This method is also preferred for isolated offices and staff separated by physical distance. An easy five-question online test would probably be insufficient for a sales department sitting in a remote location. Moreover, the impact that priming with money has on individuals may support the idea to reduce the emphasis on profit-maximizing imperatives in internal communications.
There are many more touchpoints between behavioral ethics and C&E programs that could be considered. Training and communications is an important step towards an ethics-driven cultural transformation. However, it’s just one aspect of a C&E program. Other behavioral studies, too many for a short blog post, have particular relevance for ethical leadership, risk assessments, and C&E program evaluations. The cited experiments imply that ‘we are already ethical’ notion, in reality, could lack plausibility – we are all imperfect, that’s human nature. The mere acknowledgment of this fact will help make a strong case for an effective C&E program.