When we think we know something to be objective truth, our immediate reaction to news indicating the opposite is to jump to the conclusion that there must be something wrong with the source.
“The human understanding when it has once adopted an opinion draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects and despises, or else by some distinction sets aside and rejects, in order that by this great predetermination the authority of its former conclusions may remain inviolate.” This phenomenon called confirmation bias was noted by Francis Bacon almost four hundred years ago.
Drew Westen and his team conducted a study at Emory University. They scanned brains of “strong Republicans” and “strong Democrats”, showing them self-contradictory statements by the two candidates during the 2004 presidential campaign. The democrats found ways to reconcile inconsistencies of their democrat candidate and became even more strongly democrat, while the republicans had no difficulty explaining self-contradictions of the republican candidate and became more fervently republican.
Also, it was found that while participants were considering the inconsistent statements, the part of the brain associated with reasoning revealed no signs of activity at all. Instead, they saw a network of emotion circuits lighting up, including circuits involved in regulating emotion and in resolving conflicts.
Even more interestingly, once participants had seen a way to interpret contradictory statements as supporting their original position, the part of the brain involved in reward and pleasure became active. Their conclusion was massively reinforced with the elimination of negative emotional states and the activation of positive ones. Their emotional reaction, not the thinking mind, was causing them to be even more passionately attached to their original beliefs. Their brains were giving themselves a psychic reward for having been able to stick to their original position.
The Confirmation Bias helps explain why the traditional approach of trying to persuade people by giving them reasons to change isn’t a good idea if the audience is skeptical or hostile. If a leader offers reasons at the outset of a communication to such an audience, the maneuver will likely activate the confirmation bias and the reasons for change will be reinterpreted as reasons not to change. This occurs without the thinking part of the brain being activated: the audience becomes even more deeply dug into its current contrary position. Reasons don’t work at the outset, because the audience is neither listening nor thinking.
Skepticism and cynicism are contagious and can quickly turn into epidemics. They are instances of rebellious, antisocial behavior. This can be seen with hooligans or teenage smoking. Being a skeptic can quickly become the cool thing and license others to be likewise. The same applies to opinions on political views or a simple presentation given. If a cool guy says that a presentation was total BS, chances are high that others will go along with that opinion, if they themselves found it rather confusing. Although we might imagine that giving a presentation discussing and analyzing problems and reaching rational conclusions in favor of change can’t do any harm, we need to think again. Giving a lecture full of abstract reasons arguing for change can quickly turn an audience into an army of cynics.
The language that is capable of generating enduring enthusiasm for change depends on tiniest details, such as the words used, the narrative intelligence, appropriate body language, and an understanding of the audience’s story. The leader has to show a truthful commitment to a clear, inspiring change idea. Also, successful leaders don’t stop with a one-time communication. As implementation proceeds, leaders and their followers stay in communication and co-create the future by continuing the conversation.
If leaders are unable to lead clever people, it means that they don’t know how to be good leaders. Clever people object to being led badly. They do not like to be managed, commanded, controlled, or manipulated by people who aren’t knowledgeable in the area where they are working or who are working in pursuit of goals that don’t make sense. “Benevolent guardianship” (suggesting the opposite of what you really want) may do less poorly with clever people than traditional command-and-control management, but it is still a suboptimal solution. Reverse psychology works on clever people only when they have experienced bad decision making and therefore pay attention to the opposite of what management says. What clever people want, like all people, is good leaders: people who can inspire enduring enthusiasm for a worthwhile cause.