The Backfire Effect
For a long time now, I have realized that so-called incontrovertible evidence doesn't change the views of hardened partisans. In fact, evidence contradicting their views only further cements their most fervent beliefs. Why? It's called the Backfire Effect. What happens is that someone with strongly held views, when faced with evidence that severely undercuts those views, goes through a period of severe cognitive stress. They are then forced to reconcile their pre-existing worldview with the new information. And what usually ends up happening is the new information loses that reconciliation, ironically reinforcing the existing pre-conceptions.
I first learned that this is how we deal with cognitive dissonance on important issues in 2011 via this article. And since then, I have consistently tried to ascertain where I am employing the Backfire Effect to protect my ego and sense of self in order to come up with more nuanced, self-aware and forward-looking analysis.
I shared this with a friend this morning. And he sent me a good 2015 articlefrom the FT by John Kay. The biggest takeaway from the article for me was this:
It is generally possible to predict what people will think about abortion from what they think about climate change, and vice versa; and those who are concerned about wealth inequality tend to favour gun control, while those who are not, do not. Why, since these seem wholly unrelated issues, should this be so? Opinions seem to be based more and more on what team you belong to and less and less on your assessment of facts.
No comments:
Post a Comment