Sunday, January 13, 2013

Michael Shermer — The Mind’s Compartments Create Conflicting Beliefs

How our modular brains lead us to deny and distort evidence
If you have pondered how intelligent and educated people can, in the face of overwhelming contradictory evidence, believe that evolution is a myth, that global warming is a hoax, that vaccines cause autism and asthma, that 9/11 was orchestrated by the Bush administration, conjecture no more. The explanation is in what I call logic-tight compartments—modules in the brain analogous to watertight compartments in a ship....
Scientific American
The Mind’s Compartments Create Conflicting Beliefs
By Michael Shermer | Publisher of Skeptic magazine
(h/t Mark Thoma at Economist's View)

Very Humean analysis epistemologically. Knowledge is compartmentalized and there is no unified self to provide consistency of thought and belief in regulating judgement. Wackiness is a by-product of design, a design flaw, so to speak. Cognitive bias is built in.

Not encouraging for early adoption and fast proliferation of state of the art knowledge, such as in monetary economics — meaning we are so screwed. "Contradictory scientific statements were processed more slowly and less accurately, suggesting that “naive theories survive the acquisition of a mutually incompatible scientific theory, coexisting with that theory for many years to follow.” Old ideas die hard. "Neoliberal tendencies," as Bill Mitchell likes to say when readers of his blog, many of whom know better, fall into this trap.

More fuel for the fire:  "In the 2010 article “When in Doubt, Shout!” in Psychological Science, Northwestern University researchers David Gal and Derek Rucker found that when subjects' closely held beliefs were shaken, they “engaged in more advocacy of their beliefs ... than did people whose confidence was not undermined.” Further, they concluded that enthusiastic evangelists of a belief may in fact be “boiling over with doubt,” and thus their persistent proselytizing may be a signal that the belief warrants skepticism." It's called "doubling-down" on error, often to save face and preserve an investment. It's also called "throwing good money after bad" in the financial world.

One lesson: Don't reinforce bad framing by repeating the frame. That just feeds it.


1 comment:

Ryan Harris said...

"To avoid making people more familiar with misinformation..., emphasize the facts you wish to communicate rather than the myth"

Sage advice for MMT.


Cherry picking anecdote article.

Apply this concept to the European Monetary Union. Perhaps skepticism and questioning of "overwhelming" evidence relating to new ideas might be helpful to the group to protect them from intentional blindness that comes with any hysteria.