"Facts don't necessarily have the power to change our minds. In fact, quite the opposite."
Before you read this article to determine where that conclusion came from, ponder that quote for a moment. Can you imagine a more frightful statement? That means what isn't true - i.e., what isn't reality - is more important to us when determining our choices and actions than what reality is.
The potential implications of that are enormous. It suggests that people think reason is impotent; that irrationality is a priority over rationality; that we judge things for what they are not rather than what they are; that we prefer the fantasy to the real; that we believe evasion of truth leads to happiness. If such a philosophy were taken to the extreme, we would expect quite a bit of destruction going on around the world. Funny enough...
There is a psychological term for people who don't know stuff but are too incompetent to realize it: it's called the Dunning-Kruger effect. Generally, even when you provide such people with the correct information, it doesn't make any difference. That would seem like an important problem to address. But this article also discusses that those who are politically sophisticated still get significant facts totally wrong which have highly destructive results (remind us again please about those WMDs that the Bush administration claimed were in Iraq).
What is the basis of self-deception? Why are there so many false beliefs? Where is political ignorance leading us? Do you think we're going in the right direction? Do you find people's political opinions stronger or weaker the more misinformed they are? How much do you think your own political beliefs are based upon verifiable facts and non-contradicting logic? How confident are you that you're making the "right" political decisions? How often do you dismiss facts because they aren't in-line with your Weltanschauung? What can we do to guard against believing things that aren't true?
It's one of the great assumptions underlying modern democracy that an informed citizenry is preferable to an uninformed one. "Whenever the people are well-informed, they can be trusted with their own government," Thomas Jefferson wrote in 1789. This notion, carried down through the years, underlies everything from humble political pamphlets to presidential debates to the very notion of a free press. Mankind may be crooked timber, as Kant put it, uniquely susceptible to ignorance and misinformation, but it's an article of faith that knowledge is the best remedy. If people are furnished with the facts, they will be clearer thinkers and better citizens. If they are ignorant, facts will enlighten them. If they are mistaken, facts will set them straight.
In the end, truth will out. Won't it?
Maybe not. Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It's this: Facts don't necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.
This bodes ill for a democracy, because most voters - the people making decisions about how the country runs - aren't blank slates. They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper...
These findings open a long-running argument about the political ignorance of American citizens to broader questions about the interplay between the nature of human intelligence and our democratic ideals. Most of us like to believe that our opinions have been formed over time by careful, rational consideration of facts and ideas, and that the decisions based on those opinions, therefore, have the ring of soundness and intelligence. In reality, we often base our opinions on our beliefs, which can have an uneasy relationship with facts. And rather than facts driving beliefs, our beliefs can dictate the facts we chose to accept. They can cause us to twist facts so they fit better with our preconceived notions. Worst of all, they can lead us to uncritically accept bad information just because it reinforces our beliefs. This reinforcement makes us more confident we're right, and even less likely to listen to any new information. And then we vote.
How facts backfire. Researchers discover a surprising threat to democracy: our brains
DISCUSS!
Original posting by Braincrave Second Life staff on Jul 8, 2011 at http://www.braincrave.com/viewblog.php?id=596