The science of not believing in science.

(1/4) > >>

BoogieMonster (April 29, 2011, 10:37:27 AM):
A brief excerpt:

an array of new discoveries in psychology and neuroscience has further demonstrated how our preexisting beliefs, far more than any new facts, can skew our thoughts and even color what we consider our most dispassionate and logical conclusions. This tendency toward so-called "motivated reasoning" helps explain why we find groups so polarized over matters where the evidence is so unequivocal: climate change, vaccines, "death panels," the birthplace and religion of the president, and much else. It would seem that expecting people to be convinced by the facts flies in the face of, you know, the facts.

The theory of motivated reasoning builds on a key insight of modern neuroscience: Reasoning is actually suffused with emotion (or what researchers often call "affect"). Not only are the two inseparable, but our positive or negative feelings about people, things, and ideas arise much more rapidly than our conscious thoughts, in a matter of milliseconds—fast enough to detect with an EEG device, but long before we're aware of it. That shouldn't be surprising: Evolution required us to react very quickly to stimuli in our environment. It's a "basic human survival skill," explains political scientist Arthur Lupia of the University of Michigan. We push threatening information away; we pull friendly information close. We apply fight-or-flight reflexes not only to predators, but to data itself.

We're not driven only by emotions, of course—we also reason, deliberate. But reasoning comes later, works slower—and even then, it doesn't take place in an emotional vacuum. Rather, our quick-fire emotions can set us on a course of thinking that's highly biased, especially on topics we care a great deal about.

Consider a person who has heard about a scientific discovery that deeply challenges her belief in divine creation—a new hominid, say, that confirms our evolutionary origins. What happens next, explains political scientist Charles Taber of Stony Brook University, is a subconscious negative response to the new information—and that response, in turn, guides the type of memories and associations formed in the conscious mind. "They retrieve thoughts that are consistent with their previous beliefs," says Taber, "and that will lead them to build an argument and challenge what they're hearing."

In other words, when we think we're reasoning, we may instead be rationalizing.

.....


That's not to suggest that we aren't also motivated to perceive the world accurately—we are. Or that we never change our minds—we do. It's just that we have other important goals besides accuracy—including identity affirmation and protecting one's sense of self—and often those make us highly resistant to changing our beliefs when the facts say we should. ...

...

Modern science originated from an attempt to weed out such subjective lapses (....) Our individual responses to the conclusions that science reaches, however, are quite another matter. Ironically, in part because researchers employ so much nuance and strive to disclose all remaining sources of uncertainty, scientific evidence is highly susceptible to selective reading and misinterpretation.


This article makes one wonder how rational one's own beliefs is, or at least, it makes me want to be more vigilant about my own biases, or more Scientific. However the article suggests that our biases may affect the outcome of our reasoning no matter how hard we try to look... Food for thought.
GCG (April 29, 2011, 10:59:09 AM):
brief

lol >:D
BoogieMonster (April 29, 2011, 11:04:59 AM):
That WAS brief, the article is 4 pages long. Well on the internet that is relative but, it was much, much longer than my "brief". >:D
cyghost (April 29, 2011, 11:06:31 AM):
If I look at the issues mentioned, BoogieM, I wouldn't be too concerned.

But being more vigilant and aware and revisiting out preconceptions and biases aren't a bad thing either.
Mefiante (April 29, 2011, 11:08:49 AM):
This is hardly news, so I’m a little taken aback that it seems to be reported as such. Neuroscientists and psychologists have been aware for many years already of the role of these and other factors that significantly shape our thinking.

Be that as it may, they are the reasons why peer review and expert consensus are of such cardinal importance in science. For the layperson, this translates into reading far and wide on any given topic if you wish to be sure that your thinking isn’t skewed by those factors.

'Luthon64

Navigation

[0] Message Index

[#] Next page

Skeptic Forum Board Index

Non-mobile version of page