Print

Print


The research on belief vs fact that was described in a recent news  release 
from University of Buffalo (New York, USA) might be useful, though it  
doesn't deal specifically with EBM -- they used a political topic to explore  
"motivated reasoning" -- but I feel this key statement from the news release  
applies equally well to the application of EBM:
 
>>
"Our data shows substantial support for a cognitive theory known as  
'motivated
reasoning,' which suggests that rather than search rationally  for
information that either confirms or disconfirms a particular  belief,
people actually seek out information that confirms what they  already
believe. 

"In fact," he says, "for the most part people  completely ignore
contrary information. 
<<
 
Full text of the news release, with URL to the original, appears  below.
 
***** 
STUDY DEMONSTRATES HOW WE SUPPORT OUR FALSE BELIEFS

A belief in link between Saddam Hussein and 9/11 is a case in  point;
false beliefs stirred by current health care debate may be  another


Release date: Friday, August 21, 2009
Contact: Patricia  Donovan, [log in to unmask]
Phone: 716-645-4602
Fax:  716-645-3765

BUFFALO, N.Y. -- In a study published in the most recent  issue of the
journal Sociological Inquiry, sociologists from four major  research
institutions focus on one of the most curious aspects of the  2004
presidential election: the strength and resilience of the belief  among
many Americans that Saddam Hussein was linked to the terrorist  attacks
of 9/11. 

Although this belief influenced the 2004 election,  they claim it did
not result from pro-Bush propaganda, but from an urgent  need by many
Americans to seek justification for a war already in  progress.

The findings may illuminate reasons why some people form false  beliefs
about the pros and cons of health-care reform or regarding  President
Obama's citizenship, for example.   

The study,  "There Must Be a Reason: Osama, Saddam and Inferred
Justification" calls such  unsubstantiated beliefs "a serious challenge
to democratic theory and  practice" and considers how and why it was
maintained by so many voters for  so long in the absence of supporting
evidence.  

Co-author Steven  Hoffman, Ph.D., visiting assistant professor of
sociology at the University  at Buffalo, says, "Our data shows
substantial support for a cognitive theory  known as 'motivated
reasoning,' which suggests that rather than search  rationally for
information that either confirms or disconfirms a particular  belief,
people actually seek out information that confirms what they  already
believe. 

"In fact," he says, "for the most part people  completely ignore
contrary information. 

"The study demonstrates  voters' ability to develop elaborate
rationalizations based on faulty  information," he explains.

While numerous scholars have blamed a campaign  of false information and
innuendo from the Bush administration, this study  argues that the
primary cause of misperception in the 9/11-Saddam Hussein  case was not
the presence or absence of accurate data but a respondent's  desire to
believe in particular kinds of information. 

"The argument  here is that people get deeply attached to their
beliefs," Hoffman  says.

"We form emotional attachments that get wrapped up in our  personal
identity and sense of morality, irrespective of the facts of  the
matter. The problem is that this notion of 'motivated reasoning'  has
only been supported with experimental results in artificial  settings.
We decided it was time to see if it held up when you talk to  actual
voters in their homes, workplaces, restaurants, offices and  other
deliberative settings."

The survey and interview-based study was  conducted by Hoffman, Monica
Prasad, Ph.D., assistant professor of sociology  at Northwestern
University; Northwestern graduate students Kieren Bezila and  Kate
Kindleberger; Andrew Perrin, Ph.D., associate professor of  sociology,
University of North Carolina, Chapel Hill; and UNC graduate  students
Kim Manturuk, Andrew R. Payton and Ashleigh Smith Powers (now  an
assistant professor of political science and psychology at  Millsaps
College).  

The study addresses what it refers to as a  "serious challenge to
democratic theory and practice that results when  citizens with
incorrect information cannot form appropriate preferences or  evaluate
the preferences of others."  

One of the most curious  "false beliefs" of the 2004 presidential
election, they say, was a strong and  resilient belief among many
Americans that Saddam Hussein was linked to the  terrorist attacks of
Sept. 11, 2001. 

Hoffman says that over the  course of the 2004 presidential campaign,
several polls showed that  majorities of respondents believed that
Saddam Hussein was either partly or  largely responsible for the 9/11
attacks, a percentage that declined very  slowly, dipping below 50
percent only in late 2003. 

"This  misperception that Hussein was responsible for the Twin Tower
terrorist  attacks was very persistent, despite all the evidence
suggesting that no link  existed," Hoffman says.

The study team employed a technique called  "challenge interviews" on a
sample of voters who reported believing in a link  between Saddam and
9/11. The researchers presented the available evidence of  the link,
along with the evidence that there was no link, and then  pushed
respondents to justify their opinion on the matter. For all but  one
respondent, the overwhelming evidence that there was no link left  no
impact on their arguments in support of the link. 

One unexpected  pattern that emerged from the different justifications
that subjects offered  for continuing to believe in the validity of the
link was that it helped  citizens make sense of the Bush
Administration's decision to go to war  against Iraq. 

"We refer to this as 'inferred justification,'" says  Hoffman "because
for these voters, the sheer fact that we were engaged in war  led to a
post-hoc search for a justification for that war.

"People  were basically making up justifications for the fact that we
were at war," he  says. 

"One of the things that is really interesting about this, from  both the
perspective of voting patterns but also for democratic theory  more
generally, Hoffman says, "is that we did not find that people  were
being duped by a campaign of innuendo so much as they were  actively
constructing links and justifications that did not exist.  

"They wanted to believe in the link," he says, "because it helped  them
make sense of a current reality. So voters' ability to  develop
elaborate rationalizations based on faulty information, whether  we
think that is good or bad for democratic practice, does at  least
demonstrate an impressive form of creativity."

The University at  Buffalo is a premier research-intensive public
university, a flagship  institution in the State University of New York
system and its largest and  most comprehensive campus. UB's more than
28,000 students pursue their  academic interests through more than 300
undergraduate, graduate and  professional degree programs. Founded in
1846, the University at Buffalo is a  member of the Association of
American Universities.


See this  article online at:  http://www.buffalo.edu/news/10364

----------------------------------------------------------------------------
--
For  more information please see the UB News Services
web site at  http://www.buffalo.edu/news

******
[end of copied news release]
 
Peggy Noonan 
_www.pjnoonan.com_ (http://www.pjnoonan.com) 
@pjnoonan
 
 
 
 
In a message dated 12/4/2009 11:05:33 A.M. Mountain Standard Time,  
[log in to unmask] writes:

I'm a freelance journalist, and I'm writing a piece examining pushbacks  
against evidence-based medicine (for instance, see the latest mammogram  
controversy in the U.S.) 


In particular, I'm exploring the role of belief in the uptake of EBM. In  
my reporting, I've found that new scientific evidence is often rejected when  
it contradicts strongly held (but erroneous) beliefs. It's not that people  
don't see the evidence, it's that they don't believe it (or they don't 
believe  that it applies to them). 


I'm looking for research on the role of belief systems in the uptake of  
EBM. Has anyone studied ways to defeat scientifically wrong but  strongly held 
beliefs via narrative? The idea being, that in some cases it's  not the 
evidence itself that convinces, but instead, the story or narrative  
constructed from the evidence. How can new evidence be more effectively  communicated 
when it contradicts established practice?


cheers,
Christie



 
 
 
 
 
Christie Aschwanden
Freelance writer
[log in to unmask] (mailto:[log in to unmask]) 
_www.christieaschwanden.com_ (http://www.christieaschwanden.com/) 

















=