The dynamics behind influence programs are examined in a piece based largely on research by University of Michigan social psychologist Norbert Schwarz.
The main practical difficulty of conducting counter-propaganda is clearly stated.
The psychological insights yielded by the research, which has been confirmed in a number of peer-reviewed laboratory experiments, have broad implications for public policy. The conventional response to myths and urban legends is to counter bad information with accurate information. But the new psychological studies show that denials and clarifications, for all their intuitive appeal, can paradoxically contribute to the resiliency of popular myths.
This phenomenon may help explain why large numbers of Americans incorrectly think that Saddam Hussein was directly involved in planning the Sept 11, 2001, terrorist attacks, and that most of the Sept. 11 hijackers were Iraqi. While these beliefs likely arose because Bush administration officials have repeatedly tried to connect Iraq with Sept. 11, the experiments suggest that intelligence reports and other efforts to debunk this account may in fact help keep it alive. ...
The research does not absolve those who are responsible for promoting myths in the first place. What the psychological studies highlight, however, is the potential paradox in trying to fight bad information with good information.
Schwarz's study was published this year in the journal Advances in Experimental Social Psychology, but the roots of the research go back decades. As early as 1945, psychologists Floyd Allport and Milton Lepkin found that the more often people heard false wartime rumors, the more likely they were to believe them.
The research is painting a broad new understanding of how the mind works. Contrary to the conventional notion that people absorb information in a deliberate manner, the studies show that the brain uses subconscious "rules of thumb" that can bias it into thinking that false information is true. Clever manipulators can take advantage of this tendency.
The experiments also highlight the difference between asking people whether they still believe a falsehood immediately after giving them the correct information, and asking them a few days later. Long-term memories matter most in public health campaigns or political ones, and they are the most susceptible to the bias of thinking that well-recalled false information is true.
The experiments do not show that denials are completely useless; if that were true, everyone would believe the myths. But the mind's bias does affect many people, especially those who want to believe the myth for their own reasons, or those who are only peripherally interested and are less likely to invest the time and effort needed to firmly grasp the facts.
The research also highlights the disturbing reality that once an idea has been implanted in people's minds, it can be difficult to dislodge. Denials inherently require repeating the bad information, which may be one reason they can paradoxically reinforce it.
Indeed, repetition seems to be a key culprit. Things that are repeated often become more accessible in memory, and one of the brain's subconscious rules of thumb is that easily recalled things are true.
The long-recognized utility of such repetition is precisely the reason we are treated to backdrops festooned with the slogan du jour -- reiterated almost to the point of distraction -- most of the time President Bush makes a public appearance.
Furthermore, a new experiment by Kimberlee Weaver at Virginia Polytechnic Institute and others shows that hearing the same thing over and over again from one source can have the same effect as hearing that thing from many different people -- the brain gets tricked into thinking it has heard a piece of information from multiple, independent sources, even when it has not. Weaver's study was published this year in the Journal of Personality and Social Psychology.
The experiments by Weaver, Schwarz and others illustrate another basic property of the mind -- it is not good at remembering when and where a person first learned something. People are not good at keeping track of which information came from credible sources and which came from less trustworthy ones, or even remembering that some information came from the same untrustworthy source over and over again. Even if a person recognizes which sources are credible and which are not, repeated assertions and denials can have the effect of making the information more accessible in memory and thereby making it feel true, said Schwarz.
Experiments by Ruth Mayo, a cognitive social psychologist at Hebrew University in Jerusalem, also found that for a substantial chunk of people, the "negation tag" of a denial falls off with time. Mayo's findings were published in the Journal of Experimental Social Psychology in 2004.
No comments:
Post a Comment