The elusive backfire effects
Ten years ago, scholars and practitioners were concerned that corrections may “backfire”; that is, ironically strengthen misconceptions rather than reduce them. Recent research has allayed those concerns: backfire effects occur only occasionally and the risk of occurrence is lower in most situations than once thought.
Do not refrain from attempting to debunk or correct misinformation out of fear that doing so will backfire or increase beliefs in false information 66, 67, 68.
Backfire Effect: A backfire effect is where a correction inadvertently increases belief in, or reliance on, misinformation relative to a pre-correction or nocorrection baseline.
Familiarity backfire effect
Repetition makes information more familiar, and familiar information is generally perceived to be more truthful than novel information (the aforementioned illusory-truth effect). Because a myth is necessarily repeated when it is debunked, the risk arises that debunking may backfire by making a myth more familiar (see figure below). Early evidence was supportive of this idea, but more recently, exhaustive experimental attempts to induce a backfire effect through familiarity alone have come up empty 69, 70. Thus, while repeating misinformation generally increases familiarity and truth ratings, repeating a myth while refuting it has been found to be safe in many circumstances, and can even make the correction more salient and effective 71.
Overkill backfire effect
This effect refers to the idea that providing “too many” counterarguments against a false claim might produce unintended effects or even backfire. The only study to directly examine this notion, however, found no evidence for this effect and instead concluded that a greater number of relevant counterarguments generally leads to greater reduction of misconceptions 69.
Worldview backfire effect
The worldview backfire effect is presumed to occur when a correction that challenges people’s worldview increases belief in the misinformation. While there was initially some evidence for the worldview backfire effect 72, recent research indicates that it is not a pervasive and robust empirical phenomenon.
Several studies have failed to obtain a backfire effect even in theoretically favourable circumstances 22, 23, 67, 73, 74. Thus, while there are reports of worldview backfire effects emerging under specific conditions (e.g., when Republicans are presented with information concerning climate mitigation measures 75) concern about worldview backfire has been disproportionate.
Personal experience vs. evidence
Although communicators may observe backfire effects in their everyday lives, many experiments have shown that, in fact, such behavior is unusual. Social scientists are still figuring out why some people “backfire” but not others, and why those effects occur on some occasions but not others. However, the accumulated evidence to date is clear that the worldview backfire effect is not a sufficient reason to avoid debunking and fact-checking.
Role of worldview in belief confirmation
Even if worldview backfire effects are infrequent, there are other ways that worldview can affect debunking.
Worldview can affect what content people choose to consume 76, 77, 78. This process of selective exposure may mean that people are more likely to be exposed to worldview-consonant false or misleading claims in the first place, and by implication, less likely to be exposed to corrective information about such claims after exposure. To illustrate, one analysis showed that 62% of visits to fake news websites came from the 20% of Americans with the most conservative information diet 77.
The efficacy of corrections depends in part on the recipient’s willingness to believe the statement. Activating group identities likely induces constraints in how people think about an issue—depending on the identity and the issue, this may ameliorate or exacerbate misperceptions, and it may affect whom a person will believe. This highlights the importance of using inclusive language and avoiding the stigmatization of groups for holding inaccurate beliefs. Doing so is likely to polarize more than generate desired updating.
Recent research suggests that although (mis-)information diets may differ across the political spectrum,some of the motivated reasoning processes just described may be symmetric for liberals and conservatives 79.
To learn more, continue with Debunk often and do it properly