Psychologist logo
People walking down street blurred
Research, Social and behavioural

Can psychology save the world?

Scott Lilienfeld: 'The most important psychology experiment that’s never been done would determine whether psychology can save the world.'

26 September 2007

Yes, that statement is admittedly more than a bit hyperbolic. And this experiment will probably never be conducted, at least not in our lifetimes or even our great-grandchildren's lifetimes. But it is at least worth pondering as a Gedanken experiment. This experiment rests on three premises for which, I contend, there is substantial, although not yet definitive, support.

Premise #1: The greatest threat to the world is ideological fanaticism. By ideological fanaticism, I mean the unshakeable conviction that one's belief system and that of other in-group members is always right and righteous, and that others' belief systems are always wrong and wrongheaded – even to the point that others who hold them must be eliminated. Contra Hitchens (2007), religion per se is not a threat to the world, although certain religious beliefs can provide the scaffolding for ideological fanaticism, as we can see in the contemporary wave of Islamic extremism. As many historians have observed, the three most deadly political movements of the 20th century – Hitler's Nazism, Mao Tse-Tung's cultural revolution, and Pol Pot's Khmer Rouge – were largely or entirely secular. What unites all of these movements, including Islamic extremism, is the deeply entrenched belief that one's enemies are not merely misguided, but so profoundly misguided that they are wicked and must be liquidated.

Premise # 2. Biased thinking is a necessary, although not sufficient, condition for ideological fanaticism. Among the most malignant biases, and those most relevant to ideological fanaticism, are: (1) Naïve realism: the erroneous belief that the world is precisely as we
see it (Ross & Ward, 1996). Naïve realism in turn often leads to the assumption that "because I perceive reality objectively, others who disagree with me must be foolish, irrational, or evil" (see Pronin, Puccio, & Ross, 2002); (2) Bias blind spot ("not me" bias): the erroneous belief that we are not biased, although others are (Pronin, Gilovich, & Ross, 2004); and (3) Confirmation bias: the tendency to selectively seek out information consistent with one's beliefs and to ignore, minimize, or distort information that that is not (Nickerson, 1998).

Premise # 3: Critical thinking is the most effective (partial) antidote against ideological fanaticism. By critical thinking, I mean thinking designed to overcome one's biases, especially the three aforementioned biases.

Regrettably, malignant biases in thinking are virtually never addressed explicitly or even implicitly in educational curricula, which is troubling given that so much of everyday life – left-wing political blogs, right-wing political talk radio, political book buying habits (Krebs, 2007), ad infinitum – reinforce them. Moreover, our selection of friends can generate not only communal reinforcement for our biases (Carroll, 2003), but the erroneous belief that our views are shared by most or all other reasonable people (i.e., a false consensus effect; Ross, Greene, & House, 1977). In some Islamic countries, of course, much of the educational curriculum comprises indoctrination into a cultural and religious worldview that implies that one's enemies are mistaken, blasphemous, and despicable. In the United States, some social critics (e.g., Bloom, 1987Horowitz, 2007) have charged that the higher educational system typically engenders an insidious indoctrination into left-wing ideology. The merits of these arguments aside, it is undeniable that even among highly educated individuals (a group that includes many or most terrorists; Sageman, 2004), the capacity to appreciate views other than one's own is hardly normative.

So, the most important psychological experiment never done would (1) begin with the construction of a comprehensive evidence-based educational programme of debiasing children and adolescents in multiple countries against malignant biases, (2) randomly assign some students to receive this program and others to receive standard educational curricula, and (3) measure the long-term effects of this debiasing program on well-validated attitudinal and behavioural measures of ideological fanaticism. To some extent, the goal of this program would be to inculcate not merely knowledge but wisdom (Sternberg, 2001), particularly aspects of wisdom that necessitate an awareness of one's biases and limitations, and the capacity to recognize the merits of differing viewpoints (e.g., Meacham, 1990 see p.181-211 here).

The greatest obstacle to conducting this experiment, aside from the sheer pragmatic difficulty of administering a large scale curriculum across multiple countries, is the surprising paucity of research on effective debiasing strategies. Nevertheless, at least some controlled research suggests that encouraging individuals to seriously entertain viewpoints other than their own (e.g., "considering the opposite") can partly immunize them against confirmation bias and related biases (Kray & Galinsky, 2003Wilson, Centerbar, & Brekke, 2002). Whether such educational debiasing efforts, implemented on a massive scale, would help to inoculate future generations against ideological fanaticism, is unknown. But launching such an endeavour by conducting small-scale pilot studies would seem to be a worthwhile starting point."

Dr Scott O Lilienfeld is Professor of Psychology, Emory University, Atlanta.