There are two areas of research that have similarities but aren’t often merged. Specifically, ordinary cognitive biases and schizophrenic cognition. The approach of this review was to explore the intersection between normal biased thinking and schizophrenic delusional processes, in essence, to see where ordinary cognition and schizophrenic cognition overlap, if at all. With the rise of conspiracy thinking and polarization in the current political climate, this seems to be a pressing issue. There is also the problem of schizophrenic stigma, despite that ordinary people may also share such biases. First, we explore the definition of delusion and the aspects of schizophrenia relating to cognitive biases, with a focus on confirmation biases. Then normal biased thinking is explored, showing how motivated reasoning and ultimately confirmation bias can be facilitated by belief-related aversion, particularly cognitive dissonance. The overlap that normal and schizophrenic cognition has in regards to biases seems to be with stress driving such patterns. Evidence reveals enhanced aversion-related experience in schizophrenic subjects, which may underlie the tendency for enhanced cognitive biases. Lastly, possible solutions are explored for the problem of reducing biases in both clinical and non-clinical populations.
Delusions are erroneous beliefs that have been historically associated to schizophrenia as well as other mental health problems. According to the DSM-5 (APA 2013), a delusion is:
A false belief based on incorrect inference about external reality that is firmly held despite what almost everyone else believes and despite what constitutes incontrovertible and obvious proof or evidence to the contrary. The belief is not ordinarily accepted by other members of the person’s culture or subculture (i.e., it is not an article of religious faith). When a false belief involves a value judgment, it is regarded as a delusion only when the judgment is so extreme as to defy credibility. Delusional conviction can sometimes be inferred from an overvalued idea (in which case the individual has an unreasonable belief or idea but does not hold it as firmly as is the case with a delusion).
This definition includes a few elements, some of which can be viewed as problematic. Most importantly there is the “wrong belief” element and the ‘not followed by one’s culture’ element. It is the cultural element that poses issues the most. One 2020 preprint literature review argues that this conformity element is an issue (McKay & Ross 2020). Though, without this element, very high amounts of the non-clinical population would fit the criteria. This aspect of the definition seems to be designed to exclude those individuals, otherwise the concept of delusion would have very little use in a diagnostic context. McKay and Ross even note that researchers have argued that delusion and religion share components of ordinary cognition. The distinction between conformist and nonconformist “incorrect” thought is useful, though problematic, because incorrect thinking alone is hardly abnormal while nonconforming thought is inherently unusual, almost by definition. To conform is, in some sense, to be common. This is important because when we try to correct or even medicate away delusional thinking, it doesn’t seem to be that we are giving patients drugs that induce conformity, although it could be possible. The problem with the conformity element of the concept of delusion will become apparent throughout this essay.
By the way, we are doing a study currently. Could you take a survey to help us? Here is the link.
There are a few biases that have been associated with delusional thinking. There are certain confirmation biases such as the “bias against disconfirmatory evidence” (BADE) and the “bias against confirmatory evidence” (BACE) that are often observed in relation to delusional thought processes. There are also other biases like “jumping to conclusions” (JTC) and “liberal acceptance” (LA) that have related to delusional thinking (McLean, Mattiske, & Balzan 2017). BADE and BACE involve rejecting information that either goes against your hypothesis or confirms a hypothesis that you reject, respectively. JTC is a bias that involves quickly forming conclusions with insufficient evidence. LA is a bias in which people are too willing to accept absurd conclusions.
These biases seem to be used commonly in the context of religious thinking, where challenging the beliefs of religious individuals can often stimulate the use of BADE and BASE, leading some religious people to, for instance, reject scientific information. Holding beliefs that are not evidence-based correlated with BADE, BASE, and LA (Prike, Arnold, & Williamson 2018). Though religions are usually not considered to be an evidence-based belief, the definition of delusion excludes these people but it isn’t clear that the distinction between culturally sanctioned absurd beliefs and non-culturally sanctioned absurd beliefs should be differently pathologized on the basis of nonconformity. To include religious thinking in the definition of delusion would lead us to realize that delusions are ordinary. On the other hand, to exclude religion is essentially to suggest that the major pathological component of delusions is not wrongful thinking, but actually dissent of the culture, or heresy. We exclude “correct” nonconformist thinkers (though not always–see Galileo and the rejection of science by some religious people) because this is the basis for intellectual progress and novel insight. One might wonder if the difference between madness and genius is simply level of education and academic integration, although it is likely more complex than that.
Before we explore the intersection of ordinary and delusional thinking, it is important to consider that there are only some people who we consider qualified to form their own original thoughts, namely experts or authorities. We do not expect uneducated people to form conclusions about how the world works; instead, we expect them to believe in what trained experts think about the world. Most people might tend to stick to what authorities think, and to form conclusions on their own, without the guidance of education or experts is to risk being very misguided, ultimately forming uninformed conclusions. If one is resistant to the experts, we might label them (or that person) delusional.
Cognitive biases are a normal part of human thinking patterns. Confirmation bias is one common bias involving a person treating evidence unfairly, in support for their own hypotheses. Researchers have found that giving people a motivation, either through reward or aversion, tends to induce what is known as motivated reasoning. Motivated reasoning prompts the use of confirmation bias. During motivated reasoning, the brain regions that activate are different from those during cold reasoning or emotional regulation (Taber & Lodge 2006), suggesting that motivated reasoning is dissimilar to general unmotivated reasoning. Cognitive dissonance is an aversive experience that occurs when presented contradictory beliefs that is thought to provide motivation to use confirmation biases as a strategy for reducing dissonance (Harmon-Jones & Mills 2019). Although, one can also change their belief in response to this aversive experience. There is also evidence of a prior attitude effect in which people tend to stick to the beliefs and hypotheses they already had (Westen et al. 2006). This suggests that people who have already formed beliefs may be more prone to confirmation bias. As it seems, these biases go hand in hand. Someone who is presented information that contradicts their prior attitude will experience cognitive dissonance, which motivates reasoning via a sense of stress/aversion, leading to a tendency to use confirmation bias. To note, the research in the realm of human biases is not performed on a clinical population, but in healthy subjects for the most part. This leads us to the intersection of these biases and the clinical population.
What is ordinary about delusions?
Individuals with schizophrenia are prone to delusional thinking. As mentioned before, the BADE and BACE are, in fact, two specific kinds of confirmation biases. More broadly, schizophrenia has been argued to involve frequent use of confirmation biases in a model called the hypersalient evidence hypothesis (Balzan et al. 2013). Since cognitive dissonance and aversion motivate reasoning, one might wonder if these play a role in the biases observed in delusional processes associated to schizophrenia. There is some evidence of involvement of stress/aversion in schizophrenia and delusional processes. The diathesis stress model argues that there is a genetic predisposition to stress reactivity and that this, combined with environmental stressors, produces mental illness symptoms (Pruessner et al. 2017). There is some research suggesting that BADE and JTC are associated with stress in those with low schizotypy, but in high schizotypy individuals the BADE and JTC seem to occur even at low levels of stress and not change in response to stress (Le et al. 2019). The authors argue that dysfunction in the prefrontal cortex may explain both stress-induced and schizotypal-related cognitive bias but that the high schizotypy individuals may not be as impacted by stress, though it isn’t clear why. This seems to introduce unresolved complexity into this issue.
Other research has shown that stress seems to promote delusional processes. An experiment used social stress to induce paranoia (Saalfeld et al. 2018). Another study found that social stress induced paranoia associated with biases like belief inflexibility, external attribution bias, and data-gathering bias (Pot-Kolder et al. 2018). The authors also note that attention to threat bias and external attribution bias moderated paranoid ideation but not data-gathering bias or belief inflexibility. Social stress also led schizophrenics to be more certain of their paranoid ideations (Urbańska, Moritz, & Gawęda 2019). Together these findings suggest that stress could promote certain kinds of biases, which might promote paranoid and delusional symptoms.
The JTC bias has also been observed to occur in schizophrenia and to be enhanced during stressful conditions. When delusion-prone individuals felt rushed, the JTC bias increased (Keefe & Warman 2011). This suggests that a pressure to conclude results in enhanced JTC. Another study found that both those who are prone to delusions and those who are not performed similarly when no stress induction was present, but the delusion-prone individuals were prone to JTC during stress conditions (White & Mansell 2009). Research has shown that the stress that induces JTC does not need to be time-pressure, but even factors like noisy environments (Moritz et al. 2015). This could mean there are differences in response to broader stress in those who are prone to delusions.
In particular, an enhanced proneness to stress/aversion may exist in those with schizophrenia, as the diathesis stress model would suggest. There is evidence that individuals with schizophrenia have overactive regions of the brain related to fear responses when presented with neutral faces (Hall et al. 2008). Another study found that schizophrenics express increased aversion to angry faces (Evans et al. 2011). Schizophrenics have also been observed to have increased aversion to risk, possibly indicating less motivation or more distress from the possible negative consequences (Reddy et al. 2014). It may be that schizophrenics tend towards experiencing stronger effects from stress that eventually produce the tendency towards delusion-proneness. Some researchers argue that delusions may arise from the need to reduce uncertainty (Feeney et al. 2017). Those who experience more discomfort with uncertainty may begin to jump to conclusions in order to resolve this kind of dissonance.
There seem to be some studies that indicate less bias in schizophrenic subjects, further complicating the picture. In one study, schizophrenics were found to exhibit less JTC (Moritz et al. 2020). Instead, the study supported the LA bias only. Since these are individuals with schizophrenia, it is possible that medication is a factor here, which wouldn’t apply to delusion-prone or high schizotypy subjects in the other studies. Anti-psychotics are known to reduce motivation (Kirsch et al. 2007), so it could be that this also reduces motivated reasoning or even plays a role in treating the delusional state. Notably, ambivalence seems to reduce the use of biases (Schneider et al. 2020) and since antipsychotics are often said to be apathy-inducing, this may partially explain the reduction in JTC observed in the schizophrenic subjects of the study. Schizophrenia was associated with ambivalence, but this link seemed to not be genetic as it did not appear in relatives of schizophrenics (Docherty et al. 2014). This implies it may be an environmental factor, i.e. medication.
Memory: A possible distinction or a general mechanism?
What may be unique to schizophrenic-type delusions is an element of memory impairment. Our conclusions about the present moment and reality are based on memories. The way we view reality depends on our awareness of how we learned the world works, like that the sky is blue but isn’t made of water. To forget elements of what we know about the sky being blue may allow for a more open interpretation. Without access to certain memories, we may form beliefs that contradict those memories since those memories are not guiding our belief formation. Thus, the inability to access certain memories may allow one to form beliefs they wouldn’t normally hold. This is an important consideration because amnesiac drugs (anesthetics) and brain injury can both lead to delusional beliefs (Aljadeed & Perona 2020; Corlett, Honey, & Fletcher 2007; Darby et al. 2017), by which the mechanism may be a disruption of memory functions.
Some research has suggested that persecutory delusions rely on episodic memory biases that are driven by emotional states (Lepage et al. 2007). Lepage and team note that research has shown such episodic memory biases in depressive disorders, revealing that memory for negative experiences is enhanced, while other memories are diminished. In persecutory states, the authors argue that memories for threatening experiences are enhanced. In the case of drug use (particularly stimulants), the emotions of the user may be altered so that such memory biases are elicited. In peak experiences of threat, these biases may become so pronounced that there is an amnesiac effect on other memories, thus altering the conclusions and beliefs one holds to align with the memories that are most dominant in one’s mind.
This memory bias could partially explain the hyper-salience model of delusional processes. The hypersalient evidence model of delusions suggests that confirmation biases are more prevalent in delusion-prone individuals (Balzan et al. 2013). It also suggests that these biases are behind delusional processes. It may be that emotion sculpts which memories and information we are aware of, leading to confirmation biases. This is something that could be true for nonclinical populations as well, like in the case of motivated reasoning and cognitive dissonance.
Besides memory loss, there is also the possibility that a person never had such a memory to begin with. In other words, un- or miseducation. This is essentially cultural transmission of delusions, though, by the definition provided in the DSM-5, these would not be considered delusions. The cultural transmission of delusions is likely to be highly complicated and involve both miseducation and the use of cognitive dissonance as well as other mechanisms.
Can Delusions Be Induced?
The obvious case would be the induction of delusions through drugs. But this is already known about and frequently discussed in the literature. So instead, I bring up the possibility of inducing delusions through manipulation of these other elements of cognition, such as cognitive biases and memory biases. Keep in mind, this is speculative. If stress were found to promote biases, then stress could be used to experimentally induce these biases. Though, it seems there are other factors at play, and the study on schizotypy suggests it isn’t so simple.
First, one could prompt an individual to make a choice in their belief, to commit to a belief. Making a choice or commitment to a belief has been observed to enhance confirmation bias when presented evidence after the choice was made (Talluri et al. 2018). Next, one could induce self-affirmation in an individual, which has also been observed to induce confirmation bias and seems to enhance attentional bias towards threatening components of a persuasive message (Munro & Stansbury 2009). Boosting one’s confidence seems to also drive confirmation bias (Rollwage et al. 2020). These steps may prepare a subject to engage in confirmation biases.
Once prepared, the subject may now be prompted into bias through threats and dissonance. In some sense, this should reduce an individual’s ambivalence, promoting the use of biases further. Of course, to do something like this in an ethical way may be difficult. An insult could be provided, giving a motivation to sink further into one’s beliefs. For example, if you tell someone that their beliefs are stupid, the individual has a choice to accept that they are wrong or prove that they aren’t. The motivation to prove that they aren’t is expected to be stronger in the face of this insult because to admit fault is now more costly. That doesn’t necessarily mean that admitting fault is more costly than being biased, as a person who is informed about these biases may be self-conscious about being called out on them, which might provoke further insults.
After these steps, one could provide disconfirmatory evidence to the belief that was labeled “stupid” and see if the subject defends the idea. If the subject does, then a debate could take place. Each counter argument might be expected to be met with contrarianism. With each point made, the individual could be pushed further into a corner, where they have the choice to express and argue for a delusional position or to admit fault and suffer an extremely aversive outcome. It may be those who are more anxious about the aversive outcome that are prone to this form of experimental delusion induction. To reiterate, it isn’t clear that this could be done ethically, but at the same time, if trauma or abuse underlie naturally occurring psychotic delusional effects, then we likely couldn’t simulate this in a lab ethically. This may be outside of the realm of ethical testability, although perhaps someone besides I could design an ethical implementation.
Can Delusions Be Reduced?
Reducing delusions is far more ethical. Part of this process may simply be a reversal of the steps above. One could reduce the sense of choice, reduce self-affirmations, reduce confidence in their belief, increase ambivalence, and avoid insults. This would mean approaching delusional individuals carefully, treating them with more respect and caution than we would normally expect. Currently, many delusional beliefs are met with criticism, laughter, or even in some cases, persecution (consider the case of flat earthers). Political polarization seems to involve both sides engaging in delusion-inducing behaviors, such as throwing insults and dehumanizing the opposition for their beliefs. In order to reduce delusions in clinical subjects (or even more broadly), we may need to take an opposite approach and try to reduce emotional investment and reduce the cost of being wrong (ie embaressment).
We can also focus on reducing people’s memory biases and teach them about normal biases. In the case of memory biases, perhaps reminding a patient of memories outside of their emotional state could shift their mood and bring them back to a more comfortable and less threatened mindset. Reminiscing on good memories may prove to be a valuable tool for these kinds of delusions. In the case of other biases, we can teach people how they work. This could mean implementing awareness of these biases at an earlier age, for the general population. This could be in the form of the education system or parenting. Normalizing discussion of how these biases work may improve social relations throughout society, so it seems to be something worthwhile.
It would appear that delusional processes are far more complex and that pinpointing any particular mechanism will remain difficult. It may be that various disruptions of cognition could skew thinking patterns and produce delusional conclusions. This may occur in relation to motivation, promoting motivated reasoning. It may occur as a result of memory disruption, leading to beliefs uninformed by one’s most basic knowledge. It could also be due to cultural transmission and merely learning misinformed conclusions. These biases may only be one way to produce delusional thinking.
It appears that stress-proneness could underlie delusion-proneness. Although, perhaps we shouldn’t jump to conclusions.
. . .
If you found this enjoyable, consider joining the Patreon! I’ve been posting detailed experience reports with my adventures using prescription ketamine. Also. someone sent me an EEG device to collect data on ketamine-induced brainwave changes which I’ve started posting there too. I also post secret mini podcasts. You can find the publicly available podcasts here by the way!
Special thanks to the 14 patrons: Idan Solon, David Chang, Jan Rzymkowski, Jack Wang, Richard Kemp, Milan Griffes, Alex W, Sarah Gehrke, Melissa Bradley, Morgan Catha, Niklas Kokkola, Abhishaike Mahajan, Riley Fitzpatrick, and Charles Wright! Abhi is also the artist who created the cover image for Most Relevant. Please support him on instagram, he is an amazing artist! I’d also like to thank Alexey Guzey, Annie Vu, Chris Byrd, and Kettner Griswold for your kindness and for making these projects and the podcast possible through your donations.
If you’d like to support these projects like this, check out this page.
If you liked this, follow me on
Aljadeed, R., & Perona, S. (2020). Transient amnesia following prehospital low-dose ketamine administration. The American journal of emergency medicine, 38(7), 1544-e5.
American Psychiatric Association, & American Psychiatric Association. (2013). DSM 5. American Psychiatric Association, 70.
Balzan, R., Delfabbro, P., Galletly, C., & Woodward, T. (2013). Confirmation biases across the psychosis continuum: The contribution of hypersalient evidence‐hypothesis matches. British Journal of Clinical Psychology, 52(1), 53-69.
Corlett, P. R., Honey, G. D., & Fletcher, P. C. (2007). From prediction error to psychosis: ketamine as a pharmacological model of delusions. Journal of psychopharmacology, 21(3), 238-252.
Darby, R. R., Laganiere, S., Pascual-Leone, A., Prasad, S., & Fox, M. D. (2017). Finding the imposter: brain connectivity of lesions causing delusional misidentifications. Brain, 140(2), 497-507.
Docherty, A. R., Sponheim, S. R., & Kerns, J. G. (2014). Further examination of ambivalence in relation to the schizophrenia spectrum. Schizophrenia research, 158(1-3), 261-263.
Evans, S., Shergill, S. S., Chouhan, V., Collier, T., & Averbeck, B. B. (2011). Patients with schizophrenia show increased aversion to angry faces in an associative learning task. Psychological medicine, 41(7), 1471.
Feeney, E. J., Groman, S. M., Taylor, J. R., & Corlett, P. R. (2017). Explaining delusions: reducing uncertainty through basic and computational neuroscience. Schizophrenia bulletin, 43(2), 263-272.
Hall, J., Whalley, H. C., McKirdy, J. W., Romaniuk, L., McGonigle, D., McIntosh, A. M., … & Lawrie, S. M. (2008). Overactivation of fear systems to neutral faces in schizophrenia. Biological psychiatry, 64(1), 70-73.
Harmon-Jones, E., & Mills, J. (2019). An introduction to cognitive dissonance theory and an overview of current perspectives on the theory.
Keefe, K. M., & Warman, D. M. (2011). Reasoning, delusion proneness and stress: An experimental investigation. Clinical psychology & psychotherapy, 18(2), 138-147.
Kirsch, P., Ronshausen, S., Mier, D., & Gallhofer, B. (2007). The influence of antipsychotic treatment on brain reward system reactivity in schizophrenia patients. Pharmacopsychiatry, 40(05), 196-198.
Le, T. P., Fedechko, T. L., Cohen, A. S., Allred, S., Pham, C., Lewis, S., & Barkus, E. (2019). Stress and cognitive biases in schizotypy: A two-site study of bias against disconfirmatory evidence and jumping to conclusions. European Psychiatry, 62, 20-27.
Lepage, M., Sergerie, K., Pelletier, M., & Harvey, P. O. (2007). Episodic memory bias and the symptoms of schizophrenia. The Canadian Journal of Psychiatry, 52(11), 702-709.
McKay, R. T., & Ross, R. M. (2020). Religion and delusion. Current Opinion in Psychology.
McLean, B. F., Mattiske, J. K., & Balzan, R. P. (2017). Association of the jumping to conclusions and evidence integration biases with delusions in psychosis: a detailed meta-analysis. Schizophrenia bulletin, 43(2), 344-354.
Moritz, S., Köther, U., Hartmann, M., & Lincoln, T. M. (2015). Stress is a bad advisor. Stress primes poor decision making in deluded psychotic patients. European archives of psychiatry and clinical neuroscience, 265(6), 461-469.
Moritz, S., Scheunemann, J., Lüdtke, T., Westermann, S., Pfuhl, G., Balzan, R. P., & Andreou, C. (2020). Prolonged rather than hasty decision-making in schizophrenia using the box task. Must we rethink the jumping to conclusions account of paranoia?. Schizophrenia Research, 222, 202-208.
Munro, G. D., & Stansbury, J. A. (2009). The dark side of self-affirmation: Confirmation bias and illusory correlation in response to threatening information. Personality and Social Psychology Bulletin, 35(9), 1143-1153.
Pot-Kolder, R., Veling, W., Counotte, J., & Van Der Gaag, M. (2018). Self-reported cognitive biases moderate the associations between social stress and paranoid ideation in a virtual reality experimental study. Schizophrenia bulletin, 44(4), 749-756.
Prike, T., Arnold, M. M., & Williamson, P. (2018). The relationship between anomalistic belief and biases of evidence integration and jumping to conclusions. Acta psychologica, 190, 217-227.
Pruessner, M., Cullen, A. E., Aas, M., & Walker, E. F. (2017). The neural diathesis-stress model of schizophrenia revisited: An update on recent findings considering illness stage and neurobiological and methodological complexities. Neuroscience & Biobehavioral Reviews, 73, 191-218.
Reddy, L. F., Lee, J., Davis, M. C., Altshuler, L., Glahn, D. C., Miklowitz, D. J., & Green, M. F. (2014). Impulsivity and risk taking in bipolar disorder and schizophrenia. Neuropsychopharmacology, 39(2), 456-463.
Rollwage, M., Loosen, A., Hauser, T. U., Moran, R., Dolan, R. J., & Fleming, S. M. (2020). Confidence drives a neural confirmation bias. Nature communications, 11(1), 1-11.
Saalfeld, V., Ramadan, Z., Bell, V., & Raihani, N. J. (2018). Experimentally induced social threat increases paranoid thinking. Royal Society open science, 5(8), 180569.
Schneider, I. K., Novin, S., van Harreveld, F., & Genschow, O. (2020). Benefits of being ambivalent: The relationship between trait ambivalence and attribution biases. British Journal of Social Psychology.
Taber, C. S., & Lodge, M. (2006). Motivated skepticism in the evaluation of political beliefs. American journal of political science, 50(3), 755-769.
Talluri, B. C., Urai, A. E., Tsetsos, K., Usher, M., & Donner, T. H. (2018). Confirmation bias through selective overweighting of choice-consistent evidence. Current Biology, 28(19), 3128-3135.
Urbańska, D., Moritz, S., & Gawęda, Ł. (2019). The impact of social and sensory stress on cognitive biases and delusions in schizophrenia. Cognitive neuropsychiatry, 24(3), 217-232.
Westen, D., Blagov, P. S., Harenski, K., Kilts, C., & Hamann, S. (2006). Neural bases of motivated reasoning: An fMRI study of emotional constraints on partisan political judgment in the 2004 US presidential election. Journal of cognitive neuroscience, 18(11), 1947-1958.
White, L. O., & Mansell, W. (2009). Failing to ponder? Delusion‐prone individuals rush to conclusions. Clinical Psychology & Psychotherapy: An International Journal of Theory & Practice, 16(2), 111-124.