Ulterior Motives in Truth-Seeking

Most are aware that biases are a big issue when it comes to doing science effectively. Essentially, biases are ulterior motives besides truth-seeking. Here we explore why we might become motivated to support falsehoods. The fix is somewhat easy: eliminate the motivating factors. Yet, many science-interested people and scientists likely promote these motivations, likely unknowingly. We will then cover the motivations we have to promote motivations for supporting falsehoods in others. Last, we look at solutions.

Our motivation to support hypotheses can stem from multiple things. One motivation is good and it is discovering some valuable practical truth. Those whose lives see direct implications of the truths discovered by their science are in the best position. Their future relies directly on the truth rather than just convincing other people of the results or hypothesis. Many motivations are counterproductive such as our urge to socially express our ability and worthiness as a thinker or scientist. More obviously, money is a sketchy motivator for truth-seeking. Most tragically, survival may also be a sketchy motivator. Our need to protect ourselves from financial ruin may motivate us to essentially manipulate those who are in control of our financial situation. Creating systems that rely on these sketchy motivations is part of the problem.

As a culture, we often reward people who find truth and punish those who hold falsehoods. Rewarding people for finding the truth means creating an incentive besides truth being rewarding for some direct practical reason like the utility in creating new technology or the cure to a disease that a loved one has. If we purely seek the reward that is inherent to the truth, then we should not support anything besides the discovery of the truth. Denying the truth would only make such a situation worse. You would invest more resources into essentially nothing but loss if the only reward comes from the implications of the truth being discovered or explored.

On the other hand, we often have social motivations to cling to a belief or hypothesis. A scientist might be rewarded with fame or monetary rewards if they discover some important truth with great practical utility, again, like treating some disease. This creates a scenario in which falsehoods are actually incentivized. In order to acquire money or fame, one does not necessarily need a true idea. They merely need to convince other people that it is true or worthy of investing in. People will forfeit their money or attention if they do not understand it is false.

Another very important social motivator is the simple social reward people get for being “right” about an idea. People seem to often play a game of one-upping each other intellectually. This involves viewing a perceived opponent with intense scrutiny as a means of showing one’s own “intellectual power”, which a person might acquire social status for displaying. I encounter this particularly frequently in social media spaces, especially when it is public, such as Facebook or Twitter. To stop this, we must not be impressed with people or reward them for their ability to be right. We cannot have a culture that rewards smart people for being smart. Being smart for the sake of its’ social effects should not guide truth-seeking Instead, only practical utility in truthful ideas should.

We also punish being wrong about an idea. We shame conspiracy theorists or pseudoscientists. These create new motivators that are distinct from the direct utility of the truth. Punishing falsehoods or wrongs creates an incentive to hide being wrong, which can actually involve trying to spread falsehoods and convert others as a protective mechanism. Clearly, if you prove that your wrong belief is correct to other people, you would not suffer this kind of punishment. This means truth-seeking can easily lose priority to social status protection. It’s like we created a culture that pays people in social credit for being smart and so many people have been corrupted by social-“financial” biases. If you are labeled intelligent by others, you are secretly being funded to appear smart by your mom, friends, employers, or the public.

Rather than punishing people for holding false beliefs, such as conspiracy theories, we should be punishing those who punish people who hold false beliefs. These people who punish others that hold wrong beliefs are fucking idiots. Yes, feel this rage burn inside of you. Now you have become highly motivated to defend your behavior and prove that you are not an idiot. This motivation is not the sudden emergence of truth-seeking but rather the sudden urgency to defend your social standing and seek my approval. It is your need to protect yourself from the fear that others will disapprove of you as well. Perhaps you will find friends to give you that approval to increase your sense of reward and diminish this pain. This reaction is exactly why we should not punish false beliefs. Do not worry, you are not an idiot and this is purely for demonstration purposes.

Why would anyone want to seek truth purely for its own sake? I’ve explored this issue in Knowledge Isn’t Valuable. Truth is only useful in how we gain the ability to control reality and ultimately reach some kind of reward or avoid a negative consequence. Being biased to control reality via using the truth to do so is how you correctly seek the truth. Seeking truth “for the sake of it” is something I find absurd. It’s actually a mask for something else. Like the urge to have the social identity of an intellectual or a truth-seeker. You can easily hijack “science” or truth-seeking as an identity to reach social goals. Then falsehoods begin to have a utility in social power and the situation gets problematic.

Truths are frequently arbitrary to our innate goals. Consider the results of a coin flip. This is certainly a truth to be discovered but if our choice is to know the true side of the coin that was landed on versus gaining money or social status, we will quickly favor the money or social status because that has more concrete implications for our lives. The truth often does have concrete implications and here I am arguing that those are the cases in which a person is most likely to seek actual truths.

Truth-seeking is not the only thing with utility in our world. So this means that many other goals can gain priority over truth-seeking. Falsehoods can be a utility in altering the behavior of other people, essentially becoming a tool to reach other goals like getting funding or selling a product.

If science is the goal, creating a scenario that incentivizes deception or biases us against the truth is counterproductive, obviously. The truth has its own utility and we should narrow down the rewards people gain from discovering new truths to just the reward that stems from its practical implications. We should have poor scientists who are not rewarded with fame. If you want to cure some disease, you should give the disease to the person you love most so that you are forced into maximum motivation to solve that problem. The truth, the cure, is the only way to save the one you love.

In my own life, having schizophrenic-type symptoms drove me to explore the literature to further understand and hopefully control this about my mind. Along the way, I developed many other personal motivations like trying to control the brain in general ways, through pharmacology, electricity, nutrition, and various other things.

I don’t currently think that humans are inherently biased but instead, we create biases and incentives to stray from the truth. Some of these problems could be fixed in the realm of science, at least theoretically. There should be nothing left to gain from doing science than what is gained from the direct consequences of the work being done, such as practical implications (disease cures, new technology that the scientists want themselves, etc).

  1. Scientists could be banned from fame. This may involve scientists not being allowed to publicly reveal their identity.
  2. Financial biases could be addressed by making sure that money is not possibly a motivator for their actions in science work.
  3. Less realistically: Again, if you want to cure some disease, you should give the disease to the person you love most so that you are forced into maximum motivation to solve the problem. The truth is the only way to save the one you love.
  4. Someone could give themselves schizophrenic symptoms so they are more motivated to solve their problems. Being biased here won’t give you functional solutions so there is great pressure to figure it out correctly. In my own experience, this has accidentally been the case. Using cannabis too much led to symptoms like formication, which led me to write about what I learned in trying to resolve this in Illusory Bugs.
  5. You could imagine more. The list may be updated with Redditor’s suggestions.

. . .

If you found this enjoyable, consider joining the Patreon! I’ve been posting detailed experience reports with my adventures using prescription ketamine. Also, someone sent me an EEG device to collect data on ketamine-induced brainwave changes which I’ve started posting there too. I also post secret mini podcasts. You can find the publicly available podcasts here by the way!

Special thanks to the 15 patrons: Nick S, Alex W, Elin Ahlstrand, Morgan Hough, Buttercup, Dan Elton, Idan Solon, David Chang, Jack Wang, Richard Kemp, Sarah Gehrke, Melissa Bradley, Morgan Catha, Niklas Kokkola, Riley Fitzpatrick, and Charles Wright! Abhi is also the artist who created the cover image for Most Relevant. Please support him on instagram, he is an amazing artist! I’d also like to thank Alexey Guzey, Annie Vu, Chris Byrd, and Kettner Griswold for your kindness and for making these projects and the podcast possible through your donations.

If you liked this, follow me on

Twitter

Leave a comment