From Facebook to Moon Mission Race, Lost Arguments Are Doing Science
In a recent study published in Science Signaling, Jason Tappey, an author of the paper and a former postdoctoral researcher at the Laboratory for Sensorive Bioanalytical Studies, looked at how people can be divided into subgroups according to their beliefs in a wide variety of sets of information and what it means for them to be true, partial, or unrepentant. He found that likelier to be labeled true believers are ones who are uniquely and pertinently held to a set of beliefs about the universe, even if that belief carries no significant explanatory power, as with strong faith. The adherent is also most likely to align their beliefs with philosophical and scientific pursuits.
Those are the changes attributed to loss of confidence in theories about the universe, which are at odds with the beliefs held by true believers.
In their study, researchers obtained the responses from 3,083 people across 14 different countries. Participants were asked about things such as which subjects they thought were better (10 percent) and which stories were better (38 percent) than expected (37 percent).
True believers were also more likely to allow their beliefs to change. This may be seen in a popular meme that recently resurfaced, showing an average American saying he’s only going to tell the truth 90 percent of the time. True believers, it turns out, are less likely to say they’re telling the truth about only 90 percent of the time, whether because they change their minds or because they believe in believing more than they say.
In turn, true believers tend to be more pessimistic about the future — 40 percent believe we’re doomed to end up living in a wheelbarrow in Paris. Participants who rated themselves true believers were 1.28 times more likely to say they were looking at the future the wrong way than those who were not true believers.
What about simulation believers? The idea that computers are fast, efficient, and can run our lives just fine may draw them toward certain theories about AI. However, the results of their study should not be cast as unequivocal, notes Vaidhyanathan. He points out that some virtual work environments like Dreamwork Q, run by the Andreessen Horowitz, have encouraged full participation from self-described AI “fools.” In that way, the study has a fair bit of significance — it emphasizes “intentional cognitive compartmentalization,” while Vaidhyanathan insists he’s not condoning it. He stressed that the core “big story” of the study is that “true believers are not based on any solid material that we can explain, but have an intrinsic, insatiable, and eternal intellectual life that might be taken to the extreme.” But that tension is ultimately “balancing the abundance of material in nature with the deeper part of a person’s being” which is centered on a set of thoughts, he said.
Note: A previous version of this article was written by Jiho Lee of the Science Signaling news service.