Suppose you are on a trial jury trying to decide whether the defendant is guilty. You are discussing the case with your fellow jurors who you know have exactly the same evidence as you, and are just as good at assessing the evidence. You think the defendant is guilty, while your peers think he is innocent. After lengthy discussion, you still disagree. What is the rational response to this disagreement? Is there a logical way out of such an impasse?
This is a common situation, but it is deeply puzzling. To answer the question of what the rational response to disagreements is, we should distinguish the psychological question of what people do from the philosophical question of what people ought to do.
The issues with what people do are well known. Benjamin Franklin, one of the founding fathers of the US, wrote that: “Most men … think themselves in possession of all truth, and that wherever others differ from them, it is so far error.”
This has been backed up by research. Most people ignore evidence that would contradict their beliefs, regardless of whether they are right. They are even biased in their detection of bias – they find it in other people, but not in themselves.
It is when we consider what we should do that we realise such responses are irrational – they are often based on emotions rather than logic. Those who hold that their opinions are right and everyone else is mistaken are guilty of being arbitrary. They have most likely acted impulsively, failing to make a rational assessment of the argument.
To avoid being arbitrary, you should be humble and conciliate: move your opinions towards the other person’s. Similarly, they should move their opinion towards yours and you should both become agnostic. We are talking here about disagreement between peers, people who are equally intelligent. This is nevertheless extremely counter-intuitive for most of us.
But conciliating has some successes. The “wisdom of the crowd” is the well-known phenomenon that groups can produce very accurate opinions. This can be traced to the ancient philosopher Aristotle and was popularised when the English scientist Francis Galton noticed that the average of 800 guesses of the weight of a bull was within 1% of the correct weight.
But this commonsense view has some disturbing results if we follow it through, as it becomes impossible to maintain any opinion in the light of peers who disagree with you. When faced with a disagreeing juror, you should give up your belief. When it comes to 11 other jurors disagreeing with your view that a defendant is guilty, it is more likely that you made a mistake than that all the others did, so you should change your mind and conclude that the defendant is innocent.
Almost any controversial opinion becomes irrational. You probably have at least one strong political opinion on which intelligent people disagree with you. According to conciliationism, this is irrational. The only rational position becomes a radical agnosticism, refraining from any strong opinions that are not shared by most other people. It is irrational to disagree with the crowd.
But philosopher Adam Elga sees this as “spineless” and argues that you don’t always have to agree with the crowd to be agnostic. Consider people who have radically different political views to yours. These radically different political views must be based on a radically different worldview. If you think that this worldview is radically wrong, perhaps you should decide that they are not your peers after all and discount their opinion.
But I argue that this gets things backwards. If you initially thought they are as intelligent and knowledgeable as you, you should take their views to be evidence that your entire worldview is wrong. There are limits of course. We should only worry about the beliefs of those who are at least as well-informed and as good at assessing the evidence as we are. Still, many people find this approach uncomfortable.
But what benefits could radical agnosticism have in society? Politicians often have to make decisions in the light of experts disagreeing, as we are now seeing in responses to the coronavirus. When there is a choice between incompatible paths, it might be best to take one despite having low confidence that it is the right path to take – opting for the one that seems to be the best. In such cases, however, new evidence, which might still be inconclusive, could show that the best course is to change from one path to another. So making a U-turn, which is politically embarrassing, is actually more rational than sticking with a faulty initial approach.
When it comes to science, revolutions have come from people who completely disagreed with their peers. Scientists might rationally work on something which they and others think will fail, as the benefits of being right make it worthwhile, despite the low probability of success. But if they genuinely believe they are right, they should be able to convince others. Einstein discovered a revolutionary and deeply counter-intuitive theory of gravity, but it did not take very long for other scientists to adopt it.
Strangely, those with strong beliefs tend to be admired. The human mind hates uncertainty, so it is comforting to be told what to think, and to form settled opinions. But it is not rational. As the philosopher Bertrand Russell wrote: “The fundamental cause of the trouble is that in the modern world the stupid are cocksure while the intelligent are full of doubt.”
It’s an insightful comment that we should all ponder. Whether we are able to fully embrace radical agnosticism or not, chances are the world would be a better place if we started questioning our own beliefs a bit more.
Darren Bradley, Associate Professor, University of Leeds