The 7 Habits of Highly Effective People, Stephen R. Covey
It is increasingly common to considering people with different views as “not listening to reason”. If such an encounter happens, it is increasingly okey to disconsider the other side of the discussion and not bother discussing at all. For example, people who voted for Brexit or Trump may simply be dismissed as “not knowing what they did”, without trying to understand their position. True, the fact that the future US president does not believe in global warming and the future US vice-president believes in creationism makes it off-putting to engage in intelligent conversations, but that is no excuse for not even trying.
I think we are not seeing the greater problem that is hiding behind these small disagreements:
It has become increasingly difficult to independently verify findings.
Let me give you an example. Assume you wanted to convince me that the Earth is round. How would you do that if I did not trust you? You could show me pictures from outer space, but then I could argue that you faked them. You could call me on Skype from the US to tell me that it is daylight there while it is night in Sweden, but I would need to trust that you did not put a poster on your window. We could perform the Bedford Level experiment, but even that is tricky to get right.
It is also tempting to just claim that people “did not do their research”. I mean, it should be easy to find a report that exhaustively lists the arguments for showing that the Earth is round. Unfortunately, truth fabrication has become such a great enterprise that reports can nowadays be found for any viewpoint. Climate change is fake? There are documentaries on that! Climate change is real? There is a report on that! And the list goes on.
“Well, you simply need to check if the authors are credible.” Now you have two choices: (a) you trust the authors – you’re done, (b) you check the references and data used in the report. The latter path is recursive, which completes either when (i) you found some authors that you trust, (ii) you reached some axioms that you agree are true and/or made measurements yourself. Depending on your level of distrust – in other words, how many levels of recursion you went down – you might end up spending a lot of time. Essentially, you might end up redoing all the science that generations of groups of research have done, which is impractical for mortals.
What I am trying to argue is that you need a “trust base”, a group of axioms and authors that you trust to make it practical for you to verify facts within your lifetime. But what if my “trust base” is completely different from your “trust base”, not because of ignorance, but because we both independently tried to create a system to quickly verify facts? Worse off, both our “trust bases” might have been strengthened by past experience. Your “trust base” allowed you to understand your jet lag, whereas my “trust base” allowed me to get over difficult situations in life. We ended up speaking the same math, but assuming completely different axioms. It is like Euclid teaching geometry to a non-Euclidian.
I hope I managed to convince you that agreeing with a viewpoint is not only an issue of reason, but also a matter of trust. Although it feels sad, so many topics have had trustfulness problems in the past (GMOs, vaccines, global warming, food) either due to profit or fame, that it no longer surprises me that people just shut down and prefer to pick facts (or “axioms”) based on feelings or intuition.
But let us not stay on the complaining side and get actionable about it. Here are some items I would start with:
“Seek First to Understand, Then to be Understood”: Did you understand the position of the other side, their “trust base”, their motives and their “axioms”? I might not be racist, I just have not seen other races. We tend to fear what we do not know. And yes, we should probably all read the Bible and the Quran, they are the “axioms” of many people.
Frame the logic from their point of view: Instead of “the Bible is non-sense”, a better start could be “I read the Bible and I did not find anything arguing against X. My interpretation of the tale of Y is that one should not do Z.”
Maintain a neutral point of view: Wikipedia articles marvel at this. Instead of “coffee is bad”, you could say “some research shows that coffee adversely affects people’s health”. Alternatively, you may explicitly show that you talk from your point of view: “I think that coffee is bad”. (Can anybody contradict you and say “no, you do not think coffee is bad”?)
Do not take extremes: It is common to take an extreme to compensate for another extreme. If the other side is against treating animals as a commodity, then you may be tempted to take the position “people should be free to do whatever they want”. You could also take a middle ground: “Incentivising people to consume less meat is good, but I would not forbid it.”
Avoid arrogance: I find if very off putting when people start with “I worked with X for 20 years”, generally used as synonymous for “I surely know better”. Remember, you not only need to show you are knowledgeable, but also create trust.
Learn instead of convincing: In the end, one can have a civilised discussion without agreeing. The goal of a discussion does not have to be to convince the other side, but rather learn more about their position and about yourself. What are you own biases? What are the ideas you stand for?