Quantcast
Channel: Journal of Interest » Essays
Viewing all articles
Browse latest Browse all 10

Certainty and ignorance

$
0
0

Statue of John Knox

Although I write about and am engaged with politics these days, I haven’t always held much in the way of political opinions.  The impetus for beginning an interest in it came from one conversation too many in which I was left silent with no opinion to give.  As it seemed to me – and which I now know to be naïve – the people whose opinions I heard tended to carry their views with a certainty and conviction I rarely had about anything.  Not knowing the details of political circumstances, I concluded these people were better informed than I was, so I generally felt no reason to interject or disagree with them.

Not being able to contribute to these discussions was uninteresting however.  When I was aged 21, and having some time on my hands, I decided I’d read some political theory and try to develop a basic understanding of parties, systems, issues, etc., so that I’d at least be able to follow the conversations being had and ask a few relevant questions rather than feeling myself to be wholly ignorant.  As it turns out, and as the reader might predict, only a modicum of study was required to reveal that the people whose views I had respected as well-informed (and apparently deserving of certainty) were decidedly and emphatically not so.  The degree to which this was the case was in fact startling, and I didn’t anticipate it.

In retrospect, it’s clear to me what the source of my folly was: It was to assume that people probably thought and acted in a similar way to me.  Maybe in most cases this is true, but in this instance it definitely wasn’t, so it afforded me a valuable life lesson, and ever since then I’ve noticed occasionally when I’m again confronted with it.  Whereas my ignorance was the cause for my hesitation, in others it appeared to be the cause of their certainty.  I immediately wondered to what degree I too was guilty of it.

Everyone feels him- or herself to be rational,1 so I won’t try to argue that I’m a particularly rational person myself, but the idea that a person should increase their conviction in-line with their understanding was an implicit idea for me – inasmuch as it formed the basis of my uninspected assumption that it would be the same for everyone.  I had also assumed that people, in general, had a relatively accurate appreciation of their understanding or ignorance in a given area – this, also, turns out to be unequivocally wrong.2

I felt the reason I didn’t engage much in political discussions was because I understood myself to be largely ignorant about the subject – the fact that others had not only opinions, but strong ones held with certainty, indicated to me that they were not as ignorant as me.  This, of course, doesn’t follow; all that follows is that these people likely believed themselves to be informed, which is not the same.  The takeaway for me was the following: Even when people think they know what they’re talking about, there’s still a good chance they don’t know what they’re talking about.

Being right versus feeling right

In all honesty this outcome is one that I probably should have anticipated (the people I’ve previously expressed these thoughts to gave me the impression they found it self-evident).  It wasn’t self-evident to me, perhaps because I’ve been overly trusting, but I had certainly been struck by the peculiarity of the nature of politics.  What seemed (and still seems) curious to me is how politicians can disagree so fundamentally about so many things, even when operating on the same information.  They aren’t shaken at all by the fact that swathes of other politicians hold beliefs with equal certainty contradicting them.  It’s always seemed alien to me, as I think it should to anyone with a genuine interest in being right – how can you be so sure of yourself when the evidence and circumstances are so clearly disputable?

There are two answers to this question I think.  The first is that political beliefs are based as much on emotion as fact (perhaps more so).  Facts alone aren’t enough to decide what should be done – you must also be informed by your values.3 It’s natural then that two people, even when agreeing perfectly on the the nature of the world, will not necessarily agree on how the world is best to be.

The second, more interesting answer involves humans’ psychology.  Often it seems evidence is a secondary consideration; it’s used to justify views already held rather than lead to new ones.4 Despite humanity’s claims that we are concerned with truth, every day we act contrary to that by preferencing the beliefs we already hold over the beliefs we don’t.  We’re incredibly biased towards what we want to be true, and it takes a good deal of discipline to challenge those beliefs.

The fact of the matter is that no matter what we say, generally speaking we care more about feeling right than actually being right.  This is evident not only in politics but almost everywhere.  I have never, for example, witnessed a debate in which one of the debaters became persuaded by the argument of their opponent.5 If we were rational, we would admit that at least one of the debater’s convictions was misplaced, and given we don’t know which of the two it is, we should expect each to have a modesty that it might be them. Of course this usually isn’t the case – probably the reason these people are debaters in the first place is that they are so sure of the reasonableness of their beliefs that they believe them impervious to reason itself.

It seems people a lot of the time just aren’t open to the possibility that they might be wrong – it takes an overwhelming amount of evidence to the contrary to overturn a long-held belief, a bar that’s usually too high to meet, and especially when the person is intent on looking in the other direction. This applies to myself as well of course.  I notice quite regularly that in my mind I’m unduly dismissive of arguments that happen to be contrary to my current opinions – but I think thank goodness I’m at least to a small degree cognizant of it so I can force myself to give the competing arguments a proper hearing.  If I’m genuinely interested in truth, it won’t do to let myself get away with being so biased.  Consequently I’ve challenged and changed a few of my beliefs over time.6

Correcting for certainty

Given our propensity to skew the facts to fit our view of the world, if we decide that we do care about what is right (and not just about feeling right), then we ought to expect to have to employ a certain amount of intellectual rigour in our pursuit of the truth. At any given time, we simply don’t know if we are correct or not. The reasons why we believe what we believe are often partially if not completely obfuscated,7 and all that is truly clear to us is our certainty in our belief. Our ability to reconstruct the reasons for our beliefs are thusly imperative in ensuring they are still valid.

I recently heard a metaphor, originally applied to martial arts,8 but which I think applies equally well here. The acquisition of knowledge isn’t like climbing a mountain – it’s more like mowing the lawn. You can’t read new books all the time and expect everything you read to be accurate; there is a certain degree of maintenance involved in ensuring beliefs’ accuracy. Knowledge, being indistinguishable from beliefs incorrectly perceived to be true, must be periodically examined if inconsistencies are to be resolved. Without doing so, a person is in effect leaving their beliefs up to the luck of the draw of the quality of information they happen to be exposed to – not good if they’re interested in the truth!

This intellectual rigour requires a willingness to accept even the most unpalatable truths.9 It requires the humility to admit that a significant proportion of our beliefs are likely to be wrong. Who knows how many incorrect beliefs are lurking around in my brain! The thought of it is disconcerting, but it encourages me to want to fish them out and correct them. Potentially any of my beliefs could turn out to be wrong, and the dearer they are to me, the more important it is that they are correct – in other words, the more I value a particular belief, the more I should scrutinise it.

The origins of confirmation bias

The finding that we have such an apparent proclivity to favour present beliefs over prospective beliefs would seem to hint that there is a reason behind it. I’m not aware of any theories on the subject (though I expect I’m not the first to consider the question). In the absence of knowing the truth, I’d like to put forward a hypothesis. That this bias appears universal would suggest it to be a genetic disposition as opposed to an environmental one – if this is the case then it is an evolved characteristic of humans. This would imply that there is an evolutionary benefit to favouring existing beliefs over new ones – specifically, it implies that the behaviour improves the propagatory potential of the genes that promote it.

My guess would be that the amount of time a belief is held serves as a heuristic mechanism for validating its utility. It is the utility of the belief that would matter after all – a belief’s truth value only matters in that accurate beliefs are typically more useful than inaccurate ones, but it is always the utility of the belief that matters in terms of serving the replication of genes. The life of a person up until a given point is, in effect, a stamp of approval for the beliefs that he or she happens to hold: The beliefs have been useful enough to keep the person from being killed.

The same cannot be said of an arbitrary new belief y that comes along. Assuming total ignorance of which beliefs are true, if prospective belief y conflicts with already-held belief x, what we know of the situation is that x has not caused the host to die yet – it has to a degree been vetted. New belief y has not been vetted; given that the genes have no way to know which belief is true, in their blindness they must favour familiar x over unknown y. Such a hypothesis would predict that the longer a person held a belief, the safer it is statistically likely to be, and so the less willing the host should be to part with it.

It is also much more important to avoid dangerous beliefs than to avoid safe false beliefs. Life has to win every time; death only has to win once. A conservative strategy for the adoption of beliefs would mitigate risk to the host at the expense of (from the genes’ perspective) inconsequential (or minorly consequential) superstition.

This is best illustrated with an example. Consider a proto-human who has gone through life believing wild cats to be dangerous. He does not go near them because he believes they will eat him. He has no evidence that they will eat him, but if he changes his mind he is liable to die, so he would do well to only do so if confronted with something incontrovertible. He benefits significantly from the belief, since wild cats are dangerous and believing this helps to keep him alive. On the other hand, consider instead that he had gone through his life believing a certain type of tree to be dangerous. In this scenario he benefits nothing, but he does not lose out hugely either. He is inconvenienced and maybe occasionally a tree will obstruct him from obtaining food or a similar benefit.

In both cases, at any time which he might be confronted with a competing prospective belief, he will be alive and therefore will not have been killed by his beliefs. His beliefs, whether incorrect or not, have served him in surviving.

Finally, it is not the case that there are an equal number of useful beliefs and useless beliefs. The set of all possible beliefs will be much larger than the subset of beliefs which happen to be true or useful. Therefore, from a purely statistical view, if a held belief is competing against a random10 new belief, it is likely that the proven-useful belief will be more useful than the unknown contender.

However, there is a complication in that harmful beliefs are probably less numerous than harmless beliefs. This is illustrated in that most things that the proto-human could in principle believe to be dangerous are in fact not dangerous. Believing them to be dangerous would nevertheless incur an opportunity cost11 in terms of lost benefits the proto-human would otherwise have had, so there will be at some point an equilibrium reached between the want of avoiding dangerous beliefs and the want of avoiding too many safe but inconvenient beliefs. Where that balance should be predicted to be I can’t say.

In summary

Whatever the origins of our unjustified certainty, we should take care to correct for it. There’s a certain humility in accepting how much we might be wrong. Like it or not, we’re hugely ignorant about most things. We don’t know when or where we might be wrong next, and our adamance is usually inappropriate and an impediment to accuracy. I’ve seen it written that Bertrand Russell was once asked whether he would die for his beliefs; his response, or so I saw, was that he would not, because he might be wrong. This stance seems to me to be eminently judicious – or, at least, if you are to commit your life to a cause, you must better make certain it’s the right one.

Crucially, the ego is not a good source of knowledge, so we should be careful when listening to it. Humility is wise and arrogance is foolish. If we care about intellectual honesty, we should endeavour to remain broad-minded, considering things based on their merit and not how sweetly our ego speaks about them. It is easier said than done, but like many worthwhile things, success is reserved for those who insist.

In terms of the martial Ways, nobody likes the beginner who believes themselves more knowledgeable than the teacher, and never do these people seem to manage to stick it out. Humility is vital in the pursuit of self-improvement, since a person cannot improve if they cannot honestly appraise themselves of where their failings are. Martial arts are, in effect, the process of replacing wrong with right, so if you cannot put up with being confronted with how continually wrong you are, you will your Way a source of perpetual tribulation. Humility, then, if not present at the start of one’s journey, should certainly be present mid-way12 – but, more generally, should be an active concern of anyone who favours truth and wisdom over comfort and fiction.


  1. I’m reminded of the following quote from Bertrand Russell, being characteristically entertaining and pithy:

    Man is a rational animal — so at least I have been told.  Throughout a long life, I have looked diligently for evidence in favor of this statement, but so far I have not had the good fortune to come across it, though I have searched in many countries spread over three continents.

    An Outline of Intellectual Rubbish, Unpopular Essays

     

  2. See for example the Dunning-Kruger effect

  3. In other words, you can’t get an ‘ought’ from an ‘is’

  4. This is known as confirmation bias.  It apparently extends even to people’s ability to do maths, where numeracy skills are affected by whether the solution to a numerical task lends itself to supporting a conclusion contrary to the calculator’s politics. 

  5. I saw Sam Harris make this point just yesterday:

    In recent years, I have spent so much time debating scientists, philosophers, and other scholars that I’ve begun to doubt whether any smart person retains the ability to change his mind. This is one of the great scandals of intellectual life: The virtues of rational discourse are everywhere espoused, and yet witnessing someone relinquish a cherished opinion in real time is about as common as seeing a supernova explode overhead. The perpetual stalemate one encounters in public debates is annoying because it is so clearly the product of motivated reasoning, self-deception, and other failures of rationality—and yet we’ve grown to expect it on every topic, no matter how intelligent and well-intentioned the participants.

     

  6. For example, de-converting from Christianity and becoming vegetarian

  7. It’s comfortable to believe that we know the reasons why we believe what we believe, but this is regularly found to be false. Psychology has uncovered innumerable biases and environmental factors influencing our behaviour – factors which seemingly apply to all of us and of which we are usually entirely unaware. 

  8. The original metaphor was that learning a martial art isn’t like climbing a mountain, it’s more like mowing the lawn because it is, in essence, a process of refinement rather than linear succession. I like it because it also speaks to the mundane nature of the hard work in learning a martial art in contrast to the perceived excitement of climbing mountains. 

  9. I think a good field to encourage this is economics, due to the abundance of disagreeable findings it seems to throw up. For example, mass abortion after the Roe v. Wade court case in 1973 was found to result in a significant reduction of violent crime in the 90s and 2000s in the US. Naturally the finding was not well-received by the general public. 

  10. In practice, competing beliefs will probably not be completely random – the degree that they are random is completely unknown to me, but there will probably be some reason (good or bad) for the prospective belief to be encountered and considered. 

  11. Opportunity cost is a term used in economics to describe the resources (e.g., time, money and opportunities) that are forfeited by making a particular choice. For example, if I choose to go on holiday to Tenerife, for that time I would not be able to attend my aikido classes back home. 

  12. I say this, yet there are still many egos in the martial arts. I should qualify my statement to say that, in my view, if they have developed their physical technique but failed to develop their character, they are not in any meaningful sense ‘mid-way’. 

The post Certainty and ignorance appeared first on Journal of Interest.


Viewing all articles
Browse latest Browse all 10

Trending Articles