Chapter 13 – The Will to Believe – and to Disbelieve

It is very easy to imagine that you are somewhere dramatically different from where you are at the moment. But it is highly unlikely that you could make yourself believe that you are actually there now merely by an act of will. And yet the following discussion is an attempt to convince you that most people continually will themselves into believing strange and unlikely things. This applies to everyday affairs and to religious beliefs.

This may seem absurd, but what you think is absurd depends on what you already believe. Some Atheists think it is absurd to believe in the existence of God, and some believers in God think it is absurd to believe that there is no god. Some ideas can be believed to be self-evident throughout entire cultures, yet widely dismissed in other cultures. What individual people believe or disbelieve is not a matter of how intelligent or naive they are: you cannot tell what things a person believes just from knowing their IQ or apparent gullibility. The will to believe has sneaky ways of inducing people to believe things that others would never accept.

Before this is discussed, the will to believe should be put it into perspective among other ways in which people adopt beliefs.

(In Chapter 5 Free Will, Determinism and Morality I discussed whether there is such a thing as free will. The arguments made in this chapter apply irrespective of whether the will to believe is free or completely determined.)

 

Gate-Keeping

Think about what happens when we come across some new piece of information. Do we accept it, wonder about it, or discount it? There seems to be some gate-keeping process, i.e., how we let in or keep out any prospective new beliefs. One way is that any new idea that we accept has to be supported, directly or indirectly, by one or more of our existing beliefs. Another way is that we will accept any new idea unless it is incompatible with existing beliefs. (And here I regard beliefs to be both the things we are confident we know and also what we presently accept as true but would readily change.)

These processes could be likened to seeking membership of a club. In the first, no new member is admitted unless authorised by an existing member. In the second, anyone can join unless blackballed by any existing member. I think we try to use one or other of these whenever new information or a new idea comes up. However, it may sometimes take some time to decide whether to “nominate” or “blackball”. And some applicants may intuitively be immediately judged to be highly desirable or highly undesirable.

 

Sources of Beliefs

This idea of gate keeping presupposes the prior existence of some beliefs. So where do these come from? I suggest they come from four sources:

  • personal evidence;
  • indoctrination;
  • innate drives;
  • and – the will to believe.

These usually act in combination with each other.

It might be tempting to add intuition, reason and imagination to these four. But I think intuition and reason are part of the gate-keeping. I suggest that imagination (and many other sources) might provide ideas, but these become beliefs only by satisfying the acceptance criteria.

Personal Evidence is one’s interpretation of any type of experience, ranging from the observation of every-day events, to contemplation, to mystic experiences, to dreams, hallucinations and visions. The interpretation of an experience will depend on past experiences, on existing beliefs and on innate personality characteristics.

Indoctrination might be regarded as part of personal evidence or experience, but I think it has some special characteristics. It is largely indirect evidence. It occurs continually from birth, as we are socialised into our culture and instructed in the teachings of our culture, religion, profession or areas of interest. This is by no means a purely passive process; it includes all the answers to the interminable questions asked by children – and by adults. Indoctrination is the main process in the persistence of belief systems, as in religions and cultures, including science and other areas of learning.

Among the many beliefs that are acquired through indoctrination are those about what is enjoyable and unenjoyable. Examples are what is thought to be funny, aesthetically pleasing or in good taste. In each of these there are marked differences between cultures. Within cultures, continual changes occur in accordance with fashion. Citizens of other countries, even those who share our language, tend to have a poorer sense of humour then we do. And with music, what is favourite or traditional in one culture is often positively unpleasant to another.

Innate Drives are motivational urges that come with being human. For example, we all have some innate sense of fairness, of attachment to other selected people, to revenge, etc., and also to what is physiologically pleasant and unpleasant as in taste and smell. Experience and indoctrination shape these into beliefs, such as about what is proper behaviour.

Right from birth we all are trying to understand what is going on around us, and, either unconsciously or consciously, feel an urge to develop explanations of what we experience. When there is no obvious explanation it is comforting to invent one. Also, we have an innate tendency to be emotionally attached to ideas we develop or discover ourselves. This makes personal religious and moral beliefs persist, and to become embroidered and altered, hence the proliferation of different sects within all religions.

Another kind of innately motivated belief comprises those that depend on personality traits, for example, a love of the bizarre or the straightforward, the mysterious or the explainable, or a predisposition towards either reason or emotion.

 

Reconstructions

One component of belief is memory of what we have experienced. But memories are not like photographs or videos. Photos and memories may fade, but photos do not spontaneously rearrange their details or add new material. But there is ample evidence that all of our memories are reconstructed, not necessarily faithfully, every time we access them.

A reconstruction of a memory may be made when the details are hard to make sense of, are too vague, or are inconsistent with each other and/or with known facts. If it is important for the particular memory to be complete and consistent, the mind will try to produce the “correct” or consistent memory by working out what “must have happened”. Once a satisfactory story is produced, it is believed to be the valid memory, and often a fictitious scene of it is visualised. This is not exactly a case of willing to believe, but once the memory is reconstructed the new version is believed to be true.

 

Reasoning, and its absence

As discussed earlier, any new ideas that come up through the urges of innate nature, from experience, from indoctrination and from the imagination usually have to go through the process of gate keeping before they are accepted. This is largely an intuitive process, and might not be very thorough. Often we are confronted with complex or unusual or controversial ideas that are not easily accepted or rejected. What are we to believe about strange scientific concepts, such as in Einstein’s theories of relativity, or quantum theory? What about tales of strange goings-on in other parts of the world, or sub-cultures of our own society? What should we believe about the morality of various biological and genetic technologies? In each case we may try to reason out what we believe about any new issue, and may reconsider any existing beliefs that are contrary to it.

Sometimes we may just let the matter hang, without coming to a belief. Or we may act on gut feeling, without checking or realising whether it is consistent or incompatible with any of our other beliefs. Or we make an emotional decision without thinking the matter through. But later, the belief may need support when the matter is discussed with other people. And then, if we really want to keep that belief, and if there is no simple rational support, a rationalisation will be necessary. We have then exercised the will to believe.

When we contemplate an issue, looking at evidence, and comparing it with other evidence, it might seem after a bit of thinking that something that is commonly believed to be true may well be untrue. We might previously have accepted the common wisdom. If we find no fault with our evidence and reasoning after going through it again, we then have acquired a new belief. There might have been something wrong with the evidence used, or with the reasoning, but we will now defend the new belief.

 

Rationalisations

In effect, reasoning helps the gatekeeper. A related process, rationalisation, attempts to mislead the gatekeeper. Rationalisation starts with the assumption that a belief is true, and then works back to try to derive an explanation to show how it could be true. Once a plausible explanation is obtained, the belief appears to be proven – no matter how bizarre it may seem to other people. This process should be distinguished from using critical logical reasoning to test whether a possible new belief seems to be true.

A contrary form of rationalisation is one that contrives an argument to oppose belief in what, on the surface, may seem plainly evident. We can rationalise to disbelieve as well as to believe. Again, this not the same as disinterestedly and rationally testing a belief, although it may have some aspects in common. This is the distinction between sceptics and deniers. Sceptics are trying to test whether there are grounds for accepting or rejecting something, and deniers are trying to find rationale to justify rejection. However, contrary to stereotypes, some scientists, who should always be sceptical, can sometimes be deniers, and some believers in religions can have truly sceptical attitudes to some of the teachings of their religion.

Sometimes rules of conduct are, or seem to be, ambiguous. This enables people to invent rationalisations to justify interpretations they prefer. Sometimes these interpretations are a long way from the intent and/or the more obvious connotation of the particular rule. The accepted practices of many religions contain examples, some being unusually strict and punitive, and some providing ways to avoid the rule.

 

Rationalisations are not necessarily unsound. They can turn out to be logically valid, and therefore be genuine justifications of the belief or disbelief: the fact that we ardently want something to be true does not mean that it must be untrue. But many scientifically tested findings, such as the effectiveness of vaccination against disease and the global warming effect of human-induced emissions of greenhouse gases, are denied by people who have relied on evidence and reasoning that leads them to contrary conclusions. People on both sides of such arguments dismiss their opponents’ evidence and reasoning. Someone who is genuinely seeking the truth must consider all available evidence. Someone who is rationalising will look for ways to dismiss or discount their opponents’ evidence without applying the same scrutiny to their own.

The more intelligent someone is, the more likely that person is able to concoct a plausible rationalisation to justify something they want to believe. So it is not valid to assume that everyone who believes something patently false must have low intelligence.

Rationalisations can lead people to adopt and maintain a belief in atheism, agnosticism or supernaturalism. Some chapters of this book have examples.

 

Typical cases of rationalisations

Typical feelings that prepare people to will themselves to believe something are that:

  • there must be some continuation of life after death – either for this dear person, or for me;
  • this person really does love me (despite signs to the contrary);
  • I have a right to have or do some particular thing – like own a firearm or thrash my children;
  • there is an alternative motive that makes these people say that;
  • this particular car salesman or politician – who I need to trust – is honest.

Others are statements with beginnings such as ‘There’s sure to be….’.

And again, believing something just because you want to does not make the belief false: the car salesman may be scrupulously honest.

Wanting to believe something is usually not enough if it is self-evidently untrue or impossible. Nonetheless, after being involved in a tragedy, people often go into denial – they can’t accept the truth of what has happened. While it is highly unlikely that anyone would believe that the car they have just crashed will somehow spring back to its original condition, no matter how much they would like that to happen, there is still plenty of room for ready self-deception.

The motivations behind particular acts of willing a belief may be one of the following:

  • support for idealistic or romantic views of life;
  • support or rationalisation for other strongly-held beliefs;
  • justification of selfish acts;
  • justification of dubious acts in support of particular causes (the end justifying the means);
  • emotional inability to bear the truth.

One set of examples, cases of selfish acts – or, if you are charitable, human weaknesses – is known as the seven deadly sins.

 

The Seven Deadly Sins

These are – or, perhaps, were – pride, envy, lust, greed, gluttony, anger and sloth. They weren’t so terrible in themselves, but were said to lead to more serious sins, such as theft, murder, or even worse.

Some people nowadays apply the appellation of “deadly” sins to cases where indulgence in the sin is likely to lead people into making unwise decisions. Such decisions arise from believing something that would look crazy to anyone who was not tempted by the particular sin.

Taking greed as an example, a common case is where there are offers or opportunities that should seem too good to be true. In times of financial optimism it is tempting to put savings into the scheme offering the highest rate of return. Caution about high returns being related to high risks is overcome by wishful thinking, followed by rationalisations about the other investors and the prosperous appearance of a prospectus or business headquarters. Business managers continually allow themselves to believe that it is safe to expose their companies in ventures that in less buoyant times would clearly be regarded as risky, or even suicidal. Pride, another deadly sin, can push people to will themselves into believing in their unfailing cleverness in avoiding pitfalls.

And what do you believe about tax avoidance? It depends on who is doing the avoiding, you or someone else. Your belief here may depend on greed, or on pride or on envy.

What do you believe about taking a “sickie” to have a day off work? Does that attitude depend on your position in the organisation you work for?

 

Personality

Personality determines which of the seven deadly sins you are prone to indulge in. Personality, coupled with experience, affects what people want to believe in. Those who are attracted by romantic ideas will willingly embrace different things from those who prefer hard rationality. The New-Ager and the Sceptic typically have different personalities. For example, despite the many fascinating revelations of science, its rigorous and intricate explanations have little appeal, and hence little credibility, for many people. Of course, the condescending or authoritarian attitudes of some practitioners of “orthodox” science or technology can incline someone to believe in some “alternative” system, almost as a pay-back reaction. But personality has more aspects than just the range between rationality and romanticism. All the deadly sins, or indeed everything that goes into personality, affect what we will ourselves into believing.

The same applies to disbelieving. Faith in their own infallibility can make politicians refuse to believe they are responsible for the patently obvious consequences of their bad decisions. Many people like to feel that they know something special that is different from what the deluded majority has been tricked into believing. They often form groups who share a particular disbelief, such as those who “know” that vaccination or fluoridation of water is harmful and has no beneficial effect. Such an attitude also feeds conspiracy theories.

Emotion

If an idea or theory is pleasing, it requires only the scantest evidence for its ready acceptance. Few ideas are more pleasing than those we invented ourselves. If you invent a new religion, or a new cure for one of the current plagues affecting humanity, it is easier to convince yourself that it is true than if someone else invented it. Hence the many cranks, who on other matters may be no less sane than anyone else.

Presenting concrete evidence that is contrary to a pleasing belief will not necessarily lead to discrediting the belief. It may lead to hostility to the messenger – particularly if the messenger is smugly irritating. A theory that satisfies feelings such as wonder or poetic justice, or is flattering, or puts down people or groups who might be seen as intimidating or unpleasant, is pleasing. A theory that is easy to grasp and explain to others is pleasing. People willingly believe simple and easy solutions to perceived problems – which partially explains mob action. A complicated explanation often generates hostility, even though the matter itself is very complex.

We are more ready to believe good things about people we like and bad things about people we don’t like. ‘There must have been a good reason,’ explains the unaccountable “bad” act of a “nice” person, who continues to be nice. And ‘there must have been an ulterior motive,’ accounts for the good deeds of someone “nasty”.

Anger, hate and other strong emotions can make people passionately believe what is obviously or officially wrong, and refuse to believe what is obviously right. So hate and resentment make some would-be suicide bombers believe the Koran says their act will give them immediate access to Heaven even if they kill non-combatants.

Similarly, negative beliefs can also result from acts of will – a will to disbelieve. It is hard to convince someone of something that they don’t want to believe. If you hope that something is not true you are more likely to believe that it is not. If you hope that something is true you are more likely to believe it is. The more you dislike the person telling you something, the less likely you are to believe them. Also, the more you like the person telling you, the more you are likely to believe them. (Love may be a different matter.)

How do you choose which radio/TV/newspaper commentators to ignore and which to pay attention to? How many people give equal credence to the claims of both sides in the Israeli/Palestinian, the US/Iraq/Afghanistan and similar issues?

How many readers of this book have become even more entrenched in their pre-existing opinions because they have accepted only those of my arguments that support them?

 

Preserving Existing Beliefs

We usually have an emotional attachment to our beliefs. They are part of self-identification so we try to preserve them. So, how do you react when you learn of scientific or other evidence for or against such things as the historical-type stories of the Old Testament, or Evolution, or the appalling behaviour of your forebears a generation or so ago, or the abduction of people by Aliens who have arrived in UFOs?

The scientific world is renowned for rejecting new findings. This goes far beyond the accepted need to have independent corroboration. It is common for such hostility to arise at new ideas or evidence that, far from conducting independent investigations, the establishment refuses to consider any neutral discussion whatever. Well-known cases are that continents are continually moving around the surface of the earth and that the bacterium Helicobacter pylori is the principal cause of stomach ulcers. Scientific journals continually provide new cases. However, if further independent evidence supports a new theory or finding, almost all scientists within the relevant discipline will accept it. (Both of the cases quoted needed a lot of time to be accepted into the canon of science.)

But many people have such an emotional attachment to some of their beliefs that they will reject all opposing evidence and deny the credibility of anyone who presents it. The particular belief has become too important a part of their self-image, and to lose it would be a loss to their personality. Examples of such types of belief where the opposing evidence is overwhelming are: conspiracy theories, the danger of drinking fluoridated water, the dangers of vaccination, and some religious teachings. People who cling to such beliefs are often very skilled at inventing rationalisations that discount all evidence that is presented to them. And their rationalisations further entrench the belief.

 

“Useful” Beliefs

Most people will believe new or unusual things if it serves their purpose. There is therefore a continuing process of adopting some kinds of beliefs without testing how sound they are. But having settled on the belief, it is usually possible to develop a rationale to support it. If a few other people can be found who take the rationale seriously, the belief can appear to have some claim to truth, even though it might otherwise seem wildly imaginative. The beliefs of some neo-Nazis and followers of the obscure cult religions are examples. Thus, if the rest of your group profess to believe that the presence of a few “funny looking foreigners” is undermining the morals of your society or stopping you from getting a job, it is convenient to believe it. Statement of such beliefs may also purport to be the legitimate reason for attitudes or actions that actually have some other motivations. There will, of course, always be followers who naively but genuinely accept the arguments put forward in support of such beliefs, and join the particular movement, and who may well get benefits such as fellowship and a sense of identity that were not available to them from other areas of society.

Adopting a new religion provides another illustration. A religion might be adopted because of one particular aspect of its teaching, for example, as a result of a search for answers to questions about the true meaning of life, or the after-life, or wholesome ways of living and relating to other people, or personal self-fulfilment. Or a religion may be adopted for convenience, such as get a job, or to marry a believer. And then the intricacies of the doctrine are discovered and have to be swallowed. And very often they are swallowed passionately. There have been cases in history where whole countries embraced, and presumably believed in, a religion because the ruler has been persuaded, such as Catholicism in Austria and Islam in the Malay peninsula. Certain practices or rules of life may then have to be accepted, irrespective of whether the accompanying rationalisation is understood.

 

Believing the Worst

Another occurrence is the inclination to believe certain categories of things. Sometimes a belief is adopted in order to confirm an anxiety, or to avoid seeming to be unduly optimistic, or to reduce the disappointment if something unpleasant actually happens. Someone you are waiting for and is very late must have had an accident, or is doing it deliberately to annoy you.

Situations that cause of anxiety, such as having to deal with something unfamiliar, result from a tendency to believe in the worst. Or we don’t want to believe that people with unusual appearance or behaviour, or with certain types of disability, for example, are to be treated like ourselves. So people of a different generation or culture, or with their different styles of dress are automatically believed to be immoral, lazy, violent or mentally deficient. After people have passed middle age, their life-long eccentricities are attributed to senility or its approach. People in wheel chairs are often spoken to as if they are mentally incompetent, or deaf, or both. This is usually done without malice or intention to belittle. There is usually no reason or evidence to support such behaviour: it is just that there is a common tendency to believe (or assume) the worst in such situations.

 

Conclusion

The following motivations have been suggested for “willing” ourselves into beliefs:

  • support for idealistic or romantic views of life;
  • justification of selfish acts;
  • support or rationalisation for other strongly-held beliefs;
  • emotional inability to bear the truth.

The will to believe is one of four suggested ways of adopting a belief. The other three ways seem fairly rational, but the will to believe does not. So, do only irrational people succumb to this phenomenon? Do only irrational people espouse religion or believe every urban myth and conspiracy theory that is circulating?

To those who would answer “yes” I would add that no one is rational all the time. If we think deeply, we can all dredge up something that we once believed or disbelieved purely because we wanted to – not to mention the ones we still believe. Some psychologists argue that our minds have a built-in irrational processing system in addition to and separate from our rational system. Perhaps it is instead of a rational system.

But I would also add that people believe many kinds of things for reasons other than the will to believe, and can have rationally considered (but not necessarily logically sound) reasons for believing in science, religion, urban myths, conspiracies and things that everyone else would think were weird.