You have heard it said that bias is bad. Media bias, “fake news,” and disinformation pollute the modern political landscape in word and connotation. Surely no sensible person would contend that bias is invariably good (I would certainly hope not). And yet, would it not be an oversimplification to make the assumption that bias is invariably bad? That bias is unequivocally, irrefutably wrong?
Well, to make such an accusation would be to make an accusation against human nature. To suggest that we ought to leave no room for bias is to suggest that we ought to leave no room for belief, which would certainly put something of a wet rag on our attempts to find purpose and direction in life. We would be chaotic, confused, and necessarily agnostic on everything.
Now, bear with me for a moment as I talk about the most fundamental bias of all: consciousness. Consciousness is a very confusing thing, and one that is hardly understood. It’s defined in the Oxford English Dictionary as: “the fact of awareness by the mind of itself and the world.” But there exist all sorts of definitions for consciousness—both within the Oxford English Dictionary and elsewhere. It may well be understood as mere experience (by which I mean experience down to its most basic elements), and not necessarily human experience alone.
Annaka Harris writes this in her book Conscious: A Brief Guide to the Fundamental Mystery of the Mind (2019), “Consciousness is experience itself, and it is therefore easy to miss the profound question staring us in the face in each moment: Why would any collection of matter in the universe be conscious?” Her answer is that maybe at the very root of reality, intrinsic to matter and all things, is consciousness itself.
An interesting thought, though not entirely relevant to our discussion of bias. It’s not often we think about the experience of our leftovers’ consciousness cooking in the microwave. The reason I bring this up, however, is to make clear the complexities of human consciousness. At its most basic level, you can imagine consciousness as pure experience. But what would that mean? Our level of experience is certainly not basic, impacted as it is by the unbelievable complexities of the human brain. Our brains store memories, signal pain, create emotions, and make predictions—all while working in coordination with cognitive and intellectual processes to navigate reality.
In imagining pure experience, you might take away all the previously mentioned mechanisms of the brain to do so. What would it be like to experience reality without memory, by which I mean not just faulty memory (i.e. Alzheimer’s), but literally no memory at all? What would it be like, moreover, to experience reality without any senses or emotions to regulate it? We can hardly imagine something like that, biased as we are in our human minds. Our cognitive processes are designed (among other things) for survival, which is only possible through feelings, judgements, and reactions that guide our decisions.
As human beings, we are the most biased of all living creatures. Our decision-making processes are fine-tuned to serve our purposes and, if we choose, help us put together the pieces of life’s great puzzle, the mystery of experience: Why are we here? Why did we wake up, and what did we wake from?
Like I said, consciousness is complex and difficult to understand. But even if everything is conscious, which is another way of saying that consciousness is fundamental, human consciousness sets us apart from everything else. Our advanced biases provide us with the necessary tools to navigate this strange world and find purpose within it, no matter what is or isn’t conscious. We need bias to make decisions, to pick one thing over another, to choose.
Doubtless, refusing to pick a path at a crossroads leaves you nowhere, and the path to truth requires making decisions—including some wrong ones—along the way.
And so I conclude from this that bias cannot be all that bad. Our ability to make decisions, judgements, and evaluations denotes a necessary kind of bias—one absolutely worthy of our appreciation. Nevertheless, it is clear that bias can quickly become a problem when left unchecked. Bias is a tool, not a master.
Discrimination, for example, is useful … in some circumstances. Every time that I choose not to eat candy I discriminate against it (though candy doesn’t seem to change its mind about me). Additionally, by adopting a new philosophy, worldview, or way of thinking, I discriminate against others that contradict it. I, for one, discriminate against atheism because I do believe in God. However, I think it is always a healthy exercise to question my own beliefs when confronted with different ones.
And yes, that means actually considering an atheistic argument and trying to get behind it, if I can, as a Christian. Intellectual honesty is important, though our reasons for believing things are more than intellectual (more on that in my next essay, titled “Unreasonable”).
Discrimination, as a product of bias, manifests at two critical and interconnected levels of humanity: the individual level (within ourselves) and the collective level (within society). Often internal discrimination leads to external discrimination as our individual biases match and reinforce shared collective biases. This can go either way, however, meaning that proximity (or closeness) to collective biases can inform our individual biases as well. Choosing to conform to the biases of a “collective” can lead to groupthink (a way of thinking within a group that restricts thinking outside of it).
The danger of discrimination, as well as bias, results from forfeiting good, beneficial variables for maladaptive ones. For instance, most people would agree on an ethical basis that each person has inherent worth and inalienable rights. Unfortunately, history has shown no shortage of unethical discrimination that has completely disregarded human rights in favor of something else (be it money, resources, labor, etc.).
Further, while discrimination allows us to distinguish between things and make decisions, it’s important to recognize our fallibility—or tendency to be wrong about things—in decision-making. We are not always right and don’t always make the right distinctions. (It’s also important to realize that distinguishing variables does not always mean that one must be held in higher regard than another, which can cause serious problems [think races, sexes, nationalities]).
Therefore, we must embrace the opportunity for revision if we ever wish to correct our wrongs. Reconsideration must be an option we remain open to, willing to question our beliefs and our choices. If we refuse to question ourselves, we create the possibility of unintentionally reinforcing incorrect or harmful beliefs without even realizing it.
To put it more simply, we must be aware of our wrongs in order to fix them—for our sake and for theirs.
Assuming an innate (or built-in) correctness about things is a critical error: for one, because we can’t ever have total certainty, and two, it is a kind of intellectual Russian Roulette. In the case of the latter, you better hope that you aren’t completely mistaken because that means you’re playing with a full chamber.
All analogies and hypotheticals fall short in the end, but let’s try one. Say that you have ten options in life (pretending that these are the only options in the world for you to choose from). In this game, you are told that you can’t know any of them with absolute certainty, but you can use your critical thinking to eliminate those that are most reasonably false. For the sake of this hypothetical, let’s also pretend that only one contains the absolute truth. Also, if you pick the wrong one, you die.
So you set out to find the truth. You find three that are obviously wrong, and everyone else agrees. Then you come across another two that you think can’t be right. Again, most people agree with you, but this time not everyone. Still, you feel confident in moving forward. That leaves five.
At this point, it gets trickier. Two of the remaining options don’t seem right to you, though you can’t be sure. Over half of the other people continue side with you, which is reassuring, but it’s not the same majority as it has been. You decide to continue anyway.
Then on the last three, it’s totally split. Whichever of the three you choose seems entirely arbitrary. Arguments develop for each side, and there is something reasonable for every one. Two of them seem slightly more plausible to you, however, so you go with them.
It’s fifty-fifty now, and after spending a lot of time thinking about it to no avail, you select the first one off the draw. You sigh, “Whew.” Better than nothing.
You have your selection, and you’re right to stick with it. Except now you’re told that you have a lifetime to determine whether or not you were right. If you were wrong, you can always change your mind along the way. In addition, more evidence will come out that makes certain options more or less plausible: in some cases it may even completely discredit some of the most plausible options, or completely validate some of the least plausible ones.
What do you do now? You have your initial answer. Good! Keep it and value it; but all the same, would it not be a mistake to consider it the absolute truth without question? I think it would be, especially given the possibility of new evidence coming out against it. Let’s say, also, that you discover a susceptibility to certain faults in your own reasoning due to built-in biases, as well as faults true of all human reasoning.
Would you proceed to turn a blind eye to all new evidence? All opposing arguments? Would you consider it in your best interest to surround yourself only with those who agree with your point of view? People that only consider evidence supporting your initial answer?
This is the big deal with echo chambers and groupthink. They do just that. They tell you what you want to hear, cherry-pick information, and blind you from opposing perspectives. And, according to the rules of our hypothetical, what happens if you prevent yourself from looking outside your own truth? Dismissing the off-chance that you had it right in the first place, and because you restricted the scope of your understanding, you die.
How confident are you in your bias to lead you to the right conclusion without further scrutiny?
Like I said earlier, however, more goes into our beliefs than reason alone. Reason and logic, even if they are useful tools, are faulty. But if not reason, what else? A different kind of knowledge, one that some would say “comes from the heart.” The concept I’m hinting at is faith.
Everyone has faith in something: science, philosophy, fate, power, the cosmos, God … even nothing. But nothing too must be something because, strangely enough, we wouldn’t have a word for it otherwise. Nihilism—the belief that life is meaningless—has its own form of faith, that is, in nothing.
Hence, what leads us to believe in something happens somewhere along the lines of faith and reason; presupposition and calculation; bias and fact.
What we would refer to as “subconscious” processes often guide us just as much as our conscious ones, despite the difference between the two remaining something of a mystery. If we think of biases as trajectories or underlying paths of direction, then we operate within those boundaries—that is, until our awareness of them makes it possible for us to explore uncharted territory outside of them. At that point, we become masters of our own fate, awakened from a slumber of ignorance, capable of dead reckoning our paths into a future of hope rather than desperation.
But we are desperate creatures and have been for a long time. We look for certainty, clarity, security: anything that we can hold onto in this seismic world we’re floating in. Things change, so we want things to hold onto whether it be money, knowledge, youth, or anything else that provides a sense of stability.
Mastering fate doesn’t mean to control fate, however, deceptive as this may seem. To master fate is to give it up. To die to it. And in its place, to posit something more meaningful, more profound, and more mysterious: the relentless pursuit of truth even when truth is not easily forthcoming. We trust that truth will reveal itself based on past revelations, and we must be courageous enough to attempt this eternally significant pursuit even contained as we are in mortal bodies.
Henry David Thoreau, the 19th-century American writer and philosopher, wrote: “Don’t read the Times. Read the eternities.” It goes without saying that we should also live for the eternities, not the Times.
As impossible as it is to be unbiased in a biased mind—a biased consciousness in a biased world—it’s best to live as though truth is attainable, and not as though you already have it. Furthermore, don’t trust one source of information. Don’t trust two. Don’t even trust all of them put together. Only trust a higher truth, one that their synthesis aims to find, a truth that is independent of one mind or community, one ideology or political party, one field or research finding. A truth that is independent of what reason and faith have to offer in isolation.
Whatever consciousness is, and however we as humans came to have it, reality is surely a great mystery that we are seeing through restricted lenses, lenses that must be regularly checked, cleaned, and adjusted to compensate for habitual dust and smudges, much like real glasses.
In Blaise Pascal’s Pensées, he states, “There are only two kinds of men: the righteous who think they are sinners and the sinners who think they are righteous.” The same applies for our thinking as well as our actions.
There are righteous thinkers who realize their shortcomings and mistaken thinkers who think they are righteous. The truth is that nobody has all the answers, so it’s a healthy discipline, a necessary one, to question our own beliefs. Only then—maybe one day—can we stand with all of consciousness in harmony, undaunted, unafraid, uninhibited … and unbiased.
Leave a comment