Hi there. It’s been a while since I’ve done a philosophical post, but all the more reason to jump back in! After my debate with Manny, and subsequent reading of the book he recommended, Bart Ehrman’s How Jesus Became God, I’ve been on a bit of a theism vs atheism kick, and decided to read probably the most famous modern atheist book, The God Delusion by Richard Dawkins.
I’m planning to write a series of posts interacting with that book (with which I’m almost done), but before that, I thought I’d write on a topic that I’ve wanted to write about for a while: bias. As we all know, the context our brain starts with when approaching a topic has a huge impact on how we look at it, and that’s no less the case when I, as a Christian, approach something like The God Delusion.
I think it’s helpful to divide bias into two distinct, largely unrelated types, which I will refer to as Knowledge / Experience Bias and Willful Bias. The latter is very similar to (or maybe simply the same as) well-known Confirmation Bias, but I want to use my own term so that we can look at these two ideas fresh in this post. The reason it’s so helpful to divide discussion of bias into these types is that I think they affect a person in very different ways and also are fought in very different ways as well. Let’s take Knowledge / Experience Bias first.
Knowledge / Experience Bias
You are probably familiar with IBM Watson—the AI system that won hugely on Jeopardy. Although it was completely disconnected from the internet during the game show, it had been trained on a huge amount of data to build up its knowledge-base, and then during the game show, it could query its knowledge-base in an effort to intelligently answer the given question, often with a positive result. Now let’s pretend we had dumped every bit of 9/11 “Truther” (the conspiracy theories about 9/11) material we could find on the internet into Watson, and zero of anything that conflicted with that. Now imagine that Alex Trebek asked Watson a question about 9/11. How do you think it would respond? Well with conspiracy theories of course! But would you say in this case that Watson was acting illogically? No, not at all. It simply had bad data to work with. It likely was acting as rationally as possible with the data that it had been given.
Now imagine that you now also dump all the factual information about 9/11 into Watson, and then let it answer the question again—and let’s assume this data overwhelmed any evidence for a conspiracy. Would Watson stick with its Truther ideas since that is what it started with? Of course not! It would calculate based on the scope and power of the new data and switch its answer presumably. In other words, Watson would not have any emotional attachment to its previous views, or have its identity all tied up into them and therefore resist changing its mind like people do.
I hope by now it’s clear the type of bias that Watson was originally subjected to when it only had Truther data. It’s a purely experiential, knowledge-based form of bias. Watson was simply doing the best it could do with the data that it had.
Everyone starts out that way as a child; they trust their parents, they take that data, and they run with it—at least until they find hugely conflicting data. So the takeaway is that one can act as rationally as possible, and come to very wrong conclusions, depending on the data and experience at their disposal. And this has nothing to do with any sort of desire for a particular conclusion. That’s why I’m convinced that two equally open-minded, rational people can have opposite views about the existence of God. We all have our own unique sets of knowledge and experience we are working with when figuring this stuff out.
Perhaps “willful” isn’t the best word for this, because there’s a way in which this form of bias can be unconscious. However, no matter how you look at it, it’s of a completely different kind than Knowledge / Experience Bias.
I think the best way to describe this bias is the knot in your stomach when you’re coming into a contact with an idea or argument that goes against a cherished belief. It’s the resistance you feel towards allowing your brain to “go there.” The lead character in the movie Inception talks about how the most dangerous thing in the world is an Idea, and I tend to agree. It’s strange how we can keep certain ideas at arms length and intellectually “consider” them, but if we’re being honest, we don’t actually allow our brains to think, “what if this is actually true?” We don’t truly “try on” that mindset. We only consider it from afar, far enough away that it can’t really overtake our brains if it’s true.
Let me give an example. Growing up, basketball was my main sport, and I was decent at it; not great, but had a pretty good shot and could keep up with the other kids. I even played JV basketball at my homeschool high school (it’s ok to laugh at that sentence). It was part of my identity that I was good as basketball. However, there were some moments when that was questioned. And when that would happen, I would feel that knot in my stomach, that feeling of “no, I am good at basketball. That must be true.” It’s the same feeling a child feels when their parent is making them do something they don’t want to. In this case though, the parent is really reality itself, saying, “hey, what if you aren’t actually that great as basketball?”
I think we all recognize those things in our life, those things about our identity that are off limits to being questioned. This does not mean they are incorrect beliefs, but it clearly does mean we have a bias.
Once I started to see that mechanism active in me, and in particular once I got more and more philosophical about my own religious beliefs, I ended up crusading against every knot in my stomach I felt about anything in my life. I wanted to rid myself of all cherished beliefs (i.e. un-cherish them, but not necessarily discard them) as quickly as possible; questioning them felt horrible and I wanted to just get it the heck over with. So, literally, every time I would feel that knot (about an element of Christianity, or if I was good at basketball, or piano, or if I was smart, or if I was arrogant, etc), I would sit down, try my best to face the fact, to let my mind truly “go there”: actually try on the worldview of “I’m not that great at basketball,” for instance. Like the character in Inception, I knew that once I really tried on that way of looking at things, if it were true, I wouldn’t be able to go back to fully resisting that belief, it would’ve permanently taken root in my brain.
In the case of basketball, after trying as hard as I could to look at my skill objectively, I decided, “eh, I’m decent. Not great, but I do have a pretty good shot—streaky though—and honestly I’m not a good ball handler.” Some of that was hard to face, because being good at basketball as part of my identity. But once I really faced it, the knot in my stomach melted. I had to exorcise part of it from my identity—because I wasn’t as good at basketball as I wanted to believe—but I also was able to affirm to myself that I was decent as well, so the fear about that went away too. There is an incredible freedom that comes after. I no longer had a cherished belief about my basketball abilities. I possessed my own view of my skill level, which might be more right or wrong (I’m not claiming to have found absolute objective truth), but the key is I didn’t hold that belief in a closed fist anymore. I held it in an open hand, making the best judgment that I could, but now willing to let it go if I needed to. And I did indeed need to let go of part of it.
This sort of bias is, I think, what most people mean by the word “bias.” It’s a functional, behavioral form of bias. It’s what true closed-mindedness is: a ugly behavior that manifests when new, undesired information comes available. In the previous example, Watson is decidedly not doing this. Watson is, ironically, an open-minded 9/11 Truther. Even though it believes in a conspiracy theory, Watson is still open-minded because as new information comes in, it immediately is willing to change conclusions if necessary. So it’s important to recognize that open-mindedness (and close-mindedness) is a behavior, not a set a beliefs. It’s sometimes associated with certain beliefs, because it’s usually hard to maintain some particular beliefs without rejecting new information as it comes; but ultimately it’s not a set of beliefs, it’s a behavior.
Let me clarify at this point that I don’t actually think I have truly removed all Willful biases in my life. But I have honestly tried my damnedest. There are very few things, if any, that gives me that knot in my stomach anymore, that I’m aware of anyway.
Secondly, I know that Willful bias isn’t always experienced as a knot in your stomach—some people seem to quite cheerfully resist allowing an idea into their brain. But I would submit that that happens only after years of being desensitized. It seems natural to children at least to feel that emotional strain when something conflicts with their beliefs and they are resisting the idea.
So how do you fight the two biases? In very different ways. Experience / Knowledge bias clearly is fought by reading far and wide, talking to people with different viewpoints, etc. As you do this, the second type of bias (Willful), will almost certainly rear it’s particularly ugly head and must be fought on its own terms: the will. You have to strengthen the mental muscle of detecting that knot in the stomach, commit to facing the fear, truly trying on a different mindset, and letting it sink in deep. And sometimes this has to happen gradually over time. The good news is, it should get easier the more you do it, because once you’ve faced a fear, you immediately have more firm ground to stand on (i.e. a piece of the truth—as far as you can reason to it), which gives you more confidence to face the next fear in your worldview.
This is exactly how it worked when boring down deep into my childhood religious beliefs. Those first few deep steps were utterly excruciating. It was almost impossible to truly question some of my most cherished beliefs. At one point, when I questioned the hardest and deepest, I literally sat down in my room in my apartment in college, put the Bible on my dresser in front of me, and tried to force myself to see it as merely an utterly, human book—no different than any other religious book through history. It honestly took a minute or two for my mind to truly go there, since my entire life up to that point I never truly tried on that idea. But, after a moment, it happened, and my viewpoint flipped (almost like seeing the optical illusion box one way, and then it flipping and seeing the other). That moment was during the time when I was at my deepest time of doubt. And admittedly, I think it did truly change my view of the Bible, to a degree, forever.
(On a side note: I’m not recommending other Christians take my path. I think there’s a healthier and more useful way to fight biases by focusing on a single small element of your worldview at a time, as a I describe below.)
Dawkins, Bias, and Me
So, where does this position me, as a believing Christian, as I go into a book like The God Delusion?
I think two things primarily need to be said. First, I really feel like I’ve mostly already gone through the process, as I described some above, of taking each of my beliefs one by one and questioning it to the core—as far as I’m aware. So although there’s little chance Dawkins is going to convert me to an atheist, it’s not because I’m putting up a fence and not allowing my brain to go there. It’s more like my brain has already “gone there” pretty thoroughly and I decided not to build a house and settle down. Now, if Dawkins brings up particular compelling points I’ve never heard before, I guarantee you it will cause some real pondering (not of the faith-threatening kind, granted). This was the case when I read the Bart Ehrman book I mentioned above for sure; I still have notes of things to return to in that.
That brings up the second point: the way I have learned to most effectively fight my Willful bias is to avoid questioning everything at once. Questioning your entire worldview is not only very challenging, it’s also needlessly painful (speaking from experience…). Instead it’s way easier and more helpful, in my view, to pick one relatively small piece of your worldview at a time and really dig deep there. This is how I learned to seriously question Biblical inerrancy, for example—I held it out separately from my Christian beliefs (i.e. I knew I could still be a Christian even if the Bible had an error), and then it gave me substantial freedom to truly open my mind to the idea that the Bible is not inerrant. Same for initially looking at Evolution when I was much younger and then later the divinity of Christ, the existence of miracles, etc.
Think of it this way: If you politically are a raging liberal and you’re trying to convert your conservative friend (or vice versa) to your side, do you try to do it all at once? That would be as foolish as it would be frustrating for the both of you. Instead, if you just pick one issue to really go at it with, you might actually make headway if your friend is reasonable and you’re having productive discussions. It gives your friend a safe place—because they can remain a “conservative” (or a “liberal”) as they consider a different viewpoint on just this one particular issue.
In sum, my approach with a book like The God Delusion is to ruthlessly focus on individual pieces of evidences one at a time and try to free it from the whole and really look at it. In my experience, this method seems to really allow seeing things more objectively, at least as evidenced in my own life by the fact that I truly have changed mind on a lot of things that I’ve analyzed this way. (There have been plenty of things I’ve mostly kept my original view on as well, but I came back afterwards feeling a peace about it and no longer holding it in a death-grip.)
To bring this back to an earlier thought, this is similar to my questioning my basketball skills. It would’ve been impossible for me to question my entire identity, in all its facets, at once. But I was able, with some grit, to question the Basketball part of my identity and come to a more objective view. And then with some solid ground gained from that to more firmly base my identity on, I was able to move on to the next piece of my identity to question and then either discard or affirm, and so on.
Hopefully this post served as a nice intro to my dialogue with this Dawkins book, and also gave some helpful autobiography so you know where my head is at as I go into it.
I will close with one piece of completely unsolicited personal advice: focus ruthlessly on your Willful biases. Learn to detect that knot in your stomach, those places in your mind you immediately recoil from. Not only will unraveling those knots lead you closer to objective truth, but honestly, I believe those knots are the primary source of pain in all human relationships. The desire to not face ideas about ourselves prevent us from the growth that’s needed to sustain deep relationships with people. From my own personal experience, being laser-focused on detecting Willful bias has continually, progressively (and admittedly sometimes quite slowly) humbled me and carved away at the ugliest bits of my personality. I’m sure there’s still plenty of ugly blind spots left for me to discover down the road (yay!). I look forward to, in a horrified sort of way, unraveling those knots when I (hopefully) discover them in the future.
Now let’s get on with taking a look at my Delusion!