Not even close! says Harvard professor Eric Mandelbaum. I just read an eye opening article from an interdisciplinary philosophy journal. The article, "Thinking is Believing," by Eric Mandelbaum, argues that whenever you think, hear, consider, or read something (e.g. "unicorns are pink"), your brain has the default setting of automatically believing it (so I guess now you believe unicorns are pink, sorry about that). In other words, everything is true unless you consciously reject it, override it--which takes extra effort so rarely happens. What makes all this worse, he says, is that we are not even aware we are doing this--it all happens unconsciously, in the background processing of the brain (mostly perception and memory). In other words, this is an extreme thesis: we are not rational creatures--not by a long shot. Although he is arguing for a particular philosophy of mind, he is not just doing armchair philosophy here; he is drawing from a body of scientific studies (which all philosophers ought to do).
This big idea explains a number of things:
- evolutionary speaking, it makes sense. A tiger is coming! You will be much safer if your brain simply believes that statement without having to think it over. But does the same principle apply when someone tells you the universe is composed of invisible harmonic strings? He thinks so.
- this explains why we have so many conflicting, contradictory beliefs. As Whitman said, "Do I contradict myself? Very well, I contradict myself. I am large; I contain multitudes." Well, it's really because we believe every damn thing we hear! And we just file it away, unaware of how many other beliefs it conflicts with (until we consciously pull out all the files, sort them, deliberate, reject some, strengthen others...something that is incredible hard to do because memory has it's own problems...sometimes we can't even remember what we believe! Hard to believe, huh.)
- several psychological experiments seems to confirm this. I don't remember them exactly, but here's the gist: you can have test subjects memorize a bunch of random statements like "Jim likes ice cream" "the car is purple" "the equator is 57,000 miles long". Later, even if you tell them those statements were all false, they will remember them as being true.
- confirmation bias: this refers to the fact that we seek out things that support what we already believe. It goes deeper. Not only do we seek positive truth rather than debunking negative truth, but we have trouble processing negative information to begin with. For example, "Jill is not male" is very hard for us to process. "Jill is female" is extremely easy. Studies show that when we hear "Jill is not male," we unconsciously switch is to "Jill is female"...and file it away. Or we could fuck it up entirely and believe that Jill is male. This is why it's always a good idea to communicate in positive terms, so people will understand you better.
- everyone who watches Fox news believes it, even when it's not "news" or "factual." I added this one. Not to pick on conservatives, this really goes for all news. Why do most people think we live in a violent world, even though we don't? Because the news only reports violent news. And what about those nasty political attack ads? We all know these ads are probably not true, but heck do they work! It's like once an idea gets "mainstream," it becomes true by virtue of existing.
- This explains why parents are so important in the development of childrens' beliefs, and why kids are pretty much set on a particular path by the time they become adults. We are sponges, a blank slate as Locke said. This also makes you really think hard about what ideas you want to even expose your children too. Shit, forget children: yourself! If you flood the mind with tons of ideas, is that the best thing to do? Or does that make you the biggest hypocrite ever (according to this theory).
But is this idea true? Do we believe everything we think?
Well, what a question! Notice that by posing that question--is this article true?--I have, in a way, debunked the very conclusion of the article. I do not have to accept the conclusion, even if I don't find evidence to the contrary. I read his ideas. I consider them. I accept, deny, or forget them. I was exposed to them but don't believe them. I do find them very interesting and somewhat convincing. Later, perhaps in a year, I will mention this article to someone at a party: "I read this article once that argues we believe everything we read. I'm not so sure about it." An agnostic approach, something he predicts can never happen.
Either way, I just don't see how this could be correct for the more intellectual beliefs that matter in our lives, the ones we actually think about and care about. For all those other beliefs, however, which the mind passively accepts for survival/evolutionary reasons--sure I think he's probably right. Psychology is good at telling us trivial truths about how our minds act when we are not looking. Perhaps this is another example. Perhaps the author is a victim of his own belief--perhaps his underlying belief in the irrationality of human nature is being fueled and puffed up by a couple of psychology experiments he read about. Which made him write this article.