I'm reading a book right now called "How We Know What Isn't So" by Thomas Gilovich. It's about how we think, and where thought processes go wrong. When that happens, we either believe things that are not true (like astrology), or don't believe in things that are (like the Moon landings).
Gilovich has lots of interesting stuff to say, and so much of it is relevant to science and critical thinking! I have long said that it's okay to dismiss some notions because they are so wrong they don't deserve to be entertained. A lot of people say I am dogmatic when I do that, but really, when someone tells me that an invisible giant planet will destroy the Earth next week, I think it's okay to dismiss that. Gilovich has this to say about this idea:
"People are inclined to see what they expect to see, and conclude what they expect to conclude... At first blush, such uneven treatment of new information strikes most people as completely unjustified and potentially pernicious... it brings to mind... groups blindly adhering to outmoded dogma. .. [but] it is also inappropriate and misguided to go through life weighing all facts equally and reconsidering one's beliefs anew each time an antagonistic fact is encountered."
In other words, some things really shouldn't bear equal treatment.
He uses the word "fact" there in the last sentence. I think he should have used another word, maybe "idea", or "piece of evidence", because if a fact comes along that does contradict your belief, you do indeed need to rethink your belief. Even if evidence comes along, you need to weigh it. But not all evidence has equal weight either. Evidence that is repeated and independent is pretty heavy, while an anecdote bears little or no weight.
Then he says something I find pretty funny:
"In evaluating more clear-cut information, our perceptions are rarely so distorted that information that completely contradicts our expectations is seen as supportive."
In other words, he's saying that given great evidence against us, we rarely use that evidence to support our beliefs. To that I say, "Ha!"
He's never met Bart Sibrel, James McCanney, or Richard Hoagland. These guys constantly use evidence that directly contradicts what they are saying, yet claim it supports them. Whether it's the Moon Hoax, a physically impossible model of the solar system, or the face on Mars, direct evidence blowing these ideas away is used by those men as triumphs of their theories. It boggles the mind to think they actually believe what they say. Of course, if they don't believe in what they say, then maybe Gilovich is correct anyway. I don't know if they believe in what they say or not, but it doesn't matter. Either way, what they are saying is wrong.
A final word: Gilovich goes on to say:
"We humans seem to be extremely good at generating ideas, theories and explanations that have the ring of plausibility. We may be relatively deficient, however, in evaluating and testing our ideas once they are formed."
To that he is exactly right. We can always come up with an explanation ( "It was astrology! Planet X! Sunspots! Telekinesis! Homeopathy!" ) for some event, but as humans we're pretty poor, maybe even lazy, about explaining our explanations to ourselves.