Tuesday, April 24, 2012

The Essentials of Epistemology

The Essentials of Epistemology (the study of how to discover the truth):

Now we See Through a Glass, Darkly
Now we See Through a Glass, Darkly
How can we tell the difference between truth and error, between fact and fiction, between conspiracy fact and conspiracy theory? Naturally, the best solution is to become an expert. But you can't even become an expert without first learning what information to trust, and what information to ignore. Furthermore, it is impossible for everyone to be experts on everything. So we all eventually have to trust the advice of others. But whose advice should we trust? These questions are becoming increasingly important in the world of the internet, where everyone has been given a printing press, and where false (and even dangerous) ideas spread like wildfire. 

I do research in statistics, which is the study of how to determine truth from data and make rational decisions based upon this data. So I naturally have some opinions on the matter. But explaining these principles to others outside the field of statistics can be a challenge. And it is not reasonable to expect everyone to get a degree in statistics before they try to sort through the difference between truth and error. 

So, how should the average person determine truth from error? I believe that there really are a few principles that (if followed) will lead you to the truth far more often than any other approach. They are: 

1. Simplicity: The simplest theory is to be preferred over the more complex theory, all other things being equal (which has bearing on #4 as well).

2. Data over Dogmatism (be willing to change your mind): Let the data speak for itself as much as possible, don't assume that the answer must match some ideological or dogmatic assumption. Be as dispassioned and rational as possible. Allow your opinions to change when the data contradicts your initial opinions.

3. Avoid Confirmation Bias: People naturally tend to seek out data that confirms their initial opinions (this is called confirmation bias), while reacting to new data that contradicts their original belief with even more fervent adherence to the original belief (the backfire effect). This means that the luck of the draw for what you believed first has an unfair advantage. Therefore, be sure to actively look for data that contradicts your initial opinions, and do your very very best, as hard as it can be, to treat such new data fairly. (Conservatives should watch PBS, CNN, and BBC, while liberals should watch Fox News, and even *gasp* Mormons should read so called "anti-Mormon" or atheist opinions on occasion). It is only after you have read and understood the opinions of those that disagree with you, that you can be confident that you are actually right. 

4. Avoid Conspiracy Theories: Although real conspiracies do exist (usually very small ones), assuming that all data counter to your initial belief is due to a vast conspiracy to hide the truth is problematic, because first, it vastly over simplifies the reality where each person has their own (often contradictory) goals and motivations, which generally limits the real conspiracies in size and scope; second, most people want to do good, and those that do evil usually do it because they have convinced themselves that it is the right thing to do; and third (and most important) it creates a situation where your opinion (right or wrong), can never be contradicted by evidence, no matter how strong or otherwise convincing. This is an especially dangerous flavor of confirmation bias (see #3 above).

5. Trust the Experts: Because we can't personally experience everything, we must trust the opinions of others that have experienced things that we have not. Assuming that white elephants don't exist because you haven't seen one is foolish if others have. Thus, a large percentage of our understanding of the world around us must be based upon the witness and opinions of others. Our task is often to determine which witnesses to believe, and which opinions to trust.  When deciding which experts to believe, always assign more weight to the opinions of people who know more about the subject, than you do to people who know less about the subject. This means that we should trust and respect the experts in their fields. Look for issues and ideas where there is a strong consensus among the experts in a given field. And be careful of internet sources. Some 60 year old guy blogging in his underwear from his parent's basement does not a reliable expert make.

6. Look for General non Unanimous Consensus: Understand that there will always be a few dissenting voices on every issue. Don't assume that because you found a PhD physicist that thinks that the earth is flat that there is "controversy" among the scientific community on the issue, and that there is no scientific consensus on the matter, or think that the idea that the earth is a sphere is "only a theory", or that you should "teach the controversy" on this matter. Instead, look for general agreement among some large majority of the experts. 

7. Avoid Anecdotes and Emotional Stories: Anecdotes are almost entirely useless, since there will always be an anecdote or two that seems to support any possible position that one could conceivably hold. Unfortunately, these anecdotal stories often have vast emotional impact, but that doesn't mean that they are right. Instead, look for broad statistically significant studies to find truth. They are less emotionally convincing, but they are far more likely to be right!

If you follow these 7 rules, you will undoubtedly be wrong on occasion. But you will be wrong less often than if you don't, and you will be willing to change your mind rapidly when more information showing that you were wrong becomes available. If you do this, you will be unlikely to be lead astray, and you will be far more likely to discover things as they really are, really were, and really will be. 

So, how does that play out?

In the vaccination debate, it means that we should pay more attention to reputable doctors than we do to Jenny McCarthy when talking about whether we should vaccinate. It means that while the emotional story of the child that died after being vaccinated may be more emotionally moving, it should not be as intellectually convincing as a broad based statistical study. And it means that we shouldn't be surprised to find a few MD's in both camps. But we should look at the broad consensus, and realize that the vast majority of reputable doctors favor vaccination. Therefore we should favor vaccination. There is no vast conspiracy to make vaccination look successful when it is not. Sure the pharmaceutical companies have the motivation to do this, but the vast array of doctors who care about their patients don't. At least not in a way that could cause this level of near universal support.

How would these principles play out in the global warming debate? I am sure you can immediately see the answer. How about evolution? Again, is the answer obvious? How about economics? there things are a bit more muddled, but that itself is a conclusion, namely that there is consensus on some important points, but disagreement on some others.

What about the truthers? The birthers? Opposition to GMOs? And the list goes on and on and on.

(For more thoughts on the 'wisdom of the crowd,' the 'marketplace of ideas' and the problems the internet has produced in these things, see my paper: "Amplifying the Wisdom of the Crowd, Building and Measuring for Expert and Moral Consensus" by myself and Brent Allsop.)