The following sentence really stuck with me,
Contrary to the rules of philosophers of science, who advise testing hypotheses by trying to refute them, people (and scientists, quite often) seek data that are likely to be compatible with the beliefs they currently hold.
— Daniel Kahneman, Thinking, Fast and Slow (New York: Farrar, Strous, and Giroux, 2011), p. 81.
It stuck with me so much that by the time I finished the book I really thought that Kahneman had written more directly about the philosophy of science. We all know that the human mind tends to look for confirming data, and that evidence without theory is not of much use is something we read a lot — especially among Austrians. What I’ve heard less often is the emphasis on falsification.
Most people do know that in science a theory is never really proven. Instead, we accept it as useful until somebody comes and falsifies it. But, the way I’ve — and I’m sure others — interpreted this method is simply as a general rule, not one that the individual necessarily needs to follow. In other words, I’ve never thought of being a good scientist as someone who actively looks to disprove her own understanding. I consider it important, and I’ve made similar points on this blog, but I’ve never framed it as something of fundamental importance.
A great tragedy is that non-academic media out of necessity has to sacrifice some element of science. Not too long ago, I read a short 2003 paper on beauty, productivity, and discrimination in the classroom by Daniel Hammermesh and Amy Parker, who at the time was one of his undergraduate students. The results were generalized and re-published in the New York Times by Hal R. Varian, who we all recognize as one of the most well-known microeconomists of our time. Something that stands out is that while Hammermesh and Parker are very careful about drawing conclusions, and they delineate where some of their results are tentative and where there still exists uncertainty, Varian’s column is much more certain. Varian understands what it means to be a scientist more than most other people and he wasn’t trying to mislead anybody. What happened is that stories about uncertain results are not popular, because it makes the reader wonder why the news is even relevant. There is little room for the pedantic objectivity that complex scientific questions call for.
You often see the same thing in the blogosphere. How many times has Krugman written that the evidence is on his side? How many times have I posted graphs of very general data suggesting it validates, at least within some limit, my beliefs? The answer is very often. To some degree, it’s justifiable. I like to draw attention to things I think others might miss. Krugman is interested in convincing people of points he thinks are important. No less, Krugman is not completely adverse to falsfication — there are plenty of examples of him changing his views. Finally, that some body of evidence is not necessarily at odds with our world view is worthy of consideration. But, in outlets read by people who don’t always operate with the understanding that we should challenge our beliefs — and that no evidence is really ever final —, I feel that focusing too much on evidence of us being right can be misleading.
Scientists with wide readership should always make the point that there are limits to our understanding, and that the probability of being wrong is very high — whether these mistakes are major or minor. Inculcating methods of dealing with our cognitive limitations is an important step in making the world a more educated place. I’d say that it’s more important than strict schooling, because even strict schooling is a lost cause for those who aren’t interested in exploring the limits of their knowledge.