Skepticism and Data: When Should You Trust the Facts You Read?

Brian Tomek
Jan 13, 2014 at 10:57 AM

imagePicture this hypothetical scenario: A new study was released by a private company which supposedly linked the violence found in video games, such as Call of Duty, to increased murder rates. The statistics showed that 80% of gun-related murders were linked to first person shooter games. Now, most people would read this data and see it as undeniable proof that first person shooters should be banned. Pretty soon, many news outlets and politicians would be citing this study in order to prove their points and to fulfill their agendas. The opposite side of the debate sits there baffled, unable to disprove such an obvious correlation. This theoretical example seems all too real, as it is eerily similar to examples seen in real life.  

So, the pro gamer side has lost this one, right? Not necessarily. You see, the problem with these supposed conclusive studies is that most of them are unverified. That 80% is completely arbitrary and could very well be a false statistic created by biased data. Take the example of Dr. Andrew Wakefield, who was once a prominent figure in the in field of medical science. Wakefield devised in experiment which was trying to see if there was a correlation between the MMR vaccine (a vaccine designed to grant immunity to several diseases) and the development of Autism in young children.

The experiment found conclusively that this was, in fact, the case. As a result, panic ensued among parents around the world. Little did they know that this experiment was very suspicious. Upon further analysis of the data, it turned out that only 12 children had taken part in the experiment. Furthermore, it turned out Wakefield was planning on selling his own vaccine to replace MMR. Dr. Wakefield was later discredited because of his scheme.

Now, back to the original scenario. There is no real way to verify the results of the study unless actual sources for the data are supplied. Often times, the data itself is never provided when these results are presented. It may be the case that the study looked at a very small sample size, or only examined a particular type of murder. But that information may never reach you, whether because of deliberate obfuscation or simply because the journalists reporting on the findings don't have enough expertise to know what information is important to share. If that's the case, how do you know what you can actually trust?

As a general rule of thumb, always follow what I call the rule of three. For example, this hypothetical study about the murder rate needs to be confirmed in three separate studies in order to be truth. If you analyse the statistics and find that the murder rates in three different countries associated with FPS games is 70%, 79%, and 80% then perhaps there is some credence to the theory that FPS games cause more murders. If, however, the murder rates associated with those three countries are 10%, 40% , and 80%, then perhaps there is something suspicious about the data in the study on the correlation between murder rates and video games.

It's a basic concept in science: If the experiment can be duplicated with similar results by an outside source, then it is credible. If the experiment can't be duplicated with similar results, then the experiment is false. Until data can be confirmed by three or more separate sources, it should be seen with a degree of skepticism.    

Do not trust the results of a study simply because they have statistics, which can be arbitrary numbers based on unsubstantiated claims. In order to confirm the truth of these statistics, the same study should be duplicated by other sources in order to test the original results. Always be skeptical unless given a reason not to be.

Content published on the Young Americans for Liberty blog is only representative of the opinions and research of the individual authors. It does not necessarily reflect the views, goals, or membership of YAL.

Great article! Also keep in mind that Correlation does not prove Causation. Just because there is a correlation between two things, does not prove that one caused the other. There could be equally plausible alternative explanations for why two things may be occurring at the same time. For example, if there was a strong correlation between playing FPS games & gun violence, it is not necessarily that the FPS games led to the violent behavior, but that those who are more prone to violent behavior tend to enjoy FPS games for different reasons.

ecomeaux's picture