We noticed you’re blocking ads

Thanks for visiting CRSToday. Our advertisers are important supporters of this site, and content cannot be accessed if ad-blocking software is activated.

In order to avoid adverse performance issues with this site, please white list https://crstoday.com in your ad blocker then refresh this page.

Need help? Click here for instructions.

Up Front | Apr 2022

Just the Facts

“Men, it has been well said, think in herds; it will be seen that they go mad in herds, while they only recover their senses slowly, and one by one.”1

–Charles Mackay

Scottish author Charles Mackay (1814–1889) was an early observer of crowd psychology. In his book, Memoirs of Extraordinary Popular Delusions, he outlined examples of peculiar phenomena among groups of people, including the witch hunt trials of Western Europe and the 17th century Dutch tulip mania in which some tulip bulbs became the most expensive objects in the world.1 History is replete with examples of societies that became enthralled by a collective delusion. It’s easy to dismiss historical examples of societal irrationality as a product of their time and an uninformed populace with little access to information and scientific discourse. Modern society, however, has demonstrated the same tendency.

Society is characterized today by nearly unlimited access to information and a broader scientific understanding of the universe than ever before. Regardless, the COVID-19 pandemic has demonstrated that people presented with the same information may interpret the information differently depending on their worldview. Even when presented with an unequivocal fact, there will be a spectrum of interpretations, some irrational, among the general population. An obvious explanation is that people have lost confidence in the validity of the facts they receive.

As I write, the world is gripped by the war in Ukraine. An onslaught of information about this current event is being shared with people everywhere. Further complicating our inevitably varied interpretations of that information is the fact that it can be manipulated. Imagine what a state-sponsored intelligence agency can do if an app on my phone can easily create a realistic but fake video of my daughter singing a Mariah Carey song that she has never even heard. In this context, a loss of confidence in facts is more understandable.

At least we medical professionals can count on our ability to examine the facts, devoid of agendas, financial bias, and outright fraud—right? Great strides have been made in requiring financial disclosures and rooting out conflicts of interest in scientific publications, but bias remains a significant problem.2-5 Putting aside the challenges of bias, how good are we at interpreting data rationally? Even when the facts are laid out in purely mathematical form, many of us are inept at data reasoning.

Modern understanding of rationality is based on the work of the 18th century mathematician and minister Thomas Bayes. According to Bayesian reasoning, previous knowledge is not discarded when new facts are learned. Instead, the entirety of what is now known is appropriately modified with consideration for the reliability of the new facts.6

Steven Pinker illustrated how bad physicians are at Bayesian reasoning in an example that I will modify slightly.7 Imagine that the prevalence of glaucoma in the population is 1% and that an available test detects glaucoma 90% of the time. The test also has a 9% false-positive rate. If your patient tests positive, what is their true risk of glaucoma? According to Pinker, most doctors answer that the patient has slightly less than a 90% chance of having glaucoma. (The right answer, however, is that the patient has only about a 9% chance.) Of course, Pinker is referring to other doctors. All of us recall that the positive predictive value of a test is equal to the number of true positives divided by the sum of the true and false positives. Those other doctors essentially forgot what they knew originally—that glaucoma is not present in 99% of the population. Instead, they focused too keenly on the test result for their patient.

We’ve seen that large swathes of society can adopt irrational beliefs. Unfortunately, there’s not much we can do to change that. We can, however, try to lead by example by reasoning in the Bayesian tradition. Start with the big picture of what you already know to be true, and carefully integrate new facts into your analysis. Finally, when society seems to be thinking in herds, remember the adage from medical school: “When you hear hoofbeats, think of a herd of horses, not zebras.” This is pure Bayesian reasoning at its finest.

Steven J. Dell, MD | Chief Medical Editor

1. Mackay C. Memoirs of Extraordinary Popular Delusions. Richard Bentley; 1841.

2. Grimes DR, Bauch CT, Ioannidis JPA. Modelling science trustworthiness under publish or perish pressure. R Soc Open Sci. 2018;5(1):171511.

3. Lexchin J, Bero LA, Djulbegovic B, Clark O. Pharmaceutical industry sponsorship and research outcome and quality: systematic review. BMJ. 2003;326(7400):1167-1170.

4. Sharma M, Vadhariya A, Johnson ML, Marcum ZA, Holmes HM. Association between industry payments and prescribing costly medications: an observational study using open payments and Medicare part D data. BMC Health Serv Res. 2018;18:236.

5. Perlis RH, Perlis CS. Physician payments from industry are associated with greater Medicare part D prescribing costs. PloS One. 2016;11(5):e0155474.

6. Paulos JA. The mathematics of changing your mind. The New York Times. August 5, 2011. Accessed March 17, 2022. https://www.nytimes.com/2011/08/07/books/review/the-theory-that-would-not-die-by-sharon-bertsch-mcgrayne-book-review.html?_r=1&scp=1&sq=thomas%20bayes&st=cse

7. Rothman J. Why is it so hard to be rational? The New Yorker. August 21, 2021. Accessed March 17, 2022. https://www.newyorker.com/magazine/2021/08/23/why-is-it-so-hard-to-be-rational

Advertisement - Issue Continues Below
Publication Ad Publication Ad
End of Advertisement - Issue Continues Below

NEXT IN THIS ISSUE