“Science Gone Wrong”?

By Guy Higgins

I’ve just finished reading a book by Dr. Paul Offit, Pandora’s Lab (Seven Stories of Science Gone Wrong). The seven stories (and they actually are well told stories) capture the histories of seven episodes involving science. Those episodes, in order, cover:

  • The centuries-long creation of the opioid epidemic
  • The decades-long creation of the trans fat health hazard
  • The two sides of the discovery of a way to make atmospheric nitrogen chemically useable
  • The lengthy history of how genetics begat eugenics (simplistically, the idea that “inferior” humans should not have children)
  • The decades-long rise and fall of surgical lobotomies
  • The forty-year creation of an epidemic that has killed millions of people
  • The continuing health risks created when genius colors outside the lines

As I was reading the book, it occurred to me that the title was, essentially, “click bait.” Dr. Offit wasn’t telling stories about science going astray, but rather how human cognitive biases can result in terrible decisions – decisions that cost human lives in the thousands and millions. I’m writing about this, because these same biases that led people to do unconscionable harm in these seven cases are present among us and can influence the decisions that we, as leaders, make every day. There is nothing magic about you and me that eliminates the risk that these biases pose. So, let’s look at those seven episodes and the biases that were present:

  • Opioids – Humans have known, for millennia, about opium and the addiction danger that it poses. Starting in the late nineteenth century and progressing through the present, chemists and doctors have tried to discover a way to eliminate the addictive properties of opium, its derivatives and its synthetic progeny. They have uniformly failed. The use of opioid painkillers continues to present a risk of addiction – even though the chemists and doctors have thought that they eliminated that risk. This is the problem of, “I know everyone else has failed, but I’m smart and I have done it. I have the data.” The biases present here are over confidence and, often, confirmation bias. Over confidence is the belief that “I’m smart enough to succeed even though everyone else has failed” compounded by limited testing (I know I succeeded, so I don’t need as many tests). Confirmation bias is cherry picking the data that supports your foregone conclusion that you succeeded.
  • Trans fats – Studies of heart disease in the early and mid twentieth century showed a correlation between diets high in fats (these were naturally occurring fats) and an increase in heart disease among the U.S. population. Mind you, correlation does not mean causation! Preceding these studies, we (smart humans that we are) invented ways to modify naturally occurring fats to make them presumably less risky. Many nutritional scientists and doctors complained that the data were ambiguous and supported contradictory conclusions.   Nonetheless, the U.S. Congress became interested and chartered a committee to explore the issue and recommend any needed government action. Here’s where things went kattywhumpus (a technical term). The Congressional staffer charged with pulling information together knew precisely zero point nothing about biology, chemistry or diet, so he went to a prestigious university to talk to a (as in one and only one) highly regarded scientist – who just happened to be a zealous advocate of low-fat diets. He provided selected data and highly technical explanations of why diets high in fats were causing the increase in heart disease. This was the starting position for the Congressional committee. The committee, responsibly, heard from both advocates and opponents, but decided that high-fat diets were bad, leading to the government indicting some fats (like butter) as bad and recommending that they be replaced with more heart healthy fats (such as oleo). The problem these pronouncements created was that not all fats are equal – some fats are relatively benign and some are seriously unhealthy. The unhealthy fats are trans fats, and these are not present in naturally occurring fats – like butter – but are created by the partial hydrogenation of fats to create things like oleo. Trans fats are seriously bad, but it took another fifteen or so years for studies to tweeze that data out and present it in a compelling fashion. However, it took the government much longer to change its outlook on things like butter and oleo. The biases here were, again, over confidence (we understand the data better than you challengers), confirmation bias (our studies show…), and a new one – the status quo bias. The status quo bias describes our preference for not changing the current situation – this is a major factor in the government’s desultory approach to changing its dietary recommendations.
  • “Fixing” atmospheric nitrogen – The bottleneck in agriculture isn’t arable land, or water, or temperate climate – it’s the amount of nitrogen (as nitrates) available to plant life. Elemental nitrogen reacts chemically only with great difficulty. For all practical purposes, the gazillions of tons of atmospheric nitrogen are completely unavailable for agricultural (or other) purposes. Over a hundred years ago, a German chemist and his industrial engineer partner discovered a practical way to “fix” nitrogen – to make it react with hydrogen to create ammonia, which can then be made into fertilizer. This is seriously important – without this discovery, about half of the people alive today would not be alive because there would not have been enough food to support the population growth over the last century. The problem is that this process made these fertilizers so cheap and so available that they have been over-used and are creating biohazards as they wash into rivers, lakes and the ocean. Dr. Offit asserts that this is, in fact, an existential problem. The bias (and it’s hard to blame people a century ago for not foreseeing the issues of today) is what Daniel Kahneman call, “What You See Is All There Is (WYSIATI).” No one even tried to think through what the profligate use of these fertilizers would mean.
  • Eugenics – This is a case of “no idea is immune to being misunderstood and misapplied.” The science of genetics was started by Gregor Mendel’s famous pea experiments and then found its way into Charles Darwin’s Theory of Evolution. That much is solid science. The problem came about when people started to associate behaviors and pathologies with genetics and then extended those thoughts to the idea that, “If we can breed better horses, we can breed better humans.” It doesn’t take much imagination to see how that thought gets you to a policy of mandatory sterilization for people deemed “inferior breeding stock.” The problem, of course, is that behavior is a “nurture” phenomenon – not a “nature” one. Again, the bias involved in this chain of reasoning is almost completely the confirmation bias.
  • Lobotomies – A lobotomy is a “surgical” procedure that is, for all practical purposes, identical to serious brain trauma. It involved, basically, un-knowledgeable cutting or disrupting of the cerebrum. The problem with the conclusion that such a procedure can cure neurological disorders is akin to the conclusion that you can fix an antique watch with a chain saw. The biases involved in the spread of the idea that lobotomies were an acceptable practice would seem to be a complex amalgam of confirmation bias, over confidence, WYSIATI with a large contribution by the champion bias. The champion bias comes into play when a well respected individual strongly proposes a concept. As humans, we are inclined to accept the champion’s proposal because of his respected position and the strength of his support.
  • The malaria epidemic – In the forty years since Dichlorodiphenyltrichloroethane was banned, approximately 50,000,000 people have died from malaria. In the thirty years before the ban, cases and deaths from malaria plummeted globally. In fact, malaria was eliminated, entirely, from eleven nations. So, why was it banned? It was banned because one person, an extremely accomplished writer and well respected scientist, took the position that DDT was destroying the Eden-like environment and exceptionally strongly and effectively championed the banning of it. Today, we understand that the very limited data used to support that position was carefully chosen to present a picture that was false. Nonetheless, the strength of the champion and the eloquence with which the picture was presented convinced the press, the public and the government. DDT was banned and remains, today, anathema. The biases involved were the champion bias and groupthink. Groupthink occurs when people are reluctant to challenge an idea because the benefit of belonging to the group is greater than the benefit to be gained by successfully opposing the idea. In mass movements, it can be extremely powerful.
  • Vitamin C – Like all vitamins, vitamin C is critical to human health. In 1966, the only man to ever win two individual Nobel Prizes was told that taking enormous doses of vitamin C would improve his health and extend his life. This enormously intelligent and disciplined scientist accepted that assertion at face value – even though the person who made that assertion had neither credentials, not supporting data. The Nobel Laureate subsequently wrote books championing megadoses of vitamin C. Because of his international standing as a reputable scientist, he garnered enormous publicity and acceptance of his ideas. The problem, of course is that the data emphatically does not support any benefit from taking high doses of any vitamins – in fact, there is significant data indicating increased risk of various cancers associated with such large doses. There are two biases that dominate in this case, the champion bias and the halo bias. The halo bias exists when we attribute expertise to a person in some area simply because they are a recognized expert in another area.

None of these biases – champion, confirmation, over confidence, groupthink, WYSIATI or halo – are unique to these seven episodes or to science in general. They are common biases that are present all the time. As leaders, we need to remain aware of the potential problems that a failure to recognize and mitigate these biases can cause. Nobel Laureate Daniel Kahneman thinks that the strongest hope for our ability to recognize and overcome these and other cognitive biases is our human ability to work together as a team, leveraging our different perspectives, approaches to problem solving, and techniques for making predictions. We, leaders, need to step up and enable the group dynamics that will enable our teams to overcome these biases and make better decisions.

Thoughts?

Leave a Reply

Your email address will not be published. Required fields are marked *