The perpetuation of biased beliefs
I have written repeatedly about the fact that while homebirth advocates claim to be educated, most of what they "know" about childbirth is factually false. They are easily duped because they lack the most basic knowledge about science, statistics and childbirth itself. However, even when they come into contact with accurate information, they tend not to change their beliefs. That's because people routinely act in ways that perpetuate their own biased beliefs.An article in the Economic Journal, The Self-Perpetuation of Biased Beliefs, by Wing Suen, lays out the problem:
A basic tenet of science is that the accumulation of evidence will eradicate false beliefs... Alas, this is too optimistic a view for the progress of economics and other social sciences... Why is it that mistaken beliefs seem to have a life of their own, refusing to disappear in the face of accumulated data?It is well known that people with biased beliefs are more likely to choose sources of information that are biased. That is certainly the case for homebirth advocates. According to the author, if groups of people with differing beliefs are exposed to the same data, both groups will revise their previous views by incorporating the new data. However, when the price of access to the original data is "too high" (if, for example, it requires a level of expertise that is simply not available to the average person), the average person must rely upon an expert to present the information. Moreover, that expert must present the information in such a way that the average person could understand it. Of necessity, that will mean simplifying the information in such a way that much of its original meaning may be lost. Different experts may simplify the same information in different ways, depending on what outcome they wish to ensure.
The failure of data to resolve differences in prior beliefs is not confined to the realm of science. Indeed the failure is so much more severe in other areas of life that many conflicting beliefs seem to be incapable of ever having an empirical resolution. The focus of this paper is not on why people have different beliefs... The real puzzle is why these conflicting beliefs can persist [in the face of new evidence].
For example, homebirth advocates claim that the most recent statistics on US maternal mortality indicate that maternal mortality is rising. Many physicians and scientists (myself included) believe that the most recent data does NOT indicate that maternal mortality is rising. Both groups take the actual data as their starting point, and, indeed, the total maternal mortality rate did rise from one year to the next. However, homebirth advocates, simplifying the data for the layperson, have left out several critical facts. First, birth certificates have been revised to pick up additional causes of maternal death remote from the actual birth. Second, the risk profile of pregnant women has been changing, with more pregnancies to older women and more multiple births. Anyone who has access to the actual data (both because you actually read the data, and because you are able to interpret it without an intermediary), you will draw one conclusion. If the data is simplified and important factors are left out, the layperson may draw the opposite conclusion.
Suen argues that the desire for experts who will simplify the data to support the preferred bias has a rational (as well as a self-justifying) basis. Of course, people prefer information that reinforces existing beliefs. However, if they are going to receive information that does not confirm existing beliefs, it is more effective to get it from experts that they trust. People reason that if their chosen expert advocates accepting information that subverts existing beliefs, that information must certainly be true.
Suen offers an example:
Consider a person who is predisposed to voting for the Conservative Party. He is not interested in consulting a very left-wing newspaper even if he knows that the newspaper editors possess private information about the candidates. The voter figures that this newspaper will endorse a Labour candidate based on very weak evidence ... Thus the leftist newspaper’s recommendation is of no value to this voter. However this Conservative voter can gain by reading a rightist newspaper. In the (unlikely) event that the rightist newspaper endorses Labour, the voter infers that the newspaper must have received a very strong signal that the Labour candidate is indeed superior. Such information is useful because it will change his vote. The leftist and rightist newspapers may have access to the same information but, because they process the information differently, the Conservative voter prefers learning from the rightist newspaper. Such a rational demand for information from like-minded sources is the key to the theory of self-perpetuating bias.Similarly, when I point out the blindingly obvious fact that direct entry midwives are undereducated and undertrained, homebirth advocates will dispute it. However, when a direct entry midwife, like Kneelingwoman or Navelgazing Midwife, makes the same blindingly obvious observation, homebirth advocates pay attention, and often agree.
The bottom line is that biased experts can transmit bad news to believers with more efficiency than experts who are perceived to be unbiased or biased in the wrong direction. Unfortunately, such occurances are rare. What does this mean for the average homebirth advocate? It means that when a preferred expert simplifies new information in a way that challenges your existing beliefs, you can assume that the information is correct. However, if the preferred expert simplifies new information in a way that confirms your existing biases, you have no way of knowing if the interpretation is actually correct. In other words, choosing a biased expert may be more efficient in the short run, but is generally inaccurate in the long run.
Labels: philosophy
<< Home