Vaccination in today’s world can evoke deeply felt and controversial emotional responses. Conflicts over vaccine regimens, whether mandatory or recommended, have divided relatives and friends, caused families to be removed from clinic rolls, and inflamed legislative debates. In the United States, meetings of federal advisory committees charged with developing policies for vaccine use have become occasions for theatrical and highly emotive displays of dissent (Wadman, 2019). Public health professionals, concerned about herd immunity and seemingly vulnerable childhood vaccination programs, have joined with vaccine-promoting parents, scientists, scholars, and clinicians to address vaccine-dissenting views that spread across social media and through other social networks. Misinformation has emerged as the primary frame for both research on the topic and Op-Ed articles addressing the problem of vaccine skepticism and refusal (e.g., Armstrong & Naylor, 2019; Carroll, 2019; Chou et al., 2018; Merchant & Asch, 2018; Perakslis & Califf, 2019; Wadman, 2020).
As a framework for understanding how dissenting ideas spread and gain influence, misinformation has some advantages: it suggests broad dissemination of nonfactual ideas, and it produces a corrective measure – education – to address people’s ill-advised acceptance of such ideas. Science education, in particular, is often singled out as the deficit needing to be rectified – vaccine dissenters are not scientifically literate, it is argued, and must be taught to understand vaccination recommendations and the evidentiary bases for their safety and efficacy (Poland & Jacobson, 2011). Such an approach to vaccine dissent proffers a solution based in science’s social and epistemological authority, and is reflected in the authoritative information flows that characterize vaccine promotion.
In addition, in its common usage against vaccine dissent, misinformation often implies disinformation, which is a systematic effort to misinform an unsuspecting public for nefarious ends. As a result of its prevalent usage to refer to vaccine dissenting ideas, misinformation as a term has come to consolidate and simplify varied dissenting perspectives and the ideas on which they are based. In this way, misinformation has become a metaphor for all vaccine dissent, regardless of the heterogeneity of what vaccine dissenters actually espouse and whether some of their resistance is based in factual, more widely accepted, ideas.
This latter problem illuminates the weaknesses of the misinformation frame for understanding vaccine dissent and, more significantly, for understanding how to address challenges to vaccination programs and the spread of vaccine-preventable diseases (VPDs). As long as the misinformation frame dominates news reporting, public health approaches, and television punditry, vaccine dissent will be singled out as the primary challenge to the success of vaccination programs and the primary cause of outbreaks of VPDs. Misinformed parents will be seen as the main obstacle to high vaccination rates, obscuring other contributors like access to health care, poverty, and discrimination. The misinformation frame also assumes that alternative ideas about the body, health, immunity, and illness – often framed in nonwestern or at least nonallopathic medical traditions – are simply wrong, rather than part of a different world view. The misinformation frame construes vaccine dissenters as either vicious super spreaders of wrong information that is dangerous for society, or the gullible consumers of such information. In this view, education as a primary goal of vaccine programs supersedes any attempt to identify and address dissenters’ alienation from mainstream public health recommendations and laws mandating immunization.
Further, the misinformation frame is based on the idea that science is the sole domain of truth. In this framing, science guides human behavior toward triumphant achievements over nature and tradition, especially myths. It is the latest twist in an old story – the Enlightenment triumph of reason over religion and belief. In the current version, the Internet and, especially, social media, amplify this persistent problem – the tendency of people to believe things that are patently false and bad for them. The way these bad ideas circulate – and the impossibility of controlling that circulation or banning the ideas – strikes many as dangerous to the very foundations of modern societies: belief in reason as the basis for sound democratic governments and strong civil societies. Roiling conspiracy theories and patently false ideas masquerading as truths undermine social functioning, public health, and the cohesiveness of even radically pluralistic societies.
What if this typical framing of misinformation is wrong? If the frame misrepresents the problem, then it also leads to ineffective solutions. I have come across two approaches that provide viable alternatives to dominant approaches to misinformation today and address weaknesses in the typical misinformation frame.
The first is explicated by Cailin O’Connor and James Owen Weatherall in The Misinformation Age: How False Beliefs Spread (2019). The main thrust of the book is that what the authors call “false beliefs” are often produced when people seek good information (“true beliefs”). Even scientists create and propagate false beliefs in the ordinary course of experimentation and the dissemination of results within science. Coherence around “true beliefs” (those that are supported by evidence) depends on trusting colleagues and others in the social networks within which information circulates. Sometimes by accident, and sometimes by propagandistic design, wrong results seem right and false beliefs proliferate. Sometimes the problem is too many researchers (with small pots of data); sometimes the problem is too few researchers and inadequate research design.
As a result, people behaving perfectly rationally can create and propagate nontruths. How far they get disseminated depends in part on trust. Polarization – of scientific belief and in civil society – can become established as groups gradually move apart in their interpretation of evidence and trust in others. The erosion of trust is particularly caustic – it makes building consensus nearly impossible. O’Connor and Weatherall (2019) discuss two ways polarization can occur: through conformity bias and through mistrust. Conformity bias defines a tendency to align with existing ideas and practices – it is the pull of inertia dampening new ideas. It can be challenged by breaking up social networks and infusing new people into the conversation. But countering mistrust means finding shared language or beliefs to establish common ground, from which persuasion at least has some chance.
In their model-driven analysis, O’Connor and Weatherall (2019) show that misinformation can be a typical outcome of good-faith information-seeking practices if social networks do not – for a variety of reasons – support consensus around the truth. Their solution to this problem in science is to avoid the most egregious bias-producing activities – money from industry, for example. In civil society, they suggest that, just as the Internet seems to amplify false truths, when news organizations augment reporting on issues and incidents with fact checking and refutation of false claims, they can also amplify misinformation. O’Connor and Weatherall’s (2019) remedy is to split those functions and have news organizations recalibrate to report on “real stories of independent interest” (p. 169) and leave the fact checking to other organizations. The point here is that there are activities that are based in truth-seeking that can inadvertently support the dissemination of nontruths, even when reasonable efforts are made to be clear and factual. Continually harping on people to avoid misinformation and repudiate it is difficult when such information circulates from disparate points of view all the time. In addition, when influence and funding are measured by clicks in addition to paid subscriptions, when venerable news sources expand into life advice and product reviews, and when cable news outlets routinely mix anchors’ viewpoints with news reporting, readers have a hard time distinguishing factual reporting from the expression of opinions, making the idea of a “true belief” harder to define.
O’Connor and Weatherall (2019) argue that there has always been “fake news” – what distinguishes our present time is its capacity to disseminate quickly and widely via the Internet. But the Internet is only an amplifier of other phenomena. This leads me to my second point: how a history of reading, and the technologies that attend it, reveals an elite desire to control knowledge-seeking and knowledge dissemination.
The history of reading, exemplified, for example, in Frank Furedi’s Power of Reading: From Socrates to Twitter (2015), alerts us to the persistent concern of elites that new technologies – the printing press, paperback books, the Internet – strain existing mechanisms for controlling nonelites and their access to information. In this framing, misinformation is the name given to beliefs and ideas that contradict those of the educated and governing elites, those technocratic authorities who run governments, write policy reports, direct public health, and lead universities. Identifying something as misinformation becomes a way of identifying it as a wrong belief that goes against the status quo. Educating people against these wrong beliefs is a way of maintaining control over their behavior and compliance with government mandates or advice. And concerns about how they circulate information outside of authorized and surveilled contexts are just that – concerns about an emerging lack of elite control over what constitutes truth, or facts, or evidence.
The misinformation frame clearly represents the old epistemological system pushing against a newly emerging and dynamic information space. As such, it has limited purchase on solutions to ideas that circulate outside of the recognized professional verifications of peer review and professional credibility.
If false beliefs or misinformation can be created by the ordinary processes of science or the reasonable information-seeking behaviors of rational adults, then we cannot use it as a catchall term for why some people are duped into believing things. Indeed, such an approach suggests our worlds are awash in true and false beliefs and we are all engaged in efforts to identify which is which. That we disagree about what we believe is a natural and expected feature of the influence of our social networks on our views. What is distinct now is a growing intolerance for nonstandard views, especially from the frustrated viewpoints of elites who are tired of policing the deluge of information that constitutes the Internet for those wrong beliefs that they believe are dangerous to humankind.
That such an effort is futile should be clear by now. There is only so much handwringing one can do about “those people” who will not listen to reason. In the U.S., an extremely polarized society, there are increasing calls to force people to vaccinate (and to remove so-called misinformation from social media), because dissenters are perceived to be “immune to reason” (Allen, 2008). This is how the misinformation frame, typically understood, leads to what dissenters experience as violence against their bodies.
If the reframing of misinformation that I have explored here has a purchase in the education domain, it is in the cultivation of empathy as a goal. O’Connor and Weatherall (2019) follow a Burkian model of persuasion through identification, which is a form of empathy (Burke, 1950/1969). The history of reading and anxieties around developing technologies reveal that elites are interested in the control of information. That suggests a need for greater empathy toward the views and experiences of nonelites. This exercise underscores the importance of understanding different points of view and different experiences – in other words, empathy. One place to start is to emphasize learning about others and the plurality of human beliefs and practices. Perhaps combatting the creeping tribalism and its attendant narrowing of social networks offers a way to embark on this project. In this sense, the problem is less misinformation than our own narrow social connections, our growing mistrust of those different from ourselves, and the increasing inequality that further differentiates professional and educated elites from others.
All of which is to say that those fields that contribute toward empathetic understanding of others – sociology, anthropology, and the study of literature, for example – provide epistemological grounding for revising our current educational emphases on STEM fields. In an important sense, it is not more scientists who will fix the world, it is more humanists, given that social relations and the conflicts in belief and practice that flow from them are some of the biggest challenges that we face in coming decades. As the response to Covid-19 has shown us, a crucial part of the public health response to pandemic infectious disease is managing the disruption to social relations and economic activity caused by physical distancing measures. While the world waits for a vaccine (the technical solution which may or may not be successful in the short run), it is worth considering how understanding and changing human interaction patterns demands deep and abiding knowledge of culture, behavior, and belief.
Empathy, based on greater knowledge about the lives of others and their intrinsic humanity, is needed in other areas as well. Recent protests in the U. S. against police brutality and for African American lives have added to the sense that many of us have been caught sleeping about the very nature of our social existence and the differences in basic experiences and expectations that people of color have been trying to raise to consciousness for decades. White Americans have had a significant and serious reckoning with privilege that has registered more widely and more deeply than anything since the Civil Rights movement of the 1950s and 1960s. We have learned that we have been misinformed (and have been spreading misinformation) about pretty much everything having to do with our own experience. The portrayal of ordinary activities of “living while black” has revealed life itself to be intrinsically dangerous for African Americans. We simply “did not know,” although we stand rightly accused of purposefully turning away from the truth. As we unpeel the layers of not-knowing that have characterized white American experience, we can think about all of those “others” that we understand to be misinformed, dangerous to “our” existence, and easily swayed by charlatans and fakes, and be more humble in our own grasp on the truth.
References
Allen, A. (2008, September 1). Immune to reason: Are vaccine skeptics putting your kids at risk? The Free Library. https://www.thefreelibrary.com/Immune+to+reason%3a+are+vaccine+skeptics+putting+your+kids+at+risk%3f-a0188063361
Armstrong, P. W., & Naylor, C. D. (2019). Counteracting health misinformation: A role for medical journals? JAMA, 321(19), 1863–1864. https://doi.org/10.1001/jama.2019.5168
Burke, K. (1969). A rhetoric of motives. University of California Press. (Original work published 1950)
Carroll, A. (2019, July 22). Health facts aren’t enough. Should persuasion become a priority? New York Times. https://www.nytimes.com/2019/07/22/upshot/health-facts-importance-persuasion.html
Chou, W. S., Oh, A., & Klein, W. M. P. (2018). Addressing health-related misinformation on social media. JAMA, 320(23), 2417–2418. https://doi.org/10.1001/jama.2018.16865
Furedi, F. (2015). The power of reading: From Socrates to Twitter. Bloomsbury.
Merchant, R. M., & Asch, D. A. (2018). Protecting the value of medical science in the age of social media and “fake news.” JAMA, 320(23), 2415–2416. https://doi.org/10.1001/jama.2018.18416
O’Connor, C., & Weatherall, J. O. (2019). The misinformation age: How false beliefs spread. Yale University Press.
Perakslis, E., & Califf, R. M. (2019). Employ cybersecurity techniques against the threat of medical misinformation. JAMA, 322(3), 207–208. https://doi.org/10.1001/jama.2019.6857
Poland, G. A., & Jacobson, R. M. (2011). The age-old struggle against the antivaccinationists. The New England journal of medicine, 364(2), 97–99. https://doi.org/10.1056/NEJMp1010594
Wadman, M. (2019). Vaccinations opponents target CDC panel. Science, 363(6431), 1024. https://doi.org/10.1126/science.363.6431.1024
Wadman, M. (2020). Antivaccine forces gaining online. Science, 368(6492), 699. https://doi.org/10.1126/science.368.6492.699
Recommended Citation
Hausman, B. L. (2020). Against misinformation. On Education. Journal for Research and Debate, 3(8). https://doi.org/10.17899/on_ed.2020.8.3
Do you want to comment on this article? Please send your reply to editors@oneducation.net. Replies will be processed like invited contributions. This means they will be assessed according to standard criteria of quality, relevance, and civility. Please make sure to follow editorial policies and formatting guidelines.