Current Events in Historical Context , Perspectives Daily

Vaccine Hesitancy Is a 21st-Century Phenomenon

Why Moving from “Prevention” to “Eradication” Changes the Scale of the Anti-Vaccination Problem

Gareth Millward | Apr 16, 2021

In May 2019, the British Health Secretary controversially announced that he was considering mandatory measles vaccination for all children. Two years later, there is widespread debate about COVID-19 “vaccine passports” that would effectively make it compulsory to get vaccinated before going to the pub. Proposals like these are very sensitive. Vaccination has been entirely voluntary in the UK since the 1940s, and compulsion was effectively removed by the early 1900s in response to waves of opposition in the 19th century. And you just don’t get between a Briton and the pub.

This 1963 poster featured the Well Bee, the CDC’s symbol for public health, encouraging Americans to take the oral polio vaccine.

This 1963 poster featured the Well Bee, the CDC’s symbol for public health, encouraging Americans to take the oral polio vaccine. Public Health Image Library, public domain.

So why propose such controversial policies? It’s because public health authorities are pushing for universal acceptance of vaccines. Governments, businesses, and the public no longer want to control infectious disease. They have become eradicationists. Protecting the people means removing vaccine-preventable diseases. Anything less is failure.

Despite measles’ return to the UK in 2019, vaccine acceptance is—historically speaking—remarkable. In England, 94.5 percent of children received at least one dose before their fifth birthday in 2018–19. It is now easier to trace people, the measles vaccine requires a much higher rate of acceptance to achieve herd immunity, and vaccines have a longer record of efficacy and safety. High-profile incidents in the United States, such as the 2015 Disneyland outbreak, have shown that vaccination rates below 95 percent can be deadly. COVID-19 is producing similar anxieties. 

In 1945, UK officials were delighted when local authorities achieved a goal of 75 percent diphtheria immunization of preschoolers. But “success” in the 1940s meant reducing diphtheria. Eradication would have been a very welcome side effect; however, it was not the be-all-end-all.

Historians and public health officials can spend all day discussing why certain populations do not vaccinate, the role of partisan party politics, sincerely held moral objections, online disinformation, and other factors. But the focus on “why” obscures a deeper phenomenon: vaccine refusal is a bigger problem for public health professionals today than it was just 30 years ago. 

We no longer want to control infectious diseases—we want to eradicate them through vaccines.

How can that be? By all measures—at least in England—vaccination rates have increased substantially. But we are no longer asking healthcare workers to vaccinate as many people as possible to lower infection rates. We are asking them to immunize more than 95 percent of the population to eradicate diseases including measles, poliomyelitis, and now COVID-19. In this environment, any individual or small cluster of people refusing vaccination can create a significant barrier to meeting such targets.

Try getting 19 out of 20 people to agree on anything. If just 1 in 50 are “anti-vaxxers,” “covid deniers,” “vaccine hesitant,” or plain scared, it gives public health officials very little margin for error. Add in groups who are less likely to get vaccines—people who move addresses regularly, do not speak the dominant language, mistrust medical authority, or live in otherwise marginalized communities—and 95 percent becomes an exceedingly difficult target to reach, especially when public health infrastructure has taken a fiscal hammering due to the great recession and public expenditure choices.

This is not to say that Jonas Salk, Albert Sabin, and others weren’t dreaming of eradicating polio when they developed their vaccines in the 1950s. Rather, officials’ immediate metrics of success were different. Since the 1980s, however, World Health Organization targets and public pressure on governments to protect people from infection have meant that any cases of vaccine-preventable disease are a source of embarrassment.

Ironically, the public health profession has become a victim of its own success. Medical science has demonstrated that vaccines work and are remarkably safe. Officials have convinced the majority of people that vaccines should be available quickly and to as broad a swath of the public as possible. As sociologist Jacob Heller has explained, publics in the Global North see vaccination as a marker of modernity and advancement, something that separates them from “poor” developing countries, and a rite of passage for young children. Yet there is still a small percentage of citizens that has become somewhat complacent about the dangers of certain infectious diseases, having never really experienced them (individually or en masse). For these people, the extremely low risk of vaccine damage is perceived as bigger than the risk of the diseases they prevent.

This was not historically inevitable.

So, at precisely the time that public health officials demand near-universal coverage to achieve eradication, a small but significant chunk of the public requires much more convincing that vaccination is worthwhile. The media often portrays failure to reach this minority as a crisis. The narrative employed in these stories typically focuses on how the continued existence of these diseases means preventable deaths, disability, pain, and/or temporary economic inactivity. Time is of the essence. It is certainly a failure to meet clearly defined goals, though it may not be a failure of public health.

This was not historically inevitable. It is a result of public health proving its worth to modern nation states and economies. There are, therefore, two notes of caution to end on. First, the existence of a minority of “hard to reach” populations might be used as an excuse for even greater surveillance and policing of their behavior. Historically, such policies have been counterproductive in the long term and do nothing to build trust between health officials and the hesitant public. And (in the case of bad-faith anti-vaxxers) such actions could actually elevate a small minority to a high-profile bogeyman status they do not deserve.

Second, infectious disease reduction is not an absolute failure. It is a noble goal to eradicate these diseases. But letting the perfect be the enemy of the good obscures the relative progress made in combatting hesitancy from building vaccination and wider public health infrastructure. In the 1940s, for example, British health authorities had much success with making vaccination easier to obtain in poorer neighborhoods and making clinic hours more convenient than by scolding working-class mothers. This is a much more difficult, long-term project that cannot be left to public health alone. But it is a platform upon which “eradicationists” can build.

Vaccine hesitancy is therefore a 21st-century problem. When judging the failures of the breadth and speed of the vaccination program, history shows us that protecting the public’s health is a spectrum, not an absolute.


Gareth Millward is a research fellow at the University of Warwick. He tweets @MillieQED.


Tags: Current Events in Historical Context Perspectives Daily History of STEM


Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. Attribution must provide author name, article title, Perspectives on History, date of publication, and a link to this page. This license applies only to the article, not to text or images used here by permission.

The American Historical Association welcomes comments in the discussion area below, at AHA Communities, and in letters to the editor. Please read our commenting and letters policy before submitting.


Comment

Please read our commenting and letters policy before submitting.