Since the birth of the first inoculation programme, the vaccine has been a vehicle of disease prevention viewed with intense suspicion. When Edward Jenner proposed exposure to the cowpox pathogen (a DNA orthopoxvirus) as a means of developing immunity against smallpox (a highly virulent disease caused by a different orthopoxvirus accounting for between six and ten percent of burials in nineteenth century Britain), satirical cartoons were repeatedly published in newspapers depicting half-woman half-cow chimaeras wreaking havoc upon society, paired with reports claiming that vaccinated individuals had developed horns and other bovine characteristics. Despite a more mellowed public outlook on vaccination programmes throughout the second half of the twentieth century (a thawing likely linked to the total eradication of smallpox via widespread vaccination, as announced by the World Health Organisation in 1980), the 1998 publication of 'The Wakefield Study' in respected medical journal The Lancet breathed new life into the 'anti-vax' movement. Former physician Andrew Wakefield utilised the journal as a platform to broadcast his false claims that the MMR vaccine was causally linked to the development of autism. The results produced in this controversial study have since been disproven - with Wakefield having been identified as engaging in several ethical violations and data manipulation - but the concern about the safety of childhood vaccination his research generated remains pervasive. When a vaccine was produced for the coronavirus in late 2020, one would not be amiss in comparing the contemporary public response to that observed by Edward Jenner in 1796. Popular pro-choice slogan "my body, my choice" was co-opted by Republican pundits in retaliation to increasingly pro-vaccination campaigns funded by the Biden administration, and certain politicians began to question, or outright spurn, the necessity of relevant mask/vaccination mandates.
Additionally, questions were raised as to the speed with which the COVID-19 vaccination was developed — with around a year between the sequencing of the pathogen genome and the first large-scale public vaccination efforts, beginning in the United Kingdom. Typically, vaccination development is expected to take a relatively long time — with time investment required in pathogen isolation, genome sequencing, health and safety testing in animal models, testing in human models, and larger clinical trials. The smallpox vaccination, the development of which was a watershed moment in medical history, took several decades to reach widespread use, with Edward Jenner’s eighteenth century model being refined throughout the nineteenth century.
However, the difference in speed of development isn’t quite as concerning as certain media outlets might suggest. For one thing, initial phases of vaccine development were accelerated due to the rapid sequencing of the SARS-CoV-2 genome by Chinese scientists just a month after the first documented outbreak in Wuhan. Furthermore, outbreaks of SARS and MERS throughout the twenty-first century resulted in the production of a vast breadth of data analysing similar pathogens (e.g. SARS-CoV-1, which caused a comparably brief pandemic in 2003). Due to the rapid inter-continental spread of the coronavirus, the U.S. government approved Operation Warp Speed in May 2020 — an initiative with the primary intention of developing a safe and effective vaccine to be made available for use as soon as was possible. Operation Warp Speed encouraged pharmaceutical companies to merge the phases of clinical testing, allowing for simultaneous exploration of the effectiveness, potential side effects and optimum dosage of the vaccine. During vaccine and drug development trials, production may often be delayed if difficulties are encountered in sourcing volunteers — however, as the coronavirus was rapidly spreading across the globe, leaving no substantially inhabited country unmarred, these companies had a near surplus of individuals upon whom the vaccine could be tested.
The rapid expansion of the biotech industry also shifted the goalposts of the vaccine development timeline. Where vaccinations had previously relied upon dead or live attenuated versions of a pathogen, pharmaceutical companies such as Moderna had been pioneering the development of mRNA vaccines—those which could be developed rapidly subsequent to the sequencing of the viral genome. The mechanism of an mRNA vaccine is complex, yet ensures the safety of patients—although direct comparisons with live attenuated vaccines vary, mRNA vaccines tend to have fewer severe side effects. Nucleoside-modified mRNA encapsulated in lipid nanoparticles is received via intramuscular injection, whereupon that mRNA can be utilised by the ribosomes of body cells to replicate a mutated version of a surface protein (in the case of SARS-CoV-2, the full-length spike protein used by the virus to bind to ACE2 receptors on human cells), which will, in turn, stimulate an immune response—allowing for the development of active immunity against the pathogen in question.
The 100 Days Mission, as outlined by the Coalition for Epidemic Preparedness Innovations (CEPI), proposes that had the timeline of coronavirus vaccine production been shortened from over three hundred to just one hundred days, tens of millions of infections could have been prevented—and possibly hundreds of thousands of deaths. Whilst the SARS-CoV-2 vaccine development occurred under Operation Warp Speed, with international bodies facilitating collaborative research and development processes with the assistance of regulatory bodies such as the FDA and EMA, the exponential international development of the biotech industry could render the notion of a hundred-day vaccine more realistic than ever before.
The development of mRNA vaccine technology has significantly altered the pharmaceutical landscape, allowing for the creation of novel vaccines using pre-authorised platform technologies — effectively sidestepping what might otherwise be a prolonged regulatory delay in the production process. Furthermore, the cooperation of international regulatory bodies in allowing for 'rolling reviews' of data produced through various phases of clinical trials enables pharmaceutical companies to produce large quantities of their vaccine prior to complete approval—a practice referred to as “at-risk manufacturing” —with the assurance that these vaccines are likely to be approved for distribution. As may be observed in the stilted vaccination rollout suffered by South Africa in 2021, flaws in the medical supply chain can cost an untold number of lives. For this reason, any organisation pursuing a 100-day vaccine must invest in a pre-existing and consistent supply chain allowing for the rapid international distribution of vaccines in the event of a pandemic. Through international cooperation and the establishment of vaccine manufacturing hubs or advanced purchase agreements, a robust supply chain could allow for the distribution of vaccines globally within just days of approval. Policymaking and social initiatives also play a significant role in the success and speed of vaccine rollouts—with public information campaigns emphasising the safety of vaccine technologies, simplifying scientific terminology for the general public, and providing translations for those who cannot read or understand English having been documented as increasing the willingness of people to get vaccinated.
Additionally, one of the most significant factors in allowing for the sequencing of a pathogen's genome during the early stages of a pandemic and subsequently producing an mRNA vaccine in under 100 days is the presence of early warning system infrastructure. As mentioned in previous discussions, AI-powered early warning systems, such as BlueDot, were able to notify clients of the incoming coronavirus pandemic days before the World Health Organization (WHO) made its official announcement. Pathogen-agnostic technologies like metagenomic sequencing—as seen in the development of systems like NAO’s wastewater analysis technology—have already far outstripped those which would have been available just four years ago. In the next few years, international investment and domestic resource allocation to the development of consistent and reliable early warning systems, alongside research into biotechnology capable of detecting a pandemic pathogen before widespread infection, could prove invaluable in preparing for future existential biological risks - regardless of whether these risks emerge naturally (e.g. from a poorly managed wet market in Wuhan) or from a deliberately disseminated bioweapon.
Mandatory Resources 🦠