Dr. Julia Cummiskey, assistant professor of history, is currently completing a post-doctoral fellowship in the Johns Hopkins University School of Medicine Department of the History of Medicine.
Dr. Cummiskey authored the below article as part of a series on “A Historical Guide to Pandemic Responses,” published by the JHU Department of the History of Medicine.
How did we get here and what could we have done differently? It’s a question many of us are asking in the face of the coronavirus pandemic. Among the answers to that question is a clear lesson from history that we seem to need to learn time after time —the importance of investing in public health agencies so they are equipped to detect, track and respond to disease. Historian James Colgrove and a team of scientists based in Uganda and Atlanta highlighted the effects of government investment in public heath on the capacity of societies to prevent and respond to health crises.
In his writing, Colgrove describes how the choices made by administrators and politicians contributed to the genesis and spread of multidrug-resistant tuberculosis in New York City. In the 1950s and 1960s, before the emergence of the drug-resistant disease, increased access to antibiotic therapies and comprehensive case detection and management led to a decline in tuberculosis cases. With the decline in cases, federal and state funding for tuberculosis control was cut drastically. Public health staff were reassigned from tuberculosis care, clinics were closed, and laboratory testing capacity was diminished. These cuts didn’t only affect tuberculosis care. The cuts spanned other public health efforts too, including those for cervical cancer screening and mental health treatment programs.
These cuts set the stage for a deadly drug-resistant tuberculosis outbreak in the early 1990s.
In combination with high levels of homelessness and the HIV/AIDS epidemic, tuberculosis control went from an afterthought to a crisis. Colgrove claims that the outbreak was a predictable outcome of the cuts to public health systems designed to detect and prevent transmission of the disease.
In contrast, Shoemaker and his colleagues highlight Uganda’s infrastructure for detecting and containing hemorrhagic fevers as an example of how investments in public health institutions and personnel can avert infectious disease crises. In 2010, the Uganda Ministry of Health launched the Uganda Virus Research Institute’s surveillance and laboratory program in collaboration with the U.S. Centers for Disease Control. Through this collaboration, Uganda has decreased the threat diseases such as Ebola, Marburg virus, Crimean-Congo virus, and Rift Valley fever pose to its citizens. Not only has this program saved lives in Uganda, but it has prevented small outbreaks from escalating into international epidemics.
As we attempt to translate our experiences with COVID-19 into lessons for the future, it is worth considering how cuts to public health institutions such as the CDC, the White House National Security Council Directorate for Global Health Security and Biodefense, and the National Institutes of Health undermine the country’s ability to respond to pandemic threats. Of course, when these systems work best, we are largely unaware of their efficacy—it is difficult to measure lives not lost because of successful prevention. So, for the benefit of our globalized society, it is always time to invest in public health infrastructure.
Sources: James Colgrove, “Chronicle of an Epidemic Foretold,” in Epidemic City: The Politics of Public Health in New York (New York: Russell Sage Foundation, 2011), 180-212.Trevor R. Shoemaker, Stephen Balinandi, Alex Tumusiime, et al., “Impact of Enhanced Viral Haemorrhagic Fever Surveillance on Outbreak Detection and Response in Uganda,” Lancet Infectious Disease 18, 4 (2018): 373-375.