You’ve probably heard of the Cat and the Hat, but you may not have heard of “Bloodthirsty Ann.” Ann, another character created by Ted Geisel (aka Dr. Seuss), was a mosquito who transmitted malaria, and she first appeared in a 1943 booklet for American troops during World War II.
By the time Bloodthirsty Ann was introduced to readers, malaria had long been present in the southeastern United States. Cases of the deadly disease rose during the Great Depression, and started to wane during the early 1940s. As the country mobilized for World War II, the United States grew concerned about preventing its spread in its military training camps, many of which were in the country’s southern states and its overseas territories.
In 1942, the U.S. Public Health Service established the Office of Malaria Control in War Areas to address the issue. The Atlanta-based office was the forerunner of the CDC, which opened in 1946 as the Communicable Disease Center (now the Centers for Disease Control and Prevention). The office’s efforts may have helped end malaria transmission in the United States by the early 1950s—though modern scholars have questioned whether demographic and socioeconomic changes played a larger role in the disease’s decline.
War Highlights Malaria in the U.S.
Historically, disease has been a major mortality factor in wars. During the U.S. Civil War, tens of thousands of soldiers died from diseases like typhoid, pneumonia, measles and malaria. Scholars have also estimated that during World War I, more soldiers died from influenza than combat.
Disease control was a huge concern for the United States when it entered World War II. To prevent deaths from bacterial infections, the country mobilized to produce doses of penicillin. To avoid the kind of flu pandemic seen during World War I, the United States funded research into the world’s first flu vaccines. In addition, the country developed programs to combat malaria, a deadly disease caused by parasites and spread via the Anopheles mosquito.
“These efforts during the war were actually quite successful on a number of fronts,” says Leo B. Slater, a former historian at the National Science Foundation and author of War and Disease: Biomedical Research on Malaria in the Twentieth Century. “World War II was the first major conflict that the country engaged in where casualties caused by disease were lower than casualties caused by combat.”
Captain Theodor Seuss Geisel’s “blood-thirsty Ann” booklet was part of the U.S. Army’s malaria response. The booklet educated soldiers about malaria and how to avoid mosquito bites using bed netting and insect repellant. In addition to the Army’s efforts, the U.S. Public Health Service opened the Office of Malaria Control in War Areas—the predecessor to the CDC.
The Office of Malaria Control in War Areas opened in 1942 in Atlanta, Georgia. This new office focused on draining and destroying mosquito breeding sites and spraying insecticide, as well as teaching state and local health departments how to use these methods. Around 1943 it began applying a newer insecticide called DDT in people’s homes to keep them mosquito-free (in 1972, the U.S. banned DDT because of its long-term effects on the environment).
Focus Broadens to Become CDC After War
Like many wartime offices, the Office of Malaria Control in War Areas was set to close down when World War II ended. However, a physician named Joseph Mountin stepped in to expand the office into a center that focused on multiple diseases.
Mountin, who worked for the Bureau of State Services within the Public Health Service at the time, “decided that MCWA should do more than just malaria,” says Judy Gantt, director of the David J. Sencer CDC Museum in Atlanta. “And so in 1946, it became the Communicable Disease Center.”
The CDC continued some of the anti-malaria efforts of its forerunner while also tackling other diseases like typhus and hookworm. Starting in 1947, the CDC’s National Malaria Eradication Program worked in collaboration with state and local health departments to continue destroying mosquito breeding sites and spraying insecticide.
Malaria Transmission Ends in the US
By 1951, the CDC considered malaria transmission eliminated in the United States. However, it’s difficult to say how much of a role the Office of Malaria Control in War Areas and the CDC played in this elimination. Medical historian Margaret Ellen Humphreys has argued that demographic and socioeconomic changes in the first half of the 20th century played a major role in malaria’s decline in the south.
“In America malaria thrived where impoverished, malnourished people lived in porous housing near anopheles breeding grounds, and they contracted a disease for which they could not, by and large, acquire effective medication for suppression or cure,” Humphreys writes in Malaria: Poverty, Race, and Public Health in the United States.
Access to better housing and medicine contributed to malaria’s decline in the United States, as did the migration of poor and working-class people out of rural areas where their risk of contracting malaria was higher. This migration was so critical in malaria’s decline, Humphreys writes, that it “is hard to know whether the American DDT spraying campaign would have been successful without it.”
Although malaria transmission is no longer a danger in the United States, the disease remains a major health risk in parts of the Americas, Asia and Africa. The World Health Organization estimates that in 2020, there were 241 million cases of malaria and 627,000 deaths from the disease.