A review of Contagious. Cultures, Carriers, and the Outbreak Narrative, Priscilla Ward, Duke University Press, 2008.
We think containing the spread of infectious diseases is all about science. In fact, more than we care to admit, our perception of disease contagion is shaped by fictions: blockbuster movies, popular novels, newspaper headlines, and magazine articles. These fictions frame our understanding of emerging viruses and the response we give to global health crises. Call it the outbreak narrative. It follows a formulaic plot that goes through roughly the same steps of emergence in nature or in labs, human infection, transnational contagion, widespread prevalence, medical identification of the virus, epidemiological containment, and final eradication. It features familiar characters: the healthy human carrier, the superspreader, the virus detective, the microbe hunter. It summons mythological figures or supervillains from past history: the poisonous Typhoid Mary from the early twentieth century, the elusive Patient Zero from the HIV/AIDS crisis. Through these fictions, new terms and metaphors have entered our vocabulary: immunodeficiency, false negative, reproductive rate, incubation period, herd immunity, “flattening the curve.” We don’t know the science behind the concepts, but we easily get the picture. Outbreak narratives have consequences: they shape the reaction to the health crisis by leaders and the public, they affect survival rates and contagion routes, they promote or mitigate the stigmatizing of individuals and groups, and they change moral and political economies. It is therefore important to understand the appeal and persistence of the outbreak narrative in order to design more effective and humane responses to the global health crises that lie ahead of us.
The outbreak narrative
Another consequence of living immersed in fiction is that usually you only remember the last episode of the whole drama series. Published in 2008, Priscilla Ward’s book begins with a reference to “the first novel infectious disease epidemic of the 21st century, caused by a brand-new coronavirus.” The contagion epidemic was of course SARS, not COVID, and the “brand-new” coronavirus of the early 2000s was named SARS-CoV-1 as opposed to the more recent SARS-CoV-2. But it is difficult not to read Contagious in light of the ongoing Covid-19 epidemic, and not to apply its narrative logic to our recent predicament. Covid-19 rewrote the script of past epidemic outbreaks but didn’t change it completely. It built on past experience, both real and imagined or reflected through fiction. The scenario of disease emergence was already familiar to the public, and it shaped the way countries responded to the epidemiological crisis. It demonstrated that living in fiction leaves us fully unprepared to face the real thing: the countries that achieved early success in containing the virus were those most affected by past outbreaks and especially by SARS, which mainly spread in East Asia. By contrast, the United States is the country from which most fictions originate, but where response to Covid-19 outbreak was disorganized and weak. We need more than fiction to prepare us to the health crises of the future; we also need better fictions than the conventional outbreak narrative that casts the blame on villains and invests hope in heroes to provide salvation.
As Priscilla Ward reminds us, there was an earlier wave of fictional scenarios in the 1990s that popularized the outbreak narrative in its present form. Blockbuster movies, medical thrillers, and nonfiction books reached a wide public and dramatized the research results that infectious disease specialists were discussing at the time in their scientific conferences and publications. They include the novels Carriers (David Lynch 1995), Contagion (Robin Cook, 1995), The Blood Artists (Chuck Hogan, 1998), as well as the movies Twelve Monkeys (dir. Terry Gillian, 1995), The Stand (dir. Mike Harris, 1994), Outbreak (dir. Wolfgang Petersen, 1995), and the nonfiction bestsellers The Hot Zone (Richard Preston, 1994), The Coming Plague (Laurie Garrett, 1994), and Guns, Germs and Steel (Jared Diamond, 1997). Priscilla Ward use the movie Outbreak, starring Dustin Hoffman and Morgan Freeman, as particularly representative of the genre that came to shape the global imaginary of disease emergence. The opening scene of a desolate African camp decimated by an unknown hemorrhagic virus, as seen through the protection mask of an American epidemiologist, sets the stage for subsequent narratives. The story casts Africa as an “epidemiological ground zero,” a continental Petri dish out of which “virtually anything might arise.” It dramatizes human responsibility in bringing microbes and animals in close contact with (American) human beings and in spreading the disease out of its “natural” environment through the illicit traffic of a monkey that finds its way to a California pet store. It gives the US Army a key role in maintaining public order and makes US soldiers shoot their countrymen who attempt to violate the quarantine. Outbreak fictions often cast the military officer as the villain, sometimes in cahoot with private corporations to engineer bioweapons, and the public scientist as the ultimate savior who substitutes a medical cure for a military solution. Helped by visual technologies such as epidemiological maps, electron microscopes, and close-ups of the virus, experts engage in a race against time to identify the source of the disease and then to determine how to eradicate it. That effort constitutes the plot and storyline of the film: the outbreak narrative.
Healthy carriers and social reformers
The outbreak narrative as it emerged in the mid-1990s builds on earlier attempts to storify disease emergence and contagion. Much like the blockbuster movies and popular novels of the 1990s relied on the work of scientists writing and debating about emerging infections, discussions about disease and contagion in the early twentieth century were shaped by new and controversial research showing that a apparently healthy person could transmit a communicable disease. The idea of a healthy human carrier was one of the most publicized and transformative discoveries of bacteriology. It signified that one person could fuel an epidemic without knowing it or being detected, and it required the curtailment of personal liberties to identify, isolate, and treat or eliminate such a vector of contagion. For the popular press in the English-speaking world, the healthy and deadly carrier took the figure of “Typhoid Mary,” an Irish immigrant who worked as a cook and left a trail of contaminations in the families that employed her. She was reluctant to submit herself to containment or incarceration in a hospital facility and repeatedly escaped the surveillance of public-health officials, assuming a false name and identity to disappear and cause new cases of contagion. Typhoid fever at the time was a “national disgrace” associated with dirtiness and filth. It resulted from the ingestion of fecal matter, as many authors liked to explain, and could be combatted by personal hygiene and proper sanitation of homes and urban space. Typhoid Mary’s refusal to cooperate with public health authorities created a moral panic that combined the perceived threat of immigration, prejudices against Irish female servants, fallen-woman narratives, and violation of the sanctity of the family. In response, the Home Economics Movement emphasized “how carefully we should select our cooks,” and made familial and national health a central occupation of the professional housewife.
Communicable disease and the figure of the healthy carrier influenced changing ideas about urban space and social interactions. Focusing on poverty, city life, urban slums, marginal men, migration, deviance, and crime, the Chicago School was one of the first and most influential centers of sociological research in North America. Like other sociologists of his generation, Robert Park began his career as a muck-raking journalist and social reformer. While investigating the outbreak of a diphtheria epidemic in downtown Chicago, he was able to plot the distribution of cases along an open sewer that he identified as the source of the infection. This led him to use the concept of contagion as a metaphor for social interactions and cultural transmission. It wasn’t the first time biology provided models for the nascent discipline of sociology. In the view of early commentators, microbes did not just represent social bonds; they created and enforced them, acting as a great “social leveller” unifying the social body. In France, Gabriel Tarde and Emile Durkheim argued about the role of contagion and imitation in explaining social phenomena such as suicide and crime. Communicable disease in particular vividly depicted the connection between impoverished urban spaces and the broader social environment. Calling the city a “laboratory or clinic in which human nature and social processes may be conveniently and profitably studied,” Park and his colleagues from the Chicago School of sociology concentrated their analysis on social interactions in urban formations such as the tenement or slum dwelling, the ethnic enclave or the ghetto, as well as nodes of communication such as points of entry, train stations, and quarantine spaces. The particular association of those spaces with immigrants in the United States intensified nativism and anti-Semitism, as preventive measures disproportionately and inequitably targeted Eastern European Jews. The theories and models of the urban sociologists conceptualized a spacialization of the social and the pathological that would play a great role in the outbreak narrative.
Cold War stories
The outbreak narrative is also heir to the stories of viral invasion, threats to the national body, and monstrous creatures from outer space that shaped the imaginaries of the Cold War. The insights of virology were central to those stories. New technologies of visualization implanted on the public the image of a virus attacking a healthy cell and destroying the host through a weakening of the immune system. Viruses unsettled traditional definitions of life and human existence. Unlike parasites, they did not simply gain nutrients from host cells but actually harnessed the cell’s apparatus to duplicate themselves. Neither living nor dead, they offered a convenient trope for science-fiction horror stories envisioning the invasion of the earth by “body snatchers” that transformed their human hosts into insentient beings of walking dead. These stories were suffused with the anxieties of the times: the inflated threat of Communism, the paranoia fueled by McCarthyism, research into biological warfare or mind control, the atomization of society, emerging visions of an ecological catastrophe, as well as the unsettling of racial and gender boundaries. Americans were inundated with stories and images of a cunning enemy waiting to infiltrate the deepest recesses of their being. Conceptual changes into science and politics commingled, and narrative fictions in turn influenced the new discipline of virology, marking the conjunction of art and science. Priscilla Ward describes these changes through an analysis of the avant-garde work of William S. Burroughs, who developed a fascination with virology, as well as popular fictions such as Jack Finney’s bestselling 1955 novel The Body Snatchers and its cinematic adaptations.
The metamorphosis of infected people into superspreaders is a convention of the outbreak narrative. In the case of HIV/AIDS, epidemiology mixed with moral judgments and social conventions to shape popular perceptions and influence scientific hypotheses. Medical doctors, journalists, and the general public found the sexuality of the early AIDS patients too compelling to ignore. In 1987, Randy Shilts’s controversial bestseller And the Band Played On brought the story of the early years of the HIV/AIDS epidemic to a mainstream audience and contributed significantly to an emerging narrative of HIV/AIDS. Particularly contentious was the story of the French Canadian airline steward Gaetan Dugas, launched into notoriety as “Patient Zero” and who reported hundreds of sexual partners per year. In retrospect, Shilts regretted that “630 pages of serious AIDS policy reporting” were reduced to the most sensational aspects of the epidemic, and offered an apology for the harm he may have done. Considering the lack of scientific validity of the “Patient Zero” hypothesis, it is difficult not to see the identification of this epidemiological index case and its transformation into a story character as primarily a narrative device. The earliest narratives of any new disease always reflect assumptions about the location, population, and circumstances in which it is first identified. In the case of HIV/AIDS, the earlier focus on homosexuals, and also on Haitians, intravenous drug users, and hemophiliacs, was an integral part of the viral equation, while origin theories associating the virus with the primordial spaces of African rainforests reproduced earlier tropes of Africa as a continent of evil and darkness. Modern stories of “supergerms” developing antibiotic resistance in the unregulated spaces of the Third World and threatening to turn Western hospitals into nineteenth-century hotbeds of nosocomial infection fuel on the same anxieties.
The narrative bias
The outbreak narrative introduces several biases in our treatment of global health crises, a lesson that is made only too obvious in the international response to Covid-19. It focuses on the emergence of the disease, often bringing scientific expertise into view; but it treats the widespread diffusion of the virus along conventional lines, and has almost nothing to say about the closure or end-game of the epidemic. It is cast in distinctly national terms, and only envisages national responses to a global threat. It presents public health as first and foremost a national responsibility, and treats international cooperation as secondary or even as nefarious. As countries engage in a “war of narratives,” the reality of global interdependence is made into a threat, not a solution. The exclusive focus on discourse and narratives overlooks the importance of social processes and material outcomes. Priscilla Ward’s book reflects many of the biases she otherwise denounces. It is America-centric and focuses solely on fictions produced in the United States. It exhibits a narrative bias that is shared by politicians and journalists who think problems can be solved by addressing them at the discursive level. It neglects the material artifacts that play a key role in the spread and containment of infectious diseases: the protection mask, the test kit, the hospital ventilator, and the vaccine shot are as much part of the Covid-19 story as debates about the outbreak and zoonotic origins of the disease. Priscilla Ward’s Contagious concludes with a vigorous plea to “revise the outbreak narrative, to tell the story of disease emergence and human connection in the language of social justice rather than of susceptibility.” But fictions alone cannot solve the problem of modern epidemics. In times like ours, leaders are tested not by the stories they tell, but by the actions they take and the results they achieve.

A large literature exists on United States intervention in Latin America. Much has been written about the CIA’s role in fomenting coups, influencing election results, and plotting to assassinate popular figures. Well-documented cases of abuse include the overthrow of the popularly elected president of Guatemala in 1954 and the attempts to assassinate Rafael Trujillo in the Dominican Republic and Fidel Castro in Cuba. Books about the CIA make for compelling stories and sensationalist titles: The Ghosts of Langley, The Devil’s Chessboard, Killing Hope, Legacy of Ashes, Deadly Deceits. They are usually written from the perspective of the agency’s headquarters—which moved to Langley, Virginia, only after 1961—, and they concentrate on the CIA leadership or on the wider foreign policy community in Washington—The Power Elite, The Wise Men, The Georgetown Set. Rarely do they reflect the perspective of agents in the field: the station chiefs, the case officers, the special agents charged with gathering intelligence and monitoring operations on the ground. Such narratives require a more fine-grained approach that is less spectacular than the journalistic accounts of grand spying schemes but more true to the everyday work of intelligence officers based in US diplomatic representations abroad. Fortunately, sources are available. There is a trove of declassified intelligence documents made available to the public through the online CREST database under the 25-year program of automatic declassification. In The CIA in Ecuador, Marc Becker exploits this archive to document the history of the Communist Party of Ecuador as seen from the surveillance and reporting activities of the CIA station in Quito during the first decade of the Cold War.
Capacity building is the holy grail of development cooperation. It refers to the process by which individuals and organizations as well as nations obtain, improve, and retain the skills, knowledge, tools, equipment, and other resources needed to achieve development. Like a scaffolding, official development assistance is only a temporary fixture; it pursues the goal of making itself irrelevant. The partner country, it insists, needs to be placed in the driver’s seat and implement its domestically-designed policies on its own terms. Once capacity is built and the development infrastructure is in place, technical assistance is no longer needed. National programs, funded by fiscal resources and private capital, can pursue the task of development and pick up from where foreign experts and ODA projects left off. And yet, in most cases, building capacity proves elusive. The landscape of development cooperation is filled with failed projects, broken-down equipment, useless consultant reports, and empty promises. Developing countries are playing catch-up with an ever receding target. As local experts master skills and technologies are transferred, new technologies emerge and disrupt existing practices. Creative destruction wreaks havoc fixed capacity and accumulated capital. Development can even be destructive and nefarious. The ground on which the book opens, the commune of Ngagne Diaw near Senegal’s capital city Dakar, is made toxic by the poisonous effluents of used lead-acid car batteries that inhabitants process to recycle heavy metals and scrape a living. Other locations in rural areas are contaminated with stockpiles of pesticides that have leaked into soil and water ecosystems.
I close my eyes and I can hear Billie Holiday’s black voice filling the room. Her voice, described as “a unique blend of vulnerability, innocence, and sexuality,” speaks of a life marked by abandonment, drug abuse, romantic turmoil, and premature death. Hearing Billie Holiday sing the blues also summons her black ancestors’ history of enslavement, hard labor, racial segregation, and disfranchisement. I can imagine the black singer, cigarette in hand, eyes closed, bearing the sorrow of shattered hopes and broken dreams. But wait. I open my eyes and what I see on the screen is a seven-year-old Norwegian named Angelina Jordan performing on the variety show Norway’s Got Talent. Her imitation of Billie Holiday is almost perfect: pitch, rhythm, intonation, and vocal range correspond to her model down to the smallest detail. Here is a combination of a child’s frail body and the sound of an iconic singer that we usually hear through the narrative of her unfortunate life and perceived ethnicity. Impersonations of African-American singers can be problematic: as Nina Eidsheim notes, they bring to mind a past history of blackface minstrelsy and racist exploitation, and a present still marked by cultural misappropriation and racial stereotypes. But her point is elsewhere: by assigning a race or ethnicity to the sound of a voice, we commit a common fallacy that helps reproduce and essentialize the notion of race. We hear race where, in fact, it isn’t.
On March 3, 2021, Byun Hui-su, South Korea’s first transgender soldier who was discharged from the military the year before for having gender reassignment surgery, was found dead in her home. Her apparent suicide drew media attention to transphobia and homophobia in the army and in South Korean society at large. According to Todd Henry, who edited the volume Queer Korea published by Duke University Press in 2020, “LGBTI South Koreans face innumerable obstacles in a society in which homophobia, transphobia, toxic masculinity, misogyny, and other marginalizing pressures cause an alarmingly high number of queers (and other alienated subjects) to commit suicide or inflict self-harm.” Recently people and organizations claiming LGBT identity and rights have gained increased visibility. The city of Seoul has had a Gay Pride parade since 2000, and in 2014 its mayor Park Won-soon suggested that South Korea become the first country to legalize gay marriage—but conservative politicians as well as some so-called progressives blocked the move, and the mayor committed suicide linked to a #metoo scandal in 2020. Short of same-sex unions, most laws and judicial decisions protecting LGBT rights are already on the books or in jurisprudence, and society has moved towards a more tolerant attitude regarding the issue. Nonetheless, gay and lesbian Koreans still face numerous difficulties at home and work, and many prefer not to reveal their sexual orientation to family, friends or co-workers. Opposition to LGBT rights comes mostly from Christian sectors of the country, especially Protestants, who regularly stage counter-protests to pride parades, carrying signs urging LGBT people to “repent from their sins.” In these conditions, some sexually non-normative subjects eschew visibility and remain closeted, or even give up sexuality and retreat from same-sex communities as a survival strategy.
Nowadays young PhDs majoring in the social sciences and the humanities often list an interest in sound studies when they enter the academic job market. Likewise, digital humanities is a booming field encompassing a wide range of theories and disciplines bound together by an interest in digital tools and technologies. There is a premium in listing these categories as fields of interest in one’s CV, even though the young scholar’s specialization may lie in more traditional disciplines such as English literature, modern history, or American studies. This is what economists call job market signaling: by associating themselves with “hot” topics, potential new hires make themselves in hot demand and differentiate their profile from more standard competitors. And yet, digital humanities and sonic materials have so far had a limited impact on social science scholarship. The humanities remain text-centric and bound by technologies inherited from the printing press and the paper format. The reproduction of sound is ubiquitous, and digital technologies are everywhere but in the content of academic journals and university syllabuses. Student evaluation is still mostly based on silent modes of learning such as final essays, midterm exams, and reading responses. Sonic modes of participation such as asking questions, providing oral feedback, and exchanging ideas with peers during class discussions are weighted with a limited coefficient compared to other evaluation metrics based on the written text.
In 1980, smallpox, also known as variola, became the only human infectious disease ever to be completely eradicated. Smallpox had plagued humanity since times immemorial. It is believed to have appeared around 10,000 BC, at the time of the first agricultural settlements. Stains of smallpox were found in Egyptian mummies, in ancient Chinese tombs, and among the Roman legions. Long before germ theory was developed and bacteria or viruses could be observed, humanity was already familiar with ways to prevent the disease and to produce a remedy. The technique of variolation, or exposing patients to the disease so that they develop immunity, was already known to the Chinese in the fifteenth century and to India, the Ottoman Empire, and Europe in the eighteenth century. In 1796, Edward Jenner developed the first vaccine by noticing that milkmaids who had gotten cowpox never contracted smallpox. Calves or children produced the cowpox lymph that was then inoculated to patients to vaccinate them from smallpox. Vaccination became widely accepted and gradually replaced the practice of variolation. By the end of the nineteenth century, Europeans vaccinated most of their children and they brought the technique to the colonies, where it was nonetheless slow to take hold. In 1959, the World Health Organization initiated a plan to rid the world of smallpox. The concept of global health emerged from that enterprise and, as a result of these efforts, the World Health Assembly declared smallpox eradicated in 1980 and recommended that all countries cease routine smallpox vaccination.
There is a renewed interest in the United States for art-and-technology projects. Tech firms have money to spend on the arts to buttress their image of cool modernity; universities want to break the barriers between science and the humanities; and artists are looking for material opportunities to explore new modes of working. Recent initiatives mixing art, science, and technology include
Ten years have passed since the wave of protests that swept across North Africa and the Middle East. Time has not been kind to the hopes, dreams, and aspirations for change that were invested in these Arab uprisings. A whole generation is now looking back at its youthful idealism with nostalgia, disillusion, and bitterness. Revolutionary hope is always followed by political disenchantment: this has been the case for all revolutions that succeeded and for all attempts that failed. Fadi Bardawil even sees here the expression of a more general law: “For as long as I can remember, I have witnessed intellectuals and critical theorists slide from critique to loss and melancholia after having witnessed a political defeat or experienced a regression in the state of affairs of the world.” These cycles of hope and disillusion are particularly acute in the Arab world, where each decade seems to bring its own political sequence of rising tide and lowering ebb. Revolution and Disenchantment tells the story of a fringe political movement, Socialist Lebanon (1964-70), through the figures of three Marxist intellectuals who went through a cycle of revolutionary fervor, disenchantment, despair, and adjustment. Waddah Charara (1942–), Fawwaz Traboulsi (1941–), and Ahmad Beydoun (1942–) are completely unknown for most publics outside Lebanon, and their reputation in their country may not even have crossed the limits of narrow intellectual circles. They have now retired from an academic career in the humanities and social sciences, and few people remember their youthful engagement at the vanguard of the revolutionary Left. But their political itinerary has a lot to tell about the role of intellectuals, the relationship between theory and practice, and the waves of enthusiasm and disillusion that turn emancipatory enterprises into disenchanted projects.
“Inanimate objects, have you then a soul / that clings to our soul and forces it to love?,” wondered Alphonse de Lamartine in his poem “Milly or the Homeland.” In Animacies, Mel Chen answers positively to the first part of this question, although the range of affects she considers is much broader than the lovely attachments that connected the French poet to his home village. As she sees it, “matter that is considered insensate, immobile, deathly, or otherwise ‘wrong’ animates cultural life in important ways.” Anima, the Latin word from which animacy derives, is defined as air, breath, life, mind, or soul. Inanimate objects are supposed to be devoid of such characteristics. In De Anima, Aristotle granted a soul to animals and to plants as well as to humans, but he denied that stones could have one. Modern thinkers have been more ready to take the plunge. As Chen notes, “Throughout the humanities and social sciences, scholars are working through posthumanist understandings of the significance of stuff, objects, commodities, and things.” Various concepts have been proposed to break the great divide between humans and nonhumans and between life and inanimate things, as the titles of recent essays indicate: “Vibrant Matter” (Jane Bennett), “Excitable Matter” (Natasha Myers), “Bodies That Matter” (Judith Butler), “The Social Life of Things” (Arjun Appadurai), “The Politics of Life Itself” (Nikolas Rose),“Parliament of Things” (Bruno Latour). Many argue that objects are imbued with agency, or at least an ability to evoke some sort of change or response in individual humans or in an entire society. However, each scholar also possesses an individual interpretation of the meaning of agency and the true capacity of material objects to have personalities of their own. In Animacies, Mel Chen makes her own contribution to this debate by pushing it in a radical way: writing from the perspective of queer studies, she argues that degrees of animacy, the agency of life and things, cannot be dissociated from the parameters of sexuality and race and is imbricated with health and disability issues as well as environmental and security concerns.