The problem with working out an incubation period is that we rarely see the actual moment of infection. We just see people showing up with symptoms later on. If we want to estimate the average incubation period, we therefore need to find people who could only have been infected during a specific period of time. For example, a businessman staying at the Metropole had overlapped with the Chinese doctor for a single day. He fell ill with sars six days later, so this delay must have been the incubation period for his infection. Donnelly and her colleagues tried to gather together other examples like this, but there weren’t that many. Of the 1,400 sars cases that had been reported in Hong Kong by the end of April, only 57 people had a clearly defined exposure to the virus. Put together, these examples suggested that sars had an average incubation period of about 6.4 days. The same method has since been used to estimate the incubation period for other new infections, including pandemic flu in 2009 and Ebola in 2014.[57]
Of course, there is another way to work out an incubation period: deliberately give someone the infection and see what happens. One of the most infamous examples of this approach occurred in New York City during the 1950s and 1960s. The Willowbrook State School, located on Staten Island, was home to over 6,000 children with intellectual disabilities. Overcrowded and filthy, the school had frequent outbreaks of hepatitis, which had led paediatrician Saul Krugman to set up a project to study the infection.[58] Working with collaborators Robert McCollum and Joan Giles, the research involved deliberately infecting children with hepatitis to understand how the infection developed and spread. As well as measuring the incubation period, the team discovered they were actually dealing with two different types of hepatitis virus. One type, which we now call hepatitis A, spread from person-to-person, whereas hepatitis B was blood-borne.
The research brought controversy as well as discoveries. In the early 1970s, criticism of the work grew, and the experiments were eventually halted. The study team argued that the project had been ethically sound: it had approval from several medical ethics boards, they’d obtained consent from childrens’ parents, and the poor conditions in the school meant that many of the children would have got the disease at some point anyway. Critics responded that, among other things, the consent forms had brushed over the details of what was involved and Krugman overstated the chances children would get infected naturally. ‘They were the most unethical medical experiments ever performed on children in the United States,’ claimed vaccine pioneer Maurice Hillman.[59]
This raises the question of what to do with such knowledge once it’s been obtained. Research papers from the Willowbrook study have been cited hundred of times, but not everyone agreed they should be acknowledged in this way. ‘Every new reference to the work of Krugman and Giles adds to its apparent ethical respectability, and in my view such references should stop, or at least be heavily qualified,’ wrote physician Stephen Goldby in a letter to The Lancet in 1971.[60]
There are many other examples of medical knowledge that has uncomfortable origins. In early nineteenth-century Britain, the growing number of medical schools created a massive demand for cadavers for use in anatomy classes. Faced with a limited legal supply, the criminal market stepped in; bodies were increasingly snatched from graveyards and sold to lecturers.[61] Yet it is experiments on the living that have proved the most shocking. During the Second World War, Nazi doctors deliberately infected patients at Auschwitz with diseases including typhus and cholera, to measure things like the incubation period.[62] After the war, the medical community created the Nuremberg Code, outlining a set of principles for ethical studies. Even so, the controversies would continue. Much of our understanding of typhoid comes from studies involving US prisoners in the 1950s and 1960s.[63] Then, of course, there was Willowbrook, which transformed our knowledge of hepatitis.
Despite the sometimes horrific history of human experiments, studies involving deliberate infections are on the rise.[64] Around the world, volunteers are signing up for research involving malaria, influenza, dengue fever, and others. In 2019, there were dozens of such studies underway. Although some pathogens are simply too dangerous – Ebola is clearly out of the question – there are situations in which the social and scientific benefits of an infection experiment can outweigh a small risk to participants. Modern infection experiments have much stricter ethical guidelines, particularly when giving participants information and asking for their consent, but they must still strike this balance between benefit and risk. It’s a balancing act that is becoming increasingly prominent in other areas of life as well.
8
A spot of trouble
Grenville clark had just about settled into his position as conference chair when someone handed him a folded note.[1] A lawyer by training, Clark had organised the conference to discuss the future of the newly formed United Nations and what it would mean for world peace. Sixty delegates had already arrived at the Princeton University venue, but there was one more person who wanted to join. The note in Clark’s hands came from Albert Einstein, who was based at the adjacent Institute for Advanced Studies.
It was January 1946, and many in the physics community were haunted by their role in the recent atomic bombings of Hiroshima and Nagasaki.[2] Although Einstein was a long-time pacifist – and had opposed the bombings – his letter to President Roosevelt in 1939, warning of the potential for a Nazi atom bomb, had triggered the US nuclear programme.[3] During the Princeton conference, one attendee asked Einstein about humanity’s