was long and arduous work.” They were calculating part 1 of a two-part problem, trying to see if a fission trigger heating a specified amount of deuterium and tritium would start a thermonuclear reaction. Teller's group had calculated a cruder version of part 1 between 1944 and 1946. By February 1950 Ulam saw that the amount of tritium Teller had estimated earlier was not nearly sufficient. “The result of the calculations,” Ulam reported, “seems to be that the model considered is a fizzle.” He increased the estimated tritium volume and began again. Even with more tritium Teller's classical Super looked distinctly unpromising. Late in April Ulam went off to Princeton to discuss his pessimistic results with von Neumann and Fermi. The three men talked with Oppenheimer as well; Ulam noticed that he “seemed rather glad to learn of the difficulties.”

Ulam returned to Los Alamos and broke the news to Teller. “He was pale with fury yesterday literally,” Ulam reported to von Neumann — “but I think is calmed down today.” Teller at first refused to believe the calculations. He also questioned Ulam's motives for performing them; according to the official AEC history it was necessary for von Neumann to offer Teller “reassurances that the motives behind the changes [in tritium estimates] were constructive.” Ironically, Ulam had favored building the Super from the beginning.

The Super problem went on the ENIAC on schedule in June. The evolving results confirmed Ulam's and Everett's findings. “In the course of the calculation,” Ulam recalls, “in spite of an initial, hopeful-looking ‘flare up,’ the whole assembly started slowly to cool down. Every few days Johnny [von Neumann] would call in some results. ‘Icicles are forming,’ he would say dejectedly.” At the same time Ulam and Fermi, who was visiting Los Alamos for the summer, began hand-calculating the next phase of the Super problem, which concerned the propagation of the initial thermonuclear reaction. That work, says Ulam, “turned out to be basic to the technology of thermonuclear explosions.” It also predicted that Teller's Super would fizzle. The Super was simply a bad design, Hans Bethe explains, a dead end:

That Ulam's calculations had to be done at all was proof that the H-bomb project was not ready for a “crash” program when Teller first advocated such a program in the Fall of 1949. Nobody will blame Teller because the calculations of 1946 were wrong, especially because adequate computing machines were not then available. But he was blamed at Los Alamos for leading the Laboratory, and indeed the whole country, into an adventurous program on the basis of calculations which he himself must have known to have been very incomplete. The technical skepticism of the GAC on the other hand had turned out to be far more justified than the GAC itself had dreamed in October 1949.

Between October 1950 and January 1951, Bethe goes on, Teller “was desperate… He proposed a number of complicated schemes to save [the classical Super], none of which seemed to show much promise. It was evident that he did not know of any solution.” He was nevertheless unwilling to retrench. He wanted most of the laboratory's time for at least another year and a half. He did not know how to make a thermonuclear, he told the GAC at an October meeting, but he was convinced it could be done. He insisted that the bottleneck was a lack of theoreticians at Los Alamos and a lack of imagination. If the Greenhouse tests of thermonuclear feasibility scheduled for the spring of 1951 at Eniwetok atoll in the Marshall Islands proved a hydrogen bomb impossible, he concluded, Los Alamos might be strong enough to continue; if the tests proved a bomb possible, the laboratory might not be strong enough to follow through. Teller's assessment won him few friends at Los Alamos.

Severe stress can be creative. So can long familiarity with a problem. By February 1951 Ulam was angry with Teller and Teller was angry with everyone. The result was a novel, entirely unexpected invention. Not even Teller had anticipated it. Bethe supplies a context for lay assessment: “The new concept was to me, who had been rather closely associated with the program, about as surprising as the discovery of fission had been to physicists in 1939.” The concept has come to be called the Teller-Ulam configuration.

Afterward Teller would variously deny, acknowledge and claim credit for Ulam's contribution. Ulam would consistently acknowledge Teller's part but quietly insist upon his own. Others — Lothar Nordheim of the Theoretical Division, Herbert York — confirm, as Nordheim wrote in 1954 to the New York Times, that “a general principle was formulated by Dr. Stanislaw Ulam in collaboration with Teller, who shortly afterward gave it its technically practical form.” Teller's most nearly generous acknowledgment appears in his 1955 essay “The Work of Many People”: “Two signs of hope came within a few weeks: one sign was an imaginative suggestion by Ulam; the other sign was a fine calculation by [physicist Frederick] de Hoffmann.” His unwillingness consistently to acknowledge Ulam's contribution, in contradiction of scientific ethics, suggests the importance he attached to historic rank in the matter. He came to dislike being called “the father of the H-bomb,” but asserted his paternity in 1954 with a curiously explicit allegory of his on-again, off-again relationship with Los Alamos:

It is true that I am the father in [the] biological sense that I performed a necessary function and let nature take its course. After that a child had to be born. It might be robust or it might be stillborn, but something had to be born. The process of conception was by no means a pleasure; it was filled with difficulty and anxiety for both parties. My act… aroused the emotions associated with such behavior.

Bethe sifts the evidence the other way, drolly tongue-in-cheek: “I used to say that Ulam was the father of the hydrogen bomb and Edward was the mother, because he carried the baby for quite a while.”

The mechanism of the Teller-Ulam H-bomb was revealed in general terms in an official Los Alamos publication in 1983, on the occasion of the fortieth anniversary of the laboratory's founding:

The first megaton-yield explosives (hydrogen bombs) were based on the application of X-rays produced by a primary nuclear device to compress and ignite a physically distinct secondary nuclear assembly. The process by which the time-varying radiation source is coupled to the secondary is referred to as radiation transport.

Stanislaw Ulam's basic contribution appears to have emerged from a closer look at the early development of the fission fireball, which initially radiates most of its energy as X rays. These, traveling at the speed of light, advance outward ahead of any shock wave. The classical Super and other previous designs presumably tried to pack the entire mass of thermonuclear material into the evolving fission explosion to heat it hydrodynamically — more spheres within spheres, a fatter and unworkable Fat Man. Those designs always promised to blow apart before thermonuclear burning could make much headway. Ulam suddenly realized, it seems, that if the thermonuclear materials were physically separated from the fission primary, the enormous flux of X rays coming off the primary might be applied somehow to start thermonuclear burning in the brief fraction of a second before the slower shock wave caught up and blew everything apart.

Ulam and Teller proceeded to develop Ulam's idea. The X rays from the primary might heat the thermonuclear secondary directly (as microwaves heat food in a microwave oven) but they could not squeeze it efficiently to the greater density that would promote fusion. Some other material would need to intervene. It turned out that ordinary plastic would serve. Dump so large a flux of X rays into a layer of dense plastic foam wrapped around a cylindrical stick of thermonuclear materials and the plastic would heat instantaneously to a plasma — a hot, ionized gas — expanding explosively at pressures thousands of times more intense than the pressures high explosives can generate. So a fission primary — a little Fat Man, no larger in today's efficient weapons than a soccer ball — might occupy one end of an evacuated cylindrical casing. Farther along the casing a layer of plastic might wrap a cylindrical arrangement of thermonuclear material. Fire the primary and the X-ray flux would radiate into the plastic at the speed of light, much faster than the expanding fission shock wave coming up behind. Configuring the plastic would be much simpler than configuring high-explosive lenses; the light-swift X rays would irradiate it simultaneously along its entire length and the resulting implosion would be beautifully symmetrical.

That, to the extent that continuing secrecy allows its reconstruction, is probably what Ulam first conceived and Teller made practical. Though it was the necessary breakthrough, it was not the end of invention. Even with the greater heat and pressure of irradiated plastic implosion the design apparently would not evolve sufficient heat long enough to kindle a full-scale thermonuclear reaction. Such reactions depend on heating light atoms such as deuterium and tritium — increasing their velocity of motion — sufficiently to force them through the electrical barrier of the nucleus so that they can fuse to helium. The process requires heat and pressure but no critical mass. Once fusion has begun, the binding energy released in the reaction (for deuterium and tritium, 17.6 MeV) promotes further burning. A fusion weapon can therefore be made arbitrarily large, as a fire can be made arbitrarily large, by piling on more fuel. But first it must be well started, and the arrangement Ulam and Teller initially proposed

Вы читаете The Making of the Atomic Bomb
Добавить отзыв
ВСЕ ОТЗЫВЫ О КНИГЕ В ИЗБРАННОЕ

0

Вы можете отметить интересные вам фрагменты текста, которые будут доступны по уникальной ссылке в адресной строке браузера.

Отметить Добавить цитату