British, French, Dutch, and American empires that the most famous of Wilson’s Fourteen Points—“self- determination for all peoples”—applied only to the defeated Austro-Hungarian and Ottoman empires, and even there only to white people. Self-determination was not being offered to the peoples of British India, or French Indochina, or the Netherlands East Indies, or the Philippines. On board Wilson’s ship bound for Europe, Secretary of State Lansing had written in his diary, “The more I think about the president’s declaration of the right of self- determination the more convinced I am that it is bound to be the basis of impossible demands on the peace conference—what misery it will cause.”17 Much of the rest of the twentieth century would be devoted to efforts by colonized peoples to achieve, through rebellion, urban insurrection, and guerrilla warfare, what Wilson had denied them in the treaty ending World War I.
These tragedies of hubris and naivete ended in personal tragedy for Wilson. On his arrival in Paris for the peace negotiations, he had declared, “We have just concluded the war to end all wars.” The League of Nations that he intended to create would, he believed, prevent future wars by acting against aggressors. But on November 19, 1919, and again on March 19, 1920, the U.S. Senate, led by Henry Cabot Lodge, declined to ratify the Treaty of Versailles as an encroachment on American sovereignty, and the United States itself never became a member of the League of Nations. Even Secretary of State Lansing had opposed the treaty, and Wilson, now semiparalyzed by a stroke, asked for his resignation. The Republicans returned to power in November 1920, and the new president, Warren G. Harding, quickly concluded a separate peace with Germany. At the end of 1920, Wilson was finally awarded the Nobel Peace Prize, but it was—even more than usual—a meaningless gesture. Marshal Ferdinand Foch of France, supreme commander of all Allied forces at war’s end, remarked of “Wilson’s” peace at Versailles, “This is not a peace treaty, it’s a twenty years armistice.”18 Foch did not live to see how precisely his prediction would be fulfilled.
With Woodrow Wilson, the intellectual foundations of American imperialism were set in place. Theodore Roosevelt and Elihu Root had represented a European-derived, militaristic vision of imperialism backed by nothing more substantial than the notion that the manifest destiny of the United States was to govern racially inferior Latin Americans and East Asians. Wilson laid over that his own hyperidealistic, sentimental, and ahistorical idea that what should be sought was a world democracy based on the American example and led by the United States. It was a political project no less ambitious and no less passionately held than the vision of world Communism launched at almost the same time by the leaders of the Bolshevik Revolution. As international-relations critic William Pfaff puts it, “[The United States was] still in the intellectual thrall of the megalomaniacal and self-righteous clergyman-president who gave to the American nation the blasphemous conviction that it, like he himself, had been created by God ‘to show the way to the nations of the world how they shall walk in the paths of liberty.’”19
If World War I generated the ideological basis for American imperialism, World War II unleashed its growing militarism. It was then, as retired Marine Colonel James Donovan has written, that the “American martial spirit grew to prominence.”20 The wars with Germany and Japan were popular, the public and the members of the armed forces knew why they were fighting, and there was comparatively little dissent over war aims. Even so, the government carefully managed the news to sustain a warlike mood. No photos of dead American soldiers were allowed to be printed in newspapers or magazines until 1943, and the Pentagon gave journalists extensive guidance on how to report the war.21
World War II saw the nation’s highest military participation ratio (MPR)—that is, percentage of people under arms—of any of America’s wars. With some 16,353,700 men and women out of a total population of 133.5 million serving in the armed forces, World War II produced an MPR of 12.2 percent. Only the MPR of the Confederate side in the Civil War was higher, at 13.1 percent, but the overall ratio for both sides in the Civil War was 11.1 percent. The lowest MPRs, both 0.4 percent, were in the Mexican (1846-48) and Spanish-American Wars, followed by the Persian Gulf War of 1991 at 1.1 percent.22 (This latter figure is, however, unreliable since a significant portion of the forces “under arms” at the time of the Gulf War were not engaged in combat or even located in the gulf region but were manning the United States’s many garrisons and ships around the world.)
World War II produced a nation of veterans, proud of what they had achieved, respectful but not totally trusting of their military leaders, and almost uniformly supportive of the use of the atomic bombs that had brought the war to a rapid close. President Franklin Roosevelt played the role of supreme commander as no other president before or since. He once sent a memo to Secretary of State Cordell Hull saying, “Please try to address me as Commander- in-Chief, not as president.”23 Congress did not impose a Joint Committee to Conduct the War on Roosevelt, as it had on President Lincoln during the Civil War, and military institutions like the Joint Chiefs of Staff were still informal and unsupervised organizations created by and entirely responsible to the executive branch. As Colonel Donovan has observed, “With an agreed policy of unlimited war, Congress was also satisfied to abdicate its responsibilities of controlling the military establishment.... Some military leaders believed civilian control of the military was a relic of the past, with no place in the future.”24
The most illustrious of World War II’s American militarists, General Douglas MacArthur, challenged the constitutional authority of President Harry Truman during the Korean War, writing that it was “a new and heretofore unknown and dangerous concept that the members of our armed forces owe primary allegiance or loyalty to those who temporarily exercise the authority of the Executive Branch of the Government rather than to the country and its Constitution which they are sworn to defend. No proposition could be more dangerous.”25 On April 11, 1951, Truman charged MacArthur with insubordination, relieved him of his command, and forced him to retire. Truman’s action was probably the last classic assertion of the constitutional principle that the president and the civilians appointed by him control the military. During the presidencies of John F. Kennedy and Bill Clinton, in particular, the high command would often be publicly restive about the qualities of the commander in chief and come close to crossing the line of constitutional legality without actually doing so. As we shall see, during the Kennedy administration the Joint Chiefs of Staff proposed that the military secretly carry out terrorist incidents in the United States and use them as a pretext for war with Castro’s Cuba, and President Clinton was never able to regain full authority over the high command after the firestorm at the beginning of his administration over gays in the military.
After World War II, high-ranking military officers, including Generals Marshall and Eisenhower, moved into key positions in the civilian hierarchy of political power in a way unprecedented since the Civil War. George C. Marshall, the wartime chief of staff, became the country’s first secretary of state from a military background. (There have been only two others since: General Alexander Haig in the Reagan administration and General Colin Powell in the second Bush administration.) Paradoxically, General Marshall left his name on what is probably the country’s single greatest foreign policy failure, the 1946 Marshall Mission to China, which attempted to mediate between the Communists and the Nationalists in the Chinese civil war, and its single greatest success, the 1947 Marshall Plan, which helped rebuild postwar Europe economically.
But World War II, although a popular war, did not create American militarism, and had the Cold War not ensued it is reasonable to assume that traditional American opposition to standing armies and foreign wars would have forcefully reasserted itself. If there has been a growing trend toward militarism, there also remains a vein of deep suspicion of armies. The military almost totally demobilized in the years immediately after 1945 even though the draft remained in place until 1973, when an all-volunteer military came into being following almost a decade of protests against the war in Vietnam. On a pragmatic level, the public has proved ambivalent about wars because of the casualties they produce. And World War II produced the second-largest number of casualties of all America’s wars.
The Civil War, by far the bloodiest war in our history, had profoundly affected popular attitudes and generated a deep resistance on the part of the American people to sending their sons and daughters into battle. The number of combat deaths for both sides in the Civil War was 184,594, considerably less than the 292,131 American deaths in