At the beginning of the twentieth century, soon after the conclusion of the Spanish-American War, the United States was an imperialistic nation, maintaining (often with the assistance of military force) a far-flung and growing empire. After a long period of collective national amnesia regarding American colonialism, in the Philippines and elsewhere, scholars have resurrected the power of "empire" as a way of revealing American history and culture...