The history of the American city is, in many ways, the history of the United States. Although rural traditions have also left their impact on the country, cities and urban living have been vital components of America for centuries, and an understanding of the urban experience is essential to comprehending America's past. America's Urban History is an engaging and accessible overview of the life of American cities, from Native American settlements...