In the years following the Civil War, women in the United States took up many new roles and their impact on the nation became ever more visible. As new territories were settled and the country began to heal its wounds, great industrial expansion brought changes in women's occupations, education, and activities. The sharecroppers who labored in the fields of the South, migrants who put down roots in the Great Plains, immigrants who sought opportunities...