The United States of the 1940s marked the beginnings of significant social and political change. Men were shipped off to fight in World War II, and women entered the workforce in larger numbers than ever before to hold down the homefront, earning a taste of what it meant to be independent. African Americans fought beside their white counterparts in the war and returned home unwilling to accept the inequality under which theyd lived for so long...