The Handmaid's Tale depicts a dystopian society in which a religious dictatorship assumes control of the United States, turning the country into the Republic of Gilead. In this new society, women are stripped of autonomy and often relegated to roles such as servant or childbearing...
The Handmaid's Tale depicts a dystopian society in which a religious dictatorship assumes control of the United States, turning the country into the Republic of Gilead. In this new society, women are stripped of autonomy and often relegated to roles such as servant or childbearing...