In the half century after the Civil War, evangelical southerners turned increasingly to Sunday schools as a means of rejuvenating their destitute region and adjusting to an ever-modernizing world. By educating children -- and later adults -- in Sunday school and exposing them to Christian teachings, biblical truths, and exemplary behavior, southerners felt certain that a better world would emerge and cast aside the death and destruction wrought by...