This book is not only a groundbreaking study of the role of women in the American medical profession, but a fascinating glimpse into how medicine was taught and practiced in the last century.
Proceeding from the colonial period--when women participated in healing as nurses, midwives, and practitioners of folk medicine--to their struggle in the 19th-century to enter medical schools, the book charts the emergence in our own time of women as full-fledged...