In the 1960s, feminists voiced their outrage about the health care system in the United States which routinely discriminated against women and, in so doing, literally jeopardized their health and well-being. Over a decade later, women's health advocates still stressed the need for reform of this male-dominated institution because of the on-going threat to the health of American women. In the 1990s, nearly 40 years after women began their fight for...