Winifred

1%
Flag icon
For much of Western history, caring for the sick and dying has been women’s work. Women were respected healers in ancient times, but their standing declined in Europe during the Middle Ages and early modern period as university-trained male physicians gradually turned medicine into an elite occupation among the wealthy.
Doing Harm: The Truth About How Bad Medicine and Lazy Science Leave Women Dismissed, Misdiagnosed, and Sick
Rate this book
Clear rating
Open Preview