For much of Western history, caring for the sick and dying has been women’s work. Women were respected healers in ancient times, but their standing declined in Europe during the Middle Ages and early modern period as university-trained male physicians gradually turned medicine into an elite occupation among the wealthy.