
When I was learning to be a doctor, the focus was on making the sick well, but there was a huge piece missing from what they taught me in my medical education. Nobody ever taught me that, as doctors, it's our job to help people become, not just well, but whole.
After all, our job is to help people heal - and "to heal" means "to become whole." But what does that mean? Here's what it means to me.
read more
Published on September 15, 2011 22:00