What is clear is that they were given a freedom within the church that was denied them in the pagan world. They could take leadership roles within the church as deaconesses, and some of the wealthier women sponsored house churches. So contrary to popular belief, the rise of Christianity had a very positive effect on the place of women in society.