It's the 21st century and that we need to even have a discussion about whether women have value in society and are fully equal to men in everything is reprehensible. Of course women should be educated. Of course we should have full autonomy over our bodies for medical decisions. Of course we should make our contraception decisions without government interference. Of course we should be paid equally. When will the patriarchy that fears us fall?