I am SO tired the these right wing, white males that still have the mentality that women should be barefoot and pregnant attempting to decide what women should be able to do with their body! Maybe women need to start a few bills to make a law like making it a mandatory castration if a man is found guilty of rape or cheating on his spouse. Or insurance can no longer cover Viagra or any similar drug (since they also don’t want ins to cover birth control) Or men should have to wear something similar to a chastity belt until married. Or maybe mandatory male birth control that results in impotency until they are married. Should I go on? STOP this crap and LEAVE WOMENS HEALTH ALONE!