I am getting to the point where I think some of the societal against woman really need to be altered or changed. People need to wrap their minds around a more liberal approach to life. In our society woman are constantly mistreated, objectified, or exploited. A man and a woman could do the same extact things and a woman will be called a hoe and so many other demeaning and derogatory names however, a dude will be given a high five or glorified for what he does.
I think a lot of females have gotten to the point where they believe that if males can do it then females can do it as well. Why must females always be in the receiving end of everything?