*in before Reincar*
what happened?Read this
The Feminization of American Culture. - Social Anxiety Forum
Note the date the thread was started.
Now think about everything that has happened since then...
Women have all the rights men do and in some cases more( in the western world anyway). I really don't understand what theyre talking about.
feminism has destroyed the fabric of western society
Men didn't abandon their role as men. We were pushed/forced out of our role, feminism has brought forth a demonic disease a matriarchal society, where society as we know it is decaying.
fukk you talking about. I know i can walk down anystreet without fear of being raped or kidnapped. Also when it comes to jobs men can demand more and are skills are appericed more so then our looks. That is changing but very slowly.
I cant comprehend.