I can't think of a single rule or law in religion that actually helps women.
They are told to wear shyt they don't like
They are basically property to men
They can't hold positions of power in their religions
so what the fukk do women get from it that makes them believe in it more than men?
I just recently learned this and was confused. Women should be the FIRST people to say fukk it and ditch religion.
for fukk's sake, all of humanity's sins, according to the torah and bible, is because a woman ate a god damn apple. they really blamed the ORIGINAL SIN on a woman
They are told to wear shyt they don't like
They are basically property to men
They can't hold positions of power in their religions
so what the fukk do women get from it that makes them believe in it more than men?
I just recently learned this and was confused. Women should be the FIRST people to say fukk it and ditch religion.
for fukk's sake, all of humanity's sins, according to the torah and bible, is because a woman ate a god damn apple. they really blamed the ORIGINAL SIN on a woman
