Black women. They've also been shown to target Black boys in school more than any other teacher. It's clear a lot of Black women have very negative stereotypical views of Black men and boys based off their Black feminist theories and it extends to their sons. That's why we'll never have a real conversation about this, it just makes people uncomfortable
Black women in America commit more statutory rape in schools?
Black women in America commit more infanticide?
Black women in America commit more child trafficking crimes?
Black women in America commit more paedophilia?
Not disputing you as I live in Great Britain, would just like to know the source of your facts please so I can gain better perspective.
My observation is Black women in general physically beat their children more which is defo a form of abuse, I dont whoop my children. But the other forms of chilld abuse I would not have thought Black women in America were the forerunners