I'm so tired of these kinds of movies. We get a few slavery, bipoics, and we shall overcome films every year. I prefer something fresh like Get Out or The Last Black Man in San Francisco. Let's get some new stories.
I'd rather see documentaries or read a book on stories like this because these movies follow the same Hollywood tropes or the story is so b*stardized it's borderline fiction. They'll throw a white savior or magical negro in there somewhere. Look at the Harriet Tubman movie where they threw in a white love interest, a black villain, and still crammed in some magic
I'm not looking to Hollywood to learn about history
You saw Harriet?