If you're relying on american movie industry to teach you about other cultures you'll be wildly ignorant while being all arrogant about it.
Hollywood and the whole TV industry is the reason americans can sound so ignorant in some aspects. They see these things on TV and assume them to be accurate and next thing you know they're running their mouths like they know the culture better than you who's from it and lived it. This white dude once came up to me and was telling me how life was for haitians under Jean-Claude Duvalier and im looking the f@ggots running his mouth like
"breh, you know i lived there during that time, right?".
They fail to remind themselves that Hollywood is trying to make money and couldnt give a fukk about accuracy while spreading ignorance if that's what it'll take to sell the story. Like they say
"Why let the truth get in the way of a good story?" . shyt like Rambo taking on entire soviet batallions on his own. All the dumb shyt mentioned about Haiti in "Serpent in the rainbow". etc etc. Some are cool and will ask questions and reference something they saw on TV and ask if it's accurate. But many will straight up try to tell you shyt like they been there. God forbid they're among their friends speaking on the topic not knowing someone from said culture is present to fact-check them. shyt can get wild.