I watch the BBC a lot and I've noticed a trend. They make documentaries about the US all the time. Smack in Suburbia, Louie Theroux with Scientology, Gang violence, etc.
The BBC shyts on America's problems all the time. Why? What's their agenda?
As an American, I kinda find this offensive.
Doesn't the U.K. have the same problems? Why are they so fascinated with the US? It's like they're trying to make the US look bad. fukk that shyt...
The BBC shyts on America's problems all the time. Why? What's their agenda?

As an American, I kinda find this offensive.
Doesn't the U.K. have the same problems? Why are they so fascinated with the US? It's like they're trying to make the US look bad. fukk that shyt...
White people tend to use the mainstream media for propaganda painting themselves in the best light as possible, but when your on the outside looking in, you see the glaring flaws in this country it;s not hard to see it.