I watch the BBC a lot and I've noticed a trend. They make documentaries about the US all the time. Smack in Suburbia, Louie Theroux with Scientology, Gang violence, etc.
The BBC shyts on America's problems all the time. Why? What's their agenda?
As an American, I kinda find this offensive.
Doesn't the U.K. have the same problems? Why are they so fascinated with the US? It's like they're trying to make the US look bad. fukk that shyt...
The BBC shyts on America's problems all the time. Why? What's their agenda?
As an American, I kinda find this offensive.
Doesn't the U.K. have the same problems? Why are they so fascinated with the US? It's like they're trying to make the US look bad. fukk that shyt...