I recently watched the movie "Wild" with Reese Witherspoon (good movie by the way) and it led to an interesting discussion with some friends on why there aren't many of us who "seem" to enjoy the nature. I personally love nature and being outside. I believe the notion of blacks not liking nature plays in old stereotypes about us being easily spooked and cowardly. I've heard a lot of us play into these stereotypes by saying " its what white people do" , and "black people don't do shyt like that", as if acknowledging that we lack the desire for adventure that is innately present within every human being
I believe many of have a hidden desire to be adventurous and go out into nature but are held back by what we've come to accept as cultural norms. We as a people have historically had to live off the land, so any idea of us having a natural nature is nonsense. I truly believe as stated that because of the present cultural norm and stereotypes we've opted into, we believe embracing nature is something we aren't suppose to do.
What do you guys think?
I believe many of have a hidden desire to be adventurous and go out into nature but are held back by what we've come to accept as cultural norms. We as a people have historically had to live off the land, so any idea of us having a natural nature is nonsense. I truly believe as stated that because of the present cultural norm and stereotypes we've opted into, we believe embracing nature is something we aren't suppose to do.
What do you guys think?