To be honest living in the United States is very depressing.
The individualistic culture makes most people isolated and recluse. There's no sense of community. Lack of warmth. Very cold culture.
The "work work work till you die" mantra leads to feeling dead inside. We're just surviving to survive. Existing not even living. On top of that it's a dog eat dog society where everything is a competition. It's exhausting.
FEAR is the main emotion that is promoted here. People have a chip on their shoulder it's like everybody is just scared. The media promotes a lot of fear mongering tactics. Ofc it promotes traffic to increase revenue.
The dating scene is a disaster. Entitlement narcissism and passive aggressive...toxic cocktail. There's no passion, no camaraderie, everything is soulless. Women don't really seem "mentally healthy" ..Men going on rage attacks.
Making friends here is a hit or miss. Lack of warmth. Most people don't trust each other.
Fake outrage and politically correctness.
Fast food is cheap healthy food is expensive. Almost every food in the supermarket has sugar.
Though this highly depends on the region culturally the U.S. is declining
If you ever get the opportunity to travel or get a remote job outside the states -- take the opportunity!
Thanks for listening to my rant hope you enjoyed just how I feel