Patrick Kane
Superstar
It was after WW2. The US spent the century before that building up the country rapidly through the industrial revolution and the massive manufacturing output they produced. Steel, rubber, gold that were sold on international markets and tons of finished products (fridges, radios, cars, etc) for the American population to consume (beginning of consumerism we see today).
Prior to entering the Second WW, America was much very an isolated state, similar to China during much of the 20th century. They did partake in South America/North American affairs but that was because of the Monroe Doctrine and really just the beginning of exercising their military might in their sphere of the world. When Japan attacked America in WW2 and America entered the war, they were ready to flex that military and economic power they built up the past century. When they dropped them thangs on Japan to ultimately end the war, that was the TRUE, IMO, beginning of America’s world dominance.
Right after this, they went into a lengthy Cold War against the Russians in the battle of communism and capitalism to see which way of life would serve the world better and to open up countries and “markets” all around the world. They won this too by the end of the 1980s with the dissolution of the Soviet Union. I glossed over a lot but to answer your question, it was after the war that began that American arrogance. Especially with the Marshall Plan and re building Europe after it was destroyed by the war. Not to mention their technological advances throughout the 40s-60s, especially during the Space Race. All of this added to the arrogance and idea of “American exceptionalism”
Prior to entering the Second WW, America was much very an isolated state, similar to China during much of the 20th century. They did partake in South America/North American affairs but that was because of the Monroe Doctrine and really just the beginning of exercising their military might in their sphere of the world. When Japan attacked America in WW2 and America entered the war, they were ready to flex that military and economic power they built up the past century. When they dropped them thangs on Japan to ultimately end the war, that was the TRUE, IMO, beginning of America’s world dominance.
Right after this, they went into a lengthy Cold War against the Russians in the battle of communism and capitalism to see which way of life would serve the world better and to open up countries and “markets” all around the world. They won this too by the end of the 1980s with the dissolution of the Soviet Union. I glossed over a lot but to answer your question, it was after the war that began that American arrogance. Especially with the Marshall Plan and re building Europe after it was destroyed by the war. Not to mention their technological advances throughout the 40s-60s, especially during the Space Race. All of this added to the arrogance and idea of “American exceptionalism”