You gotta drop this fantasy that black people in the US aren't Americans.
We're here.
And its our duty to steer this ship. We have just as much vested interest in Westernization than anywhere else. We have a unique history and thus future.
I'm not a second class citizen, so stop talking like we are.
When has this country ever acknowledged black americans as americans?
They ask us to assimilate....but dont let us assimilate.
Black people in the us have been at the mercy of this country since the transatlantic slave trade.
When the emancipation proclamation was announced and slaves were freed...methods were employed to keep the system going to keep black men and women as a source of free labor. Either through black codes, through the KKK, the runaway slave patrol (which would eventually become the police), through jim crow laws, segregation, rockefeller drug laws, gentrification to keep black people in America as a proverbial underclass.
Black men and women fought for this country in world war i...only to get killed in race riots by whites when they came home. And the same thing happened after world war ii.
White soldiers during peace time in WWII had thanksgiving dinners with nazi soldiers and excluded blacks period.
Whites did everything they could to shut black people from creating any sort of wealth. From when fdr proposed the new deal social services and welfare programs that blacks were shut out of to the creation of the first suburban neighborhoods where white soldiers were given tax incentives and deals on houses and black people were not allowed to even buy houses and denied any chance to even rent.
So you want to steer this ship...but they wont let you.