So I take it every other country on earth is a racial paradise?
That a white person in Africa or China wouldn't be seen or as treated as the other? How many movies are coming out of Asia or Africa that mandate they have whites in them, and are not depicted in a stereotypical role (i.e any movie that takes place after colonialism or not a war movie).
Ever notice how frequently and quickly you deflect with invoking random countries? Not sure what Africa (Africa is a continent mate not a country...) or China have to do with America.
No it doesn't. It just fights racism with more racism.
Real way to fix minorities in companies and higher education is EDUCATION. Meaning more school funding and better education.
But aren't white American women the biggest benefactors of Affirmative Action in your country?
Better education doesn't seem like it's going to happen in your country with that Betty Devos lady in charge of it all. Honestly I feel like secretly America hates children lol.
Last edited: