The White Center
Nearly everything in the United States is centered around white people and being white. White people being at the center means that white norms are most commonly accepted while other norms such as African American norms are frowned upon. Everything is measured against whiteness because it is at the very center of racism and American society as a whole. White people have established that being white is good while being black is not, both consciously …