White Supremacy And Western Hegemony in United States

White supremacy refers to the belief that white people are innately better than people of other races or ethnicities and therefore uniquely suited to rule society and establish its norms. This belief system can manifest itself as both personal sentiment and structural policy.