How White Feminism Betrays Women of Color
White feminism is a term used to describe the dominant form of feminism in the United States. It is a feminism that is focused on the experiences and needs of white women, and it often ignores or marginalizes the...