I've thought about this for years. But I've never felt a safe place to discuss it.

I'm white and went to a majority black high school and university. Most of my friends are black and a good amount of them are very afrocentric. They like to post things on facebook about how white people keep them down, hold them back, etc. Recently a friend posted this article which I have seen a year ago or so. I commented on how we shouldn't judge people by their skin color, and by spreading toxic ignorant articles like that around is just making things worse for everyone. Someone else said that all white woman are weak and obey whatever they are told. I said she was hateful and she said it was her opinion and if I don't like it oh well. I said of course I don't like it, it's racist!

I just don't get why it's okay to say shit like that. I'll never claim to know what it's like to be black, but I do know what it's like to be a minority (here in CR and while in high school and college). And to be judged by the way you look. It's so frustrating for me to be in this situation.

I guess I get the idea that black women need to be proud of their look and who they are. They have all of society telling them being lighter is beautiful, having straight flowing hair is better, etc. But you don't need to put other people down just to feel better! You can love yourself and love everyone else. There's no need for hate.

This was a rant I guess. I'm not really looking for replies, just needed to let it out.