I recently came across this article about certain celebrity women afraid to call themselves feminists. I found it very interesting that most of them declared themselves in support of gender equality but "not feminists." Hmmm. I respect their judgements and opinions, but it makes me wonder: do we really know what feminism means?
The actual definition of feminism is "the advocacy of women's rights on the grounds of political, social, and economic equality to men." I won't lie and say that there isn't a negative connotation to this word nowadays, but there is a big necessity for people (especially women) to look past this silly association. I'm assuming this stereotype came about when women were being even more headstrong about their rights than they are today, and I assume, out of bias, that it also came about because people (men) felt threatened. In reality, all feminists want is to have the same rights that men have. Because let's face it, we are all human beings.
The article states it perfectly:
“You don’t have to be some angry, man-hating termagant. In fact, if that’s your idea of what a feminist is, you may be basing your image on what the most loathsome trolls on the Internet call us.” Amen, sister.