Feminism, what you think it is?
as a women, as a feminist, i say that feminism is fighting for my rights rather than competing with men. but most of the men think women only wants to dominate men. what are your opinions?
It is just a word with a defined meaning:
Feminism is a range of social movements, political movements and ideologies, that share the common goal to define, establish, and achieve political, economic, personal, and social equality of sexes. This includes seeking to establish educational and professional opportunities for women that are equal to those for men. (From Wikipedia)
Hardly anything to debate.
I think feminism has a context; and I'm not sure if we should focus on the context first. I personally haven't seen women getting discriminated at workplaces or getting paid lesser than men to do the same job. Difficult to comment on it.