Definition Definition

Feminism

Feminism is the advocacy of the political, economic, and social equality of the sexes.


Feminism is a social movement or a viewpoint committed to the removal of prejudice against women, differential treatment of men and women, and to the advancement of women's interests in general.  

Share it: CITE

Related Definitions