Feminism

Feminism: Belief in the liberation of women in society to a social stature equal to that of men.