Let me begin by defining feminism, according to Merriam-Webster. Feminism is: The theory of the political, economic, and social equality of the sexes. Organized activity on behalf of women's...