Most women, ladies, girls thinks feminism is by hating men, that has really affected the society all in the name of equality to men.
Feminism is basically just an ideology/approach that challenges patriarchy and advocates for equality of man and woman, I took a course that studies Feminism in school, thus I know a thing or two about it and what it aims to achieve, due to a male-centered society we live in today, especially in Africa, most women feel intimidated by the men, and feel the only way to be free from male domination is my hinging on feminism, and addressing themselves as feminists in a bid to seek protection under the umbrella of the movement called feminism.
Having said that, feminism actually doesn't have as its purpose hating on men, but there are a few types of feminism, one of which is the radical one, and I think that's where most of these issues of complete hatred for men falls under, I feel more people need to be educated of the fact that they actually do not have to be radical in their quest for equality, and as such this issue can be perfectly addressed.