In societies, women believe they've been oppressed hence the need to advocate for the feminist movement. But it has been shown that women benefit from the patriarchal society they claim to be fighting against (e.g Ladies expect the man to be the breadwinner) when feminism stands for the equality of both sexes.
I would like to know your thoughts on this
It's not about the benefits that women can receive. The point is freedom and equality. I think these two concepts are much more important than the benefits.
After all, these are the values of modern Western society. It`s the same for both men and women.