In societies, women believe they've been oppressed hence the need to advocate for the feminist movement. But it has been shown that women benefit from the patriarchal society they claim to be fighting against (e.g Ladies expect the man to be the breadwinner) when feminism stands for the equality of both sexes.
I would like to know your thoughts on this
Everything depends on society. In my country women can receive education, work, and do business in the same way as men. Equal rights and opportunities for all, without exception - that's what we need to achieve.
Because feminism is now in the trend, some ladies abuse this. The fact that you are a woman (or man) does not mean that you should "play this card" all the time.