My parents were involved in stuff like the National Organization for Women (NOW) while women's rights around abortion and equal pay etc were being attacked under the Reagan administration in the 80s. So I grew up having strong feminist role models all around me. I was taught to treat everyone as an equal, and to speak respectfully. Using terms like police officer instead of policeman, fire fighter instead of fireman, and so on. Think how quaint that sounds today!
I'm appalled that pretty much all of that seems to have died since the 90s. Grown women on reality TV call themselves girls. Traditional gender roles seem to be cemented in place by marketing. I see the most chauvinistic, repulsive men being rewarded for being an "alpha".
So I dunno, to me it feels like it's over. I don't see a way back to progress when half the population subscribes to gender stereotypes. I feel like I was prepared for an egalitarian world that never came to be.
And not just feminism. All of the social justice causes I'm most passionate about, beginning with the destruction of wealth inequality, seem to have fallen by the wayside. It's just all bad news all the time on every front. Maybe there's a silver lining there that it's all fake. Maybe we can shift out of this false reality and manifest a better one. That thought is about all that's keeping me going anymore.