If anything, the right has largely shifted to the center over the course of my life. For example: they, along with Democrats for quite a while, were opposed to gay marriage. While I'm sure there's still a small contingent that would roll that back if they could it's very much not an issue for the party anymore.
The left, or at least a good portion of the democratic party? They've gone off the rails in the past 10+ years. Everything is now about race or sex. Everything. It's 100% OK to discriminate against white men. We should have open borders. Abortion should be legal up until birth. Prepubescent children should be allowed to go on hormone blockers and get surgery if they think they're trans. Transwomen should be able to compete against women in sports. The Kavanaugh confirmation was a disgusting point in US history. Let's ban guns. The list could go on.
From what I see, Republicans largely want things to stay the same or be rolled back by a decade or so.