I'm agnostic but I was raised Christian in different communities and on different continents and I have never been taught that Jesus's return is anything but a peaceful end-time where the just go to heaven (and yes, the wicked to hell). Zero mentions of Jesus waging war, if anything it's taught as the end of war and strife.
Not a single person I know associates Jesus with doing violence. I'm sure someone must believe this, but calling it "mainstream" is bad faith. What sort of "mainstream" theology is it if neither me nor anyone I know has been taught this?