AC used to let you step up voltage easier, and switch without as much arcing, and higher voltage allows one to send more power over longer wires.
We also rate AC voltage by RMS, which (for resistive loads) is the voltage that would transfer the same amount of power as the DC equivalent.
But really the peak voltage is higher, so we need parts with a higher rating despite the fact we are not using that peak the whole time.
And then there's issues with power factor. Modern gadgets don't draw power in the nice sine wave pattern without extra tech to make that happen. The extra tech is common now, but isn't needed with DC. Without PFC, you get more line resistance losses than you want.
But now we have solid state switching, and it's cheaper to convert DC with a SMPS than a bulky transformer.
We also don't use long series circuits commonly in anything I know about, except for Christmas lights, which are mostly LED now and work fine as they are(As long as nobody forgets a smoothing capacitor to get rid of that horrid flicker the cheap ones have).
Series circuits have the same current everywhere, but modern stuff is constantly changing the current it draws to save power. Which would be an issue if one device wants to be in sleep mode but the other wants to be full power.
I suppose we could do series circuits with constant current power supplies, so the supply pushes a constant 1 amp, and the devices alter their voltage drop to draw more or less current, rather than altering their current draw.
AC seems to be more suited for higher voltages(Where arcing is a problem) and analog stuff(if you don't want to use SMPSes and you have resistive loads to drive).