Much of the reason RDP and WebGL work so well is that they send the raw command stream/assets to the rendering side rather than continuously sending a stream of compressed frame buffer data. RDP, for example, can send the raw video stream being played, or recompress it, and send it rather than trying to render it to the screen and then recompress the resulting output, which is what you get with something like VNC. Thats why its completly possible to stream video over RDP on links that choke up pretty much everything else.
For an incredibly wide range of things, high level GUI command streams are going to be significantly less data intensive than the resulting rendered and compressed video stream. AKA the GUI command stream is itself an applicaiton specific compresion algorithm. A draw line command can affect tens of thousands of pixels around the line due to antialiasing, and that won't ever compress as well as the drawing (x,y,brush). So while sending a bunch of textures, shaders, and the like might be a huge initial overhead, its going to quickly pay for itself over the lifetime as used in something like a game/etc, particularly at high resolution and refresh rates.
Never mind, of course, the overhead of doing a bunch of local rendering, compressing it, sending it over the wire, and doing video playback. If it weren't for hardware video encoding/decoding, it would essentially be a case of pegged CPUs on both sides just doing graphical updates.
This may just be one of the issues with Wayland compositing; over time, I've become more convinced that the loss of standard GUI widgets and a serializable drawing stream might be a huge mistake.