A deceptively hard to answer question!
Why is it hard?
Because both jpeg and WebP (which is really just a VP8 intra frame) can represent images with a variable amount of bits.
But it gets even trickier! How do you measure "savings"? Certainly you can produce 2 images with the same (or similar) bitrates with both webp and jpeg, but how do you say one is better than the other? How can you lower webp's bitrate until it's "quality" measure is the same as jpeg?
Quality metrics such as PSNR, SSIM, and VMAF all exist to try to give a "quality" metric. However, they all have their own flaws that allow an image compression format to get worse subjective quality will improving their objective score (For example, codecs that optimize for PSNR tend to be more blurry than codecs that target SSIM. Grass ends up looking like big blobs of green).
In fact, because x264 was often being beat by other codecs in those metrics they went out of their way to add "cheat" modes for the encoder! You can tell x264 to target PSNR or SSIM :D. Neither are the default.
Just some fun thoughts. Subjectively, I'd say WebP and the newer AVIF or HEIF does a better job than Jpeg (to my eyes). However, I could see why others might disagree.