To be honest you don't have to use Google Website Optimizer to notice that.
Example: I'm releasing my Rails A/B testing framework this Sunday, with a case study writeup taken from my site. My site currently allows people to defer signup for the trial by means of a guest login. That sort of option is popular here. I'm told it increases usability and user engagement, right?
Without ruining the surprise: that is a testable hypothesis. ;)
I stopped at the first misuse of statistics, but by that point, they had taken out informative links from their web site and turned it into a landing page. Frankly, a 5% change in conversion rate is in the noise compared to the effect of targeted advertising (and I have no faith that these sloppy reporters actually did a properly controlled experiment).
Edit: Downmodded for asking for real math over marketing math?
The old conversion rate was 24.4%. The new one was 29.6%.
(29.6 - 24.4) / 24.4 = .213 = 21.3%.
Is it misleading because their number of conversions is so low? (Not trying to be snarky, I have no idea)
Reporting "a percent of a percent" is downright misleading because it is entirely dependent on the starting baseline and also because conversion rates have high natural variance. Imagine an improvement from 2% conversion (for a truly awful site) to 2.6% (for a site just as awful). That "30% improvement" just isn't. If they were intellectually honest and called it a "0.6% improvement," anyone would be able to see that the claimed improvement is well within stddev.