If you do a small-scale measurement, say you get result of 5g, with a standard deviation of 0.2g. That means the variance is 0.04 g^2.
If you then scale the setup up by 1000 (=> getting 5kg as expected value), then the variance scales to 1000^2 * 0.04 = 40000 g^2.
BUT the standard deviation is still 200g. The relative uncertainty is NOT increasing quadratically!
(another sanity check: if you change the units by a factor of 1000, your variance must not increase, relatively).
But maybe I misunderstood your point?