Yeah, that's a commonly mentioned problem with citation-counting. Much like linkbaiting on the internet, a poor-quality paper taking an inflammatory position can get a lot of citations from people debunking it. Another problem is throwaway citations: some paper gets cited as a generic example, rather than because it provides anything valuable that the paper citing it actually draws on.
Unfortunately, it's much harder to come up with better measures. Given a smallish corpus of a few hundred papers, humans could read through them and annotate each citation with things like, "cited to debunk", "cited to distinguish related work", "cited for general background", "cited in passing", "cited for result", etc. But computers are not yet very good at doing that automatically, so the large-scale citation analysis just does dumb citation-counting.