Right, so I'm asserting that the parameter usually called for in big-O analysis of hash table operations is the wrong one since it measures the lookup complexity and not the hashing complexity, which is usually O(n). And this results in meaningless complexity analysis which gives you things like "Insert is O(1) on Average but O(n) Worst Case or Θ(n)".
This analysis is using the hash table length as the parameter under consideration, but that's silly, because most of the complexity of hash-tables is in the hashing which (depending on the hash function) usually has a known complexity of O(n). Where n is the length of the input string.
This is far more important in hash table complexity analysis than table size because this part of the operation dominates runtime.
You can also do multi-paramter big-O analysis. O(mn) is a thing for example, with two differently sized parameters, both of which contribute to the complexity of the algorithm and can't be dismissed.
So charitably, if you need to provide a big-O for say, hash table inserts, it's reasonable to say O(mn) where m is the length of the table and n is the length of the input string, but it's not necessary since table length usually has little contribution to the complexity of the algorithm...hence why people keep saying inserts are O(1) on average, because table length isn't much of a determinant in hash table operation complexity. Just like big-O hides constants, we can hide parameters that are dominated by another. O(1) is doing this backwards.
My guess is that O(1) is some weird cargo-culted remnant from some early work done on Hash tables as a generalization of Arrays, likely from when the hash functions were just modulus operations on integer keys or some other simple, fixed length hashing method that was easy to ignore (like the universal hash in Corman's "Introduction to Algorithms". But modern hash-tables can't make these assumptions and I, and quite a few other folks, think that this needs to be rethought.
Some examples (I don't agree with all of these, but I think it makes the point):
https://stackoverflow.com/questions/2771368/can-hash-tables-...
http://lemire.me/blog/archives/2009/08/18/do-hash-tables-wor...
> Sound like a analytic club that's rarely accurate to real-world performance? Yup, that's big-O
If your complexity analysis isn't measurable by real-world performance, it's likely that you aren't analyzing the correct parameters.