A good analogy might be found in some software I'm working on now. Part of it is a specialized serialization scheme designed to produce small serialized forms of large numbers of similar tree graphs. It is easy to quantify the size of the serialized form of N million trees, and it is easy to measure the CPU/IO time required to produce this form. However, when making design decisions, I only use these metrics as one input into my thought process. If I do X, how easy will it be for others to maintain after I leave (I'm a contractor)? Does a proposed design decision limit future opportunities for this module's use? Does "brittleness" increase? Does this decision move the design further away from using Mongo/HBase/whatever in the future, or does it lessen the current impedance mismatch? The easy metrics are time and space, but the might not be the most important ones.