do you want to call them all 'computers' now?
When the arithmetic circuits, i.e. the "central arithmetical part", as called by von Neumann, are coupled with a "central control part", as called by von Neumann, i.e. with a sequencer that is connected in a feedback loop with the arithmetic part, so that the computation results can modify the sequence of computations, then this device must be named as a "computer", regardless whether the computations are done with analog circuits or with digital circuits.
What defines a computer (according to the definition already given by von Neumann, which is the right definition in my opinion) is closing the feedback loop between the arithmetic part and the control part, which raises the order of the system in comparison with a simple finite state automaton, not how those parts are implemented.
The control part must be discrete, i.e. digital, but the arithmetic part can be completely analog. Closing the feedback loop, i.e. the conditional jumps executed by the control part, can be done with analog comparators that provide the predicates tested by the conditional jumps. The state of an analog arithmetic part uses capacitors, inductors or analog integrators, instead of digital registers.
Several decades ago, I had to debug an analog computer during its installation process, before functioning for the first time. That was in a metallurgic plant, and the analog computer provided outputs that controlled the torques of a group of multi-megawatt DC electric motors. The formulae used in the analog computations were very complex, with a large number of adders, multipliers, integrators, square root circuits and so on, which combined inputs from many sensors.
That analog computer (made with op amps) performed a sequence of computations much more complex than the algorithms that were executed on an Intel 8080, which controlled various on-off execution elements of the system, like relays and hydraulic valves and the induction motors that powered some pumps.
The main reason why such analog computers have become obsolete is the difficulty of ensuring that the accuracy of their computations will not change due to aging and due to temperature variations. Making analog computers that are insensitive to aging and temperature raises their cost much above modern digital microcontrollers.
you can even include multiplexors in your analog 'computer', even with only adders and multipliers and constants; x · (1 + -1 · y) + z · y interpolates between x and z under the control of y, so that its output is conditionally either x or z (or some intermediate state). but once you start including feedback to push y out of that intermediate zone, you've built a flip-flop, and you're well on your way to building a digital control unit (one you could probably build more easily out of transistors rather than op-amps). and surely before long you can call it a digital computer, though one that is controlling precision linear analog circuitry
it is very commonly the case that analog computation is much, much faster than digital computation; even today, with microprocessors a hundred thousand times faster than an 8080 and fpgas that are faster still, if you're doing submillimeter computation you're going to have to do your front-end filtering, upconversion or downconversion, and probably even detection in the analog domain
By doing so, you get to make a point—perhaps via analogy, perhaps via precision, perhaps via pedantry—which is illuminating for you but now confusing for your reader. And to explain yourself, you must swim upstream and redefine a term while simultaneously making a different point altogether.
Much has been written about jargon, but a primary benefit of jargon is the chance to create a domain-specific meaning without the baggage of dictionary-correct associations. It’s also why geeks can be bores at dinner parties.
By analogy to HCI: words are affordances. Affordances exist because of familiarity. Don’t make a doorknob that you push on, and expect people not to write in telling you to use a door-bar on that door instead.
Unilaterally changing language is not forbidden, but if The Culture Wars™ has thought us anything, it is that people are allergic to talking about what they see as mandated changes to their language, even if it is reasonable and you can explain it.
Colour me stoked, but you could still just do it unilaterally and wait till somebody notices.
However my caveat with viewing everything as computation is that you fall into the same trap as people in the ~1850s did when they wanted to describe everything in the world using complex mechanical devices, because that was the bleeding edge back then. Not everything is an intricate system of pulleys and levers it turned out, even if theoretically you could mimic everything if that system was just complex enough.