Nowadays a web application, for example, is tiered more. Different frameworks are encouraged for the different tiers: Bootstrap, Spring, Hibernate, etc. Each one is its own ecosystem and is built on top of other libraries. It's very common to make web service calls outside your WAN. Quickly you find out that "standards" have different interpretations by different library authors.
UI's are no longer an afterthought. They affect how successful your application is. (My observation is that a well-designed UI can cut down on user errors and training by two thirds over a merely functional UI.)
I'm keeping the example simple by not mentioning necessary middle-tier components that we didn't use 20+ years ago. We also didn't worry about clustered environments, asynchronicity, or concurrency.
Not knowing the application you needed or how the analysis was done by the coding team, it's hard to say if some of their "slowness" was getting to know the problem AND coming to understand how extensible, performant, and reliable you wanted it. My own approach is usually to solve the "happy path" first and then start surrounding it with "what if's" - e.g. what if a null is passed into the function, etc. Over time I refactor and build in reliability and extensibility. The coding team you referred to may have used a different approach in which they tried abstracting use-cases and building an error handling model before solving the "happy path".
Your "tl;dr" is spot on. But I'd like to raise a cautionary flag about judging modern development through a 25 yo lens. The game has changed.