"The amount of APIs in the world that require concurrency within a single request scope to meet latency needs approaches zero."
I've never worked on any non-Rails API where this was true. The Ruby community keeps telling me this, but in languages where concurrency is well supported, it gets used everywhere to a great extent. I obviously don't have hard data to support this, but your claim seems pretty far fetched to be honest.
"In practice, you don’t make that db call until the auth request is done and the user is verified"
Of course you optimistically do any idempotent DB operations while waiting for auth to succeed if you care about latency.
"you don’t make the outgoing api call until you already have the results of the db call, because you need your data to form the outgoing request"
These dependencies of course exist, but so do parts of the graph where they do not. You might want to make 2 or 5 outgoing calls based on your DB query and have to wait for 2 out of these 5 to make another DB query. This is so common that there are libraries like https://github.com/uber-go/cff to explicitly model those dependency graphs, visualize them and resolve them at runtime.
My theory is that system designs like this are just impractical to implement inside Rails today, which leads heavily Rails biased engineers to not even consider them, which leads Rails experienced developers to never have seen them, which in turn fuels the sentiment that they are rare. I'm not saying that you fall into this category, but from my experience many engineers who have only ever done Rails in their career do.