I'm not specifically familiar with C# or typescript's async keyword implementations, but this.. shouldn't be true?
If you can't launch multiple requests and then gather their results either through aggregating to a single future or by awaiting on them individually (which will mean you will start acting on results somewhere between min(t1,t2,t3,..tn) and max(t1,t2,t3,..tn) time) while all of them proceed just as well as if you had done callbacks, that's not a very good implementation of async/await...
Does anyone know if there are warnings or linters that will catch this in C#, at least if done within one function?
That just doesn't seem like good language design. I totally get the need for concurrency, but the feature shouldn't have such an invasive impact on the code base. Go and Erlang manage to provide good concurrency support without the tax.
[0] https://stackoverflow.com/questions/9343594/how-to-call-asyn...
Is this news for anybody here?
http://webcache.googleusercontent.com/search?q=cache:D8y6DGJ...
I'm hopeful someday the overhead will be made negligible but until then I'd suggest profiling the trade-offs.
Meanwhile data frameworks like Relay offer incremental flushing with much more flexibility all over a single request.
this is pretty interesting, do you have links or knowledge of why this is the case? is it simply JIT overhead? What could be taking O(ms) of CPU time?
Fetching all resources in a single HTTP request (or all data in a single GraphQL query) avoids a lot of issues and workaround effort caused by non-deterministic completion order of multiple small requests.