I really, highly doubt that. Autodetecting and using proper settings for a newly-discovered display device (projector) is fundamentally a software consideration, not hardware: the hardware into which you connect the display is capable of talking to it (maybe not at ideal resolution depending on specs, but whatever), but the OS needs to discover, detect, and configure the new connection properly.
The same is true for Zoom: things like accelerated video streaming are universally supported on laptop GPUs, but software support is spotty for some apps/OSes. Things like screen sharing are 100% software-side.
While I'd love to use a Linux workstation (and often do), it's simply not there yet in those areas. That has nothing to do with "high-end hardware" and everything to do with less robust software support than many alternatives--including MacOS.
I guess you could make the case that because MacOS has to support fewer kinds of hardware, they can spend more time on making software support robust, but I'm neither convinced of that argument nor convinced that's what you meant by your post.