Meanwhile random FOSS projects be like "please sudo curl bash to install the prebuilt binaries".
But this is true about lots of code. We have this notion of "it works, therefore there's no problem" which is just bad engineering. Just because you don't know there's a problem doesn't mean there isn't. Just because it passes the tests doesn't mean you have test coverage.
curl -L "foo.sh" -o foo.sh && bash foo.sh
Is just more characters. But you should do it simply because a poorly written bash script can accidentally mess you up when streaming.Why sudo though?
I honestly think it's stupidity. Most people really don't know you can build programs to the user and don't need system privileges. I think everyone is just so used to installing from package managers and doing `sudo make install` that they forgot programs only need to be in $PATH and not /usr/bin
Of course, the one advantage of having source is that it is easier to run things like SAST tools against source, but how many people do that in practice? How integrated is that with package systems? And when package maintainers might provide hashes of what they ostensibly checked, you still need trust.
So we need a combination of static analysis tools that are integrated properly to produce trusted binaries, and you need earned trust and authority. Hyperindividualist self-reliance is, at the very minimum, impractical. And with authority, we know whose job it is to care for the quality of software and therefore whom to hang.
However commits tend to be much easier to trace at a later date than arbitrary binaries so attackers will be less inclined to go that route. Once committed it's there forever unless you can somehow get everyone to censor it from their own copies for an unrelated reason. Consider that the xz compromise involved downloading the payload later.
My policy is to either obtain binaries from a major distro or to build from a clean commit in a network isolated environment. If I can't go one of those routes it's almost always a hard pass for me.