It does seem like something of an interesting and worthwhile challenge to conduct a project like this while performance is being lifted ever higher in step with Moore's law, or, if it is not sustained, whatever curve follows it.
With the wide spread of fiber cable(faster internet speeds), I see it being very do-able form of computing in the near to soon future. We might as well help push the Moore's law mentality a bit further.
The goal is to get people thinking outside of the box. Especially those in leadership positions.
Your proposal is to formalize the folding@home and similar efforts into a national distributed general-purpose computing environment. This kind of system works fine if the problem can be decomposed into small independent computation-bound pieces, but that's not a framework that fits a lot of things.
You write: "To get a program working on a super computer, one must pay highly specialized developers in order to create a program that might only be used once. It also might take a long time to build considering the missing tools which are available to use when targeting the consumer side of the market."
I don't see any reason why this would not hold in any other computing environment, and tbh I'd expect it would be even worse. We already have computational power on desktops that would have been unimaginable not that long ago, not to mention GPUs, but look how shoddy the scientific computing ecosystem is (there are exceptions, but in general it's "not good").
Plus America is in a huge amount of debit.
Not saying we couldn't do both ideas though. I wonder which would be more useful.