Just want to preface by saying the discussion of this thread is overall pretty high quality, I imagine because people are a bit more experienced and not constrained. A few more thoughts...
I think the NIH is very important, but I also think that the institutes are too important right now because they almost monopolize research funding. This means that there is little incentive for making career scientists or raising the pay.
This is not factually true anymore. Industry funding now accounts for about half of total R&D budget in the US.
I think limiting the number of spots and creating more hierarchy as mentioned previously should be done after more competition is introduced.
There's gonna be a limit to this. No matter how much competition you introduce, the subject matter area is relevant. If you are a humanities professor, you just won't get as many relevant industry opportunities, unless it's in a specific niche area (international affairs, etc). Similarly, while overall CS profs do well in industry, if you specialize in computational complexity theory your work outside of academia is rather limited. So in order to make this work, you want to micromanage the labor market at very precise, subject specific level. In general, this fails miserably, mostly because of competing interests--typically the central agencies is responsible for both the training pathway, which typically favors increasing spots to get more cheap labor, and the professionalization, which typically favors limits on licensure. Examples of recent such regulatory failures within medicine include pathology, radiation oncology, and until recently, radiology. I can think of two specific instances where this micromanagement works, one is dermatology, one is economics. In both cases, there's a nationalized matching process, and a fairly informed and flexible central committee that matches training spots with professionalization. In particular, what also saves those two is a rising private demand. Invariably, if the central committee favors professionalization, the professionals in the field eventually benefit from the arbitrary monopoly. This really applies in EVERY SINGLE scenario: i.e. when labor is more powerful, the laborers get more out of the system. Ultimately though, the final arbiter is rising private demand. One might argue the only difference between derm and rad onc is the former has one.
In the case of NIH funded research, there's a lack of rising demand, and during the funding doubling, training was decoupled from professionalization, causing a severe bottleneck. The rise of the K award process introduces some elements of "matching", but it's cumbersome, requires a multiple year tedious application process, evaluated by multiple committees and has limited geographical flexibility, causing problems for women and "the poor".
What's also really funny when you think about "industry" competition is that this exists for physician scientists, it's called being a clinician. Lots more MD/PhDs drop out prior to the K award stage (and after) because clinical work pays more and often are less demanding. The Feds creates a pathway but underfunds it, causing severe salary disparity. This leads to worsened labor mismatch, where highly qualified candidates pursues different careers solely because of lack of salary support, and people who stay in the track often have outside financial support. This is really a microcosm of what happened in Soviet Russia.
Rather than a block grant system, the government could just arbitrarily limit the number of grantees based on budgets, especially for training grants and institutions would then be responsible to acquire the training grants and demonstrate more spots are needed. Though in the current scenario, the large grant budgets are just too big for this issue to matter (i.e. for R01s, you'd probably prefer to just hire students, who are higher quality to begin with, than random RAs from the community). Most of the PhDs from lower tier programs go into industry anyway, though admitted a lot of skills they learn during PhD may be useful (i.e. writing and presentation). It would be impossible to create a match program for PhDs overall, as they often go to jobs that they over-qualify for, and there's a sharp difference between to top PhD programs and lower tier. The situation is closer to law school than medical school.
I like the idea that an institute or specific directors fund a position to do research on topic “X” as far as certainty, but I think that also stifles innovation. Maybe the same approach to research as clinical enterprises withcenters of excellence in a specific topic or field. I don’t know though, the system is imperfect and the government oversight, or lack thereof, enhances that imperfection.
There are disagreements (as shown in the present discussion) in whether more or less oversight is the right next move. Whenever this happens I think the right answer is that it depends--some kinds of research benefits from more oversight (especially large multi-lab collaborations), others don't. Creativity might come out of competition, but it might not. What IS creative is extremely subjective. A lot of work done by mid-range mid-career people are not "creative", but 1) are necessary foundations 2) I would hate to see them lose their jobs. Now I'm a medical doctor so I don't care as much, but it sucks a lot to be a PhD who had previous R01s who end up in a teaching position in his/her 50s. This happens not infrequently. This is also an issue that people aren't used to be aware of. To be fair, the situation really got this bad about 10-20 years ago, so the trickling is still going on.