I think you're missing some important things when you talk about efficiency in this context.
Switching from SAS to R is not as simple as just installing R on your computers. You're essentially throwing out the window decades of existing analyst software experience to save a relatively (from the standpoint of a university) small amount of money. You're going to take a significant hit in productivity while your analysts learn how to use R--it could be calculated if you knew all the inputs but my guess is it would take quite a while for the difference in software prices to cover the hours wasted as people learn how to use R. In addition to the up front financial penalty, your research is going to be held up while you wait for people to get up to speed. You could get scooped, miss deadlines, etcetera. Analysts will have a lot of stored bits of code they use and recycle to make things faster (for example, calculating comorbidity scores) so you're also throwing all of that in the trash and making them recreate those procedures in a new environment.
Further, from an institutional standpoint there's something to be said for using paid software which comes with a guarantee of technical support and data security. Also, as mentioned by
@dempty when you're installing random packages for R you don't have something like the SAS Institute standing behind them to verify their accuracy and functionality.
Again I think this is likely to change over coming years as more of the data analysts currently in practice are replaced by younger graduates who have exposure to R (though even now I'd guess a majority of courses taught in public health schools are done in SAS), but I don't think it's necessarily true that institutions would benefit from mandating a switch to R for their existing employees.
As an aside on your last point, assuming you can do the stats accurately it's unlikely anyone will care what software you use to do them.