On Feb. 7, the National Institutes of Health (NIH) issued guidance limiting how much of the “indirect costs” (IDC) of scientific research it would pay. While the topic is arcane, it has massive implications for the future of scientific research in the U.S.
In the specialized jargon of big science, “direct costs” contribute unambiguously to a specific grant-funded project. For instance, the salaries for personnel who work on a project and project-specific equipment are two examples of direct costs. Indirect costs can cover everything from the cost of ventilation in the lab where the grant-funded project takes place, to staff accountants who help create and monitor budgets.
Before the NIH directive, universities negotiated what percentage of the indirect costs incurred by scientific research the federal government would cover. The negotiated rates reflected differences in costs confronting institutions. Since lab space in New York City, for example, costs more than in Lincoln, Nebraska. Columbia University has a higher IDC rate (64.5%) than the University of Nebraska (55.5%).
The new NIH guidance justified reducing the IDC it would pay as a commonsense market reform: since private foundations pay a lower percentage of universities’ IDC, the federal rate must be padded. Meanwhile, Project 2025, the Heritage Foundation’s manifesto for a second Trump administration, argued that IDC actually paid for “Diversity, Equity, and Inclusion (DEI) efforts” on university campuses.
But both claims ignore the real reasons the federal government embraced this complex method of funding scientific research during the Cold War and oblivious to how it has evolved. Paying for IDC was a way to build and maintain a free and uniquely American way of doing science—and it has proved deeply successful for three-quarters of a century.
Before World War II, private donations and industry awards funded most science research. Agriculture was the exception. At land grant…

