Notes on the decline of unfettered research

Andrew Odlyzko defines unfettered research as basically “work on whatever you want.”

What do I mean by unfettered research? In the discussions of federal science policy it has also occasionally been called “curiosity-driven.” It is exemplified by the following reminiscences of Henry Ehrenreich [Ehr].

When I arrived at the General Electric Research Laboratory at the beginning of 1956, fresh from a PhD at Cornell, I was greeted by my supervisor, Leroy Apker, who looked after the semiconductor section of the general physics department. I asked him to suggest some research topics that might be germane to the interests of the section. He said that what I did was entirely up to me. After recovering from my surprise, I asked, “Well, how are you going to judge my performance at the end of the year?” He replied, “Oh, I’ll just call up the people at Bell and ask them how they think you are doing.”

In this style of work, the researcher is allowed, and even required, to select problems for investigation, without having to justify their relevance for the institution, and without negotiating a set of objectives with management. The value of the research is determined by other scientists, again without looking for its immediate effect on the bottom line of the employer. The assumption that justifies such a policy is that “scientific progress on a broad front results from the free play of free intellects, working on subjects of their own choice, in the manner dictated by their curiosity.” (This quote is from the famous report of Vannevar Bush [Bush] that formed the cornerstone of U.S. federal funding for research after World War II.)

Note that this is different from the Basic/Applied (pipeline) research paradigm, but also subtlety separate from the Quadrant Model of Scientific Research. Both the pipeline model and the quadrant model focus on the mindset and goals of the researcher while the fetters model focuses on whether there are external factors shaping that research. He equates fettered/unfettered with the more common “mission-driven/curiosity driven.”

When he wrote the piece in 1995 Andrew was working at a (arguably past its heyday) Bell Labs didn’t have a unique model. The main focus of the paper is on corporate R&D in the US but it touches on academia and Japan as well. It’s fascinating that 1) Basically everything he describes in 1995 seems to hold true today (which may be a greater argument for stagnation than any number) and 2) Unlike most papers in this vein he does not default to the conclusion that unfettered research is the engine of progress and lament its decline.

This essay is not meant to be a balanced account of research policies. It is intended to explain the reasons for the decline in unfettered research, and the turn towards more directed work. Therefore it emphasizes the negatives. In context, the negatives are ‘negatives for people who previously enabled more unfettered research.’

tl;drIn a nutshell is that unfettered research has declined because research has become a commodity. Furthermore, the commoditization of research is tightly coupled to a changing public perception of how invention and discovery works - specifically that it has shifted changed from discrete discoveries to continuous advancement.

The paper is full of delicious examples so there will be many block quotes.

At first, the statement that research has become a commodity is a bit shocking. “Research is the farthest thing from a commodity - it’s all handcrafted and depends on individual sparks of genius!” At the same time, everybody knows about the phenomena of near-simultaneous invention, which Odlyzko argues has been becoming more widespread.

Today such opportunities (to take advantage of a technology that nobody else will invent for a decade like xerography and the semiconductor) are extremely rare. For example, when Bednorz and Mueller announced their discovery of high-temperature superconductivity at the IBM Zurich lab in 1987, it took only a few weeks for groups at University of Houston, University of Alabama, Bell Labs, and other places to make important further discoveries. Thus even if high-temperature superconductivity had developed into a commercially significant field, IBM would have had to share the financial benefits with others who held patents that would have been crucial to developments of products.^1

It’s clearly anecdotal but this squares with my experience: whenever I see a super cool technology that is at a point where it’s ready to be commercialized, there are at least a few other companies doing the same thing.

Another part of the commoditization argument is just that there are so many more researchers now than there were at the beginning of the 20th century.
When we talk of the decline in unfettered research, we should remember that there would be no difficulty in providing an unfettered research environment for Einstein today, were he still alive. The difficulty is that there are now thousands of theoretical physicists who would like to be treated like Einstein.

The number of scientists has been increasing at an exponential (in the strict mathematical sense of the word) rate for centuries. The number of scientific papers published annually has been doubling every 10-15 years for the last two centuries [Price]. With the spurt in funding after World War II, the rate of increase rose. While the increase in papers could be attributed to the metrication of science, other sources corroborate that the number of scientists is increasing (Are Ideas Getting Harder to Find?.) It’s surprising that the doubling trend goes back much farther than my intuition (which was that it started after WWII.) It might be unrelated, but it feels similar to how there are arguments that there were world GDP doubling trends long before the industrial revolution. (Modeling Human History. It makes me idly wonder whether how much the periods we perceive as discontinuities like the industrial revolution or WWII are actually just narrative scaffoldings we put around exponential growth so that our human brains can make sense of it.

Intense competition is intimately tied to commodities. The more interchangeable the products are, the lower the switching cost for a customer so commodity producers compete almost purely on price. That’s why the margins are tiny in commodity businesses. Before reading this paper, I intellectually knew that competition was intense in science, but I had never connected it to commoditization.

People also perceive commodity products differently. “One lb grade A rice” just doesn’t have the ring as “iPhone XL” or “Air Jordans.”
The growth and increasing competitiveness of any field can easily affect the public perception of that field. In sports, for example, it is common for commentators to talk of how some famous figure of old, such as Babe Ruth, was the greatest player of all time. Such assertions are made only about sports such as baseball or boxing, where teams or individuals compete against each other, and there is no objective measurement that can be used to compare performance over time. In sports such as swimming or running, where the clock determines the winner, such assertions are never made, since the evidence there is clear, that the performance of the top athletes has been steadily improving
It’s a sobering thought that science is now a relative rather than absolute game. Relative vs absolute games I wonder how much that’s the case. It certainly is in terms of funding and tenure. And the impact of most research is so dispersed that it is hard to measure its outcome and we do depend on opinions - Science is getting less Bang for its Buck. Through the growth economics lens (Are Ideas Getting Harder to Find?) science is also a relative game in a way because the null hypothesis is that each idea increases GDP at a constant proportion of GDP even as it grows. I need to dig more into this question, but regardless, this statement was pretty truthy:
if we could resurrect Einstein and clone 100 copies of him, the public would not treat each of these individuals with the same respect they accorded the original. At the end of the day, attention is the only finite resource.

It’s off the main argument about commoditization, but it’s an important point that many ‘big science’ projects like the LHC inherently put fetters on researchers. I find this especially concerning because I’m pretty sure Discoveries happen when people have the ability to go “oh that’s weird”.

^1: A sobering speculation: what if the reason that high-temperature superconductivity didn’t develop into a commercially significant field is because other groups had also invented it so quickly and none of them were incentivized to do the work to make it commercially viable?

Related

Web URL for this note

Comment on this note