I’m sure that Matt Yglesias has forgotten more economics in the past hour than I will ever know. And yet, he believes that “if spending on military robotics declines then our most talented roboticists will focus more of their time and attention on civilian applications.” Really? Military spending doesn’t affect the overall demand for engineers and scientists? It’s just as likely that if spending on military robotics declines our most talented roboticists will leave robotics and science altogether. If Lockheed Martin, Raytheon, etc. weren’t hiring, many of my friends would be out of a job, not making snazzy commercial gadgets.
Several liberal bloggers protested the Times suggestion that cutting the Defense budget will reduce innovation. While some of their points are well-taken (the DOD budget is almost certainly bloated and wasteful), they all unfortunately make two big mistakes: they equate defense research with weapons research, and they neglect the role of deployment in bringing technology to scale.
Here is Robert Wright’s flawed analysis, typical among the group:
Defense department research, in contrast, focuses on services that people are more ambivalent about–like getting blown up. If more benign services get developed in the process–like if blowing people up involves technologies that help them play digital music–that’s a happy accident.
Wright’s simplistic link between DOD research and weapons ignores the synergies between civilian and military technologies. At some point the DARPA-funded optical interconnects that my girlfriend studies may improve weapons. But in the short run, they have a much better chance of reducing energy use.
At least in universities, DOD complements rather than competes with civilian agencies, and they all fund similar work. Everyone in my lab did the same sort of space physics research. Some of us were funded by the Air Force, some by the Office of Naval Research, and some by NSF. While the emphases may have differed slightly, there was a lot of overlap. That’s why we all had the same advisor. I’m sure there’s a similar dynamic in quantum computing funded by DARPA, the NSF, and DOE.
The existence of multiple funding agencies is one of the main strengths of U.S. science. They foster diverse approaches and ensure that a single paradigm doesn’t dominate. It wouldn’t necessarily be a good thing if all of DOD quantum computing money were transferred to the NSF. We want many groups attacking the same problem and we should be happy DOD is part of the mix.
Now if all we care about is research production, we may be fine with just two or three agencies funding science. Especially if DOD is as inefficient as they suggest, we may be better off transferring half the DOD research budget to NSF and DOE.
But we don’t care about research for the sake of research. We want to drive innovation, which depends on much more than government funding. Which brings me to the second mistake Wright et. al. make: ignoring the importance of deployment.
As David Roberts noted, technology deployment is itself a form of research. It’s one thing to make a neat device in your lab. It’s quite another to scale the product, align it with customer needs, bypass regulatory hurdles, and market it successfully. I can’t tell how routine it is for a company to fail for these reasons even if they have the science locked down. As great as NSF research is, it’s only a small part of the picture.
Computers are commonplace not only because smart physicists figured out quantum mechanics. It’s also because we learned how to make lots of computer chips cheaply and quickly. The DOD role in this development has been crucial. Precisely because they are so massive and relatively price-insensitive, they enabled large-scale deployment and the learning that goes along with it.
Cliff Bob shows that he doesn’t understand any of this:
Nowhere in the article is there anything but assumption that only the military, as some kind of beneficent and far-seeing midwife of invention, could have fostered these and other innovations. Nowhere are there convincing arguments that most if not all of these developments wouldn’t have been made either through some other government R & D agency or through the market itself.
Nowhere in Bob’s article is there anything but the wrong assumption that “these developments” occurred primarily because of an R&D agency rather than procurement and deployment. In some cases DOD was the only market in existence because no one else could afford the technology. Only after DOD brought down the price of semiconductors did we all benefit.
DOD may very well be wasteful and inefficient. Maybe in 2012 it’s not the best way to drive innovation and perhaps negatives now outweigh the positives . Those are fair arguments. But to debate the point intelligently, we have to first rid ourselves of the myopic view that money is all that matters. DOD funding is associated with scale and deployment, key components of innovation and commercialization. (See Roger for more along these lines.)
The most depressing part about all this is how otherwise brilliant writers make bafflingly simplistic arguments when it comes to innovation policy. Is it really so hard to understand that innovation requires more than government funding of R&D?
Those involved in science policy sometimes seem to me to be sleep-walking through the greatest crisis to afflict the West since the Second World War. True, from the point of view of the scientist at the bench, grants continue to flow and results continue to be published. Perhaps this is why wider discourse about science’s role in society has hardly budged an inch.
For the past three years, I have grown steadily more impatient with this ‘business as usual’ approach. Whenever an academy president or research chief stands up to speak in public, I have been waiting for them to explain how they will do things differently. They never do.
Macilwain doesn’t seem to understand that scientists are already dealing with a crisis. From their perspective, less science funding is the crisis to be dealt with. Why should scientists meekly accept they change their ways when everyone is trying to maintain business as usual? Scientists see a shrinking pie and want their portion to stay the same. It’s self-preservation, and there’s nothing wrong with that.
Scientists genuinely believe more science funding serves the common good and addresses the economic crisis, just as the Chamber of Commerce genuinely believes the same about lower corporate taxes. Scientists do in fact care about basic research. Asking scientists not to lobby for what they care about is asking them to abdicate their democratic responsibilities. It’s not a fair request.
Going forward, a better approach may be to stop narrowly equating science with academic basic research (something I’m guilty of in this very post), and instead try to direct funding to different kinds of science. Academics will always study what the Macilwains and Stilgoes out there are not satisfied with. So rather than attacking this type of research, Macilwain et. al. should do their own political lobbying for the type of science they want. A world in crisis demands it.
James Kalb has recently interviewed mathematician and architectural theorist Nikos Salingaros. As someone who has always been irritated when people mistakenly think it the job of “science” to invent more and shinier consumer crap, I particularly appreciate the following remarks of Professor Salingaros:
Our educated world remains ignorant about the distinction between science and technology, unfortunately. Science helps us understand the universe and ourselves. Technology applies scientific results to master processes that we can manipulate so as to better our lives. It is also applied to kill people directly, destroy nature, and threaten our own survival. Or to save us from our stupidity. Tools can be used for either good or evil.
The actual history of science and technology is a lot fuzzier than the stark picture painted here. There’s often a fair amount of overlap between the two, and it’s often not straightforward to see where one begins and the other ends.
To give just a few examples, much of basic materials science dependended on preceding technological developments in mettalurgy. My own (former!) field of space physics developed primarily because engineers working on improved telephone communication needed better data on the Earth’s atmosphere. The interplay between science and technology in thermodynamics is legendary. In this case, some of the most fundamental scientific laws owe their discovery as much to the steam engine as they do to some abstract quest for understanding. To this day, college sophomores learn the science of thermodynamics by studying the technology of steam engines. Technology itself (in the form of better instrumentation) is often created to help us “understand the universe and ourselves” while science qua science is sometimes the tool responsible for the bad stuff.
None of this negates the idea that there are differences between what we call science and what we call technology. Or that there are questions we can and should ask about technology as distinct from science. There surely are such questions, and on some level we can distinguish between the two. But as I’ve noted before, presenting diverse, varied concepts as simple monoliths often obscures more than it illuminates.
David Bornstein reports on a promising approach to increase the number of drugs produced by medical research:
Researchers in the foundation’s four partner labs say that the relationship hasn’t constrained their science; it’s improved it. “We are certainly looking for targets to enhance the reparative potential of the central nervous system,” explained Brian Popko, the Associate Chair for Research in the Department of Neurology at the University of Chicago. “But we weren’t encouraged to develop therapeutic approaches before we understood the system we were trying to repair.”
Johnson has also assembled some of the country’s most accomplished neurologists and immunologists, as well as industry experts to sit on M.R.F.’s three advisory boards, which focus, respectively, on basic science, drug discovery and clinical issues.
To date, researchers have identified a number of targets, of which five to seven are strong candidates that M.R.F. is actively pursuing. They are in close contact with industry researchers and will be farming out the “translational” research needed to attract pharmaceutical interest. They control 50 percent of the licensing of patents for research they fund and, as a nonprofit, can be very flexible in negotiating fees. Within five years, Johnson envisions partnerships with four or five companies, each of which will be moving a couple of targets forward. “We hope to have our first clinical trial for myelin repair by 2014,” he says. “To get to a clinical trial within ten years in an area of research that until recently no one thought was therapeutically relevant would be hugely significant.”
If the M.R.F., and like-minded foundations, succeed, their models will likely influence the bigger actors in medical research. Francis S. Collins, the head of the N.I.H., is pushing to create a new drug development center to be named the National Center for Advancing Translational Sciences.
In an admission that I know some people will appreciate, Stanford professor Ben Barres admits that academia incentivizes basic rather than “useful” research:
“Pure science is what you’re rewarded for,” notes Dr. Barres. “That’s what you get promoted for. That’s what they give the Nobel Prizes for. And yet developing a drug is a hundred times harder than getting a Nobel Prize. We really have to have the very best scientists engaged in this. For a long time this hasn’t been the case. Until five or ten years ago, working on disease was kind of shunned.”
My last post discussed Lehrer’s column on the increased difficulty of making scientific discoveries. Lehrer should have stuck with that topic alone instead of pivotting off Tyler Cowen’s The Great Stagnation to ponder the relationship between scientific discoveries and standard of living:
I think it’s also worth contemplating the disturbing possibility that our cresting living standards might ultimately be rooted in the difficulty of making new scientific discoveries. After all, at a certain point the pursuit of reality is subject to diminishing returns – our asteroids will get so small that we’ll stop searching for them.
For someone who often paints wonderfully nuanced pictures of science, I’m a bit confused to see Lehrer write this. Living standards never have been and never will be “rooted” in new science. If they are rooted in anything, it is productivity increases related to innovation. The rule of law, tax structure, monetary policies, and capital investment all play a pretty big role here.
In fact, only since WWII has science been more than a minor player. Industrial revolution technologies were not strongly linked to the scientific revolution that preceded it, and may have depended more on a robust patent system than heroic scientists. Even if we grant that science has recently become critical, it may be more so in America than elsewhere. The Japanese economic miracle occurred despite a paltry level of science funding, and was spurred by careful industrial planning. Germany seems to have escaped the worst of the Great Recession despite spending relatively little on R&D. And while our standard of living may be “cresting”, the developing world is, well, developing quite well. So again, there is no straightforward link between science, innovation, and living standards.
The U.S. may indeed have a problem with economic stagnation, and it’s important to understand what exactly is going on. But casually assigning too much credit or blame to discovery is, for lack of a better term, pretty unscientific and doesn’t help.
Jonah Lehrer has again written a provacative piece, arguing that growing collaboration and teamwork among scientists is a response to ”all the low-hanging facts [having] been found.” Today’s science is simply much too hard and complex to tackle alone. Lehrer quotes Samuel Arbesman:
If you look back on history, you get the sense that scientific discovery used to be easy. Galileo rolled objects down slopes. Robert Hooke played with a spring to learn about elasticity; Isaac Newton poked around his own eye with a darning needle to understand color perception. It took creativity and knowledge to ask the right questions, but the experiments themselves could be almost trivial.
Today, if you want to make a discovery in physics, it helps to be part of a 10,000 member team that runs a multibillion dollar atom smasher. It takes ever more money, more effort, and more people to find out new things.
Given that I wrote something similar just a month ago, I of course liked this passage. A few responders at Andrew Sullivan do critique the notions that science was ever easy or that we’ve reached the ”end of disovery.” On the latter point, I agree that the meme is a bit overwrought. Over 1 million papers get published every year, and presumably most of them make a discovery of some form. Some might later be proven wrong, and some may be meaningless (Lehrer claims one-third of papers never get cited), but they are discoveries nonetheless.
Perhaps a better phrase is the slowdown of meaningful discovery. The fact that all the low-hanging fruit have been found doesn’t end the pursuit of knowledge. It just means we have to focus on smaller, more specialized problems that have less payoff (hence all those uncited papers), or focus on really, really hard problems that can’t be solved (hence all those corrections in medical research). The productivity slowdown in pharmaceutical R&D is one notable example of the former phenomenon. The graph is particularly striking (h/t Roger Pielke Jr.).
The rate of discovery ultimately matters because we think it affects our standard of living, a point Lehrer addresses towards the end of his post. Unfortunately, he really muddles the relationship between discovery and innovation. I’ll try to address this in my next post.
If you’re wondering why I keep harping on basic research, it’s because I want to write an essay on the topic and I’m using this venue to hash out my ideas. So here we go again! Here’s David Bruggeman commenting on my exhortation for policy analysts to just say that scientists should care more about need-driven research:
Yes, the government supports research that addresses specific national needs. But who gets tenure for conducting research that addresses specific national needs? There’s not necessarily a correlation between cutting-edge and targeted to national needs.
There are a few issues mixed in here. First, it’s very easy to falsely homogenize all academic research. It’s undoubtedly true that the string theorists and particle physicists aren’t addressing a specific national need (outside of maintaining the U.S. lead in basic science). Theoretical physics, however, does not represent all of academia. Engineering schools routinely focus on practical problems. Even in physics many study “useful” topics like fuel cells and alternative energy. On top of that, programs like Stanford’s Interdisciplinary Program in Environment and Resource are becoming increasingly common.
It’s also true that no one gets tenure for addressing a national need. But I’ll go against the zeitgeist here and say that’s a good thing! Researchers should be evaluated on their research quality, teaching, outreach, administrative tasks, etc. And while I would love to see a much bigger focus on teaching and outreach, it’s a stretch to think that your typical tenure committee is qualified to evaluate how well a given research portfolio addresses national needs. As David knows as well as anyone, science is only one component.
To the extent we believe research helps solve problems, academics can best serve their role by doing good research. Sometimes doing doing good research entails direct engagement with a real-world problem. The global cook-oves initiative comes to mind here. But this type of situation is rare. Even on what appears to be a pressing issue like coral reef management, research is often fairly removed from immediate use. I’d like to think that Stanford’s Environmental Fluid Mechanics Lab will lead to better decisions, but the latest supercomputer simulations don’t offer much. And thus it doesn’t make much sense to include extra-scientific criteria in the evaluation.
The underlying problem here is that David is trying to make academia, and research more generally, something that it inherently is not. Academia, simply put, is not supposed to be directly useful. Academia does in fact contain many academics. Almost by definition, academics aren’t motivated by pressing, relevant problems. If they were, they wouldn’t be academics! By your 3rd year in grad school, you pretty much figure out that if you want to do work that connects to the real world, you leave academia. Now I may be biased because I was in a physics department, but I don’t think these sentiments are way off the mark. I suspect this dynamic is why the government does research in national labs as well as universities. It’s also why we have university-industry partnerships, and directly fund private companies to do work.
There’s nothing wrong with being an academic of course. The pursuit of knowledge is a worthy activity, and a big part of me would be happy doing that for the rest of my life.* But you’re only going to get so far asking academics to do something they’re not always well-suited to do.
To continue with the basic research meme, here’s Lewis Branscomb distinguishing the goals of research with how it’s conducted (emphasis added):
I believe it would be much easier to understand what is required if the agencies would define basic research not by the character of the benefits the public expects to gain (large but unpredictable and long-delayed benefits in the case of Newtonian research) but rather by the highly creative environment in which the best basic research is carried out…If we pursue this line of reasoning, we are immediately led to the realization that the goals to which Jeffersonian research is dedicated require progress in both scientific understanding and in new technological discoveries. Thus not only basic science but a broad range of basic technology research of great value to society is required. The key idea here is to separate in our policy thinking the motives for spending public money on research from the choice of environments in which to perform the work.
I have to wonder how much of that thinking is confused, not in the minds of politicians or policymakers, but in scientists and the public. The Air Force program managers who funded my graduate research knew that basic science conducted in the relative freedom of academia could also serve national needs. Along those lines, since the 1970′s DARPA has explicitly tried to fund high-risk high-reward research. Much of the work has occurred in universities, and has spanned natural science, engineering and even (occasionally!) social science departments. So again, it appears that the decision-makers have known for a very long time that there is no real conflict between basic and applied research. Whatever scientists might believe and say in public, policymakers and politicians are smart enough to not listen to them. The NSF, the only agency that’s really dedicated to science for science’s sake, receives a paltry 4% of federal R&D spending. So scientists push a narrative that the people in power clearly, and thankfully, don’t believe.
Which leads to me to say again, as I did in my last post, that efforts such as Branscomb’s are not really about changing funding patterns. They’re more about changing scientists’ priorities and the culture of academia. Those are worthwhile goals, and ones I support in some measure. I suspect these ideas would get a lot more traction if Branscomb simply said he wanted to make academia more welcoming to need-driven research. In my experience, young graduate students are yearning for that opportunity. Invoking rhetoric about Jeffersonian science just muddles the message.
The second important flaw in the usual antithesis is that these two widespread and ancient modes of thinking about science, pure versus applied, have tended to displace and derogate a third way that combines aspects of the two. This third mode now deserves the attention of researchers and policymakers.—Holton and Sonnert
These two sentences capture for me what is wrong with much writing on basic research. This long-suffering third-mode, displaced and derogated by researchers and policymakers alike, was in fact advanced by Thomas Jefferson when he funded the Lewis and Clark expedition. Holton and Sonnert were surely after aware of this fact because they discussed it shortly after the passage above. Their catchy phrase for such work–Jeffersonian science–is even in the essay’s title. Holton and Sonnert also acknowledge that defense and health science funding are “dramatic and successful examples” of Jeffersonian science. So what then are they saying? That even though the U.S. funded Jeffersonian science in the past, funds it today, and will fund it in the future…policymakers are still unaware of it?
And what did Lewis Branscomb mean when he insisted we should not “settle for a [basic/applied research] dichotomy” because Jeffersonian research is a “central element of a new model?” Branscomb too was surely aware that Jeffersonian research is not new and has existed since the times of, well, Jefferson. And how about Roger Pielke Jr.’s call for a “new understanding of how science serves national needs” and research that reflects the “broader scope of considerations relevant to practical problems.” Roger must also know that the U.S. has been doing as such for two hundred years now.
This lack of specificity carried over to their policy recommendations, and I was left wondering what exactly they wanted. Is it that we should spend less on basic and applied work but more on Jeffersonian? Or should we increase the amount spent on Jeffersonian while keeping the rest constant? Or perhaps basic and applied work should be mostly eliminated in favor of Jeffersonian?
I don’t mean to be too critical. I thoroughly enjoyed Lewis Branscomb’s talk at the 2010 AAAS annual meeting. I look forward to reading Holton’s “Thematic Origins of Scientific Thought.” And Roger Pielke has deeply influenced my thinking on science policy and politics. I continue to visit his blog every day. But I think in this case they weren’t as clear as they could have been.
From what I can tell, Branscomb et al. simply think Jeffersonian research is more meaningful and important than basic research and they want scientists to think so too. But for some reason they can’t just come out and say that. When it comes to this topic, being direct and to the point appears to be difficult. I think we can all learn from Jane Lubchenco’s 1998 address to AAAS: “Urgent and unprecedented environmental and social changes challenge scientists to define a new social contract. This contract represents a commitment on the part of all scientists to devote their energies and talents to the most pressing problems of the day, in proportion to their importance, in exchange for public funding.” While I may not agree with her, at least I know what she thinks.