Archive

Archive for the ‘Basic Research’ Category

More sloppy thinking on DOD-funded research

January 10, 2012 Leave a comment

I’m sure that Matt Yglesias has forgotten more economics in the past hour than I will ever know. And yet, he believes that “if spending on military robotics declines then our most talented roboticists will focus more of their time and attention on civilian applications.” Really? Military spending doesn’t affect the overall demand for engineers and scientists? It’s just as likely that if spending on military robotics declines our most talented roboticists will leave robotics and science altogether. If Lockheed Martin, Raytheon, etc. weren’t hiring, many of my friends would be out of a job, not making snazzy commercial gadgets.

Sloppy thinking on DOD-funded research

January 10, 2012 Leave a comment

DOD research unrelated to weapons!


Several
liberal bloggers protested the Times suggestion that cutting the Defense budget will reduce innovation. While some of their points are well-taken (the DOD budget is almost certainly bloated and wasteful), they all unfortunately make two big mistakes: they equate defense research with weapons research, and they neglect the role of deployment in bringing technology to scale.

Here is Robert Wright’s flawed analysis, typical among the group:

Defense department research, in contrast, focuses on services that people are more ambivalent about–like getting blown up. If more benign services get developed in the process–like if blowing people up involves technologies that help them play digital music–that’s a happy accident.

Wright’s simplistic link between DOD research and weapons ignores the synergies between civilian and military technologies. At some point the DARPA-funded optical interconnects that my girlfriend studies may improve weapons. But in the short run, they have a much better chance of reducing energy use.

At least in universities, DOD complements rather than competes with civilian agencies, and they all fund similar work. Everyone in my lab did the same sort of space physics research. Some of us were funded by the Air Force, some by the Office of Naval Research, and some by NSF. While the emphases may have differed slightly, there was a lot of overlap. That’s why we all had the same advisor. I’m sure there’s a similar dynamic  in quantum computing funded by DARPA, the NSF, and DOE.

The existence of multiple funding agencies is one of the main strengths of U.S. science. They foster diverse approaches and ensure that a single paradigm doesn’t dominate. It wouldn’t necessarily be a good thing if all of DOD quantum computing money were transferred to the NSF. We want many groups attacking the same problem and we should be happy DOD is part of the mix.

Now if all we care about is research production, we may be fine with just two or three agencies funding science. Especially if DOD is as inefficient as they suggest, we may be better off transferring half the DOD research budget to NSF and DOE.

But we don’t care about research for the sake of research. We want to drive innovation, which depends on much more than government funding. Which brings me to the second mistake Wright et. al. make: ignoring the importance of deployment.

As David Roberts noted, technology deployment is itself a form of research. It’s one thing to make a neat device in your lab. It’s quite another to scale the product, align it with customer needs, bypass regulatory hurdles, and market it successfully. I can’t tell how routine it is for a company to fail for these reasons even if they have the science locked down. As great as NSF research is, it’s only a small part of the picture.

Computers are commonplace not only because smart physicists figured out quantum mechanics. It’s also because we learned how to make lots of computer chips cheaply and quickly. The DOD role in this development has been crucial. Precisely because they are so massive and relatively price-insensitive, they enabled large-scale deployment and the learning that goes along with it.

Cliff Bob shows that he doesn’t understand any of this:

Nowhere in the article is there anything but assumption that only the military, as some kind of beneficent and far-seeing midwife of invention, could have fostered these and other innovations.  Nowhere are there convincing arguments that most if not all of these developments wouldn’t have been made either through some other government R & D agency or through the market itself.

Nowhere in Bob’s article is there anything but the wrong assumption that “these developments” occurred primarily because of an R&D agency rather than procurement and deployment. In some cases DOD was the only market in existence because no one else could afford the technology. Only after DOD brought down the price of semiconductors did we all benefit.

DOD may very well be wasteful and inefficient. Maybe in 2012 it’s not the best way to drive innovation and perhaps negatives now outweigh the positives . Those are fair arguments. But to debate the point intelligently, we have to first rid ourselves of the myopic view that money is all that matters. DOD funding is associated with scale and deployment, key components of innovation and commercialization. (See Roger for more along these lines.)

The most depressing part about all this is how otherwise brilliant writers make bafflingly simplistic arguments when it comes to innovation policy. Is it really so hard to understand that innovation requires more than government funding of R&D?

Scientists and the economic crisis, ctd.

December 1, 2011 Leave a comment

Via the Jack Stilgoe post I just discussed, Colin Macilwain exhorts scientists to deal with a world in crisis:

Those involved in science policy sometimes seem to me to be sleep-walking through the greatest crisis to afflict the West since the Second World War. True, from the point of view of the scientist at the bench, grants continue to flow and results continue to be published. Perhaps this is why wider discourse about science’s role in society has hardly budged an inch.

For the past three years, I have grown steadily more impatient with this ‘business as usual’ approach. Whenever an academy president or research chief stands up to speak in public, I have been waiting for them to explain how they will do things differently. They never do.

Macilwain doesn’t seem to understand that scientists are already dealing with a crisis. From their perspective, less science funding is the crisis to be dealt with. Why should scientists meekly accept they change their ways when everyone is trying to maintain business as usual? Scientists see a shrinking pie and want their portion to stay the same. It’s self-preservation, and there’s nothing wrong with that.

Scientists genuinely believe more science funding serves the common good and addresses the economic crisis, just as the Chamber of Commerce genuinely believes the same about lower corporate taxes. Scientists do in fact care about basic research. Asking scientists not to lobby for what they care about is asking them to abdicate their democratic responsibilities. It’s not a fair request.

Going forward, a better approach may be to stop narrowly equating science with academic basic research (something I’m guilty of in this very post), and instead try to direct funding to different kinds of science. Academics will always study what the Macilwains and Stilgoes out there are not satisfied with. So rather than attacking this type of research, Macilwain et. al. should do their own political lobbying for the type of science they want. A world in crisis demands it.

The fuzzy boundary between science and technology

September 9, 2011 2 comments

Over at Front Porch Republic, Nikos Salingaros (via Jerry Salyer) draws an unnecessarily sharp boundary between science and technology:

James Kalb has recently interviewed mathematician and architectural theorist Nikos Salingaros.  As someone who has always been irritated when people mistakenly think it the job of “science” to invent more and shinier consumer crap, I particularly appreciate the following remarks of Professor Salingaros:

Our educated world remains ignorant about the distinction between science and technology, unfortunately.  Science helps us understand the universe and ourselves.  Technology applies scientific results to master processes that we can manipulate so as to better our lives.  It is also applied to kill people directly, destroy nature, and threaten our own survival.  Or to save us from our stupidity.  Tools can be used for either good or evil.

The actual history of science and technology is a lot fuzzier than the stark picture painted here. There’s often a fair amount of overlap between the two, and it’s often not straightforward to see where one begins and the other ends.

To give just a few examples, much of basic materials science dependended on preceding technological developments in mettalurgy. My own (former!) field of space physics developed primarily because engineers working on improved telephone communication needed better data on the Earth’s atmosphere. The interplay between science and technology in thermodynamics is legendary. In this case, some of the most fundamental scientific laws owe their discovery as much to the steam engine as they do to some abstract quest for understanding. To this day, college sophomores learn the science of thermodynamics by studying the technology of steam engines. Technology itself (in the form of better instrumentation) is often created to help us “understand the universe and ourselves” while science qua science is sometimes the tool responsible for the bad stuff.

None of this negates the idea that there are differences between what we call science and what we call technology. Or that there are questions we can and should ask about technology as distinct from science. There surely are such questions, and on some level we can distinguish between the two. But as I’ve noted before, presenting diverse, varied concepts as simple monoliths often obscures more than it illuminates.

Categories: Basic Research

Unclogging the drug pipeline

May 11, 2011 1 comment

David Bornstein reports on a promising approach to increase the number of drugs produced by medical research:

Researchers in the foundation’s four partner labs say that the relationship hasn’t constrained their science; it’s improved it. “We are certainly looking for targets to enhance the reparative potential of the central nervous system,” explained Brian Popko, the Associate Chair for Research in the Department of Neurology at the University of Chicago. “But we weren’t encouraged to develop therapeutic approaches before we understood the system we were trying to repair.”

Johnson has also assembled some of the country’s most accomplished neurologists and immunologists, as well as industry experts to sit on M.R.F.’s three advisory boards, which focus, respectively, on basic science, drug discovery and clinical issues.

To date, researchers have identified a number of targets, of which five to seven are strong candidates that M.R.F. is actively pursuing. They are in close contact with industry researchers and will be farming out the “translational” research needed to attract pharmaceutical interest. They control 50 percent of the licensing of patents for research they fund and, as a nonprofit, can be very flexible in negotiating fees. Within five years, Johnson envisions partnerships with four or five companies, each of which will be moving a couple of targets forward. “We hope to have our first clinical trial for myelin repair by 2014,” he says. “To get to a clinical trial within ten years in an area of research that until recently no one thought was therapeutically relevant would be hugely significant.”

If the M.R.F., and like-minded foundations, succeed, their models will likely influence the bigger actors in medical research. Francis S. Collins, the head of the N.I.H., is pushing to create a new drug development center to be named the National Center for Advancing Translational Sciences.

In an admission that I know some people will appreciate, Stanford professor Ben Barres admits that academia incentivizes basic rather than “useful” research:

“Pure science is what you’re rewarded for,” notes Dr. Barres. “That’s what you get promoted for. That’s what they give the Nobel Prizes for. And yet developing a drug is a hundred times harder than getting a Nobel Prize. We really have to have the very best scientists engaged in this. For a long time this hasn’t been the case. Until five or ten years ago, working on disease was kind of shunned.”

Categories: Basic Research

Discovery is not a synonym for innovation

February 12, 2011 Leave a comment

My last post discussed Lehrer’s column on the increased difficulty of making scientific discoveries. Lehrer should have stuck with that topic alone instead of pivotting off Tyler Cowen’s The Great Stagnation to ponder the relationship between scientific discoveries and standard of living:

I think it’s also worth contemplating the disturbing possibility that our cresting living standards might ultimately be rooted in the difficulty of making new scientific discoveries.  After all, at a certain point the pursuit of reality is subject to diminishing returns – our asteroids will get so small that we’ll stop searching for them.

For someone who often paints wonderfully nuanced pictures of science, I’m a bit confused to see Lehrer write this. Living standards never have been and never will be “rooted” in new science. If they are rooted in anything, it is productivity increases related to innovation. The rule of law, tax structure, monetary policies, and capital investment all play a pretty big role here.

In fact, only since WWII has science been more than a minor player. Industrial revolution technologies were not strongly linked to the scientific revolution that preceded it, and may have depended more on a robust patent system than heroic scientists. Even if we grant that science has recently become critical, it may be more so in America than elsewhere. The Japanese economic miracle occurred despite a paltry level of science funding, and was spurred by careful industrial planning. Germany seems  to have escaped the worst of the Great Recession despite spending relatively little on R&D. And while our standard of living may be “cresting”, the developing world is, well, developing quite well. So again, there is no straightforward link between science, innovation, and living standards.

The U.S. may indeed have a problem with economic stagnation, and it’s important to understand what exactly is going on. But casually assigning too much credit or blame to discovery is, for lack of a better term, pretty unscientific and doesn’t help.

Categories: Basic Research, Economics

The slowdown of meaningful discovery

February 11, 2011 1 comment

Jonah Lehrer has again written a provacative piece, arguing that growing collaboration and teamwork among scientists is a response to “all the low-hanging facts [having] been found.” Today’s science is simply much too hard and complex to tackle alone. Lehrer quotes Samuel Arbesman

If you look back on history, you get the sense that scientific discovery used to be easy. Galileo rolled objects down slopes. Robert Hooke played with a spring to learn about elasticity; Isaac Newton poked around his own eye with a darning needle to understand color perception. It took creativity and knowledge to ask the right questions, but the experiments themselves could be almost trivial.

Today, if you want to make a discovery in physics, it helps to be part of a 10,000 member team that runs a multibillion dollar atom smasher. It takes ever more money, more effort, and more people to find out new things.

Given that I wrote something similar just a month ago, I of course liked this passage. A few responders at Andrew Sullivan do critique the notions that science was ever easy or that we’ve reached the “end of disovery.” On the latter point, I agree that the meme is a bit overwrought. Over 1 million papers get published every year, and presumably most of them make a discovery of some form. Some might later be proven wrong, and some may be meaningless (Lehrer claims one-third of papers never get cited), but they are discoveries nonetheless. 

Perhaps a better phrase is the slowdown of meaningful discovery. The fact that all the low-hanging fruit have been found doesn’t end the pursuit of knowledge. It just means we have to focus on smaller, more specialized problems that have less payoff (hence all those uncited papers), or focus on really, really hard problems that can’t be solved (hence all those corrections in medical research). The productivity slowdown in pharmaceutical R&D is one notable example of the former phenomenon. The graph is particularly striking (h/t Roger Pielke Jr.).

The rate of discovery ultimately matters because we think it affects our standard of living, a point Lehrer addresses towards the end of his post. Unfortunately, he really muddles the relationship between discovery and innovation. I’ll try to address this in my next post.

Categories: Basic Research, Economics