Not just a hodge-podge, but a miscellaneous hodge-podge. Here’s what I’ve been reading:
But humans keep experiencing suffering and death. Why? What explains the tremendous mismatch between expectation and reality? Are the cures really coming, just more slowly than expected? Or have scientists fundamentally misled us, and themselves, about the potential of new medical technologies?
4. Barbara Herrnstein-Smith’s Natural Reflections: Human Cognition at the Nexus of Science and Religion. I just started this book, which has been on my list since reading this Stanley Fish post. The introduction has already blown my mind and I look forward to reading more. Money quote from a follow-up response Smith wrote to Fish’s post:
Finally and most seriously, I think that the idea of science and religion as counterpoised monoliths deepens prevailing misunderstandings of both. As I emphasize throughout the book, the kinds of things that can be assembled under the term “religion” are exceptionally diverse. They range from personal experiences and popular beliefs to formal doctrines, priestly institutions, ritual practices and devotional icons — Neanderthal burial rites to Vatican encyclicals. The same can be said of “science,” a term that embraces a wide range of quite different kinds of things — general pursuits and specialized practices, findings and theories, instruments and techniques, ideals and institutions (not to mention a share of devotional icons and ritual practices).
I’ve been speaking about disunity of science for a while now and it’s refreshing to see it from a different perspective.
Although the conclusions reached in this post are initially counter-intuitive, we here explain why ethical arguments are in some ways much stronger arguments than self-interest based arguments and the failure to look at climate change policies through an ethical lens has practical consequences…In fact, ClimateEthics believes that an appeal to self-interest alone on climate change, a tactic followed both by the Clinton and Obama administrations for understandable reasons, has been at least partially responsible for the failure of the United States to take climate change seriously.
Read the whole thing. While I think his writing could have been clearer, I think his post connects nicely with my last one. Perhaps a general reluctance to engage with the ethics of climate change has consequences beyond the corrosion of public discourse.
Let me expand on an idea I started a couple posts ago, namely that the mundane is always sacrificed for the sexy. Put another way, all discussions about science inevitably require grand claims. Consider the delicious symmetry between the science studies crowd and natural and physical scientists. The latter group insists that biochemistry and supercomputers are the only ways to solve cancer and global warming. And how do the social scientists respond? By stressing that these approaches prevent us from solving cancer and global warming!
Don’t get me wrong. These problems need to be solved, and I’m glad everyone devotes so much energy to them. But it’s quite strange that all of our arguments have to be framed by such superlatives. There’s no space to make a simpler point that in the end is probably much closer to the truth. Physicists can’t say that we focus on numerical solutions not necessarily because we believe it’s the best or only path forward, but because we enjoy that type of work. Conversely, the science studies crowd does not (as far as I know) point out that the (mild!) hypocrisy of scientists’ exaggerations is intrinsically wrong. Rather, they string together a series of tendentious links that are very hard to prove. If the benefits of basic research weren’t exaggerated, then (maybe) we’d spend more on socially relevant research, and then (again maybe) we’d make better progress on solving cancer and global warming. There are too many hypotheticals here for my comfort, and this argument is no less convoluted than the ones often made in defense of basic research.
We really shouldn’t have to justify everything in terms of majestic solutions to big problems. Many scientists simply like basic research and they should be allowed to say that. And we can disagree with scientists’ exaggerations for the rather boring grounds that the arguments are bad on their own terms and undermine honest debate. Cancer shouldn’t have to be part of the picture.
It is true that personal motives alone cannot compel public action. We’re understandably wary of basing policy on something as whimsical as subjective preference. But my very basic reading in political theory tells me that at its most fundamental level, democratic politics was designed to resolve the empirical fact that people simply care about different things. And thus we shouldn’t have to justify everything we care about entirely in terms of the common good. Public discourse suffers if everyone tries too hard to cloak the true reasons for their actions. These reasons, however mundane and ordinary, must be part of the picture.
* In my never-ending desire to prove my lack of originality, I’d like to credit Kammen and Dove’s wonderful article for inspiring the title of this post
David Bruggeman had some problems with my last post:
You’re coming across as glib here, even suggesting some of the same carelessness you’re upset about in others. Deranged? What’s your evidence? Simplistic, sure. Incomplete, right there with you. Deranged? Can’t agree. The interwar period in scientific research and technological development could lead some to assume basic research led directly to the technological advancements of the 40s and 50s.
I’m also not persuaded by assertions that scientists’ promises of their research are intentionally overblown. If there’s proof, then they need to be called on it. But if they’re making an estimate of when they think certain things will happen, it needs to be treated as a prediction rather than a promise. If they can’t predict well, then they shouldn’t be funded, but I wouldn’t consider that to be fraudulent behavior.
My use of the word deranged was wrong, and I apologize. David is correct when he writes that the interwar period “could lead some to assume basic research led directly to the technological advancement of the 40s and 50s.” But scientists are supposed to do more than merely assume such links. Rather we must examine evidence and draw reasonable conclusions. Simply following our own protocols would have alerted us to the fact that at least since 1982 there have been doubts about a straightforward link between basic research and innovation. More recent work has not clarified the situation, which David himself has discussed several times. There would be no confusion if scientists were as rigorous here as we are with our own research.
Which is why I described our behavior as careless and hypocritical. These sins are less serious than “intentionally overblown” and “fraudulent,” which I don’t believe scientists are guilty of. I’m even willing to accept our actions as relatively mild and ultimately unimportant. But it’s still hypocrisy.
I think it’s perfectly understandable and reasonable for scientists to [exaggerate our benefits]. Every other special interest group does so. There’s nothing wrong with that of course. Some would even say that healthy democracy requires special interest groups.
In the back and forth that followed, Ryan criticized this attitude as blithely accepting an outcome with potentially harsh consequences. More money for genetic maps and global climate models may mean less for research on racial health disparities and adaptation. The mundane gets sacrificed for the sexy while the public continues to believe in utopian promises, a meme that the science studies folks have pretty much beaten to death.
All of this is of course true. Scientists do unreasonably favor basic research over short term need, and surely many important questions remain unanswered as a result. I should not have so readily acquiesced to the current, imperfect state of affairs. Lives are at stake after all. And so we must push against scientists’ false promises, and we must highlight the opportunity costs of basic research.
But in making such arguments, as the science studies community so often does, they themselves ironically neglect more mundane and less sexy ones. Yes it’s true that more biochemistry won’t necessarily lead to better health. But it’s also true that those making this claim forgo logic and evidence for anecdotes and sloppy reasoning. What most bothers me is not our ineptitude in solving cancer and global warming. It’s that the lobbying done in my name brings with it distortions and half-truths that contradict the very qualities I allegedly embody. How can we honestly advocate for better science education and data-driven decisions while also promoting the deranged belief that basic research is the source of technology?
Such carelessness is unbecoming of people who supposedly value reason and evidence. In the end, potential negative consequences shouldn’t be the only reason to resist scientists’ exaggerations. Protesting blatant hypocrisy is also a good reason. In my view it’s reason enough.
This past week I’ve been at the Gordon Research Conference on Science and Technology Policy. It’s really a great conference with a nice mix of academics and practitioners. I’d discuss more here, but it’s unfortunately off-the-record. I could try make the case that given my underwhelming readership, my blog should qualify as off-the-record, but I don’t think it’d fly. Oh well, such is life.
I will say that I just had some very fruitful discussions on Rethinking Expertise, which I just gave a glowing review. Definitely gave me some perspective on how the work is viewed by people in the field.
My earlier post neglected to mention another reason I liked the book. On page 51, EC noted that the failure to wrestle properly with expertise gives a “misleading picture of the power of logical thought and experimental genius.” In light of my views on scientific thinking, this message had special appeal. Ultimately content knowledge and specific expertise matters much more than an amorphous, poorly defined method of thinking.
Now resolving the problems of expertise won’t necessarily make contentious debates any easier. Climate change and genetically modified crops are contentious for reasons deeper than a misunderstanding of expertise. But addressing the misconceptions might be a useful place to start.
So I just plowed through Harry Collins and Robert Evans’ [henceforth CE] Rethinking Expertise. Though it’s a bit dense, you’ll find it insightful if you can get past the STS jargon. I’ll give a brief review now, and try to expand more in future posts.
Very early on, the authors insist that expertise does matter and all else being equal, we should prefer their judgment on technical matters. Their attitude contrasts with some of the more egalitarian (and misinformed in both mine and the authors’ views) approaches that deprivileges science completely. While science studies has performed admirably in deconstructing and removing science’s mystique, it can go too far. There are actual facts about the physical world, and oftentimes these do matter. Neither democracy nor science expertise should dominate a decision, and we must welcome but limit public involvement. Ultimately, CE aimed to provide a vocabulary and way of thinking about expertise to help us negotiate this terrain. They succeeded in the latter goal, but their clumsy and inelegant terminology will hinder the former.
CE spend quite some time developing a taxonomy and diagrammed it in a rather unhelpful “Periodic Table of Expertise.” While the figure itself was unclear left much to be desired, the corresponding discussion was excellent. What I found most interesting was their distinction between interactional and contributory experts. The latter group consists of scientists publishing in a fairly constrained field in (to use their example) gravitational wave physics. Interactional experts can talk-the-talk but aren’t qualified to publish. Their arguments were bolstered by what appeared to be carefully run experiments. Such experimentation and data analysis are rarities in science studies, and I greatly appreciated their presence here.
After reading their work, I realize my insistence that only climate scientists be allowed to speak on global warming is a bit restrictive. There are people–namely interactional experts–who can speak on aspects of the problem without actually being a part of the IPCC. Joe Romm and Roger Pielke Jr both probably qualify, though I’m sure they’d both hate to be lumped together! Interactional experts can even be non-scientists, as the sheep farmers in Brian Wynne’s famous study showed. The fact that their framework coherently incorporates both scientists and non-scientists makes it even more impressive.
I wish they had applied their model to a contemporary science controversy, and the omission of an in-depth case study is the only major omission. It will hopefully be corrected in the future.
Via Ezra Klein, I came across this excellent but dated Newsweek article cleverly titled “We Fought Cancer…And Cancer Won.” It nicely discusses, among other things, the disconnect between basic research and medical advances, something I touched on recently. I strongly recommend the entire article. Here’s the money quote:
“It made for a lot of elegant science and important research papers. But it “all seemed to have little or no impact on the methods used by clinicians to diagnose and treat cancers,” wrote Varmus. Basic-science studies of the mechanisms leading to cancer and efforts to control cancer, he observed, “often seemed to inhabit separate worlds.” Indeed, it is possible (and common) for cancer researchers to achieve extraordinary acclaim and success, measured by grants, awards, professorships and papers in leading journals, without ever helping a single patient gain a single extra day of life. There is no pressure within science to make that happen.” [emphasis added--PK]
I’m sure somebody will really appreciate this quote! On top of this issue, the author also briefly but effectively discusses prioritizing prevention rather than cure, connecting bench scientists with medical decision-making, funding riskier research, and ensuring that the benefits of science are equally distributed across all groups. I could probably write at least 5 more posts on this one article, but I’ll (probably!) spare you.
On an unrelated note, also check out this Times piece on the increasing presence of university presidents on corporate boards. There are probably upsides and downsides to the practice, which the article does discuss. John Gillespie’s stance is quite hilarious:
Academics may be trained to ask tough questions in their own fields, but when confronted with tricky business issues far above their level of expertise they “often become as meek as church mice,” he says.