Archive

Archive for April, 2011

The future of the PhD

April 23, 2011 2 comments

Nature has a special feature on the future of the PhD. Mark Taylor, professor of religion at Columbia, has some especially strong words:

Universities face growing financial challenges. Most in the United States, for example, have not recovered from losses incurred on investments during the financial fiasco of 2008, and they probably never will. State and federal support is also collapsing, so institutions cannot afford to support as many programmes. There could be an upside to these unfortunate developments: growing competition for dwindling public and private resources might force universities to change their approach to PhD education, even if they do not want to.

There are two responsible courses of action: either radically reform doctoral programmes or shut them down.

Eh. I’m instinctively skeptical of suggestions that the only “responsible” action is radical change. The PhD isn’t going anywhere for a very long time, and any changes that do occur will surely be small and exploratory. Statements like that make me roll my eyes.

That said, graduate education does need to be reexamined. It’d be nice if we could get a group like the National Academies to write a report that will later be forgotten. I’m pretty sure no one remembers this one from 1995!

Categories: Academia

Blog Dump

April 18, 2011 Leave a comment

Here it goes…my cop-out technique to avoid writing a real post when I’m too busy or tired.

1. Over at TNC, they’re discussing why academic debates don’t enter public discourse. In my view, we can’t discount the fact that most academic debates just aren’t that interesting! It’s telling that the phrase “It’s academic” generally refers to questions that ultimately don’t have much relevance to things we actually care about.

2. Scott Adams suggests that practical knowledge should trump learning for learning’s sake:

Some of you will argue that learning history is important on a number of levels, including creating a shared culture, understanding other countries, and avoiding the mistakes of the past. I agree. And if the question was teaching history versus teaching nothing, history would be the best choice every time. But if you compare teaching history with, for example, teaching a kid how to compare complicated financial alternatives, I’d always choose the skill that has the most practical value. You get all the benefit of generic mental training plus some real world benefits if any of it is retained.

3. Via David Bruggeman, we learn that robots are getting really f***ing smart:

After an update to its software, a robot scientist has recycled its previous research to make a new biological discovery.

Named Adam, the van-sized robot came to scientific fame after autonomously investigating gene function in yeast. Those findings anticipated an era when computers wouldn’t just be research tools, but researchers.

Wow!

Categories: Academia, Blog Dump

Ideas are overrated

April 13, 2011 Leave a comment

Light blogging week this week. But check out Farhad Manjoo, who insists that Mark Zuckerberg, not anyone else, invented Facebook and deserves the credit. I’m not too familiar with the details, but this passage caught my eye (emphasis added):

I suspect we’re mainly interested in how Facebook got started because we want to know whom to credit for coming up with a brilliant idea. In America, we root for the guy with the great idea over the guy who didn’t sleep for a year making it happen. If Zuckerberg really did come up with the idea for a campuswide social network, he deserves all the billions that are coming to him. But if he stole the idea, why should he profit from something that someone else thought up first?

Easy answer: because Zuckerberg did it better. If you look at the early history of Facebook, you’ll see that almost nothing about it was a new idea. Even if it’s true that the Winklevosses came up with a plan for a Harvard social network first, they were obviously inspired by other sites. Social-networking sites—even ones focused on college students—had existed long before Facebook. The real value of Facebook wasn’t that it did something new, but that it did something old better—faster, prettier, more useful, and more addictive. This is a story we’ve heard before in the likes of the iPod, the iPhone, the iPad, Windows, and Google. None of these were new ideas, but we shouldn’t think any less of them because of it. Ideas are overrated. In technology, what really matters is execution.

Categories: Economics

The Chinese (and Indians) are taking over!

April 8, 2011 1 comment

Here’s David Rothkopf at Foreign Policy parroting the standard (and careless) meme that there is a simple, straightforward link between scientific production and economic growth, while also making some dubious historical claims:

The report reveals that whereas in 1996, the U.S. produced approximately 290,000 scientific papers and China produced just over 25,000, by 2008, the United States had crept forward to just over 316,000 whereas China had increased to about 184,000. While estimates as to the speed China is catching up vary, the report concludes that a simple straight-line projection would put the Chinese ahead of the United States … and every other country in the world … in output by 2013.

How did China do it? Simple: They made it a priority. They increased research and development spending 20 percent a year or more every year since 1999 and now invest over $100 billion annually on scientific innovation. It is estimated that five years ago, the Chinese were already producing over 1.5 million new science and engineering graduates a year.

This data resonates on many levels. It suggests a profound shift in the world’s intellectual balance of power. This shift is one that is historically linked to the economic vitality and consequent political and military clout of the countries that lead. It suggests a much better future for the people of the world’s most populous country and knock-on benefits for their neighbors and trading partners. It suggests a relative decline in influence for the U.S. And, for the people of the Arab world, currently struggling with their own revolutions, it suggests the only true path to real reform, opportunity and empowerment.

It is an axiom of history that the silent revolutions — like those that periodically come in science and technology — are far more important than the noisier, bloodier and more publicized political kinds. That’s why these subtle indicators of their progress can be even more momentous than the round-the-clock coverage of upheaval that seems to be dominating our attentions at the moment.

How on Earth can you possibly specify the relative importance of the American and French revolutions over some (unnamed) silent revolutions? What does it “more important” even mean? At any rate, we’ve also known for at least five years that total graduates is a poor metric to use for both China and India. More recently, this article in the Wall Street Journal notes the generally poor quality of many Indian college graduates. There are broad trends in global science and they must be carefully examined and understood. This piece doesn’t help.

David Bruggeman’s measured response to the growing Iranian (!) power in science should be heeded:

In any event, these numbers-focused discussions have a bad habit of devolving into narrow zero-sum game conversations focused on whether leading countries are losing or not.  Because collaboration is an important part of science, and that it tends to resist international tensions, having more publications, more scientists, and more quality researchers will help everyone.

Categories: Economics

Public literacy and self-awareness

April 5, 2011 Leave a comment

Via Chris Mooney and The Bubble Chamber, here’s sociologist Barry Glassner calling for more public literacy among scientists:

Scientists and their advocates need to become more knowledgeable about how people come to their beliefs — who they rely on for scientific information, what they hear, and through which filters they hear it….

Nor are pleas for rationality and greater respect for science likely to win the day. Were hard data and cold logic all that mattered, any number of common personal behaviors would be long gone by now, from smoking to overeating. As any skilled public relations practitioner will attest, successful communication meets people on their own turf — by means that address emotions, fears and values.”

While I’m deeply sympathetic to this argument, it only goes half-way. If scientists truly want to understand the public, they first have to accept that they are part of it, that we too need our emotions, fears and values addressed. For me at least, building self-awareness and humility has been a continuous, painful process, and one that I’m far from completing. Confronting and interrogating your most deeply-held beliefs and concluding that you do not, and may never, really understand them is not easy. I now look back with more than a little embarrassment at my Richard Dawkins worshipping days and wonder how I could have been so stupid.

As I’ve done so often, here’s Barbara Herrnstein-Smith articulating my thoughts better than I can in the context of (yet again!) science and religion:

Scientists share cognitive tendencies, achievements, and limits with nonscientists; religious believers share them with nonbelievers. Although each may put the world together and conduct his or her life in ways that are at odds with or opaque to the other, the cosmology and way of life of each deserves minimally respectful acknowledgement from the other. Such acknowledgment would not mean accepting ideas one finds fantastic or claims one knows are false. And of course it would not mean approving practices that one knows are confining, maiming, or murderous to oneself or to others. What it would mean is recognizing, as parallel to one’s own, the process by which those cosmologies and ways of life came to be formed.

Not me, says the self-vaunting evangelical atheist. Tu quoque–you too–says the defensive, resentful theist. Et ego–I, too–says the reflexive, reflective naturalist.

The best two closing paragraphs to a non-fiction book I’ve ever read.

Categories: Public Discourse

On astrology and demarcation

April 1, 2011 Leave a comment

I’ve been very busy applying to jobs and figuring out my life, so I’ll just leave you with a few posts all related to astrology. I meant to link to these a while back:

1. Start with Paul Newall for a somewhat academic summary on astrology, Kuhn, Popper and Feyeraband. Will be most relevant if you are at least somewhat familiar with the names. I liked the closing paragraph:

In summary, the philosophical problem for astrology is thus not that it can always explain failures (Popper) or that it does not attempt to solve problems (Kuhn) but instead that it has stagnated (Feyerabend) – assuming that this progression in criticisms is fair, of course. Notice that, for both Kuhn and Feyerabend, this is not a final verdict: if astrology can become problem solving or – better – if it can strengthen its arguments while proliferating alternative theories, it might be possible to eventually reassess it. However, excoriating astrologers or calling their discipline “rubbish” is perhaps unlikely to encourage such an improvement in matters.

Also read through the comment exchange.

2. Linked to from Paul, you should also read historian Rebekah Higgitt. Some fascinating history that I was completely unaware of:

When science happens there is always a reason: astronomy developed because, broadly, it served three masters: navigation, timekeeping and astrology. These were, all three, supremely important in ensuring development of accurate positional astronomy, because all were things for which people were willing to pay.

Although the words astronomy and astrology were often used interchangeably, I think it can be helpful to think about astronomy as the means by which data was generated (observations taken, mathematics applied, models created and tables drawn up) and the others as uses made of that data. The need for all three applications drove astronomy. Good, accurate astronomy would ensure good, reliable and accurate time-telling and navigation, and the best possible basis for astrological interpretation to take place. There are clear historical examples of astrology rather than the others being the impetus behind particular instances of patronage of astronomers or mathematicians to undertake observations or produce tables. This was the case up until the late 17th century.

3. Finally, here’s practicing astrologer (!) Deborah Houlding insisting that astrologers do in fact know their stuff:

All informed astrologers recognise the definition of the constellational groupings of stars, and draw meaning from planetary relationships to the prominent fixed stars. All informed astrologers also know that the Sun does not cross the equator at exactly the same point of the ecliptic year after year, but that the phenomenon of precession brings a 50 second shift between the first point of the tropical zodiac and the background constellations, which accrues a disparity of about 1 degree every 72 years (see here for explanation).

I will leave with yet another promise to expand on my thoughts at a later date! My only (brief) comment connects to Higgitt’s passage that I quoted above. If astrology was once a part of science, why do some scientists seem get so angry about it now? It seems to me that astrology is mainly a playful, harmless distraction that had distinct value at one point in history. So while it may be unscientific…why get so worked up about it?

Categories: Philosophy