That's the question behind a paper (abstract available here) released Monday by the National Bureau of Economic Research. The paper -- by Benjamin Jones, associate professor of management at Northwestern University -- argues that science has changed in key ways. Specifically, it argues that the age at which researchers are able to make breakthroughs has advanced, and that scientists are parts of increasingly larger teams, encouraging narrow specialization. Yet, he argues, science policy (or a lot of it) continues to assume the possibility if not desirability of breakthroughs by a lone young investigator.Recently, I was telling a student considering making the jump into Math and Astronomy (from English) that they shouldn't worry about how old they are, as the average graduate student is now around 30 something. I forgot where I read that, but now I wish I could find it again. This mean most people start their professorial career (in academia) or move to commercial labs in their late 30s, if not 40s. The man who heads my research group at my last job typified this, having gone back to college when he was in his late 30s. Granted, a large chunk of his 20s was taken up by being in Vietnam.
Just to be contrarian (a healthy thing, in science) I'm not sure what the world needs is more collaborative research. Sure, it'll let us tackle bigger and bigger questions, but those big questions are increasingly narrow. Further, it feels like these massive collaborations are sort of the cludge solution to many of the problems. Instead of clever experimental design, we're substituting brute force.
Sometimes, I think the solution is to make people in both academia and industry to take a mandatory Patent Office period.