Wednesday, October 15, 2014

When do we REALLY have too many PhDs, and what then?

Recently, NPR ran a suite of stories (sample1, sample2) on biomedical PhDs either leaving academia or not having academic jobs available to them. I teach a class to entering PhD students, and many of them expressed concern about both the general fear being propagated by these stories as well as fear regarding their own  prospects post-PhD.

Meanwhile everyone with or pursuing a PhD has an opinion on the value of a PhD, whether there really is a crisis of too many PhDs, etc. Some describe a virtual pyramid/ ponzi scheme that's existed for decades, wherein growth in number of scientists in general or PhDs in particular is unsustainable. Others persistently argue that "The skills ... are useful in many professions, and our society needs to be more, not less, scientifically and technically literate." Still others say that the system should continue to take in many PhDs since we never know which will be the most successful. (I suppose there's an implicit "... and to hell with the losers who don't get the jobs they anticipated" in this last point of view.)

Elements of truth exist in each, so let's start with facts, then statements of the problem, and finally possible paths forward.

1) FACT-- right now, most science PhDs will not get tenure-track university jobs (for whatever reason). The figure I see often for biomedical sciences is ~15% nationwide. It's easy for elite universities to assume that THEIR graduates will fall into that fortunate 15%. We recently collected data for Biology PhDs from Duke University, specifically seeking ones that were not in postdoctoral stints. For our program, indeed, the fraction of non-postdocs in tenure-track jobs was around 50% over the last decade. Of the rest, a subset were in non-tenure-track jobs, and the rest had diverse careers-- research scientists, editors, writers, scientific sales, etc. Hence, even from elite private research I universities, a large fraction of the science PhDs will not get tenure-track university jobs (for whatever reason).

2) FACT-- research faculty have an intrinsic conflict-of-interest in considering whether there are "too many" PhDs. We faculty are judged by our productivity, which is partially (wholly?) associated with the talented trainees we bring into our labs. Turning down the tap of incoming PhD students could reduce our potential productivity, and thus our research program & career advancement prospects.

3) ARGUMENT-- there must be SOME hypothetical "maximum" number of PhDs after which point there are substantially more PhDs than necessary to fill jobs requiring that level of training. One of the points used against this argument is a general "more education is good". Sure, if time & money were no object, then yes, perhaps everyone could benefit from a PhD. Similarly, some sell the skills that one gets in a PhD as "useful in almost any profession." This is clearly true in a trivial sense--  pursuing a PhD is more valuable than sitting at home and watching Netflix for 6 years. But for someone who'll be either a base-level staff scientist or general administrator (or perhaps leaving science entirely), does it make more sense to get a PhD, or to spend 2 years on a masters degree and 4 years advancing their experience (and stature) within their chosen career that does not require a PhD?

So, with the above points in mind, I ask the first question. When do we REALLY have too many PhDs? Have we already passed that point? Are we coming close?

Here's some data (albeit crude) from a few years ago separated by country:
http://www.nature.com/news/2011/110420/full/472276a.html
Note, for the US, they mention specifically "Figures suggest that more doctorates are taking jobs that do not require a PhD."

Some faculty toss out the word "industry" as a solution to the small number of tenure-track positions available, as though companies are struggling to get PhDs and these are easy jobs to snag (and as though these faculty even know what "industry" means). While some areas may be booming, many biomedical industry jobs too are quite competitive, and some PhDs who TRY to go into industry have difficulty getting in the door.

With that in mind, I ask the second question-- what do we do when we really do have too many PhDs? Turn down the pipeline at that point? It would already be too late for those who went through the pipeline that we purported to be mentoring. Some have argued we should reduce support for postdocs, but I think that is foolish since it "strands" people who already invested in a PhD.

Here are my thoughts on solutions, and I argue that these need to happen now, not later:

A) It's nothing short of criminal for graduate programs and advisers to fail to prepare students for diverse career possibilities. Some trainees may even prefer (gasp!) a non-academic route, not because it'd be any less competitive but because they prefer the work. How do we prepare such students? Some of the skills overlap between some non-academic and academic positions, such as project management, creatively thinking about science, rigor in science, some hands-on techniques, etc. But some aspects are not emphasized in PhD programs because they're less critical for securing research faculty positions than for other careers--developing a portfolio of writing pieces aimed at the general public, interfacing and networking with industry representatives, getting "real" teaching experience (a TA-ship is not "teaching"), etc.

We must engage students early in their PhD training to think seriously about their directions and advise them on how to best-prepare for their chosen routes. Implementing only a "one day workshop on diverse careers" is a pathetic solution to a real problem. The myIDP questionnaire is a good starting point. Longer-term programmatic solutions need to be in place, as well as individual faculty efforts. For example, I now insist all of my PhD students get trained in at least rudimentary computer programming/ bioinformatics-- this is useful to them in many careers and certainly broadens the careers from which they can choose post-PhD.

What I say above is necessary but perhaps not sufficient-- it assumes there are enough non-academic (or non-tenure-track) positions for all PhDs.

B) We must invest more in Masters programs. I'm a big advocate of the PhD having great value, but a research thesis MS also has value, brings more breadth to one's education and job prospects, and requires far less time.

The last is potentially the most controversial:

C) PhD granting institutions may need to begin a serious discussion about scaling back the number of PhD students they admit. I stress that the elite research universities should take the lead in this and not presume that Northeastern West Virginia State University (a fictional school) should scale their PhD program back before, e.g., Yale University does. I'm not positive that the community is at the point that we must scale back PhD student numbers, but dismissing this option without a serious discussion and exploration means we're waiting for the catastrophe to happen before we are willing to even assess. Let me reiterate point #2 above-- we faculty at research universities have an intrinsic conflict-of-interest, so we should be even more vigilant to make sure we're not sacrificing those we claim to be mentoring.

Finally, let me close with a note to current PhD-seekers and postdocs.Yes, a lot of your mentors (including me) had it easier than you do. But don't despair prematurely or give up. There are still jobs out there, both academic and non-academic. Pick the direction you want to go (academic or non-academic), find out what you need to do to both secure and succeed in that direction, and pursue it wholeheartedly. Get advice from both formal and informal advisers-- take initiative on asking for this advice. Don't stop pursuing your dream because NPR interviewed someone who gave up a similar dream and is doing something different. At the same time, diversify your portfolio. Build some skills that may open what you perceive as an appealing "plan B." The reason isn't for assuming you won't get your plan A job, but is because it's always better to have options than not. And don't overly fear being actually "unemployed"-- while clearly a handful of PhDs have struggled to get "any" job, the statistics on unemployment for PhD-holders are not so bad-- certainly they're far better than for those who had a college degree only. It's almost unequivocal that you got (or will get) something for that time investment in a PhD. In the meantime, if you hear people who've held faculty positions for a decade or longer saying nothing's really different, just know that your successes will have been harder fought than theirs.


Friday, January 3, 2014

Putting College Under the MOOC Microscope



Another year draws to an end, but not before yet another"MOOCs aren't as good as college" story slips into the media (NPR, in this case). Amazing insights are present there, like that MOOCs don't provide as much personal, face-to-face interaction as one can potentially get in a college class. Wow, no one could ever have figured that out. Also, a very small fraction of people who sign up for a class (requiring in some cases literally one button-click of a mouse on a website) don't view all the lectures or complete all the assessments. Well, blow me down. And the conclusion in the article? "We have a lousy product."

Lousy??? I'm really fed up with the anti-MOOC movement, especially when it comes from within academia. Despite my snide sarcasm above, I do appreciate that much of this continued MOOC pushback is a response to the MOOC overhype that both preceded and overlapped it. What many MOOC dissenters seem to miss is that most MOOC advocates (including myself) never argued they are a "replacement" for a college education and experience. No way-- not even close. The media and a very few zealots played that line up, and they were wrong from day 1.

But let's turn the tables a bit. Let's put "in-person" college experiences under the microscope used for MOOCs. Before that, we must realize that we cannot compare completion rates for a college class and a free online product that fails to provide credentialing. Especially for introductory-level science courses (the kind I teach in genetics and evolution), the vast majority of students in college attend classes for credentialing rather than to satisfy a keen interest in the specific topics. A few months ago, I asked a room full of college students in a workshop, "How many of you look forward to 2 or more of your classes most weeks?" The answer-- one. Keep in mind all of the students there take 4-5 classes at a time, so the vast majority do not look forward to even half of what they're signed up for. Again, they are signed up for most classes because they're "required", either directly or to fulfill some sort of requirement or credit. If the students fail to complete the "in-person" college class, not only do they fail to fulfill the requirement and fail to get the credit, but they often have the black-mark of an "F" or a "W" on their permanent record. That's simply untrue for MOOCs in all respects-- if you dislike a MOOC, you simply stop watching without consequence.

How can we compare these experiences fairly then? MOOCs are like what students would be willing to look at as "extra," and with no consequence for failing to complete. I looked up some statistics from my on-campus class last spring as a comparison-- every week, I provided online resources (often podcasts or pdfs) that were truly "extra"... the resources were available on the same webpage as required materials for each week, and the resources complemented what was discussed in the lectures. There were 452 students enrolled. The very first such resource was viewed 100 times. How does this (100/452) compare to the MOOC criticism of "About half who registered for a class ever viewed a lecture"? Again, these were students already in a college class on this subject, and it was material pre-identified for them as relevant. If you look at the supplements from the end of the semester, the views are in the low single digits (potentially just reflecting the times I'd open the files to confirm they uploaded). How does this compare with the MOOC criticism of "completion rates averaged just 4%"?

I don't blame these on-campus students for the low uptake at all. They have career aspirations (in my case, mostly pre-med), and frankly, we've placed them into a situation where their grades matter more than what they care to learn about. If they spend time viewing my supplementary materials, that time is not spent studying for organic chemistry or physics. For every B or lower grade they get, their choices of medical schools become more limited, so they need to triage. And maybe they don't even really care about my topics, but they're forced to take my class by major requirements. None of this is true for MOOCs. Further, as I've argued previously, many college classes effectively focus on stratifying students (the essence of a "curve"), and far too little ensuring that all students who want to be engaged and learn are successful in doing so. MOOCs don't concern themselves with stratification at all-- it's all about engaging and learning for an interested audience. I wonder if college was once that way, centuries ago.

Back to MOOCs, let's drop the percentages and look at just the final numbers. I'll use mine as an example, but I suspect you'd get similar numbers in any of them. My MOOC ran twice. Even if we pretend that only those students who completed every assessment and got a passing grade at the end were the only ones who reaped any benefit, that number still comes to ~4000 students. 4000 people from around the world quantifiably learned about genetics and evolution as a result of this MOOC. Presumably there are other students who didn't complete it but found some part of the experience personally rewarding or engaging, and they have a greater appreciation for the topic. And best of all, none, NOT ONE, of those 4000+ "had" to do it-- this was quenching a thirst for knowledge, not jumping through a pre-MCAT or biology-major hoop. I'd like to see more "lousy products" like that in the world. How many of those students enrolled would have gone to a local college instead to satisfy this particular thirst? My guess is less than 1%, if any. Finally, I like the thought experiment of what would happen if I just told my on-campus class from day 1, "You'll all get A's no matter what," (obviating the credentialing)-- how many would still be in my classroom three months later? How many years would I have to run my on-campus class under that condition to get 4000 students to have continued to month 3 of my class?

Yes, MOOCs were overhyped. They are no panacea. They don't have face-to-face interactions with knowledgeable faculty and able other students. They don't invalidate college or provide a serious alternative. They don't provide "education for all." Most of the enrolled students already have higher education, so MOOCs' contributions to equalizing opportunity are limited (if for no other reason than because of variable internet access). And they are misused by some reckless college administrations. But before we cast any more stones at MOOCs for what they "aren't", let's have colleges take a serious look in the mirror themselves at what they've become, and see how badly their faces have broken out.

Personally, MOOCs have helped me see deficits in standard on-campus college experiences. I think the overall college experience needs to be rethought in a big way. It's NOT that I think MOOCs are better or are replacing college, but they highlight college's obsessions with course requirements, with grades, with credentialing, and with hoops of various sorts in the on-campus experience. Unlike on-campus college classes, MOOCs are hoop-free and purely educational: people enroll because they think they want to learn the subject being taught, and they continue in the class if and when they stay engaged in the material and seek to commit their time to it. What a concept that would be for an on-campus introductory science classroom.