This was supposed to be a post on the difficulties in securing tenure. A tenure-track position would be godsend (even for the atheists out there). I spent hours typing and ranting ferociously. After this invigorating, yet ultimately futile exercise, I realised no one, including me, would want another reminder of how difficult life is going to be post-postdoc…or…post-post-postdoc. Instead, this was written to encourage discussions on tackling the present tenure shortage.
Two solutions have been proposed for the academic job shortage: one, reduce PhD positions, and two, encourage non-academic careers.
Reducing the number of PhDs is, I feel, short-sighted. We need a large number of skilled researchers – the PhDs and postdocs – to actually carry out scientific work. Lab heads and principal investigators (or PIs) have little time and cannot be expected to pull their research projects through all on their own. Moreover, reducing PhD positions, and thereby qualified scientists, will lower our overall scientific output (unless you believe Braess’s paradox is applicable to academia). A related solution is to increase funding. While increased funding is desirable in and of itself, I am unsure that this alone would solve tenure shortage. Injecting more research funds will increase tenure-track positions, but will also increase the number of graduates as newly created labs would be hiring their own trainees.
Encouraging non-academic careers is pragmatic, but such encouragement may initially sound disappointing. Generally, those who sign up for a PhD do so because they enjoy learning and are intent on a scientific career. Outside of academia, spending longer in university beyond one’s undergraduate does not always increase salary. Why then would anyone want to spend four to ten years training for a degree that hampers their earning potential, unless it were with the explicit intention of pursuing an academic career?
One alternative could be to create permanent research staff positions. Having permanent research jobs might alleviate our anxieties. Moreover, not everyone wants to be a principal investigator, whose unenviable job responsibilities include continually writing grants, prioritising tedious administrative work, and stretching one’s body clock from 24- to 48-hours (what exactly is sleep?). So why not have permanent researchers, whose responsibilities mirror a postdoc’s, but whose position is not time-bound. Comparable jobs do exist, i.e., the research associate. However, a research associateship is usually an extension of a postdoc term. These positions are based on short-term contracts and can be lost the moment grant funding runs out. A permanent researcher position would need to circumvent these problems. This might work if this position is created and funded at the university-level, and not by an individual lab. Thus, each institute would employ a pool of permanent researchers to assist multiple labs. Ideally, such positions would be recognised by all institutes, so that it is easy to switch from one institute to another. Ease of and flexibility in job switching is also important as the loss of one position should not lead to short-term unemployment.
Simultaneous to creating permanent researcher positions, reducing non-scientific workload of faculty may be helpful. This will allow PIs to contribute more time to research, reduce the number of trainees needed, and eventually improve the ratio of tenure jobs to graduates. Lowering non-scientific workloads, however, depends on someone else stepping in to do all other work. This could be tackled by creating positions to manage administrative tasks. Taking this a step further, what if we create specialised positions for individual research tasks: teams for grant writing, ethics management, research design, experimentation, and publication/communication? Would such division of labor keep-up productivity while simultaneously reducing tenure shortage?
2018 Lindau Alumna Arunima Roy with moderator Adam Smith (left) and Nobel Laureate Peter C. Doherty during the panel discussion ‘Science in a Post-Factual World’ at the 68th Lindau Nobel Laureate Meeting. © Julia Nimke/Lindau Nobel Laureate Meetings
Now, this is where I would have stopped writing and thanked you for reading this while you are understandably busy pipetting with one arm and typing your next conference abstract with the other. Unfortunately for you, while writing this, I happened to watch the recent Netflix documentary on flat-earth theorists. That got me thinking of the utility of a PhD… Unfortunately this means there’s more depressing reminders of tenure shortage below. So sit back while that centrifuge is whirring and enjoy.
Watching people contest the fact that the earth is a globe, made me wonder if my own beliefs go unquestioned. Am I really capable of critical thought? I guess not, but it is fair to say that my capabilities have improved since my PhD. I think it is the same for everyone who goes through a PhD.
It has been argued that we do not vote logically, but emotionally. As humans, we will keep giving weight to our emotions while making decisions, and having a PhD will not make one devoid of emotions (though it is an ideal state to be in when your study gets scooped). That said, a doctoral degree can improve our abilities to define problems, examine available information, and weigh potential outcomes. It also grants the unique superpower of skepticism. Scientists first criticise their own work. When there are no holes to pick, they pass it on to reviewers who happily begin the process all over again. That is perhaps all we need to make better decisions: knowing how information can be used and understanding that the decision made may not be optimal.
Perhaps then it is good to have PhDs outside of academia too. “Aha!”, you say, “You have come around to the idea of alternative careers.” I believe so, but I think we need a massive cultural restructuring to actually pull this through. Non-academic careers are seen as a fallback option and there is some stigma associated with moving to other jobs. Moreover, as long a PhD is considered unusual, it will confer minimal advantage outside academia. However, if it helps make better decisions, graduate school should be the norm, not the exception. To reach that point, we require a considerable shift in mindset along with acceptability of prolonged training durations.
Credential inflation is a problem: we have many overqualified graduates scrambling for too few jobs. Given this, asking for doctoral studies to be the norm would only be making things worse. As a counter, I can only offer a personal anecdote: my great-grandma could never go to school, but my grandma did. My grandma in turn could follow political events better than her predecessor. My mother, who was more fortunate, went to university and her understanding of the world is more nuanced, which I think might help her make better decisions. Historically, graduating from high school, let alone from university, was not necessary for employment. But the overall gains in literacy made from the normalisation of post-high school education have had a positive impact on society. Similarly, would a PhD make better thinkers out of us and produce long-term societal benefits?
Lastly, as to the title: Sisyphus is that wretched soul cursed for eternity to push a boulder up a hill, only for it to roll back to the bottom each time. The path to tenure can feel like that. Tenured jobs are disappearing, replaced by adjunct positions, if at all. The handful of remaining permanent faculty positions may not necessarily go to those most deserving. With professorship becoming enriched unobtanium, you can say that we are worse off than Sisyphus. Let us not be disheartened, as our scientific skills are definitely stronger than Sisyphus’s understanding of entropy. While we might pursue occupations other than in academia, our training should enable us to improve human lives and happiness in myriad ways. I do believe that we can live our dream of making a positive contribution to humanity.