Published 31 January 2025 by LINO News
How to Create Bias-Free Review Procedures for Fellowships and Research Grants
During a workshop at the 73rd Lindau Nobel Laureate Meeting, Young Scientists discussed the Lindau Guidelines, focusing on how to improve review procedures for fellowships and research grants. Moderated by Council Member Pernilla Wittung-Stafshede, the session highlighted key action items and demands, summarized below. All contributors are listed at the end.
How to Apply – Applications
- Replacing conventional length CVs with a narrative CV focused on an applicant’s scientific contributions, training of early researchers, collaborative experience and clarity of communication may help evaluations consider the scientific ideas and abilities with less emphasis on demographic factors. Here is an example: Résumé for Research and Innovation (R4RI) template – UKRI
- Adjust the order of evaluation such that biased information is revealed after the reviewers have formed an opinion of the candidate’s research ideas and ability. For example, remove affiliation on the CV, and use initial for first name to not reveal the gender of the candidate.
- Dividing the application in two parts, where the first part is a short (perhaps 1 page) summary that should be anonymized so that the reviewers cannot figure out the identity or affiliation of the applicant. This first part can still contain key information about the applicant’s merits in anonymized form.
- Introducing an anonymous selection process up to interview. Investigation regarding “accountability” as well as “feasibility” could be done later, e.g., after the interview.
- Research proposals that are succinct, omitting extensive background, literature analysis, and speculation on broader impact of project. I.e., shorter proposal lengths.
- Effective first screening procedure using a anonymous narrative CV and short 1-2 pages pitches of the research proposal.
- A two-step evaluation system that initially separates the assessment of scientific merit from the candidate’s personal and professional information. First step: Reviewers assess scientific merit of short 1-2 pages proposal. After that, CV and personal information reviewed. Proposals scoring above a certain threshold advance to second stage. The second step address administrative, technical, and detailed elements of the proposal. Here, all the project management details and a more detailed description of the scientific project (3-4 pages) should be included.
- Ask for the results of the targets mentioned in previous research proposals to keep proposals more realistic.
Selecting and Training Reviewers
- Provide training sessions for reviewers on recognizing and mitigating unconscious bias. Nudge the reviewer: Reminder of common biases in proximity to marking grids.
- Explicitly outline priorities and expected outcomes of funding call to reviewers. Describe profiles of successful target candidates.
- Avoid using reviewers that have direct or indirect connections with the applicants. Recruiting international (foreign) reviewers to broaden view and help ensure proposals are evaluated strictly on scientific merit.
- Increasing the number of potential reviewers invited to review to ensure suitable and matched reviewers for each proposal. To reduce work for expert reviewers and provide a broader perspective, introduce early-career expert reviewers on review panels.
- Strive for diversity (not only gender) among reviewers and review panel members.
How to Review
- Reviewers’ comments must be sufficiently precise so that the applicant can address it. The reasons for a low score should be provided.
- Require review panel members to privately document individual scores before group discussion to avoid dominance of a single member. Transparency in scores given by all panel members. Justify marks with a short explanation. A well-designed scoring grid of how/what to assess in each stage of the selection.
- Reviewers should be instructed not to assign undue importance to language errors in applications. Establish systematic, clear evaluation standards that give the applicant’s capacity for research, relevance, and originality prominence over language accuracy.
- Providing an opportunity for factual clarification or context from applicants could help ensure reviews are based on accurate understanding of the proposed work.
- Create a channel of communication (anonymous or open exchange) between applicant and reviewers; the applicant should have a possibility to respond to criticism and based on applicant’s response, reviewer should have an opportunity to modify the review.
- As it is practically impossible to distinguish fundable projects within a small window around a hard cutoff, a certain percent of applications below and above the cutoff point, that fall within the uncertainty/resolution, are chosen randomly.
- Use RRR (Review the Reviewers’ Review), a feedback mechanism designed to enhance quality and fairness by rating the reviewers’ assessments. Applicants who participate in RRR will review the evaluations provided by reviewers on other proposals. These “applicant reviewers” will assess the clarity, depth, constructiveness, and absence of bias in the original reviews. They cannot review evaluations of their own proposals, will typically review proposals outside their area of expertise or with time delay. Ratings given by “applicant reviewers” are recorded and monitored by granting agency for accountability and improvements. Perhaps use standardized questionnaire for applicant reviewers to collect feedback.
- The reviewers should be selected, assessed and rewarded by the quality of their expert opinions. Their contributions to science need to be properly acknowledged.
- Funding agencies should systematically collect data on misconduct regarding conflicts of interests, reviewers’ performance etc, and set up metrics to follow up the effectiveness of newly implemented changes in procedure.
Some Relevant References
- Peer Review Bias in the Funding Process: Main Themes and Interventions (hw.ac.uk)
- What Works for Peer Review and Decision-Making in Research Funding: A Realist Synthesis (biomedcentral.com)
- An Experimental Test of the Effects of Redacting Grant Applicant Identifiers on Peer Review Outcomes (elifesciences.org)
- Blinding Reduces Institutional Prestige Bias During Initial Review of Applications for a Young Investigator Award (elifesciences.org)
- Measuring Bias, Burden, and Conservatism in Research (f1000research.com)
- The Acceptability of Using a Lottery to Allocate Research Funding: A Survey of Applicants (biomedcentral.com)
- Blinding Applicants in a First-Stage Peer-Review Process of Biomedical Research Grants: An Observational Study (oup.com)
Contributors
Shada Abuhattum, Petra Albertova, Farah Atour, Vasilis Belis, Wesley Cota, Luka Hansen, Mahmoud Saad Afify Ali Ibrahim, Samuel Jacob, Xinyu Jia, Yusuf Karli, Seung Ju Kim, Suraya Kazi, Ewelina Lipiec, Vasudev Mittal, Emma Campillo Muñoz, Mats Persson, Mirko Rossini, Alexei Sytov, Duhan Toparlak, Federico Vismarra, Shunmin Zhu