Nihar B. Shah

Carnegie Mellon University (CMU)


Subjectivity and Bias in Peer Review


Statistics Seminar


24th July 2023, 2:00 pm – 3:00 pm
Fry Building, G.13


Peer review is the backbone of scientific research, used to review millions of papers and allocate billions of dollars of grants annually. We will discuss two challenges in peer review.

1) Subjectivity: We will discuss a common form of subjectivity in reviewers -- called "commensuration bias" -- which is known to lead to arbitrariness in the review process. Here, different reviewers place differing emphases on the various criteria for judging papers. We will present an algorithm to address this problem. From a theoretical standpoint, we show that this is the only choice that satisfies three natural "axioms". From a practical standpoint, this algorithm has been used in the reviewing of tens of thousands of papers.

2) Bias: Many peer-review venues are debating policies of hiding author identities from reviewers since revealing these identities could bias reviewers. We will first describe a noteworthy controlled experiment by Tomkins, Heavlin and Zhang in the WSDM conference to test for such biases in peer review. Using this as a case study, we will illustrate how the various aspects of the peer review process (e.g., non-random reviewer assignments) can break such standard experimental procedures. We will then present an experimental design and analysis procedure along with strong theoretical guarantees that also hold under the vagaries of real-world peer review.

Time permitting, we will briefly also talk about other challenges, experiments, and computational tools in peer review, based on this survey article: https://www.cs.cmu.edu/~nihars/preprints/SurveyPeerReview.pdf

You are very welcome to join us for a reception in the main staff room following this seminar.





Biography:

Nihar B. Shah is an Associate Professor in the Machine Learning and Computer Science departments at Carnegie Mellon University (CMU). His research broadly lies in the fields of statistics, machine learning, information theory, and game theory. The recent focus of his research is on addressing systemic issues in peer review, involving algorithm design, theoretical guarantees, experiments and deployments. His algorithms for peer review are used for the review of tens of thousands of submissions, and his experiments have helped in evidence-based policy design in many peer-review venues. He is a recipient of a JP Morgan faculty research award, a Google Research Scholar Award, an NSF CAREER Award, the David J. Sakrison memorial prize from EECS Berkeley for a "truly outstanding and innovative PhD thesis", and several paper awards.

Organiser: Juliette Unwin

Comments are closed.
css.php