The scientific and academic communities are founded on the general principle of peer review. You’d think therefore that scientists would be pretty good at gauging the value and merit in one anothers work. A new paper suggests however that this is far from the case.
The researchers looked at three distinct methods for judging scientific work:
- Peer review: subjective post-publication peer review where other scientists give their opinion of a published work;
- Number of citations: the number of times a paper is referenced as a recognised source of information in another publication;
- Impact factor: a measure of a journal’s importance, determined by the average number of times papers in a journal are cited by other scientific papers.
One would think that the high level of subject matter expertise would allow academics to be good peer reviewers of their colleagues work. The findings however revealed that academics are unreliable judges of the importance of their peers publications. For instance, it emerged that agreement was rarely reached over the importance of a paper, whilst the publishing journal was excessively influential upon opinions.
“The three measures of scientific merit considered here are poor; in particular subjective assessments are an error-prone, biased and expensive method by which to assess merit. While the impact factor may be the most satisfactory of the methods considered, since it is a form of prepublication review, it is likely to be a poor measure of merit, since it depends on subjective assessment.” the researchers said.
The paper concludes with a warning regarding the validity of these methods of assessing scientific output, but the research should surely be sending similar warning signals as to the fallibility of appraisal processes in the commercial world. If experts are ill placed to judge the merit of academic work, why do we think that managers are well placed to judge the merits of professional work?
In the academic world, the citation model is utilised to get around the inadequacies of individual appraisals because they allow a wisdom of crowds approach to aggregate out any biases in our individual opinions. Peer appraisal software such as that provided by Work.com do the same in the workplace.