What's new in this update
Search strategy peer review, within the evidence synthesis context, is a process by which the searches designed and conducted for a Health Technology Assessment (HTA) or systematic review are designed, ideally, by an Information Specialist and subsequently reviewed by another Information Specialist, prior to the searches being run. The goal of peer review of search strategies is to detect errors in a timely fashion (that is, before the searches are run), to improve quality, and to reduce not only the risk of missing relevant studies but also the risk of identifying unnecessarily large numbers of irrelevant records.
As the search strategy is the cornerstone of a well-conducted HTA or systematic review, its quality could affect the results of the final review. A study published in 2006 by Sampson and McGowan found that errors in search strategies were common: the principal mistakes being spelling errors, missed spelling variants, truncation errors, logical operator errors, use of wrong line numbers, missed or incorrect use of medical subject heading index terms (e.g. MeSH), and the search strategy not being tailored for use in other databases (1).
A study by Franco et al (2) published in 2018 assessed a random sample of 70 Cochrane systematic reviews of interventions published in 2015, evaluating the design and reporting of their search strategies using the recommendations from the then current Cochrane Handbook for Systematic Reviews of Interventions (2011 version) (3), the then current Methodological Expectations of Cochrane Intervention Reviews (MECIR standards - 2013 version) (4) and the then current Peer Review of Electronic Search Strategies (PRESS) evidence‐based guideline (5, 6). They found problems in the design of the search strategies in 73% of the reviews (95% CI, 60‐84%) and 53% of these contained problems (95% CI, 38‐69%) that could limit both the sensitivity and precision of the search strategies.
A study by Salvador-Olivan et al (7) published in 2019 found that 92.7% of their 137 included systematic reviews, published in January 2018, contained some type of error in the MEDLINE/PubMed search strategy, and that 78.1% of these errors affected recall / sensitivity.
A study by Masterson et al (33), published in 2022, using the 2015 PRESS Checklist (10) to assess the search strategies in systematic reviews of dentistry published in 2015 by authors with a Brazilian affiliation, found that the search strategies for 45 of the 57 systematic reviews assessed needed to be revised, whilst revision was suggested for the remaining 12 reviews.
A preprint by Price et al (34), originally posted in 2018 and updated in 2022, assessed the quality of search strategies in Cochrane systematic reviews and protocols of 65 systematic reviews and systematic review protocols, published between 1998 and 15 September 2016, of which only 55 contained at least one documented search strategy. For each of the six main elements of the PRESS Checklist, up to 50% of the search strategies failed to meet the PRESS guidance.
A study by Ramirez et al (35), published in 2022, assessed the quality of search strategies published by Campbell's Education Coordinating Group in 14 Campbell systematic review protocols and 13 matching Campbell reviews, where the protocols were published between October 2014 and January 2019. They assessed the search strategies against the Methodological Expectations of Campbell Collaboration Intervention Reviews (MECCIR) (36, 37), the conduct and reporting standards for all Campbell reviews, based on the Cochrane MECIR standards (38), and on the Peer Review of Electronic Search Strategies (PRESS) checklist (10). With respect to PRESS, they found that there was an improvement between protocol and review searches, from seven searches requiring revisions identified in protocols, to just three in the full reviews. In comparing the scores of those protocols and reviews reporting librarian involvement, and those not, there was a significant difference (p = .006, 1 tail t-test) in looking at the percentage of reports assessed as having no revisions required across the PRESS criteria.
A study by Gorring et al (39), published in 2022, assessed various aspects of the COVID-19 Health Education England (HEE) search bank, which shares peer reviewed search strategies and results on the Knowledge for Healthcare website. Structured interviews with the peer reviewers (n = 10) were conducted, who worked in pairs to peer review submitted search strategies. Those interviewed felt that peer review benefited from a ‘buddy’ approach among expert searchers and that agreement about feedback provided to contributors was needed. They felt that peer review could be challenging and would benefit from a more formal structure than they had adopted but that it was professionally rewarding. Librarians using the search bank were asked if they would be interested in becoming a peer reviewer and what training they would require. Of the 20 respondents (54.1%) who would be interested, the main development needs were what to look for in a search and how to provide feedback. Respondents stated that they would welcome working with another individual when peer reviewing and that feedback templates would help the process. The peer reviewers also identified providing effective feedback to be one of their training and development needs for fulfilling this role.
How is peer review of search strategies performed?
Peer review of search strategies has been performed informally since searching for studies for HTA and systematic reviews began. HTA Information Specialists who are part of information teams have always been able to check colleagues’ search strategies for mistakes. The search strategy peer reviewer and the Information Specialist who designed the search strategy have been able to meet face-to-face (or more recently virtually) to discuss errors and revisions. Not all Information Specialists, however, are based in teams and so may be unable to call on colleagues to peer review their search strategies. A forum has been established to enable Information Specialists to submit their searches for peer review by a fellow Information Specialist, on a reciprocal basis (http://pressforum.pbworks.com).
In addition to the forum mentioned above, a tool has been developed which enables Information Specialists to check search strategies in a more formal, structured way. The PRESS (Peer Review of Electronic Search Strategies) Evidence-Based Checklist is a checklist that summarizes the main potential mistakes made in search strategies. This checklist can help Information Specialists to improve and assure the quality of their own search strategies and those of their colleagues. It provides clear guidance for peer reviewers to follow. It can also help non-searchers understand how search strategies have been constructed and what it is they have been designed to retrieve. Full details about the original PRESS project can be found in the original funder’s report (8) and the accompanying journal article (5). Further information, including the original PRESS Evidence-Based Checklist (now superseded by PRESS 2015 (9, 10)) can be found elsewhere (6). An update of the PRESS processes was published in 2016 (9). This involved an updated systematic review, a web-based survey of experts and a consensus meeting to update the PRESS tools. The 2015 Guideline Explanation and Elaboration (PRESS E&E) incorporates four components:
six PRESS 2015 recommendations for librarian practice
four PRESS 2015 implementation strategies
an updated PRESS 2015 Evidence-Based Checklist
an updated PRESS 2015 assessment form.
The six main domains of the updated PRESS 2015 Evidence-Based Checklist are:
translation of the research question
Boolean and proximity operators
text word searching (free text)
spelling, syntax and line numbers
limits and filters
A recent article by Terzi and Kitis (40), promoting peer review of search strategies in general and use of the PRESS Checklist in particular, included a translation of the accompanying Table 2 'Guideline Recommendations for librarian practice' from the PRESS Peer Review of Electronic Search Strategies: 2015 Guideline Statement (10) to try to encourage wide use of the PRESS guidelines in Turkey.
It is recommended that the peer review of search strategies is undertaken at the research protocol phase. As noted above, peer review of the search strategy should be performed before the searches are conducted, the results downloaded and researchers start the selection of studies. The latest version of the Cochrane Handbook chapter on searching for and selecting studies recommends peer review of search strategies at the protocol stage (11), whilst the PRISMA 2020 and PRISMA-S checklists both include an item explicitly for peer review of search strategies (12, 13, 14). PRISMA 2020 stipulates that, if the search strategy was peer reviewed, the peer review process used should be reported and the tool used, such as the Peer Review of Electronic Search Strategies (PRESS) Checklist (10), should be specified.
The example for the Search Strategy section in PRISMA 2020 states “The strategy was developed by an information specialist and the final strategies were peer reviewed by an experienced information specialist within our team.” PRISMA-S stipulates that authors should consider using the Peer Review of Electronic Search Strategies (PRESS) Guideline statement, a practice guideline for literature search peer review outlining the major components important to review and the benefits of peer reviewing searches (10). Authors should strongly consider having the search strategy peer reviewed by an experienced searcher, information specialist, or librarian. Furthermore, it suggests that the use of peer review should be described in the Methods section of the article / report. The example for the Peer Review section in PRISMA-S states “The strategies were peer reviewed by another senior information specialist prior to execution using the PRESS Checklist (10).”
In addition to the above, the Cochrane Handbook (11) suggests the acknowledgment of search strategy peer reviewers, for example the names, credentials and institutions of the peer reviewers of the search strategies should be noted in the review (with their permission) in the Acknowledgements section.
Is there any evidence of the value of the peer review of search strategies?
The Agency for Healthcare Research and Quality (AHRQ) has conducted a study assessing use of the PRESS checklist and found that it “seems to cut down the time needed to do the review, increase response, and do a better job of identifying actual errors in search strategies” (15). The time burden of the review process using the PRESS checklist was less than two hours.
A more recent study by Neilson (16) indicated that, of the thousands of knowledge synthesis studies published each year, they were only able to identify 415 knowledge synthesis articles in Scopus, over the period 2009 to 2018, that reported peer reviewing their searches. These 415 studies, however, indicated an overall upward trend in the number of protocols and completed reviews that reported incorporating peer review of their searches. Peer reviewers, using the PRESS Evidence Based-Checklist were acknowledged by name in 124 articles (29%) and 111 of these 124 articles (89.5%) also listed an information professional amongst the authors, or otherwise indicated that an information professional was involved in the search strategy development.
We have not been able to identify any research evidence of whether this tool affects the final quality of systematic reviews or their economic cost. CADTH, however, conducted an internal investigation to see whether peer review of search strategies had an effect on the number and quality of articles included in CADTH Rapid Response reports (17, 18, 19) and found that both the number and quality of relevant articles retrieved were improved. We have also found increased reporting of both peer reviewing of search strategies and use of the PRESS checklist (without reporting evidence of the effectiveness). Folb and colleagues conducted an evaluation of workshops they were running for librarians on systematic reviews and found that pre-class only 9% of librarians had ever provided peer review of search strategies but at 6-months post-class follow-up this had risen to 17% (20). With respect to seeking peer review of their own searches, they found that pre-class only 36% of librarians had ever sought peer review of search strategies but at 6-months post-class follow-up this had risen to 48% (20).
It is worth noting that there is increasing interest, at least within the librarian and information specialist community, in librarian and information specialist involvement in peer reviewing search strategies at the manuscript submission for publication stage. For example, a recent online survey of medical librarians and information specialists conducted by Grossetta Nardini and colleagues, found that only 22% (63/291) of respondents had ever been invited to peer review a systematic review or meta-analysis journal manuscript (21). The recent launch of the Librarian Peer Reviewer Database (https://sites.google.com/view/mlprdatabase/home/about-this-site), which serves to connect librarians who have expertise in searching for evidence syntheses with journal editors who need peer reviewers with expertise in this area, should go some way towards remedying this situation. In April 2020, four of the major international library associations (the Canadian Health Libraries Association (CHLA/ABSC), the European Association for Health Information and Libraries (EAHIL), Health Libraries Australia (HLA-ALIA) and the US Medical Library Association (MLA)) submitted a joint letter to the International Committee of Medical Journal Editors (ICMJE) urging journal editors to actively seek Information Specialists as peer reviewers for knowledge synthesis publications and to advocate for the recognition of their methodological expertise. This letter has also been published in the journals of the respective library associations (22-25). As clarified above, peer review of searches at the pre-publication stage is, strictly speaking, beyond the scope of this summary, which focusses on peer review of search strategies prior to the searches being run but this does indicate increasing awareness of this related topic.
Which organizations / guidance documents advocate peer review of searches?
The Agency for Healthcare Research and Quality (AHRQ) in the U.S. (26)
The Centre for Reviews and Dissemination in the U.K. (27)
Cochrane (11, 41)
The European Network for Health Technology Assessment (EUnetHTA) (28)
The Institute for Quality and Efficiency in Health Care (IQWiG) in Germany (29)
The Institute of Medicine in the U.S. (30)
The National Institute of Health and Care Excellence (NICE) in the U.K. (31)
The Preferred Reporting Items for Systematic reviews and Meta-Analyses - Extension for Searches (PRISMA-S Extension) (14)
The PRISMA 2020 statement and explanation and elaboration documents (12,13)
The above have been discussed in more detail elsewhere (32).