Peer reviewing search strategies

Authors

Steven Duffy
Carol Lefebvre

Last updated: 31 March 2023

What's new in this update

Introduction

 

Search strategy peer review, within the evidence synthesis context, is a process by which the searches designed and conducted for a Health Technology Assessment (HTA) or systematic review are designed, ideally, by an Information Specialist and subsequently reviewed by another Information Specialist, prior to the searches being run. The goal of peer review of search strategies is to detect errors in a timely fashion (that is, before the searches are run), to improve quality, and to reduce not only the risk of missing relevant studies but also the risk of identifying unnecessarily large numbers of irrelevant records.

 

As the search strategy is the cornerstone of a well-conducted HTA or systematic review, its quality could affect the results of the final review. A study published in 2006 by Sampson and McGowan found that errors in search strategies were common: the principal mistakes being spelling errors, missed spelling variants, truncation errors, logical operator errors, use of wrong line numbers, missed or incorrect use of medical subject heading index terms (e.g. MeSH), and the search strategy not being tailored for use in other databases (1). 


A study by Franco et al (2), published in 2018, assessed a random sample of 70 Cochrane systematic reviews of interventions published in 2015, evaluating the design and reporting of their search strategies using the recommendations from the then current Cochrane Handbook for Systematic Reviews of Interventions (2011 version) (3), the then current Methodological Expectations of Cochrane Intervention Reviews (MECIR standards - 2013 version) (4) and the then current Peer Review of Electronic Search Strategies (PRESS) evidence‐based guideline (5, 6). They found problems in the design of the search strategies in 73% of the reviews (95% CI, 60‐84%) and 53% of these contained problems (95% CI, 38‐69%) that could limit both the sensitivity and precision of the search strategies. 


A study by Salvador-Olivan et al (7), published in 2019, found that 92.7% of their 137 included systematic reviews, published in January 2018, contained some type of error in the MEDLINE/PubMed search strategy, and that 78.1% of these errors affected recall / sensitivity.


A study by Masterson and Martinez-Silveira (33), published in 2022, using the 2015 PRESS Checklist (10) to assess the search strategies in systematic reviews of dentistry published in 2015 by authors with a Brazilian affiliation, found that the search strategies for 45 of the 57 systematic reviews assessed needed to be revised, whilst revision was suggested for the remaining 12 reviews.  


A preprint by Lyon et al (34), originally posted in 2018 and updated in 2022, assessed the quality of search strategies in Cochrane systematic reviews and protocols of 65 systematic reviews and systematic review protocols, published between 1998 and 15 September 2016, of which only 55 contained at least one documented search strategy.  For each of the six main elements of the PRESS Checklist, up to 50% of the search strategies failed to meet the PRESS guidance.


A study by Ramirez et al (35), published in 2022, assessed the quality of search strategies published by Campbell's Education Coordinating Group in 14 Campbell systematic review protocols and 13 matching Campbell reviews, where the protocols were published between October 2014 and January 2019.  They assessed the search strategies against the Methodological Expectations of Campbell Collaboration Intervention Reviews (MECCIR) (36, 37), the conduct and reporting standards for all Campbell reviews, based on the Cochrane MECIR standards (38), and on the Peer Review of Electronic Search Strategies (PRESS) checklist (10).  With respect to PRESS, they found that there was an improvement between protocol and review searches, from seven searches requiring revisions identified in protocols, to just three in the full reviews. In comparing the scores of those protocols and reviews reporting librarian involvement, and those not, there was a significant difference (p = .006, 1 tail t-test) in looking at the percentage of reports assessed as having no revisions required across the PRESS criteria.


A study by Gorring et al (39), published in 2022, assessed various aspects of the COVID-19 Health Education England (HEE) search bank, which shares peer reviewed search strategies and results on the Knowledge for Healthcare website.  Structured interviews with the peer reviewers (n = 10) were conducted, who worked in pairs to peer review submitted search strategies.  Those interviewed felt that peer review benefited from a ‘buddy’ approach among expert searchers and that agreement about feedback provided to contributors was needed.  They felt that peer review could be challenging and would benefit from a more formal structure than they had adopted but that it was professionally rewarding. Librarians using the search bank were asked if they would be interested in becoming a peer reviewer and what training they would require. Of the 20 respondents (54.1%) who would be interested, the main development needs were what to look for in a search and how to provide feedback. Respondents stated that they would welcome working with another individual when peer reviewing and that feedback templates would help the process.  The peer reviewers also identified providing effective feedback to be one of their training and development needs for fulfilling this role.

 

How is peer review of search strategies performed?

 

Peer review of search strategies has been performed informally since searching for studies for HTA and systematic reviews began. HTA Information Specialists who are part of information teams have always been able to check colleagues’ search strategies for mistakes. The search strategy peer reviewer and the Information Specialist who designed the search strategy have been able to meet face-to-face (or more recently virtually) to discuss errors and revisions. Not all Information Specialists, however, are based in teams and so may be unable to call on colleagues to peer review their search strategies. A forum has been established to enable Information Specialists to submit their searches for peer review by a fellow Information Specialist, on a reciprocal basis (http://pressforum.pbworks.com).


In addition to the forum mentioned above, a tool has been developed which enables Information Specialists to check search strategies in a more formal, structured way. The PRESS (Peer Review of Electronic Search Strategies) Evidence-Based Checklist is a checklist that summarizes the main potential mistakes made in search strategies. This checklist can help Information Specialists to improve and assure the quality of their own search strategies and those of their colleagues. It provides clear guidance for peer reviewers to follow. It can also help non-searchers understand how search strategies have been constructed and what it is they have been designed to retrieve. Full details about the original PRESS project can be found in the original funder’s report (8) and the accompanying journal article (5). Further information, including the original PRESS Evidence-Based Checklist (now superseded by PRESS 2015 (9, 10)) can be found elsewhere (6). An update of the PRESS processes was published in 2016 (9). This involved an updated systematic review, a web-based survey of experts and a consensus meeting to update the PRESS tools. The 2015 Guideline Explanation and Elaboration (PRESS E&E) incorporates four components:

 


The six main domains of the updated PRESS 2015 Evidence-Based Checklist are:

 

 

A recent article by Terzi and Kitis (40), promoting peer review of search strategies in general and use of the PRESS Checklist in particular, included a translation of the accompanying Table 2 'Guideline Recommendations for librarian practice' from the PRESS Peer Review of Electronic Search Strategies: 2015 Guideline Statement (10) to try to encourage wide use of the PRESS guidelines in Turkey.


It is recommended that the peer review of search strategies is undertaken at the research protocol phase. As noted above, peer review of the search strategy should be performed before the searches are conducted, the results downloaded and researchers start the selection of studies. The latest version of the Cochrane Handbook chapter on searching for and selecting studies recommends peer review of search strategies at the protocol stage (11), whilst the PRISMA 2020 and PRISMA-S checklists both include an item explicitly for peer review of search strategies (12, 13, 14). PRISMA 2020 stipulates that, if the search strategy was peer reviewed, the peer review process used should be reported and the tool used, such as the Peer Review of Electronic Search Strategies (PRESS) Checklist (10), should be specified.


The example for the Search Strategy section in PRISMA 2020 states “The strategy was developed by an information specialist and the final strategies were peer reviewed by an experienced information specialist within our team.” PRISMA-S stipulates that authors should consider using the Peer Review of Electronic Search Strategies (PRESS) Guideline statement, a practice guideline for literature search peer review outlining the major components important to review and the benefits of peer reviewing searches (10). Authors should strongly consider having the search strategy peer reviewed by an experienced searcher, information specialist, or librarian. Furthermore, it suggests that the use of peer review should be described in the Methods section of the article / report. The example for the Peer Review section in PRISMA-S states “The strategies were peer reviewed by another senior information specialist prior to execution using the PRESS Checklist (10).”

 

In addition to the above, the Cochrane Handbook (11) recommends the acknowledgment of search strategy peer reviewers, for example the names, credentials and institutions of the peer reviewers of the search strategies should be noted in the review (with their permission) in the Acknowledgements section.

 

Is there any evidence of the value of the peer review of search strategies?

 

The Agency for Healthcare Research and Quality (AHRQ) has conducted a study assessing use of the PRESS checklist and found that it “seems to cut down the time needed to do the review, increase response, and do a better job of identifying actual errors in search strategies” (15). The time burden of the review process using the PRESS checklist was less than two hours.

 

A more recent study by Neilson (16) indicated that, of the thousands of knowledge synthesis studies published each year, they were only able to identify 415 knowledge synthesis articles in Scopus, over the period 2009 to 2018, that reported peer reviewing their searches. These 415 studies, however, indicated an overall upward trend in the number of protocols and completed reviews that reported incorporating peer review of their searches. Peer reviewers, using the PRESS Evidence Based-Checklist, were acknowledged by name in 124 articles (29%) and 111 of these 124 articles (89.5%) also listed an information professional amongst the authors, or otherwise indicated that an information professional was involved in the search strategy development.

 

We have not been able to identify any research evidence of whether this tool affects the final quality of systematic reviews or their economic cost. CADTH, however, conducted an internal investigation to see whether peer review of search strategies had an effect on the number and quality of articles included in CADTH Rapid Response reports (17, 18, 19) and found that both the number and quality of relevant articles retrieved were improved. We have also found increased reporting of both peer reviewing of search strategies and use of the PRESS checklist (without reporting evidence of the effectiveness). Folb and colleagues conducted an evaluation of workshops they were running for librarians on systematic reviews and found that pre-class only 9% of librarians had ever provided peer review of search strategies but at 6-months post-class follow-up this had risen to 17% (20). With respect to seeking peer review of their own searches, they found that pre-class only 36% of librarians had ever sought peer review of search strategies but at 6-months post-class follow-up this had risen to 48% (20).

 

It is worth noting that there is increasing interest, at least within the librarian and information specialist community, in librarian and information specialist involvement in peer reviewing search strategies at the manuscript submission for publication stage. For example, a recent online survey of medical librarians and information specialists conducted by Grossetta Nardini and colleagues, found that only 22% (63/291) of respondents had ever been invited to peer review a systematic review or meta-analysis journal manuscript (21). The recent launch of the Librarian Peer Reviewer Database (https://sites.google.com/view/mlprdatabase/home/about-this-site), which serves to connect librarians who have expertise in searching for evidence syntheses with journal editors who need peer reviewers with expertise in this area, should go some way towards remedying this situation. In April 2020, four of the major international library associations (the Canadian Health Libraries Association (CHLA/ABSC), the European Association for Health Information and Libraries (EAHIL), Health Libraries Australia (HLA-ALIA) and the US Medical Library Association (MLA)) submitted a joint letter to the International Committee of Medical Journal Editors (ICMJE) urging journal editors to actively seek Information Specialists as peer reviewers for knowledge synthesis publications and to advocate for the recognition of their methodological expertise. This letter has also been published in the journals of the respective library associations (22-25).  As clarified above, peer review of searches at the pre-publication stage is, strictly speaking, beyond the scope of this summary, which focusses on peer review of search strategies prior to the searches being run but this does indicate increasing awareness of this related topic.

 

 

Which organizations / guidance documents advocate peer review of searches?

 


 The above have been discussed in more detail elsewhere (32).

Reference list

(1) Sampson M, McGowan J. Errors in search strategies were identified by type and frequency. J Clin Epidemiol 2006;59(10):1057-63.  [Publication appraisal] 

(2) Franco JVA, Garrote VL, Escobar Liquitay CM, Vietto V. Identification of problems in search strategies in Cochrane Reviews. Res Synth Methods. 2018 Sep;9(3):408-416.  [Publication appraisal] 

(3) Lefebvre C, Manheimer E, Glanville J. Chapter 6: Searching for studies. In: Higgins JPT, Green S (editors). Cochrane Handbook for Systematic Reviews of Interventions. Version 5.1.0 [updated March 2011]. The Cochrane Collaboration, 2011. Available from https://training.cochrane.org/cochrane-handbook-systematic-reviews-interventions#previous-versions. [Publication appraisal] 

(4) Chandler J, Churchill R, Higgins J, Lasserson T, Tovey D. Methodological Expectations of Cochrane Intervention Reviews (MECIR). Standards for the conduct and reporting of new Cochrane Intervention Reviews 2013. Version Dec 2013. Available from: https://community.cochrane.org/mecir-manual/key-points-and-introduction/versions-and-changes-mecir

(5) Sampson M, McGowan J, Cogo E, Grimshaw J, Moher D, Lefebvre C. An evidence-based practice guideline for the peer review of electronic search strategies. J Clin Epidemiol 2009;62(9):944-52.  [Publication appraisal]

(6) McGowan, Sampson M, Lefebvre C. An evidence based checklist for the Peer Review of Electronic Search Strategies (PRESS EBC). Evidence Based Library and Information Practice 2010;5(1):149-54 [Publication appraisal

(7) Salvador-Olivan JA, Marco-Cuenca G, Arquero-Aviles R. Errors in search strategies used in systematic reviews and their effects on information retrieval. J Med Libr Assoc 2019;107(2):210-221.  [Publication appraisal] 

(8) Sampson M, McGowan J, Lefebvre C, Moher D, Grimshaw J. PRESS: Peer Review of Electronic Search Strategies. Ottawa: Canadian Agency for Drugs and Technologies in Health; 2008. [Publication appraisal] 

(9) McGowan J, Sampson M, Salzwedel DM, Cogo E, Foerster V, Lefebvre C. PRESS – Peer Review of Electronic Search Strategies: 2015 Guideline Explanation and Elaboration (PRESS E&E). Ottawa: CADTH; 2016 Jan. [Publication appraisal]  

(10) McGowan J, Sampson M, Salzwedel DM, Cogo E, Foerster V, Lefebvre C. PRESS Peer Review of Electronic Search Strategies: 2015 Guideline Statement. J Clin Epidemiol 2016;75:40-6.  [Publication appraisal

(11) Lefebvre C, Glanville J, Briscoe S, Featherstone R, Littlewood A, Marshall C, et al. Chapter 4: Searching for and selecting studies. 4.4.8 Peer review of search strategies. In: Higgins JPT, Thomas J, Chandler J, Cumpston M, Li T, Page MJ, Welch VA (editors). Cochrane Handbook for Systematic Reviews of Interventions version 6.3 (updated February 2022). Cochrane, 2022. [Publication appraisal] 

(12) Page MJ, McKenzie JE, Bossuyt PM, Boutron I, Hoffmann TC, Mulrow CD, et al. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ 2021;372:n71. [Publication appraisal]

(13) Page MJ, Moher D, Bossuyt PM, Boutron I, Hoffmann TC, Mulrow CD, et al. PRISMA 2020 explanation and elaboration: updated guidance and exemplars for reporting systematic reviews. BMJ 2021;372:n160. [Publication appraisal]

(14) Rethlefsen ML, Kirtley S, Waffenschmidt S, Ayala AP, Moher D, Page MJ, et al. PRISMA-S: an extension to the PRISMA Statement for Reporting Literature Searches in Systematic Reviews. Syst Rev 2021;10(1):39. [Publication appraisal]

(15) Relevo R, Paynter R. Peer Review of Search Strategies. Methods Research Report. (Prepared by the Oregon Evidence-based Practice Center under Contract No. 290-2007-100572.) AHRQ Publication No. 12-EHC068-EF. Rockville, MD: Agency for Healthcare Research and Quality. June 2012. [Publication appraisal] 

(16) Neilson CJ. Adoption of peer review of literature search strategies in knowledge synthesis from 2009 to 2018: An overview. Health Info Libr J. 2021 Sep;38(3):160-171. [Publication appraisal]

 (17) Spry C, Mierzwinski-Urban M, Rabb D. Peer review of literature search strategies: does it make a difference? Presented at Canadian Health Libraries Association (CHLA) Conference; 22-25 May 2013; Saskatoon, Saskatchewan: Canada.  [Publication appraisal] 

 (18) Spry C, Mierzwinski-Urban M, Rabb D. Peer review of literature search strategies: does it make a difference? Presented at the 21st Cochrane Colloquium; 19-23 Sep 2013; Quebec: Canada.  [Publication appraisal] 

(19) Spry C, Mierzwinski‐Urban M. The impact of the peer review of literature search strategies in support of rapid review reports. Res Synth Methods 2018;9(4):521-526. [Publication appraisal

(20) Folb BL, Klem ML, Youk AO, Dahm JJ, He M, Ketchum AM, et al. Continuing education for systematic reviews: a prospective longitudinal assessment of a workshop for librarians. J Med Libr Assoc 2020;108(1):36-46. [Publication appraisal] 

(21) Grossetta Nardini HK, Batten J, Funaro MC, Garcia-Milian R, Nyhan K, Spak JM, et al. Librarians as methodological peer reviewers for systematic reviews: results of an online survey. Res Integr Peer Rev 2019;4:23.  [Publication appraisal]

(22) Iverson S, Della Seta M, Lefebvre C, Ritchie M, Traditi L, Baliozian K. International health library associations urge the International Committee of Medical Journal Editors (ICMJE) to seek information specialists as peer reviewers for knowledge synthesis publications. JEAHIL 2020;16(2):58-61. [Publication appraisal]

(23) Iverson S, Della Seta M, Lefebvre C, Ritchie M, Traditi L, Baliozian K. International health library associations urge the International Committee of Medical Journal Editors (ICMJE) to seek information specialists as peer reviewers for knowledge synthesis publications. JCHLA / JABSC 2020;41(2):77-80. [Publication appraisal]

(24) Iverson S, Seta MD, Lefebvre C, Ritchie A, Traditi L, Baliozian K. International health library associations urge the ICMJE to seek information specialists as peer reviewers for knowledge synthesis publications. J Med Libr Assoc 2021;109(3):503-504.  [Publication appraisal]

(25) Iverson S, Della Seta M, Lefebvre C, Ritchie M, Traditi L, Baliozian K. International health library associations urge the International Committee of Medical Journal Editors (ICMJE) to seek information specialists as peer reviewers for knowledge synthesis publications. JoHILA 2020;1(3):9-12. [Publication appraisal]

(26) Agency for Healthcare Research and Quality (AHRQ). Methods guide for effectiveness and comparative effectiveness reviews. AHRQ publication No. 10(14)-EHC063-EF. Rockville (MD): Agency for Healthcare Research and Quality; 2014. [Publication appraisal] 

(27) Centre for Reviews and Dissemination. Systematic reviews: CRD's guidance for undertaking reviews in health care. York: University of York; 2009.  [Publication appraisal] 


(28) EUnetHTA JA3WP6B2-2 Authoring Team. Process of information retrieval for systematic reviews and health technology assessments on clinical effectiveness. Methodological guidelines. Diemen, The Netherlands: EUnetHTA; 2019. [Publication appraisal] 


(29) Institute for Quality and Efficiency in Health Care (IQWiG). IQWiG general methods: Version 6.1. Cologne: Institute for Quality and Efficiency in Health Care; 2022. [Publication appraisal] 


(30) Institute of Medicine. Finding what works in health care: Standards for systematic reviews. Washington (DC): The National Academies Press; 2011.reference [Publication appraisal]


(31) National Institute for Health and Care Excellence (NICE). Chapter 5: Identifying the evidence: literature searching and evidence submission. In: Developing NICE guidelines: The manual. Process and methods [PMG20]. London: NICE; 2022. [Publication appraisal] 


(32) Lefebvre C, Duffy S. Peer review of searches for studies for health technology assessments, systematic reviews, and other evidence syntheses. Int J Technol Assess Health Care. 2021: 37(1), E64. [Publication appraisal] 


(33) Masterson D, Martinez-Silveira MS. Aplicação do Peer Review of Electronic Search Strategies para avaliação da qualidade das estratégias de busca das revisões sistemáticas [Application of Peer Review of Electronic Search Strategies (PRESS) to assess the quality of systematic reviews search strategies]. Em Questão 2022;28(3):117865. [Publication appraisal]


(34) Lyon J, Price C, Saragossi J, Tran C. Evaluating the consistency and quality of search strategies and methodology in Cochrane Urology Group systematic reviews. OSF Preprints 2022. [Publication appraisal]


(35)  Ramirez D, Foster MJ, Kogut A, Xiao D. Adherence to systematic review standards: impact of librarian involvement in Campbell Collaboration's education reviews. Journal of Academic Librarianship 2022;48(5):102567. [Publication appraisal]


(36) The Methods Group of the Campbell Collaboration. Methodological expectations of Campbell Collaboration intervention reviews: Conduct standards. Campbell Policies and Guidelines Series No. 3. Campbell Collaboration, 2019. [Publication appraisal]


(37) The Methods Group of the Campbell Collaboration. Methodological expectations of Campbell Collaboration intervention reviews: Reporting standards. Campbell Policies and Guidelines Series No. 4. Campbell Collaboration, 2019. [Publication appraisal]


(38) Higgins JPT, Lasserson T, Chandler J, Tovey D, Thomas J, Flemyng E, et al. Methodological Expectations of Cochrane Intervention Reviews (MECIR). Standards for the conduct and reporting of new Cochrane Intervention Reviews, reporting of protocols and the planning, conduct and reporting of updates. Version February 2022. [Publication appraisal]


(39) Gorring H, Divall P, Gardner S, Gray A, McLaren A, Snell L, et al. NHS librarians collaborate to develop a search bank peer reviewing and sharing COVID-19 searches - an evaluation. Health Info Libr J 2022;09:09. [Publication appraisal]


(40) Terzi H, Kitis Y. Kapsamlı Literatür Taramasının Değerlendirilmesinde Yeni Bir Çerçeve: PRESS 2015 Rehberi [A new framework for the assessment of comprehensive literature search: the PRESS 2015 Guideline]. Turkish Journal of Family Medicine and Primary Care (TJFMPC) 2022;16(2):231-44. [Publication appraisal]


(41) Cochrane Diagnostic Test Accuracy Review Unit. The Cochrane DTA editorial process. Oxford: The Cochrane Collaboration; 2020. [Publication appraisal] 

How to cite this chapter:

Duffy S, Lefebvre C.  Peer reviewing search strategies. Last updated 31 March 2023. In: SuRe Info: Summarized Research in Information Retrieval for HTA. Available from: https://sites.google.com/york.ac.uk/sureinfo/home/peer-reviewing-search-strategies 

Copyright: the authors