What is the ISSG Search Filter Resource?

What are search filters?

Search filters are collections of search terms designed to retrieve selections of records. Search filters may be designed to retrieve records of research using a specific study design or by topic or by some other feature of the research question.

Filters may have a very specific focus or may be high level. Search filters may be designed to maximise sensitivity (or recall) or to maximise precision (and reduce the number of irrelevant records that need to be assessed for relevance).

The methods used to compile search filters should be clearly described by the authors and critical assessment by users is also required. Search filters are not quality filters and all search results require assessment for quality. All search filters and all search strategies are compromises and an assessment of the performance of filters for your own research should always be made.

What is the ISSG Search Filter Resource?

This resource lists known filters of interest to researchers producing technology assessments. Filters are usually published for a specific database interface. Often these filters are 'translated' or adapted to run on different interfaces to the database. Where these 'translations' are known, links will be offered.

To avoid potential duplication of effort and to encourage testing and use of new filters, the site also offers detail of research in progress and known completed but as yet unpublished filters. Other information retrieval research such as reviews of search filters is also offered.

Publicising the ISSG Search Filter Resource

The editorial team have presented information about the ISSG Search Filter Resource at conferences and workshops:

How do we keep the ISSG Search Filter Resource up to date?

To keep the resource up to date, the ISSG Search Filter Resource editorial team search for new publications on a regular basis. We acknowledge past assistance from NHS Quality Improvement Scotland and the UK Cochrane Centre.

Our search process is described below.

We are currently searching the MEDLINE ALL database via Ovid on a monthly basis to identify new articles about search filters. We are using this search strategy, updated from  December 2023 onwards:

1 "information storage and retrieval"/ or exp pubmed/ or medline/ 

2 ((search$ or methodological or qualitative or study design$) adj2 (filter$ or hedge$)).ti,ab,kf.

3 ((sensitivity or recall or specificity or precision or predictive or performance) adj4 (search$ or strateg$ or filter$ or hedge$)).ti,ab,kf.

4 or/1-3

5 ((expert or special) adj4 (search$ or hedge$ or filter$ or quer$)).ti,ab,kf.

6 (Ovid or Elsevier or Proquest or Ebsco or Wiley or Clarivate or Medline or PubMed or Embase or CINAHL or Web of Science or Scopus ).ti,ab,kf.

7 5 and 6

8 ((search build$ or clinical quer$ or search filter$) adj4 (Ovid or Elsevier or Proquest or Ebsco or Wiley  or Clarivate or Medline or PubMed or Embase or CINAHL or Web of Science or Scopus )).ti,ab,kf.

9 ((inbuilt or in built) adj4 (search$ or hedge$ or filter$ or quer$)).ti,ab,kf.

10 7 or 8 or 9

11 exp Artificial Intelligence/

12 ("AI" or comput$ Intelligence or comput$ reasoning or machine Intelligence).ti,ab.

13 exp Machine Learning/

14 ((machine or transfer or algorithmic) adj2 Learning).ti,ab.

15 ("neural networks" or "natural language processing" or 'llm*$ or large language model$ ).ti,ab.

16 11 or 12 or 13 or 14 or 15

17 ((expert or special) adj4 (search$ or hedge$ or filter$ or quer$)).ti,ab,kf.

18 (search build$ or clinical quer$ or search filter$).ti,ab,kf.

19 17 or 18

20 16 and 19

21 4 or 10 or 20

In addition we search the MEDLINE ALL database (via Ovid) for the most commonly published authors in the search filters literature with the following strategy:

We search Embase (via Ovid) monthly using the following search strategy, updated from December 2023 onwards:

1 *Information retrieval/

2 ((search$ or methodological or qualitative or study design$) adj2 (filter$ or hedge$)).ti,ab,kf.

3 ((sensitivity or recall or specificity or precision or predictive or performance) adj4 (search$ or strateg$ or filter$ or hedge$)).ti,ab,kf.

4 or/1-3

5 ((expert or special) adj4 (search$ or hedge$ or filter$ or quer$)).ti,ab,kf.

6 (Ovid or Elsevier or Proquest or Ebsco or Wiley or Clarivate or Medline or PubMed or Embase or CINAHL or Web of Science or Scopus ).ti,ab,kf.

7 5 and 6

8 ((search build$ or clinical quer$ or search filter$) adj4 (Ovid or Elsevier or Proquest or Ebsco or Wiley  or Clarivate or Medline or PubMed or Embase or CINAHL or Web of Science or Scopus )).ti,ab,kf.

9 ((inbuilt or in built) adj4 (search$ or hedge$ or filter$ or quer$)).ti,ab,kf.

10 7 or 8 or 9

11 exp Artificial Intelligence/

12 ("AI" or comput$ Intelligence or comput$ reasoning or machine Intelligence).ti,ab.

13 exp *Machine Learning/

14 ((machine or transfer or algorithmic) adj2 Learning).ti,ab.

15 ("neural networks" or "natural language processing" or 'llm$1 or large language model$).ti,ab.

16 11 or 12 or 13 or 14 or 15

17 ((expert or special) adj4 (search$ or hedge$ or filter$ or quer$)).ti,ab,kf.

18 (search build$ or clinical quer$ or search filter$).ti,ab,kf.

19 17 or 18

20 16 and 19

21 4 or 10 or 20

We use the focused subject heading in line 1  as it reduces the number of irrelevant records without impacting substantially on the recall of relevant records.


Since 2013 we have searched CINAHL (via EBSCO) on a monthly basis to identify new articles about search filters. We are using this search strategy, updated from January 2024 onwards

S1 (MH Information Retrieval) OR (MH "Information Storage") OR (MM "Reference Databases, Health+")

S2 TX (sensitivity OR recall OR specificity OR precision OR predictive OR performance) N4 (search* OR strateg* OR filter* OR hedge*)

S3 TX (search* OR methodological OR qualitative OR "study design") N4 (filter* OR hedge*)

S4 S1 OR S2 OR S3

S5 TX (expert OR special) N4 (search$ OR hedge$ OR filter$ OR quer$)

S6 TX (Ovid OR Elsevier OR Proquest OR Ebsco OR Wiley OR PubMed OR Medline OR Embase OR CINAHL OR “Web of Science” OR Scopus)

S7 S5 AND S6

S8 TX ("search build$" OR "clinical quer$" OR "search filter$") N4 (Ovid OR Elsevier OR Proquest OR Ebsco OR Wiley OR PubMed OR Medline OR Embase OR CINAHL OR “Web of Science” OR Scopus)

S9 TX (inbuilt OR "in built") N4 (search$ OR hedge$ OR filter$ OR quer$)

S10 S7 OR S8 OR S9

S11 (MH "Artificial Intelligence+")

S12 TX ("AI" OR "comput$ intelligence" OR "comput$ reasoning" OR "machine intelligence")

S13 TX (machine OR transfer OR algorithmic) N2 Learning

S14 TX ("neural networks" OR "natural language processing")

S15 S11 OR S12 OR S13 OR S14

S16 TX (expert OR special) N4 (search$ OR hedge$ OR filter$ OR quer$)

S17 TX ("search build$" OR "clinical quer$" OR "search filter$")

S18 S16 OR S17

S19 S15 AND S18

S20 S4 OR S10 OR S19

We use the focused subject heading "Reference Databases, Health+" in line 1  as it reduces the number of irrelevant records without impacting substantially on the recall of relevant records.

Scanning the table of contents of journals


We currently scan the following journals:

Current awareness lists


We scan the the annual current awareness bulletin Important publications on the topic of Information Retrieval from the German Institute for Quality and Efficiency in Health Care (IQWiG) (from the 2021 edition onwards - English version) and the monthly alerting services from the UK's National Institute for Health and Care Excellence (NICE).

How can I assess the quality of a search filter?

The ISSG has developed a quality checklist to provide assessments of published filters designed to retrieve records by specific study design. This checklist has been published as:

Glanville J , Bayliss S, Booth A, Dundar Y, Fernandes H, Fleeman ND, Foster L, Fraser C, Fry-Smith A, Golder S, Lefebvre C, Miller C, Paisley S, Payne L, Price A, Welch K. So many filters, so little time: The development of a Search Filter Appraisal Checklist. Journal of the Medical Library Association. 2008; 96(4): 356-61.

Feedback

We welcome your comments on this resource.  Provide feedback