BIR 2020
10th International Workshop on Bibliometric-enhanced Information Retrieval
Keynote
Dr. George Tsatsaronis, VP Data Science at Elsevier
Title of the talk: Metrics and trends in assessing the scientific impact
You are invited to participate in the 10th International Workshop on Bibliometric-enhanced Information Retrieval (BIR 2020) to be held as part of the 42nd European Conference on Information Retrieval (ECIR 2020).
The workshop will run as an ONLINE workshop! If you like to participate (via Zoom, it's free) please send an email to Ingo Frommholz <ifrommholz@acm.org> to receive further details; please use "[BIR2020] Link" as subject. Please find the schedule below.
Proceedings: http://ceur-ws.org/Vol-2591/
=== Important Dates ===
All dates are AoE (Anywhere on Earth)!- Submissions: 03 February 2020 (extended)
- Notifications: 2 March 2020
- Camera Ready Contributions and Copyright Statements (see below): 30 March 2020
- Workshop: 14 April 2020 in Lisbon, Portugal will be online!
=== Keynote ===
Dr. Georgios Tsatsaronis, VP Data Science at Elsevier
Title of the talk: Metrics and trends in assessing the scientific impact
Abstract: The economy of science has been traditionally shaped around the design of metrics that attempt to capture several different facets of the impact of scientific works. Analytics and mining around (co-)citation and co-authorship graphs, taking into account also parameters such as time, scientific output per field, and active years, are often the fundamental pieces of information that are considered in most of the well adopted metrics. There are, however, many other aspects that can contribute further to the assessment of scientific impact, as well as to the evaluation of the performance of individuals, and organisations, e.g., university departments and research centers. Such facets may cover for example the measurement of research funding raised, the impact of scientific works in patented ideas, or even the extent to which a scientific work constituted the basis for the birth of a new discipline or a new scientific (sub)area. In this work we are going to present an overview of the most recent trends in novel metrics for assessing scientific impact and performance, as well as the technical challenges faced by integrating a plethora of heterogeneous data sources in order to be able to shape the necessary views for these metrics, and the novel information extraction techniques employed to facilitate the process.
=== Program===
Keynote is 45min (incl. QA), Papers are 20min (incl. QA), Presenters in bold.
All times are Western European Summer Time (WEST, Porto time, UTC+1).
Proceedings: http://ceur-ws.org/Vol-2591/
=== Greeting notes===
- Andrea Scharnhorst (DANS, Netherlands) -- Video
- Dietmar Wolfram (University of Wisconsin-Milwaukee, USA) -- Video
- Suzan Verberne (Leiden University, Netherlands) -- Video
- Mike Thelwall (University of Wolverhampton, UK) -- Video
- Iana Atanassova (Université Bourgogne Franche-Comté, France) and Marc Bertin (University of Lyon, France) -- Video
- Henry Small (SciTech Strategies, USA) -- Video
- Staša Milojević (Indiana University, USA) -- Video
- Min-Yen Kan (National University of Singapore, Singapore) -- Video
- Ludo Waltman (Leiden University, The Netherlands) -- Video
- Aparna Basu (South Asian University, India) -- Video
- Wolfgang Glänzel (KU Leuven, Belgium) -- Video
- Akiko Aizawa (National Institute of Informatics, Japan) -- Video
- Muthu Kumar Chandrasekaran (Amazon, USA) -- Video
=== Overview paper===
Cabanac, G., Frommholz, I., & Mayr, P. (2020). Bibliometric-enhanced Information Retrieval 10th Anniversary Workshop Edition. Proceedings of ECIR 2020. https://link.springer.com/chapter/10.1007%2F978-3-030-45442-5_85
=== Accepted papers===
- Robin Brochier, Antoine Gourru, Adrien Guille and Julien Velcin. Scientific Expert Finding in Document Networks: an Evaluation Framework
- Michael Färber, Timo Klein and Joan Sigloch. Neural Citation Recommendation: A Reproducibility Study
- Juan Pablo Bascur, Suzan Verberne, Nees Jan Van Eck and Ludo Waltman. Browsing citation clusters for academic literature search: A simulation study with systematic reviews
- Timo Breuer, Philipp Schaer and Dirk Tunger. Relations Between Relevance Assessments, Bibliometrics and Altmetrics
- Christopher Michels, Mandy Neumann, Philipp Schaer and Ralf Schenkel. A Ranking Model for Urgent Conference Updates in Digital Libraries
- Daniel Kershaw, Benjamin Pettit, Maya Hristakeva and Kris Jack. Learning to Rank Research Articles: A case study of collaborative filtering and learning to rank in ScienceDirect
- Gineke Wiggers and Suzan Verberne. Usage and Citation Metrics for Ranking Algorithms in Legal Information Retrieval Systems
- Rodrigo Nogueira, Zhiying Jiang, Kyunghyun Cho and Jimmy Lin. Evaluating Pretrained Transformer Models for Citation Recommendation
=== tl;dr ===
The Bibliometric-enhanced Information Retrieval (BIR) workshop series at ECIR tackles issues related to academic search, at the crossroads between Information Retrieval and Bibliometrics. BIR is a hot topic investigated by both academia (e.g., ArnetMiner, CiteSeerX) and the industry (e.g., Google Scholar, Microsoft Academic Search, Semantic Scholar). A one-day workshop is to be held at ECIR 2020 in Lisbon, Portugal.
Past BIR(NDL) proceedings are online https://dblp.org/search?q=BIR.ECIR as open access.
=== Keywords ===
Academic Search • Information Retrieval • Digital Libraries • Bibliometrics • Scientometrics
=== Workshop Topics ===
We welcome (but are not limited to) submissions regarding all three aspects of academic search below:
- Information seeking & searching with scientific information, such as:
- Finding relevant papers/authors for a literature review.
- Measuring the degree of plagiarism in a paper.
- Identifying expert reviewers for a given submission.
- Flagging predatory conferences and journals.
- Information seeking behaviour and human-computer interaction in academic search.
- Mining the scientific literature, such as:
- Information extraction, text mining and parsing of scholarly literature.
- Natural language processing (e.g., citation contexts).
- Discourse modelling and argument mining.
- Academic search/recommender systems, such as:
- Modelling the multifaceted nature of scientific information.
- Building test collections for reproducible BIR.
- System support for literature search and recommendation.
We especially invite descriptions of running projects and ongoing work as well as contributions from industry. Papers that investigate multiple themes directly are especially welcome.
=== Aim of the Workshop ===
Searching for scientific information is a long-lived information need. In the early 1960s, Salton (1963) was already striving to enhance information retrieval by including clues inferred from bibliographic citations. The development of citation indexes pioneered by Garfield (1955) proved determinant for such a research endeavour at the crossroads between the nascent fields of Information Retrieval (IR) and Bibliometrics [Bibliometrics refers to the statistical analysis of the academic literature (Pritchard, 1969) and plays a key role in scientometrics: the quantitative analysis of science and innovation (Leydesdorff & Milojevic, 2015)]. The pioneers who established these fields in Information Science---such as Salton and Garfield---were followed by scientists who specialised in one of these (White & McCain, 1998), leading to the two loosely connected fields we know of today.
The purpose of the BIR workshop series founded in 2014 is to tighten up the link between IR and Bibliometrics. We strive to get the ‘retrievalists’ and ‘citationists’ (White & McCain, 1998) active in both academia and the industry together, who are developing search engines and recommender systems such as ArnetMiner, CiteSeerX, Google Scholar, Microsoft Academic Search, and Semantic Scholar, just to name a few.
These bibliometric-enhanced IR systems must deal with the multifaceted nature of scientific information by searching for or recommending academic papers, patents, venues (i.e., conferences or journals), authors, experts (e.g., peer reviewers), references (to be cited to support an argument), and datasets. The underlying models harness relevance signals from keywords provided by authors, topics extracted from the full-texts, coauthorship networks, citation networks, and various classifications schemes of science.
Bibliometric-enhanced IR is a hot topic whose recent developments made the news---see for instance the Initiative for Open Citations (Shotton, 2018) and the Google Dataset Search (Castelvecchi, 2018) launched on September 4, 2018. We believe that BIR@ECIR is a much needed scientific event for the ‘retrievalists’ and ‘citationists’ to meet and join forces pushing the knowledge boundaries of IR applied to literature search and recommendation.
- Castelvecchi, D.: Google unveils search engine for open data [News & Comment]. Nature, (2018). doi:10.1038/d41586-018-06201-x
- Garfield, E.: Citation indexes for science: A new dimension in documentation through association of ideas. Science, 122(3159), 108–111 (1955). doi:10.1126/science.122.3159.108
- Leydesdorff, L., Milojević, S.: Scientometrics. In: Wright, J.D. (ed.) International Encyclopedia of the Social & Behavioral Sciences, vol. 21, pp. 322–327. Elsevier, 2nd edn. (2015). doi:10.1016/b978-0-08-097086-8.85030-8
- Pritchard, A.: Statistical bibliography or bibliometrics? [Documentation notes]. Journal of Documentation, 25(4), 348–349 (1969). doi:10.1108/eb026482
- Salton, G.: Associative document retrieval techniques using bibliographic information. Journal of the ACM, 10(4), 440–457 (1963). doi:10.1145/321186.321188
- Shotton, D.: Funders should mandate open citations. Nature, 553(7687), 129 (2018). doi:10.1038/d41586-018-00104-7
- White, H.D., McCain, K.W.: Visualizing a discipline: An author co-citation analysis of Information Science, 1972–1995. Journal of the American Society for Information Science, 49(4), 327–355 (1998). doi:b57vc7
=== Submission and Camera-ready Details ===
All submissions must be written in English following Springer LNCS author guidelines (6 to 12 pages, pleae see below) and should be submitted as PDF files to EasyChair. All submissions will be reviewed by at least two independent reviewers. Please be aware of the fact that at least one author per paper needs to register for the workshop and attend the workshop to present the work. In case of no-show the paper (even if accepted) will be deleted from the proceedings AND from the program.
Springer LNCS: <http://www.springer.com/gp/computer-science/lncs/conference-proceedings-guidelines>
EasyChair: <https://easychair.org/conferences/?conf=bir2020>
Camera-ready Versions and Copyright Statements
Workshop proceedings will be deposited online in the CEUR workshop proceedings publication service (ISSN 1613-0073) - this way the proceedings will be permanently available and citable (digital persistent identifiers and long term preservation).
Every author needs to sign the CEUR author agreement. We prepared a special author agreement for BIR - please use this one. All authors please make sure that you
- fill all fields
- print the author agreement,
- sign the printed agreement (no electronic signatures accepted),
- scan the signed author agreement and
- send it to Ingo Frommholz <ifrommholz@acm.org> - deadline March 30, 2020.
By using above form you confirm you did not use any copyrighted third party material. In case you did include copyrighted third party material in your paper, you need to use an alternative copyright form. You must then also attach a copy of the permission by the third party to use the material in the signed author agreement.
You also need to add a copyright clause to your paper:
"Copyright © 2020 for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0)."
A LaTeX example (.zip file) with the copyright statement can be found here.
Further information and background can be found on CEUR's How to Submit page.
=== Program Committee ===
Muhammad Kamran Abbasi, University of Sindh, Pakistan
Iana Atanassova, CRIT, Université de Franche-Comté, France
Sumit Bhatia, IBM Research, India
Joeran Beel, Trinity College Dublin, Ireland
Patrice Bellot, Aix-Marseille Université - CNRS (LSIS), France
Marc Bertin, Université Lyon 1, France
Jose Borbinha, IST / INESC-ID, Portugal
Peter Bruza, Queensland University of Technology
Cornelia Caragea, Kansas State University, USA
Zeljko Carevic, GESIS - Leibniz Institute for the Social Sciences, Germany
Muthu Kumar Chandrasekaran, Amazon, USA
Edward Fox, Virginia Polytechnic Institute and State University, USA
Norbert Fuhr, University of Duisburg-Essen, Germany
C. Lee Giles, The Pennsylvania State University, USA
Bela Gipp, Bergische University Wuppertal, Germany
Gilles Hubert, University of Toulouse, France
Kokil Jaidka, University of Pennsylvania, USA
Roman Kern, Know-Center GmbH, Germany
Petr Knoth, The Open University, UK
Marijn Koolen, Huygens Institute for the History of the Netherlands, Netherlands
Rob Koopman, OCLC, The Netherlands
Cyril Labbé, Grenoble University, France
Vincent Larivière, EBSI-UdeM, Canada
Haiming Liu, University of Bedfordshire, UK
Stasa Milojevic, Indiana University Bloomington, USA
Peter Mutschke, GESIS – Leibniz Institute for the Social Sciences, Germany
Rajesh Piryani, South Asian University, India
Horacio Saggion, Universitat Pompeu Fabra, Spain
Philipp Schaer, TH Cologne, Germany
Andrea Scharnhorst, DANS-KNAW, The Netherlands
Ralf Schenkel, Univerisity of Trier, Germany
Vivek Kumar Singh, Banaras Hindu University, India
Henry Small, SciTech Strategies, USA
Cassidy Sugimoto, Indiana University Bloomington, USA
Lynda Tamine, University of Toulouse, France
Ludovic Tanguy, University of Toulouse, France
Ulrich Thiel, Fraunhofer IPA-PAMB, Germany
Dietmar Wolfram, University of Wisconsin-Milwaukee, USA
Haozhen Zhao, Navigant, USA
=== Program Chairs ===
Guillaume Cabanac, University of Toulouse, France
Ingo Frommholz, University of Bedfordshire, UK
Philipp Mayr, GESIS - Leibniz Institute for the Social Sciences, Germany