center
ECIS 2024: CoLRev

Three tools to help you navigate and review IS literature

LitBaskets, PermuSearch & CoLRev

Gerit Wagner and Julian Prester
ECIS 2024: CoLRev

Before we start


center

Gerit Wagner and Julian Prester
center
ECIS 2024: CoLRev

Our background

Gerit Wagner and Julian Prester
ECIS 2024: CoLRev

Philosophy: Reliability, efficiency, and contribution

Reliability

  • Transparency and openness of data, and open-source code
  • Validation of data and code
  • Efficient undo operations

Efficiency

  • Tool testing and innovating
  • Team expansion, involving colleagues, student assistants, and crowds
  • Data reuse, e.g., updating prior reviews, importing samples from other review papers

Contribution

  • Tools must build on a nuanced understanding of review goals and types
  • Knowledge contributions primarily depend on the ingenuity of researchers (with some help of tools)
Gerit Wagner and Julian Prester
ECIS 2024: CoLRev

Literature reviews with CoLRev

  • Based on Git for data versioning and collaboration
  • An open and extensible environment supporting different types of reviews
  • Covers all steps of the process
Step Operations
Problem formulation colrev init
Metadata retrieval colrev search, colrev load, colrev prep, colrev dedupe
Metadata prescreen colrev prescreen
PDF retrieval colrev pdfs
PDF screen colrev screen
Data extraction and synthesis colrev data
Gerit Wagner and Julian Prester
ECIS 2024: CoLRev

The CoLRev tutorial

  • Form small groups (2-3 people) and work together
  • Go to the codespaces, read the worksheet and enter the commands
  • You can enter colrev status at all times
  • Consult with the documentation when necessary
Gerit Wagner and Julian Prester
ECIS 2024: CoLRev

How to get involved

center

Gerit Wagner and Julian Prester
ECIS 2024: CoLRev

Open questions

Use AI in the search to reduce time

Can not find the right information

  • Use litbaskets for journal scope and Permusearch for search terms
  • Iterate and extend search based on keywords and terms from the papers
  • Consider guidelines of Boell and Cecez-Kecmanovic 2014 (Appendix A)
  • Once you have a working strategy: report and publish it (e.g., at searchRxiv)

Finding high-quality literature outside of IS

Gerit Wagner and Julian Prester

TODO : add Guy Paré to HEC Montréal - Gerit Wagner: short bio - Julian Prester: short bio Overview of publications on literature reviews, tools, teaching (phd, bachelor, master), editorial work, ... Map our journey on the left (started in Regensburg, JP to UNSW, GW to Montreal and Bamberg, JP to University of Sydney) Illustrate our experience on the right as different "building blocks" with the colrev project on top (e.g., 12 review papers, 4 methods papers, 87 packages, 7 teaching offers, 4 x service as editor/reviewer ) 3 methods papers in the senior scholars basket (of 11) over 50 phd students colrev projet: setup in 2021 - 3 years under development, 26 versions, 20 contributors, but still a lot to do

data: manual and algorithmic (transparency - using Git to see exactly what was changed) - not the most common approach in the context of LR : paradigm change: no longer require "blind trust" in algorithms/student assistants enables / requires

varying performance, availability, and cost new algorithms and SOTA tools (reuse: one step further than reproducibility) or student papers etc.

Git-based: the full collaboration model First slides: what do we mean with colrev/what's our focus? colrev: literature reviews in collaborative settings something we discussed earlier, when announcing the workshop (record keeping, put users in a position to report a full standalone paper at all times) -> Extensible approach, adapting the first steps with parameters, and selecting different packages for the data analysis/extraction/coding/synthesis/RoB