Skip Navigation
Department of Health and Human Services www.hhs.gov
  • Home
  • Search for Research Summaries, Reviews, and Reports
 
 
Maintenance Notice
An infrastructure upgrade will take place on Friday, December 19 at approximately 2PM Eastern time. Please be aware you may experience temporary issues accessing the site at that point.

EHC Component

  • EPC Project

Topic Title

  • Abstrackr: Software for Semi-Automatic Citation Screening

Related Links for this Topic

Save this page in Facebook.com  Save this page in Myspace.com  Save this page in Twitter.com  Save this page on your Google Home Page  Save this page in Windows Live
Save this page in Yahoo  Save this page in Ask.com  Stumble this page.  Save this page in del.ico.us  Digg this page. 

E-mail E-mail   Print Print

Add to My Collections



Abstract - Final – Feb. 23, 2012

Abstrackr: Software for Semi-Automatic Citation Screening

Topic Abstract

Systematic reviews are the only practical way to provide comprehensive syntheses of the exponentially expanding medical evidence base with as little bias as possible. While such summaries of the literature are vital components of modern evidence-based medicine (EBM), they are becoming increasingly onerous to produce due to the exponentially growing volume of published biomedical literature.

With the objective of optimizing the step of citation screening for systematic reviews, we are developing abstrackr: a free, open-source, web-based application for facilitating citation screening for systematic reviews. The program comprises two components; a web-based annotation tool that allows participants in a review to collaboratively screen citations for relevance, and machine learning technologies that semi-automate this process. We describe these in turn.

The web-based annotation tool, currently in beta version, allows project leads to import the citations that are to be screened for a review from either RefMan or Pubmed. Participants can then join the project and begin screening; the tool maintains a digital paper trail of all screening decisions. We are adding functionality to facilitate the logistics of data management, including options to single- or double- screen citations, and a decision reconciliation mode for reviewing citations about whose relevance two reviewers disagreed. Project leads can also monitor the progress of each participant, and the screening project overall.

In addition, we have been working on machine learning technologies to automatically screen citations, much as email clients automatically distinguish legitimate from spam emails. The methods are still in development, but we envision reviewers screening roughly half of the set of citations imported for a given review, and then letting the software automatically exclude a (hopefully large) portion of the remaining citations; the reviewers will then only need to screen the articles classified as relevant by the software. Eventually, this functionality will be integrated with the abstrackr tool.

The project website is at http://abstrackr.cebm.brown.edu Exit Disclaimer. Please contact Tom Trikalinos (thomas_trikalinos@brown.edu) with any questions. This project is currently supported by funding from AHRQ, grant number R01HS018494.

  Return to Top of Page