Powered by the Evidence-based Practice Centers
Evidence Reports All of EHC
Evidence Reports All of EHC

SHARE:

FacebookTwitterFacebookPrintShare

Improving Access to and Usability of Systematic Review Data for Health Systems Guidelines Development

Research Report Feb 28, 2019
Download PDF files for this report here.

Page Contents

Improving Access to and Usability of Systematic Review Data for Health Systems Guidelines Development

This item is available in PDF only (Full Report [5 MB]). People using assistive technology may not be able to fully access information in these files. For additional assistance, please contact us.

This report is from AHRQ's series on Health Systems Partnership Pilot Project Reports. These reports describe the efforts of Evidence-based Practice Centers (EPCs) to work with health care decisionmakers and facilitate the use of information from AHRQ EPC evidence reports.

Purpose of Project

Identify and test interactive methods to make the large amount of data in an Evidence-based Practice Center (EPC) systematic review more accessible for decision makers at Oregon Health & Science University.

Key Messages

  • We identified two functionalities existing software could address:
    • Ability to drill down from summaries to increasing levels of detail.
    • Ability to slice and dice the data into subgroups corresponding to specific interests.
  • To use such tools, the EPCs will need staff with informatics skills, and the Agency for Healthcare Research and Quality will need to verify Web strategies for public access.
  • Alternate formats may improve utility of EPC report data to guideline committees in learning health systems.
  • The recommended pilot extension could confirm usefulness of prototypes and resources needed.

Structured Abstract

Objectives. Evidence presented in systematic reviews informs the development of healthcare practice, guidelines, and policy. The inherent complexity and quantity of data in systemic reviews may impede understanding and use in decision processes, but little evidence exists on transforming large volumes of these data into accessible formats for end users. The objectives of this Evidence-based Practice Center (EPC) pilot project were (1) to identify the information needs of health systems guideline/protocol developers; (2) to assess existing, off-the-shelf software or Web platforms that would allow creation of interactive presentations of systematic review data in formats that would address the identified needs, and (3) to test the ability of selected software/platforms to make the large amount of data included in a recent systematic review of chronic pain management more accessible for decision makers at Oregon Health & Science University.

Methods. To develop and test alternative formats for dissemination, we assessed stakeholder needs through qualitative interviews with a department director and four health system content experts. We reviewed interview notes and identified the key themes in team discussion, and arrived at consensus. We then conducted a literature search regarding core functionalities desired in evidence summaries and systematic reviews, as described by the content experts. Next, we compared recommendations from the content experts and the literature search to several existing software tools in order to select two tools for the pilot test. We imported data from a recent systematic review on chronic pain into the selected tools to mock up example outputs. Finally, we solicited reactions from the department director and six health system content experts (four of whom were interviewed initially) on the mocked-up report examples in terms of accessibility and utility, and we based recommendations for next steps on these assessments and our experience.

Results. The key theme that emerged from the initial interviews with content experts was the need for two core functionalities: the ability to drill down from a general overview to specific more information and the ability to select subsets of evidence from a larger review. We identified two tools that provided these functions and that met our other criteria: MAGICapp is a platform for evidence summaries; Tableau is a data management and visualization tool. MAGICapp required less time and skill to mock up, as the data were entered manually into the Web-based platform, while Tableau required more time and a staff member with knowledge of informatics such as the ability to set up the relational databases for the dashboard. MAGICapp parameters required the data output to follow the structure of the pain review and allowed users to drill down to granular detail; Tableau allowed users to explore evidence without adhering to the organization of the review, but could not provide the granularity found in MAGICapp. Neither of the two tools we tested were able to fulfill both core functionalities, drilling down to specific study data and reviewing subsets of evidence outside the confines of the organization of the pain review. The second round of health system content expert interviews provided positive feedback on the products, aesthetically as well as for their potential functionality. Respondents perceived Tableau as ideal for content experts reviewing data, as the functionality allows users to query the data in multiple ways. Respondents perceived MAGICapp as the better choice for multidisciplinary groups or decision makers less familiar with the data, given the tool's organized structure and capacity for explanatory text.

The two key themes from the second-round interviews and our evaluation were (1) the need for the learning health system administrators to consider the level of expertise of the end users, as those with more or less familiarity with a set of data may require the granularity of MAGICapp or the freedom of Tableau and (2) the need for EPCs to test one or both prototype in an actual review from the beginning in order to accurately estimate what additional staff time and expertise is needed to prepare, import, and manage data beyond the traditional EPC report formats.

Conclusions. The results of this "proof-of-concept" prototype development demonstrate that existing tools could be used to make large systematic reviews more accessible and usable. However, an individual tool may not have the capacity to provide all desired functionalities, and each tool has differing requirements for time, data management, and staff expertise. To better understand the actual time required, the data storage needs, implications for EPCs and learning health systems, and issues related to Section 508 accessibility standards and government data rights, we recommend a follow-on pilot be conducted to allow systematic review teams to test these tools as integrated components of one or a small number of future reviews. This follow-on research would provide realistic data on the resources needed to generate systematic reviews in alternative formats and allow further assessment of whether these formats can increase uptake of EPC reports within learning health systems.

Journal Citation

Smith CJ, Jungbauer RM, Totten AM. Visual evidence increasing usability of systematic reviews in health systems guidelines development. Appl Clin Inform. 2019 Aug;10(4):743-50. doi: 10.1055/s-0039-1697595. PMID: 31578047.

Citation

Suggested citation: Totten AM, Smith C, Dunham K, Jungbauer RM, Graham E. Improving Access to and Usability of Systematic Review Data for Health Systems Guidelines Development. Methods Research Report. (Prepared by the Pacific Northwest Evidence-based Practice Center under Contract No. 290-2015-00009-I.) AHRQ Publication No. 19-EHC010-EF. Rockville, MD: Agency for Healthcare Research and Quality. February 2019. Posted final reports are located on the Effective Health Care Program search page.
DOI: https://doi.org/10.23970/AHRQEPCMETHENGAGEIMPROVING. 

Page last reviewed March 2020
Page originally created February 2019

Internet Citation: Research Report: Improving Access to and Usability of Systematic Review Data for Health Systems Guidelines Development. Content last reviewed March 2020. Effective Health Care Program, Agency for Healthcare Research and Quality, Rockville, MD.
https://effectivehealthcare.ahrq.gov/products/systematic-review-data/methods-report

Select to copy citation