Improving access to and usability of systematic review data for health systems guidelines development

CONCLUSIONS: The results of this "proof-of-concept" prototype development demonstrate that existing tools could be used to make large systematic reviews more accessible and usable. However, an individual tool may not have the capacity to provide all desired functionalities, and each tool h...

Full description

Bibliographic Details
Main Authors: Totten, Annette M., Smith, Connor (Author), Dunham, Kenneth (Author), Jungbauer, Rebecca M. (Author)
Corporate Authors: United States Agency for Healthcare Research and Quality, Oregon Health & Science University Pacific Northwest Evidence-based Practice Center
Format: eBook
Language:English
Published: Rockville, MD Agency for Healthcare Research and Quality, U.S. Department of Health and Human Services February 2019, 2019
Series:Methods research report
Subjects:
Online Access:
Collection: National Center for Biotechnology Information - Collection details see MPG.ReNa
Description
Summary:CONCLUSIONS: The results of this "proof-of-concept" prototype development demonstrate that existing tools could be used to make large systematic reviews more accessible and usable. However, an individual tool may not have the capacity to provide all desired functionalities, and each tool has differing requirements for time, data management, and staff expertise. To better understand the actual time required, the data storage needs, implications for EPCs and learning health systems, and issues related to Section 508 accessibility standards and government data rights, we recommend a follow-on pilot be conducted to allow systematic review teams to test these tools as integrated components of one or a small number of future reviews. This follow-on research would provide realistic data on the resources needed to generate systematic reviews in alternative formats and allow further assessment of whether these formats can increase uptake of EPC reports within learning health systems
METHODS: To develop and test alternative formats for dissemination, we assessed stakeholder needs through qualitative interviews with a department director and four health system content experts. We reviewed interview notes and identified the key themes in team discussion, and arrived at consensus. We then conducted a literature search regarding core functionalities desired in evidence summaries and systematic reviews, as described by the content experts. Next, we compared recommendations from the content experts and the literature search to several existing software tools in order to select two tools for the pilot test. We imported data from a recent systematic review on chronic pain into the selected tools to mock up example outputs.
Finally, we solicited reactions from the department director and six health system content experts (four of whom were interviewed initially) on the mocked-up report examples in terms of accessibility and utility, and we based recommendations for next steps on these assessments and our experience. RESULTS: The key theme that emerged from the initial interviews with content experts was the need for two core functionalities: the ability to drill down from a general overview to specific more information and the ability to select subsets of evidence from a larger review. We identified two tools that provided these functions and that met our other criteria: MAGICapp is a platform for evidence summaries; Tableau is a data management and visualization tool.
Respondents perceived Tableau as ideal for content experts reviewing data, as the functionality allows users to query the data in multiple ways. Respondents perceived MAGICapp as the better choice for multidisciplinary groups or decision makers less familiar with the data, given the tool's organized structure and capacity for explanatory text. The two key themes from the second-round interviews and our evaluation were (1) the need for the learning health system administrators to consider the level of expertise of the end users, as those with more or less familiarity with a set of data may require the granularity of MAGICapp or the freedom of Tableau and (2) the need for EPCs to test one or both prototype in an actual review from the beginning in order to accurately estimate what additional staff time and expertise is needed to prepare, import, and manage data beyond the traditional EPC report formats.
MAGICapp required less time and skill to mock up, as the data were entered manually into the Web-based platform, while Tableau required more time and a staff member with knowledge of informatics such as the ability to set up the relational databases for the dashboard. MAGICapp parameters required the data output to follow the structure of the pain review and allowed users to drill down to granular detail; Tableau allowed users to explore evidence without adhering to the organization of the review, but could not provide the granularity found in MAGICapp. Neither of the two tools we tested were able to fulfill both core functionalities, drilling down to specific study data and reviewing subsets of evidence outside the confines of the organization of the pain review. The second round of health system content expert interviews provided positive feedback on the products, aesthetically as well as for their potential functionality.
OBJECTIVES: Evidence presented in systematic reviews informs the development of healthcare practice, guidelines, and policy. The inherent complexity and quantity of data in systemic reviews may impede understanding and use in decision processes, but little evidence exists on transforming large volumes of these data into accessible formats for end users. The objectives of this Evidence-based Practice Center (EPC) pilot project were (1) to identify the information needs of health systems guideline/protocol developers; (2) to assess existing, off-the-shelf software or Web platforms that would allow creation of interactive presentations of systematic review data in formats that would address the identified needs, and (3) to test the ability of selected software/platforms to make the large amount of data included in a recent systematic review of chronic pain management more accessible for decision makers at Oregon Health & Science University.
Physical Description:1 PDF file (various pagings) illustrations