Keywords
Rapid review, systematic review, methodology, evidence synthesis, Priority setting partnership, PPI
Rapid review, systematic review, methodology, evidence synthesis, Priority setting partnership, PPI
Systematic reviews summarise existing research and use structured and ‘explicit methods to identify, select and critically appraise relevant studies, and collect and analyse data from these studies’1. Systematic reviews, therefore, provide a robust synthesis of existing research for a given topic and are typically seen as the cornerstone of research evidence for informing health policy and practice decision-making. However, systematic reviews require significant resources (time, expertise, funding) to conduct and can take up to two years to complete2. While full systematic reviews can, and have been, completed rapidly, rapid reviews have emerged more commonly as a form of evidence synthesis in which certain steps of the systematic review process are omitted or simplified to produce information more efficiently within limited resources to inform healthcare decisions2,3.
Evidence suggests that rapid reviews are increasingly being commissioned, and the results used, to inform decision making by policymakers and funders as a more resource efficient alternative to conventional systematic reviews4,5. The coronavirus disease 2019 (COVID-19) pandemic has further increased the demand for rapid reviews due to the need to provide time-sensitive information to decision-makers6.
Although growing demand has led to a steady rise in rapid review publications generally since 20103,7, albeit often poorly reported8, there are challenges in identifying rapid reviews commissioned specifically by organisations as they are often not published in peer review journals3. This may, in part, be due to the suggestion that commissioned rapid reviews are considered more relevant in the context of narrow questions rather than broader questions of wider relevance that may require the focus of a conventional systematic review5,9.
Despite an evident rise in the commissioning and conduct of rapid reviews, there is an absence of high-quality evidence underpinning some decisions about how they are planned, done and the findings shared. There is also limited evidence on the relative impact of different simplifications or omissions, for example, restricting searches to published material or by date/language or conducting single reviewer screening or data extraction10,11, on the validity and application of findings3 commonly used in rapid reviews. Furthermore, there is no universally accepted definition of a rapid review3,6,12,13 and debate exists regarding the use of the word ‘rapid’, with several different synonyms previously identified within the literature7,10. These debates have been further complicated in the era of the COVID-19 pandemic where it has been highlighted that rapid reviews cannot be categorised simply based on the time taken to complete a review given that well-resourced conventional systematic reviews have been conducted to a high standard, rapidly.
Overall, rapid reviews are ‘poorly understood, ill-defined diverse methodologies supported by limited published evidence’3, with a need for further research to establish a robust methodological evidence base14. Without such an evidence base, the validity, appropriateness and usefulness of rapid reviews are undermined2.
Research prioritisation plays a key role in minimising research waste by ensuring that research resources are targeted towards questions of the most potential benefit15. Research prioritisation involves identifying, prioritising and obtaining consensus on research needs and questions that are relevant and important to all relevant stakeholders for that topic16. The James Lind Alliance (JLA) is a non-profit organisation that brings multiple stakeholders, including patients, carers and clinicians, together in an equal, transparent and evidence-based Priority Setting Partnership (PSP) to determine the most important evidence uncertainties or unanswered research questions. Although commonly focused on knowledge gaps surrounding the effects of treatments, the approach has been broadly applied to other areas such as diagnosis, prevention, and more recently, to identify methodology uncertainties in recruitment and retention within clinical trials17,18.
This Priority III PSP aims to identify the top 10 unanswered research questions about how we plan, do and share rapid reviews, as identified and prioritised by contributors drawn from key stakeholder groups, including patients, the public, reviewers, researchers, clinicians, policymakers and funders. Evidence Synthesis Ireland is conducting the Priority III PSP in collaboration with the JLA. Evidence Synthesis Ireland is an all-Ireland initiative funded by the Health Research Board and the Health and Social Care, Research and Development Division of the Public Health Agency in Northern Ireland. It aims to make evidence syntheses better designed, conducted, reported, and more usable within health care policy and clinical practice decision-making by patients, the public, health care institutions and policymakers, clinicians and researchers.
Ethical approval was granted by the National University of Ireland Galway Research Ethics Committee (reference: 20-Apr-02).
The study employs a PSP based on the methods of the JLA and will be reported following the REporting guideline for PRIority SEtting of health research (REPRISE) criteria16.
Following the JLA Guidebook for PSPs19, this PSP will take place in seven stages and build on modified JLA guidance17,18 and previous PSPs in recruitment and retention within clinical trials17,18.
The seven stages of the project are: establishing the steering group, identifying and inviting potential partners, gathering uncertainties, data processing and verifying uncertainties, interim priority setting; final prioritisation workshop; and disseminating findings.
Establishing the Steering Group (6 months). The Priority III PSP will be led and managed by an international Steering Group who will coordinate, oversee and guide the PSP activities. The primary roles of the Steering Group will be to discuss and agree on the scope and remit of the project, enable access and reach to key stakeholder groups, contribute intellectually towards the study methods and interpretation, and ensure that all perspectives are captured and meaningfully included throughout.
The Steering Group will also determine decisions about the processes used throughout as the project progresses. Membership of the Steering Group will include individuals and representatives from organisations, including patients and the public, reviewers, researchers, clinicians, policymakers and funders. The Steering Group will comprise of up 25 members across stakeholder groups. Potential members will be approached to participate via email. Patient and public participants will be paid for their time spent on Steering Group activities. Other participants (researchers/ reviewers, clinicians, policy makers, funders) will not be paid.
Identifying and inviting potential partners (7 months). Steering Group members will identify and engage additional appropriate partners through a process of peer knowledge and consultation, using the Steering Group members’ respective networks. Organisations and individuals who can reach and advocate for key stakeholder groups will be invited to participate with the PSP as partners. We have not set a minimum or maximum number of partners we would like to reruit. Partners will be organisations or groups representative of diverse stakeholder perspectives (patients and the public, reviewers, researchers, clinicians, policymakers and funders) who will commit to supporting, participating and promoting the PSP among their stakeholder groups. As far as possible, the partners involved will seek to represent the interests of all stakeholders. All partners will be asked to confirm that they agree to support the PSP.
Gathering uncertainties (1 month- October 2020). The Rapid Review Methodology PSP will gather uncertainties from patients and the public, reviewers, researchers, clinicians, policymakers and funders. This will be undertaken by conducting an initial survey with all relevant stakeholders. The survey will identify unanswered questions or uncertainties about how we plan, do and share rapid reviews. The survey will be designed and piloted by the Steering Group members. The survey will be created using QuestionPro20 software and hosted on the Evidence Synthesis Ireland website. Participants will be asked to give their explicit content to take part in the survey using yes/ no questions. The survey will contain four open-ended questions. Three of the questions will focus on different stages of the rapid review process, and participants will be asked to answer as few or as many as they wish. The fourth question asks for any additional questions or comments on the rapid review process. The four questions are;
1. What questions or comments do you have about improving the process needed to plan a rapid review successfully?
2. What questions or comments do you have about improving how rapid reviews are carried out?
3. What questions or comments do you have about how the findings of rapid reviews are communicated to people?
4. Do you have any other questions or comments on how we plan, do and share the results of rapid reviews?
Participant demographic data will also be collected to monitor responses of different stakeholder groups and help refine and target the promotion of the survey towards under-represented groups if necessary. In line with previous methodology-focused PSPs17,18, the survey will be open for four weeks.
The survey will be advertised and distributed via the Steering Group and the PSP Partners. Specifically, members will distribute the survey via their networks, mailing lists, newsletters and social media. We do not have an a priori determined sample size and instead will distribute the survey widely with a view to reaching as wide an audience as possible. The survey and all other study materials can be found as Extended data21.
Data processing and verifying uncertainties (5 months). The initial survey consultation process is expected to produce substantial ‘raw’ questions and comments indicating stakeholders’ areas of uncertainty. The survey data will be downloaded from QuestionPro20. These raw questions will be categorised and refined by the NUI Galway research team (CB, CH, DD) into summary questions that are clear, addressable by research, and understandable to all. Similar or duplicate responses will be combined where appropriate. Questions will be considered out of scope if they do not relate to planning, doing and sharing the results of rapid reviews within the healthcare setting, asking for information or advice, or being too broad, unclear, unrelated or off topic. Questions deemed to be out of scope will be compiled separately and made available for future use upon request.
The Steering Group will exercise oversight on data analysis and processing to ensure that the raw data are being interpreted appropriately. The summary questions will be developed in a reflective manner that is understandable to all audiences. The process will be conducted transparently to ensure that the finalised questions can be traced back to the raw response data.
Each in-scope question will be checked against existing sources of evidence to determine which questions remain unanswered. A question will be verified as unanswered if a synthesis gap is apparent following a search of all relevant systematic reviews published within the past three years. We judged a review to be systematic when it involved explicit methods to search, select, critically appraise and synthesise individual studies. If a systematic review has been conducted to answer any of the questions completely, the quality of that systematic review will be appraised using the AMSTAR 222 tool to help inform decisions about the extent to which the question has been answered. Evidence checking will be completed by one researcher (CB) and verified by one other researcher (DD) Difference of opinion will be resolved by consultation with a third researcher if necessary.
Existing sources of evidence will be identified through a search of the PubMed bibliographic database for systematic reviews published from 2018 to the time of searching (2021) using a search strategy developed specifically for use in this project with the support of an experienced information specialist (see Table 1 for search strategy and limits applied).
The JLA Question Verification Form, which describes the process used to verify question uncertainty, will be completed. In line with JLA guidance, the Question Verification Form includes details of the types and sources of evidence used to check uncertainty. This form will be published on the JLA website to enable stakeholders to understand how the PSP decided that these questions are unanswered and any limitations of this process.
The Steering Group will be asked to review and refine, as appropriate, the final list of summary questions for inclusion in the interim survey.
Interim priority setting (1 month- April 2021). Interim priority setting is where the long list of questions is reduced to a shorter list that can be taken to the final priority setting workshop. This shortlist is aimed at a wide audience and will be administered using an electronic survey (QuestionPro software20) and hosted on the Evidence Synthesis Ireland website. It will be designed and piloted by the Steering Group members. In this survey, stakeholders will be asked to prioritise summary questions developed from the initial survey in order of importance. This interim priority setting process will help reduce the long list to a short, manageable set of approximately 20 indicative questions that are clear, addressable by research and understandable. The survey will be open for four weeks and will be advertised and distributed like the initial survey via the Steering Group and the PSP Partners. Participant demographic data will be collected to monitor response rates from different stakeholder groups to help refine and target the promotion of the survey towards under-represented groups.
Questions prioritised in the interim survey will go forward for final prioritisation at a stakeholder workshop(s). Where the interim prioritisation does not produce a clear ranking or cut off point, the Steering Group will decide, and report, which questions are taken forward to the final prioritisation.
Final prioritisation workshop (2 days- May 2021). The final priority setting stage will involve two half-day virtual workshops facilitated by the JLA to ensure transparency, accountability, fairness and appropriate representation. Based on the methods of the JLA, these workshops would ordinarily take place in person, but due to the Priority III study being conducted under restrictions contingent upon the global COVID-19 pandemic, the workshops will be held virtually. With input from the Steering Group, up to 24 contributors drawn from key stakeholder groups (patients, public, reviewers, researchers, clinicians, policymakers and funders) will be recruited to participate in a day of group discussion, plenary sessions and ranking exercises to determine the top 10 questions for research. Four virtual breakout rooms will be used to facilitate smaller group discussions. All participants will declare their interests, enabling a diverse group of stakeholders to exchange knowledge, perspectives, and experiences and inform the decision-making process. A maximum of four Steering Group members will be invited to participate in the workshops, with a steering group member in each breakout room to participate and answer any questions about the Priority III process. Additional Steering Group members who wish to attend will do so in an observer capacity. The outcome of the workshop will be consensus on a top 10 list of research priorities of unanswered questions about rapid review methodology.
Disseminating findings. The results of the study will be disseminated through the JLA and Evidence Synthesis Ireland websites. The Steering Group will identify additional appropriate audiences to engage when sharing the results, such as researchers, funders, and patient and clinical communities. The steering group will also identify opportunities to collaborate and contribute evidence to answer the top 10 list of research priorities.
The Priority III study will identify a top 10 of unanswered research questions regarding rapid review methodology, guided by an adaptation of the James Lind Alliance PSP process. The findings of the study will contribute to minimising research waste in rapid review methodology, ensuring that research resources will be used to answer questions on this topic that have been prioritised by key stakeholders internationally, including patient and the public, reviewers, researchers, clinicians, policymakers and funders.
At the time of submission, all stages up to the dissemination of findings had been completed. There are several factors that contributed to the submission of the protocol at this time:
1. The iterative nature of the PSP processes and the decisions that needed to be made by the Steering Group as the study progresses through to the later stages
2. Discussions with F1000/ HRB Open Research on affiliation of authors and the time dedicated to same
No results of the study have yet been disseminated.
Open Science Framework: Priority III. https://doi.org/10.17605/OSF.IO/R6VFX21.
This project contains the following extended data:
Data are available under the terms of the Creative Commons Attribution 4.0 International license (CC-BY 4.0).
Is the rationale for, and objectives of, the study clearly described?
Yes
Is the study design appropriate for the research question?
Yes
Are sufficient details of the methods provided to allow replication by others?
Yes
Are the datasets clearly presented in a useable and accessible format?
Not applicable
Competing Interests: No competing interests were disclosed.
Reviewer Expertise: Molecular diagnostics, health policy, diagnostic testing, healthcare value, test utilization
Is the rationale for, and objectives of, the study clearly described?
Yes
Is the study design appropriate for the research question?
Yes
Are sufficient details of the methods provided to allow replication by others?
Partly
Are the datasets clearly presented in a useable and accessible format?
Yes
References
1. Hartling L, Guise JM, Kato E, Anderson J, et al.: EPC Methods: An Exploration of Methods and Context for the Production of Rapid Reviews. PubMed AbstractCompeting Interests: I have a professional interest in rapid review methods and have written on the subject in both peer-reviewed journal papers and in a paid capacity for WHO/PAHO.
Reviewer Expertise: I have experience in conducting both systematic reviews and rapid reviews. I have also been involved in the development and assessment of methodologies for rapid reviews.
Is the rationale for, and objectives of, the study clearly described?
Yes
Is the study design appropriate for the research question?
Yes
Are sufficient details of the methods provided to allow replication by others?
No
Are the datasets clearly presented in a useable and accessible format?
Not applicable
Competing Interests: No competing interests were disclosed.
Reviewer Expertise: I currently perform systematic reviews and meta-analysis for covid19-related diagnostics. I have previous experience in randomized clinical trials, genomic epidemiology, and preclinical research.
Is the rationale for, and objectives of, the study clearly described?
Yes
Is the study design appropriate for the research question?
Yes
Are sufficient details of the methods provided to allow replication by others?
Yes
Are the datasets clearly presented in a useable and accessible format?
Not applicable
Competing Interests: No competing interests were disclosed.
Reviewer Expertise: Patient and public involvement in research
Alongside their report, reviewers assign a status to the article:
| Invited Reviewers | ||||
|---|---|---|---|---|
| 1 | 2 | 3 | 4 | |
|
Version 2 (revision) 18 Nov 21 |
read | |||
|
Version 1 23 Jul 21 |
read | read | read | read |
Provide sufficient details of any financial or non-financial competing interests to enable users to assess whether your comments might lead a reasonable person to question your impartiality. Consider the following examples, but note that this is not an exhaustive list:
Sign up for content alerts and receive a weekly or monthly email with all newly published articles
Register with HRB Open Research
Already registered? Sign in
Submission to HRB Open Research is open to all HRB grantholders or people working on a HRB-funded/co-funded grant on or since 1 January 2017. Sign up for information about developments, publishing and publications from HRB Open Research.
We'll keep you updated on any major new updates to HRB Open Research
The email address should be the one you originally registered with F1000.
You registered with F1000 via Google, so we cannot reset your password.
To sign in, please click here.
If you still need help with your Google account password, please click here.
You registered with F1000 via Facebook, so we cannot reset your password.
To sign in, please click here.
If you still need help with your Facebook account password, please click here.
If your email address is registered with us, we will email you instructions to reset your password.
If you think you should have received this email but it has not arrived, please check your spam filters and/or contact for further assistance.
Comments on this article Comments (0)