database individually, including ALL your search terms, any
MeSH or other subject headings, truncation (like hemipleg ),
and/or wildcards (like sul ur). Apply all your limits (such as
years of search, English language only, and so on). Once all
search terms have been combined and you have applied all
relevant limits, you should have a final number of records or
articles for each database. Enter this information in the top
left box of the PRISMA flow chart. You should add the total
number of combined results from all databases (including
duplicates) after the equal sign where it says .
Many researchers also add notations in the box for the number
of results from each database search, for example, Pubmed
(n=335), Embase (n= 600), and so on. If you search trial
registers, such as , , , or others,
you should enter that number after the equal sign in .
NOTE:Some citation managers automatically remove duplicates
with each file you import. Be sure to capture the number of articles
from your database searches before any duplicates are removed.
To avoid reviewing duplicate articles,
you need to remove any articles that appear more than once in your
results. You may want to export the entire list of articles from each
database to a citation manager such as EndNote, Sciwheel, Zotero,
or Mendeley (including both citation and abstract in your file) and
remove the duplicates there. If you are using Covidence for your
review, you should also add the duplicate articles identified in
Covidence to the citation manager number. Enter the number of
records removed as duplicates in the second box on your PRISMA
template. If you are using automation tools to help evaluate the
relevance of citations in your results, you would also enter that
number here.
If you are using Covidence to screen your articles, you can
copy the numbers from the PRISMA diagram in your Covidence
review into the boxes mentioned below. Covidence does not include
the number of results from each database, so you will need to keep
track of that number yourself.
The final step is to subtract the number
of records excluded during the review of full-texts (Step 9)
from the total number of full-texts reviewed (Step 8). Enter
this number in the box labeled "Studies included in review,"
combining numbers with your grey literature search results in this
box if needed.
You have now completed your PRISMA flow diagram, unless you
have also performed searches in non-database sources or are
performing a search update. If so, complete those portions of the template as well.
Step 1: Preparation Download the flow diagram template version 1 PRISMA 2020 flow diagram for new systematic reviews which included searches of databases, registers and other sources or the version 2 PRISMA 2020 flow diagram for updated systematic reviews which included searches of databases, registers and other sources .
If you have identified articles through other sources than databases (such as manual searches through reference lists of articles you have found or search engines like Google Scholar), enter the total number of records from each source type in the box on the top right of the flow diagram. | |
This should be the total number of reports you obtain from each grey literature source. | |
List the number of documents for which you are unable to find the full text. Remember to use Find@UNC and to request items to see if we can order them from other libraries before automatically excluding them. | |
This should be the number of grey literature reports sought for retrieval (Step 2) minus the number of reports not retrieved (Step 3). Review the full text for these items to assess their eligibility for inclusion in your systematic review. | |
After reviewing all items in the full-text screening stage for eligibility, enter the total number of articles you exclude in the box titled "Reports Excluded," and then list your reasons for excluding the item as well as the number of items excluded for each reason. Examples include wrong setting, wrong patient population, wrong intervention, wrong dosage, etc. You should only count an excluded item once in your list even if if meets multiple exclusion criteria. | |
The final step is to subtract the number of excluded articles or records during the eligibility review of full-texts from the total number of articles reviewed for eligibility. Enter this number in the box labeled "Studies included in review," combining numbers with your database search results in this box if needed. You have now completed your PRISMA flow diagram, which you can now include in the results section of your article or assignment. |
Step 1: Preparation Download the flow diagram template version 2 PRISMA 2020 flow diagram for updated systematic reviews which included searches of databases and registers only or the version 2 PRISMA 2020 flow diagram for updated systematic reviews which included searches of databases, registers and other sources .
In the Previous
| |
At the bottom of the column, There will also be a box for the total number of studies included in your |
For more information about updating your systematic review, see the box Updating Your Review? on the Step 3: Conduct Literature Searches page of the guide.
Scientific articles often follow the IMRaD format: Introduction, Methods, Results, and Discussion. You will also need a title and an abstract to summarize your research.
You can read more about scientific writing through the library guides below.
Systematic reviews follow the same structure as original research articles, but you will need to report on your search instead of on details like the participants or sampling. Sections of your manuscript are shown as bold headings in the PRISMA checklist.
Title | Describe your manuscript and state whether it is a systematic review, meta-analysis, or both. |
---|---|
Abstract | Structure the abstract and include (as applicable): background, objectives, data sources, study eligibility criteria, participants, interventions, quality assessment and synthesis methods, results, limitations, conclusions, implications of key findings, and systematic review registration number. |
Introduction | Describe the rationale for the review and provide a statement of questions being addressed. |
Methods | Include details regarding the protocol, eligibility criteria, databases searched, full search strategy of at least one database (often reported in appendix), and the study selection process. Describe how data were extracted and analyzed. If a librarian is part of your research team, that person may be best suited to write this section. |
Results | Report the numbers of articles screened at each stage using a PRISMA diagram. Include information about included study characteristics, risk of bias (quality assessment) within studies, and results across studies. |
Discussion | Summarize main findings, including the strength of evidence and limitations of the review. Provide a general interpretation of the results and implications for future research. |
Funding | Describe any sources of funding for the systematic review. |
Appendix | Include entire search strategy for at least one database in the appendix (include search strategies for all databases searched for more transparency). |
Refer to the PRISMA checklist for more information.
Consider including a Plain Language Summary (PLS) when you publish your systematic review. Like an abstract, a PLS gives an overview of your study, but is specifically written and formatted to be easy for non-experts to understand.
Tips for writing a PLS:
Learn more about Plain Language Summaries:
A literature review is an integrated analysis -- not just a summary-- of scholarly writings and other relevant evidence related directly to your research question. That is, it represents a synthesis of the evidence that provides background information on your topic and shows a association between the evidence and your research question.
A literature review may be a stand alone work or the introduction to a larger research paper, depending on the assignment. Rely heavily on the guidelines your instructor has given you.
Why is it important?
A literature review is important because it:
APA Style Blog - for those harder to find answers
Your literature review should be guided by your central research question. The literature represents background and research developments related to a specific research question, interpreted and analyzed by you in a synthesized way.
How many studies do you need to look at? How comprehensive should it be? How many years should it cover?
Make a list of the databases you will search.
Where to find databases:
Some questions to help you analyze the research:
Tips:
Steps in the literature review process.
Note: The first four steps are the best points at which to contact a librarian. Your librarian can help you determine the best databases to use for your topic, assess scope, and formulate a search strategy.
This 4.5 minute video from Academic Education Materials has a Creative Commons License and a British narrator.
3 straightforward steps (with examples) + free template.
By: Derek Jansen (MBA) | Expert Reviewed By: Dr. Eunice Rautenbach | October 2019
Quality research is about building onto the existing work of others , “standing on the shoulders of giants”, as Newton put it. The literature review chapter of your dissertation, thesis or research project is where you synthesise this prior work and lay the theoretical foundation for your own research.
Long story short, this chapter is a pretty big deal, which is why you want to make sure you get it right . In this post, I’ll show you exactly how to write a literature review in three straightforward steps, so you can conquer this vital chapter (the smart way).
Before we unpack how to write the literature review chapter, we’ve got to look at the why . To put it bluntly, if you don’t understand the function and purpose of the literature review process, there’s no way you can pull it off well. So, what exactly is the purpose of the literature review?
Well, there are (at least) four core functions:
Most students understand the first point but don’t give any thought to the rest. To get the most from the literature review process, you must keep all four points front of mind as you review the literature (more on this shortly), or you’ll land up with a wonky foundation.
Okay – with the why out the way, let’s move on to the how . As mentioned above, writing your literature review is a process, which I’ll break down into three steps:
Importantly, you must complete steps one and two before you start writing up your chapter. I know it’s very tempting, but don’t try to kill two birds with one stone and write as you read. You’ll invariably end up wasting huge amounts of time re-writing and re-shaping, or you’ll just land up with a disjointed, hard-to-digest mess . Instead, you need to read first and distil the information, then plan and execute the writing.
Naturally, the first step in the literature review journey is to hunt down the existing research that’s relevant to your topic. While you probably already have a decent base of this from your research proposal , you need to expand on this substantially in the dissertation or thesis itself.
Essentially, you need to be looking for any existing literature that potentially helps you answer your research question (or develop it, if that’s not yet pinned down). There are numerous ways to find relevant literature, but I’ll cover my top four tactics here. I’d suggest combining all four methods to ensure that nothing slips past you:
Google’s academic search engine, Google Scholar , is a great starting point as it provides a good high-level view of the relevant journal articles for whatever keyword you throw at it. Most valuably, it tells you how many times each article has been cited, which gives you an idea of how credible (or at least, popular) it is. Some articles will be free to access, while others will require an account, which brings us to the next method.
Generally, universities provide students with access to an online library, which provides access to many (but not all) of the major journals.
So, if you find an article using Google Scholar that requires paid access (which is quite likely), search for that article in your university’s database – if it’s listed there, you’ll have access. Note that, generally, the search engine capabilities of these databases are poor, so make sure you search for the exact article name, or you might not find it.
At the end of every academic journal article, you’ll find a list of references. As with any academic writing, these references are the building blocks of the article, so if the article is relevant to your topic, there’s a good chance a portion of the referenced works will be too. Do a quick scan of the titles and see what seems relevant, then search for the relevant ones in your university’s database.
Similar to Method 3 above, you can leverage other students’ dissertations. All you have to do is skim through literature review chapters of existing dissertations related to your topic and you’ll find a gold mine of potential literature. Usually, your university will provide you with access to previous students’ dissertations, but you can also find a much larger selection in the following databases:
Keep in mind that dissertations and theses are not as academically sound as published, peer-reviewed journal articles (because they’re written by students, not professionals), so be sure to check the credibility of any sources you find using this method. You can do this by assessing the citation count of any given article in Google Scholar. If you need help with assessing the credibility of any article, or with finding relevant research in general, you can chat with one of our Research Specialists .
Alright – with a good base of literature firmly under your belt, it’s time to move onto the next step.
Once you’ve built a little treasure trove of articles, it’s time to get reading and start digesting the information – what does it all mean?
While I present steps one and two (hunting and digesting) as sequential, in reality, it’s more of a back-and-forth tango – you’ll read a little , then have an idea, spot a new citation, or a new potential variable, and then go back to searching for articles. This is perfectly natural – through the reading process, your thoughts will develop , new avenues might crop up, and directional adjustments might arise. This is, after all, one of the main purposes of the literature review process (i.e. to familiarise yourself with the current state of research in your field).
As you’re working through your treasure chest, it’s essential that you simultaneously start organising the information. There are three aspects to this:
I’ll discuss each of these below:
As you read each article, you should add it to your reference management software. I usually recommend Mendeley for this purpose (see the Mendeley 101 video below), but you can use whichever software you’re comfortable with. Most importantly, make sure you load EVERY article you read into your reference manager, even if it doesn’t seem very relevant at the time.
In the beginning, you might feel confident that you can remember who said what, where, and what their main arguments were. Trust me, you won’t. If you do a thorough review of the relevant literature (as you must!), you’re going to read many, many articles, and it’s simply impossible to remember who said what, when, and in what context . Also, without the bird’s eye view that a catalogue provides, you’ll miss connections between various articles, and have no view of how the research developed over time. Simply put, it’s essential to build your own catalogue of the literature.
I would suggest using Excel to build your catalogue, as it allows you to run filters, colour code and sort – all very useful when your list grows large (which it will). How you lay your spreadsheet out is up to you, but I’d suggest you have the following columns (at minimum):
If you’d like, you can try out our free catalog template here (see screenshot below).
Most importantly, as you work through the literature and build your catalogue, you need to synthesise all the information in your own mind – how does it all fit together? Look for links between the various articles and try to develop a bigger picture view of the state of the research. Some important questions to ask yourself are:
To help you develop a big-picture view and synthesise all the information, you might find mind mapping software such as Freemind useful. Alternatively, if you’re a fan of physical note-taking, investing in a large whiteboard might work for you.
Once you’re satisfied that you have digested and distilled all the relevant literature in your mind, it’s time to put pen to paper (or rather, fingers to keyboard). There are two steps here – outlining and writing:
Having spent so much time reading, it might be tempting to just start writing up without a clear structure in mind. However, it’s critically important to decide on your structure and develop a detailed outline before you write anything. Your literature review chapter needs to present a clear, logical and an easy to follow narrative – and that requires some planning. Don’t try to wing it!
Naturally, you won’t always follow the plan to the letter, but without a detailed outline, you’re more than likely going to end up with a disjointed pile of waffle , and then you’re going to spend a far greater amount of time re-writing, hacking and patching. The adage, “measure twice, cut once” is very suitable here.
In terms of structure, the first decision you’ll have to make is whether you’ll lay out your review thematically (into themes) or chronologically (by date/period). The right choice depends on your topic, research objectives and research questions, which we discuss in this article .
Once that’s decided, you need to draw up an outline of your entire chapter in bullet point format. Try to get as detailed as possible, so that you know exactly what you’ll cover where, how each section will connect to the next, and how your entire argument will develop throughout the chapter. Also, at this stage, it’s a good idea to allocate rough word count limits for each section, so that you can identify word count problems before you’ve spent weeks or months writing!
PS – check out our free literature review chapter template…
With a detailed outline at your side, it’s time to start writing up (finally!). At this stage, it’s common to feel a bit of writer’s block and find yourself procrastinating under the pressure of finally having to put something on paper. To help with this, remember that the objective of the first draft is not perfection – it’s simply to get your thoughts out of your head and onto paper, after which you can refine them. The structure might change a little, the word count allocations might shift and shuffle, and you might add or remove a section – that’s all okay. Don’t worry about all this on your first draft – just get your thoughts down on paper.
Once you’ve got a full first draft (however rough it may be), step away from it for a day or two (longer if you can) and then come back at it with fresh eyes. Pay particular attention to the flow and narrative – does it fall fit together and flow from one section to another smoothly? Now’s the time to try to improve the linkage from each section to the next, tighten up the writing to be more concise, trim down word count and sand it down into a more digestible read.
Once you’ve done that, give your writing to a friend or colleague who is not a subject matter expert and ask them if they understand the overall discussion. The best way to assess this is to ask them to explain the chapter back to you. This technique will give you a strong indication of which points were clearly communicated and which weren’t. If you’re working with Grad Coach, this is a good time to have your Research Specialist review your chapter.
Finally, tighten it up and send it off to your supervisor for comment. Some might argue that you should be sending your work to your supervisor sooner than this (indeed your university might formally require this), but in my experience, supervisors are extremely short on time (and often patience), so, the more refined your chapter is, the less time they’ll waste on addressing basic issues (which you know about already) and the more time they’ll spend on valuable feedback that will increase your mark-earning potential.
In the video below, we unpack an actual literature review so that you can see how all the core components come together in reality.
In this post, we’ve covered how to research and write up a high-quality literature review chapter. Let’s do a quick recap of the key takeaways:
This post is an extract from our bestselling short course, Literature Review Bootcamp . If you want to work smart, you don't want to miss this .
Thank you very much. This page is an eye opener and easy to comprehend.
This is awesome!
I wish I come across GradCoach earlier enough.
But all the same I’ll make use of this opportunity to the fullest.
Thank you for this good job.
Keep it up!
You’re welcome, Yinka. Thank you for the kind words. All the best writing your literature review.
Thank you for a very useful literature review session. Although I am doing most of the steps…it being my first masters an Mphil is a self study and one not sure you are on the right track. I have an amazing supervisor but one also knows they are super busy. So not wanting to bother on the minutae. Thank you.
You’re most welcome, Renee. Good luck with your literature review 🙂
This has been really helpful. Will make full use of it. 🙂
Thank you Gradcoach.
Really agreed. Admirable effort
thank you for this beautiful well explained recap.
Thank you so much for your guide of video and other instructions for the dissertation writing.
It is instrumental. It encouraged me to write a dissertation now.
Thank you the video was great – from someone that knows nothing thankyou
an amazing and very constructive way of presetting a topic, very useful, thanks for the effort,
It is timely
It is very good video of guidance for writing a research proposal and a dissertation. Since I have been watching and reading instructions, I have started my research proposal to write. I appreciate to Mr Jansen hugely.
I learn a lot from your videos. Very comprehensive and detailed.
Thank you for sharing your knowledge. As a research student, you learn better with your learning tips in research
I was really stuck in reading and gathering information but after watching these things are cleared thanks, it is so helpful.
Really helpful, Thank you for the effort in showing such information
This is super helpful thank you very much.
Thank you for this whole literature writing review.You have simplified the process.
I’m so glad I found GradCoach. Excellent information, Clear explanation, and Easy to follow, Many thanks Derek!
You’re welcome, Maithe. Good luck writing your literature review 🙂
Thank you Coach, you have greatly enriched and improved my knowledge
Great piece, so enriching and it is going to help me a great lot in my project and thesis, thanks so much
This is THE BEST site for ANYONE doing a masters or doctorate! Thank you for the sound advice and templates. You rock!
Thanks, Stephanie 🙂
This is mind blowing, the detailed explanation and simplicity is perfect.
I am doing two papers on my final year thesis, and I must stay I feel very confident to face both headlong after reading this article.
thank you so much.
if anyone is to get a paper done on time and in the best way possible, GRADCOACH is certainly the go to area!
This is very good video which is well explained with detailed explanation
Thank you excellent piece of work and great mentoring
Thanks, it was useful
Thank you very much. the video and the information were very helpful.
Good morning scholar. I’m delighted coming to know you even before the commencement of my dissertation which hopefully is expected in not more than six months from now. I would love to engage my study under your guidance from the beginning to the end. I love to know how to do good job
Thank you so much Derek for such useful information on writing up a good literature review. I am at a stage where I need to start writing my one. My proposal was accepted late last year but I honestly did not know where to start
Like the name of your YouTube implies you are GRAD (great,resource person, about dissertation). In short you are smart enough in coaching research work.
This is a very well thought out webpage. Very informative and a great read.
Very timely.
I appreciate.
Very comprehensive and eye opener for me as beginner in postgraduate study. Well explained and easy to understand. Appreciate and good reference in guiding me in my research journey. Thank you
Thank you. I requested to download the free literature review template, however, your website wouldn’t allow me to complete the request or complete a download. May I request that you email me the free template? Thank you.
Your email address will not be published. Required fields are marked *
Save my name, email, and website in this browser for the next time I comment.
Intended for healthcare professionals
PRISMA 2020 explanation and elaboration: updated guidance and exemplars for reporting systematic reviews
The Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) statement, published in 2009, was designed to help systematic reviewers transparently report why the review was done, what the authors did, and what they found. Over the past decade, advances in systematic review methodology and terminology have necessitated an update to the guideline. The PRISMA 2020 statement replaces the 2009 statement and includes new reporting guidance that reflects advances in methods to identify, select, appraise, and synthesise studies. The structure and presentation of the items have been modified to facilitate implementation. In this article, we present the PRISMA 2020 27-item checklist, an expanded checklist that details reporting recommendations for each item, the PRISMA 2020 abstract checklist, and the revised flow diagrams for original and updated reviews.
Systematic reviews serve many critical roles. They can provide syntheses of the state of knowledge in a field, from which future research priorities can be identified; they can address questions that otherwise could not be answered by individual studies; they can identify problems in primary research that should be rectified in future studies; and they can generate or evaluate theories about how or why phenomena occur. Systematic reviews therefore generate various types of knowledge for different users of reviews (such as patients, healthcare providers, researchers, and policy makers). 1 2 To ensure a systematic review is valuable to users, authors should prepare a transparent, complete, and accurate account of why the review was done, what they did (such as how studies were identified and selected) and what they found (such as characteristics of contributing studies and results of meta-analyses). Up-to-date reporting guidance facilitates authors achieving this. 3
The Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) statement published in 2009 (hereafter referred to as PRISMA 2009) 4 5 6 7 8 9 10 is a reporting guideline designed to address poor reporting of systematic reviews. 11 The PRISMA 2009 statement comprised a checklist of 27 items recommended for reporting in systematic reviews and an “explanation and elaboration” paper 12 13 14 15 16 providing additional reporting guidance for each item, along with exemplars of reporting. The recommendations have been widely endorsed and adopted, as evidenced by its co-publication in multiple journals, citation in over 60 000 reports (Scopus, August 2020), endorsement from almost 200 journals and systematic review organisations, and adoption in various disciplines. Evidence from observational studies suggests that use of the PRISMA 2009 statement is associated with more complete reporting of systematic reviews, 17 18 19 20 although more could be done to improve adherence to the guideline. 21
Many innovations in the conduct of systematic reviews have occurred since publication of the PRISMA 2009 statement. For example, technological advances have enabled the use of natural language processing and machine learning to identify relevant evidence, 22 23 24 methods have been proposed to synthesise and present findings when meta-analysis is not possible or appropriate, 25 26 27 and new methods have been developed to assess the risk of bias in results of included studies. 28 29 Evidence on sources of bias in systematic reviews has accrued, culminating in the development of new tools to appraise the conduct of systematic reviews. 30 31 Terminology used to describe particular review processes has also evolved, as in the shift from assessing “quality” to assessing “certainty” in the body of evidence. 32 In addition, the publishing landscape has transformed, with multiple avenues now available for registering and disseminating systematic review protocols, 33 34 disseminating reports of systematic reviews, and sharing data and materials, such as preprint servers and publicly accessible repositories. To capture these advances in the reporting of systematic reviews necessitated an update to the PRISMA 2009 statement.
To ensure a systematic review is valuable to users, authors should prepare a transparent, complete, and accurate account of why the review was done, what they did, and what they found
The PRISMA 2020 statement provides updated reporting guidance for systematic reviews that reflects advances in methods to identify, select, appraise, and synthesise studies
The PRISMA 2020 statement consists of a 27-item checklist, an expanded checklist that details reporting recommendations for each item, the PRISMA 2020 abstract checklist, and revised flow diagrams for original and updated reviews
We anticipate that the PRISMA 2020 statement will benefit authors, editors, and peer reviewers of systematic reviews, and different users of reviews, including guideline developers, policy makers, healthcare providers, patients, and other stakeholders
A complete description of the methods used to develop PRISMA 2020 is available elsewhere. 35 We identified PRISMA 2009 items that were often reported incompletely by examining the results of studies investigating the transparency of reporting of published reviews. 17 21 36 37 We identified possible modifications to the PRISMA 2009 statement by reviewing 60 documents providing reporting guidance for systematic reviews (including reporting guidelines, handbooks, tools, and meta-research studies). 38 These reviews of the literature were used to inform the content of a survey with suggested possible modifications to the 27 items in PRISMA 2009 and possible additional items. Respondents were asked whether they believed we should keep each PRISMA 2009 item as is, modify it, or remove it, and whether we should add each additional item. Systematic review methodologists and journal editors were invited to complete the online survey (110 of 220 invited responded). We discussed proposed content and wording of the PRISMA 2020 statement, as informed by the review and survey results, at a 21-member, two-day, in-person meeting in September 2018 in Edinburgh, Scotland. Throughout 2019 and 2020, we circulated an initial draft and five revisions of the checklist and explanation and elaboration paper to co-authors for feedback. In April 2020, we invited 22 systematic reviewers who had expressed interest in providing feedback on the PRISMA 2020 checklist to share their views (via an online survey) on the layout and terminology used in a preliminary version of the checklist. Feedback was received from 15 individuals and considered by the first author, and any revisions deemed necessary were incorporated before the final version was approved and endorsed by all co-authors.
Scope of the guideline.
The PRISMA 2020 statement has been designed primarily for systematic reviews of studies that evaluate the effects of health interventions, irrespective of the design of the included studies. However, the checklist items are applicable to reports of systematic reviews evaluating other interventions (such as social or educational interventions), and many items are applicable to systematic reviews with objectives other than evaluating interventions (such as evaluating aetiology, prevalence, or prognosis). PRISMA 2020 is intended for use in systematic reviews that include synthesis (such as pairwise meta-analysis or other statistical synthesis methods) or do not include synthesis (for example, because only one eligible study is identified). The PRISMA 2020 items are relevant for mixed-methods systematic reviews (which include quantitative and qualitative studies), but reporting guidelines addressing the presentation and synthesis of qualitative data should also be consulted. 39 40 PRISMA 2020 can be used for original systematic reviews, updated systematic reviews, or continually updated (“living”) systematic reviews. However, for updated and living systematic reviews, there may be some additional considerations that need to be addressed. Where there is relevant content from other reporting guidelines, we reference these guidelines within the items in the explanation and elaboration paper 41 (such as PRISMA-Search 42 in items 6 and 7, Synthesis without meta-analysis (SWiM) reporting guideline 27 in item 13d). Box 1 includes a glossary of terms used throughout the PRISMA 2020 statement.
Systematic review —A review that uses explicit, systematic methods to collate and synthesise findings of studies that address a clearly formulated question 43
Statistical synthesis —The combination of quantitative results of two or more studies. This encompasses meta-analysis of effect estimates (described below) and other methods, such as combining P values, calculating the range and distribution of observed effects, and vote counting based on the direction of effect (see McKenzie and Brennan 25 for a description of each method)
Meta-analysis of effect estimates —A statistical technique used to synthesise results when study effect estimates and their variances are available, yielding a quantitative summary of results 25
Outcome —An event or measurement collected for participants in a study (such as quality of life, mortality)
Result —The combination of a point estimate (such as a mean difference, risk ratio, or proportion) and a measure of its precision (such as a confidence/credible interval) for a particular outcome
Report —A document (paper or electronic) supplying information about a particular study. It could be a journal article, preprint, conference abstract, study register entry, clinical study report, dissertation, unpublished manuscript, government report, or any other document providing relevant information
Record —The title or abstract (or both) of a report indexed in a database or website (such as a title or abstract for an article indexed in Medline). Records that refer to the same report (such as the same journal article) are “duplicates”; however, records that refer to reports that are merely similar (such as a similar abstract submitted to two different conferences) should be considered unique.
Study —An investigation, such as a clinical trial, that includes a defined group of participants and one or more interventions and outcomes. A “study” might have multiple reports. For example, reports could include the protocol, statistical analysis plan, baseline characteristics, results for the primary outcome, results for harms, results for secondary outcomes, and results for additional mediator and moderator analyses
PRISMA 2020 is not intended to guide systematic review conduct, for which comprehensive resources are available. 43 44 45 46 However, familiarity with PRISMA 2020 is useful when planning and conducting systematic reviews to ensure that all recommended information is captured. PRISMA 2020 should not be used to assess the conduct or methodological quality of systematic reviews; other tools exist for this purpose. 30 31 Furthermore, PRISMA 2020 is not intended to inform the reporting of systematic review protocols, for which a separate statement is available (PRISMA for Protocols (PRISMA-P) 2015 statement 47 48 ). Finally, extensions to the PRISMA 2009 statement have been developed to guide reporting of network meta-analyses, 49 meta-analyses of individual participant data, 50 systematic reviews of harms, 51 systematic reviews of diagnostic test accuracy studies, 52 and scoping reviews 53 ; for these types of reviews we recommend authors report their review in accordance with the recommendations in PRISMA 2020 along with the guidance specific to the extension.
The PRISMA 2020 statement (including the checklists, explanation and elaboration, and flow diagram) replaces the PRISMA 2009 statement, which should no longer be used. Box 2 summarises noteworthy changes from the PRISMA 2009 statement. The PRISMA 2020 checklist includes seven sections with 27 items, some of which include sub-items ( table 1 ). A checklist for journal and conference abstracts for systematic reviews is included in PRISMA 2020. This abstract checklist is an update of the 2013 PRISMA for Abstracts statement, 54 reflecting new and modified content in PRISMA 2020 ( table 2 ). A template PRISMA flow diagram is provided, which can be modified depending on whether the systematic review is original or updated ( fig 1 ).
Inclusion of the abstract reporting checklist within PRISMA 2020 (see item #2 and table 2 ).
Movement of the ‘Protocol and registration’ item from the start of the Methods section of the checklist to a new Other section, with addition of a sub-item recommending authors describe amendments to information provided at registration or in the protocol (see item #24a-24c).
Modification of the ‘Search’ item to recommend authors present full search strategies for all databases, registers and websites searched, not just at least one database (see item #7).
Modification of the ‘Study selection’ item in the Methods section to emphasise the reporting of how many reviewers screened each record and each report retrieved, whether they worked independently, and if applicable, details of automation tools used in the process (see item #8).
Addition of a sub-item to the ‘Data items’ item recommending authors report how outcomes were defined, which results were sought, and methods for selecting a subset of results from included studies (see item #10a).
Splitting of the ‘Synthesis of results’ item in the Methods section into six sub-items recommending authors describe: the processes used to decide which studies were eligible for each synthesis; any methods required to prepare the data for synthesis; any methods used to tabulate or visually display results of individual studies and syntheses; any methods used to synthesise results; any methods used to explore possible causes of heterogeneity among study results (such as subgroup analysis, meta-regression); and any sensitivity analyses used to assess robustness of the synthesised results (see item #13a-13f).
Addition of a sub-item to the ‘Study selection’ item in the Results section recommending authors cite studies that might appear to meet the inclusion criteria, but which were excluded, and explain why they were excluded (see item #16b).
Splitting of the ‘Synthesis of results’ item in the Results section into four sub-items recommending authors: briefly summarise the characteristics and risk of bias among studies contributing to the synthesis; present results of all statistical syntheses conducted; present results of any investigations of possible causes of heterogeneity among study results; and present results of any sensitivity analyses (see item #20a-20d).
Addition of new items recommending authors report methods for and results of an assessment of certainty (or confidence) in the body of evidence for an outcome (see items #15 and #22).
Addition of a new item recommending authors declare any competing interests (see item #26).
Addition of a new item recommending authors indicate whether data, analytic code and other materials used in the review are publicly available and if so, where they can be found (see item #27).
PRISMA 2020 item checklist
PRISMA 2020 for Abstracts checklist*
PRISMA 2020 flow diagram template for systematic reviews. The new design is adapted from flow diagrams proposed by Boers, 55 Mayo-Wilson et al. 56 and Stovold et al. 57 The boxes in grey should only be completed if applicable; otherwise they should be removed from the flow diagram. Note that a “report” could be a journal article, preprint, conference abstract, study register entry, clinical study report, dissertation, unpublished manuscript, government report or any other document providing relevant information.
We recommend authors refer to PRISMA 2020 early in the writing process, because prospective consideration of the items may help to ensure that all the items are addressed. To help keep track of which items have been reported, the PRISMA statement website ( http://www.prisma-statement.org/ ) includes fillable templates of the checklists to download and complete (also available in the data supplement on bmj.com). We have also created a web application that allows users to complete the checklist via a user-friendly interface 58 (available at https://prisma.shinyapps.io/checklist/ and adapted from the Transparency Checklist app 59 ). The completed checklist can be exported to Word or PDF. Editable templates of the flow diagram can also be downloaded from the PRISMA statement website.
We have prepared an updated explanation and elaboration paper, in which we explain why reporting of each item is recommended and present bullet points that detail the reporting recommendations (which we refer to as elements). 41 The bullet-point structure is new to PRISMA 2020 and has been adopted to facilitate implementation of the guidance. 60 61 An expanded checklist, which comprises an abridged version of the elements presented in the explanation and elaboration paper, with references and some examples removed, is available in the data supplement on bmj.com. Consulting the explanation and elaboration paper is recommended if further clarity or information is required.
Journals and publishers might impose word and section limits, and limits on the number of tables and figures allowed in the main report. In such cases, if the relevant information for some items already appears in a publicly accessible review protocol, referring to the protocol may suffice. Alternatively, placing detailed descriptions of the methods used or additional results (such as for less critical outcomes) in supplementary files is recommended. Ideally, supplementary files should be deposited to a general-purpose or institutional open-access repository that provides free and permanent access to the material (such as Open Science Framework, Dryad, figshare). A reference or link to the additional information should be included in the main report. Finally, although PRISMA 2020 provides a template for where information might be located, the suggested location should not be seen as prescriptive; the guiding principle is to ensure the information is reported.
Use of PRISMA 2020 has the potential to benefit many stakeholders. Complete reporting allows readers to assess the appropriateness of the methods, and therefore the trustworthiness of the findings. Presenting and summarising characteristics of studies contributing to a synthesis allows healthcare providers and policy makers to evaluate the applicability of the findings to their setting. Describing the certainty in the body of evidence for an outcome and the implications of findings should help policy makers, managers, and other decision makers formulate appropriate recommendations for practice or policy. Complete reporting of all PRISMA 2020 items also facilitates replication and review updates, as well as inclusion of systematic reviews in overviews (of systematic reviews) and guidelines, so teams can leverage work that is already done and decrease research waste. 36 62 63
We updated the PRISMA 2009 statement by adapting the EQUATOR Network’s guidance for developing health research reporting guidelines. 64 We evaluated the reporting completeness of published systematic reviews, 17 21 36 37 reviewed the items included in other documents providing guidance for systematic reviews, 38 surveyed systematic review methodologists and journal editors for their views on how to revise the original PRISMA statement, 35 discussed the findings at an in-person meeting, and prepared this document through an iterative process. Our recommendations are informed by the reviews and survey conducted before the in-person meeting, theoretical considerations about which items facilitate replication and help users assess the risk of bias and applicability of systematic reviews, and co-authors’ experience with authoring and using systematic reviews.
Various strategies to increase the use of reporting guidelines and improve reporting have been proposed. They include educators introducing reporting guidelines into graduate curricula to promote good reporting habits of early career scientists 65 ; journal editors and regulators endorsing use of reporting guidelines 18 ; peer reviewers evaluating adherence to reporting guidelines 61 66 ; journals requiring authors to indicate where in their manuscript they have adhered to each reporting item 67 ; and authors using online writing tools that prompt complete reporting at the writing stage. 60 Multi-pronged interventions, where more than one of these strategies are combined, may be more effective (such as completion of checklists coupled with editorial checks). 68 However, of 31 interventions proposed to increase adherence to reporting guidelines, the effects of only 11 have been evaluated, mostly in observational studies at high risk of bias due to confounding. 69 It is therefore unclear which strategies should be used. Future research might explore barriers and facilitators to the use of PRISMA 2020 by authors, editors, and peer reviewers, designing interventions that address the identified barriers, and evaluating those interventions using randomised trials. To inform possible revisions to the guideline, it would also be valuable to conduct think-aloud studies 70 to understand how systematic reviewers interpret the items, and reliability studies to identify items where there is varied interpretation of the items.
We encourage readers to submit evidence that informs any of the recommendations in PRISMA 2020 (via the PRISMA statement website: http://www.prisma-statement.org/ ). To enhance accessibility of PRISMA 2020, several translations of the guideline are under way (see available translations at the PRISMA statement website). We encourage journal editors and publishers to raise awareness of PRISMA 2020 (for example, by referring to it in journal “Instructions to authors”), endorsing its use, advising editors and peer reviewers to evaluate submitted systematic reviews against the PRISMA 2020 checklists, and making changes to journal policies to accommodate the new reporting recommendations. We recommend existing PRISMA extensions 47 49 50 51 52 53 71 72 be updated to reflect PRISMA 2020 and advise developers of new PRISMA extensions to use PRISMA 2020 as the foundation document.
We anticipate that the PRISMA 2020 statement will benefit authors, editors, and peer reviewers of systematic reviews, and different users of reviews, including guideline developers, policy makers, healthcare providers, patients, and other stakeholders. Ultimately, we hope that uptake of the guideline will lead to more transparent, complete, and accurate reporting of systematic reviews, thus facilitating evidence based decision making.
We dedicate this paper to the late Douglas G Altman and Alessandro Liberati, whose contributions were fundamental to the development and implementation of the original PRISMA statement.
We thank the following contributors who completed the survey to inform discussions at the development meeting: Xavier Armoiry, Edoardo Aromataris, Ana Patricia Ayala, Ethan M Balk, Virginia Barbour, Elaine Beller, Jesse A Berlin, Lisa Bero, Zhao-Xiang Bian, Jean Joel Bigna, Ferrán Catalá-López, Anna Chaimani, Mike Clarke, Tammy Clifford, Ioana A Cristea, Miranda Cumpston, Sofia Dias, Corinna Dressler, Ivan D Florez, Joel J Gagnier, Chantelle Garritty, Long Ge, Davina Ghersi, Sean Grant, Gordon Guyatt, Neal R Haddaway, Julian PT Higgins, Sally Hopewell, Brian Hutton, Jamie J Kirkham, Jos Kleijnen, Julia Koricheva, Joey SW Kwong, Toby J Lasserson, Julia H Littell, Yoon K Loke, Malcolm R Macleod, Chris G Maher, Ana Marušic, Dimitris Mavridis, Jessie McGowan, Matthew DF McInnes, Philippa Middleton, Karel G Moons, Zachary Munn, Jane Noyes, Barbara Nußbaumer-Streit, Donald L Patrick, Tatiana Pereira-Cenci, Ba’ Pham, Bob Phillips, Dawid Pieper, Michelle Pollock, Daniel S Quintana, Drummond Rennie, Melissa L Rethlefsen, Hannah R Rothstein, Maroeska M Rovers, Rebecca Ryan, Georgia Salanti, Ian J Saldanha, Margaret Sampson, Nancy Santesso, Rafael Sarkis-Onofre, Jelena Savović, Christopher H Schmid, Kenneth F Schulz, Guido Schwarzer, Beverley J Shea, Paul G Shekelle, Farhad Shokraneh, Mark Simmonds, Nicole Skoetz, Sharon E Straus, Anneliese Synnot, Emily E Tanner-Smith, Brett D Thombs, Hilary Thomson, Alexander Tsertsvadze, Peter Tugwell, Tari Turner, Lesley Uttley, Jeffrey C Valentine, Matt Vassar, Areti Angeliki Veroniki, Meera Viswanathan, Cole Wayant, Paul Whaley, and Kehu Yang. We thank the following contributors who provided feedback on a preliminary version of the PRISMA 2020 checklist: Jo Abbott, Fionn Büttner, Patricia Correia-Santos, Victoria Freeman, Emily A Hennessy, Rakibul Islam, Amalia (Emily) Karahalios, Kasper Krommes, Andreas Lundh, Dafne Port Nascimento, Davina Robson, Catherine Schenck-Yglesias, Mary M Scott, Sarah Tanveer and Pavel Zhelnov. We thank Abigail H Goben, Melissa L Rethlefsen, Tanja Rombey, Anna Scott, and Farhad Shokraneh for their helpful comments on the preprints of the PRISMA 2020 papers. We thank Edoardo Aromataris, Stephanie Chang, Toby Lasserson and David Schriger for their helpful peer review comments on the PRISMA 2020 papers.
Contributors: JEM and DM are joint senior authors. MJP, JEM, PMB, IB, TCH, CDM, LS, and DM conceived this paper and designed the literature review and survey conducted to inform the guideline content. MJP conducted the literature review, administered the survey and analysed the data for both. MJP prepared all materials for the development meeting. MJP and JEM presented proposals at the development meeting. All authors except for TCH, JMT, EAA, SEB, and LAM attended the development meeting. MJP and JEM took and consolidated notes from the development meeting. MJP and JEM led the drafting and editing of the article. JEM, PMB, IB, TCH, LS, JMT, EAA, SEB, RC, JG, AH, TL, EMW, SM, LAM, LAS, JT, ACT, PW, and DM drafted particular sections of the article. All authors were involved in revising the article critically for important intellectual content. All authors approved the final version of the article. MJP is the guarantor of this work. The corresponding author attests that all listed authors meet authorship criteria and that no others meeting the criteria have been omitted.
Funding: There was no direct funding for this research. MJP is supported by an Australian Research Council Discovery Early Career Researcher Award (DE200101618) and was previously supported by an Australian National Health and Medical Research Council (NHMRC) Early Career Fellowship (1088535) during the conduct of this research. JEM is supported by an Australian NHMRC Career Development Fellowship (1143429). TCH is supported by an Australian NHMRC Senior Research Fellowship (1154607). JMT is supported by Evidence Partners Inc. JMG is supported by a Tier 1 Canada Research Chair in Health Knowledge Transfer and Uptake. MML is supported by The Ottawa Hospital Anaesthesia Alternate Funds Association and a Faculty of Medicine Junior Research Chair. TL is supported by funding from the National Eye Institute (UG1EY020522), National Institutes of Health, United States. LAM is supported by a National Institute for Health Research Doctoral Research Fellowship (DRF-2018-11-ST2-048). ACT is supported by a Tier 2 Canada Research Chair in Knowledge Synthesis. DM is supported in part by a University Research Chair, University of Ottawa. The funders had no role in considering the study design or in the collection, analysis, interpretation of data, writing of the report, or decision to submit the article for publication.
Competing interests: All authors have completed the ICMJE uniform disclosure form at http://www.icmje.org/conflicts-of-interest/ and declare: EL is head of research for the BMJ ; MJP is an editorial board member for PLOS Medicine ; ACT is an associate editor and MJP, TL, EMW, and DM are editorial board members for the Journal of Clinical Epidemiology ; DM and LAS were editors in chief, LS, JMT, and ACT are associate editors, and JG is an editorial board member for Systematic Reviews . None of these authors were involved in the peer review process or decision to publish. TCH has received personal fees from Elsevier outside the submitted work. EMW has received personal fees from the American Journal for Public Health , for which he is the editor for systematic reviews. VW is editor in chief of the Campbell Collaboration, which produces systematic reviews, and co-convenor of the Campbell and Cochrane equity methods group. DM is chair of the EQUATOR Network, IB is adjunct director of the French EQUATOR Centre and TCH is co-director of the Australasian EQUATOR Centre, which advocates for the use of reporting guidelines to improve the quality of reporting in research articles. JMT received salary from Evidence Partners, creator of DistillerSR software for systematic reviews; Evidence Partners was not involved in the design or outcomes of the statement, and the views expressed solely represent those of the author.
Provenance and peer review: Not commissioned; externally peer reviewed.
Patient and public involvement: Patients and the public were not involved in this methodological research. We plan to disseminate the research widely, including to community participants in evidence synthesis organisations.
This is an Open Access article distributed in accordance with the terms of the Creative Commons Attribution (CC BY 4.0) license, which permits others to distribute, remix, adapt and build upon this work, for commercial use, provided the original work is properly cited. See: http://creativecommons.org/licenses/by/4.0/ .
About Systematic Reviews
PRISMA was designed to guide the transparent reporting of systematic reviews. One of the ways it does this is through the PRISMA statement flow diagram , a 4-stage flowchart that maps out the flow of information through the research process. It illustrates how studies are identified, screened, and included or excluded for the review. With a PRISMA flow diagram, a reader can see how the number of articles first identified through database searches is purged to a collection of studies that are ultimately reviewed. For example, 50 titles or abstracts may be used from an initial list of 500.
After understanding what a PRISMA flow diagram is , it is natural to wonder about its purpose. First, it is made to enhance the value of reviews as it helps reporters achieve higher reliability and better explicitness as it guides them through selecting relevant studies. The flowchart is particularly preferred in systematic reviews in healthcare because it helps to synthesize the findings of multiple studies and summarize the best available research.
There are two versions of the PRISMA flow diagram, both of which have templates that you can download for easy use. Knowing which one to choose depends on the type of systematic review (new or updated) and the sources used for study identification (ie, whether just databases and clinical trials or including gray literature). Generally, though, these diagrams contain the following parts:
“Records identified” refers to all the titles and abstracts found in your searches. Run searches including all of your terms, subject headings, etc, and apply all the relevant limits. The number of articles that come up for each will be your “records identified,” which you can also note as separate numbers per database.
Before you start screening your articles, you must remove any duplicates from the initial search. The number of removed duplicates is written in “records removed before screening.”
“Records screened” is the number of “records identified” minus the removed duplicates. This refers to all the titles and abstracts that you will screen for relevance to your research question.
After the screening, note the number of any articles to be excluded (ie, those found to be irrelevant to your systematic review) in “records excluded.” You can also add exclusion reasons here, but it is optional – you’ll have to do it later on, though.
“Reports sought for retrieval” are those to be obtained in preparation for full-text screening. It is the difference between “records screened” and “records excluded.”
Once you begin retrieving full texts, you may find that some titles are inaccessible. If you’ve exhausted all tools and resources to try to obtain the full text, yet are still unable, add the number of unavailable articles to “reports not retrieved.”
The number of “reports assessed for eligibility” is the difference between “reports sought for retrieval” and “reports not retrieved.” These articles should be assessed to check their eligibility for inclusion in the systematic review.
“Reports excluded” tallies the number of articles that will be excluded after full-text screening due to their ineligibility for inclusion in the systematic review. Articles that are ineligible will meet exclusion criteria such as wrong setting, wrong population, etc. In this section, you must explain why you made the exclusion.
This comprises the “reports assessed for eligibility” minus “reports excluded,” which provides the number to input in “studies included in review.”
(Article continues below)
PRISMA makes systematic reviews more organized and transparent through guidelines such as its 27-point checklist (check out our PRISMA 2020 Checklist example ) and its flow diagram. PRISMA is an important guideline to follow when undertaking a systematic review, whether you’re doing the process manually or using software like DistillerSR to automate some steps .
An official website of the United States government
The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.
The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.
The PMC website is updating on October 15, 2024. Learn More or Try it out now .
Lara a. kahale.
1 Cochrane Central Executive, Cochrane, London, St Albans House, 57-59 Haymarket, London, SW1Y 4QX, UK
2 Infectious Disease Epidemiology Group, Weill Cornell Medicine-Qatar, Cornell University, Qatar Foundation - Education City, Doha, Qatar
3 World Health Organization Collaborating Centre for Disease Epidemiology Analytics on HIV/AIDS, Sexually Transmitted Infections, and Viral Hepatitis, Weill Cornell Medicine–Qatar, Cornell University, Qatar Foundation – Education City, Doha, Qatar
4 Clinical Research Institute, American University of Beirut, Beirut, Riad El Solh 1107 2020, Lebanon
5 CIBER Epidemiología y Salud Pública, Madrid, Av. de Monforte de Lemos, 5, 28029, Spain
6 Iberoamerican Cochrane Centre, Sant Pau Biomedical Research Institute, Barcelona, C / Sant Quintí, 77-79 08041, Spain
7 Wolfson Palliative Care Research Centre, Hull York Medical School, University of Hull, Hull, HU6 7RX, UK
8 Department of Health Research Methods, Evidence, and Impact, McMaster University, Hamilton, Ontario, 1280 Main Street West 2C Area, Canada
9 Department of Medicine, McMaster University, Hamilton, Ontario, 1280 Main Street West 2C Area, Canada
10 Stockholm Environment Institute, Stockholm, Linnégatan, 87D, Sweden
11 Africa Centre for Evidence, University of Johannesburg, Johannesburg, South Africa
12 Leibniz Centre for Agricultural Landscape Research (ZALF), Eberswalder Str. 84, 15374, Müncheberg, Germany
13 Department of Internal Medicine, American University of Beirut, Beirut, Riad El Solh 1107 2020, Lebanon
Underlying data.
All data underlying the results are available as part of the article and no additional source data are required.
Revised. amendments from version 2.
We have made the following edits from version 2 to version 3:
Review date | Reviewer name(s) | Version reviewed | Review status |
---|---|---|---|
Lex M. Bouter | Approved | ||
Sonia Hines | Approved | ||
Lex M. Bouter | Approved with Reservations |
Background : While the PRISMA flow diagram is widely used for reporting standard systematic reviews (SRs), it was not designed for capturing the results of continual searches for studies in living systematic reviews (LSRs). The objectives of this study are (1) to assess how published LSRs report on the flow of studies through the different phases of the review for the different updates; (2) to propose an approach to reporting on that flow.
Methods : For objective 1, we identified all LSRs published up to April 2021. We abstracted information regarding their general characteristics and how they reported on search results. For objective 2, we based our proposal for tailored PRISMA approaches on the findings from objective 1, as well as on our experience with conducting Cochrane LSRs.
Results: We identified 279 living publications relating to 76 LSRs. Of the 279 publications, 11% were protocols, 23% were base versions (i.e., the first version), 50% were partial updates (i.e., does not include all typical sections of an SR), and 16% were full updates (i.e., includes all typical sections of an SR). We identified six ways to reporting the study flow: base separately, each update separately (38%); numbers not reported (32%); latest update separately, all previous versions combined (20%); base separately, all updates combined (7%); latest update version only (3%); all versions combined (0%). We propose recording in detail the results of the searches to keep track of all identified records. For structuring the flow diagram, we propose using one of four approaches.
Conclusion: We identified six ways for reporting the study flow through the different phases of the review for the different update versions. We propose to document in detail the study flow for the different search updates and select one of our four tailored PRISMA diagram approaches to present that study flow.
During the coronavirus disease 2019 (COVID-19) pandemic, health research has proliferated exponentially 1 . Systematic reviews are essential to synthesize the evidence and inform policy and practice. Given the pace of research publication, those reviews need to be kept up to date. Living systematic reviews (LSRs) are an emerging type of systematic review that involves the continual search of the literature and incorporation of relevant new evidence, soon after it becomes available 2 . While many evidence synthesis groups are engaged in conducting LSRs or living network meta-analyses, others have developed living databases or living maps, including resources specific for COVID-19 literature 3 – 17 .
An essential component of systematic reviews is to keep track of and report the number of records captured while searching the scientific literature and details of the selection process 18 . The PRISMA statement recommends the use of the PRISMA flow diagram to depict the flow of studies through the different phases of the systematic review 19 . While the PRISMA flow diagram is a widely used tool for reporting original systematic reviews, it was not designed to capture the results of continual searches typically used in LSRs. Hence, it’s unclear how authors of LSRs address the issue of presenting results of these continual searches.
The objectives of this study were (1) to assess how published LSRs report on the flow of studies through the different phases of the review for the different updates; and (2) to propose an approach to documenting and reporting on the flow of studies through the different phases of a LSR, for the different updates.
For objective 1, we collected relevant data as part of a larger methodological survey aiming to assess the methods of conduct and reporting of LSRs. We have described the details of that study in a previously published protocol 20 . Briefly, we identified all living reviews published up to April 2021 available from the following electronic databases: Medline , EMBASE and the Cochrane library (see extended data 21 of Khamis et al. 20 for the search strategy). An eligible living review was either (1) a protocol for an LSR, (2) a base version of an LSR, (3) a full update version of an LSR, (4) a partial update version of an LSR, or (5) a combination of any of these (e.g., one living review may constitute of a protocol, a base version, and a full update version; another living review may constitute of only a Box 1 the definition of each type of living reviews.
• LSR protocol: the protocol that describes the planned methods of the living review
• Base version: the first version of the review that follows a living approach
• Full update version: a subsequent version of the review that includes all the typical sections of a systematic review, including an introduction, methods, and results sections. Such a version could stand-alone in terms of content.
• Partial update version: a subsequent version of the review that does not include all the typical sections of a systematic review, but instead refers to a previous version for complementary information. Such a version could not stand-alone in terms of content.
For the current study, we abstracted information about the following features of LSRs:
For objective 2, we base our proposal for tailored PRISMA 2020 flow diagram approaches on the findings from objective 1, on our experience conducting Cochrane LSRs, and our methodological work on designing and reporting living evidence. Since 2017, our group has been responsible for the first series of three Cochrane LSRs, all of which address anticoagulation in patients with cancer 22 – 24 . We conducted the base search in February 2016. Since then, we have been updating the search on a monthly basis. Through this experience, we have been able to apply and refine the guidance for conducting LSRs endorsed by the living evidence network group 25 . Specifically, we explored solutions for the reporting of the study flow that would address different scenarios. Our goal was not to be prescriptive and narrow, but rather to cover all possible resulting flows by reviewing the LSRs we identified based on objective 1. Two authors developed a draft of the tailored approaches to presenting the study flow, and then circulated to the author team for review and suggestions for improvement.
We used REDCap to collect and manage the data abstraction process. All data were exported from REDCap and analyzed using Stata v. 13 26 , 27 .
Our search identified a total of 279 living publications relating to 76 LSRs. Table 1 shows their general characteristics. Of the 279 living publications, 11% were protocols, 23% were base versions, 50% were partial updates, and 16% were full updates. The median number of living publications per LSR was 2 (Interquartile range 1–4). Of the 76 living reviews, 22% were published in the Cochrane library, 63% were related to COVID-19, and 25% had a base version published as a rapid review. The majority were related to clinical topics (70%).
N | n (%) | |
---|---|---|
• Protocol | 31 (11.1) | |
• Base version | 64 (22.9) | |
• Partial update version | 138 (49.5) | |
• Full update version | 46 (16.5) | |
(Median (IQR)) | 2 (1 – 4) | |
17 (22.4) | ||
• Clinical | 53 (69.7) | |
• Public health | 20 (26.3) | |
• Health system and policy | 3 (4.0) | |
48 (63.2) | ||
16 (25.0) |
Abbreviations : LSR: living systematic review; IQR: interquartile range
Table 2 shows the results for the reporting on the study flow. Most base versions and full updates used a flow diagram to report on the search results (96% and 93% respectively), whereas only one partial update presented a flow diagram. In addition, none of the 279 living publications reported in their methods section how they plan to report on the study flow.
N | n (%) | |
---|---|---|
• Base version | 62 (96.9) | |
• Partial update version | 1 (0.7) | |
• Full update version | 43 (93.5) | |
| ||
Base separately; each update separately | 39 (21.2) | |
Base separately; all updates combined | 20 (10.9) | |
Latest update separately; all previous versions combined (including the base) | 24 (13.0) | |
All versions combined | 12 (6.5) | |
Latest update version only | 47 (25.5) | |
Numbers not reported | 42 (22.8) |
a When a flow diagram is not reported, the authors reported on the search results in a narrative format.
Among the 184 update versions ( Figure 1 ):
Using the approach described in the methods section, we developed four approaches that allow authors to document and report the study flow for the different review update versions of an LSR.
1. Documenting LSR study flow
Authors should record in detail the results of the searches to keep track of all identified records. We propose using a spreadsheet for one LSR at a time. The format we present consists of tabs for each of the respective search sources: bibliographic databases (e.g. MEDLINE, EMBASE, Cochrane databases); conference proceedings; ongoing studies as captured in clinicaltrials.gov and WHO International Clinical Trials Registry Platform (ICTRP); other tabs as needed, and a final ‘cumulative’ tab.
We show in Figure 2 a snapshot of the ‘cumulative’ tab of the spreadsheet that keeps track of all records. It shows the study flow for a hypothetical example for an LSR published first in January 2020 (i.e., base version) and updated on a monthly basis up to August 2020. Each row corresponds to a different update version. The columns present the following information for each update (columns B to E): the number of records received, deduplicated, included at title and abstract screening, and included at full-text screening (i.e., newly included reports). Additional columns (F to I) present the distribution of the newly included reports as relating to either: (1) new studies, (2) previously included studies, (3) ongoing (unpublished) studies, or (4) preprints.
After manually entering the information in the first five tabs (corresponding to the different search sources) the total is automatically computed in the ‘cumulative’ tab.
2. Reporting LSR study flow
The proposed spreadsheet can act as a basis for a tailored PRISMA flow diagram for LSRs. For structuring the flow diagram for LSR, one can select one out of four tailored PRISMA 2020 flow diagram approaches:
In our Cochrane reviews, we applied the second proposal where we present the results for the different searches combined.
This study found that authors of LSRs are not consistent in reporting on the flow of studies through the different phases of the review for the different update versions. Thus, we propose to document in detail the study flow for the different search updates and select one of four tailored PRISMA 2020 flow diagram approaches to present that study flow.
To our knowledge, this is the first methodological survey that assesses how LSR authors report on the flow of studies through the different phases of the review for the different update versions of LSRs. In addition, the research expertise on our team covers both living approach and regular updating of traditional SR. We believe that our assessment forms a vital baseline and allows us to propose best practices for visualization options to improve consistency whilst the production of LSRs is still at a relatively early stage. Indeed, this survey is part of a larger methodological survey aiming to assess the methods of conduct and reporting of LSRs 20 , that would allow us to update our findings in the future.
Authors tend to produce more partial updates of LSRs rather than continually updating the full systematic review. This might seem like a pragmatic approach particularly for a rapidly growing research field and when methods do not seem to change from one update to another. The heterogeneity observed in the ways LSR authors report on the study flow is likely to be explained by the lack of clear guidance on how to do so.
We built our proposal on the PRISMA 2020 flow diagram and provide four approaches to tailor the needs for continual searchers used in LSR. The fourth approach is the closest to the current PRISMA 2020 flow diagram as it presents the results of the latest update version separately and the results of all previous versions (including the base) combined.
In addition, we proposed three other different approaches to provide options to LSR authors and publishing journals. Authors should choose one or the other approach based on the number of new citations, presentation preferences, and the impression of what provides the greatest transparency in reporting. Whatever approach one decides to follow, for transparency purposes, the systematic reviewers should ideally archive previous versions of the flow diagram (e.g., in an appendix). One major challenge will be to accommodate a large number of updates in the same diagram; some approaches would work better than others in that case. Also, advanced information technology solutions may allow fitting a large number of updates. A web-based prototype is available that allows readers to explore different reporting options across these four approaches: an R package ( https://github.com/nealhaddaway/livingPRISMAflow ) and web-based ShinyApp ( https://estech.shinyapps.io/livingprismaflow/ ) were developed that allow users to enter their own data (e.g., from the spreadsheet suggested above) to produce a bespoke flow diagram according to their desired approach or to create their own interactive diagram that allows readers to toggle between different versions of the same data 28 .
Advanced information technology can also be utilized to simplify updating and tracking the change in all LSR sections including the PRISMA diagram. It would be optimal to develop the base version in a certain platform where all SR and LSR sections are reported as units (i.e., title, authors, background, objectives, inclusion criteria, effect estimate for outcome x). With each update and for every unit, the author has the luxury to keep the same text (if no change has occurred) or edit (if change has occurred). Each unit can be updated in a differential speed based on certain criteria. The edits could be highlighted to visualize the change. For a certain section, one would easily have access to the entries in the previous versions and possibly visualize a trend across the different versions (i.e., cross-sectional view for that specific item). For example, dynamic documents can be developed using ‘R markdown’, a document preparation system, where static text can be combined with in-line code and ‘code chunks’ that produce instantly updatable documents given a modified input 29 .
This study is part of a bigger project aiming to develop extension to the PRISMA 2020 statement for LSRs (please see registration form on EQUATOR network website: Equator Network. PRISMA for LSR – Extension of PRISMA 2020 for living systematic reviews . 2021; Accessed from https://www.equator-network.org/library/reporting-guidelines-under-development/reporting-guidelines-under-development-for-systematic-reviews/#LSR ]. This project will pilot the proposed approaches for documenting the study flow and for structuring the living flow diagram. In addition, qualitative studies would be helpful to explore: (1) the feasibility and acceptability by LSR authors, publishers, and users towards the proposal; and (2) what the end-users would like to see in an LSR update.
LSR authors are not consistent in reporting the flow of studies through the different phases of the review for the different update versions. We propose to document in detail the study flow for the different search updates. Authors can select one of our four tailored PRISMA 2020 flow diagram approaches to present that study flow until detailed guidance will become available. Improving the reporting of study flow in LSR methodology is essential for incorporating living evidence when developing living guidance, particularly in the context of an urgent response 30 , 31 .
Acknowledgment.
We would like to acknowledge Dr. Mathew Page for his revision for the manuscript.
[version 3; peer review: 2 approved]
The author(s) declared that no grants were involved in supporting this work.
Lex m. bouter.
1 Department of Epidemiology and Data Science, Amsterdam Universities Medical Center, Amsterdam, The Netherlands
2 Department of Philosophy, Faculty of Humanities, Vrije Universiteit, Amsterdam, The Netherlands
I applaud the authors for updating their survey and for more than double the sample size. They also made some other - rather marginal - changes and explain that their publication is only the first of a series based on an ongoing project.
Is the work clearly and accurately presented and does it cite the current literature?
If applicable, is the statistical analysis and its interpretation appropriate?
Are all the source data underlying the results available to ensure full reproducibility?
Is the study design appropriate and is the work technically sound?
Are the conclusions drawn adequately supported by the results?
Are sufficient details of methods and analysis provided to allow replication by others?
Reviewer Expertise:
Methodology, epidemiology, research integrity, open science, systematic review methods.
I confirm that I have read this submission and believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard.
Sonia hines.
1 College of Medicine and Public Health, Flinders University, Alice Springs, Northern Territory, Australia
This is an interesting proposal to solve the problem of study flow reporting in living systematic reviews (LSRs). As LSRs increase in number, the methodology and reporting requirements need to be well described and usable.
I am not sure these particular suggested approaches are the most practical. Spreadsheets are not easily inserted into publications, but it is a worthwhile question to be asking, and working from the existing publications is a good starting point.
The authors have done what they set out to do, but I suggest further work is needed before a recommended method of study flow reporting is settled upon.
Systematic reviews, evidence-based practice.
American University of Beirut, Lebanon
The Reviewers’ comments are in bold font and our replies in regular font. Extracts from the text are in italic fonts with changes underlined . We have indicated the sections where revisions have been made in our manuscript.
Reviewer 2: Sonia Hines
1. This is an interesting proposal to solve the problem of study flow reporting in living systematic reviews (LSRs). As LSRs increase in number, the methodology and reporting requirements need to be well described and usable. I am not sure these particular suggested approaches are the most practical. Spreadsheets are not easily inserted into publications, but it is a worthwhile question to be asking, and working from the existing publications is a good starting point.
2. The authors have done what they set out to do, but I suggest further work is needed before a recommended method of study flow reporting is settled upon.
Living Systematic Reviews (LSRs) are updated as new evidence becomes available and gained popularity during the Covid-19 pandemic. This manuscript describes the way PRISMA 2020 flow diagrams are handled in 32 LSRs with a view to recommend how this can best be done. The topic is relevant albeit a bit narrow and the manuscript is written clearly. However, there’re some issues that should to be solved in the next version of the manuscript.
Major issues
I confirm that I have read this submission and believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard, however I have significant reservations, as outlined above.
Reviewer 1: Lex Bouter
1. It’s disappointing that no clear recommendation but four alternative recommendations are given without any guidance which one to select when. That sounds a bit like ‘anything goes’. The reduction from the six approaches found in the LSRs to date to the four recommended is not very impressive. I was also surprised that the recommendations were solely based on the experience of the authors. Why is no attempt made to consult survey methodologists and end-users of LSRs, e.g. by performing a Delphi study? Also Cochrane Methods Groups and the editors of the Cochrane Handbook seem not to have been approached with a request to state their view on the issue.
2. The findings presented are part of a larger project on the methods of LSRs about which near to nothing is said in the manuscript. That makes one wonder whether this is not too small a part of the harvest to be optimally useful. Please explain why this element on flow charts is separated from the rest.
3. The data set is quite small: 32 LSRs of which 8 are only available as study protocol, 12 have only one (base) version, and 12 have one or more updates. Why is no indication provided of the corresponding imprecision, e.g. by presenting 95% confidence intervals?
4. The bottom half of table 2 presents how the study flow is reported among the 12 LSRs that got at least one update. I recommend to do this for all 32 LSRs included, assuming that when no update is yet reported the envisioned handling of flow charts should be specified in either the review protocol or the base version of the review .
Systematic Reviews volume 10 , Article number: 89 ( 2021 ) Cite this article
336k Accesses
3156 Citations
111 Altmetric
Metrics details
An Editorial to this article was published on 19 April 2021
The Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) statement, published in 2009, was designed to help systematic reviewers transparently report why the review was done, what the authors did, and what they found. Over the past decade, advances in systematic review methodology and terminology have necessitated an update to the guideline. The PRISMA 2020 statement replaces the 2009 statement and includes new reporting guidance that reflects advances in methods to identify, select, appraise, and synthesise studies. The structure and presentation of the items have been modified to facilitate implementation. In this article, we present the PRISMA 2020 27-item checklist, an expanded checklist that details reporting recommendations for each item, the PRISMA 2020 abstract checklist, and the revised flow diagrams for original and updated reviews. In order to encourage its wide dissemination this article is freely accessible on BMJ, PLOS Medicine, Journal of Clinical Epidemiology and International Journal of Surgery journal websites.
Systematic reviews serve many critical roles. They can provide syntheses of the state of knowledge in a field, from which future research priorities can be identified; they can address questions that otherwise could not be answered by individual studies; they can identify problems in primary research that should be rectified in future studies; and they can generate or evaluate theories about how or why phenomena occur. Systematic reviews therefore generate various types of knowledge for different users of reviews (such as patients, healthcare providers, researchers, and policy makers) [ 1 , 2 ]. To ensure a systematic review is valuable to users, authors should prepare a transparent, complete, and accurate account of why the review was done, what they did (such as how studies were identified and selected) and what they found (such as characteristics of contributing studies and results of meta-analyses). Up-to-date reporting guidance facilitates authors achieving this [ 3 ].
The Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) statement published in 2009 (hereafter referred to as PRISMA 2009) [ 4 , 5 , 6 , 7 , 8 , 9 , 10 ] is a reporting guideline designed to address poor reporting of systematic reviews [ 11 ]. The PRISMA 2009 statement comprised a checklist of 27 items recommended for reporting in systematic reviews and an “explanation and elaboration” paper [ 12 , 13 , 14 , 15 , 16 ] providing additional reporting guidance for each item, along with exemplars of reporting. The recommendations have been widely endorsed and adopted, as evidenced by its co-publication in multiple journals, citation in over 60,000 reports (Scopus, August 2020), endorsement from almost 200 journals and systematic review organisations, and adoption in various disciplines. Evidence from observational studies suggests that use of the PRISMA 2009 statement is associated with more complete reporting of systematic reviews [ 17 , 18 , 19 , 20 ], although more could be done to improve adherence to the guideline [ 21 ].
Many innovations in the conduct of systematic reviews have occurred since publication of the PRISMA 2009 statement. For example, technological advances have enabled the use of natural language processing and machine learning to identify relevant evidence [ 22 , 23 , 24 ], methods have been proposed to synthesise and present findings when meta-analysis is not possible or appropriate [ 25 , 26 , 27 ], and new methods have been developed to assess the risk of bias in results of included studies [ 28 , 29 ]. Evidence on sources of bias in systematic reviews has accrued, culminating in the development of new tools to appraise the conduct of systematic reviews [ 30 , 31 ]. Terminology used to describe particular review processes has also evolved, as in the shift from assessing “quality” to assessing “certainty” in the body of evidence [ 32 ]. In addition, the publishing landscape has transformed, with multiple avenues now available for registering and disseminating systematic review protocols [ 33 , 34 ], disseminating reports of systematic reviews, and sharing data and materials, such as preprint servers and publicly accessible repositories. To capture these advances in the reporting of systematic reviews necessitated an update to the PRISMA 2009 statement.
| |
• To ensure a systematic review is valuable to users, authors should prepare a transparent, complete, and accurate account of why the review was done, what they did, and what they found | |
• The PRISMA 2020 statement provides updated reporting guidance for systematic reviews that reflects advances in methods to identify, select, appraise, and synthesise studies | |
• The PRISMA 2020 statement consists of a 27-item checklist, an expanded checklist that details reporting recommendations for each item, the PRISMA 2020 abstract checklist, and revised flow diagrams for original and updated reviews | |
• We anticipate that the PRISMA 2020 statement will benefit authors, editors, and peer reviewers of systematic reviews, and different users of reviews, including guideline developers, policy makers, healthcare providers, patients, and other stakeholders |
A complete description of the methods used to develop PRISMA 2020 is available elsewhere [ 35 ]. We identified PRISMA 2009 items that were often reported incompletely by examining the results of studies investigating the transparency of reporting of published reviews [ 17 , 21 , 36 , 37 ]. We identified possible modifications to the PRISMA 2009 statement by reviewing 60 documents providing reporting guidance for systematic reviews (including reporting guidelines, handbooks, tools, and meta-research studies) [ 38 ]. These reviews of the literature were used to inform the content of a survey with suggested possible modifications to the 27 items in PRISMA 2009 and possible additional items. Respondents were asked whether they believed we should keep each PRISMA 2009 item as is, modify it, or remove it, and whether we should add each additional item. Systematic review methodologists and journal editors were invited to complete the online survey (110 of 220 invited responded). We discussed proposed content and wording of the PRISMA 2020 statement, as informed by the review and survey results, at a 21-member, two-day, in-person meeting in September 2018 in Edinburgh, Scotland. Throughout 2019 and 2020, we circulated an initial draft and five revisions of the checklist and explanation and elaboration paper to co-authors for feedback. In April 2020, we invited 22 systematic reviewers who had expressed interest in providing feedback on the PRISMA 2020 checklist to share their views (via an online survey) on the layout and terminology used in a preliminary version of the checklist. Feedback was received from 15 individuals and considered by the first author, and any revisions deemed necessary were incorporated before the final version was approved and endorsed by all co-authors.
Scope of the guideline.
The PRISMA 2020 statement has been designed primarily for systematic reviews of studies that evaluate the effects of health interventions, irrespective of the design of the included studies. However, the checklist items are applicable to reports of systematic reviews evaluating other interventions (such as social or educational interventions), and many items are applicable to systematic reviews with objectives other than evaluating interventions (such as evaluating aetiology, prevalence, or prognosis). PRISMA 2020 is intended for use in systematic reviews that include synthesis (such as pairwise meta-analysis or other statistical synthesis methods) or do not include synthesis (for example, because only one eligible study is identified). The PRISMA 2020 items are relevant for mixed-methods systematic reviews (which include quantitative and qualitative studies), but reporting guidelines addressing the presentation and synthesis of qualitative data should also be consulted [ 39 , 40 ]. PRISMA 2020 can be used for original systematic reviews, updated systematic reviews, or continually updated (“living”) systematic reviews. However, for updated and living systematic reviews, there may be some additional considerations that need to be addressed. Where there is relevant content from other reporting guidelines, we reference these guidelines within the items in the explanation and elaboration paper [ 41 ] (such as PRISMA-Search [ 42 ] in items 6 and 7, Synthesis without meta-analysis (SWiM) reporting guideline [ 27 ] in item 13d). Box 1 includes a glossary of terms used throughout the PRISMA 2020 statement.
PRISMA 2020 is not intended to guide systematic review conduct, for which comprehensive resources are available [ 43 , 44 , 45 , 46 ]. However, familiarity with PRISMA 2020 is useful when planning and conducting systematic reviews to ensure that all recommended information is captured. PRISMA 2020 should not be used to assess the conduct or methodological quality of systematic reviews; other tools exist for this purpose [ 30 , 31 ]. Furthermore, PRISMA 2020 is not intended to inform the reporting of systematic review protocols, for which a separate statement is available (PRISMA for Protocols (PRISMA-P) 2015 statement [ 47 , 48 ]). Finally, extensions to the PRISMA 2009 statement have been developed to guide reporting of network meta-analyses [ 49 ], meta-analyses of individual participant data [ 50 ], systematic reviews of harms [ 51 ], systematic reviews of diagnostic test accuracy studies [ 52 ], and scoping reviews [ 53 ]; for these types of reviews we recommend authors report their review in accordance with the recommendations in PRISMA 2020 along with the guidance specific to the extension.
The PRISMA 2020 statement (including the checklists, explanation and elaboration, and flow diagram) replaces the PRISMA 2009 statement, which should no longer be used. Box 2 summarises noteworthy changes from the PRISMA 2009 statement. The PRISMA 2020 checklist includes seven sections with 27 items, some of which include sub-items (Table 1 ). A checklist for journal and conference abstracts for systematic reviews is included in PRISMA 2020. This abstract checklist is an update of the 2013 PRISMA for Abstracts statement [ 54 ], reflecting new and modified content in PRISMA 2020 (Table 2 ). A template PRISMA flow diagram is provided, which can be modified depending on whether the systematic review is original or updated (Fig. 1 ).
PRISMA 2020 flow diagram template for systematic reviews. The new design is adapted from flow diagrams proposed by Boers [ 55 ], Mayo-Wilson et al. [ 56 ] and Stovold et al. [ 57 ] The boxes in grey should only be completed if applicable; otherwise they should be removed from the flow diagram. Note that a “report” could be a journal article, preprint, conference abstract, study register entry, clinical study report, dissertation, unpublished manuscript, government report or any other document providing relevant information
We recommend authors refer to PRISMA 2020 early in the writing process, because prospective consideration of the items may help to ensure that all the items are addressed. To help keep track of which items have been reported, the PRISMA statement website ( http://www.prisma-statement.org/ ) includes fillable templates of the checklists to download and complete (also available in Additional file 1 ). We have also created a web application that allows users to complete the checklist via a user-friendly interface [ 58 ] (available at https://prisma.shinyapps.io/checklist/ and adapted from the Transparency Checklist app [ 59 ]). The completed checklist can be exported to Word or PDF. Editable templates of the flow diagram can also be downloaded from the PRISMA statement website.
We have prepared an updated explanation and elaboration paper, in which we explain why reporting of each item is recommended and present bullet points that detail the reporting recommendations (which we refer to as elements) [ 41 ]. The bullet-point structure is new to PRISMA 2020 and has been adopted to facilitate implementation of the guidance [ 60 , 61 ]. An expanded checklist, which comprises an abridged version of the elements presented in the explanation and elaboration paper, with references and some examples removed, is available in Additional file 2 . Consulting the explanation and elaboration paper is recommended if further clarity or information is required.
Journals and publishers might impose word and section limits, and limits on the number of tables and figures allowed in the main report. In such cases, if the relevant information for some items already appears in a publicly accessible review protocol, referring to the protocol may suffice. Alternatively, placing detailed descriptions of the methods used or additional results (such as for less critical outcomes) in supplementary files is recommended. Ideally, supplementary files should be deposited to a general-purpose or institutional open-access repository that provides free and permanent access to the material (such as Open Science Framework, Dryad, figshare). A reference or link to the additional information should be included in the main report. Finally, although PRISMA 2020 provides a template for where information might be located, the suggested location should not be seen as prescriptive; the guiding principle is to ensure the information is reported.
Use of PRISMA 2020 has the potential to benefit many stakeholders. Complete reporting allows readers to assess the appropriateness of the methods, and therefore the trustworthiness of the findings. Presenting and summarising characteristics of studies contributing to a synthesis allows healthcare providers and policy makers to evaluate the applicability of the findings to their setting. Describing the certainty in the body of evidence for an outcome and the implications of findings should help policy makers, managers, and other decision makers formulate appropriate recommendations for practice or policy. Complete reporting of all PRISMA 2020 items also facilitates replication and review updates, as well as inclusion of systematic reviews in overviews (of systematic reviews) and guidelines, so teams can leverage work that is already done and decrease research waste [ 36 , 62 , 63 ].
We updated the PRISMA 2009 statement by adapting the EQUATOR Network’s guidance for developing health research reporting guidelines [ 64 ]. We evaluated the reporting completeness of published systematic reviews [ 17 , 21 , 36 , 37 ], reviewed the items included in other documents providing guidance for systematic reviews [ 38 ], surveyed systematic review methodologists and journal editors for their views on how to revise the original PRISMA statement [ 35 ], discussed the findings at an in-person meeting, and prepared this document through an iterative process. Our recommendations are informed by the reviews and survey conducted before the in-person meeting, theoretical considerations about which items facilitate replication and help users assess the risk of bias and applicability of systematic reviews, and co-authors’ experience with authoring and using systematic reviews.
Various strategies to increase the use of reporting guidelines and improve reporting have been proposed. They include educators introducing reporting guidelines into graduate curricula to promote good reporting habits of early career scientists [ 65 ]; journal editors and regulators endorsing use of reporting guidelines [ 18 ]; peer reviewers evaluating adherence to reporting guidelines [ 61 , 66 ]; journals requiring authors to indicate where in their manuscript they have adhered to each reporting item [ 67 ]; and authors using online writing tools that prompt complete reporting at the writing stage [ 60 ]. Multi-pronged interventions, where more than one of these strategies are combined, may be more effective (such as completion of checklists coupled with editorial checks) [ 68 ]. However, of 31 interventions proposed to increase adherence to reporting guidelines, the effects of only 11 have been evaluated, mostly in observational studies at high risk of bias due to confounding [ 69 ]. It is therefore unclear which strategies should be used. Future research might explore barriers and facilitators to the use of PRISMA 2020 by authors, editors, and peer reviewers, designing interventions that address the identified barriers, and evaluating those interventions using randomised trials. To inform possible revisions to the guideline, it would also be valuable to conduct think-aloud studies [ 70 ] to understand how systematic reviewers interpret the items, and reliability studies to identify items where there is varied interpretation of the items.
We encourage readers to submit evidence that informs any of the recommendations in PRISMA 2020 (via the PRISMA statement website: http://www.prisma-statement.org/ ). To enhance accessibility of PRISMA 2020, several translations of the guideline are under way (see available translations at the PRISMA statement website). We encourage journal editors and publishers to raise awareness of PRISMA 2020 (for example, by referring to it in journal “Instructions to authors”), endorsing its use, advising editors and peer reviewers to evaluate submitted systematic reviews against the PRISMA 2020 checklists, and making changes to journal policies to accommodate the new reporting recommendations. We recommend existing PRISMA extensions [ 47 , 49 , 50 , 51 , 52 , 53 , 71 , 72 ] be updated to reflect PRISMA 2020 and advise developers of new PRISMA extensions to use PRISMA 2020 as the foundation document.
We anticipate that the PRISMA 2020 statement will benefit authors, editors, and peer reviewers of systematic reviews, and different users of reviews, including guideline developers, policy makers, healthcare providers, patients, and other stakeholders. Ultimately, we hope that uptake of the guideline will lead to more transparent, complete, and accurate reporting of systematic reviews, thus facilitating evidence based decision making.
Systematic review —A review that uses explicit, systematic methods to collate and synthesise findings of studies that address a clearly formulated question [ 43 ]
Statistical synthesis —The combination of quantitative results of two or more studies. This encompasses meta-analysis of effect estimates (described below) and other methods, such as combining P values, calculating the range and distribution of observed effects, and vote counting based on the direction of effect (see McKenzie and Brennan [ 25 ] for a description of each method)
Meta-analysis of effect estimates —A statistical technique used to synthesise results when study effect estimates and their variances are available, yielding a quantitative summary of results [ 25 ]
Outcome —An event or measurement collected for participants in a study (such as quality of life, mortality)
Result —The combination of a point estimate (such as a mean difference, risk ratio, or proportion) and a measure of its precision (such as a confidence/credible interval) for a particular outcome
Report —A document (paper or electronic) supplying information about a particular study. It could be a journal article, preprint, conference abstract, study register entry, clinical study report, dissertation, unpublished manuscript, government report, or any other document providing relevant information
Record —The title or abstract (or both) of a report indexed in a database or website (such as a title or abstract for an article indexed in Medline). Records that refer to the same report (such as the same journal article) are “duplicates”; however, records that refer to reports that are merely similar (such as a similar abstract submitted to two different conferences) should be considered unique.
Study —An investigation, such as a clinical trial, that includes a defined group of participants and one or more interventions and outcomes. A “study” might have multiple reports. For example, reports could include the protocol, statistical analysis plan, baseline characteristics, results for the primary outcome, results for harms, results for secondary outcomes, and results for additional mediator and moderator analyses
• Inclusion of the abstract reporting checklist within PRISMA 2020 (see item #2 and Box 2 ).
• Movement of the ‘Protocol and registration’ item from the start of the Methods section of the checklist to a new Other section, with addition of a sub-item recommending authors describe amendments to information provided at registration or in the protocol (see item #24a-24c).
• Modification of the ‘Search’ item to recommend authors present full search strategies for all databases, registers and websites searched, not just at least one database (see item #7).
• Modification of the ‘Study selection’ item in the Methods section to emphasise the reporting of how many reviewers screened each record and each report retrieved, whether they worked independently, and if applicable, details of automation tools used in the process (see item #8).
• Addition of a sub-item to the ‘Data items’ item recommending authors report how outcomes were defined, which results were sought, and methods for selecting a subset of results from included studies (see item #10a).
• Splitting of the ‘Synthesis of results’ item in the Methods section into six sub-items recommending authors describe: the processes used to decide which studies were eligible for each synthesis; any methods required to prepare the data for synthesis; any methods used to tabulate or visually display results of individual studies and syntheses; any methods used to synthesise results; any methods used to explore possible causes of heterogeneity among study results (such as subgroup analysis, meta-regression); and any sensitivity analyses used to assess robustness of the synthesised results (see item #13a-13f).
• Addition of a sub-item to the ‘Study selection’ item in the Results section recommending authors cite studies that might appear to meet the inclusion criteria, but which were excluded, and explain why they were excluded (see item #16b).
• Splitting of the ‘Synthesis of results’ item in the Results section into four sub-items recommending authors: briefly summarise the characteristics and risk of bias among studies contributing to the synthesis; present results of all statistical syntheses conducted; present results of any investigations of possible causes of heterogeneity among study results; and present results of any sensitivity analyses (see item #20a-20d).
• Addition of new items recommending authors report methods for and results of an assessment of certainty (or confidence) in the body of evidence for an outcome (see items #15 and #22).
• Addition of a new item recommending authors declare any competing interests (see item #26).
• Addition of a new item recommending authors indicate whether data, analytic code and other materials used in the review are publicly available and if so, where they can be found (see item #27).
Gurevitch J, Koricheva J, Nakagawa S, Stewart G. Meta-analysis and the science of research synthesis. Nature. 2018;555:175–82. https://doi.org/10.1038/nature25753 .
Article CAS PubMed Google Scholar
Gough D, Thomas J, Oliver S. Clarifying differences between reviews within evidence ecosystems. Syst Rev. 2019;8:170. https://doi.org/10.1186/s13643-019-1089-2 .
Article PubMed PubMed Central Google Scholar
Moher D. Reporting guidelines: doing better for readers. BMC Med. 2018;16:233. https://doi.org/10.1186/s12916-018-1226-0 .
Moher D, Liberati A, Tetzlaff J, Altman DG, PRISMA Group. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. Ann Intern Med. 2009;151:264–9, W64. https://doi.org/10.7326/0003-4819-151-4-200908180-00135 .
Article PubMed Google Scholar
Moher D, Liberati A, Tetzlaff J, Altman DG. PRISMA Group Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. BMJ. 2009;339:b2535. https://doi.org/10.1136/bmj.b2535 .
Moher D, Liberati A, Tetzlaff J, Altman DG. PRISMA Group Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. PLoS Med. 2009;6:e1000097. https://doi.org/10.1371/journal.pmed.1000097 .
Moher D, Liberati A, Tetzlaff J, Altman DG, PRISMA Group. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. J Clin Epidemiol. 2009;62:1006–12. https://doi.org/10.1016/j.jclinepi.2009.06.005 .
Moher D, Liberati A, Tetzlaff J, Altman DG, PRISMA Group. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. Int J Surg. 2010;8:336–41. https://doi.org/10.1016/j.ijsu.2010.02.007 .
Moher D, Liberati A, Tetzlaff J, Altman DG, PRISMA Group. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. Open Med. 2009;3:e123–30.
PubMed PubMed Central Google Scholar
Moher D, Liberati A, Tetzlaff J, Altman DG, PRISMA Group. Reprint--preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. Phys Ther. 2009;89:873–80. https://doi.org/10.1093/ptj/89.9.873 .
Moher D, Tetzlaff J, Tricco AC, Sampson M, Altman DG. Epidemiology and reporting characteristics of systematic reviews. PLoS Med. 2007;4:e78. https://doi.org/10.1371/journal.pmed.0040078 .
Liberati A, Altman DG, Tetzlaff J, et al. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration. J Clin Epidemiol. 2009;62:e1–34. https://doi.org/10.1016/j.jclinepi.2009.06.006 .
Liberati A, Altman DG, Tetzlaff J, et al. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate healthcare interventions: explanation and elaboration. BMJ. 2009;339:b2700. https://doi.org/10.1136/bmj.b2700 .
Liberati A, Altman DG, Tetzlaff J, et al. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration. Ann Intern Med. 2009;151:W65–94. https://doi.org/10.7326/0003-4819-151-4-200908180-00136 .
Liberati A, Altman DG, Tetzlaff J, et al. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration. PLoS Med. 2009;6:e1000100. https://doi.org/10.1371/journal.pmed.1000100 .
Liberati A, Altman DG, Tetzlaff J, et al. The PRISMA statement for reporting. systematic reviews and meta-analyses of studies that evaluate health care. interventions: explanation and elaboration. PLoS Med. 2009;6:e1000100. https://doi.org/10.1371/journal.pmed.1000100 .
Page MJ, Shamseer L, Altman DG, et al. Epidemiology and reporting characteristics of systematic reviews of biomedical research: a cross-sectional study. PLoS Med. 2016;13:e1002028. https://doi.org/10.1371/journal.pmed.1002028 .
Panic N, Leoncini E, de Belvis G, Ricciardi W, Boccia S. Evaluation of the endorsement of the preferred reporting items for systematic reviews and meta-analysis (PRISMA) statement on the quality of published systematic review and meta-analyses. PLoS One. 2013;8:e83138. https://doi.org/10.1371/journal.pone.0083138 .
Article CAS PubMed PubMed Central Google Scholar
Agha RA, Fowler AJ, Limb C, et al. Impact of the mandatory implementation of reporting guidelines on reporting quality in a surgical journal: a before and after study. Int J Surg. 2016;30:169–72. https://doi.org/10.1016/j.ijsu.2016.04.032 .
Leclercq V, Beaudart C, Ajamieh S, Rabenda V, Tirelli E, Bruyère O. Meta-analyses indexed in PsycINFO had a better completeness of reporting when they mention PRISMA. J Clin Epidemiol. 2019;115:46–54. https://doi.org/10.1016/j.jclinepi.2019.06.014 .
Page MJ, Moher D. Evaluations of the uptake and impact of the preferred reporting items for systematic reviews and meta-analyses (PRISMA) statement and extensions: a scoping review. Syst Rev. 2017;6:263. https://doi.org/10.1186/s13643-017-0663-8 .
O’Mara-Eves A, Thomas J, McNaught J, Miwa M, Ananiadou S. Using text mining for study identification in systematic reviews: a systematic review of current approaches. Syst Rev. 2015;4:5. https://doi.org/10.1186/2046-4053-4-5 .
Marshall IJ, Noel-Storr A, Kuiper J, Thomas J, Wallace BC. Machine learning for identifying randomized controlled trials: an evaluation and practitioner’s guide. Res Synth Methods. 2018;9:602–14. https://doi.org/10.1002/jrsm.1287 .
Marshall IJ, Wallace BC. Toward systematic review automation: a practical guide to using machine learning tools in research synthesis. Syst Rev. 2019;8:163. https://doi.org/10.1186/s13643-019-1074-9 .
McKenzie JE, Brennan SE. Synthesizing and presenting findings using other methods. In: Higgins JPT, Thomas J, Chandler J, et al., editors. Cochrane handbook for systematic reviews of interventions. London: Cochrane; 2019. https://doi.org/10.1002/9781119536604.ch12 .
Chapter Google Scholar
Higgins JPT, López-López JA, Becker BJ, et al. Synthesising quantitative evidence in systematic reviews of complex health interventions. BMJ Glob Health. 2019;4(Suppl 1):e000858. https://doi.org/10.1136/bmjgh-2018-000858 .
Campbell M, McKenzie JE, Sowden A, et al. Synthesis without meta-analysis (SWiM) in systematic reviews: reporting guideline. BMJ. 2020;368:l6890. https://doi.org/10.1136/bmj.l6890 .
Sterne JAC, Savović J, Page MJ, et al. RoB 2: a revised tool for assessing risk of bias in randomised trials. BMJ. 2019;366:l4898. https://doi.org/10.1136/bmj.l4898 .
Sterne JA, Hernán MA, Reeves BC, et al. ROBINS-I: a tool for assessing risk of bias in non-randomised studies of interventions. BMJ. 2016;355:i4919. https://doi.org/10.1136/bmj.i4919 .
Whiting P, Savović J, Higgins JP, ROBIS group, et al. ROBIS: a new tool to assess risk of bias in systematic reviews was developed. J Clin Epidemiol. 2016;69:225–34. https://doi.org/10.1016/j.jclinepi.2015.06.005 .
Shea BJ, Reeves BC, Wells G, et al. AMSTAR 2: a critical appraisal tool for systematic reviews that include randomised or non-randomised studies of healthcare interventions, or both. BMJ. 2017;358:j4008. https://doi.org/10.1136/bmj.j4008 .
Hultcrantz M, Rind D, Akl EA, et al. The GRADE working group clarifies the construct of certainty of evidence. J Clin Epidemiol. 2017;87:4–13. https://doi.org/10.1016/j.jclinepi.2017.05.006 .
Booth A, Clarke M, Dooley G, et al. The nuts and bolts of PROSPERO: an international prospective register of systematic reviews. Syst Rev. 2012;1:2. https://doi.org/10.1186/2046-4053-1-2 .
Moher D, Stewart L, Shekelle P. Establishing a new journal for systematic review products. Syst Rev. 2012;1:1. https://doi.org/10.1186/2046-4053-1-1 .
Page MJ, McKenzie JE, Bossuyt PM, et al. Updating guidance for reporting systematic reviews: development of the PRISMA 2020 statement. J Clin Epidemiol 2021;134:103–112. https://doi.org/10.1016/j.jclinepi.2021.02.003 .
Page MJ, Altman DG, Shamseer L, et al. Reproducible research practices are underused in systematic reviews of biomedical interventions. J Clin Epidemiol. 2018;94:8–18. https://doi.org/10.1016/j.jclinepi.2017.10.017 .
Page MJ, Altman DG, McKenzie JE, et al. Flaws in the application and interpretation of statistical analyses in systematic reviews of therapeutic interventions were common: a cross-sectional analysis. J Clin Epidemiol. 2018;95:7–18. https://doi.org/10.1016/j.jclinepi.2017.11.022 .
Page MJ, McKenzie JE, Bossuyt PM, et al. Mapping of reporting guidance for systematic reviews and meta-analyses generated a comprehensive item bank for future reporting guidelines. J Clin Epidemiol. 2020;118:60–8. https://doi.org/10.1016/j.jclinepi.2019.11.010 .
Tong A, Flemming K, McInnes E, Oliver S, Craig J. Enhancing transparency in reporting the synthesis of qualitative research: ENTREQ. BMC Med Res Methodol. 2012;12:181. https://doi.org/10.1186/1471-2288-12-181 .
France EF, Cunningham M, Ring N, et al. Improving reporting of meta-ethnography: the eMERGe reporting guidance. BMC Med Res Methodol. 2019;19:25. https://doi.org/10.1186/s12874-018-0600-0 .
Page MJ, Moher D, Bossuyt PM, et al. PRISMA 2020 explanation and elaboration: updated guidance and exemplars for reporting systematic reviews. BMJ. 2021;372:n160. https://doi.org/10.1136/bmj.n160 .
Rethlefsen ML, Kirtley S, Waffenschmidt S, et al. PRISMA-S Group PRISMA-S: an extension to the PRISMA statement for reporting literature searches in systematic reviews. Syst Rev. 2021;10:39. https://doi.org/10.1186/s13643-020-01542-z .
Higgins JPT, Thomas J, Chandler J, et al. Cochrane handbook for systematic reviews of interventions: version 6.0. London: Cochrane; 2019. Available from https://training.cochrane.org/handbook
Book Google Scholar
Dekkers OM, Vandenbroucke JP, Cevallos M, Renehan AG, Altman DG, Egger M. COSMOS-E: guidance on conducting systematic reviews and meta-analyses of observational studies of etiology. PLoS Med. 2019;16:e1002742. https://doi.org/10.1371/journal.pmed.1002742 .
Cooper H, Hedges LV, Valentine JV. The handbook of research synthesis and meta-analysis. New York: Russell Sage Foundation; 2019.
IOM (Institute of Medicine). Finding what works in health care: standards for systematic reviews. Washington, D.C.: The National Academies Press; 2011.
Google Scholar
Moher D, Shamseer L, Clarke M, PRISMA-P Group, et al. Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015 statement. Syst Rev. 2015;4:1. https://doi.org/10.1186/2046-4053-4-1 .
Shamseer L, Moher D, Clarke M, PRISMA-P Group, et al. Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015: elaboration and explanation. BMJ. 2015;350:g7647. https://doi.org/10.1136/bmj.g7647 .
Hutton B, Salanti G, Caldwell DM, et al. The PRISMA extension statement for reporting of systematic reviews incorporating network meta-analyses of health care interventions: checklist and explanations. Ann Intern Med. 2015;162:777–84. https://doi.org/10.7326/M14-2385 .
Stewart LA, Clarke M, Rovers M, PRISMA-IPD Development Group, et al. Preferred reporting items for systematic review and meta-analyses of individual participant data: the PRISMA-IPD statement. JAMA. 2015;313:1657–65. https://doi.org/10.1001/jama.2015.3656 .
Zorzela L, Loke YK, Ioannidis JP, et al. PRISMAHarms Group PRISMA harms checklist: improving harms reporting in systematic reviews. BMJ. 2016;352:i157. https://doi.org/10.1136/bmj.i157 .
McInnes MDF, Moher D, Thombs BD, the PRISMA-DTA Group, et al. Preferred reporting items for a systematic review and meta-analysis of diagnostic test accuracy studies: the PRISMA-DTA statement. JAMA. 2018;319:388–96. https://doi.org/10.1001/jama.2017.19163 .
Tricco AC, Lillie E, Zarin W, et al. PRISMA extension for scoping reviews (PRISMA-SCR): checklist and explanation. Ann Intern Med. 2018;169:467–73. https://doi.org/10.7326/M18-0850 .
Beller EM, Glasziou PP, Altman DG, et al. PRISMA for Abstracts Group PRISMA for Abstracts: reporting systematic reviews in journal and conference abstracts. PLoS Med. 2013;10:e1001419. https://doi.org/10.1371/journal.pmed.1001419 .
Boers M. Graphics and statistics for cardiology: designing effective tables for presentation and publication. Heart. 2018;104:192–200. https://doi.org/10.1136/heartjnl-2017-311581 .
Mayo-Wilson E, Li T, Fusco N, Dickersin K, MUDS investigators. Practical guidance for using multiple data sources in systematic reviews and meta-analyses (with examples from the MUDS study). Res Synth Methods. 2018;9:2–12. https://doi.org/10.1002/jrsm.1277 .
Stovold E, Beecher D, Foxlee R, Noel-Storr A. Study flow diagrams in Cochrane systematic review updates: an adapted PRISMA flow diagram. Syst Rev. 2014;3:54. https://doi.org/10.1186/2046-4053-3-54 .
McGuinness LA. mcguinlu/PRISMA-Checklist: Initial release for manuscript submission (Version v1.0.0). Geneva: Zenodo; 2020. https://doi.org/10.5281/zenodo.3994319 .
Aczel B, Szaszi B, Sarafoglou A, et al. A consensus-based transparency checklist. Nat Hum Behav. 2020;4:4–6. https://doi.org/10.1038/s41562-019-0772-6 .
Barnes C, Boutron I, Giraudeau B, Porcher R, Altman DG, Ravaud P. Impact of an online writing aid tool for writing a randomized trial report: the COBWEB (Consort-based WEB tool) randomized controlled trial. BMC Med. 2015;13:221. https://doi.org/10.1186/s12916-015-0460-y .
Chauvin A, Ravaud P, Moher D, et al. Accuracy in detecting inadequate research reporting by early career peer reviewers using an online CONSORT-based peer-review tool (COBPeer) versus the usual peer-review process: a cross-sectional diagnostic study. BMC Med. 2019;17:205. https://doi.org/10.1186/s12916-019-1436-0 .
Wayant C, Page MJ, Vassar M. Evaluation of reproducible research practices in oncology systematic reviews with meta-analyses referenced by national comprehensive cancer network guidelines. JAMA Oncol. 2019;5:1550–5. https://doi.org/10.1001/jamaoncol.2019.2564 .
Article PubMed Central PubMed Google Scholar
McKenzie JE, Brennan SE. Overviews of systematic reviews: great promise, greater challenge. Syst Rev. 2017;6:185. https://doi.org/10.1186/s13643-017-0582-8 .
Moher D, Schulz KF, Simera I, Altman DG. Guidance for developers of health research reporting guidelines. PLoS Med. 2010;7:e1000217. https://doi.org/10.1371/journal.pmed.1000217 .
Simera I, Moher D, Hirst A, Hoey J, Schulz KF, Altman DG. Transparent and accurate reporting increases reliability, utility, and impact of your research: reporting guidelines and the EQUATOR Network. BMC Med. 2010;8:24. https://doi.org/10.1186/1741-7015-8-24 .
Speich B, Schroter S, Briel M, et al. Impact of a short version of the CONSORT checklist for peer reviewers to improve the reporting of randomised controlled trials published in biomedical journals: study protocol for a randomised controlled trial. BMJ Open. 2020;10:e035114. https://doi.org/10.1136/bmjopen-2019-035114 .
Stevens A, Shamseer L, Weinstein E, et al. Relation of completeness of reporting of health research to journals’ endorsement of reporting guidelines: systematic review. BMJ. 2014;348:g3804. https://doi.org/10.1136/bmj.g3804 .
Hair K, Macleod MR, Sena ES, IICARus Collaboration. A randomised controlled trial of an Intervention to Improve Compliance with the ARRIVE guidelines (IICARus). Res Integr Peer Rev. 2019;4:12. https://doi.org/10.1186/s41073-019-0069-3 .
Blanco D, Altman D, Moher D, Boutron I, Kirkham JJ, Cobo E. Scoping review on interventions to improve adherence to reporting guidelines in health research. BMJ Open. 2019;9:e026589. https://doi.org/10.1136/bmjopen-2018-026589 .
Charters E. The use of think-aloud methods in qualitative research: an introduction to think-aloud methods. Brock Educ J. 2003;12:68–82. https://doi.org/10.26522/brocked.v12i2.38 .
Article Google Scholar
Welch V, Petticrew M, Tugwell P, PRISMA-Equity Bellagio group, et al. PRISMA-equity 2012 extension: reporting guidelines for systematic reviews with a focus on health equity. PLoS Med. 2012;9:e1001333. https://doi.org/10.1371/journal.pmed.1001333 .
Wang X, Chen Y, Liu Y, et al. Reporting items for systematic reviews and meta-analyses of acupuncture: the PRISMA for acupuncture checklist. BMC Complement Altern Med. 2019;19:208. https://doi.org/10.1186/s12906-019-2624-3 .
Download references
We dedicate this paper to the late Douglas G Altman and Alessandro Liberati, whose contributions were fundamental to the development and implementation of the original PRISMA statement.
We thank the following contributors who completed the survey to inform discussions at the development meeting: Xavier Armoiry, Edoardo Aromataris, Ana Patricia Ayala, Ethan M Balk, Virginia Barbour, Elaine Beller, Jesse A Berlin, Lisa Bero, Zhao-Xiang Bian, Jean Joel Bigna, Ferrán Catalá-López, Anna Chaimani, Mike Clarke, Tammy Clifford, Ioana A Cristea, Miranda Cumpston, Sofia Dias, Corinna Dressler, Ivan D Florez, Joel J Gagnier, Chantelle Garritty, Long Ge, Davina Ghersi, Sean Grant, Gordon Guyatt, Neal R Haddaway, Julian PT Higgins, Sally Hopewell, Brian Hutton, Jamie J Kirkham, Jos Kleijnen, Julia Koricheva, Joey SW Kwong, Toby J Lasserson, Julia H Littell, Yoon K Loke, Malcolm R Macleod, Chris G Maher, Ana Marušic, Dimitris Mavridis, Jessie McGowan, Matthew DF McInnes, Philippa Middleton, Karel G Moons, Zachary Munn, Jane Noyes, Barbara Nußbaumer-Streit, Donald L Patrick, Tatiana Pereira-Cenci, Ba′ Pham, Bob Phillips, Dawid Pieper, Michelle Pollock, Daniel S Quintana, Drummond Rennie, Melissa L Rethlefsen, Hannah R Rothstein, Maroeska M Rovers, Rebecca Ryan, Georgia Salanti, Ian J Saldanha, Margaret Sampson, Nancy Santesso, Rafael Sarkis-Onofre, Jelena Savović, Christopher H Schmid, Kenneth F Schulz, Guido Schwarzer, Beverley J Shea, Paul G Shekelle, Farhad Shokraneh, Mark Simmonds, Nicole Skoetz, Sharon E Straus, Anneliese Synnot, Emily E Tanner-Smith, Brett D Thombs, Hilary Thomson, Alexander Tsertsvadze, Peter Tugwell, Tari Turner, Lesley Uttley, Jeffrey C Valentine, Matt Vassar, Areti Angeliki Veroniki, Meera Viswanathan, Cole Wayant, Paul Whaley, and Kehu Yang. We thank the following contributors who provided feedback on a preliminary version of the PRISMA 2020 checklist: Jo Abbott, Fionn Büttner, Patricia Correia-Santos, Victoria Freeman, Emily A Hennessy, Rakibul Islam, Amalia (Emily) Karahalios, Kasper Krommes, Andreas Lundh, Dafne Port Nascimento, Davina Robson, Catherine Schenck-Yglesias, Mary M Scott, Sarah Tanveer and Pavel Zhelnov. We thank Abigail H Goben, Melissa L Rethlefsen, Tanja Rombey, Anna Scott, and Farhad Shokraneh for their helpful comments on the preprints of the PRISMA 2020 papers. We thank Edoardo Aromataris, Stephanie Chang, Toby Lasserson and David Schriger for their helpful peer review comments on the PRISMA 2020 papers.
Not commissioned; externally peer reviewed.
Patients and the public were not involved in this methodological research. We plan to disseminate the research widely, including to community participants in evidence synthesis organisations.
There was no direct funding for this research. MJP is supported by an Australian Research Council Discovery Early Career Researcher Award (DE200101618) and was previously supported by an Australian National Health and Medical Research Council (NHMRC) Early Career Fellowship (1088535) during the conduct of this research. JEM is supported by an Australian NHMRC Career Development Fellowship (1143429). TCH is supported by an Australian NHMRC Senior Research Fellowship (1154607). JMT is supported by Evidence Partners Inc. JMG is supported by a Tier 1 Canada Research Chair in Health Knowledge Transfer and Uptake. MML is supported by The Ottawa Hospital Anaesthesia Alternate Funds Association and a Faculty of Medicine Junior Research Chair. TL is supported by funding from the National Eye Institute (UG1EY020522), National Institutes of Health, United States. LAM is supported by a National Institute for Health Research Doctoral Research Fellowship (DRF-2018-11-ST2–048). ACT is supported by a Tier 2 Canada Research Chair in Knowledge Synthesis. DM is supported in part by a University Research Chair, University of Ottawa. The funders had no role in considering the study design or in the collection, analysis, interpretation of data, writing of the report, or decision to submit the article for publication.
Authors and affiliations.
School of Public Health and Preventive Medicine, Monash University, Melbourne, Australia
Matthew J. Page, Joanne E. McKenzie, Sue E. Brennan & Steve McDonald
Department of Clinical Epidemiology, Biostatistics and Bioinformatics, Amsterdam University Medical Centres, University of Amsterdam, Amsterdam, Netherlands
Patrick M. Bossuyt
Université de Paris, Centre of Epidemiology and Statistics (CRESS), Inserm, F 75004, Paris, France
Isabelle Boutron
Institute for Evidence-Based Healthcare, Faculty of Health Sciences and Medicine, Bond University, Gold Coast, Australia
Tammy C. Hoffmann
Annals of Internal Medicine, University of Texas Health Science Center at San Antonio, San Antonio, Texas, USA
Cynthia D. Mulrow
Knowledge Translation Program, Li Ka Shing Knowledge Institute, Toronto, Canada; School of Epidemiology and Public Health, Faculty of Medicine, University of Ottawa, Ottawa, Canada
Larissa Shamseer
Evidence Partners, Ottawa, Canada
Jennifer M. Tetzlaff
Clinical Research Institute, American University of Beirut, Beirut, Lebanon; Department of Health Research Methods, Evidence, and Impact, McMaster University, Hamilton, Ontario, Canada
Elie A. Akl
Department of Medical Informatics and Clinical Epidemiology, Oregon Health & Science University, Portland, OR, USA
York Health Economics Consortium (YHEC Ltd), University of York, York, UK
Julie Glanville
Clinical Epidemiology Program, Ottawa Hospital Research Institute, Ottawa, Canada; School of Epidemiology and Public Health, University of Ottawa, Ottawa, Canada; Department of Medicine, University of Ottawa, Ottawa, Canada
Jeremy M. Grimshaw
Centre for Evidence-Based Medicine Odense (CEBMO) and Cochrane Denmark, Department of Clinical Research, University of Southern Denmark, JB Winsløwsvej 9b, 3rd Floor, 5000 Odense, Denmark; Open Patient data Exploratory Network (OPEN), Odense University Hospital, Odense, Denmark
Asbjørn Hróbjartsson
Department of Anesthesiology and Pain Medicine, The Ottawa Hospital, Ottawa, Canada; Clinical Epidemiology Program, Blueprint Translational Research Group, Ottawa Hospital Research Institute, Ottawa, Canada; Regenerative Medicine Program, Ottawa Hospital Research Institute, Ottawa, Canada
Manoj M. Lalu
Department of Ophthalmology, School of Medicine, University of Colorado Denver, Denver, Colorado, United States; Department of Epidemiology, Johns Hopkins Bloomberg School of Public Health, Baltimore, Maryland, USA
Tianjing Li
Division of Headache, Department of Neurology, Brigham and Women’s Hospital, Harvard Medical School, Boston, Massachusetts, USA; Head of Research, The BMJ, London, UK
Elizabeth W. Loder
Department of Epidemiology and Biostatistics, Indiana University School of Public Health-Bloomington, Bloomington, Indiana, USA
Evan Mayo-Wilson
Population Health Sciences, Bristol Medical School, University of Bristol, Bristol, UK
Luke A. McGuinness & Penny Whiting
Centre for Reviews and Dissemination, University of York, York, UK
Lesley A. Stewart
EPPI-Centre, UCL Social Research Institute, University College London, London, UK
James Thomas
Li Ka Shing Knowledge Institute of St. Michael’s Hospital, Unity Health Toronto, Toronto, Canada; Epidemiology Division of the Dalla Lana School of Public Health and the Institute of Health Management, Policy, and Evaluation, University of Toronto, Toronto, Canada; Queen’s Collaboration for Health Care Quality Joanna Briggs Institute Centre of Excellence, Queen’s University, Kingston, Canada
Andrea C. Tricco
Methods Centre, Bruyère Research Institute, Ottawa, Ontario, Canada; School of Epidemiology and Public Health, Faculty of Medicine, University of Ottawa, Ottawa, Canada
Vivian A. Welch
Centre for Journalology, Clinical Epidemiology Program, Ottawa Hospital Research Institute, Ottawa, Canada; School of Epidemiology and Public Health, Faculty of Medicine, University of Ottawa, Ottawa, Canada
David Moher
You can also search for this author in PubMed Google Scholar
JEM and DM are joint senior authors. MJP, JEM, PMB, IB, TCH, CDM, LS, and DM conceived this paper and designed the literature review and survey conducted to inform the guideline content. MJP conducted the literature review, administered the survey and analysed the data for both. MJP prepared all materials for the development meeting. MJP and JEM presented proposals at the development meeting. All authors except for TCH, JMT, EAA, SEB, and LAM attended the development meeting. MJP and JEM took and consolidated notes from the development meeting. MJP and JEM led the drafting and editing of the article. JEM, PMB, IB, TCH, LS, JMT, EAA, SEB, RC, JG, AH, TL, EMW, SM, LAM, LAS, JT, ACT, PW, and DM drafted particular sections of the article. All authors were involved in revising the article critically for important intellectual content. All authors approved the final version of the article. MJP is the guarantor of this work. The corresponding author attests that all listed authors meet authorship criteria and that no others meeting the criteria have been omitted.
Correspondence to Matthew J. Page .
Competing interests.
All authors have completed the ICMJE uniform disclosure form at http://www.icmje.org/conflicts-of-interest/ and declare: EL is head of research for the BMJ ; MJP is an editorial board member for PLOS Medicine ; ACT is an associate editor and MJP, TL, EMW, and DM are editorial board members for the Journal of Clinical Epidemiology ; DM and LAS were editors in chief, LS, JMT, and ACT are associate editors, and JG is an editorial board member for Systematic Reviews . None of these authors were involved in the peer review process or decision to publish. TCH has received personal fees from Elsevier outside the submitted work. EMW has received personal fees from the American Journal for Public Health , for which he is the editor for systematic reviews. VW is editor in chief of the Campbell Collaboration, which produces systematic reviews, and co-convenor of the Campbell and Cochrane equity methods group. DM is chair of the EQUATOR Network, IB is adjunct director of the French EQUATOR Centre and TCH is co-director of the Australasian EQUATOR Centre, which advocates for the use of reporting guidelines to improve the quality of reporting in research articles. JMT received salary from Evidence Partners, creator of DistillerSR software for systematic reviews; Evidence Partners was not involved in the design or outcomes of the statement, and the views expressed solely represent those of the author.
Publisher’s note.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Additional file 1..
PRISMA 2020 checklist.
PRISMA 2020 expanded checklist.
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.
Reprints and permissions
Cite this article.
Page, M.J., McKenzie, J.E., Bossuyt, P.M. et al. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. Syst Rev 10 , 89 (2021). https://doi.org/10.1186/s13643-021-01626-4
Download citation
Accepted : 04 January 2021
Published : 29 March 2021
DOI : https://doi.org/10.1186/s13643-021-01626-4
Anyone you share the following link with will be able to read this content:
Sorry, a shareable link is not currently available for this article.
Provided by the Springer Nature SharedIt content-sharing initiative
ISSN: 2046-4053
You can easily edit this template using Creately. You can export it in multiple formats like JPEG, PNG and SVG and easily add it to Word documents, Powerpoint (PPT) presentations, Excel or any other documents. You can export it as a PDF for high-quality printouts.
You are accessing a machine-readable page. In order to be human-readable, please install an RSS reader.
All articles published by MDPI are made immediately available worldwide under an open access license. No special permission is required to reuse all or part of the article published by MDPI, including figures and tables. For articles published under an open access Creative Common CC BY license, any part of the article may be reused without permission provided that the original article is clearly cited. For more information, please refer to https://www.mdpi.com/openaccess .
Feature papers represent the most advanced research with significant potential for high impact in the field. A Feature Paper should be a substantial original Article that involves several techniques or approaches, provides an outlook for future research directions and describes possible research applications.
Feature papers are submitted upon individual invitation or recommendation by the scientific editors and must receive positive feedback from the reviewers.
Editor’s Choice articles are based on recommendations by the scientific editors of MDPI journals from around the world. Editors select a small number of articles recently published in the journal that they believe will be particularly interesting to readers, or important in the respective research area. The aim is to provide a snapshot of some of the most exciting work published in the various research areas of the journal.
Original Submission Date Received: .
Find support for a specific problem in the support section of our website.
Please let us know what you think of our products and services.
Visit our dedicated information section to learn more about MDPI.
Feasible solutions for low-carbon thermal electricity generation and utilization in oil-rich developing countries: a literature review.
2. study selection methodology, 2.1. identification phase: databases, search terms definition, 2.2. screening phase: inclusion and exclusion criteria definition, article’s abstract review, 2.3. eligibility and inclusion phase: comprehensive full-text review for assessing studies, 2.4. synthesis phase: bibliometric analysis of included literature, 3. results and discussions, 3.1. oil-rich developing countries, 3.2. low-carbon technologies for thermal generation, 3.2.1. biomass co-firing and its integration with existing infrastructure, 3.2.2. hydrogen as a fuel for high-efficiency gas turbines, 3.2.3. geothermal hybrid systems for base load power generation, 3.2.4. advanced thermal energy storage solutions, 3.3. successful case studies implementation perspectives for oil-rich regions, 3.3.1. lessons from the middle east: pioneering low-carbon initiatives in oil-dependent economies, 3.3.2. transitioning national grids: the role of clean thermal power in africa’s oil-producing nations, 3.3.3. public-private partnerships in latin america: leveraging oil revenues for sustainable energy, 3.3.4. overcoming socioeconomic barriers: community engagement and low-carbon technology adoption, 3.3.5. economic barriers and solutions for community engagement in low-carbon technology adoption, 3.4. current regulatory framework, 3.4.1. comparative analysis of carbon pricing mechanisms and their impact on thermal generation, 3.4.2. regulatory innovation: creating incentives for hybrid thermal systems, 3.4.3. evaluating the effectiveness of renewable portfolio standards in promoting low-carbon thermal technologies, 3.4.4. future-proofing regulations: adapting policies for emerging low-carbon technologies, 3.4.5. economic considerations for future-proofing regulations in emerging low-carbon technologies, 3.5. technological innovations and trends, 3.5.1. digital twins and predictive analytics in thermal power plants, 3.5.2. the rise of ai-driven energy management systems for thermal generation, 3.5.3. advanced materials for enhanced thermal efficiency, 3.5.4. trends in modular thermal generation units for decentralized energy systems, 3.5.5. financial viability of modular thermal generation units in decentralized energy systems, 3.5.6. barriers to investment in low-carbon thermal technologies in ordcs, 4. conclusions, author contributions, institutional review board statement, informed consent statement, data availability statement, acknowledgments, conflicts of interest, appendix a. prisma 2020 flowchart of the literature review process.
Click here to enlarge figure
Database | Query String | N° of Returned Documents | Removal of Duplicates | Final Sample for Screening Phase |
---|---|---|---|---|
Scopus | TITLE-ABS-KEY (“low carbon” AND “electricity” AND (“biogas” OR “biofuels” OR “biomass” OR “hydrogen” OR “geothermal”)) AND PUBYEAR > 2013 AND PUBYEAR < 2025 AND (LIMIT-TO (DOCTYPE, “ar”)) | 935 | 19 | 916 |
Web of Science | ALL = (“low carbon” AND “electricity” AND “thermal ”AND “renewable”) Publication Years: 2024 or 2023 or 2022 or 2021 or 2020 or 2019 or 2018 or 2017 or 2016 or 2015 or 2014; Document Types: Article | 309 | 63 | 246 |
1244 | 82 | 1162 |
Criterion | Inclusion | Exclusion |
---|---|---|
Articles from peer-reviewed journals | Conference proceedings, editorial notes, review papers, book chapters, theses, white papers, and other non-peer-reviewed materials | |
Publications in English | Publications in languages other than English | |
Research articles published within the timeframe of 2014 to 2024 | Research articles published before 2014 | |
Articles accessible in full text through institutional subscriptions or as open access | Articles that lack full-text accessibility | |
Studies concentrating on low-carbon thermal electricity generation and utilization that are technologically advanced and economically feasible for oil-rich developing countries, particularly those discussing the integration of clean thermal technologies like biogas, biofuels, biomass, hydrogen, and geothermal energy. | Studies that do not focus on low-carbon thermal electricity generation or fail to address technological and economic aspects relevant to oil-rich developing countries. Additionally, articles that solely explore non-thermal renewable energy technologies without discussing their integration into thermal systems are excluded. |
N° | Criterion | Description and Evaluation Metrics |
---|---|---|
1 | Relevance to Study Goals | How well the study addresses the integration of clean thermal technologies for low-carbon electricity generation in oil-rich developing countries. (1: Peripheral, 2: Related, 3: Highly Relevant) |
2 | Methodological Soundness | The appropriateness and robustness of the research methodology used. (1: Needs Improvement, 2: Acceptable, 3: Strong) |
3 | Originality and Contribution | The originality and significance of the study’s contributions to the field. (1: Minor, 2: Substantial, 3: Major) |
4 | Data Quality and Reliability | The quality and reliability of the data presented in the study. (1: Satisfactory, 2: Good, 3: Excellent) |
5 | Practical Applicability | The potential for practical application of the study’s findings in real-world scenarios. (1: Limited, 2: Useful, 3: Highly Applicable) |
6 | Technological Maturity and Economic Attractiveness for Oil-rich developing countries | The extent to which the study offers solutions that are technologically mature and economically viable for oil-rich developing countries. (1: Developing, 2: Promising, 3: Established) |
Low-Carbon Technology | Challenges | Opportunities | Policy Recommendations | References |
---|---|---|---|---|
Biomass Co-firing | High upfront costs, emissions control, infrastructure retrofitting | Utilizes agricultural waste, reduces fossil fuel dependence | Subsidies for retrofitting existing infrastructure; incentives for agricultural waste management | [ , , , , , ] |
Hydrogen for Gas Turbines | Requires infrastructure expansion, expensive hydrogen storage | Zero direct CO emissions, decarbonizes high energy industries | Investments in hydrogen infrastructure and storage technologies, support for R&D | [ , , , , , , , , ] |
Geothermal Hybrid Systems | High capital cost, site-specific resource dependency | Stable base load power, suitable for resource-rich regions | Geothermal exploration grants, financial incentives for infrastructure development | [ , , , , , , , , , ] |
Thermal Energy Storage (TES) | High implementation costs, integration challenges with renewables | Enhances flexibility, supports renewable energy integration | Policies promoting energy storage systems, tax breaks for TES projects | [ , , , , , , , , , ] |
Modular Thermal Generation Units | High initial capital investment, rural deployment challenges | Suitable for decentralized and off-grid areas, flexible | Support for decentralized systems, public-private partnerships, rural energy development programs | [ , , , , , , , ] |
Digital Twins and Predictive Analytics | High cost of digital infrastructure, requires technical expertise | Improves operational efficiency, reduces emissions | Grants for digital infrastructure in power plants, incentives for AI, and predictive analytics | [ , , , , ] |
AI-Driven Energy Management | Expensive implementation, need for technical expertise | Optimizes energy generation and integration with renewables | Research grants for AI-driven energy management systems, subsidies for implementation | [ , , , , , ] |
Advanced Materials for Thermal Efficiency | High production costs, complex integration with existing systems | Enhances efficiency, reduces emissions, and extends equipment lifespan | Financial support for adopting advanced materials, R&D tax credits | [ , , , , ] |
Technology | Market Size (2023) | Projected Market Size | CAGR | Year Range |
---|---|---|---|---|
Biogas | USD 89 billion | N/A | 4.2% | 2024–2032 |
Biofuels | USD 110 billion (2021) | N/A | 9.6% (Middle East and Africa), 4% (Europe and Latin America) | 2021–2030 |
Biomass Co-firing | USD 133.97 billion | N/A | 6.3% | 2024–2030 |
Hydrogen | USD 242.7 billion | USD 410.6 billion | 7.8% | 2023–2030 |
Geothermal Energy | USD 6.6 billion | USD 9.4 billion | 5.9% | 2022–2027 |
The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
Ochoa-Correa, D.; Arévalo, P.; Villa-Ávila, E.; Espinoza, J.L.; Jurado, F. Feasible Solutions for Low-Carbon Thermal Electricity Generation and Utilization in Oil-Rich Developing Countries: A Literature Review. Fire 2024 , 7 , 344. https://doi.org/10.3390/fire7100344
Ochoa-Correa D, Arévalo P, Villa-Ávila E, Espinoza JL, Jurado F. Feasible Solutions for Low-Carbon Thermal Electricity Generation and Utilization in Oil-Rich Developing Countries: A Literature Review. Fire . 2024; 7(10):344. https://doi.org/10.3390/fire7100344
Ochoa-Correa, Danny, Paul Arévalo, Edisson Villa-Ávila, Juan L. Espinoza, and Francisco Jurado. 2024. "Feasible Solutions for Low-Carbon Thermal Electricity Generation and Utilization in Oil-Rich Developing Countries: A Literature Review" Fire 7, no. 10: 344. https://doi.org/10.3390/fire7100344
Further information, mdpi initiatives, follow mdpi.
Subscribe to receive issue release notifications and newsletters from MDPI journals
IMAGES
VIDEO
COMMENTS
Apply all your limits (such as years of search, English language only, and so on). Once all search terms have been combined and you have applied all relevant limits, you should have a final number of records or articles for each database. Enter this information in the top left box of the PRISMA flow chart.
PRISMA Flow Diagram. The flow diagram depicts the flow of information through the different phases of a systematic review. It maps out the number of records identified, included and excluded, and the reasons for exclusions. Different templates are available depending on the type of review (new or updated) and sources used to identify studies:
This guide briefly looks at some of the different types of reviews to help you determine what kind of review you actually need to do and where to start How to use PRISMA, links to download. ... The PRISMA Flow Diagram is a tool that can be used to record different stages of the literature search process--across multiple resources--and clearly ...
A step-by-step process. Using the PRISMA 2020 (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines involves a step-by-step process to ensure that your systematic review or meta-analysis is reported transparently and comprehensively. Below are the key steps to follow when using PRISMA 2020:
Tool. This tool, developed by PRISMA, can be used to develop a PRISMA flow diagram in order to report on systematic reviews. The flow diagram depicts the flow of information through the different phases of a systematic review. It maps out the number of records identified, included and excluded, and the reasons for exclusions.
Examples of literature reviews. Step 1 - Search for relevant literature. Step 2 - Evaluate and select sources. Step 3 - Identify themes, debates, and gaps. Step 4 - Outline your literature review's structure. Step 5 - Write your literature review.
Documenting grey literature and/or hand searches. If you have also searched additional sources, such as professional organization websites, cited or citing references, etc., document your grey literature search using the flow diagram template version 1 PRISMA 2020 flow diagram for new systematic reviews which included searches of databases, registers and other sources or the version 2 PRISMA ...
A literature review is an integrated analysis-- not just a summary-- of scholarly writings and other relevant evidence related directly to your research question. That is, it represents a synthesis of the evidence that provides background information on your topic and shows a association between the evidence and your research question.
When creating a PRISMA flowchart to represent the data collection and analysis used in a research report, track the following steps taken during your review or analysis. Adequate preparation: Choose and download an editable flowchart template that is suitable for the research being reviewed. Database search: Search for the terms within the ...
Thus, the following examples can illustrate the appropriate use of the PRISMA Statement in research reporting: "The reporting of this systematic review was guided by the standards of the Preferred Reporting Items for Systematic Review and Meta-Analysis (PRISMA) Statement.". "This scoping review was reported according to the Preferred ...
Terms such as "review," "literature review," "evidence synthesis," or "knowledge synthesis" are not recommended because they do not distinguish systematic and non-systematic approaches. We also discourage using the terms "systematic review" and "meta-analysis" interchangeably because a systematic review refers to the ...
Option 1: Chronological (according to date) Organising the literature chronologically is one of the simplest ways to structure your literature review. You start with what was published first and work your way through the literature until you reach the work published most recently. Pretty straightforward.
a Subject for StudyStep 2. Develop t. e of Ar. umentation ToolsConcept 1. Building the. ase for. a Literature ReviewS. ep 3. S. arch the LiteratureTask 1. Sel. ct the. iterature to ReviewTask 2. Translate the Personal Interest or C.
Literature Review and Research Design by Dave Harris This book looks at literature review in the process of research design, and how to develop a research practice that will build skills in reading and writing about research literature--skills that remain valuable in both academic and professional careers. Literature review is approached as a process of engaging with the discourse of scholarly ...
Okay - with the why out the way, let's move on to the how. As mentioned above, writing your literature review is a process, which I'll break down into three steps: Finding the most suitable literature. Understanding, distilling and organising the literature. Planning and writing up your literature review chapter.
The Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) statement, published in 2009, was designed to help systematic reviewers transparently report why the review was done, what the authors did, and what they found. Over the past decade, advances in systematic review methodology and terminology have necessitated an update to the guideline. The PRISMA 2020 statement ...
When deciding if your question is suitable for a systematic review you need to consider: • If the systematic review has been done before. Links to systematic review databases are available in the 'literature searching' page of the Medicine subject support pages (6). If it has, then has enough research been published since that review or are
PRISMA was designed to guide the transparent reporting of systematic reviews. One of the ways it does this is through the PRISMA statement flow diagram, a 4-stage flowchart that maps out the flow of information through the research process.It illustrates how studies are identified, screened, and included or excluded for the review.
A literature review is an overview of the available research for a specific scholarly topic. Literature reviews summarize existing research to answer a review question, provide context for new research, or identify important gaps in the existing body of literature.. An incredible amount of academic literature is published each year; by some estimates nearly three million articles.
Background: While the PRISMA flow diagram is widely used for reporting standard systematic reviews (SRs), it was not designed for capturing the results of continual searches for studies in living systematic reviews (LSRs).The objectives of this study are (1) to assess how published LSRs report on the flow of studies through the different phases of the review for the different updates; (2) to ...
Cochrane systematic reviews are conducted and reported according to rigorous standards. A study flow diagram must be included in a new review, and there is clear guidance from the PRISMA statement on how to do this. However, for a review update, there is currently no guidance on how study flow diagrams should be presented. To address this, a working group was formed to find a solution and ...
Steps in a Systematic Review. Searching the Published Literature. Searching the Gray Literature. Methodology and Documentation. Managing the Process. Help. Scoping Reviews. Includes the number of results retrieved from each source. Duplicates are removed.
The Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) statement published in 2009 (hereafter referred to as PRISMA 2009) [4,5,6,7,8,9,10] is a reporting guideline designed to address poor reporting of systematic reviews [].The PRISMA 2009 statement comprised a checklist of 27 items recommended for reporting in systematic reviews and an "explanation and elaboration ...
Literature Review Flowchart. by Belinda Wewalage. Edit This Template. Use Creately's easy online diagram editor to edit this diagram, collaborate with others and export results to multiple image formats. Edit This Template Close. You can easily edit this template using Creately. You can export it in multiple formats like JPEG, PNG and SVG and ...
Figure A1 in Appendix A shows the standardized PRISMA 2020 flowchart for the systematic literature review reported in this article. 2.1. Identification Phase: Databases, Search Terms Definition. ... Based on the reported literature review findings, despite the potential of low-carbon thermal technologies to support a sustainable energy ...