Systematic Review: Structure and Process

A systematic review is a term that carries specific meaning and refers to a specific process in academic writing. It is the standard for quality and legitimate academic review articles. Academics recognize that, for a study to be reviewed and published, it must follow certain universally-agreed guidelines. Without these guidelines, scientists and publishers would not be able to communicate effectively and meaningfully about a manuscript.

Writers who are new to manuscript writing and academic publishing, such as graduate students, will benefit from careful study of systematic review guidelines and from the advice of experienced authors. The National Center for Biotechnology Information, U.S. National Library of Medicine (a division of the US National Institutes of Health) defines systematic review in this way:

“A systematic review is a protocol driven comprehensive review and synthesis of data focusing on a topic or on related key questions. It is typically performed by experienced methodologists with the input of domain experts.”

There are three types of research:

  • Quantitative – results are measured based on numbers and calculations
  • Qualitative – results are measured based on observations
  • Meta-Analysis – results are based on the review of many related studies

Regardless of their differences, the same protocol is followed for systematic reviews.

Overarching the systematic review is a superstructure or framework. This structure comprises the following components in order: Title-Abstract-Introduction-Methods-Results-Discussion-References. This structure should be familiar to most manuscript writers in an academic or scientific setting, even if they have not yet written a review article.

Full Structure of a Systematic Review

A systematic review follows these steps:

  1. Designing the question
  2. Analytic framework
  3. Evidence mapping
  4. Critical analysis
  5. Evidence synthesis

Designing the Question

The design of a research question requires careful selection of language, which mirrors the intent of the research. Research questions are often devised with the use of the PICO structure: Population, Intervention (or Exposure), Control/Comparator, Outcomes. The question will serve as a reference point throughout the research and discussion, so it needs to be explicit.

Analytic Framework

The analytic framework is a visual representation of the PICO structure, drawing connections among all four components. These connections are supported by the evidence provided. It helps the reader quickly understand what is being presented in the review article.

Evidence Mapping

Evidence mapping is a process by which researchers select and reject studies related to the research question at hand. Criteria must be established at the outset for what constitutes a relevant study and what does not. In their article entitled, “The Global Evidence Mapping Initiative: Scoping research in broad topic areas,” authors Peter Bragge, et al (2011) describe evidence mapping in this way:

“Evidence mapping describes the quantity, design and characteristics of research in broad topic areas…. The breadth of evidence mapping helps to identify evidence gaps, and may guide future research efforts.”


Critical Analysis

Fundamentally, the critical analysis is used to determine whether or not the study is of high quality. Several criteria are employed to judge the study’s methodology, including topics such as concealment of random allocation, reporting of withdrawals, reporting accuracy, statistical analysis, and assessment of outcomes. Authors and reviewers look for potential bias and misuse of data (or observation) in every area of the study.

Evidence Synthesis

The results of different studies are compared and contrasted, using the research question as a guide. The author provides an evaluation of or answer to the research question based on studies included in the research. The synthesis of a variety of studies has come to be called, “meta-analysis.”

Meta-Analysis

Meta-analysis involves the systematic review of a curated set of studies around a single topic. While a meta-analysis follows a similar protocol to a systematic review, it is research on existing research rather than the creation of new research. Although, one could argue that the synthesis of studies on a single topic does provide a new perspective and new data on that topic or research question.

In their article for the Journal of the Royal Society of Medicine (JRSM), “Five steps to conducting a systematic review,” authors Khan, Kunz, Kleijnen, and Antes (2003) describe the five steps necessary to conduct a meta-analysis that qualifies as a systematic review:

  1. Framing the Question
  2. Identifying Relevant Publications
  3. Assessing Study Quality
  4. Summarizing the Evidence
  5. Interpreting the Findings

Framing the Question

Essentially, framing the question for a meta-analysis follows the same protocol as designing the question for a systematic review. These authors focus on a variation of PICO: Populations, Interventions or Exposures, Outcomes, and Study Designs. Because a meta-analysis doesn’t necessarily involve a control group or comparator, that element is left out and Study Designs is added on. Reviewers are synthesizing multiple studies and the design of those studies must be considered when framing the question.

Identifying Relevant Publications

The authors must seek and identify relevant published works to inform the meta-analysis. Selected studies/publications must directly correlate to the research question. In addition, authors should record the rationale behind which studies were selected and which were rejected.

Assessing Study Quality

This area correlates with “Critical Analysis,” above. While “quality” may seem to be a subjective assessment, there are criteria commonly used in the academic community, which can standardize such an assessment. These include evaluations of the study’s design, the type and amount of research available on a given topic, and the potential for bias in the results. PRISMA is a website that can assist with the search for and assessment of quality studies.

Summarizing the Evidence

The objective here is to discuss the differences among the studies – characteristics, quality, statistical, etc. – as well as their similarities, particularly in outcomes. For it is not just a comparison of studies that occurs in a meta-analysis; the research question is still the hub of these spokes and the meta-analysis should always point toward the central topic in question.

Interpreting the Findings

Bias should be explored when interpreting the findings (favoring one journal or publishing house, favoring a particular author, etc.). The overall quality of the studies should be discussed and “recommendations should be graded by reference to the strengths and weaknesses of the evidence.”

Final Word to Writers

A systematic review is a serious undertaking, usually attempted by a team of writers and researchers, rather than an individual. University libraries are a good place to start when looking for guidance. For example, the University of Toronto has a set of guidelines on their website.

In addition, The Joanna Briggs Institute for Evidence-based Nursing and Midwifery has a PDF entitled, “Appraising Systematic Reviews” and BioMed Central has a good example of a meta-analysis in their “Methodology in conducting a systematic review of systematic reviews of healthcare interventions” article. Being able to conduct this type of research is an important part of one’s academic career.

Rate this article

Rating*

Your email address will not be published.

You might also like
X

Sign-up to read more

Subscribe for free to get unrestricted access to all our resources on research writing and academic publishing including:

  • 2000+ blog articles
  • 50+ Webinars
  • 10+ Expert podcasts
  • 50+ Infographics
  • Q&A Forum
  • 10+ eBooks
  • 10+ Checklists
  • Research Guides
[contact-form-7 id="40123" title="Global popup two"]





    Researchers' Poll

    Which among these would you prefer the most for improving research integrity?