Finding and selecting options to address a problem

This stage aims to find and select options that could address (or help to reduce the impact) of a problem. It is structured in four different goals that may need to be achieved (A to D). In total, 13 different types of questions that may need to be answered to achieve the goal are included in this stage.

Open the goal that is mostly related to your specific query, and a list of types of questions will be displayed. More details about each type of questions will be provided, including examples and the methodological approaches to address each question.

A. Finding and understanding potential options

What are the potential solutions?

Option

This goal aims to first identify a list of potential options to address a given problem, followed by understanding how and why they work.

Open the question that is mostly related to your specific query, and more details about the question, examples and methodological approaches will be displayed.

Question A1. Scoping a list of potential options

This type of question aims to create a list of potentially available interventions to address a given problem. This type of question commonly includes a judgement of what options are potentially suitable for a specific contextual reality.

Some examples of this type of question are:

  • What alternatives are available to treat patients with multi-drug resistant tuberculosis?
  • What population-wide interventions are available to reduce people’s sugar consumption?
  • What types of governance approaches exist to conduct surveillance functions in a given territory?
  • What system arrangements could be changed to reduce the surgical backlog created during the acute phase of the COVID-19 pandemic?
  • What interventions can be used to reduce police violence (e.g., monitoring cameras, independent audit mechanisms, etc.).

Study designs to address question A1

  • Review to find options that have been used by other studies (e.g., scoping review)
  • Jurisdictional scan (comparative analysis) to understand what options have been implemented by other jurisdictions
  • Cross-sectional study (survey, point-in-time or snapshot study or analysis) of people’s opinions
  • Ecological study (population-based study, including spatial analysis)

Methodological approaches (forms of evidence) to address question A1

behavioural icon
  • Behavioural/implementation research
    • Knowledge syntheses to identify potential options to address a problem.
    • Multidisciplinary expert group to identify potential options.
 
evaluation icon
  • Evaluation
    • Outcome Harvesting
    • Outcome Mapping
    • Contribution analysis
This type of question would benefit from a participatory approach, which emphasizes stakeholder engagement in all of the stages of an evaluation design.
evidence syntheses icon
  • Evidence syntheses
    • Big pictures reviews (e.g., scoping reviews) to identify potential options
Big picture reviews include scoping reviews, evidence maps, and evidence gap maps, that identify and map the breadth of evidence available on a particular issue.

 

Question A2. Understanding the way potential options and their components work

This type of question aims to describe what does entail to develop a given option, including its mechanism of action (or causal pathway, if applicable), and how and why it should work to address a problem.

An important distinction should be made with an option that has been already implemented, and someone may want to know why it has not had the results that should have had. This issue is addressed in stage 4 (Monitoring implementation and evaluating impact).

Some examples of this type of question are:

  • How does metformin work to reduce blood glucose levels?
  • How increasing the price of sugar-sweetened beverages could reduce the prevalence of obesity in a given country?
  • How would the school payment mechanisms benefit teachers?

Methodological approaches to address question A2

  • Randomized-controlled study (randomized experiment or randomized trial) measuring intermediate outcomes
  • Review to identify existing frameworks (conceptual analysis) that explain how an intervention might work
  • Interrupted time-series analysis (including joint-point regression) measuring intermediate outcomes

Methodological approaches (forms of evidence) to address question A2

behavioural icon
  • Behavioural/implementation research
    • Optimization designs to understand how the components of an intervention might work to produce the desired outcomes.
 
evaluation icon
  • Evaluation
    • Contribution analysis to test the theory of change of a given intervention using a stakeholder workshop/consultation.
    •  Experimental approaches (randomized-controlled trials)
This type of question would benefit from a participatory approach, which emphasizes stakeholder engagement in all of the stages of an evaluation design.
evidence syntheses icon
  • Evidence syntheses
    • Qualitative evidence syntheses (e.g., realist syntheses)
    • Complexity-oriented systematic reviews
 

 

B. Assessing the expected impacts of options

Is it feasible (can it work?), does it work, is it convenient, and is it equitable and acceptable?

Option

This goal aims to assess the possible impact or success of options by assessing it in different outcomes. The impact of a given option can be assessed at one point in time, and can be assessed in the short, medium, and long-term to better assess the sustainability of the option.

The impact of an option can be assessed in different populations, so to well-formulate a question in this stage, someone should have a clear population in which the option is planning to be implemented.

Open the question that is mostly related to your specific query, and more details about the question, examples and methodological approaches will be displayed.

Question B1. Assessing the feasibility of an option

This type of question aims to assess whether a given option is feasible to be implemented in a given context or setting, which could be split into different dimensions (e.g., operationally feasible, legally feasible, etc.).

Some examples of this type of question are:

  • How feasible is it to implement curfews to reduce COVID-19 transmission?
  • What is the level of feasibility of reducing the number of hours in of training for teachers?
  • How feasible is it for a patient to access an emergency room within 7 hours of having a stroke?

Methodological approaches to address question B1

  • Delphi studies (to get consensus from experts)
  • Jurisdictional scan (comparative analysis) to understand the feasibility of the option elsewhere
  • Discrete choice experiment (stated preferences)
  • Modelling to predict whether the option will be feasible (e.g., system dynamics, ARIMA models, etc.)

Methodological approaches (forms of evidence) to address question B1

behavioural icon
  • Behavioural/implementation research
    • Feasibility trial to understand if the intervention can be implemented as designed
In the APEASE (acceptability, practicality, effectiveness, affordability, side-effects, equity) criteria, this question would be measuring the practicality.
evaluation icon
  • Evaluation
    • Contribution analysis to test the theory of change of a given intervention using a stakeholder workshop/consultation.

These approaches should follow ex-ante models, assuming that the intervention has not been implemented yet.

This type of question would benefit from a participatory approach, which emphasizes stakeholder engagement in all of the stages of an evaluation design.

evidence syntheses icon
  • Evidence syntheses
    • Reviews of interventions
    • Expert opinion/policy reviews
    • Complexity oriented systematic reviews
    • Qualitative evidence syntheses
 

 

Question B2. Assessing the benefits and early-and-frequently occurring harms of an option

Efficacy, effectiveness

This type of question aims to assess the benefits and the early-occurring harms of an option to address a problem or its causes, by measuring outcomes, the effect size, and its variability.

Some examples of this type of question are:

  • What is the clinical efficacy of using remdesivir to treat COVID-19 patients?
  • What are the benefits of mask mandates to reduce COVID-19 transmission?
  • What are the benefits of the remote monitoring of chronic patients?

Methodological approaches to address question B2

  • Randomized-controlled study (randomized experiment or randomized trial)
  • Controlled before-and-after study of aggregated data (including difference-in-differences study and non-equivalent control group designs)
  • Interrupted time-series analysis (including joint-point regression)
  • Retrospective cohort study of individual-level data (retrospective or historical longitudinal, or panel study)

Methodological approaches (forms of evidence) to address question B2

behavioural icon
  • Behavioural/implementation research
    • Implementation trials to evaluate if the intervention is effective in achieving the desired outcomes

In the APEASE (acceptability, practicality, effectiveness, affordability, side-effects, equity) criteria, this question would be measuring the effectiveness.

In implementation sciences, it is critical to distinguish between efficacy (in controlled settings) from the effectiveness (in real-life settings) of an intervention.
evaluation icon
  • Evaluation
    • Experimental (randomized-controlled trials) or quasi-experimental approaches
    • Contribution analysis to test the theory of change of a given intervention using a stakeholder workshop/consultation
These approaches should follow ex-ante models, assuming that the intervention has not been implemented yet.
evidence syntheses icon
  • Evidence syntheses
    • Reviews of interventions
 

 

Question B3. Assessing late-occurring harms and risks of an option

Benefit-risk balance

This type of question aims to identify late-occurring harms and assess their probability of occurrence (i.e., risk). In case the benefits of an option are also available at the same time, this type of question might also entail assessing whether the benefits outweigh the harms (i.e., net benefit).

Some examples of this type of question are:

  • What are the late-occurring harms of using HEPA filters in a classroom? 
  • Do the benefits of using convalescent plasma outweigh the late-occurring harms in treating COVID-19 patients?
  • Does building homeless shelters creates unintended effects, such as affecting the housing market?

Study designs to address question B3

  • Retrospective cohort study of individual-level data (retrospective or historical longitudinal, or panel study), including databases of adverse event reporting (e.g., pharmacovigilance)
  • Randomized-controlled study (randomized experiment or randomized trial)
  • Prospective cohort study of individual-level data (prospective longitudinal or panel study)
  • Case-control study (case-comparison study)

Methodological approaches (forms of evidence) to address question B3

behavioural icon
  • Behavioural/implementation research
    • Implementation trials to evaluate if the intervention is safe.

In the APEASE (acceptability, practicality, effectiveness, affordability, side-effects, equity) criteria, this question would be measuring side-effects.

In implementation sciences, it is critical to distinguish between efficacy (in controlled settings) from the effectiveness (in real-life settings) of an intervention.
evaluation icon
  • Evaluation
    • Experimental (randomized-controlled trials) or quasi-experimental approaches
    • Contribution analysis to test the theory of change of a given intervention using a stakeholder workshop/consultation.
These approaches should follow ex-ante models, assuming that the intervention has not been implemented yet.
evidence syntheses icon
  • Evidence syntheses
    • Reviews of interventions
 

 

Question B4. Assessing the costs and resource use of an option

This type of question aims to assess the potential costs and resource use that implementing a given option will create depending on the specific context and setting in which the option is planned to be implemented. Resources could be monetary (i.e., costs), but could also be human resources, technology, etc.

Some examples of this type of question are:

  • How much would a rare disease treatment cost for a family?
  • What are the costs of procuring masks for all hospitals?
  • What is the budget impact of implementing universal health insurance in a given country?

Study designs to address question B4

  • Modelling to estimate the cost of an option
  • Jurisdictional scan (comparative analysis) to understand the costs in other jurisdictions
  • Cross-sectional study (survey, point-in-time or snapshot study or analysis) of data collected for this purpose (i.e., primary data)

Methodological approaches (forms of evidence) to address question B4

behavioural icon
  • Behavioural/implementation research
    • Implementation trials to evaluate if the intervention is affordable.

In the APEASE (acceptability, practicality, effectiveness, affordability, side-effects, equity) criteria, this question would be measuring the affordability.

In implementation sciences, it is critical to distinguish between efficacy (in controlled settings) from the effectiveness (in real-life settings) of an intervention.
evidence syntheses icon
  • Evidence syntheses
    • Cost/economic evaluation reviews
 

 

Question B5. Assessing the efficiency in the use of resources

Value for money, Return on Investment (ROI), Cost-effectiveness

This type of question aims to assess how efficient an investment of resources is, by comparing the costs of an option against the balance between benefits and harms. In this context, we already have credible evidence of the effectiveness of the option.

Some examples of this type of question are:

  • What is the incremental cost-effectiveness ratio of covering trastuzumab to treat advanced-stage breast cancer?
  • How economically efficient is it to invest resources in preventing diabetes?
  • How economically efficient is it to procure laptops for all students to improve their learning outcomes?

Study designs to address question B5

  • Economic evaluations (cost-effectiveness, cost-utility, cost-benefit analyses)
  • Jurisdictional scan (comparative analysis) to understand whether the option was efficient in other jurisdictions 
  • Delphi studies (to get consensus from experts)

Methodological approaches (forms of evidence) to address question B5

evaluation icon
  • Evaluation
    • Economic evaluations
 
evidence syntheses icon
  • Evidence syntheses
    • Cost/economic evaluation reviews
 

 

Question B6. Identifying equity, ethical, social and human rights impacts of an option

This type of question aims to understand the equity, ethical and human rights implications of a given option. On one side, implementing an option might have a differential impact (measured as any of the other outcomes included in this goal) in some population groups. On the other side, some options could have ethical and human rights implications when implemented that might arise when finding and selecting options.

Some examples of this type of question are:

  • What are the equity implications of lockdowns to prevent COVID-19 transmission?
  • How human rights could be affected by implementing identity controls to prevent crime?
  • What are the ethical implications of using plasma transfusion in a patient against the use of this technology due to religious beliefs?

Study designs to address question B6

  • Delphi studies (to get consensus from experts)
  • Cross-sectional study (survey, point-in-time or snapshot study or analysis) of people’s experiences (not asking about hypothetical scenarios)
  • Qualitative deductive (from general to particular i.e., testing theory) methods to describe/critically analyze a phenomenon (e.g., qualitative case studies)t
  • Qualitative deductive (from general to particular i.e., testing theory) methods to describe a phenomenon (e.g., qualitative description, narrative approaches)

Methodological approaches (forms of evidence) to address question B6

behavioural icon
  • Behavioural/implementation research
    • Knowledge syntheses to evaluate if the equity implications of an intervention implementation.
In the APEASE (acceptability, practicality, effectiveness, affordability, side-effects, equity) criteria, this question would be measuring equity.
evaluation icon
  • Evaluation
    • Qualitative comparative analysis
    • Process tracing and contribution analysis
 
evidence syntheses icon
  • Evidence syntheses
    • Systematic reviews with a focus on equity
 

 

Question B7. Assessing the acceptability of an option

This type of question aims to measure the level of acceptability that a given option would have in a concrete setting, and to what extent a given group is willing to receive an intervention.

Some examples of this type of question are:

  • "What are the knowledge, attitudes and preferences of Aboriginal adolescents about sexual and reproductive health services?
  • What data is there (if any) on the cultural acceptability of these (or other) approaches?

Study designs to address question B7

  • Discrete choice experiment (stated preferences)
  • Qualitative deductive (from general to particular i.e., testing theory) methods to describe/critically analyze a phenomenon (e.g., qualitative case studies)
  • Qualitative deductive (from general to particular i.e., testing theory) methods to describe a phenomenon (e.g., qualitative description, narrative approaches)t
  • Cross-sectional study (survey, point-in-time or snapshot study or analysis) of people’s experiences (not asking about hypothetical scenarios)

Methodological approaches (forms of evidence) to address question B7

evidence syntheses icon
  • Evidence syntheses
    • Reviews of preferences and values
    • Qualitative evidence syntheses
    • Review of interventions
 

 

C. Maximizing the expected impact of options

How can we ensure success with these solutions?

Option

This goal aims to maximize the expected impact by either adjusting some variables of interventions, or by focusing the option on certain population groups or settings.

Open the question that is mostly related to your specific query, and more details about the question, examples and methodological approaches will be displayed.

Question C1. Adjusting options and enabling factors to maximize impact

Modifiers

This type of question aims to evaluate whether adjusting some variables (e.g., the deliverer, the intensity of the intervention, etc.) could modify the expected impact (measured as any of the questions provided in Stage 2, Goal B) of an option.

Some examples of this type of question are:

  • How does the effect of corticosteroids for treating severe COVID-19 changes with its dosage?
  • How the COVID-19 transmission risk varies when changing the specific distance required to promote social distancing?
  • How does a company profits change when their employees are paid by performance using process vs result indicators?

Adjusting of options could also be done by considering specific implementation considerations related to an intervention (e.g., cognitive behavioral therapy would maximize its effects when it is implemented by a trained nurse). To further explore implementation issues, jump to the implementation stage.

Study designs to address question C1

Three clusters of study designs (in order of suitability) can be used:

  • First choice/higher rank: 
    • Randomized-controlled study (randomized experiment or randomized trial) to compare different forms of the same intervention
  • Second choice/Middle rank:
    • Interrupted time-series analysis (including joint-point regression)
    • Controlled before-and-after study of aggregated data (including difference-in-differences study and non-equivalent control group designs)
    • Instrumental variables study (two-stage least-squares study or regression)
    • Modelling to predict or estimate the impact of an intervention (e.g., system dynamics, ARIMA models, etc.)
    • Regression discontinuity study (regression kink study or analysis)
  • Bottom rank:
    • Prospective cohort study of individual-level data (prospective longitudinal or panel study)
    • Retrospective cohort study of individual-level data (retrospective or historical longitudinal, or panel study)
    • Case-control study (case-comparison study)

Methodological approaches (forms of evidence) to address question C1

behavioural icon
  • Behavioural/implementation research
    • Developing methods for optimizing implementation programs
One key part of implementation sciences is trying to understand the behaviour change mechanism behind the implementation of specific interventions.
evaluation icon
  • Evaluation
    • Realist evaluation
    • Most significant change
    • Process tracing and contribution analysis
 
evidence syntheses icon
  • Evidence syntheses
    • Review of interventions (including meta-regression or complexity perspective)
 

 

Question C2. Finding population groups, settings and contexts to focus options

Positive deviance

This type of question aims to explore what setting or socioecological contexts, and/or in what population groups the intervention would produce most impact (measured as any of the questions provided in Stage 2, Goal B, which includes in what population the intervention would achieve most equitable results).

Some examples of this type of question are:

  • For what type of breast cancer patients, radiotherapy would be most effective?
  • What population group would be the greatest benefit from a reform to police forces?
  • In what settings the promotion of nurse practitioners to undertake some physician-tasks would produce the highest net benefit?"

Methodological approaches to address question C2

  • Case-control study (case-comparison study)
  • Prospective cohort study of individual-level data (prospective longitudinal or panel study)
  • Controlled before-and-after study of aggregated data (including difference-in-differences study and non-equivalent control group designs)
  • Randomized-controlled study (randomized experiment or randomized trial) using subgroup comparisons.

Methodological approaches (forms of evidence) to address question C2

behavioural icon
  • Behavioural/implementation research
    • Developing methods for optimizing implementation programs
One key part of implementation sciences is trying to answer what population groups would be most advantaged and disadvantaged by a given intervention.
evaluation icon
  • Evaluation
    • Experimental approaches (randomized-controlled trials)
    • Realist evaluation
 
evidence syntheses icon
  • Evidence syntheses
    • Review of interventions (including meta-regression or complexity perspective)
    • Systematic reviews with a focus on equity
 

 

D. Contributing to prioritize and select options

How to prioritize or combine solutions?

Option

This goal aims to produce insights to select the best combination of options to address the problem or causes, by creating packages or creating a ranking of options.

It is important to notice that selecting what options to pursue would be out of the scope of this list, since many other non-evidence related factors could be considered to make a decision on what to implement, but this goal concentrated on the insights that evidence could provide to these specific types of decisions.

Open the question that is mostly related to your specific query, and more details about the question, examples and methodological approaches will be displayed.

Question D1. Creating packages of options

This type of question aims to find the right combination of interventions that would maximize the expected impacts (using any or a combination of the impacts described in Stage 2 Goal B).

Some examples of this type of question are:

  • What are the best drug combinations to treat a patient with stage IV breast cancer?
  • What combination of public health policies is most effective to reduce infant mortality?
  • What interventions need to be combined to reduce the homeless population in a given city?

Study designs to address question D1

  • Evidence synthesis of studies evaluating the impact of single interventions to analyze the combined effect of packages. 
  • Randomized-controlled study (randomized experiment or randomized trial) to compare packages of interventions in different arms
  • Delphi study (to get consensus from experts)

Methodological approaches (forms of evidence) to address question D1

behavioural icon
  • Behavioural/implementation research
    • Knowledge synthesis to define what care should we be providing
 
evaluation icon
  • Evaluation
    • Realist evaluation
    • Most significant change
 
evidence syntheses icon
  • Evidence syntheses
    • Big picture reviews
    • Reviews of interventions
Big picture reviews include scoping reviews, evidence maps, and evidence gap maps, that identify and map the breadth of evidence available on a particular issue.

 

Question D2. Creating a ranking of options

This type of question aims to create or facilitate the process to create a ranking of interventions based on their expected impact (measured as any of the outcomes included in Stage 2, Goal B).

Some examples of this type of question are:

  • What are the five most effective treatment courses for patients with depression?
  • What are the five most effective strategies to reduce childhood obesity?
  • What are the five most effective strategies to reduce emergency room waiting times?

This type of question might also be part of a larger process (e.g., guideline development) that could include a number of different questions included in the tool.

Study designs to address question D2

  • Economic evaluations (cost-effectiveness, cost-utility, cost-benefit analyses) to create a ranked list of options
  • Ranking type Delphi study (to get consensus from experts)
  • Multi-criteria (objective) decision analysis to create a ranked list of options
  • Discrete choice experiment (stated preferences)

Methodological approaches (forms of evidence) to address question D2

evaluation icon
  • Evaluation
    • Most significant change
    • Qualitative comparative analysis
 
evidence syntheses icon
  • Evidence syntheses
    • Reviews of interventions (with network meta-analysis)
 

 

Similar to problems, options can be interventions that are sitting in present or the past, or they can also be interventions that are not necessarily available right now, but they could eventually become an option (future options). We did not create specific questions for these scenarios, but we acknowledge that the same types of questions that are included in this stage can be formulated for present or future options.

Take a look at the demand-driven approach used to create the Matching Q-M tool, as well as a list and explanation of all methodological approaches included.