ISPOR Publishes Report on Multiple Criteria Decision Analysis for Health Care Decision-Making

01/27/16

Much attention has been paid recently to the proper design and implementation of decision-making support tools in health care, such as clinical pathways. The lack of transparency and standardization regarding the methods of developing clinical pathways has been a source of criticism from ASCO and others.

Recognizing a need for more guidance on the use of multiple criteria decision analysis (MCDA) in health care decision-making, the International Society of Pharmacoeconomics and Outcomes Research (ISPOR) has published a Task Force report on this topic.

Health care decision-making can often be a complex and difficult procedure that requires physicians to make tough choices about conflicting objectives. However, using a more structured approach that incorporates multiple variables can aid those tasked with making decisions about patient health care.

MCDA, a technique already widely used in other sectors, has recently begun to see widespread application in health care as well. MCDA has been used in a wide range of diverse health care scenarios to support, rank, and value different alternatives in decision-making. However, while there are a number of different MCDA methods available, little guidance exists on choosing the best technique for specific clinical problems.

In 2014, ISPOR formed the MCDA Emerging Good Practices Task Force to review the application of MCDA and provide good practice guidelines for use in health care decision-making. In the first of two reports to be published by the task force, authors describe different models of MCDA and provide examples of MCDA application in health care.

MCDA approaches can be broadly classified into three different models: value measurement models allocate scores to different alternatives to determine the preferred solution; outranking methods utilize direct one-to-one comparisons of alternatives; and reference-level modeling involves methods to identify the alternative that most closely attains predefined performance levels.

While approaches can vary in several ways, authors of the study contended that several main elements of the process should be applied:

  1. Define the problem. The starting point for any MCDA first involves defining the problem and corresponding goal. This includes identifying appropriate stakeholders, which could change for different clinical scenarios.
  2. Selecting and structuring criteria. After a problem has been identified, it becomes necessary to form agreed criteria from which all alternatives will be evaluated. Criteria can be broad or small depending upon desired goals.
  3. Measuring performance. Data on each agreed criterion should be gathered and evaluated with specific measures.
  4. Scoring alternatives. Alternatives should be scored on the basis of how they did or did not satisfy certain criterion.
  5. Weighting criteria. Based on stakeholder preference, values should be associated with each score. Higher weights are associated with criterion that is more important to stakeholders. Decision makers may be involved in assigning weights, but are often not.  
  6. Calculating aggregate scores. Each alternative is given a “total value,” where scores are multiplied by the weight given to each criterion. This helps stakeholders and decision-makers visualize best options.
  7. Dealing with uncertainty. Statistical analysis can be applied to understand the impact uncertainty has on MCDA results and evaluate the robustness of decision outcomes.
  8. Interpretation and reporting the results. Using the different methodologies, results can be presented in distinctive forms for analysis. However, authors note that MCDA is intended to aid those making decisions, not make the decision for them.

To conclude, authors wrote that MCDA could serve as an adequate support function in decision-making by improving the transparency and consistency of decisions. The second report that will be published by the task force will provide recommendations on Emerging Good Practices for conducting MCDA.