Chapter 6: Evaluation and Reflection

Vocabulary



Evaluating a health program seems to come at the end, when we want to show how the program worked. However, we need to anticipate the end from the beginning. Ideas for evaluation should be included early in the planning.Public health efforts are often required to justify their effectiveness so they can qualify for renewed funding—leading to evaluation becoming more important all the time.

Based on our goals and objectives, we should already know what we need to measure. Next, we need to plan how to collect this data. Too many programs are unable to perform an evaluation because they didn’t collect the needed baseline data at the beginning. 



Types of Evaluations

(CDC, 2012)

Evaluation falls into one of two broad categories: formative and summative.

(CDC, n.d.-a)

The following chart shows several different types of evaluations and how they can be used.

Evaluation Types 

When to use 

What it shows 

Why it is useful

Formative Evaluation 


Evaluability 

Assessment 


Needs Assessment 

  • During the development of a new program.

  • When an existing program is being modified or is being used in a new setting or with a new population. 

  • Whether the proposed program elements are likely to be needed, understood, and accepted by the population you want to reach. 

  • The extent to which an evaluation is possible, based on the goals and objectives. 

  • It allows for modifications to be made to the plan before full implementation begins. 

  • Maximizes the likelihood that the program will succeed. 

Process Evaluation 


Program Monitoring

  • As soon as program implementation begins. 

  • During operation of an existing program. 

  • How well the program is working. 

  • The extent to which the program is being implemented as designed. 

  • Whether the program is accessible and acceptable to its target population.

  • Provides an early warning for any problems that may occur. 

  • Allows programs to monitor how well their program plans and activities are working. 

Outcome Evaluation


Objectives-Based Evaluation 

  • After the program has made contact with at least one person or group in the target population. 

  • The degree to which the program is having an effect on the target population’s behaviors. 

  • Tells whether the program is being effective in meeting its objectives.

Economic Evaluation: Cost Analysis, Cost-Effectiveness Evaluation, Cost-Benefit Analysis,
Cost-Utility Analysis

  • At the beginning of a program.

  • During the operation of an existing program.

  • What resources are being used in a program and their costs (direct and indirect) compared to outcomes. 

  • Provides program managers and funders a way to assess cost relative to effects. How effective was the use of funds for this program.

Impact Evaluation

  • During the operation of an existing program at appropriate intervals.

  • At the end of a program. 

  • The degree to which the program meets its ultimate goal. 

  • Provides evidence for use in policy and funding decisions. 



A Framework for Program Evaluation

The framework developed by the CDC offers steps to follow and standards to be achieved for an effective evaluation.

(CTB, n.d.; CDC, 2017)


A large circle with four rings for the Framework for Program Evaluation. See the appendix for a more in-depth description.

Access the appendix for a description of the image

 

Below are the Six Recommended Steps (CDC, 2021a)

1. Engage Stakeholders

Stakeholders must be part of the evaluation to ensure that their unique perspectives are understood. Although all stakeholders are interested in your program’s success, they may come from varied backgrounds and have different perspectives on what they want evaluated.

 2. Describe the Program

Summarize the intervention being evaluated. Explain what the program is trying to accomplish. Illustrate the program's core components and elements, its ability to make changes, its stage of development, and how the program fits into the larger organizational and communal environment.

3. Focus the Evaluation Design

Depending on what you want to learn, some types of evaluation will be better suited than others. Funders may have specific evaluation requirements. The design of the evaluation may be one of the following:

Each method option has its own biases and limitations. Using multiple evaluation methods, called the Mixed Methods approach, can give a better understanding.

4. Gathering Credible Evidence

Having credible evidence strengthens the evaluation results as well as the recommendations that follow from them. When more stakeholders participate, they will be more likely to accept the evaluation's conclusions and to act on its recommendations.

5. Justify Conclusions

Evidence must be carefully considered from a number of different stakeholders' perspectives to reach conclusions that are substantiated. Conclusions are justified if they are linked to the evidence and judged against values set by the stakeholders. From the conclusions reached, the stakeholders will help you form recommendations about future actions: to continue or expand the program, or to try a different approach.

 6. ENSURE USE AND SHARE LESSONS LEARNED

Ideally, lessons learned in an evaluation will be used in decision making and future actions. This requires strategically watching for opportunities to communicate and influence. It can begin in the earliest stages of the process and continue throughout the evaluation.

Dissemination is the process of communicating the lessons learned from an evaluation to the right people in a timely fashion. The goal for dissemination is to achieve full disclosure and impartial reporting.

What reports should be disseminated?

You'll probably also include specific data, annual reports, quarterly or monthly reports from the monitoring system, and anything else that is mutually agreed upon between the organization and the evaluation team.

 

STANDARDS FOR EVALUATION

(CDC, 2021b)

The Joint Committee on Standards for Educational Evaluation developed "The Program Evaluation Standards" to ensure evaluations are well-designed and fair. These standards offer principles to follow for interventions related to community health. They also help to guard against an imbalanced or impractical evaluation. 


The 30 Specific Standards Are grouped into Four Categories:


Utility Standards

Utility standards ensure that the evaluation is useful to all stakeholders and potential readers of the information in the future.


Feasibility Standards

The feasibility standards are to ensure that the evaluation makes sense - that the planned steps are both viable and pragmatic.

The feasibility standards are:


Proprietary Standards

The propriety standards ensure that the evaluation is ethical  and conducted with regard for the rights and interests of those involved. The eight propriety standards follow:


Accuracy Standards

The accuracy standards ensure that the evaluation findings are correct.

There are 12 accuracy standards:


Applying the Framework to Conduct Optimal Evaluations

The six steps and 30 standards can be integrated and applied together, as illustrated on this chart:

(CDC, n.d.-b)

Steps in Evaluation Practice

Relevant Standards

Group/Item

Engaging stakeholders

Stakeholder identification

Utility/A

Evaluator credibility

Utility/B

Formal agreements

Propriety/B

Rights of human subjects

Propriety/C

Human interactions

Propriety/D

Conflict of interest

Propriety/G

Metaevaluation

Accuracy/L

Describing the program

Complete and fair assessment

Propriety/C

Program documentation

Accuracy/A

Context analysis

Accuracy/B

Metaevaluation

Accuracy/L

Focusing the evaluation

design

Evaluation impact

Utility/G

Practical procedures

Feasibility/A

Political viability

Feasibility/B

Cost effectiveness

Feasibility/C

Service orientation

Propriety/A

Complete and fair assessment

Propriety/E

Fiscal responsibility

Propriety/H

Described purposes and procedures

Accuracy/C

Metaevaluation

Accuracy/C

Gathering credible evidence

Information scope and selection

Utility/C

Defensible information sources

Accuracy/D

Valid information

Accuracy/E

Reliable information

Accuracy/F

Systematic information

Accuracy/G

Metaevaluation

Accuracy/L

Justifying conclusions

Values identification

Utility/D

Analysis of quantitative information

Accuracy/H

Analysis of qualitative information

Accuracy/I

Justified conclusions

Accuracy/J

Metaevaluation

Accuracy/L

Ensuring use and sharing

lessons learned

Evaluator credibility

Utility/B

Report clarity

Utility/E

Report timeliness and dissemination

Utility/F

Evaluation impact

Utility/G

Disclosure of findings

Propriety/F

Impartial reporting

Accuracy/K

Metaevaluation

Accuracy/L


Using this framework for program evaluation will help you find the best way to evaluate and use evaluation results to make your program more effective. The framework encourages an evaluation approach designed to engage all interested stakeholders in a process that welcomes their participation.




References


CDC. (2012, May 11). Step 3: Focus the Evaluation Design. Centers for Disease Control and Prevention. https://www.cdc.gov/evaluation/guide/step3/index.htm

CDC. (2017, May 15). A Framework for Program Evaluation. Centers for Disease Control and Prevention. https://www.cdc.gov/evaluation/framework/index.htm

CDC. (2021a, April 9). Evaluation Steps. Centers for Disease Control and Prevention. https://www.cdc.gov/evaluation/steps/index.htm

CDC. (2021b, April 9). Evaluation Standards. Centers for Disease Control and Prevention. https://www.cdc.gov/evaluation/standards/index.htm

CDC. (n.d.-a). Types of Evaluation. Centers for Disease Control and Prevention. https://www.cdc.gov/std/Program/pupestd/Types%20of%20Evaluation.pdf

CDC. (n.d.-b). Cross-reference of steps and relevant standards. Centers for Disease Control and Prevention. https://www.cdc.gov/evaluation/standards/stepsandrelevantstandards.pdf

CTB. (n.d.) Chapter 36, Section 1. A Framework for Program Evaluation: A Gateway to Tools. Community Tool Box. https://ctb.ku.edu/en/table-of-contents/evaluate/evaluation/framework-for-evaluation/main

This content is provided to you freely by BYU-I Books.

Access it online or download it at https://books.byui.edu/pubh_390_readings_2nd_edition_/chapter_6_evaluation_and_reflection.