Chapter 6: Ensure Use, Lessons Learned, and Final Program Evaluation Plan

Overview

Involving stakeholders is often overlooked in planning. Step 6 deals with planning for use of evaluation results, sharing of lessons learned, communication, and dissemination of results.  Planning for how the evaluation results will be used should be part of the planning phase for the program. Including the Evaluation Stakeholder Workgroup (ESW) throughout the plan development stage increases the chances that results will be used for program decision making. Collaborating with the ESW throughout the entire process will help make sure the evaluation results are used. This step is directly tied to the utility standard in evaluation. 

Use must be planned for, cultivated, and included in the evaluation plan from the very beginning.


Based on the uses for your evaluation, you will need to determine who should learn about the findings and how they should learn the information.  It will be important to consider the needs of the audience. Remember that stakeholders will not suddenly become interested in your product just because you produced a report. You must prepare the market for the product and for use of the evaluation results (Patton, 2008). Writing a straightforward and comprehensive evaluation report can help insure use.

Communication and Dissemination Plans

An effective communication and dissemination approach should be included in your evaluation plan. The communication-dissemination phase of the evaluation is a two-way process. In order to be effective, dissemination systems need to do the following:

(National Institute on Disability and Rehabilitation Research, 2001)

First, define your communication goals and objectives. Tailor objectives to each target audience. Consider with your ESW who the primary audience is (for example, the funding agency, the general public, or some other group). Some questions to ask about the potential audience include the following:

Once the goals, objectives, and target audiences of the communication plan are established, consider the best way to reach them, and which tools will best serve. Seek feedback from your ESW and reach out to target audiences to gather their preferences. 

Establish a timetable for sharing evaluation findings and lessons learned. Figure 5 can be useful in helping the program to chart the written communications plan.

Figure 5: Communication Plan Table

Communication Plan Table. For a more in-depth description, see the appendix.

For a description of the image, access the appendix

You do not have to wait until the final evaluation report is written to share your evaluation results. Sharing interim results throughout the whole process will help with program course corrections. Additionally, stories that focus on successes will help with program visibility. A success story can show your program’s progress over time and demonstrate its impact. It can serve as a vehicle for engaging potential participants, partners, and funders. (Lavinghouze, Price, Smith, 2007).


Ensuring Use

Communicating results is not enough to ensure use of evaluation results and lessons learned. Take action to encourage use and wide dissemination of the information gained. Steps to help ensure evaluation findings are used include the following:

One Last Note

The impact of the evaluation results can reach far beyond the evaluation report. If stakeholders are involved throughout the process, communication and participation may be enhanced. Positive changes in the program may come from the careful evaluation process (Patton, 2008). Use of evaluation results beyond the formal findings of the evaluation report start with the planning process and the transparent evaluation plan.

Evaluation Plan Tips for Step 6

At This Point, Your Plan Should Have Done the Following:

Step 6: Ensure Use of Evaluation Findings and Share Lessons Learned 

The ultimate purpose of program evaluation is to use the information to improve programs. The purpose(s) you identified early in the evaluation process should guide the use of the evaluation results. The evaluation results can be used to demonstrate the effectiveness of your program, identify ways to improve your program, modify program planning, demonstrate accountability, and justify funding. 

Additional uses include the following: 

What’s involved in ensuring use and sharing lessons learned? Five elements are important in making sure that the findings from an evaluation are used:

Recommendations

Recommendations are actions to consider as a result of an evaluation. Recommendations can strengthen an evaluation when they anticipate and react to what users want to know and may undermine an evaluation’s credibility if they are not supported by enough evidence or are not in keeping with stakeholders’ values.

Your recommendations will depend on the audience and the purpose of the evaluation (see text box). Remember, you identified many or all of these key audiences in Step 1 and have engaged many of them throughout as stakeholders. You know the information your stakeholders want and what is important to them. Their feedback early on in the evaluation makes their eventual support of your recommendations more likely. 

Some Potential Audiences for Recommendations

  • Local programs.

  • The state health department. 

  • City councils.  

  • State legislators.  

  • Schools.  

  • Workplace owners.  

  • Parents.  

  • Police departments or enforcement agencies.  

  • Health care providers.  

  • Contractors.  

  • Health insurance agencies.

  • Advocacy groups.


Illustrations from Cases

Here are some examples, using the case illustrations, of recommendations tailored to different purposes and for different audiences: 

Audience: Local provider immunization program. 

Purpose of Evaluation: Improve program efforts. 

Recommendation: Thirty-five percent of providers in Region 2 recalled the content of the monthly provider newsletter. To meet the current objective of a 50% recall rate among this population group, we recommend varying the media messages by specialty and increasing the number of messages targeted through journals for the targeted specialties.


Audience: Legislators. 

Purpose of Evaluation: Demonstrate effectiveness. 

Recommendation: Last year, a targeted education and media campaign about the need for private provider participation in adult immunization was conducted across the state. Eighty percent of providers were reached by the campaign and reported a change in attitudes towards adult immunization—a twofold increase from the year before. We recommend the campaign be continued and expanded to emphasize minimizing missed opportunities for providers to conduct adult immunizations.


Audience: County health commissioners. 

Purpose of Evaluation: Demonstrate effectiveness of CLPP efforts. 

Recommendation: In this past year, county staff identified all homes with EBLL children in targeted sections of the county. Data indicate that only 30% of these homes have been treated to eliminate the source of the lead poisoning. We recommend that you incorporate compliance checks for the lead ordinance into the county’s housing inspection process and apply penalties for noncompliance by private landlords.


Audience: Foundation funding source for affordable housing program. 

Purpose of Evaluation: Demonstrate fiscal accountability. 

Recommendation: For the past 5 years, the program has worked through local coalitions, educational campaigns, and media efforts to increase engagement of volunteers and sponsors, and to match them with 300 needy families to build and sell a house. More than 90% of the families are still in their homes and making timely mortgage payments. But, while families report satisfaction with their new housing arrangement, we do not yet see evidence of changes in employment and school outcomes. We recommend continued support for the program with expansion to include an emphasis on tutoring and life coaching by the volunteers. 

Preparation

Preparation refers to the steps taken to eventually use the evaluation findings. Through preparation, stakeholders can do the following: 

Feedback

Feedback occurs among everyone involved in the evaluation. Feedback is necessary at all stages of the evaluation process, and it creates an atmosphere of trust among stakeholders. Early in an evaluation, giving and receiving feedback keeps an evaluation on track by keeping everyone informed about how the program is being implemented and how the evaluation is proceeding. As the evaluation progresses and preliminary results become available, feedback helps ensure that primary users and other stakeholders can comment on evaluation decisions. Valuable feedback can be obtained by holding discussions and routinely sharing interim findings, provisional interpretations, and draft reports.

Follow-up

Follow-up refers to the support that users need throughout the evaluation process. In this step it refers to the support users need after receiving evaluation results and beginning to reach and justify their conclusions. Active follow-up can achieve the following:

Dissemination: Sharing the Results and the Lessons Learned From Evaluation

Dissemination involves communicating evaluation procedures or lessons learned to relevant audiences in a timely, unbiased, and consistent manner. Regardless of how communications are structured, the goal for dissemination is to achieve full disclosure and impartial reporting. Planning effective communications requires the following:

Some methods of getting the information to your audience include the following:  

If a formal evaluation report is the chosen format, the evaluation report must clearly, succinctly, and impartially communicate all parts of the evaluation (see text box). The report should be written so that it is easy to understand. It does not need to be lengthy or technical. You should also consider oral presentations tailored to various audiences. An outline for a traditional evaluation report might look like the following:

Tips for Writing Your Evaluation Report

  • Tailor the report to your audience; you may need a different version of your report for each segment of your audience.  


  • Present clear and succinct results.  


  • Summarize the stakeholder roles and involvement. 


  • Explain the focus of the evaluation and its limitations.  


  • Summarize the evaluation plan and procedures. 


  • List the strengths and weaknesses of the evaluation.

  

  • List the advantages and disadvantages of the recommendations.

 

  • Verify that the report is unbiased and accurate.

  

  • Remove technical jargon.  


  • Use examples, illustrations, graphics, and stories.

  

  • Prepare and distribute reports on time. 


  • Distribute reports to as many stakeholders as possible.


Applying Standards

Utility, propriety, and accuracy are the three standards that most directly apply to Step 6: Ensure Use and Share Lessons Learned. The questions presented in Table 6.1 can help you to clarify and achieve these standards.

Standards for Step 6: Ensure Use and Share Lessons Learned

Table 6.1

Standard

Questions

Utility

Do reports clearly describe the program, including its context, and the evaluation’s purposes, procedures, and findings? 


Have you shared significant mid-course findings and reports with users so that the findings can be used in a timely fashion? 


Have you planned, conducted, and reported the evaluation in ways that encourage follow-through by stakeholders?

Feasibility 

Is the format appropriate to your resources and to the time and resources of the audience?

Propriety

Have you ensured that the evaluation findings (including the limitations) are made accessible to everyone affected by the evaluation and others who have the right to receive the results?

Accuracy

Have you tried to avoid the distortions that can be caused by personal feelings and other biases? 


Do evaluation reports impartially and fairly reflect evaluation findings?


Evaluation is a practical tool that states can use to inform programs’ efforts and assess their impact. Program evaluation should be well integrated into the day-to-day planning, implementation, and management of public health programs. Program evaluation complements the CDC’s operating principles for public health, which include using science as a basis for decision making and action, expanding the quest for social equity, performing effectively as a service agency, and making efforts outcome-oriented. 

These principles highlight the need for programs to develop clear plans, inclusive partnerships, and feedback systems that support ongoing improvement. The CDC is committed to providing additional tools and technical assistance to states and partners to build and enhance their capacity for evaluation.

Checklist for Step 6: Ensuring That Evaluation Findings Are Used and Sharing Lessons Learned

  • Identify strategies to increase the likelihood that evaluation findings will be used.  
  • Identify strategies to reduce the likelihood that information will be misinterpreted.  
  • Provide continuous feedback to the program. 
  • Prepare stakeholders for the eventual use of evaluation findings. 
  • Identify training and technical assistance needs.
  • Use evaluation findings to support annual and long-range planning. 
  • Use evaluation findings to promote your program. 
  • Use evaluation findings to enhance the public image of your program.
  • Schedule follow-up meetings to facilitate the transfer of evaluation conclusions.
  • Disseminate procedures used and lessons learned to stakeholders.
  • Consider interim reports to key audiences.
  • Tailor evaluation reports to the audience(s.)
  • Revisit the purpose(s) of the evaluation when preparing recommendations.
  • Present clear and succinct findings in a timely manner.
  • Avoid jargon when preparing or presenting information to stakeholders.
  • Disseminate evaluation findings in several ways.


Worksheet Step 6A: Communicating Results

Communicate to this audience

Most appropriate format

Most effective channel(s)

1






2





3





4





5






Worksheet Step 6B: Ensuring Follow-up

Who will follow up with users on evaluation findings

How the follow up will be done

Available support for follow up

1






2





3





4





5





Step 7: Final Program Evaluation Plan

Pulling It All Together

Thus far we have walked through the six steps of the CDC Framework for Program Evaluation in Public Health to facilitate programs and their evaluation workgroups as they think through the process of planning evaluation activities. We have described the components of an evaluation plan and details to consider while developing the plan in the context of the CDC Framework. In this section, we briefly recap information that you should consider when developing your evaluation plan.

Increasingly, a multi-year evaluation plan is required as part of applications for funding or as part of program work plans. An evaluation plan is more than a column added to the program work plan for indicators. These plans should be based on stated program objectives and include activities to assess progress on those objectives. Plans should include both process and outcome evaluation activities.

As previously discussed, an evaluation plan is a written document that describes how you will monitor and evaluate your program so that you will be able to describe the What, the How, and the Why It Matters for your program. The What reflects the description and accomplishments of your program. Your plan serves to clarify the program’s purpose, anticipated expectations, and outcomes. The How answers the question, how did you do it? and assesses how a program is being implemented and if the program is operating with fidelity to the program protocol. Additionally, the How answers program course-corrections that should be made during implementation. The Why It Matters represents how your program makes a difference and the impact it has on the public health issue being addressed. Being able to demonstrate that your program has made a difference can be critical to program sustainability. An evaluation plan is similar to a program work plan in that it is a roadmap that guides the planning of activities used to assess the processes and outcomes of a program. An effective evaluation plan is a dynamic tool that can change over time, as needed, to reflect program changes and priorities. An evaluation plan creates directions for accomplishing program goals and objectives by linking evaluation and program planning.

Ideally, program staff, evaluation staff, and the ESW will  be developing the evaluation plan while the program staff develops the program work plan. Developing the evaluation plan simultaneously with the program work plan allows program staff and stakeholders to realistically think through the process and resources needed for the evaluation. It facilitates the link between program planning and evaluation and ensures creating a feedback loop of evaluation information for program improvement and decision making.

Ideally, program staff and the ESW will develop the evaluation plan while developing the program work plan.

Often, programs have multiple funding sources and thus may have multiple evaluation plans. Ideally, your program will develop one overarching evaluation plan that consolidates all activities and provides an integrated view of program assessment. Then, as additional funding sources are sought and activities added, those evaluation activities can be enfolded into the larger logic model and evaluation scheme.

The basic elements of an evaluation plan include the following:

However, your plan should be adapted to your specific evaluation needs and context. Additionally, it is important to remember that your evaluation plan is a living, dynamic document designed to adapt to the complexities of the environment within which your programs are implemented. The plan is a guide to facilitate intentional decisions. If changes are made, they are documented and done intentionally with a fully informed ESW.

Title page: This page provides an easily identifiable program name, the dates covered, and possibly the basic focus of the evaluation.

Question overview: In an evaluation plan, this is an overview of the evaluation questions for ease of reference, similar to the executive summary in a final evaluation report.

Intended use and users: This section fosters transparency about the purposes of the evaluation and who will have access to evaluation results. It is important to build a market for evaluation results from the beginning. This section identifies the primary intended users and the ESW and describes the purposes and intended uses of the evaluation.

Program description: This section provides a shared understanding of the description of your program and a basis for the evaluation questions and prioritization. This section will usually include a logic model and a description of the stage of development of the program in addition to a narrative description. This section can also facilitate completing the introduction section for a final report or publication from the evaluation. This section might also include a reference section or bibliography related to your program description.

Evaluation focus: There are never enough resources or time to answer every evaluation question. Prioritization must be collaboratively accomplished based on the logic model and program description, the stage of development of the program, program and stakeholder priorities, intended uses of the evaluation, and feasibility issues. This section will clearly delineate the criteria for evaluation prioritization and will include a discussion of feasibility and efficiency.

Methods: This section covers indicators and performance measures, data sources and selection of appropriate methods, roles and responsibilities, and credibility of evaluation information. This section will include a discussion about appropriate methods to fit the evaluation question. An evaluation plan methods grid is a useful tool for transparency and planning.

Analysis and interpretation plan: Who will get to see interim results? Will there be a stakeholder interpretation meeting or meetings? It is critical that your plans allow time for interpretation and review from stakeholders (including your critics) to increase transparency and validity of your process and conclusions. The emphasis here is on justifying conclusions, not just analyzing data. This is a step that deserves due diligence in the planning process. The propriety standard plays a role in guiding the evaluator’s decisions in how to analyze and interpret data to assure that all stakeholder values are respected in the process of drawing conclusions. A timeline that transparently demonstrates inclusion of stakeholders facilitates acceptance of evaluation results and use of information.

Use, dissemination, and sharing plan: Plans for use of evaluation results, communications, and dissemination methods should be discussed from the beginning. This is a critical but often neglected section of the evaluation plan. A communication plan that displays target audience, goals, tools, and a timeline is helpful for this section. The exercises, worksheets, and tools found in Part 2 of this workbook are to help you think through the concepts discussed in Part 1. These are only examples. Remember, your evaluation plan(s) will vary based on program and stakeholder priorities and context.

Outline: 7.1 Basic Elements of an Evaluation Plan

Often, programs have multiple funding sources and, thus, may have multiple evaluation plans. Ideally, your program will develop one overarching evaluation plan that consolidates all activities and provides an integrated view of program assessment. Then, as additional funding sources are sought and activities added, those evaluation activities can be enfolded into the larger logic model and evaluation scheme.

Your plan should be adapted to your specific evaluation needs and context. Additionally, it is important to remember that your evaluation plan is a living, dynamic document designed to adapt to the complexities of the environment within which your programs are implemented. The plan is a guide to facilitate intentional decisions. If changes are made, they are documented and done intentionally with a fully informed ESW.

NOTE: The following section is a review of all the other sections. There’s no need to complete these, but they may be helpful for you to review all in one place.

Outline: 7.2 Evaluation Plan Sketchpad

Often, groups do not have the luxury of months to develop an evaluation plan. In many scenarios, a program team has only one opportunity to work with their ESW to develop their evaluation plan to submit with a funding proposal. All of the work discussed in this workbook must be accomplished in a single workgroup meeting, retreat, or conference session. In this scenario, it is helpful to have an evaluation sketchpad to develop the backbone or skeleton outline of your evaluation plan. With the major components of your evaluation plan developed, you will have the elements necessary to submit a basic evaluation plan that can be further developed with your funder and future stakeholders. Even if you have time to fully develop a mature evaluation plan, this sketchpad exercise is often a great tool to use to work with an ESW in a retreat type setting.

  1. First, brainstorm a list of stakeholders for your evaluation project.
PriorityPerson/GroupComments




























2. Go back to your list and circle high-priority stakeholders or high-priority information needs.

From the list of high-priority stakeholders identified above, think about their information needs from the evaluation or about the program.

Primary Intended UserInformation Needed
1. 
2. 
3. 
4. 
5. 
6. 
7. 
8. 
9.


Discuss the intended uses of the evaluation by primary intended users and program staff:

Primary Intended User/Program StaffIntended Uses
1. 
2. 
3. 
4. 
5. 
6. 
7. 
8. 
9.


3. Discuss potential political agendas or environmental constraints (Hidden agendas--from stakeholders, team members, company). What goals and objectives for the evaluation do stakeholders come to the table with before you even begin the evaluation? What is most important to each of the stakeholders at the table?


StakeholderGoals/Objectives










4. Briefly describe your program (in your plan you will include your logic model(s) if you have one):

Description of Program:


5. Think back to your program description you just wrote. Where are you in your program's growth (beginning, middle, mature)?

Stage of Growth:


6. Based on where you are in your program's growth, what does that tell you about what kinds of questions you can ask?

Stage of GrowthQuestions
Beginning
Middle
Mature


7. Based on your program's growth, your list of high-priority stakeholders and high-priority information needs, as well as your information needs, what are your possible evaluation questions?

Your evaluation questions for the current evaluation are:






8. Now, take each question and think about ways you might answer that question. Will your method be qualitatie, quantitative or both? Do you already have a data source? Will you have some success stories? How much will it cost? What resources do you have? Who needs to be involved to make the evaluation a success? How will you ensure use of lessons learned? How and when will you disseminate information? Below are two samples of tables you can use to organize this information.

Evaluation QuestionIndicator/Performance MeasureMethodData SourceFrequencyResponsibility





























































Evaluation QuestionIndicator/Performance MeasureMethodData SourceFrequencyResponsibilityCost Considerations
































































What do you want to communicate?To whom do you want to communicate?How do you want to communicate? Format(s)How do you want to communicate? Channel(s)





































9. Now think about the different ways you might communicate information from the evaluation to stakeholders. Communication may include information to stakeholders not on your ESW. You may want to provide preliminary results, success stories, etc. throughout the evaluation. Additionally, your ESW may assist in your communication efforts. What deadlines must be met and what opportunities are lost if deadlines are not met. How will this impact the timetable you created in #8?

References

Centers for Disease Control and Prevention. (2011). Developing an Effective Evaluation Plan. CDC. https://www.cdc.gov/obesity/downloads/cdc-evaluation-workbook-508.pdf

Introduction to Program Evaluation for Public Health Programs: A Self-Study Guide. (2011). U.S. Department of Health and Human Services. https://www.cdc.gov/evaluation/guide/CDCEvalManual.pdf


This content is provided to you freely by BYU-I Books.

Access it online or download it at https://books.byui.edu/pubh_381_readings/step_6_ensure_use_lessons_learned_and_final_program_evaluation_plan.