Skip directly to search Skip directly to A to Z list Skip directly to navigation Skip directly to site content Skip directly to page options
CDC Home

Evaluation Manual: Step 6 - Ensure Use of Evaluation Findings and Share Lessons Learned

Introduction to Program Evaluation for Public Health Program


Evaluating Appropriate Antibiotic Use Programs


Released April 2006

Adobe Acrobat print-friendly PDF file Print-friendly version of Section 6 (64 KB, 9 pages)

The ultimate purpose of program evaluation is to use the information to improve programs. The purpose(s) you identified early in the evaluation process should guide the use of the evaluation results. The evaluation results can be used to demonstrate the effectiveness of your program, identify ways to improve your program, modify program planning, demonstrate accountability, and justify funding.

Additional uses include the following:

  • Demonstrate to legislators or other stakeholders that resources are being well spent and that the program is effective.
  • Aid in forming budgets and justify the allocation of resources.
  • Compare outcomes with those of previous years.
  • Compare actual outcomes with intended outcomes.
  • Suggest realistic intended outcomes.
  • Support annual and long-range planning.
  • Focus attention on issues important to your program.
  • Promote your program.
  • Identify partners for collaborations.
  • Enhance the image of your program.
  • Retain or increase funding.
  • Provide direction for program staff.
  • Identify training and technical assistance needs.

What’s involved in ensuring use and sharing lessons learned? Five elements are important in making sure that the findings from an evaluation are used:

  • Recommendations
  • Preparation
  • Feedback
  • Follow-up
  • Dissemination

top of page

Making Recommendations

Recommendations are actions to consider as a result of an evaluation. Recommendations can strengthen an evaluation when they anticipate and react to what users want to know, and may undermine an evaluation’s credibility if they are not supported by enough evidence, or are not in keeping with stakeholders’ values.

Your recommendations will depend on the audience and the purpose of the evaluation (see text box). Remember, you identified many or all of these key audiences in Step 1, and have engaged many of them throughout as stakeholders. Hence, you have maximized the chances that the recommendations that you eventually make are relevant and useful to them. You know the information your stakeholders want and what is important to them. Their feedback early on in the evaluation makes their eventual support of your recommendations more likely.

Some Potential Audiences for Recommendations

  • Local programs
  • The state health department
  • City councils
  • State legislators
  • Schools
  • Workplace owners
  • Parents
  • Police departments or enforcement agencies
  • Healthcare providers
  • Contractors
  • Health insurance agencies
  • Advocacy groups

top of page

Illustrations from Cases

Here are some examples, using the case illustrations, of recommendations tailored to different purposes and for different audiences:

Audience: Local provider immunization program.
Purpose of Evaluation: Improve program efforts.
Recommendation: Thirty-five percent of providers in Region 2 recalled the content of the monthly provider newsletter. To meet the current objective of a 50% recall rate among this population group, we recommend varying the media messages by specialty, and increasing the number of messages targeted through journals for the targeted specialties.

Audience: Legislators.
Purpose of Evaluation: Demonstrate effectiveness.
Recommendation: Last year, a targeted education and media campaign about the need for private provider participation in adult immunization was conducted across the state. Eighty percent of providers were reached by the campaign and reported a change in attitudes towards adult immunization—a twofold increase from the year before. We recommend the campaign be continued and expanded to include an emphasis on minimizing missed opportunities of providers to conduct adult immunizations.

Audience: County health commissioners.
Purpose of Evaluation: Demonstrate effectiveness of CLPP efforts.
Recommendation: In this past year, county staff identified all homes with EBLL children in targeted sections of the county. Data indicate that only 30% of these homes have been treated to eliminate the source of the lead poisoning. We recommend that you incorporate compliance checks for the lead ordinance into the county’s housing inspection process and apply penalties for noncompliance by private landlords.

Audience: Foundation funding source for affordable housing program.
Purpose of Evaluation: Demonstrate fiscal accountability.
Recommendation: For the past 5 years, the program has worked through local coalitions, educational campaigns, and media efforts to increase engagement of volunteers and sponsors, and to match them with 300 needy families to build and sell a house. More than 90% of the families are still in their homes and making timely mortgage payments. But, while families report satisfaction with their new housing arrangement, we do not yet see evidence of changes in employment and school outcomes. We recommend continued support for the program but expansion to include an emphasis on tutoring and life coaching by the volunteers.

top of page

Preparation

Preparation refers to the steps taken to get ready to eventually use the evaluation findings. Through preparation, stakeholders can:

  • Strengthen their ability to translate new knowledge into appropriate action.
  • Discuss how potential findings might affect decision-making.
  • Explore positive and negative implications of potential results and identify different options for program improvement.

top of page

Feedback

Feedback is the communication that occurs among everyone involved in the evaluation. Feedback, necessary at all stages of the evaluation process, creates an atmosphere of trust among stakeholders. Early in an evaluation, the process of giving and receiving feedback keeps an evaluation on track by keeping everyone informed about how the program is being implemented and how the evaluation is proceeding. As the evaluation progresses and preliminary results become available, feedback helps ensure that primary intended users and other stakeholders have opportunities to comment on evaluation decisions. Valuable feedback can be obtained by holding discussions and routinely sharing interim findings, provisional interpretations, and draft reports.

top of page

Follow-up

Although follow-up refers to the support that many users need throughout the evaluation process, in this step, in particular, it refers to the support that is needed after users receive evaluation results and begin to reach and justify their conclusions. Active follow-up can achieve the following:

  • Remind users of the intended uses of what has been learned.
  • Help to prevent misuse of results by ensuring that evidence is applied to the questions that were the evaluation’s central focus.
  • Prevent lessons learned from becoming lost or ignored in the process of making complex or political decisions.

top of page

Dissemination - Sharing the Results and the Lessons Learned from Evaluation

Dissemination is the process of communicating evaluation procedures or lessons learned to relevant audiences in a timely, unbiased, and consistent manner. Regardless of how communications are structured, the goal for dissemination is to achieve full disclosure and impartial reporting. Planning effective communications requires

  • Advance discussion of the reporting strategy with intended users and other stakeholders.
  • Matching the timing, style, tone, message source, vehicle, and format of information products to the audience.

Some methods of getting the information to your audience include

  • Mailings
  • Websites
  • Community forums
  • Media (television, radio, newspaper)
  • Personal contacts
  • Listservs
  • Organizational newsletters.

If a formal evaluation report is the chosen format, the evaluation report must clearly, succinctly, and impartially communicate all parts of the evaluation (see text box). The report should be written so that it is easy to understand. It need not be lengthy or technical. You should also consider oral presentations tailored to various audiences. An outline for a traditional evaluation report might look like this:

  • Executive Summary
  • Background and Purpose
    • Program background
    • Evaluation rationale
    • Stakeholder identification and engagement
    • Program description
    • Key evaluation questions/focus
  • Evaluation Methods
    • Design
    • Sampling procedures
    • Measures or indicators
    • Data collection procedures
    • Data processing procedures
    • Analysis
    • Limitations
  • Results
  • Discussion and Recommendations

Tips for Writing Your Evaluation Report

  • Tailor the report to your audience; you may need a different version of your report for each segment of your audience.
  • Present clear and succinct results.
  • Summarize the stakeholder roles and involvement.
  • Explain the focus of the evaluation and its limitations.
  • Summarize the evaluation plan and procedures.
  • List the strengths and weaknesses of the evaluation.
  • List the advantages and disadvantages of the recommendations.
  • Verify that the report is unbiased and accurate.
  • Remove technical jargon.
  • Use examples, illustrations, graphics, and stories.
  • Prepare and distribute reports on time.
  • Distribute reports to as many stakeholders as possible

top of page

Applying Standards

The three standards that most directly apply to Step 6—Ensure Use and Share Lessons Learned—are utility, propriety, and accuracy. As you use your own evaluation results, the questions presented in Table 6.1 can help you to clarify and achieve these standards.

Standard Questions
Utility
  • Do reports clearly describe the program, including its context, and the evaluation’s purposes, procedures, and findings?
  • Have you shared significant mid-course findings and reports with users so that the findings can be used in a timely fashion?
  • Have you planned, conducted, and reported the evaluation in ways that encourage follow-through by stakeholders?
Feasibility Is the format appropriate to your resources and to the time and resources of the audience?
Propriety Have you ensured that the evaluation findings (including the limitations) are made accessible to everyone affected by the evaluation and others who have the right to receive the results?
Accuracy
  • Have you tried to avoid the distortions that can be caused by personal feelings and other biases?
  • Do evaluation reports impartially and fairly reflect evaluation findings?

Evaluation is a practical tool that states can use to inform programs’ efforts and assess their impact. Program evaluation should be well integrated into the day-to-day planning, implementation, and management of public health programs. Program evaluation complements CDC’s operating principles for public health, which include using science as a basis for decision making and action, expanding the quest for social equity, performing effectively as a service agency, and making efforts outcome-oriented. These principles highlight the need for programs to develop clear plans, inclusive partnerships, and feedback systems that support ongoing improvement. CDC is committed to providing additional tools and technical assistance to states and partners to build and enhance their capacity for evaluation.

top of page

Checklist for Ensuring the Evaluation Findings are Used and Sharing Lessons Learned

__ Identify strategies to increase the likelihood that evaluation findings will be used.

__ Identify strategies to reduce the likelihood that information will be misinterpreted.

__ Provide continuous feedback to the program.

__ Prepare stakeholders for the eventual use of evaluation findings.

__ Identify training and technical assistance needs.

__ Use evaluation findings to support annual and long-range planning.

__ Use evaluation findings to promote your program.

__ Use evaluation findings to enhance the public image of your program.

__ Schedule follow-up meetings to facilitate the transfer of evaluation conclusions.

__ Disseminate procedures used and lessons learned to stakeholders.

__ Consider interim reports to key audiences.

__ Tailor evaluation reports to audience(s)

__ Revisit the purpose(s) of the evaluation when preparing recommendations.

__ Present clear and succinct findings in a timely manner.

__ Avoid jargon when preparing or presenting information to stakeholders.

__ Disseminate evaluation findings in several ways.

Worksheet 6A - Communicating Results

I need to communicate to this audience This format would be most appropriate This channel(s) would be most effective
1.    
2.    
3.    
4.    
5.    
6.    

Worksheet 6B - Ensuring Follow-up

The following will follow up with users of the evaluation findings In this manner This support is available for follow-up
1.    
2.    
3.    
4.    
5.    
6.    

top of page

Ensure Use of Evaluation Findings and Share Lessons Learned

Evaluating Appropriate Antibiotic Use Programs

As discussed earlier, the way you use your evaluation findings and the recommendations you make will differ depending on your audience. Following are some hypothetical evaluation findings and potential recommendations that could be developed for the stakeholders listed.

  • Audience: State and local health department staff/health department administration.
    Purpose of evaluation: Demonstrate effectiveness and expand program to new audiences.
    Evaluation findings: Health education materials on appropriate antibiotic use were distributed to patients and providers at private practices in one major metropolitan area. Both consumers and providers reported high levels of knowledge and awareness of program messages following the intervention.
    Recommendation: We recommend using the state and local health department infrastructure to expand distribution of materials to both patients and providers at public health clinics throughout the state.
  • Audience: Managed care organizations.
    Purpose of evaluation: Demonstrate effectiveness; improve program efforts.
    Evaluation findings: Appropriate prescribing guidelines were distributed to all providers in a managed care organization for the past two years, but prescribing rates did not change.
    Recommendation: We recommend convening groups of providers and administrators from the managed care organization to discuss institutional barriers to changing prescribing practices (e.g., short visit times, formulary inventory) and suggestions for how to overcome these barriers (e.g., restructuring patient schedules, revisions to formulary).
  • Audience: Funding source.
    Purpose of evaluation: Improve program efforts and reach multiple audiences.
    Evaluation findings: For the past year, the state appropriate antibiotic use program has participated in an English-language media campaign that includes print, radio and television ads in conjunction with CDC’s national media campaign. Half of the Caucasian population surveyed, but almost none of the American Indians, recalled the content of these ads.
    Recommendation: To expand the reach of this campaign to include the state’s large American Indian population, we recommend increased support to develop culturally and linguistically appropriate materials for this population.
  • Audience: Coalition members.
    Purpose of evaluation: Develop and implement sustainability plan.
    Evaluation findings: A coalition developed a presentation on antibiotic resistance and appropriate antibiotic use for community groups. Coalition members were trained to deliver the presentation and did so as their work schedules permitted. Within six months, the coalition had received over 100 requests for community presentations, and groups often had to wait several months before a speaker was available. Participant evaluations of the community presentations were overwhelmingly positive and showed increases in participants’ knowledge and awareness of appropriate antibiotic use messages following presentations.
    Recommendation: To continue providing community presentations, the coalition will need to identify additional speakers. We recommend recruiting and training graduate students (e.g., public health, medical, or pharmacy students) to deliver presentations as part of their field work or community service requirements.

top of page

Previous Page Next Page: Case Studies

Pages in This Report
  1. 1
  2. 2
  3. 3
  4. 4
  5. 5
  6. 6
  7. [7]
  8. 8
  9. 9
 

External Web Site Policy This symbol means you are leaving the CDC.gov Web site. For more information, please see CDC's Exit Notification and Disclaimer policy.

File Formats: All viewers, players, and plug-ins used on this site can be downloaded from the file formats page. (For example: Adobe Acrobat Reader for pdf files, Windows Media Player for audio and video files, PowerPoint Viewer for presentation slides, etc.)

Copyrighted images: Images on this website which are copyrighted were used with permission of the copyright holder and are not in the public domain. CDC has licensed these images for use in the materials provided on this website, and the materials in the form presented on this website may be used without seeking further permission. Any other use of copyrighted images requires permission from the copyright holder.



Contact Us:
  • Centers for Disease Control and Prevention
    1600 Clifton Rd
    Atlanta, GA 30333
  • 800-CDC-INFO
    (800-232-4636)
    TTY: (888) 232-6348
  • New Hours of Operation
    8am-8pm ET/Monday-Friday
    Closed Holidays
  • cdcinfo@cdc.gov
USA.gov: The U.S. Government's Official Web PortalDepartment of Health and Human Services
Centers for Disease Control and Prevention   1600 Clifton Rd. Atlanta, GA 30333, USA
800-CDC-INFO (800-232-4636) TTY: (888) 232-6348 - Contact CDC–INFO
A-Z Index
  1. A
  2. B
  3. C
  4. D
  5. E
  6. F
  7. G
  8. H
  9. I
  10. J
  11. K
  12. L
  13. M
  14. N
  15. O
  16. P
  17. Q
  18. R
  19. S
  20. T
  21. U
  22. V
  23. W
  24. X
  25. Y
  26. Z
  27. #