In our increasingly measurement-driven culture, gathering evidence of the difference that an initiative makes is important. Gathering credible evidence of outcomes is an imperative within grant-funded work. All too often evaluations of grant-funded initiatives focus on project activities (e.g., outputs) rather than meaningful project impact (e.g., outcomes). Averting this common error is first addressed when writing an evaluation plan for a grant proposal. Whether you need to write an evaluation plan or review one provided for your project, this webinar series will provide a formula for writing a program evaluation plan. Presented in three-parts, webinar participants will come away with an understanding of the evaluation plan elements to scale them for large and small grant proposals. Time will be allocated to address participants’ specific questions.
Part I: July 1, 2020 2:00p – 2:30p The Foundations: Theory of Change, Logic Models, and Evaluation Questions The first webinar of the three-part series begins with a brief review of the six key elements that should be included in every evaluation plan. The webinar will then dig into the foundational components of a program evaluation: theory of change, logic models, and evaluation questions.
Part II: July 15, 2020 2:00p – 2:30p The Methods: Evaluation Design and Data Gathering & Analytical Approaches In Part II, building off of the program evaluation foundations, this webinar will focus on the methods of a program evaluation: design, data gathering strategies, and analytical approaches.
Part III: July 29, 2020 2:00p – 2:30p The Finale: Evaluation Use, Sharing Findings, and Evaluation Teams In Part III, the final key elements of a program evaluation will be discussed, such as using program evaluation for continuous improvement, reporting to funding agencies, sharing the findings with different audiences, including the necessary personnel involved in the entire process. Most important, the final segment will pull together all three parts to show participants how to adjust the elements of an evaluation plan to fit the size of the program proposal.
About the Presenter: Lana Rucks, Ph.D., brings to her work nearly two decades of research and evaluation experience. Her extensive knowledge in the design and implementation of research and program evaluations is based on her leadership of dozens of evaluation initiatives. She has evaluated projects funded by federal agencies, including the Centers for Disease Control and Prevention, the U.S. Department of Labor, the National Institutes of Health, and the National Science Foundation and non-profit organizations such as the Bill and Melinda Gates Foundation and the Kresge Foundation.