Demystifying Evaluation Planning for Grant Proposals

Overview

In preparation for the upcoming grant writing season, we are offering an update on our formula for writing an evaluation plan. The evaluation section continues to be an important component of a grant proposal. As such, we have previously introduced a formula for how to write an effective evaluation plan for a grant proposal. In this webinar, we share our updated formula and lessons we’ve gleaned over the past year of working with dozens of project teams. We build on those lessons for insights on how to increase the potential for you to garner grant funding. In this 30-minute webinar we’ll:

  • Discuss what to review in a grant solicitation to aid in the development of the evaluation, making sure that all team members are aware of these key elements.
  • Suggest how to address unique situations and spacing issues particularly for large grants with tight word counts for proposals.
  • Use funders’ reviews to inform that next draft of the proposal (Resubmission is key. If you can resubmit, do it!)
  • Answer your questions.
  • For an optimal learning experience, we encourage you to review our Three-Part Formula for Writing a Grant Proposal Evaluation.

Transcript

Part 1

Welcome, everyone! I am so glad that you have joined us for the next installment of our Coffee Break Webinar series, Demystifying Evaluation Planning for Grant Proposals. Hopefully, you have a nice cup of coffee, or even a cold cup of coffee if the temperature in your neck of the woods is the way it is here. I also hope that you will learn a lot from our conversation today. Before I jump in, I do want to take moment to reinforces that in our Coffee Break Webinars, we really do encourage you to ask questions so make sure to use the question function on your screen – and we have with us Alyce Hopes who our outreach coordinator is who will help manage the Q&A portion of the webinar. She is also available if you are having any technical issues. And then, I am Lana Rucks, principal consultant with The Rucks Group. If you don’t know, The Rucks Group is a research and evaluation firm that gathers, analyzes, and interprets data to enable our clients to measure the impact of their work. We were formed in 2008 and over the past several years, we have had the privilege of working primarily with higher education institutions and grants funded by federal agencies such as the National Science Foundation, Department of Education and Department of Labor, and philanthropic organizations as well.

In thinking about the topic for today, this is really motivated by our own internal self-reflection on our work on evaluation plans over the last year. And over the last year, we have worked with approximately 50 different project teams and writing about 47 project evaluation plans for grant proposals which translates to roughly 90 million dollars of grant funds sought. In our self-reflection, we were also kind of curious about “what does this mean?” in terms of a win rate and the award rate, and what benchmark would be against that? So, when we look specifically at new submissions and the win rate for the grants we know have been awarded so far, we know that our win rate is about 53% and we’re kind of proud of that because the benchmark for grant raters in terms of a win rate is around 30% to 40%. In thinking about what got us there, we really think a lot of it is in regard to that evaluation plan formula. With that, what we want to be able to do in this time together is to 1.) be able to quickly do a review of the evaluation plan formula that we’ve introduced before and 2.) provide some updates learned over the last year of working and utilizing the formula as well as getting feedback from other individuals, and of course, the whole purpose of these webinars is to 3.) help reduce the evaluation angst. Let’s get started by looking at the valuation plan formula review.

The evaluation plan formula is really based on tree-broad components. The first component is based on the foundations and that’s where you want to make sure you’re including elements that are related to the theory of change and evaluation questions. The theory of change really relates to “if we do this, then we get that” while evaluation questions relate to the idea of that like research which is driven by questions, evaluations are driven by questions as well. The next two components are the methods – the evaluation design (How are you going to design? What’s going to be the context for interpreting the findings?) and data gathering & analysis (What data and analysis are you going to do to be able to provide evidence of the impact or outcomes of the initiative that you’re working with?). The final component is what we call the final which is around using findings as well as the operation approach. (Who will be using those findings? How will they be incorporated for continuous improvement purposes?). At the center of the formula is the theory that regardless of the types of evaluation you’re working on, whether it’s small, moderate, or large, you always want to have those elements included. The difference will be the scale to which you’re going to be writing. So, for a small grant, the evaluation may have fewer words where for a moderate grant you may have one to two pages to describe the evaluation section and in that type of grant you may also have an explicit requirement for an external evaluator. For a large grant, then, some of those same elements where you would have a requirement for an external evaluator or even where you’re required to name that evaluator, then you may have between two to four pages to be able to write and describe the evaluation plan. So, I went over the evaluation formula very quickly, but if you’re not familiar with that evaluation formula, I would just encourage you to visit our YouTube page and go back and review these components in the Three-Part Webinar for Writing a Grant Proposal Evaluation – that goes much more in-depth than what I am able to right now. Another item that I would also encourage you to review is the 5 Best Practices for Hiring the Right Evaluation Partner – particularly if you’re working on a moderate or large grant, you may want to review that for tips on who to hire and work with an evaluator.

With that as context, one of the things we learned over the year is that this formula is quite robust and can work in a variety of different situations. Let me try to demonstrate why I think that’s the case. I’m showing you a series of different, very pretty vases. Just look at each – the vase with the flowers, the empty vase, the blue vase, and this terra cotta base. Now I am going to show you a picture and very quickly when you first look at that picture, did that vase stand out for you? It may not have because this is a pretty common ambiguous figure, but what I am essentially trying to do is to prime the idea of a vase. Very often when you are primed, or you see something of one type before you look at an ambiguous situation, that’s what you actually see in that picture. In many ways, the formula does this too in terms of helping you to be able to digest requests for proposals that may be different or the ones you don’t normally work with. Let me give you an example of that. As I mentioned, we do a lot of work with NSF (National Science Foundation), DOE (Department of Education), and DOL (Department of Labor), but this last year we also worked in a couple of different person-grants which move between federal agencies, so each solicitation is set up slightly differently. In looking at this particular solicitation there are a couple of things that stood out to me. 1.) When reading the evaluation and technical support components where the solicitation asked to “describe the systems and processes that will support your organization’s collection of hearsay’s performance measurement requirement for this program” the first thing that formula element that came to my mind what operational approach. Similarly, when asked about “monitoring ongoing processes and progress to more meeting goals and objectives” that same piece came to mind again. 2.) When looking at “including descriptions of inputs” what came to mind was the theory of change. And not to belabor a point, but just to show how the formula will map into solicitation when reading 3.) “describe data collection strategy to accurately collect, manage, analyze, store, and track data”, data gathering and analysis came to mind for me. So, in essence, the real point is that the formula is robust and it’s helping in looking at a different situation because it will help you to see something that may be “ambiguous” or “unclear” by using the formula as an organizing framework. Let me pause there and see if there are any questions, Alyce are there any questions from anyone?

Q: Does it matter what order those elements are presented in?

A: In looking at how individuals have approached it, I don’t think that the order really matters for most of the elements. Sometimes you may put the operational approach first, sometimes it may go last. The only piece and I’m trying to think quickly here, that may make a difference is probably the theory of change, “if we do this, then we get that” and you may want this to come earlier. Something that we’ll get to later is the differences between what’s substantive and what’s stylistic. I think in terms of marking sure you have those substantive elements included is the important piece. How you do that gets to a much more stylistic issue.

Part 2

Let’s go on and continue talking about some other updates and lessons learned. One question that has come up is in regard to what you should keep in mind from a practical standpoint. One issue is around spacing. If you have worked in grants, you know that space requirements are very important. We actually just finished up a grant this week and we’re trying to take it from 21 pages down to 15 pages. One of the things that has been illuminated this year is when thinking about the different sizes of a grant from small, to moderate, to large and in thinking about the way you approach the evaluation plan is by pulling out or pushing in elements. What’s I’ve learned is that with large grants, you may not necessarily get three or four pages. So now you have this larger grant with a lot of complexity, but you don’t necessarily have the space to be able to handle that in a traditional way – so I had to make some adjustments and I’ll share what I mean. When dealing with spacing issues, the areas of the evaluation plan that get most impacted are usually around the evaluation questions and the data matrix, and again, let me demonstrate and show what I mean by that. So, at the heart of this formula is pushing in and pulling out different elements. So, when you have, for instance, a moderate scale type of grant, you’re going to pull out in terms of the number of evaluation questions, say five questions, that you have and then you create a data matrix associate with that. That’s very manageable within a two-page limit, but when you start thinking about a larger brand that has a lot more complexity to it, it’s very possible that you would end up with 20 or more questions, and I should mention that both of these examples come from the webinars from last year. So, for the large-scale grant, you may have a lot more questions, and for these purposes, I won’t have all 20 questions that are there, but one of the ways I approached this, this year was in thinking about what types of evaluation questions I would like to have asked and then creating overarching themes. So, if you look at these questions, there are some themes that emerge. Looking at the first three, the theme that emerged is really in regard to the number of participants. Now, if you look at the next three questions, the theme that emerges is in regard to student success. The next three relate to employment and the final one relates to wages. So, one of the ways that you can handle when you don’t have a lot of space is by looking at what the themes are and then creating overarching themes when you’re writing out your evaluation questions and creating the data matrix – that will help you to be able to save some space. You can also note in the operational components that you’re going to fully distill out the evaluation when you get started (when the project is funded) as an opportunity to let reviewers know that those aren’t all the questions you’re going to address, but that you’re just approaching it by theme.

Something else in terms of thinking about practical considerations is saving space in developing the logic model. We usually develop the logical model from left to right. Instead of writing it in this traditional framework, write it from top to bottom. So, here’s a logical model which is basically like a detailed theory of change, and this is the kind of a template you see writing in that traditional left to right standpoint. I was recently working on an evaluation plan and trying to develop the logic model so the one I’m about to present isn’t the one I was actually working on. This is kind of a hybrid of several different projects, but as I was working on it, I could very easily get the inputs listed, but then I started trying to go through and write out the activities. It was a slightly complex project and as I kept writing the activities, I realized I was running out of space and that perhaps this isn’t going to be the most effective organization to have communication with the project team. It was actually in working with a grant writer that I saw that as she was developing the evaluation, she actually just went in a word document from top to bottom and would write up the logic model. As you can see, you can get all the activities listed on one page which is a bit easier to do this way than if you are using the left-to-right framework. So, this is just something to be able to help in terms of the development phase and then if you actually have to submit the logic model and the grant, you can identify out different themes and that will help to condense the logical model of the proposal as well. So this is kind of a space saver for development and a potential space saver for the actual proposal as well. Let me pause there for a moment, Alyce are there any additional questions at this point?

Q: Do you also cut back on the number of data points and data indicators because there are less questions?

A: That’s a really good question. Even though there are fewer questions because you are writing the question at this overarching level, theoretically those data points should still within that question and applied to that question because it’s still going to link back to certain goals and objectives in the proposal. So I don’t necessarily cut back on what those data points are and the data indicators. The only time that I may try to condense things is if I’m still kind of pressed with some spacing issues and I may try to come up with some additional kind of creative overarching terms, but conceptually I’m not dropping or eliminating any types of data points.

Q: I could see how other areas of the evaluation plan could be impacted by spacing issues; do you see any other issues as well?

A: I do. I think, again, the evaluation questions and the data matrix are probably the two components most impacted because if you’re trying to pull out the amount of detail, that’s going to come from the evaluation questions and the data matrix, but I do also think the theory of change can be impacted as well. So, if you go back and look at the previous webinars, one of the things I suggest doing is if you are trying to develop the theory of change, you create a sentence that says “if this, then that”. I also said that this can be a lengthier sentence but still try to get a sentence or find a way to summarize that project within that if-then framework. If you have a very complicated project with a lot of moving parts, an if-then sentence can be challenging. What I did in a couple of situations this past year is I did the “if” but then instead of trying to create a sentence, I did sacrifice some space and made some bullet points that if we do this then we would get “a” and explain a, b, and c. So, in essence, yes, I think I am in agreement with you that there are probably other areas that can be impacted as well.

Part 3

So the other item that I think we learned or had insights on is in regard to handling resubmission and I would say definitely resubmit. I think that’s a really important piece to be able to get some feedback and to resubmit your proposal. I do think there are a couple of things to keep in mind when you resubmit. One is regarding the reviewer’s feedback. I mentioned this idea earlier, but when looking at the feedback I conceptually think of it as being substantive or stylistic – this is just the gospel according to Lana. It’s kind of tricky because if you frame it in this way, it’s almost as though you’re saying that one’s important and one is not important, but that’s not really the take-home message. For my role on projects, partly what I am trying to glean from feedback is what needs to be done not for that particular project, but I’m also trying to understand what are the larger take-home messages I need to apply to other grants and other projects that we’re working on? So substantive issues to me are really at the meat. If I took the example, I provided before in regard to making sure you have the data matrix or making sure that you have the evaluation questions, outlining the operational approach – to me those are substantive issues. The stylistic issues, then, are how you put those together and what organization you associate with it. I’ve also had situation in which I receive feedback that x, y, and z wasn’t included, but I thought it was included – so I now make sure they have headers and I literally change the style so that it’s understood what’s been included in the evaluation plan. The other piece is that you want to make sure that if you are looking at the reviewer’s comments, that you’re also looking at the solicitation. This is particularly important if you have been working with some NSF grants. This year a number of solicitations have changed, one is with the S-STEM grant and the solicitation changed around the requirements which “track” that the generation of knowledge research activities was required. So, you want to also make sure that the reviewer’s feedback is still relevant for that what is required of the current solicitation. Let me pause here to see if there are other questions.

Q: Are there any interesting trends emerging that you’ve seen?

A: That’s a good question. One trend over the last year I’ve seen is the extent to which industry partners and partnerships have emerged as being important. A lot of the initial space we worked with was with the NSF ATE community and technician education and I think for years there was this understanding that industry partnerships and industry involvement were very critical to the work of ATE projects. I think in the last year, what was very interesting was to start seeing that within undergraduate and graduate-level education, an emerging emphasis on industry partnerships, particularly within graduate education around STEM, and the attempt to transform graduate education with the realization that more individuals may go into industry versus the academy.

Part 4

If there are other questions, we can hold those, and I will try to make sure to follow up with you individually, but we do just have a couple of minutes left so let me just go over a couple of points quickly. Stay tuned for our next webinar which will be in September of this Fall. We’re really excited to be able to bring a couple of guest speakers in for this webinar. And as a final point, in thinking about some of the ideas, and again these are just my perspectives based on my own experiences, is that I really don’t see that there’s a real broad and large difference in terms of awarded and non-awarded projects – they’re usually relatively small items. I think in terms of using some of these items in terms of lessons learn and using the evaluation formula, those are some of the things that will help to address some of those small items to actually help make sure that you increase the chance of your project actually being awarded.

So, with that, it’s been a really fast 30 minutes this afternoon, but thank you ever so much for joining us. If there are any outstanding questions that you haven’t asked, please don’t hesitate to reach out and we’ll make sure to answer them. Thank you and have a great rest of the day!