Webinar-Demystifying Evaluation Planning for Grant Proposals

Webinar

Presented by: The Rucks Group

Demystifying Evaluation Planning for Grant Proposals

Demystifying Evaluation Planning for Grant Proposals

June 17, 2021

Transcript

Introduction:

 

0m:0s:

11m:30s

Let’s go on and continue talking about some other updates and lessons learned. One question that has come up is in regard to what you should keep in mind from a practical standpoint. One issue is around spacing. If you have worked in grants, you know that space requirements are very important. We actually just finished up a grant this week and we’re trying to take it from 21 pages down to 15 pages. One of the things that has been illuminated this year is when thinking about the different sizes of a grant from small, to moderate, to large and in thinking about the way you approach the evaluation plan is by pulling out or pushing in elements. What’s I’ve learned is that with large grants, you may not necessarily get three or four pages. So now you have this larger grant with a lot of complexity, but you don’t necessarily have the space to be able to handle that in a traditional way – so I had to make some adjustments and I’ll share what I mean. When dealing with spacing issues, the areas of the evaluation plan that get most impacted are usually around the evaluation questions and the data matrix, and again, let me demonstrate and show what I mean by that. So, at the heart of this formula is pushing in and pulling out different elements. So, when you have, for instance, a moderate scale type of grant, you’re going to pull out in terms of the number of evaluation questions, say five questions, that you have and then you create a data matrix associate with that. That’s very manageable within a two-page limit, but when you start thinking about a larger brand that has a lot more complexity to it, it’s very possible that you would end up with 20 or more questions, and I should mention that both of these examples come from the webinars from last year. So, for the large-scale grant, you may have a lot more questions, and for these purposes, I won’t have all 20 questions that are there, but one of the ways I approached this, this year was in thinking about what types of evaluation questions I would like to have asked and then creating overarching themes. So, if you look at these questions, there are some themes that emerge. Looking at the first three, the theme that emerged is really in regard to the number of participants. Now, if you look at the next three questions, the theme that emerges is in regard to student success. The next three relate to employment and the final one relates to wages. So, one of the ways that you can handle when you don’t have a lot of space is by looking at what the themes are and then creating overarching themes when you’re writing out your evaluation questions and creating the data matrix – that will help you to be able to save some space. You can also note in the operational components that you’re going to fully distill out the evaluation when you get started (when the project is funded) as an opportunity to let reviewers know that those aren’t all the questions you’re going to address, but that you’re just approaching it by theme.

Something else in terms of thinking about practical considerations is saving space in developing the logic model. We usually develop the logical model from left to right. Instead of writing it in this traditional framework, write it from top to bottom. So, here’s a logical model which is basically like a detailed theory of change, and this is the kind of a template you see writing in that traditional left to right standpoint. I was recently working on an evaluation plan and trying to develop the logic model so the one I’m about to present isn’t the one I was actually working on. This is kind of a hybrid of several different projects, but as I was working on it, I could very easily get the inputs listed, but then I started trying to go through and write out the activities. It was a slightly complex project and as I kept writing the activities, I realized I was running out of space and that perhaps this isn’t going to be the most effective organization to have communication with the project team. It was actually in working with a grant writer that I saw that as she was developing the evaluation, she actually just went in a word document from top to bottom and would write up the logic model. As you can see, you can get all the activities listed on one page which is a bit easier to do this way than if you are using the left-to-right framework. So, this is just something to be able to help in terms of the development phase and then if you actually have to submit the logic model and the grant, you can identify out different themes and that will help to condense the logical model of the proposal as well. So this is kind of a space saver for development and a potential space saver for the actual proposal as well. Let me pause there for a moment, Alyce are there any additional questions at this point?

Q: Do you also cut back on the number of data points and data indicators because there are less questions?

A: That’s a really good question. Even though there are fewer questions because you are writing the question at this overarching level, theoretically those data points should still within that question and applied to that question because it’s still going to link back to certain goals and objectives in the proposal. So I don’t necessarily cut back on what those data points are and the data indicators. The only time that I may try to condense things is if I’m still kind of pressed with some spacing issues and I may try to come up with some additional kind of creative overarching terms, but conceptually I’m not dropping or eliminating any types of data points.

Q: I could see how other areas of the evaluation plan could be impacted by spacing issues; do you see any other issues as well?

A: I do. I think, again, the evaluation questions and the data matrix are probably the two components most impacted because if you’re trying to pull out the amount of detail, that’s going to come from the evaluation questions and the data matrix, but I do also think the theory of change can be impacted as well. So, if you go back and look at the previous webinars, one of the things I suggest doing is if you are trying to develop the theory of change, you create a sentence that says “if this, then that”. I also said that this can be a lengthier sentence but still try to get a sentence or find a way to summarize that project within that if-then framework. If you have a very complicated project with a lot of moving parts, an if-then sentence can be challenging. What I did in a couple of situations this past year is I did the “if” but then instead of trying to create a sentence, I did sacrifice some space and made some bullet points that if we do this then we would get “a” and explain a, b, and c. So, in essence, yes, I think I am in agreement with you that there are probably other areas that can be impacted as well.

22m:28s

So the other item that I think we learned or had insights on is in regard to handling resubmission and I would say definitely resubmit. I think that’s a really important piece to be able to get some feedback and to resubmit your proposal. I do think there are a couple of things to keep in mind when you resubmit. One is regarding the reviewer’s feedback. I mentioned this idea earlier, but when looking at the feedback I conceptually think of it as being substantive or stylistic – this is just the gospel according to Lana. It’s kind of tricky because if you frame it in this way, it’s almost as though you’re saying that one’s important and one is not important, but that’s not really the take-home message. For my role on projects, partly what I am trying to glean from feedback is what needs to be done not for that particular project, but I’m also trying to understand what are the larger take-home messages I need to apply to other grants and other projects that we’re working on? So substantive issues to me are really at the meat. If I took the example, I provided before in regard to making sure you have the data matrix or making sure that you have the evaluation questions, outlining the operational approach – to me those are substantive issues. The stylistic issues, then, are how you put those together and what organization you associate with it. I’ve also had situation in which I receive feedback that x, y, and z wasn’t included, but I thought it was included – so I now make sure they have headers and I literally change the style so that it’s understood what’s been included in the evaluation plan. The other piece is that you want to make sure that if you are looking at the reviewer’s comments, that you’re also looking at the solicitation. This is particularly important if you have been working with some NSF grants. This year a number of solicitations have changed, one is with the S-STEM grant and the solicitation changed around the requirements which “track” that the generation of knowledge research activities was required. So, you want to also make sure that the reviewer’s feedback is still relevant for that what is required of the current solicitation. Let me pause here to see if there are other questions.

Q: Are there any interesting trends emerging that you’ve seen?

A: That’s a good question. One trend over the last year I’ve seen is the extent to which industry partners and partnerships have emerged as being important. A lot of the initial space we worked with was with the NSF ATE community and technician education and I think for years there was this understanding that industry partnerships and industry involvement were very critical to the work of ATE projects. I think in the last year, what was very interesting was to start seeing that within undergraduate and graduate-level education, an emerging emphasis on industry partnerships, particularly within graduate education around STEM, and the attempt to transform graduate education with the realization that more individuals may go into industry versus the academy.

27m:53s

If there are other questions, we can hold those, and I will try to make sure to follow up with you individually, but we do just have a couple of minutes left so let me just go over a couple of points quickly. Stay tuned for our next webinar which will be in September of this Fall. We’re really excited to be able to bring a couple of guest speakers in for this webinar. And as a final point, in thinking about some of the ideas, and again these are just my perspectives based on my own experiences, is that I really don’t see that there’s a real broad and large difference in terms of awarded and non-awarded projects – they’re usually relatively small items. I think in terms of using some of these items in terms of lessons learn and using the evaluation formula, those are some of the things that will help to address some of those small items to actually help make sure that you increase the chance of your project actually being awarded.

So, with that, it’s been a really fast 30 minutes this afternoon, but thank you ever so much for joining us. If there are any outstanding questions that you haven’t asked, please don’t hesitate to reach out and we’ll make sure to answer them. Thank you and have a great rest of the day!

Now you know us.

Isn’t it time we get to know you?

    *We will not sell or share your information.

Send us your information and we will get right to you.