Using Logic Models to Navigate Projects

Starting a new project is a lot like going on a road trip to a new place. We know what our destination is, it’s just the actually getting there that’s a bit fuzzy. That’s why we use GPS (or maps, if you prefer the old-fashioned route!) I like to think of logic models as the GPS of a project because this tool can serve to provide a way to achieve the outcomes of a project.

A logic model is a pictorial representation that conveys the relationship between inputs, activities, and anticipated outcomes. The logic model is a living document that changes throughout the life of the project as a deeper understanding of the connection between activities and outcomes emerges. As the figure below demonstrates, these components usually flow from left to right, with some sort of visual connectors, such as arrows, to direct readers through the model.

Logic Model Template 012816

A strength of the logic model lies not solely in the actual logic model but also in the internal alignment that occurs through the development of one. The process of developing a logic model focuses everyone to have a more systematic, theory-driven understanding of how actions are connected to desired outcomes. Creating the logic model provides an opportunity for team members to make assumptions explicit and gain consensus around project assumptions in a way that generally doesn’t naturally occur within the grant management process. It is not atypical during work sessions to develop a logic model to hear phrases such as “I was envisioning this differently” or “I assumed that we meant…”

It is important to note that a logic model is only as good as the information provided: If the individuals involved in the development of the tool don’t feel comfortable sharing their thoughts and comments, then some of these benefits will not be realized. So creating a safe space for stakeholders to share is an important part of developing a logic model.

After a logic model has been articulated, it is a great resource throughout the project’s life cycle. Project teams can rely on their logic model as a way to keep the project on track, by using it as a periodic checkpoint. By continually referring back to the logic model, team members will be able to identify leading and lagging indicators. Additionally, the logic model has utility as a decision-making tool. Logic models can help predict otherwise unintended consequences, allowing for teams to make informed decisions.

On practical note, there is only so much information that can and should be represented in the logic model. In other words, there needs to be a balance between simplicity and completeness in order to optimize the utility of a logic model.

Planning and executing a project is not an easy task. So if you’re struggling to find your way with a project, developing a logic model may help. Remember, you wouldn’t attempt a road trip without directions to your destination – so simplify your project by adding some project navigation.

Research and Evaluation: Presentation from AEA

If yIMG_0455ou’ve been working with grants for a while, you have probably noticed that the federal grant funding process is changing particularly as it relates to evaluation. In November of last year, I had the pleasure to work with Dr. Kelly Ball-Stahl and Jeff Grebinoski of Northeast Wisconsin Technical College (NWTC) on a presentation at the annual American Evaluation Association meeting regarding our shared experience of navigating these changing requirements.

Over the past few years, we’ve noticed remarkable changes in evaluation expectations. For instance, a few years ago, examining outputs was enough but now there is a much greater emphasis on outcomes. Similarly, there was a time when there was a large conceptual distinction between evaluation and research, those clear dividing lines are starting to blur through an increasing emphasis on evaluation rigor within the federally funding space, particularly from the Department of Ed, Department of Labor, and the National Science Foundation. It should be noted that this shift is not occurring simply to make evaluation and grant management more challenging. The underlying motivation is to ensure that the right interventions are being implemented to help the most number of individuals.

So, what are some consequences of these changes? Grant writers are increasingly involving evaluators in the evaluation planning of these types of grants because we have the expertise to be able to design experimental and rigorously designed quasi-experimental evaluations. For instance, at The Rucks Group we routinely work with grant writers to write evaluation plans for federal funding sources.

IMG_0467

Another important consequence is that institutions have to strengthen their data collection systems.  Data collection systems are all the entities within an organization that are involved in gathering facts, numbers, or statistics and effectively communicating this information to the right individuals. Having a fully functional data collection system is challenge by the “system” having a shared data language and an institutional research department that has the resources to appropriately respond to the data demands.

Although the changing requirements of federally funded grants do pose challenges for interested organizations, these changes also provide extraordinary opportunities.  By increasing the rigor of evaluating projects, we are also creating a deeper understanding of what works. Through these efforts, our work collectively will improve the lives of as many individuals as possible.

New Team, New Year

Although we say this every year, it is truly hard to believe that another year has passed. And while I’m still getting use to writing “2016,” I am particularly excited about this year. We will continue to work with lots of great clients and exciting projects some in new domains.

To continue to realize our core values concerning the quality of our services and products, we have expanded our team to bring in new talents and expertise to the firm. In October of last year, we welcomed two new employees: Jeremy Schwob and AnnaBeth Fish.  Jeremy Schwob joins our team as a Research and Evaluation Associate. Jeremy is finishing his master’s at the University of Dayton (UD) in Clinical Psychology, which he expects to earn in Spring 2016. Jeremy completed his undergraduate degree at UD, where he received a BA in Psychology. Jeremy is analytical and really enjoys working with data.

AnnaBeth Fish is our Administrative Assistant. AnnaBeth has an MA in Organizational Communication from Ball State University, and a BA in Communication Studies from Hollins University. She helps the firm in many ways, including calendar management, maintaining our website, general office administration, and completing the notorious “other tasks not specified.”

In addition to welcoming new individuals, we’ve also expanded the roles of existing team members. Joe Williams joined The Rucks Group as an intern in May 2015. After completing his internship he stayed on with us, working a few hours per week during the fall 2015. He is graduating a semester early with a BA in psychology, and will work nearly full-time in 2016 until matriculating to graduate school in the fall.

Carla Clasen, who serves as a sub-contractor, still works closely with the firm. In 2016, she her work will begin to focus more on our projects within the public health area. As many of you may recall, Carla worked as an evaluator within public health for nearly 20 years.

If you’d like more information about any of our team members or the firm itself, be sure to check out the About page on our website.

As we move into this year, we will continue to develop and attract the best talent to be able to optimally provide research and evaluation expertise to our clients and projects.