Connect with Us  Facebook Twitter LinkedIn Email Email

Are We Making Effective Grants? (Part 2)

2014-October 8
Nick Randell, Program Officer

[This second (and final) installment takes a look at recent Tower grants and their impact on the target population, grantee organization, and the field at large.]

Philanthropy is increasingly interested in assessing the broader impact of grantmaking.  On our lessons learned forms we also scored grants for impact (whether "significant," "some," or "no impact").  We looked at three kinds of impact:

  1. impact on target population (was a clinic able to see more young patients in a given month?);

  2. impact on organizational capacity (did an agency create new in-house trainers?);

  3. impact on the field (did the work help convince a state agency to expand reimbursements to a new service delivery model?). 

It is worth noting that we expected impact on the field to be fairly modest, simply because most of the grants we looked at did not have field-level objectives.

Here are three pie charts showing what we found for our 54 grants.



We tried to capture some key features of projects that demonstrated "significant" impact on the target population. Several made a point of engaging and educating referral networks, resulting in more clients getting served.  Evidence-based practices gave a
boost to projects when they were highly recognized by the field as effective.



Grants were more likely to have significant impact on organizational capacity by aggressively focusing on staff buy-in right up front, developing internal training capacity,  paying attention to supportive system changes and IT, investing in individual staff development (supporting staff certification or other credentials, for example), and planning that anticipates a degree of staff turnover.

 

I think we saw more field impact than we expected (61% had at least some impact).  Here are a few factors that seem to contribute to field level impact.

  • Implementation of a particular program reinforces a larger, regional trend towards more evidence-based approaches (particularly if provider is one of the largest in the area)
  • Program growth increases demand for job candidates with specific skill sets, particularly if this is communicated to community colleges/job training programs
  • Lessons learned from program implementation get fed back to the model developer/author

The way we look at grant impact is evolving at the Tower Foundation. 

With adoption of Results Based Accountability in the near future (see Don Matteson's blog from July 2014), we will identify and track specific performance indicators at organizational and community levels.  These indicators, and the broader objectives they serve, will be shared with both grantees and the general public.  But that's another blog.

 

Photo by Rick Bolin
Flickr: Archery World Cup /4929302647
Creative Commons 2.0 Licensed

comments powered by Disqus