Blog Archives

outcomes eval – necessary, but worth it?

I’m on an evaluaton kick – digging deeper into the recent literature in this area. Today I ran across the Annie E. Casey Foundation’s making connections project, which aims to “improve the outcomes for families and children in tough or isolated neighborhoods” (emphasis mine). According to the project description, Casey research shows that “children do better when their families are strong, and families do better when they live in communities that help them to succeed.
What this has got to do with libraries may be obvious. Libraries clearly fit into two of the core premises of the project: building close ties with civic groups and having reliable services close to home. The project supports a number of “local learning partners” with the making connections sites, and libraries are included in a number of project materials I reviewed.

But the project’s focus on evaluating core results is what caught my eye. There are six results that the program is aiming for: increased income and earnings for families; increased assets for families; healthy children who are ready to succeed in school; increaseed civic participation for families, youth and neighborhoods; strong informal support and networks for families and neighborhoods; access to quality services and supports that work for families. Under each target is a statement “We’ll know we’re making a difference when …” with two or three indicators identified for each. All results and indicators are backed with this research. (The Foundation also has a national indicators database that was designed to help evaluate making connections projects. Anyone can use the database to identify survey-based outcome measures related to eight community and/or family related domains.)

When Steven and I asked (in our online survey) you to tell us about your specific community building projects, we wondered how the programs were evaluated – and what (exactly) indicated that they were successful. Many of you told us about number of participants and described anecdotal evidence that “people liked it” and “want to do it again.”

Durrance and Fisher talk about a number of libraries doing outcomes based research. I’m wondering about our readers: Has anyone developed something like Casey’s “core results” statement for their library project? What changes/outcomes were you looking for? What were the related indicators and how did you measure them?

And underneath those questions are more of them: Is robust thinking around outcomes worth the planning, collection, and analysis effort?¬† Or does it just¬† end up confirming what we already “know” by simply working and staying in-tuned day-to-day in our libraries? How much of our work is intuitive – how much of it based on facts?

substantive stories for sustainability

Libraries often operate under the auspices of public good and public will. We’ve all heard the stories from local benefactors or library supporters that begin “As a young child, I took refuge in the stacks of my local neighborhood library…” and “The library was the only place where I fit in…” or “Without the library…” As powerful as stories and anecdotes can be to our partners, patrons, and holders of purse-strings, it’s not enough to simply argue that it’s “the right thing to do” to sustain them. In happy, bountiful times, when everyone’ feeling prosperous, even generous, perhaps “Libraries Change Lives” is good enough. Meanwhile, we’ve spent a lot of time counting – counting our resources, our processes, and our patrons’ use of the library. Take these last two points together. We take for granted that “Libraries Change Lives,” and believe, or perhaps hope, that the number of things we can count along the way will be enough to keep doors open when times are lean. We’re learning now: it won’t. We’ve got a long way to go before we “Prove it!”

In their book “How Libraries & Librarians Help,” Fisher and Durrance (thanks Patrick) identify an “urgent need to tell the library story more effectively.” Economic downturns, swings of the political pendulum, the clear need for vital services like police and emergency services, all call to question the services of the library, especially if we’ve not taken care to match them with community needs, or stayed in touch with our community about how we’re meeting them. Many libraries have found more robust “business reasoning” to be effective in determining and articulating value because it requires that we look at a bottom line and determine if what we’re doing makes sense based on needs, resources, and outcomes. The bottom line for libraries is not a financial profit, it’s sustainability.

But to get to sustainability, we have to know where we are, and where we have been. Telling our libraries stories, not only through traditional library metrics, but through measurement and analysis of the impacts or outcomes of our services and programs, is one way to help ensure that we get there. When our stories are more substantive, we’re getting closer to “proving it”.

How does your library tell its story? to patrons? decision-makers? supporters? partners? If you have a good example or contact of someone who’s actively accountable through regular evaluation AND doing a great job getting the word out about it – let us know – I’d love to chat with them.

Oh yeah? Prove it!

The concept of evaluation is getting more and more attention these days – but is it a critical element of community building? I say: Yes!

The attention seems to have began with the Government Performance and Results Act of 1993. Under the Act, agencies are required to develop multiyear strategic plans, annual performance plans, and annual reports. Because many libraries are affiliated with city, county, state, or other government entities, and primarily supported by public funding, it was only a matter of time before the Act would impact library practice.

Next, the Institute of Museum and Library Services (IMLS) focused on working with their grantees to measure the impact of their work through “systematic evaluation of results – outcomes.” Otherwise known as Outcomes Based Evaluation (OBE), IMLS defines the process as “a systematic way to determine if a program has achieved its goals.” Although grantees are not required to conduct OBE on their projects funded through IMLS, organizations are now required to report to Congress in terms of measured outcomes. As the process of outcome evaluation becomes mainstream for government and other organizations, IMLS is sure to require their grantees to properly evaluate their projects for their success intended outcomes.

Finally, non-government benefactors, funders, and supporters of libraries are increasingly interested in their investments in libraries. They want to know if our spaces, buildings, programs, services, and projects are achieving desired results. Meanwhile, shifts in our cultural, political, information and community landscapes all call to question: Does what we’re doing support our mission, provide value, and create change? A presentation introducing OBE (now available on the IMLS Web site) states : “Libraries Change Lives – Oh Yeah? Prove It!” We may still be far from what David White calls the “culture of assessment,” (Matthews, Measuring for Results, 2004) but the culture of accountability, no matter what type of library we’re working from, is upon us.

So, we’ve got all of these factors drawing us to the research desk, but what does it all have to do with building community? Here’s what I’m thinking: Is your community served well by your library? Which library services meet what needs? Do and how well do your programs achieve intended outcomes? How do you measure and track the success of your programs? And how do you communicate back to your community the analysis of these measures? Ultimately, these are questions about community connections. Libraries that build community are accountable – to their staff, patrons, partners, and funders. It’s just another way to keep communication open and flowing through the relationships we’re constantly creating and fostering.
I’m trying to take what some may feel is a dry, analytical process and infuse some community-focused energy into it. What do you think? Tenuous? Or common sense? And if you know of any work/projects connecting evaluation to outreach and/or partnerships – I’d love to hear.

-CRH

« Previous Page