By Mélanie L. Sisley, with special thanks to Karen Henchey, July 2018
"I feel happier" was only one of the many redeeming comments patrons of the West Island Women’s Center (WIWC) said about their experience. To see “happiness” emerge as one of the words that describe the center was like getting 100% plus bonus points in a chemistry exam. The WIWC professes its mission to be “improving the lives of women.” But how can we know for sure that these are not just lofty words that don't translate into reality? And how can you verify that lives have been improved? For a non-profit organization like the WIWC, showing concrete evidence or results is fundamental for funding. This is what prompted them to embarked on the task of revamping the current evaluation strategy so that the "real" impacts of the center could speak for themselves.
Personally I discovered the WIWC as a new mom in much need of some time to myself (and I’ll admit, advice and support with parenting). So I can say first hand that the WIWC is indeed a tremendous resource for support and relief from life's many challenges. Later, as the Programming Director (volunteer), I was offered the opportunity to participate in revising the current evaluation strategy to articulate, in concrete actions and evidence, how the Centre is reaching its objective. A challenged I jumped at without hesitation.
Although I have built several evaluation strategies for corporate training and of course developed many exams when I was a teacher, the process of creating an evaluation strategy for a non profit was a multi-layered process requiring something very different from a learning performance evaluation. So I went from master to student and participated in training given by the Center for Community Organizations (COCo) along with the WIWC Executive Director, Karen Henchey, and Programming and Membership Coordinators, Wendy Wong and later Kristin Illiffe. I'm not going to lie to you, building and executing a meaningful evaluation strategy is not an easy process and there is no cookie cutter template. And if you find someone else's tool that looks easy and works well, it's probably because there was a lot of thinking, prototyping and adjusting that was invested in the creation of this tool. In other words, don't expect to get it right on the first try. That said, I can share a few learnings and revelations I had along the journey:
- Use a model to guide the process. This is not the kind of trip you want to take without a map. COCo introduced us to "logic model", which what a great model to help us think this through. There are others like Kirkpatrick and the CIPP evaluation model.
- Work backwards: Start with the desired outcomes, then identify the actions that contribute to the outcomes, then looks for the performance indicators that show you your actions are aligned with your objective.
- Identify "natural" information sources. When we think evaluations, right away we think questionnaires, but often the real meat can be in informal impressions, hallway conversations, and bathroom side discussions. But how can we capture this? One group shared that their staff keeps a journal of some of their daily impressions--good and bad. Keep in mind that interviews and focus groups are quite underused and often more reliant than questionnaires.
- Be critical of your data: If you collect responses from only have 2 people out of 15 participants, it may not represent the situation adequately. If it's not relevant, don't use it.
- Get curious about what is not there: If there were few respondents, try to figure out why. Was there a high drop-out rate for the activity? If so, why?
- Crunch the numbers. This is a science in its own right. I would stress however that qualitative analysis techniques can be a bit messy but often very meaningful. Have a way to streamline data and represent it visually. At the WIWC we used good old Excel and tag clouds. There are other techniques to process your data, including such as Google and Survey Monkey.
After all this you might be asking yourself, is it worth all the effort? Of course, my answer to that is YES! A good evaluation strategy not only gives to precious data to adjust and calibrate what you do, it also provides an organic balance sheet of your organization and before you know it, you will be increasing your performance. It won't be a 2+2 equation, but a good evaluation strategy can bring you closer to your goals in subtle but impactful ways.
Mélanie Sisley was an insightful and productive member of the team responsible for improving the tools our organization uses to evaluate our programmes. With her background in education, she played an important role in articulating the anticipated impact of our programmes. She asked a lot of good questions, provided context for our discussions, and helped us find the answers we needed. With her help, we were able to create effective evaluation tools to gauge the short, medium and long-term impact of our programmes. I very much appreciated her creativity, her quick mind, and her enthusiasm for this project. A very nice collaboration for all concerned.
Karen Henchey, Executive Director, West Island Women’s Centre 2018