We use the ideas of evaluation to focus our team and partners on what matters most.
9
Our reports ask more about what really matters than what we are responsible for.
6
We ask questions to explore what counts rather than just collecting data.
9
We measure the changes in people more than we measure what we do.
9
Our reports are less about our outputs and more about our outcomes.
9
We do not use anecdotes as evidence of impact.
9
More of our evaluation questions are about how participants are doing and not on feedback (what they think) about us.
9
We ask more questions about the impact that is lasting after the program, not just the experience people have in the program.
9
We use questions to explore where participants are struggling, not just where they are succeeding.
9
We make evaluation a part of our program, not simply an assessment after our program is done.
9
We debrief with participants about the meaning of impact, not just give them a survey.
9
We design our questions as a learning opportunity for the participants, not simply a data collection opportunity for us.
9
We use guided interviews as sources of data.
9
We analyze our qualitative data for underlying themes instead of just using quotes.
9
We regularly use a mixed methods approach to data collection.
9