Evaluation: Boring but Necessary
By Mary Ann Wojton, Ph. D.
Evaluation is a boring but necessary element of an organization. Programming is more fun, I know, because I developed and implemented programs for 25 years before I graduated with my Ph.D.. As a program director, I didn’t always appreciate evaluation. It took me a while to realize that evaluation would help me understand if my programs were effective. Evaluation quantified my feeling that a program worked.
Evaluation can identify minor modifications that can increase impact and illustrate program impact. Funders want data; programs that prove their effectiveness are rewarded. Evaluation findings can also be used in marketing pieces and to boost staff morale. Who doesn’t want to attend a program, or work with a program, where a high percentage of participants are successful? Finally, sharing evaluation results with other like-minded individuals advances the field—we can learn a lot from each other.
There are several elements to successful evaluations. First, you need to identify who you are working with and what you are trying to accomplish. After we know what we want to measure, we develop a plan to collect the data. In addition to looking at the outputs (numbers of activities, number of individuals served, resources generated, etc.), we look at the outcomes, the knowledge, skills, and attitudes the project would like to change.
ECS uses valid, reliable, and usable instruments to measure outcomes. A valid instrument measures what it is supposed to measure. A reliable instrument gives the same results every time, which is especially important when working with diverse populations. And a usable instrument is easy to use. If it’s a questionnaire, it is written in the language of those who will be asked to complete it at an appropriate reading level. Finally, we ensure the data collected is necessary to measure success; deleting unnecessary questions so participants don’t waste time providing information that won’t be used.
The ECS team has a lot of experience in the field, and while no instrument is perfect, we have gotten close by adapting instruments that have proven validity, reliability and usability. Building upon our experience, ECS staff comb the research to find valid, reliable, and usable questions that measure our client’s outcomes, crediting those who developed them. We typically use questions developed by the Substance Abuse and Mental Health Services Administration (SAMSHA) for the core measures (i.e., 30-day use, perception of risk, etc.); additional risk questions may come from the Centers for Disease Control and Prevention’s (CDC) Youth Risk Behavior Surveillance System (YRBSS). Questions regarding positive youth development typically come from research led by the Search Institute, 4-H, and other youth-serving organizations.
We realize that while a smart person does not necessarily know everything, they have a good idea of where to find the information needed to get started. Between our experience and knowledge, ECS constructs evaluation protocols that support our partners as they think beyond tomorrow to build the future they desire.