At the conclusion of each course I am taking at Walden University , I am required to write a brief reflection on the course. This reflection was written on the “Program Evaluation” class I just completed.
Eight weeks ago, I started another course in the Master’s program I am pursuing. At first impression, I figured I would be learning some new jargon for old concepts I have employed for years. I was mistaken. I was also intrigued.
The characteristic that separates evaluation from other research disciplines is the assignment of value, and the process of judging between what is mediocre and what is worthwhile. Evaluation also distinguishes itself from other forms of research through its function of initiating change, not only through criticism, but even more through its effect of changing the way we think about change. The process of evaluation creates a dynamic mindset of continual improvement. Improvements frequently happen before an evaluation is completed, just because questions were asked that prompt a response. If evaluation’s goal was the gathering of information, this effect of observation changing what is observed might be considered a problem, but change is the purpose of evaluation. (Fitzpatrick, Sanders, & Worthen, 2011)
The importance of evaluation as a means to effect change is also a liability. Significant ethical challenges exist because evaluations bring about change. Evaluations are conducted to meet the needs of stakeholders who may have varied and even conflicting interests in the outcome of the evaluation. A balanced view normally requires input from multiple perspectives. Evaluators need to be aware of all stakeholder perspectives in an evaluation, and they need to be aware of their own biases as they design an evaluation, and they need to disclose their biases when reporting their results. Since a balanced view requires multiple perspectives, multiple evaluations are often required. (Fitzpatrick, Sanders, & Worthen, 2011)
My experience planning an evaluation started with the preparation of a concept map (pictured above). A concept map is a diagram similar to a mind map, but a concept map illustrates relationships between ideas, and particularly, hierarchical relationships between ideas. The hierarchy of these relationships can be represented vertically or horizontally, but the map will generally show a progression of inputs, outputs, and outcomes. A concept map is useful to build an understanding of how an organization works, providing an intuitive view of various stakeholders and their interests in an organization. (Novak & Canas, Revised January 22)
The next step of the evaluation plan was a program analysis report that described the program, its history and stakeholders, contextual factors related to the stakeholders, and potential ethical challenges that would face an evaluation. A logic model was created to follow the activity of the program being evaluated. Like a concept map, a logic model considers the inputs, outputs, and outcomes of the program, and provides a framework for defining the kinds of questions an evaluation must answer to judge the effectiveness of a program. In the course of doing this assignment, I found an excellent template for designing a logic model which I downloaded from http://www.uwex.edu/ces/pdande/evaluation/docs/WorksheetExcel.xls
The wide variety of applications for evaluation have resulted in the development of a rich and varied set of evaluation models. Several categories of evaluation approaches were investigated, including expertise and consumer-oriented approaches, program-oriented approaches, decision-oriented approaches, and participant-oriented approaches. Strengths and weaknesses of each approach were reviewed, and a mixed approach was chosen that best matched the particular program characteristics and stakeholder interests with my own strengths and the limited resources available for conducting the evaluation. (Fitzpatrick, Sanders, & Worthen, 2011) The results of this investigation were documented in an evaluation model table. An evaluation criteria report more finely tuned the parameters of an evaluation plan and defined stakeholders and their interests, evaluation standards, and questions that the evaluation should seek to answer.
A data collection design and sampling strategy was developed as a PowerPoint presentation which was then recorded with the help of my daughter as an audio presentation in an interview format. The audio presentation discussed how data would be collected, specifically through interviews with various stakeholders, using census data to verify the validity of information obtained through interviews. The presentation discussed how the data collection design will address the evaluation questions. It discussed the need for random selection of stakeholder interviewees, and it discussed potential limitations to the data collection plan that will need attention.
A reporting strategy table was used to plan reporting strategies that will be appropriate to different kinds of stakeholders. In the reporting strategy, consideration was given to values and standards related to fair treatment of stakeholders, with a discussion of ethical policy.
I leave this class with an odd mixture of relief and regret: relief that an intense period of learning has come to a conclusion, (although personal circumstances may have influenced my perception of the intensity of the learning activities) and regret that the eight-week class format did not provide enough time to follow through to conduct the evaluations we planned.
Fitzpatrick, J. L. , Sanders, J. R. , & Worthen, B. R. (2011). Program evaluation: Alternative approaches and practical guidelines (4th ed .). Upper Saddle River , NJ : Pearson Education, Inc.
Novak, J. D. , & Canas, A. J. (Revised January 22, 2008). The theory underlying concept maps and how to constuct then [Electronic mailing list message] [Regular]. Technical report IHMC Cmap Tools 2006-01 Rev 01-2008, Florida Institute for Human and Machine Cognition, 2008. Retrieved from http://cmap.ihmc.us/Publications/ResearchPapers/TheoryUnderlyingConceptMaps.pdf