Evaluation Impact and Use
As explained in the Program Evaluation and Design section of this evaluation, a utilization-focused process and outcome-based evaluation approach will be used (Patton, 2014). The purpose of this evaluation is to maximize the usefulness of Freshgrade's e-portfolios in an effort to contribute to the quality of our school's core value of deep student learning. In reference to "micro-recommendations" (Stufflebeam, 2017), and from the data collected, the evaluator is now able to give advice based on the internal workings of the program. What are the benefits in using e-portfolios? Does the evaluation validate the current practice for staff, parents, and students which help to solidify a positive perception of the program? Has it helped to clarify goals and increase collaboration and interactions among teachers, students, and parents? Perhaps the evaluation itself, or process, has been useful. By reflecting on the current practice and having primary intended users involved in the study and keeping them informed in the process, this may have led to mutual learning (Taut, 2007). Over the course of the evaluation, primary intended users may have shown changes in their thinking and attitude about Freshgrade as the process unfolded. As Taut explains, "process use and findings are highly interrelated" (Taut, 2007, p.4). Therefore, when the evaluator and primary intended users are exposed to the results of the evaluation along the way, both instrumental (concrete decisions about the program) and conceptual changes (changes in thinking and understanding) may occur.
"Macro-recommendations" (Stufflebeam, 2017) may also occur. Usually this is articulated from an external evaluator, however. As mentioned in the data analysis, if an external evaluator was also able to analyze the data, and felt that findings could be generalized across other educational institutions, perhaps Freshgrade e-portfolios could be used at other schools. On the other hand, lessons learned, failures, and limitations must also be taken into account.
I would like to note that due to the time constraints of this study, I did not use a comparison to validate findings. Further study on comparing a school with similar social, cultural, and economic background not using e-portfolios (a "critical competitor"), to the school in this study would strengthen evaluation findings (Stufflebeam, 2017).
The evaluator is also responsible for facilitating help following the evaluation (Stufflebeam, 2017). Along with explaining the significance of the evaluation report through a presentation or through discussions at a staff meeting, for example, the evaluator should also be there to handle questions and help primary intended users understand the evaluation findings and make them useful in implementing change. This could also be done at a school staff meeting for teachers, a class discussion for students, and a meeting with parents. I recommend presenting a PowerPoint presentation with visuals including exemplar student e-portfolios to show stakeholders of the value of the program. Videos with interview data would also be effective in the presentation.