Loading…
STLHE2014SAPES has ended
Tuesday, June 17 • 9:00am - 12:00pm
PC.AM.03 -- Practical Strategies for Designing and Conducting Program Evaluation

Sign up or log in to save this to your schedule, view media, leave feedback and see who's attending!

Educational developers and faculty members are increasingly expected to manage programs using evidence-informed practices. They must be ready to understand and communicate the impact of their educational programs in systematic, rigorous ways that go beyond simple metrics. In addition to satisfying accountability pressures, evaluation can help improve existing program models, develop new ones, determine the effectiveness and efficacy of programs, and demonstrate end-user impact. Program evaluation draws upon social scientific methods to determine the merit, worth, significance of a program.

In this workshop, participants will progress through a crash course on the principles and practices of designing and conducting program evaluation. Specifically, we will introduce the utilization-focused evaluation framework (Patton, 2008, 2012), a dominant approach to conducting program evaluation that ensures both the process and the output can be useful and meaningful to those who have a vested interest in the program and the evaluation:

Utilization-Focused Evaluation (U-FE) begins with the premise that evaluations should be judged by their utility and actual use; therefore, evaluators should facilitate the evaluation process and design any evaluation with careful consideration of how everything that is done, from beginning to end, will affect use. Use concerns how real people in the real world apply evaluation findings and experience the evaluation process. Therefore, the focus in utilization-focused evaluation is on intended use by intended users. (Patton, 2008)

This workshop will cover the following topics:

  • What is program evaluation?
  • What are five dominant purposes for evaluating program?
  • How is evaluation different from research?
  • What does quality evaluation look like? What are dimensions of quality to program evaluation?
  • Identify stakeholders and focusing an evaluation.
  • Question the underlying program logic, its theory of change, and theory of action?
  • How do student-level and course-level evaluation data feed into larger program evaluation?
  • Crafting an evaluation design that balances concerns of utility, feasibility, propriety, accuracy, and evaluator accountability.

Using practical, hands-on learning, participants will assume the role of a novice evaluator and work through a case scenario. In so doing, participants will be able to:

  • Design and make informed decisions concerning the evaluation of programs and services
  • Apply evaluative thinking about programs to every-day decisions-making
  • Craft a defensible argument for an evaluation design
  • Access the program evaluation literature, its major theoretical propositions, and its practical strategies

Participants are invited to bring to the workshop questions and issues related to program evaluation encountered in their own work contexts. Participants should have some working understanding of basic research methods (e.g. surveys, interviews, focus groups, basic quantitative and qualitative analysis techniques).



Tuesday June 17, 2014 9:00am - 12:00pm EDT
A240 McArthur Hall

Attendees (0)