Program Evaluation Overview
Reporting to Stakeholders
Starting a Program Evaluation
Tutorial on Assessment Plans
Program Renewal Through SWOT Analysis
Reporting and Records Retention
Tutorial on Assessment Plans
In the context of evaluation, some scholarship providers are interested in gauging their success for continuous program improvement, but they are not yet at the developmental stage at which to engage in full scale evaluation or empirical research. For these types of providers, they may be more comfortable with the process of assessment. A simple summary of evaluative practices can be defined this way:
- Measurement – the act of gathering data resulting from use of a tool
- Assessment – cumulative collection of data and analysis for the purpose of program improvement
- Evaluation – judging worth or quality (perhaps also to improve or replicate programs)
- Research – describing, predicting, controlling…to test hypotheses, revise theories, etc.
Scholarship providers who want to consider assessment or evaluation should think about two major guiding questions:
- How can the provider measure outcomes in a meaningful way? (How are you doing?)
- How can providers share news about students, and the program’s impact beyond academic reporting? (What do you say?)
A basic template for an assessment plan might include these components and guiding questions to be answered:
- Purpose & Scope: Why is the provider interested in assessment? What prompted the need? To what degree will the assessment measure internal or external processes, outputs, and early indicators of outcomes? Will the assessment include all programs or a select few that are of interest?
- Audience: Who will care about the results? Is the assessment for internal purposes, partner organizations, industry trade groups and associations, or current and potential funders?
- Capacity: Does the organization have the resources to gather data, disaggregate data, conduct statistical data analysis, ensure reliable and valid measures are being used, and manage the process? What human, technical, and financial resources are required?
- Methodology: To what degree will the organization use qualitative or quantitative methods to gather information and knowledge? What are the possible sources of data in-house or externally and how will those data be acquired, assembled and assessed? Are the indicators of progress or success valid and reliable? Do the tools, surveys or protocols need to be tested first? What is the timeline for the assessment and who is responsible for which roles? This section is the most detailed of the assessment plan.
- Analysis: Will the organization conduct statistical, contextual, historical, meta-, comparative or other type of analysis? What is the process by which the provider will digest, calculate, and analyze the information? For example, a program administrator may want to conduct a statistical analysis but too many records are missing or incomplete. Conversely, the organization may want to conduct a comparative analysis of how its program fares in comparison to similar programs, but may not have an adequate system for gathering comparative data.
- Reporting: In what manner will the findings be disseminated and to whom? Will the organization keep the findings within the organization or issue a media release for public consumption? Will students be made aware of the results? What are the intended and unintended consequences of reporting decisions?
- Feedback: How will the program respond to kudos or criticism based on reactions to the findings? Is there a process through which staff can accept feedback and respond to it?
Scholarship providers that want to engage in program assessment can use the tips and questions provided to guide them in the process. The result will be an organized, efficient framework by which to conduct the assessment along with the evidence to guide decisions and subsequent actions. It may also help them prepare for a more formal evaluation or experimental research process.
Source: Stezala Consulting, LLC