Descriptor
Evaluation Methods | 3 |
Input Output Analysis | 3 |
Standards | 3 |
Evaluation Criteria | 2 |
Models | 2 |
Objectives | 2 |
Program Evaluation | 2 |
Criteria | 1 |
Data Collection | 1 |
Differences | 1 |
Educational Assessment | 1 |
More ▼ |
Publication Type
Reports - Research | 2 |
Collected Works - Serials | 1 |
Education Level
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Welty, Gordon – 1969
This article outlines the procedures followed in program evaluation in Pittsburgh public schools. A program design is obtained by asking the field staff a series of specific questions. As the staff interact, problems about the program are resolved. The consensus achieved is the basis for standardization of activities in the field. The next step in…
Descriptors: Criteria, Evaluation Methods, Feedback, Input Output Analysis
Steinmetz, Andres – NCME Measurement in Education, 1976
The discrepancy evaluation model (DEM) specifies that evaluation consists of comparing performance with a standard, yielding discrepancy information. DEM is applied to programs in order to improve the program by making standards-performance-discrepancy cycles explicit and public. Action-oriented planning is involved in creating standards; a useful…
Descriptors: Data Collection, Differences, Evaluation Criteria, Evaluation Methods

Morra, Linda G. – 1978
This paper presents the Discrepancy Evaluation Model (DEM) as an overall strategy or framework for both the improvement and assessment of effectiveness of simulation/games. While application of the evaluation model to simulation/games rather than educational programs requires modification of the model, its critical features remain. These include:…
Descriptors: Educational Assessment, Educational Change, Educational Games, Evaluation Criteria