Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 0 |
Since 2006 (last 20 years) | 3 |
Descriptor
Evaluation Criteria | 3 |
Evaluation Methods | 3 |
Guidelines | 3 |
Program Evaluation | 3 |
Research Design | 3 |
Evaluation Research | 2 |
Check Lists | 1 |
Computer Science | 1 |
Computer Science Education | 1 |
Concept Formation | 1 |
Conflict | 1 |
More ▼ |
Source
Journal of MultiDisciplinary… | 3 |
Author
Hendrix, Theron Dean | 1 |
Myneni, Lakshman Sundeep | 1 |
Nafziger, Dean N. | 1 |
Narayanan, N. Hari | 1 |
Ross, Margaret E. | 1 |
Sanders, James R. | 1 |
Stufflebeam, Daniel L. | 1 |
Publication Type
Journal Articles | 3 |
Reports - Evaluative | 2 |
Reports - Descriptive | 1 |
Education Level
Adult Education | 1 |
Higher Education | 1 |
Audience
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Sanders, James R.; Nafziger, Dean N. – Journal of MultiDisciplinary Evaluation, 2011
The purpose of this paper is to provide a basis for judging the adequacy of evaluation plans or, as they are commonly called, evaluation designs. The authors assume that using the procedures suggested in this paper to determine the adequacy of evaluation designs in advance of actually conducting evaluations will lead to better evaluation designs,…
Descriptors: Check Lists, Program Evaluation, Research Design, Evaluation Methods
Ross, Margaret E.; Narayanan, N. Hari; Hendrix, Theron Dean; Myneni, Lakshman Sundeep – Journal of MultiDisciplinary Evaluation, 2011
Background: The philosophical underpinnings of evaluation guidelines set forth by a funding agency can sometimes seem inconsistent with that of the intervention. Purpose: Our purpose is to introduce questions pertaining to the contrast between the instructional program's underlying philosophical beliefs and assumptions and those underlying our…
Descriptors: Philanthropic Foundations, Grants, Financial Support, Computer Science Education
Stufflebeam, Daniel L. – Journal of MultiDisciplinary Evaluation, 2011
Good evaluation requires that evaluation efforts themselves be evaluated. Many things can and often do go wrong in evaluation work. Accordingly, it is necessary to check evaluations for problems such as bias, technical error, administrative difficulties, and misuse. Such checks are needed both to improve ongoing evaluation activities and to assess…
Descriptors: Program Evaluation, Evaluation Criteria, Evaluation Methods, Definitions