Publication Date
In 2025 | 1 |
Since 2024 | 1 |
Since 2021 (last 5 years) | 3 |
Since 2016 (last 10 years) | 8 |
Since 2006 (last 20 years) | 21 |
Descriptor
Evaluation Methods | 50 |
Research Methodology | 50 |
Standards | 50 |
Program Evaluation | 15 |
Higher Education | 12 |
Educational Research | 11 |
Elementary Secondary Education | 11 |
Measurement Techniques | 9 |
Research Design | 9 |
Educational Assessment | 8 |
Foreign Countries | 8 |
More ▼ |
Source
Author
Aas, Gro Hanne | 1 |
Agrella, Robert F. | 1 |
Ashby, Cornelia M. | 1 |
Askling, Berit | 1 |
August, Diane | 1 |
Ben Kei Daniel | 1 |
Berk, Richard A. | 1 |
Biggs, John | 1 |
Boruch, Robert | 1 |
Caracelli, Valerie J. | 1 |
Carlson, Robert V. | 1 |
More ▼ |
Publication Type
Education Level
Higher Education | 8 |
Postsecondary Education | 6 |
Elementary Secondary Education | 2 |
Elementary Education | 1 |
Grade 6 | 1 |
Intermediate Grades | 1 |
Junior High Schools | 1 |
Middle Schools | 1 |
Secondary Education | 1 |
Audience
Researchers | 6 |
Practitioners | 2 |
Laws, Policies, & Programs
Assessments and Surveys
National Assessment of… | 2 |
Program for International… | 1 |
What Works Clearinghouse Rating
Edmonds, Bruce – International Journal of Social Research Methodology, 2023
This paper looks at the tension between the desire to claim predictive ability for Agent-Based Models (ABMs) and its extreme difficulty for social and ecological systems, suggesting that this is the main cause for the continuance of a rhetoric of prediction that is at odds with what is achievable. Following others, it recommends that it is better…
Descriptors: Models, Prediction, Evaluation Methods, Standards
Göran Lövestam; Susanne Bremer-Hoffmann; Koen Jonkers; Pieter van Nes – Research Ethics, 2025
The Joint Research Centre (JRC) is the European Commission's in-house science and knowledge service, employing a substantial staff of scientists devoted to conducting research to provide independent scientific advice for EU policy. Focussed on various research areas aligned with EU priorities, the JRC excels in delivering scientific evidence for…
Descriptors: Integrity, Ethics, Scientific Research, Scientists
What Works Clearinghouse, 2022
Education decisionmakers need access to the best evidence about the effectiveness of education interventions, including practices, products, programs, and policies. It can be difficult, time consuming, and costly to access and draw conclusions from relevant studies about the effectiveness of interventions. The What Works Clearinghouse (WWC)…
Descriptors: Program Evaluation, Program Effectiveness, Standards, Educational Research
Margulieux, Lauren; Ketenci, Tuba Ayer; Decker, Adrienne – Computer Science Education, 2019
Background and context: The variables that researchers measure and how they measure them are central in any area of research, including computing education. Which research questions can be asked and how they are answered depends on measurement. Objective: To summarize the commonly used variables and measurements in computing education and to…
Descriptors: Measurement Techniques, Standards, Evaluation Methods, Computer Science Education
What Works Clearinghouse, 2017
The What Works Clearinghouse (WWC) examines manuscripts (that is, journal articles, working papers, dissertations, or other publications or pieces of written research presented as complete sets of findings) to determine if they are eligible to be included in a specific review effort. The WWC first determines whether the manuscript contains a…
Descriptors: Program Effectiveness, Program Evaluation, Evaluation Methods, Educational Research
What Works Clearinghouse, 2017
Many studies of education interventions make claims about impacts on students' outcomes. Some studies have designs that enable readers to make causal inferences about the effects of an intervention but others have designs that do not permit these types of conclusions. To help policymakers, practitioners, and others make sense of study results, the…
Descriptors: Educational Research, Intervention, Program Evaluation, Program Effectiveness
Ben Kei Daniel – Qualitative Research Journal, 2018
Purpose: The purpose of this paper is to present a framework intended to guide students and novice researchers in learning about the necessary dimensions for assessing the rigour of qualitative research studies. The framework has four dimensions -- (T)rustworthiness, (A)uditability, (C)redibility and (T)ransferability. The development of TACT is…
Descriptors: Guidelines, Qualitative Research, Research Methodology, Credibility
Heyvaert, Mieke; Wendt, Oliver; Van den Noortgate, Wim; Onghena, Patrick – Journal of Special Education, 2015
Reporting standards and critical appraisal tools serve as beacons for researchers, reviewers, and research consumers. Parallel to existing guidelines for researchers to report and evaluate group-comparison studies, single-case experimental (SCE) researchers are in need of guidelines for reporting and evaluating SCE studies. A systematic search was…
Descriptors: Standards, Research Methodology, Comparative Analysis, Experiments
Vitinš, Maris; Rasnacs, Oskars – Informatics in Education, 2012
Information and communications technologies today are used in virtually any university course when students prepare their papers. ICT is also needed after people are graduated from university and enter the job market. This author is an instructor in the field of informatics related to health care and social sciences at the Riga Stradins…
Descriptors: Educational Technology, Assignments, Information Science, Data Processing
Mayton, Michael R.; Wheeler, John J.; Menendez, Anthony L.; Zhang, Jie – Education and Training in Autism and Developmental Disabilities, 2010
Horner et al. (2005) present a review substantiating how single-subject research methodology can be utilized to determine whether interventions are evidence-based practices (EBPs). The current study utilized the Horner et al. research piece to: (a) systematically identify a set of quality standards for the evaluation of single-case research…
Descriptors: Evaluators, Autism, Research Methodology, Validity
Cooksy, Leslie J.; Caracelli, Valerie J. – Journal of MultiDisciplinary Evaluation, 2009
This paper examines the practice of metaevaluation as indicated by the Metaevaluation standard of the Program Evaluation Standards, as the evaluation of a specific evaluation to inform stakeholders about the evaluation's strengths and weaknesses. The findings from an analysis of eighteen metaevaluations, including a description of the data…
Descriptors: Program Evaluation, Evaluation Criteria, Standards, Evaluation Research
Nunes, Miguel Baptista, Ed.; Isaias, Pedro, Ed. – International Association for Development of the Information Society, 2018
These proceedings contain the papers of the International Conference e-Learning 2018, which was organised by the International Association for Development of the Information Society, 17-19 July, 2018. This conference is part of the Multi Conference on Computer Science and Information Systems 2018, 17-20 July, which had a total of 617 submissions.…
Descriptors: Electronic Learning, Educational Technology, Online Courses, Educational Environment
Coyle, James P. – Collected Essays on Learning and Teaching, 2011
Evaluating higher education degree programs is an arduous task. This paper suggests innovative strategies for addressing four types of challenges that commonly occur during program evaluation: identifying theoretical models for evaluation, balancing potentially conflicting standards, accommodating faculty differences, and aligning courses.…
Descriptors: Undergraduate Study, College Programs, Program Evaluation, Evaluation Methods
What Works Clearinghouse, 2011
With its critical assessments of scientific evidence on the effectiveness of education programs, policies, and practices (referred to as "interventions"), and a range of products summarizing this evidence, the What Works Clearinghouse (WWC) is an important part of the Institute of Education Sciences' strategy to use rigorous and relevant…
Descriptors: Standards, Access to Information, Information Management, Guides
Hanssen, Carl E.; Lawrenz, Frances; Dunet, Diane O. – American Journal of Evaluation, 2008
Meta-evaluations reported in the literature, although rare, often have focused on retrospective assessment of completed evaluations. Conducting a meta-evaluation concurrently with the evaluation modifies this approach. This method provides the opportunity for the meta-evaluators to advise the evaluators and provides the basis for a summative…
Descriptors: Evaluation Methods, Evaluators, Standards, Summative Evaluation