Publication Date
In 2025 | 1 |
Since 2024 | 22 |
Since 2021 (last 5 years) | 41 |
Since 2016 (last 10 years) | 81 |
Since 2006 (last 20 years) | 302 |
Descriptor
Evaluation Methods | 1074 |
Research Design | 1074 |
Research Methodology | 420 |
Program Evaluation | 417 |
Research Problems | 166 |
Educational Research | 162 |
Data Collection | 158 |
Data Analysis | 140 |
Elementary Secondary Education | 140 |
Models | 139 |
Evaluation Criteria | 136 |
More ▼ |
Source
Author
Fink, Arlene | 7 |
Stufflebeam, Daniel L. | 6 |
Baker, Eva L. | 5 |
Smith, Nick L. | 5 |
Boser, Judith A. | 4 |
Clark, Sheldon B. | 4 |
Greene, Jennifer C. | 4 |
Rossi, Peter H. | 4 |
Simison, Diane | 4 |
Steiner, Peter M. | 4 |
Anna Shapiro | 3 |
More ▼ |
Publication Type
Education Level
Audience
Researchers | 81 |
Practitioners | 37 |
Administrators | 23 |
Policymakers | 16 |
Students | 11 |
Teachers | 9 |
Parents | 3 |
Community | 1 |
Media Staff | 1 |
Support Staff | 1 |
Location
United Kingdom | 12 |
Australia | 10 |
Canada | 8 |
United Kingdom (England) | 6 |
United States | 6 |
California | 5 |
New York | 5 |
Ohio | 5 |
District of Columbia | 4 |
Illinois | 4 |
Israel | 4 |
More ▼ |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Nadira Dayo; Sameh Said Metwaly; Wim Van Den Noortgate – British Journal of Educational Technology, 2024
Single-case experimental designs (SCEDs) may offer a reliable and internally valid way to evaluate technology-enhanced learning (TEL). A systematic review was conducted to provide an overview of what, why and how SCEDs are used to evaluate TEL. Accordingly, 136 studies from nine databases fulfilling the inclusion criteria were included. The…
Descriptors: Technology Uses in Education, Technology Integration, Educational Research, Evaluation Methods
Qiong Wu; Liping Gu – Sociological Methods & Research, 2024
Family income questions in general purpose surveys are usually collected with either a single-question summary design or a multiple-question disaggregation design. It is unclear how estimates from the two approaches agree with each other. The current paper takes advantage of a large-scale survey that has collected family income with both methods.…
Descriptors: Foreign Countries, Family Income, Questionnaires, Research Design
Alexander D. Latham; David A. Klingbeil – Grantee Submission, 2024
The visual analysis of data presented in time-series graphs are common in single-case design (SCD) research and applied practice in school psychology. A growing body of research suggests that visual analysts' ratings are often influenced by construct-irrelevant features including Y-axis truncation and compression of the number of data points per…
Descriptors: Intervention, School Psychologists, Graphs, Evaluation Methods
Stephen Gorard – Review of Education, 2024
This paper describes, and lays out an argument for, the use of a procedure to help groups of reviewers to judge the quality of prior research reports. It argues why such a procedure is needed, and how other existing approaches are only relevant to some kinds of research, meaning that a review or synthesis cannot successfully combine quality…
Descriptors: Credibility, Research Reports, Evaluation Methods, Research Design
What Works Clearinghouse, 2022
Education decisionmakers need access to the best evidence about the effectiveness of education interventions, including practices, products, programs, and policies. It can be difficult, time consuming, and costly to access and draw conclusions from relevant studies about the effectiveness of interventions. The What Works Clearinghouse (WWC)…
Descriptors: Program Evaluation, Program Effectiveness, Standards, Educational Research
Bephyer Parey; Elisabeth Kutscher – Journal of Mixed Methods Research, 2024
Rigor evaluation in mixed methods research is a growing need. Linking rigor to Hong and Pluye's (2019) concepts of methodological and reporting quality, the purpose of this article is to operationalize and expand Harrison et al.'s (2020) Rigorous Mixed Methods Framework. Drawing from a systematic methodological review of 66 inclusive education…
Descriptors: Mixed Methods Research, Difficulty Level, Evaluation Methods, Research Design
Ian Greener – International Journal of Social Research Methodology, 2024
This paper argues for three aspects of tolerance with respect to QCA research: tolerance with respect to different approaches to QCA; producing QCA research with tolerance (work that is resistant to criticism); and for QCA researchers to be clear about the tolerance of the solutions they present -- especially in terms of calibration and truth…
Descriptors: Qualitative Research, Research Methodology, Comparative Analysis, Research Design
Radhakrishna, Rama; Chaudhary, Anil Kumar; Tobin, Daniel – Journal of Extension, 2019
We present a framework to help those working in Extension connect program designs with appropriate evaluation designs to improve evaluation. The framework links four distinct Extension program domains--service, facilitation, content transformation, and transformative education--with three types of evaluation design--preexperimental,…
Descriptors: Extension Education, Program Design, Evaluation Methods, Research Design
Dan Reynolds; Courtney Hattan – Reading Teacher, 2024
The role of knowledge and reading comprehension has seen a recent explosion of attention from researchers, journalists, and policy advocates. Much of this discourse describes knowledge in neutral terms such as knowledge of "the world". That knowledge of the world, however, is woven into the fabric of the gendered world we live in and its…
Descriptors: Literacy Education, Literacy, Research Methodology, Curriculum
Jordan Tait – ProQuest LLC, 2022
Once research questions are posed, researchers must answer many a priori questions regarding research design before analysis can be performed and any conclusions can be made, including sample selection criteria, data collection method, model specification, analysis and estimation technique. The choices made by researchers along this forking path…
Descriptors: Researchers, Research Design, Research Methodology, Models
What Works Clearinghouse, 2020
The What Works Clearinghouse (WWC) systematic review process is the basis of many of its products, enabling the WWC to use consistent, objective, and transparent standards and procedures in its reviews, while also ensuring comprehensive coverage of the relevant literature. The WWC systematic review process consists of five steps: (1) Developing…
Descriptors: Educational Research, Evaluation Methods, Research Reports, Standards
Corinne Huggins-Manley; Anthony W. Raborn; Peggy K. Jones; Ted Myers – Journal of Educational Measurement, 2024
The purpose of this study is to develop a nonparametric DIF method that (a) compares focal groups directly to the composite group that will be used to develop the reported test score scale, and (b) allows practitioners to explore for DIF related to focal groups stemming from multicategorical variables that constitute a small proportion of the…
Descriptors: Nonparametric Statistics, Test Bias, Scores, Statistical Significance
Gal Raz; Sabrina Piccolo; Janine Medrano; Shari Liu; Kirsten Lydic; Catherine Mei; Victoria Nguyen; Tianmin Shu; Rebecca Saxe – Developmental Psychology, 2024
The study of infant gaze has long been a key tool for understanding the developing mind. However, labor-intensive data collection and processing limit the speed at which this understanding can be advanced. Here, we demonstrate an asynchronous workflow for conducting violation-of-expectation (VoE) experiments, which is fully "hands-off"…
Descriptors: Infants, Eye Movements, Attention, Expectation
Huey T. Chen; Liliana Morosanu; Victor H. Chen – Asia Pacific Journal of Education, 2024
The Campbellian validity typology has been used as a foundation for outcome evaluation and for developing evidence-based interventions for decades. As such, randomized control trials were preferred for outcome evaluation. However, some evaluators disagree with the validity typology's argument that randomized controlled trials as the best design…
Descriptors: Evaluation Methods, Systems Approach, Intervention, Evidence Based Practice
Jorietha Hugo; Ronel Callaghan; Johannes Cronje – Electronic Journal of e-Learning, 2024
Emerging technologies are transforming educational practices, but successful integration requires improving the quality and efficiency of learning. New technology emerges in hype cycles but adoption and performance lag over time. A strategy development framework is needed for decision-makers to understand the complex interaction of all the factors…
Descriptors: Strategic Planning, Educational Technology, Research Design, Technology Uses in Education