Publication Date
In 2025 | 2 |
Since 2024 | 3 |
Since 2021 (last 5 years) | 3 |
Since 2016 (last 10 years) | 8 |
Since 2006 (last 20 years) | 27 |
Descriptor
Source
Author
Anderson, Terry | 1 |
Andrew P. Jaciw | 1 |
Appleton, James J. | 1 |
Arzi, Hanna J. | 1 |
Barham, Mary Ann | 1 |
Bechger, Timo M. | 1 |
Benjawan Plengkham | 1 |
Brand-Gruwel, S. | 1 |
Brinke, D. Joosten-Ten | 1 |
Brown, James M. | 1 |
Bruck, Margaret | 1 |
More ▼ |
Publication Type
Reports - Descriptive | 42 |
Journal Articles | 32 |
Opinion Papers | 5 |
Tests/Questionnaires | 3 |
Numerical/Quantitative Data | 2 |
Information Analyses | 1 |
Reference Materials -… | 1 |
Reports - Research | 1 |
Education Level
Adult Education | 2 |
Early Childhood Education | 2 |
Higher Education | 2 |
Elementary Education | 1 |
Elementary Secondary Education | 1 |
High Schools | 1 |
Postsecondary Education | 1 |
Preschool Education | 1 |
Secondary Education | 1 |
Audience
Researchers | 4 |
Policymakers | 2 |
Practitioners | 2 |
Laws, Policies, & Programs
Debra P v Turlington | 1 |
No Child Left Behind Act 2001 | 1 |
Assessments and Surveys
Program for International… | 2 |
Early Childhood Environment… | 1 |
Florida State Student… | 1 |
Infant Toddler Environment… | 1 |
What Works Clearinghouse Rating
Youmi Suk – Asia Pacific Education Review, 2024
Regression discontinuity (RD) designs have gained significant popularity as a quasi-experimental device for evaluating education programs and policies. In this paper, we present a comprehensive review of RD designs, focusing on the continuity-based framework, the most widely adopted RD framework. We first review the fundamental aspects of RD…
Descriptors: Educational Research, Preschool Education, Regression (Statistics), Test Validity
Benjawan Plengkham; Sonthaya Rattanasak; Patsawut Sukserm – Journal of Education and Learning, 2025
This academic article provides the essential steps for designing an effective English questionnaire in social science research, with a focus on ensuring clarity, cultural sensitivity and ethical integrity. Developed from key insights from related studies, it outlines potential practice in questionnaire design, item development and the importance…
Descriptors: Guidelines, Test Construction, Questionnaires, Surveys
Andrew P. Jaciw – American Journal of Evaluation, 2025
By design, randomized experiments (XPs) rule out bias from confounded selection of participants into conditions. Quasi-experiments (QEs) are often considered second-best because they do not share this benefit. However, when results from XPs are used to generalize causal impacts, the benefit from unconfounded selection into conditions may be offset…
Descriptors: Elementary School Students, Elementary School Teachers, Generalization, Test Bias
Peterson, Christina Hamme; Peterson, N. Andrew; Powell, Kristen Gilmore – Measurement and Evaluation in Counseling and Development, 2017
Cognitive interviewing (CI) is a method to identify sources of confusion in assessment items and to assess validity evidence on the basis of content and response processes. We introduce readers to CI and describe a process for conducting such interviews and analyzing the results. Recommendations for best practice are provided.
Descriptors: Test Items, Test Construction, Interviews, Test Validity
Koretz, Daniel – Assessment in Education: Principles, Policy & Practice, 2016
Daniel Koretz is the Henry Lee Shattuck Professor of Education at the Harvard Graduate School of Education. His research focuses on educational assessment and policy, particularly the effects of high-stakes testing on educational practice and the validity of score gains. He is the author of "Measuring Up: What Educational Testing Really Tells…
Descriptors: Test Validity, Definitions, Evidence, Relevance (Education)
Slaney, Kathleen L. – Assessment in Education: Principles, Policy & Practice, 2016
Kathleen Slaney, associate professor in the History, Quantitative and Theoretical Psychology stream in the Department of Psychology at Simon Fraser University, comments on three issues she considers central to a fruitful discussion of how "validity" should be used in the context of testing.
Descriptors: Test Validity, Educational Practices, Praxis, Evaluation Criteria
Schmidt, Frank L. – Research Synthesis Methods, 2015
In this article, I provide answers to the questions posed by Will Shadish about the history and development of the Schmidt-Hunter methods of meta-analysis. In the 1970s, I headed a research program on personnel selection at the US Office of Personnel Management (OPM). After our research showed that validity studies have low statistical power, OPM…
Descriptors: Meta Analysis, Test Validity, Measurement, Predictive Validity
What Works Clearinghouse, 2015
The What Works Clearinghouse (WWC) Standards Briefs explain the rules the WWC uses to evaluate the quality of studies for practitioners, researchers, and policymakers. This brief explains what baseline equivalence is and why it matters. As part of the WWC review process for certain types of studies, reviewers assess whether the intervention group…
Descriptors: Control Groups, Participant Characteristics, Matched Groups, Research Methodology
Kozleski, Elizabeth B. – Research and Practice for Persons with Severe Disabilities, 2017
This article offers a rationale for the contributions of qualitative research to evidence-based practice in special education. In it, I make the argument that qualitative research encompasses the ability to study significant problems of practice, engage with practitioners in the conduct of research studies, learn and change processes during a…
Descriptors: Qualitative Research, Evidence Based Practice, Special Education, Research Methodology
Murawska, Jaclyn M.; Walker, David A. – Mid-Western Educational Researcher, 2017
In this commentary, we offer a set of visual tools that can assist education researchers, especially those in the field of mathematics, in developing cohesiveness from a mixed methods perspective, commencing at a study's research questions and literature review, through its data collection and analysis, and finally to its results. This expounds…
Descriptors: Mixed Methods Research, Research Methodology, Visual Aids, Research Tools
Weaver-Hightower, Marcus B. – Journal of Mixed Methods Research, 2014
Fields from political science to critical education policy studies have long explored power relations in policy processes, showing who influences policy agendas, policy creation, and policy implementation. Yet showing particular actors' influence on specific points in a policy text remains a methodological challenge. This article presents a…
Descriptors: Mixed Methods Research, Public Policy, Influences, Politics of Education
Roth, Wolff-Michael – Journal of Research in Science Teaching, 2011
In the wake of an increasing political commitment to evidence-based decision making and evidence-based educational reform that emerged with the No Child Left Behind effort, the question of what counts as evidence has become increasingly important in the field of science education. In current public discussions, academics, politicians, and other…
Descriptors: Science Education, Educational Research, Evidence, Definitions
Wise, Vicki L.; Barham, Mary Ann – About Campus, 2012
The August 16, 2011, "Chronicle of Higher Education" article "Want Data? Ask Students. Again and Again" by Sara Lipka posits that in higher education there is a culture of oversurveying students and too often relying on surveys as the main, or only, way of assessing the impact of programs and services on student satisfaction and learning. Because…
Descriptors: Learner Engagement, Research Methodology, Test Validity, Response Style (Tests)
Cooksy, Leslie J.; Mark, Melvin M. – American Journal of Evaluation, 2012
Attention to evaluation quality is commonplace, even if sometimes implicit. Drawing on her 2010 Presidential Address to the American Evaluation Association, Leslie Cooksy suggests that evaluation quality depends, at least in part, on the intersection of three factors: (a) evaluator competency, (b) aspects of the evaluation environment or context,…
Descriptors: Competence, Context Effect, Educational Resources, Educational Quality
OECD Publishing, 2014
The "PISA 2012 Technical Report" describes the methodology underlying the PISA 2012 survey, which tested 15-year-olds' competencies in mathematics, reading and science and, in some countries, problem solving and financial literacy. It examines the design and implementation of the project at a level of detail that allows researchers to…
Descriptors: International Assessment, Secondary School Students, Foreign Countries, Achievement Tests