Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 1 |
Since 2006 (last 20 years) | 1 |
Descriptor
Source
ETS Research Report Series | 3 |
Author
Ackerman, Debra J. | 1 |
Bonett, John | 1 |
Bridgeman, Brent | 1 |
Larkin, Kevin C. | 1 |
Lawless, René | 1 |
Livingston, Samuel A. | 1 |
Morley, Mary | 1 |
Yu, Lei | 1 |
Publication Type
Journal Articles | 3 |
Reports - Research | 3 |
Tests/Questionnaires | 2 |
Education Level
Higher Education | 2 |
Postsecondary Education | 2 |
Early Childhood Education | 1 |
Kindergarten | 1 |
Primary Education | 1 |
Audience
Location
Delaware | 1 |
Illinois | 1 |
Louisiana (New Orleans) | 1 |
Maryland | 1 |
Michigan | 1 |
New Jersey | 1 |
North Carolina | 1 |
Ohio | 1 |
Oregon | 1 |
Pennsylvania | 1 |
Pennsylvania (Philadelphia) | 1 |
More ▼ |
Laws, Policies, & Programs
Assessments and Surveys
Graduate Record Examinations | 1 |
Praxis Series | 1 |
What Works Clearinghouse Rating
Ackerman, Debra J. – ETS Research Report Series, 2018
Kindergarten entry assessments (KEAs) have increasingly been incorporated into state education policies over the past 5 years, with much of this interest stemming from Race to the Top--Early Learning Challenge (RTT-ELC) awards, Enhanced Assessment Grants, and nationwide efforts to develop common K-12 state learning standards. Drawing on…
Descriptors: Screening Tests, Kindergarten, Test Validity, Test Reliability
Morley, Mary; Bridgeman, Brent; Lawless, René – ETS Research Report Series, 2004
This study investigated the transfer of solution strategies between close variants of quantitative reasoning questions. Pre- and posttests were obtained from 406 college undergraduates, all of whom took the same posttest; pretests varied such that one group of participants saw close variants of one set of posttest items while other groups saw…
Descriptors: Test Items, Mathematics Tests, Problem Solving, Pretests Posttests
Yu, Lei; Livingston, Samuel A.; Larkin, Kevin C.; Bonett, John – ETS Research Report Series, 2004
This study compared essay scores from paper-based and computer-based versions of a writing test for prospective teachers. Scores for essays in the paper-based version averaged nearly half a standard deviation higher than those in the computer-based version, after applying a statistical control for demographic differences between the groups of…
Descriptors: Essays, Writing (Composition), Computer Assisted Testing, Technology Uses in Education