Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 4 |
Since 2016 (last 10 years) | 6 |
Since 2006 (last 20 years) | 6 |
Descriptor
Source
Applied Measurement in… | 3 |
Educational Assessment | 1 |
Journal of Applied Testing… | 1 |
Journal of Research on… | 1 |
Author
Wise, Steven L. | 4 |
Kuhfeld, Megan | 2 |
Soland, James | 2 |
Cronin, John | 1 |
Dupray, Laurence M. | 1 |
Kingsbury, G. Gage | 1 |
Kuhfeld, Megan R. | 1 |
Wise, Steven | 1 |
Publication Type
Journal Articles | 6 |
Reports - Research | 6 |
Education Level
Elementary Education | 3 |
Secondary Education | 3 |
Elementary Secondary Education | 2 |
Junior High Schools | 2 |
Middle Schools | 2 |
Early Childhood Education | 1 |
Grade 2 | 1 |
Grade 3 | 1 |
Grade 4 | 1 |
Grade 5 | 1 |
Grade 6 | 1 |
More ▼ |
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
Measures of Academic Progress | 6 |
What Works Clearinghouse Rating
Wise, Steven; Kuhfeld, Megan – Applied Measurement in Education, 2021
Effort-moderated (E-M) scoring is intended to estimate how well a disengaged test taker would have performed had they been fully engaged. It accomplishes this adjustment by excluding disengaged responses from scoring and estimating performance from the remaining responses. The scoring method, however, assumes that the remaining responses are not…
Descriptors: Scoring, Achievement Tests, Identification, Validity
Wise, Steven L. – Applied Measurement in Education, 2020
In achievement testing there is typically a practical requirement that the set of items administered should be representative of some target content domain. This is accomplished by establishing test blueprints specifying the content constraints to be followed when selecting the items for a test. Sometimes, however, students give disengaged…
Descriptors: Test Items, Test Content, Achievement Tests, Guessing (Tests)
Wise, Steven L.; Soland, James; Dupray, Laurence M. – Journal of Applied Testing Technology, 2021
Technology-Enhanced Items (TEIs) have been purported to be more motivating and engaging to test takers than traditional multiple-choice items. The claim of enhanced engagement, however, has thus far received limited research attention. This study examined the rates of rapid-guessing behavior received by three types of items (multiple-choice,…
Descriptors: Test Items, Guessing (Tests), Multiple Choice Tests, Achievement Tests
Wise, Steven L.; Kuhfeld, Megan R.; Cronin, John – Educational Assessment, 2022
The arrival of the COVID-19 pandemic had a profound effect on K-12 education. Most schools transitioned to remote instruction, and some used remote testing to assess student learning. Remote testing, however, is less controlled than in-school testing, leading to concerns regarding test-taking engagement. This study compared the disengagement of…
Descriptors: Computer Assisted Testing, COVID-19, Pandemics, Learner Engagement
Kuhfeld, Megan; Soland, James – Journal of Research on Educational Effectiveness, 2020
Educational stakeholders have long known that students might not be fully engaged when taking an achievement test and that such disengagement could undermine the inferences drawn from observed scores. Thanks to the growing prevalence of computer-based tests and the new forms of metadata they produce, researchers have developed and validated…
Descriptors: Metadata, Computer Assisted Testing, Achievement Tests, Reaction Time
Wise, Steven L.; Kingsbury, G. Gage – Applied Measurement in Education, 2022
In achievement testing we assume that students will demonstrate their maximum performance as they encounter test items. Sometimes, however, student performance can decline during a test event, which implies that the test score does not represent maximum performance. This study describes a method for identifying significant performance decline and…
Descriptors: Achievement Tests, Performance, Classification, Guessing (Tests)