Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 5 |
Since 2006 (last 20 years) | 11 |
Descriptor
Difficulty Level | 12 |
Test Construction | 12 |
Test Items | 10 |
Statistical Analysis | 7 |
College Entrance Examinations | 4 |
Graduate Study | 4 |
Item Analysis | 4 |
Scores | 4 |
Test Validity | 4 |
Computer Software | 3 |
Evidence | 3 |
More ▼ |
Source
ETS Research Report Series | 12 |
Author
Graf, Edith Aurora | 3 |
Sheehan, Kathleen M. | 3 |
Bejar, Isaac I. | 2 |
Deane, Paul | 2 |
Flor, Michael | 2 |
Futagi, Yoko | 2 |
Lawless, René | 2 |
Mikeska, Jamie N. | 2 |
Arieli-Attali, Meirav | 1 |
Attali, Yigal | 1 |
Chen, Jing | 1 |
More ▼ |
Publication Type
Journal Articles | 12 |
Reports - Research | 12 |
Tests/Questionnaires | 1 |
Education Level
Audience
Location
Alabama | 1 |
Arizona | 1 |
Arkansas | 1 |
California | 1 |
Connecticut | 1 |
Georgia | 1 |
Idaho | 1 |
Illinois | 1 |
Indiana | 1 |
Iowa | 1 |
Kentucky | 1 |
More ▼ |
Laws, Policies, & Programs
Assessments and Surveys
Graduate Record Examinations | 4 |
Praxis Series | 1 |
What Works Clearinghouse Rating
Guo, Hongwen; Zu, Jiyun; Kyllonen, Patrick – ETS Research Report Series, 2018
For a multiple-choice test under development or redesign, it is important to choose the optimal number of options per item so that the test possesses the desired psychometric properties. On the basis of available data for a multiple-choice assessment with 8 options, we evaluated the effects of changing the number of options on test properties…
Descriptors: Multiple Choice Tests, Test Items, Simulation, Test Construction
van Rijn, Peter; Graf, Edith Aurora; Arieli-Attali, Meirav; Song, Yi – ETS Research Report Series, 2018
In this study, we explored the extent to which teachers agree on the ordering and separation of levels of two different learning progressions (LPs) in English language arts (ELA) and mathematics. In a panel meeting akin to a standard-setting procedure, we asked teachers to link the items and responses of summative educational assessments to LP…
Descriptors: Teacher Attitudes, Student Evaluation, Summative Evaluation, Language Arts
Mikeska, Jamie N.; Kurzum, Christopher; Steinberg, Jonathan H.; Xu, Jun – ETS Research Report Series, 2018
The purpose of this report is to examine the performance of assessment items designed to measure elementary teachers' content knowledge for teaching (CKT) science as part of the ETS® Educator Series. The Elementary Education: CKT Science assessment is 1 component of licensure examination through the PRAXIS® assessments. The Elementary Education:…
Descriptors: Elementary School Teachers, Pedagogical Content Knowledge, Elementary School Science, Preservice Teachers
Mikeska, Jamie N.; Phelps, Geoffrey; Croft, Andrew J. – ETS Research Report Series, 2017
This report describes efforts by a group of science teachers, teacher educators, researchers, and content specialists to conceptualize, develop, and pilot practice-based assessment items designed to measure elementary science teachers' content knowledge for teaching (CKT). The report documents the framework used to specify the content-specific…
Descriptors: Elementary School Teachers, Science Teachers, Knowledge Base for Teaching, Test Items
Bejar, Isaac I.; Deane, Paul D.; Flor, Michael; Chen, Jing – ETS Research Report Series, 2017
The report is the first systematic evaluation of the sentence equivalence item type introduced by the "GRE"® revised General Test. We adopt a validity framework to guide our investigation based on Kane's approach to validation whereby a hierarchy of inferences that should be documented to support score meaning and interpretation is…
Descriptors: College Entrance Examinations, Graduate Study, Generalization, Inferences
Attali, Yigal – ETS Research Report Series, 2014
Previous research on calculator use in standardized assessments of quantitative ability focused on the effect of calculator availability on item difficulty and on whether test developers can predict these effects. With the introduction of an on-screen calculator on the Quantitative Reasoning measure of the "GRE"® revised General Test, it…
Descriptors: College Entrance Examinations, Graduate Study, Calculators, Test Items
Sheehan, Kathleen M. – ETS Research Report Series, 2015
The "TextEvaluator"® text analysis tool is a fully automated text complexity evaluation tool designed to help teachers, curriculum specialists, textbook publishers, and test developers select texts that are consistent with the text complexity guidelines specified in the Common Core State Standards.This paper documents the procedure used…
Descriptors: Scores, Common Core State Standards, Computer Software, Computational Linguistics
Sheehan, Kathleen M.; Flor, Michael; Napolitano, Diane; Ramineni, Chaitanya – ETS Research Report Series, 2015
This paper considers whether the sources of linguistic complexity presented within texts targeted at 1st-grade readers have increased, decreased, or held steady over the 52-year period from 1962 to 2013. A collection of more than 450 texts is examined. All texts were selected from Grade 1 textbooks published by Scott Foresman during the targeted…
Descriptors: Text Structure, Content Analysis, Grade 1, Elementary School Students
Deane, Paul; Lawless, René R.; Li, Chen; Sabatini, John; Bejar, Isaac I.; O'Reilly, Tenaha – ETS Research Report Series, 2014
We expect that word knowledge accumulates gradually. This article draws on earlier approaches to assessing depth, but focuses on one dimension: richness of semantic knowledge. We present results from a study in which three distinct item types were developed at three levels of depth: knowledge of common usage patterns, knowledge of broad topical…
Descriptors: Vocabulary, Test Items, Language Tests, Semantics
Sheehan, Kathleen M.; Kostin, Irene; Futagi, Yoko – ETS Research Report Series, 2007
This paper explores alternative approaches for facilitating efficient, evidence-centered item development for a new type of verbal reasoning item developed for use on the GRE® General Test. Results obtained in two separate studies are reported. The first study documented the development and validation of a fully automated approach for locating the…
Descriptors: College Entrance Examinations, Graduate Study, Test Items, Item Analysis
Deane, Paul; Graf, Edith Aurora; Higgins, Derrick; Futagi, Yoko; Lawless, René – ETS Research Report Series, 2006
This study focuses on the relationship between item modeling and evidence-centered design (ECD); it considers how an appropriately generalized item modeling software tool can support systematic identification and exploitation of task-model variables, and then examines the feasibility of this goal, using linear-equation items as a test case. The…
Descriptors: Test Items, Models, Computer Software, Equations (Mathematics)
Graf, Edith Aurora; Peterson, Stephen; Steffen, Manfred; Lawless, René – ETS Research Report Series, 2005
We describe the item modeling development and evaluation process as applied to a quantitative assessment with high-stakes outcomes. In addition to expediting the item-creation process, a model-based approach may reduce pretesting costs, if the difficulty and discrimination of model-generated items may be predicted to a predefined level of…
Descriptors: Psychometrics, Accuracy, Item Analysis, High Stakes Tests