NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Laws, Policies, & Programs
Assessments and Surveys
Gates MacGinitie Reading Tests2
What Works Clearinghouse Rating
Showing all 6 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Ponce, Héctor R.; Mayer, Richard E.; Loyola, María Soledad – Journal of Educational Computing Research, 2021
One of the most common technology-enhanced items used in large-scale K-12 testing programs is the drag-and-drop response interaction. The main research questions in this study are: (a) Does adding a drag-and-drop interface to an online test affect the accuracy of student performance? (b) Does adding a drag-and-drop interface to an online test…
Descriptors: Computer Assisted Testing, Test Construction, Standardized Tests, Elementary School Students
Peer reviewed Peer reviewed
Direct linkDirect link
Goodwin, Amanda; Petscher, Yaacov; Tock, Jamie – Journal of Research in Reading, 2021
Background: Middle school students use the information conveyed by morphemes (i.e., units of meaning such as prefixes, root words and suffixes) in different ways to support their literacy endeavours, suggesting the likelihood that morphological knowledge is multidimensional. This has important implications for assessment. Methods: The current…
Descriptors: Middle School Students, Morphology (Languages), Metalinguistics, Student Evaluation
Goodwin, Amanda P.; Petscher, Yaacov; Tock, Jamie – Grantee Submission, 2021
Background: Middle school students use the information conveyed by morphemes (i.e., units of meaning such as prefixes, root words and suffixes) in different ways to support their literacy endeavours, suggesting the likelihood that morphological knowledge is multidimensional. This has important implications for assessment. Methods: The current…
Descriptors: Morphology (Languages), Morphemes, Middle School Students, Knowledge Level
Steedle, Jeffrey; McBride, Malena; Johnson, Marc; Keng, Leslie – Partnership for Assessment of Readiness for College and Careers, 2016
The first operational administration of the Partnership for Assessment of Readiness for College and Careers (PARCC) took place during the 2014-2015 school year. In addition to the traditional paper-and-pencil format, the assessments were available for administration on a variety of electronic devices, including desktop computers, laptop computers,…
Descriptors: Computer Assisted Testing, Difficulty Level, Test Items, Scores
Peer reviewed Peer reviewed
Direct linkDirect link
Quellmalz, Edys S.; Davenport, Jodi L.; Timms, Michael J.; DeBoer, George E.; Jordan, Kevin A.; Huang, Chun-Wei; Buckley, Barbara C. – Journal of Educational Psychology, 2013
How can assessments measure complex science learning? Although traditional, multiple-choice items can effectively measure declarative knowledge such as scientific facts or definitions, they are considered less well suited for providing evidence of science inquiry practices such as making observations or designing and conducting investigations.…
Descriptors: Science Education, Educational Assessment, Psychometrics, Science Tests
Alonzo, Julie; Anderson, Daniel; Tindal, Gerald – Behavioral Research and Teaching, 2009
We present scaling outcomes for mathematics assessments used in the fall to screen students at risk of failing to learn the knowledge and skills described in the National Council of Teachers of Mathematics (NCTM) Focal Point Standards. At each grade level, the assessment consisted of a 48-item test with three 16-item sub-test sets aligned to the…
Descriptors: At Risk Students, Mathematics Teachers, National Standards, Item Response Theory