NotesFAQContact Us
Collection
Advanced
Search Tips
What Works Clearinghouse Rating
Showing 1 to 15 of 65 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Lopez, Alexis A.; Tolentino, Florencia – ETS Research Report Series, 2020
In this study we investigated how English learners (ELs) interacted with "®" summative English language arts (ELA) and mathematics items, the embedded online tools, and accessibility features. We focused on how EL students navigated the assessment items; how they selected or constructed their responses; how they interacted with the…
Descriptors: English Language Learners, Student Evaluation, Language Arts, Summative Evaluation
Tomkowicz, Joanna; Kim, Dong-In; Wan, Ping – Online Submission, 2022
In this study we evaluated the stability of item parameters and student scores, using the pre-equated (pre-pandemic) parameters from Spring 2019 and post-equated (post-pandemic) parameters from Spring 2021 in two calibration and equating designs related to item parameter treatment: re-estimating all anchor parameters (Design 1) and holding the…
Descriptors: Equated Scores, Test Items, Evaluation Methods, Pandemics
Heather Dorrian – ProQuest LLC, 2021
This mixed methods study aimed to categorize and analyze the frequencies and percentages of complex thinking in the PARCC practices assessments in English Language Arts grade 10 and Geometry. The Hess' Cognitive Rigor Matrix was used for the first part of the study to code each of the PARCC assessment questions in Language Arts grade 10 and…
Descriptors: Thinking Skills, Drills (Practice), Standardized Tests, Grade 10
Lazarus, Sheryl S.; Johnstone, Christopher J.; Liu, Kristin K.; Thurlow, Martha L.; Hinkle, Andrew R.; Burden, Kathryn – National Center on Educational Outcomes, 2022
This "Guide" is an update to the State Guide to Universally Designed Assessments produced by the National Center on Educational Outcomes (NCEO) in 2006 (Johnstone et al.). It provides a brief overview of what a universally designed assessment is, followed by a set of steps for states to consider when designing and developing, or…
Descriptors: Alternative Assessment, Educational Assessment, Test Construction, Summative Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
Tony Albano; Brian F. French; Thao Thu Vo – Applied Measurement in Education, 2024
Recent research has demonstrated an intersectional approach to the study of differential item functioning (DIF). This approach expands DIF to account for the interactions between what have traditionally been treated as separate grouping variables. In this paper, we compare traditional and intersectional DIF analyses using data from a state testing…
Descriptors: Test Items, Item Analysis, Data Use, Standardized Tests
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Dianne S. McCarthy – Journal of Inquiry and Action in Education, 2023
Teacher certification exams are supposed to assess if a student is likely to succeed in teaching. What if an exam seems to be inappropriate? This article is an inquiry of the New York State Content Specialty Test for Early Childhood Candidates, particularly the math section. It raises the issue of whether we are asking the right questions and…
Descriptors: Teacher Certification, Licensing Examinations (Professions), Preservice Teachers, Early Childhood Teachers
Kim, Dong-In; Julian, Marc; Hermann, Pam – Online Submission, 2022
In test equating, one critical equating property is the group invariance property which indicates that the equating function used to convert performance on each alternate form to the reporting scale should be the same for various subgroups. To mitigate the impact of disrupted learning on the item parameters during the COVID-19 pandemic, a…
Descriptors: COVID-19, Pandemics, Test Format, Equated Scores
Partnership for Assessment of Readiness for College and Careers, 2016
The Partnership for Assessment of Readiness for College and Careers (PARCC) is a group of states working together to develop a set of assessments that measure whether students are on track to be successful in college and careers. Administrations of the PARCC assessment included three Prose Constructed Responses (PCR), one per task for English…
Descriptors: Scoring Rubrics, Test Items, Literacy, Language Arts
Peer reviewed Peer reviewed
Direct linkDirect link
Russell, Michael; Szendey, Olivia; Li, Zhushan – Educational Assessment, 2022
Recent research provides evidence that an intersectional approach to defining reference and focal groups results in a higher percentage of comparisons flagged for potential DIF. The study presented here examined the generalizability of this pattern across methods for examining DIF. While the level of DIF detection differed among the four methods…
Descriptors: Comparative Analysis, Item Analysis, Test Items, Test Construction
Peer reviewed Peer reviewed
Direct linkDirect link
Russell, Michael; Moncaleano, Sebastian – Educational Assessment, 2019
Over the past decade, large-scale testing programs have employed technology-enhanced items (TEI) to improve the fidelity with which an item measures a targeted construct. This paper presents findings from a review of released TEIs employed by large-scale testing programs worldwide. Analyses examine the prevalence with which different types of TEIs…
Descriptors: Computer Assisted Testing, Fidelity, Elementary Secondary Education, Test Items
Peer reviewed Peer reviewed
Direct linkDirect link
Dee, Thomas S.; Domingue, Benjamin W. – Educational Measurement: Issues and Practice, 2021
On the second day of a 2019 high-stakes English Language Arts assessment, Massachusetts 10th graders faced an essay question that was based on a passage from the novel "The Underground Railroad" and publicly characterized as racially insensitive. Though the state excluded the essay responses from student scores, an unresolved public…
Descriptors: High School Students, Grade 10, Language Arts, High Stakes Tests
Achieve, Inc., 2019
In 2013, the Council of Chief State School Officers (CCSSO), working collaboratively with state education agencies, released a set of criteria for states to use to evaluate and procure high-quality assessments. The English Language Arts (ELA)/Literacy section of the document included nine content-specific criteria to evaluate the alignment of…
Descriptors: Reading Skills, Student Evaluation, Evaluation Methods, Reading Tests
Nebraska Department of Education, 2021
This technical report documents the processes and procedures implemented to support the Spring 2021 Nebraska Student-Centered Assessment System (NSCAS) Phase I Pilot in English Language Arts (ELA), Mathematics, and Science assessments by NWEA® under the supervision of the Nebraska Department of Education (NDE). The technical report shows how the…
Descriptors: Psychometrics, Standard Setting, English, Language Arts
Madni, Ayesha; Kao, Jenny C.; Rivera, Nichole M.; Baker, Eva L.; Cai, Li – National Center for Research on Evaluation, Standards, and Student Testing (CRESST), 2018
This report is the first in a series of five reports considering career-readiness features within high school assessments. Utilizing feature analysis and cognitive lab interviews, the primary objective of this study was to verify and validate the existence of specific career-readiness features in select math and English language arts (ELA) test…
Descriptors: Career Readiness, High School Students, Test Items, Student Evaluation
Peer reviewed Peer reviewed
PDF on ERIC Download full text
van Rijn, Peter; Graf, Edith Aurora; Arieli-Attali, Meirav; Song, Yi – ETS Research Report Series, 2018
In this study, we explored the extent to which teachers agree on the ordering and separation of levels of two different learning progressions (LPs) in English language arts (ELA) and mathematics. In a panel meeting akin to a standard-setting procedure, we asked teachers to link the items and responses of summative educational assessments to LP…
Descriptors: Teacher Attitudes, Student Evaluation, Summative Evaluation, Language Arts
Previous Page | Next Page »
Pages: 1  |  2  |  3  |  4  |  5