NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 1 to 15 of 20 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Burhan Ogut; Ruhan Circi; Huade Huo; Juanita Hicks; Michelle Yin – International Electronic Journal of Elementary Education, 2025
This study explored the effectiveness of extended time (ET) accommodations in the 2017 NAEP Grade 8 Mathematics assessment to enhance educational equity. Analyzing NAEP process data through an XGBoost model, we examined if early interactions with assessment items could predict students' likelihood of requiring ET by identifying those who received…
Descriptors: Identification, Testing Accommodations, National Competency Tests, Equal Education
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Blair Lehman; Jesse R. Sparks; Jonathan Steinberg – ETS Research Report Series, 2024
Over the last 20 years, many methods have been proposed to use process data (e.g., response time) to detect changes in engagement during the test-taking process. However, many of these methods were developed and evaluated in highly similar testing contexts: 30 or more single-select multiple-choice items presented in a linear, fixed sequence in…
Descriptors: National Competency Tests, Secondary School Mathematics, Secondary School Students, Mathematics Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Xin Wei – Grantee Submission, 2025
This study investigates the time-use patterns of students with learning disabilities during digital mathematics assessments and explores the role of extended time accommodations (ETA) in shaping these patterns. Using latent profile analysis, four distinct time-use profiles were identified separately for students with and without ETA. "Initial…
Descriptors: Computer Assisted Testing, Mathematics Tests, Students with Disabilities, Testing Accommodations
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Congning Ni; Bhashithe Abeysinghe; Juanita Hicks – International Electronic Journal of Elementary Education, 2025
The National Assessment of Educational Progress (NAEP), often referred to as The Nation's Report Card, offers a window into the state of U.S. K-12 education system. Since 2017, NAEP has transitioned to digital assessments, opening new research opportunities that were previously impossible. Process data tracks students' interactions with the…
Descriptors: Reaction Time, Multiple Choice Tests, Behavior Change, National Competency Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Jiang, Yang; Gong, Tao; Saldivia, Luis E.; Cayton-Hodges, Gabrielle; Agard, Christopher – Large-scale Assessments in Education, 2021
In 2017, the mathematics assessments that are part of the National Assessment of Educational Progress (NAEP) program underwent a transformation shifting the administration from paper-and-pencil formats to digitally-based assessments (DBA). This shift introduced new interactive item types that bring rich process data and tremendous opportunities to…
Descriptors: Data Use, Learning Analytics, Test Items, Measurement
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Patel, Nirmal; Sharma, Aditya; Shah, Tirth; Lomas, Derek – Journal of Educational Data Mining, 2021
Process Analysis is an emerging approach to discover meaningful knowledge from temporal educational data. The study presented in this paper shows how we used Process Analysis methods on the National Assessment of Educational Progress (NAEP) test data for modeling and predicting student test-taking behavior. Our process-oriented data exploration…
Descriptors: Learning Analytics, National Competency Tests, Evaluation Methods, Prediction
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Wang, Yan; Murphy, Kevin B. – National Center for Education Statistics, 2020
In 2018, the National Center for Education Statistics (NCES) administered two assessments--the National Assessment of Educational Progress (NAEP) Technology and Engineering Literacy (TEL) assessment and the International Computer and Information Literacy Study (ICILS)--to two separate nationally representative samples of 8th-grade students in the…
Descriptors: National Competency Tests, International Assessment, Computer Literacy, Information Literacy
Peer reviewed Peer reviewed
Direct linkDirect link
Xin Wei – Educational Researcher, 2024
This study investigates the relationship between text-to-speech (TTS) usage and item-by-item performance in the 2017 eighth-grade National Assessment of Educational Progress (NAEP) math assessment, focusing on students with disabilities (SWDs), English language learners (ELLs), and their general education (GE) peers. Results indicate that all…
Descriptors: Assistive Technology, Students with Disabilities, English Language Learners, Regular and Special Education Relationship
Durán, Richard P.; Zhang, Ting; Sañosa, David; Stancavage, Fran – American Institutes for Research, 2020
The National Assessment of Educational Progress's (NAEP's) transition to an entirely digitally based assessment (DBA) began in 2017. As part of this transition, new types of NAEP items have begun to be developed that leverage the DBA environment to measure a wider range of knowledge and skills. These new item types include the science…
Descriptors: National Competency Tests, Computer Assisted Testing, Science Tests, Test Items
O'Malley, Fran; Norton, Scott – American Institutes for Research, 2022
This paper provides the National Center for Education Statistics (NCES), National Assessment Governing Board (NAGB), and the National Assessment of Educational Progress (NAEP) community with information that may help maintain the validity and utility of the NAEP assessments for civics and U.S. history as revisions are planned to the NAEP…
Descriptors: National Competency Tests, United States History, Test Validity, Governing Boards
Peer reviewed Peer reviewed
Direct linkDirect link
Tate, Tamara P.; Warschauer, Mark – Technology, Knowledge and Learning, 2019
The quality of students' writing skills continues to concern educators. Because writing is essential to success in both college and career, poor writing can have lifelong consequences. Writing is now primarily done digitally, but students receive limited explicit instruction in digital writing. This lack of instruction means that students fail to…
Descriptors: Writing Tests, Computer Assisted Testing, Writing Skills, Writing Processes
Peer reviewed Peer reviewed
Direct linkDirect link
Pan, Qianqian – AERA Online Paper Repository, 2016
Comparability studies between CBT and PPT studied if there were score differences between these two versions. However, there were no consistent results across studies and time. The purpose of this study is to summarize the testing mode effects for K-12 mathematics tests by conducting a meta-analysis. 103 effect sizes of 32 studies were collected…
Descriptors: Elementary School Mathematics, Secondary School Mathematics, Computer Assisted Testing, Student Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
Ling, Guangming – International Journal of Testing, 2016
To investigate possible iPad related mode effect, we tested 403 8th graders in Indiana, Maryland, and New Jersey under three mode conditions through random assignment: a desktop computer, an iPad alone, and an iPad with an external keyboard. All students had used an iPad or computer for six months or longer. The 2-hour test included reading, math,…
Descriptors: Educational Testing, Computer Assisted Testing, Handheld Devices, Computers
White, Sheida; Kim, Young Yee; Chen, Jing; Liu, Fei – National Center for Education Statistics, 2015
This study examined whether or not fourth-graders could fully demonstrate their writing skills on the computer and factors associated with their performance on the National Assessment of Educational Progress (NAEP) computer-based writing assessment. The results suggest that high-performing fourth-graders (those who scored in the upper 20 percent…
Descriptors: National Competency Tests, Computer Assisted Testing, Writing Tests, Grade 4
Peer reviewed Peer reviewed
Direct linkDirect link
Thompson, Meredith Myra; Braude, Eric John – Journal of Educational Computing Research, 2016
The assessment of learning in large online courses requires tools that are valid, reliable, easy to administer, and can be automatically scored. We have evaluated an online assessment and learning tool called Knowledge Assembly, or Knowla. Knowla measures a student's knowledge in a particular subject by having the student assemble a set of…
Descriptors: Computer Assisted Testing, Teaching Methods, Online Courses, Critical Thinking
Previous Page | Next Page »
Pages: 1  |  2