Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 2 |
Since 2016 (last 10 years) | 8 |
Since 2006 (last 20 years) | 13 |
Descriptor
Computer Assisted Testing | 13 |
Item Response Theory | 13 |
Grade 5 | 9 |
Grade 6 | 7 |
Test Items | 7 |
Elementary School Students | 5 |
Grade 4 | 5 |
Grade 3 | 4 |
Grade 7 | 4 |
Language Tests | 4 |
Test Construction | 4 |
More ▼ |
Source
Author
Alonzo, Julie | 3 |
Anderson, Daniel | 3 |
Park, Bitnara Jasmine | 3 |
Tindal, Gerald | 3 |
Albano, Anthony D. | 1 |
Ali, Usama | 1 |
Brown, Terran | 1 |
Chen, Guanhua | 1 |
Chen, Jianshen | 1 |
Chen, Shin-Feng | 1 |
Costanzo, Kate | 1 |
More ▼ |
Publication Type
Reports - Research | 8 |
Numerical/Quantitative Data | 5 |
Journal Articles | 4 |
Reports - Evaluative | 2 |
Collected Works - Proceedings | 1 |
Dissertations/Theses -… | 1 |
Reports - Descriptive | 1 |
Speeches/Meeting Papers | 1 |
Education Level
Elementary Education | 13 |
Intermediate Grades | 13 |
Middle Schools | 11 |
Grade 5 | 9 |
Grade 6 | 7 |
Early Childhood Education | 5 |
Grade 4 | 5 |
Junior High Schools | 5 |
Primary Education | 5 |
Secondary Education | 5 |
Grade 3 | 4 |
More ▼ |
Audience
Location
Maryland | 2 |
Arkansas | 1 |
Colorado | 1 |
District of Columbia | 1 |
Finland | 1 |
France | 1 |
Illinois | 1 |
Massachusetts | 1 |
Mississippi | 1 |
New Jersey | 1 |
New Mexico | 1 |
More ▼ |
Laws, Policies, & Programs
Assessments and Surveys
Measures of Academic Progress | 1 |
National Assessment of… | 1 |
Program for International… | 1 |
What Works Clearinghouse Rating
Nagy, Gabriel; Ulitzsch, Esther; Lindner, Marlit Annalena – Journal of Computer Assisted Learning, 2023
Background: Item response times in computerized assessments are frequently used to identify rapid guessing behaviour as a manifestation of response disengagement. However, non-rapid responses (i.e., with longer response times) are not necessarily engaged, which means that response-time-based procedures could overlook disengaged responses.…
Descriptors: Guessing (Tests), Academic Persistence, Learner Engagement, Computer Assisted Testing
Jewsbury, Paul A.; van Rijn, Peter W. – Journal of Educational and Behavioral Statistics, 2020
In large-scale educational assessment data consistent with a simple-structure multidimensional item response theory (MIRT) model, where every item measures only one latent variable, separate unidimensional item response theory (UIRT) models for each latent variable are often calibrated for practical reasons. While this approach can be valid for…
Descriptors: Item Response Theory, Computation, Test Items, Adaptive Testing
James, Syretta R.; Liu, Shihching Jessica; Maina, Nyambura; Wade, Julie; Wang, Helen; Wilson, Heather; Wolanin, Natalie – Montgomery County Public Schools, 2021
The impact of the COVID-19 pandemic continues to overwhelm the functioning and outcomes of educational systems throughout the nation. The public education system is under particular scrutiny given that students, families, and educators are under considerable stress to maintain academic progress. Since the beginning of the crisis, school-systems…
Descriptors: Achievement Tests, COVID-19, Pandemics, Public Schools
Wang, Jing-Ru; Chen, Shin-Feng – International Journal of Science and Mathematics Education, 2016
This article reports on the development of an online dynamic approach for assessing and improving students' reading comprehension of science texts--the dynamic assessment for reading comprehension of science text (DARCST). The DARCST blended assessment and response-specific instruction into a holistic learning task for grades 5 and 6 students. The…
Descriptors: Computer Assisted Testing, Reading Comprehension, Science Instruction, Grade 5
Chen, Guanhua – ProQuest LLC, 2018
This study is part of a larger design study that iteratively improves a robotics programming curriculum as well as a computational thinking (CT) instrument. Its focus was majorly on CT assessment and particularly on an online CT instrument with logging functionality that can store a student's problem-solving process by recording interactions…
Descriptors: Elementary School Students, Test Construction, Cognitive Tests, Computer Assisted Testing
Liu, Junhui; Brown, Terran; Chen, Jianshen; Ali, Usama; Hou, Likun; Costanzo, Kate – Partnership for Assessment of Readiness for College and Careers, 2016
The Partnership for Assessment of Readiness for College and Careers (PARCC) is a state-led consortium working to develop next-generation assessments that more accurately, compared to previous assessments, measure student progress toward college and career readiness. The PARCC assessments include both English Language Arts/Literacy (ELA/L) and…
Descriptors: Testing, Achievement Tests, Test Items, Test Bias
Steedle, Jeffrey; McBride, Malena; Johnson, Marc; Keng, Leslie – Partnership for Assessment of Readiness for College and Careers, 2016
The first operational administration of the Partnership for Assessment of Readiness for College and Careers (PARCC) took place during the 2014-2015 school year. In addition to the traditional paper-and-pencil format, the assessments were available for administration on a variety of electronic devices, including desktop computers, laptop computers,…
Descriptors: Computer Assisted Testing, Difficulty Level, Test Items, Scores
Wyse, Adam E.; Albano, Anthony D. – Applied Measurement in Education, 2015
This article used several data sets from a large-scale state testing program to examine the feasibility of combining general and modified assessment items in computerized adaptive testing (CAT) for different groups of students. Results suggested that several of the assumptions made when employing this type of mixed-item CAT may not be met for…
Descriptors: Adaptive Testing, Computer Assisted Testing, Test Items, Testing Programs
Liu, I-Fang; Ko, Hwa-Wei – International Association for Development of the Information Society, 2016
Perspectives from reading and information fields have identified similar skills belong to two different kind of literacy being online reading abilities and ICT skills. It causes a conflict between two research fields and increase difficult of integrating study results. The purpose of this study was to determine which views are suitable for…
Descriptors: Information Technology, Information Literacy, Computer Literacy, Reading Skills
Alonzo, Julie; Anderson, Daniel; Park, Bitnara Jasmine; Tindal, Gerald – Behavioral Research and Teaching, 2012
In this technical report, we describe the development and piloting of a series of vocabulary assessments intended for use with students in grades two through eight. These measures, available as part of easyCBM[TM], an online progress monitoring and benchmark/screening assessment system, were developed in 2010 and administered to approximately 1200…
Descriptors: Curriculum Based Assessment, Vocabulary, Language Tests, Test Construction
Alonzo, Julie; Anderson, Daniel; Park, Bitnara Jasmine; Tindal, Gerald – Behavioral Research and Teaching, 2012
In this technical report, we describe the development and piloting of a series of vocabulary assessments intended for use with students in grades two through eight. These measures, available as part of easyCBM[TM], an online progress monitoring and benchmark/screening assessment system, were developed in 2010 and administered to approximately 1200…
Descriptors: Curriculum Based Assessment, Vocabulary, Language Tests, Test Construction
Alonzo, Julie; Anderson, Daniel; Park, Bitnara Jasmine; Tindal, Gerald – Behavioral Research and Teaching, 2012
In this technical report, we describe the development and piloting of a series of vocabulary assessments intended for use with students in grades two through eight. These measures, available as part of easyCBM[TM], an online progress monitoring and benchmark/screening assessment system, were developed in 2010 and administered to approximately 1200…
Descriptors: Curriculum Based Assessment, Vocabulary, Language Tests, Test Construction
Stamper, John, Ed.; Pardos, Zachary, Ed.; Mavrikis, Manolis, Ed.; McLaren, Bruce M., Ed. – International Educational Data Mining Society, 2014
The 7th International Conference on Education Data Mining held on July 4th-7th, 2014, at the Institute of Education, London, UK is the leading international forum for high-quality research that mines large data sets in order to answer educational research questions that shed light on the learning process. These data sets may come from the traces…
Descriptors: Information Retrieval, Data Processing, Data Analysis, Data Collection