Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 1 |
Since 2016 (last 10 years) | 6 |
Since 2006 (last 20 years) | 7 |
Descriptor
Computer Assisted Testing | 16 |
Scoring | 15 |
Test Construction | 8 |
Test Reliability | 6 |
Simulation | 4 |
Test Items | 4 |
Test Validity | 4 |
Adaptive Testing | 3 |
Benchmarking | 3 |
Career Readiness | 3 |
College Readiness | 3 |
More ▼ |
Source
National Center for Education… | 3 |
Nebraska Department of… | 2 |
Behavioral Research and… | 1 |
Grantee Submission | 1 |
Partnership for Assessment of… | 1 |
Author
O'Neil, Harold F., Jr. | 3 |
Bennett, Randy Elliot | 2 |
Chung, Gregory K. W. K. | 2 |
Klein, Davina C. D. | 2 |
Schacter, John | 2 |
Allen, Nancy | 1 |
Alonzo, Julie | 1 |
Baker, Eva L. | 1 |
Ben Seipel | 1 |
Braswell, James | 1 |
Dalton, Ben | 1 |
More ▼ |
Publication Type
Numerical/Quantitative Data | 16 |
Reports - Research | 8 |
Reports - Evaluative | 7 |
Tests/Questionnaires | 2 |
Collected Works - General | 1 |
Opinion Papers | 1 |
Education Level
Elementary Education | 4 |
Intermediate Grades | 3 |
Grade 4 | 2 |
Grade 8 | 2 |
Junior High Schools | 2 |
Middle Schools | 2 |
Secondary Education | 2 |
Early Childhood Education | 1 |
Grade 2 | 1 |
Grade 3 | 1 |
Grade 5 | 1 |
More ▼ |
Audience
Location
Nebraska | 2 |
Laws, Policies, & Programs
Assessments and Surveys
Massachusetts Comprehensive… | 1 |
Progress in International… | 1 |
What Works Clearinghouse Rating
Ben Seipel; Sarah E. Carlson; Virginia Clinton-Lisell; Mark L. Davison; Patrick C. Kennedy – Grantee Submission, 2022
Originally designed for students in Grades 3 through 5, MOCCA (formerly the Multiple-choice Online Causal Comprehension Assessment), identifies students who struggle with comprehension, and helps uncover why they struggle. There are many reasons why students might not comprehend what they read. They may struggle with decoding, or reading words…
Descriptors: Multiple Choice Tests, Computer Assisted Testing, Diagnostic Tests, Reading Tests
Herget, Debbie; Dalton, Ben; Kinney, Saki; Smith, W. Zachary; Wilson, David; Rogers, Jim – National Center for Education Statistics, 2019
The Progress in International Reading Literacy Study (PIRLS) is an international comparative study of student performance in reading literacy at the fourth grade. PIRLS 2016 marks the fourth iteration of the study, which has been conducted every 5 years since 2001. New to the PIRLS assessment in 2016, ePIRLS provides a computer-based extension to…
Descriptors: Achievement Tests, Grade 4, Reading Achievement, Foreign Countries
Nebraska Department of Education, 2020
The Spring 2020 Nebraska Student-Centered Assessment System (NSCAS) General Summative testing was cancelled due to COVID-19. This technical report documents the processes and procedures that had been implemented to support the Spring 2020 assessments prior to the cancellation. The following sections are presented in this technical report: (1)…
Descriptors: English, Language Arts, Mathematics Tests, Science Tests
Nebraska Department of Education, 2019
This technical report documents the processes and procedures implemented to support the Spring 2019 Nebraska Student-Centered Assessment System (NSCAS) General Summative English Language Arts (ELA), Mathematics, and Science assessments by NWEA® under the supervision of the Nebraska Department of Education (NDE). The technical report shows how the…
Descriptors: English, Language Arts, Summative Evaluation, Mathematics Tests
Kahn, Josh; Nese, Joseph T.; Alonzo, Julie – Behavioral Research and Teaching, 2016
There is strong theoretical support for oral reading fluency (ORF) as an essential building block of reading proficiency. The current and standard ORF assessment procedure requires that students read aloud a grade-level passage (˜ 250 words) in a one-to-one administration, with the number of words read correctly in 60 seconds constituting their…
Descriptors: Teacher Surveys, Oral Reading, Reading Tests, Computer Assisted Testing
Partnership for Assessment of Readiness for College and Careers, 2016
The Partnership for Assessment of Readiness for College and Careers (PARCC) is a state-led consortium designed to create next-generation assessments that, compared to traditional K-12 assessments, more accurately measure student progress toward college and career readiness. The PARCC assessments are aligned to the Common Core State Standards…
Descriptors: Standardized Tests, Career Readiness, College Readiness, Test Validity
O'Neil, Harold F., Jr.; Klein, Davina C. D. – 1997
This report documents progress at the Center for Research on Evaluation, Standards, and Student Testing (CRESST) on the feasibility of scoring concept maps using technology. CRESST, in its integrated simulation approach to assessment, has assembled a suite of performance assessment tasks (the integrated simulation) onto which they have mapped the…
Descriptors: Automation, Computer Assisted Testing, Concept Mapping, Cooperation

Green, Bert F. – 2002
Maximum likelihood and Bayesian estimates of proficiency, typically used in adaptive testing, use item weights that depend on test taker proficiency to estimate test taker proficiency. In this study, several methods were explored through computer simulation using fixed item weights, which depend mainly on the items difficulty. The simpler scores…
Descriptors: Adaptive Testing, Bayesian Statistics, Computer Assisted Testing, Computer Simulation
Chung, Gregory K. W. K.; Herl, Howard E.; Klein, Davina C. D.; O'Neil, Harold F., Jr.; Schacter, John – 1997
This report examines issues in the scale-up of assessment software from the Center for Research on Evaluation, Standards, and Student Testing (CRESST). "Scale-up" is used in a metaphorical sense, meaning adding new assessment tools to CRESST's assessment software. During the past several years, CRESST has been developing and evaluating a…
Descriptors: Computer Assisted Testing, Computer Software, Concept Mapping, Educational Assessment
Sandene, Brent; Horkay, Nancy; Bennett, Randy Elliot; Allen, Nancy; Braswell, James; Kaplan, Bruce; Oranje, Andreas – National Center for Education Statistics, 2005
This publication presents the reports from two studies, Math Online (MOL) and Writing Online (WOL), part of the National Assessment of Educational Progress (NAEP) Technology-Based Assessment (TBA) project. Funded by the National Center for Education Statistics (NCES), the Technology-Based Assessment project is intended to explore the use of new…
Descriptors: Grade 8, Statistical Analysis, Scoring, Familiarity
O'Neil, Harold F., Jr.; Schacter, John – 1997
This document reviews several theoretical frameworks of problem-solving, provides a definition of the construct, suggests ways of measuring the construct, focuses on issues for assessment, and provides specifications for the computer-based assessment of problem solving. As defined in the model of the Center for Research on Evaluation, Standards,…
Descriptors: Computer Assisted Testing, Computer Software, Criteria, Educational Assessment
Russell, Michael; Plati, Tom – 2000
The studies reported here focus on the effects of mode of administration of the Massachusetts Comprehensive Assessment System (MCAS) in grades 4, 8, and 10. These studies, which draw on earlier studies, also examine the mode of administration effect at different levels of keyboarding speed and for students who received special education (SPED)…
Descriptors: Computer Assisted Testing, Elementary Secondary Education, Essay Tests, Keyboarding (Data Entry)
Bennett, Randy Elliot; Persky, Hilary; Weiss, Andrew R.; Jenkins, Frank – National Center for Education Statistics, 2007
The Problem Solving in Technology-Rich Environments (TRE) study was designed to demonstrate and explore innovative use of computers for developing, administering, scoring, and analyzing the results of National Assessment of Educational Progress (NAEP) assessments. Two scenarios (Search and Simulation) were created for measuring problem solving…
Descriptors: Computer Assisted Testing, National Competency Tests, Problem Solving, Simulation
Chung, Gregory K. W. K.; Baker, Eva L. – 1997
This report documents the technology initiatives of the Center for Research on Evaluation, Standards, and Student Testing (CRESST) in two broad areas: (1) using technology to improve the quality, utility, and feasibility of existing measures; and (2) using technology to design and develop new assessments and measurement approaches available…
Descriptors: Computer Assisted Testing, Constructed Response, Educational Planning, Educational Technology
Kingsbury, G. Gage; Weiss, David J. – 1981
Conventional mastery tests designed to make optimal mastery classifications were compared with fixed-length and variable-length adaptive mastery tests. Comparisons between the testing procedures were made across five content areas in an introductory biology course from tests administered to volunteers. The criterion was the student's standing in…
Descriptors: Achievement Tests, Adaptive Testing, Biology, Comparative Analysis
Previous Page | Next Page »
Pages: 1 | 2