Publication Date
In 2025 | 0 |
Since 2024 | 1 |
Since 2021 (last 5 years) | 7 |
Since 2016 (last 10 years) | 11 |
Since 2006 (last 20 years) | 13 |
Descriptor
Computer Assisted Testing | 21 |
Reading Comprehension | 21 |
Test Format | 21 |
Reading Tests | 12 |
Test Items | 8 |
Foreign Countries | 6 |
Higher Education | 6 |
Scores | 6 |
College Students | 5 |
Comparative Analysis | 5 |
Language Tests | 5 |
More ▼ |
Source
Author
Kobrin, Jennifer L. | 2 |
Ali Amjadi | 1 |
Andrea Cedeno | 1 |
Ardoin, Scott P. | 1 |
Bailey, Kathleen M., Ed. | 1 |
Becker, Douglas | 1 |
Ben Seipel | 1 |
Ben-Yehudah, Gal | 1 |
Blais, Jean-Guy | 1 |
Brady, Michael P. | 1 |
Brown, Kevin | 1 |
More ▼ |
Publication Type
Reports - Research | 14 |
Journal Articles | 13 |
Reports - Evaluative | 4 |
Speeches/Meeting Papers | 3 |
Collected Works - Proceedings | 1 |
Dissertations/Theses -… | 1 |
Information Analyses | 1 |
Numerical/Quantitative Data | 1 |
Education Level
Higher Education | 5 |
Postsecondary Education | 5 |
Secondary Education | 3 |
Elementary Education | 2 |
Early Childhood Education | 1 |
Grade 10 | 1 |
Grade 11 | 1 |
Grade 3 | 1 |
Grade 4 | 1 |
High Schools | 1 |
Intermediate Grades | 1 |
More ▼ |
Audience
Location
Canada | 1 |
China | 1 |
Germany | 1 |
Iowa | 1 |
Iran | 1 |
South Korea | 1 |
South Korea (Seoul) | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Andrea Cedeno – ProQuest LLC, 2022
The relationship between computer-based and paper-based testing may vary. Students with special needs may or may not perform better on computer-based testing compared to paper-based testing. Over the past few decades, computers and technology have increased in society and in classrooms. The use of technology has increased during the era of the…
Descriptors: Computer Assisted Testing, Test Format, Special Needs Students, Reading Comprehension
Ali Amjadi – Reading & Writing Quarterly, 2024
Over the last few years, technology has offered new ways of teaching and learning. Accordingly, educational systems are adopting what technology has purveyed to education. The abrupt upsurge of the COVID-19 pandemic also expedited this employment and impelled educational systems to shift to online teaching and learning. Consequently, the offline…
Descriptors: Test Format, Reading Comprehension, Computer Assisted Testing, Reading Strategies
Shin, Jinnie; Gierl, Mark J. – International Journal of Testing, 2022
Over the last five years, tremendous strides have been made in advancing the AIG methodology required to produce items in diverse content areas. However, the one content area where enormous problems remain unsolved is language arts, generally, and reading comprehension, more specifically. While reading comprehension test items can be created using…
Descriptors: Reading Comprehension, Test Construction, Test Items, Natural Language Processing
Ben-Yehudah, Gal; Eshet-Alkalai, Yoram – British Journal of Educational Technology, 2021
The use of digital environments for both learning and assessment is becoming prevalent. This often leads to incongruent situations, in which the study medium (eg, printed textbook) is different from the testing medium (eg, online multiple-choice exams). Despite some evidence that incongruent study-test situations are associated with inferior…
Descriptors: Reading Comprehension, Reading Tests, Test Format, Computer Assisted Testing
Kroehne, Ulf; Buerger, Sarah; Hahnel, Carolin; Goldhammer, Frank – Educational Measurement: Issues and Practice, 2019
For many years, reading comprehension in the Programme for International Student Assessment (PISA) was measured via paper-based assessment (PBA). In the 2015 cycle, computer-based assessment (CBA) was introduced, raising the question of whether central equivalence criteria required for a valid interpretation of the results are fulfilled. As an…
Descriptors: Reading Comprehension, Computer Assisted Testing, Achievement Tests, Foreign Countries
Ben Seipel; Sarah E. Carlson; Virginia Clinton-Lisell; Mark L. Davison; Patrick C. Kennedy – Grantee Submission, 2022
Originally designed for students in Grades 3 through 5, MOCCA (formerly the Multiple-choice Online Causal Comprehension Assessment), identifies students who struggle with comprehension, and helps uncover why they struggle. There are many reasons why students might not comprehend what they read. They may struggle with decoding, or reading words…
Descriptors: Multiple Choice Tests, Computer Assisted Testing, Diagnostic Tests, Reading Tests
Gu, Lin; Ling, Guangming; Liu, Ou Lydia; Yang, Zhitong; Li, Guirong; Kardanova, Elena; Loyalka, Prashant – Assessment & Evaluation in Higher Education, 2021
We examine the effects of computer-based versus paper-based assessment of critical thinking skills, adapted from English (in the U.S.) to Chinese. Using data collected based on a random assignment between the two modes in multiple Chinese colleges, we investigate mode effects from multiple perspectives: mean scores, measurement precision, item…
Descriptors: Critical Thinking, Tests, Test Format, Computer Assisted Testing
Yeom, Soohye; Jun, Henry – Language Assessment Quarterly, 2020
This study investigated whether young Korean students learning English as a foreign language (EFL) engage in similar processes when responding to reading comprehension questions delivered in two different test-presentation modes: paper and computer. It examined the relationship between the two modes and test-takers' reading comprehension…
Descriptors: English (Second Language), Second Language Learning, Reading Strategies, Test Wiseness
Toroujeni, Seyyed Morteza Hashemi – Education and Information Technologies, 2022
Score interchangeability of Computerized Fixed-Length Linear Testing (henceforth CFLT) and Paper-and-Pencil-Based Testing (henceforth PPBT) has become a controversial issue over the last decade when technology has meaningfully restructured methods of the educational assessment. Given this controversy, various testing guidelines published on…
Descriptors: Computer Assisted Testing, Reading Tests, Reading Comprehension, Scoring
Zawoyski, Andrea; Ardoin, Scott P. – School Psychology Review, 2019
Reading comprehension assessments often include multiple-choice (MC) questions, but some researchers doubt their validity in measuring comprehension. Consequently, new assessments may include more short-answer (SA) questions. The current study contributes to the research comparing MC and SA questions by evaluating the effects of anticipated…
Descriptors: Eye Movements, Elementary School Students, Children, Test Format
Worrell, Jamie; Duffy, Mary Lou; Brady, Michael P.; Dukes, Charles; Gonzalez-DeHass, Alyssa – Preventing School Failure, 2016
Many schools use computer-based testing to measure students' progress for end-of-the-year and statewide assessments. There is little research to support whether computer-based testing accurately reflects student progress, particularly among students with learning, performance, and generalization difficulties. This article summarizes an…
Descriptors: Computer Assisted Testing, Generalization, Reading Strategies, Reading Comprehension
Chen, Jing; Sheehan, Kathleen M. – ETS Research Report Series, 2015
The "TOEFL"® family of assessments includes the "TOEFL"® Primary"™, "TOEFL Junior"®, and "TOEFL iBT"® tests. The linguistic complexity of stimulus passages in the reading sections of the TOEFL family of assessments is expected to differ across the test levels. This study evaluates the linguistic…
Descriptors: Language Tests, Second Language Learning, English (Second Language), Reading Comprehension
Brown, Kevin – CEA Forum, 2015
In this article, the author describes his project to take every standardized exam English majors students take. During the summer and fall semesters of 2012, the author signed up for and took the GRE General Test, the Praxis Content Area Exam (English Language, Literature, and Composition: Content Knowledge), the Senior Major Field Tests in…
Descriptors: College Faculty, College English, Test Preparation, Standardized Tests

Kobrin, Jennifer L.; Young, John W. – Applied Measurement in Education, 2003
Studied the cognitive equivalence of computerized and paper-and-pencil reading comprehension tests using verbal protocol analysis. Results for 48 college students indicate that the only significant difference between the computerized and paper-and-pencil tests was in the frequency of identifying important information in the passage. (SLD)
Descriptors: Cognitive Processes, College Students, Computer Assisted Testing, Difficulty Level

Heppner, Frank H.; And Others – Journal of Reading, 1985
Reports that reading performance on a standardized test is better when the text is displayed in print, rather than on a computer display screen. (HOD)
Descriptors: Comparative Analysis, Computer Assisted Testing, Higher Education, Reading Comprehension
Previous Page | Next Page »
Pages: 1 | 2