NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 1 to 15 of 21 results Save | Export
Andrea Cedeno – ProQuest LLC, 2022
The relationship between computer-based and paper-based testing may vary. Students with special needs may or may not perform better on computer-based testing compared to paper-based testing. Over the past few decades, computers and technology have increased in society and in classrooms. The use of technology has increased during the era of the…
Descriptors: Computer Assisted Testing, Test Format, Special Needs Students, Reading Comprehension
Peer reviewed Peer reviewed
Direct linkDirect link
Ali Amjadi – Reading & Writing Quarterly, 2024
Over the last few years, technology has offered new ways of teaching and learning. Accordingly, educational systems are adopting what technology has purveyed to education. The abrupt upsurge of the COVID-19 pandemic also expedited this employment and impelled educational systems to shift to online teaching and learning. Consequently, the offline…
Descriptors: Test Format, Reading Comprehension, Computer Assisted Testing, Reading Strategies
Peer reviewed Peer reviewed
Direct linkDirect link
Shin, Jinnie; Gierl, Mark J. – International Journal of Testing, 2022
Over the last five years, tremendous strides have been made in advancing the AIG methodology required to produce items in diverse content areas. However, the one content area where enormous problems remain unsolved is language arts, generally, and reading comprehension, more specifically. While reading comprehension test items can be created using…
Descriptors: Reading Comprehension, Test Construction, Test Items, Natural Language Processing
Peer reviewed Peer reviewed
Direct linkDirect link
Ben-Yehudah, Gal; Eshet-Alkalai, Yoram – British Journal of Educational Technology, 2021
The use of digital environments for both learning and assessment is becoming prevalent. This often leads to incongruent situations, in which the study medium (eg, printed textbook) is different from the testing medium (eg, online multiple-choice exams). Despite some evidence that incongruent study-test situations are associated with inferior…
Descriptors: Reading Comprehension, Reading Tests, Test Format, Computer Assisted Testing
Peer reviewed Peer reviewed
Direct linkDirect link
Kroehne, Ulf; Buerger, Sarah; Hahnel, Carolin; Goldhammer, Frank – Educational Measurement: Issues and Practice, 2019
For many years, reading comprehension in the Programme for International Student Assessment (PISA) was measured via paper-based assessment (PBA). In the 2015 cycle, computer-based assessment (CBA) was introduced, raising the question of whether central equivalence criteria required for a valid interpretation of the results are fulfilled. As an…
Descriptors: Reading Comprehension, Computer Assisted Testing, Achievement Tests, Foreign Countries
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Ben Seipel; Sarah E. Carlson; Virginia Clinton-Lisell; Mark L. Davison; Patrick C. Kennedy – Grantee Submission, 2022
Originally designed for students in Grades 3 through 5, MOCCA (formerly the Multiple-choice Online Causal Comprehension Assessment), identifies students who struggle with comprehension, and helps uncover why they struggle. There are many reasons why students might not comprehend what they read. They may struggle with decoding, or reading words…
Descriptors: Multiple Choice Tests, Computer Assisted Testing, Diagnostic Tests, Reading Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Gu, Lin; Ling, Guangming; Liu, Ou Lydia; Yang, Zhitong; Li, Guirong; Kardanova, Elena; Loyalka, Prashant – Assessment & Evaluation in Higher Education, 2021
We examine the effects of computer-based versus paper-based assessment of critical thinking skills, adapted from English (in the U.S.) to Chinese. Using data collected based on a random assignment between the two modes in multiple Chinese colleges, we investigate mode effects from multiple perspectives: mean scores, measurement precision, item…
Descriptors: Critical Thinking, Tests, Test Format, Computer Assisted Testing
Peer reviewed Peer reviewed
Direct linkDirect link
Yeom, Soohye; Jun, Henry – Language Assessment Quarterly, 2020
This study investigated whether young Korean students learning English as a foreign language (EFL) engage in similar processes when responding to reading comprehension questions delivered in two different test-presentation modes: paper and computer. It examined the relationship between the two modes and test-takers' reading comprehension…
Descriptors: English (Second Language), Second Language Learning, Reading Strategies, Test Wiseness
Peer reviewed Peer reviewed
Direct linkDirect link
Toroujeni, Seyyed Morteza Hashemi – Education and Information Technologies, 2022
Score interchangeability of Computerized Fixed-Length Linear Testing (henceforth CFLT) and Paper-and-Pencil-Based Testing (henceforth PPBT) has become a controversial issue over the last decade when technology has meaningfully restructured methods of the educational assessment. Given this controversy, various testing guidelines published on…
Descriptors: Computer Assisted Testing, Reading Tests, Reading Comprehension, Scoring
Peer reviewed Peer reviewed
Direct linkDirect link
Zawoyski, Andrea; Ardoin, Scott P. – School Psychology Review, 2019
Reading comprehension assessments often include multiple-choice (MC) questions, but some researchers doubt their validity in measuring comprehension. Consequently, new assessments may include more short-answer (SA) questions. The current study contributes to the research comparing MC and SA questions by evaluating the effects of anticipated…
Descriptors: Eye Movements, Elementary School Students, Children, Test Format
Peer reviewed Peer reviewed
Direct linkDirect link
Worrell, Jamie; Duffy, Mary Lou; Brady, Michael P.; Dukes, Charles; Gonzalez-DeHass, Alyssa – Preventing School Failure, 2016
Many schools use computer-based testing to measure students' progress for end-of-the-year and statewide assessments. There is little research to support whether computer-based testing accurately reflects student progress, particularly among students with learning, performance, and generalization difficulties. This article summarizes an…
Descriptors: Computer Assisted Testing, Generalization, Reading Strategies, Reading Comprehension
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Chen, Jing; Sheehan, Kathleen M. – ETS Research Report Series, 2015
The "TOEFL"® family of assessments includes the "TOEFL"® Primary"™, "TOEFL Junior"®, and "TOEFL iBT"® tests. The linguistic complexity of stimulus passages in the reading sections of the TOEFL family of assessments is expected to differ across the test levels. This study evaluates the linguistic…
Descriptors: Language Tests, Second Language Learning, English (Second Language), Reading Comprehension
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Brown, Kevin – CEA Forum, 2015
In this article, the author describes his project to take every standardized exam English majors students take. During the summer and fall semesters of 2012, the author signed up for and took the GRE General Test, the Praxis Content Area Exam (English Language, Literature, and Composition: Content Knowledge), the Senior Major Field Tests in…
Descriptors: College Faculty, College English, Test Preparation, Standardized Tests
Peer reviewed Peer reviewed
Kobrin, Jennifer L.; Young, John W. – Applied Measurement in Education, 2003
Studied the cognitive equivalence of computerized and paper-and-pencil reading comprehension tests using verbal protocol analysis. Results for 48 college students indicate that the only significant difference between the computerized and paper-and-pencil tests was in the frequency of identifying important information in the passage. (SLD)
Descriptors: Cognitive Processes, College Students, Computer Assisted Testing, Difficulty Level
Peer reviewed Peer reviewed
Heppner, Frank H.; And Others – Journal of Reading, 1985
Reports that reading performance on a standardized test is better when the text is displayed in print, rather than on a computer display screen. (HOD)
Descriptors: Comparative Analysis, Computer Assisted Testing, Higher Education, Reading Comprehension
Previous Page | Next Page »
Pages: 1  |  2