NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Laws, Policies, & Programs
Pell Grant Program1
What Works Clearinghouse Rating
Showing 1 to 15 of 19 results Save | Export
Jeff Allen; Jay Thomas; Stacy Dreyer; Scott Johanningmeier; Dana Murano; Ty Cruce; Xin Li; Edgar Sanchez – ACT Education Corp., 2025
This report describes the process of developing and validating the enhanced ACT. The report describes the changes made to the test content and the processes by which these design decisions were implemented. The authors describe how they shared the overall scope of the enhancements, including the initial blueprints, with external expert panels,…
Descriptors: College Entrance Examinations, Testing, Change, Test Construction
Peer reviewed Peer reviewed
Direct linkDirect link
Pengelley, James; Whipp, Peter R.; Rovis-Hermann, Nina – Educational Psychology Review, 2023
The aim of the present study is to reconcile previous findings (a) that testing mode has no effect on test outcomes or cognitive load (Comput Hum Behav 77:1-10, 2017) and (b) that younger learners' working memory processes are more sensitive to computer-based test formats (J Psychoeduc Assess 37(3):382-394, 2019). We addressed key methodological…
Descriptors: Scores, Cognitive Processes, Difficulty Level, Secondary School Students
Peer reviewed Peer reviewed
Direct linkDirect link
Provasnik, Stephen – Large-scale Assessments in Education, 2021
This paper presents the concepts and observations in the author's keynote address at the May 2019 "Opportunity versus Challenge: Exploring Usage of Log-File and Process Data in International Large-Scale Assessments" conference in Dublin, Ireland. This paper recaps briefly some key points that emerged at the December 2018 ETS symposium on…
Descriptors: Data Collection, Cognitive Processes, Ethics, Student Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
Moon, Jung Aa; Sinharay, Sandip; Keehner, Madeleine; Katz, Irvin R. – International Journal of Testing, 2020
The current study examined the relationship between test-taker cognition and psychometric item properties in multiple-selection multiple-choice and grid items. In a study with content-equivalent mathematics items in alternative item formats, adult participants' tendency to respond to an item was affected by the presence of a grid and variations of…
Descriptors: Computer Assisted Testing, Multiple Choice Tests, Test Wiseness, Psychometrics
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Hedlefs-Aguilar, Maria Isolde; Morales-Martinez, Guadalupe Elizabeth; Villarreal-Lozano, Ricardo Jesus; Moreno-Rodriguez, Claudia; Gonzalez-Rodriguez, Erick Alejandro – European Journal of Educational Research, 2021
This study explored the cognitive mechanism behind information integration in the test anxiety judgments in 140 engineering students. An experiment was designed to test four factors combined (test goal orientation, test cognitive functioning level, test difficulty and test mode). The experimental task required participants to read 36 scenarios,…
Descriptors: Test Anxiety, Engineering Education, Algebra, College Students
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Morales-Martinez, Guadalupe Elizabeth; Lopez-Ramirez, Ernesto Octavio; Mezquita-Hoyos, Yanko Norberto; Lopez-Perez, Rafael; Resendiz, Ana Yolanda Lara – European Journal of Educational Research, 2019
A sample of 327 engineering bachelor students from a public university in Mexico took part in an information integration study to explore systematic thinking underlying propensity for cheating during a course exam. All study participants were provided with written descriptions of 12 scenarios pertaining to the academic evaluation criteria and were…
Descriptors: Undergraduate Students, Engineering Education, Cheating, Computer Assisted Testing
Peer reviewed Peer reviewed
Direct linkDirect link
Carpenter, Rachel; Alloway, Tracy – Journal of Psychoeducational Assessment, 2019
School systems across the country are transitioning from paper-based testing (PBT) to computer-based testing (CBT). As this technological shift occurs, more research is necessary to understand the practical and performance implications of administering CBTs. Currently, there is a paucity of research using CBTs to examine working memory (WM)…
Descriptors: Computer Assisted Testing, Test Format, Short Term Memory, Cognitive Processes
Peer reviewed Peer reviewed
Direct linkDirect link
Smolinsky, Lawrence; Marx, Brian D.; Olafsson, Gestur; Ma, Yanxia A. – Journal of Educational Computing Research, 2020
Computer-based testing is an expanding use of technology offering advantages to teachers and students. We studied Calculus II classes for science, technology, engineering, and mathematics majors using different testing modes. Three sections with 324 students employed: paper-and-pencil testing, computer-based testing, and both. Computer tests gave…
Descriptors: Test Format, Computer Assisted Testing, Paper (Material), Calculus
Peer reviewed Peer reviewed
Direct linkDirect link
Ihme, Jan Marten; Senkbeil, Martin; Goldhammer, Frank; Gerick, Julia – European Educational Research Journal, 2017
The combination of different item formats is found quite often in large scale assessments, and analyses on the dimensionality often indicate multi-dimensionality of tests regarding the task format. In ICILS 2013, three different item types (information-based response tasks, simulation tasks, and authoring tasks) were used to measure computer and…
Descriptors: Foreign Countries, Computer Literacy, Information Literacy, International Assessment
Peer reviewed Peer reviewed
Direct linkDirect link
Zou, Xiao-Ling; Chen, Yan-Min – Technology, Pedagogy and Education, 2016
The effects of computer and paper test media on EFL test-takers with different computer familiarity in writing scores and in the cognitive writing process have been comprehensively explored from the learners' aspect as well as on the basis of related theories and practice. The results indicate significant differences in test scores among the…
Descriptors: English (Second Language), Second Language Learning, Second Language Instruction, Test Format
Peer reviewed Peer reviewed
Direct linkDirect link
Steinmetz, Jean-Paul; Brunner, Martin; Loarer, Even; Houssemand, Claude – Psychological Assessment, 2010
The Wisconsin Card Sorting Test (WCST) assesses executive and frontal lobe function and can be administered manually or by computer. Despite the widespread application of the 2 versions, the psychometric equivalence of their scores has rarely been evaluated and only a limited set of criteria has been considered. The present experimental study (N =…
Descriptors: Computer Assisted Testing, Psychometrics, Test Theory, Scores
Peer reviewed Peer reviewed
Kobrin, Jennifer L.; Young, John W. – Applied Measurement in Education, 2003
Studied the cognitive equivalence of computerized and paper-and-pencil reading comprehension tests using verbal protocol analysis. Results for 48 college students indicate that the only significant difference between the computerized and paper-and-pencil tests was in the frequency of identifying important information in the passage. (SLD)
Descriptors: Cognitive Processes, College Students, Computer Assisted Testing, Difficulty Level
Peer reviewed Peer reviewed
Cox, Kevin; Clark, David – Computers & Education, 1998
Describes how to construct questions to test all cognitive levels of learning for a course in introductory programming. The framework for learning is based on Bloom's taxonomy. Practical advice is given through examples and by describing a computer system to help deliver quizzes. Includes example questions. (Author/AEF)
Descriptors: Cognitive Processes, Computer Assisted Testing, Computer Science, Computer System Design
Kobrin, Jennifer L. – 2000
The comparability of computerized and paper-and-pencil tests was examined from cognitive perspective, using verbal protocols rather than psychometric methods, as the primary mode of inquiry. Reading comprehension items from the Graduate Record Examinations were completed by 48 college juniors and seniors, half of whom took the computerized test…
Descriptors: Cognitive Processes, College Students, Computer Assisted Testing, Higher Education
Peer reviewed Peer reviewed
Marshall, Thomas E.; And Others – Journal of Educational Technology Systems, 1996
Examines the strategies used in answering a computerized multiple-choice test where all questions on a semantic topic were grouped together or randomly distributed. Findings indicate that students grouped by performance on the test used different strategies in completing the test due to distinct cognitive processes between the groups. (AEF)
Descriptors: Academic Achievement, Cognitive Processes, Computer Assisted Testing, Higher Education
Previous Page | Next Page »
Pages: 1  |  2