NotesFAQContact Us
Collection
Advanced
Search Tips
Publication Date
In 20250
Since 20240
Since 2021 (last 5 years)0
Since 2016 (last 10 years)5
Since 2006 (last 20 years)6
Audience
Researchers1
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing all 6 results Save | Export
OECD Publishing, 2019
Log files from computer-based assessment can help better understand respondents' behaviours and cognitive strategies. Analysis of timing information from Programme for the International Assessment of Adult Competencies (PIAAC) reveals large differences in the time participants take to answer assessment items, as well as large country differences…
Descriptors: Adults, Computer Assisted Testing, Test Items, Reaction Time
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Storme, Martin; Myszkowski, Nils; Baron, Simon; Bernard, David – Journal of Intelligence, 2019
Assessing job applicants' general mental ability online poses psychometric challenges due to the necessity of having brief but accurate tests. Recent research (Myszkowski & Storme, 2018) suggests that recovering distractor information through Nested Logit Models (NLM; Suh & Bolt, 2010) increases the reliability of ability estimates in…
Descriptors: Intelligence Tests, Item Response Theory, Comparative Analysis, Test Reliability
Fraillon, Julian, Ed.; Ainley, John, Ed.; Schulz, Wolfram, Ed.; Friedman, Tim, Ed.; Duckworth, Daniel, Ed. – International Association for the Evaluation of Educational Achievement, 2020
IEA's International Computer and Information Literacy Study (ICILS) 2018 investigated how well students are prepared for study, work, and life in a digital world. ICILS 2018 measured international differences in students' computer and information literacy (CIL): their ability to use computers to investigate, create, participate, and communicate at…
Descriptors: International Assessment, Computer Literacy, Information Literacy, Computer Assisted Testing
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Yamamoto, Kentaro; He, Qiwei; Shin, Hyo Jeong; von Davier, Mattias – ETS Research Report Series, 2017
Approximately a third of the Programme for International Student Assessment (PISA) items in the core domains (math, reading, and science) are constructed-response items and require human coding (scoring). This process is time-consuming, expensive, and prone to error as often (a) humans code inconsistently, and (b) coding reliability in…
Descriptors: Foreign Countries, Achievement Tests, International Assessment, Secondary School Students
Peer reviewed Peer reviewed
Direct linkDirect link
Lee, Yi-Hsuan; Haberman, Shelby J. – International Journal of Testing, 2016
The use of computer-based assessments makes the collection of detailed data that capture examinees' progress in the tests and time spent on individual actions possible. This article presents a study using process and timing data to aid understanding of an international language assessment and the examinees. Issues regarding test-taking strategies,…
Descriptors: Computer Assisted Testing, Test Wiseness, Language Tests, International Assessment
OECD Publishing, 2013
The Programme for the International Assessment of Adult Competencies (PIAAC) has been planned as an ongoing program of assessment. The first cycle of the assessment has involved two "rounds." The first round, which is covered by this report, took place over the period of January 2008-October 2013. The main features of the first cycle of…
Descriptors: International Assessment, Adults, Skills, Test Construction