Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 10 |
Since 2006 (last 20 years) | 11 |
Descriptor
Source
Author
Goldhammer, Frank | 4 |
Sälzer, Christine | 2 |
Zehner, Fabian | 2 |
Ainley, John, Ed. | 1 |
Asil, Mustafa | 1 |
Buerger, Sarah | 1 |
Duckworth, Daniel, Ed. | 1 |
Ercikan, Kadriye | 1 |
Fraillon, Julian, Ed. | 1 |
Friedman, Tim, Ed. | 1 |
Gerick, Julia | 1 |
More ▼ |
Publication Type
Reports - Research | 9 |
Journal Articles | 8 |
Collected Works - General | 1 |
Guides - General | 1 |
Numerical/Quantitative Data | 1 |
Reports - Descriptive | 1 |
Reports - General | 1 |
Tests/Questionnaires | 1 |
Education Level
Secondary Education | 7 |
Grade 8 | 3 |
Junior High Schools | 3 |
Middle Schools | 3 |
Elementary Education | 2 |
Grade 9 | 1 |
High Schools | 1 |
Audience
Researchers | 1 |
Location
Germany | 11 |
South Korea | 6 |
France | 5 |
Chile | 4 |
Denmark | 4 |
Australia | 3 |
Italy | 3 |
Netherlands | 3 |
United States | 3 |
Canada | 2 |
China | 2 |
More ▼ |
Laws, Policies, & Programs
Assessments and Surveys
Program for International… | 5 |
What Works Clearinghouse Rating
Kroehne, Ulf; Buerger, Sarah; Hahnel, Carolin; Goldhammer, Frank – Educational Measurement: Issues and Practice, 2019
For many years, reading comprehension in the Programme for International Student Assessment (PISA) was measured via paper-based assessment (PBA). In the 2015 cycle, computer-based assessment (CBA) was introduced, raising the question of whether central equivalence criteria required for a valid interpretation of the results are fulfilled. As an…
Descriptors: Reading Comprehension, Computer Assisted Testing, Achievement Tests, Foreign Countries
Mikheeva, Ekaterina, Ed.; Meyer, Sebastian, Ed. – International Association for the Evaluation of Educational Achievement, 2020
IEA's International Computer and Information Literacy Study (ICILS) 2018 is designed to assess how well students are prepared for study, work, and life in a digital world. The study measures international differences in students' computer and information literacy (CIL): their ability to use computers to investigate, create, participate, and…
Descriptors: International Assessment, Computer Literacy, Information Literacy, Computer Assisted Testing
Ercikan, Kadriye; Asil, Mustafa; Grover, Raman – Education Policy Analysis Archives, 2018
Student learning is increasingly taking place in digital environments both within and outside schooling contexts. Educational assessments are following suit, both to take advantage of the conveniences and opportunities that digital environments provide as well as to reflect the mediums of learning increasingly taking place in societies around the…
Descriptors: Access to Computers, Disadvantaged, Educational Assessment, Gender Differences
Zehner, Fabian; Goldhammer, Frank; Lubaway, Emily; Sälzer, Christine – Education Inquiry, 2019
In 2015, the "Programme for International Student Assessment" (PISA) introduced multiple changes in its study design, the most extensive being the transition from paper- to computer-based assessment. We investigated the differences between German students' text responses to eight reading items from the paper-based study in 2012 to text…
Descriptors: Foreign Countries, Achievement Tests, International Assessment, Secondary School Students
Fraillon, Julian, Ed.; Ainley, John, Ed.; Schulz, Wolfram, Ed.; Friedman, Tim, Ed.; Duckworth, Daniel, Ed. – International Association for the Evaluation of Educational Achievement, 2020
IEA's International Computer and Information Literacy Study (ICILS) 2018 investigated how well students are prepared for study, work, and life in a digital world. ICILS 2018 measured international differences in students' computer and information literacy (CIL): their ability to use computers to investigate, create, participate, and communicate at…
Descriptors: International Assessment, Computer Literacy, Information Literacy, Computer Assisted Testing
Yamamoto, Kentaro; He, Qiwei; Shin, Hyo Jeong; von Davier, Mattias – ETS Research Report Series, 2017
Approximately a third of the Programme for International Student Assessment (PISA) items in the core domains (math, reading, and science) are constructed-response items and require human coding (scoring). This process is time-consuming, expensive, and prone to error as often (a) humans code inconsistently, and (b) coding reliability in…
Descriptors: Foreign Countries, Achievement Tests, International Assessment, Secondary School Students
Lee, Yi-Hsuan; Haberman, Shelby J. – International Journal of Testing, 2016
The use of computer-based assessments makes the collection of detailed data that capture examinees' progress in the tests and time spent on individual actions possible. This article presents a study using process and timing data to aid understanding of an international language assessment and the examinees. Issues regarding test-taking strategies,…
Descriptors: Computer Assisted Testing, Test Wiseness, Language Tests, International Assessment
Jerrim, John; Micklewright, John; Heine, Jorg-Henrik; Salzer, Christine; McKeown, Caroline – Oxford Review of Education, 2018
The Programme for International Student Assessment (PISA) is an important cross-national study of 15-year-olds' academic knowledge and skills. Educationalists and public policymakers eagerly await the tri-annual results, with particular interest in whether their country has moved up or slid down the international rankings, as compared to earlier…
Descriptors: Foreign Countries, Achievement Tests, International Assessment, Secondary School Students
Zehner, Fabian; Sälzer, Christine; Goldhammer, Frank – Educational and Psychological Measurement, 2016
Automatic coding of short text responses opens new doors in assessment. We implemented and integrated baseline methods of natural language processing and statistical modelling by means of software components that are available under open licenses. The accuracy of automatic text coding is demonstrated by using data collected in the "Programme…
Descriptors: Educational Assessment, Coding, Automation, Responses
Ihme, Jan Marten; Senkbeil, Martin; Goldhammer, Frank; Gerick, Julia – European Educational Research Journal, 2017
The combination of different item formats is found quite often in large scale assessments, and analyses on the dimensionality often indicate multi-dimensionality of tests regarding the task format. In ICILS 2013, three different item types (information-based response tasks, simulation tasks, and authoring tasks) were used to measure computer and…
Descriptors: Foreign Countries, Computer Literacy, Information Literacy, International Assessment
OECD Publishing, 2013
The Programme for the International Assessment of Adult Competencies (PIAAC) has been planned as an ongoing program of assessment. The first cycle of the assessment has involved two "rounds." The first round, which is covered by this report, took place over the period of January 2008-October 2013. The main features of the first cycle of…
Descriptors: International Assessment, Adults, Skills, Test Construction