Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 4 |
Since 2016 (last 10 years) | 14 |
Since 2006 (last 20 years) | 14 |
Descriptor
Source
Author
Goldhammer, Frank | 14 |
Zehner, Fabian | 5 |
Kroehne, Ulf | 4 |
Hahnel, Carolin | 3 |
Lüdtke, Oliver | 3 |
Sälzer, Christine | 3 |
Brandhuber, Liene | 2 |
Eichmann, Beate | 2 |
Greiff, Samuel | 2 |
Martens, Thomas | 2 |
Naumann, Johannes | 2 |
More ▼ |
Publication Type
Journal Articles | 13 |
Reports - Research | 12 |
Reports - Evaluative | 2 |
Numerical/Quantitative Data | 1 |
Education Level
Secondary Education | 10 |
Grade 9 | 1 |
High Schools | 1 |
Junior High Schools | 1 |
Middle Schools | 1 |
Audience
Location
Germany | 9 |
Canada | 3 |
Australia | 2 |
Austria | 2 |
Belgium | 2 |
Czech Republic | 2 |
Denmark | 2 |
Estonia | 2 |
Finland | 2 |
France | 2 |
Ireland | 2 |
More ▼ |
Laws, Policies, & Programs
Assessments and Surveys
Program for International… | 10 |
What Works Clearinghouse Rating
Andersen, Nico; Zehner, Fabian; Goldhammer, Frank – Journal of Computer Assisted Learning, 2023
Background: In the context of large-scale educational assessments, the effort required to code open-ended text responses is considerably more expensive and time-consuming than the evaluation of multiple-choice responses because it requires trained personnel and long manual coding sessions. Aim: Our semi-supervised coding method eco (exploring…
Descriptors: Foreign Countries, Achievement Tests, International Assessment, Secondary School Students
Harrison, Scott; Kroehne, Ulf; Goldhammer, Frank; Lüdtke, Oliver; Robitzsch, Alexander – Large-scale Assessments in Education, 2023
Background: Mode effects, the variations in item and scale properties attributed to the mode of test administration (paper vs. computer), have stimulated research around test equivalence and trend estimation in PISA. The PISA assessment framework provides the backbone to the interpretation of the results of the PISA test scores. However, an…
Descriptors: Scoring, Test Items, Difficulty Level, Foreign Countries
Goldhammer, Frank; Hahnel, Carolin; Kroehne, Ulf; Zehner, Fabian – Large-scale Assessments in Education, 2021
International large-scale assessments such as PISA or PIAAC have started to provide public or scientific use files for log data; that is, events, event-related attributes and timestamps of test-takers' interactions with the assessment system. Log data and the process indicators derived from it can be used for many purposes. However, the intended…
Descriptors: International Assessment, Data, Computer Assisted Testing, Validity
Kunina-Habenicht, Olga; Goldhammer, Frank – Large-scale Assessments in Education, 2020
As a relevant cognitive-motivational aspect of ICT literacy, a new construct "ICT Engagement" is theoretically based on self-determination theory and involves the factors ICT interest, Perceived ICT competence, Perceived autonomy related to ICT use, and ICT as a topic in social interaction. In this manuscript, we present different…
Descriptors: International Assessment, Achievement Tests, Foreign Countries, Secondary School Students
Eichmann, Beate; Goldhammer, Frank; Greiff, Samuel; Brandhuber, Liene; Naumann, Johannes – Journal of Educational Psychology, 2020
In large-scale assessments, performance differences across different groups are regularly found. These group differences (e.g., gender differences) are often relevant for educational policy decisions and measures. However, the formation of these group differences usually remains unclear. We propose an approach for investigating this formation by…
Descriptors: Problem Solving, Data Use, Differences, Achievement Tests
Eichmann, Beate; Greiff, Samuel; Naumann, Johannes; Brandhuber, Liene; Goldhammer, Frank – Journal of Computer Assisted Learning, 2020
In this explorative study, we investigate how sequences of behaviour are related to success or failure in complex problem-solving (CPS). To this end, we analysed log data from two different tasks of the problem-solving assessment of the Programme for International Student Assessment 2012 study (n = 30,098 students). We first coded every…
Descriptors: Behavior Patterns, Difficulty Level, Problem Solving, Success
Kroehne, Ulf; Buerger, Sarah; Hahnel, Carolin; Goldhammer, Frank – Educational Measurement: Issues and Practice, 2019
For many years, reading comprehension in the Programme for International Student Assessment (PISA) was measured via paper-based assessment (PBA). In the 2015 cycle, computer-based assessment (CBA) was introduced, raising the question of whether central equivalence criteria required for a valid interpretation of the results are fulfilled. As an…
Descriptors: Reading Comprehension, Computer Assisted Testing, Achievement Tests, Foreign Countries
Goldhammer, Frank; Kroehne, Ulf; Hahnel, Carolin; De Boeck, Paul – Journal of Educational Psychology, 2021
Efficiency in reading component skills is crucial for reading comprehension, as efficient subprocesses do not extensively consume limited cognitive resources, making them available for comprehension processes. Cognitive efficiency is typically measured with speeded tests of relatively easy items. Observed responses and response times indicate the…
Descriptors: Reading Rate, Reading Comprehension, Cognitive Ability, Reading Tests
Zehner, Fabian; Goldhammer, Frank; Sälzer, Christine – Large-scale Assessments in Education, 2018
Background: The gender gap in reading literacy is repeatedly found in large-scale assessments. This study compared girls' and boys' text responses in a reading test applying natural language processing. For this, a theoretical framework was compiled that allows mapping of response features to the preceding cognitive components such as micro- and…
Descriptors: Reading Comprehension, Gender Differences, Reader Response, Reader Text Relationship
Zehner, Fabian; Goldhammer, Frank; Lubaway, Emily; Sälzer, Christine – Education Inquiry, 2019
In 2015, the "Programme for International Student Assessment" (PISA) introduced multiple changes in its study design, the most extensive being the transition from paper- to computer-based assessment. We investigated the differences between German students' text responses to eight reading items from the paper-based study in 2012 to text…
Descriptors: Foreign Countries, Achievement Tests, International Assessment, Secondary School Students
Goldhammer, Frank; Martens, Thomas; Christoph, Gabriela; Lüdtke, Oliver – OECD Publishing, 2016
In this study, we investigated how empirical indicators of test-taking engagement can be defined, empirically validated, and used to describe group differences in the context of the Programme of International Assessment of Adult Competences (PIAAC). The approach was to distinguish between disengaged and engaged response behavior by means of…
Descriptors: International Assessment, Adults, Response Style (Tests), Reaction Time
Zehner, Fabian; Sälzer, Christine; Goldhammer, Frank – Educational and Psychological Measurement, 2016
Automatic coding of short text responses opens new doors in assessment. We implemented and integrated baseline methods of natural language processing and statistical modelling by means of software components that are available under open licenses. The accuracy of automatic text coding is demonstrated by using data collected in the "Programme…
Descriptors: Educational Assessment, Coding, Automation, Responses
Goldhammer, Frank; Martens, Thomas; Lüdtke, Oliver – Large-scale Assessments in Education, 2017
Background: A potential problem of low-stakes large-scale assessments such as the Programme for the International Assessment of Adult Competencies (PIAAC) is low test-taking engagement. The present study pursued two goals in order to better understand conditioning factors of test-taking disengagement: First, a model-based approach was used to…
Descriptors: Student Evaluation, International Assessment, Adults, Competence
Ihme, Jan Marten; Senkbeil, Martin; Goldhammer, Frank; Gerick, Julia – European Educational Research Journal, 2017
The combination of different item formats is found quite often in large scale assessments, and analyses on the dimensionality often indicate multi-dimensionality of tests regarding the task format. In ICILS 2013, three different item types (information-based response tasks, simulation tasks, and authoring tasks) were used to measure computer and…
Descriptors: Foreign Countries, Computer Literacy, Information Literacy, International Assessment