Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 2 |
Since 2016 (last 10 years) | 6 |
Since 2006 (last 20 years) | 8 |
Descriptor
Difficulty Level | 27 |
Performance Factors | 27 |
Test Items | 27 |
Higher Education | 9 |
Foreign Countries | 8 |
Scores | 7 |
Multiple Choice Tests | 6 |
Cognitive Processes | 5 |
College Entrance Examinations | 5 |
College Students | 5 |
Mathematics Tests | 5 |
More ▼ |
Source
Author
Allison, Donald E. | 1 |
Almeida, Daniela | 1 |
Attali, Yigal | 1 |
Baird, J. Hugh | 1 |
Balta, Ebru | 1 |
Bethell-Fox, Charles E. | 1 |
Braswell, James | 1 |
Bridgeman, Brent | 1 |
Calders, Toon | 1 |
Chall, Jeanne S. | 1 |
Cline, Frederick | 1 |
More ▼ |
Publication Type
Reports - Research | 19 |
Journal Articles | 15 |
Speeches/Meeting Papers | 5 |
Reports - Evaluative | 3 |
Collected Works - Proceedings | 1 |
Information Analyses | 1 |
Education Level
Higher Education | 5 |
Postsecondary Education | 3 |
Elementary Education | 2 |
Grade 5 | 1 |
Grade 8 | 1 |
High Schools | 1 |
Intermediate Grades | 1 |
Middle Schools | 1 |
Secondary Education | 1 |
Audience
Laws, Policies, & Programs
Assessments and Surveys
SAT (College Admission Test) | 4 |
Graduate Record Examinations | 1 |
State Trait Anxiety Inventory | 1 |
Trends in International… | 1 |
Wechsler Adult Intelligence… | 1 |
What Works Clearinghouse Rating
Guo, Hongwen; Ercikan, Kadriye – ETS Research Report Series, 2021
In this report, we demonstrate use of differential response time (DRT) methodology, an extension of differential item functioning methodology, for examining differences in how students from different backgrounds engage with assessment tasks. We analyze response time data from a digitally delivered mathematics assessment to examine timing…
Descriptors: Test Wiseness, English Language Learners, Reaction Time, Mathematics Tests
Moro, Sérgio; Martins, António; Ramos, Pedro; Esmerado, Joaquim; Costa, Joana Martinho; Almeida, Daniela – Computers in the Schools, 2020
Many university programs include Microsoft Excel courses given their value as a scientific and technical tool. However, evaluating what is effectively learned by students is a challenging task. Considering multiple-choice written exams are a standard evaluation format, this study aimed to uncover the features influencing students' success in…
Descriptors: Multiple Choice Tests, Test Items, Spreadsheets, Computer Software
Shanmugam, S.Kanageswari Suppiah; Veloo, Arsaythamby; Md-Ali, Ruzlan – EURASIA Journal of Mathematics, Science and Technology Education, 2021
This comparative research study examined Grade Five Malaysian Aboriginal pupils' mathematical performance on 30 computation and 20 word problem items in the academic Malay language and community "Temiar" language. The items were constructed in the Malay language before adapting and audio recording into the "Temiar" language…
Descriptors: Mathematics Achievement, Mathematics Tests, Test Items, Difficulty Level
Balta, Ebru; Omur Sunbul, Secil – Eurasian Journal of Educational Research, 2017
Purpose: Position effects may influence examinees' test performances in several ways and trigger other psychometric issues, such as Differential Item Functioning (DIF) .This study aims to supply test forms in which items in the test are ordered differently, depending on their difficulty level (from easy to difficult or difficult to easy), to…
Descriptors: Test Items, Sequential Approach, Difficulty Level, Mathematics Tests
Polat, Murat – Novitas-ROYAL (Research on Youth and Language), 2020
Classroom practices, materials and teaching methods in language classes have changed a lot in the last decades and continue to evolve; however, the commonly used techniques to test students' foreign language skills have not changed much regardless of the recent awareness in Bloom's taxonomy. Testing units at schools rely mostly on multiple choice…
Descriptors: Multiple Choice Tests, Test Format, Test Items, Difficulty Level
Greenlees, Jane; Logan, Tracy – North American Chapter of the International Group for the Psychology of Mathematics Education, 2014
This study investigated the performance and reasoning of 143 Australian students who completed mathematics tasks sourced from their national test. Specifically, this study examined changed student performance and reasoning on items where the graphic component was modified. The results of the study revealed significant performance differences…
Descriptors: Test Items, Test Construction, Mathematics Achievement, Thinking Skills
Qian, Xiaoyu; Nandakumar, Ratna; Glutting, Joseoph; Ford, Danielle; Fifield, Steve – ETS Research Report Series, 2017
In this study, we investigated gender and minority achievement gaps on 8th-grade science items employing a multilevel item response methodology. Both gaps were wider on physics and earth science items than on biology and chemistry items. Larger gender gaps were found on items with specific topics favoring male students than other items, for…
Descriptors: Item Analysis, Gender Differences, Achievement Gap, Grade 8

Attali, Yigal; Goldschmidt, Chanan – Journal of Educational Measurement, 1996
Sources of observed differences in difficulty between items in graph comprehension performance were studied using 132 items from the Psychometric Entrance Test, an Israeli college entrance examination. Results indicate that item difficulty can be predicted successfully on the basis of item characteristics and task demands. Implications for graph…
Descriptors: College Entrance Examinations, Comprehension, Difficulty Level, Foreign Countries

Green, Kathy – Educational and Psychological Measurement, 1984
Two factors, language difficulty and option set convergence, were experimentally manipulated and their effects on item difficulty assessed. Option convergence was found to have a significant effect on item difficulty while the effect of language difficulty was not significant. (Author/BW)
Descriptors: Difficulty Level, Error Patterns, Higher Education, Multiple Choice Tests

Royer, Fred L. – Intelligence, 1978
Various experiments demonstrated that the difficulty level of several performance-type intelligence test tasks is determined directly by stimulus and task variables that vary the information to be processed. The implications of these findings for intelligence and the problems of an experimental approach to the measurement of intelligence are…
Descriptors: Adults, Cognitive Processes, Difficulty Level, Intelligence Tests
Bridgeman, Brent; Cline, Frederick – Journal of Educational Measurement, 2004
Time limits on some computer-adaptive tests (CATs) are such that many examinees have difficulty finishing, and some examinees may be administered tests with more time-consuming items than others. Results from over 100,000 examinees suggested that about half of the examinees must guess on the final six questions of the analytical section of the…
Descriptors: Guessing (Tests), Timed Tests, Adaptive Testing, Computer Assisted Testing

Huck, Schuyler W. – Journal of Educational Measurement, 1978
Providing examinees with advanced knowledge of the difficulty of an item led to an increase in test performance with no loss of reliability. This finding was consistent across several test formats. ( Author/JKS)
Descriptors: Difficulty Level, Feedback, Higher Education, Item Analysis
Allison, Donald E. – Measurement and Evaluation in Guidance, 1984
Administered three item-difficulty sequence forms of an achievement test to 107 sixth-grade students. No relationship between item-difficulty sequence and test performance, reliability, or item difficulty and discrimination was discovered. (Author/JAC)
Descriptors: Achievement Tests, Difficulty Level, Elementary School Students, Intermediate Grades

Plake, Barbara S.; And Others – Journal of Experimental Education, 1981
Number right and elimination scores were analyzed on a college level mathematics exam assembled from pretest data. Anxiety measures were administered along with the experimental forms to undergraduates. Results suggest that neither test scores nor attitudes are influenced by item order knowledge thereof, or anxiety level. (Author/GK)
Descriptors: College Mathematics, Difficulty Level, Higher Education, Multiple Choice Tests
Halpin, Glennelle; And Others – 1979
Forty-five graduate students in two educational psychology classes were randomly assigned to three groups for a classroom test on the course content. One group was asked to go to another room and then was privately excused from the test. The other two groups took tests with the same 32 items, but one group used a multiple-choice format and the…
Descriptors: Academic Achievement, Difficulty Level, Higher Education, Knowledge Level
Previous Page | Next Page »
Pages: 1 | 2