Publication Date
| In 2026 | 0 |
| Since 2025 | 74 |
| Since 2022 (last 5 years) | 509 |
| Since 2017 (last 10 years) | 1084 |
| Since 2007 (last 20 years) | 2603 |
Descriptor
Source
Author
Publication Type
Education Level
Audience
| Researchers | 169 |
| Practitioners | 49 |
| Teachers | 32 |
| Administrators | 8 |
| Policymakers | 8 |
| Counselors | 4 |
| Students | 4 |
| Media Staff | 1 |
Location
| Turkey | 173 |
| Australia | 81 |
| Canada | 79 |
| China | 72 |
| United States | 56 |
| Taiwan | 44 |
| Germany | 43 |
| Japan | 41 |
| United Kingdom | 39 |
| Iran | 37 |
| Indonesia | 35 |
| More ▼ | |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
| Meets WWC Standards without Reservations | 1 |
| Meets WWC Standards with or without Reservations | 1 |
| Does not meet standards | 1 |
Peer reviewedStegelmann, Werner – Psychometrika, 1983
The Rasch model is generalized to a multicomponent model, so that observations of component events are not needed to apply the model. It is shown that the generalized model maintains the property of specific objectivity of the Rasch model. An application to a mathematics test is provided. (Author/JKS)
Descriptors: Estimation (Mathematics), Item Analysis, Latent Trait Theory, Mathematical Models
Vacc, Nicholas A.; Loesch, Larry C.; Lubik, Ruth E. – 2001
Multiple choice tests are widely viewed as the most effective and objective means of assessment. Item development is the central component of creating an effective test, but test developers often do not have the background in item development. This document describes recall, application, and analysis, the three cognitive levels of test items. It…
Descriptors: Educational Assessment, Evaluation, Item Analysis, Measures (Individuals)
Wang, Jianjun; Staver, John – 1999
Development of the test instrument in the Third International Mathematics and Science Study (TIMSS) was based on the expertise of many researchers, including "distinguished scholars from 10 countries" who participated on the TIMSS Subject Matter Advisory Committee. However, a close examination of the TIMSS Science items suggests that not…
Descriptors: Achievement Rating, Elementary Secondary Education, Foreign Countries, Item Analysis
Liu, Jinghua; Schuppan, Fred; Walker, Michael E. – College Board, 2005
This study explored whether the addition of the items with more advanced math content to the SAT Reasoning Test™ (SAT®) would impact test-taker performance. Two sets of SAT math equating sections were modified to form four subforms each. Different numbers of items with advanced content, taken from the SAT II: Mathematics Level IC Test (Math IC),…
Descriptors: College Entrance Examinations, Mathematics Tests, Test Items, Difficulty Level
Peer reviewedBennett, S. N. – British Journal of Educational Psychology, 1973
The JEPI was re-analysed by item, factor and cluster analysis. (Editor)
Descriptors: Cluster Analysis, Correlation, Educational Psychology, Educational Research
Peer reviewedMcQuitty, Louis L. – Educational and Psychological Measurement, 1973
Paper analyzes additive variance in such a fashion that it supports both a theory of types and a cognitive-frustration theory of behavior in the development of tests designed to assess mental disturbance'' versus normality'' amongst college students. (Author)
Descriptors: Emotional Disturbances, Item Analysis, Psychological Testing, Tables (Data)
Peer reviewedBoard, Cynthia; Whitney, Douglas R. – Journal of Educational Measurement, 1972
For the principles studied here, poor item-writing practices serve to obscure (or attentuate) differences between good and poor students. (Authors)
Descriptors: College Students, Item Analysis, Multiple Choice Tests, Test Construction
Peer reviewedPyrczak, Fred – Reading Research Quarterly, 1972
Descriptors: Item Analysis, Multiple Choice Tests, Reading Comprehension, Reading Research
Peer reviewedGerst, Marvin S.; Moos, Rudolf H. – Journal of Educational Psychology, 1972
The development, initial standardization, and substantive data of the University Residence Environment Scales (URES) are presented. (Authors)
Descriptors: Comparative Analysis, Dormitories, Environment, Evaluation Criteria
Peer reviewedOosterhof, Albert C.; Kocher, A. Thel – Educational and Psychological Measurement, 1972
A listing of this program which includes illustrative input and output can be obtained by writing either author, Bureau of Educational Research, The University of Kansas, Lawrence, Kansas. (Authors)
Descriptors: Computer Programs, Feedback, Input Output, Item Analysis
Bishop, A. J.; and others – Int J Educ Sci, 1969
Descriptors: Item Analysis, Multiple Choice Tests, Questioning Techniques, Test Construction
Peer reviewedMacGregor, Ronald – Studies in Art Education, 1972
Author developed an instrument, called the Perceptual Index, which was designed to provide teachers of art with a valid and reliable measure of response to visual stimuli. (Author/MB)
Descriptors: Art Education, Elementary School Students, Item Analysis, Measurement Instruments
Michael, Joan J.; Michael, William B. – Educ Psychol Meas, 1969
Descriptors: Academic Achievement, College Mathematics, Item Analysis, Multiple Choice Tests
Peer reviewedMcKenzie, Gary R. – American Educational Research Journal, 1972
This study offers one bit of evidence that quizzes written to require reasoning are more effective in attaining thinking" objectives than are recall quizzes. (Author)
Descriptors: Abstract Reasoning, Cognitive Processes, Comparative Testing, Grade 8
Peer reviewedHuck, Schuyler W.; Bowers, Norman D. – Journal of Educational Measurement, 1972
Study investigated whether the proportion of examinees who answer an item correctly may be influenced by the difficulty of the immediately preceding item. (Authors/MB)
Descriptors: Achievement Tests, Difficulty Level, Hypothesis Testing, Item Analysis


