Publication Date
| In 2026 | 0 |
| Since 2025 | 8 |
| Since 2022 (last 5 years) | 36 |
| Since 2017 (last 10 years) | 115 |
| Since 2007 (last 20 years) | 378 |
Descriptor
| Test Theory | 1166 |
| Test Items | 262 |
| Test Reliability | 252 |
| Test Construction | 246 |
| Test Validity | 245 |
| Psychometrics | 183 |
| Scores | 176 |
| Item Response Theory | 168 |
| Foreign Countries | 160 |
| Item Analysis | 141 |
| Statistical Analysis | 134 |
| More ▼ | |
Source
Author
Publication Type
Education Level
Location
| United States | 17 |
| United Kingdom (England) | 15 |
| Canada | 14 |
| Australia | 13 |
| Turkey | 12 |
| Sweden | 8 |
| United Kingdom | 8 |
| Netherlands | 7 |
| Texas | 7 |
| New York | 6 |
| Taiwan | 6 |
| More ▼ | |
Laws, Policies, & Programs
| No Child Left Behind Act 2001 | 4 |
| Elementary and Secondary… | 3 |
| Individuals with Disabilities… | 3 |
Assessments and Surveys
What Works Clearinghouse Rating
Yarnold, Paul R.; And Others – 1985
This paper reports on a short version of the Student Jenkins Activity Survey (JAS), a multiple choice questionnaire that measures Type A "coronary-prone" behavior in assessing subjects' A/B types. The primary objective was to determine if the short and long forms of the student JAS represent similar measurement instruments. A secondary…
Descriptors: Behavior Rating Scales, College Students, Comparative Testing, Factor Analysis
Norris, Stephen P. – 1988
A study examined whether the process of gathering verbal reports of subjects' thinking while taking multiple-choice critical thinking tests could be used to infer the reasoning process used and identify test items which do not require critical thinking skills. Four factors can render an inference of a subject's critical thinking skills…
Descriptors: Cognitive Processes, Critical Thinking, High School Students, High Schools
Brutten, Sheila R.; And Others – 1988
A study attempted to estimate the instructional sensitivity of items in three reading comprehension tests in English as a second language (ESL). Instructional sensitivity is a test-item construct defined as the tendency for a test item to vary in difficulty as a function of instruction. Similar tasks were given to readers at different proficiency…
Descriptors: College Students, Comparative Analysis, Difficulty Level, English (Second Language)
Sullivan, Francis J. – 1987
To examine "bluffing"--ways in which conflicts in classrooms and evaluation procedures influence the styles of student writing and teachers' responses to different styles, a study analyzed the placement-test essays of 99 undergraduates entering Temple University (Pennsylvania) in the fall of 1982. Analysis of the texts was based on a…
Descriptors: Constructed Response, Essay Tests, Higher Education, Response Style (Tests)
Lancaster, Diana M.; And Others – 1987
Difficulty and discrimination ability were compared between multiple choice and short answer items in midterm and final examinations for the internal medicine course at Louisiana State University School of Dentistry. The examinations were administered to 67 sophomore dental students in that course. Additionally, the impact of the source of the…
Descriptors: Dental Schools, Dentistry, Difficulty Level, Discriminant Analysis
Peer reviewedBennett, Randy Elliot; And Others – Journal of Educational Measurement, 1987
To identify broad classes of items on the Scholastic Aptitude Test that behave differentally for handicapped examinees taking special, extended time administrations, the performance of nine handicapped groups and one nonhandicapped group on each of two forms of the SAT was investigated through a two-stage procedure. (Author/LMO)
Descriptors: College Entrance Examinations, Disabilities, Hearing Impairments, High Schools
Peer reviewedAltepeter, Tom – School Psychology Review, 1983
A critical review of the Expressive One-Word Picture Vocabulary Test (Gardner) is offered. The reviewer feels that the instrument cannot be recommended in its present form. Further research concerning the manual, and theoretical issues, (particularly test-retest stability) is strongly recommended. (Author/PN)
Descriptors: Error of Measurement, Intelligence Tests, Item Analysis, Pictorial Stimuli
Peer reviewedDuchastel, Phillippe C.; Nungester, Ronald J. – Journal of Educational Research, 1982
For this study, high school students studied a brief history text and then: (1) answered a short-answer test; (2) answered a multiple-choice test; or (3) completed a questionnaire about study habits. Results concerning the effectiveness of these procedures on retention of material are discussed. (Authors/CJ)
Descriptors: Constructed Response, Essay Tests, High School Students, High Schools
Popham, W. James – American School Board Journal, 2003
Claims that standards-based tests neither measure skills and knowledge accurately nor help educators do a better instructional job. The article offers suggestions in four areas to make the tests contribute to improved instruction: measurement of content standards; descriptions of standards; standard-by-standard reporting; and locally administered…
Descriptors: Academic Achievement, Academic Standards, Cognitive Tests, Educational Testing
Peer reviewedMarlaire, Courtney L.; Maynard, Douglas W. – Sociology of Education, 1990
Describes and analyzes the social organization of testing as an interactional phenomenon. Provides examples of exchanges between clinician and child in a diagnostic testing situation. States these coorientational activities demonstrate that testing has an interactional basis. Explores the implications for research on bias in mental testing. (NL)
Descriptors: Cognitive Measurement, Diagnostic Tests, Educational Research, Elementary Secondary Education
Peer reviewedHills, John R.; And Others – Journal of Educational Measurement, 1988
Five methods of equating minimum-competency tests were compared using the Florida Statewide Student Assessment Test, Part II, for 1984 and 1986. Four of five methods yielded essentially comparable results for the highest scoring 84% of the students. Different lengths of anchor items were compared, using the concurrent item response theory equating…
Descriptors: Comparative Analysis, Equated Scores, Evaluation Methods, Graduation Requirements
Peer reviewedSinclair, Robert C.; Soldat, Alexander S.; Mark, Melvin M. – Teaching of Psychology, 1998
Argues that external cues provide affective information that influence processing strategy and, therefore, examination performance. Notes the differences in performance for two midterm examinations, identical, except that they were printed on blue and red paper. Discusses a method for appropriately adjusting scores to control for form effects.…
Descriptors: Cognitive Psychology, Color, Cues, Dimensional Preference
Peer reviewedBrennan, Robert L. – Educational Measurement: Issues and Practice, 1998
Explores the relationship between measurement theory and practice, considering five broad categories of: (1) models, assumptions, and terminology; (2) reliability; (3) validity; (4) scaling; and (5) setting performance standards. It must be recognized that measurement is not an end in itself. (SLD)
Descriptors: Educational Assessment, Educational Practices, Measurement Techniques, Models
Aberg-Bengtsson, Lisbeth; Erickson, Gudrun – Educational Research and Evaluation, 2006
The research project presented in this article was set in the Swedish school context and carried out on a set of compulsory national tests for English, Swedish, and mathematics used at the end of compulsory school. The aims were: (a) to gain a deeper knowledge of the internal structure of the tests and (b) to separate individual performance from…
Descriptors: Individual Testing, Factor Analysis, Structural Equation Models, Foreign Countries
Frary, Robert B. – 1995
This digest presents a list of recommendations for writing multiple-choice test items, based on psychometrics and logical deduction. Questions should ask more than mere knowledge of facts and should not contain superfluous information as an introduction to the question. Each question should focus on some specific aspect of the course, and the item…
Descriptors: Culture Fair Tests, Distractors (Tests), Educational Assessment, Item Bias

Direct link
