Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 2 |
Since 2016 (last 10 years) | 3 |
Since 2006 (last 20 years) | 8 |
Descriptor
Reading Comprehension | 26 |
Test Construction | 26 |
Test Format | 26 |
Reading Tests | 14 |
Test Items | 14 |
Foreign Countries | 9 |
Difficulty Level | 7 |
Multiple Choice Tests | 7 |
Test Validity | 7 |
English (Second Language) | 6 |
Higher Education | 6 |
More ▼ |
Source
Author
Baldauf, Richard B., Jr. | 1 |
Dirir, Mohamed A. | 1 |
Dixon, John | 1 |
Gierl, Mark J. | 1 |
Gray, George T. | 1 |
Gu, Lin | 1 |
Haladyna, Thomas M. | 1 |
Hartley, James | 1 |
Hassan, Nurul Huda | 1 |
Holley, Charles D. | 1 |
Huntley, Renee M. | 1 |
More ▼ |
Publication Type
Reports - Research | 13 |
Journal Articles | 10 |
Speeches/Meeting Papers | 8 |
Reports - Descriptive | 5 |
Reports - Evaluative | 5 |
Guides - Classroom - Teacher | 2 |
Opinion Papers | 2 |
Tests/Questionnaires | 2 |
Education Level
Higher Education | 5 |
Postsecondary Education | 4 |
Elementary Education | 1 |
Elementary Secondary Education | 1 |
Grade 12 | 1 |
Grade 4 | 1 |
Grade 8 | 1 |
High Schools | 1 |
Intermediate Grades | 1 |
Junior High Schools | 1 |
Middle Schools | 1 |
More ▼ |
Audience
Practitioners | 1 |
Teachers | 1 |
Location
Canada | 2 |
Netherlands | 2 |
China | 1 |
Singapore | 1 |
South Korea | 1 |
Laws, Policies, & Programs
Assessments and Surveys
Test of English as a Foreign… | 2 |
ACT Assessment | 1 |
National Assessment of… | 1 |
Sequential Tests of… | 1 |
What Works Clearinghouse Rating
Shin, Jinnie; Gierl, Mark J. – International Journal of Testing, 2022
Over the last five years, tremendous strides have been made in advancing the AIG methodology required to produce items in diverse content areas. However, the one content area where enormous problems remain unsolved is language arts, generally, and reading comprehension, more specifically. While reading comprehension test items can be created using…
Descriptors: Reading Comprehension, Test Construction, Test Items, Natural Language Processing
Gu, Lin; Ling, Guangming; Liu, Ou Lydia; Yang, Zhitong; Li, Guirong; Kardanova, Elena; Loyalka, Prashant – Assessment & Evaluation in Higher Education, 2021
We examine the effects of computer-based versus paper-based assessment of critical thinking skills, adapted from English (in the U.S.) to Chinese. Using data collected based on a random assignment between the two modes in multiple Chinese colleges, we investigate mode effects from multiple perspectives: mean scores, measurement precision, item…
Descriptors: Critical Thinking, Tests, Test Format, Computer Assisted Testing
Haladyna, Thomas M. – IDEA Center, Inc., 2018
Writing multiple-choice test items to measure student learning in higher education is a challenge. Based on extensive scholarly research and experience, the author describes various item formats, offers guidelines for creating these items, and provides many examples of both good and bad test items. He also suggests some shortcuts for developing…
Descriptors: Test Construction, Multiple Choice Tests, Test Items, Higher Education
O'Reilly, Tenaha; Sabatini, John – ETS Research Report Series, 2013
This paper represents the third installment of the Reading for Understanding (RfU) assessment framework. This paper builds upon the two prior installments (Sabatini & O'Reilly, 2013; Sabatini, O'Reilly, & Deane, 2013) by discussing the role of performance moderators in the test design and how scenario-based assessment can be used as a tool…
Descriptors: Reading Comprehension, Reading Tests, Test Construction, Student Characteristics
Hassan, Nurul Huda; Shih, Chih-Min – Language Assessment Quarterly, 2013
This article describes and reviews the Singapore-Cambridge General Certificate of Education Advanced Level General Paper (GP) examination. As a written test that is administered to preuniversity students, the GP examination is internationally recognised and accepted by universities and employers as proof of English competence. In this article, the…
Descriptors: Foreign Countries, College Entrance Examinations, English (Second Language), Writing Tests
National Assessment Governing Board, 2009
As the ongoing national indicator of what American students know and can do, the National Assessment of Educational Progress (NAEP) in Reading regularly collects achievement information on representative samples of students in grades 4, 8, and 12. The information that NAEP provides about student achievement helps the public, educators, and…
Descriptors: National Competency Tests, Reading Tests, Test Items, Test Format
Qian, David D. – Language Assessment Quarterly, 2008
In the last 15 years or so, language testing practitioners have increasingly favored assessing vocabulary in context. The discrete-point vocabulary measure used in the old version of the Test of English as a Foreign Language (TOEFL) has long been criticized for encouraging test candidates to memorize wordlists out of context although test items…
Descriptors: Predictive Validity, Context Effect, Vocabulary, English (Second Language)

Hartley, James; Trueman, Mark – Journal of Research in Reading, 1986
Reports on two studies of the effect of different typographic settings on the speed and accuracy of responses to cloze procedure reading tests. Concludes that in-text responding and dashes produce significantly higher scores. (SRT)
Descriptors: Cloze Procedure, Layout (Publications), Reading Comprehension, Reading Research

Baldauf, Richard B., Jr. – Educational and Psychological Measurement, 1982
A Monte Carlo design examined how the effects of guessing and item dependence influence test characteristics and student scores. Although validity for cloze variants was high, multiple-choice cloze had significantly lower reliabilities than did true score equivalents. (Author/PN)
Descriptors: Cloze Procedure, Elementary Education, Guessing (Tests), Reading Comprehension
Krieken, Robert van – 1993
The research project reported in this paper had two aims: to demonstrate what standard multiple-choice questions and two new varieties (multiple-choice cloze and guided summary test and to assess whether the new formats test the same skills. To find out what the various question formats test, the study looked at a battery of tests, each of which…
Descriptors: Foreign Countries, Multiple Choice Tests, Reading Comprehension, Scores
Huntley, Renee M.; Miller, Sherri – 1994
Whether the shaping of test items can itself result in qualitative differences in examinees' comprehension of reading passages was studied using the Pearson-Johnson item classification system. The specific practice studied incorporated, within an item stem line, references that point the examinee to a specific location within a reading passage.…
Descriptors: Ability, Classification, Difficulty Level, High School Students
Virkkunen, Anu – 1990
A study investigated whether or not a discrete-item test of English affixes could be used to measure second language reading comprehension. Subjects were 1,254 mostly first-year university students in four academic departments studying to pass the first half of a foreign language requirement, an English reading comprehension test. Results from two…
Descriptors: Affixes, Comparative Analysis, Construct Validity, English (Second Language)
Way, Walter D.; And Others – 1992
This study provided an exploratory investigation of item features that might contribute to a lack of invariance of item parameters for the Test of English as a Foreign Language (TOEFL). Data came from seven forms of the TOEFL administered in 1989. Subjective and quantitative measures developed for the study provided consistent information related…
Descriptors: Ability, English (Second Language), Goodness of Fit, Item Response Theory
Roid, Gale; And Others – 1980
Using informal, objectives-based, or linguistic methods, three elementary school teachers and three experienced item writers developed criterion-referenced pretests-posttests to accompany a prose passage. Item difficulites were tabulated on the responses of 364 elementary students. The informal-subjective method, used by many achievement test…
Descriptors: Criterion Referenced Tests, Difficulty Level, Elementary Education, Elementary School Teachers
Rachor, Robert E.; Gray, George T. – 1996
Two frequently cited guidelines for writing multiple choice test item stems are: (1) the stem can be written in either a question or statement-to-be-completed format; and (2) only positively worded stems should be used. These guidelines were evaluated in a survey of the test item banks of 13 nationally administered examinations in the physician…
Descriptors: Allied Health Personnel, Difficulty Level, High Achievement, Item Banks
Previous Page | Next Page ยป
Pages: 1 | 2