NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 9 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Attali, Yigal – ETS Research Report Series, 2014
Previous research on calculator use in standardized assessments of quantitative ability focused on the effect of calculator availability on item difficulty and on whether test developers can predict these effects. With the introduction of an on-screen calculator on the Quantitative Reasoning measure of the "GRE"® revised General Test, it…
Descriptors: College Entrance Examinations, Graduate Study, Calculators, Test Items
Emmerich, Walter; And Others – 1991
The aim of this research was to identify, develop, and evaluate empirically new reasoning item types that might be used to broaden the analytical measure of the Graduate Record Examinations (GRE) General Test and to strengthen its construct validity. Six item types were selected for empirical evaluation, including the two currently used in the GRE…
Descriptors: Construct Validity, Correlation, Evaluation Methods, Sex Differences
Peer reviewed Peer reviewed
Direct linkDirect link
Coutinho, Savia A. – Educational Research and Reviews, 2006
This study examined the relationship between the need for cognition, defined as the tendency to engage in effortful cognitive activity, and metacognition which is one's thinking about thinking and how these variables relate to intellectual task performance. Participants completed measures of need for cognition, metacognition, and problem-solved…
Descriptors: Metacognition, Correlation, Academic Achievement, Educational Psychology
Freedle, Roy; Kostin, Irene – 1992
This study examines the predictability of Graduate Record Examinations (GRE) reading item difficulty (equated delta) for the three major reading item types: main idea, inference, and explicit statement items. Each item type is analyzed separately, using 110 GRE reading passages and their associated 244 reading items; selective analyses of 285…
Descriptors: College Entrance Examinations, Correlation, Difficulty Level, Higher Education
Peer reviewed Peer reviewed
Stricker, Lawrence J. – Educational and Psychological Measurement, 1984
The stability was evaluated of a partial correlation index, comparisons of item characteristic curves, and comparisions of item difficulties in assessing race and sex differences in the performance of verbal items on the Graduate Record Examination Aptitude Test. All three indexes exhibited consistency in identifying the same items in different…
Descriptors: College Entrance Examinations, Comparative Analysis, Correlation, Difficulty Level
Wilson, Kenneth M. – 1989
Possible population differences in speed versus level of Graduate Record Examinations (GRE) reading comprehension scores were explored. The study used operational measures computed post hoc from item-level data in GRE files for a pre-October 1977 version of the verbal test in which 40 GRE reading comprehension (RC) items were included as a…
Descriptors: Correlation, English, Ethnicity, Graduate Study
Bennett, Randy Elliot; And Others – 1991
This study investigated the convergent validity of expert-system scores for four mathematical constructed-response item formats. A five-factor model was proposed comprised of four constructed-response format factors and a Graduate Record Examinations (GRE) General Test quantitative factor. Subjects were drawn from examinees taking a single form of…
Descriptors: College Students, Constructed Response, Correlation, Expert Systems
Peer reviewed Peer reviewed
Direct linkDirect link
Gorin, Joanna S.; Embretson, Susan E. – Applied Psychological Measurement, 2006
Recent assessment research joining cognitive psychology and psychometric theory has introduced a new technology, item generation. In algorithmic item generation, items are systematically created based on specific combinations of features that underlie the processing required to correctly solve a problem. Reading comprehension items have been more…
Descriptors: Difficulty Level, Test Items, Modeling (Psychology), Paragraph Composition
Peer reviewed Peer reviewed
Bridgeman, Brent – Journal of Educational Measurement, 1992
Examinees in a regular administration of the quantitative portion of the Graduate Record Examination responded to particular items in a machine-scannable multiple-choice format. Volunteers (n=364) used a computer to answer open-ended counterparts of these items. Scores for both formats demonstrated similar correlational patterns. (SLD)
Descriptors: Answer Sheets, College Entrance Examinations, College Students, Comparative Testing