NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 12 results Save | Export
Breland, Hunter M.; Jones, Robert J. – 1988
The reliability, validity, and score discrepancies of 94 expository essays scored in conference versus remote settings were studied. Focus was on comparing holistic ratings obtained in both settings. Essays written by college freshmen on two different topics were scored by readers working in a conference setting and by different readers working in…
Descriptors: College Freshmen, Comparative Analysis, Conferences, Essay Tests
Breland, Hunter M.; Lytle, Eldon G. – 1990
The utility of computer analysis in the assessment of written products was studied using the WordMAP software package. Data were collected for 92 college freshmen, using: (1) the Test of Standard Written English (TSWE); (2) the English Composition Test of the College Board; (3) verbal and mathematical Scholastic Aptitude Tests; (4) two narrative…
Descriptors: College Freshmen, Computer Assisted Testing, Educational Assessment, Essay Tests
Breland, Hunter M.; Jones, Robert J. – 1982
Research was conducted into the specific characteristics of brief, impromptu essay writing. A random sample of 806 essays was taken from the more than 80,000 written for the College Board's English Composition Achievement Test (ECT) in 1979. Using a special taxonomy of 20 writing characteristics, these essays were subjected to a second reading in…
Descriptors: Educational Assessment, Essay Tests, Evaluation Criteria, Holistic Evaluation
Breland, Hunter M. – College Board Review, 1977
One reliable way to measure student writing ability is to gather and evaluate a series of writing samples or essays over a period of time. The use of multiple-choice tests in combination with essay assignments can be the most educationally sound solution to the administrative problems involved in college course placements. (Editor/LBH)
Descriptors: Comparative Analysis, Essay Tests, Essays, Expository Writing
Breland, Hunter M.; And Others – 1987
Six university English departments collaborated in this examination of the differences between multiple-choice and essay tests in evaluating writing skills. The study also investigated ways the two tools can complement one another, ways to improve cost effectiveness of essay testing, and ways to integrate assessment and the educational process.…
Descriptors: Comparative Testing, Efficiency, Essay Tests, Higher Education
Peer reviewed Peer reviewed
Breland, Hunter M.; Griswold, Philip A. – Journal of Educational Psychology, 1982
The relationships among scores on traditional college entrance tests and scores on an essay placement test for women and men and four ethnic groups were examined. The tests correlated highly with essay performance. However, women tended to be underestimated and men and ethnic minorities overestimated by these measures. (Author/PN)
Descriptors: College Entrance Examinations, Essay Tests, Higher Education, Multiple Choice Tests
Breland, Hunter M. – 1983
Direct assessment of writing skill, usually considered to be synonymous with assessment by means of writing samples, is reviewed in terms of its history and with respect to evidence of its reliability and validity. Reliability is examined as it is influenced by reader inconsistency, domain sampling, and other sources of error. Validity evidence is…
Descriptors: Essay Tests, Evaluation Needs, Higher Education, Interrater Reliability
Breland, Hunter M.; Kubota, Melvin Y.; Bonner, Marilyn W. – College Entrance Examination Board, 1999
This study examined the SAT® II: Writing Subject Test as a predictor of writing performance in college English courses. Writing performance was based on eight writing samples submitted as part of regular course work by students in eight colleges. The samples consisted of drafts and final papers submitted in response to four take-home writing…
Descriptors: College Entrance Examinations, Writing Tests, College English, Predictive Validity
Peer reviewed Peer reviewed
Breland, Hunter M.; Gaynor, Judith L. – Journal of Educational Measurement, 1979
Over 2,000 writing samples were collected from four undergraduate institutions and compared, where possible, with scores on a multiple-choice test. High correlations between ratings of the writing samples and multiple-choice test scores were obtained. Samples contributed substantially to the prediction of both college grades and writing…
Descriptors: Achievement Tests, Comparative Testing, Correlation, Essay Tests
Breland, Hunter M. – 1978
Data were collected for a brief objective English test, the Test of Standard Written English (TSWE), and for subsequent essay writing performance in four institutions. Data were then pooled across the four institutions and subclassified into four groups: males, females, majorities, and minorities. The groups were then compared with respect to TSWE…
Descriptors: Analysis of Covariance, College Entrance Examinations, Correlation, Essay Tests
Peer reviewed Peer reviewed
Breland, Hunter M.; Duran, Richard P. – Educational and Psychological Measurement, 1985
English writing ability of Hispanic college candidates taking the College Board's English Composition Test (ECT) was studied. The performance of three groups on essay and multiple choice portions of the ECT were compared with each other and with performance on the same measures by ECT test takers as a whole. (Author/DWH)
Descriptors: College Bound Students, College Entrance Examinations, English (Second Language), Essay Tests
Breland, Hunter M. – 1996
Recent trends in writing skill assessment suggest a movement toward the use of free-response writing tasks and away from the traditional multiple-choice test. A number of national examinations, including major college admissions tests, have included free-response components. Most of the arguments in support of this trend relate to the hypothesized…
Descriptors: College Entrance Examinations, Computer Uses in Education, Cost Effectiveness, Educational Technology