NotesFAQContact Us
Collection
Advanced
Search Tips
Publication Date
In 20250
Since 20240
Since 2021 (last 5 years)0
Since 2016 (last 10 years)0
Since 2006 (last 20 years)2
Audience
Students1
Location
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing all 12 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Finch, W. Holmes – Applied Psychological Measurement, 2012
Increasingly, researchers interested in identifying potentially biased test items are encouraged to use a confirmatory, rather than exploratory, approach. One such method for confirmatory testing is rooted in differential bundle functioning (DBF), where hypotheses regarding potential differential item functioning (DIF) for sets of items (bundles)…
Descriptors: Test Bias, Test Items, Statistical Analysis, Models
Wainer, Howard – 1994
This study examined the Law School Admission Test (LSAT) through the use of testlet methods to model its inherent, locally dependent structure. Precision, measured by reliability, and fairness, measured by the comparability of performance across all identified subgroups of examinees, were the focus of the study. The polytomous item response theory…
Descriptors: College Entrance Examinations, Item Response Theory, Reading Comprehension, Reading Tests
Peer reviewed Peer reviewed
Evans, Franklin R.; Reilly, Richard R. – Journal of Educational Measurement, 1972
Study to determine whether potential bias exists in the Law School Admission Test (LSAT) which fee-free center candidates do not complete in the time available in as large a proportion as regular center candidates. (MB)
Descriptors: Black Students, Reaction Time, Response Style (Tests), Scoring
Peer reviewed Peer reviewed
Direct linkDirect link
van der Linden, Wim J.; Ariel, Adelaide; Veldkamp, Bernard P. – Journal of Educational and Behavioral Statistics, 2006
Test-item writing efforts typically results in item pools with an undesirable correlational structure between the content attributes of the items and their statistical information. If such pools are used in computerized adaptive testing (CAT), the algorithm may be forced to select items with less than optimal information, that violate the content…
Descriptors: Adaptive Testing, Computer Assisted Testing, Test Items, Item Banks
Peer reviewed Peer reviewed
Weber, David A. – Buffalo Law Review, 1975
The question of possible racial bias in the Law School Admissions Test (LSAT), the issue crucial to future tests of the constitutionality of preferential admissions, is explored with the conclusion that uncertainties in this area should not overshadow the necessity for reexamination of law school admission criteria. (JT)
Descriptors: Admission Criteria, Higher Education, Law Schools, Predictive Measurement
Peer reviewed Peer reviewed
Powell, Brian; Steelman, Lala Carr – Integrated Education, 1982
Compares men's and women's performance on the Law School Admission Test (LSAT), and suggests that the math section may have penalized women, since they scored equally to men on other sections. Questions the validity of mathematics performance as a predictor of success in law school and as a lawyer. (GC)
Descriptors: Achievement Tests, Females, Higher Education, Law Schools
Evans, Franklin R.; Reilly, Richard – 1971
Specially constructed "speeded" and "unspeeded" forms of a Reading Comprehension section of the Law School Admission Test (LSAT) were administered to regular center and fee-free center LSAT candidates in an effort to determine: (1) if the test was more speeded for fee-free candidates, and (2) if reducing the amount of…
Descriptors: Blacks, Fees, Financial Support, Item Analysis
Peer reviewed Peer reviewed
Linn, Robert L. – Journal of Legal Education, 1975
Use of the Law School Admission Test (LSAT) and undergraduate grade point average for members of minority groups are examined in relationship to recent LSAT studies and related research on admissions tests and test bias. Traditional predictors of law school grades were found to be as accurate for minority as for majority persons. (JT)
Descriptors: Admission Criteria, Grade Point Average, Grade Prediction, Graduate Students
Peer reviewed Peer reviewed
Journal of Blacks in Higher Education, 2000
The nonprofit corporation that performs standardized testing for law school admission has in recent years produced $100 million in profits. For minority groups who tend to score poorly on the test, the question is why the corporation refuses to dedicate some of its wealth to commissioning an independent study of whether test results correlate with…
Descriptors: Black Students, College Admission, Higher Education, Law Schools
Peer reviewed Peer reviewed
Simon, Rita J.; Danner, Mona J. E. – Journal of Legal Education, 1990
A study evaluated the accuracy of Law School Admission Test (LSAT) scores in predicting student law school performance. Male and female scores and White, Black, or Hispanic scores were compared. Data were drawn from 1987 and 1988 graduating classes of five geographically diverse law schools. No significant differences between groups were found.…
Descriptors: Blacks, College Entrance Examinations, Comparative Analysis, Higher Education
White, David M. – 1986
This book discusses tricks for answering questions on the Law School Admission Test (LSAT). The tricks are based on an analysis of 12 editions of the LSAT which have been made public pursuant to New York's Truth in Testing Law. Sample LSAT questions published by the Law School Admission Council are referenced to exemplify the tricks' applications…
Descriptors: College Entrance Examinations, Higher Education, Multiple Choice Tests, Pretesting
Breland, Hunter M.; Bridgeman, Brent; Fowles, Mary E. – College Entrance Examination Board, 1999
A comprehensive review was conducted of writing research literature and writing test program activities in a number of testing programs. The review was limited to writing assessments used for admission in higher education. Programs reviewed included ACT, Inc.'s ACTâ„¢ program, the California State Universities and Colleges (CSUC) testing program,…
Descriptors: Writing Research, Writing Tests, Writing (Composition), Writing Instruction