Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 1 |
Since 2006 (last 20 years) | 13 |
Descriptor
Source
College Board | 13 |
Author
Mattern, Krista D. | 5 |
Patterson, Brian F. | 5 |
Kobrin, Jennifer L. | 3 |
Shaw, Emily J. | 3 |
Barbuti, Sandra M. | 1 |
Chajewski, Michael | 1 |
Ewing, Maureen | 1 |
Godfrey, Kelly E. | 1 |
Hendrickson, Amy | 1 |
Jagesic, Sanja | 1 |
Kaliski, Pamela | 1 |
More ▼ |
Publication Type
Reports - Research | 8 |
Non-Print Media | 5 |
Reference Materials - General | 5 |
Numerical/Quantitative Data | 4 |
Speeches/Meeting Papers | 1 |
Education Level
Higher Education | 10 |
Postsecondary Education | 10 |
High Schools | 5 |
Secondary Education | 3 |
Audience
Location
Massachusetts | 1 |
New Jersey | 1 |
Laws, Policies, & Programs
Assessments and Surveys
SAT (College Admission Test) | 9 |
Advanced Placement… | 2 |
College Level Examination… | 1 |
What Works Clearinghouse Rating
Mattern, Krista D.; Patterson, Brian F. – College Board, 2012
The College Board formed a research consortium with four-year colleges and universities to build a national higher education database with the primary goal of validating the revised SAT®, which consists of three sections: critical reading (SAT-CR), mathematics (SAT-M), and writing (SAT-W), for use in college admission. A study by Mattern and…
Descriptors: College Entrance Examinations, Scores, Test Validity, Academic Persistence
Godfrey, Kelly E.; Jagesic, Sanja – College Board, 2016
The College-Level Examination Program® (CLEP®) is a computer-based prior-learning assessment that allows examinees the opportunity to demonstrate mastery of knowledge and skills necessary to earn postsecondary course credit in higher education. Currently, there are 33 exams in five subject areas: composition and literature, world languages,…
Descriptors: Student Placement, Test Validity, Scores, Mathematics Tests
Reshetar, Rosemary; Kaliski, Pamela; Chajewski, Michael; Lionberger, Karen – College Board, 2012
This presentation summarizes a pilot study conducted after the May 2011 administration of the AP Environmental Science Exam. The study used analytical methods based on scaled anchoring as input to a Performance Level Descriptor validation process that solicited systematic input from subject matter experts.
Descriptors: Advanced Placement Programs, Science Tests, Achievement Tests, Classification
Shaw, Emily J.; Marini, Jessica; Mattern, Krista D. – College Board, 2014
The Study evaluated the predictive validity of various operationalizations of AP® Exam and course information that could be used to make college admission decisions. The incremental validity of different AP variables, above and beyond traditional admission measures such as SAT® and high school grade point average (HSGPA), in predicting first-year…
Descriptors: College Admission, Advanced Placement Programs, Grade Point Average, Predictive Validity
Patelis, Thanos – College Board, 2012
This is a keynote presentation given at AERA on developing a validity agenda for growth models in a large scale (e.g., state) setting. The emphasis of this presentation was to indicate that growth models and the validity agenda designed to provide evidence in supporting the claims to be made need to be personalized to meet the local or…
Descriptors: Test Validity, Statistical Analysis, Conferences (Gatherings), Evidence
Mattern, Krista D.; Patterson, Brian F. – College Board, 2011
The College Board formed a research consortium with four-year colleges and universities to build a national higher education database with the primary goal of validating the SAT® for use in college admission. The first sample included first-time, first-year students entering college in fall 2006, with 110 institutions providing students'…
Descriptors: College Entrance Examinations, Scores, Statistical Analysis, Predictive Validity
Mattern, Krista D.; Patterson, Brian F. – College Board, 2011
This report presents the findings from a replication of the analyses from the report, "Is Performance on the SAT Related to College Retention?" (Mattern & Patterson, 2009). The tables presented herein are based on the 2007 sample and the findings are largely the same as those presented in the original report, and show SAT scores are…
Descriptors: College Entrance Examinations, Correlation, Academic Persistence, Academic Achievement
Kobrin, Jennifer L.; Patterson, Brian F. – College Board, 2010
There is substantial variability in the degree to which the SAT and high school grade point average (HSGPA) predict first-year college performance at different institutions. This paper demonstrates the usefulness of multilevel modeling as a tool to uncover institutional characteristics that are associated with this variability. In a model that…
Descriptors: Scores, Validity, Prediction, College Freshmen
Shaw, Emily J. – College Board, 2011
Presented at the 23rd Annual Historically Black Colleges & Universities (HBCU) Conference in Atlanta, GA, in September 2011. Admitted Class Evaluation Service (ACES) is the College Board's free online service that predicts how admitted students will perform at a college or university generally, and how successful students will be in specific…
Descriptors: College Admission, Student Placement, Test Validity, Graphs
Shaw, Emily – College Board, 2010
Presented at the College Board National Forum in Washington, D.C., October 2010. This presentation examines the recent national validity evidence that supports the use of SAT Writing in college admissions and English placement. Additionally it includes information on the College Board's free online Admitted Class Evaluation Service (ACES) system,…
Descriptors: Test Validity, College Entrance Examinations, Writing Achievement, Writing Tests
Hendrickson, Amy; Patterson, Brian; Ewing, Maureen – College Board, 2010
The psychometric considerations and challenges associated with including constructed response items on tests are discussed along with how these issues affect the form assembly specifications for mixed-format exams. Reliability and validity, security and fairness, pretesting, content and skills coverage, test length and timing, weights, statistical…
Descriptors: Multiple Choice Tests, Test Format, Test Construction, Test Validity
Kobrin, Jennifer L.; Kimmel, Ernest W. – College Board, 2006
Based on statistics from the first few administrations of the SAT writing section, the test is performing as expected. The reliability of the writing section is very similar to that of other writing assessments. Based on preliminary validity research, the writing section is expected to add modestly to the prediction of college performance when…
Descriptors: Test Construction, Writing Tests, Cognitive Tests, College Entrance Examinations
Mattern, Krista D.; Patterson, Brian F.; Shaw, Emily J.; Kobrin, Jennifer L.; Barbuti, Sandra M. – College Board, 2008
The purpose of the study is to examine the differential validity and prediction of the SAT using a nationally representative sample of first-year college students admitted with the revised version of the SAT. The findings demonstrate that there are similar patterns of differential validity and prediction by gender, race/ethnicity, and best…
Descriptors: Validity, Prediction, College Entrance Examinations, Standardized Tests