NotesFAQContact Us
Collection
Advanced
Search Tips
Showing 1 to 15 of 19 results Save | Export
Shaw, Emily J.; Kobrin, Jennifer L. – College Board, 2013
This study examines the relationship between students' SAT essay scores and college outcomes, including first-year grade point average (FYGPA) and first-year English course grade average (FY EngGPA), overall and by various demographic and academic performance subgroups. Results showed that the SAT essay score has a positive relationship with both…
Descriptors: College Entrance Examinations, Scores, Essays, Writing Skills
Kobrin, Jennifer L.; Patterson, Brian F.; Wiley, Andrew; Mattern, Krista D. – College Board, 2012
In 2011, the College Board released its SAT college and career readiness benchmark, which represents the level of academic preparedness associated with a high likelihood of college success and completion. The goal of this study, which was conducted in 2008, was to establish college success criteria to inform the development of the benchmark. The…
Descriptors: College Entrance Examinations, Standard Setting, College Readiness, Career Readiness
Kobrin, Jennifer L.; Sinharay, Sandip; Haberman, Shelby J.; Chajewski, Michael – College Board, 2011
This study examined the adequacy of a multiple linear regression model for predicting first-year college grade point average (FYGPA) using SAT[R] scores and high school grade point average (HSGPA). A variety of techniques, both graphical and statistical, were used to examine if it is possible to improve on the linear regression model. The results…
Descriptors: Multiple Regression Analysis, Goodness of Fit, College Entrance Examinations, Test Validity
Peer reviewed Peer reviewed
Direct linkDirect link
Mattern, Krista D.; Shaw, Emily J.; Kobrin, Jennifer L. – Educational and Psychological Measurement, 2011
This study examined discrepant high school grade point average (HSGPA) and SAT performance as measured by the difference between a student's standardized SAT composite score and standardized HSGPA. The SAT-HSGPA discrepancy measure was used to examine whether certain students are more likely to exhibit discrepant performance and in what direction.…
Descriptors: Grade Point Average, College Entrance Examinations, Predictive Validity, College Admission
Peer reviewed Peer reviewed
Direct linkDirect link
Kobrin, Jennifer L.; Patterson, Brian F. – Educational Assessment, 2011
Prior research has shown that there is substantial variability in the degree to which the SAT and high school grade point average (HSGPA) predict 1st-year college performance at different institutions. This article demonstrates the usefulness of multilevel modeling as a tool to uncover institutional characteristics that are associated with this…
Descriptors: College Entrance Examinations, Scores, Grade Point Average, High School Students
Peer reviewed Peer reviewed
Direct linkDirect link
Kobrin, Jennifer L.; Kim, YoungKoung; Sackett, Paul R. – Educational and Psychological Measurement, 2012
There is much debate on the merits and pitfalls of standardized tests for college admission, with questions regarding the format (multiple-choice vs. constructed response), cognitive complexity, and content of these assessments (achievement vs. aptitude) at the forefront of the discussion. This study addressed these questions by investigating the…
Descriptors: Grade Point Average, Standardized Tests, Predictive Validity, Predictor Variables
Patterson, Brian F.; Kobrin, Jennifer L. – College Board, 2011
This study presents a case for applying a transformation (Box and Cox, 1964) of the criterion used in predictive validity studies. The goals of the transformation were to better meet the assumptions of the linear regression model and to reduce the residual variance of fitted (i.e., predicted) values. Using data for the 2008 cohort of first-time,…
Descriptors: Predictive Validity, Evaluation Criteria, Regression (Statistics), College Freshmen
Kobrin, Jennifer L.; Patterson, Brian F. – College Board, 2012
This study examines student performance on the SAT and SAT Subject Tests in order to identify groups of students who score differently on these two tests, and to determine whether certain demographic groups score higher on one test compared to the other. Discrepancy scores were created to capture individuals' performance differences on the…
Descriptors: College Entrance Examinations, Scores, Performance, Standardized Tests
Mattern, Krista D.; Patterson, Brian F.; Kobrin, Jennifer L. – College Board, 2012
This study examined the validity of the SAT for predicting performance in first-year English and mathematics courses. Results reveal a significant positive relationship between SAT scores and course grades, with slightly higher correlations for mathematics courses compared to English courses. Correlations were estimated by student characteristics…
Descriptors: College Entrance Examinations, Predictive Validity, Test Validity, Scores
Patterson, Brian F.; Packman, Sheryl; Kobrin, Jennifer L. – College Board, 2011
The purpose of this study was to examine the effects of Advanced Placement[R] (AP[R]) exam participation and performance on college grades for courses taken in the same subject area as students' AP Exam(s). Students' first-year college subject area grade point averages (SGPAs) were examined in nine subject areas: mathematics, computer science,…
Descriptors: Advanced Placement, Intellectual Disciplines, Engineering, Natural Sciences
Kobrin, Jennifer L.; Patterson, Brian F. – College Board, 2010
There is substantial variability in the degree to which the SAT and high school grade point average (HSGPA) predict first-year college performance at different institutions. This paper demonstrates the usefulness of multilevel modeling as a tool to uncover institutional characteristics that are associated with this variability. In a model that…
Descriptors: Scores, Validity, Prediction, College Freshmen
Kobrin, Jennifer L.; Kim, Rachel; Sackett, Paul – College Board, 2011
There is much debate on the merits and pitfalls of standardized tests for college admission, with questions regarding the format (multiple-choice versus constructed response), cognitive complexity, and content of these assessments (achievement versus aptitude) at the forefront of the discussion. This study addressed these questions by…
Descriptors: College Entrance Examinations, Mathematics Tests, Test Items, Predictive Validity
Mattern, Krista D.; Shaw, Emily J.; Kobrin, Jennifer L. – College Board, 2010
Presented at the national conference for the American Educational Research Association (AERA) in 2010. This presentation describes an alternative way of presenting the unique information provided by the SAT over HSGPA, namely examining students with discrepant SAT-HSGPA performance.
Descriptors: College Entrance Examinations, Grade Point Average, High School Students, Scores
Patterson, Brian F.; Mattern, Krista D.; Kobrin, Jennifer L. – College Board, 2009
This report presents the findings from a replication of the Kobrin et al. (2008) and Mattern et al. (2008) reports. Students who were missing at least one of the following were excluded from the analyses: SAT scores, a self-reported high school grade point average (HSGPA), and a valid first-year GPA (FYGPA); this resulted in a final sample size of…
Descriptors: College Entrance Examinations, Predictive Validity, Test Validity, Scores
Patterson, Brian F.; Packman, Sheryl; Kobrin, Jennifer L. – College Board, 2011
The purpose of this study was to examine the effects of Advanced Placement (AP) exam participation and performance on college grades for courses taken in the same subject area as students' AP Exam(s). Students' first-year college subject area grade point averages (SGPAs) were examined in nine subject areas: mathematics, computer science,…
Descriptors: Advanced Placement, College Freshmen, Grades (Scholastic), Achievement Tests
Previous Page | Next Page »
Pages: 1  |  2