NotesFAQContact Us
Collection
Advanced
Search Tips
Showing 1 to 15 of 22 results Save | Export
Shaw, Emily J.; Marini, Jessica; Mattern, Krista D. – College Board, 2014
The Study evaluated the predictive validity of various operationalizations of AP® Exam and course information that could be used to make college admission decisions. The incremental validity of different AP variables, above and beyond traditional admission measures such as SAT® and high school grade point average (HSGPA), in predicting first-year…
Descriptors: College Admission, Advanced Placement Programs, Grade Point Average, Predictive Validity
Mattern, Krista D.; Patterson, Brian F. – College Board, 2013
The College Board formed a research consortium with four-year colleges and universities to build a national higher education database with the primary goal of validating the revised SAT for use in college admission. A study by Mattern and Patterson (2009) examined the relationship between SAT scores and retention to the second year. The sample…
Descriptors: College Entrance Examinations, Scores, School Holding Power, Test Validity
Peer reviewed Peer reviewed
Direct linkDirect link
Shaw, Emily J.; Marini, Jessica P.; Mattern, Krista D. – Educational and Psychological Measurement, 2013
The current study evaluated the relationship between various operationalizations of the Advanced Placement[R] (AP) exam and course information with first-year grade point average (FYGPA) in college to better understand the role of AP in college admission decisions. In particular, the incremental validity of the different AP variables, above…
Descriptors: Advanced Placement Programs, Grade Point Average, College Freshmen, College Admission
Kobrin, Jennifer L.; Patterson, Brian F.; Wiley, Andrew; Mattern, Krista D. – College Board, 2012
In 2011, the College Board released its SAT college and career readiness benchmark, which represents the level of academic preparedness associated with a high likelihood of college success and completion. The goal of this study, which was conducted in 2008, was to establish college success criteria to inform the development of the benchmark. The…
Descriptors: College Entrance Examinations, Standard Setting, College Readiness, Career Readiness
Patterson, Brian F.; Mattern, Krista D. – College Board, 2013
The continued accumulation of validity evidence for the intended uses of educational assessments is critical to ensure that proper inferences will be made for those purposes. To that end, the College Board has continued to collect college outcome data to evaluate the relationship between SAT® scores and college success. This report provides…
Descriptors: College Entrance Examinations, Predictive Validity, Test Validity, Scores
Patterson, Brian F.; Mattern, Krista D. – College Board, 2013
The continued accumulation of validity evidence for the core uses of educational assessments is critical to ensure that proper inferences will be made for those core purposes. To that end, the College Board has continued to follow previous cohorts of college students and this report provides updated validity evidence for using the SAT to predict…
Descriptors: College Entrance Examinations, Predictive Validity, Test Validity, Scores
Peer reviewed Peer reviewed
Direct linkDirect link
Mattern, Krista D.; Patterson, Brian F. – Journal of Applied Psychology, 2013
Research on the predictive bias of cognitive tests has generally shown (a) no slope effects and (b) small intercept effects, typically favoring the minority group. Aguinis, Culpepper, and Pierce (2010) simulated data and demonstrated that statistical artifacts may have led to a lack of power to detect slope differences and an overestimate of the…
Descriptors: College Admission, Cognitive Tests, Statistical Bias, Test Bias
Mattern, Krista D.; Patterson, Brian F. – College Board, 2012
The College Board formed a research consortium with four-year colleges and universities to build a national higher education database with the primary goal of validating the revised SAT for use in college admission. A study by Mattern and Patterson (2009) examined the relationship between SAT scores and retention to the second year of college. The…
Descriptors: College Entrance Examinations, Scores, School Holding Power, Test Validity
Peer reviewed Peer reviewed
Direct linkDirect link
Mattern, Krista D.; Shaw, Emily J.; Kobrin, Jennifer L. – Educational and Psychological Measurement, 2011
This study examined discrepant high school grade point average (HSGPA) and SAT performance as measured by the difference between a student's standardized SAT composite score and standardized HSGPA. The SAT-HSGPA discrepancy measure was used to examine whether certain students are more likely to exhibit discrepant performance and in what direction.…
Descriptors: Grade Point Average, College Entrance Examinations, Predictive Validity, College Admission
Peer reviewed Peer reviewed
Direct linkDirect link
Shaw, Emily J.; Mattern, Krista D.; Patterson, Brian F. – Educational Assessment, 2011
Despite the similarities that researchers note between the cognitive processes and knowledge involved in reading and writing, there are students who are much stronger readers than writers and those who are much stronger writers than readers. The addition of the writing section to the SAT provides an opportunity to examine whether certain groups of…
Descriptors: College Entrance Examinations, Critical Reading, Reading Tests, Writing Tests
Marini, Jessica P.; Mattern, Krista D.; Shaw, Emily J. – College Board, 2011
There is a common misperception that test scores do not predict above a minimum threshold (Sackett, Borneman, & Connelly, 2008). That is, test scores may be useful for identifying students with very low levels of ability; however, higher scores are considered unrelated to higher performance for those above a certain threshold. This study aims…
Descriptors: College Entrance Examinations, Scores, Grade Point Average, High Achievement
Mattern, Krista D.; Patterson, Brian F.; Kobrin, Jennifer L. – College Board, 2012
This study examined the validity of the SAT for predicting performance in first-year English and mathematics courses. Results reveal a significant positive relationship between SAT scores and course grades, with slightly higher correlations for mathematics courses compared to English courses. Correlations were estimated by student characteristics…
Descriptors: College Entrance Examinations, Predictive Validity, Test Validity, Scores
Patterson, Brian F.; Mattern, Krista D. – College Board, 2011
The findings for the 2008 sample are largely consistent with the previous reports. SAT scores were found to be correlated with FYGPA (r = 0.54), with a magnitude similar to HSGPA (r = 0.56). The best set of predictors of FYGPA remains SAT scores and HSGPA (r = 0.63), as the addition of the SAT sections to the correlation of HSGPA alone with FYGPA…
Descriptors: College Entrance Examinations, Predictive Validity, Test Validity, Scores
Shaw, Emily J.; Mattern, Krista D. – College Board, 2012
The current study will explore the validity and potential of using the SAT, in conjunction with HSGPA, to arrive at a predicted FYGPA to improve student retention at four-year postsecondary institutions. Specifically, this study examined whether college students who did not perform as expected (observed FYGPA minus predicted FYGPA) were more…
Descriptors: College Entrance Examinations, Test Validity, Grade Point Average, High School Students
Mattern, Krista D.; Shaw, Emily J.; Kobrin, Jennifer L. – College Board, 2010
Presented at the national conference for the American Educational Research Association (AERA) in 2010. This presentation describes an alternative way of presenting the unique information provided by the SAT over HSGPA, namely examining students with discrepant SAT-HSGPA performance.
Descriptors: College Entrance Examinations, Grade Point Average, High School Students, Scores
Previous Page | Next Page »
Pages: 1  |  2