NotesFAQContact Us
Collection
Advanced
Search Tips
Showing 1 to 15 of 24 results Save | Export
Shaw, Emily J.; McKenzie, Elizabeth – College Board, 2010
[Slides] presented at the annual conference of the Southern Association for College Admission Counseling, April 2010. This presentation summarizes recent research from the national SAT Validity Study and includes information on the Admitted Class Evaluation Service (ACES) system and how ACES can help institutions conduct their own validity…
Descriptors: College Entrance Examinations, Test Validity, Educational Research, Predictive Validity
Reshetar, Rosemary; Kaliski, Pamela; Chajewski, Michael; Lionberger, Karen – College Board, 2012
This presentation summarizes a pilot study conducted after the May 2011 administration of the AP Environmental Science Exam. The study used analytical methods based on scaled anchoring as input to a Performance Level Descriptor validation process that solicited systematic input from subject matter experts.
Descriptors: Advanced Placement Programs, Science Tests, Achievement Tests, Classification
Shaw, Emily J.; Mattern, Krista D. – College Board, 2012
The current study will explore the validity and potential of using the SAT, in conjunction with HSGPA, to arrive at a predicted FYGPA to improve student retention at four-year postsecondary institutions. Specifically, this study examined whether college students who did not perform as expected (observed FYGPA minus predicted FYGPA) were more…
Descriptors: College Entrance Examinations, Test Validity, Grade Point Average, High School Students
Kaliski, Pamela; Wind, Stefanie A.; Engelhard, George, Jr.; Morgan, Deanna; Plake, Barbara; Reshetar, Rosemary – College Board, 2012
The Many-Facet Rasch (MFR) Model is traditionally used to evaluate the quality of ratings on constructed response assessments; however, it can also be used to evaluate the quality of judgments from panel-based standard setting procedures. The current study illustrates the use of the MFR Model by examining the quality of ratings obtained from a…
Descriptors: Advanced Placement Programs, Achievement Tests, Item Response Theory, Models
Patelis, Thanos – College Board, 2012
This is a keynote presentation given at AERA on developing a validity agenda for growth models in a large scale (e.g., state) setting. The emphasis of this presentation was to indicate that growth models and the validity agenda designed to provide evidence in supporting the claims to be made need to be personalized to meet the local or…
Descriptors: Test Validity, Statistical Analysis, Conferences (Gatherings), Evidence
Camara, Wayne – College Board, 2011
This presentation was presented at the 2011 National Conference on Student Assessment (CCSSO). The focus of this presentation is how to validate the common core state standards (CCSS) in math and ELA and the subsequent assessments that will be developed by state consortia. The CCSS specify the skills students need to be ready for post-secondary…
Descriptors: College Readiness, Career Readiness, Benchmarking, Student Evaluation
Proctor, Thomas P.; Kim, YoungKoung Rachel – College Board, 2009
Presented at the national conference for the American Educational Research Association (AERA) in April 2009. This study examined the utility of scores on the SAT writing test, specifically examining the reliability of scores using generalizability and item response theories. The study also provides an overview of current predictive validity…
Descriptors: College Entrance Examinations, Writing Tests, Psychometrics, Predictive Validity
Kaliski, Pamela; France, Megan; Huff, Kristen; Thurber, Allison – College Board, 2011
Developing a cognitive model of task performance is an important and often overlooked phase in assessment design; failing to establish such a model can threaten the validity of the inferences made from the scores produced by an assessment (e.g., Leighton, 2004). Conducting think aloud interviews (TAIs), where students think aloud while completing…
Descriptors: World History, Advanced Placement Programs, Achievement Tests, Protocol Analysis
Patterson, Brian F.; Kobrin, Jennifer L. – College Board, 2011
This study presents a case for applying a transformation (Box and Cox, 1964) of the criterion used in predictive validity studies. The goals of the transformation were to better meet the assumptions of the linear regression model and to reduce the residual variance of fitted (i.e., predicted) values. Using data for the 2008 cohort of first-time,…
Descriptors: Predictive Validity, Evaluation Criteria, Regression (Statistics), College Freshmen
Kobrin, Jennifer L.; Patterson, Brian F. – College Board, 2010
There is substantial variability in the degree to which the SAT and high school grade point average (HSGPA) predict first-year college performance at different institutions. This paper demonstrates the usefulness of multilevel modeling as a tool to uncover institutional characteristics that are associated with this variability. In a model that…
Descriptors: Scores, Validity, Prediction, College Freshmen
Shaw, Emily J.; Kobrin, Jennifer L.; Patterson, Brian F.; Mattern, Krista D. – College Board, 2011
Presented at the Annual Meeting of the American Educational Research Association (AERA) in New Orleans, LA in April 2011. The current study examined the differential validity of the SAT for predicting cumulative GPA through the second-year of college by college major, as well as the differential prediction of cumulative GPA by college major among…
Descriptors: College Entrance Examinations, Predictive Validity, Grade Point Average, College Students
Hendrickson, Amy; Huff, Kristen; Luecht, Ric – College Board, 2009
[Slides] presented at the Annual Meeting of National Council on Measurement in Education (NCME) in San Diego, CA in April 2009. This presentation describes how the vehicles for gathering student evidence--task models and test specifications--are developed.
Descriptors: Test Items, Test Construction, Evidence, Achievement
Shaw, Emily J. – College Board, 2011
Presented at the 23rd Annual Historically Black Colleges & Universities (HBCU) Conference in Atlanta, GA, in September 2011. Admitted Class Evaluation Service (ACES) is the College Board's free online service that predicts how admitted students will perform at a college or university generally, and how successful students will be in specific…
Descriptors: College Admission, Student Placement, Test Validity, Graphs
Shaw, Emily – College Board, 2010
Presented at the College Board National Forum in Washington, D.C., October 2010. This presentation examines the recent national validity evidence that supports the use of SAT Writing in college admissions and English placement. Additionally it includes information on the College Board's free online Admitted Class Evaluation Service (ACES) system,…
Descriptors: Test Validity, College Entrance Examinations, Writing Achievement, Writing Tests
Wiley, Andrew – College Board, 2009
Presented at the national conference for the American Educational Research Association (AERA) in 2009. This discussed the development and implementation of the new SAT writing section.
Descriptors: Aptitude Tests, Writing Tests, Test Construction, Test Format
Previous Page | Next Page »
Pages: 1  |  2