Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 0 |
Since 2006 (last 20 years) | 11 |
Descriptor
Difficulty Level | 13 |
College Entrance Examinations | 9 |
Models | 7 |
Test Items | 7 |
High School Students | 6 |
Scores | 6 |
Academic Standards | 5 |
Grade Point Average | 5 |
Academic Achievement | 4 |
College Readiness | 4 |
Correlation | 4 |
More ▼ |
Source
College Board | 13 |
Author
Engelhard, George, Jr. | 3 |
Kaliski, Pamela | 3 |
Camara, Wayne J. | 2 |
Huff, Kristen | 2 |
Wiley, Andrew | 2 |
Wind, Stefanie A. | 2 |
Wyatt, Jeffrey N. | 2 |
Barry, Carol | 1 |
Beatty, Adam S. | 1 |
Camara, Wayne | 1 |
France, Megan | 1 |
More ▼ |
Publication Type
Reports - Research | 8 |
Non-Print Media | 5 |
Numerical/Quantitative Data | 5 |
Reference Materials - General | 5 |
Speeches/Meeting Papers | 1 |
Tests/Questionnaires | 1 |
Education Level
Higher Education | 9 |
Postsecondary Education | 9 |
High Schools | 7 |
Secondary Education | 7 |
Audience
Location
New York | 1 |
Laws, Policies, & Programs
Assessments and Surveys
SAT (College Admission Test) | 8 |
Advanced Placement… | 2 |
What Works Clearinghouse Rating
Engelhard, George, Jr.; Wind, Stefanie A. – College Board, 2013
The major purpose of this study is to examine the quality of ratings assigned to CR (constructed-response) questions in large-scale assessments from the perspective of Rasch Measurement Theory. Rasch Measurement Theory provides a framework for the examination of rating scale category structure that can yield useful information for interpreting the…
Descriptors: Measurement Techniques, Rating Scales, Test Theory, Scores
Kaliski, Pamela; Wind, Stefanie A.; Engelhard, George, Jr.; Morgan, Deanna; Plake, Barbara; Reshetar, Rosemary – College Board, 2012
The Many-Facet Rasch (MFR) Model is traditionally used to evaluate the quality of ratings on constructed response assessments; however, it can also be used to evaluate the quality of judgments from panel-based standard setting procedures. The current study illustrates the use of the MFR Model by examining the quality of ratings obtained from a…
Descriptors: Advanced Placement Programs, Achievement Tests, Item Response Theory, Models
Beatty, Adam S.; Sackett, Paul R.; Kuncel, Nathan R.; Kiger, Thomas B.; Rigdon, Jana L.; Shen, Winny; Walmsley, Philip T. – College Board, 2013
In recent decades, there has been an increasing emphasis placed on college graduation rates and reducing attrition due to the social and economic benefits, at both the individual and national levels, proposed to accrue from a more highly educated population (Bureau of Labor Statistics, 2011). In the United States in particular, there is a concern…
Descriptors: Graduation Rate, High Schools, Academic Standards, College Preparation
Mattern, Krista D.; Wyatt, Jeffrey N. – College Board, 2012
A recurrent trend in higher education research has been to identify additional predictors of college success beyond the traditional measures of high school grade point average (HSGPA) and standardized test scores, given that a large percentage of unaccounted variance in college performance remains. A recent study by Wyatt, Wiley, Camara, and…
Descriptors: College Readiness, Grade Point Average, College Entrance Examinations, Scores
Kaliski, Pamela; France, Megan; Huff, Kristen; Thurber, Allison – College Board, 2011
Developing a cognitive model of task performance is an important and often overlooked phase in assessment design; failing to establish such a model can threaten the validity of the inferences made from the scores produced by an assessment (e.g., Leighton, 2004). Conducting think aloud interviews (TAIs), where students think aloud while completing…
Descriptors: World History, Advanced Placement Programs, Achievement Tests, Protocol Analysis
Wyatt, Jeffrey N.; Wiley, Andrew; Camara, Wayne J.; Proestler, Nina – College Board, 2012
Academic intensity or academic rigor of students' high school curriculum is positively related to several college outcomes including the avoidance of remediation and graduation attainment. However, research on academic rigor has been limited possibly due to the difficulty in obtaining a quantitative measure applicable across schools and districts.…
Descriptors: Academic Standards, College Readiness, College Preparation, Difficulty Level
Wiley, Andrew; Wyatt, Jeffrey; Camara, Wayne J. – College Board, 2011
This report presents a methodology for the measurement and tracking of the college readiness level of high school students who are engaged in the college admission process. The proposed index uses the three distinct hurdles of SAT scores, high school GPA and a newly developed measure of academic rigor. Appendix A contains tables and numerical…
Descriptors: College Entrance Examinations, College Readiness, Measurement, Progress Monitoring
Kobrin, Jennifer L.; Kim, Rachel; Sackett, Paul – College Board, 2011
There is much debate on the merits and pitfalls of standardized tests for college admission, with questions regarding the format (multiple-choice versus constructed response), cognitive complexity, and content of these assessments (achievement versus aptitude) at the forefront of the discussion. This study addressed these questions by…
Descriptors: College Entrance Examinations, Mathematics Tests, Test Items, Predictive Validity
Camara, Wayne – College Board, 2011
This presentation was presented at the 2011 National Conference on Student Assessment (CCSSO). The focus of this presentation is how to validate the common core state standards (CCSS) in math and ELA and the subsequent assessments that will be developed by state consortia. The CCSS specify the skills students need to be ready for post-secondary…
Descriptors: College Readiness, Career Readiness, Benchmarking, Student Evaluation
Kaliski, Pamela; Huff, Kristen; Barry, Carol – College Board, 2011
For educational achievement tests that employ multiple-choice (MC) items and aim to reliably classify students into performance categories, it is critical to design MC items that are capable of discriminating student performance according to the stated achievement levels. This is accomplished, in part, by clearly understanding how item design…
Descriptors: Alignment (Education), Academic Achievement, Expertise, Evaluative Thinking
Gierl, Mark J.; Leighton, Jacqueline P.; Wang, Changjiang; Zhou, Jiawen; Gokiert, Rebecca; Tan, Adele – College Board, 2009
The purpose of the study is to present research focused on validating the four algebra cognitive models in Gierl, Wang, et al., using student response data collected with protocol analysis methods to evaluate the knowledge structures and processing skills used by a sample of SAT test takers.
Descriptors: Algebra, Mathematics Tests, College Entrance Examinations, Student Attitudes
Liu, Jinghua; Schuppan, Fred; Walker, Michael E. – College Board, 2005
This study explored whether the addition of the items with more advanced math content to the SAT Reasoning Test™ (SAT®) would impact test-taker performance. Two sets of SAT math equating sections were modified to form four subforms each. Different numbers of items with advanced content, taken from the SAT II: Mathematics Level IC Test (Math IC),…
Descriptors: College Entrance Examinations, Mathematics Tests, Test Items, Difficulty Level
Engelhard, George, Jr.; Myford, Carol M. – College Board, 2003
The purpose of this study was to examine, describe, evaluate, and compare the rating behavior of faculty consultants who scored essays written for the Advanced Placement English Literature and Composition (AP® ELC) Exam. Data from the 1999 AP ELC Exam were analyzed using FACETS (Linacre, 1998) and SAS. The faculty consultants were not all…
Descriptors: Advanced Placement, College Faculty, Consultants, Scoring