Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 0 |
| Since 2017 (last 10 years) | 3 |
| Since 2007 (last 20 years) | 5 |
Descriptor
Source
| College Board | 2 |
| College Entrance Examination… | 2 |
| ProQuest LLC | 2 |
| Assessing Writing | 1 |
| Assessment for Effective… | 1 |
| Grantee Submission | 1 |
Author
| McMaster, Kristen L. | 2 |
| Patchan, Melissa M. | 2 |
| Puranik, Cynthia S. | 2 |
| Sears, Mary M. | 2 |
| Adams, Kimberly | 1 |
| Bonner, Marilyn W. | 1 |
| Breland, Hunter M. | 1 |
| Camara, Wayne J. | 1 |
| Day, Rachel | 1 |
| Isonio, Steven | 1 |
| James, Cindy L. | 1 |
| More ▼ | |
Publication Type
| Reports - Research | 5 |
| Journal Articles | 3 |
| Numerical/Quantitative Data | 3 |
| Dissertations/Theses -… | 2 |
| Reports - Evaluative | 2 |
| Non-Print Media | 1 |
| Reference Materials - General | 1 |
| Tests/Questionnaires | 1 |
Education Level
| Higher Education | 5 |
| Postsecondary Education | 5 |
| Kindergarten | 2 |
| Early Childhood Education | 1 |
| Elementary Education | 1 |
| High Schools | 1 |
| Primary Education | 1 |
| Secondary Education | 1 |
| Two Year Colleges | 1 |
Audience
Location
| Pennsylvania | 2 |
| California | 1 |
| Mississippi | 1 |
Laws, Policies, & Programs
Assessments and Surveys
| SAT (College Admission Test) | 4 |
| COMPASS (Computer Assisted… | 1 |
What Works Clearinghouse Rating
Puranik, Cynthia S.; Patchan, Melissa M.; Sears, Mary M.; McMaster, Kristen L. – Assessment for Effective Intervention, 2017
Curriculum-based measures (CBMs) are necessary for educators to quickly assess student skill levels and monitor progress. This study examined the use of the alphabet writing fluency task, a CBM of writing, to assess handwriting fluency--that is, how well children access, retrieve, and write letter forms automatically. In the current study, the…
Descriptors: Kindergarten, Alphabets, Writing Skills, Time on Task
Puranik, Cynthia S.; Patchan, Melissa M.; Sears, Mary M.; McMaster, Kristen L. – Grantee Submission, 2017
Curriculum-based measures (CBMs) are necessary for educators to quickly assess student skill levels and monitor progress. This study examined the use of the alphabet writing fluency task, a CBM of writing, to assess handwriting fluency--that is, how well children access, retrieve, and write letter forms automatically. In the current study, the…
Descriptors: Kindergarten, Alphabets, Writing Skills, Time on Task
Platt, Sara A. – ProQuest LLC, 2017
The adoption of Common Core State Standards by many states prompted the development of new standardized writing assessments. A limited number of studies investigated the predictive ability of curriculum-based measurements (CBMs) as related to state assessments in writing, and none have analyzed the Mississippi Assessment Program (MAP) for writing.…
Descriptors: Writing Tests, Standardized Tests, Curriculum Based Assessment, Common Core State Standards
Verbout, Mary F. – ProQuest LLC, 2013
Multiple-choice tests of punctuation and usage are used throughout the United States to assess the writing skills of new community college students in order to place them in either a basic writing course or first-year composition. To determine whether using the COMPASS Writing Test (CWT) is a valid placement at a community college, student test…
Descriptors: Predictive Validity, Multiple Choice Tests, Student Placement, Community Colleges
Shaw, Emily; Mattern, Krista; Patterson, Brian – College Board, 2009
Presented at the national conference for AERA (American Educational Research Association) . This study examined whether there are distinct differences in the demographic characteristics, HSGPA, first-year college performance, and second-year college retention rates among students who have discrepant CR-W performance.
Descriptors: Critical Reading, Reading Achievement, Writing Achievement, Scores
James, Cindy L. – Assessing Writing, 2006
How do scores from writing samples generated by computerized essay scorers compare to those generated by ''untrained'' human scorers and what combination of scores, if any, is more accurate at placing students in composition courses? This study endeavored to answer this two-part question by evaluating the correspondence between writing sample…
Descriptors: Writing (Composition), Predictive Validity, Scoring, Validity
Breland, Hunter M.; Kubota, Melvin Y.; Bonner, Marilyn W. – College Entrance Examination Board, 1999
This study examined the SAT® II: Writing Subject Test as a predictor of writing performance in college English courses. Writing performance was based on eight writing samples submitted as part of regular course work by students in eight colleges. The samples consisted of drafts and final papers submitted in response to four take-home writing…
Descriptors: College Entrance Examinations, Writing Tests, College English, Predictive Validity
Norris, Dwayne; Oppler, Scott; Kuang, Daniel; Day, Rachel; Adams, Kimberly – College Board, 2006
This study assessed the predictive and incremental validity of a prototype version of the forthcoming SAT® writing section that was administered to a sample of incoming students at 13 colleges and universities. For these participants, SAT scores, high school GPA, and first-year grades also were obtained. Using these data, analyses were conducted…
Descriptors: College Entrance Examinations, Writing Tests, Predictive Validity, Test Validity
Isonio, Steven – 1991
In May 1991, Golden West College (California) conducted a validation study of the English portion of the Assessment and Placement Services for Community Colleges (APS), followed by a predictive validity study in July 1991. The initial study was designed to aid in the implementation of the new test at GWC by comparing data on APS use at other…
Descriptors: College English, Community Colleges, Correlation, Grades (Scholastic)
Kobrin, Jennifer L.; Camara, Wayne J.; Milewski, Glenn B. – College Entrance Examination Board, 2002
This study examines the relative utility and predictive validity of the SAT I and SAT II for various subgroups in both California and the nation. The effect of eliminating the SAT I on the test impact and on the over- and under-prediction of various gender and racial/ethnic subgroups is examined. Two statistical adjustments and tables are appended.
Descriptors: College Entrance Examinations, Research Reports, Evaluation Utilization, College Admission

Peer reviewed
Direct link
