Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 5 |
Since 2016 (last 10 years) | 7 |
Since 2006 (last 20 years) | 8 |
Descriptor
Predictive Validity | 9 |
Writing Evaluation | 9 |
Writing Tests | 9 |
Curriculum Based Assessment | 5 |
Elementary School Students | 5 |
Scores | 5 |
Writing Skills | 4 |
Essays | 3 |
Accuracy | 2 |
African American Students | 2 |
Alphabets | 2 |
More ▼ |
Source
Grantee Submission | 4 |
Assessment for Effective… | 1 |
Assessment in Education:… | 1 |
Community College Review | 1 |
Journal of Special Education | 1 |
School Psychology | 1 |
Author
Publication Type
Reports - Research | 8 |
Journal Articles | 6 |
Reports - Evaluative | 1 |
Education Level
Elementary Education | 5 |
Grade 4 | 2 |
Grade 7 | 2 |
Intermediate Grades | 2 |
Junior High Schools | 2 |
Kindergarten | 2 |
Middle Schools | 2 |
Secondary Education | 2 |
Early Childhood Education | 1 |
Grade 8 | 1 |
Primary Education | 1 |
More ▼ |
Audience
Location
Pennsylvania | 2 |
Texas | 2 |
New Jersey | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Implications of Bias in Automated Writing Quality Scores for Fair and Equitable Assessment Decisions
Matta, Michael; Mercer, Sterett H.; Keller-Margulis, Milena A. – School Psychology, 2023
Recent advances in automated writing evaluation have enabled educators to use automated writing quality scores to improve assessment feasibility. However, there has been limited investigation of bias for automated writing quality scores with students from diverse racial or ethnic backgrounds. The use of biased scores could contribute to…
Descriptors: Bias, Automation, Writing Evaluation, Scoring
Implications of Bias in Automated Writing Quality Scores for Fair and Equitable Assessment Decisions
Michael Matta; Sterett H. Mercer; Milena A. Keller-Margulis – Grantee Submission, 2023
Recent advances in automated writing evaluation have enabled educators to use automated writing quality scores to improve assessment feasibility. However, there has been limited investigation of bias for automated writing quality scores with students from diverse racial or ethnic backgrounds. The use of biased scores could contribute to…
Descriptors: Bias, Automation, Writing Evaluation, Scoring
Michael Matta; Sterett H. Mercer; Milena A. Keller-Margulis – Grantee Submission, 2022
Written expression curriculum-based measurement (WE-CBM) is a formative assessment approach for screening and progress monitoring. To extend evaluation of WE-CBM, we compared hand-calculated and automated scoring approaches in relation to the number of screening samples needed per student for valid scores, the long-term predictive validity and…
Descriptors: Writing Evaluation, Writing Tests, Predictive Validity, Formative Evaluation
Michael Matta; Sterett H. Mercer; Milena A. Keller-Margulis – Assessment in Education: Principles, Policy & Practice, 2022
Written expression curriculum-based measurement (WE-CBM) is a formative assessment approach for screening and progress monitoring. To extend evaluation of WE-CBM, we compared hand-calculated and automated scoring approaches in relation to the number of screening samples needed per student for valid scores, the long-term predictive validity and…
Descriptors: Writing Evaluation, Writing Tests, Predictive Validity, Formative Evaluation
Gary A. Troia; Frank R. Lawrence; Julie S. Brehmer; Kaitlin Glause; Heather L. Reichmuth – Grantee Submission, 2023
Much of the research that has examined the writing knowledge of school-age students has relied on interviews to ascertain this information, which is problematic because interviews may underestimate breadth and depth of writing knowledge, require lengthy interactions with participants, and do not permit a direct evaluation of a prescribed array of…
Descriptors: Writing Tests, Writing Evaluation, Knowledge Level, Elementary School Students
Puranik, Cynthia S.; Patchan, Melissa M.; Sears, Mary M.; McMaster, Kristen L. – Assessment for Effective Intervention, 2017
Curriculum-based measures (CBMs) are necessary for educators to quickly assess student skill levels and monitor progress. This study examined the use of the alphabet writing fluency task, a CBM of writing, to assess handwriting fluency--that is, how well children access, retrieve, and write letter forms automatically. In the current study, the…
Descriptors: Kindergarten, Alphabets, Writing Skills, Time on Task
Puranik, Cynthia S.; Patchan, Melissa M.; Sears, Mary M.; McMaster, Kristen L. – Grantee Submission, 2017
Curriculum-based measures (CBMs) are necessary for educators to quickly assess student skill levels and monitor progress. This study examined the use of the alphabet writing fluency task, a CBM of writing, to assess handwriting fluency--that is, how well children access, retrieve, and write letter forms automatically. In the current study, the…
Descriptors: Kindergarten, Alphabets, Writing Skills, Time on Task
Amato, Janelle M.; Watkins, Marley W. – Journal of Special Education, 2011
Curriculum-based measurement (CBM) is an alternative to traditional assessment techniques. Technical work has begun to identify CBM writing indices that are psychometrically sound for monitoring older students' writing proficiency. This study examined the predictive validity of CBM writing indices in a sample of 447 eighth-grade students.…
Descriptors: Grade 8, Writing Evaluation, Curriculum Based Assessment, Punctuation

Bers, Trudy H.; Smith, Kerry E. – Community College Review, 1990
Describes a study of the validity and reliability of a writing skills assessment test taken by 4,284 2-year college students in 1986-87. Assesses interrater reliability, influences of nonperformance factors (e.g., gender, native language, and form of test), predictive validity of test for future performance, and implications of findings. (DMM)
Descriptors: Basic Writing, Community Colleges, High Risk Students, Predictive Validity