Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 1 |
Since 2016 (last 10 years) | 3 |
Since 2006 (last 20 years) | 3 |
Descriptor
Author
Braude, Eric John | 1 |
Forcht, Emily R. | 1 |
Nelson, Peter M. | 1 |
Parker, David C. | 1 |
Thompson, Meredith Myra | 1 |
Van Norman, Ethan R. | 1 |
Zaslofsky, Anne F. | 1 |
Publication Type
Journal Articles | 3 |
Reports - Research | 3 |
Education Level
Elementary Education | 3 |
Grade 5 | 3 |
Grade 6 | 3 |
Intermediate Grades | 3 |
Middle Schools | 3 |
Grade 4 | 2 |
Early Childhood Education | 1 |
Grade 3 | 1 |
Grade 7 | 1 |
Grade 8 | 1 |
Higher Education | 1 |
More ▼ |
Audience
Location
Massachusetts | 1 |
Minnesota | 1 |
Laws, Policies, & Programs
Assessments and Surveys
ACT Assessment | 1 |
Flesch Kincaid Grade Level… | 1 |
Flesch Reading Ease Formula | 1 |
Minnesota Comprehensive… | 1 |
National Assessment of… | 1 |
SAT (College Admission Test) | 1 |
What Works Clearinghouse Rating
Van Norman, Ethan R.; Forcht, Emily R. – Assessment for Effective Intervention, 2023
This study explored the validity of growth on two computer adaptive tests, Star Reading and Star Math, in explaining performance on an end-of-year achievement test for a sample of students in Grades 3 through 6. Results from quantile regression analyses indicate that growth on Star Reading explained a statistically significant amount of variance…
Descriptors: Test Validity, Computer Assisted Testing, Adaptive Testing, Grade Prediction
Nelson, Peter M.; Parker, David C.; Zaslofsky, Anne F. – Assessment for Effective Intervention, 2016
The purpose of the current study was to evaluate the importance of growth in math fact skills within the context of overall math proficiency. Data for 1,493 elementary and middle school students were included for analysis. Regression models were fit to examine the relative value of math fact fluency growth, prior state test performance, and a fall…
Descriptors: Mathematics, Mathematics Instruction, Mathematics Skills, Mathematics Achievement
Thompson, Meredith Myra; Braude, Eric John – Journal of Educational Computing Research, 2016
The assessment of learning in large online courses requires tools that are valid, reliable, easy to administer, and can be automatically scored. We have evaluated an online assessment and learning tool called Knowledge Assembly, or Knowla. Knowla measures a student's knowledge in a particular subject by having the student assemble a set of…
Descriptors: Computer Assisted Testing, Teaching Methods, Online Courses, Critical Thinking