NotesFAQContact Us
Collection
Advanced
Search Tips
Publication Type
Numerical/Quantitative Data20
Reports - Research20
Journal Articles3
Speeches/Meeting Papers3
Tests/Questionnaires2
Audience
Laws, Policies, & Programs
Showing 1 to 15 of 20 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Kuchler, Kirstin; Finestack, Lizbeth – Teaching and Learning in Communication Sciences & Disorders, 2022
The use of multiple-choice testing is common among all levels of education. This study examined one type of multiple-choice testing: the Immediate Feedback--Assessment Technique®? (IF-AT®?), which uses an answer-until-correct testing format. More than 300 undergraduate students in a speech-language-hearing sciences course used the IF-AT®? to take…
Descriptors: Multiple Choice Tests, Feedback (Response), Student Evaluation, Evaluation Methods
Tomkowicz, Joanna; Kim, Dong-In; Wan, Ping – Online Submission, 2022
In this study we evaluated the stability of item parameters and student scores, using the pre-equated (pre-pandemic) parameters from Spring 2019 and post-equated (post-pandemic) parameters from Spring 2021 in two calibration and equating designs related to item parameter treatment: re-estimating all anchor parameters (Design 1) and holding the…
Descriptors: Equated Scores, Test Items, Evaluation Methods, Pandemics
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Ben Seipel; Sarah E. Carlson; Virginia Clinton-Lisell; Mark L. Davison; Patrick C. Kennedy – Grantee Submission, 2022
Originally designed for students in Grades 3 through 5, MOCCA (formerly the Multiple-choice Online Causal Comprehension Assessment), identifies students who struggle with comprehension, and helps uncover why they struggle. There are many reasons why students might not comprehend what they read. They may struggle with decoding, or reading words…
Descriptors: Multiple Choice Tests, Computer Assisted Testing, Diagnostic Tests, Reading Tests
Steedle, Jeffrey; Pashley, Peter; Cho, YoungWoo – ACT, Inc., 2020
Three mode comparability studies were conducted on the following Saturday national ACT test dates: October 26, 2019, December 14, 2019, and February 8, 2020. The primary goal of these studies was to evaluate whether ACT scores exhibited mode effects between paper and online testing that would necessitate statistical adjustments to the online…
Descriptors: Test Format, Computer Assisted Testing, College Entrance Examinations, Scores
Alonzo, Julie; Anderson, Daniel – Behavioral Research and Teaching, 2018
In response to a request for additional analyses, in particular reporting confidence intervals around the results, we re-analyzed the data from prior studies. This supplementary report presents the results of the additional analyses addressing classification accuracy, reliability, and criterion-related validity evidence. For ease of reference, we…
Descriptors: Curriculum Based Assessment, Computation, Statistical Analysis, Classification
Li, Dongmei; Yi, Qing; Harris, Deborah – ACT, Inc., 2017
In preparation for online administration of the ACT® test, ACT conducted studies to examine the comparability of scores between online and paper administrations, including a timing study in fall 2013, a mode comparability study in spring 2014, and a second mode comparability study in spring 2015. This report presents major findings from these…
Descriptors: College Entrance Examinations, Computer Assisted Testing, Comparative Analysis, Test Format
Zoblotsky, Todd; Bertz, Christine; Gallagher, Brenda; Alberg, Marty – Center for Research in Educational Policy (CREP), 2017
In August 2010, the Smithsonian Science Education Center (SSEC) received a grant of more than $25 million from the U.S. Department of Education's Investing in Innovation (i3) program for a five-year study to validate its Leadership Assistance for Science Education Reform (LASER) model in three very diverse regions of the United States: rural North…
Descriptors: Multiple Choice Tests, Science Tests, Elementary School Students, Middle School Students
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Chen, Haiwen H.; von Davier, Matthias; Yamamoto, Kentaro; Kong, Nan – ETS Research Report Series, 2015
One major issue with large-scale assessments is that the respondents might give no responses to many items, resulting in less accurate estimations of both assessed abilities and item parameters. This report studies how the types of items affect the item-level nonresponse rates and how different methods of treating item-level nonresponses have an…
Descriptors: Achievement Tests, Foreign Countries, International Assessment, Secondary School Students
Todd Zoblotsky; Christine Bertz; Brenda Gallagher; Marty Alberg – Center for Research in Educational Policy (CREP), 2016
In August 2010, the Smithsonian Science Education Center (SSEC) received a grant of more than $25 million from the U.S. Department of Education's Investing in Innovation (i3) program for a five-year study to validate its Leadership Assistance for Science Education Reform (LASER) model in three very diverse regions of the United States: rural North…
Descriptors: Multiple Choice Tests, Science Tests, Elementary School Students, Middle School Students
Verbic, Srdjan; Tomic, Boris; Kartal, Vesna – Online Submission, 2010
On-line trial testing for fourth-grade students was an exploratory study realized as a part of the project "Developing annual test of students' achievement in Nature & Society" realized by Institute for Education Quality and Evaluation. Main ideas of the study were to explore possibilities for on-line testing at national level in…
Descriptors: Foreign Countries, Item Response Theory, High School Students, Computer Assisted Testing
Saez, Leilani; Park, Bitnara; Nese, Joseph F. T.; Jamgochian, Elisa; Lai, Cheng-Fei; Anderson, Daniel; Kamata, Akihito; Alonzo, Julie; Tindal, Gerald – Behavioral Research and Teaching, 2010
In this series of studies, we investigated the technical adequacy of three curriculum-based measures used as benchmarks and for monitoring progress in three critical reading- related skills: fluency, reading comprehension, and vocabulary. In particular, we examined the following easyCBM measurement across grades 3-7 at fall, winter, and spring…
Descriptors: Elementary School Students, Middle School Students, Vocabulary, Reading Comprehension
Samejima, Fumiko; Trestman, Robert L. – 1980
The first step of the data analysis with respect to the eventual application of the various new methods in latent trait theory is here initiated. The data are a set of approximately 500 item responses of each of 7,439 examinees to the Iowa Tests of Basic Skills, Form 6, on one of three difficulty levels, which correspond to the ages of 11, 12 and…
Descriptors: Achievement Tests, Data Analysis, Goodness of Fit, Junior High Schools
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Handwerk, Phil – ETS Research Report Series, 2007
Online high schools are growing significantly in number, popularity, and function. However, little empirical data has been published about the effectiveness of these institutions. This research examined the frequency of group work and extended essay writing among online Advanced Placement Program® (AP®) students, and how these tasks may have…
Descriptors: Advanced Placement Programs, Advanced Placement, Computer Assisted Testing, Models
Obiekwe, Jerry C. – 2001
Palmore's Facts on Aging Quiz (FAQ) (E. Palmore, 1977) is an instrument that is used to educate, to measure learning, to test knowledge, to measure attitudes toward aging, and in research. A comparative analysis was performed between the FAQ I and its multiple choice version and the FAQ II and its multiple choice version in terms of their item…
Descriptors: Aging (Individuals), Attitude Measures, Educational Research, Guessing (Tests)
PDF pending restoration PDF pending restoration
Richichi, Rudolph V. – 1996
An item analysis study was conducted for two multiple-choice tests of introductory psychology created from a publisher's test bank. A 47-item multiple-choice test was administered to 247 students, and 433 students took a different 42-item multiple-choice test. Participants were from a large northeastern suburban community college. The difficulty…
Descriptors: College Students, Community Colleges, Difficulty Level, Higher Education
Previous Page | Next Page »
Pages: 1  |  2