NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 3 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Park, Siwon – Journal of Pan-Pacific Association of Applied Linguistics, 2017
This paper examines how different test methods may tap different aspects of second language knowledge. It employs multiple-choice (MC) and constructed response (CR) items which yield distinct or convergent information in the computer delivered testing of English in its presentation of this factor. In order to examine the effects of test method, a…
Descriptors: Evaluation Methods, Second Language Learning, English (Second Language), Computer Assisted Testing
Peer reviewed Peer reviewed
Direct linkDirect link
Quellmalz, Edys S.; Davenport, Jodi L.; Timms, Michael J.; DeBoer, George E.; Jordan, Kevin A.; Huang, Chun-Wei; Buckley, Barbara C. – Journal of Educational Psychology, 2013
How can assessments measure complex science learning? Although traditional, multiple-choice items can effectively measure declarative knowledge such as scientific facts or definitions, they are considered less well suited for providing evidence of science inquiry practices such as making observations or designing and conducting investigations.…
Descriptors: Science Education, Educational Assessment, Psychometrics, Science Tests
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Sawaki, Yasuyo; Stricker, Lawrence; Oranje, Andreas – ETS Research Report Series, 2008
The present study investigated the factor structure of a field trial sample of the Test of English as a Foreign Language™ Internet-based test (TOEFL® iBT). An item-level confirmatory factor analysis (CFA) was conducted for a polychoric correlation matrix of items on a test form completed by 2,720 participants in the 2003-2004 TOEFL iBT Field…
Descriptors: Factor Structure, Computer Assisted Testing, Multitrait Multimethod Techniques, Scores