NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 7 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Huang, Becky H.; Bailey, Alison L.; Sass, Daniel A.; Shawn Chang, Yung-hsiang – Language Testing, 2021
Given the increasing emphasis of communicative competence in English as a foreign language (EFL) contexts and the lack of validation research on speaking assessments for adolescent EFL learners, in the current study we examined the validity of the TOEFL JuniorĀ® speaking test, a relatively new speaking assessment developed by Educational Testing…
Descriptors: Test Validity, Language Tests, English (Second Language), Second Language Learning
Peer reviewed Peer reviewed
Direct linkDirect link
Kyle, Kristopher; Crossley, Scott A.; McNamara, Danielle S. – Language Testing, 2016
This study explores the construct validity of speaking tasks included in the TOEFL iBT (e.g., integrated and independent speaking tasks). Specifically, advanced natural language processing (NLP) tools, MANOVA difference statistics, and discriminant function analyses (DFA) are used to assess the degree to which and in what ways responses to these…
Descriptors: Construct Validity, Natural Language Processing, Speech Skills, Speech Acts
Peer reviewed Peer reviewed
Direct linkDirect link
Sawaki, Yasuyo; Stricker, Lawrence J.; Oranje, Andreas H. – Language Testing, 2009
This construct validation study investigated the factor structure of the Test of English as a Foreign Language[TM] Internet-based test (TOEFL[R] iBT). An item-level confirmatory factor analysis was conducted for a test form completed by participants in a field study. A higher-order factor model was identified, with a higher-order general factor…
Descriptors: Speech Communication, Construct Validity, Factor Structure, Factor Analysis
Peer reviewed Peer reviewed
Kostin, Irene; Freedle, Roy – Language Testing, 1999
A study investigated whether examinees taking the Test of English as a Foreign Language (TOEFL) attended to the text passages in the "minitalks" when answering the multiple-choice items (n=337) testing listening comprehension. Results support the construct validity of the minitalks, and also allow comparison between reading and listening…
Descriptors: Construct Validity, English (Second Language), Language Tests, Listening Comprehension
Peer reviewed Peer reviewed
Direct linkDirect link
Stricker, L. J. – Language Testing, 2004
The purpose of this study was to replicate previous research on the construct validity of the paper-based version of the TOEFL and extend it to the computer-based TOEFL. Two samples of Graduate Record Examination (GRE) General Test-takers were used: native speakers of English specially recruited to take the computer-based TOEFL, and ESL…
Descriptors: Native Speakers, Construct Validity, English (Second Language), Computer Assisted Instruction
Peer reviewed Peer reviewed
Schmitt, Norbert – Language Testing, 1999
One way of determining construct validity of vocabulary items in language tests is to interview subjects directly after taking the items to ascertain what is known about the target words in question. This approach was combined within the framework of lexical competency in a study of the behavior of lexical items on the Test of English as a Foreign…
Descriptors: Associative Learning, Construct Validity, English (Second Language), Foreign Countries
Peer reviewed Peer reviewed
Powers, Donald E. – Language Testing, 1986
Survey responses of faculty in six graduate study fields and undergraduate English faculty identified nine listening skills that they perceived as particularly important to academic success in lecture classes. Faculty believed that non-native English speakers had greater difficulty with listening activities. Research suggestions are presented…
Descriptors: Academic Achievement, College Faculty, Construct Validity, English (Second Language)