NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 1 to 15 of 123 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Wu, Ying Choon; Müller, Horst M.; Coulson, Seana – Discourse Processes: A Multidisciplinary Journal, 2022
Multi-modal discourse comprehension requires speakers to combine information from speech and gestures. To date, little research has addressed the cognitive resources that underlie these processes. Here we used a dual-task paradigm to test the relative importance of verbal and visuospatial working memory in speech-gesture comprehension. Healthy,…
Descriptors: Short Term Memory, Comprehension, Nonverbal Communication, Speech
Peer reviewed Peer reviewed
Direct linkDirect link
Tsui, Angeline Sin Mei; Byers-Heinlein, Krista; Fennell, Christopher T. – Developmental Psychology, 2019
Associative word learning, the ability to pair a concept to a word, is an essential mechanism for early language development. One common method by which researchers measure this ability is the Switch task (Werker, Cohen, Lloyd, Casasola, & Stager, 1998), wherein infants are habituated to 2 word-object pairings and then tested on their ability…
Descriptors: Associative Learning, Vocabulary Development, Language Acquisition, Infants
Peer reviewed Peer reviewed
Direct linkDirect link
Gerken, LouAnn; Quam, Carolyn; Goffman, Lisa – Language Learning and Development, 2019
Beginning with the classic work of Shepard, Hovland, & Jenkins (1961), Type II visual patterns (e.g., exemplars are large white squares OR small black triangles) have held a special place in investigations of human learning. Recent research on Type II "linguistic" patterns has shown that they are relatively frequent across languages…
Descriptors: Infants, Language Patterns, Language Acquisition, Learning Processes
Peer reviewed Peer reviewed
Direct linkDirect link
Stewart, Mary E.; Petrou, Alexandra M.; Ota, Mitsuhiko – Journal of Autism and Developmental Disorders, 2018
This study tested whether individuals with autism spectrum conditions (n = 23) show enhanced discrimination of acoustic differences that signal a linguistic contrast (i.e., /g/ versus /k/ as in "goat" and "coat") and whether they process such differences in a less categorical fashion as compared with 23 IQ-matched typically…
Descriptors: Auditory Perception, Speech, Adults, Autism
Peer reviewed Peer reviewed
Direct linkDirect link
Havy, Mélanie; Foroud, Afra; Fais, Laurel; Werker, Janet F. – Child Development, 2017
Visual information influences speech perception in both infants and adults. It is still unknown whether lexical representations are multisensory. To address this question, we exposed 18-month-old infants (n = 32) and adults (n = 32) to new word-object pairings: Participants either heard the acoustic form of the words or saw the talking face in…
Descriptors: Infants, Vocabulary Development, Adults, Speech
Peer reviewed Peer reviewed
Direct linkDirect link
Higgins, Meaghan C.; Penney, Sarah B.; Robertson, Erin K. – Journal of Psycholinguistic Research, 2017
The roles of phonological short-term memory (pSTM) and speech perception in spoken sentence comprehension were examined in an experimental design. Deficits in pSTM and speech perception were simulated through task demands while typically-developing children (N = 71) completed a sentence-picture matching task. Children performed the control,…
Descriptors: Phonology, Short Term Memory, Speech, Auditory Perception
Peer reviewed Peer reviewed
Direct linkDirect link
Lee, Jiyeon; Yoshida, Masaya; Thompson, Cynthia K. – Journal of Speech, Language, and Hearing Research, 2015
Purpose: Grammatical encoding (GE) is impaired in agrammatic aphasia; however, the nature of such deficits remains unclear. We examined grammatical planning units during real-time sentence production in speakers with agrammatic aphasia and control speakers, testing two competing models of GE. We queried whether speakers with agrammatic aphasia…
Descriptors: Grammar, Aphasia, Language Impairments, Control Groups
Peer reviewed Peer reviewed
Direct linkDirect link
Sohail, Juwairia; Johnson, Elizabeth K. – Language Learning and Development, 2016
Much of what we know about the development of listeners' word segmentation strategies originates from the artificial language-learning literature. However, many artificial speech streams designed to study word segmentation lack a salient cue found in all natural languages: utterance boundaries. In this study, participants listened to a…
Descriptors: Phonology, Linguistic Theory, Speech, Cues
Peer reviewed Peer reviewed
Direct linkDirect link
Lowenstein, Joanna H.; Nittrouer, Susan – Journal of Speech, Language, and Hearing Research, 2015
Purpose: One task of childhood involves learning to optimally weight acoustic cues in the speech signal in order to recover phonemic categories. This study examined the extent to which spectral degradation, as associated with cochlear implants, might interfere. The 3 goals were to measure, for adults and children, (a) cue weighting with spectrally…
Descriptors: Hearing Impairments, Acoustics, Cues, Word Recognition
Peer reviewed Peer reviewed
Direct linkDirect link
Brouwer, Susanne; Bradlow, Ann R. – Journal of Psycholinguistic Research, 2016
This study examined the temporal dynamics of spoken word recognition in noise and background speech. In two visual-world experiments, English participants listened to target words while looking at four pictures on the screen: a target (e.g. "candle"), an onset competitor (e.g. "candy"), a rhyme competitor (e.g.…
Descriptors: Oral Language, Word Recognition, Visual Stimuli, Task Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Galle, Marcus E.; Apfelbaum, Keith S.; McMurray, Bob – Language Learning and Development, 2015
Recent work has demonstrated that the addition of multiple talkers during habituation improves 14-month-olds' performance in the switch task (Rost & McMurray, 2009). While the authors suggest that this boost in performance is due to the increase in acoustic variability (Rost & McMurray, 2010), it is also possible that there is…
Descriptors: Vocabulary Development, Infants, Acoustics, Auditory Stimuli
Peer reviewed Peer reviewed
Direct linkDirect link
Richards, Susan; Goswami, Usha – Journal of Speech, Language, and Hearing Research, 2015
Purpose: We investigated whether impaired acoustic processing is a factor in developmental language disorders. The amplitude envelope of the speech signal is known to be important in language processing. We examined whether impaired perception of amplitude envelope rise time is related to impaired perception of lexical and phrasal stress in…
Descriptors: Auditory Perception, Language Processing, Language Impairments, Correlation
Peer reviewed Peer reviewed
Direct linkDirect link
Yang, Seung-yun; Van Lancker Sidtis, Diana – Journal of Speech, Language, and Hearing Research, 2016
Purpose: This study investigates the effects of left- and right-hemisphere damage (LHD and RHD) on the production of idiomatic or literal expressions utilizing acoustic analyses. Method: Twenty-one native speakers of Korean with LHD or RHD and in a healthy control (HC) group produced 6 ditropically ambiguous (idiomatic or literal) sentences in 2…
Descriptors: Korean, Figurative Language, Brain Hemisphere Functions, Acoustics
Peer reviewed Peer reviewed
Direct linkDirect link
Rudner, Mary; Mishra, Sushmit; Stenfelt, Stefan; Lunner, Thomas; Rönnberg, Jerker – Journal of Speech, Language, and Hearing Research, 2016
Purpose: Seeing the talker's face improves speech understanding in noise, possibly releasing resources for cognitive processing. We investigated whether it improves free recall of spoken two-digit numbers. Method: Twenty younger adults with normal hearing and 24 older adults with hearing loss listened to and subsequently recalled lists of 13…
Descriptors: Hearing Impairments, Recall (Psychology), Older Adults, Young Adults
Peer reviewed Peer reviewed
Direct linkDirect link
Mayr, Ulrich; Kleffner-Canucci, Killian; Kikumoto, Atsushi; Redford, Melissa A. – Journal of Experimental Psychology: Learning, Memory, and Cognition, 2014
It is almost a truism that language aids serial-order control through self-cuing of upcoming sequential elements. We measured speech onset latencies as subjects performed hierarchically organized task sequences while "thinking aloud" each task label. Surprisingly, speech onset latencies and response times (RTs) were highly synchronized,…
Descriptors: Language Role, Executive Function, Task Analysis, College Students
Previous Page | Next Page »
Pages: 1  |  2  |  3  |  4  |  5  |  6  |  7  |  8  |  9