Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 6 |
Since 2006 (last 20 years) | 23 |
Descriptor
American Sign Language | 32 |
Deafness | 20 |
Language Processing | 15 |
English | 11 |
Visual Stimuli | 9 |
Bilingualism | 8 |
Cognitive Processes | 6 |
Semantics | 6 |
Comprehension | 5 |
Spatial Ability | 5 |
Task Analysis | 5 |
More ▼ |
Source
Author
Publication Type
Journal Articles | 32 |
Reports - Research | 27 |
Reports - Evaluative | 5 |
Education Level
Higher Education | 2 |
Adult Education | 1 |
Postsecondary Education | 1 |
Audience
Location
California | 1 |
District of Columbia | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Secora, Kristen; Emmorey, Karen – Journal of Deaf Studies and Deaf Education, 2020
As spatial languages, sign languages rely on spatial cognitive processes that are not involved for spoken languages. Interlocutors have different visual perspectives of the signer's hands requiring a mental transformation for successful communication about spatial scenes. It is unknown whether visual-spatial perspective-taking (VSPT) or mental…
Descriptors: American Sign Language, Deafness, Hearing Impairments, Adults
Brozdowski, Chris; Secora, Kristen; Emmorey, Karen – Journal of Deaf Studies and Deaf Education, 2019
In ASL spatial classifier expressions, the location of the hands in signing space depicts the relative position of described objects. When objects are physically present, the arrangement of the hands maps to the observed position of objects in the world (Shared Space). For non-present objects, interlocutors must perform a mental transformation to…
Descriptors: American Sign Language, Spatial Ability, Perspective Taking, Comprehension
Emmorey, Karen; Li, Chuchu; Petrich, Jennifer; Gollan, Tamar H. – Journal of Experimental Psychology: Learning, Memory, and Cognition, 2020
When spoken language (unimodal) bilinguals switch between languages, they must simultaneously inhibit 1 language and activate the other language. Because American Sign Language (ASL)-English (bimodal) bilinguals can switch into and out of code-blends (simultaneous production of a sign and a word), we can tease apart the cost of inhibition (turning…
Descriptors: Bilingualism, Code Switching (Language), Task Analysis, Second Language Learning
Sehyr, Zed Sevcikova; Giezen, Marcel R.; Emmorey, Karen – Journal of Deaf Studies and Deaf Education, 2018
This study investigated the impact of language modality and age of acquisition on semantic fluency in American Sign Language (ASL) and English. Experiment 1 compared semantic fluency performance (e.g., name as many animals as possible in 1 min) for deaf native and early ASL signers and hearing monolingual English speakers. The results showed…
Descriptors: American Sign Language, English, Language Fluency, Semantics
Giezen, Marcel R.; Emmorey, Karen – Journal of Deaf Studies and Deaf Education, 2016
Semantic and lexical decision tasks were used to investigate the mechanisms underlying code-blend facilitation: the finding that hearing bimodal bilinguals comprehend signs in American Sign Language (ASL) and spoken English words more quickly when they are presented together simultaneously than when each is presented alone. More robust…
Descriptors: Semantics, American Sign Language, Bilingual Education, Lexicology
Hauser, Peter C.; Paludneviciene, Raylene; Riddle, Wanda; Kurz, Kim B.; Emmorey, Karen; Contreras, Jessica – Journal of Deaf Studies and Deaf Education, 2016
The American Sign Language Comprehension Test (ASL-CT) is a 30-item multiple-choice test that measures ASL receptive skills and is administered through a website. This article describes the development and psychometric properties of the test based on a sample of 80 college students including deaf native signers, hearing native signers, deaf…
Descriptors: American Sign Language, Comprehension, Multiple Choice Tests, Receptive Language
Emmorey, Karen; Petrich, Jennifer A. F.; Gollan, Tamar H. – Journal of Deaf Studies and Deaf Education, 2013
The frequency-lag hypothesis proposes that bilinguals have slowed lexical retrieval relative to monolinguals and in their nondominant language relative to their dominant language, particularly for low-frequency words. These effects arise because bilinguals divide their language use between 2 languages and use their nondominant language less…
Descriptors: Deafness, Bilingualism, Monolingualism, Language Processing
Nicodemus, Brenda; Emmorey, Karen – Bilingualism: Language and Cognition, 2013
Spoken language (unimodal) interpreters often prefer to interpret from their non-dominant language (L2) into their native language (L1). Anecdotally, signed language (bimodal) interpreters express the opposite bias, preferring to interpret from L1 (spoken language) into L2 (signed language). We conducted a large survey study ("N" =…
Descriptors: Deaf Interpreting, Sign Language, Native Language, Second Languages
Emmorey, Karen; Petrich, Jennifer A. F. – Journal of Deaf Studies and Deaf Education, 2012
Two lexical decision experiments are reported that investigate whether the same segmentation strategies are used for reading printed English words and fingerspelled words (in American Sign Language). Experiment 1 revealed that both deaf and hearing readers performed better when written words were segmented with respect to an orthographically…
Descriptors: Deafness, Adults, Language Processing, Written Language
Emmorey, Karen; Petrich, Jennifer A. F.; Gollan, Tamar H. – Journal of Memory and Language, 2012
Bilinguals who are fluent in American Sign Language (ASL) and English often produce "code-blends"--simultaneously articulating a sign and a word while conversing with other ASL-English bilinguals. To investigate the cognitive mechanisms underlying code-blend processing, we compared picture-naming times (Experiment 1) and semantic categorization…
Descriptors: Speech, Language Processing, American Sign Language, Semantics
Emmorey, Karen; Xu, Jiang; Braun, Allen – Brain and Language, 2011
To identify neural regions that automatically respond to linguistically structured, but meaningless manual gestures, 14 deaf native users of American Sign Language (ASL) and 14 hearing non-signers passively viewed pseudosigns (possible but non-existent ASL signs) and non-iconic ASL signs, in addition to a fixation baseline. For the contrast…
Descriptors: Phonetics, Task Analysis, American Sign Language, Language Processing
Emmorey, Karen; McCullough, Stephen; Mehta, Sonya; Ponto, Laura L. B.; Grabowski, Thomas J. – Language and Cognitive Processes, 2011
We investigated the functional organisation of neural systems supporting language production when the primary language articulators are also used for meaningful, but nonlinguistic, expression such as pantomime. Fourteen hearing nonsigners and 10 deaf native users of American Sign Language (ASL) participated in an H[subscript 2][superscript…
Descriptors: Pantomime, Verbs, Deafness, American Sign Language
Bosworth, Rain G.; Emmorey, Karen – Journal of Experimental Psychology: Learning, Memory, and Cognition, 2010
Iconicity is a property that pervades the lexicon of many sign languages, including American Sign Language (ASL). Iconic signs exhibit a motivated, nonarbitrary mapping between the form of the sign and its meaning. We investigated whether iconicity enhances semantic priming effects for ASL and whether iconic signs are recognized more quickly than…
Descriptors: Priming, Semantics, Familiarity, American Sign Language
Casey, Shannon; Emmorey, Karen; Larrabee, Heather – Bilingualism: Language and Cognition, 2012
Given that the linguistic articulators for sign language are also used to produce co-speech gesture, we examined whether one year of academic instruction in American Sign Language (ASL) impacts the rate and nature of gestures produced when speaking English. A survey study revealed that 75% of ASL learners (N = 95), but only 14% of Romance language…
Descriptors: Cognitive Processes, American Sign Language, Cartoons, Second Language Learning
Emmorey, Karen; Thompson, Robin; Colvin, Rachael – Journal of Deaf Studies and Deaf Education, 2009
An eye-tracking experiment investigated where deaf native signers (N = 9) and hearing beginning signers (N = 10) look while comprehending a short narrative and a spatial description in American Sign Language produced live by a fluent signer. Both groups fixated primarily on the signer's face (more than 80% of the time) but differed with respect to…
Descriptors: Eye Movements, American Sign Language, Native Speakers, Comprehension