NotesFAQContact Us
Collection
Advanced
Search Tips
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 1 to 15 of 140 results Save | Export
Huteng Dai – ProQuest LLC, 2024
In this dissertation, I establish a research program that uses computational modeling as a testbed for theories of phonological learning. This dissertation focuses on a fundamental question: how do children acquire sound patterns from noisy, real-world data, especially in the presence of lexical exceptions that defy regular patterns? For instance,…
Descriptors: Phonology, Language Acquisition, Computational Linguistics, Linguistic Theory
Peer reviewed Peer reviewed
Direct linkDirect link
Thornton, Chris – Cognitive Science, 2021
Semantic composition in language must be closely related to semantic composition in thought. But the way the two processes are explained differs considerably. Focusing primarily on propositional content, language theorists generally take semantic composition to be a truth-conditional process. Focusing more on extensional content, cognitive…
Descriptors: Semantics, Cognitive Processes, Linguistic Theory, Language Usage
Peer reviewed Peer reviewed
Direct linkDirect link
Mahowald, Kyle; Kachergis, George; Frank, Michael C. – First Language, 2020
Ambridge calls for exemplar-based accounts of language acquisition. Do modern neural networks such as transformers or word2vec -- which have been extremely successful in modern natural language processing (NLP) applications -- count? Although these models often have ample parametric complexity to store exemplars from their training data, they also…
Descriptors: Models, Language Processing, Computational Linguistics, Language Acquisition
Peer reviewed Peer reviewed
Direct linkDirect link
Schuler, Kathryn D.; Kodner, Jordan; Caplan, Spencer – First Language, 2020
In 'Against Stored Abstractions,' Ambridge uses neural and computational evidence to make his case against abstract representations. He argues that storing only exemplars is more parsimonious -- why bother with abstraction when exemplar models with on-the-fly calculation can do everything abstracting models can and more -- and implies that his…
Descriptors: Language Processing, Language Acquisition, Computational Linguistics, Linguistic Theory
Peer reviewed Peer reviewed
Direct linkDirect link
Trott, Sean; Jones, Cameron; Chang, Tyler; Michaelov, James; Bergen, Benjamin – Cognitive Science, 2023
Humans can attribute beliefs to others. However, it is unknown to what extent this ability results from an innate biological endowment or from experience accrued through child development, particularly exposure to language describing others' mental states. We test the viability of the language exposure hypothesis by assessing whether models…
Descriptors: Models, Language Processing, Beliefs, Child Development
Peer reviewed Peer reviewed
Direct linkDirect link
van Schijndel, Marten; Linzen, Tal – Cognitive Science, 2021
The disambiguation of a syntactically ambiguous sentence in favor of a less preferred parse can lead to slower reading at the disambiguation point. This phenomenon, referred to as a garden-path effect, has motivated models in which readers initially maintain only a subset of the possible parses of the sentence, and subsequently require…
Descriptors: Syntax, Ambiguity (Semantics), Reading Processes, Linguistic Theory
Byung-Doh Oh – ProQuest LLC, 2024
Decades of psycholinguistics research have shown that human sentence processing is highly incremental and predictive. This has provided evidence for expectation-based theories of sentence processing, which posit that the processing difficulty of linguistic material is modulated by its probability in context. However, these theories do not make…
Descriptors: Language Processing, Computational Linguistics, Artificial Intelligence, Computer Software
Mai Al-Khatib – ProQuest LLC, 2023
Linguistic meaning is generated by the mind and can be expressed in multiple languages. One may assume that equivalent texts/utterances in two languages by means of translation generate equivalent meanings in their readers/hearers. This follows if we assume that meaning calculated from the linguistic input is solely objective in nature. However,…
Descriptors: Semantics, Linguistic Input, Bilingualism, Language Processing
Peer reviewed Peer reviewed
Direct linkDirect link
Finley, Sara – First Language, 2020
In this commentary, I discuss why, despite the existence of gradience in phonetics and phonology, there is still a need for abstract representations. Most proponents of exemplar models assume multiple levels of abstraction, allowing for an integration of the gradient and the categorical. Ben Ambridge's dismissal of generative models such as…
Descriptors: Phonology, Phonetics, Abstract Reasoning, Linguistic Theory
Peer reviewed Peer reviewed
Direct linkDirect link
Stringer, David – Second Language Research, 2021
Westergaard (2021) presents an updated account of the Linguistic Proximity Model and the micro-cue approach to the parser as an acquisition device. The property-by-property view of transfer inherent in this approach contrasts with other influential models that assume that third language (L3) acquisition involves the creation of a full copy of only…
Descriptors: Transfer of Training, Linguistic Theory, Second Language Learning, Multilingualism
Peer reviewed Peer reviewed
Direct linkDirect link
Lieven, Elena; Ferry, Alissa; Theakston, Anna; Twomey, Katherine E. – First Language, 2020
During language acquisition children generalise at multiple layers of granularity. Ambridge argues that abstraction-based accounts suffer from lumping (over-general abstractions) or splitting (over-precise abstractions). Ambridge argues that the only way to overcome this conundrum is in a purely exemplar/analogy-based system in which…
Descriptors: Language Acquisition, Children, Generalization, Abstract Reasoning
Peer reviewed Peer reviewed
Direct linkDirect link
Logacev, Pavel – Journal of Experimental Psychology: Learning, Memory, and Cognition, 2023
A number of studies have found evidence for the so-called "ambiguity advantage," that is, faster processing of ambiguous sentences compared with unambiguous counterparts. While a number of proposals regarding the mechanism underlying this phenomenon have been made, the empirical evidence so far is far from unequivocal. It is compatible…
Descriptors: Phrase Structure, Accuracy, Ambiguity (Semantics), Sentences
Peer reviewed Peer reviewed
Direct linkDirect link
Jacobs, Cassandra L.; Cho, Sun-Joo; Watson, Duane G. – Cognitive Science, 2019
Syntactic priming in language production is the increased likelihood of using a recently encountered syntactic structure. In this paper, we examine two theories of why speakers can be primed: error-driven learning accounts (Bock, Dell, Chang, & Onishi, 2007; Chang, Dell, & Bock, 2006) and activation-based accounts (Pickering &…
Descriptors: Priming, Syntax, Prediction, Linguistic Theory
Peer reviewed Peer reviewed
Direct linkDirect link
González Alonso, Jorge; Rothman, Jason – Second Language Research, 2021
In this commentary to Westergaard (2021), we focus on two main questions. The first, and most important, is what type of L3 data may be construed as supporting evidence--as opposed to a compatible outcome--for the Linguistic Proximity Model. In this regard, we highlight a number of areas in which it remains difficult to derive testable predictions…
Descriptors: Transfer of Training, Second Language Learning, Native Language, Linguistic Theory
Peer reviewed Peer reviewed
Direct linkDirect link
Koring, Loes; Giblin, Iain; Thornton, Rosalind; Crain, Stephen – First Language, 2020
This response argues against the proposal that novel utterances are formed by analogy with stored exemplars that are close in meaning. Strings of words that are similar in meaning or even identical can behave very differently once inserted into different syntactic environments. Furthermore, phrases with similar meanings but different underlying…
Descriptors: Language Acquisition, Figurative Language, Syntax, Phrase Structure
Previous Page | Next Page »
Pages: 1  |  2  |  3  |  4  |  5  |  6  |  7  |  8  |  9  |  10