Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 10 |
Since 2006 (last 20 years) | 10 |
Descriptor
Computational Linguistics | 10 |
Figurative Language | 10 |
Language Acquisition | 10 |
Abstract Reasoning | 9 |
Language Processing | 9 |
Models | 9 |
Linguistic Theory | 8 |
Syntax | 6 |
Generalization | 4 |
Phonology | 4 |
Psycholinguistics | 4 |
More ▼ |
Source
First Language | 10 |
Author
Ambridge, Ben | 2 |
Adger, David | 1 |
Chandler, Steve | 1 |
Ferry, Alissa | 1 |
Frank, Michael C. | 1 |
Hartshorne, Joshua K. | 1 |
Hou, Lynn | 1 |
Kachergis, George | 1 |
Knabe, Melina L. | 1 |
Lieven, Elena | 1 |
Mahowald, Kyle | 1 |
More ▼ |
Publication Type
Journal Articles | 10 |
Reports - Evaluative | 8 |
Opinion Papers | 7 |
Reports - Descriptive | 2 |
Education Level
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Lieven, Elena; Ferry, Alissa; Theakston, Anna; Twomey, Katherine E. – First Language, 2020
During language acquisition children generalise at multiple layers of granularity. Ambridge argues that abstraction-based accounts suffer from lumping (over-general abstractions) or splitting (over-precise abstractions). Ambridge argues that the only way to overcome this conundrum is in a purely exemplar/analogy-based system in which…
Descriptors: Language Acquisition, Children, Generalization, Abstract Reasoning
Adger, David – First Language, 2020
The syntactic behaviour of human beings cannot be explained by analogical generalization on the basis of concrete exemplars: analogies in surface form are insufficient to account for human grammatical knowledge, because they fail to hold in situations where they should, and fail to extend in situations where they need to. [For Ben Ambridge's…
Descriptors: Syntax, Figurative Language, Models, Generalization
Hartshorne, Joshua K. – First Language, 2020
Ambridge argues that the existence of exemplar models for individual phenomena (words, inflection rules, etc.) suggests the feasibility of a unified, exemplars-everywhere model that eschews abstraction. The argument would be strengthened by a description of such a model. However, none is provided. I show that any attempt to do so would immediately…
Descriptors: Models, Language Acquisition, Language Processing, Bayesian Statistics
Ambridge, Ben – First Language, 2020
In this response to commentators, I agree with those who suggested that the distinction between exemplar- and abstraction-based accounts is something of a false dichotomy and therefore move to an abstractions-made-of-exemplars account under which (a) we store all the exemplars that we hear (subject to attention, decay, interference, etc.) but (b)…
Descriptors: Language Acquisition, Syntax, Computational Linguistics, Language Research
Rose, Yvan – First Language, 2020
Ambridge's proposal cannot account for the most basic observations about phonological patterns in human languages. Outside of the earliest stages of phonological production by toddlers, the phonological systems of speakers/learners exhibit internal behaviours that point to the representation and processing of inter-related units ranging in size…
Descriptors: Phonology, Language Patterns, Toddlers, Language Processing
Mahowald, Kyle; Kachergis, George; Frank, Michael C. – First Language, 2020
Ambridge calls for exemplar-based accounts of language acquisition. Do modern neural networks such as transformers or word2vec -- which have been extremely successful in modern natural language processing (NLP) applications -- count? Although these models often have ample parametric complexity to store exemplars from their training data, they also…
Descriptors: Models, Language Processing, Computational Linguistics, Language Acquisition
Chandler, Steve – First Language, 2020
Ambridge reviews and augments an impressive body of research demonstrating both the advantages and the necessity of an exemplar-based model of knowledge of one's language. He cites three computational models that have been applied successfully to issues of phonology and morphology. Focusing on Ambridge's discussion of sentence-level constructions,…
Descriptors: Models, Figurative Language, Language Processing, Language Acquisition
Hou, Lynn; Morford, Jill P. – First Language, 2020
The visual-manual modality of sign languages renders them a unique test case for language acquisition and processing theories. In this commentary the authors describe evidence from signed languages, and ask whether it is consistent with Ambridge's proposal. The evidence includes recent research on collocations in American Sign Language that reveal…
Descriptors: Sign Language, Phrase Structure, American Sign Language, Syntax
Knabe, Melina L.; Vlach, Haley A. – First Language, 2020
Ambridge argues that there is widespread agreement among child language researchers that learners store linguistic abstractions. In this commentary the authors first argue that this assumption is incorrect; anti-representationalist/exemplar views are pervasive in theories of child language. Next, the authors outline what has been learned from this…
Descriptors: Child Language, Children, Language Acquisition, Models
Ambridge, Ben – First Language, 2020
The goal of this article is to make the case for a radical exemplar account of child language acquisition, under which unwitnessed forms are produced and comprehended by on-the-fly analogy across multiple stored exemplars, weighted by their degree of similarity to the target with regard to the task at hand. Across the domains of (1) word meanings,…
Descriptors: Language Acquisition, Morphology (Languages), Phonetics, Phonology