Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 11 |
Since 2006 (last 20 years) | 11 |
Descriptor
Source
First Language | 11 |
Author
Adger, David | 1 |
Ambridge, Ben | 1 |
Caplan, Spencer | 1 |
Chandler, Steve | 1 |
Ferry, Alissa | 1 |
Frank, Michael C. | 1 |
Hartshorne, Joshua K. | 1 |
Hou, Lynn | 1 |
Kachergis, George | 1 |
Knabe, Melina L. | 1 |
Kodner, Jordan | 1 |
More ▼ |
Publication Type
Journal Articles | 11 |
Reports - Evaluative | 9 |
Opinion Papers | 7 |
Reports - Descriptive | 2 |
Education Level
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Lieven, Elena; Ferry, Alissa; Theakston, Anna; Twomey, Katherine E. – First Language, 2020
During language acquisition children generalise at multiple layers of granularity. Ambridge argues that abstraction-based accounts suffer from lumping (over-general abstractions) or splitting (over-precise abstractions). Ambridge argues that the only way to overcome this conundrum is in a purely exemplar/analogy-based system in which…
Descriptors: Language Acquisition, Children, Generalization, Abstract Reasoning
McClelland, James L. – First Language, 2020
Humans are sensitive to the properties of individual items, and exemplar models are useful for capturing this sensitivity. I am a proponent of an extension of exemplar-based architectures that I briefly describe. However, exemplar models are very shallow architectures in which it is necessary to stipulate a set of primitive elements that make up…
Descriptors: Models, Language Processing, Artificial Intelligence, Language Usage
Adger, David – First Language, 2020
The syntactic behaviour of human beings cannot be explained by analogical generalization on the basis of concrete exemplars: analogies in surface form are insufficient to account for human grammatical knowledge, because they fail to hold in situations where they should, and fail to extend in situations where they need to. [For Ben Ambridge's…
Descriptors: Syntax, Figurative Language, Models, Generalization
Hartshorne, Joshua K. – First Language, 2020
Ambridge argues that the existence of exemplar models for individual phenomena (words, inflection rules, etc.) suggests the feasibility of a unified, exemplars-everywhere model that eschews abstraction. The argument would be strengthened by a description of such a model. However, none is provided. I show that any attempt to do so would immediately…
Descriptors: Models, Language Acquisition, Language Processing, Bayesian Statistics
Rose, Yvan – First Language, 2020
Ambridge's proposal cannot account for the most basic observations about phonological patterns in human languages. Outside of the earliest stages of phonological production by toddlers, the phonological systems of speakers/learners exhibit internal behaviours that point to the representation and processing of inter-related units ranging in size…
Descriptors: Phonology, Language Patterns, Toddlers, Language Processing
Mahowald, Kyle; Kachergis, George; Frank, Michael C. – First Language, 2020
Ambridge calls for exemplar-based accounts of language acquisition. Do modern neural networks such as transformers or word2vec -- which have been extremely successful in modern natural language processing (NLP) applications -- count? Although these models often have ample parametric complexity to store exemplars from their training data, they also…
Descriptors: Models, Language Processing, Computational Linguistics, Language Acquisition
Chandler, Steve – First Language, 2020
Ambridge reviews and augments an impressive body of research demonstrating both the advantages and the necessity of an exemplar-based model of knowledge of one's language. He cites three computational models that have been applied successfully to issues of phonology and morphology. Focusing on Ambridge's discussion of sentence-level constructions,…
Descriptors: Models, Figurative Language, Language Processing, Language Acquisition
Hou, Lynn; Morford, Jill P. – First Language, 2020
The visual-manual modality of sign languages renders them a unique test case for language acquisition and processing theories. In this commentary the authors describe evidence from signed languages, and ask whether it is consistent with Ambridge's proposal. The evidence includes recent research on collocations in American Sign Language that reveal…
Descriptors: Sign Language, Phrase Structure, American Sign Language, Syntax
Schuler, Kathryn D.; Kodner, Jordan; Caplan, Spencer – First Language, 2020
In 'Against Stored Abstractions,' Ambridge uses neural and computational evidence to make his case against abstract representations. He argues that storing only exemplars is more parsimonious -- why bother with abstraction when exemplar models with on-the-fly calculation can do everything abstracting models can and more -- and implies that his…
Descriptors: Language Processing, Language Acquisition, Computational Linguistics, Linguistic Theory
Knabe, Melina L.; Vlach, Haley A. – First Language, 2020
Ambridge argues that there is widespread agreement among child language researchers that learners store linguistic abstractions. In this commentary the authors first argue that this assumption is incorrect; anti-representationalist/exemplar views are pervasive in theories of child language. Next, the authors outline what has been learned from this…
Descriptors: Child Language, Children, Language Acquisition, Models
Ambridge, Ben – First Language, 2020
The goal of this article is to make the case for a radical exemplar account of child language acquisition, under which unwitnessed forms are produced and comprehended by on-the-fly analogy across multiple stored exemplars, weighted by their degree of similarity to the target with regard to the task at hand. Across the domains of (1) word meanings,…
Descriptors: Language Acquisition, Morphology (Languages), Phonetics, Phonology