Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 0 |
| Since 2017 (last 10 years) | 5 |
| Since 2007 (last 20 years) | 5 |
Descriptor
| Artificial Intelligence | 5 |
| Language Acquisition | 5 |
| Models | 5 |
| Abstract Reasoning | 4 |
| Computational Linguistics | 4 |
| Language Processing | 4 |
| Linguistic Theory | 4 |
| Classification | 3 |
| Learning Processes | 3 |
| Memory | 3 |
| Child Language | 2 |
| More ▼ | |
Source
| First Language | 5 |
Author
| Caplan, Spencer | 1 |
| Demuth, Katherine | 1 |
| Frank, Michael C. | 1 |
| Johnson, Mark | 1 |
| Kachergis, George | 1 |
| Knabe, Melina L. | 1 |
| Kodner, Jordan | 1 |
| Mahowald, Kyle | 1 |
| McClelland, James L. | 1 |
| Schuler, Kathryn D. | 1 |
| Vlach, Haley A. | 1 |
| More ▼ | |
Publication Type
| Journal Articles | 5 |
| Reports - Evaluative | 5 |
| Opinion Papers | 3 |
Education Level
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Demuth, Katherine; Johnson, Mark – First Language, 2020
Exemplar-based learning requires: (1) a segmentation procedure for identifying the units of past experiences that a present experience can be compared to, and (2) a similarity function for comparing these past experiences to the present experience. This article argues that for a learner to learn a language these two mechanisms will require…
Descriptors: Comparative Analysis, Language Acquisition, Linguistic Theory, Grammar
McClelland, James L. – First Language, 2020
Humans are sensitive to the properties of individual items, and exemplar models are useful for capturing this sensitivity. I am a proponent of an extension of exemplar-based architectures that I briefly describe. However, exemplar models are very shallow architectures in which it is necessary to stipulate a set of primitive elements that make up…
Descriptors: Models, Language Processing, Artificial Intelligence, Language Usage
Mahowald, Kyle; Kachergis, George; Frank, Michael C. – First Language, 2020
Ambridge calls for exemplar-based accounts of language acquisition. Do modern neural networks such as transformers or word2vec -- which have been extremely successful in modern natural language processing (NLP) applications -- count? Although these models often have ample parametric complexity to store exemplars from their training data, they also…
Descriptors: Models, Language Processing, Computational Linguistics, Language Acquisition
Schuler, Kathryn D.; Kodner, Jordan; Caplan, Spencer – First Language, 2020
In 'Against Stored Abstractions,' Ambridge uses neural and computational evidence to make his case against abstract representations. He argues that storing only exemplars is more parsimonious -- why bother with abstraction when exemplar models with on-the-fly calculation can do everything abstracting models can and more -- and implies that his…
Descriptors: Language Processing, Language Acquisition, Computational Linguistics, Linguistic Theory
Knabe, Melina L.; Vlach, Haley A. – First Language, 2020
Ambridge argues that there is widespread agreement among child language researchers that learners store linguistic abstractions. In this commentary the authors first argue that this assumption is incorrect; anti-representationalist/exemplar views are pervasive in theories of child language. Next, the authors outline what has been learned from this…
Descriptors: Child Language, Children, Language Acquisition, Models

Peer reviewed
Direct link
