Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 4 |
Since 2006 (last 20 years) | 4 |
Descriptor
Source
First Language | 4 |
Author
Caplan, Spencer | 1 |
Demuth, Katherine | 1 |
Frank, Michael C. | 1 |
Johnson, Mark | 1 |
Kachergis, George | 1 |
Kodner, Jordan | 1 |
Mahowald, Kyle | 1 |
McClelland, James L. | 1 |
Schuler, Kathryn D. | 1 |
Publication Type
Journal Articles | 4 |
Reports - Evaluative | 4 |
Opinion Papers | 2 |
Education Level
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Mahowald, Kyle; Kachergis, George; Frank, Michael C. – First Language, 2020
Ambridge calls for exemplar-based accounts of language acquisition. Do modern neural networks such as transformers or word2vec -- which have been extremely successful in modern natural language processing (NLP) applications -- count? Although these models often have ample parametric complexity to store exemplars from their training data, they also…
Descriptors: Models, Language Processing, Computational Linguistics, Language Acquisition
Schuler, Kathryn D.; Kodner, Jordan; Caplan, Spencer – First Language, 2020
In 'Against Stored Abstractions,' Ambridge uses neural and computational evidence to make his case against abstract representations. He argues that storing only exemplars is more parsimonious -- why bother with abstraction when exemplar models with on-the-fly calculation can do everything abstracting models can and more -- and implies that his…
Descriptors: Language Processing, Language Acquisition, Computational Linguistics, Linguistic Theory
Demuth, Katherine; Johnson, Mark – First Language, 2020
Exemplar-based learning requires: (1) a segmentation procedure for identifying the units of past experiences that a present experience can be compared to, and (2) a similarity function for comparing these past experiences to the present experience. This article argues that for a learner to learn a language these two mechanisms will require…
Descriptors: Comparative Analysis, Language Acquisition, Linguistic Theory, Grammar
McClelland, James L. – First Language, 2020
Humans are sensitive to the properties of individual items, and exemplar models are useful for capturing this sensitivity. I am a proponent of an extension of exemplar-based architectures that I briefly describe. However, exemplar models are very shallow architectures in which it is necessary to stipulate a set of primitive elements that make up…
Descriptors: Models, Language Processing, Artificial Intelligence, Language Usage