Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 2 |
Since 2006 (last 20 years) | 2 |
Descriptor
Source
First Language | 2 |
Author
Caplan, Spencer | 1 |
Frank, Michael C. | 1 |
Kachergis, George | 1 |
Kodner, Jordan | 1 |
Mahowald, Kyle | 1 |
Schuler, Kathryn D. | 1 |
Publication Type
Journal Articles | 2 |
Reports - Evaluative | 2 |
Education Level
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Mahowald, Kyle; Kachergis, George; Frank, Michael C. – First Language, 2020
Ambridge calls for exemplar-based accounts of language acquisition. Do modern neural networks such as transformers or word2vec -- which have been extremely successful in modern natural language processing (NLP) applications -- count? Although these models often have ample parametric complexity to store exemplars from their training data, they also…
Descriptors: Models, Language Processing, Computational Linguistics, Language Acquisition
Schuler, Kathryn D.; Kodner, Jordan; Caplan, Spencer – First Language, 2020
In 'Against Stored Abstractions,' Ambridge uses neural and computational evidence to make his case against abstract representations. He argues that storing only exemplars is more parsimonious -- why bother with abstraction when exemplar models with on-the-fly calculation can do everything abstracting models can and more -- and implies that his…
Descriptors: Language Processing, Language Acquisition, Computational Linguistics, Linguistic Theory