Publication Date
In 2025 | 0 |
Since 2024 | 3 |
Since 2021 (last 5 years) | 8 |
Since 2016 (last 10 years) | 28 |
Since 2006 (last 20 years) | 31 |
Descriptor
Language Processing | 31 |
Language Acquisition | 28 |
Linguistic Theory | 16 |
Models | 16 |
Abstract Reasoning | 13 |
Computational Linguistics | 13 |
Figurative Language | 13 |
Syntax | 10 |
Grammar | 9 |
Task Analysis | 9 |
Foreign Countries | 8 |
More ▼ |
Source
First Language | 31 |
Author
Ambridge, Ben | 2 |
Adger, David | 1 |
Alan L. F. Lee | 1 |
Andreou, Maria | 1 |
Anna Siyanova-Chanturia | 1 |
Armon-Lotem, Sharon | 1 |
Asli Aktan-Erciyes | 1 |
Bambini, Valentina | 1 |
Bianco, Federica | 1 |
Boloh, Yves | 1 |
Brooks, Patricia J. | 1 |
More ▼ |
Publication Type
Journal Articles | 31 |
Reports - Evaluative | 17 |
Opinion Papers | 15 |
Reports - Research | 11 |
Reports - Descriptive | 3 |
Education Level
Elementary Education | 4 |
Intermediate Grades | 2 |
Early Childhood Education | 1 |
Grade 3 | 1 |
Grade 4 | 1 |
Grade 6 | 1 |
Higher Education | 1 |
Middle Schools | 1 |
Postsecondary Education | 1 |
Primary Education | 1 |
Audience
Laws, Policies, & Programs
Assessments and Surveys
Digit Span Test | 1 |
Peabody Picture Vocabulary… | 1 |
What Works Clearinghouse Rating
Shang Jiang; Anna Siyanova-Chanturia – First Language, 2024
Recent studies have accumulated to suggest that children, akin to adults, exhibit a processing advantage for formulaic language (e.g. "save energy") over novel language (e.g. "sell energy"), as well as sensitivity to phrase frequencies. The majority of these studies are based on formulaic sequences in their canonical form. In…
Descriptors: Phrase Structure, Language Processing, Language Acquisition, Child Language
Mahowald, Kyle; Kachergis, George; Frank, Michael C. – First Language, 2020
Ambridge calls for exemplar-based accounts of language acquisition. Do modern neural networks such as transformers or word2vec -- which have been extremely successful in modern natural language processing (NLP) applications -- count? Although these models often have ample parametric complexity to store exemplars from their training data, they also…
Descriptors: Models, Language Processing, Computational Linguistics, Language Acquisition
Chandler, Steve – First Language, 2020
Ambridge reviews and augments an impressive body of research demonstrating both the advantages and the necessity of an exemplar-based model of knowledge of one's language. He cites three computational models that have been applied successfully to issues of phonology and morphology. Focusing on Ambridge's discussion of sentence-level constructions,…
Descriptors: Models, Figurative Language, Language Processing, Language Acquisition
Lieven, Elena; Ferry, Alissa; Theakston, Anna; Twomey, Katherine E. – First Language, 2020
During language acquisition children generalise at multiple layers of granularity. Ambridge argues that abstraction-based accounts suffer from lumping (over-general abstractions) or splitting (over-precise abstractions). Ambridge argues that the only way to overcome this conundrum is in a purely exemplar/analogy-based system in which…
Descriptors: Language Acquisition, Children, Generalization, Abstract Reasoning
Adger, David – First Language, 2020
The syntactic behaviour of human beings cannot be explained by analogical generalization on the basis of concrete exemplars: analogies in surface form are insufficient to account for human grammatical knowledge, because they fail to hold in situations where they should, and fail to extend in situations where they need to. [For Ben Ambridge's…
Descriptors: Syntax, Figurative Language, Models, Generalization
Hartshorne, Joshua K. – First Language, 2020
Ambridge argues that the existence of exemplar models for individual phenomena (words, inflection rules, etc.) suggests the feasibility of a unified, exemplars-everywhere model that eschews abstraction. The argument would be strengthened by a description of such a model. However, none is provided. I show that any attempt to do so would immediately…
Descriptors: Models, Language Acquisition, Language Processing, Bayesian Statistics
Brooks, Patricia J.; Kempe, Vera – First Language, 2020
The radical exemplar model resonates with work on perceptual classification and categorization highlighting the role of exemplars in memory representations. Further development of the model requires acknowledgment of both the fleeting and fragile nature of perceptual representations and the gist-based, good-enough quality of long-term memory…
Descriptors: Models, Language Acquisition, Classification, Memory
Rose, Yvan – First Language, 2020
Ambridge's proposal cannot account for the most basic observations about phonological patterns in human languages. Outside of the earliest stages of phonological production by toddlers, the phonological systems of speakers/learners exhibit internal behaviours that point to the representation and processing of inter-related units ranging in size…
Descriptors: Phonology, Language Patterns, Toddlers, Language Processing
Hou, Lynn; Morford, Jill P. – First Language, 2020
The visual-manual modality of sign languages renders them a unique test case for language acquisition and processing theories. In this commentary the authors describe evidence from signed languages, and ask whether it is consistent with Ambridge's proposal. The evidence includes recent research on collocations in American Sign Language that reveal…
Descriptors: Sign Language, Phrase Structure, American Sign Language, Syntax
Schuler, Kathryn D.; Kodner, Jordan; Caplan, Spencer – First Language, 2020
In 'Against Stored Abstractions,' Ambridge uses neural and computational evidence to make his case against abstract representations. He argues that storing only exemplars is more parsimonious -- why bother with abstraction when exemplar models with on-the-fly calculation can do everything abstracting models can and more -- and implies that his…
Descriptors: Language Processing, Language Acquisition, Computational Linguistics, Linguistic Theory
Finley, Sara – First Language, 2020
In this commentary, I discuss why, despite the existence of gradience in phonetics and phonology, there is still a need for abstract representations. Most proponents of exemplar models assume multiple levels of abstraction, allowing for an integration of the gradient and the categorical. Ben Ambridge's dismissal of generative models such as…
Descriptors: Phonology, Phonetics, Abstract Reasoning, Linguistic Theory
Odijk, Lotte; Gillis, Steven – First Language, 2023
The inflectional diversity of parents' speech directed to children acquiring Dutch was investigated. Inflectional diversity is defined as the number of inflected forms of a particular lemma (e.g. singular, plural of a noun) and measured by means of Mean Size of Paradigm (MSP). Changes in the inflectional diversity of infant directed speech (IDS)…
Descriptors: Parent Child Relationship, Vocabulary Development, Language Acquisition, Longitudinal Studies
Koring, Loes; Giblin, Iain; Thornton, Rosalind; Crain, Stephen – First Language, 2020
This response argues against the proposal that novel utterances are formed by analogy with stored exemplars that are close in meaning. Strings of words that are similar in meaning or even identical can behave very differently once inserted into different syntactic environments. Furthermore, phrases with similar meanings but different underlying…
Descriptors: Language Acquisition, Figurative Language, Syntax, Phrase Structure
Demuth, Katherine; Johnson, Mark – First Language, 2020
Exemplar-based learning requires: (1) a segmentation procedure for identifying the units of past experiences that a present experience can be compared to, and (2) a similarity function for comparing these past experiences to the present experience. This article argues that for a learner to learn a language these two mechanisms will require…
Descriptors: Comparative Analysis, Language Acquisition, Linguistic Theory, Grammar
Zettersten, Martin; Schonberg, Christina; Lupyan, Gary – First Language, 2020
This article reviews two aspects of human learning: (1) people draw inferences that appear to rely on hierarchical conceptual representations; (2) some categories are much easier to learn than others given the same number of exemplars, and some categories remain difficult despite extensive training. Both of these results are difficult to reconcile…
Descriptors: Models, Language Acquisition, Prediction, Language Processing