Publication Date
In 2025 | 0 |
Since 2024 | 2 |
Since 2021 (last 5 years) | 11 |
Since 2016 (last 10 years) | 34 |
Since 2006 (last 20 years) | 74 |
Descriptor
Language Acquisition | 117 |
Models | 117 |
Language Processing | 110 |
Linguistic Theory | 46 |
Computational Linguistics | 26 |
Language Research | 26 |
Grammar | 25 |
Learning Processes | 22 |
Syntax | 21 |
Child Language | 20 |
Phonology | 19 |
More ▼ |
Source
Author
Plunkett, Kim | 3 |
Mareschal, Denis | 2 |
Plaut, David C. | 2 |
Purser, Harry R. M. | 2 |
Seidenberg, Mark S. | 2 |
Thomas, Michael S. C. | 2 |
van den Bosch, Antal | 2 |
Adger, David | 1 |
Alex Warstadt | 1 |
Allen, Joseph | 1 |
Alt, Mary | 1 |
More ▼ |
Publication Type
Education Level
Early Childhood Education | 2 |
Elementary Education | 2 |
Grade 2 | 1 |
Primary Education | 1 |
Audience
Researchers | 2 |
Practitioners | 1 |
Teachers | 1 |
Laws, Policies, & Programs
Assessments and Surveys
Flesch Reading Ease Formula | 1 |
National Assessment of… | 1 |
Peabody Picture Vocabulary… | 1 |
What Works Clearinghouse Rating
Huteng Dai – ProQuest LLC, 2024
In this dissertation, I establish a research program that uses computational modeling as a testbed for theories of phonological learning. This dissertation focuses on a fundamental question: how do children acquire sound patterns from noisy, real-world data, especially in the presence of lexical exceptions that defy regular patterns? For instance,…
Descriptors: Phonology, Language Acquisition, Computational Linguistics, Linguistic Theory
Ryan Daniel Budnick – ProQuest LLC, 2023
The past thirty years have shown a rise in models of language acquisition in which the state of the learner is characterized as a probability distribution over a set of non-stochastic grammars. In recent years, increasingly powerful models have been constructed as earlier models have failed to generalize well to increasingly complex and realistic…
Descriptors: Grammar, Feedback (Response), Algorithms, Computational Linguistics
Mahowald, Kyle; Kachergis, George; Frank, Michael C. – First Language, 2020
Ambridge calls for exemplar-based accounts of language acquisition. Do modern neural networks such as transformers or word2vec -- which have been extremely successful in modern natural language processing (NLP) applications -- count? Although these models often have ample parametric complexity to store exemplars from their training data, they also…
Descriptors: Models, Language Processing, Computational Linguistics, Language Acquisition
Unger, Layla; Yim, Hyungwook; Savic, Olivera; Dennis, Simon; Sloutsky, Vladimir M. – Developmental Science, 2023
Recent years have seen a flourishing of Natural Language Processing models that can mimic many aspects of human language fluency. These models harness a simple, decades-old idea: It is possible to learn a lot about word meanings just from exposure to language, because words similar in meaning are used in language in similar ways. The successes of…
Descriptors: Natural Language Processing, Language Usage, Vocabulary Development, Linguistic Input
Schuler, Kathryn D.; Kodner, Jordan; Caplan, Spencer – First Language, 2020
In 'Against Stored Abstractions,' Ambridge uses neural and computational evidence to make his case against abstract representations. He argues that storing only exemplars is more parsimonious -- why bother with abstraction when exemplar models with on-the-fly calculation can do everything abstracting models can and more -- and implies that his…
Descriptors: Language Processing, Language Acquisition, Computational Linguistics, Linguistic Theory
Trott, Sean; Jones, Cameron; Chang, Tyler; Michaelov, James; Bergen, Benjamin – Cognitive Science, 2023
Humans can attribute beliefs to others. However, it is unknown to what extent this ability results from an innate biological endowment or from experience accrued through child development, particularly exposure to language describing others' mental states. We test the viability of the language exposure hypothesis by assessing whether models…
Descriptors: Models, Language Processing, Beliefs, Child Development
Botarleanu, Robert-Mihai; Dascalu, Mihai; Watanabe, Micah; McNamara, Danielle S.; Crossley, Scott Andrew – Grantee Submission, 2021
The ability to objectively quantify the complexity of a text can be a useful indicator of how likely learners of a given level will comprehend it. Before creating more complex models of assessing text difficulty, the basic building block of a text consists of words and, inherently, its overall difficulty is greatly influenced by the complexity of…
Descriptors: Multilingualism, Language Acquisition, Age, Models
Finley, Sara – First Language, 2020
In this commentary, I discuss why, despite the existence of gradience in phonetics and phonology, there is still a need for abstract representations. Most proponents of exemplar models assume multiple levels of abstraction, allowing for an integration of the gradient and the categorical. Ben Ambridge's dismissal of generative models such as…
Descriptors: Phonology, Phonetics, Abstract Reasoning, Linguistic Theory
Hao Wu; Shan Li; Ying Gao; Jinta Weng; Guozhu Ding – Education and Information Technologies, 2024
Natural language processing (NLP) has captivated the attention of educational researchers over the past three decades. In this study, a total of 2,480 studies were retrieved through a comprehensive literature search. We used neural topic modeling and pre-trained language modeling to explore the research topics pertaining to the application of NLP…
Descriptors: Natural Language Processing, Educational Research, Research Design, Educational Trends
Lieven, Elena; Ferry, Alissa; Theakston, Anna; Twomey, Katherine E. – First Language, 2020
During language acquisition children generalise at multiple layers of granularity. Ambridge argues that abstraction-based accounts suffer from lumping (over-general abstractions) or splitting (over-precise abstractions). Ambridge argues that the only way to overcome this conundrum is in a purely exemplar/analogy-based system in which…
Descriptors: Language Acquisition, Children, Generalization, Abstract Reasoning
Jiménez, Eva; Hills, Thomas T. – Child Development, 2022
This study investigates the influence of semantic maturation on early lexical development by examining the impact of contextual diversity--known to influence semantic development--on word promotion from receptive to productive vocabularies (i.e., comprehension-expression gap). Study 1 compares the vocabularies of 3685 American-English-speaking…
Descriptors: Semantics, Language Acquisition, Child Development, Delayed Speech
Koring, Loes; Giblin, Iain; Thornton, Rosalind; Crain, Stephen – First Language, 2020
This response argues against the proposal that novel utterances are formed by analogy with stored exemplars that are close in meaning. Strings of words that are similar in meaning or even identical can behave very differently once inserted into different syntactic environments. Furthermore, phrases with similar meanings but different underlying…
Descriptors: Language Acquisition, Figurative Language, Syntax, Phrase Structure
Demuth, Katherine; Johnson, Mark – First Language, 2020
Exemplar-based learning requires: (1) a segmentation procedure for identifying the units of past experiences that a present experience can be compared to, and (2) a similarity function for comparing these past experiences to the present experience. This article argues that for a learner to learn a language these two mechanisms will require…
Descriptors: Comparative Analysis, Language Acquisition, Linguistic Theory, Grammar
Alex Warstadt – ProQuest LLC, 2022
Data-driven learning uncontroversially plays a role in human language acquisition--how large a role is a matter of much debate. The success of artificial neural networks in NLP in recent years calls for a re-evaluation of our understanding of the possibilities for learning grammar from data alone. This dissertation argues the case for using…
Descriptors: Language Acquisition, Artificial Intelligence, Computational Linguistics, Ethics
Zettersten, Martin; Schonberg, Christina; Lupyan, Gary – First Language, 2020
This article reviews two aspects of human learning: (1) people draw inferences that appear to rely on hierarchical conceptual representations; (2) some categories are much easier to learn than others given the same number of exemplars, and some categories remain difficult despite extensive training. Both of these results are difficult to reconcile…
Descriptors: Models, Language Acquisition, Prediction, Language Processing