Publication Date
| In 2026 | 0 |
| Since 2025 | 1 |
| Since 2022 (last 5 years) | 1 |
| Since 2017 (last 10 years) | 2 |
| Since 2007 (last 20 years) | 3 |
Descriptor
| Bayesian Statistics | 3 |
| Generalization | 3 |
| Learning Processes | 3 |
| Child Language | 2 |
| Grammar | 2 |
| Language Acquisition | 2 |
| Models | 2 |
| Abstract Reasoning | 1 |
| Artificial Intelligence | 1 |
| Cognitive Processes | 1 |
| Computational Linguistics | 1 |
| More ▼ | |
Author
| Hartshorne, Joshua K. | 1 |
| Perfors, Amy | 1 |
| Samuel J. Cheyette | 1 |
| Stephen Ferrigno | 1 |
| Susan Carey | 1 |
| Tenenbaum, Joshua B. | 1 |
| Wonnacott, Elizabeth | 1 |
Publication Type
| Journal Articles | 3 |
| Opinion Papers | 1 |
| Reports - Descriptive | 1 |
| Reports - Evaluative | 1 |
| Reports - Research | 1 |
Education Level
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Stephen Ferrigno; Samuel J. Cheyette; Susan Carey – Cognitive Science, 2025
Complex sequences are ubiquitous in human mental life, structuring representations within many different cognitive domains--natural language, music, mathematics, and logic, to name a few. However, the representational and computational machinery used to learn abstract grammars and process complex sequences is unknown. Here, we used an artificial…
Descriptors: Sequential Learning, Cognitive Processes, Knowledge Representation, Training
Hartshorne, Joshua K. – First Language, 2020
Ambridge argues that the existence of exemplar models for individual phenomena (words, inflection rules, etc.) suggests the feasibility of a unified, exemplars-everywhere model that eschews abstraction. The argument would be strengthened by a description of such a model. However, none is provided. I show that any attempt to do so would immediately…
Descriptors: Models, Language Acquisition, Language Processing, Bayesian Statistics
Perfors, Amy; Tenenbaum, Joshua B.; Wonnacott, Elizabeth – Journal of Child Language, 2010
We present a hierarchical Bayesian framework for modeling the acquisition of verb argument constructions. It embodies a domain-general approach to learning higher-level knowledge in the form of inductive constraints (or overhypotheses), and has been used to explain other aspects of language development such as the shape bias in learning object…
Descriptors: Verbs, Inferences, Language Acquisition, Bayesian Statistics

Peer reviewed
Direct link
