NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 3 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Stephen Ferrigno; Samuel J. Cheyette; Susan Carey – Cognitive Science, 2025
Complex sequences are ubiquitous in human mental life, structuring representations within many different cognitive domains--natural language, music, mathematics, and logic, to name a few. However, the representational and computational machinery used to learn abstract grammars and process complex sequences is unknown. Here, we used an artificial…
Descriptors: Sequential Learning, Cognitive Processes, Knowledge Representation, Training
Peer reviewed Peer reviewed
Direct linkDirect link
Hartshorne, Joshua K. – First Language, 2020
Ambridge argues that the existence of exemplar models for individual phenomena (words, inflection rules, etc.) suggests the feasibility of a unified, exemplars-everywhere model that eschews abstraction. The argument would be strengthened by a description of such a model. However, none is provided. I show that any attempt to do so would immediately…
Descriptors: Models, Language Acquisition, Language Processing, Bayesian Statistics
Peer reviewed Peer reviewed
Direct linkDirect link
Perfors, Amy; Tenenbaum, Joshua B.; Wonnacott, Elizabeth – Journal of Child Language, 2010
We present a hierarchical Bayesian framework for modeling the acquisition of verb argument constructions. It embodies a domain-general approach to learning higher-level knowledge in the form of inductive constraints (or overhypotheses), and has been used to explain other aspects of language development such as the shape bias in learning object…
Descriptors: Verbs, Inferences, Language Acquisition, Bayesian Statistics