NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 4 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Hartshorne, Joshua K. – First Language, 2020
Ambridge argues that the existence of exemplar models for individual phenomena (words, inflection rules, etc.) suggests the feasibility of a unified, exemplars-everywhere model that eschews abstraction. The argument would be strengthened by a description of such a model. However, none is provided. I show that any attempt to do so would immediately…
Descriptors: Models, Language Acquisition, Language Processing, Bayesian Statistics
Peer reviewed Peer reviewed
Direct linkDirect link
McClelland, James L.; Mirman, Daniel; Bolger, Donald J.; Khaitan, Pranav – Cognitive Science, 2014
In a seminal 1977 article, Rumelhart argued that perception required the simultaneous use of multiple sources of information, allowing perceivers to optimally interpret sensory information at many levels of representation in real time as information arrives. Building on Rumelhart's arguments, we present the Interactive Activation…
Descriptors: Perception, Comprehension, Cognitive Processes, Alphabets
Peer reviewed Peer reviewed
Direct linkDirect link
Beekhuizen, Barend; Bod, Rens; Zuidema, Willem – Language and Speech, 2013
In this paper we present three design principles of language--experience, heterogeneity and redundancy--and present recent developments in a family of models incorporating them, namely Data-Oriented Parsing/Unsupervised Data-Oriented Parsing. Although the idea of some form of redundant storage has become part and parcel of parsing technologies and…
Descriptors: Language Acquisition, Models, Bayesian Statistics, Computational Linguistics
Peer reviewed Peer reviewed
Direct linkDirect link
Lamberts, Koen; Kent, Christopher – Journal of Experimental Psychology: Learning, Memory, and Cognition, 2008
The time course of perception and retrieval of object features was investigated. Participants completed a perceptual matching task and 2 recognition tasks under time pressure. The recognition tasks imposed different retention loads. A stochastic model of feature sampling with a Bayesian decision component was used to estimate the rate of feature…
Descriptors: Memory, Language Processing, Bayesian Statistics, Recognition (Psychology)