NotesFAQContact Us
Collection
Advanced
Search Tips
Showing 1 to 15 of 43 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Stanojevic, Miloš; Brennan, Jonathan R.; Dunagan, Donald; Steedman, Mark; Hale, John T. – Cognitive Science, 2023
To model behavioral and neural correlates of language comprehension in naturalistic environments, researchers have turned to broad-coverage tools from natural-language processing and machine learning. Where syntactic structure is explicitly modeled, prior work has relied predominantly on context-free grammars (CFGs), yet such formalisms are not…
Descriptors: Correlation, Language Processing, Brain Hemisphere Functions, Natural Language Processing
Peer reviewed Peer reviewed
Direct linkDirect link
John Hollander; Andrew Olney – Cognitive Science, 2024
Recent investigations on how people derive meaning from language have focused on task-dependent shifts between two cognitive systems. The symbolic (amodal) system represents meaning as the statistical relationships between words. The embodied (modal) system represents meaning through neurocognitive simulation of perceptual or sensorimotor systems…
Descriptors: Verbs, Symbolic Language, Language Processing, Semantics
Peer reviewed Peer reviewed
Direct linkDirect link
Ramotowska, Sonia; Steinert-Threlkeld, Shane; Maanen, Leendert; Szymanik, Jakub – Cognitive Science, 2023
According to logical theories of meaning, a meaning of an expression can be formalized and encoded in truth conditions. Vagueness of the language and individual differences between people are a challenge to incorporate into the meaning representations. In this paper, we propose a new approach to study truth-conditional representations of vague…
Descriptors: Computation, Models, Semantics, Decision Making
Peer reviewed Peer reviewed
Direct linkDirect link
Abu-Zhaya, Rana; Arnon, Inbal; Borovsky, Arielle – Cognitive Science, 2022
Meaning in language emerges from multiple words, and children are sensitive to multi-word frequency from infancy. While children successfully use cues from single words to generate linguistic predictions, it is less clear whether and how they use multi-word sequences to guide real-time language processing and whether they form predictions on the…
Descriptors: Sentences, Language Processing, Semantics, Prediction
Peer reviewed Peer reviewed
Direct linkDirect link
Thornton, Chris – Cognitive Science, 2021
Semantic composition in language must be closely related to semantic composition in thought. But the way the two processes are explained differs considerably. Focusing primarily on propositional content, language theorists generally take semantic composition to be a truth-conditional process. Focusing more on extensional content, cognitive…
Descriptors: Semantics, Cognitive Processes, Linguistic Theory, Language Usage
Peer reviewed Peer reviewed
Direct linkDirect link
Brehm, Laurel; Cho, Pyeong Whan; Smolensky, Paul; Goldrick, Matthew A. – Cognitive Science, 2022
Subject-verb agreement errors are common in sentence production. Many studies have used experimental paradigms targeting the production of subject-verb agreement from a sentence preamble ("The key to the cabinets") and eliciting verb errors (… "*were shiny"). Through reanalysis of previous data (50 experiments; 102,369…
Descriptors: Sentences, Sentence Structure, Grammar, Verbs
Peer reviewed Peer reviewed
Direct linkDirect link
Masato Nakamura; Shota Momma; Hiromu Sakai; Colin Phillips – Cognitive Science, 2024
Comprehenders generate expectations about upcoming lexical items in language processing using various types of contextual information. However, a number of studies have shown that argument roles do not impact neural and behavioral prediction measures. Despite these robust findings, some prior studies have suggested that lexical prediction might be…
Descriptors: Diagnostic Tests, Nouns, Language Processing, Verbs
Peer reviewed Peer reviewed
Direct linkDirect link
Trott, Sean; Jones, Cameron; Chang, Tyler; Michaelov, James; Bergen, Benjamin – Cognitive Science, 2023
Humans can attribute beliefs to others. However, it is unknown to what extent this ability results from an innate biological endowment or from experience accrued through child development, particularly exposure to language describing others' mental states. We test the viability of the language exposure hypothesis by assessing whether models…
Descriptors: Models, Language Processing, Beliefs, Child Development
Peer reviewed Peer reviewed
Direct linkDirect link
van Schijndel, Marten; Linzen, Tal – Cognitive Science, 2021
The disambiguation of a syntactically ambiguous sentence in favor of a less preferred parse can lead to slower reading at the disambiguation point. This phenomenon, referred to as a garden-path effect, has motivated models in which readers initially maintain only a subset of the possible parses of the sentence, and subsequently require…
Descriptors: Syntax, Ambiguity (Semantics), Reading Processes, Linguistic Theory
Peer reviewed Peer reviewed
Direct linkDirect link
Jacobs, Cassandra L.; Cho, Sun-Joo; Watson, Duane G. – Cognitive Science, 2019
Syntactic priming in language production is the increased likelihood of using a recently encountered syntactic structure. In this paper, we examine two theories of why speakers can be primed: error-driven learning accounts (Bock, Dell, Chang, & Onishi, 2007; Chang, Dell, & Bock, 2006) and activation-based accounts (Pickering &…
Descriptors: Priming, Syntax, Prediction, Linguistic Theory
Peer reviewed Peer reviewed
Direct linkDirect link
Johns, Brendan T.; Mewhort, Douglas J. K.; Jones, Michael N. – Cognitive Science, 2019
Distributional models of semantics learn word meanings from contextual co-occurrence patterns across a large sample of natural language. Early models, such as LSA and HAL (Landauer & Dumais, 1997; Lund & Burgess, 1996), counted co-occurrence events; later models, such as BEAGLE (Jones & Mewhort, 2007), replaced counting co-occurrences…
Descriptors: Semantics, Learning Processes, Models, Prediction
Peer reviewed Peer reviewed
Direct linkDirect link
Divjak, Dagmar; Milin, Petar; Medimorec, Srdan; Borowski, Maciej – Cognitive Science, 2022
Although there is a broad consensus that both the procedural and declarative memory systems play a crucial role in language learning, use, and knowledge, the mapping between linguistic types and memory structures remains underspecified: by default, a dual-route mapping of language systems to memory systems is assumed, with declarative memory…
Descriptors: Memory, Grammar, Vocabulary Development, Language Processing
Peer reviewed Peer reviewed
Direct linkDirect link
Johns, Brendan T.; Jamieson, Randall K. – Cognitive Science, 2018
The collection of very large text sources has revolutionized the study of natural language, leading to the development of several models of language learning and distributional semantics that extract sophisticated semantic representations of words based on the statistical redundancies contained within natural language (e.g., Griffiths, Steyvers,…
Descriptors: Statistical Analysis, Written Language, Models, Language Enrichment
Peer reviewed Peer reviewed
Direct linkDirect link
Paape, Dario; Avetisyan, Serine; Lago, Sol; Vasishth, Shravan – Cognitive Science, 2021
We present computational modeling results based on a self-paced reading study investigating number attraction effects in Eastern Armenian. We implement three novel computational models of agreement attraction in a Bayesian framework and compare their predictive fit to the data using k-fold cross-validation. We find that our data are better…
Descriptors: Computational Linguistics, Indo European Languages, Grammar, Bayesian Statistics
Peer reviewed Peer reviewed
Direct linkDirect link
Virpioja, Sami; Lehtonen, Minna; Hultén, Annika; Kivikari, Henna; Salmelin, Riitta; Lagus, Krista – Cognitive Science, 2018
Determining optimal units of representing morphologically complex words in the mental lexicon is a central question in psycholinguistics. Here, we utilize advances in computational sciences to study human morphological processing using statistical models of morphology, particularly the unsupervised Morfessor model that works on the principle of…
Descriptors: Statistical Analysis, Models, Morphology (Languages), Vocabulary
Previous Page | Next Page »
Pages: 1  |  2  |  3