NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 11 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Stanojevic, Miloš; Brennan, Jonathan R.; Dunagan, Donald; Steedman, Mark; Hale, John T. – Cognitive Science, 2023
To model behavioral and neural correlates of language comprehension in naturalistic environments, researchers have turned to broad-coverage tools from natural-language processing and machine learning. Where syntactic structure is explicitly modeled, prior work has relied predominantly on context-free grammars (CFGs), yet such formalisms are not…
Descriptors: Correlation, Language Processing, Brain Hemisphere Functions, Natural Language Processing
Peer reviewed Peer reviewed
Direct linkDirect link
Mahowald, Kyle; Kachergis, George; Frank, Michael C. – First Language, 2020
Ambridge calls for exemplar-based accounts of language acquisition. Do modern neural networks such as transformers or word2vec -- which have been extremely successful in modern natural language processing (NLP) applications -- count? Although these models often have ample parametric complexity to store exemplars from their training data, they also…
Descriptors: Models, Language Processing, Computational Linguistics, Language Acquisition
Peer reviewed Peer reviewed
Direct linkDirect link
Schuler, Kathryn D.; Kodner, Jordan; Caplan, Spencer – First Language, 2020
In 'Against Stored Abstractions,' Ambridge uses neural and computational evidence to make his case against abstract representations. He argues that storing only exemplars is more parsimonious -- why bother with abstraction when exemplar models with on-the-fly calculation can do everything abstracting models can and more -- and implies that his…
Descriptors: Language Processing, Language Acquisition, Computational Linguistics, Linguistic Theory
Peer reviewed Peer reviewed
Direct linkDirect link
Sinclair, Jeanne; Jang, Eunice Eunhee; Rudzicz, Frank – Journal of Educational Psychology, 2021
Advances in machine learning (ML) are poised to contribute to our understanding of the linguistic processes associated with successful reading comprehension, which is a critical aspect of children's educational success. We used ML techniques to investigate and compare associations between children's reading comprehension and 260 linguistic…
Descriptors: Prediction, Reading Comprehension, Natural Language Processing, Speech Communication
Peer reviewed Peer reviewed
Direct linkDirect link
Luka, Barbara J.; Choi, Heidi – Journal of Memory and Language, 2012
Three experiments examine whether a naturalistic reading task can induce long-lasting changes of syntactic patterns in memory. Judgment of grammatical acceptability is used as an indirect test of memory for sentences that are identical or only syntactically similar to those read earlier. In previous research (Luka & Barsalou, 2005) both sorts of…
Descriptors: Priming, Comprehension, Sentences, Grammar
Rajkumar, Rajakrishnan – ProQuest LLC, 2012
Natural Language Generation (NLG) is the process of generating natural language text from an input, which is a communicative goal and a database or knowledge base. Informally, the architecture of a standard NLG system consists of the following modules (Reiter and Dale, 2000): content determination, sentence planning (or microplanning) and surface…
Descriptors: Natural Language Processing, Linguistics, Language Processing, Models
Wu, Stephen Tze-Inn – ProQuest LLC, 2010
This thesis aims to define and extend a line of computational models for text comprehension that are humanly plausible. Since natural language is human by nature, computational models of human language will always be just that--models. To the degree that they miss out on information that humans would tap into, they may be improved by considering…
Descriptors: Comprehension, Semantics, Syntax, Short Term Memory
Peer reviewed Peer reviewed
Direct linkDirect link
Madden, Carol; Hoen, Michel; Dominey, Peter Ford – Brain and Language, 2010
This article addresses issues in embodied sentence processing from a "cognitive neural systems" approach that combines analysis of the behavior in question, analysis of the known neurophysiological bases of this behavior, and the synthesis of a neuro-computational model of embodied sentence processing that can be applied to and tested in the…
Descriptors: Sentences, Simulation, Interaction, Language Processing
Heintz, Ilana – ProQuest LLC, 2010
The goal of this dissertation is to introduce a method for deriving morphemes from Arabic words using stem patterns, a feature of Arabic morphology. The motivations are three-fold: modeling with morphemes rather than words should help address the out-of-vocabulary problem; working with stem patterns should prove to be a cross-dialectally valid…
Descriptors: Semitic Languages, Dialects, Vowels, Morphemes
Boyd-Graber, Jordan – ProQuest LLC, 2010
Topic models like latent Dirichlet allocation (LDA) provide a framework for analyzing large datasets where observations are collected into groups. Although topic modeling has been fruitfully applied to problems social science, biology, and computer vision, it has been most widely used to model datasets where documents are modeled as exchangeable…
Descriptors: Language Patterns, Semantics, Linguistics, Multilingualism
Huang, Jian – ProQuest LLC, 2010
With the increasing wealth of information on the Web, information integration is ubiquitous as the same real-world entity may appear in a variety of forms extracted from different sources. This dissertation proposes supervised and unsupervised algorithms that are naturally integrated in a scalable framework to solve the entity resolution problem,…
Descriptors: Electronic Libraries, Language Processing, Profiles, Social Networks