Publication Date
In 2025 | 0 |
Since 2024 | 1 |
Since 2021 (last 5 years) | 1 |
Since 2016 (last 10 years) | 1 |
Since 2006 (last 20 years) | 4 |
Descriptor
Source
Cognitive Science | 5 |
Author
Daniel Swingley | 1 |
Frank, Robert | 1 |
Gatt, Albert | 1 |
Graesser, Arthur C. | 1 |
Jackson, G. Tanner | 1 |
Jordan, Pamela | 1 |
Koolen, Ruud | 1 |
Krahmer, Emiel | 1 |
Matthews, Danielle E. | 1 |
Olney, Andrew | 1 |
Power, Richard | 1 |
More ▼ |
Publication Type
Journal Articles | 5 |
Reports - Evaluative | 5 |
Opinion Papers | 1 |
Education Level
Higher Education | 1 |
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Daniel Swingley; Robin Algayres – Cognitive Science, 2024
Computational models of infant word-finding typically operate over transcriptions of infant-directed speech corpora. It is now possible to test models of word segmentation on speech materials, rather than transcriptions of speech. We propose that such modeling efforts be conducted over the speech of the experimental stimuli used in studies…
Descriptors: Sentences, Word Recognition, Psycholinguistics, Infants
Krahmer, Emiel; Koolen, Ruud; Theune, Mariet – Cognitive Science, 2012
In a recent article published in this journal (van Deemter, Gatt, van der Sluis, & Power, 2012), the authors criticize the Incremental Algorithm (a well-known algorithm for the generation of referring expressions due to Dale & Reiter, 1995, also in this journal) because of its strong reliance on a pre-determined, domain-dependent Preference Order.…
Descriptors: Natural Language Processing, Mathematics, Computational Linguistics
van Deemter, Kees; Gatt, Albert; van der Sluis, Ielka; Power, Richard – Cognitive Science, 2012
A substantial amount of recent work in natural language generation has focused on the generation of "one-shot" referring expressions whose only aim is to identify a target referent. Dale and Reiter's Incremental Algorithm (IA) is often thought to be the best algorithm for maximizing the similarity to referring expressions produced by people. We…
Descriptors: Natural Language Processing, Mathematics, Computational Linguistics
Frank, Robert – Cognitive Science, 2004
Theories of natural language syntax often characterize grammatical knowledge as a form of abstract computation. This paper argues that such a characterization is correct, and that fundamental properties of grammar can and should be understood in terms of restrictions on the complexity of possible grammatical computation, when defined in terms of…
Descriptors: Syntax, Natural Language Processing, Computational Linguistics, Generative Grammar
Matthews, Danielle E.; VanLehn, Kurt; Graesser, Arthur C.; Jackson, G. Tanner; Jordan, Pamela; Olney, Andrew; Rosa, Andrew Carolyn P. – Cognitive Science, 2007
It is often assumed that engaging in a one-on-one dialogue with a tutor is more effective than listening to a lecture or reading a text. Although earlier experiments have not always supported this hypothesis, this may be due in part to allowing the tutors to cover different content than the noninteractive instruction. In 7 experiments, we tested…
Descriptors: Tutoring, Natural Language Processing, Physics, Computer Assisted Instruction