Publication Date
In 2025 | 0 |
Since 2024 | 1 |
Since 2021 (last 5 years) | 3 |
Since 2016 (last 10 years) | 8 |
Since 2006 (last 20 years) | 23 |
Descriptor
Artificial Languages | 30 |
Models | 30 |
Grammar | 13 |
Language Acquisition | 12 |
Language Research | 7 |
Cognitive Processes | 6 |
Comparative Analysis | 6 |
Language Processing | 6 |
Learning Processes | 6 |
Second Language Learning | 6 |
Syntax | 6 |
More ▼ |
Source
Author
Culbertson, Jennifer | 3 |
Frank, Michael C. | 2 |
Smolensky, Paul | 2 |
Aislinn Keogh | 1 |
Amato, Michael S. | 1 |
Anderson, John R. | 1 |
Arnon, Inbal | 1 |
Aslin, Richard N. | 1 |
Avrutin, Sergey | 1 |
Bierschenk, Bernhard | 1 |
Bierschenk, Inger | 1 |
More ▼ |
Publication Type
Journal Articles | 22 |
Reports - Research | 19 |
Dissertations/Theses -… | 5 |
Reports - Evaluative | 4 |
Dissertations/Theses -… | 1 |
Information Analyses | 1 |
Reports - Descriptive | 1 |
Tests/Questionnaires | 1 |
Education Level
Adult Education | 2 |
Higher Education | 2 |
Postsecondary Education | 2 |
Audience
Location
Canada | 1 |
United States | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Aislinn Keogh; Simon Kirby; Jennifer Culbertson – Cognitive Science, 2024
General principles of human cognition can help to explain why languages are more likely to have certain characteristics than others: structures that are difficult to process or produce will tend to be lost over time. One aspect of cognition that is implicated in language use is working memory--the component of short-term memory used for temporary…
Descriptors: Language Variation, Learning Processes, Short Term Memory, Schemata (Cognition)
Carter, William Thomas Jeffrey – ProQuest LLC, 2019
Despite OT's success, opaque alternations prove difficult to capture with constraints, and some violate the theory's formal restrictions. Here, I propose a novel account of opacity drawing upon developments in psychology. Rather than one grammar, I propose a dual-system model with implicit and explicit mechanisms, a domain-specific OT-like system…
Descriptors: Grammar, Language Patterns, Models, Psychology
Cooper, Angela; Paquette-Smith, Melissa; Bordignon, Caterina; Johnson, Elizabeth K. – Language Learning and Development, 2023
Foreign accents can vary considerably in the degree to which they deviate from the listener's native accent, but little is known about how the relationship between a speaker's accent and a listener's native language phonology mediates adaptation. Using an artificial accent methodology, we addressed this issue by constructing a set of three…
Descriptors: Pronunciation, Auditory Perception, Adults, Toddlers
Thontirawong, Pipat; Chinchanachokchai, Sydney – Marketing Education Review, 2021
In the age of big data and analytics, it is important that students learn about artificial intelligence (AI) and machine learning (ML). Machine learning is a discipline that focuses on building a computer system that can improve itself using experience. ML models can be used to detect patterns from data and recommend strategic marketing actions.…
Descriptors: Marketing, Artificial Languages, Career Development, Time Management
Radulescu, Silvia; Wijnen, Frank; Avrutin, Sergey – Language Learning and Development, 2020
From limited evidence, children track the regularities of their language impressively fast and they infer generalized rules that apply to novel instances. This study investigated what drives the inductive leap from memorizing specific items and statistical regularities to extracting abstract rules. We propose an innovative entropy model that…
Descriptors: Linguistic Input, Language Acquisition, Grammar, Learning Processes
Fowlie, Meaghan – ProQuest LLC, 2017
Adjuncts and arguments exhibit different syntactic behaviours, but modelling this difference in minimalist syntax is challenging: on the one hand, adjuncts differ from arguments in that they are optional, transparent, and iterable, but on the other hand they are often strictly ordered, reflecting the kind of strict selection seen in argument…
Descriptors: Persuasive Discourse, Syntax, Form Classes (Languages), Language Research
Schuler, Kathryn Dolores – ProQuest LLC, 2017
In natural language, evidence suggests that, while some rules are productive (regular), applying broadly to new words, others are restricted to a specific set of lexical items (irregular). Further, the literature suggests that children make a categorical distinction between regular and irregular rules, applying only regular rules productively…
Descriptors: Prediction, Linguistic Theory, Language Acquisition, Grammar
Ouyang, Long; Boroditsky, Lera; Frank, Michael C. – Cognitive Science, 2017
Computational models have shown that purely statistical knowledge about words' linguistic contexts is sufficient to learn many properties of words, including syntactic and semantic category. For example, models can infer that "postman" and "mailman" are semantically similar because they have quantitatively similar patterns of…
Descriptors: Semiotics, Computational Linguistics, Syntax, Semantics
Kurumada, Chigusa; Meylan, Stephan C.; Frank, Michael C. – Cognition, 2013
Word frequencies in natural language follow a highly skewed Zipfian distribution, but the consequences of this distribution for language acquisition are only beginning to be understood. Typically, learning experiments that are meant to simulate language acquisition use uniform word frequency distributions. We examine the effects of Zipfian…
Descriptors: Statistical Distributions, Word Frequency, Language Acquisition, Artificial Languages
Perruchet, Pierre; Poulin-Charronnat, Benedicte – Journal of Memory and Language, 2012
Endress and Mehler (2009) reported that when adult subjects are exposed to an unsegmented artificial language composed from trisyllabic words such as ABX, YBC, and AZC, they are unable to distinguish between these words and what they coined as the "phantom-word" ABC in a subsequent test. This suggests that statistical learning generates knowledge…
Descriptors: Artificial Languages, Probability, Models, Simulation
Aslin, Richard N.; Newport, Elissa L. – Language Learning, 2014
In the past 15 years, a substantial body of evidence has confirmed that a powerful distributional learning mechanism is present in infants, children, adults and (at least to some degree) in nonhuman animals as well. The present article briefly reviews this literature and then examines some of the fundamental questions that must be addressed for…
Descriptors: Linguistic Input, Grammar, Language Research, Computational Linguistics
Culbertson, Jennifer; Smolensky, Paul – Cognitive Science, 2012
In this article, we develop a hierarchical Bayesian model of learning in a general type of artificial language-learning experiment in which learners are exposed to a mixture of grammars representing the variation present in real learners' input, particularly at times of language change. The modeling goal is to formalize and quantify hypothesized…
Descriptors: Models, Bayesian Statistics, Artificial Languages, Language Acquisition
Hamrick, Phillip – Language Learning, 2014
Humans are remarkably sensitive to the statistical structure of language. However, different mechanisms have been proposed to account for such statistical sensitivities. The present study compared adult learning of syntax and the ability of two models of statistical learning to simulate human performance: Simple Recurrent Networks, which learn by…
Descriptors: Second Language Learning, Role, Syntax, Computational Linguistics
Arnon, Inbal; Ramscar, Michael – Cognition, 2012
Why do adult language learners typically fail to acquire second languages with native proficiency? Does prior linguistic experience influence the size of the "units" adults attend to in learning, and if so, how does this influence what gets learned? Here, we examine these questions in relation to grammatical gender, which adult learners almost…
Descriptors: Sentences, Nouns, Grammar, Linguistics
Sakas, William Gregory; Fodor, Janet Dean – Language Acquisition: A Journal of Developmental Linguistics, 2012
We present data from an artificial language domain that suggest new contributions to the theory of syntactic triggers. Whether a learning algorithm is capable of matching the achievements of child learners depends in part on how much parametric ambiguity there is in the input. For practical reasons this cannot be established for the domain of all…
Descriptors: Ambiguity (Semantics), Artificial Languages, Language Acquisition, Linguistic Theory
Previous Page | Next Page ยป
Pages: 1 | 2