Publication Date
In 2025 | 0 |
Since 2024 | 1 |
Since 2021 (last 5 years) | 3 |
Since 2016 (last 10 years) | 8 |
Descriptor
Artificial Languages | 8 |
Models | 8 |
Language Acquisition | 4 |
Learning Processes | 4 |
Adults | 3 |
Generalization | 3 |
Grammar | 3 |
Language Research | 3 |
Prediction | 3 |
Psycholinguistics | 3 |
Auditory Stimuli | 2 |
More ▼ |
Author
Publication Type
Journal Articles | 5 |
Reports - Research | 5 |
Dissertations/Theses -… | 3 |
Education Level
Higher Education | 1 |
Postsecondary Education | 1 |
Audience
Location
Canada | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Aislinn Keogh; Simon Kirby; Jennifer Culbertson – Cognitive Science, 2024
General principles of human cognition can help to explain why languages are more likely to have certain characteristics than others: structures that are difficult to process or produce will tend to be lost over time. One aspect of cognition that is implicated in language use is working memory--the component of short-term memory used for temporary…
Descriptors: Language Variation, Learning Processes, Short Term Memory, Schemata (Cognition)
Carter, William Thomas Jeffrey – ProQuest LLC, 2019
Despite OT's success, opaque alternations prove difficult to capture with constraints, and some violate the theory's formal restrictions. Here, I propose a novel account of opacity drawing upon developments in psychology. Rather than one grammar, I propose a dual-system model with implicit and explicit mechanisms, a domain-specific OT-like system…
Descriptors: Grammar, Language Patterns, Models, Psychology
Cooper, Angela; Paquette-Smith, Melissa; Bordignon, Caterina; Johnson, Elizabeth K. – Language Learning and Development, 2023
Foreign accents can vary considerably in the degree to which they deviate from the listener's native accent, but little is known about how the relationship between a speaker's accent and a listener's native language phonology mediates adaptation. Using an artificial accent methodology, we addressed this issue by constructing a set of three…
Descriptors: Pronunciation, Auditory Perception, Adults, Toddlers
Thontirawong, Pipat; Chinchanachokchai, Sydney – Marketing Education Review, 2021
In the age of big data and analytics, it is important that students learn about artificial intelligence (AI) and machine learning (ML). Machine learning is a discipline that focuses on building a computer system that can improve itself using experience. ML models can be used to detect patterns from data and recommend strategic marketing actions.…
Descriptors: Marketing, Artificial Languages, Career Development, Time Management
Radulescu, Silvia; Wijnen, Frank; Avrutin, Sergey – Language Learning and Development, 2020
From limited evidence, children track the regularities of their language impressively fast and they infer generalized rules that apply to novel instances. This study investigated what drives the inductive leap from memorizing specific items and statistical regularities to extracting abstract rules. We propose an innovative entropy model that…
Descriptors: Linguistic Input, Language Acquisition, Grammar, Learning Processes
Fowlie, Meaghan – ProQuest LLC, 2017
Adjuncts and arguments exhibit different syntactic behaviours, but modelling this difference in minimalist syntax is challenging: on the one hand, adjuncts differ from arguments in that they are optional, transparent, and iterable, but on the other hand they are often strictly ordered, reflecting the kind of strict selection seen in argument…
Descriptors: Persuasive Discourse, Syntax, Form Classes (Languages), Language Research
Schuler, Kathryn Dolores – ProQuest LLC, 2017
In natural language, evidence suggests that, while some rules are productive (regular), applying broadly to new words, others are restricted to a specific set of lexical items (irregular). Further, the literature suggests that children make a categorical distinction between regular and irregular rules, applying only regular rules productively…
Descriptors: Prediction, Linguistic Theory, Language Acquisition, Grammar
Ouyang, Long; Boroditsky, Lera; Frank, Michael C. – Cognitive Science, 2017
Computational models have shown that purely statistical knowledge about words' linguistic contexts is sufficient to learn many properties of words, including syntactic and semantic category. For example, models can infer that "postman" and "mailman" are semantically similar because they have quantitatively similar patterns of…
Descriptors: Semiotics, Computational Linguistics, Syntax, Semantics