Descriptor
Algorithms | 3 |
Markov Processes | 3 |
Computer Science | 1 |
Estimation (Mathematics) | 1 |
Models | 1 |
Monte Carlo Methods | 1 |
Path Analysis | 1 |
Probability | 1 |
Program Implementation | 1 |
Author
Bookstein, Abraham | 1 |
Choi, Kilchan | 1 |
Klein, Shmuel T. | 1 |
Lim, Nelson | 1 |
Novak, John | 1 |
Raita, Timo | 1 |
Seltzer, Michael | 1 |
Shipley, Bill | 1 |
Publication Type
Journal Articles | 3 |
Reports - Descriptive | 3 |
Education Level
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating

Shipley, Bill – Structural Equation Modeling, 2002
Describes a method for choosing rejection probabilities for the tests of independence that are used in constraint-based algorithms of exploratory path analysis. The method consists of generating a Markov or semi-Markov model from the equivalence class represented by a partial ancestral graph and then testing the d-separation implications. (SLD)
Descriptors: Algorithms, Markov Processes, Path Analysis

Seltzer, Michael; Novak, John; Choi, Kilchan; Lim, Nelson – Journal of Educational and Behavioral Statistics, 2002
Examines the ways in which level-1 outliers can impact the estimation of fixed effects and random effects in hierarchical models (HMs). Also outlines and illustrates the use of Markov Chain Monte Carlo algorithms for conducting sensitivity analyses under "t" level-1 assumptions, including algorithms for settings in which the degrees of…
Descriptors: Algorithms, Estimation (Mathematics), Markov Processes, Monte Carlo Methods

Bookstein, Abraham; Klein, Shmuel T.; Raita, Timo – Information Processing & Management, 1997
Discussion of text compression focuses on a method to reduce the amount of storage needed to represent a Markov model with an extended alphabet, by applying a clustering scheme that brings together similar states. Highlights include probability vectors; algorithms; implementation details; and experimental data with natural languages. (Author/LRW)
Descriptors: Algorithms, Computer Science, Markov Processes, Models