NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing all 15 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Akaeze, Hope O.; Wu, Jamie Heng-Chieh; Lawrence, Frank R.; Weber, Everett P. – Journal of Psychoeducational Assessment, 2023
This paper reports an investigation into the psychometric properties of the COR-Advantage1.5 (COR-Adv1.5) assessment tool, a criterion-referenced observation-based instrument designed to assess the developmental abilities of children from birth through kindergarten. Using data from 8534 children participating in a state-funded preschool program…
Descriptors: Criterion Referenced Tests, Evaluation Methods, Measures (Individuals), Measurement Techniques
Peer reviewed Peer reviewed
Direct linkDirect link
Martin, Andrew J.; Yu, Kai; Papworth, Brad; Ginns, Paul; Collie, Rebecca J. – Journal of Psychoeducational Assessment, 2015
This study explored motivation and engagement among North American (the United States and Canada; n = 1,540), U.K. (n = 1,558), Australian (n = 2,283), and Chinese (n = 3,753) secondary school students. Motivation and engagement were assessed via students' responses to the Motivation and Engagement Scale-High School (MES-HS). Confirmatory factor…
Descriptors: Foreign Countries, Motivation, Learner Engagement, Secondary School Students
Peer reviewed Peer reviewed
Direct linkDirect link
Rhemtulla, Mijke; Brosseau-Liard, Patricia E.; Savalei, Victoria – Psychological Methods, 2012
A simulation study compared the performance of robust normal theory maximum likelihood (ML) and robust categorical least squares (cat-LS) methodology for estimating confirmatory factor analysis models with ordinal variables. Data were generated from 2 models with 2-7 categories, 4 sample sizes, 2 latent distributions, and 5 patterns of category…
Descriptors: Factor Analysis, Computation, Simulation, Sample Size
Peer reviewed Peer reviewed
Direct linkDirect link
Tucker-Drob, Elliot M.; Salthouse, Timothy A. – International Journal of Behavioral Development, 2009
Although factor analysis is the most commonly-used method for examining the structure of cognitive variable interrelations, multidimensional scaling (MDS) can provide visual representations highlighting the continuous nature of interrelations among variables. Using data (N = 8,813; ages 17-97 years) aggregated across 38 separate studies, MDS was…
Descriptors: Construct Validity, Multidimensional Scaling, Factor Analysis, Cognitive Ability
Peer reviewed Peer reviewed
Direct linkDirect link
Dickes, Paul; Valentova, Marie; Borsenberger, Monique – Social Indicators Research, 2010
The aim of the paper is to assess the construct validation of a multidimensional measure of social cohesion which is well theoretically grounded and has an equivalent/comparable interpretation across all European countries. Up-to-now published research on social cohesion is deficient in either one or both of these important aspects. This paper…
Descriptors: Construct Validity, Multidimensional Scaling, Factor Analysis, Foreign Countries
Peer reviewed Peer reviewed
Direct linkDirect link
Kim, Se-Kang; Davison, Mark L.; Frisby, Craig L. – Multivariate Behavioral Research, 2007
This paper describes the Confirmatory Factor Analysis (CFA) parameterization of the Profile Analysis via Multidimensional Scaling (PAMS) model to demonstrate validation of profile pattern hypotheses derived from multidimensional scaling (MDS). Profile Analysis via Multidimensional Scaling (PAMS) is an exploratory method for identifying major…
Descriptors: Profiles, Factor Analysis, Multidimensional Scaling, Evaluation Methods
Peer reviewed Peer reviewed
Direct linkDirect link
Kunina-Habenicht, Olga; Rupp, Andre A.; Wilhelm, Oliver – Studies in Educational Evaluation, 2009
In recent years there has been an increasing international interest in fine-grained diagnostic inferences on multiple skills for formative purposes. A successful provision of such inferences that support meaningful instructional decision-making requires (a) careful diagnostic assessment design coupled with (b) empirical support for the structure…
Descriptors: Educational Testing, Diagnostic Tests, Multidimensional Scaling, Factor Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Burdsal, Charles A.; Harrison, Paul D. – Assessment & Evaluation in Higher Education, 2008
The purpose of this research is to provide additional empirical evidence supporting the use of both a multidimensional profile and an overall evaluation of teaching effectiveness as valid indicators of student perceptions of effective classroom instruction. A factor analytic teaching evaluation instrument was used that also included open-ended…
Descriptors: Student Evaluation of Teacher Performance, Factor Analysis, Profiles, Multidimensional Scaling
Peer reviewed Peer reviewed
Subkoviak, Michael J. – Review of Educational Research, 1975
Illustrated is the power of multidimensional scaling in reducing a complex set of proximity measures to a simple geometric picture that shows the relationship among data objects. Methods are discussed for determining the number of dimensions needed to represent a set. (Author/DEP)
Descriptors: Educational Research, Evaluation Methods, Factor Analysis, Mathematical Models
Peer reviewed Peer reviewed
Direct linkDirect link
Stone, Clement A.; Yeh, Chien-Chi – Educational and Psychological Measurement, 2006
Examination of a test's internal structure can be used to identify what domains or dimensions are being measured, identify relationships between the dimensions, provide evidence for hypothesized multidimensionality and test score interpretations, and identify construct-irrelevant variance. The purpose of this research is to provide a…
Descriptors: Multiple Choice Tests, Factor Structure, Factor Analysis, Licensing Examinations (Professions)
Peer reviewed Peer reviewed
Krus, David J.; And Others – Research in Higher Education, 1975
Reports an exploratory study designed to describe the development of the scales, investigate concerns regarding student perceptions of a large university, and examine in a standard research situation the viability of a new method of multidimensional analysis (order analysis, proposed as an inferential alternative to factor analysis). (JT)
Descriptors: Evaluation Methods, Factor Analysis, Higher Education, Institutional Environment
Peer reviewed Peer reviewed
Farley, John U.; And Others – Multivariate Behavioral Research, 1974
Evaluation of attributes of a subcompact car were combined in linear regressions predicting liking and purchase intention. Of two forms--raw scales and scales weighted by the importance attached to each attribute by each subject--unweighted evaluations proved more consistent and important predictors than those weighted by their saliency. (Author)
Descriptors: Attitudes, Decision Making, Design Preferences, Design Requirements
McCroskey, James C.; McCain, Thomas A. – 1972
This paper reports a factor analytic investigation of the interpersonal attraction construct. Two hundred-fifteen subjects completed 30 Likert-type, 7-step scales concerning an acquaintance. Factor analysis indicated three dimensions of the interpersonal attraction construct which were labeled "task,""social," and "physical." Obtained internal…
Descriptors: Attitudes, Evaluation Criteria, Evaluation Methods, Factor Analysis
Peer reviewed Peer reviewed
Raymond, Mark R. – Evaluation and the Health Professions, 1989
Multidimensional scaling (MDS) and its potential use for research and evaluation in health-related professions are discussed. Useful data types, interpretation of results, and various applications of MDS are presented. MDS is less restrictive than factor analysis since it does not assume a linear relationship between the objects/variables of…
Descriptors: Allied Health Occupations, Cluster Analysis, Data Interpretation, Discriminant Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Brunner, Martin; SuB, Heinz-Martin – Educational and Psychological Measurement, 2005
Two aspects of the reliability of multidimensional measures can be distinguished: the amount of scale score variance that is accounted for by all underlying factors (composite reliability) and the degree to which the scale score reflects one particular factor (construct reliability). Confidence intervals for composite and construct reliabilities…
Descriptors: Measures (Individuals), Intervals, Intelligence Tests, Evaluation Methods