NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 10 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Takane, Yoshio; Jung, Sunho – Psychometrika, 2008
Methods of incorporating a ridge type of regularization into partial redundancy analysis (PRA), constrained redundancy analysis (CRA), and partial and constrained redundancy analysis (PCRA) were discussed. The usefulness of ridge estimation in reducing mean square error (MSE) has been recognized in multiple regression analysis for some time,…
Descriptors: Predictor Variables, Multiple Regression Analysis, Least Squares Statistics, Data Analysis
Peer reviewed Peer reviewed
Olkin, Ingram – Psychometrika, 1981
It is known that for trivariate distributions, if two correlations are fixed, the remaining correlation is constrained. If just one is fixed, the remaining two are constrained. Both results are extended to the case of a multivariate distribution. (Author/JKS)
Descriptors: Correlation, Data Analysis, Matrices, Multiple Regression Analysis
Peer reviewed Peer reviewed
Tyler, David E. – Psychometrika, 1982
The index of redundancy is a measure of association between a set of independent variables and a set of dependent variables. Properties and interpretations of redundancy variables, in a particular subset of the original variables, are discussed. (JKS)
Descriptors: Correlation, Data Analysis, Multiple Regression Analysis, Multivariate Analysis
Peer reviewed Peer reviewed
Hedges, Larry V.; Olkin, Ingram – Psychometrika, 1981
Commonality components have been defined as a method of partitioning squared multiple correlations. The asymptotic joint distribution of all possible squared multiple correlations is derived. The asymptotic joint distribution of linear combinations of squared multiple correlations is obtained as a corollary. (Author/JKS)
Descriptors: Correlation, Data Analysis, Mathematical Models, Multiple Regression Analysis
Peer reviewed Peer reviewed
Muller, Keith E. – Psychometrika, 1981
Redundancy analysis is an attempt to provide nonsymmetric measures of the dependence of one set of variables on another set. This paper attempts to clarify the nature of redundancy analysis and its relationships to canonical correlation and multivariate multiple linear regression. (Author/JKS)
Descriptors: Correlation, Data Analysis, Multiple Regression Analysis, Multivariate Analysis
Peer reviewed Peer reviewed
Harris, Chester W. – Psychometrika, 1978
A simple roof is presented: that the squared multiple correlation of a variable with the remaining variables in the set of variables is a lower bound to the communality of that variable. (Author/JKS)
Descriptors: Correlation, Data Analysis, Factor Analysis, Mathematical Models
Peer reviewed Peer reviewed
Koopman, Raymond F. – Psychometrika, 1976
This note proposes an alternative implementation of the regression method which should be slightly faster than the principal components methods for estimating missing data. (RC)
Descriptors: Comparative Analysis, Data Analysis, Factor Analysis, Multiple Regression Analysis
Peer reviewed Peer reviewed
Bentler, P. N.; Freeman, Edward H. – Psychometrika, 1983
Interpretations regarding the effects of exogenous and endogenous variables on endogenous variables in linear structural equation systems depend upon the convergence of a matrix power series. The test for convergence developed by Joreskog and Sorbom is shown to be only sufficient, not necessary and sufficient. (Author/JKS)
Descriptors: Data Analysis, Mathematical Models, Matrices, Multiple Regression Analysis
Peer reviewed Peer reviewed
Steiger, James H. – Psychometrika, 1979
A theorem which gives the range of possible correlations between a common factor and an external variable (not contained in the factor analysis) is presented. Analogous expressions for component theory are also derived. (Author/JKS)
Descriptors: Correlation, Data Analysis, Factor Analysis, Multiple Regression Analysis
Peer reviewed Peer reviewed
Ramsay, J. O. – Psychometrika, 1975
Many data analysis problems in psychology may be posed conveniently in terms which place the parameters to be estimated on one side of an equation and an expression in these parameters on the other side. A rule for improving the rate of convergence of the iterative solution of such equations is developed and applied to four problems. (Author/RC)
Descriptors: Computer Programs, Data Analysis, Factor Analysis, Individual Differences