NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 3 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Quellmalz, Edys S.; Davenport, Jodi L.; Timms, Michael J.; DeBoer, George E.; Jordan, Kevin A.; Huang, Chun-Wei; Buckley, Barbara C. – Journal of Educational Psychology, 2013
How can assessments measure complex science learning? Although traditional, multiple-choice items can effectively measure declarative knowledge such as scientific facts or definitions, they are considered less well suited for providing evidence of science inquiry practices such as making observations or designing and conducting investigations.…
Descriptors: Science Education, Educational Assessment, Psychometrics, Science Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Bowler, Mark C.; Woehr, David J. – Journal of Vocational Behavior, 2009
Recent Monte Carlo research has illustrated that the traditional method for assessing the construct-related validity of assessment center (AC) post-exercise dimension ratings (PEDRs), an application of confirmatory factor analysis (CFA) to a multitrait-multimethod matrix, produces inconsistent results [Lance, C. E., Woehr, D. J., & Meade, A. W.…
Descriptors: Monte Carlo Methods, Multitrait Multimethod Techniques, Construct Validity, Validity
Peer reviewed Peer reviewed
Czogalik, Dietmar; Russell, Robert L. – Journal of Consulting and Clinical Psychology, 1995
Assigned factor scores to therapist and client utterances. Constructed a 17 x 17 correlation matrix consisting of correlations across 4 lagged utterances. Principal-components analysis revealed 4 therapist-client interaction factors: mutual therapeutic engagement, mutual therapeutic negotiation, undirected client reminiscence, and sustained…
Descriptors: Clinical Psychology, Counselor Client Relationship, Factor Analysis, Factor Structure