NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 12 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Liou, Gloria; Bonner, Cavan V.; Tay, Louis – International Journal of Testing, 2022
With the advent of big data and advances in technology, psychological assessments have become increasingly sophisticated and complex. Nevertheless, traditional psychometric issues concerning the validity, reliability, and measurement bias of such assessments remain fundamental in determining whether score inferences of human attributes are…
Descriptors: Psychometrics, Computer Assisted Testing, Adaptive Testing, Data
Peer reviewed Peer reviewed
Direct linkDirect link
Choi, Youn-Jeng; Asilkalkan, Abdullah – Measurement: Interdisciplinary Research and Perspectives, 2019
About 45 R packages to analyze data using item response theory (IRT) have been developed over the last decade. This article introduces these 45 R packages with their descriptions and features. It also describes possible advanced IRT models using R packages, as well as dichotomous and polytomous IRT models, and R packages that contain applications…
Descriptors: Item Response Theory, Data Analysis, Computer Software, Test Bias
Peer reviewed Peer reviewed
Direct linkDirect link
Feinberg, Mark E.; Gomez, Brendan J.; Puddy, Richard W.; Greenberg, Mark T. – Health Education & Behavior, 2008
Community coalitions (CCs) have labored with some difficulty to demonstrate empirical evidence of effectiveness in preventing a wide range of adolescent problem behaviors. Training and technical assistance (TA) have been identified as important elements in promoting improved functioning of CCs. A reliable, valid, and inexpensive method to assess…
Descriptors: Prevention, Construct Validity, Risk, Questionnaires
Peer reviewed Peer reviewed
Jacobs, Ronald L.; And Others – Journal of Educational Computing Research, 1985
This study adapted the Hidden Figures Test for use on PLATO and determined the reliability of the computerized version compared to the paper and pencil version. Results indicate the test was successfully adapted with some modifications, and it was judged reliable although it may be measuring additional constructs. (MBR)
Descriptors: Computer Assisted Testing, Educational Research, Field Dependence Independence, Higher Education
Oosterhof, Albert C.; Salisbury, David F. – 1984
The Assessment Resource Center (ARC) at Florida State University provides computer assisted testing (CAT) for approximately 4,000 students each term. Computer capabilities permit a small proctoring staff to administer tests simultaneously to large numbers of students. Programs provide immediate feedback for students and generate a variety of…
Descriptors: Computer Assisted Testing, Criterion Referenced Tests, Feedback, Higher Education
Bruno, James E. – Journal of Computer-Based Instruction, 1987
Reports preliminary findings of a study which used a modified Admissible Probability Measurement (APM) test scoring system in the design of computer based instructional management systems. The use of APM for curriculum analysis is discussed, as well as its value in enhancing individualized learning. (Author/LRW)
Descriptors: Computer Assisted Testing, Computer Managed Instruction, Curriculum Evaluation, Design
Peer reviewed Peer reviewed
Nelson, Larry R. – Educational Measurement: Issues and Practice, 1984
The author argues that scoring, reporting, and deriving final grades can be considerably assisted by using a computer. He also contends that the savings in time and the computer database formed will allow instructors to determine test quality and reflect on the quality of instruction. (BW)
Descriptors: Achievement Tests, Affective Objectives, Computer Assisted Testing, Educational Testing
Sorensen, H. Barbara – AEDS Monitor, 1985
This study validated computerized versions of selected tests from the Kit of Factor-Referenced Cognitive Tests of 1976, compared performance of higher education students on computerized and paper-and-pencil versions of the tests, and explored whether gender, academic status, or age interacted with the tests. (MBR)
Descriptors: Age Differences, Cognitive Ability, Cognitive Tests, Comparative Analysis
O'Brien, Michael; Hampilos, John P. – 1984
The feasibility of creating an item bank from a teacher-made test was examined in two comparable sections of a graduate-level introductory measurement course. The 67-item midterm examination contained multiple-choice and master matching items, which required higher level cognitive processes such as application and analysis. The feasibility of…
Descriptors: Computer Assisted Testing, Criterion Referenced Tests, Difficulty Level, Higher Education
Carlson, Sybil B.; Camp, Roberta – 1985
This paper reports on Educational Testing Service research studies investigating the parameters critical to reliability and validity in both the direct and indirect writing ability assessment of higher education applicants. The studies involved: (1) formulating an operational definition of writing competence; (2) designing and pretesting writing…
Descriptors: College Entrance Examinations, Computer Assisted Testing, English (Second Language), Essay Tests
Harnisch, Delwyn L. – 1985
Computer adaptive testing systems are feasible for certification and licensure testing. This is in part due to the availability of extensive yet inexpensive computers. Modern item response theory, combined with computerized adaptive testing, yields a powerful new method of testing which provides greater accuracy and efficiency and less boredom for…
Descriptors: Adaptive Testing, Certification, Computer Assisted Testing, Cost Effectiveness
Wisniewski, Dennis R. – 1986
Three questions concerning the Binary Search Method (BSM) of computerized adaptive testing were studied: (1) whether it provided a reliable and valid estimation of examinee ability; (2) its effect on examinee attitudes toward computerized adaptive testing and conventional paper-and-pencil testing; and (3) the relationship between item response…
Descriptors: Adaptive Testing, Computer Assisted Testing, Difficulty Level, Grade 5