Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 4 |
Since 2006 (last 20 years) | 9 |
Descriptor
Higher Education | 99 |
Item Banks | 99 |
Test Construction | 54 |
Test Items | 44 |
Computer Assisted Testing | 38 |
Item Analysis | 17 |
Multiple Choice Tests | 17 |
College Students | 16 |
Foreign Countries | 16 |
Test Validity | 16 |
Adaptive Testing | 15 |
More ▼ |
Source
Author
Bejar, Isaac I. | 3 |
Reckase, Mark D. | 3 |
Seely, Oliver, Jr. | 3 |
Adema, Jos J. | 2 |
Ory, John C. | 2 |
Willis, Van | 2 |
van der Linden, Wim J. | 2 |
Aesche, Darryl W. | 1 |
Ardolino, Piermatteo | 1 |
Benbasat, Izak | 1 |
Bergstrom, Betty A. | 1 |
More ▼ |
Publication Type
Education Level
Higher Education | 9 |
Postsecondary Education | 6 |
Elementary Secondary Education | 2 |
Audience
Practitioners | 4 |
Researchers | 2 |
Teachers | 1 |
Location
United Kingdom | 3 |
India | 2 |
Ireland | 2 |
Italy | 2 |
Netherlands | 2 |
South Korea | 2 |
Tennessee | 2 |
United States | 2 |
Africa | 1 |
Asia | 1 |
Australia | 1 |
More ▼ |
Laws, Policies, & Programs
Comprehensive Education… | 2 |
Assessments and Surveys
What Works Clearinghouse Rating
Ardolino, Piermatteo; Noventa, Stefano; Formicuzzi, Maddalena; Cubico, Serena; Favretto, Giuseppe – Higher Education: The International Journal of Higher Education Research, 2016
An observational study has been carried out to analyse differences in performance between students of different undergraduate curricula in the same written business administration examination, focusing particularly on possible effects of "integrated" or "multi-modular" examinations, a recently widespread format in Italian…
Descriptors: Business Administration, Undergraduate Study, Higher Education, Foreign Countries
Oliveri, Maria Elena; Lawless, Rene; Robin, Frederic; Bridgeman, Brent – Applied Measurement in Education, 2018
We analyzed a pool of items from an admissions test for differential item functioning (DIF) for groups based on age, socioeconomic status, citizenship, or English language status using Mantel-Haenszel and item response theory. DIF items were systematically examined to identify its possible sources by item type, content, and wording. DIF was…
Descriptors: Test Bias, Comparative Analysis, Item Banks, Item Response Theory
Choy, S. Chee; Goh, Pauline Swee Choo; Sedhu, Daljeet Singh – International Journal of Teaching and Learning in Higher Education, 2016
The development of the 21-item Learner Awareness Levels Questionnaire (LALQ) was carried out using data from three separate studies. The LALQ is a self-reporting questionnaire assessing how and why students learn. Study 1 refined the initial pool of items to 21 using exploratory factor analysis. In Study 2, the analysis showed evidence for a…
Descriptors: Questionnaires, Higher Education, College Students, Factor Analysis
Cheung, K. Y. F.; Stupple, E. J. N.; Elander, J. – Studies in Higher Education, 2017
One approach to plagiarism prevention focuses on improving students' authorial identity, but work in this area depends on robust measures. This paper presents the development of a psychometrically robust measure of authorial identity--the Student Attitudes and Beliefs about Authorship Scale. In the item generation phase, a pool of items was…
Descriptors: Student Attitudes, Plagiarism, Academic Discourse, Psychometrics
Risquez, Angelica; Raftery, Damien; Costello, Eamon – British Journal of Educational Technology, 2015
The Irish inter-institutional virtual learning environments (VLEs) open dataset stems from ongoing work with a rolling longitudinal survey of students' usage of VLEs which has been ongoing in 12 higher education institutions since 2008. The project has collected over 21,000 student responses to date through the growth of an extended network of…
Descriptors: Foreign Countries, Virtual Classrooms, Educational Environment, Student Attitudes
McAllister, Daniel; Guidice, Rebecca M. – Teaching in Higher Education, 2012
The primary goal of teaching is to successfully facilitate learning. Testing can help accomplish this goal in two ways. First, testing can provide a powerful motivation for students to prepare when they perceive that the effort involved leads to valued outcomes. Second, testing can provide instructors with valuable feedback on whether their…
Descriptors: Testing, Role, Student Motivation, Feedback (Response)
Farrow, Robert; Pitt, Rebecca; de los Arcos, Beatriz; Perryman, Leigh-Anne; Weller, Martin; McAndrew, Patrick – British Journal of Educational Technology, 2015
The true power of comparative research around the impact and use of open educational resources is only just being realised, largely through the work done by the Hewlett-funded OER Research Hub, based at The Open University (UK). Since late 2012, the project has used a combination of surveys, interviews and focus groups to gather data about the use…
Descriptors: Educational Resources, Open Source Technology, Surveys, Interviews
Shelton, Kaye – ProQuest LLC, 2010
As the demands for public accountability increase for the higher education industry, institutions are seeking methods for continuous improvement in order to demonstrate quality within programs and processes, including those provided through online education. Because of the rapid growth of online education programs, institutions are further called…
Descriptors: Delphi Technique, Higher Education, Distance Education, Online Courses
van der Linden, Wim J. – 1998
Six methods for assembling tests from a pool with an item-set structure are presented. All methods are computational and based on the technique of mixed integer programming. The methods are evaluated using such criteria as the feasibility of their linear programming problems and their expected solution times. The methods are illustrated for two…
Descriptors: Higher Education, Item Banks, Selection, Test Construction

Clute, Ronald C.; McGrail, George R. – Journal of Education for Business, 1989
Eight text banks that accompany cost accounting textbooks were evaluated for the presence of bias in the distribution of correct responses. All but one were found to have considerable bias, and three of eight were found to have significant choice bias. (SK)
Descriptors: Accounting, Higher Education, Item Banks, Multiple Choice Tests

Schnirman, Geoffrey M.; Welsh, Marilyn C.; Retzlaff, Paul D. – Assessment, 1998
The Tower of London test (T. Shallice, 1982), a measure of executive function, was reconstructed to increase its reliability through revisions tested with successive samples of 50, 50, and 34 college students. Adjusting the item pool resulted in acceptable test-retest reliability. (SLD)
Descriptors: Cognitive Tests, College Students, Higher Education, Item Banks
Michener, R. Dean; And Others – 1978
A specific application of the process of automating exams for any introductory statistics course is described. The process of automating exams was accomplished by using the Statistical Test Item Collection System (STICS). This system was first used to select a set of questions based on course requirements established in advance; afterward, STICS…
Descriptors: Computer Assisted Testing, Computer Programs, Higher Education, Item Banks

Seely, Oliver, Jr.; Willis, Van – AEDS Journal, 1976
Describes the SOCRATES Computer-Assisted Test Retrieval system available to the faculty and students of the California State University and Colleges. (Author/IRT)
Descriptors: Computer Assisted Instruction, Databases, Educational Testing, Higher Education
Chen, Hui-Chuan; And Others – Educational Research and Methods, 1977
Discusses how to design a large exam question file for any discipline and how to use this file to extract examinations. (MLH)
Descriptors: Computer Assisted Testing, Computers, Evaluation, Higher Education

Masters, Joan C.; Hulsmeyer, Barbara S.; Pike, Mary E.; Leichty, Kathy; Miller, Margaret T.; Verst, Amy L. – Journal of Nursing Education, 2001
A sample of 2,913 questions from 17 nursing test banks was evaluated for adherence to multiple-choice guidelines, cognitive level in Bloom's Taxonomy, and distribution of correct answers. Analysis revealed 2,233 guideline violations; 47.3% of items were written at the knowledge level, 6.5% at the analysis level; and correct answers were evenly…
Descriptors: Higher Education, Item Analysis, Item Banks, Multiple Choice Tests