Publication Date
In 2025 | 1 |
Since 2024 | 1 |
Since 2021 (last 5 years) | 1 |
Since 2016 (last 10 years) | 2 |
Since 2006 (last 20 years) | 2 |
Descriptor
Source
Author
Cheek, Jimmy G. | 6 |
McGhee, Max B. | 6 |
Ellington, Henry | 3 |
Herman, Joan | 3 |
Kolstad, Rosemarie K. | 3 |
Erickson, Harley E. | 2 |
Haladyna, Thomas M. | 2 |
Hintzen, Neil | 2 |
Kitao, Kenji | 2 |
Kitao, S. Kathleen | 2 |
Kolstad, Robert A. | 2 |
More ▼ |
Publication Type
Education Level
Higher Education | 1 |
Postsecondary Education | 1 |
Audience
Practitioners | 155 |
Teachers | 96 |
Administrators | 20 |
Researchers | 18 |
Students | 8 |
Counselors | 1 |
Parents | 1 |
Policymakers | 1 |
Location
Florida | 7 |
Pennsylvania | 5 |
Texas | 5 |
Delaware | 4 |
Oklahoma | 4 |
United Kingdom (Great Britain) | 3 |
California | 2 |
Canada | 2 |
Georgia | 2 |
Wisconsin | 2 |
Israel | 1 |
More ▼ |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Daniel A. DeCino; Steven R. Chesnut; Phillip L. Waalkes; Reed N. Keen – Measurement and Evaluation in Counseling and Development, 2025
Objective: The purpose of this study was to develop and validate the Counselor Self-Reflection Inventory (CSRI) from a Transformative Learning Theory framework for counselors, and counselors-in-training to use in clinical and training settings. Method: A sample of 351, mostly female (86.89%), white (85.19%), counselors with MS or MA (88.08%)…
Descriptors: Test Construction, Test Validity, Test Reliability, Attitude Measures
Irwin, Clare W.; Stafford, Erin T. – Regional Educational Laboratory Northeast & Islands, 2016
This guide describes a five-step collaborative process that educators can use with other educators, researchers, and content experts to write or adapt questions and develop surveys for education contexts. This process allows educators to leverage the expertise of individuals within and outside of their organization to ensure a high-quality survey…
Descriptors: Surveys, Test Construction, Educational Cooperation, Test Items

Fox, Robert A. – Journal of School Health, 1980
Some practical guidelines for developing multiple choice tests are offered. Included are three steps: (1) test design; (2) proper construction of test items; and (3) item analysis and evaluation. (JMF)
Descriptors: Guidelines, Objective Tests, Planning, Test Construction
Berk, Ronald A. – Educational Technology, 1980
Examines four factors involved in the determination of how many test items should be constructed or sampled for a set of objectives: (1) the type of decision to be made with results, (2) importance of objectives, (3) number of objectives, and (4) practical constraints. Specific guidelines that teachers and evaluators can use and an illustrative…
Descriptors: Behavioral Objectives, Criterion Referenced Tests, Guidelines, Test Construction
Mizokawa, Donald T.; Hamlin, Michael D. – Educational Technology, 1984
Suggestions for software design in computer managed testing (CMT) cover instructions to testees, their physical format, provision of practice items, and time limit information; test item presentation, physical format, discussion of task demands, review capabilities, and rate of presentation; pedagogically helpful utilities; typefonts; vocabulary;…
Descriptors: Computer Assisted Testing, Decision Making, Guidelines, Test Construction

Kolstad, Rosemarie K.; Kolstad, Robert A. – Clearing House, 1982
Argues that multiple choice tests can be effective only if the items are written in a format suitable for testing the mastery of specific instructional objectives. Proposes the use of nonrestrictive test items and cites examples of such items. (FL)
Descriptors: Elementary Secondary Education, Multiple Choice Tests, Test Construction, Test Format
Long, Susan; Cognetta, Randall A. – 1978
After a brief introduction and discussion of the advantages and disadvantages of questionnaires, this publication explains how to develop a questionnaire. Each of the five parts of a questionnaire (heading, procedural statements, items, comment space, and procedures for return) are discussed, with concrete examples. Guidelines for pre-test/tryout…
Descriptors: Data Analysis, Data Collection, Field Tests, Questionnaires

Livne, Nava L.; Livne, Oren E.; Milgram, Roberta M. – International Journal of Mathematical Education in Science and Technology, 1999
Develops a mapping sentence to construct test items measuring academic and creative abilities in mathematics at four levels. Describes the three stages of the process of developing the mapping sentence and presents examples of test items representing each ability/level combination. Contains 63 references. (Author/ASK)
Descriptors: Ability Identification, Academic Ability, Creativity, Mathematics Education
Haladyna, Thomas M. – 1999
This book explains writing effective multiple-choice test items and studying responses to items to evaluate and improve them, two topics that are very important in the development of many cognitive tests. The chapters are: (1) "Providing a Context for Multiple-Choice Testing"; (2) "Constructed-Response and Multiple-Choice Item Formats"; (3)…
Descriptors: Constructed Response, Multiple Choice Tests, Test Construction, Test Format

Mershon, Donald H. – Teaching of Psychology, 1982
Describes a method for increasing efficiency of examination production which uses file cards to store and organize test items. The process of reproducing tests directly from master copies made with file cards is discussed. (AM)
Descriptors: Efficiency, Higher Education, Item Banks, Job Simplification
Sommer, Thomas W. – 1986
A model was developed to provide a uniform method for vocational and technical education content experts to develop test items (written questions and performance measures) that are congruous with course-level exit competencies. The model is essentially a closed-system approach in that specific action verbs must be identified and operationally…
Descriptors: Course Objectives, Models, Objective Tests, Performance Tests
Delaware State Dept. of Education, Dover. – 2002
The Delaware Student Testing Program (DSTP) is designed to assess progress toward the Delaware Content Standards. Every year a certain number of items are removed from the test and then selected for public release. This booklet contains items released from the 2001 administration of the DSTP Science tests for grades 8 and 11. It contains examples…
Descriptors: Academic Standards, Sciences, Secondary Education, State Programs
Delaware State Dept. of Education, Dover. Assessment and Accountability Branch. – 2003
This guide contains materials to help Delaware educators understand and use reports from the Delaware Student Testing Program (DSTP). The DSTP tests are tied to the Delaware content standards that define the knowledge and skills required for students to progress beyond high school. In spring 2003, the DSTP reading, writing, and mathematics tests…
Descriptors: Elementary Secondary Education, Scores, State Programs, Teachers

Conderman, Greg; Koroghlanian, Carol – Intervention in School and Clinic, 2002
This article presents guidelines for writing better test items to evaluate student learning better. Recommendations are provided for writing true-false items (test only one idea in each item), multiple-choice items (include the bulk of information in the stem), and matching items (only use homogeneous lists). Examples are provided. (Contains 7…
Descriptors: Elementary Secondary Education, Learning Disabilities, Student Evaluation, Teacher Made Tests
Rigol, Gretchen W. – College Board Review, 1991
The College Entrance Examination Board has not permitted calculator use on the Scholastic Aptitude Test because of unresolved concerns about equity, implications for test content, and logistical and security issues. Those issues no longer seem insurmountable, and significant changes are being introduced on many tests. (MSE)
Descriptors: Calculators, Cheating, College Entrance Examinations, Higher Education