You are on page 1of 33

Module 10: Evaluation Methods

Module 10.3: Construction of Evaluation Tools

Overall objective is to understand about the construction of evaluation
tools, recognize the various steps in developing the evaluation devices and
apply these tools for the assessment of students in terms of knowledge,
skills and attitude.

Learning Outcome
At the end of this module the students will be able to:
• Explain the steps in test construction
• Describe the guidelines for construction
• Enlist the problems faced in construction
• Identify the examples of questionnaire

List of Topics • Introduction • Steps in test construction • Guidelines for construction • Problems faced in construction • Examples of questionnaire • Summary • References .

It requires many skills that are of equal importance with other elements of instructional process like how to device. understand the basic concepts of measurements and the construction of variety of tools. . to use and interpret tests. continuing process and an integral part of teaching and learning.Introduction Educational evaluation is a complex.

Steps in Test Construction 1. • This will be followed by the scheme of option. content. Developing test design: • The objective. 2. • A test for a single unit may be generally of 40 to 45 minutes duration with maximum of 20 to 25 marks. . • This should be decided in terms of the nature and scope of the unit or units involved in the testing. • The weightage for the different forms of the question to be included and for the difficult level to be maintained also are considered while finalising the design. from of question and weightage of difficulty level are the most important factors to be considered while designing the test. Planning an Achievement test: This step is concerned with determining the maximum time marks and nature of the test. • If the test conducted at the end of the session the duration may be about 2 to 3 hours and the maximum marks may be 100.

Knowledge 10 20 2.Design for a Unit a. Analysis 5 10 5. Synthesis 5 10 6 Evaluation 2 4 Total 50 100 . Application 8 16 4. Weightage of instructional objectives: Sl. No Objective Marks Percentage 1. Understanding 20 40 3.

No Sub -Unit Marks Percentage 1 I 15 30 2 II 10 20 3 III 10 20 4 IV 5 10 5 V 10 20 Total 50 100 . Weightage to Content Areas Sl.b.

c. Weightage to Form of Question Sl. Form Of No. Of Marks Percentage No Question Question 1 Objective .Type 25 25 50 2 Short Answer Type 5 15 30 3 Long Essay Type 1 10 20 Total 31 50 100 .

Weightage to Difficulty Level Sl.d. No Level Of Difficulty Marks Percentages 1 Easy 10 20 2 Average 30 60 3 Difficulty 10 20 Total 50 100 .

Guidelines for Preparing Test Design • The design should reflect the pre. • Modern trends is to avoid option. • Regarding weightage of difficulty level sixty percent of items of average difficulty with twenty percent on either side. • Regarding the number of question under each from also there cannot be any uniform ally acceptable design.determined objectives envisaged at the time of instruction. • As in the case weightage of content. . there is no final ruling regarding the number of subunits into which the content has to be divided . Whether more weightage has to be given to a particular form of question. It depends on the total content area as well as its nature.

wise.3.wise and form. Prepare the Blue Print for Test The next step in the construction of an achievement test is preparing a blueprint according to design. content. objective-wise. Content Hours Knowledg Application Critical Total e (30%) analysis (60%) (10%) Area 1 20 13 6 1 20 Area 2 20 12 6 2 20 Area 3 15 10 4 1 15 Area 4 25 13 9 3 25 Area 5 20 12 5 3 20 Total 60 30 10 100 . Normally a blueprint for a test is prepared as a three dimensional chart indicating the distribution of question.

Organization of the test: After finalising the items these has to be arranged according to the scheme of section as suggested in the design. .Steps in Test Construction 4. 5. While setting the question and making the final selection care has to taken to maintain the weightage of difficulty level suggested by the design. the preliminary details such as name of the examination. The maximum marks and time . Construction of the test: The blue print gives very definite idea. instruction for answering each part have to written. Before that . regarding the number of question to be set from each subunit their forms and scope.

Steps in Test Construction 6. keep the time correctly. record any significant events that might influence test scores and collect the test material promptly. . Test Administration: Motivate the students to do their best. In this scoring key for objective type and point method can be used for short and essay type. Preparation of the scheme for evaluation: One of the steps suggest for maintaining objectivity is to make the scoring strictly in accordance with predesigned scheme of evaluation. follow the direction closely. 7.

Essay Type Items • Avoid phrases like “discuss briefly state everything that you know” etc. • Use a point system of scoring based upon the elements. • Have range of complexity and difficulty. • Do not give too many or too lengthy questions. • Question should be worded carefully. . • Do not allow too many choices. • Score the answers of all the students to one question before scoring another. • The question should be structured and cover a specific topic. • Scoring Procedure • Set out the elements which should appear in the answer.

Guidelines for Construction Short-answer type items: • Avoid phrases like “briefly. if negative statement is to be used. • Use positive statement in the stem as far as possible. • Make the stem simple and brief. accurate in relation to the subject matter area Multiple Choice Items: • Have enough content in the stem with distractors as small as possible. • As far as possible use action oriented precise verbs. • Each item deals with important content. but avoid lengthy stem. . short notes on” etc. • Be sure that there is only one correct and best answer. Don’t load the stem with irrelevant material. • Language should be precise. • Keep the question as long as possible but make the answer short. underline it or write in capital letters. so that it will not be overlooked.

• Avoid using the distractors “all of the above” or “none of the above” as far as possible. if number of items matched are more than 10. • Matching type items • Make relatively several short matching items.Guidelines for Construction • Make distractors that resemble the correct answer i. • Have four to five distractors only. • Stimulus and response columns should be preferably on same page. distractors should be plausible. don’t make lists that are quite different. .e. • Avoid completing the stem with an or a which confuses or gives a clue to the learner. • Provide a blank space or a separate answer sheet against each item for writing the number/letter of the correct answer • Arrange the distractors in such a way that there is no pattern evident about correct answers.

columns A and B or list A and B rather than items of right side and items of left side. • Stimulus and response columns should be preferably on same page. • Avoid “trick and catch” items.Guidelines for Construction • Give some heading to both the columns e. • Write clear and direct statements. . • Avoid using clues like all. • Avoid the use of negative statements particularly double negative. • Give some heading to both the columns e. always.g. True and False Items: • Give single idea in the statement. columns A and B or list A and B rather than items of right side and items of left side. usually. if number of items matched are more than 10. • Make relatively several short matching items. emphasize important points. avoid ambiguous statements. should.g. • Avoid lengthy statements. nothing. none. sometimes. may etc. no.

• Multiple observations provide a more accurate assessment performance that does a single instance • Students should be evaluated in the natural setting or one as closely approximating reality as possible. which is followed by an individual session in which instructor and student discuss strengths and weakness of the performance and formulate a plan to improve the performance. • Determine the order of true and false by chance.Guidelines for Construction • Have equal number of true and false items. • A copy of the completed checklist should be given to each student for review. . • Need to be confined to performance areas that can be assessed sufficiently by examining positive and negative criteria only and when sufficient opportunity for observation exists. Checklist : • Should relate directly to learning objectives.

the instructors feel unqualified to judge • All raters to be oriented to the specific scale as well as to the process of rating in general • Consider evaluation setting.Guidelines for Construction Rating Scale: • Should relate directly to learning objectives • Need to be confined to performance areas that can be observed • Three to seven rating positions may need to be provided • Provision to omit items. feedback and student participation in instrument development • Rating scales are vulnerable to errors resulting from the subjective judgement required of the observers. .

The rater places the appropriate number beside each trait being rated. Code numbers are assigned to the descriptive phrases. The degree of each characteristic are arranged so that the rater can make as fine distinctions as the rater wishes. arranging in order of the degree. level. Graphic Rating Scales: • It has descriptive phrases printed horizontally at various points. The rater indicates the subjects standing with respect to each trait by placing a checkmark at appropriate point along the line. intensity or frequency with which they indicate possession.Guidelines for Construction Numerical Rating Scales: • These are setup so that the rater assigns a code number to each trait of the person being rated. lack or occurrence of each trait. .

then limit observation to those categories or qualities. • Limit each anecdote to a brief description of a single specific incident. • The observer must make a definite judgment about behavior that is considered to be critical. • Record enough of the situations to decrease subjectivity and record the incident as soon as possible after its occurrence. Critical Incident Report: • Actual behavior must be reported rather than general trait. the behaviour to assess. . • Behavior must be actually observed by the reporter. • All relevant factors in the situation must be given.Guidelines for Construction Anecdotal Records: • Determine in advance.

• Covers only a limited field of knowledge in one test . • Difficulties in obtaining consistent judgement of performance. . grammar and length of answer. • Requires excessive time to score. • Lack of objectivity. • Provide negligible feedback. • Contaminated by the extraneous factors like spelling. handwriting. • Subjectivity of scoring. neatness.Problems Faced in Construction Essay type items: • Provide little useful feedback.

• Objective type test items. • Provides little or no opportunity for measurement of students ability to organize and to express an error that occurs when a rater‘s general impression of a person influence the rating of individual characteristic. • Needs lots of time and effort in preparing the test. • Needs more stationary. • Cost aspects.results when two characteristics are rated as more alike than they actually because of the raster's belief concerning their relationship. . • The halo effect. Rating scale: • Personal bias.errors are indicated by a general tendency to rate all individual at approximately the same position on the scale.Problems Faced in Construction Short-answer type item: • Difficulty in construction of reliable items. • A logical error..

Checklist • It has limited application • Determines only presence or absence of an action • Provides no means of judging the extent to which a behaviour is possessed by the student • Anecdotal Records • Subjectivity • Lack of standardization • Difficulty in scoring • Time consuming • Limited application .

Examples of Questionnaire Essay type items: Example: • Poor: Discuss immunity – 10 • Better: Define immunity. Describes hazards of immunization.H.O . Differentiate between passive and active immunity. Short-answer type items: Example: • Poor: Give your best definition of health • Better: What is the definition of health according to W. List general precautions to be taken while giving immunization -1+3+3+3=10.

Examples of Questionnaire Multiple Choice Items: Example The collection of fluid in the pleural space is known as: • Pleuritis • Pleural Effusion • Pleural Tapping • Pleurodesis Matching type items: Example: Column A Column B • Long bone Skull • Small bone Tibia • Flat bone Femur Stapes .

Examples of Questionnaire True and False Items: Example: Note: Tick the correct response: • Pancreas is an endocrine gland.True / False • The largest gland in the body is pituitary.True / False . . .

Checks appearance of present dressing 3. Explains procedure to the patient 2. Obtains necessary equipment Comments: Student’s Signature Instructor’s Signature .Examples of Questionnaire Checklist: Example: • Change of Dressing • Instruction: Tick the appropriate column Behavior Yes No 1. Washes hands 4.

Obtains necessary equipment Comments: Student’s Signature Instructor’s Signature .Examples of Questionnaire Rating Scale: Example: • Change of Dressing • Instruction: Tick the appropriate column Behavior Excellent-5 Very Fair-3 Poor-2 Very Good-4 Poor- 1 1. Washes hands 4. Checks appearance of present dressing 3. Explains procedure to the patient 2.

Examples of Questionnaire Anecdotal Record: Example: • Name of the Student: • Year: Date & Time: • Name of the Observer: • Setting: • Incident: • Interpretation: • Recommendations: Observer’s Signature .

Behavior and consequences Observer’s Signature .Examples of Questionnaire Critical Incident Report: Example – Empathy: • Positive Behaviors Negative Behaviors • To be encouraged Needing improvement • Uses patient’s name in all Addresses patient by a general communications term such as “Grandpa” Date Behavior (Number What happened: Record in items) antecedents.

to use and interpret tests. .Summary Educational evaluation requires many skills that include how to device. understand the basic concepts of measurements and the construction of variety of tools such as achievement tests and performance appraisal.