Item analysis

47
Starting Windows Especially Designed

Transcript of Item analysis

Page 1: Item analysis

Starting Windows Especially Designed

Page 2: Item analysis

Basics of Item

Analysis

Page 3: Item analysis

QUESTIONS?

Click to end

Contact Me…!!!©James L. Paglinawan TM

® e-mail me @: [email protected]

[email protected]

Page 4: Item analysis

Item Analysis

Techniques to improve test items

and instruction

Page 5: Item analysis

11. Administer Tests12. Conduct Item Analysis

1. Review National

and Professional Standards

7. Develop New Test Questions8. Review Test Questions

13. Standard Setting Study14. Set Passing Standard

9. Assemble Operational Test Forms

10. Produce Printed Tests Mat.

2. Convene

National Advisory Committee

3. Develop Domain, Knowledge and Skills

Statements4. Conduct Job Analysis

5. Establish Test Specifications

6. Develop Test Design

Test Development Process

Page 6: Item analysis

What is Item Analysis ?

• process that examines student responses to individual test items assessing quality items and the test as a whole

• valuable in improving items which will be used again in later tests and eliminate ambiguous or misleading items

• valuable for increasing instructors' skills in test construction, and

• identifying specific areas of course content which need greater emphasis or clarity.

Page 7: Item analysis

Several Purposes 1. More diagnostic information on

students – Classroom level:

• determine questions most found very difficult/ guessing on, -

– reteach that concept • questions all got right –

– don't waste more time on this area• find wrong answers students are

choosing-– identify common misconceptions

– Individual level: • isolate specific errors this student

made

Page 8: Item analysis

2. Build future tests, revise test items to make them better

• know how much work in writing good questions

• SHOULD NOT REUSE WHOLE TESTS --> diagnostic teaching means responding to needs of students, so after a few years a test bank is build up and choose a tests for the class

• can spread difficulty levels across your blueprint (TOS)

Page 9: Item analysis

3. Part of continuing professional development

– doing occasional item analysis will help become a better test writer

– documenting just how good your evaluation is

– useful for dealing with parents or administrators if there's ever a dispute

– once you start bringing out all these impressive looking stats, parents and administrators will believe why some students failed.

Page 10: Item analysis

Classical ITEM Analysis Statistics

• Reliability (test level statistic)

• Difficulty (item level statistic)

• Discrimination (item level statistic)

Page 11: Item analysis

Test level statisticQuality of the Test

• Reliability and Validity–Reliability

Consistency of measurement

–ValidityTruthfulness of response

Overall Test QualityIndividual Item Quality

Page 12: Item analysis

Reliability refers to the extent to which the

test is likely to produce consistent scores.

Characteristics: 1. The intercorrelations among the items --

the greater/stronger the relative number of positive relationships are, the greater the reliability.

2. The length of the test – a test with more items will have a higher reliability, all other things being equal.

Page 13: Item analysis

3. The content of the test -- generally, the more diverse the subject matter tested and the testing techniques used, the lower the reliability.

4. Heterogeneous groups of test takers

Page 14: Item analysis

Types of reliability

• Stability1. Test – Retest

Page 15: Item analysis

• Stability2. Inter – rater / Observer/ Scorer

• applicable for mostly essay questions• Use Cohen’s Kappa Statistic

Page 16: Item analysis

• Equivalence3. Parallel-Forms/ Equivalent

Used to assess the consistency of the results of two tests constructed in the same way from the same content domain.

Page 17: Item analysis

• Internal Consistency• Used to assess the consistency of

results across items within a test. 4. Split – Half

Page 18: Item analysis

• 5. Kuder-Richardson Formula 20 / 21Correlation is

determined from a single administration of a test through a study

of score variances

Page 19: Item analysis

• 6. Cronbach's Alpha (a)

Page 20: Item analysis

ReliabilityIndices Interpretation

.91 and above Excellent reliability; at the level of the best standardized tests

.81 - .90 Very good for a classroom test

.71 - .80 Good for a classroom test; in the range of most. There are probably a few items which could be improved.

.61 - .70Somewhat low. This test needs to be supplemented by

other measures (e.g., more tests) to determine grades. There are probably some items which could be improved.

.51 - .60Suggests need for revision of test, unless it is quite

short (ten or fewer items). The test definitely needs to be supplemented by other measures (e.g., more tests) for grading.

.50 or below Questionable reliability. This test should not contribute heavily to the course grade, and it needs revision.

Page 21: Item analysis

Test Item statistic

Item DifficultyPercent answering correctly

Item DiscriminationHow well the item

"functions“How “valid” the item is

based on the total test score criterion

Page 22: Item analysis

WHAT IS A WELL-FUNCTIONING

TEST ITEM?

• how many students got it correct?

(DIFFICULTY) • which students got it correct?

(DISCRIMINATION)

Page 23: Item analysis

Three important information on quality of test items

• Item difficulty: measure whether an item was too easy or too hard.

• Item discrimination: measure whether an item discriminated between students who knew the material well and students who did not.

• Effectiveness of alternatives: Determination whether distracters (incorrect but plausible answers) tend to be marked by the less able students and not by the more able students.

Page 24: Item analysis

Item Difficulty• Item difficulty is simply the percentage

of students who answer an item correctly. In this case, it is also equal to the item mean.

Diff = # of students choosing correctly   total # of students

• The item difficulty index ranges from 0 to 100; the higher the value, the easier the question.

Page 25: Item analysis

Item Difficulty Level: Definition

The percentage of students who answered the item correctly.

High(Difficult)

Medium(Moderate)

Low(Easy)

<= 30% > 30% AND < 80% >=80%

0 10 20 30 40 50 60 70 80 90 100

Page 26: Item analysis

Item Difficulty Level: Sample

Item No.

No. Correct Answers

% Correct

Difficulty Level

1 15

2 25

3 35

4 45

Number of students who answered each item = 50

30 High

50 Medium

70 Medium

90 Low

Page 27: Item analysis

Item Difficulty Level: Questions/Discussion

• Is a test that nobody failed too easy?

• Is a test on which nobody got 100% too difficult?

• Should items that are “too easy” or “too difficult” be thrown out?

Page 28: Item analysis

Item Discrimination

• Traditionally, using high and low scoring groups (upper 27 % and lower 27%)

• Computerized analyses provide more accurate assessment of the discrimination power of items since it accounts all responses rather than just high and low scoring groups.

• Equivalent to point-biserial correlation. It provides estimate the degree an individual item is measuring the same thing as the rest of the items.

Page 29: Item analysis

What is Item Discrimination?

• Generally, students who did well on the exam should select the correct answer to any given item on the exam.

• The Discrimination Index distinguishes for each item between the performance of students who did well on the exam and students who did poorly.

Page 30: Item analysis

Indices of Difficulty and Discrimination

(by Hopkins and Antes)

Index Difficulty Discrimination0.86 above Very Easy To be discarded

0.71 – 0.85 Easy To be revised

0.30 – 0.70 Moderate Very Good items

0.15 – 0.29 Difficult To be revised

0.14 below Very Difficult To be discarded

Page 31: Item analysis

Item Discrimination: Questions / Discussion

• What factors could contribute to low item discrimination between the two groups of students?

• What is a likely cause for a negative discrimination index?

Page 32: Item analysis

ITEM ANALYSIS PROCESS

Page 33: Item analysis

Sample TOSRemember Understand Apply Total

Section A

4(1,3,7,9)

6 10 20

Section B

5(2,5,8,11,15)

5 4 14

Section C

3(6,17,21)

7 6 16

Total 12 18 20 50

Page 34: Item analysis

Steps in Item analysis

1. Code the test items:- 1 for correct and 0 for

incorrect- Vertical – columns (item

numbers)- Horizontal – rows

(respondents/students)

Page 35: Item analysis

TEST ITEMSNo. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 . . . . 50

1 1 0 1 1 1 0 0 0 0 1 1 1 0 0 0 0 0 1 1

2 1 1 0 1 1 1 0 1 1 1 0 1 1 1 1 0 1 1 1

3 0 0 0 1 0 0 0 1 0 0 0 1 1 1 1 1 1 1 0

4 0 1 0 0 0 1 0 0 0 1 0 0 1 0 0 0 1 0 0

5 1 0 1 1 1 0 1 1 1 0 1 1 0 1 1 1 0 1 0

6 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 0 1

7 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1

8 1 1 0 1 1 1 0 1 1 1 0 1 1 0 0 0 1 0 0

Page 36: Item analysis

2. IN SPSS:

Analyze Scale Reliability analysis (drag/place variables to Item box) Statistics Scale if item deleted ok.

Page 37: Item analysis

• ****** Method 1 (space saver) will be used for this analysis ******• R E L I A B I L I T Y A N A L Y S I S - S C A L E (A L P H A)• Item-total Statistics• Scale Scale Corrected• Mean Variance Item- Alpha• if Item if Item Total if Item • Deleted Deleted Correlation Deleted• VAR00001 14.4211 127.1053 .9401 .9502• VAR00002 14.6316 136.8440 .7332 .9542• VAR00003 14.4211 141.5695 .4774 .9574• VAR00004 14.4737 128.6109 .6511 .9508• VAR00005 14.4737 128.8252 .8274 .9509• VAR00006 14.0526 130.6579 .2236 .9525• VAR00007 14.2105 127.8835 .2533 .9511• VAR00008 14.1053 128.6673 .1906 .9515• VAR00009 14.4211 129.1410 .7311 .9513• .....................• VAR00022 14.4211 129.1410 .7311 .9513• VAR00023 14.4211 127.1053 .4401 .9502• VAR00024 14.6316 136.8440 -.0332 .9542• VAR00047 14.4737 128.6109 .8511 .9508• VAR00048 14.4737 128.8252 .8274 .9509• VAR00049 14.0526 130.6579 .5236 .9525• VAR00050 14.2105 127.8835 .7533 .9511• Reliability Coefficients• N of Cases = 57.0 N of Items = 50• Alpha = .9533

Page 38: Item analysis

3. In the output dialog box:

• Alpha placed at the bottom

• the corrected item total correlation is the point biserial correlation as bases for index of test reliability

Page 39: Item analysis

4. Count the number of items discarded and fill up summary item analysis table.

Page 40: Item analysis

Test Item Reliability Analysis Summary (sample)

Test Level of Difficulty

Number of Items

% Item Number

Math Very Easy 1 2 1

(50 items) Easy 2 4 2,5

Moderate 10 20 3,4,10,15…

Difficult 30 60 6,7,8,9,11,…

Very Difficult

7 14 16,24,32…

Page 41: Item analysis

5. Count the number of items retained based on the cognitive domains in the TOS. Compute the percentage per level of difficulty.

Page 42: Item analysis

Remember Understand Apply

N Ret N Ret N Ret

A 4 1 6 3 10 3B 5 3 5 3 4 2C 3 2 7 4 6 3

Total 12 6 18 10 20 8% 50% 56% 40%

Overall

24/50 = 48%

Page 43: Item analysis

• Realistically: Do item analysis to your most important tests

–end of unit tests, final exams --> summative evaluation

–common exams with other teachers(departmentalized exam)• common exams gives bigger

sample to work with, which is good

• makes sure that questions other teacher wrote are working for YOUR class

Page 44: Item analysis

ITEM ANALYSIS is one area where even a lot of otherwise very good classroom teachers fall down:

– they think they're doing a good job;

– they think they've doing good evaluation;

–but without doing item analysis, – they can't really know

Page 45: Item analysis

ITEM ANALYSIS is not an end in itself, –no point unless you use it to revise items, and

–help students on basis of information you get out of it.

Page 46: Item analysis

END OF PRESENTATION…

THANK U FOR LISTENING…

HAVE A RELIABLE AND ENJOYABLE DAY….

Page 47: Item analysis

L/O/G/O