February 2013 - Welcome Home Page articles/Webinars...• Scoring rubrics based on individual tasks...

27
CEdMA Certification Special Interest Group Highlights from ATP 2013 February 2013

Transcript of February 2013 - Welcome Home Page articles/Webinars...• Scoring rubrics based on individual tasks...

Page 1: February 2013 - Welcome Home Page articles/Webinars...• Scoring rubrics based on individual tasks or final outcome –Scored and unscored items –Boolean logic to score “optimal”,

CEdMA Certification Special Interest Group

Highlights from ATP 2013

February 2013

Page 2: February 2013 - Welcome Home Page articles/Webinars...• Scoring rubrics based on individual tasks or final outcome –Scored and unscored items –Boolean logic to score “optimal”,

Beauties and the Beast

Beverly Liz Jesse Joe Joan

Eva

Jamie

Page 3: February 2013 - Welcome Home Page articles/Webinars...• Scoring rubrics based on individual tasks or final outcome –Scored and unscored items –Boolean logic to score “optimal”,

CEdMA Certification Special Interest Group

• Standard Setting

– Eva Chase – Informatica

• Defending Exam Enforcement in Court

– Liz Burns – Juniper Networks

• Item Positioning and Cheating Techniques

– Joe Cannata – Brocade

• Performance Based Testing

– Beverly Van de Velde - Symantec

Agenda

3

Page 4: February 2013 - Welcome Home Page articles/Webinars...• Scoring rubrics based on individual tasks or final outcome –Scored and unscored items –Boolean logic to score “optimal”,

CEdMA Certification Special Interest Group

• An Interactive Workshop

• Presenters:

– Dr. Canda Mueller

– Dr. Timothy Vansickle

• From:

– Questar Assessment, Inc.

The Life of a Standard Setting Panelist

4

Page 5: February 2013 - Welcome Home Page articles/Webinars...• Scoring rubrics based on individual tasks or final outcome –Scored and unscored items –Boolean logic to score “optimal”,

CEdMA Certification Special Interest Group

Standard Setting Reason Integrated Judgment Process

• The RIJ process is to review the exam as a whole; good

for when you are starting from scratch

– “Through this approach, various sources of data, methods of

analysis, and perspectives of stakeholders can be

simultaneously considered while making cut score

recommendations”.(http://www.pearsonassessments.com/hai/im

ages/tmrs/Bulletin21_Evidence_Based_Standard_Setting.pdf )

• This is different from the Angoff method where every

item is reviewed for the target candidate

2/22/2013 Eva Chase

Page 6: February 2013 - Welcome Home Page articles/Webinars...• Scoring rubrics based on individual tasks or final outcome –Scored and unscored items –Boolean logic to score “optimal”,

CEdMA Certification Special Interest Group

Data, Analysis and Perspectives What is to be used?

• Key to everything is to provide a candidate profile – 3 to 5 skills listed (for example; remember the 5 foot hurdle jumper)

– Entire exam measures minimally competent candidate

• Document roles and responsibilities – Candidate profile is determined by the governance council/ learning

objectives/IPS skills

– Cut score and standard setting is determined by SMEs.

– Document who our SME’s are and their qualifications

2/2

2/2

01

3

Eva Chase

Page 7: February 2013 - Welcome Home Page articles/Webinars...• Scoring rubrics based on individual tasks or final outcome –Scored and unscored items –Boolean logic to score “optimal”,

CEdMA Certification Special Interest Group

• Methods of analysis – What is needed to pass? Use the PLD (performance level descriptions) to

understand definition and stated required skills

– Item mapping order by difficulty

– Minimum cut – medium cut – highest cut / what is minimally competent?

• Perspectives of stakeholders – State purpose/which is to recommend cut score and secure agreement

– Skill level of the SME is always higher than typical candidate

– Consolidate average scores for SMEs to provide a good level setting

experiences

– Be sure that they don’t the key for the 1st pass through the exam

– Panel should be no more than 12

– Engage SMEs as discussion leaders

– Set the stage so they have the same expectations of the target candidate

7

Data, Analysis and Perspectives What is to be used? (continued)

Page 8: February 2013 - Welcome Home Page articles/Webinars...• Scoring rubrics based on individual tasks or final outcome –Scored and unscored items –Boolean logic to score “optimal”,

CEdMA Certification Special Interest Group

Sample Workshop Agenda - The process

• Focus on the outcome! To determine cut scores and standard setting

• Break SMEs into groups

– Appoint SMEs as discussion leaders per technology/product

– Take the exam, no answer key provided – have them rank by difficulty

– Take the exam – with answer key

– Track all results for both rounds

– Collect comments on notecards

• Determine what is proficient and advanced; what is the difference?

– Review Candidate profile and PDL’s

– List skills out for example and rate

• Review the exam add up how many items are at the basic,

proficient or advanced level

– Cut score is determined by panelists expectation of the target candidate

– All outlier items in the exam should be dropped

• Provide final evaluation before they leave

– Document that they believe in the process followed to be fair and true 2/2

2/2

01

3

Eva Chase

Page 9: February 2013 - Welcome Home Page articles/Webinars...• Scoring rubrics based on individual tasks or final outcome –Scored and unscored items –Boolean logic to score “optimal”,

CEdMA Certification Special Interest Group

• Standard Setting

– Eva Chase – Informatica

• Defending Exam Enforcement in Court

– Liz Burns – Juniper Networks

• Item Positioning and Cheating Techniques

– Joe Cannata – Brocade

• Performance Based Testing

– Beverly Van de Velde - Symantec

Agenda

9

Page 10: February 2013 - Welcome Home Page articles/Webinars...• Scoring rubrics based on individual tasks or final outcome –Scored and unscored items –Boolean logic to score “optimal”,

CEdMA Certification Special Interest Group

Defending Exam Enforcement in Court

Presentation – Tell It To The Judge

• Aimee Hobby Rhodes, JD - CFA Institute

• Dr. James (Jim) Wollack, Ph.D. - University of Wisconsin-Madison

• Rachel Schoenig - ACT, Inc.

• Jennifer Ancona Semko - Baker & McKenzie LLP (Washington DC)

• Steve Addicott - Caveon Test Security

10

Page 11: February 2013 - Welcome Home Page articles/Webinars...• Scoring rubrics based on individual tasks or final outcome –Scored and unscored items –Boolean logic to score “optimal”,

CEdMA Certification Special Interest Group

Top Ten “Do nots” 1. Don’t use an “expert” who is an expert…but not in this field!

2. Don’t use an untested methodology (e.g., new and unpublished analysis)

3. Don’t fail to report on obvious analysis because it doesn’t show cheating

4. Don’t use poorly set up statistical procedures

5. Don’t use a non-objective expert

6. Don’t over-state the conclusion – Statistics don’t indicate intent

7. Don’t use an expert who is hard to understand (too technical)

8. Don’t withhold facts from your expert

9. Don’t limit your investigation to the statistics alone, if other information could be relevant (ex: no past relationship or past scores that are consistent with current score)

10. Don’t stray from your candidate agreement and the processes it proscribes

11

Page 12: February 2013 - Welcome Home Page articles/Webinars...• Scoring rubrics based on individual tasks or final outcome –Scored and unscored items –Boolean logic to score “optimal”,

CEdMA Certification Special Interest Group

1. Use an expert who is truly an expert

2. Use a trusted, tested methodology

3. Report fairly

4. Set up procedures properly

5. Be objective

6. Be reasonable in what you’re concluding

7. Use your expert as a teacher – he/she should be able to

explain what they did and why the evidence is compelling

8. Ensure that your expert has all of the facts

9. Take into consideration all reasonably available evidence –

look at the evidence as a whole

10. Apply the terms of your candidate agreement reasonably and

in good faith

Top Ten “Do’s”

12

Page 13: February 2013 - Welcome Home Page articles/Webinars...• Scoring rubrics based on individual tasks or final outcome –Scored and unscored items –Boolean logic to score “optimal”,

CEdMA Certification Special Interest Group

• Standard Setting

– Eva Chase – Informatica

• Defending Exam Enforcement in Court

– Liz Burns – Juniper Networks

• Item Positioning and Cheating Techniques

– Joe Cannata – Brocade

• Performance Based Testing

– Beverly Van de Velde - Symantec

Agenda

13

Page 14: February 2013 - Welcome Home Page articles/Webinars...• Scoring rubrics based on individual tasks or final outcome –Scored and unscored items –Boolean logic to score “optimal”,

CEdMA Certification Special Interest Group

• Study performed by National Restaurant Association

– Food handling and protection certification exam

– 450,000 tested annually

• Exam details

– 90 items, 4-option multiple choice questions

• Study details

– Exams issued over a 3-month period in pencil & paper format

– 49,402 tested, primarily in the US

– 5 delivery orders

• Random

• Hardest to easiest

• Easiest to hardest

• Content in order learned

• Content in reverse order learned

Effects of Item Position on Performance Joe Cannata, Brocade

14

Page 15: February 2013 - Welcome Home Page articles/Webinars...• Scoring rubrics based on individual tasks or final outcome –Scored and unscored items –Boolean logic to score “optimal”,

CEdMA Certification Special Interest Group

• Study results

– No significant variance in candidate performance

– Item performances variances did not significantly impact results

• Conclusion

– Candidates scored the in each exam section, regardless of the order

• Study conducted by

– Amy Roedl, PhD, National Restaurant Association

– Evelyn Tessitore, M.S. Ed., Professional Examination Service

Effects of Item Position on Performance Joe Cannata, Brocade

15

Page 16: February 2013 - Welcome Home Page articles/Webinars...• Scoring rubrics based on individual tasks or final outcome –Scored and unscored items –Boolean logic to score “optimal”,

CEdMA Certification Special Interest Group

• Presenters

– Layne Pethick, GMAC

– Heather Mullen, CFA Institute

– Ben Mannes, American Board of Internal Medicine

• Demonstration of several covert cheating devices

– Cameras

• Hidden in a baseball cap

• Hidden in a necktie

• Hidden in a button

• Hidden in a working Casio calculator

• Hidden in a fake Coke Zero can (labeled Cake Zero)

• Hidden in eyeglasses, jewelry, watches

– Recording/scanning devices • Hidden in an eraser (voice)

• Pens/writing instruments that scan text

A Close Look at Covert Cheating Devices Joe Cannata, Brocade

16

Page 17: February 2013 - Welcome Home Page articles/Webinars...• Scoring rubrics based on individual tasks or final outcome –Scored and unscored items –Boolean logic to score “optimal”,

CEdMA Certification Special Interest Group

• Presenters

– Layne Pethick, GMAC

– Daniel Eyob, GMAC

– Lawrence Rudner, GMAC

• They have tool that they developed to seek out keywords

• GMAC is willing to make it available, as is

– Email [email protected]

– It comes with a manual

– Email me if you need to see their presentation

Free Tools to Nail Braindumps Joe Cannata, Brocade

17

Defending Exam Enforcement in Court

Liz Burns – Juniper Networks

Page 18: February 2013 - Welcome Home Page articles/Webinars...• Scoring rubrics based on individual tasks or final outcome –Scored and unscored items –Boolean logic to score “optimal”,

CEdMA Certification Special Interest Group

• Standard Setting

– Eva Chase – Informatica

• Defending Exam Enforcement in Court

– Liz Burns – Juniper Networks

• Item Positioning and Cheating Techniques

– Joe Cannata – Brocade

• Performance Based Testing

– Beverly Van de Velde - Symantec

Agenda

18

Page 19: February 2013 - Welcome Home Page articles/Webinars...• Scoring rubrics based on individual tasks or final outcome –Scored and unscored items –Boolean logic to score “optimal”,

CEdMA Certification Special Interest Group

Summary of 3 PBT sessions from ATP 2013 covering examples from

Cisco, HP, Sun, RedHat, Microsoft, VMware, Citrix, Adobe

1. Developing a Performance-Based Board Review Test: Factors

and Considerations

* Kim Thayn, Certification Management Services

* Karen Petrini, Hewlett Packard Corporation

2. Developing the Cisco Certified Design Expert Practical

(CCDEp) Examination: A Case Study in Developing an IT

Performance Test

* Russell Smith & Brian Adams, Alpine Testing Solutions

* Theresa Maniar, Cisco Systems, Inc.

3. Strategies for the Development and Delivery of Performance-

Based Assessments

* Sinead Hogan & Gary Fluitt, Certiport

* Jim Brinton & Kim Thayn, Certification Management Services

Strategies for Performance-Based Testing Beverly van de Velde, Symantec

19

Page 20: February 2013 - Welcome Home Page articles/Webinars...• Scoring rubrics based on individual tasks or final outcome –Scored and unscored items –Boolean logic to score “optimal”,

CEdMA Certification Special Interest Group

• Many options exist depending on:

– Levels of complexity and interaction

– Exam security and integrity

• Design considerations

– Secure a psychometrician early in the process

– Conduct a global needs analysis and define program scope

– Determine resources and support / commitment required

– Identify key competencies and job tasks by:

• Levels of cognitive complexity

• Function / job considerations

• Blueprint weighting

• Environmental considerations (e.g., new versus old equipment, other

various work models)

Strategies for Performance-Based Testing

20

Page 21: February 2013 - Welcome Home Page articles/Webinars...• Scoring rubrics based on individual tasks or final outcome –Scored and unscored items –Boolean logic to score “optimal”,

CEdMA Certification Special Interest Group

• Development time and complexity of exam format

– Average 6 to 12 months

– Proctored multiple-choice assessing “knowledge” combined with one or

more of the following:

• Application / eligibility form, questionnaire, and/or references

• Work product / assignment evaluation (offline and/or panel reviewed)

• Proctored essay

• ILT followed by proctored hands-on lab

• Remote, cloud-based hands-on lab

• Subject matter expert panel review

– Determine how to represent a real-world situation

Strategies for Performance-Based Testing Beverly van de Velde, Symantec

21

Page 22: February 2013 - Welcome Home Page articles/Webinars...• Scoring rubrics based on individual tasks or final outcome –Scored and unscored items –Boolean logic to score “optimal”,

CEdMA Certification Special Interest Group

• Costs vary based on item types and scoring approach

– Advanced computer-based, performance tasks, multiple choice; E.g.,

– 1 item is assigned of 10 different scenarios

– Items have branching depending on answer options selected by the candidates

– 1 item may have a table requiring multiple answers

• High fidelity, real-world, fully functional Flash-based items

• Low fidelity, simulations, HTML5 / Java-based items

– Lab-based (cloud/virtual or local/client); inside live application

• Human versus computer scored via custom scripts

– SME panel review of a work product / scenario

• Scoring rubrics based on individual tasks or final outcome

– Scored and unscored items

– Boolean logic to score “optimal”, “sub-optimal”, and “less than optimal”

performance / responses (or) unqualified, minimally qualified, and expert

– Identification of “show stoppers”

Strategies for Performance-Based Testing (continued)

22

Page 23: February 2013 - Welcome Home Page articles/Webinars...• Scoring rubrics based on individual tasks or final outcome –Scored and unscored items –Boolean logic to score “optimal”,

CEdMA Certification Special Interest Group

• Challenges

– Development costs for exam items

– Delivery costs

– Development time; allow more time than anticipated, Alpha, Beta, and &

flexibility for “restarts”

– Cost to scale

– Validity & reliability

– Industry role definitions and terminology

– Homogenous SMEs, commitment, and availability

– Developing items of equal levels of difficulty

– Eliminating subjectivity and ensuring inter-rater reliability (panel reviews)

– “Funneling” to the correct answer / path (simulations)

– Localization [because of need for multi-lingual assessors]

Strategies for Performance-Based Testing (continued)

23

Page 24: February 2013 - Welcome Home Page articles/Webinars...• Scoring rubrics based on individual tasks or final outcome –Scored and unscored items –Boolean logic to score “optimal”,

CEdMA Certification Special Interest Group

• Other considerations

– Delayed scoring

– Content / exam development and technical team collaboration

– Level of fidelity – look & feel, relevant functionality – requires careful and

early planning; Determine how to represent a real-world situation

– SME tech reviews “as-you-go” and training against real-world environment

– Retakes and expiration

– Candidate acknowledgement and video taped delivery

– Logistics and delivery

– SME / panel / board members and benefits

– Scalability

• Scoring rubric – product agnostic with scenarios based on technology versus product

Strategies for Performance-Based Testing (continued)

24

Page 25: February 2013 - Welcome Home Page articles/Webinars...• Scoring rubrics based on individual tasks or final outcome –Scored and unscored items –Boolean logic to score “optimal”,

CEdMA Certification Special Interest Group

• Other considerations (continued)

– Determining levels of candidate benefits for achievement

– Ongoing, continuous maintenance / updates & adjusting scoring rubrics

– Ease of process and preparing candidates for what to expect

– Marketing and visibility

– Tracking, reporting, and analysis

– Instructors as SMEs are usually sensitive to localization issues

– Prepare for appeals process

– Exam security and proctored lunches / breaks

– No “skip item” and No “go back” due to branching items

– Assign mentors to candidates who fail

Strategies for Performance-Based Testing (continued)

25

Page 26: February 2013 - Welcome Home Page articles/Webinars...• Scoring rubrics based on individual tasks or final outcome –Scored and unscored items –Boolean logic to score “optimal”,

CEdMA Certification Special Interest Group

Certification Components

Application

•Submitted by candidate to showcase their business and technical skills to the review board

•External candidates must select a Reference to validate their skills and experience

•Board reviews application and makes accept/reject recommendation

Training

•Optional 2-day Instructor-led class

•Training alone will NOT fully prepare a candidate to pass either exam

Entrance

Exam

•Computer-based exam

•Administered following training class or at select Pearson VUE centers

Board Review Exam

•One day of solution design for customer scenario

•One hour presentation to the Board

•Scored by Board Members to a detailed scoring rubric

HP Master Accredited Solutions Expert Candidate Path

NOTE: This slide provided by HP & CMS at ATP 2013

Page 27: February 2013 - Welcome Home Page articles/Webinars...• Scoring rubrics based on individual tasks or final outcome –Scored and unscored items –Boolean logic to score “optimal”,

CEdMA Certification Special Interest Group

Questions?

2/2

2/2

01

3

27