Challenges in Classifying Adverse Events in Cancer Clinical Trials Steven Joffe, MD, MPH Dave...

26
Challenges in Classifying Adverse Events in Cancer Clinical Trials Steven Joffe, MD, MPH Dave Harrington, PhD David Studdert, JD, PhD Saul Weingart, MD, PhD Damiana Maloof, RN

Transcript of Challenges in Classifying Adverse Events in Cancer Clinical Trials Steven Joffe, MD, MPH Dave...

Challenges in Classifying Adverse Events in Cancer

Clinical Trials

Steven Joffe, MD, MPH

Dave Harrington, PhD

David Studdert, JD, PhD

Saul Weingart, MD, PhD

Damiana Maloof, RN

Disclosure

• Member of clinical trial adverse event review board for Genzyme Corp (not oncology-related)

Adverse Events in Clinical Trials

• Adverse events (AEs) are critically important outcomes of clinical trials– Human subjects protection– Endpoints for judgments about benefits &

risks of study interventions

• Captured on Case Report Forms

• Reported to oversight agencies

Components of AE Assessment

• Type

• Severity

• Relatedness to study agent(s)

• Expectedness

Components of AE Assessment

• Type

• Severity

• Relatedness to study agent(s)

• Expectedness

Global judgment about reportability to IRB

Reporting Criteria(to Dana-Farber IRB)

• Grade 5 (fatal)

• Grade 4, unless specifically exempted

• Grade 2/3, if unexpected AND possibly, probably or definitely related

• Virtually identical to NCI’s Adverse Event Expedited Reporting System (AdEERS) criteria

AE Grading in Oncology

• NCI’s Common Terminology Criteria for Adverse Events (CTCAE) typically used– Effort to standardize nomenclature– developed by consensus methods; no formal

process to establish reliability of grading

http://ctep.cancer.gov/protocolDevelopment/electronic_applications/ctc.htm#ctc_v30

Aims

1. To assess the validity of physician reviewers’ determinations about whether AEs in cancer trials meet IRB reporting criteria

2. To assess the interrater reliability of reviewers’ determinations about whether AEs that occur in cancer trials meet IRB reporting criteria

3. To assess the validity and reliability of revie-wers’ judgments about the components of AEs

Study Methods

Panelists’ Roles

• Review primary data from criterion sets of AEs

• Rate each AE:– Classification– Grade– Relatedness– Expectedness– Reportable to IRB

} from CTCAE

Panelist Demographics

Expert Panel

(n=3)

Second Panel

(n=10)

Years since fellowship training

Mean 20 yrs 6.3 yrs

Range 10 – 32 yrs 2 – 17 yrs

Academic rank

Instructor / Asst Prof 1 10

Assoc Prof / Prof 2 0

Panelists’ Experience

Expert Panel

(n=3)

Second Panel

(n=10)

Clinical trials served as overall Principal Investigator

0 – 5 0 7

≥ 6 3 3

Clinical trials served as PI, site PI, or Co-Investigator

0 – 5 0 2

6 – 20 0 3

>20 3 5

Panelists’ Experience

Expert Panel

(n=3)

Second Panel

(n=10)

Patients personally enrolled in a clinical trial during past 3 years

0 – 10 0 1

11 – 30 1 4

>30 2 5

Adverse event reports personally filed with the IRB during past 3 years

0 – 10 1 6

11 – 30 0 1

>30 2 3

Statistical Analysis

• Validity of judgments regarding reportability to IRB– % agreement with gold standard

• Interrater reliability of raters’ judgments– Kappa coefficients

Results

Criterion Set of AEsType of AE Grad

eRelated Expected Reportable

High triglycerides 4 Definite Y Y

Osteonecrosis 3 Definite Y N

Sensory neuropathy 1 Probable Y N

Cardiac ischemia 4 Possible Y Y

Rash 2 Probable Y N

Thrombosis 4 Unlikely N Y

High uric acid 4 Probable N Y

Cardiac dysfunction 2 Definite Y N

Thrombotic thrombo- cytopenic purpura

4 Possibly N Y

Renal failure 4 Definite Y Y

Validity of Judgments Regarding Reportability to IRB

Adverse Event Not Reportable Reportable % Agree

1. High triglycerides 0 10 100

2. Osteonecrosis 6 4 60

3. Sensory neuropathy 10 0 100

4. Cardiac ischemia 0 10 100

5. Rash 9 1 90

6. Thrombosis 0 10 100

7. High uric acid 0 10 100

8. Cardiac dysfunction 8 2 80

9. TTP 0 10 100

10. Renal failure 0 10 100

TOTAL 93%

Interrater Reliability of Panelists’ Judgments

Judgment Kappa P value

Reportability 0.75 <0.0001

Grade 0.52 <0.0001

Relatedness 0.22 <0.0001

Expectedness 0.88 <0.0001

Role of Experience: Rank

Kappa

Role of Experience: Service as PI

Kappa

Role of Experience: Number of AE Reports Filed

Kappa

Conclusions

• Oncologists’ judgments about whether or not AEs require reporting to the IRB show high agreement with gold standard

• Interrater reliability of oncologists’ judgments about components of AEs varies– High: expectedness of AE; need for reporting– Moderate: grade of AE– Low: relationship of AE to study agents

Limitations

• Small sample sizes– Criterion set of AEs– Panel of physician reviewers

• Generalizability of set of AEs

• Reviewers may not reflect population of investigators who file AE reports

• Judgments based on document review rather than on firsthand knowledge

Thoughts About Direction of Bias in Agreement Statistics

• Factors biasing towards less agreement– Reviewer experience

• Factors biasing towards greater agreement– Standardized set of documents for review– Criterion set selected based on maximum

agreement among expert panel reviewers

Implications

• Judgments about AEs are complex

– Human subjects: efforts to enhance reliability, or to minimize reliance on judgments about causation, are needed

– Science: toxicity data from uncontrolled trials may be misleading

– RCR: education about need for reporting is important but insufficient

Acknowledgments

• Debra Morley• Anna Mattson-DiCecca• Physician panelists

• ORI• NCI• Milton Fund