111 Methodologic Issues and Next Steps in the Implementation of the Results Database Deborah A....

Post on 11-Jan-2016

225 views 0 download

Tags:

Transcript of 111 Methodologic Issues and Next Steps in the Implementation of the Results Database Deborah A....

111

Methodologic Issues and Next Steps in the Implementation of

the Results Database

Deborah A. Zarin, M.D.ClinicalTrials.gov

February 24, 2009

222

History of ClinicalTrials.gov

• FDAMA 113 (1997): Mandates Registry– IND trials for serious and life-threatening diseases or

conditions

• ClinicalTrials.gov Launched in February 2000• Calls for Increased Transparency of Clinical Trials

– Maine State Law; State Attorneys General– Journal Editors (2004)

• ClinicalTrials.gov Accommodates Other Policies• FDAAA 801 (2007): Expands Registry and Adds

Results Database

3333

Public Law 110-85Sec.801 Expanded Clinical Trial Registry

• Enacted on September 27, 2007• Requires Trial Registration (Dec 2007)

– Phase II-IV drug and device trials for all diseases– Data elements: ClinicalTrials.gov + ~ WHO/ICMJE

• Requires Results Reporting (Sept 2008)– Trials of FDA-approved or cleared drugs and devices– “Basic” Results: Baseline Characteristics, Primary &

Secondary Outcomes, Statistical Analyses– Adverse Events (Sept 2009)– “Expansion” of results by rulemaking (Sept 2010)

• Added enforcement provisions

Enforcement Provisions

• Notices of non-compliances

• Civil monetary penalties up to $10,000/day

• Withholding of NIH grant funds

Key Terms

• Applicable Clinical Trials– Interventional trials– Phase 2-4 drug, biologic, device– >= one site in U.S.– Ongoing as of 9/27/07, or later

• Responsible Party– Sponsor, grantee– PI if designated

• Completion Date

New Registrations Continue to IncreaseNumber of New Records Since May 1, 2005

0

10,000

20,000

30,000

40,000

50,000

60,000

70,000

start to 12/31/2005

1/30/2005

3/6/2005

4/10/2005

5/15/2005

6/19/2005

7/24/2005

8/28/2005

10/2/2005

11/6/2005

12/11/2005

1/15/2006

2/19/2006

3/26/2006

4/30/2006

6/4/2006

7/9/2006

8/13/2006

9/17/2006

10/22/2006

11/26/2006

12/31/2006

2/4/2007

3/11/2007

4/15/2007

5/20/2007

6/24/2007

7/29/2007

9/2/2007

10/7/2007

11/11/2007

12/16/2007

1/20/2008

2/24/2008

3/30/2008

5/4/2008

6/8/2008

7/13/2008

8/17/2008

9/21/2008

10/26/2008

7

ClinicalTrials.gov Statistics(as of 02/03/2009)

Number Percent

Total 67,064 100%Type of Trial*

Observational 10,690 16%Interventional 57,119 84%– Drug & Biologic 42,684 – Surgical Procedure 8,585 – Behavioral, Gene Transfer, Other 7,997 – Device** 3,862

International Sites (161 countries)US only 32,772 49%Non-US only 23,109 34%US & Non-US mixed 4,064 6%Missing 7,119 11%

*171 records missing Study Type information**173 device trials – “delayed posting”

8

ClinicalTrials.gov Statistics (cont.)(as of 02/03/2009)

User Statistics

Page Views per month 40 Million Unique visitors per month 500,000

Number Percent

Trials by Data Provider

US Federal (including NIH)18,088 27%Industry 21,072 31%University, Other 28,820 42%

Total 67,980

9

Basic Results Database

1010101010

Basic Results Database: General Characteristics

• Results of “applicable clinical trials” of FDA-approved/cleared medical products

• Generally, submission within 12 months of the earlier of estimated or actual trial completion date (of primary outcome)

• Delayed Submission of Results – Seeking approval of a new use – Extensions for “good cause”

• E.g. data are still blinded• NOT—awaiting publication

Wood AJJ. Progress and deficiencies in the registration of clinical trials. NEJM. 2009

Wood AJJ. Progress and deficiencies in the registration of clinical trials. NEJM. 2009

Basic Results Modules

• Participant Flow • Baseline and Demographic Characteristics• Outcome Measures• Adverse Events (summary data)• Other Information

– “Certain Agreements” Restricting Results Disclosure

– Overall Limitations and Caveats– Results Point of Contact

Key Issues

1. The Results Database is Working

2. Is the Results Database a Good Thing?

3. Who is the Audience?

4. What have we learned so far?

5. To Be Considered…

1. The Results Database is Working

1616

Current Status – “Basic Results”(as of 02/06/09)

• Functional Web-based Data Entry System • Launched in September 2008• Ongoing system of feedback and

improvements

• 410 Results Records have been submitted• Industry: 293 records from 72 data providers• Other: 117 records from 80 data providers• Anticipate greatly increased rate of

submission

171717

Results Data Entry Process

Technical Issues

18

Design Requirements

• Display consists of data tables with minimal text—must be self-explanatory

• System must accommodate range of study designs and facilitate comparison across studies

• NLM directed to: – Consider different methods of display based on

principles of risk communication for different audiences

– Ensure the data are searchable in many ways

• Structured data entry required to facilitate search and display needs 18

19

Design Features

• Tables are “constructed” by the data provider– Columns are pre-set as study arms, but can

be changed by the data provider– Rows are measures—some are pre-set,

others are customized for each study– Type of measure determines specific design

of “cells”

• Attempt to balance fixed structure with flexibility

20

Principles for Using the Basic Results Database

• Submitted data are used to develop basic tables for the public display

• Tables must be interpretable by people not familiar with each particular study

• Labels for rows, columns, and units of measure must be meaningful and precise

Resources to Help Data Providers

• “Helpful Hints”– Illustrates process for entering different study

designs (parallel, crossover, diagnostic accuracy, bioequivalence—in progress)

• Webinar• “Common Errors” • Individual discussions regarding particular

studies• Presentations

2222

Results Database:Challenges

• Data tables will be a public representation of the study—must be clear and informative

• Many entries are not logical or informative

• Posting such entries will embarrass all parties and will not contribute to the public good

232323

Sample Posted Results

Reasons Not Completed

Milestone

Arms

Multiple “Periods”

Crossover Design

User-SpecifiedMeasure

“Default” RequiredMeasures

Categories

Arms

2929

Categories

Statistical Analysis

Statistical Analysis

34

36

Example: Study Record with Results including 29 Posted Outcome Measures

Results Data Submissions(as of 2/23/09)

• 432 Studies Total

• By Sector– 311 (72%) Private– 121 (28%) Public

• Data Providers– 76 Private (Mean: 4.1 studies/org)– 104 Public (Mean: 0.9 studies/org)

Results Data Posted(as of 2/20/09)

• 54 Total

• Study Type– 3 (6%) Observational– 51 (94%) Interventional

• Provider Category– 3 (6%) NIH– 35 (65%) Industry– 16 (29%) Other

Results Data Posted (cont.)(as of 2/20/09)

• Intervention Type– 39 Drug & Biologic– 6 Device– 4 Behavioral– 1 Medical Procedure

• Study Design– 39 Randomized– 23 Masked, double– 15 Control, active– 13 Control, placebo

• Adverse Events (optional)– 7/54 (13%) Studies

• Phase– 5 Phase 1– 9 Phase 2– 25 Phase 3– 11 Phase 4– 9 N/A

• Outcome Measures [mean (range)]

– 6.6 Primary (1-49)– 6.4 Secondary (0-49)– <1 Other Prespecified (0-8)

• 33 of 87 include Adverse Event module

2. Is the Results Database a Good Thing?

Powerful Search Features

• Synonymy– UMLS, etc

• Hierarchies

• Spelling relaxation

What about lack of peer review?

• Large social experiment—results unknown

• On the other hand, industry (e.g., GSK) results databases have been around– Nissen—rosiglitazone

• Peer review is not a panacea

4848

Published Results

• Search of PubMed Secondary Source ID [SI] field by NCT number from 248 records with results– Identified 10 citations (NCT # in Abstract)

• Manual search of 79 records with results in PubMed (without NCT numbers)– Identified 2 citations (no NCT # in Abstract)

Conclusions: Paroxetine is generally well tolerated and effective for major depression in adolescents.

Conclusions: Paroxetine is generally well tolerated and effective for major depression in adolescents.

Publication Bias

Source: Figure 1A. Turner et al. (NEJM, 2008)

Source: Silverstein FE et al. JAMA. 2000 Sep 13;284(10):1247-55.

Kaplan-Meier estimates for ulcer complications according to traditional definition. Results are truncated after 12 months, no ulcer complications occurred after this period. Adapted from Lu 2001.

Source: Jüni P, Rutjes AW, Dieppe PA. BMJ. 2002 Jun 1;324(7349):1287-8.

Lee K, Bacchetti P, Sim I. PLoS Med. 2008;5(9): e191

Zarin DA, Tse T. Medicine. Moving toward transparency of clinical trials. Science. 2008 Mar 7;319(5868):1340-2.

3. Who is the Audience?

Is it possible to create “stand-alone” tables that are informative?

Quality Assurance Challenges

• Data tables will be the public representation of the study—must be clear and informative;

• NLM QA Focuses on:– Apparent Validity (when possible)– Meaningful Entries– Internal consistency/logic– Format

5858

Common Quality Concerns

• Reporting of percentage without reporting absolute numbers

• Improper use of terms• Incidence• Proportion and Ratio• Frequency

• Reporting a change—lack of specificity• Subtraction: minuend vs. subtrahend• Ratio: nominator vs. denominator

• Complicated outcomes that cannot be understood

59

Where is the Quality Line?

Domains of Quality:

Quality of Entries

Scope of Entries

Not Meaningful Meaningful

Minimal Comprehensive

QA Staff Resources

Registration and Results Data Must be Consistent

• Participant Flow Numbers and Enrollment

• Study Design and Results Tables– Number of Arms

6161

Actual Enrollment: 229Study Start Date: June 2006Study Completion Date: October 2007Primary Completion Date: October 2007 (Final data collection date for primary outcome measure)

Summary Protocol Section:

Basic Results Section:

Placebo Drug X

STARTED 220 211

COMPLETED 218 210

NOT COMPLETED 2 1

Participant Flow: Initial Treatment

Actual enrollment (229) displayed in the protocol section does not match total number started in the basic results section (220 + 211 = 431)

BEFORE Revision (Public View)

Table Structure Must be Logical

6363

Measured ValuesDrug X, Week 10

Drug X, Change from Week 10 to 18

Number of Participants Analyzed 88 80

Treatment Satisfaction QuestionnaireAfter 18 Weeks of Treatment

[units: Score]

Mean ± Standard Deviation

81 ± 17.46 7.9 ± 12.16

Inconsistency between columns and rows: Measure “at week 10” and Measure “after 18 weeks of treatment”

BEFORE Revision (Public View)

Data Must Make Sense

• Outcome Measure Name, Description, Units and Data are Compatible

6565

Measured ValuesIntervention X Control

Number of Participants Analyzed 28 27

Hours Per Day of Sleep

[units: Average Hours per Day]

Mean ± Standard Deviation

823 ± 92 864 ± 106

Inconsistency between Units of Measure, “average hours per day,” and Measure Data: value provided is greater than the total number of hours in a day

BEFORE Revision (Public View)

6666

Measure Type Secondary

Measure Name Use of Community Health Resources

Measure Description Evaluation of visits to primary care pediatrician, hospital emergency and re-hospitalization

Time Frame Up to 3 months after discharge

Safety Issue No

Secondary Outcome Measure: Use of Community Health Resources

Measured ValuesEarly

DischargeStandard Discharge

Number of Participants Analyzed 90 86

Use of Community Health Resources

[units: Number]

4.4% 10.5%

• Data are inconsistent: percentages of what?

• Invalid entry: needs to be numerical (cannot include “%”)

BEFORE Revision (Public View)

Implies number of health resources used – how was it measured?

6767

Measure Type Primary

Measure Name Frequency and Magnitude of Antibody Response

Measure Description Nasal secretions to Virus A/12 and B/14. Antibody Response: Three-fold increase after immunization

Time Frame Visit 3 (Week 15)

Safety Issue Yes

Secondary Outcome Measure: Frequency and Magnitude of Antibody Response

Measured ValuesVaccine,

Low Dose

Vaccine, High Dose

Number of Participants Analyzed 35 34

Frequency and Magnitude of Antibody Response

[units: Participants]

17 21

May mean “three-fold or greater increase”

BEFORE Revision (Public View)

Same unit cannot represent measures of “frequency” and “magnitude”

“Participants” is not a unit of measure for “frequency” or “magnitude”

Best to provide both categories for a dichotomous measure:

• < 3x increase• ≥ 3x increase

Best to provide both categories for a dichotomous measure:

• < 3x increase• ≥ 3x increase

Tables Must Be Informative

• Scales should include:– Full name– Construct or domain (e.g., pain)– Direction of scores (Best/Worst Value)– Other information as necessary

• Measures Have Useful Descriptions

• Avoid Abbreviations

6969

Investigational Drug X

GOG Performance Status

[units: Score]

0 48

1 27

2 4

BEFORE Revision (Public View)

Need information about this scale• Full Name• Construct/domain• Range and directionality

Baseline MeasuresNeed information about these values(e.g., is “0” better or worse than “2”?)

Need to change to “participants” – data represent “number of participants” with a particular score

Are these the only possible scores?

Open Label

Number of Participants Analyzed 403

Duration (Days)

[units: Days]

Mean ± Standard Deviation

195.5 ± 43.87

Measure Type Secondary

Measure Name Duration (Days)

Measure Description Extent of Exposure for All Treated Subjects

Time Frame Duration of Study

Safety Issue No

Secondary Outcome Measure: Duration (Days)

Measured Values

BEFORE Revision (Public View)

Needs description: Duration of what?

Needs Arm Label: What is the intervention?

Measure Information Must be Precise and Accurate

• Avoid misuse of terms, e.g., – proportion– ratio– incidence

• State what is being measured and how– Do not provide results in measure description

field

Drug X Drug Y Drug X + Y

Number of Participants Analyzed 351 361 384

Proportion of Patients with Controlled SBP

[units: Participants]186 135 287

Measure Type Primary

Measure Name Proportion of Patients with Controlled SBP

Measure Description Controlled SBP defined as SBP < 130 mmHg

Time Frame Baseline to 12 weeks

Safety Issue No

Primary Outcome Measure: Proportion of Patients with Controlled SBP

Spell out acronym

Not a proportion

BEFORE Revision (Public View)

Not a proportionNot a proportion

Drug X Drug Y Drug X + Y

Number of Participants Analyzed 351 361 384

Change in Sitting DBP From Baseline to End of Study

[units: mmHg]

Lease Squares Mean ± Standard Error

-8.4 ± 0.2 -6.7 ± 0.2 -11.2 ± 0.3

Measure Type Primary

Measure Name Change in Sitting DBP From Baseline to End of Study

Measure Description Change in Sitting DBP

Time Frame Baseline to 12 weeks

Safety Issue No

Primary Outcome Measure: Change in Sitting DBP From Baseline to End of Study

Spell out acronym

Specify calculation details: which value was subtracted from which?

BEFORE Revision (Public View)

Measure Type Secondary

Measure Name To Compare Drug X and Drug Y for Efficacy

Measure Description

Time Frame 4 months

Safety Issue No

Secondary Outcome Measure: To Compare Drug X and Drug Y for Efficacy

State what is being measured, not the purpose

BEFORE Revision (Public View)

Needs description: what is being measured and how?

Data in All Tables Must be Internally Consistent and Logical

• Participants must “flow”

• “Number analyzed” must be consistent with participant flow data

• Avoid Illogical Entries

76

Placebo Drug X

STARTED 301 299

COMPLETED 291 285

NOT COMPLETED 10 14

Participant Flow: First Period Number of participants STARTED in second period of Participant Flow needs to be the same as numbers COMPLETED in the first period

BEFORE Revision (Public View)

Placebo Drug X

STARTED 298 290

COMPLETED 288 278

NOT COMPLETED 10 12

Participant Flow: Second Period

7777

Measured ValuesIntervention X Control

Number of Participants 28 27

Hours Per Day of Sleep

[units: Average Hours per Day]

Mean ± Standard Deviation

823 ± 92 864 ± 106

Inconsistency between Units of Measure, “average hours per day,” and Measure Data: value provided is greater than the total number of hours in a day

BEFORE Revision (Public View)

Statistical Analyses

• Must be Logical

• Compatible with Data

• Informative (report informative metrics)

7979

Groups Early Discharge vs. Standard Discharge

Method ANOVA

P-Value 0.05

Mean Difference (Net) 9

Statistical Analysis 1 for Parental Stress

Measured ValuesEarly Discharge Standard Discharge

Number of Participants 100 100

Parental Stress

[units: Points on a Likert Scale]

Mean ± Standard Deviation

9.3 ± 1.2 7.8 ± 2.1

Inconsistency between Measure Data and Method of Estimation

• Reported Mean Difference: “9”• By Inspection: 9.3 – 7.8 = 1.5

BEFORE Revision (Public View)

8080

Placebo Investigational Drug X

Number of Participants 148 153

Time to Relapse of a Mood Episode

[units: Days]

Median (Inter-Quartile Range)

219 (83 to NA) NA (173 to NA)

Measure Type Secondary

Measure Name Time to Relapse of a Mood Episode

Measure Description

Time Frame 24 months

Safety Issue No

Secondary Outcome Measure: Time to Relapse of a Mood Episode

Measured Values

Needs description

Invalid entry

BEFORE Revision (Public View)

Who is the Audience?PI and Clinical Research Team (You!)

Other Medical Researchers in same field

[Study Sponsor]

Other Medical Researchers in other fields

Other Readers of the medical literature

Science Writers

Lay Public (readers of consumer health literature)

Lessons Learned from Early Submissions of Basic Results

• Many iterations with the QA staff are necessary to reach minimal quality standards and to correct serious flaws

• Data Providers must be able to understand the study design and data analysis– Typically, the investigator and a

statistician will need to be involved

82

4. What Have We Learned So Far?

84

“Studies” vary in size!!

• 1 Primary Outcome Measure– 1 Statistical Analysis

• 13 Secondary Outcome Measures– 147 Statistical Analyses

• 31 Other Pre-specified Outcome Measures– 351 Statistical Analyses

85

Primary Outcome Measure

• Sum of Sore Throat Pain Intensity Difference (SPID2) on Swallowing at 2 hours Post-First Dose

86

Sample Secondary Outcome Measures

• Sore Throat Pain Intensity Difference (PID) Within 6 hours Post-First Dose

• Sum of Sore Throat Pain Intensity Difference (SPID) Within 6 hours Post-First Dose

• Sore Throat Relief Rating Scale (STRRS) Within 6 hours Post-First Dose

• Sore Throat Relief Rating Scale (STRRS) - 'Moderate Relief' at 6 hours Post-First Dose

• Patient's Global Evaluation of Study Medication at 6 hours Post-First Dose

87

Sample Other Pre-specified Outcome Measures

• Throat Soreness Scale (TSS) Difference Within 6 hours Post-First Dose• Sum of Throat Soreness Scale (TSS) Difference Within 6 hours Post-First

Dose • Difficulty Swallowing Scale (DSS) Difference Within 6 hours Post-First Dose • Sum of Difficulty Swallowing Scale (DSS) Difference Within 6 hours Post-

First Dose• Difficulty Swallowing Scale (DSS) Difference at Least 50% Gone at 6 hours

Post-First Dose• Subjects with >= 50% Total Pain Relief (TOTPAR) at 6 hours Post-First

Dose• Number Needed to Treat (NNT) to Achieve at Least 50% of Maximum Total

Pain Relief (TOTPAR) at 6 hours Post-First Dose• Subjects with Sore Throat Pain at Least 35% Gone and at Least 50% Gone

at 2 and 6 hours Post-First Dose• Subjects who Achieved Their Own Level of 'Meaningful Relief' Within 6

hours who had Perceptible Relief Onset Time Within 1 hour

8888

29 Posted Outcome Measures

89

Interesting Findings to Date

• Large numbers of submitted Outcome Measures and Statistical Analyses

• Power of Defaults (e.g., “Baseline Measures”)– Age > 65– Race and Ethnicity– Region of Enrollment

• Problems with imprecise entries

Study-Specific MeasuresCharacteristics

(n = 87 submitted records with results)

• Total: 134 Measures

• Mean: 1.5 Measures per Record

• Range: 0 to 16• Examples

– Baseline LDL-C strata– Daytime asthma symptom score– Glomerular filtration rate– Maternal body mass index– Type of allergic rhinitis

Distribution of Measure Typesfor Study-Specific Measures

(n = 87)

• 42 Number

• 78 Mean

• 14 Median

• 0 Least Squares Mean

• 0 Geometric Mean

• 0 Log Mean

Distribution of Measure of Dispersionfor Study-Specific Measures

(n=87)

• 42 Not Applicable

• 73 Standard Deviation

• 2 Inter-Quartile Range

• 17 Full Range

Distribution of Measure of Dispersionfor All Outcome Measures

(n=87)

• 149 Not Applicable

• 169 Standard Deviation

• 5 Inter-Quartile Range

• 20 Full Range

• 37 Standard Error

• 59 95% Confidence Interval

Statistical Analysis Data Structure

Method - Choices

• ANCOVA• ANOVA• Chi-squared• Chi-squared, Corrected• Cochran-Mantel-

Haenszel• Fisher Exact• Kruskal-Wallis• Log Rank• Mantel Haenszel

• McNemar• Mixed Models Analysis• Regression, Cox• Regression, Linear• Regression, Logistic• Sign Test• t-Test, 1-sided• t-Test, 2-sided• Wilcoxon (Mann-Whitney)• Other

“Other” Statistical Test Methods

• Percent of ITT subjects

• LL of 97.5% CI of difference

• Kaplan Meier

• Rank-Sum Test

• Student's t-test

• binomial distribution

• Linear mixed model

• t-Test, Paired

• 2-sided Log Rank test

• General Linear Model

• McNemar-Bowker

• Nonparametric ANCOVA

• Longitudinal Data Analysis (LDA)

• One proportion binomial exact test

• Wilcoxon Rank-Sum

Estimated Parameter - Choices

• Cox Proportional Hazard

• Hazard Ratio (HR)• Hazard Ratio, log• Mean Difference

(Final Values)• Mean Difference (Net)• Median Difference

(Final Values)

• Median Difference (Net)

• Odds Ratio (OR)• Odds Ratio, log• Risk Difference (RD)• Risk Ratio (RR)• Risk Ratio, log• Slope• Other

“Other” Parameter Names

• Mean• Mean Response Ratio• Difference in cumulative

probability• Kaplan Meier Estimate• Cumulative probability• LS mean difference of

SPID48• Least Square Mean

Change Difference

• Proportion of patients achieving control

• Geometric least-squares mean ratio

• Rate Difference

• Percentage

• Percentage of patients castrated

• Clopper-Pearson

Distribution of Estimation Parameter Method for All Statistical Analyses

(n=87 records)

• 0 Cox Proportional Hazard

• 2 Hazard Ratio (HR)

• 0 Hazard Ratio, log

• 63 Mean Difference (Final Values)

• 2 Mean Difference (Net)

• 3 Median Difference (Final Values)

• 0 Median Difference (Net)

• 2 Odds Ratio (OR)

• 0 Odds Ratio, log

• 1 Risk Difference (RD)

• 2 Risk Ratio (RR)

• 0 Risk Ratio, log

• 0 Slope

• 99 Other

Statistical AnalysesCharacteristics

(n = 87 submitted records with results)

• Total: 305 Analyses

• Per Record– Mean: 3.5– Median: 0– Range: 0 – 40

• Per Outcome Measure– Mean: 0.8– Median: 0– Range: 0 – 20

Distribution of Statistical Test Method for All Statistical Analyses

(n=87 records)

• 55 ANCOVA• 2 ANOVA• 4 Chi-squared• 0 Chi-squared,

Corrected• 12 Cochran-Mantel-

Haenszel• 12 Fisher Exact• 2 Kruskal-Wallis• 9 Log Rank• 0 Mantel Haenszel

• 1 McNemar• 10 Mixed Models

Analysis• 0 Regression, Cox• 0 Regression, Linear• 2 Regression, Logistic• 0 Sign Test• 3 t-Test, 1-sided• 0 t-Test, 2-sided• 4 Wilcoxon (Mann-

Whitney)• 64 Other

104104

Posted Basic Results: Certain Agreements (as of 1/12/09)

• 8 PIs are Employees

• 36 PIs are Not Employees– 15 No Agreement– 21 Has Agreement

• 7 Embargo ≤ 60 days• 7 Embargo > 60 days, but ≤ 180 days• 7 Other*

* “The PI shall have the right to publish the results of research to include in any publication. The PI will provide the Sponsor with at least sixty (60) days for review of a manuscript. No paper that incorporates Sponsor Confidential Information will be submitted for publication without Sponsor’s prior written agreement.”

5. Issues To Be Considered

• Display

• Content

Display

• Tracking and Displaying Changes

• Displaying Scale Information from a “data bank”

• Links to Systematic Reviews and other “authoritative information”

• Links to Consumer Health Information

Public View and Changes to Records—Current Policy

• Default public view is always most recent entry in registry and results records

• Records are never “frozen”– Data provider can change entries at any time

• Post “First received” and “last updated” • Archive site “tracks changes”• Concerns:

– Not all users understand what they are seeing– May not be best policy

Issues • Distinguish between “freezing” data and

default display• Registry issues

– Different users/uses• Recruitment of participants• Scientific integrity

• Results issues– New data may be added– Existing data may be improved, changed or

deleted

108

Data Element “Permanence”

• Changes Rarely– Sponsor– Unique Protocol ID– Study Start Date– Condition & Intervention– Study Design

• May Change– Outcome Measures

• Expected to Change– Recruitment Status, Overall and by Location– Enrollment (expected actual)– Primary Completion Date (expected actual)

Factors leading to changes in data elements

• Must keep data up to date, including location and recruitment status

• Evolving policies, database structure• QA requests for changes• Correction of errors• Added specificity• On the other hand:

– Requirement to report outcome measures– Requirement re: completion date

110

Example: Change in Recruiting Status

History of NCT00239681(The JUPITER Trial)

• 11-13-08: Changed Overall Status from “Active, not yet completed” to “Terminated”

• 11-16-08: Changed Overall Status from “Terminated” to “Completed”

• 02-03-09: Changed Overall Status from “Completed” to “Terminated”

Last Update

Tracking Outcome Measures

• How to distinguish between additions, deletions, and modifications to primary and secondary outcome measures?

• How to handle changes that transform one general measure to several more specific measures?

Challenge of Tracking Outcome Measures

ORIGINAL SUBMISSION UPDATED SUBMISSION

Blood Pressure (BP) at 3 mos. Systolic BP (SBP) at 3 mos.Diastolic BP (DBP) at 3 mos.Change in SBP at 3 mos. to baselineChange in DBP at 3 mos. to baselinePercent Change in SBP…Percent Change in DBP… ETC.

Blood Pressure (BP) at 3 mos. Diastolic BP (DBP) at 12 weeksDBP at 9 mos.DBP at 12 mos.

Blood Pressure (BP) at 3 mos. Quality of Life at 6 mos.

Example from Current RecordBEFORE• Effectiveness of the SAGB post-operatively

(for the follow-up period) • Rate of device-related adverse events and

SAGB malfunctions in subjects implanted with the SAGB throughout the post-operative follow-up period.

• Changes in excess body weight throughout post-operative follow-up period

• Change in body mass index (BMI) throughout the post-operative follow-up period

• Absolute weight loss and percent change in absolute weight throughout the post-operative follow-up period

• Change in Quality of Life measures throughout the post-operative follow-up period

• Changes in specific laboratory test values throughout the post-operative follow-up period

• Rate of all adverse events in subjects implanted with the SAGB throughout the post-operative follow-up period.

AFTER• Percent Excess Weight Loss • Percent of Adverse Events With

Swedish Adjustable Gastric Band (SAGB)

• Changes in Quality of Life (QOL) Measures

• Changes in Excess Body Weight (EBW)

• Changes in Total Cholesterol • Changes in Body Mass Index (BMI)• Change in Absolute Weight• Number of All Adverse Events of

Subjects Implanted With the SAGB

http://clinicaltrials.gov/archive/NCT00166205/2008_12_10/changes

?

? ?

?

Other Issues Related to Outcome Measures

• What is a “primary” outcome measure?– Can there by 5? 10?

• What is a “secondary” outcome measure?– Can there be 20? 50?

• Where does “outcome measure” end and “analytic plan” begin?

• Can the same display be used for the lay public (recruitment) and to ensure scientific integrity?

119

Proposed Change

• Add a “box” in tabular view that highlights key facts about the record

• For each data element, indicate whether or not it’s been changed

• Provide link to “history of changes” at each data element

• Make it easier to access your preferred view

Summary Tracking Information

First Registered October 14, 2005

Study Start Date February 2, 2003

First Posted Outcome Measures October 14, 2005

Outcome Measures Investigate whether long-term treatment with rosuvastatin (First Posted) compared with placebo will decrease the rate of major

cardiovascular events

Investigate the safety of long-term treatment withrosuvastatin compared with placebo through comparisons ofevents

Investigate whether therapy with rosuvastatin reduces theincidence of diabetes mellitus, venous thromboembolicevents, & the incidence of bone fractures.

Last Changed Outcome Measures

Outcome Measures No changes since initial registration

Mock-Up

Click here to see full history of the record

Bank of Scales

• Create a database of standard outcome measures, scales and possibly interventions

• Allow data providers to select from “bank” when appropriate (e.g., Hamilton Rating Scale for Depression)

• Option to add new scales – Data provider specific “bank”– Suggestions for additions to “general bank”

125125

Investigational Drug X

GOG Performance Status

[units: Score]

0 48

1 27

2 4

BEFORE Revision (Public View)

Need information about this scale• Full Name• Construct/domain• Range and directionality

Baseline MeasuresNeed information about these values(e.g., is “0” better or worse than “2”?)

Need to change to “participants” – data represent “number of participants” with a particular score

Are these the only possible scores?

Scale Information

• Beck Depression Inventory II (Beck, Steer & Brown, 1996) [NCT00056316]

– “21-item self-report instrument to assess severity of symptoms of depression. There is a four-point scale for each item ranging from 0 to 3. The total continuous score can range from 0 to 63 points, with higher scores reflective of greater severity.”

• Source of information– Always data provider?– Other authoritative source(s)?

Possible Links

• FDA information

• Publications (use nct#s)

• Systematic reviews– Same condition– Same intervention (or intervention class)– How to choose?

• Medline Plus?

• How to read a study?127

Key Issues in Expansion of Results Database

• Expand results reporting to trials of unapproved products?

• Include narrative summaries? Can it be done w/out being promotional and misleading?– Technical– Lay Language

• Data Quality Verification– Process (e.g., Pilot Quality Control Project)– External Sources

• Full protocol versus extract “necessary to help evaluate the results”

Issues (cont)

• Results Submission– New Submission Deadline: up to 18 months?

• Other– Implementation of adverse event

requirements• Threshold for non-serious?

– Definition of applicable clinical trial

Public Meeting

• “Expansion” of results database

• April 20,2009

• NIH campus

• Please come!

130

131131

Pilot Quality Control Study

• 12/243 results records appear to have one or more journal publications

• 19/43 posted results summaries have some entry at drugs@FDA

• What to use as reference standard for this study?

Applicable Clinical Trials?

• Components involve devices and/or drugs– ImageChecker DMax computer-aided detection

system, version 8.1 (Hologic/R2 Technology)

Gilbert FJ et al. N Engl J Med 2008;359:1675-84.

135135135135135

Additional Information

• Email LISTSERV and other FDAAA information:– http://prsinfo.clinicaltrials.gov/fdaaa.html

• Other general information:– http://prsinfo.clinicaltrials.gov

• Questions?– register@clinicaltrials.gov

136136

Finding Results at ClinicalTrials.gov

• From Homepage – Go to “Search for Clinical Trials”– Select “Advanced Search”– Select “Studies with Results” from the menu

for the Study Results field– Select study record from results list– Click “Study Results” tab

137137

138

http://prsinfo.clinicaltrials.gov/fdaaa.html

139139

140140

141141

142142142

FDAAA and Prior Publishing

143

ICJME

“…will not consider results posted in the same primary clinical trials register in which the initial registration resides as previous publication if the results are presented in the form of a brief, structured (<500 words) abstract or table.”

144

BMJ

“Firstly, disclosure will be a legal requirement, so there is nothing editors can do about it if they still want to publish important trials of drugs and devices. Moreover, journals will continue to add value by publishing useful and readable trial reports that clinicians, the media, and patients can interpret and use. And, most importantly, the results disclosed for the FDA will not have been externally peer reviewed and will be preliminary. Peer review not only provides a stamp of quality assurance, it often leads to reanalysis of results.”

145

PLoS

“supports the public disclosure of all clinical trial results, as mandated for example by the FDA Amendments Act, 2007. Prior disclosure of results on a public website such as clinicaltrials.gov will not affect the decision to peer review or acceptance of papers in PLoS journals.”