Participant-Level Data Collection - PEP-C Home · Participant-Level Data Collection ......
Transcript of Participant-Level Data Collection - PEP-C Home · Participant-Level Data Collection ......
2
Participant-Level Data Collection (Questionnaires & Dosage Forms)
Overview for CSAP’s MAI Grantees
Presenter: Nilufer Isvan
MAI Grantee Training
PEP-C
First Things First…
• Audio
– Your computer’s speakers or
– Dial: 866-503-4560; Participant Code: 1031268581
• Webinar is being recorded, use listen-only mode
• Participant Questions
– Use the Q & A box on your screen at any time
• Download PowerPoint Presentation
– Use link provided on your screen
• Technical Issues
– Use Q & A box to reach our TA Team3
Training Objectives
By the end of this webinar, participants will be…
• Acquainted with the new evaluation contract and its SAMHSA management team
• Familiar with PEP-C’s MAI evaluation plan
• Familiar with SAMHSA/CSAP’s MAI GPRA measures
• Able to administer the cross-site instruments – Youth and Adult Questionnaires
– Individual and Group Dosage Forms
• Able to implement the cross-site data collection protocol
• Able to identify and avoid common obstacles to data quality
4
Agenda
• Introductory remarks from SAMHSA
– PEP-C and the Cross-Site Team
– Importance of participant-level data for SAMHSA
– MAI GPRA measures
• The cross-site evaluation logic model
• Collecting participant data with CSAP’s standard instruments
• Common data quality issues to avoid
• Online data systems under development
• Submitting data during system transition4
Introducing the SAMHSA Team
Contracting Officer’s Representative (COR) in charge of PEP-C: Sara Azimi-Bolourian, Ph.D., Public Health Analyst, SAMHSA/CSAP
Alternate Contracting Officer’s Representative (ACOR) for PEP-C: Thomas Clarke, Ph.D., Social Science Analyst, SAMHSA/CSAP
6
Program Evaluation for Prevention Contract (PEP-C) Tasks
SAMHSA/CSAP’s evaluation contract
National cross-site evaluations of four SAMHSA/CSAP grant initiatives• Partnerships For Success (PFS)
• Strategic Prevention Framework State Incentive Grants (SPF SIG)
• Minority AIDS Initiative (MAI)
• STOP Act Grants (one-year retrospective study, completed)
7
Program Evaluation for Prevention Contract (PEP-C) Leadership
Project Director: Phillip W. Graham, Dr.PH– Director of the Drugs, Violence, and Delinquency Prevention Research Program
in the Center for Justice, Safety, and Resilience, RTI International
Deputy Project Director: Elvira Elek, Ph. D. (Social Psychology)
– Research public health analyst, RTI International
5
PEP-C’s MAI National Cross-Site Evaluation Team
Name & Degree Role on the Team Affiliation
Nilufer IsvanPh.D. in Sociology
Team Co-Lead HSRI
Mindy Herman StahlPh.D. in Human Development
Team Co-Lead RTI International
Darigg Brown Ph.D. in Biobehavioral Health
HIV/AIDS Prevention Evaluation Specialist
RTI International
Melissa BurnettB.A. in Psychology
Research Analyst, TTA Liaison HSRI
Leena ElsadekB.A. in Global Health and
Anthropology
Research Analyst, TTA Support RTI International
Rachael GerberM.P.H.
Research Analyst, Data Manager HSRI
Lisa LundquistM.A. in Criminal Justice
Research Analyst, Data Manager HSRI
9
Federal Reporting Requirements
Government Performance and Results Act (GPRA)
• Originally enacted in 1993
• Currently: GPRA Modernization Act of 2010
• Federally funded programs required to report performance measures & meet targets
• Each year’s performance used to justify budget requests for following year
10
SAMHSA/CSAP’s MAI GPRA Measures
11
Measure TargetActual
(FY2013)
Number of program participants exposed to substance abuse prevention education services (Output)
5,734 6,437
Percent of program participants that rate the risk of harm from substance abuse as great (all ages) (Outcome)
88.0% 96.2%
Percent of program participants who report no use of alcohol at pre-test who remain non-users at post-test (all ages) (Outcome)
91.2% 89.2%
Percent of participants who report no illicit drug use at pre-test who remain non-users at post-test (all ages) (Outcome)
92.6% 93.9%
Number of persons tested for HIV through the Minority AIDS Initiative prevention activities (Outcome)
Baseline 36,707
9
Source: CSAP Accountability Report, Volume XII, FY 2013.
MAI Cross-Site Evaluation Logic Model
12
· MAI funding
· TTA · Needs assessment
· Capacity building
· Strategic planning
· Implementation
o # Direct prevention
o # Environmental
strategies
o # HIV/HCV testing
o # Referrals and
service linkages
· Evaluation
· Knowledge
· Attitudes
· Risk perceptions
· Self-efficacy
· Intentions
· Social norms
· Awareness of and
access to health
services
· Changes in
community social
norms and attitudes
around alcohol use
and risky sexual
behaviors associated
with HIV/HCV
transmission
Distal
(Environmental Level)
Proximal
(Individual Level)
Inputs OutcomesOutputs
· # Served by demog.
· # Trained in SA, HIV,
HCV prevention
· # Tested for HIV/
HCV, # positive,
# counseled, and
# linked to care
· # Tested for the first
time
· # Number with
knowledge of test
results
· Any alcohol use
· Binge/heavy drinking
· Any illicit drug use
· Unprotected sex
· Sex while drunk or
high
Intermediate
(Individual Level)Activities Participants
Individual-Level Moderators
· Sociodemographics
· Victimization
· Discrimination
· Mental health
· Criminal justice involvement
· Social support
Grantee-Environment
Level Moderators
· Fidelity
· Baseline prevalence of HIV/
HCV/STIs
· Baseline social/economic
characteristics
MAI Participant-Level Instruments
The National Minority SA/HIV Prevention Initiative has four standard instruments
• Youth Questionnaire
• Adult Questionnaire
• Group Dosage Form
• Individual Dosage Form
11
Recent Instrument Revisions
• Recent additions to questionnaires (cleared by OMB)
– Questions on military status
– Gender-specific definitions of binge drinking
• Future versions of questionnaires (pending OMB clearance)
– Youth and adult questionnaires shortened by 25%
– Deleted/consolidated questions not crucial for evaluation and accountability
– Added questions on newly emerging national priorities14
Which Participants are Included in this Data Collection Protocol?
Requirement to collect these data from program participants
receiving funded direct service interventions
12
Key Concept – Direct Prevention Service
• Delivered in direct interaction with participants
• Can be either one-on-one (individual) or group format.
13
Examples – Direct Services
• HIV or substance abuse prevention education classes
• Motivational interviewing
• Problem identification, referral, and case management services
• One-on-one or group counseling
• Refusal skills training
• HIV testing 17
When Not to Administer the Instruments
Participant-level data collection protocol does not apply to:
• Individuals contacted through community outreach or other recruitment efforts only
• Individuals who only receive testing services
18
Data Collection from “Testing-Only” Participants
19
• HHS requires testing and related data to be reported in the aggregate
• Keep careful records of
– demographic characteristics
– homeless status
– whether or not first-time tested
– test results (if available)
– whether referred out or tested by grantee
Record Management Section
• Included in all instruments (questionnaires anddosage forms)
• Filled out ONLY by a staff member with access to the necessary information
• Must be completed prior to administering questionnaires to participants
26
Record Management Fields
• Grant ID
• Unique Participant ID
• Interview type
• Interview date
• Intervention(s) received
• Service duration
• Intervention vs. comparison group
22
Questionnaire Sections
• SECTION 1: Facts About You– Demographic and socioeconomic information
– Output measures (people served)
– Disparities in outcomes
• SECTION 2: Attitudes & Knowledge– HIV knowledge, perception of risk, self-efficacy
– Proximal outcomes (expected to change soon after program)
• SECTION 3: Behavior & Relationships– Substance use, risky sexual behaviors, emotional support
– May take some time to change (at least 30 days)23
Participant Burden Reduction
Participants with shorter service duration receive:
• Fewer questions on the questionnaire
• Fewer survey administrations
23
Key Concept – Service Duration
• Length of time between the first and last direct service encounters with the participant.
• Divided into three categories
– “Single Session” (does not exceed a single day)
– “Multiple Session Brief” (2-29 days)
– “Multiple Session Long” (30+ days)
24
Which Questionnaire Sections to Administer and When
Participant’s Service Duration Category
Questionnaire Section to be Administered
Data Collection Time Points
“Single Session”(no longer than a single day)
All of Section One and 3 to 5 relevant questions selected from Section Two
Exit only
“Multiple Session Brief”(2-29 days)
All of Sections One and Two
Baseline and exit
“Multiple Session Long”(30+ days)
Entire questionnaire (All of Sections One, Two, and Three)
Baseline, exit, and 3-6-month post-exit followup
25
Terminology Alert!
• Enter participant’s service duration into field labeled “Intervention Duration.”
• Enter name(s) {or code(s)} of interventions received by participant into the “Intervention Name” fields
• Can enter up to three intervention names {or codes}
27
Grant Identification Number
• The Grant ID field can be found
in the upper left hand corner of the Record Management section
• Each grantee will use its assigned grant identification number provided by CSAP
If this number is missing or inaccurate, the data record cannot be processed or used in evaluation
28
Study Design Group
• Intervention Group▫ Group receiving services
• Comparison Group▫ Group NOT receiving any
services
• Comparison groups are NOTrequired by CSAP. Select “Intervention” for all of your participants if you are not using a comparison group.
Records with missing Study Design Group cannot be used in the evaluation.
29
Participant Identification Number
• Unique numbers should be assigned to each
program participant by qualified
Staff.
• The same Participant ID number will be used
for ALL records associated with the
participant (all survey and dosage
data)
Multiple records sharing identical Participant ID, Survey Administration Date, and Interview Type (e.g. Baseline, Exit, Follow-up) will be flagged and may be eliminated from analysis as this suggests duplicated records or the same Participant ID assigned to more than one participant.
30
Date of Survey Administration
• The 2-digit month, 2-digit day, and 4-digit year should be entered.
• This should be the date that the questionnaire was administered, not the date the data were entered.
Records with missing, incomplete, or inaccurate administration date information cannot be used in evaluation.
If administration dates are out of order (e.g. date of exit interview is before the baseline interview), neither record can be used in the evaluation.
31
Interview Type
BaselineFirst data collection point, must be prior to program exposure by no more than 30 days
ExitSecond data collection point, up to 10 days following the final service encounter with the participant. If services lasted a single day (“single session”), administer at the end of the day.
Follow-upThree-to-six months after program exit
If not accurately filled out, participant may appear to have two interviews at the same time point and it may not be possible to use the data in the evaluation.
32
Terminology Reminder!
In order to avoid confusion, let’s keep in mind:
On the Adult and Youth Questionnaires, the term “Intervention Duration” is used to refer to the total duration of services for the participant and not the specific EBP(s) that the participant is receiving.
33
Service Duration
• Single Session “Intervention”
– Total service duration does not exceed a single day
– Section 1 of the questionnaire
– 3 to 5 questions from Section 2
• Multiple Session Brief “Intervention”
– Total service duration between 2 and 29 days
– Sections 1 & 2 of the questionnaire
• Multiple Session Long “Intervention”
– Total service duration 30 days or longer
– Sections 1, 2 & 3 of the questionnaire
Service duration will be used to select the appropriate outcome variables for the participant. If incorrectly assigned, the data may not be included in the relevant outcome analyses.
34
Frequently Asked Question
35
Before services begin, we may not know how long the participant will stay in our program.
How do we fill out “Intervention Duration” in the baseline survey?
Response: “Informed Guess”
36
• Enter “best guess” at baseline
• Enter actual duration of services at exit
• Evaluation team will make the necessary correction before analysis
Intervention Name(s)
Workaround during system transition:
• Go to the PEP-C MAI Knowledge Base & select “Youth & Adult Questionnaires”
• Download document named “MAI Intervention Names”
• Look up and enter the code assigned to your intervention
• If your intervention is not on the list, write in the name of your intervention
37
Sample Scenario
An individual participated in Voices/Voces one day; then 10 days later, she enrolled in Protocol-Based HIV Counseling and Testing that lasted 15 days.
• “Intervention Duration” is 1+10+15=26 days – select the “Multiple Session Brief Intervention” option.
• Enter the code for Voices/Voces (IPN060) as “Intervention Name 1” and Protocol-Based HIV Counseling and Testing (IPN076) as “Intervention Name 2.”
38
Recap: Common Record Management Issues to Avoid (1)
Data records with the following issues cannot be used in the evaluation analysis:
– Design Group
Missing or inconsistent across time
– Participant ID
Missing, incomplete, or the same ID assigned to multiple participants
– Grant ID
Missing, incomplete, or inconsistent format
39
– Survey Administration Date
The day, month, or year entered is missing or invalid (e.g. a future date)
– Interview Type
Missing or incorrectly filled out
– Out-of-range values
• E.g. Valid response range is 0-30, but the value entered is 40
– Combination of Items
Multiple records share the same Part ID, Interview Type, and Survey Administration Date
Recap: Common Record Management Issues to Avoid (2)
40
Other Common Data Issues Leading to Data Loss
Missing Records:• The participant is missing the baseline, exit, or follow-up questionnaire
record.
• Be sure to submit all records and administer questionnaires at the required time.
Age too Young:• Respondent is younger than 12.
• Youth Questionnaire is validated for ages 12 and older
• Youth under 12 will be included in the number of people served but will not be included in outcome analyses.
• Administering only Section 1 is an appropriate option.
41
Preparing the Questionnaire
• Familiarize survey administration staff with the questionnaires and Administration Guides
• Determine the total duration of funded services that the participant is intended to receive
• Prepare appropriate section(s) of appropriate questionnaire (youth or adult)
• Determine timing of Baseline, Exit, and Follow-up data collection
• Complete the record management section before giving to participants
42
Administering the Questionnaire
• Who administers the questionnaires?– Qualified staff familiar with the instruments and trained in survey administration
– Service providers should not administer questionnaires
• Choose a space that provides sufficient ventilation, lighting, and privacy
• Budget an appropriate amount of time to complete the questionnaire– Single-day services : ~ 5 minutes
– Services lasting 2 – 29 days: ~ 30 minutes
– Services lasting 30+ days: ~ 50 minutes
• Allow extra time for administrative issues (reading instructions, etc.)
• Provide services or referrals in the event the questionnaire items about personal issues such as partner abuse cause emotional distress among participants
43
Single Session Data Collection
• The participant engages in the program for a single day and is not expected to return for any additional services
• Section One of the survey, as well as 3-5 items from Section Two
• Section Two items selected on the basis of targeted risk/protective factor(s)
46
Timing of the Single-Session Survey
• A baseline survey is not required
• Administer survey once, immediately after the end of services
• This is an Exit survey
47
Administering a Single Session Questionnaire
Prior to questionnaire administration:
1. Complete the Record Management Section
• Intervention Duration = Single Session Intervention
• Interview Type = Exit
2. Highlight or circle 3 to 5 items from Section Two and explain that participants should only answer the identified questions
3. Tear off and discard Section Three of questionnaire
48
Dosage Forms
• Used to record the type and duration of direct contact with participants
• Two different service delivery formats:
– Group (more than one participant receiving service during the encounter)
– Individual (one-on-one service delivery)
• Service codes are provided at the end of the dosage form
• A dosage record should be submitted for every service encounter with a participant
50
• Record management fields:
• Encounter Date
• Grant ID
• Administration Format
• Participant ID Number(s)
• Data fields:
• Service Code(s)
• Duration Code(s) (in minutes, rounded up to the nearest 5-min. interval)
• Dosage data should not be collected from comparison or control group participants
• Dosage records with missing or invalid record management information may be excluded from the evaluation
Dosage Forms
51
Individual Dosage Form
Example:
On March 22, 2010, Marie (ID#65471) participated in a behavioral health intervention. It included an individual HIV education session and an HIV testing counseling session at the Jones Health Center (Grant ID SP00009). The education session lasted 88 minutes and the HIV testing counseling lasted 15 minutes. She decided not to stay for testing on that particular day but may come back.
Month Day Year Grant ID Grp.Typ. Adm. Frmt.
Participant ID #
0 3 2 2 1 0 S P 0 0 0 0 9 1 1 6 5 4 7 1
Individual Service Code
Duration Code
#1 01 02 03 04 05
06 06a 07 9 0
08 09 10 11 12
13 14 15 (Closest 5-
minute interval)
#2 01 02 03 04 05
06 06a 07
1 5
08 09 10 11 12
13 14 15 (Closest 5-
minute interval)
52
Group Dosage Form
Example:
Cityside Prevention Center (Grant ID SP00017) holds a 2-hour group counseling session for adults once a week. On April 12, 2010, 12 people attended. The session went over the normal 2 hour time slot by 4 minutes.
Encounter Date Group Service Code Duration Code
Month Day Year
#1
11 12 13 14 15 16 17 18 1 2 5
0 4 1 2 1 0 19 20 21 21a 22 23 24 25 (To the closest 5-
minute interval)
Grant ID
S P 0 0 0 1 7
#2
11 12 13 14 15 16 17 18
19 20 21 21a 22 23 24 25 (To the closest 5-
minute interval)
Grp. Type Adm. Frmt.
1 2
#3
11 12 13 14 15 16 17 18
19 20 21 21a 22 23 24 25 (To the closest 5-
minute interval)
Participant ID Numbers:
0 5 4 7 2 1 3 6 3 4 1 4 7 6 5
5 2 1 7 5 5 2 1 3 8 6 8 4 8 3
2 5 7 3 4 2 1 7 6 6
1 3 6 9 1 3 2 8 6 5
9 4 5 7 6 6 2 5 3 8
53
Data Entry
Completed questionnaires and dosage forms can be entered into digital format for analysis in one of two ways:
• PEP-C’s online data entry system (under development)
• Standard templates and coding manuals (ready for immediate use)
60
Data Submission During System Transition Period
Data submission instructions already sent out to grantees:
• Access codebooks, templates, and intervention codes from Knowledge Base
• Follow indicated file naming conventions
• E-mail data to the PEP-C MAI Technical Assistance Team
• Ensure that the required information is provided in the transmittal e-mail
57
How to Access the Instruments and Supporting Materials
PEP-C MAI Knowledge Base website:
https://pep-c.rti.org/HERO/KB/PEP-C-MAI-KB/MAI-KB.htm
58
Satisfaction Survey
• Link to a brief survey will be e-mailed to all participants
• Please take a moment to respond!
• We diligently evaluate ourselves and use your feedback to improve future trainings
60
Future Questions and TA Requests
E-mail: [email protected]
Phone: 866-558-0724
Your question will be triaged to a Cross-Site Team member best qualified to respond
Of course, you can always contact your SAMHSA Project Officer with your questions and TA requests
61