Probabilistic exposure assessment of operator and residential exposure; a Canadian regulatory...

5
Ann. occup. Hyg., Vol. 45, No. 1001, pp. S43–S47, 2001 2001 British Occupational Hygiene Society Published by Elsevier Science Ltd. All rights reserved Printed in Great Britain. 0003–4878/01/$20.00 PII: S0003-4878(00)00098-3 Probabilistic Exposure Assessment of Operator and Residential Exposure; A Canadian Regulatory Perspective MARY MITCHELL* and CATHY CAMPBELL Pest Management Regulatory Agency, Health Canada, Sir Charles Tupper Building, 2270 Riverside Drive, Ottawa, Canada K1A 0K9 An overview of the considerations central to selection of probabilistic versus deterministic approaches to assessment of operator and residential exposure are provided. From a regulat- ory perspective, the decision to use probabilistic over deterministic assessments should include consideration of factors such as the nature of the populations being assessed, includ- ing the expected duration and frequency of their exposures, as well as an understanding of the toxicity endpoints that the exposure assessment will be linked to during risk assessment. In situations where there is an identifiable need to characterize variability and uncertainty and/or quantify the exposure that will represent most of the exposed population, and where there are adequate data to characterize input parameters, probabilistic assessments may be appropriate. Issues with respect to probabilistic assessments for which detailed, harmonized guidance are required are outlined. These issues are discussed within the context of a tiered approach to exposure and risk assessment. 2001 British Occupational Hygiene Society. Published by Elsevier Science Ltd. All rights reserved Keywords: probabilistic assessment; regulatory perspective; pesticide exposure INTRODUCTION All regulatory decisions must be accurate, transparent and consistent. In Canada, management of submission policy puts the onus on the registrant to ensure sub- mission of a data package that is adequate for a regu- latory decision. The regulatory confidence in an exposure or risk assessment, whether deterministic or probabilistic, is dependent on the quality of the data and on the scientific strength of the assumptions on which it is based. Consideration of inherent varia- bility and uncertainty is critical. Canada’s Pest Man- agement Regulatory Agency (PMRA) agrees with the US-EPA’s policy position on probabilistic exposure assessment: probabilistic analysis techniques (such) as Monte Carlo analysis, given adequate supporting data and credible assumptions, can be viable statistical tools Received 17 August 2000; in final form 30 November 2000. *Author to whom correspondence should be addressed. Tel.: +1-613-736-3471; Fax: +1-613-736-3489; E-mail: Mary [email protected] S43 for analysing variability and uncertainty in risk assessments. (US-EPA-1997) EPA has identified a set of key conditions for proba- bilistic risk assessments (US-EPA, 1997, 1998, 2000) which PMRA consider to be a sound basis upon which to build a harmonized regulatory framework. The objective of this paper is to present a regulat- ory perspective through a constructive critique of the paper presented by Curt Lunchick (Lunchick, 2001). The focus is probabilistic assessment for specific use scenarios. All of the issues addressed would also apply to aggregate probabilistic assessments, how- ever, the more complex issues associated with timing and probability of occurrence of exposures will have to be considered at a later time as the guidance for probabilistic assessment evolves. METRIC SELECTION Whether a deterministic or probabilistic approach is used to generate an exposure estimate, selection of appropriate values for input parameters and from the output distribution is critical. In this paper, we refer to the chosen value for a parameter as a ‘metric’ (i.e., input metric or output metric).

Transcript of Probabilistic exposure assessment of operator and residential exposure; a Canadian regulatory...

Ann. occup. Hyg., Vol. 45, No. 1001, pp. S43–S47, 2001 2001 British Occupational Hygiene Society

Published by Elsevier Science Ltd. All rights reservedPrinted in Great Britain.

0003–4878/01/$20.00PII: S0003-4878(00)00098-3

Probabilistic Exposure Assessment of Operator andResidential Exposure; A Canadian RegulatoryPerspectiveMARY MITCHELL* and CATHY CAMPBELLPest Management Regulatory Agency, Health Canada, Sir Charles Tupper Building, 2270 RiversideDrive, Ottawa, Canada K1A 0K9

An overview of the considerations central to selection of probabilistic versus deterministicapproaches to assessment of operator and residential exposure are provided. From a regulat-ory perspective, the decision to use probabilistic over deterministic assessments shouldinclude consideration of factors such as the nature of the populations being assessed, includ-ing the expected duration and frequency of their exposures, as well as an understanding ofthe toxicity endpoints that the exposure assessment will be linked to during risk assessment.In situations where there is an identifiable need to characterize variability and uncertaintyand/or quantify the exposure that will represent most of the exposed population, and wherethere are adequate data to characterize input parameters, probabilistic assessments may beappropriate. Issues with respect to probabilistic assessments for which detailed, harmonizedguidance are required are outlined. These issues are discussed within the context of a tieredapproach to exposure and risk assessment. 2001 British Occupational Hygiene Society.Published by Elsevier Science Ltd. All rights reserved

Keywords: probabilistic assessment; regulatory perspective; pesticide exposure

INTRODUCTION

All regulatory decisions must be accurate, transparentand consistent. In Canada, management of submissionpolicy puts the onus on the registrant to ensure sub-mission of a data package that is adequate for a regu-latory decision. The regulatory confidence in anexposure or risk assessment, whether deterministic orprobabilistic, is dependent on the quality of the dataand on the scientific strength of the assumptions onwhich it is based. Consideration of inherent varia-bility and uncertainty is critical. Canada’s Pest Man-agement Regulatory Agency (PMRA) agrees with theUS-EPA’s policy position on probabilistic exposureassessment:

probabilistic analysis techniques (such) as MonteCarlo analysis, given adequate supporting data andcredible assumptions, can be viable statistical tools

Received 17 August 2000; in final form 30 November 2000.*Author to whom correspondence should be addressed. Tel.:+1-613-736-3471; Fax: +1-613-736-3489; E-mail:Mary [email protected]

S43

for analysing variability and uncertainty in riskassessments. (US-EPA-1997)

EPA has identified a set of key conditions for proba-bilistic risk assessments (US-EPA, 1997, 1998, 2000)which PMRA consider to be a sound basis uponwhich to build a harmonized regulatory framework.

The objective of this paper is to present a regulat-ory perspective through a constructive critique of thepaper presented by Curt Lunchick (Lunchick, 2001).The focus is probabilistic assessment for specific usescenarios. All of the issues addressed would alsoapply to aggregate probabilistic assessments, how-ever, the more complex issues associated with timingand probability of occurrence of exposures will haveto be considered at a later time as the guidance forprobabilistic assessment evolves.

METRIC SELECTION

Whether a deterministic or probabilistic approachis used to generate an exposure estimate, selection ofappropriate values for input parameters and from theoutput distribution is critical. In this paper, we referto the chosen value for a parameter as a ‘metric’ (i.e.,input metric or output metric).

S44 M. Mitchell and C. Campbell

It has often been stated (e.g., Lunchick, 2001) thata ‘traditional point estimate’ will usually lie at theupper end of a probabilistic output distribution. Weagree that if multiple conservative assumptions areused in a deterministic assessment, the resulting esti-mate will certainly reflect this compounding con-servatism. That is why it is important to endeavourto determine the correct metric for each parameter inan exposure algorithm, based on as robust a data setas possible and an understanding of underlying distri-butions and associated variability and uncertainty. Ifaccurate estimates of central tendency are used for allthe input parameters in a deterministic model, theresult should be near the median of the output of aprobabilistic distribution.

For probabilistic assessments, a key considerationis whether the variability and uncertainty of the inputparameters are adequately described in the model, asthe description of variability and uncertainty is theprimary benefit of this type of analysis. Ideally, datageneration should be structured to provide enoughinformation to fully characterize the distribution,including its tails. Differing approaches to charac-terizing the tails of the input distributions can poten-tially have significant effects on the output distri-bution, particularly when the higher percentiles of theoutput distribution are of interest in the risk assess-ment.

Knowledge of the variability and uncertainty asso-ciated with the input distributions may also have animpact on the selection of the output metric. Varia-bility is an inherent factor that must be addressed inany exposure/risk assessment (e.g., through selectionof a higher percentile from the output distribution orpossibly reduced through risk mitigation measures)while uncertainty can usually be reduced only byadditional data or through better conceptualization ofthe exposure scenario.

Selection of an appropriate metric from the outputdistribution also requires an understanding of the nat-ure of the populations being assessed, including theexpected duration and frequency of their exposures,as well as an understanding of the toxicity endpointsthat the exposure assessment will be linked to duringrisk assessment.

Recent consultations on central tendency selectionfor the Pesticide Handler Exposure Database (PHED)resulted in general guidance (Ginevan, 1999, personalcommunication) that may have ramifications beyondPHED data. The concept is to focus on the logicalbasis for selection of a given metric using exposureduration and toxicological considerations. These con-siderations may be applied to selection of the appro-priate probabilistic output metric as well as to deter-ministic assessments:

� The median is an appropriate measure of centraltendency for short-term exposures:

For exposure estimates based on short-term

exposure durations, the median is an appropriateand highly robust measure of central tendency.For both lognormal and normal distributions, themean (geometric or arithmetic, respectively)approximates the median. Using the median as ametric, nullifies the debate over whether data areadequate to characterize distribution type.

� The arithmetic mean is an appropriate measure ofcentral tendency for longer term exposures:

The central limit theorem dictates that the aver-age of a large enough number of exposure eventswill converge to the arithmetic mean of the orig-inal distribution from which the observationswere drawn, and that the distribution of theseaverages will follow a normal distribution,regardless of the form of the original underly-ing distribution.

� A reasonably high end measure of exposure isappropriate for risk assessments addressing toxi-cants which have significant acute toxicity:

In this situation, it is the peak rather than the cen-tral exposure that is of concern.

THE TIERED APPROACH — WHEN IS ITAPPROPRIATE TO CONDUCT A PROBABILISTIC

ASSESSMENT?

Existing tiered structures for exposure assessment(eg, OECD, 1997) are driven by data availability. Asregulators, we believe that exposure and risk assess-ments, whether deterministic or probabilistic, shouldbe a ‘best-estimate’ based on all available data.

An objective of a probabilistic assessment is not togenerate a value that is lower than the comparabledeterministic estimate. A probabilistic estimate that issignificantly lower than a deterministic estimate,based on equivalent information, would suggest aflaw in one of the models. If central tendency esti-mates are used in a deterministic assessment, it shouldcompare well with the median value of the probabilis-tic output. If, however, conservative estimates areused in the deterministic assessment, then theresulting output should compare well with the higherpercentiles of the probabilistic output distribution. Assuch, a probabilistic assessment should not be rep-resented per se as a higher step in a tiered approachto risk assessment.

In our experience, an exposure estimate can onlybe refined by using additional data. We agree withLunchick (2001) that probabilistic methodology fitswithin the framework of a tiered approach to riskassessment, but propose a different hierarchy prim-arily driven by data availability and by an identifiedneed for probabilistic assessment:

Tier I: few data are available:� default assumptions must be used to generate a

conservative screening level estimate.

S45Operator and residential exposure

Tier II: Refined, product specific information isavailable (e.g. use information, dermal absorptiondata, dislodgeable residue data):� deterministic or probabilistic estimate may be

generated, depending on the purpose of theassessment and the associated toxicity profile.

Tier III: Field exposure evaluation required� deterministic or probabilistic estimate may be

generated, depending on the purpose of theassessment and the associated toxicity profile.

ADVANTAGES AND DISADVANTAGES OFDETERMINISTIC VS PROBABILISTIC RISK

ASSESSMENT

Lunchick (2001) considers the advantages and dis-advantages of deterministic and probabilisticapproaches. He argues that deterministic assessmentsare less labour intensive, require less supporting data,and are easier to comprehend and interpret while pro-babilistic assessments take into account all availableinformation and reflect the variability and uncertaintysurrounding the exposure estimates.

In the Canadian regulatory experience, point esti-mates for exposure variables are seldom easy to con-firm or reproduce and, although the resulting esti-mates may appear deceptively simplistic, they areusually the result of many assumptions and are oftendifficult to understand and interpret. In our experi-ence, the principal limitation of either a deterministicor probabilistic exposure assessment is the lack ofadequate information to characterize thevariability/uncertainty around individual variables.

In the Canadian regulatory process, large amountsof knowledge regarding use patterns and other vari-ables are often incorporated into deterministic assess-ments. Generating and analysing data and developingcredible assumptions is the most labour intensivecomponent of an exposure/risk assessment. Canadianregulators use high-end or upper bound values onlywhen assessing risk for severe, acute toxicity end-points or in the absence of better information. Forexample:

� In current Canadian deterministic assessments, thevalues for areas treated per day is not a high-endvalue, but rather is based on central tendency esti-mates from surveys of farmers and customapplicators, with data sorted by crop and equip-ment type.

� There has been a recent trend to identifying situ-ations in which different short-term and acuteexposure estimates for the same application scen-ario are appropriate, as indicated by parameterssuch as a rapid residue dissipation curve. Forexample, in a case where high quality dislodgeableresidue dissipation curves were available to verify

rapid dissipation between applications, a 7 weektime weighted exposure value was calculated forchildren’s post application exposure from turftreated at two week intervals. An ‘acute’ exposureestimate was also generated based on residues onthe day of application. These two exposure valueswere then compared to appropriate toxicity end-points in parallel risk assessments.

Where a registrant submits refined use information,or any other data that can be used to refine anexposure assessment, the regulator should incorporatethis into either a deterministic or probabilisticexposure assessment, provided the data are relevantand supported by a verifiable citation.

The major advantage of probabilistic assessment isthat it is possible to characterize the variability anduncertainty around an estimate for a specific percen-tile of concern. However, the output of a probabilisticassessment provides useful information only if theinput data is robust and the modelling assumptionsare well supported.

VALIDATION OF PROBABILISTIC MODELS

There are numerous probabilistic modelling toolsavailable. From the regulatory perspective, a model-ling tool must be structured so that:

� All algorithms and assumptions inherent to themodel can be identified and validated. For cus-tomized models, this generally means that theregulator should have access to the source code sothat they can commission independent review ofthe mathematics and coding;

� Model predictions can be compared between mod-els. Stability of the output distribution should beevident across as well as within modelling tools;

� Limitations of the sensitivity analysis and otherindicators of variability and uncertainty can beidentified and assessed, e.g., truncation of inputdistributions, use of point values, etc. As noted inUS-EPA (1997), ‘a rationale must be provided asto the justification of the selection of the maximumand minimum values’.

HARMONIZATION

A ‘from the ground up’ harmonized approach toprobabilistic exposure and risk assessment, by bothindustry and regulators, will facilitate the adoption ofsuch approaches by regulatory agencies. The ground-work has been layed through the publication of theEPA guidance documents for probabilistic assess-ments (US-EPA, 1997, 1998, 2000) and through pre-vious harmonization initiatives, such as the NAFTATechnical Working Group on Pesticides, that have

S46 M. Mitchell and C. Campbell

resulted in North American harmonization of manyof the key parameters and assumptions inherent tooccupational and residential exposure assessments.

Issues for which harmonized detailed guidance arerequired include:

� The types of analyses for which probabilisticmethodologies potentially add value.

� Criteria for model validation.� The key considerations for fitting appropriate dis-

tributions to specific types of data:— There is increasing regulatory awareness of the

limitations of ‘goodness of fit’ tests for nor-mality or lognormality. Such tests reject poordistribution fits rather than identify good fits.A small data set is actually more likely to bedeclared a fit using such tests than is a largedata set;

— For regulators, numerical stability of the out-put distribution’s central tendency and uppertail are essential. It is a mathematical sequiturthat truncating the ends of input distributionswill stabilize the ends of output distributionsand we agree with Lunchick (2001) that whena continuous input distribution is used the endsof the distribution should sometimes be trunc-ated. Ideally the impact of selecting or truncat-ing distributions should be characterizedthrough generating ‘alternative’ assessmentswith different input descriptors so that themagnitude of the effects of truncation can beprobed. It may be possible to adopt default dis-tribution truncation values for common datasets (e.g., body weights). Any limitations asso-ciated with either approach should be clearlydescribed;

— There may be situations where it is better touse empirical versus probability data functions(i.e. to directly sample an empirical data ratherthan to fit the sample from a distribution) andvice versa. A limitation associated with directsampling from the input data set, without fit-ting a distribution, is that this effectively trunc-ates the ‘distribution’ and may mask realuncertainty/variability;

— The ultimate mask of variability and uncer-tainty is using a point value for an input. Assuch, point estimates should generally be usedfor variables which are considered to have lowvariability, rather than basing the choice ofusing a point estimate solely on lack of data.Lack of variability/uncertainty for point esti-mates may also stop these values from beingidentified in sensitivity analyses and may resultin an important variable being inadequatelycharacterized in the assessment;

— In the absence of robust data, it may be bestto utilize the widest distribution consistent

with the given state of knowledge. Bayesianapproaches may sometimes be appropriate.

� Selection of appropriate output metrics, defined bythe purpose and scope of assessments (e.g., acutevs. chronic toxicity).

� Data reporting formats, including sensitivityanalysis and the appropriate level of annotationwithin spreadsheets.

� Characterization of variability and uncertainty ininput and output distributions. The advantages andlimitations of quantitative vs. qualitativeapproaches should be elucidated. A one-dimen-sional probabilistic analysis cannot propagatevariability and uncertainty at the same time andtherefore is of limited use. A two-dimensionalanalysis is generally preferable as it can quantitat-ively characterize how uncertainty in the inputdata affects the range of possible output data sets.A general discussion of two-dimensional simula-tions is provided by Cullen and Frey (1999). Con-sideration should be given to harmonizing the con-ditions under which a one dimensional versus atwo dimensional analysis would be acceptable.

� The appropriate degree of formality for docu-menting distributions and assumptions based on‘expert judgement’.

CONCLUSIONS

For the regulator, it is important to keep anexposure assessment as transparent as possible. Thisgenerally means clearly defining the purpose of theassessment, identifying relevant model algorithmsand keeping it as simple as possible, while utilizingall available data. In situations where input data arevariable or uncertain, a probabilistic assessment canhelp characterize the tails of the output distribution,allow investigation of sources of variability anduncertainty and can include a sensitivity analysis toidentify variables that are driving the assessment orare contributing to the instability of the assessment.This can help focus additional data gathering or riskmitigation proposals.

As harmonized guidance is developed, probabilisticapproaches will play an increasingly important rolein pesticide regulation. As a starting point, we pro-pose that probabilistic assessments should be conduc-ted for regulatory purposes only when all of the fol-lowing conditions are met:

� A ‘Tier 1’ high end estimate does not result inacceptable risk;

� There are adequate data to support a refinedexposure estimate (all raw input data must beavailable and supported by verifiable datasources);

� There is an identifiable need to characterize varia-bility and uncertainty and/or to quantify the

S47Operator and residential exposure

exposure that will represent most of the exposedpopulation (e.g. for a severe acute toxicant).Otherwise a well characterized point estimatebased on central tendency estimates may suffice;

� There are adequate data to characterize data distri-butions (i.e., statistical or other support for sel-ecting a distribution shape, a reasonable basis forbounding the distribution, etc);

� There is a scientific rationale available to supportthe credibility of each of the assumptions in theprobability model;

� The presence or absence of correlations betweeninput parameters can be identified and docu-mented, or an appropriate sensitivity analysis indi-cates such correlations have minimal impact.

REFERENCES

Cullen AC, Frey HC. Probabilistic techniques in exposureassessment. A handbook for dealing with variability and

uncertainty in models and inputs. Society for Risk Analysis.New York: Plenum Press, 1999.

Ginevan M. Personal communication. Michael E. Ginevan,Ph.D. M.E. Ginevan and Associates, Washington, DC, 1999.

Lunchick C. Probabilistic exposure assessment of operator andresidential non-dietary exposure (in press).

OECD. Guidance document for the conduct of studies of occu-pational exposure to pesticides during agricultural appli-cation. Organisation for Economic Co-operation and Devel-opment. Series on Testing and Assessment No. 9. OECD/GD(97)148.58, 1997.

US-EPA. Guiding principles for Monte Carlo analysis. UnitedStates Environmental Protection Agency. Risk AssessmentForum. EPA/630/R-97/001, 1997.

US-EPA. Guidance for submission of probabilistic humanhealth exposure assessments to the Office of Pesticide Pro-grams. Draft. 63FR 67063-67066, 1998.

US-EPA. Choosing a percentile of acute dietary exposure as athreshold of regulatory concern. United States Environmen-tal Protection Agency. Office of Pesticide Programs, 2000.