Exposure Factors Handbook (Post 2011)

Project ID

1854

Category

Other

Added on

April 3, 2012, 9:48 a.m.

Search the HERO reference database

Query Builder

Search query
Technical Report

Abstract  This guidance contains principles for developing and describing EPA risk assessments, with a particular emphasis on risk characterization. The current document is an update of the guidance issued with the Agency's 1992 policy (Guidance on Risk Characterization for Risk Managers and Risk Assessors, February 26, 1992). The guidance has not been substantially revised, but includes some clarifications and changes to give more prominence to certain issues, such as the need to explain the use of default assumptions. As in the 1992 policy, some aspects of this guidance focus on cancer risk assessment, but the guidance applies generally to human health effects (e.g., neurotoxicity, developmental toxicity) and, with appropriate modifications, should be used in all health risk assessments. This document has not been revised to specifically address ecological risk assessment, however, initial guidance for ecological risk characterization is included in EPA's Framework for Ecological Risk Assessments (EPA/630/R-92/001). Neither does this guidance address in detail the use of risk assessment information (e.g., information from the Integrated Risk Information System (IRIS)) to generate site- or media-specific risk assessments. Additional program-specific guidance will be developed to enable implementation of EPA's Risk Characterization Policy. Development of such guidance will be overseen by the Science Policy Council and will involve risk assessors and risk managers from across the Agency.

Journal Article

Abstract  The appearance of measurement error in exposure and risk factor data potentially affects any inferences regarding variability and uncertainty because the distribution representing the observed data set deviates from the distribution that represents an error-free data set. A methodology for improving the characterization of variability and uncertainty with known measurement errors in data is demonstrated in this article based on an observed data set, known measurement error, and a measurement-error model. A practical method for constructing an error-free data set is presented and a numerical method based upon bootstrap pairs, incorporating two-dimensional Monte Carlo simulation, is introduced to address uncertainty arising from measurement error in selected statistics. When measurement error is a large source of uncertainty, substantial differences between the distribution representing variability of the observed data set and the distribution representing variability of the error-free data set will occur. Furthermore, the shape and range of the probability bands for uncertainty differ between the observed and error-free data set. Failure to separately characterize contributions from random sampling error and measurement error will lead to bias in the variability and uncertainty estimates. However, a key finding is that total uncertainty in mean can be properly quantified even if measurement and random sampling errors cannot be separated. An empirical case study is used to illustrate the application of the methodology.

DOI
Journal Article

Abstract  This article addresses the ubiquitous nature of uncertainty in industrial hygiene-related risk assessment and defines two basic types of uncertainty: natural variability and a basic lack of knowledge. A relatively simple physical-chemical modeling example is provided as an illustration in which uncertainty and sensitivity are described using two methods, a conventional technique and a readily available and user-friendly computer simulation software. The need for and value of monitoring data for validation and as a “reality check” is emphasized.

Technical Report

Abstract  This guidance has been developed as a basis for transparently characterizing uncertainties in exposure assessment to enable their full consideration in regulatory and policy decision-making process. These uncertainties are grouped under three categories, namely: parameter, model and scenario, with the guidance addressing both qualitative and quantitative descriptions. Guidance offered here is consistent with other projects addressing exposure in the WHO/IPCS harmonization initiative including a monograph on “IPCS Glossary of Key Exposure Assessment Terminology”, and a monograph on “Principles of Characterizing and Applying Human Exposure Models”. This document recommends a tiered approach to the evaluation of uncertainties in exposure assessment using both qualitative and quantitative methods, including both deterministic and probabilistic methodologies. It is intended for use by risk assessors who are not intimately familiar with uncertainty analysis. The key sections of the report include: definition and identification of different sources of uncertainties in exposure assessment, considerations for selecting the appropriate approach to uncertainty analysis as dictated by the specific objective, identifying the information needs of decision-makers, and recommendations for adopting a set of guiding principles for uncertainty analysis. The document also provides guidance on ways to consider or characterize exposure uncertainties during risk assessment and risk management decision-making, and on communicating the results. Illustrative examples based on environmental exposure and risk analysis case-studies are also provided. The framework is considered applicable across a full range of chemical categories, e.g. industrial chemicals, pesticides, food additives and others. A tiered approach to choosing alternative methods for uncertainty analysis is proposed, with the degree of quantitative analysis increasing as progress is made through each tier. Finally, the monograph is developed to provide an insight into the complexities associated with characterizing uncertainties in exposure assessment and suggested strategies for incorporating them during human health risk assessments for environmental contaminants. This is presented in the context of comparability with uncertainties associated with hazard quantification in risk assessment.

DOI
Journal Article

Abstract  Women in the child-bearing age of 15 to 44 years and, in particular, pregnant and lactating women in this age cohort are considered a sensitive subpopulation when assessing risk from ingestion of water because water borne contaminants may pose a risk not only to the mother but to the fetus or infant. This article presents estimates of daily average per capita water ingestion for women of child-bearing age and in three subgroups: pregnant, lactating, and non-pregnant/non-lactating women. Estimates of means and upper percentiles of subgroup ingestion distributions were generated using participant responses and survey weights from the United States Department of Agriculture's (USDA) 1994–96 and 1998 Continuing Survey of Food Intake by Individuals (CSFII). The ingestion estimates are empirical and not based on an assumed parametric distribution of daily average amount of water ingestion. Water occurring naturally in foods or added by manufacturers to commercial products is not included in the estimates presented. These estimates of water ingestion by women of child-bearing age are compared to those attributed to Ershow and Cantor (1989) by Burmaster. These estimates, based on data collected in 1978, were used by Burmaster to characterize the distribution of daily average per capita ingestion as lognormal. The lognormal estimates of total water ingestion are generally greater than the total water ingestion estimates based on the CSFII data. Possible explanations for the differences are discussed.

Journal Article

Abstract  This paper reanalyzes the dataset cited by the U.S. Environmental Protection Agency in its Exposure Factors Handbook that contains measurements of skin area, height, and body weight for 401 people spanning all stages of development. The reanalysis shows that a univariate model for total skin area as a function of body weight gives useful and practical results with little or no loss of reliability as compared to the Agency's bivariate model. This new result leads to a new method to develop Lognormal distributions for total skin area as a function of body weight alone.

Journal Article

Abstract  Four hundred and forty personal air measurements were carried out on 54 workers, employed in the main processes in six different factories (6-13 in each). Potential exposure was to lead, benzene and dust. Ten randomly repeated hygiene surveys were carried out over 1 year. In order to estimate the magnitude of the variability in workers' exposure over time, its sources, the variance between workers and the variance within a worker, a nested unbalanced analysis-of-variance model was fitted to the log-transformed data. Of the total exposure variance, the within variance of a worker's exposure over time was 51% (geometrical standard deviation, GSD = 3.1) and the between workers, factories and air contaminants variance was 49%. The exposure variance between all the workers was due mainly to variance between workers within the same factory (67%). Outdoor locations, mobility of the worker and mobility of the sources of exposure result in a positive influence on both the variance between (26%, ANOVA) and the variance of a worker over time (39%, regression). These variables are therefore important in the sampling strategy of workers' exposure. For valid compliance testing and assessment of workers' exposure the mean and the within- and between-variance of the workers' exposure over time should be considered. The exposure should be measured several times a year randomly in order to prevent workers misclassification. To assess yearly exposure, a GSD = 3.1 can be used to calculate confidence limits for the arithmetic mean of worker's exposure measurements, in circumstances similar to those in this study.

Journal Article

Abstract  Uncertainty permeates the process of risk assessment. It arises from recognized sources such as inadequacy of toxicological data, lack of exposure information, and imprecise identification of sensitive populations. In addition to these ambiguities, comparative risk exercises, which amount to risk assessment on a scale wider than that applied to single agents, also entail balancing community values, cost-benefit analyses, and other factors not directly tied to toxicology. Such exercises often convene evaluation panels to attempt a ranking of different stressors or stressor groups. The size of these panels is necessarily limited, and they usually strive to reach some form of consensus on ranking. Because ordinal assignments of risk are so difficult to achieve, they actually evolve into rating agendas in which stressors are categorized as high, medium, or low risks. Whether ranking or rating is its aim, this process, with its emphasis on agreement, usually overlooks two major components of uncertainty. One is variability among raters in assigning a score or category. The other is the degree of uncertainty they implicitly attach to their individual ratings. Both serve as guides to the scope and clarity of the available information. To gather more information about these critical but usually overlooked contributions to uncertainty, and, simultaneously, to query a broader sample of respondents, a survey method was designed to exploit the possibilities of electronic communication based on the World Wide Web. This method can secure risk ratings of selected stressors from many different samples of respondents. In addition, it can also provide information about the extent of ratings variability among risk assessors, individuals, or groups of respondents, about the bases of the ratings, and, concurrently, the confidence they place in their judgment. Comparative risk endeavors conducted in this format make their aims and content easy to modify. Data obtained by such a method can serve as pointers to new research initiatives, to regulator priorities, or to further iterations.

DOI
Journal Article

Abstract  The development of quantitative models of the interindividual (or intraspecies) uncertainty factor requires a clear and objective description of the interindividual uncertainty factor itself. Currently, a number of empirical approaches have been proposed for defining the uncertainty factor; however, the exact purpose of the factor remains unclear. This paper reviews the historical definition of the interindividual uncertainty factor and identifies two separate conceptual models that have been used to define the uncertainty factor, sensitive population, and limited sample size models. These two models address separate sources of uncertainty, both of which should be addressed in determining the size of the interindividual uncertainty factor for chemicals.

DOI
Journal Article

Abstract  In recent years, there has been a trend toward the use of probabilistic methods for the analysis of uncertainty and variability in risk assessment. By developing a plausible distribution of risk, it is possible to obtain a more complete characterization of risk than is provided by either ''best estimates'' or ''upper bounds'' on risk. In this article, we propose a general framework for the evaluation of uncertainty and variability in risk assessment. Within this framework, the contributions made by individual variables affecting risk to overall uncertainty and variability can be identified. First-order approximations are developed which avoid the need to resort to Monte Carlo simulation for evaluating uncertainty and variability. A practical application based on a multiplicative risk model for a population of individuals ingesting contaminated fish is presented to illustrate the application of the proposed methods.

DOI
Journal Article

Abstract  The potential application of categorical (i.e., species, pathway, or group specific) defaults for several components of uncertainty relevant to development of tolerable or reference concentrations/doses is considered-namely, interspecies variation and adequacy of database. For the former, the adequacy of allometric scaling by body surface area as a species-specific default for oral tolerable or reference doses is considered. For the latter, the extent to which data from analyses of subchronic:chronic effect levels, LOAELs/NOAELs, and critical effect levels for complete versus incomplete datasets informs selection of defaults is examined. The relative role of categorical defaults for these aspects is considered in the context of the continuum of increasingly data-informed approaches to characterization of uncertainty and variability that range from default ("presumed protective") to "biologically based predictive".

DOI
Journal Article

Abstract  Many different techniques have been proposed for performing uncertainty and sensitivity analyses on computer models for complex processes. The objective of the present study is to investigate the applicability of three widely used techniques to three computer models having large uncertainties and varying degrees of complexity in order to highlight some of the problem areas that must be addressed in actual applications. The following approaches to uncertainty and sensitivity analysis are considered: (1) response surface methodology based on input determined from a fractional factorial design; (2) Latin hypercube sampling with and without regression analysis; and (3) differential analysis. These techniques are investigated with respect to (1) ease of implementation, (2) flexibility, (3) estimation of the cumulative distribution function of the output, and (4) adaptability to different methods of sensitivity analysis. With respect to these criteria, the technique using Latin hypercube sampling and regression analysis had the best overall performance. The models used in the investigation are well documented, thus making it possible for researchers to make comparisons of other techniques with the results in this study.

DOI
Journal Article

Abstract  I use an analogy with the history of physical measurements, population and energy projections, and analyze the trends in several data sets to quantify the overconfidence of the experts in the reliability of their uncertainty estimates. Data sets include (i) time trends in the sequential measurements of the same physical quantity; (ii) national population projections; and (iii) projections for the U.S., energy sector. Probabilities of large deviations for the true values are parametrized by an exponential distribution with the slope determined by the data. Statistics of past errors can be used in probabilistic risk assessment to hedge against unsuspected uncertainties and to include the possibility of human error into the framework of uncertainty analysis. By means of a sample Monte Carlo simulation of cancer risk caused by ingestion of benzene in soil, I demonstrate how the upper 95th percentiles of risk are changed when unsuspected uncertainties are included. I recommend to inflate the estimated uncertainties by default safety factors determined from the relevant historical data sets.

Journal Article

Abstract  This paper first reviews in plain language some basic concepts and methods for estimating inter-individual variability in susceptibility from human data. A scale is offered to allow different variability findings to be understood and compared. Then the accumulated results of different variability analyses, information on how much variability has been observed and how often, is summarized in the form of a series of box plots. Data are presented on pharmacodynamic variability for various non-cancer effects, variability in susceptibility to infectious organisms, and variability in susceptibility to carcinogenesis by genetically-acting agents.

WoS
Technical Report

Abstract  This guidance has been developed as a basis for transparently characterizing uncertainty in chemical exposure assessment to enable its full consideration in regulatory and policy decision-making processes. Uncertainties in exposure assessment are grouped under three categories—namely, parameter, model and scenario—with the guidance addressing both qualitative and quantitative descriptions. Guidance offered here is consistent with other projects addressing exposure in the WHO/IPCS Harmonization Project, including a monograph on IPCS Risk Assessment Terminology, which includes a glossary of key exposure assessment terminology, and a monograph on Principles of Characterizing and Applying Human Exposure Models. The framework described in this monograph is considered applicable across a full range of chemical categories, such as industrial chemicals, pesticides, food additives and others. It is intended primarily for use by exposure assessors who are not intimately familiar with uncertainty analysis. The monograph aims to provide an insight into the complexities associated with characterizing uncertainties in exposure assessment and suggested strategies for incorporating them during human health risk assessments for environmental contaminants. This is presented in the context of comparability with uncertainties associated with hazard quantification in risk assessment. This document recommends a tiered approach to the evaluation of uncertainties in exposure assessment using both qualitative and quantitative (both deterministic and probabilistic) methods, with the complexity of the analysis increasing as progress is made through the tiers. The report defines and identifies different sources of uncertainty in exposure assessment, outlines considerations for selecting the appropriate approach to uncertainty analysis as dictated by the specific objective and identifies the information needs of decision-makers and stakeholders. The document also provides guidance on ways to consider or characterize exposure uncertainties during risk assessment and risk management decision-making and on communicating the results. Illustrative examples based on environmental exposure and risk analysis case-studies are provided. The monograph also recommends the adoption of 10 guiding principles for uncertainty analysis. These guiding principles are considered to be the general desirable goals or properties of good exposure assessment. They are mentioned in the text where most appropriate and are supported by more detailed recommendations for good practice. The 10 guiding principles are as follows: 1) Uncertainty analysis should be an integral part of exposure assessment. 2) The level of detail of the uncertainty analysis should be based on a tiered approach and consistent with the overall scope and purpose of the exposure and risk assessment. 3) Sources of uncertainty and variability should be systematically identified and evaluated in the exposure assessment. 4) The presence or absence of moderate to strong dependencies between model inputs is to be discussed and appropriately accounted for in the analysis. 5) Data, expert judgement or both should be used to inform the specification of uncertainties for scenarios, models and model parameters. 6) Sensitivity analysis should be an integral component of the uncertainty analysis in order to identify key sources of variability, uncertainty or both and to aid in iterative refinement of the exposure model. 7) Uncertainty analyses for exposure assessment should be documented fully and systematically in a transparent manner, including both qualitative and quantitative aspects pertaining to data, methods, scenarios, inputs, models, outputs, sensitivity analysis and interpretation of results. 8) The uncertainty analysis should be subject to an evaluation process that may include peer review, model comparison, quality assurance or comparison with relevant data or independent observations. 9) Where appropriate to an assessment objective, exposure assessments should be iteratively refined over time to incorporate new data, information and methods to better characterize uncertainty and variability. 10) Communication of the results of exposure assessment uncertainties to the different stakeholders should reflect the different needs of the audiences in a transparent and understandable manner.

Technical Report

Abstract  U.S. EPA increasingly utilizes physiologically based pharmacokinetic (PBPK) models in the development of its risk assessments. As reviewed in U.S. EPA’s Approaches for the Application of Physiologically Based Pharmacokinetic (PBPK) Models and Supporting Data in Risk Assessment (U.S. EPA, 2006), these models are designed to determine the relationship between external exposure and biologically relevant (usually internal) dose, and their predictions can be used for extrapolating across routes, levels, or patterns of exposure, and for quantitatively characterizing differences in susceptibility across species, populations, and life-stages. However, characterizing uncertainty and variability in PBPK models and their predictions has been an ongoing challenge, and this report summarizes some of the recent progress in this area that has been conducted or funded by the National Center for Environmental Assessment (NCEA). Specifically, the elements of this work are: - Identification of (i) the key issues in characterizing uncertainty and variability in PBPK modeling; (ii) the state of the science on addressing those issues; and (iii) the key areas in need of improvement though research or enhanced implementation. These issues were discussed as a part of the International Workshop on Uncertainty and Variability in PBPK Models, held on October 31 - November 2, 2006. The outcome of this workshop has been summarized by Barton et al. (2007). - Case examples of chemical-specific applications that demonstrate the methods and issues associated with characterizing uncertainty and variability in PBPK modeling. Specifically, the following case examples were completed: (i) uncertainty and variability in the human pharmacokinetics of tetrachloroethylene (Chiu and Bois, 2006; Chiu, 2006); (ii) uncertainty in the route-to-route extrapolation of vinyl chloride pharmacokinetics (Chiu, 2006); (iii) the development of a method to characterize inter-individual variability when only pooled data (mean and standard deviation) are available, using data on 1,3-butadiene (Chiu and Bois, 2007); and (iv) evaluation of uncertainty in human dose metrics for methyl tertiary-butyl ether (MTBE) exposures (Blancato et al., 2007).

Technical Report

Abstract  The Environmental Protection Agency's Office of Radiation Programs (ORP) is responsible for regulating on a national level the risks associated with technological sources of ionizing radiation in the environment. A critical activity of the ORP is analyzing and evaluating risk. The ORP believes that the analysis of uncertainty should be an integral part of any risk assessment; therefore, the ORP has initiated a project to develop framework for the treatment of uncertainty in risk analysis. Summaries of recent studies done in five areas of study are presented. (ERA citation 13:049624)

DOI
Book/Book Chapter

Abstract  In the field of modelling it is easier to find academic papers, guidelines tailored to specific disciplines and handbooks of numerical simulation rather than plain textbooks of broad appeal. The various academic communities go about modelling largely independently of each other. Is this an indication that modelling is not a science but a craft, as argued by epistemologists? In other words, is it because it is impossible to define a single set of rules to encode natural or man-made systems into sets of mathematical rules called models? If modelling is in fact characterized by such heterogeneity and lack of systematization, it might seem overly ambitious to offer a set of good practices of universal application in sensitivity analysis. Furthermore, if one looks at the available literature, in most instances ‘sensitivities’ are understood as derivatives of a particular output versus a particular input (such as elasticities in economics). This is not surprising, as contemporary researchers – like the authors of the present volume – are likely to have received more training in calculus than in Monte Carlo methods and to have seen more Jacobians and Hessians than Russian roulettes. A minority of sensitivity analysis practitioners (mostly in statistics, risk analysis and reliability) actively use importance measures such as those described in this book, whereby the influence of factors on outputs is assessed by looking at the entire input space rather than at a point in that space. Slowly these methods are finding their way into more recent modelling guidelines in other disciplines (see, for example, those of the Environmental Protection Agency in the USA, EPA, 2001). The purpose of this book is to offer to students an easy-to-read manual for sensitivity analysis covering importance measures and to show how these global methods may help to produce more robust or parsimonious models as well as to make models more defensible in the face of scientific or technical controversy.

Technical Report

Abstract  This report displays daily average per capita fish consumption estimates. The objective of the report is to provide estimates of fish consumption that may be used in estimating risk to human health from the consumption of contaminated freshwater and estuarine finfish and shellfish species. The reported estimates were calculated using data from the combined 1994-1996 and 1998 Continuing Survey of Food Intakes by Individuals (CSFII), conducted annually by the United States Department of Agriculture (USDA). Estimates in this report are of “as prepared” and “uncooked” fish. The “as prepared” designation indicates that consumptions of prepared foods containing fish which were reported by the CSFII participants were adjusted to reflect the amount of consumed prepared fish. “Uncooked” fish estimates project the portion of prepared fish to the amount of uncooked fish tissue which entered the fish–containing consumed food. Thus, the estimates in this report are not biased high due to other ingredients in fish–containing prepared foods.

Filter Results