Exposure Factors Handbook (Post 2011)

Project ID

1854

Category

Other

Added on

April 3, 2012, 9:48 a.m.

Search the HERO reference database

Query Builder

Search query
Book/Book Chapter

Abstract  The public depends on competent risk assessment from the federal government and the scientific community to grapple with the threat of pollution. When risk reports turn out to be overblown--or when risks are overlooked--public skepticism abounds. This comprehensive and readable book explores how the U.S. Environmental Protection Agency (EPA) can improve its risk assessment practices, with a focus on implementation of the 1990 Clean Air Act Amendments. With a wealth of detailed information, pertinent examples, and revealing analysis, the volume explores the "default option" and other basic concepts. It offers two views of EPA operations: The first examines how EPA currently assesses exposure to hazardous air pollutants, evaluates the toxicity of a substance, and characterizes the risk to the public. The second, more holistic, view explores how EPA can improve in several critical areas of risk assessment by focusing on cross-cutting themes and incorporating more scientific judgment. This comprehensive volume will be important to the EPA and other agencies, risk managers, environmental advocates, scientists, faculty, students, and concerned individuals.

Journal Article

Abstract  This paper presents an approach for characterizing the probability of adverse effects occurring in a population exposed to dose rates in excess of the Reference Dose (RfD). The approach uses a linear threshold (hockey stick) model of response and is based on the current system of uncertainty factors used in setting RfDs. The approach requires generally available toxicological estimates such as No-Observed-Adverse-Effect Levels (NOAELs) or Benchmark Doses and doses at which adverse effects are observed in 50% of the test animals (ED50s). In this approach, Monte Carlo analysis is used to characterize the uncertainty in the dose response slope based on the range and magnitude of the key sources of uncertainty in setting protective doses. The method does not require information on the shape of the dose response curve for specific chemicals, but is amenable to the inclusion of such data. The approach is applied to four compounds to produce estimates of response rates for dose rates greater than the RfD.

DOI
Journal Article

Abstract  This paper presents a survey and comparative evaluation of methods which have been developed for the determination of uncertainties in accident consequences and probabilities, for use in probabilistic risk assessment. The methods considered are: analytic techniques, Monte Carlo simulation, response surface approaches, differential sensitivity techniques, and evaluation of classical statistical confidence bounds. It is concluded that only the response surface and differential sensitivity approaches are sufficiently general and flexible for use as overall methods of uncertainty analysis in probabilistic risk assessment. The other methods considered, however, are very useful in particular problems.

Book/Book Chapter

Abstract  The authors explain the ways in which uncertainty is an important factor in the problems of risk and policy analysis. This book outlines the source and nature of uncertainty, discusses techniques for obtaining and using expert judgment, and reviews a variety of simple and advanced methods for analyzing uncertainty.

Journal Article

Abstract  Currently, there is a trend away from the use of single (often conservative) estimates of risk to summarize the results of risk analyses in favor of stochastic methods which provide a more complete characterization of risk. The use of such stochastic methods leads to a distribution of possible values of risk, taking into account both uncertainty and variability in all of the factors affecting risk. In this article, we propose a general framework for the analysis of uncertainty and variability for use in the commonly encountered case of multireplicative risk models, in which risk may be expressed as a product of two or more risk factors. Our analytical methods facilitate the evaluation of overall uncertainty and variability in risk assessment, as well as the contributions of individual risk factors to both uncertainty and variability which is cumbersome using Monte Carlo methods. The use of these methods is illustrated in the analysis of potential cancer risks due to the ingestion of radon in drinking water.

Journal Article

Abstract  Uncertainty in risk assessment results from the lack of knowledge on toxicity to the target population for a substance. Currently used deterministic risk assessment methods yield human limit values or margins of safety (MOS) without quantitative measurements of uncertainty. Qualitative and quantitative uncertainty analysis would enable risk managers to better judge the consequences of different management options. This article discusses sources of uncertainty and possibilities for quantification of uncertainty associated with different steps in the risk assessment of non-carcinogenic health effects. Knowledge gaps causing uncertainty in risk assessment are overcome by extrapolation. Distribution functions for extrapolation factors are based on empirical data and provide information about the extent of uncertainty introduced by these factors. Whereas deterministic methods can account only qualitatively for uncertainty of the resulting human limit value, probabilistic risk assessment methods are able to quantify several aspects of uncertainty. However, there is only limited experience with these methods in practice. Their acceptance and future application will depend on the establishment of evidence based distribution functions, flexibility and practicability of the methods, and the unambiguity of the results.

Journal Article

Abstract  Using probability plots and Maximum Likelihood Estimation (MLE), we fit lognormal distributions to data compiled by Ershow et al. for daily intake of total water and tap water by three groups of women (controls, pregnant, and lactating; all between 15-49 years of age) in the United States. We also develop bivariate lognormal distributions for the joint distribution of water ingestion and body weight for these three groups. Overall, we recommend the marginal distributions for water intake as fit by MLE for use in human health risk assessments.

Journal Article

Abstract  An integrated, quantitative approach to incorporating both uncertainty and interindividual variability into risk prediction models is described. Individual risk R is treated as a variable distributed in both an uncertainty dimension and a variability dimension, whereas population risk I (the number of additional cases caused by R) is purely uncertain. I is shown to follow a compound Poisson-binomial distribution, which in low-level risk contexts can often be approximated well by a corresponding compound Poisson distribution. The proposed analytic framework is illustrated with an application to cancer risk assessment for a California population exposed to 1,2-dibromo-3-chloropropane from ground water.

DOI
Journal Article

Abstract  Mathematical modelers from different disciplines and regulatory agencies worldwide agree on the importance of a careful sensitivity analysis (SA) of model-based inference. The most popular SA practice seen in the literature is that of 'one-factor-at-a-time' (OAT). This consists of analyzing the effect of varying one model input factor at a time while keeping all other fixed. While the shortcomings of OAT are known from the statistical literature, its widespread use among modelers raises concern on the quality of the associated sensitivity analyses. The present paper introduces a novel geometric proof of the inefficiency of OAT, with the purpose of providing the modeling community with a convincing and possibly definitive argument against OAT. Alternatives to OAT are indicated which are based on statistical theory, drawing from experimental design, regression analysis and sensitivity analysis proper.

DOI
Journal Article

Abstract  Quantification of uncertainty associated with risk estimates is an important part of risk assessment. In recent years, use of second-order distributions, and two-dimensional simulations have been suggested for quantifying both variability and uncertainty. These approaches are better interpreted within the Bayesian framework. To help practitioners better use such methods and interpret the results, in this article, we describe propagation and interpretation of uncertainty in the Bayesian paradigm. We consider both the estimation problem where some summary measures of the risk distribution (e.g., mean, variance, or selected percentiles) are to be estimated, and the prediction problem, where the risk values for some specific individuals are to be predicted. We discuss some connections and differences between uncertainties in estimation and prediction problems, and present an interpretation of a decomposition of total variability/uncertainty into variability and uncertainty in terms of expected squared error of prediction and its reduction from perfect information. We also discuss the role of Monte Carlo methods in characterizing uncertainty. We explain the basic ideas using a simple example, and demonstrate Monte Carlo calculations using another example from the literature.

DOI
Journal Article

Abstract  In 1987, James and Knuiman published their analysis of a comprehensive domestic water use study conducted in Perth, Western Australia to quantify the components of water usage in approximately 3000 households. This manuscript corrects errors and omissions about James and Knuiman's study in the U.S. EPA's Exposure Factors Handbook, and it shows James and Knuiman's results in a form and notation more readily used in Monte Carlo simulations.

DOI
Journal Article

Abstract  Practitioners and consumers of risk assessments often wonder whether the trend toward more complex risk models, incorporating increasing amounts of biological knowledge and increasing numbers of biologically interpretable parameters, actually leads to better risk estimates. A contrary view might be that the need to estimate more uncertain quantities undermines the advantages of greater descriptive realism so much that the final risk estimates are less certain than the ones traditionally obtained from simpler, less realistic, statistical curve-fitting models. In opposition to this pessimistic view is the widespread common-sense notion that including more information in a risk model can never worsen (and will usually improve) the resulting risk estimates. This paper appeals to mathematical arguments to resolve these conflicting intuitions. First, it emphasizes the fact that risk depends on multiple inputs only through a small number of reduced quantities — perhaps on only one, which would then be defined as internal dose. Thus, uncertainty about risk may have limited sensitivity to uncertainties in the original input quantities. The concept of internal dose and its possible algebraic relations to the original input quantities are clarified using concepts from dimensional analysis. Then, the question of whether greater model complexity leads to better or worse risk estimates is addressed in an information-theoretic framework, using entropies of probability distributions to quantify uncertainties. Within this framework, it is shown that models with greater intrinsic or structural complexity (meaning complexity that can not be eliminated by reformulating the model in terms of its reduced quantities) lead to better-informed, and hence more certain (lower-entropy) risk estimates. The compatibility of this result with results from decision theory, in which expected loss rather than entropy is used as a criterion, is discussed.

DOI
Journal Article

Abstract  A terminology and typology of uncertainty is presented together with a framework for the modelling process, its interaction with the broader water management process and the role of uncertainty at different stages in the modelling processes. Brief reviews have been made of 14 different (partly complementary) methods commonly used in uncertainty assessment and characterisation: data uncertainty engine (DUE), error propagation equations, expert elicitation, extended peer review, inverse modelling (parameter estimation), inverse modelling (predictive uncertainty), Monte Carlo analysis, multiple model simulation, NUSAP, quality assurance, scenario analysis, sensitivity analysis, stakeholder involvement and uncertainty matrix. The applicability of these methods has been mapped according to purpose of application, stage of the modelling process and source and type of uncertainty addressed. It is concluded that uncertainty assessment is not just something to be added after the completion of the modelling work. Instead uncertainty should be seen as a red thread throughout the modelling study starting from the very beginning, where the identification and characterisation of all uncertainty sources should be performed jointly by the modeller, the water manager and the stakeholders.

Journal Article

Abstract  A probabilistic model (SHEDS-Wood) was developed to examine children's exposure and dose to chromated copper arsenate (CCA)-treated wood, as described in Part 1 of this two-part article. This Part 2 article discusses sensitivity and uncertainty analyses conducted to assess the key model inputs and areas of needed research for children's exposure to CCA-treated playsets and decks. The following types of analyses were conducted: (1) sensitivity analyses using a percentile scaling approach and multiple stepwise regression; and (2) uncertainty analyses using the bootstrap and two-stage Monte Carlo techniques. The five most important variables, based on both sensitivity and uncertainty analyses, were: wood surface residue-to-skin transfer efficiency; wood surface residue levels; fraction of hand surface area mouthed per mouthing event; average fraction of nonresidential outdoor time a child plays on/around CCA-treated public playsets; and frequency of hand washing. In general, there was a factor of 8 for the 5th and 95th percentiles and a factor of 4 for the 50th percentile in the uncertainty of predicted population dose estimates due to parameter uncertainty. Data were available for most of the key model inputs identified with sensitivity and uncertainty analyses; however, there were few or no data for some key inputs. To evaluate and improve the accuracy of model results, future measurement studies should obtain longitudinal time-activity diary information on children, spatial and temporal measurements of residue and soil concentrations on or near CCA-treated playsets and decks, and key exposure factors. Future studies should also address other sources of uncertainty in addition to parameter uncertainty, such as scenario and model uncertainty.

Technical Report

Abstract  In the Fall of 1985 EPA's Office of Radiation Programs (ORP) initiated a project to develop a formal approach to dealing with uncertainties encountered when estimating and evaluating risks to human health and the environment. Based on a literature review of modeling uncertainty, interviews with ORP technical and management staff, and input from experts on uncertainty analysis, a comprehensive approach was developed. This approach recognizes by design the constraints on budget, time, manpower, expertise, and availability of information often encountered in ''real world'' modeling. It is based on the observation that in practice risk modeling is usually done to support a decision process. As such, the approach focuses on how to frame a given risk modeling problem, how to use that framing to select an appropriate mixture of uncertainty analyses techniques, and how to integrate the techniques into an uncertainty assessment that effectively communicates important information and insight to decision-makers. The approach is presented in this report. Practical guidance on characterizing and analyzing uncertainties about model form and quantities and on effectively communicating uncertainty analysis results is included. Examples from actual applications are presented. (ERA citation 13:047344)

Journal Article

Abstract  Water ingestion estimates are important for the assessment of risk to human populations of exposure to water-borne pollutants. This paper reports mean and percentile estimates of the distributions of daily average per capita water ingestion for a number of age range groups. The age ranges, based on guidance from the US EPA's Risk Assessment Forum, are narrow for younger ages when development is rapid and wider for older ages when the rate of development decreases. Estimates are based on data from the United States Department of Agriculture's (USDA's) 1994-1996 and 1998 Continuing Survey of Food Intake by Individuals (CSFII). Water ingestion estimates include water ingested directly as a beverage and water added to foods and beverages during preparation at home or in local establishments. Water occurring naturally in foods or added by manufacturers to commercial products (beverage or food) is not included. Estimates are reported in milliliters (ml/person/day) and milliliters per kilogram of body weight (ml/kg/day). As a by-product of constructing estimates in terms of body weight of respondents, distributions of self-reported body weights based on the CSFII were estimated and are also reported here.

Technical Report

Abstract  The most common basic definition of risk assessment used within the U.S. Environmental Protection Agency (EPA) is paraphrased from the 1983 report Risk Assessment in the Federal Government: Managing the Process (NRC, 1983), by the National Academy of Sciences’ (NAS’s) National Research Council (NRC): Risk assessment is a process in which information is analyzed to determine if an environmental hazard might cause harm to exposed persons and ecosystems. This process is highly interdisciplinary in that it draws from such diverse fields as biology, toxicology, ecology, engineering, geology, statistics, and the social sciences to create a rational framework for evaluating environmental hazards. While this definition has been somewhat enhanced and elaborated upon through subsequent NAS writings, it still basically describes risk assessment as it is performed within EPA. EPA uses risk assessment as a tool to integrate exposure and health effects or ecological effects information into a characterization of the potential for health hazards in humans or other hazards to our environment.

Journal Article

Abstract  The use of uncertainty factors in the standard method for deriving acceptable intake or exposure limits for humans, such as the Reference Dose (RfD), may be viewed as a conservative method of taking various uncertainties into account. As an obvious alternative, the use of uncertainty distributions instead of uncertainty factors is gaining attention. This paper presents a comprehensive discussion of a general framework that quantifies both the uncertainties in the no-adverse-effect level in the animal (using a benchmark-like approach) and the uncertainties in the various extrapolation steps involved (using uncertainty distributions). This approach results in an uncertainty distribution for the no-adverse-effect level in the sensitive human subpopulation, reflecting the overall scientific uncertainty associated with that level. A lower percentile of this distribution may be regarded as an acceptable exposure limit (e.g., RfD) that takes account of the various uncertainties in a nonconservative fashion. The same methodology may also be used as a tool to derive a distribution for possible human health effects at a given exposure level. We argue that in a probabilistic approach the uncertainty in the estimated no-adverse-effect-level in the animal should be explicitly taken into account. Not only in this source of uncertainty too large to be ignored, it also has repercussions for the quantification of the other uncertainty distributions.

Technical Report

Abstract  The United States Environmental Protection Agency (U.S. EPA) generated the estimates in this report in response to legislative mandates in the Safe Drinking Water Act Amendments of 1996. These mandates require up-to-date information on water ingestion to identify subpopulations at elevated risk of health effects from exposure to contaminants in drinking water. The estimates also support characterization of health risks to sensitive populations from contaminants in drinking water. The estimates in this document characterize the empirical distributions of 2-day average per capita ingestion of water for specific subpopulations. Subpopulation estimates apply to demographic categories, but do not distinguish individuals with a history of serious illness or with lifestyles that effect water ingestion.

DOI
Journal Article

Abstract  This article reviews some of the current guidance concerning the separation of variability and uncertainty in presenting the results of human health and ecological risk assessments. Such guidance and some of the published examples of its implementation using two-stage Monte Carlo simulation methods have not emphasized the fact that there is considerable judgment involved in determining which input parameters can be modeled as purely variable or purely uncertain, and which require explicit treatment in both dimensions. Failure to discuss these choices leads to confusion and misunderstanding of the proposed methods. We conclude with an example illustrating some of the reasoning and statistical calculations that might be used to inform such choices.

Journal Article

Abstract  OBJECTIVES: This study used Monte Carlo (MC) simulation to examine the influence of uncertainty on an exposure model and to determine whether a difference exists between two worker groups in a ceramic fiber manufacturing plant.

METHODS: Data on work practices and conditions were gathered in interviews with long-serving employees. With the use of previously developed deterministic modeling techniques and likely distributions for model parameters, MC simulations generated exposure profiles for the two job titles.

RESULTS: The exposure profiles overlapped considerably, although the average estimated exposure for one job was approximately double that of the other. However, when the correlation between the model parameters in the two jobs was considered, it was concluded that there was a significant difference in the two estimates.

CONCLUSIONS: Models are increasingly being used to estimate exposure. Different work situations inevitably result in different exposure estimates. However, it is difficult to determine whether such differences in estimated exposure between worker groups are simply the result of uncertainty with respect to the model parameters or whether they reflect real differences between occupational groups. This study demonstrates the value of MC simulation in helping define the uncertainty in deterministic model estimates.

Journal Article

Abstract  Based on results reported from the NHANES II Survey (the National Health and Nutrition Examination Survey II) for people living in the United States during 1976-1980, we use exploratory data analysis, probability plots, and the method of maximum likelihood to fit lognormal distributions to percentiles of body weight for males and females as a function of age from 6 months through 74 years. The results are immediately useful in probabilistic (and deterministic) risk assessments.

DOI
Journal Article

Abstract  Elucidating uncertainty and sensitivity structures in environmental models can be a difficult task, even for low-order, single-medium constructs driven by a unique set of site-specific data. Quantitative assessment of integrated, multimedia models that simulate hundreds of sites, spanning multiple geographical and ecological regions, will ultimately require a comparative approach using several techniques, coupled with sufficient computational power. The Framework for Risk Analysis in Multimedia Environmental Systems – Multimedia, Multipathway, and Multireceptor Risk Assessment (FRAMES-3MRA) is an important software model being developed by the United States Environmental Protection Agency for use in risk assessment of hazardous waste management facilities. The 3MRA modeling system includes a set of 17 science modules that collectively simulate release, fate and transport, exposure, and risk associated with hazardous contaminants disposed of in land-based waste management units (WMU). The 3MRA model encompasses 966 multi-dimensional input variables, over 185 of which are explicitly stochastic. Design of SuperMUSE, a 215 GHz PC-based, Windows-based Supercomputer for Model Uncertainty and Sensitivity Evaluation is described. Developed for 3MRA and extendable to other computer models, an accompanying platform-independent, Java-based parallel processing software toolset is also discussed. For 3MRA, comparison of stand-alone PC versus SuperMUSE simulation executions showed a parallel computing overhead of only 0.57 seconds/simulation, a relative cost increase of 0.7% over average model runtime. Parallel computing software tools represent a critical aspect of exploiting the capabilities of such modeling systems. The Java toolset developed here readily handled machine and job management tasks over the Windows cluster, and is currently capable of completing over 3 million 3MRA model simulations per month on SuperMUSE. Preliminary work is reported for an example uncertainty analysis of Benzene disposal that describes the relative importance of various exposure pathways in driving risk levels for ecological receptors and human health. Incorporating landfills, waste piles, aerated tanks, surface impoundments, and land application units, the site-based data used in the analysis included 201 facilities across the United States representing 419 site-WMU combinations.

Journal Article

Abstract  This guest editorial is a summary of the NCSU/USDA Workshop on Sensitivity Analysis held June 11-12, 2001 at North Carolina State University and sponsored by the U.S. Department of Agriculture's Office of Risk Assessment and Cost Benefit Analysis. The objective of the workshop was to learn across disciplines in identifying, evaluating, and recommending sensitivity analysis methods and practices for application to food-safety process risk models. The workshop included presentations regarding the Hazard Assessment and Critical Control Points (HACCP) framework used in food-safety risk assessment, a survey of sensitivity analysis methods, invited white papers on sensitivity analysis, and invited case studies regarding risk assessment of microbial pathogens in food. Based on the sharing of interdisciplinary information represented by the presentations, the workshop participants, divided into breakout sessions, responded to three trigger questions: What are the key criteria for sensitivity analysis methods applied to food-safety risk assessment? What sensitivity analysis methods are most promising for application to food safety and risk assessment? and What are the key needs for implementation and demonstration of such methods? The workshop produced agreement regarding key criteria for sensitivity analysis methods and the need to use two or more methods to try to obtain robust insights. Recommendations were made regarding a guideline document to assist practitioners in selecting, applying, interpreting, and reporting the results of sensitivity analysis.

Journal Article

Abstract  A major challenge for drug development and environmental or occupational health is the prediction of pharmacokinetic and pharmacodynamic interactions between drugs, natural chemicals or environmental contaminants. This article reviews briefly past developments in the area of physiologically based pharmacokinetic (PBPK) modelling of interactions. It also demonstrates a systems biology approach to the question, and the capabilities of new software tools to facilitate that development. Individual Systems Biology Markup Language models of metabolic pathways can now be automatically merged and coupled to a template PBPK pharmacokinetic model, using for example the GNU MCSim software. The global model generated is very efficient and able to simulate the interactions between a theoretically unlimited number of substances. Development time and the number of model parameter increase only linearly with the number of substances considered, even though the number of possible interactions increases exponentially.

Filter Results