Exposure Factors Handbook (Post 2011)

Project ID

1854

Category

Other

Added on

April 3, 2012, 9:48 a.m.

Search the HERO reference database

Query Builder

Search query
Technical Report

Abstract  The importance of adequately characterizing variability and uncertainty in risk assessments has been emphasized in several science and policy documents. These include the 1992 U.S. Environmental Protection Agency (EPA) Exposure Assessment Guidelines, the 1992 EPA Risk Assessment Council (RAC) Guidance, the 1995 EPA Policy for Risk Characterization, the EPA Proposed Guidelines for Ecological Risk Assessment, the EPA Region 3 Technical Guidance Manual on Risk Assessment, the EPA Region 8 Superfund Technical Guidance, the 1994 National Academy of Sciences "Science and Judgment in Risk Assessment," and the report by the Commission on Risk Assessment and Risk Management. As part of the implementation of the recommendations contained in these reports, the Agency is issuing guidance on the appropriate use of an application for analyzing variability and uncertainty in Agency risk assessments. This policy and the guiding principles attached are designed to support the use of various techniques for characterizing variability and uncertainty. Further, the policy defines a set of Conditions for Acceptance. These conditions are important for ensuring good scientific practice in quantifying uncertainty and variability. In accordance with EPA's 1995 Policy for Risk Characterization, this policy also emphasizes the importance of clarity, transparency, reasonableness, and consistency in risk assessments. There are a variety of different methods for characterizing uncertainty and variability. These methods cover a broad range of complexity from the simple comparison of discrete points to probabilistic techniques like Monte Carlo analysis. Recently, interest in using Monte Carlo analysis for risk assessment has increased. This method has the advantage of allowing the analyst to account for relationships between input variables and of providing the flexibility to investigate the effects of different modeling assumptions. Experience has shown that to benefit fully from the advantages of such probabilistic techniques as Monte Carlo analysis, certain standards of practice are to be observed. The Agency is issuing, therefore, this policy statement and associated guiding principles. While Monte Carlo analysis is the most frequently encountered probabilistic tool for analyzing variability and uncertainty in risk assessments, the intent of this policy is not to indicate that Monte Carlo analysis is the only acceptable approach for Agency risk assessments. The spirit of this policy and the Conditions for Acceptance described herein are equally applicable to other methods for analyzing variability and uncertainty.

Technical Report

Abstract  Exposure assessments, except those based upon measured exposure levels for a probability sample of population members, rely upon a model to predict exposure. The model may be any mathematical function that estimates the population distribution of exposure or an individual's exposure as a function of one or more input variables. Whenever a model that has not been validated is used as the basis for an exposure assessment, the uncertainty associated with the exposure assessment may be substantial. The primary characterization of uncertainty is at least partly qualitative in this case, i.e., it includes a description of the assumptions inherent in the model and their justification. Plausible alternative models should be discussed. Sensitivity of the exposure assessment to model formulation can be investigated by replicating the assessment for plausible alternative models. When an exposure assessment is based upon directly measured exposure levels for a probability sample of population members, uncertainty can be greatly reduced and described quantitatively. The primary sources of uncertainty are measurement errors and sampling errors. A quality assurance program should be designed into the study to ensure that the magnitude of measurement errors can be estimated. The effects of all sources of random error should be measured quantitatively.

Technical Report

Abstract  The Guidelines for Exposure Assessment describe the general concepts of exposure assessment including definitions and associated units, and by providing guidance on the planning and conducting of an exposure assessment. Guidance is also provided on presenting the results of the exposure assessment and characterizing uncertainty. Although these Guidelines focus on exposures of humans to chemical substances, much of the guidance also pertains to assessing wildlife exposure to chemicals, or human exposures to biological, noise, or radiological agents. The Guidelines include a glossary which helps standardize terminology used by the Agency in exposure assessment. They emphasize that exposure assessments done as part of a risk assessment need to consider the hazard identification and dose-response parts of the risk assessment in the planning stages of the exposure assessment so that these three parts can be smoothly integrated into the risk characterization. The Guidelines discuss and reference a number of approaches and tools for exposure assessment, along with discussion of their appropriate use. The Guidelines also stress that exposure estimates along with supporting information will be fully presented in Agency risk assessment documents, and that Agency scientists will identify the strengths and weaknesses of each assessment by describing uncertainties, assumptions, and limitations, as well as the scientific basis and rationale for each assessment.

Technical Report

Abstract  RAGS Volume 3: Part A addresses the technical and policy issues associated with the use of PRA in EPA Superfund program. This guidance builds upon basic concepts of risk assessment outlined in RAGS Volume I (U.S. EPA, 1989a; 2001), recent guidance for ecological risk assessment (U.S. EPA, 1992, 1994, 1997a, 1998a; 1999), and the Agency Probabilistic Analysis Policy document (U.S. EPA, 1997b). RAGS Volume 3: Part A addresses the use of PRA for both human health and ecological risk assessments. RAGS Volume 3: Part A was developed to provide risk assessors and risk managers with basic guidelines for incorporating PRA into Superfund site-specific risk assessments. It is not intended to be a detailed technical reference on PRA methods, however, it does direct the reader to appropriate literature on important technical subjects. A primary purpose of RAGS Volume 3: Part A is to help prevent misuse and misinterpretation of PRA.

Book/Book Chapter

Abstract  Risk assessment has become a dominant public policy tool for making choices, based on limited resources, to protect public health and the environment. It has been instrumental to the mission of the U.S. Environmental Protection Agency (EPA) as well as other federal agencies in evaluating public health concerns, informing regulatory and technological decisions, prioritizing research needs and funding, and in developing approaches for cost-benefit analysis. However, risk assessment is at a crossroads. Despite advances in the field, risk assessment faces a number of significant challenges including lengthy delays in making complex decisions; lack of data leading to significant uncertainty in risk assessments; and many chemicals in the marketplace that have not been evaluated and emerging agents requiring assessment. Science and Decisions makes practical scientific and technical recommendations to address these challenges. This book is a complement to the widely used 1983 National Academies book, Risk Assessment in he Federal Government (also known as the Red Book). The earlier book established a framework for the concepts and conduct of risk assessment that has been adopted by numerous expert committees, regulatory agencies, and public health institutions. The new book embeds these concepts within a broader framework for risk-based decision-making. Together, these are essential references for those working in the regulatory and public health fields.

Journal Article

Abstract  This paper summarizes the state of the science of probabilistic exposure assessment (PEA) as applied to chemical risk characterization. Current probabilistic risk analysis methods applied to PEA are reviewed. PEA within the context of risk-based decision making is discussed, including probabilistic treatment of related uncertainty, interindividual heterogeneity, and other sources of variability. Key examples of recent experience gained in assessing human exposures to chemicals in the environment, and other applications to chemical risk characterization and assessment, are presented. It is concluded that, although improvements continue to be made, existing methods suffice for effective application of PEA to support quantitative analyses of the risk of chemically induced toxicity that play an increasing role in key decision-making objectives involving health protection, triage, civil justice, and criminal justice. Different types of information required to apply PEA to these different decision contexts are identified, and specific PEA methods are highlighted that are best suited to exposure assessment in these separate contexts.

Journal Article

Abstract  Standard statistical methods understate the uncertainty one should attach to effect estimates obtained from observational data. Among the methods used to address this problem are sensitivity analysis, Monte Carlo risk analysis (MCRA), and Bayesian uncertainty assessment. Estimates from MCRAs have been presented as if they were valid frequentist or Bayesian results, but examples show that they need not be either in actual applications. It is concluded that both sensitivity analyses and MCRA should begin with the same type of prior specification effort as Bayesian analysis.

Journal Article

Abstract  Addressing human variability and sensitive subpopulations is one of the challenges of risk assessment and is an important aspect of the Food Quality Protection Act, the law passed in 1996 that regulates food use pesticides in the United States. The intraspecies uncertainty factor is intended to address differences in susceptibility within the human population. This paper examines the history and scientific basis for the intraspecies uncertainty factor. Our best source of knowledge about human variability in the response to chemicals comes from clinical trials of pharmaceuticals. This large body of data allows both qualitative and quantitative characterization of variability in pharmacokinetic and pharmacodynamic parameters in the general population and in subgroups such as children. The preponderance of evidence in the areas of pharmacodynamics and pharmacokinetics supports the routine use of an intraspecies uncertainty factor in the range of 1-10 as being protective of greater than 99% of the human population. The intraspecies uncertainty factor is highly protective of various subpopulations, including infants and children.

Journal Article

Abstract  We review briefly some examples that would support an extended role for quantitative sensitivity analysis in the context of model-based analysis (Section 1). We then review what features a quantitative sensitivity analysis needs to have to play such a role (Section 2). The methods that meet these requirements are described in Section 3; an example is provided in Section 4. Some pointers to further research are set out in Section 5.

DOI
Journal Article

Abstract  Tenfold uncertainty factors have been used in risk assessment for about 40 years to allow for species differences and inter-individual variability. Each factor has to allow for toxicokinetic and toxicodynamic differences. Subdividing the 10-fold factors into kinetic and dynamic defaults, which when multiplied give a product of 10, offers a number of advantages. A major advantage is that chemical-specific data can be introduced to replace one or more of the default subfactors, hence contributing to a chemical-related overall factor. Subdivision of the 10-fold factors also facilitates analysis of the appropriateness of the overall 10-fold defaults, and the development of a more refined approach to the use of uncertainty factors.

Journal Article

Abstract  A method for estimating long-term exposures from short-term measurements is validated using data from a recent EPA study of exposure to fine particles. The method was developed a decade ago but long-term exposure data to validate it did not exist until recently. In this article, exposure data from repeated visits to 37 persons over 1 year (up to 28 measurements per person) are used to test the model. Both fine particle mass and elemental concentrations measured indoors, outdoors, and on the person are examined. To provide the most stringent test of the method, only two single-day distributions are randomly selected for each element to predict the long-term distributions. The precision of the method in estimating the long-term geometric mean and geometric standard deviation appears to be of the order of 10%, with no apparent bias. The precision in estimating the 99 th percentile ranges from 19% to 48%, again without obvious bias. The precision can be improved by selecting a number of pairs of single-day distributions instead of just one pair. Occasionally, the method fails to provide an estimate for the long-term distribution. In that case, a repeat of the random selection procedure can provide an estimate. Although the method assumes a log-normal distribution, most of the distributions tested failed the chi-square test for log-normality. Therefore, the method appears suitable for application to distributions that depart from log-normality.

Book/Book Chapter

Abstract  Monte Carlo statistical methods, particularly those based on Markov chains, are now an essential component of the standard set of techniques used by statisticians. This new edition has been revised towards a coherent and flowing coverage of these simulation techniques, with incorporation of the most recent developments in the field. In particular, the introductory coverage of random variable generation has been totally revised, with many concepts being unified through a fundamental theorem of simulation There are five completely new chapters that cover Monte Carlo control, reversible jump, slice sampling, sequential Monte Carlo, and perfect sampling. There is a more in-depth coverage of Gibbs sampling, which is now contained in three consecutive chapters. The development of Gibbs sampling starts with slice sampling and its connection with the fundamental theorem of simulation, and builds up to two-stage Gibbs sampling and its theoretical properties. A third chapter covers the multi-stage Gibbs sampler and its variety of applications. Lastly, chapters from the previous edition have been revised towards easier access, with the examples getting more detailed coverage. This textbook is intended for a second year graduate course, but will also be useful to someone who either wants to apply simulation techniques for the resolution of practical problems or wishes to grasp the fundamental principles behind those methods. The authors do not assume familiarity with Monte Carlo techniques (such as random variable generation), with computer programming, or with any Markov chain theory (the necessary concepts are developed in Chapter 6). A solutions manual, which covers approximately 40% of the problems, is available for instructors who require the book for a course.

Journal Article

Abstract  Using the Monte Carlo method and physiologically based pharmacokinetic modeling, an occupational inhalation exposure to trichloroethylene consisting of 7 h of exposure per day for 5 days was simulated in populations of men and women of 5000 individuals each. The endpoint of concern for occupational exposure was drowsiness. The toxicologic condition leading to drowsiness was assumed to be high levels of both trichloroethanol and trichloroethylene. Therefore, the output of the simulation or dose metric was the maximum value of the sum of the concentration of trichloroethylene in blood and the concentration of trichloroethanol within its volume of distribution occurring within 1 week of exposure. The distributions of the dose metric in the simulated populations were lognormal. To protect 99% of a worker population, a concentration of 30 ppm over a 7-h period of the work day should not be exceeded. Subjecting a susceptible individual (the 99th percentile of the dose metric) to 200 ppm (the ACGIH short-term exposure limit or STEL) for 15 min twice a day over a work week necessitates a 2.5-h rest in fresh air following the STEL exposure to allow the blood concentrations of trichloroethylene and trichloroethanol to drop to levels that would not cause drowsiness. Both the OSHA PEL and the ACGIH TLV are greater than the value of 30 ppm derived here. As well as suggesting a new occupational guidance value, this study provides an example of this method of guidance value derivation.

Book/Book Chapter

Abstract  In a family study of breast cancer, epidemiologists in Southern California increase the power for detecting a gene-environment interaction. In Gambia, a study helps a vaccination program reduce the incidence of Hepatitis B carriage. Archaeologists in Austria place a Bronze Age site in its true temporal location on the calendar scale. And in France, researchers map a rare disease with relatively little variation. Each of these studies applied Markov chain Monte Carlo methods to produce more accurate and inclusive results. General state-space Markov chain theory has seen several developments that have made it both more accessible and more powerful to the general statistician. Markov Chain Monte Carlo in Practice introduces MCMC methods and their applications, providing some theoretical background as well. The authors are researchers who have made key contributions in the recent development of MCMC methodology and its application. Considering the broad audience, the editors emphasize practice rather than theory, keeping the technical content to a minimum. The examples range from the simplest application, Gibbs sampling, to more complex applications. The first chapter contains enough information to allow the reader to start applying MCMC in a basic way. The following chapters cover main issues, important concepts and results, techniques for implementing MCMC, improving its performance, assessing model adequacy, choosing between models, and applications and their domains. Markov Chain Monte Carlo in Practice is a thorough, clear introduction to the methodology and applications of this simple idea with enormous potential. It shows the importance of MCMC in real applications, such as archaeology, astronomy, biostatistics, genetics, epidemiology, and image analysis, and provides an excellent base for MCMC to be applied to other fields as well.

Book/Book Chapter

Abstract  Incorporating new and updated information, this second edition of THE bestselling text in Bayesian data analysis continues to emphasize practice over theory, describing how to conceptualize, perform, and critique statistical analyses from a Bayesian perspective. Its world-class authors provide guidance on all aspects of Bayesian data analysis and include examples of real statistical analyses, based on their own research, that demonstrate how to solve complicated problems. Changes in the new edition include: -Stronger focus on MCMC -Revision of the computational advice in Part III -New chapters on nonlinear models and decision analysis -Several additional applied examples from the authors' recent research -Additional chapters on current models for Bayesian data analysis such as nonlinear models, generalized linear mixed models, and more -Reorganization of chapters 6 and 7 on model checking and data collection Bayesian computation is currently at a stage where there are many reasonable ways to compute any given posterior distribution. However, the best approach is not always clear ahead of time. Reflecting this, the new edition offers a more pluralistic presentation, giving advice on performing computations from many perspectives while making clear the importance of being aware that there are different ways to implement any given iterative simulation computation. The new approach, additional examples, and updated information make Bayesian Data Analysis an excellent introductory text and a reference that working scientists will use throughout their professional life.

Journal Article

Abstract  A call for risk assessment approaches that better characterize and quantify uncertainty has been made by the scientific and regulatory community. This paper responds to that call by demonstrating a distributional approach that draws upon human data to derive potency estimates and to identify and quantify important sources of uncertainty. The approach is rooted in the science of decision analysis and employs an influence diagram, a decision tree, probabilistic weights, and a distribution of point estimates of carcinogenic potency. Its results estimate the likelihood of different carcinogenic risks (potencies) for a chemical under a specific scenario. For this exercise, human data on formaldehyde were employed to demonstrate the approach. Sensitivity analyses were performed to determine the relative impact of specific levels and alternatives on the potency distribution. The resulting potency estimates are compared with the results of an exercise using animal data on formaldehyde. The paper demonstrates that distributional risk assessment is readily adapted to situations in which epidemiologic data serve as the basis for potency estimates. Strengths and weaknesses of the distributional approach are discussed. Areas for further application and research are recommended.

Journal Article

Abstract  Identification and qualitative comparison of sensitivity analysis methods that have been used across various disciplines, and that merit consideration for application to food-safety risk assessment models, are presented in this article. Sensitivity analysis can help in identifying critical control points, prioritizing additional data collection or research, and verifying and validating a model. Ten sensitivity analysis methods, including four mathematical methods, five statistical methods, and one graphical method, are identified. The selected methods are compared on the basis of their applicability to different types of models, computational issues such as initial data requirement and complexity of their application, representation of the sensitivity, and the specific uses of these methods. Applications of these methods are illustrated with examples from various fields. No one method is clearly best for food-safety risk models. In general, use of two or more methods, preferably with dissimilar theoretical foundations, may be needed to increase confidence in the ranking of key inputs.

Technical Report

Abstract  The importance of adequately characterizing variability and uncertainty in fate, transport, exposure, and dose-response assessments for human health and ecological risk assessments has been emphasized in several U.S. Environmental Protection Agency (EPA) documents and activities. As a follow up to these activities, EPA is issued a Policy for Use of Probabilistic Analysis in Risk Assessment and preliminary guidance on using probabilistic analysis. The policy documents the EPA's position "that such probabilistic analysis techniques as Monte Carlo analysis, given adequate supporting data and credible assumptions, can be viable statistical tools for analyzing variability and uncertainty in risk assessments." The policy also establishes conditions that are to be satisfied by risk assessments that use probabilistic techniques. These conditions are in keeping with the Agency's risk characterization policy that requires clarity, consistency, transparency, and reproducibility in risk assessments. "Guiding Principles for Monte Carlo Analysis" (EPA/630/R-97/001) presents a general framework and broad set of principles important for ensuring good scientific practices. Many of the principles apply generally to the various techniques for conducting quantitative analyses of variability and uncertainty; however, the focus of the principles is on Monte Carlo analysis. EPA recognizes that quantitative risk assessment methods and quantitative variability and uncertainty analysis are undergoing rapid development. The guiding principles are intended to serve as a minimum set of principles and are not intended to constrain or prevent the use of new or innovative improvements where scientifically defensible.

Journal Article

Abstract  This paper presents estimates of daily average per capita fish consumption by age and gender for the 48 conterminous states. The estimated consumption rates are reported for three fish habitats: freshwater/estuarine fish, marine fish, and all fish. The estimates were generated from the combined 1989, 1990, and 1991 Continuing Survey of Food Intake by Individuals (CSFII), a national food consumption survey conducted by the United States Department of Agriculture (USDA). Point and interval estimates of per capita fish consumption were generated from the empirical distribution of daily average per capita consumption. The point estimates include the mean, 50th, 75th, 90th, 95th, and 99th percentiles. Ninety percent confidence intervals are provided for the estimated mean and 90% bootstrap intervals are provided for percentile estimates. Information in a recipe file provided by USDA was used to calculate the amount of fish in recipes which contain fish. The estimated consumption rates are based on the weight of fish in its prepared or "as consumed" condition. The estimated mean consumption rate for all fish for the U.S. population of the 48 conterminous states was 15.65 grams/person/day (C.I.:14.67-16.63) of which 4.71 grams/person/day (C.I.:4.17-5.25) was freshwater/estuarine fish and 10.94 grams/person/day (C.I.:10.14-11.73) was marine fish.

Journal Article

Abstract  To conduct an initial exposure assessment for an airborne toxicant, industrial hygienists usually prefer air monitoring to mathematical modeling, even if only one exposure value is to be measured. This article argues that mathematical modeling may provide a more accurate (less uncertain) exposure estimate than monitoring if only a few air samples are to be collected, if anticipated exposure variability is high, and if information on exposure determinants is not too uncertain. To explore this idea, a hypothetical "true" distribution of 8-hour time-weighted average airborne exposure values, C, is posited based on an NF exposure model. The C distribution is approximately lognormal. Estimation of the mean value, microC (the long-term average exposure level), is considered. Based on simple random sampling of workdays and use of the sample mean C to estimate microC, accuracy (uncertainty) in the estimate is measured by the mean square error, MSE(C). In the alternative, a modeling estimate can be made using estimates of the mean chemical emission rate microG, the mean room dilution supply air rate microQ, and the mean dilution ventilation rate in the NF of the source mu beta. By positing uniform distributions for the estimates microG, microQ, and mu beta, an equation for the modeling mean square error MSE(microC) is presented. It is shown that for a sample size of three or fewer workdays, mathematical modeling rather than air monitoring should provide a more accurate estimate of microC if the anticipated geometric standard deviation for the C distribution exceeds 2.3.

DOI
Journal Article

Abstract  A statistical method using linear regression is shown for quantifying each variable's contribution to the uncertainty analysis in environmental health risk assessments. The method suggests that uncertainty analyses can be significantly simplified when a linear relationship can be established between risk or log(risk) and the independent variables.

DOI
Journal Article

Abstract  An essential facet of a risk assessment is the correct evaluation of uncertainties inherent in the numerical results. If the calculation is based on an explicit algebraic expression, an analytical treatment of error propagation is possible, usually as an approximation valid for small errors. In many instances, however, the errors are large and uncertain. It is the purpose of this paper to demonstrate that despite large errors, an analytical treatment is possible in many instances. These cases can be identified by an analysis of the algebraic structure and a detailed examination of the errors in input parameters and mathematical models. From a general formula, explicit formulas for some simple algebraic structures that occur often in risk assessments are derived and applied to practical problems.

DOI
Book/Book Chapter

Abstract  This book is a ‘primer’ in global sensitivity analysis (SA). Its ambition is to enable the reader to apply global SA to a mathematical or computational model. It offers a description of a few selected techniques for sensitivity analysis, used for assessing the relative importance of model input factors. These techniques will answer questions of the type ‘which of the uncertain input factors is more important in determining the uncertainty in the output of interest?’ or ‘if we could eliminate the uncertainty in one of the input factors, which factor should we choose to reduce the most the variance of the output?’ Throughout this primer, the input factors of interest will be those that are uncertain, i.e. whose value lie within a finite interval of non-zero width. As a result, the reader will not find sensitivity analysis methods here that look at the local property of the input–output relationships, such as derivative-based analysis. Special attention is paid to the selection of the method, to the framing of the analysis and to the interpretation and presentation of the results. The examples will help the reader to apply the methods in a way that is unambiguous and justifiable, so as to make the sensitivity analysis an added value to model-based studies or assessments. Both diagnostic and prognostic uses of models will be considered (a description of these is in Chapter 2), and Bayesian tools of analysis will be applied in conjunction with sensitivity analysis. When discussing sensitivity with respect to factors, we shall interpret the term ‘factor’ in a very broad sense: a factor is anything that can be changed in a model prior to its execution. This also includes structural or epistemic sources of uncertainty. To make an example, factors will be presented in applications that are in fact ‘triggers’, used to select one model structure versus another, one mesh size versus another, or altogether different conceptualisations of a system.

DOI
Book/Book Chapter

Abstract  Probabilistic risk analysis aims to quantify the risk caused by high technology installations. Increasingly, such analyses are being applied to a wider class of systems in which problems such as lack of data, complexity of the systems, uncertainty about consequences, make a classical statistical analysis difficult or impossible. The authors discuss the fundamental notion of uncertainty, its relationship with probability, and the limits to the quantification of uncertainty. Drawing on extensive experience in the theory and applications of risk analysis, the authors focus on the conceptual and mathematical foundations underlying the quantification, interpretation and management of risk. They cover standard topics as well as important new subjects such as the use of expert judgement and uncertainty propagation. The relationship of risk analysis with decision making is highlighted in chapters on influence diagrams and decision theory. Finally, the difficulties of choosing metrics to quantify risk, and current regulatory frameworks are discussed.

Filter Results