Third Biofuels Report to Congress

Project ID

2779

Category

Other

Added on

Nov. 21, 2018, 10:12 a.m.

Search the HERO reference database

Query Builder

Search query
Journal Article

Abstract  There is widespread potential for human exposure to disinfection byproducts (DBPs) in drinking water because everyone drinks, bathes, cooks, and cleans with water. The need for clean and safe water led the U.S. Congress to pass the Safe Drinking Water Act more than 20 years ago in 1974. In 1976, chloroform, a trihalomethane (THM) and a principal DBP, was shown to be carcinogenic in rodents. This prompted the U.S. Environmental Protection Agency (U.S. EPA) in 1979 to develop a drinking water rule that would provide guidance on the levels of THMs allowed in drinking water. Further concern was raised by epidemiology studies suggesting a weak association between the consumption of chlorinated drinking water and the occurrence of bladder, colon, and rectal cancer. In 1992 the U.S. EPA initiated a negotiated rulemaking to evaluate the need for additional controls for microbial pathogens and DBPs. The goal was to develop an approach that would reduce the level of exposure from disinfectants and DBPs without undermining the control of microbial pathogens. The product of these deliberations was a proposed stage 1 DBP rule. It was agreed that additional information was necessary on how to optimize the use of disinfectants while maintaining control of pathogens before further controls to reduce exposure beyond stage 1 were warranted. In response to this need, the U.S. EPA developed a 5-year research plan to support the development of the longer term rules to control microbial pathogens and DBPs. A considerable body of toxicologic data has been developed on DBPs that occur in the drinking water, but the main emphasis has been on THMs. Given the complexity of the problem and the need for additional data to support the drinking water DBP rules, the U.S. EPA, the National Institute of Environmental Health Sciences, and the U.S. Army are working together to develop a comprehensive biologic and mechanistic DBP database. Selected DBPs will be tested using 2-year toxicity and carcinogenicity studies in standard rodent models; transgenic mouse models and small fish models; in vitro mechanistic and toxicokinetic studies; and reproductive, immunotoxicity, and developmental studies. The goal is to create a toxicity database that reflects a wide range of DBPs resulting from different disinfection practices. This paper describes the approach developed by these agencies to provide the information needed to make scientifically based regulatory decisions.

DOI
Journal Article

Abstract  Estimates of carbon leaching losses from different land use systems are few and their contribution to the net ecosystem carbon balance is uncertain. We investigated leaching of dissolved organic carbon (DOC), dissolved inorganic carbon (DIC), and dissolved methane (CH4), at forests, grasslands, and croplands across Europe. Biogenic contributions to DIC were estimated by means of its δ13C signature. Leaching of biogenic DIC was 8.3±4.9 g m−2 yr−1 for forests, 24.1±7.2 g m−2 yr−1 for grasslands, and 14.6±4.8 g m−2 yr−1 for croplands. DOC leaching equalled 3.5±1.3 g m−2 yr−1 for forests, 5.3±2.0 g m−2 yr−1 for grasslands, and 4.1±1.3 g m−2 yr−1 for croplands. The average flux of total biogenic carbon across land use systems was 19.4±4.0 g C m−2 yr−1. Production of DOC in topsoils was positively related to their C/N ratio and DOC retention in subsoils was inversely related to the ratio of organic carbon to iron plus aluminium (hydr)oxides. Partial pressures of CO2 in soil air and soil pH determined DIC concentrations and fluxes, but soil solutions were often supersaturated with DIC relative to soil air CO2. Leaching losses of biogenic carbon (DOC plus biogenic DIC) from grasslands equalled 5–98% (median: 22%) of net ecosystem exchange (NEE) plus carbon inputs with fertilization minus carbon removal with harvest. Carbon leaching increased the net losses from cropland soils by 24–105% (median: 25%). For the majority of forest sites, leaching hardly affected actual net ecosystem carbon balances because of the small solubility of CO2 in acidic forest soil solutions and large NEE. Leaching of CH4 proved to be insignificant compared with other fluxes of carbon. Overall, our results show that leaching losses are particularly important for the carbon balance of agricultural systems.

DOI
Journal Article

Abstract  Cellulosic biofuels are intended to improve future energy and climate security. Nitrogen (N) fertilizer is commonly recommended to stimulate yields but can increase losses of the greenhouse gas nitrous oxide (N2O) and other forms of reactive N, including nitrate. We measured soil N2O emissions and nitrate leaching along a switchgrass (Panicum virgatum) high resolution N-fertilizer gradient for three years post-establishment. Results revealed an exponential increase in annual N2O emissions that each year became stronger (R-2 > 0.9, P < 0.001) and deviated further from the fixed percentage assumed for IPCC Tier 1 emission factors. Concomitantly, switchgrass yields became less responsive each year to N fertilizer. Nitrate leaching (and calculated indirect N2O emissions) also increased exponentially in response to N inputs, but neither methane (CH4) uptake nor soil organic carbon changed detectably. Overall, N fertilizer inputs at rates greater than crop need curtailed the climate benefit of ethanol production almost two-fold, from a maximum mitigation capacity of -5.71 +/- 0.22 Mg CO(2)e ha(-1) yr(-1) in switchgrass fertilized at 56 kg N ha(-1) to only -2.97 +/- 0.18 MgCO(2)e ha(-1) yr(-1) in switchgrass fertilized at 196 kg N ha(-1). Minimizing N fertilizer use will be an important strategy for fully realizing the climate benefits of cellulosic biofuel production.

DOI
Journal Article

Abstract  Farmers’ cropping decisions are a product of a complex mix of socio-economic, cultural, and natural environments in which factors operating at a number of different spatial scales affect how farmers ultimately decide to use their land in any given year or over a set of years. Some environmentalists are concerned that increased demand for corn driven by ethanol production is leading to conversion of non-cropland into corn production (which we label as “extensification”). Ethanol industry advocates counter that more than enough corn supply comes from crop switching to corn and increased yields (which we label as “intensification”). In this study, we determine whether either response to corn demand -- intensification or extensification -- is supported. This is determined through an analysis of land-use/land-cover (LULC) data that covers the state of Kansas and a measure of a corn demand shifter related to ethanol production -- distance to the closest ethanol plant -- between 2007 and 2009.

DOI
Journal Article

Abstract  Although the United States has pursued rapid development of corn ethanol as a matter of national biofuel policy, relatively little is known about this policy's widespread impacts on agricultural land conversion surrounding ethanol refineries. This knowledge gap impedes policy makers' ability to identify and mitigate potentially negative environmental impacts of ethanol production. We assessed changes to the landscape during initial implementation of the Renewable Fuel Standard v2 (RFS2) from 2008 to 2012 and found nearly 4.2 million acres of arable non-cropland converted to crops within 100 miles of refinery locations, including 3.6 million acres of converted grassland. Aggregated across all ethanol refineries, the rate of grassland conversion to cropland increased linearly with proximity to a refinery location. Despite this widespread conversion of the landscape, recent cropland expansion could have made only modest contributions to mandated increases in conventional biofuel capacity required by RFS2. Collectively, these findings demonstrate a shortcoming in the existing 'aggregate compliance' method for enforcing land protections in the RFS2 and suggest an alternative monitoring mechanism would be needed to appropriately capture the scale of observed land use changes.

DOI
Journal Article

Abstract  The Conservation Reserve Program (CRP) is the largest agricultural land-retirement program in the United States, providing many environmental benefits, including wildlife habitat and improved air, water, and soil quality. Since 2007, however, CRP area has declined by over 25% nationally with much of this land returning to agriculture. Despite this trend, it is unclear what types of CRP land are being converted, to what crops, and where. All of these specific factors greatly affect environmental impacts. To answer these questions, we quantified shifts in expiring CRP parcels to five major crop-types (corn, soy, winter and spring wheat, and sorghum) in a 12-state, Midwestern region of the United States using a US Department of Agriculture (USDA), field-level CRP database and USDA's Cropland Data Layer. For the years 2010 through 2013, we estimate almost 30%, or more than 530 000 ha, of expiring CRP land returned to the production of these five crops in our study area, with soy and corn accounting for the vast majority of these shifts. Grasslands were the largest type of CRP land converted (360 000 ha), followed by specifically designated wildlife habitat (76 000 ha), and wetland areas (53 000 ha). These wetland areas were not just wetlands themselves, but also a mix of land covers enhancing or protecting wetland ecosystem services (e.g., wetland buffers). Areas in the Dakotas, Nebraska, and southern Iowa were hotspots of change, with the highest areas of CRP land moving back to agriculture. By contrast, we estimate only a small amount (similar to 3%) of the expiring land shifted into similar, non-CRP land-retirement or easement programs. Reconciling needs for food, feed, fuel, and healthy ecosystems is an immense challenge for farmers, conservationists, and state and federal agencies. Reduced enrollment and the turnover of CRP land from conservation to agriculture raises questions about sustaining ecosystem services in this region.

DOI
Journal Article

Abstract  Cover crops play an important role in improving productivity of subsequent row crops by improving soil physical, chemical, and biological properties. The objective of this article is to review recent advances in cover crops practice, in the context of potential benefits and drawbacks for annual crop production and sustained soil quality. Desirable attributes of a cover crop are the ability to establish rapidly under less than ideal conditions, provide sufficient dry matter or soil cover, fix atmospheric nitrogen (N), establish a deep root system to facilitate nutrient uptake from lower soil depths, produce organic matter with low-residue carbon/nitrogen (C/N) ratio, and absence of phytoxic or allelopathic effects on subsequent crops. Cover crops can be leguminous or nonleguminous. Leguminous cover crops provide a substantial amount of biologically fixed N to the primary crop, as well as ease of decomposition due to their low C/N ratio. Legume cover crops also possess a strong ability to absorb low available nutrients in the soil profile and can help in increasing concentration of plant nutrients in the surface layers of soil. Some nonleguminous cover crops having high N scavenger capacity compared with leguminous crops and sometimes, the growth of these scavenging grass cover crops is limited by N deficiency, growing grass/legume mixtures appears to be the best strategy in obtaining maximum benefits from cover crops.

DOI
Journal Article

Abstract  Novel energy production systems are needed that not only offer reductions in greenhouse gas emissions but also cause fewer overall environmental impacts. How to identify and implement more sustainable biofuel production alternatives, and how to overcome economic challenges for their implementation, is a matter of debate. In this study, the environmental impacts of alternative approaches to biofuel production (i.e., first, second, and third generation biofuels), with a focus on biodiversity and ecosystem services, were contrasted to develop a set of criteria for guiding the identification of sustainable biofuel production alternatives (i.e., those that maximize socioeconomic and environmental benefits), as well as strategies for decreasing the economic barriers that prevent the implementation of more sustainable biofuel production systems. The identification and implementation of sustainable biofuel production alternatives should be based on rigorous assessments that integrate socioeconomic and environmental objectives at local, regional, and global scales. Further development of environmental indicators, standardized environmental assessments, multi-objective case studies, and globally integrated assessments, along with improved estimations of biofuel production at fine spatial scales, can enhance the identification of more sustainable biofuel production systems. In the short term, several governmental mandates and incentives, along with the development of financial and market-based mechanisms and applied research partnerships, can accelerate the implementation of more sustainable biofuel production alternatives. The set of criteria and strategies developed here can guide decision making towards the identification and adoption of sustainable biofuel production systems.

DOI
Book/Book Chapter

Abstract  This chapter intended to examine the interplay among biomass production, land management, and conservation practices and their impact(s) on water quality and hydrology. The study evaluated proposed future biomass production under partial land use change, crop residue harvest, and conservation practices in a Corn Belt watershed using SWAT model, followed by a temporal analysis. Results showed that installing riparian buffer in the entire stream network in the watershed could deliver a strong performance in reducing loss of sediments (62%) and phosphorus (30%) and have the lowest impact on stream flow (1%) among the four scenarios despite its small reduction in nitrate levels (5%). Partial land conversion to grow SWG along with residue harvest and cover crop could receive the highest nitrate loading reduction (26%) and performed well especially in high flow years, whereas it decreased annual stream flow by 13% on 20 year average. Switchgrass could contribute to changes in evapotranspiration and the distribution among surface runoff, lateral flow, groundwater flow, and tile drain flow. Our analysis indicates the benefits of incorporating conservation practices into land use planning with the consideration of regional distinct landscape, soil, climate, and crop conditions.

DOI
Journal Article

Abstract  The overall goal of this project was to quantify the long-term water quality impacts of land management changes associated with increased demands for corn as a transportation biofuel feedstock in the United States. A modeling approach that considers a nonpoint source model, Groundwater Loading Effects of Agricultural Management Systems and National Agricultural Pesticide Risk Analysis, was used to simulate annual losses in runoff, percolation, erosion, nitrate-nitrogen, total phosphorus, atrazine (1-chloro-3-ethylamino-5-isopropylamino-2,4,6-triazine), and pyraclostrobin (Methyl {2-[1-(4-chlorophenyl)-1H-pyrazol-3-yloxymethyl] phenyl} methoxycarbamate) to the edge-of-field and bottom-of-root zones associated with multiple cropping scenarios. Model results for representative soils, throughout Indiana, were analyzed to determine 10% (worst case) and 50% (average case) probability of exceedence in the aforementioned water quality indicators. Modeling results indicated significant differences (p<0.05) In water quality indicators between continuous corn and corn-soybean rotations. The results showed that agricultural management decisions would have greater impacts on nutrient, runoff, erosion, and pesticides losses from agricultural fields compared to water quality indicators associated with the projected changes in crop rotation systems. The model results point to the need for additional research to fully understand the water impacts of land management decisions associated with corn grain as a feedstock for biofuel production.

DOI
Journal Article

Abstract  Several biofuel cropping scenarios were evaluated with an improved version of Soil and Water Assessment Tool (SWAT) as part of the CenUSA Bioenergy consortium for the Boone River Watershed (BRW), which drains about 2,370km(2) in north central Iowa. The adoption of corn stover removal, switchgrass, and/or Miscanthus biofuel cropping systems was simulated to assess the impact of cellulosic biofuel production on pollutant losses. The stover removal results indicate removal of 20 or 50% of corn stover in the BRW would have negligible effects on streamflow and relatively minor or negligible effects on sediment and nutrient losses, even on higher sloped cropland. Complete cropland conversion into switchgrass or Miscanthus, resulted in reductions of streamflow, sediment, nitrate, and other pollutants ranging between 23-99%. The predicted nitrate reductions due to Miscanthus adoption were over two times greater compared to switchgrass, with the largest impacts occurring for tile-drained cropland. Targeting of switchgrass or Miscanthus on cropland 2% slope or 7% slope revealed a disproportionate amount of sediment and sediment-bound nutrient reductions could be obtained by protecting these relatively small areas of higher sloped cropland. Overall, the results indicate that all biofuel cropping systems could be effectively implemented in the BRW, with the most robust approach being corn stover removal adopted on tile-drained cropland in combination with a perennial biofuel crop on higher sloped landscapes.

DOI
Journal Article

Abstract  The United States (US) is among the global hotspots of nitrogen (N) deposition and assessing the temporal trends of wet N deposition is relevant to quantify the effectiveness of existing N regulation policies and its consequent environmental effects. This study analyzed changes in observed wet deposition of dissolved inorganic N (DIN = ammonium + nitrate) in the US between 1985 and 2012 by applying a Mann-Kendall test and Regional Kendall test. Current wet DIN deposition (2011-2012) data were used to gain insight in the current pattern of N deposition. Wet DIN deposition generally decreased going from Midwest > Northeast > South > West region with a national mean rate of 3.5 kg N ha(-1) yr(-1). Ammonium dominated wet DIN deposition in the Midwest, South and West regions, whereas nitrate and ammonium both contributed a half in the Northeast region. Wet DIN deposition showed no significant change at the national scale between 1985 and 2012, but profound changes occurred in its components. Wet ammonium deposition showed a significant increasing trend at national scale (0.013 kg N ha(-1) yr(-2)), with the highest increase in the Midwest and eastern part of the South region. Inversely, wet nitrate deposition decreased significantly at national scale (-0.014 kg N ha(-1) yr(-2)), with the largest reduction in the Northeast region. Overall, ratios of ammonium versus nitrate in wet deposition showed a significant increase in all the four regions, resulting in a transition of the dominant N species from nitrate to ammonium. Distinct magnitudes, trends and patterns of wet ammonium and nitrate deposition suggest the needs to control N emissions by species and regions to avoid negative effects of N deposition on ecosystem health and function in the US.

Journal Article

Abstract  The Environmental Protection Agency (EPA) is promulgating today's final rule, the Stage 2 Disinfectants and Disinfection Byproducts Rule (DBPR), to provide for increased protection against the potential risks for cancer and reproductive and developmental health effects associated with disinfection byproducts (DBPs). The final Stage 2 DBPR contains maximum contaminant level goals for chloroform, monochloroacetic acid and trichloroacetic acid; National Primary Drinking Water Regulations, which consist of maximum contaminant levels (MCLs) and monitoring, reporting, and public notification requirements for total trihalomethanes (TTHM) and haloacetic acids (HAA5); and revisions to the reduced monitoring requirements for bromate. This document also specifies the best available technologies for the final MCLs. EPA is also approving additional analytical methods for the determination of disinfectants and DBPs in drinking water. EPA believes the Stage 2 DBPR will reduce the potential risks of cancer and reproductive and developmental health effects associated with DBPs by reducing peak and average levels of DBPs in drinking water supplies. The Stage 2 DBPR applies to public water systems (PWSs) that are community water systems (CWSs) or nontransient noncommunity water systems (NTNCWs) that add a primary or residual disinfectant other than ultraviolet light or deliver water that has been treated with a primary or residual disinfectant other than ultraviolet light. This rule also makes minor corrections to drinking water regulations, specifically the Public Notification tables. New endnotes were added to these tables in recent rulemakings; however, the corresponding footnote numbering in the tables was not changed. In addition, this rule makes a minor correction to the Stage 1 Disinfectants and Disinfection Byproducts Rule by replacing a sentence that was inadvertently removed.

Book/Book Chapter

Abstract  Authoritative, comprehensive, and written by experts, Water Quality & Treatment: A Handbook on Drinking Water has been the essential reference for municipal water supply professionals for decades. The sixth edition covers more topics than ever and goes deeper into every topic, making it the most useful edition yet. It is truly a one-volume reference. It covers all aspects of drinking water supply: state-of-the-art technologies; water quality from source to tap, conventional and advanced methods and processes in water treatment, and drinking water standards and regulations. Importantly, it emphasizes principles (theory) and applications (practice), making it as useful for the academician as the practitioner.

Journal Article

Abstract  Forests form the critical source water areas for downstream drinking water supplies in many parts of the world, including the Rocky Mountain regions of North America. Large scale natural disturbances from wildfire and severe insect infestation are more likely because of warming climate and can significantly impact water quality downstream of forested headwaters regions. To investigate potential implications of changing climate and wildfire on drinking water treatment, the 2003 Lost Creek Wildfire in Alberta, Canada was studied. Four years of comprehensive hydrology and water quality data from seven watersheds were evaluated and synthesized to assess the implications of wildfire and post-fire intervention (salvage-logging) on downstream drinking water treatment. The 95th percentile turbidity and DOC remained low in streams draining unburned watersheds (5.1 NTU, 3.8 mg/L), even during periods of potential treatment challenge (e.g., stormflows, spring freshet); in contrast, they were elevated in streams draining burned (15.3 NTU, 4.6 mg/L) and salvage-logged (18.8 NTU, 9.9 mg/L) watersheds. Persistent increases in these parameters and observed increases in other contaminants such as nutrients, heavy metals, and chlorophyll-a in discharge from burned and salvage-logged watersheds present important economic and operational challenges for water treatment; most notably, a potential increased dependence on solids and DOC removal processes. Many traditional source water protection strategies would fail to adequately identify and evaluate many of the significant wildfire- and post-fire management-associated implications to drinking water "treatability"; accordingly, it is proposed that "source water supply and protection strategies" should be developed to consider a suppliers' ability to provide adequate quantities of potable water to meet demand by addressing all aspects of drinking water "supply" (i.e., quantity, timing of availability, and quality) and their relationship to "treatability" in response to land disturbance.

DOI
Journal Article

Abstract  Severe wildfires often have dramatic short-term effects on water quality, although there is increasing evidence that in some catchments their effects can persist for many years. Forest recovery after the 2002 Hayman Fire burned catchments that supply drinking water to over a half million users in Denver, CO, has been extremely slow and has caused persistent water quality concerns. To evaluate whether postfire water quality changes increase the potential to form undesirable by-products of water disinfection, we compared stream water from eight burned catchments within the Hayman Fire and five adjacent unburned catchments. We tested dissolved organic carbon (DOC) concentrations and the formation of disinfection by-products (trihalomethanes [THMs], haloacetonitriles [HANs], chloral hydrate [CHD, and haloketones [HKTs]) in stream water monthly during 2014 and 2015. Stream DOC, THMs, and CHD and specific ultraviolet absorbance at 254 nm (SUVA254) were elevated in catchments with a moderate extent of high-severity wildfire (8–46% of catchment area) relative to catchments that were unburned and those that burned more extensively (>74% of catchment area) 14 yr after the fire. In contrast, formation of highly toxic but unregulated nitrogenous HANs increased linearly with wildfire extent. Although these findings should not raise concern regarding drinking water safety, they highlight the long-term influences of high severity wildfire on source water C content, composition, and treatability.

DOI
Technical Report

Abstract  Sustaining the quality of the Nation’s water resources and the health of our diverse ecosystems depends on the availability of sound water-resources data and information to develop effective, science-based policies. Effective management of water resources also brings more certainty and efficiency to important economic sectors. Taken together, these actions lead to immediate and longterm economic, social, and environmental benefits that make a difference to the lives of the almost 400 million people projected to live in the United States by 2050. In 1991, Congress established the U.S. Geological Survey National Water-Quality Assessment (NAWQA) to address where, when, why, and how the Nation’s water quality has changed, or is likely to change in the future, in response to human activities and natural factors. Since then, NAWQA has been a leading source of scientific data and knowledge used by national, regional, state, and local agencies to develop science-based policies and management strategies to improve and protect water resources used for drinking water, recreation, irrigation, energy development, and ecosystem needs. Plans for the third decade of NAWQA (2013–23) address priority water-quality issues and science needs identified by NAWQA stakeholders, such as the Advisory Committee on Water Information and the National Research Council, and are designed to meet increasing challenges related to population growth, increasing needs for clean water, and changing land-use and weather patterns. This report is one of a series of publications, The Quality of Our Nation’s Waters, which describes major findings of the NAWQA Project on water-quality issues of regional and national concern and provides science-based information for assessing and managing the quality of our groundwater resources. Other reports in this series focus on occurrence and distribution of nutrients, pesticides, and volatile organic compounds in streams and groundwater, the effects of contaminants and stream-flow alteration on the condition of aquatic communities in streams, and on the quality of groundwater from private domestic and public supply wells. Each reports builds toward a more comprehensive understanding of the quality of regional and national water resources. All NAWQA reports are available online (https://water.usgs.gov/nawqa/bib/). We hope this publication will provide you with insights and information to meet your water-resource needs and will foster increased citizen awareness and involvement in the protection and restoration of our Nation’s waters. The information in this report is intended primarily for those interested or involved in resource management and protection, conservation, regulation, and policymaking at the regional and national levels.

DOI
Technical Report

Abstract  This study characterized the amount and quality of organic matter in the Clackamas River, Oregon, to gain an understanding of sources that contribute to the formation of chlorinated and brominated disinfection by-products (DBPs), focusing on regulated DBPs in treated drinking water from two direct-filtration treatment plants that together serve approximately 100,000 customers. The central hypothesis guiding this study was that natural organic matter leaching out of the forested watershed, in-stream growth of benthic algae, and phytoplankton blooms in the reservoirs contribute different and varying proportions of organic carbon to the river. Differences in the amount and composition of carbon derived from each source affects the types and concentrations of DBP precursors entering the treatment plants and, as a result, yield varying DBP concentrations and species in finished water. The two classes of DBPs analyzed in this study-trihalomethanes (THMs) and haloacetic acids (HAAs)-form from precursors within the dissolved and particulate pools of organic matter present in source water. The five principal objectives of the study were to (1) describe the seasonal quantity and character of organic matter in the Clackamas River; (2) relate the amount and composition of organic matter to the formation of DBPs; (3) evaluate sources of DBP precursors in the watershed; (4) assess the use of optical measurements, including in-situ fluorescence, for estimating dissolved organic carbon (DOC) concentrations and DBP formation; and (5) assess the removal of DBP precursors during treatment by conducting treatability "jar-test" experiments at one of the treatment plants. Data collection consisted of (1) monthly sampling of source and finished water at two drinking-water treatment plants; (2) event-based sampling in the mainstem, tributaries, and North Fork Reservoir; and (3) in-situ continuous monitoring of fluorescent dissolved organic matter (FDOM), turbidity, chlorophyll-a, and other constituents to continuously track source-water conditions in near real-time. Treatability tests were conducted during the four event-based surveys to determine the effectiveness of coagulant and powdered activated carbon (PAC) on the removal of DBP precursors. Sample analyses included DOC, total particulate carbon (TPC), total and dissolved nutrients, absorbance and fluorescence spectroscopy, and, for regulated DBPs, concentrations of THMs and HAAs in finished water and laboratory-based THM and HAA formation potentials (THMFP and HAAFP, respectively) for source water and selected locations throughout the watershed. The results of this study may not be typical given the record and near record amounts of precipitation that occurred during spring that produced streamflow much higher than average in 2010-11. Although there were algal blooms, lower concentrations of chlorophyll-a were observed in the water column during the study period compared to historical data. Concentrations of DBPs in finished (treated) water averaged 0.024 milligrams per liter (mg/L) for THMs and 0.022 mg/L for HAAs; maximum values were about 0.040 mg/L for both classes of DBPs. Although DBP concentrations were somewhat higher within the distribution system, none of the samples collected for this study or for the quarterly compliance monitoring by the water utilities exceeded levels permissible under existing U.S. Environmental Protection Agency (USEPA) regulations: 0.080 mg/L for THMs and 0.060 mg/L for HAAs. DOC concentrations were generally low in the Clackamas River, typically about 1.0-1.5 mg/L. Concentrations in the mainstem occasionally increased to nearly 2.5 mg/L during storms; DOC concentrations in tributaries were sometimes much higher (up to 7.8 mg/L). The continuous in-situ FDOM measurements indicated sharp rises in DOC concentrations in the mainstem following rainfall events; concentrations were relatively stable during summer base flow. Even though the first autumn storm mobilized appreciable quantities of carbon, higher concentrations of DBPs in finished water were observed 3-weeks later, after the ground was saturated from additional rainfall. The majority of the DOC in the lower Clackamas River appears to originate from the upper basin, suggesting terrestrial carbon was commonly the dominant source. Lower-basin tributaries typically contained the highest concentrations of DOC and DBP precursors and contributed substantially to the overall loads in the mainstem during storms. During low-flow periods, tributaries were not major sources of DOC or DBP precursors to the Clackamas River. Although the dissolved fraction of organic carbon contributed the majority of DBP precursors, at times the particulate fraction (inorganic sediment and organic particles including detritus and algal material) contributed a substantial fraction of DBP precursors. Considering just the main-stem sites, on average, 10 percent of THMFP and 32 percent of HAAFP were attributed to particulate carbon. This finding suggests water-treatment methods that remove particles prior to chlorination would reduce finished-water DBP concentrations to some degree. Overall, concentrations of THM and HAA precursors were closely linked to DOC concentrations; laboratory DBP formation potentials (DBPFPs) clearly showed that THMFP and HAAFP were greatest in the downstream tributaries that contained elevated carbon concentrations. However, carbon-normalized "specific" formation potentials for THMs and HAAs (STHMFP and SHAAFP, respectively) revealed changes in carbon character over time that affected the two types of DBP classes differently. HAA precursors were elevated in waters containing aromatic-rich soil-derived material arising from forested areas. In contrast, THM precursors were associated with carbon having a lower aromatic content; highest STHMFP occurred in autumn 2011 in the mainstem from North Fork Reservoir downstream to LO DWTP. This pattern suggests the potential for a link between THM precursors and algal-derived carbon. The highest STHMFP value was measured within North Fork Reservoir, indicating reservoir derived carbon may be important for this class of DBPs. Weak correlations between STHMFP and SHAAFP emphasize that precursor sources for these types of DBPs may be different. This highlights not only that different locations within the watershed produce carbon with different reactivity (specific DBPFP), but also that different management approaches for each class of DBP precursors could be required for control. Treatability tests conducted on source water during four basin-wide surveys demonstrated that an average of about 40 percent of DOC can be removed by coagulation. While the decrease in THMFP following coagulation was similar to DOC, the decrease in HAAFP was much greater (approximately 70 percent), indicating coagulation is particularly effective at removing HAA precursors'likely because of the aromatic nature of the carbon associated with HAA precursors. Several findings from this study have direct implications for managing drinking-water resources and for providing useful information that may help improve treatment-plant operations. For example, the use of in-situ fluorometers that measure FDOM provided an excellent proxy for DOC concentration in this system and revealed short-term, rapid changes in DOC concentration during storm events. In addition, the strong correlation between FDOM values measured in-situ and HAA5 concentrations in finished water may permit estimation of continuous HAA concentrations, as was done here. As part of this study, multiple in-situ FDOM sensors were deployed continuously and in real-time to characterize the composition of dissolved organic matter. Although the initial results were promising, additional research and engineering developments will be needed to demonstrate the full utility of these sensors for this purpose. In conclusion, although DBPFPs were strongly correlated to DOC concentration, some DBPs formed from particulate carbon, including terrestrial leaf material and algal material such as planktonic species of blue-green algae and sloughed filaments, stalks, and cells of benthic algae. Different precursor sources in the watershed were evident from the data, suggesting specific actions may be available to address some of these sources. In-situ measurements of FDOM proved to be an excellent proxy for DOC concentration as well as HAA formation during treatment, which suggests further development and refinement of these sensors have the potential to provide real-time information about complex watershed processes to operators at the drinking-water treatment plants. Follow-up studies could examine the relative roles that terrestrial and algal sources have on the DBP precursor pool to better understand how watershed-management activities may be affecting the transport of these compounds to Clackamas River drinking-water intakes. Given the low concentrations of algae in the water column during this study, additional surveys during more typical river conditions could provide a more complete understanding of how algae contribute DBP precursors. Further development of FDOM-sensor technology can improve our understanding of carbon dynamics in the river and how concentrations may be trending over time. This study was conducted in collaboration with Clackamas River Water and the City of Lake Oswego water utilities. Other research partners included Oregon Health and Science University in Hillsboro, Oregon, Alexin Laboratory in Tigard, Oregon, U.S. Geological Survey National Research Program Laboratory in Denver, Colorado, and the U.S. Geological Survey Water Science Centers in Portland, Oregon, and Sacramento, California. This project was supported with funding from Clackamas River Water, City of Lake Oswego, the U.S. Geological Survey, and the Water Research Foundation.

DOI
Journal Article

Abstract  Corn (Zea mays L.) stover is considered one of the prime lignocellulosic feedstocks for biofuel production. While producing renewable energy from biomass is necessary, impacts of harvesting corn stover on soil organic carbon (SOC) sequestration, agricultural productivity, and environmental quality must be also carefully and objectively assessed. We conducted a 2 1/2 year study of stover management in long-term (> 8 yr) no-tillage (NT) continuous corn systems under three contrasting soils in Ohio to determine changes in SOC sequestration, CO2 emissions, soil physical properties, and agronomic productivity. These measurements were made on a Rayne silt loam (RSL) (fine-loamy, mixed, active, mesic Typic Hapludult) with 6% slope, Celina silt loam (CSL) (fine, mixed, active, mesic Aquic Hapludalfs) with 2% slope, and Hoytville clay loam (HCL) (fine, illitic, mesic Mollic Epiaqualfs) with < 1% slope. Stover treatments consisted of removing 0, 25, 50, 75, and 100% of corn stover following each harvest. At the start of the experiment in May 2004, these percentages of removal corresponded to 5, 3.75, 2.5, 1.25, and 0 Mg ha− 1 yr− 1 of stover left on the soil surface, respectively. Annual stover removal rate of > 25% reduced SOC and soil productivity, but the magnitude of impacts depended on soil type and topographic conditions. Stover removal rate of 50% reduced grain yield by about 1.94 Mg ha− 1, stover yield by 0.97 Mg ha− 1, and SOC by 1.63 Mg ha− 1 in an unglaciated, sloping, and erosion-prone soil (P < 0.05). The initial water infiltration rates were significantly reduced by > 25% of stover removal on a RSL and CSL. Plant available water reserves and earthworm population were significantly reduced by 50% of stover removal at all soils. Increases in soil compaction due to stover removal were moderate. Stover removal impacts on SOC, crop yield, and water infiltration for HCL were not significant. Results from this study following 2 1/2 yr of stover management suggest that only a small fraction (≤ 25%) of the total corn stover produced can be removed for biofuel feedstocks from sloping and erosion-prone soils.

Journal Article

Abstract  Widespread use of neonicotinoid insecticides in North America has led to frequent detection of neonicotinoids in surface waters. Despite frequent surface water detection, few studies have evaluated underlying sediments for the presence of neonicotinoids. Thus, we sampled water and sediments for neonicotinoids during a one-year period at 40 floodplain wetlands throughout Missouri. Analyzed for six common neonicotinoids, sediment samples consistently (63% of samples) contained neonicotinoids (e.g., imidacloprid and clothianidin) in all sampling periods. Mean sediment and aqueous neonicotinoid concentrations were 1.19 μg kg-1 (range: 0-17.99 μg kg-1) and 0.03 μg L-1 (0-0.97 μg L-1), respectively. We used boosted regression tree analysis to explain sediment neonicotinoid concentrations and ultimately identified six variables that accounted for 31.6% of concentration variability. Efforts to limit sediment neonicotinoid contamination could include reducing agriculture within a wetland below a threshold of 25% area planted. Also, prolonging periods of overlying water >25 cm deep when water temperatures reach/exceed 18 °C could promote conditions favorable for neonicotinoid degradation. Results of this study can be useful in determining potential routes and levels of neonicotinoid exposure experienced by nontarget benthic aquatic invertebrates as well as potential means to mitigate neonicotinoid concentrations in floodplain wetlands.

Journal Article

Abstract  Imidacloprid, a widely used neonicotinoid insecticide, has led to a decline in the honey bee population worldwide. An invertebrate insect prey with neonicotinoid toxicity can adversely affect insectivores, such as echolocating bats. The aim of the current study was to examined whether imidacloprid toxicity may interfere echolocation system such as vocal, auditory, orientation, and spatial memory systems in the insectivorous bat. By comparing the ultrasound spectrum, auditory brainstem-evoked potential, and flight trajectory, we found that imidacloprid toxicity may interfere functions in vocal, auditory, orientation, and spatial memory system of insectivorous bats (Hipposideros armiger terasensis). As suggested from immunohistochemistry and western blots evidences, we found that insectivorous bats after suffering imidacloprid toxicity may decrease vocal-related FOXP2 expressions in the superior colliculus, auditory-related prestin expressions in the cochlea, and the auditory-related otoferlin expressions in the cochlea and the inferior colliculus, and cause inflammation and mitochondrial dysfunction-related apoptosis in the hippocampal CA1 and medial entorhinal cortex. These results may provide a reasonable explanation about imidacloprid-induced interference of echolocation system in insectivorous bats.

DOI
Journal Article

Abstract  Herbicides became the dominant pesticide type applied in the United States, with the bulk applied to corn, cotton, soybeans, and wheat. Major factors affecting use trends since 1980 are crop acreage, the use of newer compounds applied at lower per-acre rates, and the adoption of genetically-engineered crops.

Filter Results