WFLC - CAIF Report

Project ID

3013

Category

Other

Added on

Dec. 14, 2020, 8:58 a.m.

Search the HERO reference database

Query Builder

Search query
DOI
Journal Article

Abstract  This study employed the hedonic price framework to examine the effects of 256 wildfires and environmental amenities on home values in northwest Montana between June 1996 and January 2007. The study revealed environmental amenities, including proximity to lakes, national forests, Glacier National Park and golf courses, have large positive effects on property values in northwest Montana. However, proximity to and view of wildfire burned areas has had large and persistent negative effects on home values. The analysis supports an argument that homebuyers may correlate proximity to and view of a wildfire burned area with increased wildfire risk. Indeed, when a burned area is not visible from a home, wildfire risk appears to be out of sight and out of mind for homebuyers. Findings from this research can be used to inform debate about efficient allocation of resources to wildfire preparedness, including public education programs, and suppression activities around the wildland-urban interface. (C) 2010 Elsevier B.V. All rights reserved.

Journal Article

Abstract  Forests form the critical source water areas for downstream drinking water supplies in many parts of the world, including the Rocky Mountain regions of North America. Large scale natural disturbances from wildfire and severe insect infestation are more likely because of warming climate and can significantly impact water quality downstream of forested headwaters regions. To investigate potential implications of changing climate and wildfire on drinking water treatment, the 2003 Lost Creek Wildfire in Alberta, Canada was studied. Four years of comprehensive hydrology and water quality data from seven watersheds were evaluated and synthesized to assess the implications of wildfire and post-fire intervention (salvage-logging) on downstream drinking water treatment. The 95th percentile turbidity and DOC remained low in streams draining unburned watersheds (5.1 NTU, 3.8 mg/L), even during periods of potential treatment challenge (e.g., stormflows, spring freshet); in contrast, they were elevated in streams draining burned (15.3 NTU, 4.6 mg/L) and salvage-logged (18.8 NTU, 9.9 mg/L) watersheds. Persistent increases in these parameters and observed increases in other contaminants such as nutrients, heavy metals, and chlorophyll-a in discharge from burned and salvage-logged watersheds present important economic and operational challenges for water treatment; most notably, a potential increased dependence on solids and DOC removal processes. Many traditional source water protection strategies would fail to adequately identify and evaluate many of the significant wildfire- and post-fire management-associated implications to drinking water "treatability"; accordingly, it is proposed that "source water supply and protection strategies" should be developed to consider a suppliers' ability to provide adequate quantities of potable water to meet demand by addressing all aspects of drinking water "supply" (i.e., quantity, timing of availability, and quality) and their relationship to "treatability" in response to land disturbance.

DOI
Journal Article

Abstract  Fire severity and burn patchiness are frequently cited as important to post-fire surface runoff and erosion, yet few studies quantify their effects. A better understanding of their role is needed to predict post-fire erosion and design prescribed burns. Therefore, this study quantified the effects of fire severity and burn patchiness on surface runoff, erosion and hydrologic connectivity using 116 unbounded runoff samplers. The samplers were installed in recently prescribed burnt dry eucalypt forest in Victoria, Australia. Sediment loads over 16-months were approximately three orders of magnitude higher on burnt compared with unburnt hillslopes while differences in runoff and erosion between the low and high severity hillslopes were relatively small. Unburnt patches were often highly effective at reducing hydrologic connectivity from upslope burnt areas, with sediment loads over 16-months reduced by 1.3%, 98.1% and 99.9% downslope of 1, 5 and 10 m wide unburnt patches respectively. Hydrologic connectivity was limited most effectively by wider unburnt patches (10 m) and during lower magnitude storms. The results suggest overall that post-fire runoff and erosion may be substantially limited by unburnt patches while fire severity is a less important factor (within the context of prescribed burning). Consequently, post-fire erosion models should consider the spatial arrangement of unburnt patches, and unburnt patches (>10 m wide) should be retained within prescribed burns to minimise erosion. Crown Copyright (C) 2013 Published by Elsevier B.V. All rights reserved.

DOI
Journal Article

Abstract  Giant sequoias (Sequoiadendron giganteum [Lindl.] J. Buchholz) preserve a detailed history of fire within their annual rings. We developed a 3000 year chronology of fire events in one of the largest extant groves of ancient giant sequoias, the Giant Forest, by sampling and tree-ring dating fire scars and other fire-related indicators from 52 trees distributed over an area of about 350 ha. When all fire events were included in composite chronologies, the mean fire intervals (years between fires of any size) declined as a function of increasing spatial extent from tree, to group, to multiple groups, to grove scales: 15.5 yr (0.1 ha), 7.4 yr (1 ha.), 3.0 yr (70 ha), and 2.2 yr (350 ha), respectively. We interpreted widespread fires (i.e., fire events recorded on ≥2 trees, or ≥25% of all trees recording fires within composites) to have occurred in areas of 70 ha to 350 ha at mean intervals ranging from about 6 yr to 35 yr. We compared the annual, multi-decadal and centennial variations in Giant Forest fire frequency with those documented in tree-ring and charcoal-based fire chronologies from four other giant sequoia groves in the Sierra Nevada, and with independent tree-ring-based reconstructions of summer drought and temperatures. The other giant sequoia fire histories (tree rings and charcoal-based) were significantly (P < 0.001) correlated with the Giant Forest fire frequency record and independent climate reconstructions, and confirm a maximum fire frequency during the warm and drought-prone period from 800 C.E. to 1300 C.E. (Common Era). This was the driest period of the past two millennia, and it may serve as an analog for warming and drying effects of anthropogenic greenhouse gases in the next few decades. Sequoias can sustain very high fire frequencies, and historically they have done so during warm, dry times. We suggest that preparation of sequoia groves for anticipated warming may call for increasing the rate of prescribed burning in most parts of the Giant Forest.

DOI
Journal Article

Abstract  Forests provide the most stable and highest quality water supplies among all land uses. Quantitatively evaluating the benefits of forest water supply functions is important to effectively mitigate the impacts of land development, climate change, and population growth. Here, by integrating a water balance model and national drinking water data, we determined the amount of surface water yield originating on different forest ownership types at a fine resolution (88, 000 watersheds) and tracked that water through the river network to drinking water intakes and the populations they serve. We found that forested lands comprised 36% of the total land area but contributed 50% of the total surface water yield. Of the 23,983 public water supply intakes depending on surface water sources, 89% (serving around 150 million people) received some surface water from forested lands, and 38% (serving about 60 million people) received more than 50% of their water supply from forested lands. Privately-owned forests were the most important water source in the eastern U.S., benefiting 16 million people, followed by federal forests (14.4% of the total drinking water supply). In contrast, federally-owned forested lands were the dominant water source (52% of all water drinking water supply) in the West. Privately-owned forests are the most vulnerable to future land use change and associated water supply impacts. Continuing programs that support private forest landowners with financial and technical assistance through federal and state forest management agencies and potentially developing payment for ecosystem service schemes could maximize benefits for landowners so they may retain their land assets while minimizing forest loss and associated impacts on critical ecosystem services including the provisioning a clean and reliable water supply for the American public.

Journal Article

Abstract  Wildfire smoke is a growing public health concern in the United States. Numerous studies have documented associations between ambient smoke exposure and severe patient outcomes for single-fire seasons or limited geographic regions. However, there are few national-scale health studies of wildfire smoke in the United States, few studies investigating Intensive Care Unit (ICU) admissions as an outcome, and few specifically framed around hospital operations. This study retrospectively examined the associations between ambient wildfire-related PM2.5 at a hospital ZIP code with total hospital ICU admissions using a national-scale hospitalization data set. Wildfire smoke was characterized using a combination of kriged PM2.5 monitor observations and satellite-derived plume polygons from National Oceanic and Atmospheric Administration's Hazard Mapping System. ICU admissions data were acquired from Premier, Inc. and encompass 15%-20% of all U.S. ICU admissions during the study period. Associations were estimated using a distributed-lag conditional Poisson model under a time-stratified case-crossover design. We found that a 10 μg/m3 increase in daily wildfire PM2.5 was associated with a 2.7% (95% CI: 1.3, 4.1; p = 0.00018) increase in ICU admissions 5 days later. Under stratification, positive associations were found among patients aged 0-20 and 60+, patients living in the Midwest Census Region, patients admitted in the years 2013-2015, and non-Black patients, though other results were mixed. Following a simulated severe 7-day 120 μg/m3 smoke event, our results predict ICU bed utilization peaking at 131% (95% CI: 43, 239; p < 10-5) over baseline. Our work suggests that hospitals may need to preposition vital critical care resources when severe smoke events are forecast.

Technical Report

Abstract  This report documents research completed by investigators at the Laboratory of Tree-Ring Research with support from the Sierra Nevada Global Change research program for the period 1991 to 1997. The body of this report is prepared as a draft manuscript intended for revision and submission to a peer reviewed journal (probably Ecology or Ecological Monographs). This paper describes the completed work on the reconstruction of fire histories along transects in the Sierra Nevada, an evaluation of fire regime patterns related to elevation, and an investigation of interannual climate-fire patterns. Appendices list and illustrate additional details of the fire history data, and ongoing research using these data. A computer disk with all of the related data files is also delivered with this report. Since this report is considered a draft and is subject to revision following comments from reviewers and additional work by the authors, we ask that it not be distributed beyond the National Parks (Sequoia, Kings Canyon, and Yosemite).

Journal Article

Abstract  Green roofs are among the most popular type of green infrastructure implemented in highly urbanized watersheds due to their low cost and efficient utilization of unused or under-used space. In this study, we examined the effectiveness of green roofs to attenuate stormwater runoff across a large metropolitan area in the Pacific Northwest, United States. We utilized a spatially explicit ecohydrological watershed model called Visualizing Ecosystem Land Management Assessments (VELMA) to simulate the resulting stormwater hydrology of implementing green roofs over 25%, 50%, 75%, and 100% of existing buildings within four urban watersheds in Seattle, Washington, United States. We simulated the effects of two types of green roofs: extensive green roofs, which are characterized by shallow soil profiles and short vegetative cover, and intensive green roofs, which are characterized by deeper soil profiles and can support larger vegetation. While buildings only comprise approximately 10% of the total area within each of the four watersheds, our simulations showed that 100% implementation of green roofs on these buildings can achieve approximately 10-15% and 20-25% mean annual runoff reductions for extensive and intensive green roofs, respectively, over a 28-year simulation. These results provide an upper limit for volume reductions achievable by green roofs in these urban watersheds. We also showed that stormwater runoff reductions are proportionately smaller during higher flow regimes caused by increased precipitation, likely due to the limited storage capacity of saturated green roofs. In general, green roofs can be effective at reducing stormwater runoff, and their effectiveness is limited by both their areal extent and storage capacity. Our results showed that green roof implementation can be an effective stormwater management tool in highly urban areas, and we demonstrated that our modeling approach can be used to assess the watershed-scale hydrologic impacts of the widespread adoption of green roofs across large metropolitan areas.

DOI
Technical Report

Abstract  The U.S. Geological Survey (USGS) began a 5-year study in 2003 that focused on postfire stream-water quality and postfire sediment load in streams within the Hayman and Hinman fire study areas. This report compares water quality of selected streams receiving runoff from unburned areas and burned areas using concentrations and loads, and trend analysis, from seasonal data (approximately April–November) collected 2003–2007 at the Hayman fire study area, and data collected from 1999–2000 (prefire) and 2003 (postfire) at the Hinman fire study area. The water-quality data collected during this study include onsite measurements of streamflow, specific conductance, and turbidity, laboratory-determined pH, and concentrations of major ions, nutrients, organic carbon, trace elements, and suspended sediment. Postfire floods and effects on water quality of streams, lakes and reservoirs, drinking-water treatment, and the comparison of measured concentrations to applicable water quality standards also are discussed. Exceedances of Colorado water-quality standards in streams of both the Hayman and Hinman fire study areas only occurred for concentrations of five trace elements (not all trace-element exceedances occurred in every stream). Selected samples analyzed for total recoverable arsenic (fixed), dissolved copper (acute and chronic), total recoverable iron (chronic), dissolved manganese (acute, chronic, and fixed) and total recoverable mercury (chronic) exceeded Colorado aquatic-life standards.

DOI
Technical Report

Abstract  nthesis and provides region-specifc management options for increasing resilience to drought for Alaska and Pacifc Northwest, California, Hawai'i and U.S.-Affliated Pacifc Islands, Interior West, Great Plains, Northeast and Midwest, and Southeast. Ecological drought refers to the negative impacts of meteorological drought on ecosystem services, generally focused on observable changes (e.g., forest mortality, soil loss in rangelands), but less observable responses (e.g., lower plant productivity) can have observable changes and economic consequences over the long term. The magnitude of these impacts depends on the severity, duration, frequency, and spatial extent of drought events. A wide range of management options is available for minimizing the adverse impacts of drought when they occur, facilitating postdrought recovery, and creating ecosystem conditions that reduce negative impacts of future droughts. For forests, a common theme among regions is reducing water demand by managing stands at a lower density and favoring species that either require less water or can tolerate drought. Responses to hydrological drought include restoring riparian areas and wetlands to improve functionality, ensuring that aquatic habitats for fish and other organisms provide refugia and passage during low streamfow conditions, and carefully managing consumptive uses for livestock grazing, recreation, agriculture, and drinking water during droughts. For drought management to be effective, timely implementation is needed across large spatial scales, facilitated by coordination among agencies and stakeholders. Optimal responses can be developed by integrating existing policies and practices with new information and by timely reporting of current conditions. The following strategic actions will help institutionalize awareness of drought effects and drought responses in public and private land management: (1) establish and maintain relationships with providers of drought information, ( 2) include drought in collaborative efforts among agencies and stakeholders, (3) revise best management practices as needed, (4) implement drought in relevant planning processes, (5) establish long-term monitoring of drought effects, and ( 6) share information on effectiveness of drought responses. If drought-informed practices are institutionalized as part of agency operations, then planning and management will be more effective, and "crisis management" in response to drought can be avoided.

DOI
Journal Article

Abstract  Much of California, U.S. experienced a severe drought in 2012-2015 inciting a large tree mortality event in the central and southern Sierra Nevada. We assessed causal agents and rates of tree mortality, and short-term impacts to forest structure and composition based on a network of 11.3-m fixed-radius plots installed within three elevation bands on the Eldorado, Stanislaus, Sierra and Sequoia National Forests (914-1219, 1219-1524 and 1524-1829 m on the Eldorado, Stanislaus, Sierra; 1219-1524, 1524-1829, and 1829-2134 m on the Sequoia), where tree mortality was most severe. About 48.9% of trees died between 2014 and 2017. Tree mortality ranged from 46.1 +/- 3.3% on the Eldorado National Forest to 58.7 +/- 3.7% on the Sierra National Forest. Significantly higher levels of tree mortality occurred in the low elevation band (60.4 +/- 3.0%) compared to the high elevation band (46.1 +/- 2.9%). Ponderosa pine, Pinus ponderosa Dougl. ex Laws., exhibited the highest levels of tree mortality (89.6%), with 39.4% of plots losing all P. ponderosa. Mortality of P. ponderosa was highest at the lowest elevations, concentrated in larger-diameter trees, and attributed primarily to colonization by western pine beetle, Dendroctonus brevicomis LeConte. About 89% of P. ponderosa in the three largest diameter classes were killed, representing loss of an important structural component of these forests with implications to wildlife species of conservation concern. Sugar pine, P. lambertiana Dougl., exhibited the second highest levels of tree mortality (48.1%). Mortality of P. lambertiana was concentrated in the mid-diameter classes and attributed primarily to colonization by mountain pine beetle, D. ponderosae Hopkins. White fir, Abies concolor (Gord. & Glend.) Lindl. ex Hildebr., and incense cedar, Calocedrus decurrens (Torr.) Florin, exhibited 26.3% and 23.2% mortality, respectively. Only one Quercus died. Tree mortality (numbers of trees killed) was positively correlated with tree density and slope. A time lag was observed between the occurrence of drought and the majority of tree mortality. Tree regeneration (seedlings and saplings) was dominated by C. decurrens and Quercus spp., representing a potential long-term shift in composition from forests that were dominated by P. ponderosa. About 22.2% of plots contained plant species considered invasive, including cheatgrass, Bromus tectorum L., ripgut brome, Bromus diandrus Roth, bull thistle, Cirsium vulgare (Savi) Ten., and yellow star-thistle, Centaura solstitalis L The implications of these and other results to recovery and management of drought-impacted forests in the central and southern Sierra Nevada are discussed.

DOI
Journal Article

Abstract  Land managers use prescribed fire to return a vital process to fire-adapted ecosystems, restore forest structure from a state altered by long-term fire suppression, and reduce wildfire intensity. However, fire often produces favorable conditions for invasive plant species, particularly if it is intense enough to reveal bare mineral soil and open previously closed canopies. Understanding the environmental or fire characteristics that explain post-fire invasive plant abundance would aid managers in efficiently finding and quickly responding to fire-caused infestations. To that end, we used an information-theoretic model-selection approach to assess the relative importance of abiotic environmental characteristics (topoedaphic position, distance from roads), pre-and post-fire biotic environmental characteristics (forest structure, understory vegetation, fuel load), and prescribed fire severity (measured in four different ways) in explaining invasive plant cover in ponderosa pine forest in South Dakota’s Black Hills. Environmental characteristics (distance from roads and post-fire forest structure) alone provided the most explanation of variation (26%) in post-fire cover of Verbascum thapsus (common mullein), but a combination of surface fire severity and environmental characteristics (pre-fire forest structure and distance from roads) explained 36–39% of the variation in post-fire cover of Cirsium arvense (Canada thistle) and all invasives together. For four species and all invasives together, their pre-fire cover explained more variation (26–82%) in post-fire cover than environmental and fire characteristics did, suggesting one strategy for reducing post-fire invasive outbreaks may be to find and control invasives before the fire. Finding them may be difficult, however, since pre-fire environmental characteristics explained only 20% of variation in pre-fire total invasive cover, and less for individual species. Thus, moderating fire intensity or targeting areas of high severity for post-fire invasive control may be the most efficient means for reducing the chances of post-fire invasive plant outbreaks when conducting prescribed fires in this region.

DOI
Journal Article

Abstract  Predicting the efficacy of fuel treatments aimed at reducing high severity fire in dry-mixed conifer forests in the western US is a challenging problem that has been addressed in a variety of ways using both field observations and wildfire simulation models. One way to describe the efficacy of fuel treatments is to quantify how often wildfires are expected to intersect areas prioritized for treatment. In real landscapes treatments are static, restricted to a small portion of the landscape and against a background of stochastic fire and dynamic vegetation, thus the likelihood of fire encountering a treatment during the period treatments remain effective is small. In this paper we simulate a wide range of different treatment prioritization schemes using the forest landscape simulation model Envision to examine 50 years of fire-treatment interactions and forest succession. We first reviewed 47 fuel management projects in Oregon, USA to build prioritization schemes that addressed different fuel management objectives. We then simulated different priority schemes in the 18 planning areas of the Deschutes National Forest in central Oregon and measured potential fire-treatment interactions over time. Simulated annual area burned was used to calculate the success odds for each priority scheme and planning area. Out of the ten metrics considered only three had higher success odds than a random prioritization of planning areas. Spatial allocation of projects based on burn probability and transmitted wildfire had the highest success odds among the tested metrics. However, success odds declined sharply as desired success levels increased suggesting that fuel management goals need to be tempered to consider the stochastic nature of wildfire. Meeting long-term multiple management goals over time can benefit from consideration of short- and long-term tradeoffs from different treatment prioritization schemes. Our work contributes towards a better framing of both management and public expectations regarding the performance of fuel treatments programs.

DOI
Journal Article

Abstract  Large and severe wildfires are an observable consequence of an increasingly arid American West. There is increasing consensus that human communities, land managers, and fire managers need to adapt and learn to live with wildfires. However, a myriad of human and ecological factors constrain adaptation, and existing science-based management strategies are not sufficient to address fire as both a problem and solution. To that end, we present a novel risk-science approach that aligns wildfire response decisions, mitigation opportunities, and land management objectives by consciously integrating social, ecological and fire management system needs. We use fire-prone landscapes of the US Pacific Northwest as our study area, and report on and describe how three complementary risk-based analytic tools—quantitative wildfire risk assessment, mapping of suppression difficulty, and atlases of potential control locations—can form the foundation for adaptive governance in fire management. Together, these tools integrate wildfire risk with fire management difficulties and opportunities, providing a more complete picture of the wildfire risk management challenge. Leveraging recent and ongoing experience integrating local experiential knowledge with these tools, we provide examples and discuss how these geospatial datasets create a risk-based planning structure that spans multiple spatial scales and uses. These uses include pre-planning strategic wildfire response, implementing safe wildfire response balancing risk with likelihood of success, and alignment of non-wildfire mitigation opportunities to support wildfire risk management more directly. We explicitly focus on multi-jurisdictional landscapes to demonstrate how these tools highlight the shared responsibility of wildfire risk mitigation. By integrating quantitative risk science, expert judgement and adaptive co-management, this process provides a much-needed pathway to transform fire-prone social ecological systems to be more responsive and adaptable to change and live with fire in an increasingly arid American West.

DOI
Journal Article

Abstract  Prior to Euro–American settlement, dry ponderosa pine and mixed conifer forests (hereafter, the “dry forests”) of the Inland Northwest were burned by frequent low- or mixed-severity fires. These mostly surface fires maintained low and variable tree densities, light and patchy ground fuels, simplified forest structure, and favored fire-tolerant trees, such as ponderosa pine, and a low and patchy cover of associated fire-tolerant shrubs and herbs. Low- and mixed-severity fires provided other important feedbacks and effects to ponderosa pine-dominated stands and landscapes. For example, in stands, frequent surface fires favored an ongoing yet piecemeal regeneration of fire-tolerant trees by periodically exposing patches of mineral soil. They maintained fire-tolerant forest structures by elevating tree crown bases and scorching or consuming many seedlings, saplings, and pole-sized trees. They cycled nutrients from branches and foliage to the soil, where they could be used by other plants, and promoted the growth and development of low and patchy understory shrub and herb vegetation. Finally, surface fires reduced the long-term threat of running crown fires by reducing the fuel bed and metering out individual tree and group torching, and they reduced competition for site resources among surviving trees, shrubs, and herbs. In landscapes, the patterns of dry forest structure and composition that resulted from frequent fires reinforced the occurrence of low- or mixed-severity fires, because frequent burning spatially isolated conditions that supported high-severity fires. These spatial patterns reduced the likelihood of severe fire behavior and effects at each episode of fire. Rarely, dry forest landscapes were affected by more severe climate-driven events. Extant dry forests no longer appear or function as they once did. Large landscapes are homogeneous in their composition and structure, and the regional landscape is set up for severe, large fire and insect disturbance events. Among ecologists, there is also a high degree of concern about how future dry forests will develop, if fires continue to be large and severe. In this paper, we describe the key landscape pattern and process changes wrought by the sum of the settlement and management influences to date, and we point to an uncertain future for ecosystem management. Widespread selection cutting of the largest and oldest ponderosa pine and Douglas-fir in the 20th century has reduced much of the economic opportunity that might have been associated with restoration, and long-term investment will likely be needed, if large-scale restoration activities are attempted. An uncertain future for ecosystem management is based on the lack of current and improbable future social consensus concerning desired outcomes for public forestlands, the need for significant financial investment in ecosystem restoration, a lack of integrated planning and decision tools, and mismatches between the existing planning process, Congressional appropriations, and complex management and restoration problems.

Journal Article

Abstract  Airborne and ground-based Pandora spectrometer NO2 column measurements were collected during the 2018 Long Island Sound Tropospheric Ozone Study (LISTOS) in the New York City/Long Island Sound region, which coincided with early observations from the Sentinel-5P TROPOspheric Monitoring Instrument (TROPOMI) instrument. Both airborne- and ground-based measurements are used to evaluate the TROPOMI NO2 Tropospheric Vertical Column (TrVC) product v1.2 in this region, which has high spatial and temporal heterogeneity in NO2. First, airborne and Pandora TrVCs are compared to evaluate the uncertainty of the airborne TrVC and establish the spatial representativeness of the Pandora observations. The 171 coincidences between Pandora and airborne TrVCs are found to be highly correlated (r 2 = 0.92 and slope of 1.03), with the largest individual differences being associated with high temporal and/or spatial variability. These reference measurements (Pandora and airborne) are complementary with respect to temporal coverage and spatial representativity. Pandora spectrometers can provide continuous long-term measurements but may lack areal representativity when operated in direct-sun mode. Airborne spectrometers are typically only deployed for short periods of time, but their observations are more spatially representative of the satellite measurements with the added capability of retrieving at subpixel resolutions of 250 m × 250 m over the entire TROPOMI pixels they overfly. Thus, airborne data are more correlated with TROPOMI measurements (r 2 = 0.96) than Pandora measurements are with TROPOMI (r 2 = 0.84). The largest outliers between TROPOMI and the reference measurements appear to stem from too spatially coarse a priori surface reflectivity (0.5◦) over bright urban scenes. In this work, this results during cloud-free scenes that, at times, are affected by errors in the TROPOMI cloud pressure retrieval impacting the calculation of tropospheric air mass factors. This factor causes a high bias in TROPOMI TrVCs of 4 %–11 %. Excluding these cloud-impacted points, TROPOMI has an overall low bias of 19 %–33 % during the LISTOS timeframe of June– September 2018. Part of this low bias is caused by coarse a priori profile input from the TM5-MP model; replacing these profiles with those from a 12 km North American Model– Community Multiscale Air Quality (NAMCMAQ) analysis results in a 12 %–14 % increase in the TrVCs. Even with this improvement, the TROPOMI-NAMCMAQ TrVCs have a 7 %–19 % low bias, indicating needed improvement in a priori assumptions in the air mass factor calculation. Future work should explore additional impacts of a priori inputs to further assess the remaining low biases in TROPOMI using these datasets.

DOI
Journal Article

Abstract  PurpleAir sensors which measure particulate matter (PM) are widely used by individuals, community groups, and other organizations including state and local air monitoring agencies. PurpleAir sensors comprise a massive global network of more than 10,000 sensors. Previous performance evaluations have typically studied a limited number of PurpleAir sensors in small geographic areas or laboratory environments. While useful for determining sensor behavior and data normalization for these geographic areas, little work has been done to understand the broad applicability of these results outside these regions and conditions. Here, PurpleAir sensors operated by air quality monitoring agencies are evaluated in comparison to collocated ambient air quality regulatory instruments. In total, almost 12,000 24-hour averaged PM2.5 measurements from collocated PurpleAir sensors and Federal Reference Method (FRM) or Federal Equivalent Method (FEM) PM2.5 measurements were collected across diverse regions of the United States (U.S.), including 16 states. Consistent with previous evaluations, under typical ambient and smoke impacted conditions, the raw data from PurpleAir sensors overestimate PM2.5 concentrations by about 40 % in most parts of the U.S. A simple linear regression reduces much of this bias across most U.S. regions, but adding a relative humidity term further reduces the bias and improves consistency in the biases between different regions. More complex multiplicative models did not substantially improve results when tested on an independent dataset. The final PurpleAir correction reduces the root mean square error (RMSE) of the raw data from 8 µg m−3 to 3 µg m−3 with an average FRM or FEM concentration of 9 µg m−3. This correction equation, along with proposed data cleaning criteria, has been applied to PurpleAir PM2.5 measurements across the U.S. in the AirNow Fire and Smoke Map (fire.airnow.gov) and has the potential to be successfully used in other air quality and public health applications.

Journal Article

Abstract  Forest fire is very common to all the ecosystems of the world. It affects both vegetation and soil. It is also helpful in maintaining diversity and stability of ecosystems. Effect of forest fire and prescribed fire on forest soil is very complex. It affects soil organic matter, macro and micro-nutrients, physical properties of soil like texture, colour, pH, Bulk Density as well as soil biota. The impact of fire on forest soil depends on various factors such as intensity of fire, fuel load and soil moisture. Fire is beneficial as well as harmful for the forest soil depending on its severity and fire return interval. In low intensity fires, combustion of litter and soil organic matter increase plant available nutrients, which results in rapid growth of herbaceous plants and a significant increase in plant storage of nutrients. Whereas high intensity fires can result into complete loss of soil organic matter, volatilization of N, P, S, K, death of microbes, etc. Intense forest fire results into formation of some organic compounds with hydrophobic properties, which results into high water repellent soils. Forest fire also causes long term effect on forest soil. The purpose of this paper is to review the effect of forest fire on various properties of soil, which are important in maintaining healthy ecosystem.

DOI
Journal Article

Abstract  Contemporary forest management requires highly-detailed, spatially-contiguous, multi-temporal, and scenarios comparable forest conditions. Field inventories and individual-tree models often contain highly-detailed data and allow for long-term complex scenarios comparison, but the information is only at sampled locations and lacks complete spatial coverage. Forest landscape models (FLMs) provide landscape-level spatiotemporal data, but the details that are important to land managers are often lost in the generalized outputs. We developed a modeling framework, F-3, to integrate FIA (Forest Inventory and Analysis) plots, the Forest Vegetation Simulator (FVS), and FastEmap (Field And SatelliTe for Ecosystem MAPping) to simulate spatiotemporal forest change under natural succession and vegetation management. F-3 extrapolates the details of forest inventory plots and individual-tree model outputs to a spatially-contiguous landscape by fusing tree-list field measurements, individual tree growth and yield models, remote sensing and environmental geospatial datasets. F-3 allows for area specific management action simulations. F-3 compares FVS results with field measurements for temporal accuracy assessment and uses a leave-one-out cross-validation for spatial accuracy assessment. F-3 adopts parallel computation techniques to implement the modeling in an automatic and efficient manner. The proof of concept of F-3 was demonstrated in Tahoe National Forest (TNF) showing spatiotemporal changes on six forest structural metrics (quadratic mean diameter, basal area, biomass, habitat suitability index, canopy cover, and coarse woody debris) under natural succession, regeneration-cut, and thinning scenarios for the years 2014-2114 at a 30 m resolution. F-3 can be used for initializing FLMs and for analyzing a wide range of ecosystem services; however, the under-representation of certain forest types in the FIA plot data set, the modeling bias from FVS, and choice of FastEmap covariates contribute to major uncertainties in the framework.

Journal Article

Abstract  Exposure to outdoor fine particulate matter (PM2.5) is a leading risk factor for mortality. We develop global estimates of annual PM2.5 concentrations and trends for 1998-2018 using advances in satellite observations, chemical transport modeling, and ground-based monitoring. Aerosol optical depths (AODs) from advanced satellite products including finer resolution, increased global coverage, and improved long-term stability are combined and related to surface PM2.5 concentrations using geophysical relationships between surface PM2.5 and AOD simulated by the GEOS-Chem chemical transport model with updated algorithms. The resultant annual mean geophysical PM2.5 estimates are highly consistent with globally distributed ground monitors (R-2 = 0.81; slope = 0.90). Geographically weighted regression is applied to the geophysical PM2.5 estimates to predict and account for the residual bias with PM2.5 monitors, yielding even higher cross validated agreement (R-2 = 0.90-0.92; slope = 0.90-0.97) with ground monitors and improved agreement compared to all earlier global estimates. The consistent long-term satellite AOD and simulation enable trend assessment over a 21 year period, identifying significant trends for eastern North America (-0.28 +/- 0.03 mu g/m(3)/yr), Europe (-0.15 +/- 0.03 mu g/m(3)/yr), India (1.13 +/- 0.15 mu g/m(3)/yr), and globally (0.04 +/- 0.02 mu g/m(3)/yr). The positive trend (2.44 +/- 0.44 mu g/m(3)/yr) for India over 2005-2013 and the negative trend (-3.37 +/- 0.38 mu g/m(3)/yr) for China over 2011-2018 are remarkable, with implications for the health of billions of people.

  • <<
  • 5 of 27
  • >>
Filter Results