scispace - formally typeset
Search or ask a question

Showing papers by "United States Environmental Protection Agency published in 2006"


Journal ArticleDOI
TL;DR: The Model of Emissions of Gases and Aerosols from Nature (MEGAN) is used to quantify net terrestrial biosphere emission of isoprene into the atmosphere as mentioned in this paper.
Abstract: . Reactive gases and aerosols are produced by terrestrial ecosystems, processed within plant canopies, and can then be emitted into the above-canopy atmosphere. Estimates of the above-canopy fluxes are needed for quantitative earth system studies and assessments of past, present and future air quality and climate. The Model of Emissions of Gases and Aerosols from Nature (MEGAN) is described and used to quantify net terrestrial biosphere emission of isoprene into the atmosphere. MEGAN is designed for both global and regional emission modeling and has global coverage with ~1 km2 spatial resolution. Field and laboratory investigations of the processes controlling isoprene emission are described and data available for model development and evaluation are summarized. The factors controlling isoprene emissions include biological, physical and chemical driving variables. MEGAN driving variables are derived from models and satellite and ground observations. Tropical broadleaf trees contribute almost half of the estimated global annual isoprene emission due to their relatively high emission factors and because they are often exposed to conditions that are conducive for isoprene emission. The remaining flux is primarily from shrubs which have a widespread distribution. The annual global isoprene emission estimated with MEGAN ranges from about 500 to 750 Tg isoprene (440 to 660 Tg carbon) depending on the driving variables which include temperature, solar radiation, Leaf Area Index, and plant functional type. The global annual isoprene emission estimated using the standard driving variables is ~600 Tg isoprene. Differences in driving variables result in emission estimates that differ by more than a factor of three for specific times and locations. It is difficult to evaluate isoprene emission estimates using the concentration distributions simulated using chemistry and transport models, due to the substantial uncertainties in other model components, but at least some global models produce reasonable results when using isoprene emission distributions similar to MEGAN estimates. In addition, comparison with isoprene emissions estimated from satellite formaldehyde observations indicates reasonable agreement. The sensitivity of isoprene emissions to earth system changes (e.g., climate and land-use) demonstrates the potential for large future changes in emissions. Using temperature distributions simulated by global climate models for year 2100, MEGAN estimates that isoprene emissions increase by more than a factor of two. This is considerably greater than previous estimates and additional observations are needed to evaluate and improve the methods used to predict future isoprene emissions.

3,746 citations


Journal ArticleDOI
TL;DR: Concern was expressed about direct application of the TEF/total toxic equivalency (TEQ) approach to abiotic matrices, such as soil, sediment, etc., for direct application in human risk assessment as the present TEF scheme and TEQ methodology are primarily intended for estimating exposure and risks via oral ingestion.

3,284 citations


Journal ArticleDOI
Leming Shi1, Laura H. Reid, Wendell D. Jones, Richard Shippy2, Janet A. Warrington3, Shawn C. Baker4, Patrick J. Collins5, Francoise de Longueville, Ernest S. Kawasaki6, Kathleen Y. Lee7, Yuling Luo, Yongming Andrew Sun7, James C. Willey8, Robert Setterquist7, Gavin M. Fischer9, Weida Tong1, Yvonne P. Dragan1, David J. Dix10, Felix W. Frueh1, Federico Goodsaid1, Damir Herman6, Roderick V. Jensen11, Charles D. Johnson, Edward K. Lobenhofer12, Raj K. Puri1, Uwe Scherf1, Jean Thierry-Mieg6, Charles Wang13, Michael A Wilson7, Paul K. Wolber5, Lu Zhang7, William Slikker1, Shashi Amur1, Wenjun Bao14, Catalin Barbacioru7, Anne Bergstrom Lucas5, Vincent Bertholet, Cecilie Boysen, Bud Bromley, Donna Brown, Alan Brunner2, Roger D. Canales7, Xiaoxi Megan Cao, Thomas A. Cebula1, James J. Chen1, Jing Cheng, Tzu Ming Chu14, Eugene Chudin4, John F. Corson5, J. Christopher Corton10, Lisa J. Croner15, Christopher Davies3, Timothy Davison, Glenda C. Delenstarr5, Xutao Deng13, David Dorris7, Aron Charles Eklund11, Xiaohui Fan1, Hong Fang, Stephanie Fulmer-Smentek5, James C. Fuscoe1, Kathryn Gallagher10, Weigong Ge1, Lei Guo1, Xu Guo3, Janet Hager16, Paul K. Haje, Jing Han1, Tao Han1, Heather Harbottle1, Stephen C. Harris1, Eli Hatchwell17, Craig A. Hauser18, Susan D. Hester10, Huixiao Hong, Patrick Hurban12, Scott A. Jackson1, Hanlee P. Ji19, Charles R. Knight, Winston Patrick Kuo20, J. Eugene LeClerc1, Shawn Levy21, Quan Zhen Li, Chunmei Liu3, Ying Liu22, Michael Lombardi11, Yunqing Ma, Scott R. Magnuson, Botoul Maqsodi, Timothy K. McDaniel3, Nan Mei1, Ola Myklebost23, Baitang Ning1, Natalia Novoradovskaya9, Michael S. Orr1, Terry Osborn, Adam Papallo11, Tucker A. Patterson1, Roger Perkins, Elizabeth Herness Peters, Ron L. Peterson24, Kenneth L. Philips12, P. Scott Pine1, Lajos Pusztai25, Feng Qian, Hongzu Ren10, Mitch Rosen10, Barry A. Rosenzweig1, Raymond R. Samaha7, Mark Schena, Gary P. Schroth, Svetlana Shchegrova5, Dave D. Smith26, Frank Staedtler24, Zhenqiang Su1, Hongmei Sun, Zoltan Szallasi20, Zivana Tezak1, Danielle Thierry-Mieg6, Karol L. Thompson1, Irina Tikhonova16, Yaron Turpaz3, Beena Vallanat10, Christophe Van, Stephen J. Walker27, Sue Jane Wang1, Yonghong Wang6, Russell D. Wolfinger14, Alexander Wong5, Jie Wu, Chunlin Xiao7, Qian Xie, Jun Xu13, Wen Yang, Liang Zhang, Sheng Zhong28, Yaping Zong 
TL;DR: This study describes the experimental design and probe mapping efforts behind the MicroArray Quality Control project and shows intraplatform consistency across test sites as well as a high level of interplatform concordance in terms of genes identified as differentially expressed.
Abstract: Over the last decade, the introduction of microarray technology has had a profound impact on gene expression research. The publication of studies with dissimilar or altogether contradictory results, obtained using different microarray platforms to analyze identical RNA samples, has raised concerns about the reliability of this technology. The MicroArray Quality Control (MAQC) project was initiated to address these concerns, as well as other performance and data analysis issues. Expression data on four titration pools from two distinct reference RNA samples were generated at multiple test sites using a variety of microarray-based and alternative technology platforms. Here we describe the experimental design and probe mapping efforts behind the MAQC project. We show intraplatform consistency across test sites as well as a high level of interplatform concordance in terms of genes identified as differentially expressed. This study provides a resource that represents an important first step toward establishing a framework for the use of microarrays in clinical and regulatory settings.

1,987 citations


Journal ArticleDOI
TL;DR: The scope of the thresholds concept in ecological science is defined and methods for identifying and investigating thresholds using a variety of examples from terrestrial and aquatic environments, at ecosystem, landscape and regional scales are discussed.
Abstract: An ecological threshold is the point at which there is an abrupt change in an ecosystem quality, property or phenomenon, or where small changes in an environmental driver produce large responses in the ecosystem. Analysis of thresholds is complicated by nonlinear dynamics and by multiple factor controls that operate at diverse spatial and temporal scales. These complexities have challenged the use and utility of threshold concepts in environmental management despite great concern about preventing dramatic state changes in valued ecosystems, the need for determining critical pollutant loads and the ubiquity of other threshold-based environmental problems. In this paper we define the scope of the thresholds concept in ecological science and discuss methods for identifying and investigating thresholds using a variety of examples from terrestrial and aquatic environments, at ecosystem, landscape and regional scales. We end with a discussion of key research needs in this area.

1,049 citations


Journal ArticleDOI
TL;DR: In this paper, the reference condition for biological integrity (RC(BI) was proposed to describe the naturalness of the biota and the absence of significant human disturbance or alteration.
Abstract: An important component of the biological assessment of stream condition is an evaluation of the direct or indirect effects of human activities or disturbances. The concept of a ''reference condition'' is increasingly used to describe the standard or benchmark against which current condition is compared. Many individual nations, and the European Union as a whole, have codified the concept of reference condition in legislation aimed at protecting and improving the ecological condition of streams. However, the phrase ''reference condition'' has many meanings in a variety of contexts. One of the primary purposes of this paper is to bring some consistency to the use of the term. We argue the need for a ''reference condition'' term that is reserved for referring to the ''naturalness'' of the biota (structure and function) and that naturalness implies the absence of significant human disturbance or alteration. To avoid the confusion that arises when alternative definitions of reference condition are used, we propose that the original concept of reference condition be preserved in this modified form of the term: ''reference condition for biological integrity,'' or RC(BI). We further urge that these specific terms be used to refer to the concepts and methods used in individual bioassessments to characterize the expected condition to which current conditions are compared: ''minimally disturbed condition'' (MDC); ''historical condition'' (HC); ''least disturbed condition'' (LDC); and ''best attainable condition'' (BAC). We argue that each of these concepts can be narrowly defined, and each implies specific methods for estimating expectations. We also describe current methods by which these expectations are estimated including: the reference-site approach (condition at minimally or least-disturbed sites); best professional judgment; interpretation of historical condition; extrapolation of empirical models; and evaluation of ambient distributions. Because different assumptions about what constitutes reference condition will have important effects on the final classification of streams into condition classes, we urge that bioassessments be consistent in describing the definitions and methods used to set expectations.

907 citations


Journal ArticleDOI
TL;DR: In this article, the authors explored the use of 250m multi-temporal MODIS NDVI 16-day composite data to provide an automated change detection and alarm capability on a 1-year time-step for the Albemarle-Pamlico Estuary System (APES) region of the US.

783 citations


Journal ArticleDOI
TL;DR: The deprivation index was associated with the unadjusted prevalence of preterm birth and low birth weight for white non-Hispanic and to a lesser extent for black non- Hispanic women across the eight sites, suggesting the utility of using a deprivation index for research into neighborhood effects on adverse birth outcomes.
Abstract: Census data are widely used for assessing neighborhood socioeconomic context. Research using census data has been inconsistent in variable choice and usually limited to single geographic areas. This paper seeks to a) outline a process for developing a neighborhood deprivation index using principal components analysis and b) demonstrate an example of its utility for identifying contextual variables that are associated with perinatal health outcomes across diverse geographic areas. Year 2000 U.S. Census and vital records birth data (1998–2001) were merged at the census tract level for 19 cities (located in three states) and five suburban counties (located in three states), which were used to create eight study areas within four states. Census variables representing five socio-demographic domains previously associated with health outcomes, including income/poverty, education, employment, housing, and occupation, were empirically summarized using principal components analysis. The resulting first principal component, hereafter referred to as neighborhood deprivation, accounted for 51 to 73% of the total variability across eight study areas. Component loadings were consistent both within and across study areas (0.2–0.4), suggesting that each variable contributes approximately equally to “deprivation” across diverse geographies. The deprivation index was associated with the unadjusted prevalence of preterm birth and low birth weight for white non-Hispanic and to a lesser extent for black non-Hispanic women across the eight sites. The high correlations between census variables, the inherent multidimensionality of constructs like neighborhood deprivation, and the observed associations with birth outcomes suggest the utility of using a deprivation, index for research into neighborhood effects on adverse birth outcomes.

688 citations


Journal ArticleDOI
TL;DR: Morphometric analysis of the CNS indicated unequivocally that the brain is a critical target for PM Exposure and implicated oxidative stress as a predisposing factor that links PM exposure and susceptibility to neurodegeneration.
Abstract: Particulate air pollution has been associated with respiratory and cardiovascular disease. Evidence for cardiovascular and neurodegenerative effects of ambient particles was reviewed as part of a workshop. The purpose of this critical update is to summarize the evidence presented for the mechanisms involved in the translocation of particles from the lung to other organs and to highlight the potential of particles to cause neurodegenerative effects. Fine and ultrafine particles, after deposition on the surfactant film at the air-liquid interface, are displaced by surface forces exerted on them by surfactant film and may then interact with primary target cells upon this displacement. Ultrafine and fine particles can then penetrate through the different tissue compartments of the lungs and eventually reach the capillaries and circulating cells or constituents, e.g. erythrocytes. These particles are then translocated by the circulation to other organs including the liver, the spleen, the kidneys, the heart and the brain, where they may be deposited. It remains to be shown by which mechanisms ultrafine particles penetrate through pulmonary tissue and enter capillaries. In addition to translocation of ultrafine particles through the tissue, fine and coarse particles may be phagocytized by macrophages and dendritic cells which may carry the particles to lymph nodes in the lung or to those closely associated with the lungs. There is the potential for neurodegenerative consequence of particle entry to the brain. Histological evidence of neurodegeneration has been reported in both canine and human brains exposed to high ambient PM levels, suggesting the potential for neurotoxic consequences of PM-CNS entry. PM mediated damage may be caused by the oxidative stress pathway. Thus, oxidative stress due to nutrition, age, genetics among others may increase the susceptibility for neurodegenerative diseases. The relationship between PM exposure and CNS degeneration can also be detected under controlled experimental conditions. Transgenic mice (Apo E -/-), known to have high base line levels of oxidative stress, were exposed by inhalation to well characterized, concentrated ambient air pollution. Morphometric analysis of the CNS indicated unequivocally that the brain is a critical target for PM exposure and implicated oxidative stress as a predisposing factor that links PM exposure and susceptibility to neurodegeneration. Together, these data present evidence for potential translocation of ambient particles on organs distant from the lung and the neurodegenerative consequences of exposure to air pollutants.

541 citations


Journal ArticleDOI
TL;DR: This review article summarizes what is known about human health following exposure to dioxins and is meant primarily for health professionals but was also written with the general public in mind.

489 citations


Journal ArticleDOI
TL;DR: This work has now been extended to noncancer effects, with the eventual objective of harmonizing framework approaches to both cancer and noncancer endpoints.
Abstract: Structured frameworks are extremely useful in promoting transparent, harmonized approaches to the risk assessment of chemicals. One area where this has been particularly successful is in the analysis of modes of action (MOAs) for chemical carcinogens in experimental animals and their relevance to humans. The International Programme on Chemical Safety (IPCS) recently published an updated version of its MOA framework in animals to address human relevance (cancer human relevance framework, or HRF). This work has now been extended to noncancer effects, with the eventual objective of harmonizing framework approaches to both cancer and noncancer endpoints. As in the cancer HRF, the first step is to determine whether the weight of evidence based on experimental observations is sufficient to establish a hypothesized MOA. This comprises a series of key events causally related to the toxic effect, identified using an approach based on the Bradford Hill criteria. These events are then compared qualitatively and, next, quantitatively between experimental animals and humans. The output of the analysis is a clear statement of conclusions, together with the confidence, analysis, and implications of the findings. This framework provides a means of ensuring a transparent evaluation of the data, identification of key data gaps and of information that would be of value in the further risk assessment of the compound, such as on dose-response relationships, and recognition of potentially susceptible subgroups, for example, based on life-stage considerations.

472 citations


Journal ArticleDOI
TL;DR: The results reveal that the VBN theory is a plausible explanation for the differences measured in the respondents' perception of ecological risk.
Abstract: A mail survey on ecological risk perception was administered in the summer of 2002 to a randomized sample of the lay public and to selected risk professionals at the U.S. Environmental Protection Agency (US EPA). The ranking of 24 ecological risk items, from global climate change to commercial fishing, reveals that the lay public is more concerned about low-probability, high-consequence risks whereas the risk professionals are more concerned about risks that pose long-term, ecosystem-level impacts. To test the explanatory power of the value-belief-norm (VBN) theory for risk perception, respondents were questioned about their personal values, spiritual beliefs, and worldviews. The most consistent predictors of the risk rankings are belief in the new ecological paradigm (NEP) and Schwartz's altruism. The NEP and Schwartz's altruism explain from 19% to 46% of the variance in the risk rankings. Religious beliefs account for less than 6% of the variance and do not show a consistent pattern in predicting risk perception although religious fundamentalists are generally less concerned about the risk items. While not exerting as strong an impact, social-structural variables do have some influence on risk perception. Ethnicities show no effect on the risk scales but the more educated and financially well-off are less concerned about the risk items. Political leanings have no direct influence on risk rankings, but indirectly affect rankings through the NEP. These results reveal that the VBN theory is a plausible explanation for the differences measured in the respondents' perception of ecological risk.

Journal ArticleDOI
TL;DR: The development of stable markets in ecosystem services is now a major neoliberal policy initiative in the United States and elsewhere as mentioned in this paper, however, it requires ecosystem scientists to play a crucial role.
Abstract: The development of stable markets in ecosystem services is now a major neoliberal policy initiative in the United States and elsewhere. Such markets, however, require ecosystem scientists to play a...

Journal ArticleDOI
TL;DR: The results improve the understanding of P and its labile components within a spatially explicit context and distinguish P-enriched areas from unaffected ("natural") areas and intermediate zones that are currently undergoing change as P is mobilized and translocated.
Abstract: The hazards associated with pathogens in land-applied animal and human wastes have long been recognized. Management of these risks requires an understanding of sources, concentrations, and removal by processes that may be used to treat the wastes; survival in the environment; and exposure to sensitive populations. The major sources are animal feeding operations, municipal wastewater treatment plant effluents, biosolids, and on-site treatment systems. More than 150 known enteric pathogens may be present in the untreated wastes, and one new enteric pathogen has been discovered every year over the past decade. There has been increasing demand that risks associated with the land treatment and application be better defined. For risks to be quantified, more data are needed on the concentrations of pathogens in wastes, the effectiveness of treatment processes, standardization of detection methodology, and better quantification of exposure.

Journal ArticleDOI
TL;DR: In this paper, the authors describe the main components of this international validation effort, including the current participants, their ground LAI measurements and scaling techniques, and the metadata and infrastructure established to share data.
Abstract: Initiated in 1984, the Committee Earth Observing Satellites' Working Group on Calibration and Validation (CEOS WGCV) pursues activities to coordinate, standardize and advance calibration and validation of civilian satellites and their data. One subgroup of CEOS WGCV, Land Product Validation (LPV), was established in 2000 to define standard validation guidelines and protocols and to foster data and information exchange relevant to the validation of land products. Since then, a number of leaf area index (LAI) products have become available to the science community at both global and regional extents. Having multiple global LAI products and multiple, disparate validation activities related to these products presents the opportunity to realize efficiency through international collaboration. So the LPV subgroup established an international LAI intercomparison validation activity. This paper describes the main components of this international validation effort. The paper documents the current participants, their ground LAI measurements and scaling techniques, and the metadata and infrastructure established to share data. The paper concludes by describing plans for sharing both field data and high-resolution LAI products from each site. Many considerations of this global LAI intercomparison can apply to other products, and this paper presents a framework for such collaboration

Journal ArticleDOI
TL;DR: In this paper, a review of treatment processes with regard to their potential on endocrine disrupting chemicals removal is presented. But the authors focus on some specific groups of endocrine disruptors (estrogens and alkylphenols).

Journal ArticleDOI
TL;DR: In this paper, the authors investigated the potential implications of alternative modeling approaches for conclusions about future range shifts and extinctions using a common data set, which entailed the current ranges of 100 randomly selected mammal species found in the western hemisphere.
Abstract: Predicted changes in the global climate are likely to cause large shifts in the geographic ranges of many plant and animal species. To date, predictions of future range shifts have relied on a variety of modeling approaches with different levels of model accuracy. Using a common data set, we investigated the potential implications of alternative modeling approaches for conclusions about future range shifts and extinctions. Our common data set entailed the current ranges of 100 randomly selected mammal species found in the western hemisphere. Using these range maps, we compared six methods for modeling predicted future ranges. Predicted future distributions differed markedly across the alternative modeling approaches, which in turn resulted in estimates of extinction rates that ranged between 0% and 7%, depending on which model was used. Random forest predictors, a model-averaging approach, consistently outperformed the other techniques (correctly predicting >99% of current absences and 86% of current presences). We conclude that the types of models used in a study can have dramatic effects on predicted range shifts and extinction rates; and that model-averaging approaches appear to have the greatest potential for predicting range shifts in the face of climate change.

Journal ArticleDOI
TL;DR: In this article, the authors developed a modeling framework to estimate the emissions from fires in North and parts of Central America by taking advantage of a combination of complementary satellite and ground-based data to refine estimates of fuel loadings.

Journal ArticleDOI
TL;DR: It is illustrated how environmental tobacco smoke, outdoor air pollution, and climate change may act as environmental risk factors for the development of asthma and mechanistic explanations for how some of these effects can occur are provided.
Abstract: Asthma is a multifactorial airway disease that arises from a relatively common genetic background interphased with exposures to allergens and airborne irritants. The rapid rise in asthma over the past three decades in Western societies has been attributed to numerous diverse factors, including increased awareness of the disease, altered lifestyle and activity patterns, and ill-defined changes in environmental exposures. It is well accepted that persons with asthma are more sensitive than persons without asthma to air pollutants such as cigarette smoke, traffic emissions, and photochemical smog components. It has also been demonstrated that exposure to a mix of allergens and irritants can at times promote the development phase (induction) of the disease. Experimental evidence suggests that complex organic molecules from diesel exhaust may act as allergic adjuvants through the production of oxidative stress in airway cells. It also seems that climate change is increasing the abundance of aeroallergens such as pollen, which may result in greater incidence or severity of allergic diseases. In this review we illustrate how environmental tobacco smoke, outdoor air pollution, and climate change may act as environmental risk factors for the development of asthma and provide mechanistic explanations for how some of these effects can occur.

Journal ArticleDOI
TL;DR: In this article, the authors examine the concept and implementation of sustainable transport in the urban context and identify four emerging areas of innovation: New Mobility, City Logistics, Intelligent System Management, and Livability.

Journal ArticleDOI
TL;DR: A descriptive model, the Biological Condition Gradient (BCG), that describes how 10 ecological attributes change in response to increasing levels of stressors is proposed that will provide a means to make more consistent, ecologically relevant interpretations of the response of aquatic biota to stressors and to better communicate this information to the public.
Abstract: The United States Clean Water Act (CWA; 1972, and as amended, U.S. Code title 33, sections 1251–1387) provides the long-term, national objective to “restore and maintain the ... biological integrity of the Nation's waters” (section 1251). However, the Act does not define the ecological components, or attributes, that constitute biological integrity nor does it recommend scientific methods to measure the condition of aquatic biota. One way to define biological integrity was described over 25 years ago as a balanced, integrated, adaptive system. Since then a variety of different methods and indices have been designed and applied by each state to quantify the biological condition of their waters. Because states in the United States use different methods to determine biological condition, it is currently difficult to determine if conditions vary across states or to combine state assessments to develop regional or national assessments. A nationally applicable model that allows biological condition to be interp...

Journal ArticleDOI
TL;DR: This review of the economic literature on aquatic invasive species is the first stage in the development of a consistent method to estimate the national costs of aquatic invasives.
Abstract: Invasive species are a growing threat in the United States, causing losses in biodiversity, changes in ecosystems, and impacts on economic enterprises such as agriculture, fisheries, and international trade. The costs of preventing and controlling invasive species are not well understood or documented, but estimates indicate that the costs are quite high. The costs of aquatic invasive species are even less well understood than those for terrestrial species. A systematic approach is needed to develop a consistent method to estimate the national costs of aquatic invasives. This review of the economic literature on aquatic invasive species is the first stage in the development of that estimate. We reviewed over sixty sources and include both empirical papers that present cost estimates as well as theoretical papers on preventing and mitigating the impacts of aquatic invasive species. Species-specific estimates are included for both animals and plants.

Journal ArticleDOI
TL;DR: Chronic exposure to a pesticide and mitochondrial toxin brings into play three systems, DJ-1, alpha-synuclein, and the ubiquitin-proteasome system, and implies that mitochondrial dysfunction and oxidative stress link environmental and genetic forms of the disease.

Journal ArticleDOI
TL;DR: This analysis shows that national exposure estimates are most influenced by reported concentrations in imported tuna, swordfish, and shrimp; Pacific pollock; and Atlantic crabs, indicating the importance of spatially refined mercury concentration data.
Abstract: Human exposure to methylmercury (MeHg) causes a variety of adverse health effects, including developmental delays in children of exposed mothers (Cohen et al. 2005) and deficits in neurocognitive function in adults (Yokoo et al. 2003). Blood MeHg concentrations in individuals are strongly correlated with the frequency and types of seafood consumed (Mahaffey et al. 2004). However, even for pregnant women, consuming seafood has a variety of health benefits when dietary MeHg intake is known to be low (e.g., Daniels et al. 2004; Mozaffarian and Rimm 2006). Regulatory agencies rely on information about how individuals are exposed to MeHg to evaluate trade-offs among health benefits from fish consumption and potential risks of MeHg exposure. In the United States, MeHg risk management takes the form of both advisories recommending limits on amounts of high-Hg fish consumed and regulations that control emissions from human sources. Assessing the effectiveness of both strategies in terms of changes in human exposure requires data on a) geographic supply regions for fish consumed by the U.S. population, and b) concentrations of Hg in fish and shellfish. Comparing the supply of fisheries products for all individuals from the commercial market (18.9 g/person/day, 2000–2002) [National Marine Fisheries Service (NMFS) 2003] to the total intake from dietary recall surveys (16.9 g/person/day, uncooked fish weight, 1994–1996–1998) [U.S. Environmental Protection Agency (EPA) 2002] shows that mean consumption estimates are comparable in magnitude. Hence, across the entire U.S. population, most seafood consumed comes from the commercial market. Estuarine and marine fish and shellfish dominate the edible supply of fish in the commercial market, comprising > 90% of the market share (Carrington et al. 2004). Thus, dietary intake of MeHg from estuarine and marine seafood accounts for most exposure in the U.S. population. Although many studies have investigated how variability in amounts and types of fish consumed affects MeHg exposure, few addressed uncertainties resulting from natural stochasticity in MeHg concentrations within seafood categories in the commercial market. Instead, most studies rely on Food and Drug Administration (FDA) survey data to characterize Hg concentration distributions (e.g., Carrington and Bolger 2002; Carrington et al. 2004; Mahaffey et al. 2004; Tran et al. 2004). However, FDA survey data are usually aggregated into one mean Hg concentration for each commercial market category. This can be problematic because each market category (e.g., fresh and frozen tuna) may describe a number of different biological species (e.g., for tuna: albacore, bigeye, bluefin, skipjack, yellowfin) with different growth rates and dietary preferences that affect Hg bioaccumulation. In addition, fish and shellfish in the commercial market consist of domestic landings from the Atlantic and Pacific oceans and imported species from a variety of countries. Many researchers have reported geographic variability in Hg concentrations among commercially important fish and shellfish species. For example, various tuna species caught in the Atlantic, Pacific, and Mediterranean oceans have significantly different length- and weight-normalized tissue Hg residues (Adams 2004; Anderson and Depledge 1997; Brooks 2004; Morrisey et al. 2004; Storelli et al. 2002). In addition, although imported shrimp make up a large fraction of domestic seafood consumption (NMFS 2003), Hg concentrations reported by the FDA are typically below detection limits (FDA 2006a, 2006b). However, measured Hg concentrations in shrimp caught in a variety of countries vary by an order of magnitude (Minganti et al. 1996; Plessi et al. 2001; Ruelas-Izunza et al. 2004). Although high Hg concentrations can sometimes be attributed to sampling at contaminated sites (Chvojka et al. 1990) or age and size classes of fish not commonly found in the commercial seafood market, Burger et al. (2005) also found significant differences between nationwide FDA values and Hg levels in fish sold in seafood markets in the New Jersey region. Based on these data, we can hypothesize that variability in Hg intakes within each species category in the commercial market is not adequately captured by grouping Hg concentrations in fish caught in geographically diverse regions into a single population mean. Better resolution in Hg concentration data used for exposure assessments may be obtained by grouping survey data by the origin of each marine and estuarine seafood product in the commercial market. This study assessed how estimated Hg exposure from estuarine and marine seafood in the U.S. population is affected by variability in Hg concentrations among different supply regions. To do this, supply of fisheries products were divided into categories based on the geographic sources of seafood in the commercial market consumed by the U.S. population. Expected Hg intake rates for different age groups, such as children and women of childbearing age, were modeled using Hg concentration data from each supply region, market share, and total consumption of each species from the NMFS (2001, 2002, 2003). Data from the U.S. Department of Agriculture’s Continuing Survey of Food Intake by Individuals (CSFII) (U.S. EPA 2002) and the National Health and Nutrition Examination Survey (NHANES) (NCHS 2006) provided information on variability in consumption patterns and body weights in the U.S. population. Distributions of intakes calculated in this study from geographically explicit Hg data were compared with values obtained using FDA Hg concentrations to assess whether variability in Hg concentrations by species and geographic regions significantly affects per capita intakes used to evaluate risks associated with Hg exposure. Geographically referenced exposure data provide a building block for quantitatively assessing how global changes in environmental Hg concentrations will affect human exposure to Hg in the United States.

Journal ArticleDOI
TL;DR: Enantiomer-specific formulatins could decrease pesticide use and protect the environment from unintended effects, according to scientists at the European Food Safety Authority.
Abstract: Enantiomer-specific formulatins could decrease pesticide use and protect the environment from unintended effects.

Journal ArticleDOI
TL;DR: Information is presented on the classes of environmental chemicals that display antiandrogenic and androgenic activities in vitro and in vivo and an insight into how exposure to mixtures these chemicals might behave in utero is provided.
Abstract: Within the last decade, several classes of chemicals have been shown in laboratory studies to disrupt reproductive development by acting as androgen receptor (AR) antagonists and/or inhibitors of fetal Leydig cell testosterone production. Some phthalate esters alter gubernacular differentiation by reducing insulin-like 3 (insl3) mRNA levels. We have found that AR antagonists and inhibitors of fetal testis hormone production generally induce cumulative, apparently dose-additive adverse effects when administered in mixtures. New research has also revealed the presence of androgens in the environment. Effluents from pulp and paper mills display androgenic activity of sufficient potency to masculinize and/or sex-reverse female fish. Effluent from beef cattle concentrated animal feedlot operations from the United States also displays androgenic activity in vitro, due, in part, to the presence of a steroid used to promote growth in beef cattle. In summary, we are only beginning to identify the classes of chemicals that have the potential to alter the androgen signalling pathway in utero. This review will (i) present information on the classes of environmental chemicals that display antiandrogenic and androgenic activities in vitro and in vivo, and (ii) provide an insight into how exposure to mixtures these chemicals might behave in utero.

Journal ArticleDOI
TL;DR: Simulation tests to compare methods for detecting recent bottlenecks using microsatellite data find that Mk was the method most likely to correctly identify a bottleneck when a bottleneck lasted several generations, the population had made a demographic recovery, and mutation rates were high or pre-bottleneck population sizes were large.
Abstract: This paper describes simulation tests to compare methods for detecting recent bottlenecks using microsatellite data. This study considers both type I error (detecting a bottleneck when there wasn’t one) and type II error (failing to detect a bottleneck when there was one) under a variety of scenarios. The two most promising methods were the range in allele size conditioned on the number of alleles, M k , and heterozygosity given the number of alleles, H k , under a two-phase mutation model; in most of the simulations one of these two methods had the lowest type I and type II error relative to other methods. M k was the method most likely to correctly identify a bottleneck when a bottleneck lasted several generations, the population had made a demographic recovery, and mutation rates were high or pre-bottleneck population sizes were large. On the other hand H k was most likely to correctly identify a bottleneck when a bottleneck was more recent and less severe and when mutation rates were low or pre-bottleneck population sizes were small. Both methods were prone to type I errors when assumptions of the model were violated, but it may be easier to design a conservative heterozygosity test than a conservative ratio test.

Journal ArticleDOI
TL;DR: Overall, although these intercomparisons suggest areas where further research is needed, they provide support the contention that PM2.5 mass source apportionment results are consistent across users and methods, and that today's sourceapportionment methods are robust enough for application to PM2-5 health effects assessments.
Abstract: During the past three decades, receptor models have been used to identify and apportion ambient concentrations to sources. A number of groups are employing these methods to provide input into air quality management planning. A workshop has explored the use of resolved source contributions in health effects models. Multiple groups have analyzed particulate composition data sets from Washington, DC and Phoenix, AZ. Similar source profiles were extracted from these data sets by the investigators using different factor analysis methods. There was good agreement among the major resolved source types. Crustal (soil), sulfate, oil, and salt were the sources that were most unambiguously identified (generally highest correlation across the sites). Traffic and vegetative burning showed considerable variability among the results with variability in the ability of the methods to partition the motor vehicle contributions between gasoline and diesel vehicles. However, if the total motor vehicle contributions are estimated, good correspondence was obtained among the results. The source impacts were especially similar across various analyses for the larger mass contributors (e.g., in Washington, secondary sulfate SE=7% and 11% for traffic; in Phoenix, secondary sulfate SE=17% and 7% for traffic). Especially important for time-series health effects assessment, the source-specific impacts were found to be highly correlated across analysis methods/researchers for the major components (e.g., mean analysis to analysis correlation, r>0.9 for traffic and secondary sulfates in Phoenix and for traffic and secondary nitrates in Washington. The sulfate mean r value is >0.75 in Washington.). Overall, although these intercomparisons suggest areas where further research is needed (e.g., better division of traffic emissions between diesel and gasoline vehicles), they provide support the contention that PM2.5 mass source apportionment results are consistent across users and methods, and that today's source apportionment methods are robust enough for application to PM2.5 health effects assessments.

Journal ArticleDOI
TL;DR: In this article, the authors used data for average riverine nitrogen flux and anthropogenic inputs of nitrogen over a 6-year period (1988-1993) for 16 major watersheds in the northeastern United States to examine if there is also a climatic influence on nitrogen fluxes in rivers.
Abstract: The flux of nitrogen in large rivers in North America and Europe is well explained as a function of the net anthropogenic inputs of nitrogen to the landscape, with on average 20 to 25% of these inputs exported in rivers and 75 to 80% of the nitrogen retained or denitrified in the landscape. Here, we use data for average riverine nitrogen fluxes and anthropogenic inputs of nitrogen over a 6-year period (1988–1993) for 16 major watersheds in the northeastern United States to examine if there is also a climatic influence on nitrogen fluxes in rivers. Previous studies have shown that for any given river, nitrogen fluxes are greater in years with higher discharge, but this can be interpreted as storage of nitrogen in the landscape during dry years and flushing of this stored nitrogen during wet years. Our analyses demonstrate that there is also a longer-term steady-state influence of climate on riverine nitrogen fluxes. Those watersheds that have higher precipitation and higher discharge export a greater fraction of the net anthropogenic inputs of nitrogen. This fractional export ranges from 10 to 15% of the nitrogen inputs in drier watersheds in the northeastern United States to over 35% in the wetter watersheds. We believe this is driven by lower rates of denitrification in the wetter watersheds, perhaps because shorter water residence times do not allow for as much denitrification in riparian wetlands and low-order streams. Using mean projections for the consequences of future climate change on precipitation and discharge, we estimate that nitrogen fluxes in the Susquehanna River to Chesapeake Bay may increase by 3 to 17% by 2030 and by 16 to 65% by 2095 due to greater fractional delivery of net anthropogenic nitrogen inputs as precipitation and discharge increase. Although these projections are highly uncertain, they suggest a need to better consider the influence of climate on riverine nitrogen fluxes as part of management efforts to control coastal nitrogen pollution.

BookDOI
01 Dec 2006
TL;DR: Wu et al. as discussed by the authors proposed a framework and methods for simplifying complex landscapes to reduce uncertainty in predictions, and assessed the influence of spatial scale on the relationship between avian nesting success and forest fragmentation.
Abstract: Dedication. Preface J. Wu et al. List of Contributors.- Part I. Concepts and Methods. 1. Concepts of scale and scaling J. Wu, H. Li. 2. Perspectives and methods of scaling J. Wu, H. Li. 3. Uncertainty analysis in ecological studies: An overview H. Li, J. Wu. 4. Multilevel statistical models and ecological scaling R.A. Berk , J. De Leeuw. 5. Downscaling abundance from the distribution of species: Occupancy theory and applications F. He, W. Reed. 6. Scaling terrestrial biogeochemical processes: Contrasting intact and model experimental systems M.A. Bradford, J.F. Reynolds. 7. A framework and methods for simplifying complex landscapes to reduce uncertainty in predictions D.P.C. Peters et al. 8. Building up with a top-down approach: The role of remote sensing in deciphering functional and structural diversity C.A. Wessman, C.A. Bateson.- Part II. Case studies. 9. Carbon fluxes across regions: Observational constraints at multiple scales B.E. Law et al. 10. Landscape and regional scale studies of nitrogen gas fluxes P.M. Groffman et al. 11. Multiscale relationships of landscape characteristics and nitrogen concentrations in streams K.B. Jones et al. 12. Uncertainty in scaling nutrient export coefficients J.D. Wickham et al. 13. Causes and consequences of land use change in the North Carolina Piedmont: The scope of uncertainty D.L. Urban et al. 14. Assessing the influence of spatial scale on the relationship between avian nesting success and forest fragmentation P. Lloyd et al. 15. Scaling issues in mapping riparian zones with remote sensing data: Quantifying errors andsources of uncertainty T.P. Hollenhorst et al. 16. Scale issues in lake-watershed interaction: Assessing shoreline development, impacts on water clarity C.A. Johnston, B.A. Shmagin. 17. Scaling and uncertainty in region-wide water quality decision-making O.L. Loucks et al.- Part III. Synthesis. 18. Scaling with known uncertainty: A Synthesis J. Wu et al.- Index.

Journal ArticleDOI
TL;DR: Attributes of the fathead minnow make it an excellent model for addressing new challenges in aquatic toxicology, including identification of sensitive life-stages/endpoints for chemicals with differing modes/mechanisms of action, predicting population-level effects based on data collected from lower levels of biological organization, and exploring the emerging role of genomics in research and regulation.