scispace - formally typeset
Search or ask a question

Showing papers by "Carleton University published in 2007"


Journal ArticleDOI
TL;DR: This work presents a technique for real-time adaptive thresholding using the integral image of the input, an extension of a previous method that is more robust to illumination changes in the image.
Abstract: Image thresholding is a common task in many computer vision and graphics applications. The goal of thresholding an image is to classify pixels as either "dark" or "light." Adaptive thresholding is a form of thresholding that takes into account spatial variations in illumination. We present a technique for real-time adaptive thresholding using the integral image of the input. Our technique is an extension of a previous method. However, our solution is more robust to illumination changes in the image. Additionally, our method is simple and easy to implement. Our technique is suitable for processing live video streams at a real-time frame-rate, making it a valuable tool for interactive applications such as augmented reality. Source code is available online.

1,041 citations


Journal ArticleDOI
TL;DR: A state aggregation technique is developed to obtain a set of decentralized control laws for the individuals which possesses an epsiv-Nash equilibrium property and a stability property of the mass behavior is established.
Abstract: We consider linear quadratic Gaussian (LQG) games in large population systems where the agents evolve according to nonuniform dynamics and are coupled via their individual costs. A state aggregation technique is developed to obtain a set of decentralized control laws for the individuals which possesses an epsiv-Nash equilibrium property. A stability property of the mass behavior is established, and the effect of inaccurate population statistics on an isolated agent is also analyzed by variational techniques.

978 citations


Journal ArticleDOI
TL;DR: In this article, the effects of methylmercury exposure on wild piscivorous fish, birds, and mammals were investigated. But, the authors focused on the common loon, and only limited field-based studies corroborated laboratory-based results.
Abstract: Wild piscivorous fish, mammals, and birds may be at risk for elevated dietary methylmercury intake and toxicity. In controlled feeding studies, the consumption of diets that contained Hg (as methylmercury) at environmentally realistic concentrations resulted in a range of toxic effects in fish, birds, and mammals, including behavioral, neurochemical, hormonal, and reproductive changes. Limited field-based studies, especially with certain wild piscivorous bird species, e.g., the common loon, corroborated laboratory-based results, demonstrating significant relations between methylmercury exposure and various indicators of methylmercury toxicity, including reproductive impairment. Potential population effects in fish and wildlife resulting from dietary methylmercury exposure are expected to vary as a function of species life history, as well as regional differences in fish-Hg concentrations, which, in turn, are influenced by differences in Hg deposition and environmental methylation rates. However, population modeling suggests that reductions in Hg emissions could have substantial benefits for some common loon populations that are currently experiencing elevated methylmercury exposure. Predicted benefits would be mediated primarily through improved hatching success and development of hatchlings to maturity as Hg concentrations in prey fish decline. Other piscivorous species may also benefit from decreased Hg exposure but have not been as extensively studied as the common loon.

905 citations


Journal ArticleDOI
TL;DR: Six major themes in the ecology and conservation of landscapes are assessed, including recognizing the importance of landscape mosaics, recognizing interactions between vegetation cover and vegetation configuration, and 13 important issues that need to be considered in developing approaches to landscape conservation.
Abstract: The management of landscapes for biological conservation and ecologically sustainable natural resource use are crucial global issues. Research for over two decades has resulted in a large literature, yet there is little consensus on the applicability or even the existence of general principles or broad considerations that could guide landscape conservation. We assess six major themes in the ecology and conservation of landscapes. We identify 13 important issues that need to be considered in developing approaches to landscape conservation. They include recognizing the importance of landscape mosaics (including the integration of terrestrial and aquatic areas), recognizing interactions between vegetation cover and vegetation configuration, using an appropriate landscape conceptual model, maintaining the capacity to recover from disturbance and managing landscapes in an adaptive framework. These considerations are influenced by landscape context, species assemblages and management goals and do not translate directly into on-the-ground management guidelines but they should be recognized by researchers and resource managers when developing guidelines for specific cases. Two crucial overarching issues are: (i) a clearly articulated vision for landscape conservation and (ii) quantifiable objectives that offer unambiguous signposts for measuring progress.

673 citations


Journal ArticleDOI
TL;DR: In this article, a 6-year balance computed from continuous net ecosystem CO2 exchange (NEE), regular instantaneous measurements of methane (CH4) emissions, and export of dissolved organic C (DOC) from a northern ombrotrophic bog is presented.
Abstract: Northern peatlands contain up to 25% of the world’s soil carbon (C) and have an estimated annual exchange of CO2-C with the atmosphere of 0.1–0.5 Pg yr � 1 and of CH4-C of 10–25 Tg yr � 1 . Despite this overall importance to the global C cycle, there have been few, if any, complete multiyear annual C balances for these ecosystems. We report a 6-year balance computed from continuous net ecosystem CO2 exchange (NEE), regular instantaneous measurements of methane (CH4) emissions, and export of dissolved organic C (DOC) from a northern ombrotrophic bog. From these observations, we have constructed complete seasonal and annual C balances, examined their seasonal and interannual variability, and compared the mean 6-year contemporary C exchange with the apparent C accumulation for the last 3000 years obtained from C density and agedepth profiles from two peat cores. The 6-year mean NEE-C and CH4-C exchange, and net DOC loss are � 40.2 � 40.5 (� 1 SD), 3.7 � 0.5, and 14.9 � 3.1 g m � 2 yr � 1 , giving a 6-year mean balance of � 21.5 � 39.0 g m � 2 yr � 1 (where positive exchange is a loss of C from the ecosystem). NEE had the largest magnitude and variability of the components of the C balance, but DOC and CH4 had similar proportional variabilities and their inclusion is essential to resolve the C balance. There are large interseasonal and interannual ranges to the exchanges due to variations in climatic conditions. We estimate from the largest and smallest seasonal exchanges, quasi-maximum limits of the annual C balance between 50 and � 105 g m � 2 yr � 1 . The net C accumulation rate obtained from the two peatland cores for the interval 400–3000 BP (samples from the anoxic layer only) were 21.9 � 2.8 and 14.0 � 37.6 g m � 2 yr � 1 , which are not significantly different from the 6-year mean con

627 citations


Journal ArticleDOI
TL;DR: In this article, the authors develop an integrative perspective on catch-and-release (C&R) by drawing on historical, philosophical, socio-psychological, biological, and managerial insights and perspectives.
Abstract: Most research on catch-and-release (C&R) in recreational fishing has been conducted from a disciplinary angle focusing on the biological sciences and the study of hooking mortality after release. This hampers understanding of the complex and multifaceted nature of C&R. In the present synopsis, we develop an integrative perspective on C&R by drawing on historical, philosophical, socio-psychological, biological, and managerial insights and perspectives. Such a perspective is helpful for a variety of reasons, such as 1) improving the science supporting successful fisheries management and conservation, 2) facilitating dialogue between managers, anglers, and other stakeholders, 3) minimizing conflict potentials, and 4) paving the path toward sustainable recreational fisheries management. The present work highlights the array of cultural, institutional, psychological, and biological factors and dimensions involved in C&R. Progress toward successful treatment of C&R might be enhanced by acknowledging the complex...

594 citations


Journal ArticleDOI
TL;DR: The purpose of this report is to set out the salient issues associated with clinical implementation and experimental verification of MC dose algorithms, and to provide the framework upon which to build a comprehensive program for commissioning and routine quality assurance of MC-based treatment planning systems.
Abstract: The Monte Carlo (MC) method has been shown through many research studies to calculate accurate dose distributions for clinical radiotherapy, particularly in heterogeneous patient tissues where the effects of electron transport cannot be accurately handled with conventional, deterministic dose algorithms. Despite its proven accuracy and the potential for improved dose distributions to influence treatment outcomes, the long calculation times previously associated with MC simulation rendered this method impractical for routine clinical treatment planning. However, the development of faster codes optimized for radiotherapy calculations and improvements in computer processor technology have substantially reduced calculation times to, in some instances, within minutes on a single processor. These advances have motivated several major treatment planning system vendors to embark upon the path of MC techniques. Several commercial vendors have already released or are currently in the process of releasing MC algorithms for photon and/or electron beam treatment planning. Consequently, the accessibility and use of MC treatment planning algorithms may well become widespread in the radiotherapy community. With MC simulation, dose is computed stochastically using first principles; this method is therefore quite different from conventional dose algorithms. Issues such as statistical uncertainties, the use of variance reduction techniques, theability to account for geometric details in the accelerator treatment head simulation, and other features, are all unique components of a MC treatment planning algorithm. Successful implementation by the clinical physicist of such a system will require an understanding of the basic principles of MC techniques. The purpose of this report, while providing education and review on the use of MC simulation in radiotherapy planning, is to set out, for both users and developers, the salient issues associated with clinical implementation and experimental verification of MC dose algorithms. As the MC method is an emerging technology, this report is not meant to be prescriptive. Rather, it is intended as a preliminary report to review the tenets of the MC method and to provide the framework upon which to build a comprehensive program for commissioning and routine quality assurance of MC-based treatment planning systems.

591 citations


Journal ArticleDOI
TL;DR: This synthesis synthesizes the understanding of the relationship between landscape structure and animal movement in human-modified landscapes and develops a hypothesis that predicts the relative importance of the different population-level consequences of these non-optimal movements.
Abstract: Summary 1. I synthesize the understanding of the relationship between landscape structure and animal movement in human-modified landscapes. 2. The variety of landscape structures is first classified into four categories: continuous habitat, patchy habitat with high-quality matrix, patchy habitat with low-quality matrix, and patchy, ephemeral habitat. Using this simplification I group the range of evolved movement parameters into four categories or movement types. I then discuss how these movement types interact with current human-caused landscape changes, and how this often results in non-optimal movement. 3. From this synthesis I develop a hypothesis that predicts the relative importance of the different population-level consequences of these non-optimal movements, for the four movement types. 4. Populations of species that have inhabited landscapes with high habitat cover or patchy landscapes with low-risk matrix should have evolved low boundary responses and moderate to high movement probabilities. These species are predicted to be highly susceptible to increased movement mortality resulting from habitat loss and reduced matrix quality. 5. In contrast, populations of species that evolved in patchy landscapes with high-risk matrix or dynamic patchy landscapes are predicted to be highly susceptible to decreased immigration and colonization success, due to the increasing patch isolation that results from habitat loss. 6. Finally, I discuss three implications of this synthesis: (i) ‘least cost path’ analysis should not be used for land management decisions without data on actual movement paths and movement risks in the landscape; (ii) ‘dispersal ability’ is not simply an attribute of a species, but varies strongly with landscape structure such that the relative rankings of species’ dispersal abilities can change following landscape alteration; and (iii) the assumption that more mobile species are more resilient to human-caused landscape change is not generally true, but depends on the structure of the landscape where the species evolved.

544 citations


Book
01 Jan 2007
TL;DR: In this paper, the authors present rigorous descriptions of the main algorithms and their analyses for different variations of the Geometric Spanner Network Problem, and present several basic principles and results that are used throughout the book.
Abstract: Aimed at an audience of researchers and graduate students in computational geometry and algorithm design, this book uses the Geometric Spanner Network Problem to showcase a number of useful algorithmic techniques, data structure strategies, and geometric analysis techniques with many applications, practical and theoretical. The authors present rigorous descriptions of the main algorithms and their analyses for different variations of the Geometric Spanner Network Problem. Though the basic ideas behind most of these algorithms are intuitive, very few are easy to describe and analyze. For most of the algorithms, nontrivial data structures need to be designed, and nontrivial techniques need to be developed in order for analysis to take place. Still, there are several basic principles and results that are used throughout the book. One of the most important is the powerful well-separated pair decomposition. This decomposition is used as a starting point for several of the spanner constructions.

444 citations


Journal ArticleDOI
TL;DR: The cold regions hydrological model (CRHM) as mentioned in this paper is a flexible object-oriented modeling system for simulating the cold regions Hydrological cycle over small to medium sized basins.
Abstract: After a programme of integrated field and modelling research, hydrological processes of considerable uncertainty such as snow redistribution by wind, snow interception, sublimation, snowmelt, infiltration into frozen soils, hillslope water movement over permafrost, actual evaporation, and radiation exchange to complex surfaces have been described using physically based algorithms. The cold regions hydrological model (CRHM) platform, a flexible object-oriented modelling system was devised to incorporate these algorithms and others and to connect them for purposes of simulating the cold regions hydrological cycle over small to medium sized basins. Landscape elements in CRHM can be linked episodically in process-specific cascades via blowing snow transport, overland flow, organic layer subsurface flow, mineral interflow, groundwater flow, and streamflow. CRHM has a simple user interface but no provision for calibration; parameters and model structure are selected based on the understanding of the hydrological system; as such the model can be used both for prediction and for diagnosis of the adequacy of hydrological understanding. The model is described and demonstrated in basins from the semi-arid prairie to boreal forest, mountain and muskeg regions of Canada where traditional hydrological models have great difficulty in describing hydrological phenomena. Some success is shown in simulating various elements of the hydrological cycle without calibration; this is encouraging for predicting hydrology in ungauged basins.

426 citations


Journal ArticleDOI
TL;DR: The increased demands of decision-making and visual processing in hypertext impaired reading performance, and individual differences in readers, such as working memory capacity and prior knowledge, mediated the impact of hypertext features.

Proceedings ArticleDOI
23 May 2007
TL;DR: Software Performance Engineering encompasses efforts to describe and improve performance, with two distinct approaches: an early-cycle predictive model- based approach, and a late-cycle measurement-based approach.
Abstract: Performance is a pervasive quality of software systems; everything affects it, from the software itself to all underlying layers, such as operating system, middleware, hardware, communication networks, etc. Software Performance Engineering encompasses efforts to describe and improve performance, with two distinct approaches: an early-cycle predictive model-based approach, and a late-cycle measurement-based approach. Current progress and future trends within these two approaches are described, with a tendency (and a need) for them to converge, in order to cover the entire development cycle.

Journal ArticleDOI
TL;DR: In this article, the authors explore one of the major challenges associated with governance for sustainable development: managing change in a context where power is distributed across diverse societal subsystems, and explore how to manage change in such a context.
Abstract: This paper explores one of the major challenges associated with governance for sustainable development: managing change in a context where power is distributed across diverse societal subsystems an...

Book ChapterDOI
24 Sep 2007
TL;DR: This work proposes and examines the usability and security of Cued Click Points (CCP), a cued-recall graphical password technique, and suggests that CCP provides greater security than PassPoints because the number of images increases the workload for attackers.
Abstract: We propose and examine the usability and security of Cued Click Points (CCP), a cued-recall graphical password technique. Users click on one point per image for a sequence of images. The next image is based on the previous click-point. We present the results of an initial user study which revealed positive results. Performance was very good in terms of speed, accuracy, and number of errors. Users preferred CCP to PassPoints (Wiedenbeck et al., 2005), saying that selecting and remembering only one point per image was easier, and that seeing each image triggered their memory of where the corresponding point was located. We also suggest that CCP provides greater security than PassPoints because the number of images increases the workload for attackers.

Journal ArticleDOI
TL;DR: The hypothesis that the quality of the child’s rearing environment is one mechanism that carries risk to children of depressed parents is supported and interventions for parents whose symptoms of depression interfere with parenting responsibilities could help reduce the risk of some childhood disorders.
Abstract: This study examined parental behaviors as mediators in links between depressive symptoms in mothers and fathers and child adjustment problems. Participants were 4,184 parents and 6,048 10- to 15-year-olds enrolled in the 1998 and 2000 cycles of the Canadian National Longitudinal Survey of Children and Youth. Mothers and fathers self-reported symptoms of depression at Times 1 and 2 and their children assessed parental nurturance, rejection, and monitoring and self-reported internalizing and externalizing problems and prosocial behavior at Time 2. Hierarchical linear modeling showed evidence of mediation involving all three domains of parental behavior. Findings supported the hypothesis that the quality of the child's rearing environment is one mechanism that carries risk to children of depressed parents. Interventions for parents whose symptoms of depression interfere with parenting responsibilities could help reduce the risk of some childhood disorders.

Journal ArticleDOI
TL;DR: This article reviewed background to generalized linear mixed models and the inferential techniques which have been developed for them and considered a few extensions including additive models, models for zero-heavy data, and models accommodating latent clusters.
Abstract: Breslow and Clayton (J Am Stat Assoc 88:9-25,1993) was, and still is, a highly influential paper mobilizing the use of generalized linear mixed models in epidemiology and a wide variety of fields. An important aspect is the feasibility in implementation through the ready availability of related software in SAS (SAS Institute, PROC GLIMMIX, SAS Institute Inc., URL http://www.sas.com , 2007), S-plus (Insightful Corporation, S-PLUS 8, Insightful Corporation, Seattle, WA, URL http://www.insightful.com , 2007), and R (R Development Core Team, R: A Language and Environment for Statistical Computing, R Foundation for Statistical Computing, Vienna, Austria, URL http://www.R-project.org , 2006) for example, facilitating its broad usage. This paper reviews background to generalized linear mixed models and the inferential techniques which have been developed for them. To provide the reader with a flavor of the utility and wide applicability of this fundamental methodology we consider a few extensions including additive models, models for zero-heavy data, and models accommodating latent clusters.

Journal ArticleDOI
TL;DR: This paper presents the fundamental properties of causality, stability, and passivity that electrical interconnect models must satisfy in order to be physically consistent and interpret several common situations where either model derivation or model use in a computer-aided design environment fails dramatically.
Abstract: Modern packaging design requires extensive signal integrity simulations in order to assess the electrical performance of the system. The feasibility of such simulations is granted only when accurate and efficient models are available for all system parts and components having a significant influence on the signals. Unfortunately, model derivation is still a challenging task, despite the extensive research that has been devoted to this topic. In fact, it is a common experience that modeling or simulation tasks sometimes fail, often without a clear understanding of the main reason. This paper presents the fundamental properties of causality, stability, and passivity that electrical interconnect models must satisfy in order to be physically consistent. All basic definitions are reviewed in time domain, Laplace domain, and frequency domain, and all significant interrelations between these properties are outlined. This background material is used to interpret several common situations where either model derivation or model use in a computer-aided design environment fails dramatically. We show that the root cause for these difficulties can always be traced back to the lack of stability, causality, or passivity in the data providing the structure characterization and/or in the model itself.

Journal ArticleDOI
TL;DR: Threshold-based maximal ratio combining (MRC) and threshold-based selection combining (SC) of these multiple antenna signals are studied and analyzed and it is found that the end-to-end (E2E) error performance of a network which has few relays with many antennas is not significantly worse than that which has many relays each with a fewer antennas.
Abstract: Space, cost, and signal processing constraints, among others, often preclude the use of multiple antennas at wireless terminals. This paper investigates distributed decode-and-forward fixed relays (infrastructure-based relaying) which are engaged in cooperation in a two-hop wireless network as a means of removing the burden of multiple antennas on wireless terminals. In contrast to mobile terminals, the deployment of a small number of antennas on infrastructure-based fixed relays is feasible, thus, the paper examines the impact of multiple antennas on the performance of the distributed cooperative fixed relays. Threshold-based maximal ratio combining (MRC) and threshold-based selection combining (SC) of these multiple antenna signals are studied and analyzed. It is found that the end-to-end (E2E) error performance of a network which has few relays with many antennas is not significantly worse than that which has many relays each with a fewer antennas. Obviously, the former network has a tremendous deployment cost advantage over the latter. It is also observed that the E2E error performance of a network in which the multiple antennas at relays are configured in SC fashion is not significantly worse than that in which MRC is used. For implementation, SC presents a significantly lower complexity and cost than a full-blown MRC. The analysis in this paper uses the versatile Nakagami fading channels in contrast to the Rayleigh model used in most previous works

Journal ArticleDOI
TL;DR: A constructed wetland for the treatment of agricultural wastewater from a 150-cow dairy operation in this watershed was monitored in its eighth operating season to evaluate the proportion of total nitrogen (TN) (approximated by total Kjeldahl nitrogen (TKN) due to low NO 3 − ) and total phosphorus (TP) removal that could be attributed to storage in Typha latifolia L. as mentioned in this paper.

Journal ArticleDOI
TL;DR: The transmission spectrum of fiber Bragg gratings with gratings planes tilted at a small angle relative to the fiber axis shows a large number of narrowband cladding mode resonances within a 100 nm wide spectrum.
Abstract: The transmission spectrum of fiber Bragg gratings with gratings planes tilted at a small angle (2°-10°) relative to the fiber axis shows a large number of narrowband cladding mode resonances within a 100 nm wide spectrum. When a gold coating with a thickness between 10 and 30 nm is deposited on the fiber, the transmission spectrum shows anomalous features for values of the outside medium refractive index between 1.4211 and 1.4499. These features are shown to correspond to the excitation of surface plasmon resonances at the external surface of the gold film.

Journal ArticleDOI
TL;DR: Some recent advances in understanding the biochemical mechanisms of metabolic arrest are reviewed, with a focus on ideas such as the strategies used to reorganize metabolic priorities for ATP expenditure, molecular controls that suppress cell functions, and changes in gene expression that support hypometabolism.
Abstract: Entry into a hypometabolic state is an important survival strategy for many organisms when challenged by environmental stress, including low oxygen, cold temperatures and lack of food or water. The molecular mechanisms that regulate transitions to and from hypometabolic states, and stabilize long-term viability during dormancy, are proving to be highly conserved across phylogenic lines. A number of these mechanisms were identified and explored using anoxia-tolerant turtles as the model system, particularly from the research contributions made by Dr Peter L. Lutz in his explorations of the mechanisms of neuronal suppression in anoxic brain. Here we review some recent advances in understanding the biochemical mechanisms of metabolic arrest with a focus on ideas such as the strategies used to reorganize metabolic priorities for ATP expenditure, molecular controls that suppress cell functions (e.g. ion pumping, transcription, translation, cell cycle arrest), changes in gene expression that support hypometabolism, and enhancement of defense mechanisms (e.g. antioxidants, chaperone proteins, protease inhibitors) that stabilize macromolecules and promote long-term viability in the hypometabolic state.

Journal ArticleDOI
TL;DR: The experimental results show that the wavelength separation between selected resonances allows the measurement of the refractive index of the medium surrounding the fiber for values between 1.25 and 1.44 with an accuracy approaching 1x10(-4).
Abstract: Short-period fiber Bragg gratings with weakly tilted grating planes generate multiple strong resonances in transmission. Our experimental results show that the wavelength separation between selected resonances allows the measurement of the refractive index of the medium surrounding the fiber for values between 1.25 and 1.44 with an accuracy approaching 1x10(-4). The sensor element is 10 mm long and made from standard single-mode telecommunication grade optical fiber by ultraviolet light irradiation through a phase mask.

Journal ArticleDOI
TL;DR: This article examined the factor structure of a widely used assessment of parenting practices, the Alabama Parenting Questionnaire, and produced a 9-item short scale around its three supported factors: Positive Parenting, Inconsistent Discipline and Poor Supervision.
Abstract: Brief assessments of parenting practices can provide important information about the development of disruptive behavior disorders in children. We examined the factor structure of a widely used assessment of parenting practices, the Alabama Parenting Questionnaire, and produced a 9-item short scale around its three supported factors: Positive Parenting, Inconsistent Discipline and Poor Supervision. The short scale was then validated in independent community samples using confirmatory factor analysis and measures of disruptive behavioral disorders in children. The scale showed good fit to a three-factor model and good convergent validity by differentiating parents of children with disruptive behavioral disorders and parents of children without such disorders. Results indicated that this new measure is an informative tool for researchers and clinicians whom require brief assessments of parenting practices relating to disruptive behavioral disorders in children.

Proceedings Article
06 Aug 2007
TL;DR: The results suggest that these graphical password schemes appear to be at least as susceptible to offline attack as the traditional text passwords they were proposed to replace.
Abstract: Although motivated by both usability and security concerns, the existing literature on click-based graphical password schemes using a single background image (e.g., PassPoints) has focused largely on usability. We examine the security of such schemes, including the impact of different background images, and strategies for guessing user passwords. We report on both short- and long-term user studies: one lab-controlled, involving 43 users and 17 diverse images, and the other a field test of 223 user accounts. We provide empirical evidence that popular points (hot-spots) do exist for many images, and explore two different types of attack to exploit this hot-spotting: (1) a "human-seeded" attack based on harvesting click-points from a small set of users, and (2) an entirely automated attack based on image processing techniques. Our most effective attacks are generated by harvesting password data from a small set of users to attack other targets. These attacks can guess 36% of user passwords within 231 guesses (or 12% within 216 guesses) in one instance, and 20% within 233 guesses (or 10% within 218 guesses) in a second instance. We perform an image-processing attack by implementing and adapting a bottom-up model of visual attention, resulting in a purely automated tool that can guess up to 30% of user passwords in 235 guesses for some instances, but under 3% on others. Our results suggest that these graphical password schemes appear to be at least as susceptible to offline attack as the traditional text passwords they were proposed to replace.

Journal ArticleDOI
TL;DR: The findings indicate that mother herring gulls are exposed to several, current-use flame retardants via their diet, and in ovo transfer occurred to their eggs.
Abstract: Of the 13, current-use, non-polybrominated diphenyl ether (PBDE) flame retardants (FRs) monitored, hexabromobenzene (HBB), pentabromoethylbenzene (PBEB), pentabromotoluene (PBT), 1,2-bis(2,4,6-tribromophenoxy)ethane (BTBPE) and α- and γ-isomers of hexabromocyclododecane (HBCD), and the syn- and anti-isomers of the chlorinated Dechlorane Plus (DP) were quantified in egg pools of herring gulls (Larus argentatus) collected in 2004 from six sites in all five of the Laurentian Great Lakes of North America. α-HBCD concentrations ranged from 2.1 to 20 ng/g (wet weight (ww)). Other “new” FR levels ranged from 0.004 to 1.4 ng/g ww and were much lower than those of the major BDE congeners that are in technical mixtures (namely BDE-47, -99, -100), where Σ3PBDE ranged from 186 to 498 ng/g ww. Nineteen hepta-BDEs (Σhepta = 4.9−11 ng/g ww), octa-BDEs (Σocta = 2.6−9.1 ng/g ww), and nona-BDEs (Σnona = 0.12−5.6 ng/g ww) were detectible at all six colonies, while BDE-209 was low but quantifiable (<0.1−0.21 ng/g ww) at two ...

Journal ArticleDOI
TL;DR: This article used brachiopods of Ordovician to Cretaceous age, complemented by published data from belemnites and planktonic foraminifera, to reconstruct the evolution of calcium isotope composition of seawater over the Phanerozoic.

Journal ArticleDOI
TL;DR: In this article, the authors conducted short and long-distance translocations and trapping studies of white-footed mice ( Peromyscus leucopus ) and eastern chipmunks ( Tamias striatus ) near two-lane paved roads, which differed widely in traffic amount, from 47 to 15 433 vehicles per day.
Abstract: Summary 1. Roads can act as barriers to animal movement, which may reduce population persistence by reducing recolonization of empty habitats and limiting immigration. Appropriate mitigation of this barrier effect (e.g. seasonal road closures, location and design of wildlife over- or underpasses) depends upon whether the animals avoid the road itself or the traffic on the road. Empirical studies of road avoidance to date do not generally differentiate between these. 2. We conducted short- and long-distance translocations and trapping studies of white-footed mice ( Peromyscus leucopus ) and eastern chipmunks ( Tamias striatus ) near two-lane paved roads, which differed widely in traffic amount, from 47 to 15 433 vehicles per day. 3. In the trapping study (13 sites) only five animals moved across a road, in comparison to 36 animals that moved the same distance without an intervening road ( P < 0·0001). In the short-distance translocations (15 sites), 51% of the small mammals that were translocated across roads returned, in comparison to a return rate of 77% of animals that were translocated a similar distance with no intervening road ( P = 0·009). 4. In the long-distance translocation study (24 sites) we found that each intervening road reduced the probability of successful return by about 50%. 5. We found no significant effects of traffic amount on return rates in either the short-distance or the long-distance translocations studies. 6. Small mammal densities were not lower near roads and we found no evidence for a decrease in density near roads with increasing traffic amount. 7. Synthesis and applications. Our results suggest that small mammals avoid the road itself, and not emissions such as noise from the traffic on the roads. Our results imply that the barrier effect of roads on these species cannot be mitigated by measures aimed at reducing traffic amount; other measures such as wildlife passages would be needed.

Journal ArticleDOI
TL;DR: In this paper, a direct internal reforming solid oxide fuel cell (DIR-SOFC) is modeled thermodynamically from the energy point of view, where syngas produced from a gasification process is selected as a fuel for the SOFC.

Journal ArticleDOI
TL;DR: The mechanisms of microorganisms, plants, and animals to cope with the cold and the resulting biotechnological perspectives are described.
Abstract: Microorganisms, plants, and animals have successfully colonized cold environments, which represent the majority of the biosphere on Earth. They have evolved special mechanisms to overcome the life-endangering influence of low temperature and to survive freezing. Cold adaptation includes a complex range of structural and functional adaptations at the level of all cellular constituents, such as membranes, proteins, metabolic activity, and mechanisms to avoid the destructive effect of intracellular ice formation. These strategies offer multiple biotechnological applications of cold-adapted organisms and/or their products in various fields. In this review, we describe the mechanisms of microorganisms, plants, and animals to cope with the cold and the resulting biotechnological perspectives.

Journal ArticleDOI
TL;DR: It is argued that video games contain systems of values which players perceive and adopt, and which shape the play of the game, which leads to a genuine video game HCI.