scispace - formally typeset
Search or ask a question

Showing papers by "Australian National University published in 2006"


Journal ArticleDOI
TL;DR: Genalex is a user-friendly cross-platform package that runs within Microsoft Excel, enabling population genetic analyses of codominant, haploid and binary data.
Abstract: genalex is a user-friendly cross-platform package that runs within Microsoft Excel, enabling population genetic analyses of codominant, haploid and binary data. Allele frequency-based analyses include heterozygosity, F statistics, Nei's genetic distance, population assignment, probabilities of identity and pairwise relatedness. Distance-based calculations include amova, principal coordinates analysis (PCA), Mantel tests, multivariate and 2D spatial autocorrelation and twogener. More than 20 different graphs summarize data and aid exploration. Sequence and genotype data can be imported from automated sequencers, and exported to other software. Initially designed as tool for teaching, genalex 6 now offers features for researchers as well. Documentation and the program are available at http://www.anu.edu.au/BoZo/GenAlEx/

15,786 citations


Journal ArticleDOI
TL;DR: The essay addresses issues of causality, explanation, prediction, and generalization that underlie an understanding of theory, and suggests that the type of theory under development can influence the choice of an epistemological approach.
Abstract: The aim of this research essay is to examine the structural nature of theory in Information Systems. Despite the importance of theory, questions relating to its form and structure are neglected in comparison with questions relating to epistemology. The essay addresses issues of causality, explanation, prediction, and generalization that underlie an understanding of theory. A taxonomy is proposed that classifies information systems theories with respect to the manner in which four central goals are addressed: analysis, explanation, prediction, and prescription. Five interrelated types of theory are distinguished: (1) theory for analyzing, (2) theory for explaining, (3) theory for predicting, (4) theory for explaining and predicting, and (5) theory for design and action. Examples illustrate the nature of each theory type. The applicability of the taxonomy is demonstrated by classifying a sample of journal articles. The paper contributes by showing that multiple views of theory exist and by exposing the assumptions underlying different viewpoints. In addition, it is suggested that the type of theory under development can influence the choice of an epistemological approach. Support is given for the legitimacy and value of each theory type. The building of integrated bodies of theory that encompass all theory types is advocated.

3,070 citations


Journal ArticleDOI
23 Jun 2006-Science
TL;DR: Reconstructed time lines, causes, and consequences of change in 12 once diverse and productive estuaries and coastal seas worldwide show similar patterns: Human impacts have depleted >90% of formerly important species, destroyed >65% of seagrass and wetland habitat, degraded water quality, and accelerated species invasions.
Abstract: Estuarine and coastal transformation is as old as civilization yet has dramatically accelerated over the past 150 to 300 years. Reconstructed time lines, causes, and consequences of change in 12 once diverse and productive estuaries and coastal seas worldwide show similar patterns: Human impacts have depleted >90% of formerly important species, destroyed >65% of seagrass and wetland habitat, degraded water quality, and accelerated species invasions. Twentieth-century conservation efforts achieved partial recovery of upper trophic levels but have so far failed to restore former ecosystem structure and function. Our results provide detailed historical baselines and quantitative targets for ecosystem-based management and marine conservation.

2,795 citations


Book
01 Jan 2006
TL;DR: The Uses of Heritage as mentioned in this paper explores the use of heritage throughout the world and argues that heritage value is not inherent in physical objects or places, but rather that these objects and places are used to give tangibility to the values that underpin different communities and to assert and affirm these values.
Abstract: Examining international case studies including USA, Asia, Australia and New Zealand, Laurajane Smith identifies and explores the use of heritage throughout the world. Challenging the idea that heritage value is self-evident, and that things must be preserved because they have an inherent importance, Smith forcefully demonstrates that heritage value is not inherent in physical objects or places, but rather that these objects and places are used to give tangibility to the values that underpin different communities and to assert and affirm these values. A practically grounded accessible examination of heritage as a cultural practice, The Uses of Heritage is global in its benefit to students and field professionals alike.

2,516 citations


Journal ArticleDOI
TL;DR: The epidemiological evidence of how climate variations and trends affect various health outcomes is summarised and evidence and anticipation of adverse health effects will strengthen the case for pre-emptive policies, and guide priorities for planned adaptive strategies.

2,205 citations


Journal ArticleDOI
26 Oct 2006-Nature
TL;DR: The genome sequence of the honeybee Apis mellifera is reported, suggesting a novel African origin for the species A. melliferA and insights into whether Africanized bees spread throughout the New World via hybridization or displacement.
Abstract: Here we report the genome sequence of the honeybee Apis mellifera, a key model for social behaviour and essential to global ecology through pollination. Compared with other sequenced insect genomes, the A. mellifera genome has high A+T and CpG contents, lacks major transposon families, evolves more slowly, and is more similar to vertebrates for circadian rhythm, RNA interference and DNA methylation genes, among others. Furthermore, A. mellifera has fewer genes for innate immunity, detoxification enzymes, cuticle-forming proteins and gustatory receptors, more genes for odorant receptors, and novel genes for nectar and pollen utilization, consistent with its ecology and social organization. Compared to Drosophila, genes in early developmental pathways differ in Apis, whereas similarities exist for functions that differ markedly, such as sex determination, brain function and behaviour. Population genetics suggests a novel African origin for the species A. mellifera and insights into whether Africanized bees spread throughout the New World via hybridization or displacement.

1,673 citations


Proceedings ArticleDOI
16 Oct 2006
TL;DR: This paper recommends benchmarking selection and evaluation methodologies, and introduces the DaCapo benchmarks, a set of open source, client-side Java benchmarks that improve over SPEC Java in a variety of ways, including more complex code, richer object behaviors, and more demanding memory system requirements.
Abstract: Since benchmarks drive computer science research and industry product development, which ones we use and how we evaluate them are key questions for the community. Despite complex runtime tradeoffs due to dynamic compilation and garbage collection required for Java programs, many evaluations still use methodologies developed for C, C++, and Fortran. SPEC, the dominant purveyor of benchmarks, compounded this problem by institutionalizing these methodologies for their Java benchmark suite. This paper recommends benchmarking selection and evaluation methodologies, and introduces the DaCapo benchmarks, a set of open source, client-side Java benchmarks. We demonstrate that the complex interactions of (1) architecture, (2) compiler, (3) virtual machine, (4) memory management, and (5) application require more extensive evaluation than C, C++, and Fortran which stress (4) much less, and do not require (3). We use and introduce new value, time-series, and statistical metrics for static and dynamic properties such as code complexity, code size, heap composition, and pointer mutations. No benchmark suite is definitive, but these metrics show that DaCapo improves over SPEC Java in a variety of ways, including more complex code, richer object behaviors, and more demanding memory system requirements. This paper takes a step towards improving methodologies for choosing and evaluating benchmarks to foster innovation in system design and implementation for Java and other managed languages.

1,561 citations


Journal ArticleDOI
TL;DR: In this article, the physics of the 21 cm transition were reviewed, focusing on processes relevant at high redshifts, and the insights to be gained from such observations were described.

1,315 citations


Journal ArticleDOI
TL;DR: In this paper, the Global Mapping Function (GMF) was proposed based on data from the global ECMWF numerical weather model and the coefficients of the GMF were obtained from an expansion of the Vienna mapping function (VMF1) parameters into spherical harmonics on a global grid.
Abstract: [1] Troposphere mapping functions are used in the analyses of Global Positioning System and Very Long Baseline Interferometry observations to map a priori zenith hydrostatic and wet delays to any elevation angle. Most analysts use the Niell Mapping Function (NMF) whose coefficients are determined from site coordinates and the day of year. Here we present the Global Mapping Function (GMF), based on data from the global ECMWF numerical weather model. The coefficients of the GMF were obtained from an expansion of the Vienna Mapping Function (VMF1) parameters into spherical harmonics on a global grid. Similar to NMF, the values of the coefficients require only the station coordinates and the day of year as input parameters. Compared to the 6-hourly values of the VMF1 a slight degradation in short-term precision occurs using the empirical GMF. However, the regional height biases and annual errors of NMF are significantly reduced with GMF.

1,232 citations


Journal ArticleDOI
TL;DR: The McKean-Vlasov NCE method presented in this paper has a close connection with the statistical physics of large particle systems: both identify a consistency relationship between the individual agent at the microscopic level and the mass of individuals at the macroscopic level.
Abstract: We consider stochastic dynamic games in large population conditions where multiclass agents are weakly coupled via their individual dynamics and costs. We approach this large population game problem by the so-called Nash Certainty Equivalence (NCE) Principle which leads to a decentralized control synthesis. The McKean-Vlasov NCE method presented in this paper has a close connection with the statistical physics of large particle systems: both identify a consistency relationship between the individual agent (or particle) at the microscopic level and the mass of individuals (or particles) at the macroscopic level. The overall game is decomposed into (i) an optimal control problem whose Hamilton-Jacobi-Bellman (HJB) equation determines the optimal control for each individual and which involves a measure corresponding to the mass effect, and (ii) a family of McKean-Vlasov (M-V) equations which also depend upon this measure. We designate the NCE Principle as the property that the resulting scheme is consistent (or soluble), i.e. the prescribed control laws produce sample paths which produce the mass effect measure. By construction, the overall closed-loop behaviour is such that each agent’s behaviour is optimal with respect to all other agents in the game theoretic Nash sense.

1,195 citations


Journal ArticleDOI
TL;DR: The authors present maximum-likelihood regression models assuming that the dependent variable is conditionally beta distributed rather than Gaussian, and approach models both means and variances with their own distinct sets of predictors, thereby modeling heteroscedasticity.
Abstract: Uncorrectable skew and heteroscedasticity are among the “lemons” of psychological data, yet many important variables naturally exhibit these properties. For scales with a lower and upper bound, a suitable candidate for models is the beta distribution, which is very flexible and models skew quite well. The authors present maximum-likelihood regression models assuming that the dependent variable is conditionally beta distributed rather than Gaussian. The approach models both means (location) and variances (dispersion) with their own distinct sets of predictors (continuous and/or categorical), thereby modeling heteroscedasticity. The location submodel link function is the logit and thereby analogous to logistic regression, whereas the dispersion submodel is log linear. Real examples show that these models handle the independent observations case readily. The article discusses comparisons between beta regression and alternative techniques, model selection and interpretation, practical estimation, and software.

01 Oct 2006
TL;DR: In this article, the authors considered a large population game with weakly coupled agents and proposed the so-called Nash Certainty Equivalence (NCE) principle, which leads to a decentralized control synthesis.
Abstract: We consider stochastic dynamic games in large population conditions where multiclass agents are weakly coupled via their individual dynamics and costs. We approach this large population game problem by the so-called Nash Certainty Equivalence (NCE) Principle which leads to a decentralized control synthesis. The McKean-Vlasov NCE method presented in this paper has a close connection with the statistical physics of large particle systems: both identify a consistency relationship between the individual agent (or particle) at the microscopic level and the mass of individuals (or particles) at the macroscopic level. The overall game is decomposed into (i) an optimal control problem whose Hamilton-Jacobi-Bellman (HJB) equation determines the optimal control for each individual and which involves a measure corresponding to the mass effect, and (ii) a family of McKean-Vlasov (M-V) equations which also depend upon this measure. We designate the NCE Principle as the property that the resulting scheme is consistent (or soluble), i.e. the prescribed control laws produce sample paths which produce the mass effect measure. By construction, the overall closed-loop behaviour is such that each agent’s behaviour is optimal with respect to all other agents in the game theoretic Nash sense.

Journal ArticleDOI
TL;DR: In this article, the authors outline ten basic steps of good, disciplined model practice, including identifying clearly the clients and objectives of the modelling exercise, documenting the nature (quantity, quality, limitations) of the data used to construct and test the model, providing a strong rationale for the choice of model family and features, justifying the techniques used to calibrate the model; serious analysis, testing and discussion of model performance; and making a resultant statement of model assumptions, utility, accuracy, limitations, and scope for improvement.
Abstract: Models are increasingly being relied upon to inform and support natural resource management. They are incorporating an ever broader range of disciplines and now often confront people without strong quantitative or model-building backgrounds. These trends imply a need for wider awareness of what constitutes good model-development practice, including reporting of models to users and sceptical review of models by users. To this end the paper outlines ten basic steps of good, disciplined model practice. The aim is to develop purposeful, credible models from data and prior knowledge, in consort with end-users, with every stage open to critical review and revision. Best practice entails identifying clearly the clients and objectives of the modelling exercise; documenting the nature (quantity, quality, limitations) of the data used to construct and test the model; providing a strong rationale for the choice of model family and features (encompassing review of alternative approaches); justifying the techniques used to calibrate the model; serious analysis, testing and discussion of model performance; and making a resultant statement of model assumptions, utility, accuracy, limitations, and scope for improvement. In natural resource management applications, these steps will be a learning process, even a partnership, between model developers, clients and other interested parties.

Journal ArticleDOI
TL;DR: The Ecological Society of America recommends that the federal government take the following six actions: use new information and practices to better manage commercial and other pathways to reduce the transport and release of potentially harmful species, and establish a National Center for Invasive Species Management.
Abstract: The Ecological Society of America has evaluated current U.S. national policies and practices on biological invasions in light of current scientific knowledge. Invasions by harmful nonnative species are increasing in number and area affected; the damages to ecosystems, economic activity, and human welfare are accumulating. Without improved strategies based on recent scientific advances and increased investments to counter invasions, harm from invasive species is likely to accelerate. Federal leadership, with the cooperation of state and local governments, is required to increase the effectiveness of prevention of invasions, detect and respond quickly to new potentially harmful invasions, control and slow the spread of existing invasions, and provide a national center to ensure that these efforts are coordinated and cost effective. Specifically, the Ecological Society of America recommends that the federal government take the following six actions: (1) Use new information and practices to better manage commercial and other pathways to reduce the transport and release of potentially harmful species; (2) Adopt more quantitative procedures for risk analysis and apply them to every species proposed for importation into the country; (3) Use new cost-effective diagnostic technologies to increase active surveillance and sharing of information about invasive species so that responses to new invasions can be more rapid and effective; (4) Create new legal authority and provide emergency funding to support rapid responses to emerging invasions; (5) Provide funding and incentives for cost-effective programs to slow the spread of existing invasive species in order to protect still uninvaded ecosystems, social and industrial infrastructure, and human welfare; and (6) Establish a National Center for Invasive Species Management (under the existing National Invasive Species Council) to coordinate and lead improvements in federal, state, and international policies on invasive species. Recent scientific and technical advances provide a sound basis for more cost-effective national responses to invasive species. Greater investments in improved technology and management practices would be more than repaid by reduced damages from current and future invasive species. The Ecological Society of America is committed to assist all levels of government and provide scientific advice to improve all aspects of invasive-species management.

Journal ArticleDOI
TL;DR: These ecological functions support the argument that scattered trees are keystone structures in a wide range of landscapes, and their contribution to ecosystem functioning is dispropor- tionately large given the small area occupied and low biomass of any given tree, and the low density of scattered trees collectively.

Journal ArticleDOI
TL;DR: To quantify any relationship between emergency department overcrowding and 10‐day patient mortality, a large-scale study of accident and emergency departments in the Netherlands found no relationship.
Abstract: Objective: To quantify any relationship between emergency department (ED) overcrowding and 10-day patient mortality. Design and setting: Retrospective stratified cohort analysis of three 48-week periods in a tertiary mixed ED in 2002–2004. Mean “occupancy” (a measure of overcrowding based on number of patients receiving treatment) was calculated for 8-hour shifts and for 12week periods. The shifts of each type in the highest quartile of occupancy were classified as overcrowded. Participants: All presentations of patients (except those arriving by interstate ambulance) during “overcrowded” (OC) shifts and during an equivalent number of “not overcrowded” (NOC) shifts (same shift, weekday and period). Main outcome measure: In-hospital death of a patient recorded within 10 days of the most recent ED presentation. Results: There were 34 377 OC and 32 231 NOC presentations (736 shifts each); the presenting patients were well matched for age and sex. Mean occupancy was 21.6 on OC shifts and 16.4 on NOC shifts. There were 144 deaths in the OC cohort and 101 in the NOC cohort (0.42% and 0.31%, respectively; P = 0.025). The relative risk of death at 10 days was 1.34 (95% CI, 1.04–1.72). Subgroup analysis showed that, in the OC cohort, there were more presentations in more urgent triage categories, decreased treatment performance by standard measures, and a higher mortality rate by triage category. Conclusions: In this hospital, presentation during high ED occupancy was associated with increased in-hospital mortality at 10 days, after controlling for seasonal, shift, and day of the week effects. The magnitude of the effect is about 13 deaths per year. Further

Journal ArticleDOI
TL;DR: Sufficient conditions for exponential stability and weighted L"2-gain are developed for a class of switching signals with average dwell time and these conditions are given in the form of linear matrix inequalities (LMIs).

Journal ArticleDOI
TL;DR: Carotenoids and tocopherols are the two most abundant groups of lipid-soluble antioxidants in chloroplasts and are already providing a knowledge base for breeding and transgenic approaches to modify the types and levels of these important compounds in agricultural crops.
Abstract: Carotenoids and tocopherols are the two most abundant groups of lipid-soluble antioxidants in chloroplasts. In addition to their many functional roles in photosynthetic organisms, these compounds are also essential components of animal diets, including humans. During the past decade, a near complete set of genes required for the synthesis of both classes of compounds in photosynthetic tissues has been identified, primarily as a result of molecular genetic and biochemical genomics-based approaches in the model organisms Arabidopsis thaliana and Synechocystis sp. PCC6803. Mutant analysis and transgenic studies in these and other systems have provided important insight into the regulation, activities, integration, and evolution of individual enzymes and are already providing a knowledge base for breeding and transgenic approaches to modify the types and levels of these important compounds in agricultural crops.

Journal ArticleDOI
TL;DR: Self-embarrassment and expectations that others would respond negatively predicted the likelihood of help-seeking from professional sources, and interventions should focus on minimizing expectations of negative responses from others and negative self-responses to help- seeking, and should target younger people.
Abstract: Objective: Research has shown that people are reluctant to seek professional help for depression, especially from mental health professionals. This may be because of the impact of stigma which can ...

Journal ArticleDOI
TL;DR: It is asserted that all Rubiscos may be nearly perfectly adapted to the differing CO(2, O(2), and thermal conditions in their subcellular environments, optimizing this compromise between CO( 2)/O(2) specificity and the maximum rate of catalytic turnover.
Abstract: The cornerstone of autotrophy, the CO2-fixing enzyme, d-ribulose-1,5-bisphosphate carboxylase/oxygenase (Rubisco), is hamstrung by slow catalysis and confusion between CO2 and O2 as substrates, an “abominably perplexing” puzzle, in Darwin's parlance. Here we argue that these characteristics stem from difficulty in binding the featureless CO2 molecule, which forces specificity for the gaseous substrate to be determined largely or completely in the transition state. We hypothesize that natural selection for greater CO2/O2 specificity, in response to reducing atmospheric CO2:O2 ratios, has resulted in a transition state for CO2 addition in which the CO2 moiety closely resembles a carboxylate group. This maximizes the structural difference between the transition states for carboxylation and the competing oxygenation, allowing better differentiation between them. However, increasing structural similarity between the carboxylation transition state and its carboxyketone product exposes the carboxyketone to the strong binding required to stabilize the transition state and causes the carboxyketone intermediate to bind so tightly that its cleavage to products is slowed. We assert that all Rubiscos may be nearly perfectly adapted to the differing CO2, O2, and thermal conditions in their subcellular environments, optimizing this compromise between CO2/O2 specificity and the maximum rate of catalytic turnover. Our hypothesis explains the feeble rate enhancement displayed by Rubisco in processing the exogenously supplied carboxyketone intermediate, compared with its nonenzymatic hydrolysis, and the positive correlation between CO2/O2 specificity and 12C/13C fractionation. It further predicts that, because a more product-like transition state is more ordered (decreased entropy), the effectiveness of this strategy will deteriorate with increasing temperature. enzyme mechanisms isotope fractionation transition states

Journal ArticleDOI
TL;DR: There are some recent radiations in CYP6, CYP9 and certain CCE clades in A. mellifera that could be associated with the evolution of the hormonal and chemosensory processes underpinning its highly organized eusociality.
Abstract: The honeybee genome has substantially fewer protein coding genes (approximate to 11 000 genes) than Drosophila melanogaster (approximate to 13 500) and Anopheles gambiae (approximate to 14 000). Some of the most marked differences occur in three superfamilies encoding xenobiotic detoxifying enzymes. Specifically there are only about half as many glutathione-S-transferases (GSTs), cytochrome P450 monooxygenases (P450s) and carboxyl/cholinesterases (CCEs) in the honeybee. This includes 10-fold or greater shortfalls in the numbers of Delta and Epsilon GSTs and CYP4 P450s, members of which clades have been recurrently associated with insecticide resistance in other species. These shortfalls may contribute to the sensitivity of the honeybee to insecticides. On the other hand there are some recent radiations in CYP6, CYP9 and certain CCE clades in A. mellifera that could be associated with the evolution of the hormonal and chemosensory processes underpinning its highly organized eusociality.

Journal ArticleDOI
TL;DR: In this paper, the authors used nonparametric function estimation theory to extract the density profiles, and their derivatives, from a set of N-body halos generated from ΛCDM simulations of gravitational clustering, as well as isolated spherical collapses.
Abstract: We use techniques from nonparametric function estimation theory to extract the density profiles, and their derivatives, from a set of N-body dark matter halos. We consider halos generated from ΛCDM simulations of gravitational clustering, as well as isolated spherical collapses. The logarithmic density slopes γ ≡ d log ρ/d log r of the ΛCDM halos are found to vary as power laws in radius, reaching values of γ ≈ -1 at the innermost resolved radii, ~10-2rvir. This behavior is significantly different from that of broken-power-law models like the Navarro-Frenk-White (NFW) profile but similar to that of models like de Vaucouleurs's. Accordingly, we compare the N-body density profiles with various parametric models to find which provide the best fit. We consider an NFW-like model with arbitrary inner slope; Dehnen & McLaughlin's anisotropic model; Einasto's model (identical in functional form to Sersic's model but fitted to the space density); and the density model of Prugniel & Simien that was designed to match the deprojected form of Sersic's R1/n law. Overall, the best-fitting model to the ΛCDM halos is Einasto's, although the Prugniel-Simien and Dehnen-McLaughlin models also perform well. With regard to the spherical-collapse halos, both the Prugniel-Simien and Einasto models describe the density profiles well, with an rms scatter some 4 times smaller than that obtained with either the NFW-like model or the three-parameter Dehnen-McLaughlin model. Finally, we confirm recent claims of a systematic variation in profile shape with halo mass.

Journal ArticleDOI
26 Jan 2006-Nature
TL;DR: The detection of a cool, sub-Neptune-mass planets may be more common than gas giant planets, as predicted by the core accretion theory, and is suggested to name OGLE-2005-BLG-390Lb, indicating a planetary mass companion to the lens star of the microlensing event.
Abstract: Over 170 extrasolar planets have so far been discovered, with a wide range of masses and orbital periods, but until last July no planet of Neptune's mass or less had been detected any more than 0.15 astronomical units (AU) from a normal star. (That's close — Earth is one AU from the Sun). On 11 July 2005 the OGLE Early Warning System recorded a notable event: gravitational lensing of light from a distant object by a foreground star revealed a small planet of about 5.5 Earth masses, orbiting at about 2.6 AU from the foreground star. This is the lowest known mass for an extrasolar planet orbiting a main sequence star, and its detection suggests that cool, sub-Neptune mass planets are more common than gas giants, as predicted by the favoured core accretion theory of planet formation. In the favoured core-accretion model of formation of planetary systems, solid planetesimals accumulate to build up planetary cores, which then accrete nebular gas if they are sufficiently massive. Around M-dwarf stars (the most common stars in our Galaxy), this model favours the formation of Earth-mass (M⊕) to Neptune-mass planets with orbital radii of 1 to 10 astronomical units (au), which is consistent with the small number of gas giant planets known to orbit M-dwarf host stars1,2,3,4. More than 170 extrasolar planets have been discovered with a wide range of masses and orbital periods, but planets of Neptune's mass or less have not hitherto been detected at separations of more than 0.15 au from normal stars. Here we report the discovery of a M⊕ planetary companion at a separation of au from a M⊙ M-dwarf star, where M⊙ refers to a solar mass. (We propose to name it OGLE-2005-BLG-390Lb, indicating a planetary mass companion to the lens star of the microlensing event.) The mass is lower than that of GJ876d (ref. 5), although the error bars overlap. Our detection suggests that such cool, sub-Neptune-mass planets may be more common than gas giant planets, as predicted by the core accretion theory.

Journal ArticleDOI
TL;DR: Several problems with existing hypotheses for the evolution of parental care are identified and a number of poorly understood contrasts which, once resolved, should help elucidate avian social evolution are highlighted.
Abstract: Estimates of the incidence of major classes of parental care by birds are drawn from classical studies that preceded both the publication of a massive secondary literature and the revolution driven by molecular approaches to avian phylogeny. Here, I review this literature in the light of new phylogenetic hypotheses and estimate the prevalence of six distinct modes of care: use of geothermal heat to incubate eggs, brood parasitism, male only care, female only care, biparental care and cooperative breeding. Female only care and cooperative breeding are more common than has previously been recognized, occurring in 8 and 9% of species, respectively. Biparental care by a pair bonded male and female is the most common pattern of care but at 81% of species, the pattern is less common than once believed. I identify several problems with existing hypotheses for the evolution of parental care and highlight a number of poorly understood contrasts which, once resolved, should help elucidate avian social evolution.

Journal ArticleDOI
TL;DR: In this article, a model of the electronic structure and the associated dynamics of the nitrogen-vacancy center in diamond was presented for the occurrence of optically induced spin polarization, for the change of emission level with spin polarization and for new experimental measurements of transient emission.
Abstract: Symmetry considerations are used in presenting a model of the electronic structure and the associated dynamics of the nitrogen-vacancy center in diamond. The model accounts for the occurrence of optically induced spin polarization, for the change of emission level with spin polarization and for new experimental measurements of transient emission. The rate constants given are in variance to those reported previously.

Journal ArticleDOI
TL;DR: Both attitudinal and behavioural features of eating disorder psychopathology tended to decrease with increasing age, and these data will inform researchers intending to use the EDE-Q in epidemiological studies.

Proceedings ArticleDOI
17 Jun 2006
TL;DR: A novel spatiotemporal segmentation algorithm is employed to generate salient edgels that are robust to changes in appearance of clothing and invariant signatures are generated by combining normalized color and salient edgel histograms.
Abstract: In many surveillance applications it is desirable to determine if a given individual has been previously observed over a network of cameras. This is the person reidentification problem. This paper focuses on reidentification algorithms that use the overall appearance of an individual as opposed to passive biometrics such as face and gait. Person reidentification approaches have two aspects: (i) establish correspondence between parts, and (ii) generate signatures that are invariant to variations in illumination, pose, and the dynamic appearance of clothing. A novel spatiotemporal segmentation algorithm is employed to generate salient edgels that are robust to changes in appearance of clothing. The invariant signatures are generated by combining normalized color and salient edgel histograms. Two approaches are proposed to generate correspondences: (i) a model based approach that fits an articulated model to each individual to establish a correspondence map, and (ii) an interest point operator approach that nominates a large number of potential correspondences which are evaluated using a region growing scheme. Finally, the approaches are evaluated on a 44 person database across 3 disparate views.

Journal ArticleDOI
TL;DR: This review focuses on applications and protocols of recent studies where docking calculations and molecular dynamics simulations were combined to dock small molecules into protein receptors, and is structured to lead the reader from the simpler to more compute‐intensive methods.
Abstract: A rational approach is needed to maximize the chances of finding new drugs, and to exploit the opportunities of potential new drug targets emerging from genomic and proteomic initiatives, and from the large libraries of small compounds now readily available through combinatorial chemistry. Despite a shaky early history, computer-aided drug design techniques can now be effective in reducing costs and speeding up drug discovery. This happy outcome results from development of more accurate and reliable algorithms, use of more thoughtfully planned strategies to apply them, and greatly increased computer power to allow studies with the necessary reliability to be performed. Our review focuses on applications and protocols, with the main emphasis on critical analysis of recent studies where docking calculations and molecular dynamics (MD) simulations were combined to dock small molecules into protein receptors. We highlight successes to demonstrate what is possible now, but also point out drawbacks and future directions. The review is structured to lead the reader from the simpler to more compute-intensive methods. Thus, while inexpensive and fast docking algorithms can be used to scan large compound libraries and reduce their size, more accurate but expensive MD simulations can be applied when a few selected ligand candidates remain. MD simulations can be used: during the preparation of the protein receptor before docking, to optimize its structure and account for protein flexibility; for the refinement of docked complexes, to include solvent effects and account for induced fit; to calculate binding free energies, to provide an accurate ranking of the potential ligands; and in the latest developments, during the docking process itself to find the binding site and correctly dock the ligand a priori.

Journal ArticleDOI
TL;DR: In this paper, the authors propose a set of five guiding principles for biodiversity conservation that are broadly applicable to any forested area: (1) the maintenance of connectivity; (2) the maintaining of landscape heterogeneity; (3) maintenance of stand structural complexity; and (4) maintaining aquatic ecosystem integrity; (5) the use of natural disturbance regimes to guide human disturbance regimes.