scispace - formally typeset
Search or ask a question

Showing papers by "Australian National University published in 2004"


Journal ArticleDOI
TL;DR: This tutorial gives an overview of the basic ideas underlying Support Vector (SV) machines for function estimation, and includes a summary of currently used algorithms for training SV machines, covering both the quadratic programming part and advanced methods for dealing with large datasets.
Abstract: In this tutorial we give an overview of the basic ideas underlying Support Vector (SV) machines for function estimation. Furthermore, we include a summary of currently used algorithms for training SV machines, covering both the quadratic (or convex) programming part and advanced methods for dealing with large datasets. Finally, we mention some modifications and extensions that have been applied to the standard SV algorithm, and discuss the aspect of regularization from a SV perspective.

10,696 citations


Journal ArticleDOI
TL;DR: A multidisciplinary, international group of experts discussed the current status and future directions of MCI, with regard to clinical presentation, cognitive and functional assessment, and the role of neuroimaging, biomarkers and genetics.
Abstract: The First Key Symposium was held in Stockholm, Sweden, 2-5 September 2003. The aim of the symposium was to integrate clinical and epidemiological perspectives on the topic of Mild Cognitive Impairment (MCI). A multidisciplinary, international group of experts discussed the current status and future directions of MCI, with regard to clinical presentation, cognitive and functional assessment, and the role of neuroimaging, biomarkers and genetics. Agreement on new perspectives, as well as recommendations for management and future research were discussed by the international working group. The specific recommendations for the general MCI criteria include the following: (i) the person is neither normal nor demented; (ii) there is evidence of cognitive deterioration shown by either objectively measured decline over time and/or subjective report of decline by self and/or informant in conjunction with objective cognitive deficits; and (iii) activities of daily living are preserved and complex instrumental functions are either intact or minimally impaired.

4,206 citations


Journal ArticleDOI
TL;DR: Li et al. as discussed by the authors examined three sectors of the economy: the State Sector (state-owned firms), the Listed Sector (publicly listed firms), and the Private Sector (all other firms with various types of private and local government ownership).
Abstract: China is an important counterexample to the findings in the law, institutions, finance, and growth literature: neither its legal nor financial system is well developed by existing standards, yet it has one of the fastest growing economies. We examine 3 sectors of the economy: the State Sector (state-owned firms), the Listed Sector (publicly listed firms), and the Private Sector (all other firms with various types of private and local government ownership). The law-finance-growth nexus established by existing literature applies to the State and Listed Sectors: with poor legal protections of minority and outside investors, external markets are weak, and the growth of these firms is slow or negative. However, with arguably poorer applicable legal and financial mechanisms, the Private Sector grows much faster than the State and Listed Sectors, and provides most of the economy's growth. This suggests that there exist effective alternative financing channels and governance mechanisms, such as those based on reputation and relationships, to support this growth.

2,829 citations


01 Jul 2004
TL;DR: The authors developed a conditional estimator for the fixed-effect ordered logit model and found that assuming ordinality or cardinality of happiness scores makes little difference, whilst allowing for fixed-effects does change results substantially.
Abstract: Psychologists and sociologists usually interpret happiness scores as cardinal and comparable across respondents, and thus run OLS regressions on happiness and changes in happiness. Economists usually assume only ordinality and have mainly used ordered latent response models, thereby not taking satisfactory account of fixed individual traits. We address this problem by developing a conditional estimator for the fixed-effect ordered logit model. We find that assuming ordinality or cardinality of happiness scores makes little difference, whilst allowing for fixed-effects does change results substantially. We call for more research into the determinants of the personality traits making up these fixed-effects.

2,460 citations


Journal ArticleDOI
TL;DR: This paper developed a conditional estimator for the fixed-effect ordered logit model and found that assuming ordinality or cardinality of happiness scores makes little difference, whilst allowing for fixed-effects does change results substantially.
Abstract: Psychologists and sociologists usually interpret happiness scores as cardinal and comparable across respondents, and thus run OLS regressions on happiness and changes in happiness. Economists usually assume only ordinality and have mainly used ordered latent response models, thereby not taking satisfactory account of fixed individual traits. We address this problem by developing a conditional estimator for the fixed-effect ordered logit model. We find that assuming ordinality or cardinality of happiness scores makes little difference, whilst allowing for fixed-effects does change results substantially. We call for more research into the determinants of the personality traits making up these fixed-effects.

2,384 citations


Journal ArticleDOI
TL;DR: In this paper, a vibration-based piezoelectric generator has been developed as an enabling technology for wireless sensor networks, where the authors discuss the modeling, design, and optimization of the generator based on a two-layer bending element.
Abstract: Enabling technologies for wireless sensor networks have gained considerable attention in research communities over the past few years. It is highly desirable, even necessary in certain situations, for wireless sensor nodes to be self-powered. With this goal in mind, a vibration based piezoelectric generator has been developed as an enabling technology for wireless sensor networks. The focus of this paper is to discuss the modeling, design, and optimization of a piezoelectric generator based on a two-layer bending element. An analytical model of the generator has been developed and validated. In addition to providing intuitive design insight, the model has been used as the basis for design optimization. Designs of 1 cm3 in size generated using the model have demonstrated a power output of 375 µW from a vibration source of 2.5 m s−2 at 120 Hz. Furthermore, a 1 cm3 generator has been used to power a custom designed 1.9 GHz radio transmitter from the same vibration source.

1,782 citations


Journal ArticleDOI
05 Aug 2004-Nature
TL;DR: A novel ubiquitin ligase domain is defined and two sequential mechanisms by which A20 downregulates NF-κB signalling are identified, both of which participate in mediating a distinct regulatory effect.
Abstract: NF-kappaB transcription factors mediate the effects of pro-inflammatory cytokines such as tumour necrosis factor-alpha and interleukin-1beta. Failure to downregulate NF-kappaB transcriptional activity results in chronic inflammation and cell death, as observed in A20-deficient mice. A20 is a potent inhibitor of NF-kappaB signalling, but its mechanism of action is unknown. Here we show that A20 downregulates NF-kappaB signalling through the cooperative activity of its two ubiquitin-editing domains. The amino-terminal domain of A20, which is a de-ubiquitinating (DUB) enzyme of the OTU (ovarian tumour) family, removes lysine-63 (K63)-linked ubiquitin chains from receptor interacting protein (RIP), an essential mediator of the proximal TNF receptor 1 (TNFR1) signalling complex. The carboxy-terminal domain of A20, composed of seven C2/C2 zinc fingers, then functions as a ubiquitin ligase by polyubiquitinating RIP with K48-linked ubiquitin chains, thereby targeting RIP for proteasomal degradation. Here we define a novel ubiquitin ligase domain and identify two sequential mechanisms by which A20 downregulates NF-kappaB signalling. We also provide an example of a protein containing separate ubiquitin ligase and DUB domains, both of which participate in mediating a distinct regulatory effect.

1,749 citations


Journal ArticleDOI
TL;DR: In this article, the authors investigated correlations between the age offsets and P, Sm and Nd abundances in the zircons, and concluded that the presence of Nd is not the primary cause of the apparent matrix effect.

1,485 citations


Journal ArticleDOI
TL;DR: The state of the art in the adjuvant field is reviewed, future directions of adjUvant development are explored, some of the impediments and barriers to development and registration of new human adjuvants are examined and some of those barriers are examined are examined.
Abstract: The problem with pure recombinant or synthetic antigens used in modern day vaccines is that they are generally far less immunogenic than older style live or killed whole organism vaccines. This has created a major need for improved and more powerful adjuvants for use in these vaccines. With few exceptions, alum remains the sole adjuvant approved for human use in the majority of countries worldwide. Although alum is able to induce a good antibody (Th2) response, it has little capacity to stimulate cellular (Th1) immune responses which are so important for protection against many pathogens. In addition, alum has the potential to cause severe local and systemic side-effects including sterile abscesses, eosinophilia and myofascitis, although fortunately most of the more serious side-effects are relatively rare. There is also community concern regarding the possible role of aluminium in neurodegenerative diseases such as Alzheimer's disease. Consequently, there is a major unmet need for safer and more effective adjuvants suitable for human use. In particular, there is demand for safe and non-toxic adjuvants able to stimulate cellular (Th1) immunity. Other needs in light of new vaccine technologies are adjuvants suitable for use with mucosally-delivered vaccines, DNA vaccines, cancer and autoimmunity vaccines. Each of these areas are highly specialized with their own unique needs in respect of suitable adjuvant technology. This paper reviews the state of the art in the adjuvant field, explores future directions of adjuvant development and finally examines some of the impediments and barriers to development and registration of new human adjuvants.

964 citations


Journal ArticleDOI
TL;DR: The EDE-Q has good concurrent validity and acceptable criterion validity, and the measure appears well-suited to use in prospective epidemiological studies.

964 citations


Journal ArticleDOI
TL;DR: In this paper, the authors combine multiple cross-sections of data drawn from the National Population Health Survey and Canadian Community Health Survey to confirm the existence of the "healthy immigrant effect", specifically that immigrants are in relatively better health on arrival in Canada compared to native-born Canadians, and that immigrant health converges with years in Canada to nativeborn levels.

Journal ArticleDOI
TL;DR: In this article, a reproducing kernel Hilbert space was proposed for online learning in a wide range of problems such as classification, regression, and novelty detection, and worst-case loss bounds were derived.
Abstract: Kernel-based algorithms such as support vector machines have achieved considerable success in various problems in batch setting, where all of the training data is available in advance. Support vector machines combine the so-called kernel trick with the large margin idea. There has been little use of these methods in an online setting suitable for real-time applications. In this paper, we consider online learning in a reproducing kernel Hilbert space. By considering classical stochastic gradient descent within a feature space and the use of some straightforward tricks, we develop simple and computationally efficient algorithms for a wide range of problems such as classification, regression, and novelty detection. In addition to allowing the exploitation of the kernel trick in an online setting, we examine the value of large margins for classification in the online setting with a drifting target. We derive worst-case loss bounds, and moreover, we show the convergence of the hypothesis to the minimizer of the regularized risk functional. We present some experimental results that support the theory as well as illustrating the power of the new algorithms for online novelty detection.

Journal ArticleDOI
29 Jan 2004-BMJ
TL;DR: Depression literacy (BluePages) significantly improved participants' understanding of effective evidence based treatments for depression (P < 0.05) and both cognitive behaviour therapy and psychoeducation delivered via the internet are effective in reducing symptoms of depression.
Abstract: Objective To evaluate the efficacy of two internet interventions for community-dwelling individuals with symptoms of depression—a psychoeducation website offering information about depression and an interactive website offering cognitive behaviour therapy. Design Randomised controlled trial. Setting Internet users in the community, in Canberra, Australia. Participants 525 individuals with increased depressive symptoms recruited by survey and randomly allocated to a website offering information about depression (n = 166) or a cognitive behaviour therapy website (n = 182), or a control intervention using an attention placebo (n = 178). Main outcome measures Change in depression, dysfunctional thoughts; knowledge of medical, psychological, and lifestyle treatments; and knowledge of cognitive behaviour therapy. Results Intention to treat analyses indicated that information about depression and interventions that used cognitive behaviour therapy and were delivered via the internet were more effective than a credible control intervention in reducing symptoms of depression in a community sample. For the intervention that delivered cognitive behaviour therapy the reduction in score on the depression scale of the Center for Epidemiologic Studies was 3.2 (95% confidence interval 0.9 to 5.4). For the “depression literacy” site (BluePages), the reduction was 3.0 (95% confidence interval 0.6 to 5.2). Cognitive behaviour therapy (MoodGYM) reduced dysfunctional thinking and increased knowledge of cognitive behaviour therapy. Depression literacy (BluePages) significantly improved participants9 understanding of effective evidence based treatments for depression (P Conclusions Both cognitive behaviour therapy and psychoeducation delivered via the internet are effective in reducing symptoms of depression.

Journal ArticleDOI
TL;DR: In this paper, the authors examined the levels of precision and accuracy obtainable by laser ablation Hf-isotope analysis of zircons in real-world situations using a 193-nm ArF excimer laser coupled to a Nu Plasma MC-ICPMS.

Journal ArticleDOI
TL;DR: The present state of affairs, theory and experiment with Hofmeister effects is reviewed in this article, where the authors present a review of the literature and experiment results of their work.
Abstract: The present state of affairs, theory and experiment with Hofmeister effects is reviewed.

Posted Content
TL;DR: In this paper, the authors analyse gender pay gaps by sector across the wages distribution for ten countries and find that gender pay gap is typically higher at the top than the bottom end of the wage distribution, suggesting that glass ceilings are more prevalent than sticky floors.
Abstract: Using harmonised data from the European Union Household Panel, we analyse gender pay gaps by sector across the wages distribution for ten countries. We find that the mean gender pay gap in the raw data typically hides large variations in the gap across the wages distribution. We use quantile regression (QR) techniques to control for the effects of individual and job characteristics at different points of the distribution, and calculate the part of the gap attributable to differing returns between men and women. We find that, first, gender pay gaps are typically bigger at the top of the wage distribution, a finding that is consistent with the existence of glass ceilings. For some countries gender pay gaps are also bigger at the bottom of the wage distribution, a finding that is consistent with sticky floors. Third, the gender pay gap is typically higher at the top than the bottom end of the wage distribution, suggesting that glass ceilings are more prevalent than sticky floors and that these prevail in the majority of our countries. Fourth, the gender pay gap differs significantly across the public and the private sector wages distribution for each of our EU countries.

Journal ArticleDOI
TL;DR: Stochastic boundaries are compatible with H modes and may be attractive for ELM control in next-step fusion tokamaks, and the H mode transport barrier and core confinement are unaffected by the stochastic boundary.
Abstract: OAK-B135 A stochastic magnetic boundary, produced by an externally applied edge resonant magnetic perturbation, is used to suppress large edge localized modes (ELMs) in high confinement (H-mode) plasmas. The resulting H-mode displays rapid, small oscillations with a bursty character modulated by a coherent 130 Hz envelope. The H-mode transport barrier is unaffected by the stochastic boundary. The core confinement of these discharges is unaffected, despite a three-fold drop in the toroidal rotation in the plasma core. These results demonstrate that stochastic boundaries are compatible with H-modes and may be attractive for ELM control in next-step burning fusion tokamaks.

Journal ArticleDOI
TL;DR: Heterotopic bone induction to form a mandibular replacement inside the latissimus dorsi muscle in a human being is possible and allows for a lower operative burden compared with conventional techniques by avoiding creation of a secondary bone defect.

Journal ArticleDOI
TL;DR: A mathematical model that successfully describes a wide range of results in human and other mammals is presented, showing that the time-course of human dark adaptation and pigment regeneration is determined by the local concentration of 11-cis retinal, and that after a large bleach the recovery is limited by the rate at which 11-Cis Retinal is delivered to opsin in the bleached rod outer segments.

Journal ArticleDOI
TL;DR: In this article, the authors estimate global empirical orthogonal functions that are then combined with historical tide gauge data to estimate monthly distributions of large-scale sea level variability and change over the period 1950-2000.
Abstract: TOPEX/Poseidon satellite altimeter data are used to estimate global empirical orthogonal functions that are then combined with historical tide gauge data to estimate monthly distributions of large-scale sea level variability and change over the period 1950–2000. The reconstruction is an attempt to narrow the current broad range of sea level rise estimates, to identify any pattern of regional sea level rise, and to determine any variation in the rate of sea level rise over the 51-yr period. The computed rate of global-averaged sea level rise from the reconstructed monthly time series is 1.8 ± 0.3 mm yr−1. With the decadal variability in the computed global mean sea level, it is not possible to detect a significant increase in the rate of sea level rise over the period 1950–2000. A regional pattern of sea level rise is identified. The maximum sea level rise is in the eastern off-equatorial Pacific and there is a minimum along the equator, in the western Pacific, and in the eastern Indian Ocean. A g...

Journal ArticleDOI
TL;DR: PyEvolve provides flexible functionality that can be used either for statistical modelling of molecular evolution, or the development of new methods in the field, and implements numerous optimisations that make highly parameter rich likelihood functions solvable within hours on multi-cpu hardware.
Abstract: Examining the distribution of variation has proven an extremely profitable technique in the effort to identify sequences of biological significance. Most approaches in the field, however, evaluate only the conserved portions of sequences – ignoring the biological significance of sequence differences. A suite of sophisticated likelihood based statistical models from the field of molecular evolution provides the basis for extracting the information from the full distribution of sequence variation. The number of different problems to which phylogeny-based maximum likelihood calculations can be applied is extensive. Available software packages that can perform likelihood calculations suffer from a lack of flexibility and scalability, or employ error-prone approaches to model parameterisation. Here we describe the implementation of PyEvolve, a toolkit for the application of existing, and development of new, statistical methods for molecular evolution. We present the object architecture and design schema of PyEvolve, which includes an adaptable multi-level parallelisation schema. The approach for defining new methods is illustrated by implementing a novel dinucleotide model of substitution that includes a parameter for mutation of methylated CpG's, which required 8 lines of standard Python code to define. Benchmarking was performed using either a dinucleotide or codon substitution model applied to an alignment of BRCA1 sequences from 20 mammals, or a 10 species subset. Up to five-fold parallel performance gains over serial were recorded. Compared to leading alternative software, PyEvolve exhibited significantly better real world performance for parameter rich models with a large data set, reducing the time required for optimisation from ~10 days to ~6 hours. PyEvolve provides flexible functionality that can be used either for statistical modelling of molecular evolution, or the development of new methods in the field. The toolkit can be used interactively or by writing and executing scripts. The toolkit uses efficient processes for specifying the parameterisation of statistical models, and implements numerous optimisations that make highly parameter rich likelihood functions solvable within hours on multi-cpu hardware. PyEvolve can be readily adapted in response to changing computational demands and hardware configurations to maximise performance. PyEvolve is released under the GPL and can be downloaded from http://cbis.anu.edu.au/software .

Journal ArticleDOI
TL;DR: In this article, the authors apply a new conditional fixed-effect ordinal estimator to their measure of life satisfaction using data from the German Socio-economic Panel (GSOEP).
Abstract: One of the most prominent political and economic events of recent decades was the falling of the Berlin Wall on November 9, 1989, which was quickly followed by the reunification of the formerly separate entities of East and West Germany. It is well acknowledged that the falling of the wall was widely unanticipated in Germany (Stefan Bach and Harold Trabold, 2000), and thus it provides some useful exogenous variation with which we can more firmly establish causality in empirical analyses. In this paper, we aim to contribute to the growing economics literature on the determinants of life satisfaction (or happiness) by investigating how life satisfaction in East Germany changed over the decade following reunification. We are particularly interested in identifying the contribution that the substantial increase in real household income in East Germany in the post-reunification years (i.e., around 60 percent between 1990 and 2001) made to reported levels of life satisfaction. In order to achieve this aim, we apply a new conditional fixed-effect ordinal estimator to our measure of life satisfaction using data from the German Socio-Economic Panel (GSOEP). The estimates from this new model are then decomposed, using a new causal technique, in order to identify the factors that drove average changes in life satisfaction in East Germany following reunification. Our methodology exploits the fact that the GSOEP is an evolving panel, allowing us to make a distinction among changes in variables affecting everyone, changes in the aggregate unobserved fixed individual characteristics of the panel due to new entrants (who are also mostly younger cohorts), and panel attrition. In Section I, we briefly review the literature and describe our data. In Section II, we present the fixed-effect methodology and the causal decomposition approach that we adopt. Section III presents the results. Finally, Section IV concludes.

Journal ArticleDOI
TL;DR: The internet warrants further investigation as a means of delivering stigma reduction programmes for depression as the effects were small and changes in stigma were not mediated by changes in depression, depression literacy or cognitive–behavioural therapy literacy.
Abstract: Background Little is known about the efficacy of educational interventions for reducing the stigma associated with depression. Aims To investigate the effects on stigma of two internet depression sites. Method A sample of 525 individuals with elevated scores on a depression assessment scale were randomly allocated to a depression information website (BluePages), a cognitive–behavioural skills training website (MoodGYM) or an attention control condition. Personal stigma (personal stigmatising attitudes to depression) and perceived stigma (perception of what most other people believe) were assessed before and after the intervention. Results Relative to the control, the internet sites significantly reduced personal stigma, although the effects were small. BluePages had no effect on perceived stigma and MoodGYM was associated with an increase in perceived stigma relative to the control. Changes in stigma were not mediated by changes in depression, depression literacy or cognitive–behavioural therapy literacy. Conclusions The internet warrants further investigation as a means of delivering stigma reduction programmes for depression.

Journal ArticleDOI
TL;DR: A systematic search of the literature on the psychometric properties and validity of the IQCODE was carried out using three databases as mentioned in this paper, which showed that the questionnaire has high reliability and measures a single general factor of cognitive decline.
Abstract: Background and aims: The IQCODE is widely used as a screening test for dementia, particularly where the subject is unable to undergo direct cognitive testing or for screening in populations with low levels of education and literacy. This review draws together research on the psychometric properties and validity of the IQCODE. Method: A systematic search of the literature was carried out using three databases. Results: The review shows that the questionnaire has high reliability and measures a single general factor of cognitive decline. It validly reflects past cognitive decline, performs at least as well at screening as conventional cognitive screening tests, predicts incident dementia, and correlates with a wide range of cognitive tests. A particular strength is that the IQCODE is relatively unaffected by education and pre-morbid ability or by proficiency in the culture's dominant language. The disadvantage of the IQCODE is that it is affected by informant characteristics such as depression and anxiety in the informant and the quality of the relationship between the informant and the subject. Conclusions: Because the IQCODE provides information complementary to brief cognitive tests, harnessing them together can improve screening accuracy.

Journal ArticleDOI
TL;DR: In this paper, the eustasy, glacio-hydro-isostasy and vertical tectonic motion were used to predict relative sea-level change along the Italian coast and adjacent seas.

Journal ArticleDOI
TL;DR: Diversity Arrays Technology can be effectively applied to genetic mapping and diversity analyses of barley and is highlighted as a generic technique for genome profiling in the context of molecular breeding and genomics.
Abstract: Diversity Arrays Technology (DArT) can detect and type DNA variation at several hundred genomic loci in parallel without relying on sequence information. Here we show that it can be effectively applied to genetic mapping and diversity analyses of barley, a species with a 5,000-Mbp genome. We tested several complexity reduction methods and selected two that generated the most polymorphic genomic representations. Arrays containing individual fragments from these representations generated DArT fingerprints with a genotype call rate of 98.0% and a scoring reproducibility of at least 99.8%. The fingerprints grouped barley lines according to known genetic relationships. To validate the Mendelian behavior of DArT markers, we constructed a genetic map for a cross between cultivars Steptoe and Morex. Nearly all polymorphic array features could be incorporated into one of seven linkage groups (98.8%). The resulting map comprised ≈385 unique DArT markers and spanned 1,137 centimorgans. A comparison with the restriction fragment length polymorphism-based framework map indicated that the quality of the DArT map was equivalent, if not superior, to that of the framework map. These results highlight the potential of DArT as a generic technique for genome profiling in the context of molecular breeding and genomics.

Journal ArticleDOI
TL;DR: A better understanding of the evolving social dynamics of emerging infectious diseases ought to help to anticipate and hopefully ameliorate current and future risks.
Abstract: Fifty years ago, the age-old scourge of infectious disease was receding in the developed world in response to improved public health measures, while the advent of antibiotics, better vaccines, insecticides and improved surveillance held the promise of eradicating residual problems. By the late twentieth century, however, an increase in the emergence and re-emergence of infectious diseases was evident in many parts of the world. This upturn looms as the fourth major transition in human-microbe relationships since the advent of agriculture around 10,000 years ago. About 30 new diseases have been identified, including Legionnaires' disease, human immunodeficiency virus (HIV)/acquired immune deficiency syndrome (AIDS), hepatitis C, bovine spongiform encephalopathy (BSE)/variant Creutzfeldt-Jakob disease (vCJD), Nipah virus, several viral hemorrhagic fevers and, most recently, severe acute respiratory syndrome (SARS) and avian influenza. The emergence of these diseases, and resurgence of old ones like tuberculosis and cholera, reflects various changes in human ecology: rural-to-urban migration resulting in high-density peri-urban slums; increasing long-distance mobility and trade; the social disruption of war and conflict; changes in personal behavior; and, increasingly, human-induced global changes, including widespread forest clearance and climate change. Political ignorance, denial and obduracy (as with HIV/AIDS) further compound the risks. The use and misuse of medical technology also pose risks, such as drug-resistant microbes and contaminated equipment or biological medicines. A better understanding of the evolving social dynamics of emerging infectious diseases ought to help us to anticipate and hopefully ameliorate current and future risks.

Journal ArticleDOI
01 Feb 2004-Ecology
TL;DR: The results from the analyses suggest that, as a promoter of species coexistence, the IDH is both broader in scope and richer in detail than has previously been recognized.
Abstract: The intermediate disturbance hypothesis (IDH) has been used for several decades as an explanation for the coexistence of species in ecological communities. It is intuitively simple, but deceptively so. We show, via discussion and examples, that the IDH is not one mechanism of coexistence, but rather summarizes a set of similar phenomena that can arise from the action of several different coexistence mechanisms. These underlying mechanisms are defined by the various ways in which species differ in their response to disturbance-induced spatial and temporal variability in resources and environmental conditions. As an example, the original specification of the IDH required patchy disturbances for coexistence. However, because the underlying mechanisms of coexistence can also operate at the within-patch scale, patchy disturbances are not a necessary requirement for coexistence under intermediate-disturbance regimes. These conclusions are illustrated through the analysis of three models: a spatial within-patch model, a spatial between-patch model, and a purely temporal model. All three generate similar patterns of coexistence under intermediate disturbance, yet underlying that coexistence lie at least two quite-distinct mechanisms of species coexistence: the storage effect and relative nonlinearity. The results from our analyses suggest that, as a promoter of species coexistence, the IDH is both broader in scope and richer in detail than has previously been recognized.

Journal ArticleDOI
TL;DR: The 6dF Galaxy Survey (6dFGS) as discussed by the authors is the largest survey of the nearby universe, reaching out to about z similar to 0.15, and more than an order of magnitude larger than any peculiar velocity survey to date.
Abstract: The 6dF Galaxy Survey (6dFGS) aims to measure the redshifts of around 150 000 galaxies, and the peculiar velocities of a 15 000-member subsample, over almost the entire southern sky. When complete, it will be the largest redshift survey of the nearby Universe, reaching out to about z similar to 0.15, and more than an order of magnitude larger than any peculiar velocity survey to date. The targets are all galaxies brighter than K-tot = 12.75 in the 2MASS Extended Source Catalog (XSC), supplemented by 2MASS and SuperCOSMOS galaxies that complete the sample to limits of (H, J, r(F), b(J)) = (13.05, 13.75, 15.6, 16.75). Central to the survey is the Six-Degree Field (6dF) multifibre spectrograph, an instrument able to record 150 simultaneous spectra over the 5.7-field of the UK Schmidt Telescope. An adaptive tiling algorithm has been employed to ensure around 95 per cent fibring completeness over the 17 046 deg(2) of the southern sky with \b\ > 10degrees. Spectra are obtained in two observations using separate V and R gratings, that together give R similar to 1000 over at least 4000-7500 Angstrom and signal-to-noise ratio similar to10 per pixel. Redshift measurements are obtained semi-automatically, and are assigned a quality value based on visual inspection. The 6dFGS data base is available at http://www-wfau.roe.ac.uk/6dFGS/, with public data releases occurring after the completion of each third of the survey.

Journal ArticleDOI
TL;DR: In this article, the authors analyzed the observed correlation between galaxy environment and Halpha emission-line strength, using volume-limited samples and group catalogues of 24 968 galaxies at 0.05 < z < 0.095, drawn from the 2dF Galaxy Redshift Survey (M-bJ < -19.5) and the Sloan Digital Sky Survey(M-r < -20.6).
Abstract: We analyse the observed correlation between galaxy environment and Halpha emission-line strength, using volume-limited samples and group catalogues of 24 968 galaxies at 0.05 < z < 0.095, drawn from the 2dF Galaxy Redshift Survey (M-bJ < -19.5) and the Sloan Digital Sky Survey (M-r < -20.6). We characterize the environment by: (1) Sigma(5), the surface number density of galaxies determined by the projected distance to the fifth nearest neighbour; and (2) rho(1.1) and rho(5.5), three-dimensional density estimates obtained by convolving the galaxy distribution with Gaussian kernels of dispersion 1.1 and 5.5 Mpc, respectively. We find that star-forming and quiescent galaxies form two distinct populations, as characterized by their H equivalent width, W-0(Halpha). The relative numbers of star-forming and quiescent galaxies vary strongly and continuously with local density. However, the distribution of W-0(Halpha) amongst the star-forming population is independent of environment. The fraction of star-forming galaxies shows strong sensitivity to the density on large scales, rho(5.5), which is likely independent of the trend with local density, rho(1.1). We use two differently selected group catalogues to demonstrate that the correlation with galaxy density is approximately independent of group velocity dispersion, for sigma = 200-1000 km s(-1). Even in the lowest-density environments, no more than similar to70 per cent of galaxies show significant Halpha emission. Based on these results, we conclude that the present-day correlation between star formation rate and environment is a result of short-time-scale mechanisms that take place preferentially at high redshift, such as starbursts induced by galaxy-galaxy interactions.