scispace - formally typeset
Search or ask a question

Showing papers by "University of Alberta published in 2006"


Journal ArticleDOI
23 Nov 2006-Nature
TL;DR: A first-generation CNV map of the human genome is constructed through the study of 270 individuals from four populations with ancestry in Europe, Africa or Asia, underscoring the importance of CNV in genetic diversity and evolution and the utility of this resource for genetic disease studies.
Abstract: Copy number variation (CNV) of DNA sequences is functionally significant but has yet to be fully ascertained. We have constructed a first-generation CNV map of the human genome through the study of 270 individuals from four populations with ancestry in Europe, Africa or Asia (the HapMap collection). DNA from these individuals was screened for CNV using two complementary technologies: single-nucleotide polymorphism (SNP) genotyping arrays, and clone-based comparative genomic hybridization. A total of 1,447 copy number variable regions (CNVRs), which can encompass overlapping or adjacent gains or losses, covering 360 megabases (12% of the genome) were identified in these populations. These CNVRs contained hundreds of genes, disease loci, functional elements and segmental duplications. Notably, the CNVRs encompassed more nucleotide content per genome than SNPs, underscoring the importance of CNV in genetic diversity and evolution. The data obtained delineate linkage disequilibrium patterns for many CNVs, and reveal marked variation in copy number among populations. We also demonstrate the utility of this resource for genetic disease studies.

4,275 citations


Journal ArticleDOI
TL;DR: DrugBank is a unique bioinformatics/cheminformatics resource that combines detailed drug data with comprehensive drug target information and is fully searchable supporting extensive text, sequence, chemical structure and relational query searches.
Abstract: DrugBank is a unique bioinformatics/cheminformatics resource that combines detailed drug (i.e. chemical) data with comprehensive drug target (i.e. protein) information. The database contains .4100 drug entries including .800 FDA approved small molecule and biotech drugs as well as .3200 experimental drugs. Additionally, .14 000 protein or drug target sequences are linked to these drug entries. Each DrugCard entry contains .80 data fields with half of the information being devoted to drug/chemical data and the other half devoted to drug target or protein data. Many data fields are hyperlinked to other databases (KEGG, PubChem, ChEBI, PDB, Swiss-Prot and GenBank) and a variety of structure viewing applets. The database is fully searchable supporting extensive text, sequence, chemical structure and relational query searches. Potential applications of DrugBank include in silico drug target discovery, drug design, drug docking or screening, drug metabolism prediction, drug interaction prediction and general pharmaceutical education. DrugBank is available at http:// redpoll.pharmacy.ualberta.ca/drugbank/.

3,087 citations


Journal ArticleDOI
TL;DR: Roy Suddaby is asked to tackle another “big issue” that the editorial team has noticed with respect to qualitative submissions to AMJ: overly generic use of the term “grounded theory” and confusion regarding alternative epistemological approaches to qualitative research.
Abstract: Editor’s Note. Three years ago, I invited Robert (Bob) Gephart to write a “From the Editors” column designed to help authors improve their chances of success when submitting qualitative research to AMJ. Judging from the increasing number of qualitative studies that have been accepted and published in AMJ since that time, I would like to think that his article, “Qualitative Research and the Academy of Management Journal,” has had a positive impact. Continuing in this tradition, I asked Roy Suddaby—an excellent reviewer (and author) of qualitative research—to tackle another “big issue” that the editorial team has noticed with respect to qualitative submissions to AMJ: overly generic use of the term “grounded theory” and confusion regarding alternative epistemological approaches to qualitative research. Like Bob before him, Roy has, I believe, produced an analysis that will greatly benefit those who are relatively new to qualitative research or who have not yet had much success in getting their qualitative research published. Hopefully, Roy’s analysis will help even more authors to succeed, thus allowing AMJ and other journals to continue to increase the quality of insights provided by rich qualitative studies of individual, organizational, and institutional phenomena. Sara L. Rynes

2,598 citations


Journal ArticleDOI
TL;DR: The authors examines change initiated from the center of mature organizational fields and addresses the paradox of embedded agency, i.e., how actors enact changes to the co-authors of the paper.
Abstract: This study examines change initiated from the center of mature organizational fields. As such, it addresses the paradox of embedded agency—that is, the paradox of how actors enact changes to the co...

1,916 citations


Journal ArticleDOI
TL;DR: Islet transplantation with the use of the Edmonton protocol can successfully restore long-term endogenous insulin production and glycemic stability in subjects with type 1 diabetes mellitus and unstable control, but insulin independence is usually not sustainable.
Abstract: Background Islet transplantation offers the potential to improve glycemic control in a subgroup of patients with type 1 diabetes mellitus who are disabled by refractory hypoglycemia. We conducted an international, multicenter trial to explore the feasibility and reproducibility of islet transplantation with the use of a single common protocol (the Edmonton protocol). Methods We enrolled 36 subjects with type 1 diabetes mellitus, who underwent islet transplantation at nine international sites. Islets were prepared from pancreases of deceased donors and were transplanted within 2 hours after purification, without culture. The primary end point was defined as insulin independence with adequate glycemic control 1 year after the final transplantation. Results Of the 36 subjects, 16 (44%) met the primary end point, 10 (28%) had partial function, and 10 (28%) had complete graft loss 1 year after the final transplantation. A total of 21 subjects (58%) attained insulin independence with good glycemic control at any point throughout the trial. Of these subjects, 16 (76%) required insulin again at 2 years; 5 of the 16 subjects who reached the primary end point (31%) remained insulin-independent at 2 years. Conclusions Islet transplantation with the use of the Edmonton protocol can successfully restore long-term endogenous insulin production and glycemic stability in subjects with type 1 diabetes mellitus and unstable control, but insulin independence is usually not sustainable. Persistent islet function even without insulin independence provides both protection from severe hypoglycemia and improved levels of glycated hemoglobin. (ClinicalTrials.gov number, NCT00014911.)

1,784 citations


Journal ArticleDOI
TL;DR: Broad protection against cytohistological outcomes beyond that anticipated for HPV 16/18 and protection against incident infection with HPV 45 and HPV 31 is noted and the vaccine has a good long-term safety profile.

1,601 citations


Journal ArticleDOI
TL;DR: Adalimumab was superior to placebo for induction of remission in patients with moderate to severe Crohn's disease naive to anti-TNF therapy and was well tolerated.

1,579 citations


Journal ArticleDOI
12 Jan 2006-Nature
TL;DR: It is shown that a recent mass extinction associated with pathogen outbreaks is tied to global warming, and it is proposed that temperatures at many highland localities are shifting towards the growth optimum of Batrachochytrium, thus encouraging outbreaks.
Abstract: As the Earth warms, many species are likely to disappear, often because of changing disease dynamics. Here we show that a recent mass extinction associated with pathogen outbreaks is tied to global warming. Seventeen years ago, in the mountains of Costa Rica, the Monteverde harlequin frog (Atelopus sp.) vanished along with the golden toad (Bufo periglenes). An estimated 67% of the 110 or so species of Atelopus, which are endemic to the American tropics, have met the same fate, and a pathogenic chytrid fungus (Batrachochytrium dendrobatidis) is implicated. Analysing the timing of losses in relation to changes in sea surface and air temperatures, we conclude with 'very high confidence' (> 99%, following the Intergovernmental Panel on Climate Change, IPCC) that large-scale warming is a key factor in the disappearances. We propose that temperatures at many highland localities are shifting towards the growth optimum of Batrachochytrium, thus encouraging outbreaks. With climate change promoting infectious disease and eroding biodiversity, the urgency of reducing greenhouse-gas concentrations is now undeniable.

1,528 citations


Journal ArticleDOI
TL;DR: This review supports current guidelines that identify individuals with CKD as being at high risk for cardiovascular mortality and determines which interventions best offset this risk remains a health priority.
Abstract: Current guidelines identify people with chronic kidney disease (CKD) as being at high risk for cardiovascular and all-cause mortality. Because as many as 19 million Americans may have CKD, a comprehensive summary of this risk would be potentially useful for planning public health policy. A systematic review of the association between non–dialysis-dependent CKD and the risk for all-cause and cardiovascular mortality was conducted. Patient- and study-related characteristics that influenced the magnitude of these associations also were investigated. MEDLINE and EMBASE databases were searched, and reference lists through December 2004 were consulted. Authors of 10 primary studies provided additional data. Cohort studies or cohort analyses of randomized, controlled trials that compared mortality between those with and without chronically reduced kidney function were included. Studies were excluded from review when participants were followed for

1,476 citations


Journal ArticleDOI
20 Jul 2006-Leukemia
TL;DR: The pleiotropic effects of MV that are important for communication between cells, as well as the role of MV in carcinogenesis, coagulation, immune responses and modulation of susceptibility/infectability of cells to retroviruses or prions are discussed.
Abstract: Normal and malignant cells shed from their surface membranes as well as secrete from the endosomal membrane compartment circular membrane fragments called microvesicles (MV). MV that are released from viable cells are usually smaller in size compared to the apoptotic bodies derived from damaged cells and unlike them do not contain fragmented DNA. Growing experimental evidence indicates that MV are an underappreciated component of the cell environment and play an important pleiotropic role in many biological processes. Generally, MV are enriched in various bioactive molecules and may (i) directly stimulate cells as a kind of 'signaling complex', (ii) transfer membrane receptors, proteins, mRNA and organelles (e.g., mitochondria) between cells and finally (iii) deliver infectious agents into cells (e.g., human immuno deficiency virus, prions). In this review, we discuss the pleiotropic effects of MV that are important for communication between cells, as well as the role of MV in carcinogenesis, coagulation, immune responses and modulation of susceptibility/infectability of cells to retroviruses or prions.

1,291 citations


Journal ArticleDOI
TL;DR: In this paper, a series of numerical issues related to the analysis and implementation of fractional step methods for incompressible flows are addressed, and the essential results are summarized in a table which could serve as a useful reference to numerical analysts and practitioners.

Journal ArticleDOI
29 Jun 2006-BMJ
TL;DR: The observed association between good adherence to placebo and mortality supports the existence of the “healthy adherer” effect, whereby adherence to drug therapy may be a surrogate marker for overall healthy behaviour.
Abstract: Objective To evaluate the relation between adherence to drug therapy, including placebo, and mortality. Design Meta-analysis of observational studies. Data sources Electronic databases, contact with investigators, and textbooks and reviews on adherence. Review methods Predefined criteria were used to select studies reporting mortality among participants with good and poor adherence to drug therapy. Data were extracted for disease, drug therapy groups, methods for measurement of adherence rate, definition for good adherence, and mortality. Results Data were available from 21 studies (46 847 participants), including eight studies with placebo arms (19 633 participants). Compared with poor adherence, good adherence was associated with lower mortality (odds ratio 0.56, 95% confidence interval 0.50 to 0.63). Good adherence to placebo was associated with lower mortality (0.56, 0.43 to 0.74), as was good adherence to beneficial drug therapy (0.55, 0.49 to 0.62). Good adherence to harmful drug therapy was associated with increased mortality (2.90, 1.04 to 8.11). Conclusion Good adherence to drug therapy is associated with positive health outcomes. Moreover, the observed association between good adherence to placebo and mortality supports the existence of the “healthy adherer” effect, whereby adherence to drug therapy may be a surrogate marker for overall healthy behaviour.

Journal ArticleDOI
TL;DR: For stroke patients treated 3 to 6 hours after onset, baseline MRI findings can identify subgroups that are likely to benefit from reperfusion therapies and can potentially identify sub groups that are unlikely to benefit or may be harmed.
Abstract: Objective To determine whether prespecified baseline magnetic resonance imaging (MRI) profiles can identify stroke patients who have a robust clinical response after early reperfusion when treated 3 to 6 hours after symptom onset. Methods We conducted a prospective, multicenter study of 74 consecutive stroke patients admitted to academic stroke centers in North America and Europe. An MRI scan was obtained immediately before and 3 to 6 hours after treatment with intravenous tissue plasminogen activator 3 to 6 hours after symptom onset. Baseline MRI profiles were used to categorize patients into subgroups, and clinical responses were compared based on whether early reperfusion was achieved. Results Early reperfusion was associated with significantly increased odds of achieving a favorable clinical response in patients with a perfusion/diffusion mismatch (odds ratio, 5.4; p = 0.039) and an even more favorable response in patients with the Target Mismatch profile (odds ratio, 8.7; p = 0.011). Patients with the No Mismatch profile did not appear to benefit from early reperfusion. Early reperfusion was associated with fatal intracranial hemorrhage in patients with the Malignant profile. Interpretation For stroke patients treated 3 to 6 hours after onset, baseline MRI findings can identify subgroups that are likely to benefit from reperfusion therapies and can potentially identify subgroups that are unlikely to benefit or may be harmed. Ann Neurol 2006

Journal ArticleDOI
TL;DR: In this paper, the authors examine the nature of family businesses in an attempt to explain why some seem to do so well and others so poorly, and draw conclusions about the drivers that make some family businesses great competitors, while leaving others at a disadvantage.
Abstract: After decades of being viewed as obsolete and problem ridden, recent research has begun to show that major, publicly traded family-controlled businesses (FCBs) actually out-perform other types of businesses. This article examines the nature of such family businesses in an attempt to explain why some seem to do so well and others so poorly. It begins with four fundamental governance choices that distinguish among different kinds of family businesses: level and mode of family ownership, family leadership, the broader involvement of multiple family members, and the planned or actual participation of later generations. Using precepts from agency and stewardship theory, it relates these dimensions to the nature of the resource-allocation decisions made by the business and capability development, which in turn have implications for financial performance. Propositions are drawn about the drivers that make some family businesses great competitors—while leaving others at a disadvantage.

Journal ArticleDOI
TL;DR: 1. Office of Population Census and Surveys (OPCS)—Surveys of Psychiatric Morbidity in Great Britain Report 1: The prevalence of psychiatric morbidity amongst adults living in private households.
Abstract: 1. Meltzer H, Gill H, Petticrew M, Hinds K. Office of Population Census and Surveys (OPCS)—Surveys of Psychiatric Morbidity in Great Britain Report 1: The prevalence of psychiatric morbidity amongst adults living in private households. London: HMSO, 1995. 2. Beekman AT, Copeland JR, Prince MJ. Review of community prevalence of depression in later life. Br J Psychiatry 1999; 174: 307–11. 3. Prescription Pricing Authority (PPA) PACT Centre Pages. Drugs used in Mental Health. http://www.ppa.org.uk/news/ pact-112003/pact-112003.htm (4 November 2004, date last accessed). 4. Middleton N, Gunnell D, Whitley E, Dorling D, Frankel S. Secular trends in antidepressant prescribing in the UK, 1975–1998 J Public Health Med 2001; 23: 262–6. 5. National Institute for Clinical Excellence. Management of depression in primary and secondary care. Clinical Guideline 23. National Institute for Clinical Excellence 2004. 6. Percudani M, Barbui C, Fortino I, Petrovich L. Antidepressant drug prescribing among elderly subjects: a population-based study. Int J Geriatr Psychiatry 2005; 20: 113–8. 7. Lawreson RA, Tyrere F, Newson RB, Farmer RDT. The treatment of depression in UK general practice: selective serotonin reuptake inhibitors and tricyclic antidepressants compared. J Affect Disord 2000; 59: 149–57. 8. Wilson KC, Copeland JR, Taylor S, Donoghue J, McCracken CF. Natural history of pharmacotherapy of older depressed community resident. The MRC-ALPHA Study. Br J Psychiatry 1999; 175: 439–43. 9. Living in Britain. A summary of changes over time – Use of health services. Office of National Statistics (ONS). http://www.statistics.gov.uk (16 February 2005, date last accessed). 10. Rosenbaum JF, Zajecka J. Clinical management of antidepressant discontinuation. J Clin Psychiatry 1998; 59: 535–7. 11. Zermansky AG. Who controls repeats? Br J Gen Prac 1996; 46: 643–7.

Journal ArticleDOI
TL;DR: The authors pursue a more in-depth discussion of the positions of Glaser, using Glaser's work, and Strauss, using Strauss's and Strauss and Corbin's (1990) work, regarding the different phases of data analysis, specifically addressing the coding procedures, verification, and the issue of forcing versus emergence.
Abstract: Grounded theory, as an evolving qualitative research method, is a product of its history as well as of its epistemology. Within the literature, there have been a number of discussions focusing on the differences between Glaser's (1978, 1992) and Strauss's (1987, 1990) versions of grounded theory. The purpose of this article is to add a level of depth and breadth to this discussion through specifically exploring the Glaser-Strauss debate by comparing the data analysis processes and procedures advocated by Glaser and by Strauss. To accomplish this task, the authors present the article in two sections. First, they provide relevant background information on grounded theory as a research method. Second, they pursue a more in-depth discussion of the positions of Glaser, using Glaser's work, and Strauss, using Strauss's and Strauss and Corbin's (1990) work, regarding the different phases of data analysis, specifically addressing the coding procedures, verification, and the issue of forcing versus emergence.

Journal ArticleDOI
TL;DR: Exercise is an effective intervention to improve quality of life, cardiorespiratory fitness, physical functioning and fatigue in breast cancer patients and survivors and larger trials that examine the long-term benefits of exercise are needed for this patient group.
Abstract: Background: Physical exercise has been identified as a potential intervention to improve quality of life in women with breast cancer. We sought to summarize the available evidence concerning the effects of exercise on breast cancer patients and survivors. Methods: We searched the Cochrane Central Register of Controlled Trials, MEDLINE, EMBASE, CINAHL, PsychINFO, CancerLit, PEDro and SportDiscus as well as conference proceedings, clinical practice guidelines and other unpublished literature resources. We included only randomized controlled trials that examined exercise interventions for breast cancer patients or survivors with quality of life, cardiorespiratory fitness or physical functioning as primary outcomes. We also extracted data on symptoms of fatigue, body composition and adverse effects. Results: Of 136 studies identified, 14 met all the inclusion criteria. Despite significant heterogeneity and relatively small samples, the point estimates in terms of the benefits of exercise for all outcomes were positive even when statistical significance was not achieved. Exercise led to statistically significant improvements in quality of life as assessed by the Functional Assessment of Cancer Therapy–General (weighted mean difference [WMD] 4.58, 95% confidence interval [CI] 0.35 to 8.80) and Functional Assessment of Cancer Therapy–Breast (WMD 6.62, 95% CI 1.21 to 12.03). Exercise also led to significant improvements in physical functioning and peak oxygen consumption and in reducing symptoms of fatigue. Interpretation: Exercise is an effective intervention to improve quality of life, cardiorespiratory fitness, physical functioning and fatigue in breast cancer patients and survivors. Larger trials that have a greater focus on study quality and adverse effects and that examine the long-term benefits of exercise are needed for this patient group.

Journal ArticleDOI
TL;DR: A broad survey of the different types of machine learning methods being used, the types of data being integrated and the performance of these methods in cancer prediction and prognosis is conducted, including a growing dependence on protein biomarkers and microarray data, a strong bias towards applications in prostate and breast cancer, and a heavy reliance on "older" technologies.
Abstract: Machine learning is a branch of artificial intelligence that employs a variety of statistical, probabilistic and optimization techniques that allows computers to “learn” from past examples and to detect hard-to-discern patterns from large, noisy or complex data sets. This capability is particularly well-suited to medical applications, especially those that depend on complex proteomic and genomic measurements. As a result, machine learning is frequently used in cancer diagnosis and detection. More recently machine learning has been applied to cancer prognosis and prediction. This latter approach is particularly interesting as it is part of a growing trend towards personalized, predictive medicine. In assembling this review we conducted a broad survey of the different types of machine learning methods being used, the types of data being integrated and the performance of these methods in cancer prediction and prognosis. A number of trends are noted, including a growing dependence on protein biomarkers and mi...

Journal ArticleDOI
TL;DR: In the years ahead, climate warming will aggravate eutrophication in lakes receiving point sources of nutrients, as a result of increasing water residence times, which will increasingly favor the replacement of diatoms by nitrogen-fixing Cyanobacteria as mentioned in this paper.
Abstract: Major advances in the scientific understanding and management of eutrophication have been made since the late 1960s. The control of point sources of phosphorus reduced algal blooms in many lakes. Diffuse nutrient sources from land use changes and urbanization in the catchments of lakes have proved possible to control but require many years of restoration efforts. The importance of water residence time to eutrophication has been recognized. Changes in aquatic communities contribute to eutrophication via the trophic cascade, nutrient stoichiometry, and transport of nutrients from benthic to pelagic regions. Overexploitation of piscivorous fishes appears to be a particularly common amplifier of eutrophication. Internal nutrient loading can be controlled by reducing external loading, although the full response of lakes may take decades. In the years ahead, climate warming will aggravate eutrophication in lakes receiving point sources of nutrients, as a result of increasing water residence times. Decreased silica supplies from dwindling inflows may increasingly favor the replacement of diatoms by nitrogen-fixing Cyanobacteria. Increases in transport of nitrogen by rivers to estuaries and coastal oceans have followed increased use of nitrogen in agriculture and increasing emissions to the atmosphere. Our understanding of eutrophication and its management has evolved from simple control of nutrient sources to recognition that it is often a cumulative effects problem that will require protection and restoration of many features of a lake's community and its catchment.

Journal ArticleDOI
01 Sep 2006-Chest
TL;DR: It was showed that BMI has significant effects on all of the lung volumes, and the greatest effects were on FRC and ERV, which occurred at BMI values < 30 kg/m2, which will assist clinicians when interpreting PFT results in patients with normal airway function.

Journal ArticleDOI
TL;DR: Recent progress of advanced oxidation of aqueous pharmaceuticals is reviewed and ozonation and advanced oxidation processes are likely promising for efficient degradation of pharmaceuticals in water and wastewater.
Abstract: A vast number of pharmaceuticals have been detected in surface water and drinking water around the world, which indicates their ineffective removal from water and wastewater using conventional treatment technologies. Concerns have been raised over the potential adverse effects of pharmaceuticals on public health and aquatic environment. Among the different treatment options, ozonation and advanced oxidation processes are likely promising for efficient degradation of pharmaceuticals in water and wastewater. Recent progress of advanced oxidation of aqueous pharmaceuticals is reviewed in this paper. The pharmaceuticals and non-therapeutic medical agent of interest include antibiotics, anticonvulsants, antipyretics, beta-blockers, cytostatic drugs, H2 antagonists, estrogenic hormone and contraceptives, blood lipid regulators, and X-ray contrast media.

01 Dec 2006
TL;DR: In this article, the authors show that sulfate reduction in diagenetic environments occurs in two mutually exclusive thermal regimes, i.e., low-temperature and hightemperature, respectively.
Abstract: The association of dissolved sulfate and hydrocarbons is thermodynamically unstable in virtually all diagenetic environments. Hence, redox-reactions occur, whereby sulfate is reduced by hydrocarbons either bacterially (bacterial sulfate reductiona BSR) or inorganically (thermochemical sulphate reductiona TSR). Their geologically and economically significant products are similar. Based on empirical evidence, BSR and TSR occur in two mutually exclusive thermal regimes, i.e. low-temperature and hightemperature diagenetic environments, respectively. BSR is common in diagenetic settings from 0 up to about 60‐808C. Above this temperature range, almost all sulfate-reducing microbes cease to metabolize. Those few types of hyperthermophilic microbes that can form H2S at higher temperatures appear to be very rare and do not normally occur and/or metabolize in geologic settings that are otherwise conducive to BSR. TSR appears to be common in geologic settings with temperatures of about 100‐1408C, but in some settings temperatures of 160‐1808C appear to be necessary. TSR does not have a sharply defined, generally valid minimum temperature because the onset and rate of TSR are governed by several factors that vary from place to place, i.e. the composition of the available organic reactants, kinetic inhibitors and/or catalysts, anhydrite dissolution rates, wettability, as well as migration and diffusion rates of the major reactants toward one another. BSR is geologically instantaneous in most geologic settings. Rates of TSR are much lower, but still geologically significant. TSR may form sour gas reservoirs and/or MVT deposits in several tens of thousands to a few million years in the temperature range of 100‐1408C. BSR and TSR may be exothermic or endothermic, depending mainly on the presence or absence of specific organic reactants. However, if the reactions are exothermic, the amount of heat liberated is very small, and this heat usually dissipates quickly. Hence, heat anomalies found in association with TSR settings are normally not generated by TSR. The main organic reactants for BSR are organic acids and other products of aerobic or fermentative biodegradation. The main organic reactants for TSR are branched and n-alkanes, followed by cyclic and mono-aromatic species, in the gasoline range. Sulfate is derived almost invariably from the dissolution of gypsum and/or anhydrite, which may be primary or secondary deposits at or near the redox-reaction site(s). The products of BSR and TSR are similar, but their relative amounts vary widely and are determined by a number of locally variable factors, including availability of reactants, formation water chemistry, and wettability. The primary inorganic reaction products in both thermal regimes are H2S(HS 2 ) and HCO3 (CO2). The presence of alkali earth metals often results in the formation of carbonates, particularly calcite and dolomite. Other carbonates, i.e. ankerite, siderite, witherite, strontianite, may form if the respective metal cations are available. Iron sulfides, galena, and sphalerite form as by-products of hydrogen sulfide generation, if the respective transition or base metals are present or transported to a BSR/TSR reaction site. Elemental sulfur may accumulate as a volumetrically significant

Journal ArticleDOI
TL;DR: A model for self-regulated, reiterative patterning of all vein orders is derived and a common epidermal auxin-focusing mechanism for major-vein positioning and phyllotactic patterning is postulated.
Abstract: The formation of the leaf vascular pattern has fascinated biologists for centuries. In the early leaf primordium, complex networks of procambial cells emerge from homogeneous subepidermal tissue. The molecular nature of the underlying positional information is unknown, but various lines of evidence implicate gradually restricted transport routes of the plant hormone auxin in defining sites of procambium formation. Here we show that a crucial member of the AtPIN family of auxin-efflux-associated proteins, AtPIN1, is expressed prior to pre-procambial and procambial cell fate markers in domains that become restricted toward sites of procambium formation. Subcellular AtPIN1 polarity indicates that auxin is directed to distinct “convergence points” in the epidermis, from where it defines the positions of major veins. Integrated polarities in all emerging veins indicate auxin drainage toward pre-existing veins, but veins display divergent polarities as they become connected at both ends. Auxin application and transport inhibition reveal that convergence point positioning and AtPIN1 expression domain dynamics are self-organizing, auxin-transport-dependent processes. We derive a model for self-regulated, reiterative patterning of all vein orders and postulate at its onset a common epidermal auxin-focusing mechanism for major-vein positioning and phyllotactic patterning.

Proceedings ArticleDOI
25 Jun 2006
TL;DR: This work learns mappings from features to cost so an optimal policy in an MDP with these cost mimics the expert's behavior, and demonstrates a simple, provably efficient approach to structured maximum margin learning, based on the subgradient method, that leverages existing fast algorithms for inference.
Abstract: Imitation learning of sequential, goal-directed behavior by standard supervised techniques is often difficult. We frame learning such behaviors as a maximum margin structured prediction problem over a space of policies. In this approach, we learn mappings from features to cost so an optimal policy in an MDP with these cost mimics the expert's behavior. Further, we demonstrate a simple, provably efficient approach to structured maximum margin learning, based on the subgradient method, that leverages existing fast algorithms for inference. Although the technique is general, it is particularly relevant in problems where A* and dynamic programming approaches make learning policies tractable in problems beyond the limitations of a QP formulation. We demonstrate our approach applied to route planning for outdoor mobile robots, where the behavior a designer wishes a planner to execute is often clear, while specifying cost functions that engender this behavior is a much more difficult task.

Journal ArticleDOI
TL;DR: A simulation approach was used to clarify the application of random effects under three common situations for telemetry studies and found that random intercepts accounted for unbalanced sample designs, and models withrandom intercepts and coefficients improved model fit given the variation in selection among individuals and functional responses in selection.
Abstract: 1. Resource selection estimated by logistic regression is used increasingly in studies to identify critical resources for animal populations and to predict species occurrence. 2. Most frequently, individual animals are monitored and pooled to estimate population-level effects without regard to group or individual-level variation. Pooling assumes that both observations and their errors are independent, and resource selection is constant given individual variation in resource availability. 3. Although researchers have identified ways to minimize autocorrelation, variation between individuals caused by differences in selection or available resources, including functional responses in resource selection, have not been well addressed. 4. Here we review random-effects models and their application to resource selection modelling to overcome these common limitations. We present a simple case study of an analysis of resource selection by grizzly bears in the foothills of the Canadian Rocky Mountains with and without random effects. 5. Both categorical and continuous variables in the grizzly bear model differed in interpretation, both in statistical significance and coefficient sign, depending on how a random effect was included. We used a simulation approach to clarify the application of random effects under three common situations for telemetry studies: (a) discrepancies in sample sizes among individuals; (b) differences among individuals in selection where availability is constant; and (c) differences in availability with and without a functional response in resource selection. 6. We found that random intercepts accounted for unbalanced sample designs, and models with random intercepts and coefficients improved model fit given the variation in selection among individuals and functional responses in selection. Our empirical example and simulations demonstrate how including random effects in resource selection models can aid interpretation and address difficult assumptions limiting their generality. This approach will allow researchers to appropriately estimate marginal (population) and conditional (individual) responses, and account for complex grouping, unbalanced sample designs and autocorrelation.

Journal ArticleDOI
TL;DR: In this article, the authors link the domains of corporate governance, investment policies, competitive asymmetries, and sustainable capabilities, and propose a framework to link these domains to the domain of sustainable capabilities.
Abstract: This article seeks to link the domains of corporate governance, investment policies, competitive asymmetries, and sustainable capabilities. Conditions such as concentrated ownership, lengthy tenure...

Journal ArticleDOI
TL;DR: The three-dimensional structures of two class IIa immunity proteins have been determined, and it has been shown that the C-terminal halves of these cytosolic four-helix bundle proteins specify whichclass IIa bacteriocin they protect against.
Abstract: Many bacteria produce antimicrobial peptides, which are also referred to as peptide bacteriocins. The class IIa bacteriocins, often designated pediocin-like bacteriocins, constitute the most dominant group of antimicrobial peptides produced by lactic acid bacteria. The bacteriocins that belong to this class are structurally related and kill target cells by membrane permeabilization. Despite their structural similarity, class IIa bacteriocins display different target cell specificities. In the search for new antibiotic substances, the class IIa bacteriocins have been identified as promising new candidates and have thus received much attention. They kill some pathogenic bacteria (e.g., Listeria) with high efficiency, and they constitute a good model system for structure-function analyses of antimicrobial peptides in general. This review focuses on class IIa bacteriocins, especially on their structure, function, mode of action, biosynthesis, bacteriocin immunity, and current food applications. The genetics and biosynthesis of class IIa bacteriocins are well understood. The bacteriocins are ribosomally synthesized with an N-terminal leader sequence, which is cleaved off upon secretion. After externalization, the class IIa bacteriocins attach to potential target cells and, through electrostatic and hydrophobic interactions, subsequently permeabilize the cell membrane of sensitive cells. Recent observations suggest that a chiral interaction and possibly the presence of a mannose permease protein on the target cell surface are required for a bacteria to be sensitive to class IIa bacteriocins. There is also substantial evidence that the C-terminal half penetrates into the target cell membrane, and it plays an important role in determining the target cell specificity of these bacteriocins. Immunity proteins protect the bacteriocin producer from the bacteriocin it secretes. The three-dimensional structures of two class IIa immunity proteins have been determined, and it has been shown that the C-terminal halves of these cytosolic four-helix bundle proteins specify which class IIa bacteriocin they protect against.

Journal ArticleDOI
TL;DR: A new validation method is concluded for evaluating predictive performance of a RSF and for assessing if the model deviates from being proportional to the probability of use of a resource unit.
Abstract: Applications of logistic regression in a used–unused design in wildlife habitat studies often suffer from asymmetry of errors: used resource units (landscape locations) are known with certainty, whereas unused resource units might be observed to be used with greater sampling intensity. More appropriate might be to use logistic regression to estimate a resource selection function (RSF) tied to a use–availability design based on independent samples drawn from used and available resource units. We review the theoretical motivation for RSFs and show that sample “contamination” and the exponential form commonly assumed for the RSF are not concerns, contrary to recent statements by Keating and Cherry (2004; Use and interpretation of logistic regression in habitat-selection studies. Journal of Wildlife Management 68:774–789). To do this, we re-derive the use–availability likelihood and show that it can be maximized by logistic regression software. We then consider 2 case studies that illustrate our find...

Journal ArticleDOI
TL;DR: The authors briefly outlines the history and development of the methodology of narrative inquiry and draws attention to the need for careful delineation of terms and assumptions, and issues of social significance, purpose and ethics are also outlined.
Abstract: The paper briefly outlines the history and development of the methodology of narrative inquiry. It draws attention to the need for careful delineation of terms and assumptions. A Deweyan view of experience is central to narrative inquiry methodology and is used to frame a metaphorical three-dimensional narrative inquiry space. An illustration from a recent narrative inquiry into curriculum making is used to show what narrative inquirers do. Issues of social significance, purpose and ethics are also outlined.

Journal ArticleDOI
TL;DR: In this paper, the authors argue that the institutions and sites where regulation takes place affect both the outcome of the regulatory process and the legitimacy of the rules and practices produced, and that changes in regulatory processes affect opportunities for democratic control and legitimacy.
Abstract: This review paper argues that the institutions and sites of professionalization projects and regulatory processes matter. The institutions and locations where regulation takes place affect both the outcome of the regulatory process and the legitimacy of the rules and practices produced. Changes in regulatory processes affect opportunities for democratic control and legitimacy. A common position in the accounting literature is to examine both the process of professionalization and accounting and audit regulation within and around professional associations and related organizations, such as standard setting bodies and regulatory agencies. We argue that professional firms are increasingly important in professionalization and regulatory processes and have not received the attention that they warrant: an examination of the multi-national professional service firms (currently known as the Big 4) can enhance an understanding of professionalization and professional regulation. We suggest that these are important sites where accounting practices are themselves standardized and regulated, where accounting rules and standards are translated into practice, where professional identities are mediated, formed and transformed, and where important conceptions of personal, professional and corporate governance and management are transmitted.