scispace - formally typeset
Search or ask a question

Showing papers by "University of Rennes published in 2009"


Journal ArticleDOI
TL;DR: This review puts the current knowledge of marine picocyanobacterial genomics into an environmental context and presents previously unpublished genomic information arising from extensive genomic comparisons in order to provide insights into the adaptations of these marine microbes to their environment and how they are reflected at the genomic level.
Abstract: Marine picocyanobacteria of the genera Prochlorococcus and Synechococcus numerically dominate the picophytoplankton of the world ocean, making a key contribution to global primary production. Prochlorococcus was isolated around 20 years ago and is probably the most abundant photosynthetic organism on Earth. The genus comprises specific ecotypes which are phylogenetically distinct and differ markedly in their photophysiology, allowing growth over a broad range of light and nutrient conditions within the 45 degrees N to 40 degrees S latitudinal belt that they occupy. Synechococcus and Prochlorococcus are closely related, together forming a discrete picophytoplankton clade, but are distinguishable by their possession of dissimilar light-harvesting apparatuses and differences in cell size and elemental composition. Synechococcus strains have a ubiquitous oceanic distribution compared to that of Prochlorococcus strains and are characterized by phylogenetically discrete lineages with a wide range of pigmentation. In this review, we put our current knowledge of marine picocyanobacterial genomics into an environmental context and present previously unpublished genomic information arising from extensive genomic comparisons in order to provide insights into the adaptations of these marine microbes to their environment and how they are reflected at the genomic level.

623 citations


Journal ArticleDOI
TL;DR: Results on real images demonstrate that the proposed adaptation of the nonlocal (NL)-means filter for speckle reduction in ultrasound (US) images is able to preserve accurately edges and structural details of the image.
Abstract: In image processing, restoration is expected to improve the qualitative inspection of the image and the performance of quantitative image analysis techniques. In this paper, an adaptation of the nonlocal (NL)-means filter is proposed for speckle reduction in ultrasound (US) images. Originally developed for additive white Gaussian noise, we propose to use a Bayesian framework to derive a NL-means filter adapted to a relevant ultrasound noise model. Quantitative results on synthetic data show the performances of the proposed method compared to well-established and state-of-the-art methods. Results on real images demonstrate that the proposed method is able to preserve accurately edges and structural details of the image.

547 citations


Journal ArticleDOI
10 Jun 2009-JAMA
TL;DR: The benefits and risks of corticosteroid treatment in severe sepsis and septic shock and the influence of dose and duration are examined and analysis of this subgroup suggests a beneficial drug effect on short-term mortality.
Abstract: Context The benefit of corticosteroids in severe sepsis and septic shock remains controversial. Objective We examined the benefits and risks of corticosteroid treatment in severe sepsis and septic shock and the influence of dose and duration. Data Sources We searched the CENTRAL, MEDLINE, EMBASE, and LILACS (through March 2009) databases as well as reference lists of articles and proceedings of major meetings, and we contacted trial authors. Study Selection Randomized and quasi-randomized trials of corticosteroids vs placebo or supportive treatment in adult patients with severe sepsis/septic shock per the American College of Chest Physicians/Society of Critical Care Medicine consensus definition were included. Data Extraction All reviewers agreed on trial eligibility. One reviewer extracted data, which were checked by the other reviewers and by the trials' authors whenever possible. Some unpublished data were obtained from the trials' authors. The primary outcome for this review was 28-day mortality. Results We identified 17 randomized trials (n = 2138) and 3 quasi-randomized trials (n = 246) that had acceptable methodological quality to pool in a meta-analysis. Twenty-eight-day mortality for treated vs control patients was 388/1099 (35.3%) vs 400/1039 (38.5%) in randomized trials (risk ratio [RR], 0.84; 95% confidence interval [CI], 0.71-1.00; P=.05; I 2 =53% by random-effects model) and 28/121 (23.1%) vs 24/125 (19.2%) in quasi-randomized trials (RR, 1.05, 95% CI, 0.69-1.58; P = .83). In 12 trials investigating prolonged low-dose corticosteroid treatment, 28-day mortality for treated vs control patients was 236/629 (37.5%) vs 264/599 (44%) (RR, 0.84; 95% CI, 0.72-0.97; P = .02). This treatment increased 28-day shock reversal (6 trials; 322/481 [66.9%] vs 276/471 [58.6%]; RR, 1.12; 95% CI, 1.02-1.23; P = .02; I 2 = 4%) and reduced intensive care unit length of stay by 4.49 days (8 trials; 95% CI, –7.04 to –1.94; P 2 = 0%) without increasing the risk of gastroduodenal bleeding (13 trials; 65/800 [8.1%] vs 56/764 [7.3%]; P = .50; I 2 = 0%), superinfection (14 trials; 184/998 [18.4%] vs 170/950 [17.9%]; P = .92; I 2 = 8%), or neuromuscular weakness (3 trials; 4/407 [1%] vs 7/404 [1.7%]; P = .58; I 2 = 30%). Corticosteroids increased the risk of hyperglycemia (9 trials; 363/703 [51.6%] vs 308/670 [46%]; P 2 = 0%) and hypernatremia (3 trials; 127/404 [31.4%] vs 77/401 [19.2%]; P 2 = 0%). Conclusions Corticosteroid therapy has been used in varied doses for sepsis and related syndromes for more than 50 years, with no clear benefit on mortality. Since 1998, studies have consistently used prolonged low-dose corticosteroid therapy, and analysis of this subgroup suggests a beneficial drug effect on short-term mortality.

536 citations



Journal ArticleDOI
TL;DR: A set of flexible open source computational tools in the R package flowCore that constitutes a shared and extensible research platform that enables collaboration between bioinformaticians, computer scientists, statisticians, biologists and clinicians will foster the development of novel analytic methods for flow cytometry.
Abstract: Background: Recent advances in automation technologies have enabled the use of flow cytometry for high throughput screening, generating large complex data sets often in clinical trials or drug discovery settings. However, data management and data analysis methods have not advanced sufficiently far from the initial small-scale studies to support modeling in the presence of multiple covariates. Results: We developed a set of flexible open source computational tools in the R package flowCore to facilitate the analysis of these complex data. A key component of which is having suitable data structures that support the application of similar operations to a collection of samples or a clinical cohort. In addition, our software constitutes a shared and extensible research platform that enables collaboration between bioinformaticians, computer scientists, statisticians, biologists and clinicians. This platform will foster the development of novel analytic methods for flow cytometry. Conclusion: The software has been applied in the analysis of various data sets and its data structures have proven to be highly efficient in capturing and organizing the analytic work flow. Finally, a number of additional Bioconductor packages successfully build on the infrastructure provided by flowCore, open new avenues for flow data analysis.

468 citations


Journal ArticleDOI
01 Aug 2009
TL;DR: This paper applies Bayesian analysis to decide dependence between sources and design an algorithm that iteratively detects dependence and discovers truth from conflicting information and extends the model by considering accuracy of data sources and similarity between values.
Abstract: Many data management applications, such as setting up Web portals, managing enterprise data, managing community data, and sharing scientific data, require integrating data from multiple sources. Each of these sources provides a set of values and different sources can often provide conflicting values. To present quality data to users, it is critical that data integration systems can resolve conflicts and discover true values. Typically, we expect a true value to be provided by more sources than any particular false one, so we can take the value provided by the majority of the sources as the truth. Unfortunately, a false value can be spread through copying and that makes truth discovery extremely tricky. In this paper, we consider how to find true values from conflicting information when there are a large number of sources, among which some may copy from others.We present a novel approach that considers dependence between data sources in truth discovery. Intuitively, if two data sources provide a large number of common values and many of these values are rarely provided by other sources (e.g., particular false values), it is very likely that one copies from the other. We apply Bayesian analysis to decide dependence between sources and design an algorithm that iteratively detects dependence and discovers truth from conflicting information. We also extend our model by considering accuracy of data sources and similarity between values. Our experiments on synthetic data as well as real-world data show that our algorithm can significantly improve accuracy of truth discovery and is scalable when there are a large number of data sources.

439 citations


Journal ArticleDOI
TL;DR: In this paper, the authors analyze the factors influencing the European perceptions regarding the fact that "fish caught using an environmentally friendly technique may carry a special label" and show a significant connection between the desire for eco-labeling and seafood features.

381 citations


Journal ArticleDOI
TL;DR: An approach for specifying and executing dynamically adaptive software systems that combines model-driven and aspect-oriented techniques to help engineers tame the complexity of such systems while offering a high degree of automation and validation is presented.
Abstract: Today's society increasingly depends on software systems deployed in large companies, banks, airports, and so on. These systems must be available 24/7 and continuously adapt to varying environmental conditions and requirements. Such dynamically adaptive systems exhibit degrees of variability that depend on user needs and runtime fluctuations in their contexts. The paper presents an approach for specifying and executing dynamically adaptive software systems that combines model-driven and aspect-oriented techniques to help engineers tame the complexity of such systems while offering a high degree of automation and validation.

375 citations


Journal ArticleDOI
TL;DR: Effects on the rate of conversion to CDMS and the favourable long-term safety and tolerability profile support early initiation of treatment with interferon beta-1b, although a delay in treatment by up to 2 years did not affect long- term disability outcomes.
Abstract: BACKGROUND: The Betaferon/Betaseron in newly emerging multiple sclerosis for initial treatment (BENEFIT) trial investigated the effect of treatment with interferon beta-1b after a clinically isolated syndrome. The 5-year active treatment extension compares the effects of early and delayed treatment with interferon beta-1b on time to clinically definite multiple sclerosis (CDMS) and other disease outcomes, including disability progression. METHODS: Patients with a first event suggestive of multiple sclerosis and a minimum of two clinically silent lesions in MRI were randomly assigned to receive interferon beta-1b 250 microg (n=292; early treatment) or placebo (n=176; delayed treatment) subcutaneously every other day for 2 years, or until diagnosis of CDMS. All patients were then eligible to enter a prospectively planned follow-up phase with open-label interferon beta-1b up to a maximum of 5 years after randomisation. Patients and study personnel remained unaware of initial treatment allocation throughout the study. Primary endpoints were time to CDMS, time to confirmed disability progression measured with the expanded disability status scale, and the functional assessment of multiple sclerosis trial outcomes index (FAMS-TOI) at 5 years. Analysis of the primary endpoints was by intention to treat. This trial is registered with ClinicalTrials.gov, number NCT00185211. FINDINGS: 235 (80%) patients from the early treatment and 123 (70%) from the delayed treatment group completed the 5-year study. Early treatment reduced the risk of CDMS by 37% (hazard ratio [HR] 0.63, 95% CI 0.48-0.83; p=0.003) compared with delayed treatment. The risk for confirmed disability progression was not significantly lower in the early treatment group (0.76, 0.52-1.11; p=0.177). At 5 years, median FAMS-TOI scores were 125 in both groups. No significant differences in other disability related outcomes were recorded. Frequency and severity of adverse events remained within the established safety and tolerability profile of interferon beta-1b. INTERPRETATION: Effects on the rate of conversion to CDMS and the favourable long-term safety and tolerability profile support early initiation of treatment with interferon beta-1b, although a delay in treatment by up to 2 years did not affect long-term disability outcomes. FUNDING: Bayer Schering Pharma.

356 citations


Journal ArticleDOI
TL;DR: In this critical review, examples of coordination complexes with efficient chiral transfer determining stereochemistry at the metal centre and throughout the overall molecular assembly are presented.
Abstract: In this critical review we present examples of coordination complexes with efficient chiral transfer determining stereochemistry at the metal centre and throughout the overall molecular assembly. The general features controlling the transmission of chirality are presented. The transfer of chirality are considered here with the special purpose of obtaining a molecular material displaying a particular property or function. Coordination complexes in fields as diverse as chiral luminescent materials, homochiral MOFs, chiral liquid crystals, enantioselective sensors, chiroptical switches, and magnetochiral compounds are presented (162 references).

350 citations


Journal ArticleDOI
TL;DR: The goal is not to replace software user guides, but to provide key concepts, principles, and procedures to be applied during geomodeling tasks, with a specific focus on quality control.
Abstract: Building a 3D geological model from field and subsurface data is a typical task in geological studies involving natural resource evaluation and hazard assessment. However, there is quite often a gap between research papers presenting case studies or specific innovations in 3D modeling and the objectives of a typical class in 3D structural modeling, as more and more is implemented at universities. In this paper, we present general procedures and guidelines to effectively build a structural model made of faults and horizons from typical sparse data. Then we describe a typical 3D structural modeling workflow based on triangulated surfaces. Our goal is not to replace software user guides, but to provide key concepts, principles, and procedures to be applied during geomodeling tasks, with a specific focus on quality control.

Journal ArticleDOI
02 Oct 2009-Science
TL;DR: Genome-wide association studies of more than 1000 dogs from 80 domestic breeds identified distinct mutations in three genes, RSPO2, FGF5, and KRT71 (encoding R-spondin–2, fibroblast growth factor– 5, and keratin-71, respectively), that together account for most coat phenotypes in purebred dogs in the United States.
Abstract: Coat color and type are essential characteristics of domestic dog breeds. Although the genetic basis of coat color has been well characterized, relatively little is known about the genes influencing coat growth pattern, length, and curl. We performed genome-wide association studies of more than 1000 dogs from 80 domestic breeds to identify genes associated with canine fur phenotypes. Taking advantage of both inter- and intrabreed variability, we identified distinct mutations in three genes, RSPO2, FGF5, and KRT71 (encoding R-spondin–2, fibroblast growth factor–5, and keratin-71, respectively), that together account for most coat phenotypes in purebred dogs in the United States. Thus, an array of varied and seemingly complex phenotypes can be reduced to the combinatorial effects of only a few genes.

Journal ArticleDOI
15 Dec 2009-Geoderma
TL;DR: In this paper, controlled incubations of a wetland soil were performed under oxic and anoxic conditions to investigate the extent to which the following processes account for this phenomenon: i) production of organic metabolites by microbes during soil reduction, ii) release of organic matter (OM) from Mn- and Fe-oxyhydroxides that undergo reductive dissolution, and iii) desorption of OM from soil minerals due to pH changes.

Journal ArticleDOI
TL;DR: It is demonstrated that cross regulation between the gap genes causes their expression to approach dynamical attractors, reducing initial variation and providing a robust output, and more generally it is shown that the complex multigenic phenomenon of canalization can be understood at a quantitative and predictive level by the application of a precise dynamical model.
Abstract: Developing embryos exhibit a robust capability to reduce phenotypic variations that occur naturally or as a result of experimental manipulation. This reduction in variation occurs by an epigenetic mechanism called canalization, a phenomenon which has resisted understanding because of a lack of necessary molecular data and of appropriate gene regulation models. In recent years, quantitative gene expression data have become available for the segment determination process in the Drosophila blastoderm, revealing a specific instance of canalization. These data show that the variation of the zygotic segmentation gene expression patterns is markedly reduced compared to earlier levels by the time gastrulation begins, and this variation is significantly lower than the variation of the maternal protein gradient Bicoid. We used a predictive dynamical model of gene regulation to study the effect of Bicoid variation on the downstream gap genes. The model correctly predicts the reduced variation of the gap gene expression patterns and allows the characterization of the canalizing mechanism. We show that the canalization is the result of specific regulatory interactions among the zygotic gap genes. We demonstrate the validity of this explanation by showing that variation is increased in embryos mutant for two gap genes, Kruppel and knirps, disproving competing proposals that canalization is due to an undiscovered morphogen, or that it does not take place at all. In an accompanying article in PLoS Computational Biology (doi:10.1371/journal.pcbi.1000303), we show that cross regulation between the gap genes causes their expression to approach dynamical attractors, reducing initial variation and providing a robust output. These results demonstrate that the Bicoid gradient is not sufficient to produce gap gene borders having the low variance observed, and instead this low variance is generated by gap gene cross regulation. More generally, we show that the complex multigenic phenomenon of canalization can be understood at a quantitative and predictive level by the application of a precise dynamical model.

Journal ArticleDOI
TL;DR: Novel hypotheses are suggested on why certain specialists may not be particularly at risk and consequently why certain generalists deserve no less attention from conservationists than specialists.
Abstract: The question 'what renders a species extinction prone' is crucial to biologists. Ecological specialization has been suggested as a major constraint impeding the response of species to environmental changes. Most neoecological studies indicate that specialists suffer declines under recent environmental changes. This was confirmed by many paleoecological studies investigating longer-term survival. However, phylogeneticists, studying the entire histories of lineages, showed that specialists are not trapped in evolutionary dead ends and could even give rise to generalists. Conclusions from these approaches diverge possibly because (i) of approach-specific biases, such as lack of standardization for sampling efforts (neoecology), lack of direct observations of specialization (paleoecology), or binary coding and prevalence of specialists (phylogenetics); (ii) neoecologists focus on habitat specialization; (iii) neoecologists focus on extinction of populations, phylogeneticists on persistence of entire clades through periods of varying extinction and speciation rates; (iv) many phylogeneticists study species in which specialization may result from a lack of constraints. We recommend integrating the three approaches by studying common datasets, and accounting for range-size variation among species, and we suggest novel hypotheses on why certain specialists may not be particularly at risk and consequently why certain generalists deserve no less attention from conservationists than specialists.

Journal ArticleDOI
TL;DR: Theoretical analyses, statistical studies, and experimental electron density determinations converge to describe halogen bonding as a relatively weak structure directing tool, when compared with hydrogen bonding as mentioned in this paper.
Abstract: Halogen bonding (XB), as a directional interaction between covalently bound halogen atoms (XB donor) and Lewis bases (A, XB acceptor), has been recently intensively investigated as a powerful tool in crystal engineering. After a short review on the origin and general features of halogen bonding, current developments towards (i) the elaboration of three-dimensional networks, (ii) the interaction with anionic XB acceptors, (iii) its identification in biological systems and (iv) the formation of liquid crystal phases will be described. Theoretical analyses, statistical studies and experimental electron density determinations converge to describe halogen bonding as a relatively weak structure directing tool, when compared with hydrogen bonding. However, when the halogen atom is strongly activated as in iodoperfluorinated molecules or cationic aromatic systems can halogen bonding act as an efficient and reliable structure directing tool.

Journal ArticleDOI
TL;DR: This work revealed an unsuspected diversity of species in three of the five fungal phyla, including a new branch of Chytridiomycota forming an ancient evolutionary lineage and opens the way to new studies of the diversity, ecology, and physiology of fungi in oceans.
Abstract: Deep-sea hydrothermal ecosystems are considered oases of life in oceans. Since the discovery of these ecosystems in the late 1970s, many endemic species of Bacteria, Archaea, and other organisms, such as annelids and crabs, have been described. Considerable knowledge has been acquired about the diversity of (micro)organisms in these ecosystems, but the diversity of fungi has not been studied to date. These organisms are considered key organisms in terrestrial ecosystems because of their ecological functions and especially their ability to degrade organic matter. The lack of knowledge about them in the sea reflects the widely held belief that fungi are terrestrial organisms. The first inventory of such organisms in deep-sea hydrothermal environments was obtained in this study. Fungal diversity was investigated by analyzing the small-subunit rRNA gene sequences amplified by culture-independent PCR using DNA extracts from hydrothermal samples and from a culture collection that was established. Our work revealed an unsuspected diversity of species in three of the five fungal phyla. We found a new branch of Chytridiomycota forming an ancient evolutionary lineage. Many of the species identified are unknown, even at higher taxonomic levels in the Chytridiomycota, Ascomycota, and Basidiomycota. This work opens the way to new studies of the diversity, ecology, and physiology of fungi in oceans and might stimulate new prospecting for biomolecules. From an evolutionary point of view, the diversification of fungi in the oceans can no longer be ignored.

Journal ArticleDOI
01 Apr 2009-Cancer
TL;DR: It is shown that partial nephrectomy performed for renal cell carcinoma (RCC) may protect from non‐cancer‐related deaths and the authors tested this hypothesis in a cohort of PN and RN patients.
Abstract: BACKGROUND: Relative to radical nephrectomy (RN), partial nephrectomy (PN) performed for renal cell carcinoma (RCC) may protect from non-cancer-related deaths. The authors tested this hypothesis in a cohort of PN and RN patients. METHODS: The Surveillance, Epidemiology, and End Results-9 database allowed identification of 2198 PN (22.4%) and 7611 RN (77.6%) patients treated for T1aN0M0 RCC between 1988 and 2004. Analyses matched for age, year of surgery, tumor size, and Fuhrman grade addressed the effect of nephrectomy type (RN vs PN) on overall mortality (Cox regression models) and on non-cancer-related mortality (competing-risks regression models). RESULTS: Relative to PN, RN was associated with 1.23-fold (P = .001) increased overall mortality rate, which translated into a 4.9% and 3.1% absolute increase in mortality at 5 and 10 years after surgery, respectively. Similarly, non-cancer-related death rate was significantly higher after RN in competing-risks regression models (P < .001), which translated into a 4.6% and 4.5% absolute increase in non-cancer-related mortality at 5 and 10 years after surgery, respectively. CONCLUSIONS: Relative to PN, RN predisposes to an increase in overall mortality and non-cancer-related death rate in patients with T1a RCC. In consequence, PN should be attempted whenever technically feasible. Selective referrals should be considered if PN expertise is unavailable Cancer 2009. © 2009 American Cancer Society.

Journal ArticleDOI
TL;DR: In this paper, the authors show that differences between species in adaptations to various dispersal vectors, in combination with changes in the availability of these vectors, contribute significantly to explaining losses in plant diversity in Northwest Europe in the 20th century.
Abstract: The ongoing decline of many plant species in Northwest Europe indicates that traditional conservation measures to improve the habitat quality, although useful, are not enough to halt diversity losses. Using recent databases, we show for the first time that differences between species in adaptations to various dispersal vectors, in combination with changes in the availability of these vectors, contribute significantly to explaining losses in plant diversity in Northwest Europe in the 20th century. Species with water- or fur-assisted dispersal are over-represented among declining species, while others (wind- or bird-assisted dispersal) are under-represented. Our analysis indicates that the 'colonization deficit' due to a degraded dispersal infrastructure is no less important in explaining plant diversity losses than the more commonly accepted effect of eutrophication and associated niche-based processes. Our findings call for measures that aim to restore the dispersal infrastructure across entire regions and that go beyond current conservation practices.

Journal ArticleDOI
01 Aug 2009
TL;DR: A Hidden Markov Model that decides whether a source is a copier of another source and identifies the specific moments at which it copies is developed, and a Bayesian model that aggregates information from the sources to decide the true value for a data item, and the evolution of the true values over time is developed.
Abstract: Modern information management applications often require integrating data from a variety of data sources, some of which may copy or buy data from other sources. When these data sources model a dynamically changing world (e.g., people's contact information changes over time, restaurants open and go out of business), sources often provide out-of-date data. Errors can also creep into data when sources are updated often. Given out-of-date and erroneous data provided by different, possibly dependent, sources, it is challenging for data integration systems to provide the true values. Straightforward ways to resolve such inconsistencies (e.g., voting) may lead to noisy results, often with detrimental consequences.In this paper, we study the problem of finding true values and determining the copying relationship between sources, when the update history of the sources is known. We model the quality of sources over time by their coverage, exactness and freshness. Based on these measures, we conduct a probabilistic analysis. First, we develop a Hidden Markov Model that decides whether a source is a copier of another source and identifies the specific moments at which it copies. Second, we develop a Bayesian model that aggregates information from the sources to decide the true value for a data item, and the evolution of the true values over time. Experimental results on both real-world and synthetic data show high accuracy and scalability of our techniques.

Journal ArticleDOI
TL;DR: The central and North-Amorican domains (which together constitute the core of the Armorica microplate) are bounded by two composite suture zones as discussed by the authors, and the collision occurred during a Late Devonian event associated with a second generation of eclogites (Cellier).

Journal ArticleDOI
TL;DR: In this article, the synthesis, properties and applications of metal complexes of tetrathiafulvalene-based group XV (N, P, As, Sb) ligands are discussed.

Journal ArticleDOI
TL;DR: This new phenomenon, the "chromocapillary effect", an interfacial flow generates droplet motion in the direction opposite to the gradient in a liquid/liquid interface.
Abstract: Liquid droplets can be manipulated in a controlled fashion along trajectories of any desired shape (such as a heart, see picture) by using light to create a wavelength-dependent interfacial tension gradient at a liquid/liquid interface. In this new phenomenon, the "chromocapillary effect", an interfacial flow generates droplet motion in the direction opposite to the gradient.

Proceedings ArticleDOI
01 Aug 2009
TL;DR: A model for solving interactions between virtual humans is presented and the low number of parameters of the proposed model enables its automatic calibration from available experimental data.
Abstract: An interaction occurs between two humans when they walk with converging trajectories. They need to adapt their motion in order to avoid and cross one another at respectful distance. This paper presents a model for solving interactions between virtual humans. The proposed model is elaborated from experimental interactions data. We first focus our study on the pair-interaction case. In a second stage, we extend our approach to the multiple interactions case. Our experimental data allow us to state the conditions for interactions to occur between walkers, as well as each one's role during interaction and the strategies walkers set to adapt their motion. The low number of parameters of the proposed model enables its automatic calibration from available experimental data. We validate our approach by comparing simulated trajectories with real ones. We also provide comparison with previous solutions. We finally discuss the ability of our model to be extended to complex situations.

Journal ArticleDOI
TL;DR: The ring-opening polymerization of a mixture of enantiomerically pure but different monomers using an yttrium complex as initiator proceeds readily at room temperature to give the corresponding highly alternating polyester.
Abstract: The ring-opening polymerization of a mixture of enantiomerically pure but different monomers using an yttrium complex as initiator proceeds readily at room temperature to give the corresponding highly alternating polyester.

Proceedings ArticleDOI
16 May 2009
TL;DR: This paper presents how an AOM approach can be used to tame the combinatorial explosion of DAS modes by derive a wide range of modes by weaving aspects into an explicit model reflecting the runtime system.
Abstract: Since software systems need to be continuously available under varying conditions, their ability to evolve at runtime is increasingly seen as one key issue. Modern programming frameworks already provide support for dynamic adaptations. However the high-variability of features in Dynamic Adaptive Systems (DAS) introduces an explosion of possible runtime system configurations (often called modes) and mode transitions. Designing these configurations and their transitions is tedious and error-prone, making the system feature evolution difficult. While Aspect-Oriented Modeling (AOM) was introduced to improve the modularity of software, this paper presents how an AOM approach can be used to tame the combinatorial explosion of DAS modes. Using AOM techniques, we derive a wide range of modes by weaving aspects into an explicit model reflecting the runtime system. We use these generated modes to automatically adapt the system. We validate our approach on an adaptive middleware for home-automation currently deployed in Rennes metropolis.

Journal ArticleDOI
TL;DR: The authors proposed a two-regime spatial Durbin model with spatial and time-period fixed effects to test for political yardstick competition and exclude any other explanation that might produce spatial interaction effects among the dependent variable, the independent variables, or the error term.
Abstract: This research proposes a two-regime spatial Durbin model with spatial and time-period fixed effects to test for political yardstick competition and exclude any other explanation that might produce spatial interaction effects among the dependent variable, the independent variables, or the error term. The study also derives the maximum likelihood estimator and variance–covariance matrix of the parameters of this model. Data pertaining to welfare spending by 93 departments in France during 1992–2000 provide significant empirical evidence in support of political yardstick competition. Departments governed by a small political majority mimic neighboring expenditures on welfare to a greater extent than do departments governed by a large political majority.

Journal ArticleDOI
TL;DR: IL‐33 is strongly associated with fibrosis in chronic liver injury and activated HSC are a source of IL‐33, which was increased in cultured HSC when stimulated by pro‐inflammatory cytokines.
Abstract: Interleukin-33 (IL-33), the most recently identified member of the IL-1 family, induces synthesis of T Helper 2 (Th2)-type cytokines via its heterodimeric ST2/IL-1RAcP receptor. Th2-type cytokines play an important role in fibrosis; thus, we investigated the role of IL-33 in liver fibrosis. IL-33, ST2 and IL-1RAcP gene expression was analysed in mouse and human normal (n= 6) and fibrotic livers (n= 28), and in human hepatocellular carcinoma (HCC; n= 22), using real-time PCR. IL-33 protein was detected in normal and fibrotic liver sections and in isolated liver cells using Western blotting and immunolocalization approaches. Our results showed that IL-33 and ST2 mRNA was overproduced in mouse and human fibrotic livers, but not in human HCC. IL-33 expression correlated with ST2 expression and also with collagen expression in fibrotic livers. The major sources of IL-33 in normal liver from both mice and human beings are the liver sinusoidal endothelial cells and, in fibrotic liver, the activated hepatic stellate cells (HSC). Moreover, IL-33 expression was increased in cultured HSC when stimulated by pro-inflammatory cytokines. In conclusion, IL-33 is strongly associated with fibrosis in chronic liver injury and activated HSC are a source of IL-33.

Journal ArticleDOI
TL;DR: In this article, the authors define a new class of orogens, called ultra-hot orogens (UHO), in which the weakest type of lithosphere on Earth is deformed.

Journal ArticleDOI
TL;DR: In this paper, the analysis of a turbulent boundary layer saturated with saltating particles was carried out in a wind tunnel 15m long and 0.6m wide at the University of Aarhus in Denmark with sand grains 242 μm in size for wind speeds ranging from the threshold speed to twice its value.
Abstract: The work presented here focuses on the analysis of a turbulent boundary layer saturated with saltating particles. Experiments were carried out in a wind tunnel 15m long and 0.6m wide at the University of Aarhus in Denmark with sand grains 242 μm in size for wind speeds ranging from the threshold speed to twice its value. The saltating particles were analysed using particle image velocimetry (PIV) and particle-tracking velocimetry (PTV), and vertical profiles of particle concentration and velocity were extracted. The particle concentration was found to decrease exponentially with the height above the bed, and the characteristic decay height was independent of the wind speed. In contrast with the logarithmic profile of the wind speed, the grain velocity was found to vary linearly with the height. In addition, the measurements indicated that the grain velocity profile depended only slightly on the wind speed. These results are shown to be closely related to the features of the splash function that characterizes the impact of the saltating particles on a sandbed. A numerical simulation is developed that explicitly incorporates low-velocity moments of the splash function in a calculation of the boundary conditions that apply at the bed. The overall features of the experimental measurements are reproduced by simulation.