scispace - formally typeset
Search or ask a question

Showing papers by "École Normale Supérieure published in 2005"


Journal ArticleDOI
TL;DR: Understanding this complexity, while taking strong steps to minimize current losses of species, is necessary for responsible management of Earth's ecosystems and the diverse biota they contain.
Abstract: Humans are altering the composition of biological communities through a variety of activities that increase rates of species invasions and species extinctions, at all scales, from local to global. These changes in components of the Earth's biodiversity cause concern for ethical and aesthetic reasons, but they also have a strong potential to alter ecosystem properties and the goods and services they provide to humanity. Ecological experiments, observations, and theoretical developments show that ecosystem properties depend greatly on biodiversity in terms of the functional characteristics of organisms present in the ecosystem and the distribution and abundance of those organisms over space and time. Species effects act in concert with the effects of climate, resource availability, and disturbance regimes in influencing ecosystem properties. Human activities can modify all of the above factors; here we focus on modification of these biotic controls. The scientific community has come to a broad consensus on many aspects of the re- lationship between biodiversity and ecosystem functioning, including many points relevant to management of ecosystems. Further progress will require integration of knowledge about biotic and abiotic controls on ecosystem properties, how ecological communities are struc- tured, and the forces driving species extinctions and invasions. To strengthen links to policy and management, we also need to integrate our ecological knowledge with understanding of the social and economic constraints of potential management practices. Understanding this complexity, while taking strong steps to minimize current losses of species, is necessary for responsible management of Earth's ecosystems and the diverse biota they contain.

6,891 citations


Book
01 Jan 2005
TL;DR: A century after the publication of Max Weber's The Protestant Ethic and the "Spirit" of Capitalism, a major new work examines network-based organization, employee autonomy and post-Fordist horizontal work structures.
Abstract: A century after the publication of Max Weber's The Protestant Ethic and the "Spirit" of Capitalism, a major new work examines network-based organization, employee autonomy and post-Fordist horizontal work structures.

2,892 citations


Journal ArticleDOI
08 Dec 2005-Nature
TL;DR: It is shown, using data from 854 sites across Africa, that maximum woody cover in savannas receiving a mean annual precipitation (MAP) of less than ∼650 mm is constrained by, and increases linearly with, MAP.
Abstract: Savannas are globally important ecosystems of great significance to human economies. In these biomes, which are characterized by the co-dominance of trees and grasses, woody cover is a chief determinant of ecosystem properties1–3. The availability of resources (water, nutrients) and disturbance regimes (fire, herbivory) are thought to be important in regulating woody cover1,2,4,5, but perceptions differ on which of these are the primary drivers of savanna structure. Here we show, using data from 854 sites across Africa, that maximum woody cover in savannas receiving a mean annual precipitation (MAP) of less than ,650mm is constrained by, and increases linearly with, MAP. These arid and semi-arid savannas may be considered ‘stable' systems in which water constrains woody cover and permits grasses to coexist, while fire, herbivory and soil properties interact to reduce woody cover below the MAP-controlled upper bound. Above a MAP of ,650mm, savannas are ‘unstable' systems in which MAP is sufficient for woody canopy closure, and disturbances (fire, herbivory) are required for the coexistence of trees and grass. These results provide insights into the nature of African savannas and suggest that future changes in precipitation6 may considerably affect their distribution and dynamics.

1,740 citations


Journal ArticleDOI
TL;DR: The authors' simple model predicts correctly the timing of 96% of the spikes of the detailed model in response to injection of noisy synaptic conductances and has enough expressive power to reproduce qualitatively several electrophysiological classes described in vitro.
Abstract: We introduce a two-dimensional integrate-and-fire model that combines an exponential spike mechanism with an adaptation equation, based on recent theoretical findings. We describe a systematic method to estimate its parameters with simple electrophysiological protocols (current-clamp injection of pulses and ramps) and apply it to a detailed conductance-based model of a regular spiking neuron. Our simple model predicts correctly the timing of 96% of the spikes (±2 ms) of the detailed model in response to injection of noisy synaptic conductances. The model is especially reliable in high-conductance states, typical of cortical activity in vivo, in which intrinsic conductances were found to have a reduced role in shaping spike trains. These results are promising because this simple model has enough expressive power to reproduce qualitatively several electrophysiological classes described in vitro.

1,128 citations


Journal ArticleDOI
10 Mar 2005-Nature
TL;DR: The implementation of Grover's search algorithm demonstrates that one-way quantum computation is ideally suited for such tasks.
Abstract: Standard quantum computation is based on sequences of unitary quantum logic gates that process qubits. The one-way quantum computer proposed by Raussendorf and Briegel is entirely different. It has changed our understanding of the requirements for quantum computation and more generally how we think about quantum physics. This new model requires qubits to be initialized in a highly entangled cluster state. From this point, the quantum computation proceeds by a sequence of single-qubit measurements with classical feedforward of their outcomes. Because of the essential role of measurement, a one-way quantum computer is irreversible. In the one-way quantum computer, the order and choices of measurements determine the algorithm computed. We have experimentally realized four-qubit cluster states encoded into the polarization state of four photons. We characterize the quantum state fully by implementing experimental four-qubit quantum state tomography. Using this cluster state, we demonstrate the feasibility of one-way quantum computing through a universal set of one- and two-qubit operations. Finally, our implementation of Grover's search algorithm demonstrates that one-way quantum computation is ideally suited for such tasks.

1,058 citations


Journal ArticleDOI
TL;DR: Theoretical and phenomenological implications of R-parity violation in supersymmetric theories are discussed in the context of particle physics and cosmology in this paper, including the relation with continuous and discrete symmetries.

949 citations


Proceedings ArticleDOI
13 Mar 2005
TL;DR: This paper suggests that the base station be mobile; in this way, the nodes located close to it change over time and the obtained improvement in terms of network lifetime is in the order of 500%.
Abstract: Although many energy efficient/conserving routing protocols have been proposed for wireless sensor networks, the concentration of data traffic towards a small number of base stations remains a major threat to the network lifetime. The main reason is that the sensor nodes located near a base station have to relay data for a large part of the network and thus deplete their batteries very quickly. The solution we propose in this paper suggests that the base station be mobile; in this way, the nodes located close to it change over time. Data collection protocols can then be optimized by taking both base station mobility and multi-hop routing into account. We first study the former, and conclude that the best mobility strategy consists in following the periphery of the network (we assume that the sensors are deployed within a circle). We then consider jointly mobility and routing algorithms in this case, and show that a better routing strategy uses a combination of round routes and short paths. We provide a detailed analytical model for each of our statements, and corroborate it with simulation results. We show that the obtained improvement in terms of network lifetime is in the order of 500%.

937 citations


Journal ArticleDOI
03 Jun 2005-Cell
TL;DR: It is shown here that several supposedly silent intergenic regions in the genome of S. cerevisiae are actually transcribed by RNA polymerase II, suggesting that the expressed fraction of the genome is higher than anticipated.

830 citations


Journal Article
TL;DR: This contribution presents an online SVM algorithm based on the premise that active example selection can yield faster training, higher accuracies, and simpler models, using only a fraction of the training example labels.
Abstract: Very high dimensional learning systems become theoretically possible when training examples are abundant. The computing cost then becomes the limiting factor. Any efficient learning algorithm should at least take a brief look at each example. But should all examples be given equal attention?This contribution proposes an empirical answer. We first present an online SVM algorithm based on this premise. LASVM yields competitive misclassification rates after a single pass over the training examples, outspeeding state-of-the-art SVM solvers. Then we show how active example selection can yield faster training, higher accuracies, and simpler models, using only a fraction of the training example labels.

700 citations


Proceedings ArticleDOI
24 Oct 2005
TL;DR: It is first shown using simple arguments that the so-called residual and stratified methods do yield an improvement over the basic multinomial resampling approach, and a central limit theorem is established for the case where resamplings is performed using the residual approach.
Abstract: This contribution is devoted to the comparison of various resampling approaches that have been proposed in the literature on particle filtering. It is first shown using simple arguments that the so-called residual and stratified methods do yield an improvement over the basic multinomial resampling approach. A simple counter-example showing that this property does not hold true for systematic resampling is given. Finally, some results on the large-sample behavior of the simple bootstrap filter algorithm are given. In particular, a central limit theorem is established for the case where resampling is performed using the residual approach.

692 citations


Journal ArticleDOI
TL;DR: A unifying expression is proposed that gathers the majority of PDE-based formalisms for vector-valued image regularization into a single generic anisotropic diffusion equation, allowing us to implement the authors' regularization framework with accuracy by taking the local filtering properties of the proposed equations into account.
Abstract: In this paper, we focus on techniques for vector-valued image regularization, based on variational methods and PDE. Starting from the study of PDE-based formalisms previously proposed in the literature for the regularization of scalar and vector-valued data, we propose a unifying expression that gathers the majority of these previous frameworks into a single generic anisotropic diffusion equation. On one hand, the resulting expression provides a simple interpretation of the regularization process in terms of local filtering with spatially adaptive Gaussian kernels. On the other hand, it naturally disassembles any regularization scheme into the smoothing process itself and the underlying geometry that drives the smoothing. Thus, we can easily specialize our generic expression into different regularization PDE that fulfill desired smoothing behaviors, depending on the considered application: image restoration, inpainting, magnification, flow visualization, etc. Specific numerical schemes are also proposed, allowing us to implement our regularization framework with accuracy by taking the local filtering properties of the proposed equations into account. Finally, we illustrate the wide range of applications handled by our selected anisotropic diffusion equations with application results on color images.

Book ChapterDOI
23 Jan 2005
TL;DR: This paper presents a natural generic construction of a three-party protocol, based on any two-party authenticated key exchange protocol, and proves its security without making use of the Random Oracle model, which is the first provably-secure password-based protocol in the three- party setting.
Abstract: Password-based authenticated key exchange are protocols which are designed to be secure even when the secret key or password shared between two users is drawn from a small set of values. Due to the low entropy of passwords, such protocols are always subject to on-line guessing attacks. In these attacks, the adversary may succeed with non-negligible probability by guessing the password shared between two users during its on-line attempt to impersonate one of these users. The main goal of password-based authenticated key exchange protocols is to restrict the adversary to this case only. In this paper, we consider password-based authenticated key exchange in the three-party scenario, in which the users trying to establish a secret do not share a password between themselves but only with a trusted server. Towards our goal, we recall some of the existing security notions for password-based authenticated key exchange protocols and introduce new ones that are more suitable to the case of generic constructions. We then present a natural generic construction of a three-party protocol, based on any two-party authenticated key exchange protocol, and prove its security without making use of the Random Oracle model. To the best of our knowledge, the new protocol is the first provably-secure password-based protocol in the three-party setting.

Journal ArticleDOI
TL;DR: After two years the plant communities pollinated by the most functionally diverse pollinator assemblage contained about 50% more plant species than did plant communitiespollinated by less-diverse pollination assemblages, suggesting the functional diversity of pollination networks may be critical to ecosystem sustainability.
Abstract: Pollination is exclusively or mainly animal mediated for 70% to 90% of angiosperm species. Thus, pollinators provide an essential ecosystem service to humankind. However, the impact of human-induced biodiversity loss on the functioning of plant–pollinator interactions has not been tested experimentally. To understand how plant communities respond to diversity changes in their pollinating fauna, we manipulated the functional diversity of both plants and pollinators under natural conditions. Increasing the functional diversity of both plants and pollinators led to the recruitment of more diverse plant communities. After two years the plant communities pollinated by the most functionally diverse pollinator assemblage contained about 50% more plant species than did plant communities pollinated by less-diverse pollinator assemblages. Moreover, the positive effect of functional diversity was explained by a complementarity between functional groups of pollinators and plants. Thus, the functional diversity of pollination networks may be critical to ecosystem sustainability.

Journal ArticleDOI
TL;DR: In this article, a planar microcavity photon mode strongly coupled to a semiconductor inter-subband transition in presence of a two-dimensional electron gas is described. And the quantum properties of the ground state (a two-mode squeezed vacuum), which can be tuned in situ by changing the value of the Rabi frequency, e.g., through an electrostatic gate.
Abstract: We present a quantum description of a planar microcavity photon mode strongly coupled to a semiconductor intersubband transition in presence of a two-dimensional electron gas. We show that, in this kind of system, the vacuum Rabi frequency ${\ensuremath{\Omega}}_{R}$ can be a significant fraction of the intersubband transition frequency ${\ensuremath{\omega}}_{12}$. This regime of ultrastrong light-matter coupling is enhanced for long-wavelength transitions, because for a given doping density, effective mass and number of quantum wells, the ratio ${\ensuremath{\Omega}}_{R}∕{\ensuremath{\omega}}_{12}$ increases as the square root of the intersubband emission wavelength. We characterize the quantum properties of the ground state (a two-mode squeezed vacuum), which can be tuned in situ by changing the value of ${\ensuremath{\Omega}}_{R}$, e.g., through an electrostatic gate. We finally point out how the tunability of the polariton quantum vacuum can be exploited to generate correlated photon pairs out of the vacuum via quantum electrodynamics phenomena reminiscent of the dynamical Casimir effect.

Journal ArticleDOI
TL;DR: Continuous two-phase systems appear as more highly efficient technologies for anaerobic digestion of FVW, their greatest advantage lies in the buffering of the organic loading rate taking place in the first stage, allowing a more constant feeding rate of the methanogenic second stage.

Book ChapterDOI
03 Oct 2005
TL;DR: A new algorithm is described and analyzed, Poker (Price Of Knowledge and Estimated Reward) whose performance compares favorably to that of other existing algorithms in several experiments and proves to be often hard to beat.
Abstract: The multi-armed bandit problem for a gambler is to decide which arm of a K-slot machine to pull to maximize his total reward in a series of trials. Many real-world learning and optimization problems can be modeled in this way. Several strategies or algorithms have been proposed as a solution to this problem in the last two decades, but, to our knowledge, there has been no common evaluation of these algorithms. This paper provides a preliminary empirical evaluation of several multi-armed bandit algorithms. It also describes and analyzes a new algorithm, Poker (Price Of Knowledge and Estimated Reward) whose performance compares favorably to that of other existing algorithms in several experiments. One remarkable outcome of our experiments is that the most naive approach, the e-greedy strategy, proves to be often hard to beat.

Proceedings ArticleDOI
13 Mar 2005
TL;DR: This work proposes a mechanism for secure positioning of wireless devices, that is verifiable multilateration, and shows how this mechanism can be used to secure positioning in sensor networks.
Abstract: So far, the problem of positioning in wireless networks has been mainly studied in a non-adversarial setting. In this work, we analyze the resistance of positioning techniques to position and distance spoofing attacks. We propose a mechanism for secure positioning of wireless devices, that we call verifiable multilateration. We then show how this mechanism can be used to secure positioning in sensor networks. We analyze our system through simulations.

Proceedings ArticleDOI
20 Jun 2005
TL;DR: This paper advocates the use of randomized trees as the classification technique, which is both fast enough for real-time performance and more robust, and gives a principled way not only to match keypoints but to select during a training phase those that are the most recognizable ones.
Abstract: In earlier work, we proposed treating wide baseline matching of feature points as a classification problem, in which each class corresponds to the set of all possible views of such a point. We used a K-mean plus Nearest Neighbor classifier to validate our approach, mostly because it was simple to implement. It has proved effective but still too slow for real-time use. In this paper, we advocate instead the use of randomized trees as the classification technique. It is both fast enough for real-time performance and more robust. It also gives us a principled way not only to match keypoints but to select during a training phase those that are the most recognizable ones. This results in a real-time system able to detect and position in 3D planar, non-planar, and even deformable objects. It is robust to illuminations changes, scale changes and occlusions.

Book ChapterDOI
14 Aug 2005
TL;DR: This work identifies and fills some gaps with regard to consistency (the extent to which false positives are produced) for public-key encryption with keyword search (PEKS) and provides a transform of an anonymous IBE scheme to a secure PEKS scheme that guarantees consistency.
Abstract: We identify and fill some gaps with regard to consistency (the extent to which false positives are produced) for public-key encryption with keyword search (PEKS). We define computational and statistical relaxations of the existing notion of perfect consistency, show that the scheme of [7] is computationally consistent, and provide a new scheme that is statistically consistent. We also provide a transform of an anonymous IBE scheme to a secure PEKS scheme that, unlike the previous one, guarantees consistency. Finally we suggest three extensions of the basic notions considered here, namely anonymous HIBE, public-key encryption with temporary keyword search, and identity-based encryption with keyword search.

Proceedings ArticleDOI
13 Mar 2005
TL;DR: A generic mobility model for independent mobiles that contains as special cases the random waypoint on convex or non convex domains, random walk with reflection or wrapping, city section, space graph and other models is defined.
Abstract: We define "random trip", a generic mobility model for independent mobiles that contains as special cases: the random waypoint on convex or non convex domains, random walk with reflection or wrapping, city section, space graph and other models. We use Palm calculus to study the model and give a necessary and sufficient condition for a stationary regime to exist. When this condition is satisfied, we compute the stationary regime and give an algorithm to start a simulation in steady state (perfect simulation). The algorithm does not require the knowledge of geometric constants. For the special case of random waypoint, we provide for the first time a proof and a sufficient and necessary condition of the existence of a stationary regime. Further, we extend its applicability to a broad class of non convex and multi-site examples, and provide a ready-to-use algorithm for perfect simulation. For the special case of random walks with reflection or wrapping, we show that, in the stationary regime, the mobile location is uniformly distributed and is independent of the speed vector, and that there is no speed decay. Our framework provides a rich set of well understood models that can be used to simulate mobile networks with independent node movements. Our perfect sampling is implemented to use with ns-2, and it is freely available to download from http://ica1www.epfl.ch/RandomTrip.

Journal ArticleDOI
TL;DR: In this paper, the authors present a multisite analysis of the relationship between plant diversity and ecosystem functioning within the European BIODEPTH network of plant-diversity manipulation experiments, showing that communities with a higher diversity of species and functional groups were more productive and utilized resources more completely by intercepting more light, taking up more nitrogen, and occupying more of the available space.
Abstract: We present a multisite analysis of the relationship between plant diversity and ecosystem functioning within the European BIODEPTH network of plant-diversity manipulation experiments. We report results of the analysis of 11 variables addressing several aspects of key ecosystem processes like biomass production, resource use (space, light, and nitrogen), and decomposition, measured across three years in plots of varying plant species richness at eight different European grassland field sites. Differences among sites explained substantial and significant amounts of the variation of most of the ecosystem processes examined. However, against this background of geographic variation, all the aspects of plant diversity and composition we examined (i.e., both numbers and types of species and functional groups) produced significant, mostly positive impacts on ecosystem processes. Analyses using the additive partitioning method revealed that complementarity effects (greater net yields than predicted from monocultures due to resource partitioning, positive interactions, etc.) were stronger and more consistent than selection effects (the covariance between monoculture yield and change in yield in mixtures) caused by dominance of species with particular traits. In general, communities with a higher diversity of species and functional groups were more productive and utilized resources more completely by intercepting more light, taking up more nitrogen, and occupying more of the available space. Diversity had significant effects through both increased vegetation cover and greater nitrogen retention by plants when this resource was more abundant through N2 fixation by legumes. However, additional positive diversity effects remained even after controlling for differences in vegetation cover and for the presence of legumes in communities. Diversity effects were stronger on above- than belowground processes. In particular, clear diversity effects on decomposition were only observed at one of the eight sites. The ecosystem effects of plant diversity also varied between sites and years. In general, diversity effects were lowest in the first year and stronger later in the experiment, indicating that they were not transitional due to community establishment. These analyses of our complete ecosystem process data set largely reinforce our previous results, and those from comparable biodiversity experiments, and extend the generality of diversity–ecosystem functioning relationships to multiple sites, years, and processes.

Journal ArticleDOI
TL;DR: An analysis of the French episode of heat wave in 2003 highlights how heat wave dangers result from the intricate association of natural and social factors, as well as the causes and the effects of its sudden shift into amplification.
Abstract: In an analysis of the French episode of heat wave in 2003, this article highlights how heat wave dangers result from the intricate association of natural and social factors. Unusually high temperatures, as well as socioeconomic vulnerability, along with social attenuation of hazards, in a general context where the anthropogenic contribution to climate change is becoming more plausible, led to an excess of 14,947 deaths in France, between August 4 and 18, 2003. The greatest increase in mortality was due to causes directly attributable to heat: dehydration, hyperthermia, heat stroke. In addition to age and gender, combinatorial factors included preexisting disease, medication, urban residence, isolation, poverty, and, probably, air pollution. Although diversely impacted or reported, many parts of Europe suffered human and other losses, such as farming and forestry through drought and fires. Summer 2003 was the hottest in Europe since 1500, very likely due in part to anthropogenic climate change. The French experience confirms research establishing that heat waves are a major mortal risk, number one among so-called natural hazards in postindustrial societies. Yet France had no policy in place, as if dangerous climate were restricted to a distant or uncertain future of climate change, or to preindustrial countries. We analyze the heat wave's profile as a strongly attenuated risk in the French context, as well as the causes and the effects of its sudden shift into amplification. Research and preparedness needs are highlighted.

Journal ArticleDOI
TL;DR: In this paper, a modified sol-gel method was used for the preparation of multi-walled carbon nanotubes (MWNT) and TiO2 composite catalysts for photocatalytic degradation of phenol.
Abstract: Multi-walled carbon nanotubes (MWNT) and TiO2 composite catalysts were prepared by a modified sol–gel method. The nanoscaled composite materials were extensively characterized by TG, N2 adsorption-desorption isotherm, XRD, SEM, EDX, TEM and UV–vis spectra. The photocatalytic degradation of phenol was performed under visible light irradiation on these catalysts. An optimum of synergetic effect on photocatalytic activity was observed for a weight ratio MWNT/TiO2 equal to 20% with an increase in the first-order rate constant by a factor of 4.1. The synergetic effect, induced by a strong interphase interaction between MWNT and TiO2, was discussed in terms of different roles played by MWNT in the composite catalysts.

Journal ArticleDOI
23 Dec 2005-Science
TL;DR: The view that continental crust had formed by 4.4 to 4.5 Ga and was rapidly recycled into the mantle and was supported by initial 176Hf/177Hf values from Jack Hills, Western Australia.
Abstract: The long-favored paradigm for the development of continental crust is one of progressive growth beginning at ∼4 billion years ago (Ga). To test this hypothesis, we measured initial 176Hf/177Hf values of 4.01- to 4.37-Ga detrital zircons from Jack Hills, Western Australia. ϵHf (deviations of 176Hf/177Hf from bulk Earth in parts per 104) values show large positive and negative deviations from those of the bulk Earth. Negative values indicate the development of a Lu/Hf reservoir that is consistent with the formation of continental crust (Lu/Hf ≈ 0.01), perhaps as early as 4.5 Ga. Positive ϵHf deviations require early and likely widespread depletion of the upper mantle. These results support the view that continental crust had formed by 4.4 to 4.5 Ga and was rapidly recycled into the mantle.

Journal ArticleDOI
TL;DR: Results demonstrate that manipulation of a plant regulatory gene can simultaneously influence the production of several phytonutrients generated from independent biosynthetic pathways, and provide a novel example of the use of organ-specific gene silencing to improve the nutritional value of plant-derived products.
Abstract: Tomatoes are a principal dietary source of carotenoids and flavonoids, both of which are highly beneficial for human health. Overexpression of genes encoding biosynthetic enzymes or transcription factors have resulted in tomatoes with improved carotenoid or flavonoid content, but never with both. We attempted to increase tomato fruit nutritional value by suppressing an endogenous photomorphogenesis regulatory gene, DET1, using fruit-specific promoters combined with RNA interference (RNAi) technology. Molecular analysis indicated that DET1 transcripts were indeed specifically degraded in transgenic fruits. Both carotenoid and flavonoid contents were increased significantly, whereas other parameters of fruit quality were largely unchanged. These results demonstrate that manipulation of a plant regulatory gene can simultaneously influence the production of several phytonutrients generated from independent biosynthetic pathways, and provide a novel example of the use of organ-specific gene silencing to improve the nutritional value of plant-derived products.

Journal ArticleDOI
TL;DR: In this paper, the authors characterize = 1 vacua of type-II theories in terms of generalized complex structure on the internal manifold M. The conditions for preserving = 1 supersymmetry turn out to be simple generalizations of equations that have appeared in the context of = 2 and topological strings.
Abstract: We characterize = 1 vacua of type-II theories in terms of generalized complex structure on the internal manifold M. The structure group of T(M)⊕T*(M) being SU(3) × SU(3) implies the existence of two pure spinors Φ1 and Φ2. The conditions for preserving = 1 supersymmetry turn out to be simple generalizations of equations that have appeared in the context of = 2 and topological strings. They are (d+H∧)Φ1 = 0 and (d+H∧)Φ2 = FRR. The equation for the first pure spinor implies that the internal space is a twisted generalized Calabi-Yau manifold of a hybrid complex-symplectic type, while the RR-fields serve as an integrability defect for the second.

Journal ArticleDOI
TL;DR: The authors observed linear negative trajectories that characterized the changes in individuals across time in both affective and normative commitment and an individual's intention to quit the organization was characterized by a positive trajectory.
Abstract: Through the use of affective, normative, and continuance commitment in a multivariate 2nd-order factor latent growth modeling approach, the authors observed linear negative trajectories that characterized the changes in individuals across time in both affective and normative commitment. In turn, an individual's intention to quit the organization was characterized by a positive trajectory. A significant association was also found between the change trajectories such that the steeper the decline in an individual's affective and normative commitments across time, the greater the rate of increase in that individual's intention to quit, and, further, the greater the likelihood that the person actually left the organization over the next 9 months. Findings regarding continuance commitment and its components were mixed.

Book ChapterDOI
04 Apr 2005
TL;DR: ASTREE is an abstract interpretation-based static program analyzer aiming at proving automatically the absence of run time errors in programs written in the C programming language, producing a correctness proof for complex software without any false alarm in a few hours of computation.
Abstract: ASTREE is an abstract interpretation-based static program analyzer aiming at proving automatically the absence of run time errors in programs written in the C programming language. It has been applied with success to large embedded control-command safety critical real-time software generated automatically from synchronous specifications, producing a correctness proof for complex software without any false alarm in a few hours of computation.

Journal ArticleDOI
01 Oct 2005-Carbon
TL;DR: In this article, the pore texture of both dry and pyrolyzed carbon gels depends on the drying process, and several more or less expensive methods (supercritical drying, freeze-drying, evaporative drying) were tested in order to determine which process is the most suitable for the synthesis of a porous carbon with a definite texture.

Journal ArticleDOI
TL;DR: In this paper, the authors investigate the moderating effects of ambient odors on shoppers' emotions, perceptions of the retail environment, and perceptions of product quality under various levels of retail density.