scispace - formally typeset
Search or ask a question

Showing papers by "Polytechnic University of Turin published in 2011"


Journal ArticleDOI
TL;DR: In this article, the status of worldwide research in the thermal conductivity of carbon nanotubes and their polymer nanocomposites is reviewed, as well as the relationship between thermal conductivities and the micro- and nano-structure of the composites.

2,102 citations


Journal ArticleDOI
13 May 2011-Science
TL;DR: Natural photosynthesis is compared with present technologies for photovoltaic-driven electrolysis of water to produce hydrogen and opportunities in which the frontiers of synthetic biology might be used to enhance natural photosynthesis for improved solar energy conversion efficiency are considered.
Abstract: Comparing photosynthetic and photovoltaic efficiencies is not a simple issue. Although both processes harvest the energy in sunlight, they operate in distinctly different ways and produce different types of products: biomass or chemical fuels in the case of natural photosynthesis and nonstored electrical current in the case of photovoltaics. In order to find common ground for evaluating energy-conversion efficiency, we compare natural photosynthesis with present technologies for photovoltaic-driven electrolysis of water to produce hydrogen. Photovoltaic-driven electrolysis is the more efficient process when measured on an annual basis, yet short-term yields for photosynthetic conversion under optimal conditions come within a factor of 2 or 3 of the photovoltaic benchmark. We consider opportunities in which the frontiers of synthetic biology might be used to enhance natural photosynthesis for improved solar energy conversion efficiency.

1,379 citations


Journal ArticleDOI
TL;DR: The findings suggest that contacts predicted by DCA can be used as a reliable guide to facilitate computational predictions of alternative protein conformations, protein complex formation, and even the de novo prediction of protein domain structures, contingent on the existence of a large number of homologous sequences which are being rapidly made available due to advances in genome sequencing.
Abstract: The similarity in the three-dimensional structures of homologous proteins imposes strong constraints on their sequence variability. It has long been suggested that the resulting correlations among amino acid compositions at different sequence positions can be exploited to infer spatial contacts within the tertiary protein structure. Crucial to this inference is the ability to disentangle direct and indirect correlations, as accomplished by the recently introduced direct-coupling analysis (DCA). Here we develop a computationally efficient implementation of DCA, which allows us to evaluate the accuracy of contact prediction by DCA for a large number of protein domains, based purely on sequence information. DCA is shown to yield a large number of correctly predicted contacts, recapitulating the global structure of the contact map for the majority of the protein domains examined. Furthermore, our analysis captures clear signals beyond intradomain residue contacts, arising, e.g., from alternative protein conformations, ligand-mediated residue couplings, and interdomain interactions in protein oligomers. Our findings suggest that contacts predicted by DCA can be used as a reliable guide to facilitate computational predictions of alternative protein conformations, protein complex formation, and even the de novo prediction of protein domain structures, contingent on the existence of a large number of homologous sequences which are being rapidly made available due to advances in genome sequencing.

1,319 citations


Journal ArticleDOI
29 Apr 2011-PLOS ONE
TL;DR: OSLOM (Order Statistics Local Optimization Method), the first method capable to detect clusters in networks accounting for edge directions, edge weights, overlapping communities, hierarchies and community dynamics, is presented.
Abstract: Community structure is one of the main structural features of networks, revealing both their internal organization and the similarity of their elementary units. Despite the large variety of methods proposed to detect communities in graphs, there is a big need for multi-purpose techniques, able to handle different types of datasets and the subtleties of community structure. In this paper we present OSLOM (Order Statistics Local Optimization Method), the first method capable to detect clusters in networks accounting for edge directions, edge weights, overlapping communities, hierarchies and community dynamics. It is based on the local optimization of a fitness function expressing the statistical significance of clusters with respect to random fluctuations, which is estimated with tools of Extreme and Order Statistics. OSLOM can be used alone or as a refinement procedure of partitions/covers delivered by other techniques. We have also implemented sequential algorithms combining OSLOM with other fast techniques, so that the community structure of very large networks can be uncovered. Our method has a comparable performance as the best existing algorithms on artificial benchmark graphs. Several applications on real networks are shown as well. OSLOM is implemented in a freely available software (http://www.oslom.org), and we believe it will be a valuable tool in the analysis of networks.

1,205 citations


Journal ArticleDOI
07 Dec 2011-PLOS ONE
TL;DR: Surprisingly, it is found that the strength of these inferred couplings is an excellent predictor of residue-residue proximity in folded structures, and the top-scoring residue couplings are sufficiently accurate and well-distributed to define the 3D protein fold with remarkable accuracy.
Abstract: The evolutionary trajectory of a protein through sequence space is constrained by its function. Collections of sequence homologs record the outcomes of millions of evolutionary experiments in which the protein evolves according to these constraints. Deciphering the evolutionary record held in these sequences and exploiting it for predictive and engineering purposes presents a formidable challenge. The potential benefit of solving this challenge is amplified by the advent of inexpensive high-throughput genomic sequencing. In this paper we ask whether we can infer evolutionary constraints from a set of sequence homologs of a protein. The challenge is to distinguish true co-evolution couplings from the noisy set of observed correlations. We address this challenge using a maximum entropy model of the protein sequence, constrained by the statistics of the multiple sequence alignment, to infer residue pair couplings. Surprisingly, we find that the strength of these inferred couplings is an excellent predictor of residue-residue proximity in folded structures. Indeed, the top-scoring residue couplings are sufficiently accurate and well-distributed to define the 3D protein fold with remarkable accuracy. We quantify this observation by computing, from sequence alone, all-atom 3D structures of fifteen test proteins from different fold classes, ranging in size from 50 to 260 residues., including a G-protein coupled receptor. These blinded inferences are de novo, i.e., they do not use homology modeling or sequence-similar fragments from known structures. The co-evolution signals provide sufficient information to determine accurate 3D protein structure to 2.7–4.8 A Cα-RMSD error relative to the observed structure, over at least two-thirds of the protein (method called EVfold, details at http://EVfold.org). This discovery provides insight into essential interactions constraining protein evolution and will facilitate a comprehensive survey of the universe of protein structures, new strategies in protein and drug design, and the identification of functional genetic variants in normal and disease genomes.

1,125 citations


Journal ArticleDOI
TL;DR: It is shown that multiresolution modularity is not capable to recover the planted community structure, not even when it is pronounced and easily detectable by other methods, for any value of the resolution parameter.
Abstract: Modularity maximization is the most popular technique for the detection of community structure in graphs. The resolution limit of the method is supposedly solvable with the introduction of modified versions of the measure, with tunable resolution parameters. We show that multiresolution modularity suffers from two opposite coexisting problems: the tendency to merge small subgraphs, which dominates when the resolution is low; the tendency to split large subgraphs, which dominates when the resolution is high. In benchmark networks with heterogeneous distributions of cluster sizes, the simultaneous elimination of both biases is not possible and multiresolution modularity is not capable to recover the planted community structure, not even when it is pronounced and easily detectable by other methods, for any value of the resolution parameter. This holds for other multiresolution techniques and it is likely to be a general problem of methods based on global optimization.

461 citations


Journal ArticleDOI
31 Jan 2011-PLOS ONE
TL;DR: A comprehensive computational and theoretical study of the role of travel restrictions in halting and delaying pandemics by using a model that explicitly integrates air travel and short-range mobility data with high-resolution demographic data across the world and that is validated by the accumulation of data from the 2009 H1N1 pandemic.
Abstract: After the emergence of the H1N1 influenza in 2009, some countries responded with travel-related controls during the early stage of the outbreak in an attempt to contain or slow down its international spread. These controls along with self-imposed travel limitations contributed to a decline of about 40% in international air traffic to/from Mexico following the international alert. However, no containment was achieved by such restrictions and the virus was able to reach pandemic proportions in a short time. When gauging the value and efficacy of mobility and travel restrictions it is crucial to rely on epidemic models that integrate the wide range of features characterizing human mobility and the many options available to public health organizations for responding to a pandemic. Here we present a comprehensive computational and theoretical study of the role of travel restrictions in halting and delaying pandemics by using a model that explicitly integrates air travel and short-range mobility data with high-resolution demographic data across the world and that is validated by the accumulation of data from the 2009 H1N1 pandemic. We explore alternative scenarios for the 2009 H1N1 pandemic by assessing the potential impact of mobility restrictions that vary with respect to their magnitude and their position in the pandemic timeline. We provide a quantitative discussion of the delay obtained by different mobility restrictions and the likelihood of containing outbreaks of infectious diseases at their source, confirming the limited value and feasibility of international travel restrictions. These results are rationalized in the theoretical framework characterizing the invasion dynamics of the epidemics at the metapopulation level.

458 citations


Journal ArticleDOI
TL;DR: A review and critical analysis of the mathematical literature concerning the modeling of vehicular traffic and crowd phenomena and a critical analysis focused on research perspectives that consider the development of a unified modeling strategy are presented.
Abstract: This paper presents a review and critical analysis of the mathematical literature concerning the modeling of vehicular traffic and crowd phenomena. The survey of models deals with the representation scales and the mathematical frameworks that are used for the modeling approach. The paper also considers the challenging objective of modeling complex systems consisting of large systems of individuals interacting in a nonlinear manner, where one of the modeling difficulties is the fact that these systems are difficult to model at a global level when based only on the description of the dynamics of individual elements. The review is concluded with a critical analysis focused on research perspectives that consider the development of a unified modeling strategy.

434 citations


Journal ArticleDOI
TL;DR: In this article, electron beam melting (EBM) is used to realize a selective densification of metal powder by melting it in a layerwise manner following a CAD design, and microstructure, the residual porosity and the chemical composition of the samples have been investigated both immediately after EBM and after heat treatments.

377 citations


Journal ArticleDOI
TL;DR: In this article, the effect of thickness stretching in plate/shell structures made by materials which are functionally graded (FGM) in the thickness directions was evaluated by removing or retaining the transverse normal strain in the kinematics assumptions of various refined plate and shell theories.
Abstract: The present work evaluates the effect of thickness stretching in plate/shell structures made by materials which are functionally graded (FGM) in the thickness directions. That is done by removing or retaining the transverse normal strain in the kinematics assumptions of various refined plate/shell theories. Variable plate/shell models are implemented according to Carrera’s Unified Formulation. Plate/shell theories with constant transverse displacement are compared with the corresponding linear to fourth order of expansion in the thickness direction ones. Single-layered and multilayered FGM structures have been analyzed. A large numerical investigation, encompassing various plate/shell geometries as well as various grading rates for FGMs, has been conducted. It is mainly concluded that a refinements of classical theories that include additional in-plane variables could results meaningless unless transverse normal strain effects are taken into account.

373 citations


Proceedings ArticleDOI
02 Nov 2011
TL;DR: It is shown that the YouTube system is highly optimized for PC access and leverages aggressive buffering policies to guarantee excellent video playback, however this however causes 25%-39% of data to be unnecessarily transferred, since users abort the playback very early.
Abstract: In this paper we present a complete measurement study that compares YouTube traffic generated by mobile devices (smart-phones,tablets) with traffic generated by common PCs (desktops, notebooks, netbooks). We investigate the users' behavior and correlate it with the system performance. Our measurements are performed using unique data sets which are collected from vantage points in nation-wide ISPs and University campuses from two countries in Europe and the U.S.Our results show that the user access patterns are similar across a wide range of user locations, access technologies and user devices. Users stick with default player configurations, e.g., not changing video resolution or rarely enabling full screen playback. Furthermore it is very common that users abort video playback, with 60% of videos watched for no more than 20% of their duration.We show that the YouTube system is highly optimized for PC access and leverages aggressive buffering policies to guarantee excellent video playback. This however causes 25%-39% of data to be unnecessarily transferred, since users abort the playback very early. This waste of data transferred is even higher when mobile devices are considered. The limited storage offered by those devices makes the video download more complicated and overall less efficient, so that clients typically download more data than the actual video size. Overall, this result calls for better system optimization for both, PC and mobile accesses.

Journal ArticleDOI
TL;DR: An efficient strategy for controlling a vast range of nonintegrable quantum many-body one-dimensional systems that can be merged with state-of-the-art tensor network simulation methods such as the density matrix renormalization group is presented.
Abstract: We present an efficient strategy for controlling a vast range of nonintegrable quantum many-body one-dimensional systems that can be merged with state-of-the-art tensor network simulation methods such as the density matrix renormalization group. To demonstrate its potential, we employ it to solve a major issue in current optical-lattice physics with ultracold atoms: we show how to reduce by about 2 orders of magnitude the time needed to bring a superfluid gas into a Mott insulator state, while suppressing defects by more than 1 order of magnitude as compared to current experiments [T. Stoferle et al., Phys. Rev. Lett. 92, 130403 (2004)]. Finally, we show that the optimal pulse is robust against atom number fluctuations.

Journal ArticleDOI
TL;DR: The family of two-echelon vehicle routing problems (VRPs), a term that broadly covers such settings, where the delivery from one or more depots to customers is managed by routing and consolidating freight through intermediate depots, is introduced.
Abstract: Multiechelon distribution systems are quite common in supply-chain and logistics They are used by public administrations in their transportation and traffic planning strategies, as well as by companies, to model own distribution systems In the literature, most of the studies address issues relating to the movement of flows throughout the system from their origins to their final destinations Another recent trend is to focus on the management of the vehicle fleets required to provide transportation among different echelons The aim of this paper is twofold First, it introduces the family of two-echelon vehicle routing problems (VRPs), a term that broadly covers such settings, where the delivery from one or more depots to customers is managed by routing and consolidating freight through intermediate depots Second, it considers in detail the basic version of two-echelon VRPs, the two-echelon capacitated VRP, which is an extension of the classical VRP in which the delivery is compulsorily delivered through intermediate depots, named satellites A mathematical model for two-echelon capacitated VRP, some valid inequalities, and two math-heuristics based on the model are presented Computational results of up to 50 customers and four satellites show the effectiveness of the methods developed

Journal ArticleDOI
TL;DR: In this paper, the authors consider continuous-time average consensus dynamics in which the agents' states are communicated through uniform quantizers and prove that solutions to the resulting system are defined in the Krasowskii sense and converge to conditions of practical consensus.

Journal ArticleDOI
TL;DR: In this article, the role of poly(3,4-ethylenedioxythiophene):poly(styrenesulfonate) (PEDOT:PSS) in the degradation of polymer:PCBM ((6,6)-phenyl C61-butyric acid methyl ester) solar cells was elucidated.

Journal ArticleDOI
TL;DR: The purpose of this paper is showcase successful VBM applications and to make the case that VBM does provide valuable information in real world applications when used appropriately and without unrealistic expectations.
Abstract: Structural health monitoring (SHM) is a relatively new paradigm for civil infrastructure stakeholders including operators, consultants and contractors which has in the last two decades witnessed an acceleration of academic and applied research in related areas such as sensing technology, system identification, data mining and condition assessment. SHM has a wide range of applications including, but not limited to, diagnostic and prognostic capabilities. However, when it comes to practical applications, stakeholders usually need answers to basic and pragmatic questions about in-service performance, maintenance and management of a structure which the technological advances are slow to address. Typical among the mismatch of expectation and capability is the topic of vibration-based monitoring (VBM), which is a subset of SHM. On the one hand there is abundant reporting of exercises using vibration data to locate damage in highly controlled laboratory conditions or in numerical simulations, while the real test of a reliable and cost effective technology is operation on a commercial basis. Such commercial applications are hard to identify, with the vast majority of implementations dealing with data collection and checking against parameter limits. In addition there persists an unhelpful association between VBM and ‘damage detection’ among some civil infrastructure stakeholders in UK and North America, due to unsuccessful transfer of technology from the laboratory to the field, and this has resulted in unhealthy industry scepticism which hinders acceptance of successful technologies. Hence the purpose of this paper is showcase successful VBM applications and to make the case that VBM does provide valuable information in real world applications when used appropriately and without unrealistic expectations.

Journal ArticleDOI
TL;DR: Non‐resorbable and resorbable commercially available membranes are described, based on expanded polytetrafluoroethylene, poly(lactic acid), poly(glycolic acid) and theirCopolymers and their copolymers.
Abstract: In this review, different barrier membranes for guided bone regeneration (GBR) are described as a useful surgical technique to enhance bone regeneration in damaged alveolar sites before performing implants and fitting other dental appliances. The GBR procedure encourages bone regeneration through cellular exclusion and avoids the invasion of epithelial and connective tissues that grow at the defective site instead of bone tissue. The barrier membrane should satisfy various properties, such as biocompatibility, non-immunogenicity, non-toxicity, and a degradation rate that is long enough to permit mechanical support during bone formation. Other characteristics such as tissue integration, nutrient transfer, space maintenance and manageability are also of interest. In this review, various non-resorbable and resorbable commercially available membranes are described, based on expanded polytetrafluoroethylene, poly(lactic acid), poly(glycolic acid) and their copolymers. The polyester-based membranes are biodegradable, permit a single-stage procedure, and have higher manageability than non-resorbable membranes; however, they have shown poor biocompatibility. In contrast, membranes based on natural materials, such as collagen, are biocompatible but are characterized by poor mechanical properties and stability due to their early degradation. Moreover, new approaches are described, such as the use of multi-layered, graft-copolymer-based and composite membranes containing osteoconductive ceramic fillers as alternatives to conventional membranes.

Journal ArticleDOI
TL;DR: This review article highlights the evolution of glass-based scaffolds for bone tissue engineering, highlighting the promise of multifunctional systems able to combine bone regeneration and drug release abilities, the increasing role of nondestructive advanced imaging techniques, such as X-ray microtomography, for scaffolds investigation and the potential of stem cells incorporation into scaffolds.
Abstract: Biomaterials used in regenerative medicine are often designed to act as 3D porous templates (scaffolds) able to support and promote the growth and repair of natural tissues. Some types of glasses have a great potential for making bone tissue engineering scaffolds, as they can bond to host bone, stimulate bone cells toward osteogenesis, and resorb at the same time as the bone is repaired. This review article highlights the evolution of glass-based scaffolds for bone tissue engineering; specifically, the features, limitations, and advantages of the different types of glass-derived scaffolds proposed in the literature (macroporous glass-ceramic, sol-gel glass, composite, graded, hybrid, and hierarchical implants) are critically examined, discussed, and compared. Future directions for the research are also suggested, highlighting the promise of multifunctional systems able to combine bone regeneration and drug release abilities, the increasing role of nondestructive advanced imaging techniques, such as X-ray microtomography, for scaffolds investigation and the potential of stem cells incorporation into scaffolds.

Journal ArticleDOI
TL;DR: This study reveals a general mechanism to map nanoscale properties to the macroscale and provides a potent design strategy toward novel fiber and bulk nanomaterials through hierarchical structures.
Abstract: Silk is an exceptionally strong, extensible, and tough material made from simple protein building blocks. The molecular structure of dragline spider silk repeat units consists of semiamorphous and nanocrystalline β-sheet protein domains. Here we show by a series of computational experiments how the nanoscale properties of silk repeat units are scaled up to create macroscopic silk fibers with outstanding mechanical properties despite the presence of cavities, tears, and cracks. We demonstrate that the geometric confinement of silk fibrils to diameters of 50 ± 30 nm is critical to facilitate a powerful mechanism by which hundreds of thousands of protein domains synergistically resist deformation and failure to provide enhanced strength, extensibility, and toughness at the macroscale, closely matching experimentally measured mechanical properties. Through this mechanism silk fibers exploit the full potential of the nanoscale building blocks, regardless of the details of microscopic loading conditions and despite the presence of large defects. Experimental results confirm that silk fibers are composed of silk fibril bundles with diameters in the range of 20-150 nm, in agreement with our predicted length scale. Our study reveals a general mechanism to map nanoscale properties to the macroscale and provides a potent design strategy toward novel fiber and bulk nanomaterials through hierarchical structures.

Journal ArticleDOI
TL;DR: In this paper, PET fabrics were coated with colloidal silica nanoparticles using layer-by-layer assembly, and five bilayers of positively and negatively charged colloidal particles were used to improve flame retardant properties of textile fabric.

Journal ArticleDOI
TL;DR: The recent progress in the understanding and of basic phenomenon related to the use of SiC and SiC composite in fusion applications will be presented in this article, including both fundamental radiation effects in SiC, and engineering issues such as joining and general materials properties.

Journal ArticleDOI
TL;DR: In this paper, the authors review the performed or proposed attempts to detect the Lense-thirring effect affecting the orbital motions of natural and artificial bodies in the gravitational fields of the Sun, Earth, Mars and Jupiter.
Abstract: Recent years have seen increasing efforts to directly measure some aspects of the general relativistic gravitomagnetic interaction in several astronomical scenarios in the solar system. After briefly overviewing the concept of gravitomagnetism from a theoretical point of view, we review the performed or proposed attempts to detect the Lense-Thirring effect affecting the orbital motions of natural and artificial bodies in the gravitational fields of the Sun, Earth, Mars and Jupiter. In particular, we will focus on the evaluation of the impact of several sources of systematic uncertainties of dynamical origin to realistically elucidate the present and future perspectives in directly measuring such an elusive relativistic effect.

Journal ArticleDOI
01 Jul 2011-Energy
TL;DR: In this paper, a multi-scale model of storage tanks is proposed to analyze the operation of storage systems during the heating season and to predict their effects on the primary energy consumption and cash flows.

Journal ArticleDOI
TL;DR: In this paper, a new multiscale modeling technique is proposed, which relies on a recently introduced measure-theoretic approach, which allows one to manage the microscopic and the macroscopic scale under a unique framework.
Abstract: In this paper a new multiscale modeling technique is proposed. It relies on a recently introduced measure-theoretic approach, which allows one to manage the microscopic and the macroscopic scale under a unique framework. In the resulting coupled model the two scales coexist and share information. This way it is possible to perform numerical simulations in which the trajectories and the density of the particles affect each other. Crowd dynamics is the motivating application throughout the paper.

Journal ArticleDOI
TL;DR: In this paper, a dispersion of graphite in N-methylpyrrolidone was used to synthesize poly(N-isopropylacrylamide) nanocomposite hydrogels containing graphene.
Abstract: Frontal polymerization has been successfully used to synthesize poly(N-isopropylacrylamide) nanocomposite hydrogels containing graphene. The latter was directly achieved by ultrasound treatment of a dispersion of graphite in N-methylpyrrolidone. The dispersion, having the concentration of 2.21 g L−1, was characterized by TEM analysis and mixed with suitable amounts of N-isopropylacrylamide for the synthesis of graphene-containing nanocomposite polymer hydrogels. The nanocomposite hydrogels were analyzed by SEM and Raman spectroscopy, and their swelling and rheological properties were investigated. It was found that graphene strongly influences the swelling ratio, dramatically increasing it, even if present in small amounts. Finally, the rheological properties of the hydrogels were correlated with the graphene content: G′ modulus and complex viscosity were found to increase with increasing nanofiller concentration, thus indicating the occurrence of good interactions between the two phases. Nevertheless, at a high concentration (i.e., 0.13 wt.%), graphene showed a lubrication effect, lowering the rheological parameters and approaching the same pseudoplastic behaviour of the unfilled material.

Journal ArticleDOI
TL;DR: A comparison of the merits obtained from the different models shows that Phytoremediation results as the most sustainable WWT technology for small cheese factories and that the use of the ANP method, which allows more sophisticated analysis to be made, succeeds in offering better results.
Abstract: Multicriteria analyses (MCAs) are used to make comparative assessments of alternative projects or heterogeneous measures and allow several criteria to be taken into account simultaneously in a complex situation. The paper shows the application of different MCA techniques to a real decision problem concerning the choice of the most sustainable wastewater treatment (WWT) technology, namely Anaerobic digestion, Phytoremediation and Composting, for small cheese factories. Particularly, the Analytic Hierarchy Process (AHP) and its recent implementation, the Analytic Network Process (ANP), have been considered for prioritizing the different technologies. The models enable all the elements of the decision process to be considered, namely environmental aspects, technological factors and economic costs, and to compare them to find the best alternative. The AHP and ANP techniques are applied through specific software packages with user-friendly interfaces called Expertchoice and Superdecision, respectively. A comparison of the merits obtained from the different models shows that Phytoremediation results as the most sustainable WWT technology for small cheese factories and that the use of the ANP method, which allows more sophisticated analysis to be made, succeeds in offering better results.

Journal ArticleDOI
TL;DR: In this paper, the number of field applications of zero-valent iron differing from permeable reactive barrier has grown rapidly and at present are 112, and the authors analyzed and compared such field applications.
Abstract: In the last 10 years, the number of field applications of zero-valent iron differing from permeable reactive barrier has grown rapidly and at present are 112. This study analyzes and compares such field applications. By using statistical analysis, especially ANOVA and principal component analysis, this study shows that chlorinated solvent contamination can be treated efficiently by using zero-valent iron material singly or associated with other technologies. In the analyzed sample of case studies, the association with microbial dechlorination increased significantly the performances of nanoscale iron. This is likely due to the synergistic effect between the two processes. Millimetric iron was always used in association with source zone containment; therefore, it is not possible to distinguish the contributions of the two techniques. The comparison also shows that catalyst addition seems to not dramatically improve treatment efficiency and that such improvement is not statistically significant. Finally, the injection technology is correlated to the type of iron and to the soil permeability.

Journal ArticleDOI
TL;DR: The hemodynamics within the aorta of five healthy humans were investigated and group analysis suggested that aortic helical blood flow dynamics is an emerging behavior that is common to normal individuals, and the results suggest that helical flow might be caused by natural optimization of fluid transport processes in the cardiovascular system.
Abstract: The hemodynamics within the aorta of five healthy humans were investigated to gain insight into the complex helical flow patterns that arise from the existence of asymmetries in the aortic region. The adopted approach is aimed at (1) overcoming the relative paucity of quantitative data regarding helical blood flow dynamics in the human aorta and (2) identifying common characteristics in physiological aortic flow topology, in terms of its helical content. Four-dimensional phase-contrast magnetic resonance imaging (4D PC MRI) was combined with algorithms for the calculation of advanced fluid dynamics in this study. These algorithms allowed us to obtain a 4D representation of intra-aortic flow fields and to quantify the aortic helical flow. For our purposes, helicity was used as a measure of the alignment of the velocity and the vorticity. There were two key findings of our study: (1) intra-individual analysis revealed a statistically significant difference in the helical content at different phases of systole and (2) group analysis suggested that aortic helical blood flow dynamics is an emerging behavior that is common to normal individuals. Our results also suggest that helical flow might be caused by natural optimization of fluid transport processes in the cardiovascular system, aimed at obtaining efficient perfusion. The approach here applied to assess in vivo helical blood flow could be the starting point to elucidate the role played by helicity in the generation and decay of rotating flows in the thoracic aorta.

Proceedings ArticleDOI
22 May 2011
TL;DR: The speaker verification score for a pair of i-vectors representing a trial is computed with a functional form derived from the successful PLDA generative model, which provides up to 40% relative improvement on the NIST SRE 2010 evaluation task.
Abstract: Recently, i-vector extraction and Probabilistic Linear Discriminant Analysis (PLDA) have proven to provide state-of-the-art speaker verification performance. In this paper, the speaker verification score for a pair of i-vectors representing a trial is computed with a functional form derived from the successful PLDA generative model. In our case, however, parameters of this function are estimated based on a discriminative training criterion. We propose to use the objective function to directly address the task in speaker verification: discrimination between same-speaker and different-speaker trials. Compared with a baseline which uses a generatively trained PLDA model, discriminative training provides up to 40% relative improvement on the NIST SRE 2010 evaluation task.

Journal ArticleDOI
TL;DR: Canal modifications seem to be significantly reduced when previous glide path is performed by using the new WaveOne nickel-titanium single-file system.