scispace - formally typeset
Search or ask a question

Showing papers by "fondazione bruno kessler published in 2012"


Journal ArticleDOI
TL;DR: The advanced interferometer network will herald a new era in observational astronomy, and there is a very strong science case to go beyond the advanced detector network and build detectors that operate in a frequency range from 1 Hz to 10 kHz, with sensitivity a factor 10 better in amplitude as discussed by the authors.
Abstract: The advanced interferometer network will herald a new era in observational astronomy. There is a very strong science case to go beyond the advanced detector network and build detectors that operate in a frequency range from 1 Hz to 10 kHz, with sensitivity a factor 10 better in amplitude. Such detectors will be able to probe a range of topics in nuclear physics, astronomy, cosmology and fundamental physics, providing insights into many unsolved problems in these areas.

441 citations


Journal ArticleDOI
08 Aug 2012-PLOS ONE
TL;DR: It is shown that the Confusion Entropy, a measure of performance in multiclass problems has a strong (monotone) relation with the multiclass generalization of a classical metric, the Matthews Correlation Coefficient.
Abstract: We show that the Confusion Entropy, a measure of performance in multiclass problems has a strong (monotone) relation with the multiclass generalization of a classical metric, the Matthews Correlation Coefficient. Analytical results are provided for the limit cases of general no-information (n-face dice rolling) of the binary classification. Computational evidence supports the claim in the general case.

310 citations


Journal ArticleDOI
TL;DR: It is demonstrated that the simultaneous addition of Fe2+ and Fe3+ ions during apatite nucleation under controlled synthesis conditions induces intrinsic magnetization in the final product, minimizing the formation of magnetite as secondary phase, and potentially opens new perspectives for biodevices aimed at bone regeneration and for anti-cancer therapies based on hyperthermia.

235 citations


Journal ArticleDOI
TL;DR: A suite of GUI programs written in MATLAB for advanced data collection and analysis of full-field transmission X-ray microscopy data including mosaic imaging, tomography and XANES imaging is presented.
Abstract: Transmission X-ray microscopy (TXM) has been well recognized as a powerful tool for non-destructive investigation of the three-dimensional inner structure of a sample with spatial resolution down to a few tens of nanometers, especially when combined with synchrotron radiation sources. Recent developments of this technique have presented a need for new tools for both system control and data analysis. Here a software package developed in MATLAB for script command generation and analysis of TXM data is presented. The first toolkit, the script generator, allows automating complex experimental tasks which involve up to several thousand motor movements. The second package was designed to accomplish computationally intense tasks such as data processing of mosaic and mosaic tomography datasets; dual-energy contrast imaging, where data are recorded above and below a specific X-ray absorption edge; and TXM X-ray absorption near-edge structure imaging datasets. Furthermore, analytical and iterative tomography reconstruction algorithms were implemented. The compiled software package is freely available.

209 citations


Proceedings ArticleDOI
05 Sep 2012
TL;DR: This work assesses the performances of different subsets of structural network features, and in particular those concerned with ego-networks, in predicting the Big-5 personality traits, and focuses on social networks derived from real-life data gathered through smartphones.
Abstract: In this work, we investigate the relationships between social network structure and personality; we assess the performances of different subsets of structural network features, and in particular those concerned with ego-networks, in predicting the Big-5 personality traits. In addition to traditional survey-based data, this work focuses on social networks derived from real-life data gathered through smartphones. Besides showing that the latter are superior to the former for the task at hand, our results provide a fine-grained analysis of the contribution the various feature sets are able to provide to personality classification, along with an assessment of the relative merits of the various networks exploited.

191 citations


Journal ArticleDOI
TL;DR: The target application for this sensor is time-resolved imaging, in particular fluorescence lifetime imaging microscopy and 3D imaging, and the characterization shows the suitability of the proposed sensor technology for these applications.
Abstract: We report on the design and characterization of a novel time-resolved image sensor fabricated in a 130 nm CMOS process. Each pixel within the 3232 pixel array contains a low-noise single-photon detector and a high-precision time-to-digital converter (TDC). The 10-bit TDC exhibits a timing resolution of 119 ps with a timing uniformity across the entire array of less than 2 LSBs. The differential non-linearity (DNL) and integral non-linearity (INL) were measured at ±0.4 and ±1.2 LSBs, respectively. The pixel array was fabricated with a pitch of 50 μm in both directions and with a total TDC area of less than 2000 μm2. The target application for this sensor is time-resolved imaging, in particular fluorescence lifetime imaging microscopy and 3D imaging. The characterization shows the suitability of the proposed sensor technology for these applications.

170 citations


Journal ArticleDOI
J. Albert1, Xiao-Chao Fang2, D.S. Smith3, Clara Nellist4  +291 moreInstitutions (45)
TL;DR: In this article, the ATLAS Collaboration will upgrade its semiconductor pixel tracking detector with a new Insertable B-layer (IBL) between the existing pixel detector and the vacuum pipe of the Large Hadron Collider.
Abstract: The ATLAS Collaboration will upgrade its semiconductor pixel tracking detector with a new Insertable B-layer (IBL) between the existing pixel detector and the vacuum pipe of the Large Hadron Collider. The extreme operating conditions at this location have necessitated the development of new radiation hard pixel sensor technologies and a new front-end readout chip, called the FE-I4. Planar pixel sensors and 3D pixel sensors have been investigated to equip this new pixel layer, and prototype modules using the FE-I4A have been fabricated and characterized using 120 GeV pions at the CERN SPS and 4 GeV positrons at DESY, before and after module irradiation. Beam test results are presented, including charge collection efficiency, tracking efficiency and charge sharing.

154 citations


Book ChapterDOI
07 Jul 2012
TL;DR: This paper generalizes IC3 from SAT to Satisfiability Modulo Theories (SMT), thus enabling the direct analysis of programs after an encoding in form of symbolic transition systems, and adapts the "linear" search style of IC3 to a tree-like search.
Abstract: IC3 is a recently proposed verification technique for the analysis of sequential circuits. IC3 incrementally overapproximates the state space, refuting potential violations to the property at hand by constructing relative inductive blocking clauses. The algorithm relies on aggressive use of Boolean satisfiability (SAT) techniques, and has demonstrated impressive effectiveness. In this paper, we present the first investigation of IC3 in the setting of software verification. We first generalize it from SAT to Satisfiability Modulo Theories (SMT), thus enabling the direct analysis of programs after an encoding in form of symbolic transition systems. Second, to leverage the Control-Flow Graph (CFG) of the program being analyzed, we adapt the "linear" search style of IC3 to a tree-like search. Third, we cast this approach in the framework of lazy abstraction with interpolants, and optimize it by using interpolants extracted from proofs, when useful. The experimental results demonstrate the great potential of IC3, and the effectiveness of the proposed optimizations.

153 citations


Journal ArticleDOI
TL;DR: This paper proposes a technique performing a classification of the features extracted with EAPs computed on both optical and LiDAR images, leading to a fusion of the spectral, spatial and elevation data.
Abstract: Extended Attribute Profiles (EAPs), which are obtained by applying morphological attribute filters to an image in a multilevel architecture, can be used for the characterization of the spatial characteristics of objects in a scene. EAPs have proved to be discriminant features when considered for thematic classification in remote sensing applications especially when dealing with very high resolution images. Altimeter data (such as LiDAR) can provide important information, which being complementary to the spectral one can be valuable for a better characterization of the surveyed scene. In this paper, we propose a technique performing a classification of the features extracted with EAPs computed on both optical and LiDAR images, leading to a fusion of the spectral, spatial and elevation data. The experiments were carried out on LiDAR data along either with a hyperspectral and a multispectral image acquired on a rural and urban area of the city of Trento (Italy), respectively. The classification accuracies obtained pointed out the effectiveness of the features extracted by EAPs on both optical and LiDAR data for classification.

144 citations


Journal ArticleDOI
TL;DR: In this paper, the insertion of a new layer as close as 3.4 cm from the proton beams inside the existing pixel layers of the ATLAS experiment has been discussed, where the detector proximity to the interaction point will require new radiation hard technologies for both sensors and front end electronics.
Abstract: 3D silicon sensors, where electrodes penetrate the silicon substrate fully or partially, have successfully been fabricated in different processing facilities in Europe and USA. The key to 3D fabrication is the use of plasma micro-machining to etch narrow deep vertical openings allowing dopants to be diffused in and form electrodes of pin junctions. Similar openings can be used at the sensor's edge to reduce the perimeter's dead volume to as low as ∼4 μm. Since 2009 four industrial partners of the 3D ATLAS R&D Collaboration started a joint effort aimed at one common design and compatible processing strategy for the production of 3D sensors for the LHC Upgrade and in particular for the ATLAS pixel Insertable B-Layer (IBL). In this project, aimed for installation in 2013, a new layer will be inserted as close as 3.4 cm from the proton beams inside the existing pixel layers of the ATLAS experiment. The detector proximity to the interaction point will therefore require new radiation hard technologies for both sensors and front end electronics. The latter, called FE-I4, is processed at IBM and is the biggest front end of this kind ever designed with a surface of ∼4 cm 2 . The performance of 3D devices from several wafers was evaluated before and after bump-bonding. Key design aspects, device fabrication plans and quality assurance tests during the 3D sensors prototyping phase are discussed in this paper.

113 citations


Journal ArticleDOI
TL;DR: It is demonstrated that the small pixel fill factor of present complementary metal oxide semiconductor-imagers, decreasing the light sensitivity, can be increased up to 100% by replacing silicon photodiodes with an organic photoactive layer deposited with a simple low-cost spray-coating process.
Abstract: The solution-processability of organic photodetectors allows a straightforward combination with other materials, including inorganic ones, without increasing cost and process complexity significantly compared with conventional crystalline semiconductors. Although the optoelectronic performance of these organic devices does not outmatch their inorganic counterparts, there are certain applications exploiting the benefit of the solution-processability. Here we demonstrate that the small pixel fill factor of present complementary metal oxide semiconductor-imagers, decreasing the light sensitivity, can be increased up to 100% by replacing silicon photodiodes with an organic photoactive layer deposited with a simple low-cost spray-coating process. By performing a full optoelectronic characterization on this first solution-processable hybrid complementary metal oxide semiconductor-imager, including the first reported observation of different noise types in organic photodiodes, we demonstrate the suitability of this novel device for imaging. Furthermore, by integrating monolithically different organic materials to the chip, we show the cost-effective portability of the hybrid concept to different wavelength regions.

Book ChapterDOI
12 Mar 2012
TL;DR: This paper summarizes the audio part of the 2011 community-based Signal Separation Evaluation Campaign (SiSEC2011), including datasets recorded in noisy or dynamic environments and a subset of the SiSEC2010 datasets.
Abstract: This paper summarizes the audio part of the 2011 community-based Signal Separation Evaluation Campaign (SiSEC2011). Four speech and music datasets were contributed, including datasets recorded in noisy or dynamic environments and a subset of the SiSEC2010 datasets. The participants addressed one or more tasks out of four source separation tasks, and the results for each task were evaluated using different objective performance criteria. We provide an overview of the audio datasets, tasks and criteria. We also report the results achieved with the submitted systems, and discuss organization strategies for future campaigns.

Journal ArticleDOI
TL;DR: The paper defines the syntax and the semantics of CKR and shows that concept satisfiability and subsumption are decidable with the complexity upper bound of 2NExpTime, and it also provides a sound and complete natural deduction calculus that serves to characterize the propagation of knowledge between contexts.

Proceedings ArticleDOI
15 Jul 2012
TL;DR: This work proposes a novel approach that combines model-based and combinatorial testing in order to generate executable and effective test cases from a model, and introduces a post-optimization algorithm that can guarantee the combinatorsial criterion of choice on the whole set of test paths extracted from the model.
Abstract: Model-based testing relies on the assumption that effective adequacy criteria can be defined in terms of model coverage achieved by a set of test paths. However, such test paths are only abstract test cases and input test data must be specified to make them concrete. We propose a novel approach that combines model-based and combinatorial testing in order to generate executable and effective test cases from a model. Our approach starts from a finite state model and applies model-based testing to generate test paths that represent sequences of events to be executed against the system under test. Such paths are transformed to classification trees, enriched with domain input specifications such as data types and partitions. Finally, executable test cases are generated from those trees using t-way combinatorial criteria. While test cases that satisfy a combinatorial criterion can be generated for each individual test path obtained from the model, we introduce a post-optimization algorithm that can guarantee the combinatorial criterion of choice on the whole set of test paths extracted from the model. The resulting test suite is smaller, but it still satisfies the same adequacy criterion. We developed a tool and used it to evaluate our approach on 6 subject systems of various types and sizes, to study the effectiveness of the generated test suites, the reduction achieved by the post-optimization algorithm, as well as the effort required to produce them.

Journal ArticleDOI
TL;DR: 3D and 2D maps of 30 nm resolution could be obtained and show heterogeneities in the pore structure and chemical composition of the catalyst particle of about 20 μm.
Abstract: A closer look at catalysis: In situ hard X‐ray nanotomography has been developed (see picture) as a method to investigate an individual iron‐based Fischer–Tropsch‐to‐Olefins (FTO) catalyst particle at elevated temperatures and pressures. 3D and 2D maps of 30 nm resolution could be obtained and show heterogeneities in the pore structure and chemical composition of the catalyst particle of about 20 μm.

Journal ArticleDOI
TL;DR: This study aims to characterize the pore forming activity of the protein starting from its monomeric form and formulation of a model for pore formation at different conductance levels.

Journal ArticleDOI
07 Nov 2012-PLOS ONE
TL;DR: Assessment of gene expression signatures to predict CRC prognosis showed five gene signatures showed significant association with prognosis and provided reasonable prediction accuracy in their own training datasets, but all signatures showed low reproducibility in independent data.
Abstract: Introduction The traditional staging system is inadequate to identify those patients with stage II colorectal cancer (CRC) at high risk of recurrence or with stage III CRC at low risk. A number of gene expression signatures to predict CRC prognosis have been proposed, but none is routinely used in the clinic. The aim of this work was to assess the prediction ability and potential clinical usefulness of these signatures in a series of independent datasets.

Book ChapterDOI
03 Sep 2012
TL;DR: A holistic and at the same time lightweight approach that leverages a combination of static analysis and minimalistic emulation to apply supervised learning techniques in detecting malicious web pages pertinent to drive-by-download, phishing, injection, and malware distribution is presented.
Abstract: Malicious web pages are among the major security threats on the Web. Most of the existing techniques for detecting malicious web pages focus on specific attacks. Unfortunately, attacks are getting more complex whereby attackers use blended techniques to evade existing countermeasures. In this paper, we present a holistic and at the same time lightweight approach, called BINSPECT, that leverages a combination of static analysis and minimalistic emulation to apply supervised learning techniques in detecting malicious web pages pertinent to drive-by-download, phishing, injection, and malware distribution by introducing new features that can effectively discriminate malicious and benign web pages. Large scale experimental evaluation of BINSPECT achieved above 97% accuracy with low false signals. Moreover, the performance overhead of BINSPECT is in the range 3-5 seconds to analyze a single web page, suggesting the effectiveness of our approach for real-life deployment.

Proceedings ArticleDOI
05 Sep 2012
TL;DR: This paper presents a novel, fully formal contract framework, which relies on an expressive property specification language, conceived for the formalization of embedded system requirements, and is supported by a verification engine based on automated SMT techniques.
Abstract: Contract-based design is an emerging paradigm for the design of complex systems, where each component is associated with a contract, i.e., a clear description of the expected behaviour. Contracts specify the input-output behaviour of a component by defining what the component guarantees, provided that the its environment obeys some given assumptions. The ultimate goal of contract-based design is to allow for compositional reasoning, stepwise refinement, and a principled reuse of components that are already pre-designed, or designed independently. In this paper, we present a novel, fully formal contract framework. The decomposition of the system architecture is complemented with the corresponding decomposition of component contracts. The framework exploits such decomposition to automatically generate a set of proof obligations, which, once verified, allow concluding the correctness of the top-level system properties. The framework relies on an expressive property specification language, conceived for the formalization of embedded system requirements. The proof system reduces the correctness of contracts refinement to entailment of temporal logic formulas, and is supported by a verification engine based on automated SMT techniques.

Proceedings ArticleDOI
24 Jun 2012
TL;DR: A comprehensive framework for adaptivity of service-based applications is proposed, which exploits the concept of process fragments as a way to model reusable process knowledge and to allow for the dynamic, incremental, context-aware composition of such fragments into adaptable service- based applications.
Abstract: We propose a comprehensive framework for adaptivity of service-based applications, which exploits the concept of process fragments as a way to model reusable process knowledge and to allow for the dynamic, incremental, context- aware composition of such fragments into adaptable service-based applications The framework provides a set of adaptation mechanisms that, combined through adaptation strategies, are able to solve complex adaptation problems An implementation of the proposed solution is presented and evaluated on a real world scenario from the logistics domain

Journal ArticleDOI
TL;DR: The MultiFarm dataset, which has been designed as a benchmark for multilingual ontology matching, is presented, which is composed of a set of ontologies translated in different languages and the corresponding alignments between these ontologies.

Journal ArticleDOI
TL;DR: In this paper, the authors used the Schwinger-Dyson equations to compute the nonperturbative modifications caused to the infrared finite gluon propagator (in the Landau gauge) by the inclusion of a small number of quark families.
Abstract: In this article we use the Schwinger--Dyson equations to compute the nonperturbative modifications caused to the infrared finite gluon propagator (in the Landau gauge) by the inclusion of a small number of quark families. Our basic operating assumption is that the main bulk of the effect stems from the ``one-loop dressed'' quark loop contributing to the full gluon self-energy. This quark loop is then calculated, using as basic ingredients the full quark propagator and quark-gluon vertex; for the quark propagator we use the solution obtained from the quark-gap equation, while for the vertex we employ suitable Ans\"atze, which guarantee the transversality of the answer. The resulting effect is included as a correction to the quenched gluon propagator, obtained in recent lattice simulations. Our main finding is that the unquenched propagator displays a considerable suppression in the intermediate momentum region, which becomes more pronounced as we increase the number of active quark families. The influence of the quarks on the saturation point of the propagator cannot be reliably computed within the present scheme; the general tendency appears to be to decrease it, suggesting a corresponding increase in the effective gluon mass. The renormalization properties of our results, and the uncertainties induced by the unspecified transverse part of the quark-gluon vertex, are discussed. Finally, the gluon propagator is compared with the available unquenched lattice data, showing rather good agreement.

Journal ArticleDOI
TL;DR: The Generalized State Coherence Transform (GSCT) is analyzed which is a non-linear transform of the space represented by the whole demixing matrices that enables an accurate estimation of the propagation time-delay of multiple sources in multiple dimensions.
Abstract: According to the physical meaning of the frequency-domain blind source separation (FD-BSS), each mixing matrix estimated by independent component analysis (ICA) contains information on the physical acoustic propagation related to each source and then can be used for localization purposes. In this paper, we analyze the Generalized State Coherence Transform (GSCT) which is a non-linear transform of the space represented by the whole demixing matrices. The transform enables an accurate estimation of the propagation time-delay of multiple sources in multiple dimensions. Furthermore, it is shown that with appropriate nonlinearities and a statistical model for the reverberation, GSCT can be considered an approximated kernel density estimator of the acoustic propagation time-delay. Experimental results confirm the good properties of the transform and its effectiveness in addressing multiple source TDOA detection (e.g., 2-D TDOA estimation of several sources with only three microphones).

Journal ArticleDOI
TL;DR: This work presents a detailed description of a theory solver for Linear Integer Arithmetic (LA(Z) in a lazy SMT context that makes extensive use of layering and heuristics for combining different techniques in order to achieve good performance in practice.
Abstract: We present a detailed description of a theory solver for Linear Integer Arithmetic (LA(Z)) in a lazy SMT context. Rather than focusing on a single technique that guarantees theoretical completeness, the solver makes extensive use of layering and heuristics for combining different techniques in order to achieve good performance in practice. The viability of our approach is demonstrated by an empirical evaluation on a wide range of benchmarks, showing significant performance improvements over current state-of-the-art solvers.

Journal ArticleDOI
TL;DR: The advanced interferometer network will herald a new era in observational astronomy, and there is a very strong science case to go beyond the advanced detector network and build detectors that operate in a frequency range from 1 Hz-10 kHz, with sensitivity a factor ten better in amplitude as mentioned in this paper.
Abstract: The advanced interferometer network will herald a new era in observational astronomy. There is a very strong science case to go beyond the advanced detector network and build detectors that operate in a frequency range from 1 Hz-10 kHz, with sensitivity a factor ten better in amplitude. Such detectors will be able to probe a range of topics in nuclear physics, astronomy, cosmology and fundamental physics, providing insights into many unsolved problems in these areas.

Journal ArticleDOI
TL;DR: Results show that force sensors on the wrist are able to retrieve important information about hand and finger movements, although this information can vary depending on sensor placement.

Journal ArticleDOI
TL;DR: The performance of OBGMX in reproducing the structure of periodic systems is analyzed by calculating the root mean‐squared displacements of optimized configurations of a large set of metal–organic frameworks.
Abstract: OBGMX is a web service providing topologies for the GROMACS molecular dynamics software package according to the Universal Force Field, as implemented in the Open Babel package. OBGMX can deal with molecular and periodic systems. The geometrical parameters appearing in the potential energy functions for the bonded interactions can be set to those measured in the input configuration. The performance of OBGMX in reproducing the structure of periodic systems is analyzed by calculating the root mean-squared displacements of optimized configurations of a large set of metal-organic frameworks. OBGMX is available at http://software-lisc.fbk.eu/obgmx/.

Journal ArticleDOI
01 Oct 2012-PLOS ONE
TL;DR: This work proposes a simple SIR transmission model with vaccination choice, and shows that public intervention has a stabilising role which is able to reduce the strength of imitation-induced oscillations, to allow disease elimination, and to even make the disease-free equilibrium where everyone is vaccinated globally attractive.
Abstract: After a long period of stagnation, traditionally explained by the voluntary nature of the programme, a considerable increase in routine measles vaccine uptake has been recently observed in Italy after a set of public interventions aiming to promote MMR immunization, whilst retaining its voluntary aspect To account for this take-off in coverage we propose a simple SIR transmission model with vaccination choice, where, unlike similar works, vaccinating behaviour spreads not only through the diffusion of “private” information spontaneously circulating among parents of children to be vaccinated, which we call imitation, but also through public information communicated by the public health authorities We show that public intervention has a stabilising role which is able to reduce the strength of imitation-induced oscillations, to allow disease elimination, and to even make the disease-free equilibrium where everyone is vaccinated globally attractive The available Italian data are used to evaluate the main behavioural parameters, showing that the proposed model seems to provide a much more plausible behavioural explanation of the observed take-off of uptake of vaccine against measles than models based on pure imitation alone

Journal ArticleDOI
TL;DR: This article explores the use of profiles based on features derived from three supervised feature extraction techniques and two unsupervised feature-extraction techniques in classification and compares the classification accuracies obtained by using different techniques for two different classification methods.
Abstract: The classification of remote sensing data based on the exploitation of spatial features extracted with morphological and attribute profiles has been recently gaining importance. With the development of efficient algorithms to construct the profiles for large datasets, such methods are becoming even more relevant. When dealing with hyperspectral imagery, the profiles are traditionally built on the first few principal components computed from the data. However, it needs to be determined if other feature reduction approaches are better suited to create base images for the profiles. In this article, we explore the use of profiles based on features derived from three supervised feature extraction techniques (i.e. Discriminant Analysis Feature Extraction, Decision Boundary Feature Extraction and Non-parametric Weighted Feature Extraction) and two unsupervised feature-extraction techniques (i.e. Principal Component Analysis (PCA) and Kernel PCA) in classification and compare the classification accuracies obtaine...

Journal ArticleDOI
TL;DR: A new model for aggregating multiple criteria evaluations for relevance assessment by considering the existence of a prioritization relationship over the criteria is proposed, where relevance is modeled as a multidimensional property of documents.
Abstract: A new model for aggregating multiple criteria evaluations for relevance assessment is proposed. An Information Retrieval context is considered, where relevance is modeled as a multidimensional property of documents. The usefulness and effectiveness of such a model are demonstrated by means of a case study on personalized Information Retrieval with multi-criteria relevance. The following criteria are considered to estimate document relevance: aboutness, coverage, appropriateness, and reliability. The originality of this approach lies in the aggregation of the considered criteria in a prioritized way, by considering the existence of a prioritization relationship over the criteria. Such a prioritization is modeled by making the weights associated to a criterion dependent upon the satisfaction of the higher-priority criteria. This way, it is possible to take into account the fact that the weight of a less important criterion should be proportional to the satisfaction degree of the more important criterion. Experimental evaluations are also reported.