scispace - formally typeset
Search or ask a question

Showing papers by "École Normale Supérieure published in 2008"


Journal ArticleDOI
TL;DR: The theory of compressive sampling, also known as compressed sensing or CS, is surveyed, a novel sensing/sampling paradigm that goes against the common wisdom in data acquisition.
Abstract: Conventional approaches to sampling signals or images follow Shannon's theorem: the sampling rate must be at least twice the maximum frequency present in the signal (Nyquist rate). In the field of data conversion, standard analog-to-digital converter (ADC) technology implements the usual quantized Shannon representation - the signal is uniformly sampled at or above the Nyquist rate. This article surveys the theory of compressive sampling, also known as compressed sensing or CS, a novel sensing/sampling paradigm that goes against the common wisdom in data acquisition. CS theory asserts that one can recover certain signals and images from far fewer samples or measurements than traditional methods use.

9,686 citations


Journal ArticleDOI
TL;DR: In this article, a review of recent experimental and theoretical progress concerning many-body phenomena in dilute, ultracold gases is presented, focusing on effects beyond standard weakcoupling descriptions, such as the Mott-Hubbard transition in optical lattices, strongly interacting gases in one and two dimensions, or lowest-Landau-level physics in quasi-two-dimensional gases in fast rotation.
Abstract: This paper reviews recent experimental and theoretical progress concerning many-body phenomena in dilute, ultracold gases. It focuses on effects beyond standard weak-coupling descriptions, such as the Mott-Hubbard transition in optical lattices, strongly interacting gases in one and two dimensions, or lowest-Landau-level physics in quasi-two-dimensional gases in fast rotation. Strong correlations in fermionic gases are discussed in optical lattices or near-Feshbach resonances in the BCS-BEC crossover.

6,601 citations


Journal ArticleDOI
21 Feb 2008-Nature
TL;DR: The design and synthesis of molecules that associate together to form both chains and cross-links via hydrogen bonds and the system shows recoverable extensibility up to several hundred per cent and little creep under load are designed and synthesized.
Abstract: Rubbers exhibit enormous extensibility up to several hundred per cent, compared with a few per cent for ordinary solids, and have the ability to recover their original shape and dimensions on release of stress. Rubber elasticity is a property of macromolecules that are either covalently cross-linked or connected in a network by physical associations such as small glassy or crystalline domains, ionic aggregates or multiple hydrogen bonds. Covalent cross-links or strong physical associations prevent flow and creep. Here we design and synthesize molecules that associate together to form both chains and cross-links via hydrogen bonds. The system shows recoverable extensibility up to several hundred per cent and little creep under load. In striking contrast to conventional cross-linked or thermoreversible rubbers made of macromolecules, these systems, when broken or cut, can be simply repaired by bringing together fractured surfaces to self-heal at room temperature. Repaired samples recuperate their enormous extensibility. The process of breaking and healing can be repeated many times. These materials can be easily processed, re-used and recycled. Their unique self-repairing properties, the simplicity of their synthesis, their availability from renewable resources and the low cost of raw ingredients (fatty acids and urea) bode well for future applications.

2,501 citations


Proceedings ArticleDOI
23 Jun 2008
TL;DR: In this paper, a weighted set of visual words is obtained by selecting words based on proximity in descriptor space, and this representation may be incorporated into a standard tf-idf architecture and how spatial verification is modified in the case of this soft-assignment.
Abstract: The state of the art in visual object retrieval from large databases is achieved by systems that are inspired by text retrieval. A key component of these approaches is that local regions of images are characterized using high-dimensional descriptors which are then mapped to ldquovisual wordsrdquo selected from a discrete vocabulary.This paper explores techniques to map each visual region to a weighted set of words, allowing the inclusion of features which were lost in the quantization stage of previous systems. The set of visual words is obtained by selecting words based on proximity in descriptor space. We describe how this representation may be incorporated into a standard tf-idf architecture, and how spatial verification is modified in the case of this soft-assignment. We evaluate our method on the standard Oxford Buildings dataset, and introduce a new dataset for evaluation. Our results exceed the current state of the art retrieval performance on these datasets, particularly on queries with poor initial recall where techniques like query expansion suffer. Overall we show that soft-assignment is always beneficial for retrieval with large vocabularies, at a cost of increased storage requirements for the index.

1,630 citations


Journal ArticleDOI
13 Nov 2008-Nature
TL;DR: Analysis of molecular divergence compared with yeasts and metazoans reveals rapid rates of gene diversification in diatoms, and documents the presence of hundreds of genes from bacteria, likely to provide novel possibilities for metabolite management and for perception of environmental signals.
Abstract: Diatoms are photosynthetic secondary endosymbionts found throughout marine and freshwater environments, and are believed to be responsible for around one- fifth of the primary productivity on Earth(1,2). The genome sequence of the marine centric diatom Thalassiosira pseudonana was recently reported, revealing a wealth of information about diatom biology(3-5). Here we report the complete genome sequence of the pennate diatom Phaeodactylum tricornutum and compare it with that of T. pseudonana to clarify evolutionary origins, functional significance and ubiquity of these features throughout diatoms. In spite of the fact that the pennate and centric lineages have only been diverging for 90 million years, their genome structures are dramatically different and a substantial fraction of genes (similar to 40%) are not shared by these representatives of the two lineages. Analysis of molecular divergence compared with yeasts and metazoans reveals rapid rates of gene diversification in diatoms. Contributing factors include selective gene family expansions, differential losses and gains of genes and introns, and differential mobilization of transposable elements. Most significantly, we document the presence of hundreds of genes from bacteria. More than 300 of these gene transfers are found in both diatoms, attesting to their ancient origins, and many are likely to provide novel possibilities for metabolite management and for perception of environmental signals. These findings go a long way towards explaining the incredible diversity and success of the diatoms in contemporary oceans.

1,500 citations


Proceedings Article
08 Dec 2008
TL;DR: A novel sparse representation for signals belonging to different classes in terms of a shared dictionary and discriminative class models is proposed, with results on standard handwritten digit and texture classification tasks.
Abstract: It is now well established that sparse signal models are well suited for restoration tasks and can be effectively learned from audio, image, and video data. Recent research has been aimed at learning discriminative sparse models instead of purely reconstructive ones. This paper proposes a new step in that direction, with a novel sparse representation for signals belonging to different classes in terms of a shared dictionary and discriminative class models. The linear version of the proposed model admits a simple probabilistic interpretation, while its most general variant admits an interpretation in terms of kernels. An optimization framework for learning all the components of the proposed model is presented, along with experimental results on standard handwritten digit and texture classification tasks.

1,108 citations


Journal ArticleDOI
TL;DR: This manuscript is to give a practical overview of meshless methods (for solid mechanics) based on global weak forms through a simple and well-structured MATLAB code, to illustrate the discourse.

1,088 citations


Journal ArticleDOI
14 Mar 2008-Science
TL;DR: It is proposed that working memory is sustained by calcium-mediated synaptic facilitation in the recurrent connections of neocortical networks by using the presynaptic residual calcium as a buffer that is loaded, refreshed, and read out by spiking activity.
Abstract: It is usually assumed that enhanced spiking activity in the form of persistent reverberation for several seconds is the neural correlate of working memory. Here, we propose that working memory is sustained by calcium-mediated synaptic facilitation in the recurrent connections of neocortical networks. In this account, the presynaptic residual calcium is used as a buffer that is loaded, refreshed, and read out by spiking activity. Because of the long time constants of calcium kinetics, the refresh rate can be low, resulting in a mechanism that is metabolically efficient and robust. The duration and stability of working memory can be regulated by modulating the spontaneous activity in the network.

1,076 citations


Journal ArticleDOI
TL;DR: This article developed a response-and-effect functional framework, concentrating on how the relationships among species' response, effect, and abundance can lead to general predictions concerning the magnitude and direction of the influence of environmental change on function.
Abstract: Predicting ecosystem responses to global change is a major challenge in ecology. A critical step in that challenge is to understand how changing environmental conditions influence processes across levels of ecological organization. While direct scaling from individual to ecosystem dynamics can lead to robust and mechanistic predictions, new approaches are needed to appropriately translate questions through the community level. Species invasion, loss, and turnover all necessitate this scaling through community processes, but predicting how such changes may influence ecosystem function is notoriously difficult. We suggest that community-level dynamics can be incorporated into scaling predictions using a trait-based response-effect framework that differentiates the community response to environmental change (predicted by response traits) and the effect of that change on ecosystem processes (predicted by effect traits). We develop a response-and-effect functional framework, concentrating on how the relationships among species' response, effect, and abundance can lead to general predictions concerning the magnitude and direction of the influence of environmental change on function. We then detail several key research directions needed to better scale the effects of environmental change through the community level. These include (1) effect and response trait characterization, (2) linkages between response-and-effect traits, (3) the importance of species interactions on trait expression, and (4) incorporation of feedbacks across multiple temporal scales. Increasing rates of extinction and invasion that are modifying communities worldwide make such a research agenda imperative.

996 citations


Journal ArticleDOI
TL;DR: It is demonstrated that the generative model can effectively handle occlusions in each time frame independently, even when the only data available comes from the output of a simple background subtraction algorithm and when the number of individuals is unknown a priori.
Abstract: Given two to four synchronized video streams taken at eye level and from different angles, we show that we can effectively combine a generative model with dynamic programming to accurately follow up to six individuals across thousands of frames in spite of significant occlusions and lighting changes. In addition, we also derive metrically accurate trajectories for each of them. Our contribution is twofold. First, we demonstrate that our generative model can effectively handle occlusions in each time frame independently, even when the only data available comes from the output of a simple background subtraction algorithm and when the number of individuals is unknown a priori. Second, we show that multiperson tracking can be reliably achieved by processing individual trajectories separately over long sequences, provided that a reasonable heuristic is used to rank these individuals and that we avoid confusing them with one another.

865 citations


Proceedings ArticleDOI
23 Jun 2008
TL;DR: This article proposes an energy formulation with both sparse reconstruction and class discrimination components, jointly optimized during dictionary learning, for local image discrimination tasks, and paves the way for a novel scene analysis and recognition framework based on simultaneously learning discriminative and reconstructive dictionaries.
Abstract: Sparse signal models have been the focus of much recent research, leading to (or improving upon) state-of-the-art results in signal, image, and video restoration. This article extends this line of research into a novel framework for local image discrimination tasks, proposing an energy formulation with both sparse reconstruction and class discrimination components, jointly optimized during dictionary learning. This approach improves over the state of the art in texture segmentation experiments using the Brodatz database, and it paves the way for a novel scene analysis and recognition framework based on simultaneously learning discriminative and reconstructive dictionaries. Preliminary results in this direction using examples from the Pascal VOC06 and Graz02 datasets are presented as well.

Journal ArticleDOI
12 Dec 2008-Science
TL;DR: It is found that morphogenesis at the Arabidopsis shoot apex depends on the microtubule cytoskeleton, which in turn is regulated by mechanical stress, and a feedback loop encompassing tissue morphology, stress patterns, and microtubules-mediated cellular properties is sufficient to account for the coordinated patterns of micro Tubule arrays observed in epidermal cells.
Abstract: A central question in developmental biology is whether and how mechanical forces serve as cues for cellular behavior and thereby regulate morphogenesis. We found that morphogenesis at the Arabidopsis shoot apex depends on the microtubule cytoskeleton, which in turn is regulated by mechanical stress. A combination of experiments and modeling shows that a feedback loop encompassing tissue morphology, stress patterns, and microtubule-mediated cellular properties is sufficient to account for the coordinated patterns of microtubule arrays observed in epidermal cells, as well as for patterns of apical morphogenesis.

Journal ArticleDOI
11 Sep 2008-Nature
TL;DR: In this article, the formation of a Mott insulator of a repulsively interacting two-component Fermi gas in an optical lattice has been studied, and it is identified by three features: a drastic suppression of doubly occupied lattice sites, a strong reduction of the compressibility inferred from the response of double occupancy to an increase in atom number, and the appearance of a gapped mode in the excitation spectrum.
Abstract: Strong interactions between electrons in a solid material can lead to surprising properties. A prime example is the Mott insulator, in which suppression of conductivity occurs as a result of interactions rather than a filled Bloch band. Proximity to the Mott insulating phase in fermionic systems is the origin of many intriguing phenomena in condensed matter physics, most notably high-temperature superconductivity. The Hubbard model, which encompasses the essential physics of the Mott insulator, also applies to quantum gases trapped in an optical lattice. It is therefore now possible to access this regime with tools developed in atomic physics. However, an atomic Mott insulator has so far been realized only with a gas of bosons, which lack the rich and peculiar nature of fermions. Here we report the formation of a Mott insulator of a repulsively interacting two-component Fermi gas in an optical lattice. It is identified by three features: a drastic suppression of doubly occupied lattice sites, a strong reduction of the compressibility inferred from the response of double occupancy to an increase in atom number, and the appearance of a gapped mode in the excitation spectrum. Direct control of the interaction strength allows us to compare the Mott insulating regime and the non-interacting regime without changing tunnel-coupling or confinement. Our results pave the way for further studies of the Mott insulator, including spin-ordering and ultimately the question of d-wave superfluidity.

Book ChapterDOI
12 Oct 2008
TL;DR: A method to align an image to its neighbors in a large image collection consisting of a variety of scenes, and applies the SIFT flow algorithm to two applications: motion field prediction from a single static image and motion synthesis via transfer of moving objects.
Abstract: While image registration has been studied in different areas of computer vision, aligning images depicting different scenes remains a challenging problem, closer to recognition than to image matching Analogous to optical flow, where an image is aligned to its temporally adjacent frame, we propose SIFT flow, a method to align an image to its neighbors in a large image collection consisting of a variety of scenes For a query image, histogram intersection on a bag-of-visual-words representation is used to find the set of nearest neighbors in the database The SIFT flow algorithm then consists of matching densely sampled SIFT features between the two images, while preserving spatial discontinuities The use of SIFT features allows robust matching across different scene/object appearances and the discontinuity-preserving spatial model allows matching of objects located at different parts of the scene Experiments show that the proposed approach is able to robustly align complicated scenes with large spatial distortions We collect a large database of videos and apply the SIFT flow algorithm to two applications: (i) motion field prediction from a single static image and (ii) motion synthesis via transfer of moving objects

Journal ArticleDOI
TL;DR: An updated view on progress in elucidating the epidemiology and pathogenesis of canine leishmaniosis is presented, and the second part focuses on advances in diagnosis, treatment and prevention.

Journal ArticleDOI
TL;DR: In this paper, a gravity theory in five dimensions coupled to a dilaton and an axion may capture the important qualitative features of pure YM theory, and a part of the effects of higher α'-corrections is resummed into a Dilaton potential, which determines the full structure of the vacuum solution.
Abstract: Various holographic approaches to QCD in five dimensions are explored using input both from the putative five-dimensional non-critical string theory as well as QCD. It is argued that a gravity theory in five dimensions coupled to a dilaton and an axion may capture the important qualitative features of pure YM theory. A part of the effects of higher α'-corrections is resummed into a dilaton potential. The potential is shown to be in one-to-one correspondence with the exact β-function of QCD, and its knowledge determines the full structure of the vacuum solution. The geometry near the UV boundary is that of AdS5 with logarithmic corrections reflecting the asymptotic freedom of QCD. We find that all relevant confining backgrounds have an IR singularity of the ``good" kind that allows unambiguous spectrum computations. Near the singularity the 't Hooft coupling is driven to infinity. Asymptotically linear glueball masses can also be achieved. The classification of all confining asymptotics, the associated glueball spectra and meson dynamics are addressed in a companion paper arXiv:0707.1349

Journal ArticleDOI
TL;DR: In this paper, a grafting from a poly(e-caprolactone) (PCL) polymers to cellulose nanocrystals by Sn(Oct)2-catalyzed ring-opening polymerization (ROP) was demonstrated.
Abstract: A ‘grafting from’ approach was used to graft poly(e-caprolactone) (PCL) polymers to cellulose nanocrystals by Sn(Oct)2-catalyzed ring-opening polymerization (ROP). The grafting efficiency was evidenced by the long-term stability of suspension of PCL-grafted cellulose nanocrystals in toluene. These observations were confirmed by Fourier Transform Infrared Spectroscopy (FT-IR) and Time-of-Flight Secondary Ion Mass Spectrometry (TOF-SIMS). Extracted nanohybrids were characterized by Differential Scanning Calorimetry (DSC), X-ray photoelectron spectroscopy (XPS), and contact angle measurements. The morphology and crystalline structure of the PCL-grafted cellulose nanocrystals was examined by transmission electron microscopy (TEM) and X-Ray diffraction, respectively. Results showed that cellulose nanocrystals kept their initial morphological integrity and their native crystallinity. Nanocomposites with high content of cellulose nanocrystals were prepared using either neat cellulose nanocrystals or PCL-grafted cellulose nanocrystals and high molecular weight PCL as matrix using a casting/evaporation technique. Thermo-mechanical properties of processed nanocomposites were studied by DSC, dynamical mechanical analyses (DMA) and tensile tests. A significant improvement in terms of Young's modulus and storage modulus was obtained.

Journal ArticleDOI
TL;DR: In this paper, an analytical theory of the prestellar core initial mass function (IMF) based on an extension of the Press-Schechter statistical formalism was derived.
Abstract: We derive an analytical theory of the prestellar core initial mass function (IMF) based on an extension of the Press-Schechter statistical formalism. Our approach relies on the general concept of the gravothermal and gravoturbulent collapse of a molecular cloud, with a selection criterion based on the thermal or turbulent Jeans mass, which yields the derivation of the mass spectrum of self-gravitating objects in a quiescent or a turbulent environment. The same formalism also yields the mass spectrum of non-self-gravitating clumps produced in supersonic flows. The mass spectrum of the self-gravitating cores reproduces well the observed IMF. The theory predicts that the shape of the IMF results from two competing contributions, namely, a power law at large scales and an exponential cutoff (lognormal form) centered around the characteristic mass for gravitational collapse. The cutoff exists both in the case of thermal or turbulent collapse, provided that the underlying density field has a lognormal distribution. Whereas pure thermal collapse produces a power-law tail steeper than the Salpeter value, -->dN/dlog M M−x with -->x 1.35, the latter is recovered exactly for the (three-dimensional) value of the spectral index of the velocity power spectrum, -->n 3.8, found in observations and in numerical simulations of isothermal supersonic turbulence. Indeed, the theory predicts that -->x = (n + 1)/(2n − 4) for self-gravitating structures and -->x = 2 − n'/3 for non-self-gravitating structures, where -->n' is the power spectrum index of -->log ρ . We show that, whereas supersonic turbulence promotes the formation of both massive stars and brown dwarfs, it has an overall negative impact on star formation, decreasing the star formation efficiency. This theory provides a novel theoretical foundation to understand the origin of the IMF and provides useful guidance to numerical simulations exploring star formation, while making testable predictions.

Journal ArticleDOI
TL;DR: In this article, the effect of out-of-plane motion on 2D and 3D digital image correlation measurements is demonstrated using basic theoretical pinhole image equations and experimentally through synchronized, multi-system measurements.

Journal ArticleDOI
TL;DR: The basic properties of the wavelet approach for time-series analysis from an ecological perspective are reviewed, notably free from the assumption of stationarity that makes most methods unsuitable for many ecological time series.
Abstract: Wavelet analysis is a powerful tool that is already in use throughout science and engineering. The versatility and attractiveness of the wavelet approach lie in its decomposition properties, principally its time-scale localization. It is especially relevant to the analysis of non-stationary systems, i.e., systems with short-lived transient components, like those observed in ecological systems. Here, we review the basic properties of the wavelet approach for time-series analysis from an ecological perspective. Wavelet decomposition offers several advantages that are discussed in this paper and illustrated by appropriate synthetic and ecological examples. Wavelet analysis is notably free from the assumption of stationarity that makes most methods unsuitable for many ecological time series. Wavelet analysis also permits analysis of the relationships between two signals, and it is especially appropriate for following gradual change in forcing by exogenous variables.

Journal ArticleDOI
TL;DR: It is proposed that individuals with dyslexia have a deficit in access to phonological representations and it is speculated that a similar notion might also adequately describe the nature of other associated cognitive deficits when present.
Abstract: We review a series of experiments aimed at understanding the nature of the phonological deficit in developmental dyslexia. These experiments investigate input and output phonological representations, phonological grammar, foreign speech perception and production, and unconscious speech processing and lexical access. Our results converge on the observation that the phonological representations of people with dyslexia may be intact, and that the phonological deficit surfaces only as a function of certain task requirements, notably short-term memory, conscious awareness, and time constraints. In an attempt to reformulate those task requirements more economically, we propose that individuals with dyslexia have a deficit in access to phonological representations. We discuss the explanatory power of this concept and we speculate that a similar notion might also adequately describe the nature of other associated cognitive deficits when present.

Proceedings ArticleDOI
23 Jun 2008
TL;DR: A novel local image descriptor designed for dense wide-baseline matching purposes, inspired from earlier ones such as SIFT and GLOH but can be computed much faster for its purposes, and does not introduce artifacts that degrade the matching performance.
Abstract: We introduce a novel local image descriptor designed for dense wide-baseline matching purposes. We feed our descriptors to a graph-cuts based dense depth map estimation algorithm and this yields better wide-baseline performance than the commonly used correlation windows for which the size is hard to tune. As a result, unlike competing techniques that require many high-resolution images to produce good reconstructions, our descriptor can compute them from pairs of low-quality images such as the ones captured by video streams. Our descriptor is inspired from earlier ones such as SIFT and GLOH but can be computed much faster for our purposes. Unlike SURF which can also be computed efficiently at every pixel, it does not introduce artifacts that degrade the matching performance. Our approach was tested with ground truth laser scanned depth maps as well as on a wide variety of image pairs of different resolutions and we show that good reconstructions are achieved even with only two low quality images.

Journal ArticleDOI
TL;DR: In this paper, the authors carried out a study with stakeholders of the fashion industry and reported on their views on the challenges and conflicts of the different dimensions of sustainability, and discussed how to leverage both the internal and external organizations in the European supply chain.

Journal ArticleDOI
TL;DR: In this paper, an empirical analysis of the relationship between the stringency of environmental regulation and total factor productivity (TFP) growth in the Quebec manufacturing sector is provided, which is consistent with Michel Porter's conjecture, and this effect is stronger in a subgroup of industries which are more exposed to international competition.
Abstract: This paper provides an empirical analysis of the relationship between the stringency of environmental regulation and total factor productivity (TFP) growth in the Quebec manufacturing sector. This allows us to investigate more fully the Porter hypothesis in three directions. First, the dynamic aspect of the hypothesis is captured through the use of lagged regulatory variables. Second, we argue that the hypothesis is more relevant for more polluting sectors. Third, we argue that the hypothesis is more relevant for sectors which are more exposed to international competition. Our empirical results suggest that: (1) the contemporaneous impact of environmental regulation on productivity is negative; (2) the opposite result is observed with lagged regulatory variables, which is consistent with Michel Porter’s conjecture; and (3) this effect is stronger in a subgroup of industries which are more exposed to international competition.

Journal ArticleDOI
TL;DR: This work presents an online method that makes it possible to detect when an image comes from an already perceived scene using local shape and color information, and extends the bag-of-words method used in image classification to incremental conditions and relies on Bayesian filtering to estimate loop-closure probability.
Abstract: In robotic applications of visual simultaneous localization and mapping techniques, loop-closure detection and global localization are two issues that require the capacity to recognize a previously visited place from current camera measurements. We present an online method that makes it possible to detect when an image comes from an already perceived scene using local shape and color information. Our approach extends the bag-of-words method used in image classification to incremental conditions and relies on Bayesian filtering to estimate loop-closure probability. We demonstrate the efficiency of our solution by real-time loop-closure detection under strong perceptual aliasing conditions in both indoor and outdoor image sequences taken with a handheld camera.

Journal ArticleDOI
TL;DR: This paper argues in favor of a three-tiered dynamic model of intention, link it to an expanded version of the internal model theory of action control and specification, and uses this theoretical framework to guide an analysis of the contents, possible sources and temporal course of complementary aspects of the phenomenology of action.

Journal ArticleDOI
TL;DR: In this article, the impact of several chemical treatments, including NaOH, polyethyleneimine, ethylenediaminetetraacetic acid, Ca(OH)2 and CaCl2 on the composition and structure of hemp fibres was evaluated using differential thermal analysis, scanning electron microscopy, X-ray diffraction and Fourier transform infrared spectroscopy.
Abstract: The impact of several chemical treatments, including NaOH, polyethyleneimine, ethylenediaminetetraacetic acid, Ca(OH)2 and CaCl2 onto the composition and structure of hemp fibres was evaluated using differential thermal analysis, scanning electron microscopy, X-ray diffraction and Fourier transform infrared spectroscopy. Comparison of results obtained with the last two techniques allows us to quantify the impact of the chemical treatments onto the crystallinity index.

Book ChapterDOI
13 Apr 2008
TL;DR: The goal of this paper is to provide an assessment of lattice reduction algorithms' behaviour based on extensive experiments performed with the NTL library, and to suggest several conjectures on the worst case and the actual behaviour of lattICE reduction algorithms.
Abstract: Despite their popularity, lattice reduction algorithms remain mysterious cryptanalytical tools. Though it has been widely reported that they behave better than their proved worst-case theoretical bounds, no precise assessment has ever been given. Such an assessment would be very helpful to predict the behaviour of lattice-based attacks, as well as to select keysizes for lattice-based cryptosystems. The goal of this paper is to provide such an assessment, based on extensive experiments performed with the NTL library. The experiments suggest several conjectures on the worst case and the actual behaviour of lattice reduction algorithms. We believe the assessment might also help to design new reduction algorithms overcoming the limitations of current algorithms.

Journal ArticleDOI
TL;DR: This work identifies and fills some gaps with regard to consistency (the extent to which false positives are produced) for public-key encryption with keyword search (PEKS) and defines computational and statistical relaxations of the existing notion of perfect consistency.
Abstract: We identify and fill some gaps with regard to consistency (the extent to which false positives are produced) for public-key encryption with keyword search (PEKS). We define computational and statistical relaxations of the existing notion of perfect consistency, show that the scheme of Boneh et al. (Advances in Cryptology—EUROCRYPT 2004, ed. by C. Cachin, J. Camenisch, pp. 506–522, 2004) is computationally consistent, and provide a new scheme that is statistically consistent. We also provide a transform of an anonymous identity-based encryption (IBE) scheme to a secure PEKS scheme that, unlike the previous one, guarantees consistency. Finally, we suggest three extensions of the basic notions considered here, namely anonymous hierarchical identity-based encryption, public-key encryption with temporary keyword search, and identity-based encryption with keyword search.

Journal ArticleDOI
TL;DR: In this paper, a colloidal aqueous suspension of cellulose whiskers was used as filler to obtain tensile properties of polyvinyl alcohol (PVA) nanocomposite materials.