scispace - formally typeset
Search or ask a question

Showing papers by "IBM published in 2004"


Journal ArticleDOI
TL;DR: A publicly available algorithm that requires only the same order of magnitude of computational effort as ordinary least squares applied to the full set of covariates is described.
Abstract: The purpose of model selection algorithms such as All Subsets, Forward Selection and Backward Elimination is to choose a linear model on the basis of the same set of data to which the model will be applied. Typically we have available a large collection of possible covariates from which we hope to select a parsimonious set for the efficient prediction of a response variable. Least Angle Regression (LARS), a new model selection algorithm, is a useful and less greedy version of traditional forward selection methods. Three main properties are derived: (1) A simple modification of the LARS algorithm implements the Lasso, an attractive version of ordinary least squares that constrains the sum of the absolute regression coefficients; the LARS modification calculates all possible Lasso estimates for a given problem, using an order of magnitude less computer time than previous methods. (2) A different LARS modification efficiently implements Forward Stagewise linear regression, another promising new model selection method; this connection explains the similar numerical results previously observed for the Lasso and Stagewise, and helps us understand the properties of both methods, which are seen as constrained versions of the simpler LARS algorithm. (3) A simple approximation for the degrees of freedom of a LARS estimate is available, from which we derive a Cp estimate of prediction error; this allows a principled choice among the range of possible LARS estimates. LARS and its variants are computationally efficient: the paper describes a publicly available algorithm that requires only the same order of magnitude of computational effort as ordinary least squares applied to the full set of covariates.

7,828 citations


Journal ArticleDOI
Shouheng Sun1, Hao Zeng1, David B. Robinson1, Simone Raoux1, Philip M. Rice1, Shan X. Wang1, Guanxiong Li1 
TL;DR: As-synthesized iron oxide nanoparticles have a cubic spinel structure as characterized by HRTEM, SAED, and XRD and can be transformed into hydrophilic ones by adding bipolar surfactants, and aqueous nanoparticle dispersion is readily made.
Abstract: High-temperature solution phase reaction of iron(III) acetylacetonate, Fe(acac)3, with 1,2-hexadecanediol in the presence of oleic acid and oleylamine leads to monodisperse magnetite (Fe3O4) nanoparticles. Similarly, reaction of Fe(acac)3 and Co(acac)2 or Mn(acac)2 with the same diol results in monodisperse CoFe2O4 or MnFe2O4 nanoparticles. Particle diameter can be tuned from 3 to 20 nm by varying reaction conditions or by seed-mediated growth. The as-synthesized iron oxide nanoparticles have a cubic spinel structure as characterized by HRTEM, SAED, and XRD. Further, Fe3O4 can be oxidized to Fe2O3, as evidenced by XRD, NEXAFS spectroscopy, and SQUID magnetometry. The hydrophobic nanoparticles can be transformed into hydrophilic ones by adding bipolar surfactants, and aqueous nanoparticle dispersion is readily made. These iron oxide nanoparticles and their dispersions in various media have great potential in magnetic nanodevice and biomagnetic applications.

3,244 citations


Journal ArticleDOI
TL;DR: Sputter-deposited polycrystalline MTJs grown on an amorphous underlayer, but with highly oriented MgO tunnel barriers and CoFe electrodes, exhibit TMR values of up to ∼220% at room temperature and ∼300% at low temperatures, which will accelerate the development of new families of spintronic devices.
Abstract: Magnetically engineered magnetic tunnel junctions (MTJs) show promise as non-volatile storage cells in high-performance solid-state magnetic random access memories (MRAM). The performance of these devices is currently limited by the modest (< approximately 70%) room-temperature tunnelling magnetoresistance (TMR) of technologically relevant MTJs. Much higher TMR values have been theoretically predicted for perfectly ordered (100) oriented single-crystalline Fe/MgO/Fe MTJs. Here we show that sputter-deposited polycrystalline MTJs grown on an amorphous underlayer, but with highly oriented (100) MgO tunnel barriers and CoFe electrodes, exhibit TMR values of up to approximately 220% at room temperature and approximately 300% at low temperatures. Consistent with these high TMR values, superconducting tunnelling spectroscopy experiments indicate that the tunnelling current has a very high spin polarization of approximately 85%, which rivals that previously observed only using half-metallic ferromagnets. Such high values of spin polarization and TMR in readily manufactureable and highly thermally stable devices (up to 400 degrees C) will accelerate the development of new families of spintronic devices.

2,931 citations


Journal ArticleDOI
TL;DR: This paper presents a middleware platform which addresses the issue of selecting Web services for the purpose of their composition in a way that maximizes user satisfaction expressed as utility functions over QoS attributes, while satisfying the constraints set by the user and by the structure of the composite service.
Abstract: The paradigmatic shift from a Web of manual interactions to a Web of programmatic interactions driven by Web services is creating unprecedented opportunities for the formation of online business-to-business (B2B) collaborations. In particular, the creation of value-added services by composition of existing ones is gaining a significant momentum. Since many available Web services provide overlapping or identical functionality, albeit with different quality of service (QoS), a choice needs to be made to determine which services are to participate in a given composite service. This paper presents a middleware platform which addresses the issue of selecting Web services for the purpose of their composition in a way that maximizes user satisfaction expressed as utility functions over QoS attributes, while satisfying the constraints set by the user and by the structure of the composite service. Two selection approaches are described and compared: one based on local (task-level) selection of services and the other based on global allocation of tasks to services using integer programming.

2,872 citations


Journal ArticleDOI
TL;DR: The aim of this paper is to survey the ways in which Bloom filters have been used and modified in a variety of network problems, with the aim of providing a unified mathematical and practical framework for understanding them and stimulating their use in future applications.
Abstract: A Bloom filter is a simple space-efficient randomized data structure for representing a set in order to support membership queries. Bloom filters allow false positives but the space savings often outweigh this drawback when the probability of an error is controlled. Bloom filters have been used in database applications since the 1970s, but only in recent years have they become popular in the networking literature. The aim of this paper is to survey the ways in which Bloom filters have been used and modified in a variety of network problems, with the aim of providing a unified mathematical and practical framework for understanding them and stimulating their use in future applications.

2,199 citations


Journal ArticleDOI
TL;DR: A re-parameterization of the standard TIP4P water model for use with Ewald techniques is introduced, providing an overall global improvement in water properties relative to several popular nonpolarizable and polarizable water potentials.
Abstract: A re-parameterization of the standard TIP4P water model for use with Ewald techniques is introduced, providing an overall global improvement in water properties relative to several popular nonpolarizable and polarizable water potentials. Using high precision simulations, and careful application of standard analytical corrections, we show that the new TIP4P-Ew potential has a density maximum at approximately 1 degrees C, and reproduces experimental bulk-densities and the enthalpy of vaporization, DeltaH(vap), from -37.5 to 127 degrees C at 1 atm with an absolute average error of less than 1%. Structural properties are in very good agreement with x-ray scattering intensities at temperatures between 0 and 77 degrees C and dynamical properties such as self-diffusion coefficient are in excellent agreement with experiment. The parameterization approach used can be easily generalized to rehabilitate any water force field using available experimental data over a range of thermodynamic points.

1,741 citations


Proceedings ArticleDOI
17 May 2004
TL;DR: It is shown that a small number of expressed trusts/distrust per individual allows us to predict trust between any two people in the system with high accuracy.
Abstract: A (directed) network of people connected by ratings or trust scores, and a model for propagating those trust scores, is a fundamental building block in many of today's most successful e-commerce and recommendation systems. We develop a framework of trust propagation schemes, each of which may be appropriate in certain circumstances, and evaluate the schemes on a large trust network consisting of 800K trust scores expressed among 130K people. We show that a small number of expressed trusts/distrust per individual allows us to predict trust between any two people in the system with high accuracy. Our work appears to be the first to incorporate distrust in a computational trust propagation setting.

1,583 citations


Journal ArticleDOI
15 Jul 2004-Nature
TL;DR: The long relaxation time of the measured signal suggests that the state of an individual spin can be monitored for extended periods of time, even while subjected to a complex set of manipulations that are part of the MRFM measurement protocol.
Abstract: Magnetic resonance imaging (MRI) is well known as a powerful technique for visualizing subsurface structures with three-dimensional spatial resolution. Pushing the resolution below 1 micro m remains a major challenge, however, owing to the sensitivity limitations of conventional inductive detection techniques. Currently, the smallest volume elements in an image must contain at least 10(12) nuclear spins for MRI-based microscopy, or 10(7) electron spins for electron spin resonance microscopy. Magnetic resonance force microscopy (MRFM) was proposed as a means to improve detection sensitivity to the single-spin level, and thus enable three-dimensional imaging of macromolecules (for example, proteins) with atomic resolution. MRFM has also been proposed as a qubit readout device for spin-based quantum computers. Here we report the detection of an individual electron spin by MRFM. A spatial resolution of 25 nm in one dimension was obtained for an unpaired spin in silicon dioxide. The measured signal is consistent with a model in which the spin is aligned parallel or anti-parallel to the effective field, with a rotating-frame relaxation time of 760 ms. The long relaxation time suggests that the state of an individual spin can be monitored for extended periods of time, even while subjected to a complex set of manipulations that are part of the MRFM measurement protocol.

1,379 citations


Proceedings ArticleDOI
13 Jun 2004
TL;DR: This work presents an order-preserving encryption scheme for numeric data that allows any comparison operation to be directly applied on encrypted data, and is robust against estimation of the true value in such environments.
Abstract: Encryption is a well established technology for protecting sensitive data. However, once encrypted, data can no longer be easily queried aside from exact matches. We present an order-preserving encryption scheme for numeric data that allows any comparison operation to be directly applied on encrypted data. Query results produced are sound (no false hits) and complete (no false drops). Our scheme handles updates gracefully and new values can be added without requiring changes in the encryption of other values. It allows standard databse indexes to be built over encrypted tables and can easily be integrated with existing database systems. The proposed scheme has been designed to be deployed in application environments in which the intruder can get access to the encrypted database, but does not have prior domain information such as the distribution of values and annot encrypt or decrypt arbitrary values of his choice. The encryption is robust against estimation of the true value in such environments.

1,303 citations


Journal ArticleDOI
TL;DR: A macroscopic characterization of topic propagation through the authors' corpus, formalizing the notion of long-running "chatter" topics consisting recursively of "spike" topics generated by outside world events, or more rarely, by resonances within the community.
Abstract: We study the dynamics of information propagation in environments of low-overhead personal publishing, using a large collection of WebLogs over time as our example domain. We characterize and model this collection at two levels. First, we present a macroscopic characterization of topic propagation through our corpus, formalizing the notion of long-running "chatter" topics consisting recursively of "spike" topics generated by outside world events, or more rarely, by resonances within the community. Second, we present a microscopic characterization of propagation from individual to individual, drawing on the theory of infectious diseases to model the flow. We propose, validate, and employ an algorithm to induce the underlying propagation network from a sequence of posts, and report on the results.

1,292 citations


Journal ArticleDOI
TL;DR: An elegant, efficient measurement method that yields the elastic moduli of nanoscale polymer films in a rapid and quantitative manner without the need for expensive equipment or material-specific modelling is introduced.
Abstract: As technology continues towards smaller, thinner and lighter devices, more stringent demands are placed on thin polymer films as diffusion barriers, dielectric coatings, electronic packaging and so on. Therefore, there is a growing need for testing platforms to rapidly determine the mechanical properties of thin polymer films and coatings. We introduce here an elegant, efficient measurement method that yields the elastic moduli of nanoscale polymer films in a rapid and quantitative manner without the need for expensive equipment or material-specific modelling. The technique exploits a buckling instability that occurs in bilayers consisting of a stiff, thin film coated onto a relatively soft, thick substrate. Using the spacing of these highly periodic wrinkles, we calculate the film's elastic modulus by applying well-established buckling mechanics. We successfully apply this new measurement platform to several systems displaying a wide range of thicknessess (nanometre to micrometre) and moduli (MPa to GPa).

Journal ArticleDOI
TL;DR: In this article, a two-part review summarizes the observations, theory, and simulations of interstellar turbulence and their implications for many fields of astrophysics, including basic fluid equations, solenoidal and compressible modes, global inviscid quadratic invariants, scaling arguments for the power spectrum, phenomenological models for the scaling of higher-order structu...
Abstract: ▪ Abstract Turbulence affects the structure and motions of nearly all temperature and density regimes in the interstellar gas. This two-part review summarizes the observations, theory, and simulations of interstellar turbulence and their implications for many fields of astrophysics. The first part begins with diagnostics for turbulence that have been applied to the cool interstellar medium and highlights their main results. The energy sources for interstellar turbulence are then summarized along with numerical estimates for their power input. Supernovae and superbubbles dominate the total power, but many other sources spanning a large range of scales, from swing-amplified gravitational instabilities to cosmic ray streaming, all contribute in some way. Turbulence theory is considered in detail, including the basic fluid equations, solenoidal and compressible modes, global inviscid quadratic invariants, scaling arguments for the power spectrum, phenomenological models for the scaling of higher-order structu...

Proceedings ArticleDOI
Tong Zhang1
04 Jul 2004
TL;DR: Stochastic gradient descent algorithms on regularized forms of linear prediction methods, related to online algorithms such as perceptron, are studied, and numerical rate of convergence for such algorithms is obtained.
Abstract: Linear prediction methods, such as least squares for regression, logistic regression and support vector machines for classification, have been extensively used in statistics and machine learning. In this paper, we study stochastic gradient descent (SGD) algorithms on regularized forms of linear prediction methods. This class of methods, related to online algorithms such as perceptron, are both efficient and very simple to implement. We obtain numerical rate of convergence for such algorithms, and discuss its implications. Experiments on text data will be provided to demonstrate numerical and statistical consequences of our theoretical findings.


Journal ArticleDOI
TL;DR: A multi-document summarizer, MEAD, is presented, which generates summaries using cluster centroids produced by a topic detection and tracking system and an evaluation scheme based on sentence utility and subsumption is applied.
Abstract: We present a multi-document summarizer, MEAD, which generates summaries using cluster centroids produced by a topic detection and tracking system. We describe two new techniques, a centroid-based summarizer, and an evaluation scheme based on sentence utility and subsumption. We have applied this evaluation to both single and multiple document summaries. Finally, we describe two user studies that test our models of multi-document summarization.

Proceedings Article
13 Aug 2004
TL;DR: This work shows that many of the Microsoft NGSCB guarantees can be obtained on today's hardware and today's software and that these guarantees do not require a new CPU mode or operating system but merely depend on the availability of an independent trusted entity, a TPM for example.
Abstract: We present the design and implementation of a secure integrity measurement system for Linux. All executable content that is loaded onto the Linux system is measured before execution and these measurements are protected by the Trusted Platform Module (TPM) that is part of the Trusted Computing Group (TCG) standards. Our system is the first to extend the TCG trust measurement concepts to dynamic executable content from the BIOS all the way up into the application layer. In effect, we show that many of the Microsoft NGSCB guarantees can be obtained on today's hardware and today's software and that these guarantees do not require a new CPU mode or operating system but merely depend on the availability of an independent trusted entity, a TPM for example. We apply our trust measurement architecture to a web server application where we show how our system can detect undesirable invocations, such as rootkit programs, and that our measurement architecture is practical in terms of the number of measurements taken and the performance impact of making them.

Journal ArticleDOI
TL;DR: A new philosophy in designing image and video quality metrics is followed, which uses structural dis- tortion as an estimate of perceived visual distortion as part of full-reference (FR) video quality assessment.
Abstract: Objective image and video quality measures play important roles in a variety of image and video pro- cessing applications, such as compression, communication, printing, analysis, registration, restoration, enhancement and watermarking. Most proposed quality assessment ap- proaches in the literature are error sensitivity-based meth- ods. In this paper, we follow a new philosophy in designing image and video quality metrics, which uses structural dis- tortion as an estimate of perceived visual distortion. A com- putationally ecient approach is developed for full-reference (FR) video quality assessment. The algorithm is tested on the video quality experts group (VQEG) Phase I FR-TV test data set. Keywords—Image quality assessment, video quality assess- ment, human visual system, error sensitivity, structural dis- tortion, video quality experts group (VQEG)

Journal ArticleDOI
Yurii A. Vlasov1, Sharee J. McNab1
TL;DR: The fabrication and accurate measurement of propagation and bending losses in single-mode silicon waveguides with submicron dimensions fabricated on silicon-on-insulator wafers with record low numbers can be used as a benchmark for further development of silicon microphotonic components and circuits.
Abstract: We report the fabrication and accurate measurement of propagation and bending losses in single-mode silicon waveguides with submicron dimensions fabricated on silicon-on-insulator wafers. Owing to the small sidewall surface roughness achieved by processing on a standard 200mm CMOS fabrication line, minimal propagation losses of 3.6+/-0.1dB/cm for the TE polarization were measured at the telecommunications wavelength of 1.5microm. Losses per 90 masculine bend are measured to be 0.086+/-0.005dB for a bending radius of 1microm and as low as 0.013+/-0.005dB for a bend radius of 2microm. These record low numbers can be used as a benchmark for further development of silicon microphotonic components and circuits.

Proceedings ArticleDOI
19 May 2004
TL;DR: This paper presented an open, fair and dynamic QoS computation model for web services selection through implementation of and experimentation with a QoS registry in a hypothetical phone service provisioning market place application.
Abstract: The emerging Service-Oriented Computing (SOC) paradigm promises to enable businesses and organizations to collaborate in an unprecedented way by means of standard web services. To support rapid and dynamic composition of services in this paradigm, web services that meet requesters' functional requirements must be able to be located and bounded dynamically from a large and constantly changing number of service providers based on their Quality of Service (QoS). In order to enable quality-driven web service selection, we need an open, fair, dynamic and secure framework to evaluate the QoS of a vast number of web services. The fair computation and enforcing of QoS of web services should have minimal overhead but yet able to achieve sufficient trust by both service requesters and providers. In this paper, we presented our open, fair and dynamic QoS computation model for web services selection through implementation of and experimentation with a QoS registry in a hypothetical phone service provisioning market place application.

Journal ArticleDOI
David A. Ferrucci1, Adam Lally1
TL;DR: A general introduction to U IMA is given focusing on the design points of its analysis engine architecture and how UIMA is helping to accelerate research and technology transfer is discussed.
Abstract: IBM Research has over 200 people working on Unstructured Information Management (UIM) technologies with a strong focus on Natural Language Processing (NLP). These researchers are engaged in activities ranging from natural language dialog, information retrieval, topic-tracking, named-entity detection, document classification and machine translation to bioinformatics and open-domain question answering. An analysis of these activities strongly suggested that improving the organization's ability to quickly discover each other's results and rapidly combine different technologies and approaches would accelerate scientific advance. Furthermore, the ability to reuse and combine results through a common architecture and a robust software framework would accelerate the transfer of research results in NLP into IBM's product platforms. Market analyses indicating a growing need to process unstructured information, specifically multilingual, natural language text, coupled with IBM Research's investment in NLP, led to the development of middleware architecture for processing unstructured information dubbed UIMA. At the heart of UIMA are powerful search capabilities and a data-driven framework for the development, composition and distributed deployment of analysis engines. In this paper we give a general introduction to UIMA focusing on the design points of its analysis engine architecture and we discuss how UIMA is helping to accelerate research and technology transfer.

Journal ArticleDOI
18 May 2004
TL;DR: This work presents various methods that monolithically bind a cryptographic key with the biometric template of a user stored in the database in such a way that the key cannot be revealed without a successful biometric authentication.
Abstract: In traditional cryptosystems, user authentication is based on possession of secret keys; the method falls apart if the keys are not kept secret (i.e., shared with non-legitimate users). Further, keys can be forgotten, lost, or stolen and, thus, cannot provide non-repudiation. Current authentication systems based on physiological and behavioral characteristics of persons (known as biometrics), such as fingerprints, inherently provide solutions to many of these problems and may replace the authentication component of traditional cryptosystems. We present various methods that monolithically bind a cryptographic key with the biometric template of a user stored in the database in such a way that the key cannot be revealed without a successful biometric authentication. We assess the performance of one of these biometric key binding/generation algorithms using the fingerprint biometric. We illustrate the challenges involved in biometric key generation primarily due to drastic acquisition variations in the representation of a biometric identifier and the imperfect nature of biometric feature extraction and matching algorithms. We elaborate on the suitability of these algorithms for digital rights management systems.

Proceedings ArticleDOI
17 May 2004
TL;DR: A macroscopic characterization of topic propagation through the authors' corpus is presented, formalizing the notion of long-running "chatter" topics consisting recursively of "spike" topics generated by outside world events, or more rarely, by resonances within the community.
Abstract: We study the dynamics of information propagation in environments of low-overhead personal publishing, using a large collection of weblogs over time as our example domain. We characterize and model this collection at two levels. First, we present a macroscopic characterization of topic propagation through our corpus, formalizing the notion of long-running "chatter" topics consisting recursively of "spike" topics generated by outside world events, or more rarely, by resonances within the community. Second, we present a microscopic characterization of propagation from individual to individual, drawing on the theory of infectious diseases to model the flow. We propose, validate, and employ an algorithm to induce the underlying propagation network from a sequence of posts, and report on the results.

Patent
20 Dec 2004
TL;DR: An apparatus, system, and method for provisioning database resource within a grid database system is described in this paper, where an analysis module analyzes a data query stream from an application to a database instance and determines if the query stream exhibits a predetermined performance attribute.
Abstract: An apparatus, system, and method are disclosed for provisioning database resource within a grid database system. The federation apparatus includes an analysis module and a provision module. The analysis module analyzes a data query stream from an application to a database instance and determines if the data query stream exhibits a predetermined performance attribute. The provision module provisions a database resource in response to a determination that the data query stream exhibits the predetermined performance attribute. The provisioned database resource may be a database instance or a cache. The provisioning of the new database resource advantageously is substantially transparent to a client on the database system.

Book ChapterDOI
02 May 2004
TL;DR: This work proposes a simple and efficient construction of a CCA-secure public-key encryption scheme from any CPA-secure identity-based encryption (IBE) scheme, which avoids non-interactive proofs of “well-formedness” which were shown to underlie most previous constructions.
Abstract: We propose a simple and efficient construction of a CCA-secure public-key encryption scheme from any CPA-secure identity-based encryption (IBE) scheme. Our construction requires the underlying IBE scheme to satisfy only a relatively “weak” notion of security which is known to be achievable without random oracles; thus, our results provide a new approach for constructing CCA-secure encryption schemes in the standard model. Our approach is quite different from existing ones; in particular, it avoids non-interactive proofs of “well-formedness” which were shown to underlie most previous constructions. Furthermore, applying our conversion to some recently-proposed IBE schemes results in CCA-secure schemes whose efficiency makes them quite practical.

Proceedings ArticleDOI
25 Apr 2004
TL;DR: This paper investigates the dynamics of Wikipedia, a prominent, thriving wiki, and focuses on the relevance of authorship, the value of community surveillance in ameliorating antisocial behavior, and how authors with competing perspectives negotiate their differences.
Abstract: The Internet has fostered an unconventional and powerful style of collaboration: "wiki" web sites, where every visitor has the power to become an editor. In this paper we investigate the dynamics of Wikipedia, a prominent, thriving wiki. We make three contributions. First, we introduce a new exploratory data analysis tool, the history flow visualization, which is effective in revealing patterns within the wiki context and which we believe will be useful in other collaborative situations as well. Second, we discuss several collaboration patterns highlighted by this visualization tool and corroborate them with statistical analysis. Third, we discuss the implications of these patterns for the design and governance of online collaborative social spaces. We focus on the relevance of authorship, the value of community surveillance in ameliorating antisocial behavior, and how authors with competing perspectives negotiate their differences.

Proceedings ArticleDOI
Bianca Zadrozny1
04 Jul 2004
TL;DR: This paper formalizes the sample selection bias problem in machine learning terms and study analytically and experimentally how a number of well-known classifier learning methods are affected by it.
Abstract: Classifier learning methods commonly assume that the training data consist of randomly drawn examples from the same distribution as the test examples about which the learned model is expected to make predictions. In many practical situations, however, this assumption is violated, in a problem known in econometrics as sample selection bias. In this paper, we formalize the sample selection bias problem in machine learning terms and study analytically and experimentally how a number of well-known classifier learning methods are affected by it. We also present a bias correction method that is particularly useful for classifier evaluation under sample selection bias.

Proceedings ArticleDOI
25 Oct 2004
TL;DR: In this article, direct anonymous attestation (DAA) is proposed for remote authentication of a hardware module, called Trusted Platform Module (TPM), while preserving the privacy of the user of the platform that contains the module.
Abstract: This paper describes the direct anonymous attestation scheme (DAA). This scheme was adopted by the Trusted Computing Group (TCG) as the method for remote authentication of a hardware module, called Trusted Platform Module (TPM), while preserving the privacy of the user of the platform that contains the module. DAA can be seen as a group signature without the feature that a signature can be opened, i.e., the anonymity is not revocable. Moreover, DAA allows for pseudonyms, i.e., for each signature a user (in agreement with the recipient of the signature) can decide whether or not the signature should be linkable to another signature. DAA furthermore allows for detection of "known" keys: if the DAA secret keys are extracted from a TPM and published, a verifier can detect that a signature was produced using these secret keys. The scheme is provably secure in the random oracle model under the strong RSA and the decisional Diffie-Hellman assumption.

Journal ArticleDOI
TL;DR: How the structure of the nanotube is the key enabler of this particular one-dimensional tunneling effect is discussed, which is controlled here by the valence and conduction band edges in a bandpass-filter-like arrangement.
Abstract: A detailed study on the mechanism of band-to-band tunneling in carbon nanotube field-effect transistors (CNFETs) is presented. Through a dual-gated CNFET structure tunneling currents from the valence into the conduction band and vice versa can be enabled or disabled by changing the gate potential. Different from a conventional device where the Fermi distribution ultimately limits the gate voltage range for switching the device on or off, current flow is controlled here by the valence and conduction band edges in a bandpass-filter-like arrangement. We discuss how the structure of the nanotube is the key enabler of this particular one-dimensional tunneling effect.

Journal ArticleDOI
TL;DR: In this article, the authors take a critical look at the relationship between the security of cryptographic schemes in the Random Oracle Model, and the schemes that result from implementing the random oracle by so-called "cryptographic hash functions".
Abstract: We take a critical look at the relationship between the security of cryptographic schemes in the Random Oracle Model, and the security of the schemes that result from implementing the random oracle by so called "cryptographic hash functions".The main result of this article is a negative one: There exist signature and encryption schemes that are secure in the Random Oracle Model, but for which any implementation of the random oracle results in insecure schemes. In the process of devising the above schemes, we consider possible definitions for the notion of a "good implementation" of a random oracle, pointing out limitations and challenges.

Proceedings ArticleDOI
13 Jun 2004
TL;DR: The gIndex approach not only provides and elegant solution to the graph indexing problem, but also demonstrates how database indexing and query processing can benefit form data mining, especially frequent pattern mining.
Abstract: Graph has become increasingly important in modelling complicated structures and schemaless data such as proteins, chemical compounds, and XML documents. Given a graph query, it is desirable to retrieve graphs quickly from a large database via graph-based indices. In this paper, we investigate the issues of indexing graphs and propose a novel solution by applying a graph mining technique. Different from the existing path-based methods, our approach, called gIndex, makes use of frequent substructure as the basic indexing feature. Frequent substructures are ideal candidates since they explore the intrinsic characteristics of the data and are relatively stable to database updates. To reduce the size of index structure, two techniques, size-increasing support constraint and discriminative fragments, are introduced. Our performance study shows that gIndex has 10 times smaller index size, but achieves 3--10 times better performance in comparison with a typical path-based method, GraphGrep. The gIndex approach not only provides and elegant solution to the graph indexing problem, but also demonstrates how database indexing and query processing can benefit form data mining, especially frequent pattern mining. Furthermore, the concepts developed here can be applied to indexing sequences, trees, and other complicated structures as well.