scispace - formally typeset
Search or ask a question

Showing papers by "IBM published in 1993"


Journal ArticleDOI
TL;DR: A description of the ab initio quantum chemistry package GAMESS, which can be treated with wave functions ranging from the simplest closed‐shell case up to a general MCSCF case, permitting calculations at the necessary level of sophistication.
Abstract: A description of the ab initio quantum chemistry package GAMESS is presented. Chemical systems containing atoms through radon can be treated with wave functions ranging from the simplest closed-shell case up to a general MCSCF case, permitting calculations at the necessary level of sophistication. Emphasis is given to novel features of the program. The parallelization strategy used in the RHF, ROHF, UHF, and GVB sections of the program is described, and detailed speecup results are given. Parallel calculations can be run on ordinary workstations as well as dedicated parallel machines. © John Wiley & Sons, Inc.

18,546 citations


Proceedings ArticleDOI
01 Jun 1993
TL;DR: An efficient algorithm is presented that generates all significant association rules between items in the database of customer transactions and incorporates buffer management and novel estimation and pruning techniques.
Abstract: We are given a large database of customer transactions. Each transaction consists of items purchased by a customer in a visit. We present an efficient algorithm that generates all significant association rules between items in the database. The algorithm incorporates buffer management and novel estimation and pruning techniques. We also present results of applying this algorithm to sales data obtained from a large retailing company, which shows the effectiveness of the algorithm.

15,645 citations


Proceedings ArticleDOI
Mihir Bellare1, Phillip Rogaway1
01 Dec 1993
TL;DR: It is argued that the random oracles model—where all parties have access to a public random oracle—provides a bridge between cryptographic theory and cryptographic practice, and yields protocols much more efficient than standard ones while retaining many of the advantages of provable security.
Abstract: We argue that the random oracle model—where all parties have access to a public random oracle—provides a bridge between cryptographic theory and cryptographic practice. In the paradigm we suggest, a practical protocol P is produced by first devising and proving correct a protocol PR for the random oracle model, and then replacing oracle accesses by the computation of an “appropriately chosen” function h. This paradigm yields protocols much more efficient than standard ones while retaining many of the advantages of provable security. We illustrate these gains for problems including encryption, signatures, and zero-knowledge proofs.

5,313 citations


Journal Article
TL;DR: The authors describe a series of five statistical models of the translation process and give algorithms for estimating the parameters of these models given a set of pairs of sentences that are translations of one another.
Abstract: We describe a series of five statistical models of the translation process and give algorithms for estimating the parameters of these models given a set of pairs of sentences that are translations of one another. We define a concept of word-by-word alignment between such pairs of sentences. For any given pair of such sentences each of our models assigns a probability to each of the possible word-by-word alignments. We give an algorithm for seeking the most probable of these alignments. Although the algorithm is suboptimal, the alignment thus obtained accounts well for the word-by-word relationships in the pair of sentences. We have a great deal of data in French and English from the proceedings of the Canadian Parliament. Accordingly, we have restricted our work to these two languages; but we feel that because our algorithms have minimal linguistic content they would work well on other pairs of languages. We also feel, again because of the minimal linguistic content of our algorithms, that it is reasonable to argue that word-by-word alignments are inherent in any sufficiently large bilingual corpus.

4,693 citations


Journal ArticleDOI
Abhay Parekh1, Robert G. Gallager1
TL;DR: Worst-case bounds on delay and backlog are derived for leaky bucket constrained sessions in arbitrary topology networks of generalized processor sharing (GPS) servers and the effectiveness of PGPS in guaranteeing worst-case session delay is demonstrated under certain assignments.
Abstract: Worst-case bounds on delay and backlog are derived for leaky bucket constrained sessions in arbitrary topology networks of generalized processor sharing (GPS) servers. The inherent flexibility of the service discipline is exploited to analyze broad classes of networks. When only a subset of the sessions are leaky bucket constrained, we give succinct per-session bounds that are independent of the behavior of the other sessions and also of the network topology. However, these bounds are only shown to hold for each session that is guaranteed a backlog clearing rate that exceeds the token arrival rate of its leaky bucket. A much broader class of networks, called consistent relative session treatment (CRST) networks is analyzed for the case in which all of the sessions are leaky bucket constrained. First, an algorithm is presented that characterizes the internal traffic in terms of average rate and burstiness, and it is shown that all CRST networks are stable. Next, a method is presented that yields bounds on session delay and backlog given this internal traffic characterization. The links of a route are treated collectively, yielding tighter bounds than those that result from adding the worst-case delays (backlogs) at each of the links in the route. The bounds on delay and backlog for each session are efficiently computed from a universal service curve, and it is shown that these bounds are achieved by "staggered" greedy regimes when an independent sessions relaxation holds. Propagation delay is also incorporated into the model. Finally, the analysis of arbitrary topology GPS networks is related to Packet GPS networks (PGPS). The PGPS scheme was first proposed by Demers, Shenker and Keshav (1991) under the name of weighted fair queueing. For small packet sizes, the behavior of the two schemes is seen to be virtually identical, and the effectiveness of PGPS in guaranteeing worst-case session delay is demonstrated under certain assignments. >

3,967 citations


Journal ArticleDOI
Donald S. Bethune1, C. H. Klang1, M.S. de Vries1, G. Gorman1, R. Savoy1, J. E. Vazquez1, Robert Beyers1 
17 Jun 1993-Nature
TL;DR: In this paper, it was shown that covaporizing carbon and cobalt in an arc generator leads to the formation of carbon nanotubes which all have very small diameters (about 1.2 nm) and walls only a single atomic layer thick.
Abstract: CARBON exhibits a unique ability to form a wide range of structures. In an inert atmosphere it condenses to form hollow, spheroidal fullerenes. Carbon deposited on the hot tip of the cathode of the arc-discharge apparatus used for bulk fullerene synthesis will form nested graphitic tubes and polyhedral particles. Electron irradiation of these nanotubes and polyhedra transforms them into nearly spherical carbon 'onions'. We now report that covaporizing carbon and cobalt in an arc generator leads to the formation of carbon nanotubes which all have very small diameters (about 1.2 nm) and walls only a single atomic layer thick. The tubes form a web-like deposit woven through the fullerene-containing soot, giving it a rubbery texture. The uniformity and single-layer structure of these nanotubes should make it possible to test their properties against theoretical predictions.

3,758 citations


Proceedings ArticleDOI
TL;DR: The main algorithms for color texture, shape and sketch query that are presented, show example query results, and discuss future directions are presented.
Abstract: In the query by image content (QBIC) project we are studying methods to query large on-line image databases using the images' content as the basis of the queries. Examples of the content we use include color, texture, and shape of image objects and regions. Potential applications include medical (`Give me other images that contain a tumor with a texture like this one'), photo-journalism (`Give me images that have blue at the top and red at the bottom'), and many others in art, fashion, cataloging, retailing, and industry. Key issues include derivation and computation of attributes of images and objects that provide useful query functionality, retrieval methods based on similarity as opposed to exact match, query by image example or user drawn image, the user interfaces, query refinement and navigation, high dimensional database indexing, and automatic and semi-automatic database population. We currently have a prototype system written in X/Motif and C running on an RS/6000 that allows a variety of queries, and a test database of over 1000 images and 1000 objects populated from commercially available photo clip art images. In this paper we present the main algorithms for color texture, shape and sketch query that we use, show example query results, and discuss future directions.© (1993) COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted for personal use only.

2,127 citations


Book ChapterDOI
13 Oct 1993
TL;DR: An indexing method for time sequences for processing similarity queries using R * -trees to index the sequences and efficiently answer similarity queries and provides experimental results which show that the method is superior to search based on sequential scanning.
Abstract: We propose an indexing method for time sequences for processing similarity queries. We use the Discrete Fourier Transform (DFT) to map time sequences to the frequency domain, the crucial observation being that, for most sequences of practical interest, only the first few frequencies are strong. Another important observation is Parseval's theorem, which specifies that the Fourier transform preserves the Euclidean distance in the time or frequency domain. Having thus mapped sequences to a lower-dimensionality space by using only the first few Fourier coefficients, we use R * -trees to index the sequences and efficiently answer similarity queries. We provide experimental results which show that our method is superior to search based on sequential scanning. Our experiments show that a few coefficients (1–3) are adequate to provide good performance. The performance gain of our method increases with the number and length of sequences.

2,082 citations


Book ChapterDOI
Mihir Bellare1, Phillip Rogaway1
22 Aug 1993
TL;DR: This work provides the first formal treatment of entity authentication and authenticated key distribution appropriate to the distributed environment and presents a definition, protocol, and proof that the protocol meets its goal, assuming only the existence of a pseudorandom function.
Abstract: We provide the first formal treatment of entity authentication and authenticated key distribution appropriate to the distributed environment. Addressed in detail are the problems of mutual authentication and authenticated key exchange for the symmetric, two-party setting. For each we present a definition, protocol, and proof that the protocol meets its goal, assuming only the existence of a pseudorandom function.

1,926 citations


Journal ArticleDOI
TL;DR: The authors' perspective of database mining as the confluence of machine learning techniques and the performance emphasis of database technology is presented and an algorithm for classification obtained by combining the basic rule discovery operations is given.
Abstract: The authors' perspective of database mining as the confluence of machine learning techniques and the performance emphasis of database technology is presented. Three classes of database mining problems involving classification, associations, and sequences are described. It is argued that these problems can be uniformly viewed as requiring discovery of rules embedded in massive amounts of data. A model and some basic operations for the process of rule discovery are described. It is shown how the database mining problems considered map to this model, and how they can be solved by using the basic operations proposed. An example is given of an algorithm for classification obtained by combining the basic rule discovery operations. This algorithm is efficient in discovering classification rules and has accuracy comparable to ID3, one of the best current classifiers. >

1,539 citations


Journal ArticleDOI
08 Oct 1993-Science
TL;DR: Tuning spectroscopy performed inside of the corrals revealed a series of discrete resonances, providing evidence for size quantization and STM images show that the corral's interior local density of states is dominated by the eigenstate density expected for an electron trapped in a round two-dimensional box.
Abstract: A method for confining electrons to artificial structures at the nanometer lengthscale is presented. Surface state electrons on a copper(111) surface were confined to closed structures (corrals) defined by barriers built from iron adatoms. The barriers were assembled by individually positioning iron adatoms with the tip of a 4-kelvin scanning tunneling microscope (STM). A circular corral of radius 71.3 A was constructed in this way out of 48 iron adatoms. Tunneling spectroscopy performed inside of the corral revealed a series of discrete resonances, providing evidence for size quantization. STM images show that the corral's interior local density of states is dominated by the eigenstate density expected for an electron trapped in a round two-dimensional box.

Journal ArticleDOI
TL;DR: The capability maturity model (CMM), developed to present sets of recommended practices in a number of key process areas that have been shown to enhance software-development and maintenance capability, is discussed.
Abstract: The capability maturity model (CMM), developed to present sets of recommended practices in a number of key process areas that have been shown to enhance software-development and maintenance capability, is discussed. The CMM was designed to help developers select process-improvement strategies by determining their current process maturity and identifying the issues most critical to improving their software quality and process. The initial release of the CMM, version 1.0, was reviewed and used by the software community during 1991 and 1992. A workshop on CMM 1.0, held in April 1992, was attended by about 200 software professionals. The current version of the CMM is the result of the feedback from that workshop and ongoing feedback from the software community. The technical report that describes version 1.1. is summarised. >

Journal ArticleDOI
01 Jun 1993-Nature
TL;DR: In this paper, standing-wave patterns in the local density of states of the Cu(lll) surface using the scanning tunnelling microscope (STM) at low temperature were observed.
Abstract: ELECTRONS occupying surface states on the close-packed surfaces of noble metals form a two-dimensional nearly free electron gas1–3. These states can be probed using the scanning tunnelling microscope (STM), providing a unique opportunity to study the local properties of electrons in low-dimensional systems4. Here we report the direct observation of standing-wave patterns in the local density of states of the Cu(lll) surface using the STM at low temperature. These spatial oscillations are quantum-mechanical interference patterns caused by scattering of the two-dimensional electron gas off step edges and point defects. Analysis of the spatial oscillations gives an independent measure of the surface state dispersion, as well as insight into the interaction between surface-state electrons and scattering sites on the surface.

Journal ArticleDOI
TL;DR: Extended self-similarity (ESS) holds at high as well as at low Reynolds number, and it is characterized by the same scaling exponents of the velocity differences of fully developed turbulence.
Abstract: We report on the existence of a hitherto undetected form of self-similarity, which we call extended self-similarity (ESS). ESS holds at high as well as at low Reynolds number, and it is characterized by the same scaling exponents of the velocity differences of fully developed turbulence.

Book ChapterDOI
Jarek Rossignac1, Paul Borrel1
01 Jan 1993
TL;DR: This work presents a simple, effective, and efficient technique for approximating arbitrary polyhedra based on triangulation and vertex-clustering, and produces a series of 3D approximations that resemble the original object from all viewpoints, but contain an increasingly smaller number of faces and vertices.
Abstract: We present a simple, effective, and efficient technique for approximating arbitrary polyhedra. It is based on triangulation and vertex-clustering, and produces a series of 3D approximations (also called “levels of detail”) that resemble the original object from all viewpoints, but contain an increasingly smaller number of faces and vertices. The simplification is more efficient than competing techniques because it does not require building and maintaining a topological adjacency graph. Furthermore, it is better suited for mechanical CAD models which often exhibit patterns of small features, because it automatically groups and simplifies features that are geometrically close, but need not be topologically close or even part of a single connected component Using a lower level of detail when displaying small, distant, or background objects improves graphic performance without a significant loss of perceptual information, and thus enables realtime inspection of complex scenes or a convenient environment for animation or walkthrough preview.

Patent
13 May 1993
TL;DR: In this paper, the authors used image processing to determine information about the position of a designated object in an endoscopic surgery procedure, which is particularly useful in applications where the object is difficult to view or locate.
Abstract: The present method and apparatus use image processing to determine information about the position of a designated object. The invention is particularly useful in applications where the object is difficult to view or locate. In particular, the invention is used in endoscopic surgery to determine positional information about an anatomical feature within a patient's body. The positional information is then used to position or reposition an instrument (surgical instrument) in relation to the designated object (anatomical feature). The invention comprises an instrument which is placed in relation to the designated object and which is capable of sending information about the object to a computer. Image processing methods are used to generated images of the object and determine positional information about it. This information can be used as input to robotic devices or can be rendered, in various ways (video graphics, speech synthesis), to a human user. Various input apparatus are attached to the transmitting or other used instruments to provide control inputs to the computer.

Journal ArticleDOI
Gunter Dueck1
TL;DR: The quality of the computational results obtained so far by RRT and GDA shows that the new algorithms behave equally well as TA and thus a fortiori better than SA.

Journal ArticleDOI
Jerry Tersoff1, Rudolf M. Tromp1
TL;DR: It is shown that strained epitaxial layers, as they increase in size, may undergo a shape transition, which allows better elastic relaxation of the island's stress.
Abstract: Strained epitaxial layers tend initially to grow as dislocation-free islands Here we show that such islands, as they increase in size, may undergo a shape transition Below a critical size, islands have a compact, symmetric shape But at larger sizes, they adopt a long thin shape, which allows better elastic relaxation of the island's stress We have observed such elongated islands, with aspect ratios greater than 50:1, in low energy electron microscopy studies of growth of Ag on Si (001) These islands represent a novel approach to the fabrication of «quantum wires»

Journal ArticleDOI
01 Nov 1993-Nature
TL;DR: The existence of such species is now strongly supported by a growing body of experimental evidence as discussed by the authors, and their structure and properties are beginning to be explored in milligram quantities, and their structures and properties have been explored.
Abstract: Encapsulating atoms or molecules inside fullerene cages could give rise to a myriad of novel molecules and materials. The existence of such species is now strongly supported by a growing body of experimental evidence. Fullerene& ndash;metal complexes generally thought to be endohedral are being produced and purified in milligram quantities, and their structure and properties are beginning to be explored

Journal ArticleDOI
F. Ohnesorge1, G. Binnig1
04 Jun 1993-Science
TL;DR: The (1014) cleavage plane of calcite has been investigated by atomic force microscopy in water at room temperature and true lateral atomic-scale resolution was achieved, achieving the highest, most reliable resolution on a flat, well-ordered surface.
Abstract: The (1014) cleavage plane of calcite has been investigated by atomic force microscopy in water at room temperature. True lateral atomic-scale resolution was achieved; the atomic-scale periodicities as well as the expected relative positions of the atoms within each unit cell were obtained. Along monoatomic step lines, atomic-scale kinks, representing point-like defects, were resolved. Attractive forces on the order of 10-11 newton acting between single atomic sites on the sample and the front atoms of the tip were directly measured and provided the highest, most reliable resolution on a flat, well-ordered surface.

Journal ArticleDOI
TL;DR: This paper looks at why it may not be sufficient to work on any one of these areas in isolation or to only harmonize business strategy and information technology, and why the Strategic Alignment Model is applied.
Abstract: The strategic use of information technology (I/T) is now and has been a fundamental issue for every business. In essence, I/T can alter the basic nature of an industry. The effective and efficient utilization of information technology requires the alignment of the I/T strategies with the business strategies, something that was not done successfully in the past with traditional approaches. New methods and approaches are now available. Thes trategic alignment framework applies the Strategic Alignment Model to reflect the view that business success depends on the linkage of business strategy, information technology strategy, organizational infrastructure and processes, and I/T infrastructure and processes. In this paper, we look at why it may not be sufficient to work on any one of these areas in isolation or to only harmonize business strategy and information technology. One reason is that, often, too much attention is placed on technology, rather than business, management, and organizational issues. The objective is to build an organizational structure and set of business processes that reflect the interdependence of enterprise strategy and information technology capabilities. The attention paid to the linkage of information technology to the enterprise can significantly affect the competitiveness and efficiency of the business. The essential issue is how information technology can enable the achievement of competitive and strategic advantage for the enterprise.

Journal ArticleDOI
TL;DR: The authors show the existence of effective bandwidths for multiclass Markov fluids and other types of sources that are used to model ATM traffic and show that when such sources share a buffer with deterministic service rate, a constraint on the tail of the buffer occupancy distribution is a linear constraint onThe number of sources.
Abstract: The authors show the existence of effective bandwidths for multiclass Markov fluids and other types of sources that are used to model ATM traffic. More precisely, it is shown that when such sources share a buffer with deterministic service rate, a constraint on the tail of the buffer occupancy distribution is a linear constraint on the number of sources. That is, for a small loss probability one can assume that each source transmits at a fixed rate called its effective bandwidth. When traffic parameters are known, effective bandwidths can be calculated and may be used to obtain a circuit-switched style call acceptance and routing algorithm for ATM networks. The important feature of the effective bandwidth of a source is that it is a characteristic of that source and the acceptable loss probability only. Thus, the effective bandwidth of a source does not depend on the number of sources sharing the buffer or the model parameters of other types of sources sharing the buffer. >

Patent
19 Jan 1993
TL;DR: In this article, the authors propose a method to detect undesirable software entities, such as a computer virus, worm, or Trojan Horse, in a data processing system by detecting anomalous behavior that may indicate the presence of an undesirable software entity.
Abstract: A method includes the following component steps, or some functional subset of these steps: (A) periodic monitoring of a data processing system (10) for anomalous behavior that may indicate the presence of an undesirable software entity such as a computer virus, worm, or Trojan Horse; (B) automatic scanning for occurrences of known types of undesirable software entities and taking remedial action if they are discovered; (C) deploying decoy programs to capture samples of unknown types of computer viruses; (D) identifying machine code portions of the captured samples which are unlikely to vary from one instance of the virus to another; (E) extracting an identifying signature from the executable code portion and adding the signature to a signature database; (F) informing neighboring data processing systems on a network of an occurrence of the undesirable software entity; and (G) generating a distress signal, if appropriate, so as to call upon an expert to resolve difficult cases. A feature of this invention is the automatic execution of the foregoing steps in response to a detection of an undesired software entity, such as a virus or a worm, within a data processing system. The automatic extraction of the identifying signature, the addition of the signature to a signature data base, and the immediate use of the signature by a scanner provides protection from subsequent infections of the system, and also a network of systems, by the same or an altered form of the undesirable software entity.

Journal ArticleDOI
TL;DR: An ab initio molecular dynamics simulation of liquid water has been performed using density functional theory in the Kohn-Sham formulation and a plane wave basis set to determine the electronic structure and the forces at each time step.
Abstract: An ab initio molecular dynamics simulation of liquid water has been performed using density functional theory in the Kohn–Sham formulation and a plane wave basis set to determine the electronic structure and the forces at each time step. For an accurate description of the hydrogen bonding in the liquid, it was necessary to extend the exchange functional with a term that depends on the gradient of the electron density. A further important technical detail is that supersoft pseudopotentials were used to treat the valence orbitals of the oxygen atoms in a plane wave expansion. The structural and dynamical properties of the liquid were found to be in good agreement with experiment. The ab initio molecular dynamics also yields information on the electronic structure. The electronic feature of special interest is the lowest unoccupied molecular orbital (LUMO) of the liquid which is the state occupied by a thermalized excess electron in the conductive state. The main result of calculating the liquid LUMO is that it is a delocalized state distributed over interstitial space between the molecules with a significant admixture of the σ* orbitals of the individual water molecules.

Patent
10 Sep 1993
TL;DR: In this paper, a method and apparatus for enabling a cluster of computers to appear as a single computer to host computers outside the cluster is presented, where a host computer communicates only with a gateway to access destination nodes and processes within the cluster.
Abstract: The present invention provides a method and apparatus for enabling a cluster of computers to appear as a single computer to host computers outside the cluster. A host computer communicates only with a gateway to access destination nodes and processes within the cluster. The gateway has at least one message switch which processes incoming and outgoing port type messages crossing the cluster boundary. This processing comprises examining certain information on the message headers and then changing some of this header information either to route an incoming message to the proper computer node, port and process or to make an outgoing message appear as if originated at the gateway node. The message switch uses a table to match incoming messages to a particular routing function which can be run to perform the changes necessary to correctly route different kinds of messages.

Journal ArticleDOI
TL;DR: Five DLB strategies are presented which illustrate the tradeoff between knowledge - the accuracy of each balancing decision, and overhead - the amount of added processing and communication incurred by the balancing process.
Abstract: Dynamic load balancing strategies for minimizing the execution time of single applications running in parallel on multicomputer systems are discussed. Dynamic load balancing (DLB) is essential for the efficient use of highly parallel systems when solving non-uniform problems with unpredictable load estimates. With the evolution of more highly parallel systems, centralized DLB approaches which make use of a high degree of knowledge become less feasible due to the load balancing communication overhead. Five DLB strategies are presented which illustrate the tradeoff between 1) knowledge - the accuracy of each balancing decision, and 2) overhead - the amount of added processing and communication incurred by the balancing process. All five strategies have been implemented on an Inter iPSC/2 hypercube. >

Journal ArticleDOI
TL;DR: This work clarifies the roles of such attractors in producing intermittency, and provides examples, and relates them to previous work.
Abstract: On-off intermittency is an aperiodic switching between static, or laminar, behavior and chaotic bursts of oscillation. It can be generated by systems having an unstable invariant (or quasi-invariant) manifold, within which is found a suitable attractor. We clarify the roles of such attractors in producing intermittency, provide examples, and relate them to previous work.

Journal ArticleDOI
29 Jan 1993-Science
TL;DR: In this article, the authors used circularly polarized soft x-rays with an imaging photoelectron microscope to record images of magnetic domains at a spatial resolution of 1 micrometer.
Abstract: Circularly polarized soft x-rays have been used with an imaging photoelectron microscope to record images of magnetic domains at a spatial resolution of 1 micrometer. The magnetic contrast, which can be remarkably large, arises from the fact that the x-ray absorption cross section at inner-shell absorption edges of aligned magnetic atoms depends on the relative orientation of the photon spin and the local magnetization direction. The technique is element-specific, and, because of the long mean free paths of the x-rays and secondary electrons, it can record images of buried magnetic layers.

Journal ArticleDOI
TL;DR: The optical study of trivalent 3d transition-metal-oxide compounds with the perovskitelike structure has revealed the variation of their electronic structure with the 3d element (M) as well as the A-site rare-earth element (R).
Abstract: The optical study of trivalent 3d transition-metal-oxide compounds (RM${\mathrm{O}}_{3}$) with the perovskitelike structure has revealed the variation of their electronic structure with the 3d element (M) as well as the A-site rare-earth element (R). The crossover of the gap nature from the Mott type to charge-transfer (CT) type with increasing atomic number of M is observed to occur around M=Cr. The variation of Mott and CT gaps with M species is quantitatively consistent with the tendency expected from an ionic model. However, for the low-energy electronic structures for the narrow-gap (or metallic) compounds (M=Ti,Co,Ni), the effects of the M-3d--O-2p hybridization must be included.

Journal ArticleDOI
Norman Bobroff1
TL;DR: In this article, the state of the art in high-resolution displacement measuring interferometry is reviewed, and several approaches to improve this situation are described, including multi-wavelength inter-ferometry.
Abstract: The present state of high-resolution displacement measuring interferometry is reviewed. Factors which determine the accuracy, linearity and repeatability of nanometre-scale measurements are emphasized. Many aspects of interferometry are discussed, including general metrology and alignment errors, as well as path length errors. Optical mixing and the nonlinear relation between phase and displacement are considered, as well as the influence of diffraction on accuracy. Environmental stability is a major factor in the repeatability and accuracy of measurement. It is difficult to obtain a measurement accuracy of 10-7 when working in air. Several approaches to improving this situation are described, including multiwavelength interferometry. Recent measurements of the short- and long-term frequency stability of lasers are summarized. Optical feedback is a subtle, but important source of frequency destabilization, and methods of detection and isolation are reviewed. Calibration of phase measuring electronics used for subfringe interpolation is included. Progress in 'in situ' identification of error sources and methods of validating accuracy are emphasized.