scispace - formally typeset
Search or ask a question

Showing papers by "Stanford University published in 1994"


Proceedings ArticleDOI
21 Jun 1994
TL;DR: A feature selection criterion that is optimal by construction because it is based on how the tracker works, and a feature monitoring method that can detect occlusions, disocclusions, and features that do not correspond to points in the world are proposed.
Abstract: No feature-based vision system can work unless good features can be identified and tracked from frame to frame. Although tracking itself is by and large a solved problem, selecting features that can be tracked well and correspond to physical points in the world is still hard. We propose a feature selection criterion that is optimal by construction because it is based on how the tracker works, and a feature monitoring method that can detect occlusions, disocclusions, and features that do not correspond to points in the world. These methods are based on a new tracking algorithm that extends previous Newton-Raphson style search methods to work under affine image transformations. We test performance with several simulations and experiments. >

8,432 citations


Journal ArticleDOI
TL;DR: In this article, the authors developed a spatially adaptive method, RiskShrink, which works by shrinkage of empirical wavelet coefficients, and achieved a performance within a factor log 2 n of the ideal performance of piecewise polynomial and variable-knot spline methods.
Abstract: SUMMARY With ideal spatial adaptation, an oracle furnishes information about how best to adapt a spatially variable estimator, whether piecewise constant, piecewise polynomial, variable knot spline, or variable bandwidth kernel, to the unknown function. Estimation with the aid of an oracle offers dramatic advantages over traditional linear estimation by nonadaptive kernels; however, it is a priori unclear whether such performance can be obtained by a procedure relying on the data alone. We describe a new principle for spatially-adaptive estimation: selective wavelet reconstruction. We show that variable-knot spline fits and piecewise-polynomial fits, when equipped with an oracle to select the knots, are not dramatically more powerful than selective wavelet reconstruction with an oracle. We develop a practical spatially adaptive method, RiskShrink, which works by shrinkage of empirical wavelet coefficients. RiskShrink mimics the performance of an oracle for selective wavelet reconstruction as well as it is possible to do so. A new inequality in multivariate normal decision theory which we call the oracle inequality shows that attained performance differs from ideal performance by at most a factor of approximately 2 log n, where n is the sample size. Moreover no estimator can give a better guarantee than this. Within the class of spatially adaptive procedures, RiskShrink is essentially optimal. Relying only on the data, it comes within a factor log 2 n of the performance of piecewise polynomial and variableknot spline methods equipped with an oracle. In contrast, it is unknown how or if piecewise polynomial methods could be made to function this well when denied access to an oracle and forced to rely on data alone.

8,153 citations


Journal ArticleDOI
TL;DR: Alur et al. as discussed by the authors proposed timed automata to model the behavior of real-time systems over time, and showed that the universality problem and the language inclusion problem are solvable only for the deterministic automata: both problems are undecidable (II i-hard) in the non-deterministic case and PSPACE-complete in deterministic case.

7,096 citations


Journal ArticleDOI
K. Hagiwara, Ken Ichi Hikasa1, Koji Nakamura, Masaharu Tanabashi1, M. Aguilar-Benitez, Claude Amsler2, R. M. Barnett3, Patricia R. Burchat4, C. D. Carone5, C. Caso, G. Conforto6, Olav Dahl3, Michael Doser7, Semen Eidelman8, Jonathan L. Feng9, L. K. Gibbons10, Maury Goodman11, Christoph Grab12, D. E. Groom3, Atul Gurtu13, Atul Gurtu7, K. G. Hayes14, J. J. Herna`ndez-Rey15, K. Honscheid16, Christopher Kolda17, Michelangelo L. Mangano7, David Manley18, Aneesh V. Manohar19, John March-Russell7, Alberto Masoni, Ramon Miquel3, Klaus Mönig, Hitoshi Murayama20, Hitoshi Murayama3, S. Sánchez Navas12, Keith A. Olive21, Luc Pape7, C. Patrignani, A. Piepke22, Matts Roos23, John Terning24, Nils A. Tornqvist23, T. G. Trippe3, Petr Vogel25, C. G. Wohl3, Ron L. Workman26, W-M. Yao3, B. Armstrong3, P. S. Gee3, K. S. Lugovsky, S. B. Lugovsky, V. S. Lugovsky, Marina Artuso27, D. Asner28, K. S. Babu29, E. L. Barberio7, Marco Battaglia7, H. Bichsel30, O. Biebel31, Philippe Bloch7, Robert N. Cahn3, Ariella Cattai7, R. S. Chivukula32, R. Cousins33, G. A. Cowan34, Thibault Damour35, K. Desler, R. J. Donahue3, D. A. Edwards, Victor Daniel Elvira, Jens Erler36, V. V. Ezhela, A Fassò7, W. Fetscher12, Brian D. Fields37, B. Foster38, Daniel Froidevaux7, Masataka Fukugita39, Thomas K. Gaisser40, L. Garren, H.-J. Gerber12, Frederick J. Gilman41, Howard E. Haber42, C. A. Hagmann28, J.L. Hewett4, Ian Hinchliffe3, Craig J. Hogan30, G. Höhler43, P. Igo-Kemenes44, John David Jackson3, Kurtis F Johnson45, D. Karlen, B. Kayser, S. R. Klein3, Konrad Kleinknecht46, I.G. Knowles47, P. Kreitz4, Yu V. Kuyanov, R. Landua7, Paul Langacker36, L. S. Littenberg48, Alan D. Martin49, Tatsuya Nakada50, Tatsuya Nakada7, Meenakshi Narain32, Paolo Nason, John A. Peacock47, Helen R. Quinn4, Stuart Raby16, Georg G. Raffelt31, E. A. Razuvaev, B. Renk46, L. Rolandi7, Michael T Ronan3, L.J. Rosenberg51, Christopher T. Sachrajda52, A. I. Sanda53, Subir Sarkar54, Michael Schmitt55, O. Schneider50, Douglas Scott56, W. G. Seligman57, Michael H. Shaevitz57, Torbjörn Sjöstrand58, George F. Smoot3, Stefan M Spanier4, H. Spieler3, N. J. C. Spooner59, Mark Srednicki60, A. Stahl, Todor Stanev40, M. Suzuki3, N. P. Tkachenko, German Valencia61, K. van Bibber28, Manuella Vincter62, D. R. Ward63, Bryan R. Webber63, M R Whalley49, Lincoln Wolfenstein41, J. Womersley, C. L. Woody48, O. V. Zenin 
Tohoku University1, University of Zurich2, Lawrence Berkeley National Laboratory3, Stanford University4, College of William & Mary5, University of Urbino6, CERN7, Budker Institute of Nuclear Physics8, University of California, Irvine9, Cornell University10, Argonne National Laboratory11, ETH Zurich12, Tata Institute of Fundamental Research13, Hillsdale College14, Spanish National Research Council15, Ohio State University16, University of Notre Dame17, Kent State University18, University of California, San Diego19, University of California, Berkeley20, University of Minnesota21, University of Alabama22, University of Helsinki23, Los Alamos National Laboratory24, California Institute of Technology25, George Washington University26, Syracuse University27, Lawrence Livermore National Laboratory28, Oklahoma State University–Stillwater29, University of Washington30, Max Planck Society31, Boston University32, University of California, Los Angeles33, Royal Holloway, University of London34, Université Paris-Saclay35, University of Pennsylvania36, University of Illinois at Urbana–Champaign37, University of Bristol38, University of Tokyo39, University of Delaware40, Carnegie Mellon University41, University of California, Santa Cruz42, Karlsruhe Institute of Technology43, Heidelberg University44, Florida State University45, University of Mainz46, University of Edinburgh47, Brookhaven National Laboratory48, Durham University49, University of Lausanne50, Massachusetts Institute of Technology51, University of Southampton52, Nagoya University53, University of Oxford54, Northwestern University55, University of British Columbia56, Columbia University57, Lund University58, University of Sheffield59, University of California, Santa Barbara60, Iowa State University61, University of Alberta62, University of Cambridge63
TL;DR: This biennial Review summarizes much of Particle Physics using data from previous editions, plus 2205 new measurements from 667 papers, and features expanded coverage of CP violation in B mesons and of neutrino oscillations.
Abstract: This biennial Review summarizes much of Particle Physics. Using data from previous editions, plus 2205 new measurements from 667 papers, we list, evaluate, and average measured properties of gauge bosons, leptons, quarks, mesons, and baryons. We also summarize searches for hypothetical particles such as Higgs bosons, heavy neutrinos, and supersymmetric particles. All the particle properties and search limits are listed in Summary Tables. We also give numerous tables, figures, formulae, and reviews of topics such as the Standard Model, particle detectors, probability, and statistics. This edition features expanded coverage of CP violation in B mesons and of neutrino oscillations. For the first time we cover searches for evidence of extra dimensions (both in the particle listings and in a new review). Another new review is on Grand Unified Theories. A booklet is available containing the Summary Tables and abbreviated versions of some of the other sections of this full Review. All tables, listings, and reviews (and errata) are also available on the Particle Data Group website: http://pdg.lbl.gov.

5,143 citations


Journal ArticleDOI
TL;DR: Social dominance orientation (SDO), one's degree of preference for inequality among social groups, is introduced in this article, which is related to beliefs in a lag number of social and political ideologies that support group-based hierarchy and to support for policies that have implications for intergroup relations (e.g., war, civil rights, and social programs).
Abstract: Social dominance orientation (SDO), one's degree of preference for inequality among social groups, is introduced. On the basis of social dominance theory, it is shown that (a) men are more social dominance-oriented than women, (b) high-SDO people seek hierarchy-enhancing professional roles and low-SDO people seek hierarchy-attenuating roles, (c) SDO was related to beliefs in a lag number of social and political ideologies that support group-based hierarchy (e.g., meritocracy and racism) and to support for policies that have implications for intergroup relations (e.g., war, civil rights, and social programs), including new policies. SDO was distinguished from interpersonal dominance, conservatism, and authoritarianism

3,967 citations


Book ChapterDOI
10 Jul 1994
TL;DR: A method for feature subset selection using cross-validation that is applicable to any induction algorithm is described, and experiments conducted with ID3 and C4.5 on artificial and real datasets are discussed.
Abstract: We address the problem of finding a subset of features that allows a supervised induction algorithm to induce small high-accuracy concepts. We examine notions of relevance and irrelevance, and show that the definitions used in the machine learning literature do not adequately partition the features into useful categories of relevance. We present definitions for irrelevance and for two degrees of relevance. These definitions improve our understanding of the behavior of previous subset selection algorithms, and help define the subset of features that should be sought. The features selected should depend not only on the features and the target concept, but also on the induction algorithm. We describe a method for feature subset selection using cross-validation that is applicable to any induction algorithm, and discuss experiments conducted with ID3 and C4.5 on artificial and real datasets.

2,581 citations


Journal ArticleDOI
TL;DR: The Sharpe Index as mentioned in this paper is a measure for the performance of mutual funds and proposed the term reward-to-variability ratio to describe it (the measure is also described in Sharpe [1975] ).
Abstract: . Over 25 years ago, in Sharpe [1966], I introduced a measure for the performance of mutual funds and proposed the term reward-to-variability ratio to describe it (the measure is also described in Sharpe [1975] ). While the measure has gained considerable popularity, the name has not. Other authors have termed the original version the Sharpe Index (Radcliff [1990, p. 286] and Haugen [1993, p. 315]), the Sharpe Measure (Bodie, Kane and Marcus [1993, p. 804], Elton and Gruber [1991, p. 652], and Reilly [1989, p.803]), or the Sharpe Ratio (Morningstar [1993, p. 24]). Generalized versions have also appeared under various names (see. for example, BARRA [1992, p. 21] and Capaul, Rowley and Sharpe [1993, p. 33]).

2,513 citations


Journal ArticleDOI
TL;DR: In this paper, the stationary bootstrap technique was introduced to calculate standard errors of estimators and construct confidence regions for parameters based on weakly dependent stationary observations, where m is fixed.
Abstract: This article introduces a resampling procedure called the stationary bootstrap as a means of calculating standard errors of estimators and constructing confidence regions for parameters based on weakly dependent stationary observations. Previously, a technique based on resampling blocks of consecutive observations was introduced to construct confidence intervals for a parameter of the m-dimensional joint distribution of m consecutive observations, where m is fixed. This procedure has been generalized by constructing a “blocks of blocks” resampling scheme that yields asymptotically valid procedures even for a multivariate parameter of the whole (i.e., infinite-dimensional) joint distribution of the stationary sequence of observations. These methods share the construction of resampling blocks of observations to form a pseudo-time series, so that the statistic of interest may be recalculated based on the resampled data set. But in the context of applying this method to stationary data, it is natural...

2,418 citations


Journal ArticleDOI
TL;DR: In this approach to software development, application programs are written as software agents, i.e. software “components” that communicate with their peers by exchanging messages in an expressive agent communication language.
Abstract: The software world is one of great richness and diversity. Many thousands of software products are available to users today, providing a wide variety of information and services in a wide variety of domains. While most of these programs provide their users with significant value when used in isolation, there is increasing demand for programs that can interoperate – to exchange information and services with other programs and thereby solve problems that cannot be solved alone. Part of what makes interoperation difficult is heterogeneity. Programs are written by different people, at different times, in different languages; and, as a result, they often provide different interfaces. The difficulties created by heterogeneity are exacerbated by dynamics in the software environment. Programs are frequently rewritten; new programs are added; old programs removed. Agent-based software engineering was invented to facilitate the creation of software able to interoperate in such settings. In this approach to software development, application programs are written as software agents, i.e. software “components” that communicate with their peers by exchanging messages in an expressive agent communication language. Agents can be as simple as subroutines; but typically they are larger entities with some sort of persistent control (e.g. distinct control threads within a single address space, distinct processes on a single machine, or separate processes on different machines). The salient feature of the language used by agents is its expressiveness. It allows for the exchange of data and logical information, individual commands and scripts (i.e. programs). Using this language, agents can communicate complex information and goals, directly or indirectly “programming” each other in useful ways. Agent-based software engineering is often compared to object-oriented programming. Like an “object”, an agent provides a message-based interface independent of its internal data structures and algorithms. The primary difference between the two approaches lies in the language of the interface. In general object-oriented programming, the meaning of a message can vary from one object to another. In agent-based software engineering, agents use a common language with an agent-independent semantics. The concept of agent-based software engineering raises a number of important questions.

2,373 citations


Book
01 Jan 1994
TL;DR: This book introduces the mathematics that supports advanced computer programming and the analysis of algorithms, and is an indispensable text and reference not only for computer scientists - the authors themselves rely heavily on it!
Abstract: From the Publisher: This book introduces the mathematics that supports advanced computer programming and the analysis of algorithms. The primary aim of its well-known authors is to provide a solid and relevant base of mathematical skills - the skills needed to solve complex problems, to evaluate horrendous sums, and to discover subtle patterns in data. It is an indispensable text and reference not only for computer scientists - the authors themselves rely heavily on it! - but for serious users of mathematics in virtually every discipline. Concrete Mathematics is a blending of CONtinuous and disCRETE mathematics. "More concretely," the authors explain, "it is the controlled manipulation of mathematical formulas, using a collection of techniques for solving problems." The subject matter is primarily an expansion of the Mathematical Preliminaries section in Knuth's classic Art of Computer Programming, but the style of presentation is more leisurely, and individual topics are covered more deeply. Several new topics have been added, and the most significant ideas have been traced to their historical roots. The book includes more than 500 exercises, divided into six categories. Complete answers are provided for all exercises, except research problems, making the book particularly valuable for self-study. Major topics include: Sums Recurrences Integer functions Elementary number theory Binomial coefficients Generating functions Discrete probability Asymptotic methods This second edition includes important new material about mechanical summation. In response to the widespread use ofthe first edition as a reference book, the bibliography and index have also been expanded, and additional nontrivial improvements can be found on almost every page. Readers will appreciate the informal style of Concrete Mathematics. Particularly enjoyable are the marginal graffiti contributed by students who have taken courses based on this material. The authors want to convey not only the importance of the techniques presented, but some of the fun in learning and using them.

2,318 citations


Journal ArticleDOI
TL;DR: This article proposed conditions under which the use of small groups in classrooms can be productive, including task instructions, student preparation, and the nature of the teacher role that are eminently suitable for supporting interaction in more routine learning tasks.
Abstract: Moving beyond the general question of effectiveness of small group learning, this conceptual review proposes conditions under which the use of small groups in classrooms can be productive. Included in the review is recent research that manipulates various features of cooperative learning as well as studies of the relationship of interaction in small groups to outcomes. The analysis develops propositions concerning the kinds of discourse that are productive of different types of learning as well as propositions concerning how desirable kinds of interaction may be fostered. Whereas limited exchange of information and explanation are adequate for routine learning in collaborative seatwork, more open exchange and elaborated discussion are necessary for conceptual learning with group tasks and ill-structured problems. Moreover, task instructions, student preparation, and the nature of the teacher role that are eminently suitable for supporting interaction in more routine learning tasks may result in unduly con...

Journal ArticleDOI
TL;DR: Three models for how gender differences in depression might develop in early adolescence are described and evaluated and it is concluded that Model 3 is best supported by the available data, although much more research is needed.
Abstract: There are no gender differences in depression rates in prepubescent children, but, after the age of 15, girls and women are about twice as likely to be depressed as boys and men. In this article, three models for how gender differences in depression might develop in early adolescence are described a

Journal ArticleDOI
TL;DR: Four areas have seen major progress in the TGF-p superfamily in the last 3 years: structural characterization of the signal­ ing molecule, isolation of new family members, cloning of receptor molecules, and new genetic tests of the func­ tions of these factors in different organisms.
Abstract: In the last 10 years, a large family of secreted signaling molecules has been discovered that appear to mediate many key events in normal growth and development. The family is known as the TGF-p superfamily (Massague 1990), a name taken from the first member of the family to be isolated (transforming growth factor-^l). This name is somewhat misleading, because TGF-p 1 has a large number of effects in different systems (Spom and Roberts 1992). It actually inhibits the proliferation of many different cell lines, and its original "transforming" activity may be due to secondary effects on matrix pro­ duction and synthesis of other growth factors (Moses et al. 1990). The two dozen other members of the TGF-p superfamily have a remarkable range of activities. In Diosophila, a TGF-p-related gene is required for dorsoventral axis formation in early embryos, communication between tissue layers in gut development, and correct proximal distal patterning of adult appendages. In Xenopus, a TGF-p-related gene is expressed specifically at one end of fertilized eggs and may function in early signaling events that lay out the basic body plan. In mammals, TGF-p-related molecules have been found that control sexual development, pituitary hormone production, and the creation of bones and cartilage. The recognition of TGF-p superfamily members in many different organ­ isms and contexts provides one of the major unifying themes in recent molecular studies of animal growth and development. The rough outlines of the TGF-p family were first rec­ ognized in the 1980s. Since that time, a number of ex­ cellent reviews have appeared that summarize the prop­ erties of different family members (Ying 1989; Massague 1990; Lyons et al. 1991; Spom and Roberts 1992). Here, I will focus on four areas that have seen major progress in the last 3 years: structural characterization of the signal­ ing molecule, isolation of new family members, cloning of receptor molecules, and new genetic tests of the func­ tions of these factors in different organisms.

Book
John R. Koza1
01 Jan 1994
TL;DR: This book presents a method to automatically decompose a program into solvable components, called automatically defined functions (ADF), and then presents case studies of the application of this method to a variety of problems.
Abstract: This book is a followon to the book in which John Koza introduced genetic programming (GP) to the world "enetic Programming: On the Programming of Computers by Means of Nataral Selection " [5] 1. As such, the primary intended audience is someone already familiar with GP; however, Koza does provide introductory material to both genetic algorithms (GA) and GP. The driving force behind this book is a method to automatically decompose a program into solvable components. The book presents this method, called automatically defined functions (ADF), and then presents case studies of the application of this method to a variety of problems. While this book's size is intimidating, there is a wealth of information to be found by the reader willing to conduct a prolonged campaign. The reader is advised to study the first seven chapters of the book to gain an understanding of the concepts behind ADFs. Then the reader should be able to select which case studies he or she finds to be of interest. If the reader is feeling overwhelmed by the information presented in the book, there are several concise chapters dealing with ADFs by Koza and others in Advances in Genetic Programming edited by Kenneth Kinnear [3] 2.

Journal ArticleDOI
TL;DR: Non-Hodgkin's lymphoma affecting the stomach, but not other sites, is associated with previous H. pylori infection, and a causative role for the organism is plausible, but remains unproved.
Abstract: Background Helicobacter pylori infection is a risk factor for gastric adenocarcinoma. We examined whether this infection is also a risk factor for primary gastric non-Hodgkin's lymphoma. Methods This nested case-control study involved two large cohorts (230,593 participants). Serum had been collected from cohort members and stored, and all subjects were followed for cancer. Thirty-three patients with gastric non-Hodgkin's lymphoma were identified, and each was matched to four controls according to cohort, age, sex, and date of serum collection. For comparison, 31 patients with nongastric non-Hodgkin's lymphoma from one of the cohorts were evaluated, each of whom had been previously matched to 2 controls. Pathological reports and specimens were reviewed to confirm the histologic type of the tumor. Serum samples from all subjects were tested for H. pylori IgG by an enzyme-linked immunosorbent assay. Results Thirty-three cases of gastric non-Hodgkin's lymphoma occurred a median of 14 years after serum collec...

Journal ArticleDOI
10 Mar 1994-Nature
TL;DR: A new in vitro assay using a feedback enhanced laser trap system allows direct measurement of force and displacement that results from the interaction of a single myosin molecule with a single suspended actin filament.
Abstract: A new in vitro assay using a feedback enhanced laser trap system allows direct measurement of force and displacement that results from the interaction of a single myosin molecule with a single suspended actin filament. Discrete stepwise movements averaging 11 nm were seen under conditions of low load, and single force transients averaging 3-4 pN were measured under isometric conditions. The magnitudes of the single forces and displacements are consistent with predictions of the conventional swinging-crossbridge model of muscle contraction.

Journal ArticleDOI
31 Mar 1994-Nature
TL;DR: It is shown that polymorphic microsatellites (primarily CA repeats) allow trees of human individuals to be constructed that reflect their geographic origin with remarkable accuracy by the analysis of a large number of loci for each individual, in spite of the small variations in allele frequencies existing between populations.
Abstract: Genetic variation at hypervariable loci is being used extensively for linkage analysis and individual identification, and may be useful for inter-population studies. Here we show that polymorphic microsatellites (primarily CA repeats) allow trees of human individuals to be constructed that reflect their geographic origin with remarkable accuracy. This is achieved by the analysis of a large number of loci for each individual, in spite of the small variations in allele frequencies existing between populations. Reliable evolutionary relationships could also be established in comparisons among human populations but not among great ape species, probably because of constraints on allele length variation. Among human populations, diversity of microsatellites is highest in Africa, which is in contrast to other nuclear markers and supports the hypothesis of an African origin for humans.

Journal ArticleDOI
TL;DR: These preliminary results demonstrate that endovascular stent-graft repair is safe in highly selected patients with descending thoracic aortic aneurysms and will, however, require careful long-term evaluation.
Abstract: Background The usual treatment for thoracic aortic aneurysms is surgical replacement with a prosthetic graft, but the associated morbidity and mortality are considerable. We studied the use of transluminally placed endovascular stent-graft devices as an alternative to surgical repair. Methods We evaluated the feasibility, safety, and effectiveness of transluminally placed stent-grafts to treat descending thoracic aortic aneurysms in 13 patients over a 24-month period. Atherosclerotic, anastomotic, and post-traumatic true or false aneurysms and aortic dissections were treated. The mean diameter of the aneurysms was 6.1 cm (range, 5 to 8). The endovascular stent-grafts were custom-designed for each patient and were constructed of self-expanding stainless-steel stents covered with woven Dacron grafts. Results Endovascular placement of the stent-graft prosthesis was successful in all patients. There was complete thrombosis of the thoracic aortic aneurysm surrounding the stent-graft in 12 patients, and partial...

Journal ArticleDOI
TL;DR: It is found that at the first stage of reheating the classical inflation field rapidly decays into particles or into other bosons due to a broad parametric resonance, which implies that the inflation field can be a dark matter candidate.
Abstract: The theory of reheating of the Universe after inflation is developed. We have found that typically at the first stage of reheating the classical inflation field $\ensuremath{\varphi}$ rapidly decays into $\ensuremath{\varphi}$ particles or into other bosons due to a broad parametric resonance. Then these bosons decay into other particles, which eventually become thermalized. Complete reheating is possible only in those theories where a single particle $\ensuremath{\varphi}$ can decay into other particles. This imposes strong constraints on the structure of inflationary models, and implies that the inflation field can be a dark matter candidate.

Journal ArticleDOI
TL;DR: Vascular remodeling is an active process of structural alteration that involves changes in at least four cellular processes -- cell growth, cell death, cell migration, and production or degradation of extracellular matrix -- and is dependent on a dynamic interaction between locally.
Abstract: The vessel wall is an active, integrated organ composed of endothelial, smooth-muscle, and fibroblast cells coupled to each other in a complex autocrine-paracrine set of interactions The vasculature is capable of sensing changes within its milieu, integrating these signals by intercellular communication, and changing itself through the local production of mediators that influence structure as well as function Vascular remodeling is an active process of structural alteration that involves changes in at least four cellular processes -- cell growth, cell death, cell migration, and production or degradation of extracellular matrix -- and is dependent on a dynamic interaction between locally

Proceedings ArticleDOI
24 Jul 1994
TL;DR: A method for combining a collection of range images into a single polygonal mesh that completely describes an object to the extent that it is visible from the outside is presented.
Abstract: Range imaging offers an inexpensive and accurate means for digitizing the shape of three-dimensional objects. Because most objects self occlude, no single range image suffices to describe the entire object. We present a method for combining a collection of range images into a single polygonal mesh that completely describes an object to the extent that it is visible from the outside.The steps in our method are: 1) align the meshes with each other using a modified iterated closest-point algorithm, 2) zipper together adjacent meshes to form a continuous surface that correctly captures the topology of the object, and 3) compute local weighted averages of surface positions on all meshes to form a consensus surface geometry.Our system differs from previous approaches in that it is incremental; scans are acquired and combined one at a time. This approach allows us to acquire and combine large numbers of scans with minimal storage overhead. Our largest models contain up to 360,000 triangles. All the steps needed to digitize an object that requires up to 10 range scans can be performed using our system with five minutes of user interaction and a few hours of compute time. We show two models created using our method with range data from a commercial rangefinder that employs laser stripe technology.

Journal ArticleDOI
06 Jul 1994-JAMA
TL;DR: It is concluded that ulcer patients with H. pylori infection require treatment with antimicrobial agents in addition to antisecretory drugs whether on first presentation with the illness or on recurrence.
Abstract: The National Institutes of Health Consensus Development Conference onHelicobacter pyloriin Peptic Ulcer Disease brought together specialists in gastroenterology, surgery, infectious diseases, epidemiology, and pathology, as well as the public to address the following questions: (1) What is the causal relationship ofH pylorito upper gastrointestinal disease? (2) How does one diagnose and eradicateH pyloriinfection? (3) Does eradication ofH pyloriinfection benefit the patient with peptic ulcer disease? (4) What is the relationship betweenH pyloriinfection and gastric malignancy? (5) WhichH pylori—infected patients should be treated? (6) What are the most important questions that must be addressed by future research inH pyloriinfections? Following 1½ days of presentations by experts and discussion by the audience, a consensus panel weighed the evidence and prepared their consensus statement. Among their findings, the consensus panel concluded that (1) ulcer patients withH pyloriinfection require treatment with antimicrobial agents in addition to antisecretory drugs whether on first presentation with the illness or on recurrence; (2) the value of treating of nonulcerative dyspepsia patients withH pyloriinfection remains to be determined; and (3) the interesting relationship betweenH pyloriinfection and gastric cancers requires further exploration. (JAMA. 1994;272:65-69)

Journal ArticleDOI
TL;DR: In this paper, the inversion layer mobility in n-and p-channel Si MOSFETs with a wide range of substrate impurity concentrations (10/sup 15/ to 10/sup 18/ cm/sup -3/) was examined.
Abstract: This paper reports the studies of the inversion layer mobility in n- and p-channel Si MOSFET's with a wide range of substrate impurity concentrations (10/sup 15/ to 10/sup 18/ cm/sup -3/). The validity and limitations of the universal relationship between the inversion layer mobility and the effective normal field (E/sub eff/) are examined. It is found that the universality of both the electron and hole mobilities does hold up to 10/sup 18/ cm/sup -3/. The E/sub eff/ dependences of the universal curves are observed to differ between electrons and holes, particularly at lower temperatures. This result means a different influence of surface roughness scattering on the electron and hole transports. On substrates with higher impurity concentrations, the electron and hole mobilities significantly deviate from the universal curves at lower surface carrier concentrations because of Coulomb scattering by the substrate impurity. Also, the deviation caused by the charged centers at the Si/SiO/sub 2/ interface is observed in the mobility of MOSFET's degraded by Fowler-Nordheim electron injection. >

Journal ArticleDOI
TL;DR: The authors argue that attribution patterns reflect implicit theories acquired from induction and socialization and hence differentially distributed across human cultures, and they test the hypothesis that dispositionalism in attribution for behavior reflects a theory of social behavior more widespread in individualist than collectivist cultures.
Abstract: The authors argue that attribution patterns reflect implicit theories acquired from induction and socialization and hence differentially distributed across human cultures. In particular, the authors tested the hypothesis that dispositionalism in attribution for behavior reflects a theory of social behavior more widespread in individualist than collectivist cultures

Journal ArticleDOI
TL;DR: In this paper, the authors identify the distinctive strengths and limitations of university research and argue that industry is more effective in dealing with problems that are located close to the market place and that new policies will need to respect this division of labor.

Proceedings ArticleDOI
24 Apr 1994
TL;DR: Five experiments provide evidence that individuals’ interactions with computers are fundamentally social, and show that social responses to computers are not the result of conscious beliefs that computers are human or human-like.
Abstract: This paper presents a new experimental paradigm for the study of human-computer interaction, Five experiments provide evidence that individuals’ interactions with computers are fundamentally social. The studies show that social responses to computers are not the result of conscious beliefs that computers are human or human-like. Moreover, such behaviors do not result from users’ ignorance or from psychological or social dysfunctions, nor from a belief that subjects are interacting with programmers. Rather, social responses to computers are commonplace and easy to generate. The results reported here present numerous and unprecedented hypotheses, unexpected implications for design, new approaches to usability testing, and direct methods for verii3cation.

Journal ArticleDOI
TL;DR: This article examines two candidate neural codes: information is represented in the spike rate of neurons, or information is representation in the precise timing of individual spikes typically observed in cerebral cortex.

Journal ArticleDOI
TL;DR: Mortality rates are increased at least 2-fold in RA, and are linked to clinical severity, with a large excess of deaths attributable to cardiovascular and cerebrovascular diseases.
Abstract: Objective. To determine the risk and causes of death and to quantify mortality predictors in patients with rheumatoid arthritis (RA). Methods. RA patients (n = 3,501) from 4 centers (Saskatoon n = 905, Wichita n = 1,405, Stanford n = 886, and Santa Clara n = 305) were followed for up to 35 years; 922 patients died. Results. The overall standardized mortality ratio (SMR) was 2.26 (Saskatoon 2.24, Wichita 1.98, Stanford 3.08, Santa Clara 2.18) and increased with time. Mortality was strikingly increased for specific causes: infection, lymphoproliferative malignancy, gastroenterologic, and RA. In addition, as an effect of the SMR of 2.26, the expected number of deaths was increased nonspecifically across all causes (except cancer), with a large excess of deaths attributable to cardiovascular and cerebrovascular diseases. Independent predictors of mortality included age, education, male sex, function, rheumatoid factor, nodules, erythrocyte sedimentation rate, joint count, and prednisone use.

Journal ArticleDOI
TL;DR: Differences in adjustment associated with variations in parenting are either maintained or increase over time, whereas the benefits of authoritative parenting are largely in the maintenance of previous levels of high adjustment, the deleterious consequences of neglectful parenting continue to accumulate.
Abstract: In a previous report, we demonstrated that adolescents' adjustment varies as a function of their parents' style (e.g., authoritative, authoritarian, indulgent, neglectful). This 1-year follow-up was conducted in order to examine whether the observed differences are maintained over time. In 1987, an ethnically and socioeconomically heterogeneous sample of approximately 2,300 14-18-year-olds provided information used to classify the adolescents' families into 1 of 4 parenting style groups. That year, and again 1 year later, the students completed a battery of standardized instruments tapping psychosocial development, school achievement, internalized distress, and behavior problems. Differences in adjustment associated with variations in parenting are either maintained or increase over time. However, whereas the benefits of authoritative parenting are largely in the maintenance of previous levels of high adjustment, the deleterious consequences of neglectful parenting continue to accumulate.

Proceedings ArticleDOI
24 Jul 1994
TL;DR: A new object-order rendering algorithm based on the factorization of a shear-warp factorization for perspective viewing transformations is described that is significantly faster than published algorithms with minimal loss of image quality.
Abstract: Several existing volume rendering algorithms operate by factoring the viewing transformation into a 3D shear parallel to the data slices, a projection to form an intermediate but distorted image, and a 2D warp to form an undistorted final image. We extend this class of algorithms in three ways. First, we describe a new object-order rendering algorithm based on the factorization that is significantly faster than published algorithms with minimal loss of image quality. Shear-warp factorizations have the property that rows of voxels in the volume are aligned with rows of pixels in the intermediate image. We use this fact to construct a scanline-based algorithm that traverses the volume and the intermediate image in synchrony, taking advantage of the spatial coherence present in both. We use spatial data structures based on run-length encoding for both the volume and the intermediate image. Our implementation running on an SGI Indigo workstation renders a 2563 voxel medical data set in one second. Our second extension is a shear-warp factorization for perspective viewing transformations, and we show how our rendering algorithm can support this extension. Third, we introduce a data structure for encoding spatial coherence in unclassified volumes (i.e. scalar fields with no precomputed opacity). When combined with our shear-warp rendering algorithm this data structure allows us to classify and render a 2563 voxel volume in three seconds. The method extends to support mixed volumes and geometry and is parallelizable.