scispace - formally typeset
Search or ask a question

Showing papers by "University of Oxford published in 1997"


Proceedings ArticleDOI
28 Jul 1997
TL;DR: It is argued that the ease of implementation and more accurate estimation features of the new filter recommend its use over the EKF in virtually all applications.
Abstract: The Kalman Filter (KF) is one of the most widely used methods for tracking and estimation due to its simplicity, optimality, tractability and robustness. However, the application of the KF to nonlinear systems can be difficult. The most common approach is to use the Extended Kalman Filter (EKF) which simply linearizes all nonlinear models so that the traditional linear Kalman filter can be applied. Although the EKF (in its many forms) is a widely used filtering strategy, over thirty years of experience with it has led to a general consensus within the tracking and control community that it is difficult to implement, difficult to tune, and only reliable for systems which are almost linear on the time scale of the update intervals. In this paper a new linear estimator is developed and demonstrated. Using the principle that a set of discretely sampled points can be used to parameterize mean and covariance, the estimator yields performance equivalent to the KF for linear systems yet generalizes elegantly to nonlinear systems without the linearization steps required by the EKF. We show analytically that the expected performance of the new approach is superior to that of the EKF and, in fact, is directly comparable to that of the second order Gauss filter. The method is not restricted to assuming that the distributions of noise sources are Gaussian. We argue that the ease of implementation and more accurate estimation features of the new filter recommend its use over the EKF in virtually all applications.

5,314 citations


Journal ArticleDOI
F. Kunst1, Naotake Ogasawara2, Ivan Moszer1, Alessandra M. Albertini3  +151 moreInstitutions (30)
20 Nov 1997-Nature
TL;DR: Bacillus subtilis is the best-characterized member of the Gram-positive bacteria, indicating that bacteriophage infection has played an important evolutionary role in horizontal gene transfer, in particular in the propagation of bacterial pathogenesis.
Abstract: Bacillus subtilis is the best-characterized member of the Gram-positive bacteria. Its genome of 4,214,810 base pairs comprises 4,100 protein-coding genes. Of these protein-coding genes, 53% are represented once, while a quarter of the genome corresponds to several gene families that have been greatly expanded by gene duplication, the largest family containing 77 putative ATP-binding transport proteins. In addition, a large proportion of the genetic capacity is devoted to the utilization of a variety of carbon sources, including many plant-derived molecules. The identification of five signal peptidase genes, as well as several genes for components of the secretion apparatus, is important given the capacity of Bacillus strains to secrete large amounts of industrially important enzymes. Many of the genes are involved in the synthesis of secondary metabolites, including antibiotics, that are more typically associated with Streptomyces species. The genome contains at least ten prophages or remnants of prophages, indicating that bacteriophage infection has played an important evolutionary role in horizontal gene transfer, in particular in the propagation of bacterial pathogenesis.

3,753 citations


Journal ArticleDOI
TL;DR: This paper describes a new approach to low level image processing; in particular, edge and corner detection and structure preserving noise reduction and the resulting methods are accurate, noise resistant and fast.
Abstract: This paper describes a new approach to low level image processing; in particular, edge and corner detection and structure preserving noise reduction. Non-linear filtering is used to define which parts of the image are closely related to each individual pixel; each pixel has associated with it a local image region which is of similar brightness to that pixel. The new feature detectors are based on the minimization of this local image region, and the noise reduction method uses this region as the smoothing neighbourhood. The resulting methods are accurate, noise resistant and fast. Details of the new feature detectors and of the new noise reduction method are described, along with test results.

3,669 citations


Journal ArticleDOI
TL;DR: Interactions among extrastriate, inferotemporal, and posterior parietal regions during visual processing, under different attentional and perceptual conditions, are focused on.

2,917 citations


Journal ArticleDOI
Eugenia E. Calle1, Clark W. Heath1, R. J. Coates2, Jonathan M. Liff2  +191 moreInstitutions (45)
TL;DR: Of the many factors examined that might affect the relation between breast cancer risk and use of HRT, only a woman's weight and body-mass index had a material effect: the increase in the relative risk of breast cancer diagnosed in women using HRT and associated with long durations of use in current and recent users was greater for women of lower than of higher weight or body- mass index.

2,343 citations


Journal ArticleDOI
TL;DR: Seq-Gen is a program that will simulate the evolution of nucleotide sequences along a phylogeny, using common models of the substitution process, including the general reversible model.
Abstract: PSeq-Gen will simulate the evolution of protein sequences along evolutionary trees following the procedures previously reported for the DNA sequence simulator Seq-Gen (Sequence-Generator, Rambaut and Grassly, 1997). Statistics calculated from these sequences can be used to give expectations under specific null hypotheses of protein evolution. This Monte Carlo simulation approach to testing hypotheses is often termed 'parametric bootstrapping' (see, for example, Efron, 1985; Huelsenbeck et al., 19%; Huelsenbeck and Rannala, 1997), and has many powerful applications, such as testing the molecular clock (Goldman, 1993), detecting recombination (Grassly and Holmes, 1997), and evaluating competing phylogenetic hypotheses (Hillis et al., 19%). Three common models of amino acid substitution are implemented (PAM: Dayhoff et al., 1978; JTT: Jones et al., 1992; mtREV: Adachi and Hasegawa, 1995). These models use instantaneous rate matrices derived from observed patterns of accepted point mutations for aligned nuclear (PAM, JTT) and mitochondrial (mtREV) genes. The amino acid frequencies found for these alignments are the default in each model, but it is also possible to specify their frequencies independently in the implementation here [following Adachi (1995), but see also Kishino et al. (1990)]. In addition, sitespecific rate heterogeneity following a gamma distribution is allowed (as described in Rambaut and Grassly, 1997; Yang, 1993). As with Seq-Gen, any number of trees may be read in and any number of data sets can be simulated for each tree, allowing large sets of replicate simulations to be created easily. Maximum-likelihood algorithms exist which will reconstruct phylogenies from protein sequences using the PAM, JTT and mtREV substitution models (e.g. Adachi and Hasegawa, 1995; Yang, 1996), and hence bootstrapping to obtain confidence limits about phylogenetic parameters is easily achieved.

1,681 citations


Journal ArticleDOI
TL;DR: Using intense synchrotron sources, it is observed that six different ex vivo amyloid fibrils and two synthetic fibril preparations all gave similar high-resolution X-ray fibre diffraction patterns, consistent with a helical array of beta-sheets parallel to the fibre long axis, with the strands perpendicular to this axis, which confirms that amyloidsfibrils comprise a structural superfamily and share a common protofilament substructure.

1,648 citations


Journal ArticleDOI
01 Mar 1997-Brain
TL;DR: The results show the expression of amyloid precursor protein in damaged axons within acute multiple sclerosis lesions, and in the active borders of less acute lesions, which may have implications for the design and timing of therapeutic intervention.
Abstract: One of the histological hallmarks of early multiple sclerosis lesions is primary demyelination, with myelin destruction and relative sparing of axons. On the other hand, it is widely accepted that axonal loss occurs in, and is responsible for, the permanent disability characterizing the later chronic progressive stage of the disease. In this study, we have used an antibody against amyloid precursor protein, known to be a sensitive marker of axonal damage in a number of other contexts, in immunocytochemical experiments on paraffin embedded multiple sclerosis lesions of varying ages in order to see at which stage of the disease axonal damage, in addition to demyelination, occurs and may thus contribute to the development of disability in patients. The results show the expression of amyloid precursor protein in damaged axons within acute multiple sclerosis lesions, and in the active borders of less acute lesions. This observation may have implications for the design and timing of therapeutic intervention, one of the most important aims of which must be the reduction of permanent disability.

1,532 citations


Journal ArticleDOI
01 Aug 1997-Pain
TL;DR: The results indicate that if a patient records a baseline VAS score in excess of 30 mm they would probably have recorded at least moderate pain on a 4‐point categorical scale.
Abstract: One way to ensure adequate sensitivity for analgesic trials is to test the intervention on patients who have established pain of moderate to severe intensity. The usual criterion is at least moderate pain on a categorical pain intensity scale. When visual analogue scales (VAS) are the only pain measure in trials we need to know what point on a VAS represents moderate pain, so that these trials can be included in meta-analysis when baseline pain of at least moderate intensity is an inclusion criterion. To investigate this we used individual patient data from 1080 patients from randomised controlled trials of various analgesics. Baseline pain was measured using a 4-point categorical pain intensity scale and a pain intensity VAS under identical conditions. The distribution of the VAS scores was examined for 736 patients reporting moderate pain and for 344 reporting severe pain. The VAS scores corresponding to moderate or severe pain were also examined by gender. Baseline VAS scores recorded by patients reporting moderate pain were significantly different from those of patients reporting severe pain. Of the patients reporting moderate pain 85% scored over 30 mm on the corresponding VAS, with a mean score of 49 mm. For those reporting severe pain 85% scored over 54 mm with a mean score of 75 mm. There was no difference between the corresponding VAS scores of men and women. Our results indicate that if a patient records a baseline VAS score in excess of 30 mm they would probably have recorded at least moderate pain on a 4-point categorical scale.

1,385 citations


Journal ArticleDOI
TL;DR: In this middle-aged and elderly population, moderate alcohol consumption slightly reduced overall mortality, and the benefit depended in part on age and background cardiovascular risk and was far smaller than the large increase in risk produced by tobacco.
Abstract: Background Alcohol consumption has both adverse and beneficial effects on survival. We examined the balance of these in a large prospective study of mortality among U.S. adults. Methods Of 490,000 men and women (mean age, 56 years; range, 30 to 104) who reported their alcohol and tobacco use in 1982, 46,000 died during nine years of follow-up. We compared cause-specific death rates and rates of death from all causes across categories of base-line alcohol consumption, adjusting for other risk factors, and related drinking and smoking habits to the cumulative probability of dying between the ages of 35 and 69 years. Results Causes of death associated with drinking were cirrhosis and alcoholism; cancers of the mouth, esophagus, pharynx, larynx, and liver combined; breast cancer in women; and injuries and other external causes in men. The mortality from breast cancer was 30 percent higher among women reporting at least one drink daily than among nondrinkers (relative risk, 1.3; 95 percent confidence interval,...

1,262 citations


Journal ArticleDOI
20 Mar 1997-Nature
TL;DR: It is shown that targeted disruption of the MSR-A gene in mice results in a reduction in the size of atherosclerotic lesions in an animal deficient in apolipoprotein E, indicating that MSr-A may play a part in host defence against pathogens.
Abstract: Macrophage type-I and type-II class-A scavenger receptors (MSR-A) are implicated in the pathological deposition of cholesterol during atherogenesis as a result of receptor-mediated uptake of modified low-density lipoproteins (mLDL)1–6. MSR-A can bind an extraordinarily wide range of ligands, including bacterial pathogens7, and also mediates cation-independent macrophage adhesion in vitro8. Here we show that targeted disruption of the MSR-A gene in mice results in a reduction in the size of atherosclerotic lesions in an animal deficient in apolipoprotein E. Macrophages from MSR-A-deficient mice show a marked decrease in mLDL uptake in vitro, whereas mLDL clearance from plasma occurs at a normal rate, indicating that there may be alternative mechanisms for removing mLDL from the circulation. In addition, MSR-A-knockout mice show an increased susceptibility to infection with Listeria monocytogenes or herpes simplex virus type-1, indicating that MSR-A may play a part in host defence against pathogens.

Journal ArticleDOI
TL;DR: Six donors who make a strong CTL response to an immunodominant HLA-B27-restricted epitope and two donors who progressed to AIDS, CTL escape to fixation by the same mutation was observed, but only after 9–12 years of epitope stability.
Abstract: The precise role played by HIV-specific cytotoxic T lymphocytes (CTL) in HIV infection remains controversial Despite strong CTL responses being generated during the asymptomatic phase, the virus persists and AIDS ultimately develops It has been argued that the virus is so variable, and the virus turnover so great that escape from CTL recognition would occur continually, but so far there is limited evidence for CTL escape The opposing argument is that evidence for CTL escape is present but hard to find because multiple anti-HIV immune responses are acting simultaneously during the asymptomatic phase of infection We describe six donors who make a strong CTL response to an immunodominant HLA-B27-restricted epitope In the two donors who progressed to AIDS, CTL escape to fixation by the same mutation was observed, but only after 9-12 years of epitope stability CTL escape may play an important role in the pathogenesis of HIV infection

Journal ArticleDOI
TL;DR: Anatomical, electrophysiological, psychophysical and brain-imaging studies have contributed to elucidating the functional organization of visual confusions, finding that dyslexics may be unable to process fast incoming sensory information adequately in any domain.

Journal ArticleDOI
TL;DR: A simple method of obtaining optical sectioning in a conventional wide-field microscope by projecting a single-spatial-frequency grid pattern onto the object and processing images that are substantially similar to those obtained with confocal microscopes is described.
Abstract: We describe a simple method of obtaining optical sectioning in a conventional wide-field microscope by projecting a single-spatial-frequency grid pattern onto the object. Images taken at three spatial positions of the grid are processed in real time to produce optically sectioned images that are substantially similar to those obtained with confocal microscopes.

Journal ArticleDOI
TL;DR: From an analysis of the distributions of measures of transmission rates among hosts, an empirical relationship is identified suggesting that, typically, 20% of the host population contributes at least 80%" of the net transmission potential, as measured by the basic reproduction number, R0.
Abstract: From an analysis of the distributions of measures of transmission rates among hosts, we identify an empirical relationship suggesting that, typically, 20% of the host population contributes at least 80% of the net transmission potential, as measured by the basic reproduction number, R0. This is an example of a statistical pattern known as the 20/80 rule. The rule applies to a variety of disease systems, including vector-borne parasites and sexually transmitted pathogens. The rule implies that control programs targeted at the “core” 20% group are potentially highly effective and, conversely, that programs that fail to reach all of this group will be much less effective than expected in reducing levels of infection in the population as a whole.

Journal ArticleDOI
27 Feb 1997-Nature
TL;DR: Biophysical studies suggest that partly folded intermediates are involved in fibrillogenesis, and this may be relevant to amyloidosis generally.
Abstract: Tissue deposition of soluble proteins as amyloid fibrils underlies a range of fatal diseases. The two naturally occurring human lysozyme variants are both amyloidogenic, and are shown here to be unstable. They aggregate to form amyloid fibrils with transformation of the mainly helical native fold, observed in crystal structures, to the amyloid fibril cross-beta fold. Biophysical studies suggest that partly folded intermediates are involved in fibrillogenesis, and this may be relevant to amyloidosis generally.

Journal ArticleDOI
TL;DR: The profile of the PDQ-39 should be of value in studies aimed at determining the impact of treatment regimes upon particular aspects of functioning and well-being in patients with Parkinson's disease, while the PDSI will provide a summary score ofThe impact of the illness on functioning andWell-being and will be of use in the evaluation of the overall effect of different treatments.
Abstract: Objectives: to briefly outline the development and validation of the Parkinson's Disease Questionnaire (PDQ-39) and then to provide evidence for the use of the measure as either a profile of health status scores or a single index figure. Design: the PDQ-39 was administered in two surveys: a postal survey of patients registered with local branches of the Parkinson's Disease Society of Great Britain (n = 405) and a survey of patients attending neurology clinics for treatment for Parkinson's disease (n = 146). Data from the eight dimensions of the PDQ-39 were factor-analysed. This produced a single factor on the data from both surveys. Outcome measures: the eight dimensions of the PDQ-39 and the new single index score—the Parkinson's disease summary index (PDSI), together with clinical assessments (the Columbia rating scale and the Hoehn and Yahr staging score). Results: in the postal survey 227 patients returned questionnaires (58.2%). All 146 patients approached in the clinic sample agreed to take part. Higher-order principal-components factor analysis was undertaken on the eight dimensions of the PDQ-39 and produced one factor on both datasets. Consequently it was decided that the scores of the eight domains could be summed to produce a single index figure. The psychometric properties of this index were explored using reliability tests and tests of construct validity. The newly derived single index was found to be both internally reliable and valid. Discussion: data from the PDQ-39 can be presented either in profile form or as a single index figure. The profile should be of value in studies aimed at determining the impact of treatment regimes upon particular aspects of functioning and well-being in patients with Parkinson's disease, while the PDSI will provide a summary score of the impact of the illness on functioning and well-being and will be of use in the evaluation of the overall effect of different treatments. Furthermore, the PDSI reduces the number of statistical comparisons and hence the role of chance when exploring data from the PDQ-39.

Journal ArticleDOI
Mark Pagel1
TL;DR: A set of maximum likelihood statistical methods for inferring historical evolutionary processes and anticipating the wealth of information becoming available to biological scientists from genetic studies that pin down relationships among organisms with unprecedented accuracy are described.
Abstract: Evolutionary processes shape the regular trends of evolution and are responsible for the diversity and distribution of contemporary species. They include correlated evolutionary change and trajectories of trait evolution, convergent and parallel evolution, differential rates of evolution, speciation and extinction, the order and direction of change in characters, and the nature of the evolutionary process itself—does change accumulate gradually, episodically, or in punctuational bursts. Phylogenies, in combination with information on species, contain the imprint of these historical evolutionary processes. By applying comparative methods based upon statistical models of evolution to well resolved phylogenies, it is possible to infer the historical evolutionary processes that must have existed in the past, given the patterns of diversity seen in the present. I describe a set of maximum likelihood statistical methods for inferring such processes. The methods estimate parameters of statistical models for inferring correlated evolutionary change in continuously varying characters, for detecting correlated evolution in discrete characters, for estimating rates of evolution, and for investigating the nature of the evolutionary process itself. They also anticipate the wealth of information becoming available to biological scientists from genetic studies that pin down relationships among organisms with unprecedented accuracy.

Journal ArticleDOI
TL;DR: In this paper, the authors show that the moral hazard problem may not disappear and capital requirements alone may not achieve the socially efficient allocation, whereas that allocation can be achieved by also using a deposit rate control.
Abstract: Capital requirements are traditionally viewed as an effective form of prudential regulation - by increasing capital the bank internalizes more of the risk of its investment decisions. While the traditional view is accurate in the sense that capital requirement can be effective in combating moral hazard, we find, in contrast, that capital requirements are Pareto inefficient. With deposit insurance, freely determined deposit rates undermine prudent bank behavior. To induce a bank to choose to make prudent investments, the bank must have sufficient franchise value at risk. Free deposit rates combined with competitive markets serve to reduce franchise value to the point where banks gamble. Deposit rate controls create franchise value by increasing the per-period profits of the bank. We find that deposit rate controls combined with capital requirements can more inexpensively replicate any outcome that is induced using capital requirements alone. Even in an economy where the government can credibly commit not to offer deposit insurance, the moral hazard problem may still not disappear and capital requirements alone may not achieve the socially efficient allocation, whereas that allocation can be achieved by also using a deposit rate control.

Proceedings ArticleDOI
04 Jun 1997
TL;DR: It is proved that this algorithm yields consistent estimates irrespective of the actual correlations, which is illustrated in an application of decentralised estimation where it is impossible to consistently use a Kalman filter.
Abstract: This paper addresses the problem of estimation when the cross-correlation in the errors between different random variables are unknown. A new data fusion algorithm, the covariance intersection algorithm (CI), is presented. It is proved that this algorithm yields consistent estimates irrespective of the actual correlations. This property is illustrated in an application of decentralised estimation where it is impossible to consistently use a Kalman filter.

Journal ArticleDOI
14 Aug 1997-Nature
TL;DR: It was found that A2aR-knockout mice were viable and bred normally, and male mice were much more aggressive towards intruders, whereas caffeine, which normally stimulates exploratory behaviour, became a depressant of exploratory activity.
Abstract: Adenosine is released from metabolically active cells by facilitated diffusion, and is generated extracellularly by degradation of released ATP. It is a potent biological mediator that modulates the activity of numerous cell types, including various neuronal populations, platelets, neutrophils and mast cells, and smooth muscle cells in bronchi and vasculature. Most of these effects help to protect cells and tissues during stress conditions such as ischaemia. Adenosine mediates its effects through four receptor subtypes: the A1, A2a, A2b and A3 receptors. The A2a receptor (A2aR) is abundant in basal ganglia, vasculature and platelets, and stimulates adenylyl cyclase. It is a major target of caffeine, the most widely used psychoactive drug. Here we investigate the role of the A2a receptor by disrupting the gene in mice. We found that A2aR-knockout (A2aR-/-) mice were viable and bred normally. Their exploratory activity was reduced, whereas caffeine, which normally stimulates exploratory behaviour, became a depressant of exploratory activity. Knockout animals scored higher in anxiety tests, and male mice were much more aggressive towards intruders. The response of A2aR-/- mice to acute pain stimuli was slower. Blood pressure and heart rate were increased, as well as platelet aggregation. The specific A2a agonist CGS 21680 lost its biological activity in all systems tested.

Journal ArticleDOI
01 Mar 1997-Brain
TL;DR: For instance, this paper used PET to image the neural system underlying visuospatial attention and found that the right anterior cingulate gyrus (Brodmann area 24), in the intraparietal sulcus of right posterior parietal cortex, and in the mesial and lateral premotor cortices were observed to form the core of a neural network for spatial attention.
Abstract: PET was used to image the neural system underlying visuospatial attention. Analysis of data at both the group and individual-subject level provided anatomical resolution superior to that described to date. Six right-handed male subjects were selected from a pilot behavioural study in which behavioural responses and eye movements were recorded. The attention tasks involved covert shifts of attention, where peripheral cues indicated the location of subsequent target stimuli to be discriminated. One attention condition emphasized reflexive aspects of spatial orientation, while the other required controlled shifts of attention. PET activations agreed closely with the cortical regions recently proposed to form the core of a neural network for spatial attention. The two attention tasks evoked largely overlapping patterns of neural activation, supporting the existence of a general neural system for visuospatial attention with regional functional specialization. Specifically, neocortical activations were observed in the right anterior cingulate gyrus (Brodmann area 24), in the intraparietal sulcus of right posterior parietal cortex, and in the mesial and lateral premotor cortices (Brodmann area 6).

Journal ArticleDOI
TL;DR: Calcium transients were of a similar magnitude and duration in response to both mannitol and isoosmolar concentrations of salt, suggesting that a factor other than calcium is involved in the discrimination between drought and salinity signals in Arabidopsis.
Abstract: Changes in cytosolic free calcium concentration ([Ca2+]cyt) in response to mannitol (drought) and salt treatments were detected in vivo in intact whole Arabidopsis seedlings. Transient elevations of [Ca2+]cyt to around 1.5 microM were observed, and these were substantially inhibited by pretreatment with the calcium-channel blocker lanthanum and to a lesser extent, the calcium-chelator EGTA. The expression of three genes, p5cs, which encodes delta(1)-pyrroline-5-carboxylate synthetase (P5CS), the first enzyme of the proline biosynthesis pathway, rab18 and Iti78 which both encode proteins of unknown function, was induced by mannitol and salt treatments. The induction of all three genes by mannitol was inhibited by pretreatment with lanthanum. Salt-induced p5cs, but not rab18 and Iti78, expression was also inhibited by lanthanum. Induction of p5cs by mannitol was also inhibited by the calcium channel-blockers gadolinium and verapamil and the calcium chelator EGTA, further suggesting the involvement of calcium signalling in this response. Mannitol induced greater levels of p5cs gene expression than an isoosmolar concentration of salt, at both relatively high and low concentrations. However, calcium transients were of a similar magnitude and duration in response to both mannitol and isoosmolar concentrations of salt, suggesting that a factor other than calcium is involved in the discrimination between drought and salinity signals in Arabidopsis. In order to gauge the involvement of the vacuole as an intracellular calcium store in the response of Arabidopsis to mannitol, [Ca2+]cyt was measured at the microdomain adjacent to the vacuolar membrane. The results obtained were consistent with a significant calcium release from the vacuole contributing to the overall mannitol-induced [Ca2+]cyt response. Data obtained by using inhibitors of inositol signalling suggested that this release was occurring through IP3-dependent calcium channels.

Journal ArticleDOI
TL;DR: The development of the Meta-Cognitions Questionnaire to measure beliefs about worry and intrusive thoughts showed good psychometric properties on a range of indices of reliability and validity, and regression analyses showed that the independent predictors of worry were Positive Beliefs about Worry; Negative Beliefs About the Controllability of Thoughts and Corresponding Danger: and Cognitive Confidence.

Journal ArticleDOI
TL;DR: It is concluded that percentile bootstrap confidence interval methods provide a promising approach to estimating the uncertainty of ICER point estimates, however, successive bootstrap estimates of bias and standard error suggests that these may be unstable; accordingly, it is strongly recommend a cautious interpretation of such estimates.
Abstract: The statistic of interest in the economic evaluation of health care interventions is the incremental cost effectiveness ratio (ICER), which is defined as the difference in cost between two treatment interventions over the difference in their effect. Where patient-specific data on costs and health outcomes are available, it is natural to attempt to quantify uncertainty in the estimated ICER using confidence intervals. Recent articles have focused on parametric methods for constructing confidence intervals. In this paper, we describe the construction of non-parametric bootstrap confidence intervals. The advantage of such intervals is that they do not depend on parametric assumptions of the sampling distribution of the ICER. We present a detailed description of the non-parametric bootstrap applied to data from a clinical trial, in order to demonstrate the strengths and weaknesses of the approach. By examining the bootstrap confidence limits successively as the number of bootstrap replications increases, we conclude that percentile bootstrap confidence interval methods provide a promising approach to estimating the uncertainty of ICER point estimates. However, successive bootstrap estimates of bias and standard error suggests that these may be unstable; accordingly, we strongly recommend a cautious interpretation of such estimates.

Journal ArticleDOI
TL;DR: A variety of robust methods for the computation of the Fundamental Matrix, the calibration-free representation of camera motion, are developed from the principal categories of robust estimators, viz. case deletion diagnostics, M-estimators and random sampling, and the theory required to apply them to non-linear orthogonal regression problems is developed.
Abstract: This paper has two goals The first is to develop a variety of robust methods for the computation of the Fundamental Matrix, the calibration-free representation of camera motion The methods are drawn from the principal categories of robust estimators, viz case deletion diagnostics, M-estimators and random sampling, and the paper develops the theory required to apply them to non-linear orthogonal regression problems Although a considerable amount of interest has focussed on the application of robust estimation in computer vision, the relative merits of the many individual methods are unknown, leaving the potential practitioner to guess at their value The second goal is therefore to compare and judge the methods Comparative tests are carried out using correspondences generated both synthetically in a statistically controlled fashion and from feature matching in real imagery In contrast with previously reported methods the goodness of fit to the synthetic observations is judged not in terms of the fit to the observations per se but in terms of fit to the ground truth A variety of error measures are examined The experiments allow a statistically satisfying and quasi-optimal method to be synthesized, which is shown to be stable with up to 50 percent outlier contamination, and may still be used if there are more than 50 percent outliers Performance bounds are established for the method, and a variety of robust methods to estimate the standard deviation of the error and covariance matrix of the parameters are examined The results of the comparison have broad applicability to vision algorithms where the input data are corrupted not only by noise but also by gross outliers

Journal ArticleDOI
20 Mar 1997-Nature
TL;DR: This work describes the construction of oligopeptides, rationally designed or based on segments of native proteins, that aggregate in suitable solvents into long, semi-flexible β-sheet tapes and suggests that it should be possible to engineer a wide range of properties in these gels by appropriate choice of the peptide primary structure.
Abstract: Molecular self-assembly is becoming an increasingly popular route to new supramolecular structures and molecular materials. The inspiration for such structures is commonly derived from self-assembling systems in biology. Here we show that a biological motif, the peptide beta-sheet, can be exploited in designed oligopeptides that self-assemble into polymeric tapes and with potentially useful mechanical properties. We describe the construction of oligopeptides, rationally designed or based on segments of native proteins, that aggregate in suitable solvents into long, semi-flexible beta-sheet tapes. These become entangled even at low volume fractions to form gels whose viscoelastic properties can be controlled by chemical (pH) or physical (shear) influences. We suggest that it should be possible to engineer a wide range of properties in these gels by appropriate choice of the peptide primary structure.

Journal ArticleDOI
01 Mar 1997-Geology
TL;DR: In the Southern Alps of New Zealand, landslides are scale invariant and have a power-law magnitude frequency distribution as discussed by the authors, which is of critical importance to evaluate the impact of events of different length scales over different time intervals on landscape evolution.
Abstract: In humid uplands landsliding is the dominant mass wasting process. In the western Southern Alps of New Zealand landslides are scale invariant and have a power-law magnitude frequency distribution. Independent studies from other regions suggest that this is a general property of landsliding. This observation is of critical importance to the evaluation of the impact of events of different length scales over different time intervals on landscape evolution. It is particularly useful when estimating regional geomorphic rates, because it constrains the frequency and overall significance of extreme events, which cannot otherwise be evaluated. By integrating the complete response of the system, we estimate the regional denudation rate due to landsliding to be 9 ± 4 mm yr ‐1 . Sediment discharge from the western Southern Alps is dominated by landslide-derived material.

Journal ArticleDOI
TL;DR: This paper adopts scoring algorithms for the UK SF-36 and SF-12 summary scores to evaluate the picture of change gained in various treatment groups and suggests that where two summary scores of health status are adequate than theSF-12 may be the instrument of choice.
Abstract: Background The SF-36 is a generic health status measure which has gained popularity as a measure of outcome in a wide variety of patient groups and social surveys. However, there is a need for even shorter measures, which reduce respondent burden. The developers of the SF-36 have consequently suggested that a 12-item sub-set of the items may accurately reproduce the two summary component scores which can be derived from the SF-36 [the Physical Component Summary Score (PCS) and Mental Health Component Summary Score (MCS)]. In this paper, we adopt scoring algorithms for the UK SF-36 and SF-12 summary scores to evaluate the picture of change gained in various treatment groups. Methods The SF-36 was administered in three treatment groups (ACE inhibitors for congestive heart failure, continuous positive airways therapy for sleep apnoea, and open vs laparoscopic surgery for inguinal hernia). Results PCS and MCS scores calculated from the SF-36 or a sub-set of 12 items (the 'SF-12') were virtually identical, and indicated the same magnitude of ill-health and degree of change overtime. Conclusion The results suggest that where two summary scores of health status are adequate then the SF-12 may be the instrument of choice.

Journal ArticleDOI
TL;DR: In this article, the authors discuss the limits to the maximum precision achievable in the spectroscopy of n two-level atoms in the presence of decoherence by means of quantum entanglement.
Abstract: The optimal precision of frequency measurements in the presence of decoherence is discussed. We analyze different preparations of n two-level systems as well as different measurement procedures. We show that standard Ramsey spectroscopy on uncorrelated atoms and optimal measurements on maximally entangled states provide the same resolution. The best resolution is achieved using partially entangled preparations with a high degree of symmetry. [S0031-9007(97)04541-9] PACS numbers: 42.50.Ar, 03.65.Bz The rapid development of laser cooling and trapping techniques has opened up new perspectives in high precision spectroscopy. Frequency standards based on laser cooled ions are expected to achieve accuracies of the order of 1 part in 10 14 10 18 [1]. In this Letter we discuss the limits to the maximum precision achievable in the spectroscopy of n two-level atoms in the presence of decoherence. This question is particularly timely in view of current efforts to improve high precision spectroscopy by means of quantum entanglement. In the present context standard Ramsey spectroscopy refers to the situation schematically depicted in Fig. 1. An ion trap is loaded with n ions initially prepared in the same internal state j0l. A Ramsey pulse of frequency v is applied to all ions. The pulse shape and duration are carefully chosen so that it drives the atomic transition j0l $j 1 lof natural frequency v0 and prepares an equally weighted superposition of the two internal states j0l and j1l for each ion. Next the system evolves freely for a time t followed by the second Ramsey pulse. Finally, the internal state of each particle is measured. Provided that the duration of the Ramsey pulses is much smaller than the free evolution time t, the probability that an ion is found in j1l is given by