scispace - formally typeset
Search or ask a question
Journal ArticleDOI

The Contiguity Ratio and Statistical Mapping

01 Nov 1954-Vol. 5, Iss: 3, pp 115-141
TL;DR: In this article, the authors considered the problem of determining whether statistics given for each "county" in a "country" are distributed at random or whether they form a pattern.
Abstract: The problem discussed in this paper is to determine whether statistics given for each "county" in a "country" are distributed at random or whether they form a pattern. The statistical instrument is the contiguity ratio c defined by formula (1.1) below, which is an obvious generalization of the Von Neumann (1941) ratio used in one-dimensional analysis, particularly time series. While the applications in the paper are confined to oneand two-dimensional problems, it is evident that the theory applies to any number of dimensions. If the figures for adjoining counties are generally closer than those for counties not adjoining, the ratio will clearly tend to be less than unity. The constants are such that when the statistics are distributed at random in the counties, the average value of the ratio is unity. The statistics will be regarded as contiguous if the actual ratio found is significantly less than unity, by reference to the standard error. The theory is discussed from the viewpoints of both randomization and classical normal theory. With the randomization approach, the observations themselves are the "universe" and no assumption need be made as to the character of the frequency distribution. In the "normal case," the assumption is that the observations may be regarded as a random sample from a normal universe. In this case it seems certain that the ratio tends very rapidly to normality as the number of counties increases. The exact values of the first four semi-invariants are given for the normal case. These functions depend only on the configuration, and the calculated values for Ireland, with number of counties only 26, show that the distribution of the ratio is very close to normal. Accordingly, one can have confidence in deciding on significance from the standard error.
Citations
More filters
Journal ArticleDOI
01 Sep 1993-Ecology
TL;DR: The paper discusses first how autocorrelation in ecological variables can be described and measured, and ways are presented of explicitly introducing spatial structures into ecological models, and two approaches are proposed.
Abstract: ilbstract. Autocorrelation is a very general statistical property of ecological variables observed across geographic space; its most common forms are patches and gradients. Spatial autocorrelation. which comes either from the physical forcing of environmental variables or from community processes, presents a problem for statistical testing because autocorrelated data violate the assumption of independence of most standard statistical procedures. The paper discusses first how autocorrelation in ecological variables can be described and measured. with emphasis on mapping techniques. Then. proper statistical testing in the presence of autocorrelation is briefly discussed. Finally. ways are presented of explicitly introducing spatial structures into ecological models. Two approaches are proposed: in the raw-data approach, the spatial structure takes the form of a polynomial of the x and .v geographic coordinates of the sampling stations; in the matrix approach. the spatial structure is introduced in the form of a geographic distance matrix among locations. These two approaches are compared in the concluding section. A table provides a list of computer programs available for spatial analysis.

3,491 citations

Journal ArticleDOI
TL;DR: In this article, residential segregation is viewed as a multidimensional phenomenon varying along five distinct axes of measurement: evenness exposure concentration centralization and clustering, and 20 indices of segregation are surveyed and related conceptually to 1 of the five dimensions.
Abstract: This paper conceives of residential segregation as a multidimensional phenomenon varying along 5 distinct axes of measurement: evenness exposure concentration centralization and clustering. 20 indices of segregation are surveyed and related conceptually to 1 of the 5 dimensions. Using data from a large set of US metropolitan areas the indices are intercorrelated and factor analyzed. Orthogonal and oblique rotations produce pattern matrices consistent with the postulated dimensional structure. Based on the factor analyses and other information 1 index was chosen to represent each of the 5 dimensions and these selections were confirmed with a principal components analysis. The paper recommends adopting these indices as standard indicators in future studies of segregation. (authors)

2,833 citations


Cites background or methods from "The Contiguity Ratio and Statistica..."

  • ...Geary (1954) illustrates the calculation of the contiguity index with sample data from Britain, and Duncan, Cuzzort, and Duncan (1961) provide a more extensive empirical evaluation using U....

    [...]

  • ...Geary (1954) illustrates the calculation of the contiguity index with sample data from Britain, and Duncan, Cuzzort, and Duncan (1961) provide a more extensive empirical evaluation using U.S. data....

    [...]

  • ...Geographers have long been concerned with the "checkerboard problem" under the rubric of the "contiguity problem" and have devised a variety of indicators to measure it (Dacey 1968; Geary 1954)....

    [...]

  • ...variety of indicators to measure it (Dacey 1968; Geary 1954)....

    [...]

Journal ArticleDOI
TL;DR: In this article, the spatial heterogeneity of populations and communities plays a central role in many ecological theories, such as succession, adaptation, maintenance of species diversity, community stability, competition, predator-prey interactions, parasitism, epidemics and other natural catastrophes, ergoclines, and so on.
Abstract: The spatial heterogeneity of populations and communities plays a central role in many ecological theories, for instance the theories of succession, adaptation, maintenance of species diversity, community stability, competition, predator-prey interactions, parasitism, epidemics and other natural catastrophes, ergoclines, and so on. This paper will review how the spatial structure of biological populations and communities can be studied. We first demonstrate that many of the basic statistical methods used in ecological studies are impaired by autocorrelated data. Most if not all environmental data fall in this category. We will look briefly at ways of performing valid statistical tests in the presence of spatial autocorrelation. Methods now available for analysing the spatial structure of biological populations are described, and illustrated by vegetation data. These include various methods to test for the presence of spatial autocorrelation in the data: univariate methods (all-directional and two-dimensional spatial correlograms, and two-dimensional spectral analysis), and the multivariate Mantel test and Mantel correlogram; other descriptive methods of spatial structure: the univariate variogram, and the multivariate methods of clustering with spatial contiguity constraint; the partial Mantel test, presented here as a way of studying causal models that include space as an explanatory variable; and finally, various methods for mapping ecological variables and producing either univariate maps (interpolation, trend surface analysis, kriging) or maps of truly multivariate data (produced by constrained clustering). A table shows the methods classified in terms of the ecological questions they allow to resolve. Reference is made to available computer programs.

2,166 citations


Cites methods from "The Contiguity Ratio and Statistica..."

  • ...In the case of quantitative variables, spatial autocorrelation can be measured by either Moran's I (1950) or Geary's c (1954) spatial autocorrelation coefficients....

    [...]

Journal ArticleDOI
TL;DR: The Principal Coordinates of Neighbors of Neighbor Matrices (PCNM) approach as discussed by the authors was proposed to create spatial predictors that can be easily incorporated into regression or canonical analysis models, providing a flexible tool especially when contrasted to the family of autoregressive models and trend surface analysis which are of common use in ecological and geographical analysis.

1,620 citations

Journal ArticleDOI
TL;DR: A set of 169 radiomics features was standardized, which enabled verification and calibration of different radiomics software and could be excellently reproduced.
Abstract: Background Radiomic features may quantify characteristics present in medical imaging. However, the lack of standardized definitions and validated reference values have hampered clinical use. Purpose To standardize a set of 174 radiomic features. Materials and Methods Radiomic features were assessed in three phases. In phase I, 487 features were derived from the basic set of 174 features. Twenty-five research teams with unique radiomics software implementations computed feature values directly from a digital phantom, without any additional image processing. In phase II, 15 teams computed values for 1347 derived features using a CT image of a patient with lung cancer and predefined image processing configurations. In both phases, consensus among the teams on the validity of tentative reference values was measured through the frequency of the modal value and classified as follows: less than three matches, weak; three to five matches, moderate; six to nine matches, strong; 10 or more matches, very strong. In the final phase (phase III), a public data set of multimodality images (CT, fluorine 18 fluorodeoxyglucose PET, and T1-weighted MRI) from 51 patients with soft-tissue sarcoma was used to prospectively assess reproducibility of standardized features. Results Consensus on reference values was initially weak for 232 of 302 features (76.8%) at phase I and 703 of 1075 features (65.4%) at phase II. At the final iteration, weak consensus remained for only two of 487 features (0.4%) at phase I and 19 of 1347 features (1.4%) at phase II. Strong or better consensus was achieved for 463 of 487 features (95.1%) at phase I and 1220 of 1347 features (90.6%) at phase II. Overall, 169 of 174 features were standardized in the first two phases. In the final validation phase (phase III), most of the 169 standardized features could be excellently reproduced (166 with CT; 164 with PET; and 164 with MRI). Conclusion A set of 169 radiomics features was standardized, which enabled verification and calibration of different radiomics software. © RSNA, 2020 Online supplemental material is available for this article. See also the editorial by Kuhl and Truhn in this issue.

1,563 citations

References
More filters
Journal ArticleDOI

77 citations


"The Contiguity Ratio and Statistica..." refers background or methods in this paper

  • ...R. Rao (1952) has suggested a method of orthogonalizing the original variables xat which consists in taking new variables x,i as follows: xit = xit X2t = X2t -~ a21X2t X3t X= Xt ~ a32X2t ~ (31X1 t...

    [...]

  • ...Percentage Number a Agricultural Holdings Per 1,000 Acres Crops and Ton and Per 1,000 Population Sing in Valuation Groups Pasture (1952) Village (1951) Retail Me (1950) Population Sales as % of as Per-£ per all Males Si County centage of Person Aged Seal (incl....

    [...]

  • ...Percentage Number a Agricultural Holdings Per 1,000 Acres Crops and Ton and Per 1,000 Population Sing in Valuation Groups Pasture (1952) Village (1951) Retail Me (1950) Population Sales as % of as Per-£ per all Males Si County centage of Person Aged Seal (incl. Co. Total Private Radio (1951) 30-34 Letter Borough) £2-£10 £10-£0 Above Milch Other Pigs Sheep (1951) Cars Licences (1951) £50 Cows Cattle Registered (1952) (1952)...

    [...]