scispace - formally typeset
Search or ask a question

Showing papers on "Naturalness published in 2013"


Journal ArticleDOI
TL;DR: Experimental results demonstrate that the proposed enhancement algorithm can not only enhance the details but also preserve the naturalness for non-uniform illumination images.
Abstract: Image enhancement plays an important role in image processing and analysis. Among various enhancement algorithms, Retinex-based algorithms can efficiently enhance details and have been widely adopted. Since Retinex-based algorithms regard illumination removal as a default preference and fail to limit the range of reflectance, the naturalness of non-uniform illumination images cannot be effectively preserved. However, naturalness is essential for image enhancement to achieve pleasing perceptual quality. In order to preserve naturalness while enhancing details, we propose an enhancement algorithm for non-uniform illumination images. In general, this paper makes the following three major contributions. First, a lightness-order-error measure is proposed to access naturalness preservation objectively. Second, a bright-pass filter is proposed to decompose an image into reflectance and illumination, which, respectively, determine the details and the naturalness of the image. Third, we propose a bi-log transformation, which is utilized to map the illumination to make a balance between details and naturalness. Experimental results demonstrate that the proposed algorithm can not only enhance the details but also preserve the naturalness for non-uniform illumination images.

918 citations


Journal ArticleDOI
TL;DR: In this article, the authors provide a systematic effective lagrangian description of the phenomenology of the lightest top-partners in composite Higgs models, based on symmetry, on selection rules and on plausible dynamical assumptions.
Abstract: We provide a systematic effective lagrangian description of the phenomenology of the lightest top-partners in composite Higgs models. Our construction is based on symmetry, on selection rules and on plausible dynamical assumptions. The structure of the resulting simplified models depends on the quantum numbers of the lightest top partner and of the operators involved in the generation of the top Yukawa. In all cases the phenomenology is conveniently described by a small number of parameters, and the results of experimental searches are readily interpreted as a test of naturalness. We recast presently available experimental bounds on heavy fermions into bounds on top partners: LHC has already stepped well inside the natural region of parameter space.

217 citations


Book ChapterDOI
TL;DR: A brief introduction to naturalness problems in cosmology, and to the cosmological constant problem in particular, can be found in this article, where several notions of naturalness are defined, including the closely related ideas of technical naturalness and t Hooft naturalness.
Abstract: These notes present a brief introduction to `naturalness' problems in cosmology, and to the Cosmological Constant Problem in particular. The main focus is the `old' cosmological constant problem, though the more recent variants are also briefly discussed. Several notions of naturalness are defined, including the closely related ideas of technical naturalness and `t Hooft naturalness, and it is shown why these naturally arise when cosmology is embedded within a framework --- effective field theories --- that efficiently captures what is consistent with what is known about the physics of smaller distances. Some care is taken to clarify conceptual issues, such as the relevance or not of quadratic divergences, about which some confusion has arisen over the years. A set of minimal criteria are formulated against which proposed solutions to the problem can be judged, and a brief overview made of the general limitations of most of the approaches. A somewhat more in-depth discussion is provided of what I view as the most promising approach. These notes are aimed at graduate students with a basic working knowledge of quantum field theory and cosmology, but with no detailed knowledge of particle physics.

206 citations


Journal ArticleDOI
TL;DR: In this article, the authors study the renormalization of some dimension-4, 7 and 10 operators in a class of nonlinear scalar-tensor theories, which are invariant under linear diffeomorphisms which represent an exact symmetry of the full nonlinear action, and global fieldspace Galilean transformations of the scalar field.
Abstract: We study the renormalization of some dimension-4, 7 and 10 operators in a class of nonlinear scalar-tensor theories. These theories are invariant under (a) linear diffeomorphisms which represent an exact symmetry of the full nonlinear action, and (b) global field-space Galilean transformations of the scalar field. The Lagrangian contains a set of nontopological interaction terms of the above-mentioned dimensionality, which we show are not renormalized at any order in perturbation theory. We also discuss the renormalization of other operators, that may be generated by loops and/or receive loop corrections, and identify the regime in which they are subleading with respect to the operators that do not get renormalized. Interestingly, such scalar-tensor theories emerge in a certain high-energy limit of the ghost-free theory of massive gravity. One can use the nonrenormalization properties of the high-energy limit to estimate the magnitude of quantum corrections in the full theory. We show that the quantum corrections to the three free parameters of the model, one of them being the graviton mass, are strongly suppressed. In particular, we show that having an arbitrarily small graviton mass is technically natural.

158 citations


Journal ArticleDOI
TL;DR: In this article, a model of a confining dark sector, dark technicolor, that communicates with the Standard Model through the Higgs portal is proposed, and the electroweak scale is generated dynamically.
Abstract: We propose a model of a confining dark sector, dark technicolor, that communicates with the Standard Model through the Higgs portal. In this model electroweak symmetry breaking and dark matter share a common origin, and the electroweak scale is generated dynamically. Our motivation to suggest this model is the absense of evidence for new physics from recent LHC data. Although the conclusion is far from certain at this point, this lack of evidence may suggest that no mechanism exists at the electroweak scale to stabilise the Higgs mass against radiative corrections from UV physics. The usual reaction to this puzzling situation is to conclude that the stabilising new physics is either hidden from us by accident, or that it appears at energies that are currently inaccessible, such that nature is indeed fine-tuned. In order to re-examine the arguments that have lead to this dichotomy, we review the concept of naturalness in effective field theories, discussing in particular the role of quadratic divergences in relation to different energy scales. This leads us to suggest classical scale invariance as a guidline for model building, implying that explicit mass scales are absent in the underlying theory.

147 citations


Journal ArticleDOI
TL;DR: In this article, the authors construct a family of toy UV complete quantum theories providing a proof of concept for the second possibility, where low energy physics is described by a tuned effective field theory, which exhibits relevant interactions not protected by any symmetries and separated by an arbitrary large mass gap from the new gravitational physics, represented by a set of irrelevant operators.
Abstract: The cosmological constant problem and the absence of new natural physics at the electroweak scale, if confirmed by the LHC, may either indicate that the nature is fine-tuned or that a refined notion of naturalness is required. We construct a family of toy UV complete quantum theories providing a proof of concept for the second possibility. Low energy physics is described by a tuned effective field theory, which exhibits relevant interactions not protected by any symmetries and separated by an arbitrary large mass gap from the new “gravitational” physics, represented by a set of irrelevant operators. Nevertheless, the only available language to describe dynamics at all energy scales does not require any fine-tuning. The interesting novel feature of this construction is that UV physics is not described by a fixed point, but rather exhibits asymptotic fragility. Observation of additional unprotected scalars at the LHC would be a smoking gun for this scenario. Natural tuning also favors TeV scale unification.

139 citations


Journal ArticleDOI
Kenton O'Hara1, Richard Harper1, Helena M. Mentis1, Abigail Sellen1, Alex S. Taylor1 
TL;DR: Norman's critique is indicative of the issue that while using the word natural might have become natural, it is coming at a cost and there is a need to understand the key assumptions implicit within it and how these frame approaches to design and engineering in particular ways.
Abstract: Norman's critique is indicative of the issue that while using the word natural might have become natural, it is coming at a cost. In other words, precisely because the notion of naturalness has become so commonplace in the scientific lexicon of HCI, so it is becoming increasingly important, it seems that there is a critical examination of the conceptual work being performed when it is used. There is a need to understand the key assumptions implicit within it and how these frame approaches to design and engineering in particular ways. A second significant element of this perspective comes from Wittgenstein, and his claim that, through action, people create shared meanings with others, and these shared meanings are the essential common ground that enable individual perception to be cohered into socially organized, understood, and coordinated experiences.

125 citations


Journal ArticleDOI
TL;DR: In this article, the authors modify the usual criterion for naturalness by ignoring the uncomputable power divergences, and show that the Standard Model satisfies the modified criterion ('finite naturalness') for the measured values of its parameters.
Abstract: Motivated by LHC results, we modify the usual criterion for naturalness by ignoring the uncomputable power divergences. The Standard Model satisfies the modified criterion ('finite naturalness') for the measured values of its parameters. Extensions of the SM motivated by observations (Dark Matter, neutrino masses, the strong CP problem, vacuum instability, inflation) satisfy finite naturalness in special ranges of their parameter spaces which often imply new particles below a few TeV. Finite naturalness bounds are weaker than usual naturalness bounds because any new particle with SM gauge interactions gives a finite contribution to the Higgs mass at two loop order.

124 citations


Journal ArticleDOI
TL;DR: Any new scalar fields that perturbatively solve the hierarchy problem by stabilizing the Higgs boson mass also generate new contributions to the H boson field-strength renormalization, irrespective of their gauge representation.
Abstract: Any new scalar fields that perturbatively solve the hierarchy problem by stabilizing the Higgs boson mass also generate new contributions to the Higgs boson field-strength renormalization, irrespective of their gauge representation. These new contributions are physical, and in explicit models their magnitude can be inferred from the requirement of quadratic divergence cancellation; hence, they are directly related to the resolution of the hierarchy problem. Upon canonically normalizing the Higgs field, these new contributions lead to modifications of Higgs couplings that are typically great enough that the hierarchy problem and the concept of electroweak naturalness can be probed thoroughly within a precision Higgs boson program. Specifically, at a lepton collider this can be achieved through precision measurements of the Higgs boson associated production cross section. This would lead to indirect constraints on perturbative solutions to the hierarchy problem in the broadest sense, even if the relevant new fields are gauge singlets.

118 citations


Journal ArticleDOI
TL;DR: It is shown that, when human observers categorize global information in real-world scenes, the brain exhibits strong sensitivity to low-level summary statistics, and that global scene information may be computed by spatial pooling of responses from early visual areas (e.g., LGN or V1).
Abstract: The visual system processes natural scenes in a split second. Part of this process is the extraction of "gist," a global first impression. It is unclear, however, how the human visual system computes this information. Here, we show that, when human observers categorize global information in real-world scenes, the brain exhibits strong sensitivity to low-level summary statistics. Subjects rated a specific instance of a global scene property, naturalness, for a large set of natural scenes while EEG was recorded. For each individual scene, we derived two physiologically plausible summary statistics by spatially pooling local contrast filter outputs: contrast energy (CE), indexing contrast strength, and spatial coherence (SC), indexing scene fragmentation. We show that behavioral performance is directly related to these statistics, with naturalness rating being influenced in particular by SC. At the neural level, both statistics parametrically modulated single-trial event-related potential amplitudes during an early, transient window (100-150 ms), but SC continued to influence activity levels later in time (up to 250 ms). In addition, the magnitude of neural activity that discriminated between man-made versus natural ratings of individual trials was related to SC, but not CE. These results suggest that global scene information may be computed by spatial pooling of responses from early visual areas (e.g., LGN or V1). The increased sensitivity over time to SC in particular, which reflects scene fragmentation, suggests that this statistic is actively exploited to estimate scene naturalness.

113 citations


Journal ArticleDOI
TL;DR: In this article, the authors modify the usual criterion for naturalness by ignoring the uncomputable power divergences and show that the Standard Model satisfies the modified criterion (finite naturalness) for the measured values of its parameters.
Abstract: Motivated by LHC results, we modify the usual criterion for naturalness by ignoring the uncomputable power divergences. The Standard Model satisfies the modified criterion (‘finite naturalness’) for the measured values of its parameters. Extensions of the SM motivated by observations (Dark Matter, neutrino masses, the strong CP problem, vacuum instability, inflation) satisfy finite naturalness in special ranges of their parameter spaces which often imply new particles below a few TeV. Finite naturalness bounds are weaker than usual naturalness bounds because any new particle with SM gauge interactions gives a finite contribution to the Higgs mass at two loop order.

Journal ArticleDOI
TL;DR: Whether the patterns of phonotactic well-formedness internalized by language learners are direct reflections of the phonological patterns they encounter, or reflect in addition principles of phonological naturalness, is investigated, and is concluded in favor of a learning bias account.
Abstract: We investigate whether the patterns of phonotactic well-formedness internalized by language learners are direct reflections of the phonological patterns they encounter, or reflect in addition principles of phonological naturalness. We employed the phonotactic learning system of Hayes and Wilson (2008) to search the English lexicon for phonotactic generalizations and found that it learned many constraints that are evidently unnatural, having no typological or phonetic basis. We tested 10 such constraints by obtaining native-speaker ratings of 40 nonce words: 10 violated our unnatural constraints, 10 violated natural constraints assigned comparable weights by the learner, and 20 were control forms. Violations of the natural constraints had a powerful effect on ratings, violations of the unnatural constraints at best a weak one. We assess various hypotheses intended to explain this disparity, and conclude in favor of a learning bias account.

Journal ArticleDOI
TL;DR: In this article, it was shown that the measured Higgs mass, couplings, and percent-level naturalness of the weak scale are compatible with stops at 3.5 TeV and higgsinos at 1 TeV.
Abstract: We show that naturalness of the weak scale can be comfortably reconciled with both LHC null results and observed Higgs properties provided the double protection of supersymmetry and the twin Higgs mechanism. This double protection radically alters conventional signs of naturalness at the LHC while respecting gauge coupling unification and precision electroweak limits. We find the measured Higgs mass, couplings, and percent-level naturalness of the weak scale are compatible with stops at ~3.5 TeV and higgsinos at ~1 TeV. The primary signs of naturalness in this scenario include modifications of Higgs couplings, a modest invisible Higgs width, resonant Higgs pair production, and an invisibly-decaying heavy Higgs.

Journal ArticleDOI
TL;DR: In this paper, rescaled direct and indirect higgsino-like WIMP detection rates in SUSY models that fulfill the electroweak naturalness condition were calculated. And they showed that the rescaled results imply that these experiments should either discover WIMPs or exclude the concept of electroweak non-naturalness in R -parity conserving natural super-ymmetric models.

Journal ArticleDOI
01 Jan 2013
TL;DR: In this article, a set of pictures of the different urban green space typologies was shown to fifty undergraduate students of the University of Bari, and then measures of perceived restorativeness were taken.
Abstract: Green spaces have positive effects on human well-being and quality of life in cities. So far, studies in this field mainly compared preferences for, and outcomes of contact with, natural vs. built environments. Less attention has been given to the study of the psychological effects of contact with green spaces differing in their degree of naturalness. This paper thus aims at understanding the relation between ecological (e.g., level of naturalness) and psychological factors (e.g., perceived restorativeness) in shaping evaluations of different urban and peri-urban green spaces. Five typologies of green space have been identified in the city of Bari (southern Italy), ranging from minimum (i.e., high level of man-made elements) to maximum levels of naturalness (i.e., low level of man-made elements). A set of pictures of the different urban green space typologies was shown to fifty undergraduate students of the University of Bari, and then measures of perceived restorativeness were taken. Results show...

Posted Content
Gian F. Giudice1
TL;DR: In this paper, the status of the weak scale after the results from the LHC operating at an energy of 8 TeV was reviewed. But this was presented at the Europhysics Conference on High Energy Physics (EPS), Stockholm, Sweden, 18-24 July 2013.
Abstract: I review the status of naturalness of the weak scale after the results from the LHC operating at an energy of 8 TeV. Talk delivered at the 2013 Europhysics Conference on High Energy Physics (EPS), Stockholm, Sweden, 18-24 July 2013.

Journal ArticleDOI
TL;DR: In this paper, the naturalness of the Minimal Supersymmetric Standard Model (MSSM) in the light of recent LHC results from the ATLAS and CMS experiments is analyzed.
Abstract: We analyse the naturalness of the Minimal Supersymmetric Standard Model (MSSM) in the light of recent LHC results from the ATLAS and CMS experiments. We study non-universal boundary conditions for the scalar and the gaugino sector, with fixed relations between some of the soft breaking parameters, and find a significant reduction of fine-tuning for non-universal gaugino masses. For a Higgs mass of about 125 GeV, as observed recently, we find parameter regions with a fine-tuning of O (10), taking into account experimental and theoretical uncertainties. These regions also survive after comparison with simplified model searches in ATLAS and CMS. For a fine-tuning less than 20 the lightest neutralino is expected to be lighter than about 400 GeV and the lighter stop can be as heavy as 3.5TeV. On the other hand, the gluino mass is required to be above 1.5 TeV. For non-universal gaugino masses, we discuss which fixed GUT scale ratios can lead to a reduced fine-tuning and find that the recent Higgs results have a strong impact on which ratio is favoured. We also discuss the naturalness of GUT scale Yukawa relations, comparing the non-universal MSSM with the CMSSM.

Journal ArticleDOI
TL;DR: In this article, the authors considered the possibility that the Standard Model, and its minimal extension with the addition of singlets, merges with a high-scale supersymmetric theory at a scale satisfying the Veltman condition and therefore with no sensitivity to the cuto.
Abstract: In this paper we have considered the possibility that the Standard Model, and its minimal extension with the addition of singlets, merges with a high-scale supersymmetric theory at a scale satisfying the Veltman condition and therefore with no sensitivity to the cuto. The matching of the Standard Model is achieved at Planckian scales. In its complex singlet extension the matching scale depends on the strength of the coupling between the singlet and Higgs elds. For order one values of the coupling, still in the perturbative region, the matching scale can be located in the TeV ballpark. Even in the absence of quadratic divergences there remains a nite adjustment of the parameters in the high-energy theory which should guarantee that the Higgs and the singlets in the low-energy theory are kept light. This ne-tuning (unrelated to quadratic divergences) is the entire responsibility of the ultraviolet theory and remains as the missing ingredient to provide a full solution to the hierarchy problem.

Journal ArticleDOI
TL;DR: In this article, the effects of naturalness and daylight characteristics on preference are studied simultaneously, using both explicit and implicit preference, using direct ratings of the scenes and an affective priming task, respectively.

Journal ArticleDOI
Helena Siipi1
TL;DR: It is concluded that some very common current uses of the term “natural,” such as naturalness as lack of human influence, are not conceptually connected to the healthiness of food.
Abstract: Is food’s naturalness conceptually connected to its healthiness? Answering the question requires spelling out the following: (1) What is meant by the healthiness of food? (2) What different conceptual meanings the term natural has in the context of food? (3) Are some of those meanings connected to the healthiness of food? In this paper the healthiness of food is understood narrowly as food’s accordance with nutritional needs of its eater. The connection of healthiness to the following five food-related senses of the term “natural’’ is analyzed: naturalness as nutritive suitability, naturalness as moderate need satisfaction, naturalness as lack of human influence, naturalness as authenticity, and naturalness as familiarity. It is concluded that some very common current uses of the term “natural,” such as naturalness as lack of human influence, are not conceptually connected to the healthiness of food. Nevertheless, the first two senses of naturalness are strongly conceptually connected to healthiness in the food context and the last one may be indirectly related to it. Thus, desire for natural food is not necessarily mistaken and misguided.

Journal ArticleDOI
TL;DR: In this paper, lighting booth experiments were conducted to understand people's judgement of the naturalness of object colours and preference for the lit environment, and seven different LED spectral power distribut...
Abstract: To understand people's judgement of the naturalness of object colours and preference for the lit environment, lighting booth experiments were conducted. Seven different LED spectral power distribut...

Journal ArticleDOI
Hikaru Kawai1
TL;DR: In this article, the authors considered the low energy effective action for the multilocal wave function of the entire multiverse and showed that the coupling constants in low energy physics are determined by the dynamics of the multiverse, and they found that they are fixed in such a way that the total entropy of the universe at the late stage is maximized.
Abstract: In quantum gravity or string theory, it is natural to take the topology change of the space into account. We consider the low energy effective action for such case and show that it does not have a simple form of the local action but has a multilocal form. Actually, in quantum gravity or matrix model, there are some mechanisms that the low energy effective action becomes Seff = ∑ici Si + ∑ijcijSiSj + ∑ijkcijk Si Sj Sk + ⋯, where Si is a local action of the form . We further discuss that the topology change of the space naturally leads to the multiverse in which indefinite number of macroscopic universes exist in parallel. In this case, the space–time coordinates x in the multilocal action may sit either in the same universe or in different ones. We then consider the wave function of the entire multiverse, and see how the locality and causality are recovered in such theory. We further discuss the possibility of solving the naturalness problem. In doing so, we need to introduce some assumptions to interpret the multiverse wave function. We consider two different possibilities. One is to simply assume the probabilistic interpretation for the multiverse wave function. The other is to assume infrared cutoff independence of the partition function of the universe. In both cases, we find that the big fix occurs, in which all the coupling constants in the low energy physics are determined by the dynamics of the multiverse. Actually, we find that they are fixed in such a way that the total entropy of the universe at the late stage (in the far future) is maximized. Although the argument here is similar to Coleman's original one given in the late 1980s, our results are based on Lorentzian signature theory and the dynamical mechanism is rather different.

Journal ArticleDOI
TL;DR: In this paper, the authors argue that vector-like quarks are usually expected to mix predominantly with the third generation, and discuss about the expected size of this mixing and its naturalness.
Abstract: We argue why vector-like quarks are usually expected to mix predominantly with the third generation, and discuss about the expected size of this mixing and its naturalness.

Journal ArticleDOI
TL;DR: In this paper, a calculation scheme of radiative corrections utilizing a hidden duality was proposed, in the expectation that the unnaturalness for scalar masses might be an artifact in the effective theory and it could be improved if features of an ultimate theory are taken in.
Abstract: We reconsider the naturalnessfrom the viewpoint of effective field theories, motivatedbythealternativescenariothatthestandardmodelholdsuntila high-energy scale such as the Planck scale. We propose a calculation scheme of radiative corrections utilizing a hidden duality, in the expectation that the unnaturalness for scalar masses might be an artifact in the effective theory and it could be improved if features of an ultimate theory are taken in.

Journal ArticleDOI
TL;DR: For decades, the unnaturalness of the weak scale has been the dominant problem motivating new particle physics, and weak-scale supersymmetry has become the dominant solution as discussed by the authors. But this paradigm is now being challenged by a wealth of experimental data.
Abstract: For decades, the unnaturalness of the weak scale has been the dominant problem motivating new particle physics, and weak-scale supersymmetry has been the dominant proposed solution. This paradigm is now being challenged by a wealth of experimental data. In this review, we begin by recalling the theoretical motivations for weak-scale supersymmetry, including the gauge hierarchy problem, grand unification, and WIMP dark matter, and their implications for superpartner masses. These are set against the leading constraints on supersymmetry from collider searches, the Higgs boson mass, and low-energy constraints on flavor and CP violation. We then critically examine attempts to quantify naturalness in supersymmetry, stressing the many subjective choices that impact the results both quantitatively and qualitatively. Finally, we survey various proposals for natural supersymmetric models, including effective supersymmetry, focus point supersymmetry, compressed supersymmetry, and R-parity-violating supersymmetry, and summarize their key features, current status, and implications for future experiments.

Journal ArticleDOI
TL;DR: The running couplings can be understood as arising from the spontaneous breaking of an exact scale invariance in appropriate effective quantum field theories with no dilatation anomaly as discussed by the authors, which can be embedded into a quantum field theory with spontaneously broken exact scale-invariant in such a way that the ordinary running is recovered in the appropriate limit.
Abstract: Running couplings can be understood as arising from the spontaneous breaking of an exact scale invariance in appropriate effective theories with no dilatation anomaly. Any ordinary quantum field theory, even if it has massive fields, can be embedded into a theory with spontaneously broken exact scale invariance in such a way that the ordinary running is recovered in the appropriate limit, as long as the potential has a flat direction. These scale-invariant theories, however, do not necessarily solve the cosmological constant or naturalness problems, which become manifest in the need to fine-tune dimensionless parameters.

Journal ArticleDOI
TL;DR: The running couplings can be understood as arising from the spontaneous breaking of an exact scale invariance in appropriate effective quantum field theories with no dilatation anomaly as discussed by the authors, which can be embedded into a quantum field theory with spontaneously broken exact scale-invariant in such a way that the ordinary running is recovered in the appropriate limit.
Abstract: Running couplings can be understood as arising from the spontaneous breaking of an exact scale invariance in appropriate effective theories with no dilatation anomaly. Any ordinary quantum field theory, even if it has massive fields, can be embedded into a theory with spontaneously broken exact scale invariance in such a way that the ordinary running is recovered in the appropriate limit, as long as the potential has a flat direction. These scale-invariant theories, however, do not necessarily solve the cosmological constant or naturalness problems, which become manifest in the need to fine-tune dimensionless parameters.

Journal ArticleDOI
Guido Altarelli1
TL;DR: In this paper, the authors present a concise outlook of particle physics after the first LHC results at 7-8 TeV, and review the established facts so far and present a tentative assessment of the open problems.
Abstract: We present a concise outlook of particle physics after the first LHC results at 7-8 TeV. The discovery of the Higgs boson at 126 GeV will remain as one of the major physics discoveries of our time. But also the surprising absence of any signals of new physics, if confirmed in the continuation of the LHC experiments, is going to drastically change our vision of the field. At present the indication is that Nature does not too much care about our notion of naturalness. Still the argument for naturalness is a solid one and we are facing a puzzling situation. We review the established facts so far and present a tentative assessment of the open problems.

Journal Article
TL;DR: In this article, it is argued that the notion of degree of being or grade of being can be analyzed in terms of naturalness, fundamentality, or structure, which is a notion that many contemporary metaphysicians regard as being unintelligible.
Abstract: Let us agree that everything that there is exists, and that to be, to be real, and to exist are one and the same. Does everything that there is exist to the same degree? Or do some things exist more than others? Are there gradations of being? Perhaps no view is more despised by analytic metaphysicians than that there are gradations of being. But what if, unbeknownst to them, they have helped themselves to the doctrine that being comes in degrees when formulating various metaphysical theories or conducting metaphysical disputes? What if gradation of being is already playing a significant role in their theorizing, albeit under a different guise? Consider the following technical terms employed in many contemporary metaphysical debates: ‘naturalness’ as used by David Lewis (1986), ‘fundamentality’ or ‘structure’ as used by Ted Sider (2009, 2012), ‘grounding’ as used by Jonathan Schaffer (2009) and others, and the ubiquitous ‘in virtue of’. I have argued elsewhere that, given certain plausible assumptions, the notion of degree of being or grade of being can be analyzed in terms of these notions.2 Here I will argue that, given certain plausible assumptions, each of these notions can be analyzed in terms of the notion that being comes in degrees or grades. There are several reasons why this result is interesting. First, the notions of naturalness, fundamentality, or structure are ones that most contemporary metaphysicians grant are intelligible, whereas the claim that existence, being, or reality might come in degrees is regarded by many metaphysicians as being unintelligible. One way to assist a philosopher in grasping a notion that she regards as unintelligible

Journal ArticleDOI
TL;DR: In this article, the authors examined the effect of point-of-purchase points of purchase on the perception of naturalness and found that the authority which claims the naturalness of the product is of major importance, leading consumers to perceive the claim as more credible.