scispace - formally typeset
Search or ask a question

Showing papers on "Naturalness published in 2017"


Journal ArticleDOI
TL;DR: In this paper, a systematic review identified 72 studies conducted in 32 countries involving 85,348 consumers and found that the items used to measure the importance of naturalness can be classified into three categories: 1) the way the food has been grown (food origin), 2) how the food have been produced (what technology and ingredients have been used), and 3) the properties of the final product.
Abstract: Background Consumers’ perceptions of naturalness are important for the acceptance of foods and food technologies. Thus, several studies have examined the significance of naturalness among consumers. Nonetheless, the aspects that are considered essential in perceiving a food item as natural may vary across consumers and different stakeholder groups. Scope and approach This systematic review identified 72 studies conducted in 32 countries involving 85,348 consumers. We aimed to answer the following questions: 1) How has the perceived importance of naturalness for consumers been defined and measured? 2) To what extent is perceived naturalness important to consumers? 3) Are there individual differences regarding the importance given to food naturalness that can be explained by consumers' characteristics? 4) Do consumers’ attitudes toward food naturalness influence their intentions and behavior? Key findings and conclusions The review clearly shows that for the majority of consumers, food naturalness is crucial. This finding could be observed across countries and in the different years when the studies were conducted. Therefore, neglecting the aspect of naturalness in the food industry may be very costly in the end. Our review also reveals differences across studies in how naturalness has been defined and measured. Based on a content analysis of the measurement scales, the items used to measure the importance of naturalness can be classified into three categories: 1) the way the food has been grown (food origin), 2) how the food has been produced (what technology and ingredients have been used), and 3) the properties of the final product.

441 citations


Journal ArticleDOI
TL;DR: The right-handed neutrinos within the type-I seesaw mechanism can induce large radiative corrections to the Higgs mass and naturalness arguments can then be used to set limits on their mass scale and Yukawa couplings.
Abstract: The right-handed neutrinos within the type-I seesaw mechanism can induce large radiative corrections to the Higgs mass, and naturalness arguments can then be used to set limits on their mass scale and Yukawa couplings. Driven by minimality, we consider the presence of two degenerate right-handed neutrinos. We compare the limits from naturalness with the ones from the stability of the electroweak vacuum and from lepton flavor violation. Implications from neutrinoless double beta decay are also discussed and renormalization effects for the light neutrino parameters are presented. Adding small perturbations to the degenerate heavy neutrino spectrum allows for successful leptogenesis.

98 citations


Book ChapterDOI
TL;DR: In an imaginary conversation with Guido Altarelli, the authors expressed my views on the status of particle physics beyond the Standard Model and its future prospects, including the future of particle networks.
Abstract: In an imaginary conversation with Guido Altarelli, I express my views on the status of particle physics beyond the Standard Model and its future prospects.

81 citations


Journal ArticleDOI
TL;DR: In this paper, a weak scalar triplet is introduced and the interplay of direct and indirect constraints on the type II seesaw model with its contribution to the Higgs mass is analyzed.

56 citations


Journal ArticleDOI
TL;DR: In this article, the authors extend the list of theories featuring a rigorous interacting ultraviolet fixed point by constructing the first theory featuring a Higgs-like scalar with gauge, Yukawa and quartic interactions.
Abstract: We extend the list of theories featuring a rigorous interacting ultraviolet fixed point by constructing the first theory featuring a Higgs-like scalar with gauge, Yukawa and quartic interactions. We show that the theory enters a perturbative asymptotically safe regime at energies above a physical scale $\Lambda$. We determine the salient properties of the theory and use it as a concrete example to test whether scalars masses unavoidably receive quantum correction of order $\Lambda$. Having at our dispose a calculable model allowing us to precisely relate the IR and UV of the theory we demonstrate that the scalars can be lighter than $\Lambda$. Although we do not have an answer to whether the Standard Model hypercharge coupling growth towards a Landau pole at around $\Lambda \sim 10^{40}\GeV$ can be tamed by non-perturbative asymptotic safety, our results indicate that such a possibility is worth exploring. In fact, if successful, it might also offer an explanation for the unbearable lightness of the Higgs.

51 citations


Journal ArticleDOI
TL;DR: In this paper, the authors identify two dimensions of food naturalness and related them to credibility, attractiveness, quality, and purchase intention, with differences according to the three types of packaging tested.
Abstract: The aim of this article is to understand the concept of food “naturalness,” as it is perceived by the consumer via the packaging. The research is based on a qualitative study from which three types of experimental packaging were constructed (emotional, functional, and mixed) and a quantitative study carried out on 163 French consumers. The research identified two dimensions of food naturalness and related them to credibility, attractiveness, quality, and purchase intention, with differences according to the three types of packaging tested. The highlighting of their role in the perception of the naturalness of a food product should help managers to avoid overexposure of the concept.

46 citations


Journal ArticleDOI
TL;DR: In this paper, a controlled experiment examined how academic achievement and cognitive, emotional and social aspects of perceived learning are affected by the level of medium naturalness (face-to-face, one-way and two-way videoconferencing) and by learners' personality traits (extroversion-introversion and emotional stability-neuroticism).
Abstract: This controlled experiment examined how academic achievement and cognitive, emotional and social aspects of perceived learning are affected by the level of medium naturalness (face-to-face, one-way and two-way videoconferencing) and by learners’ personality traits (extroversion–introversion and emotional stability–neuroticism). The Media Naturalness Theory explains the degree of medium naturalness by comparing its characteristics to face-to-face communication, considered to be the most natural form of communication. A total of 76 participants were randomly assigned to three experimental conditions: face-to-face, one-way and two-way videoconferencing. E-learning conditions were conducted through Zoom videoconferencing, which enables natural and spontaneous communication. Findings shed light on the trade-off involved in media naturalness: one-way videoconferencing, the less natural learning condition, enhanced the cognitive aspect of perceived learning but compromised the emotional and social aspects. Regarding the impact of personality, neurotic students tended to enjoy and succeed more in face-to-face learning, whereas emotionally stable students enjoyed and succeeded in all of the learning conditions. Extroverts tended to enjoy more natural learning environments but had lower achievements in these conditions. In accordance with the ‘poor get richer’ principle, introverts enjoyed environments with a low level of medium naturalness. However, they remained focused and had higher achievements in the face-to-face learning. (Published 13 July 2017) Citation: Research in Learning Technology 2017, 25: 1945 - http://dx.doi.org/10.25304/rlt.v25.1974

46 citations


Journal ArticleDOI
TL;DR: In this paper, the authors consider the problem of observing gauge-Higgs sectors in non-Gaussianities with a reasonable precision given favorable couplings to the inflationary dynamics.
Abstract: Future measurements of primordial non-Gaussianity can reveal cosmologically produced particles with masses of order the inflationary Hubble scale and their interactions with the inflaton, giving us crucial insights into the structure of fundamental physics at extremely high energies. We study gauge-Higgs theories that may be accessible in this regime, carefully imposing the constraints of gauge symmetry and its (partial) Higgsing. We distinguish two types of Higgs mechanisms: (i) a standard one in which the Higgs scale is constant before and after inflation, where the particles observable in non-Gaussianities are far heavier than can be accessed by laboratory experiments, perhaps associated with gauge unification, and (ii) a "heavy-lifting" mechanism in which couplings to curvature can result in Higgs scales of order the Hubble scale during inflation while reducing to far lower scales in the current era, where they may now be accessible to collider and other laboratory experiments. In the heavy-lifting option, renormalization-group running of terrestrial measurements yield predictions for cosmological non-Gaussianities. If the heavy-lifted gauge theory suffers a hierarchy problem, such as does the Standard Model, confirming such predictions would demonstrate a striking violation of the Naturalness Principle. While observing gauge-Higgs sectors in non-Gaussianities will be challenging given the constraints of cosmic variance, we show that it may be possible with reasonable precision given favorable couplings to the inflationary dynamics.

45 citations


Journal ArticleDOI
TL;DR: Criterion values to achieve ‘good’ preference, naturalness and vividness level were determined and the observers’ assessments were predicted by recent colour quality indices and CIELAB chroma differences.
Abstract: In Part 2 of this work, observers scaled colour preference, naturalness and vividness visually on interval scales (0–100) labelled by semantic categories (e.g. ‘moderate’, ‘good’ and ‘very good’) i...

43 citations


Proceedings ArticleDOI
20 Aug 2017
TL;DR: This work proposes to use Multi-Task Learning (MTL) and use gender and naturalness as auxiliary tasks in deep neural networks and found that the MTL method proposed improved performance significantly.
Abstract: One of the challenges in Speech Emotion Recognition (SER) "in the wild" is the large mismatch between training and test data (e.g. speakers and tasks). In order to improve the generalisation capabilities of the emotion models, we propose to use Multi-Task Learning (MTL) and use gender and naturalness as auxiliary tasks in deep neural networks. This method was evaluated in within-corpus and various cross-corpus classification experiments that simulate conditions "in the wild". In comparison to Single-Task Learning (STL) based state of the art methods, we found that our MTL method proposed improved performance significantly. Particularly, models using both gender and naturalness achieved more gains than those using either gender or naturalness separately. This benefit was also found in the high-level representations of the feature space, obtained from our method proposed, where discriminative emotional clusters could be observed.

41 citations


Journal ArticleDOI
TL;DR: A technique is proposed in this paper in which a pool of pre-trained transformations between a set of speakers is used as follows, making it possible to produce de-identified speech in real-time with a high level of naturalness.

Journal ArticleDOI
TL;DR: High-level visual features play a prominent role predicting aesthetic preference, but do not completely eliminate the predictive power of the low- level visual features, which provide powerful insights for future research relating to landscape and urban design.
Abstract: Previous research has investigated ways to quantify visual information of a scene in terms of a visual processing hierarchy, i. e. making sense of visual environment by segmentation and integration of elementary sensory input. Guided by this research, studies have developed categories for low-level visual features (e.g., edges, colors), high-level visual features (scene-level entities that convey semantic information such as objects), and how models of those features predict aesthetic preference and naturalness. For example, in Kardan et al. (2015), 52 participants provided aesthetic preference and naturalness ratings, which are used in the current study, for 307 images of mixed natural and urban content. Kardan et al. (2015) then developed a model using low-level features to predict aesthetic preference and naturalness and could do so with high accuracy. What has yet to be explored is the ability of higher-level visual features (e.g., horizon line position relative to viewer, geometry of building distribution relative to visual access) to predict aesthetic preference and naturalness of scenes, and whether higher-level features mediate some of the association between the low-level features and aesthetic preference or naturalness. In this study we investigated these relationships and found that low- and high- level features explain 68.4% of the variance in aesthetic preference ratings and 88.7% of the variance in naturalness ratings. Additionally, several high-level features mediated the relationship between the low-level visual features and aesthetic preference. In a multiple mediation analysis, the high-level feature mediators accounted for over 50% of the variance in predicting aesthetic preference. These results show that high-level visual features play a prominent role predicting aesthetic preference, but do not completely eliminate the predictive power of the low-level visual features. These strong predictors provide powerful insights for future research relating to landscape and urban design with the aim of maximizing subjective well-being, which could lead to improved health outcomes on a larger scale.

Journal ArticleDOI
TL;DR: Natural elements, life and life-like processes, as well as representations of them can produce positive experiences within the built environment as mentioned in this paper, and a number of empirical studies have shown that natural elements can generate positive experiences.
Abstract: Natural elements, life and life-like processes, as well as representations of them, can produce positive experiences within the built environment. Over the past decade, a number of empirical studie...

Journal ArticleDOI
TL;DR: In this article, the authors compare fine-tuning in a Ω3 conserving semi-constrained NMSSM to the constrained MSSM (CMSSM), and show that naturalness priors provide valuable insight into the hierarchy problem.
Abstract: The Higgs boson discovery stirred interest in next-to-minimal supersymmetric models, due to the apparent fine-tuning required to accommodate it in minimal theories. To assess their naturalness, we compare fine-tuning in a ℤ3 conserving semi-constrained Next-to-Minimal Supersymmetric Standard Model (NMSSM) to the constrained MSSM (CMSSM). We contrast popular fine-tuning measures with naturalness priors, which automatically appear in statistical measures of the plausibility that a given model reproduces the weak scale. Our comparison shows that naturalness priors provide valuable insight into the hierarchy problem and rigorously ground naturalness in Bayesian statistics. For the CMSSM and semi-constrained NMSSM we demonstrate qualitative agreement between naturalness priors and popular fine tuning measures. Thus, we give a clear plausibility argument that favours relatively light superpartners.

Posted Content
TL;DR: For example, the authors demonstrate additivity dominance for the first time using equivalent adding and subtracting procedures and find that adding something to a natural product (additive) reduces naturalness more than removing an equivalent entity (subtractive).
Abstract: Naturalness is important and valued by most lay Western individuals Yet, little is known about the lay meaning of “natural†We examine the phenomenon of additivity dominance: adding something to a natural product (additive) reduces naturalness more than removing an equivalent entity (“subtractive†) We demonstrate additivity dominance for the first time using equivalent adding and subtracting procedures We find that adding something reduces naturalness more than removing the same thing (eg, adding pulp to orange juice reduces naturalness more than removing pulp from orange juice; Study 1); an organism with a gene added is less natural than one with a gene removed (Study 2); and framing a product as an additive (versus as a subtractive) reduces naturalness (Study 3) We begin to examine accounts of additivity dominance We find that it is not due to the connotations of the word “additive†(Study 4) However, data are consistent with an extra processing account — where additives involve more processing (extracting and adding) than subtractives (only removing) — and with a contagion account — where adding is more contaminating than removing (Study 5)

Book
T. Givón1
19 Jan 2017
TL;DR: This article studied the naturalness, universality and well-governedness of zero by studying it from four closely related perspectives: cognitive and communicative function, natural-text distribution, cross-language typological distribution, and the diachronic rise of referent coding devices.
Abstract: The zero coding of referents or other clausal constituents is one of the most natural, communicatively and cognitively-transparent grammatical devices in human language. Together with its functional equivalent, obligatory pronominal agreement, zero is both extremely widespread cross-linguistically and highly frequent in natural text. In the domain of reference, zero represents, somewhat paradoxically, either anaphorically-governed high continuity or cataphorically-governed low topicality. And whether in conjoined/chained or syntactically-subordinate clauses, zero is extremely well-governed, at a level approaching 100% in natural text. The naturalness, cross-language ubiquity and well-governedness of zero have been largely obscured by an approach that, for 30-odd years, has considered it a typological exotica, the so-called "pro-drop" associated with a dubious "non-configurational" language type. The main aim of this book is to reaffirm the naturalness, universality and well-governedness of zero by studying it from four closely related perspectives: (i) cognitive and communicative function; (ii) natural-text distribution; (iii) cross-language typological distribution; and (iv) the diachronic rise of referent coding devices. The latter is particularly central to our understanding the functional interplay between zero anaphora, pronominal agreement and related referent-coding devices.

Journal ArticleDOI
TL;DR: In this paper, a simplified Minimal Supersymmetric Standard Model (MSSM) scenario where only the bino-like lightest supersymmetric particle (LSP) and higgsinolike next-light particle (HPLP) were considered.
Abstract: Motivated by the naturalness, we study a simplified Minimal Supersymmetric Standard Model (MSSM) scenario where only the bino-like lightest supersymmetric particle (LSP) and higgsino-like next-ligh...

Journal ArticleDOI
TL;DR: In this paper, the authors extend the NMSSM by inverse seesaw mechanism to generate neutrino mass, and show that in certain parameter space the lightest sneutrino may act as a viable dark matter candidate, i.e. it can annihilate by multi-channels to get correct relic density and meanwhile satisfy all experimental constraints.
Abstract: In supersymmetric theories like the Next-to-Minimal Supersymmetric Standard Model (NMSSM), the lightest neutralino with bino or singlino as its dominant component is customarily taken as dark matter (DM) candidate. Since light Higgsinos favored by naturalness can strength the couplings of the DM and thus enhance the DM-nucleon scattering rate, the tension between naturalness and DM direct detection results becomes more and more acute with the improved experimental sensitivity. In this work, we extend the NMSSM by inverse seesaw mechanism to generate neutrino mass, and show that in certain parameter space the lightest sneutrino may act as a viable DM candidate, i.e. it can annihilate by multi-channels to get correct relic density and meanwhile satisfy all experimental constraints. The most striking feature of the extension is that the DM-nucleon scattering rate can be naturally below its current experimental bounds regardless of the higgsino mass, and hence it alleviates the tension between naturalness and DM experiments. Other interesting features include that the Higgs phenomenology becomes much richer than that of the original NMSSM due to the relaxed constraints from DM physics and also due to the presence of extra neutrinos, and that the signatures of sparticles at colliders are quite different from those with neutralino as DM candidate.

Journal ArticleDOI
TL;DR: In this paper, the authors study the naturalness properties of the $B-L$ Supersymmetric Standard Model (BLSSM) and compare them to those of the Minimal Supersymmetric Standard Models (MSSM), at both low (i.e., Large Hadron Collider) energies and high scales.
Abstract: We study the naturalness properties of the $B-L$ Supersymmetric Standard Model (BLSSM) and compare them to those of the Minimal Supersymmetric Standard Model (MSSM) at both low (i.e., Large Hadron Collider) energies and high (i.e., unification) scales. By adopting standard measures of naturalness, we assess that, in presence of full unification of the additional gauge couplings and scalar/fermionic masses of the BLSSM, such a scenario reveals a somewhat higher degree of Fine-Tuning (FT) than the MSSM, when the latter is computed at the unification scale and all available theoretical and experimental constraints, but the Dark Matter (DM) ones, are taken into account. Yet, such a difference, driven primarily by the collider limits requiring a high mass for the gauge boson associated to the breaking of the additional $U(1)_{B-L}$ gauge group of the BLSSM in addition to the $SU(3)_C\times SU(2)_L \times U(1)_Y$ of the MSSM, should be regarded as a modest price to pay for the former in relation to the latter, if one notices that the non-minimal scenario offers a significant volume of parameter space where numerous DM solutions of different compositions can be found to the relic density constraints, unlike the case of the minimal structure, wherein only one type of solution is accessible over an ever diminishing parameter space. In fact, this different level of tension within the two SUSY models in complying with current data is well revealed when the FT measure is recomputed in terms of the low energy spectra of the two models, over their allowed regions of parameter space now in presence of all DM bounds, as it is shown that the tendency is now opposite, the BLSSM appearing more natural than the MSSM.

Journal ArticleDOI
TL;DR: A unified CQ model was developed with a multiple nonlinear regression equation combining the Illuminating Engineering Society of North America color rendition method that accords satisfactorily with the subjective evaluation, while being applicable to a wide range of CCTs.
Abstract: Considering that the existing color quality (CQ) metrics for light sources cannot correlate well with the subjective evaluation, in an immersive environment equipped with a multichannel LED light source, a psychophysical experiment by categorical judgment method was carried out to assess the three perception-related CQ attributes of light sources in terms of naturalness, colorfulness, and preference. The experiment collected the subjective responses to these attributes of up to 41 metameric spectra at each of four test correlated color temperatures (CCTs) ranging from 2800 to 6500 K, which covers the usual white-light range for general lighting. The results indicate that preference exhibits relatively high correlation with naturalness and colorfulness, and naturalness is weakly related to colorfulness. Besides, 20 typical CQ metrics were adopted to examine their validity in characterizing the subjective data, confirming their limited performance. Meanwhile, the underlying relationship of these metrics and the subjective data was also analyzed by the multidimensional scaling, revealing that almost all metrics can correspond to one attribute of naturalness, colorfulness, and preference, and that the saturation level is identified as a critical factor affecting these attributes. Based on these results, a unified CQ model was developed with a multiple nonlinear regression equation combining the Illuminating Engineering Society of North America color rendition method. The model accords satisfactorily with the subjective evaluation, while being applicable to a wide range of CCTs.

Journal ArticleDOI
TL;DR: A scalar singlet extension of the standard model, in which the multiple-point principle (MPP) condition of a vanishing Higgs potential at the Planck scale is realized, was proposed in this paper.
Abstract: We suggest a scalar singlet extension of the standard model, in which the multiple-point principle (MPP) condition of a vanishing Higgs potential at the Planck scale is realized. Although there have been lots of attempts to realize the MPP at the Planck scale, the realization with keeping naturalness is quite difficult. Our model can easily achieve the MPP at the Planck scale without large Higgs mass corrections. It is worth noting that the electroweak symmetry can be radiatively broken in our model. In the naturalness point of view, the singlet scalar mass should be of ${\cal O}(1)\,{\rm TeV}$ or less. We also consider right-handed neutrino extension of the model for neutrino mass generation. The model does not affect the MPP scenario, and might keep the naturalness with the new particle mass scale beyond TeV, thanks to accidental cancellation of Higgs mass corrections.

Journal ArticleDOI
TL;DR: In this paper, the authors compared four different methodological approaches in order to identify which is the most reliable when analyzing habitat naturalness, and found that relative naturalness indicator values performed well in differentiating among near-natural and degraded veg...
Abstract: Assessing habitat naturalness belongs to the most current issues in conservation biology. It has been recognized that plants are able to indicate the naturalness of their habitat. Thus, species may be given relative naturalness indicator values (i.e. scores on an ordinal scale), reflecting their different tolerances against habitat degradation. In the present study, our first goal was to test whether relative naturalness indicator values are able to reveal known differences in naturalness levels. Our second purpose was to compare four different methodological approaches in order to identify which is the most reliable when analyzing habitat naturalness. We compared near-natural and degraded plots on the bases of (1) unweighted plot means, (2) plot medians, (3) unweighted naturalness indicator value populations, and (4) frequency-weighted naturalness indicator value populations. We found that relative naturalness indicator values performed well in differentiating among near-natural and degraded veg...

Journal ArticleDOI
01 Feb 2017-Synthese
TL;DR: The role of naturalness as a guide to theory model-building is being severely questioned as mentioned in this paper, as the proliferation of heavy scalars generically destabilizes the Higgs boson mass, raising it to the highest and most remote scalar values in nature.
Abstract: Sensitivity to the square of the cutoff scale of quantum corrections of the Higgs boson mass self-energy has led many authors to conclude that the Higgs theory suffers from a naturalness or fine-tuning problem. However, speculative new physics ideas to solve this problem have not manifested themselves yet at high-energy colliders, such as the Large Hadron Collider at CERN. For this reason, the role of naturalness as a guide to theory model-building is being severely questioned. Most attacks suggest that one should not resort to arguments involving gravity, which is a much less understood quantum field theory. Another line of attack is against the assumption that there exists a multitude of additional heavy states specifically charged under the Standard Model gauge symmetries. Nevertheless, if we give ground on both of these assaults on naturalness, what remains is a naturalness concern over the prospect of numerous additional spin-zero scalar states in nature. The proliferation of heavy scalars generically destabilizes the Higgs boson mass, raising it to the highest and most remote scalar mass values in nature, thus straining the legitimacy of the Standard Model. The copious use of extra scalars in theory model building, from explaining flavor physics to providing an inflationary potential and more, and the generic expectation of extra scalar bosons in nature argues for the proliferation instability problem being the central concern for naturalness of the Standard Model. Some approaches to solving this problem are presented.

Book ChapterDOI
18 Oct 2017

Posted Content
01 May 2017-viXra
TL;DR: The consistency of the theory of Everything (ToE) is 1/(7+5+4) = 0.0625 as discussed by the authors, which is the highest possible value for ToE-like theories.
Abstract: Can we guess the initial conditions for the Theory of Everything (ToE)? We understand such initial conditions as a set of all parameters, initial symmetries, and initial equations. Initial symmetries and initial equations can point possible phase transitions which can lead to additional symmetries and additional equations called here the additional conditions. Such additional conditions result from initial conditions so they do not decrease consistency of theory. On the other hand, appearing anomalies in a theory that cannot be explained within initial and additional conditions, always lead to new/free parameters. Free parameters need ad hoc hypotheses (i.e. some corrections that do not result from initial and additional conditions) which always weaken the theories. Elimination of ad-hoc/free parameters by increasing number of initial conditions causes Occam’s razor to be a determinant of the consistency of theories describing the same phenomena. The Occam’s razor is defined as follows: “Among competing hypotheses, the one with the fewest assumptions should be selected” [1]. It means that consistency of a theory can be defined as the inverse of the number which is the sum of all parameters, initial symmetries and initial equations (the sum of elements of the three different groups of initial conditions). New symmetries and new equations, which in a natural way appear on higher levels of ToE (the Standard Model (SM) and General Relativity (GR) are the higher levels of ToE), if we know the lowest levels of ToE, do not decrease the consistency of the theory. Authors of theories add the ad hoc hypotheses to prevent them from being falsified. Such non-scientific method causes that theories become more and more complex so their consistency is lower and lower. In physics, naturalness means that the dimensionless ratios between parameters take values of order 1. Parameters varying by many orders of magnitude need so called fine-tuning symmetries. It suggests that fine-tuned theories should be more complex i.e. their consistency should be lower. But Nature shows that it is the vice versa. It leads to conclusion that fine-tuned theories are closer to ToE. Here we guessed the initial conditions for ToE, we explained why consistency of presented here ToE is highest and why it is the fine-tuned theory. The consistency factor of presented here ToE is 1/(7+5+4)=0.0625 and it is the highest possible value for ToE-like theories. Consistency factor of SM is much lower so it is the incomplete theory sometimes leading to incorrect results.

Journal ArticleDOI
TL;DR: The role of the Higgs potential in particle physics, in particular in spontaneous symmetry breaking and in mass generation using an example of a simple reflection symmetry, has been discussed in this article, where temperature and quantum corrections to the potential lead to the naturalness problem and vacuum stability.
Abstract: Physics associated with the Higgs field potential is rich and interesting and deserves a concise summary for a broader audience to appreciate the beauty and the challenges of this subject. We discuss the role of the Higgs potential in particle physics, in particular in spontaneous symmetry breaking and in mass generation using an example of a simple reflection symmetry, then continue with temperature and quantum corrections to the potential which lead us to the naturalness problem and vacuum stability.

Posted Content
TL;DR: In this article, the authors extend the list of theories featuring a rigorous interacting ultraviolet fixed point by constructing the first theory featuring a Higgs-like scalar with gauge, Yukawa and quartic interactions.
Abstract: We extend the list of theories featuring a rigorous interacting ultraviolet fixed point by constructing the first theory featuring a Higgs-like scalar with gauge, Yukawa and quartic interactions. We show that the theory enters a perturbative asymptotically safe regime at energies above a physical scale $\Lambda$. We determine the salient properties of the theory and use it as a concrete example to test whether scalars masses unavoidably receive quantum correction of order $\Lambda$. Having at our dispose a calculable model allowing us to precisely relate the IR and UV of the theory we demonstrate that the scalars can be lighter than $\Lambda$. Although we do not have an answer to whether the Standard Model hypercharge coupling growth towards a Landau pole at around $\Lambda \sim 10^{40}$ GeV can be tamed by non-perturbative asymptotic safety, our results indicate that such a possibility is worth exploring. In fact, if successful, it might also offer an explanation for the unbearable lightness of the Higgs.

Dissertation
28 Jul 2017
TL;DR: In this paper, a general naturalness criterion for minimal supersymmetric scenarios is proposed, which is based on the standard fine-tuning measure, which only deals with the cancellations needed to obtain the electroweak symmetry breaking (EWSB) scale, introducing several improvements such as the mixing of the finetuning conditions and the dependence on the low and high energy (HE) scales.
Abstract: Supersymmetry (SUSY) has been considered since long ago the leading paradigm of beyond the Standard Model (SM) physics as a framework that tackles the SM hierarchy problem, provides gauge coupling unification and a well-behaved cold dark matter (DM) candidate. Nevertheless, current experimental searches for new physics seem to have cornered the minimal versions of these models in unnatural regions of their parameter space, according to the standard Natural SUSY scenario. With the aim of formulating a general naturalness criterion for minimal supersymmetric scenarios, we have carefully re-examined the standard fine-tuning measure, which only deals with the cancellations needed to obtain the electroweak symmetry breaking (EWSB) scale, introducing several improvements such as the mixing of the fine-tuning conditions and the dependence on the low and high-energy (HE) scales. Furthermore, we have outlined a method that allow to straightforwardly derive naturalness bounds on the initial parameters and mass spectrum of any minimal supersymmetric standard model (MSSM) defined at any HE scale. We have applied this method to specific scenarios to compute a complete set of naturalness bounds. and also employed it to find the most natural gauge-mediated SUSY breaking model. Contrary to what was expected, we show that Natural SUSY, in general, does not demand light stops. The most stringent upper bound from naturalness is that of the gluino mass, which typically sets the level of fine-tuning, but strongly depends on the HE scale. The most robust result of Natural SUSY is by far that Higgsinos should be rather light. Besides, we have investigated other potential sources of fine cancellations in the MSSM, that if present, must be combined with that of the EWSB scale. The most important being the tuning to obtain the experimental Higgs mass and that required to reproduce the correct DM relic abundance. We have quantified them with p-value like measures that allow us to multiplicatively combine them with the electroweak (EW) fine-tuning. Regarding DM, we have considered the lightest neutralino as the DM particle and explored the various possibilities for its mass, composition and interactions that could give rise to accurate arrangements of the initial parameters to achieve the observed DM relic density. Finally, to illustrate the utility of the above stated criteria to estimate the global degree of naturalness, we have applied all of them to a specific model that features low-mass neutralinos and sleptons at low-energy. We find that these scenarios are rather unnatural when taking into account the aforementioned sources of tuning, which would have gone unnoticed, if we have only considered the EW fine-tuning.

Posted Content
30 Jul 2017
TL;DR: In this article, the authors extend the NMSSM by inverse seesaw mechanism to generate neutrino mass, and show that in certain parameter space the lightest sneutrino may act as a viable dark matter candidate, i.e. it can annihilate by multi-channels to get correct relic density and meanwhile satisfy all experimental constraints.
Abstract: A bstractIn supersymmetric theories like the Next-to-Minimal Supersymmetric Standard Model (NMSSM), the lightest neutralino with bino or singlino as its dominant component is customarily taken as dark matter (DM) candidate. Since light Higgsinos favored by naturalness can strength the couplings of the DM and thus enhance the DM-nucleon scattering rate, the tension between naturalness and DM direct detection results becomes more and more acute with the improved experimental sensitivity. In this work, we extend the NMSSM by inverse seesaw mechanism to generate neutrino mass, and show that in certain parameter space the lightest sneutrino may act as a viable DM candidate, i.e. it can annihilate by multi-channels to get correct relic density and meanwhile satisfy all experimental constraints. The most striking feature of the extension is that the DM-nucleon scattering rate can be naturally below its current experimental bounds regardless of the higgsino mass, and hence it alleviates the tension between naturalness and DM experiments. Other interesting features include that the Higgs phenomenology becomes much richer than that of the original NMSSM due to the relaxed constraints from DM physics and also due to the presence of extra neutrinos, and that the signatures of sparticles at colliders are quite different from those with neutralino as DM candidate.

Journal ArticleDOI
TL;DR: In this paper, the authors compared and measured the restoration success of three well-established methods for grassland restoration (sod transplantation, hay transfer, seeding) with three commonly used indices (diversity, number of target species, similarity to reference sites).
Abstract: How should the somewhat vague term of restoration success be measured? This is a critical question rooted in European law, where in fact the creation of proper replacement habitats is a prerequisite for permitting projects that trigger a loss of species or habitats. Previous studies have used indices that relied on a comparison to reference sites, for example the number of a predefined pool of target species or compositional similarity. However, since restoration sites have rarely the same biotic and abiotic conditions as reference sites, plant communities in restored sites will not perfectly match the reference sites. Furthermore, such indices fail when reference sites are lacking or degraded. Hence, there is a need for an alternative approach that evaluates the conservation value of a restored site independently from reference sites. We propose that naturalness indicator values can be an option to measure restoration success. The approach of using naturalness indicator values makes use of the fact that plants are able to indicate environmental parameters, including degradation and regeneration. We compared and measured the restoration success of three well-established methods for grassland restoration (sod transplantation, hay transfer, seeding) with three commonly used indices (diversity, number of target species, similarity to reference sites). The results verified earlier studies and showed that sod transplantation led to the highest restoration success followed by hay transfer and seeding of sitespecific seed mixtures. Further, we used those well-established indices for an evaluation of novel, naturalness-based indices (unweighted and cover-weighted mean naturalness indicator values, the sum of naturalness indicator values). While calculating the means of naturalness indicator values failed to offer conclusive information on restoration success, we could show that the sum of naturalness indicator values was highly correlated with the number of target species and compositional similarity to reference sites. Thus, our case study demonstrated that naturalness indices can be an excellent option to estimate success in grassland restoration.