scispace - formally typeset
Search or ask a question

Showing papers on "Naturalness published in 2020"


Journal ArticleDOI
TL;DR: For the case of weak scale supersymmetry (SUSY), which is touted as a simple and elegant solution to the gauge hierarchy problem and likely low energy limit of compactified string theory, LHC has found rather generally that gluinos are beyond about 2.2 TeV whilst top squark must lie beyond 1.1 TeV as mentioned in this paper.
Abstract: After completion of LHC Run 2, the ATLAS and CMS experiments had collected of order 139 fb−1 of data at √s = 13 TeV. While discovering a very Standard Model-like Higgs boson of mass mh ≃ 125 GeV, no solid signal for physics beyond the Standard Model has emerged so far at LHC. In addition, no WIMP signals have emerged so far at ton-scale noble liquid WIMP search experiments. For the case of weak scale supersymmetry (SUSY), which is touted as a simple and elegant solution to the gauge hierarchy problem and likely low energy limit of compactified string theory, LHC has found rather generally that gluinos are beyond about 2.2 TeV whilst top squark must lie beyond 1.1 TeV. These limits contradict older simplistic notions of naturalness that emerged in the 1980s–1990s, leading to the rather pessimistic view that SUSY is now excluded except for perhaps some remaining narrow corners of parameter space. Yet, this picture ignores several important developments in SUSY/string theory that emerged in the 21st century: 1. the emergence of the string theory landscape and its solution to the cosmological constant problem, 2. a more nuanced view of naturalness including the notion of “stringy naturalness”, 3. the emergence of anomaly-free discrete R-symmetries and their connection to R-parity, Peccei-Quinn symmetry, the SUSY μ problem and proton decay and 4. the importance of including a solution to the strong CP problem. Rather general considerations from the string theory landscape favor large values of soft terms, subject to the vacuum selection criteria that electroweak symmetry is properly broken (no charge and/or color breaking (CCB) minima) and the resulting magnitude of the weak scale is not too far from our measured value. Then stringy naturalness predicts a Higgs mass mh ~ 125 GeV whilst sparticle masses are typically lifted beyond present LHC bounds. In light of these refinements in theory perspective confronted by LHC and dark matter search results, we review the most likely LHC, ILC and dark matter signatures that are expected to arise from weak scale SUSY as we understand it today.

61 citations


Posted Content
Isaac Elias1, Heiga Zen1, Jonathan Shen1, Yu Zhang1, Ye Jia1, Ron Weiss1, Yonghui Wu1 
TL;DR: A non-autoregressive neural text-to-speech model augmented with a variational autoencoder-based residual encoder, called Parallel Tacotron, which is highly parallelizable during both training and inference, allowing efficient synthesis on modern parallel hardware.
Abstract: Although neural end-to-end text-to-speech models can synthesize highly natural speech, there is still room for improvements to its efficiency and naturalness. This paper proposes a non-autoregressive neural text-to-speech model augmented with a variational autoencoder-based residual encoder. This model, called \emph{Parallel Tacotron}, is highly parallelizable during both training and inference, allowing efficient synthesis on modern parallel hardware. The use of the variational autoencoder relaxes the one-to-many mapping nature of the text-to-speech problem and improves naturalness. To further improve the naturalness, we use lightweight convolutions, which can efficiently capture local contexts, and introduce an iterative spectrogram loss inspired by iterative refinement. Experimental results show that Parallel Tacotron matches a strong autoregressive baseline in subjective evaluations with significantly decreased inference time.

54 citations


Proceedings ArticleDOI
25 Oct 2020
TL;DR: It is shown that the reliability of deep learning-based naturalness prediction can be improved by transfer learning from speech quality prediction models that are trained on objective POLQA scores.
Abstract: In this paper, we present a new objective prediction model for synthetic speech naturalness. It can be used to evaluate Text-To-Speech or Voice Conversion systems and works language independently. The model is trained end-to-end and based on a CNN-LSTM network that previously showed to give good results for speech quality estimation. We trained and tested the model on 16 different datasets, such as from the Blizzard Challenge and the Voice Conversion Challenge. Further, we show that the reliability of deep learning-based naturalness prediction can be improved by transfer learning from speech quality prediction models that are trained on objective POLQA scores. The proposed model is made publicly available and can, for example, be used to evaluate different TTS system configurations.

43 citations


Journal ArticleDOI
TL;DR: In this article, the one-loop running of the dimension-six CP-even Higgs operators in the Standard Model effective field theory involving the right-handed component of the would-be Dirac neutrinos was investigated.
Abstract: We compute the one-loop running of the dimension-six CP-even Higgs operators in the Standard Model effective field theory involving the right-handed component of the would-be Dirac neutrinos. Then, on the basis of naturalness arguments, for some operators we obtain bounds that surpass direct constraints by orders of magnitude. We also discuss the implications of a large Dirac neutrino magnetic dipole moment. In particular, we demonstrate that a neutrino magnetic moment explaining the recent XENON1T excess induces Higgs and Z invisible decays with branching ratios in the range [10−18, 10−12]. These numbers are unfortunately beyond the reach of current and near future facilities.

34 citations


Journal ArticleDOI
TL;DR: In this paper, a review of governmental sources and scientific literature related to food naturalness and its evaluation is presented, which highlights the need for a more extended basis for naturalness evaluation of food ingredients.
Abstract: Background: Food naturalness has been the subject of several recent studies and is a key trend in the food industry. There is currently no comprehensive legal definition of food naturalness, which is a multi-faceted and complex principle composed of many aspects. Naturalness-influencing aspects constituting naturalness of food ingredients are similar to those already investigated for finished food products. Scope and approach: Two research questions are posed in this review: • To what extent are the naturalness criteria for food ingredients set by ISO technical specification 19,657 “Definitions and technical criteria for food ingredients to be considered as natural” in line with the latest trends in consumer studies, reviews and reports on the topic? • What aspects contributing to naturalness of food ingredients are the most present across food ingredients’ categories? The first question is answered through a review of governmental sources and scientific literature related to food naturalness and its evaluation. The second question is addressed through four case studies. Key findings and conclusions: ISO TS 19657 evaluates food ingredients' naturalness and only partially fulfils consumers’ requests. To build up a more comprehensive evaluation system, other aspects e.g. farming practices should be taken into consideration. The case studies presented in this review paper highlighted this need for a more extended basis for naturalness evaluation of food ingredients. A gap between technical and safety need for processing and consumer perception of processing in relation to naturalness emerged.

30 citations


Posted Content
TL;DR: An overview of effective field theory (EFT) methods can be found in this paper, where toy model EFTs, chiral perturbation theory, Fermi liquid theory, and non-relativistic QED are discussed.
Abstract: These notes are an overview of effective field theory (EFT) methods. I discuss toy model EFTs, chiral perturbation theory, Fermi liquid theory, and non-relativistic QED, and use these examples to introduce a variety of EFT concepts, including: matching a tree and loop level; summation of large logarithms; naturalness problems; spurion fields; coset construction; Wess-Zumino-Witten terms; chiral and gauge anomalies; field rephasings; and method of regions. These lecture notes were prepared for the 2nd Joint ICTP-Trieste/ICTP-SAIFR school on Particle Physics, Sao Paulo, Brazil, June 22 - July 3, 2020.

25 citations


Posted Content
TL;DR: In this article, the LHC data have qualitatively modified the hierarchy problem into the Loerarchy Problem, where the objective is to generate an IR scale without accompanying visible structure.
Abstract: We begin this thesis with an extensive pedagogical introduction aimed at clarifying the foundations of the hierarchy problem. After introducing effective field theory, we discuss renormalization at length from a variety of perspectives. We focus on conceptual understanding and connections between approaches, while providing a plethora of examples for clarity. With that background we can then clearly understand the hierarchy problem, which is reviewed primarily by introducing and refuting common misconceptions thereof. We next discuss some of the beautiful classic frameworks to approach the issue. However, we argue that the LHC data have qualitatively modified the issue into `The Loerarchy Problem'---how to generate an IR scale without accompanying visible structure---and we discuss recent work on this approach. In the second half, we present some of our own work in these directions, beginning with explorations of how the Neutral Naturalness approach motivates novel signatures of electroweak naturalness at a variety of physics frontiers. Finally, we propose a New Trail for Naturalness and suggest that the physical breakdown of EFT, which gravity demands, may be responsible for the violation of our EFT expectations at the LHC.

24 citations


Journal ArticleDOI
TL;DR: In this article, the color-neutral top partners generate the Higgs potential radiatively without quadratic divergence, and a pseudo Nambu-Goldstone Higgs arising from SO(5)/SO(4) breaking is constructed.
Abstract: We build a minimal neutral naturalness model in which the top partners are not charged under QCD, with a pseudo Nambu-Goldstone Higgs arising from SO(5)/SO(4) breaking. The color-neutral top partners generate the Higgs potential radiatively without quadratic divergence. The misalignment between the electroweak scale and global symmetry breaking scale is naturally obtained from suppression of the Higgs quadratic term, due to cancellation between singlet and doublet top partner contributions. This model can be embedded into ultraviolet holographic setup in composite Higgs framework, which even realizes finite Higgs potential.

22 citations


Journal ArticleDOI
TL;DR: In this article, the authors discuss naturalness in the context of the relatively weak binding of nuclei, where discrete scale invariance plays a role in the emergence of complexity, and discuss the role of the naturalness assumption in nuclear effective field theories.
Abstract: Nuclear effective field theories (EFTs) have been developed over the last quarter-century with considerable impact on the description of light and even medium-mass nuclei. At the core of any EFT is a systematic expansion of observables, which is usually obtained from a rule based on an assumption of naturalness. I discuss naturalness in the context of the relatively weak binding of nuclei, where discrete scale invariance plays a role in the emergence of complexity.

19 citations


Journal ArticleDOI
TL;DR: In this paper, the Brazilian Atlantic Forest was used as an object of study and compilated information about 967 secondary, mature and restoration forests in a wide geographical extension, and 14 ecological indicators were assessed in a sampling area of 1,928,024m2.

19 citations


Proceedings ArticleDOI
25 Oct 2020
TL;DR: The authors investigated to what extent multilingual multi-speaker modeling can be an alternative to monolingual multi-Speaker modeling, and explored how data from foreign languages may best be combined with low-resource language data.
Abstract: Recent advances in neural TTS have led to models that can produce high-quality synthetic speech. However, these models typically require large amounts of training data, which can make it costly to produce a new voice with the desired quality. Although multi-speaker modeling can reduce the data requirements necessary for a new voice, this approach is usually not viable for many low-resource languages for which abundant multi-speaker data is not available. In this paper, we therefore investigated to what extent multilingual multi-speaker modeling can be an alternative to monolingual multi-speaker modeling, and explored how data from foreign languages may best be combined with low-resource language data. We found that multilingual modeling can increase the naturalness of low-resource language speech, showed that multilingual models can produce speech with a naturalness comparable to monolingual multi-speaker models, and saw that the target language naturalness was affected by the strategy used to add foreign language data.

Proceedings ArticleDOI
25 Oct 2020
TL;DR: UTACO demonstrates that attention can be successfully applied to the singing synthesis field and improves naturalness over the state of the art, and shows a strong improvement in naturalness with respect to previous neural singing synthesis models.
Abstract: We present UTACO, a singing synthesis model based on an attention-based sequence-to-sequence mechanism and a vocoder based on dilated causal convolutions. These two classes of models have significantly affected the field of text-to-speech, but have never been thoroughly applied to the task of singing synthesis. UTACO demonstrates that attention can be successfully applied to the singing synthesis field and improves naturalness over the state of the art. The system requires considerably less explicit modelling of voice features such as F0 patterns, vibratos, and note and phoneme durations, than previous models in the literature. Despite this, it shows a strong improvement in naturalness with respect to previous neural singing synthesis models. The model does not require any durations or pitch patterns as inputs, and learns to insert vibrato autonomously according to the musical context. However, we observe that, by completely dispensing with any explicit duration modelling it becomes harder to obtain the fine control of timing needed to exactly match the tempo of a song.

Posted Content
TL;DR: In this article, the authors present lecture notes for a one-semester course or for self-study to understand the string landscape and its relation to hierarchy problems and naturalness at a reasonably technical level.
Abstract: The cosmological constant and electroweak hierarchy problem have been a great inspiration for research. Nevertheless, the resolution of these two naturalness problems remains mysterious from the perspective of a low-energy effective field theorist. The string theory landscape and a possible string-based multiverse offer partial answers, but they are also controversial for both technical and conceptual reasons. The present lecture notes, suitable for a one-semester course or for self-study, attempt to provide a technical introduction to these subjects. They are aimed at graduate students and researchers with a solid background in quantum field theory and general relativity who would like to understand the string landscape and its relation to hierarchy problems and naturalness at a reasonably technical level. Necessary basics of string theory are introduced as part of the course. This text will also benefit graduate students who are in the process of studying string theory at a deeper level. In this case, the present notes may serve as additional reading beyond a formal string theory course.

Journal ArticleDOI
TL;DR: In this paper, the authors examined several theoretical aspects of this discovery plane in both the gravity-mediation NUHM2 model and the general miragemediation (GMM′) models, including the associated chargino mass m χ ˜ 1 ±, the expected regions of the bottom-up notion of electroweak naturalness Δ E W, and the expected region of stringy naturalness.

Journal ArticleDOI
TL;DR: In this paper, the authors discuss naturalness in the context of the relatively weak binding of nuclei, where discrete scale invariance plays a role in the emergence of complexity, and discuss the role of the naturalness assumption in nuclear effective field theories.
Abstract: Nuclear effective field theories (EFTs) have been developed over the last quarter-century with considerable impact on the description of light and even medium-mass nuclei. At the core of any EFT is a systematic expansion of observables, which is usually obtained from a rule based on an assumption of naturalness. I discuss naturalness in the context of the relatively weak binding of nuclei, where discrete scale invariance plays a role in the emergence of complexity.

Journal ArticleDOI
01 Jan 2020
TL;DR: The qualitative and quantitative comparisons show that the proposed method outperforms others and dehazed images are restored effectively maintaining their naturalness.
Abstract: This article proposes a novel single image dehazing method using a Type-2 membership function based similarity function matrix. The proposed method estimates the depth map and global atmospheric light of the observed hazy image. The estimated depth map is further subjected to produce true scene transmission. Finally, the observed hazy image is dehazed by the atmospheric scattering model using scene transmission and global atmospheric light. The qualitative and quantitative comparisons of the proposed method have been presented with benchmarked state-of-the-art methods. The experiments have been extensively performed on benchmarked natural hazy images, MiddleBury Stereo dataset, REalistic Single Image DEhazing (RESIDE) dataset, RESIDE-$\beta$ dataset, and Stanford ImageNet dataset. The performance metrics used for comparison are peak signal to noise ratio and structural similarity index as quantitative measures; and lightness order error and naturalness image quality evaluator as qualitative measures. Moreover, the detection results using YOLOv2 on RESIDE-$\beta$ dataset have also been compared in terms of F1-score and area under curve measures. The qualitative and quantitative comparisons show that the proposed method outperforms others and dehazed images are restored effectively maintaining their naturalness.

Journal ArticleDOI
Abstract: So-called natural food is one of the most significant current trends in the food business. Despite this trend, previous research on the measurement of naturalness has made no distinction between different groups of consumers. Therefore, the objective of this study is to explore the attributes important to millennial university students when evaluating food naturalness. The study is based on a questionnaire administered to a sample of 372 respondents. Using a partial least square (PLS) methodology, it performs a standard confirmatory factor analysis for measurement and validations. As a result, it identifies one attribute linked to how the food is grown and eight attributes associated to how it is produced and processed. These findings have several implications. Apart from testing previous scales in a millennial context, they confirm that market strategies must take different understandings of naturalness into account contingent upon the consumer group.

Journal ArticleDOI
TL;DR: Several concerns were raised as a result of the review, including the reliability and validity of measures, inadequate definitions of terminology, lack of detail in method descriptions, and the need to address relationships between naturalness and other variables included in the studies.
Abstract: The concept of speech naturalness is used widely in clinic and research applications. Unfortunately, the lack of consistency in research methods means that comparing findings between studies is dif...

Journal ArticleDOI
TL;DR: In this article, the past of the universe, extrapolated from standard physics and measured cosmological parameters, might be a non-singular bounce, and quite stringent constraints can be put on the reheating temperature and number of inflationary e-folds, basically fixing $$T_{RH}sim T_{GUT} and $$N\sim 70$$.
Abstract: In this article, we argue that the past of the Universe, extrapolated from standard physics and measured cosmological parameters, might be a non-singular bounce. We also show that, in this framework, quite stringent constraints can be put on the reheating temperature and number of inflationary e-folds, basically fixing $$T_{RH}\sim T_{GUT}$$ and $$N\sim 70$$. We draw some conclusions about the shape of the inflaton potential and raise the “naturalness” issue in this context. Finally, we argue that this could open a very specific window on the “pre big bounce" universe.

Journal ArticleDOI
TL;DR: In this article, the background evolution is presented, as a function of the parameters controlling the cosmic evolution, and the primordial tensor spectrum is also calculated and possible observational footprints of the model are underlined.
Abstract: Recent data suggest that the Universe could be positively curved. Combined with an inflationary stage, this might lead to a curvature bounce instead of the Big Bang. The background evolution is presented, as a function of the parameters controlling the cosmic evolution. The primordial tensor spectrum is also calculated and possible observational footprints of the model are underlined. Several potentials are considered and general remarks are made about "naturalness" in this context.

Journal ArticleDOI
Abstract: Effective Quantum Field Theories (EFTs) are effective insofar as they apply within a prescribed range of length-scales, but within that range they predict and describe with extremely high accuracy and precision. The effectiveness of EFTs is explained by identifying the features – the scaling behaviour of the parameters – which lead to effectiveness. The explanation relies on distinguishing autonomy with respect to changes in microstates, from autonomy with respect to changes in microlaws, and relating these, respectively, to renormalisability and naturalness. It is claimed that the effectiveness of EFTs is a consequence of each theory’s microstate-autonomy rather than its microlaw-autonomy.

Journal ArticleDOI
01 Jan 2020-Mind
TL;DR: It is argued that the epistemologist should borrow the metaphysician’s concept of naturalness and assign higher priors to more natural hypotheses.
Abstract: Many epistemological problems can be solved by the objective Bayesian view that there are rationality constraints on priors, that is, inductive probabilities. But attempts to work out these constraints have run into such serious problems that many have rejected objective Bayesianism altogether. I argue that the epistemologist should borrow the metaphysician’s concept of naturalness and assign higher priors to more natural hypotheses.

Journal ArticleDOI
TL;DR: In this article, the authors set out some of the key themes addressed by the papers in the special issue on "Confronting the Naturalness of disaster in the Pacific".
Abstract: This introduction sets out some of the key themes addressed by the papers in the special issue on ‘Confronting the Naturalness of Disaster in the Pacific’. Disasters are now widely understood not a...

Journal ArticleDOI
TL;DR: The article describes how the idea of “naturalness” was used by three different groups in arguments over the risk of livestock vaccines developed in synthetic biology.
Abstract: The article describes how the idea of “naturalness” was used by three different groups in arguments over the risk of livestock vaccines developed in synthetic biology. Based on interviews with two ...

Journal ArticleDOI
TL;DR: In this paper, the authors analyzed magnetic and axial two-nucleon contact terms in a combined large-$N_c$ and pionless effective field theory expansion, and showed that the large-N-c$ expansion hints towards a hierarchy between the two leading-order magnetic terms that matches that found in phenomenological fits.
Abstract: We analyze magnetic and axial two-nucleon contact terms in a combined large-$N_c$ and pionless effective field theory expansion. These terms play important roles in correctly describing, e.g., the low-energy cross section of radiative neutron capture and the deuteron magnetic moment. We show that the large-$N_c$ expansion hints towards a hierarchy between the two leading-order magnetic terms that matches that found in phenomenological fits. We also comment on the issue of naturalness in different Lagrangian bases.

Proceedings ArticleDOI
09 Nov 2020
TL;DR: This paper proposes the first methodology and system design to quantify, improve, and tune the privacy-utility trade-off, while simultaneously also improving the naturalness of the generated images.
Abstract: Image data analysis techniques such as facial recognition can threaten individuals' privacy. Whereas privacy risks often can be reduced by adding noise to the data, this approach reduces the utility of the images. For this reason, image de-identification techniques typically replace directly identifying features (e.g., faces, car number plates) present in the data with synthesized features, while still preserving other non-identifying features. As of today, existing techniques mostly focus on improving the naturalness of the generated synthesized images, without quantifying their impact on privacy. In this paper, we propose the first methodology and system design to quantify, improve, and tune the privacy-utility trade-off, while simultaneously also improving the naturalness of the generated images. The system design is broken down into three components that address separate but complementing challenges. This includes a two-step cluster analysis component to extract low-dimensional feature vectors representing the images (embedding) and to cluster the images into fixed-sized clusters. While the importance of good clustering mostly has been neglected in previous work, we find that our novel approach of using low-dimensional feature vectors can improve the privacy-utility trade-off by better clustering similar images. The use of these embeddings has been found particularly useful when wanting to ensure high naturalness and utility of the synthetically generated images. By combining improved clustering and incorporating StyleGAN, a state-of-the-art Generative Neural Network, into our solution, we produce more realistic synthesized faces than prior works, while also better preserving properties such as age, gender, skin tone, or even emotional expressions. Finally, our iterative tuning method exploits non-linear relations between privacy and utility to identify good privacy-utility trade-offs. We note that an example benefit of these improvements is that our solution allows car manufacturers to train their autonomous vehicles while complying with privacy laws.

01 Jan 2020
TL;DR: In this article, it is argued that there is much more complexity to the concept/property relation than the natural thought seems to presuppose, and that we look for concepts that play a role in explanations of things that cry out for explanation.
Abstract: On a view implicitly endorsed by many, a concept is epistemically better than another if and because it does a better job at ‘carving at the joints’, or if the property corresponding to it is ‘more natural’ than the one corresponding to another. This chapter offers an argument against this seemingly plausible thought, starting from three key observations about the way we use and evaluate concepts from an epistemic perspective: that we look for concepts that play a role in explanations of things that cry out for explanation; that we evaluate not only ‘empirical’ concepts, but also mathematical and perhaps moral concepts from an epistemic perspective; and that there is much more complexity to the concept/property relation than the natural thought seems to presuppose. These observations, it is argued, rule out giving a theory of conceptual evaluation that is a corollary of a metaphysical ranking of the relevant properties.

Journal ArticleDOI
TL;DR: In this article, the past of the universe, extrapolated from standard physics and measured cosmological parameters, might be a non-singular bounce, and quite stringent constraints can be put on the reheating temperature and number of inflationary e-folds, basically fixing $T{RH}sim T_{GUT}$ and $N\sim 70$.
Abstract: In this letter, we argue that the past of the Universe, extrapolated from standard physics and measured cosmological parameters, might be a non-singular bounce. We also show that, in this framework, quite stringent constraints can be put on the reheating temperature and number of inflationary e-folds, basically fixing $T_{RH}\sim T_{GUT}$ and $N\sim 70$. We draw some conclusions about the shape of the inflaton potential and raise the "naturalness" issue in this context. Finally, we argue that this could open a very specific window on the "pre big bounce" universe.

Journal ArticleDOI
TL;DR: Perceptual speech naturalness for both transfeminine and transmasculine speakers is strongly associated with gender cues in spontaneous speech.
Abstract: Purpose The purpose of this study was to investigate how speech naturalness relates to masculinity–femininity and gender identification (accuracy and reaction time) for cisgender male and female sp...

Proceedings ArticleDOI
25 Oct 2020
TL;DR: This work explored a way to include linguistic features into the sequenceto-sequence Tacotron2 system to improve the naturalness of the generated voice, making the prosody of the synthesis looking more like the real human speaker.
Abstract: State-of-the-art end-to-end speech synthesis models have reached levels of quality close to human capabilities. However, there is still room for improvement in terms of naturalness, related to prosody, which is essential for human-machine interaction. Therefore, part of current research has shift its focus on improving this aspect with many solutions, which mainly involve prosody adaptability or control. In this work, we explored a way to include linguistic features into the sequenceto-sequence Tacotron2 system to improve the naturalness of the generated voice. That is, making the prosody of the synthesis looking more like the real human speaker. Specifically we embedded with an additional encoder part-of-speech tags and punctuation mark locations of the input text to condition Tacotron2 generation. We propose two different architectures for this parallel encoder: one based on a stack of convolutional plus recurrent layers, and another formed by a stack of bidirectional recurrent plus linear layers. To evaluate the similarity between real read-speech and synthesis, we carried out an objective test using signal processing metrics and a perceptual test. The presented results show that we achieved an improvement in naturalness.