scispace - formally typeset
Search or ask a question
Topic

Naturalness

About: Naturalness is a research topic. Over the lifetime, 1305 publications have been published within this topic receiving 31737 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: In this article, the arithmetical interpretation has been challenged by a new sceptical argument, which is based on the assumption that the interpretation is independent of fundamental properties and relations.
Abstract: The criterion of naturalness represents David Lewis’s attempt to answer some of the sceptical arguments in (meta-) semantics by comparing the naturalness of meaning candidates. Recently, the criterion has been challenged by a new sceptical argument. Williams argues that the criterion cannot rule out the candidates which are not permuted versions of an intended interpretation. He presents such a candidate – the arithmetical interpretation (a specific instantiation of The criterion of naturalness represents David Lewis’s attempt to answer some of the sceptical arguments in (meta-) semantics by comparing the naturalness of meaning candidates. Recently, the criterion has been challenged by a new sceptical argument. Williams argues that the criterion cannot rule out the candidates which are not permuted versions of an intended interpretation. He presents such a candidate – the arithmetical interpretation (a specific instantiation of Henkin’s model), and he argues that it opens up the possibility of Pythagorean worlds, i.e. the worlds similar to ours in which the arithmetical interpretation is the best candidate for a semantic theory. The aim of this paper is a) to reconsider the general conditions for the applicability of Lewis’s criterion of naturalness and b) to show that Williams’s new sceptical challenge is based on a problematic assumption that the arithmetical interpretation is independent of fundamental properties and relations. As I show, if the criterion of naturalness is applied properly, it can respond even to the new sceptical challenge.

4 citations

Proceedings ArticleDOI
23 May 2022
TL;DR: In this article , an autoregressive left-right no-skip hidden Markov model is proposed to replace the non-monotonic attention in neural sequence-to-sequence TTS.
Abstract: Neural sequence-to-sequence TTS has achieved significantly better output quality than statistical speech synthesis using HMMs. However, neural TTS is generally not probabilistic and uses non-monotonic attention. Attention failures increase training time and can make synthesis babble incoherently. This paper describes how the old and new paradigms can be combined to obtain the advantages of both worlds, by replacing attention in neural TTS with an autoregressive left-right no-skip hidden Markov model defined by a neural network. Based on this proposal, we modify Tacotron 2 to obtain an HMM-based neural TTS model with monotonic alignment, trained to maximise the full sequence likelihood without approximation. We also describe how to combine ideas from classical and contemporary TTS for best results. The resulting example system is smaller and simpler than Tacotron 2, and learns to speak with fewer iterations and less data, whilst achieving comparable naturalness prior to the post-net. Our approach also allows easy control over speaking rate.

4 citations


Network Information
Related Topics (5)
Statistical model
19.9K papers, 904.1K citations
69% related
Sentence
41.2K papers, 929.6K citations
69% related
Vocabulary
44.6K papers, 941.5K citations
67% related
Detector
146.5K papers, 1.3M citations
67% related
Cluster analysis
146.5K papers, 2.9M citations
66% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023282
2022610
202182
202063
201983
201852