scispace - formally typeset
Search or ask a question
Topic

Sigmoid function

About: Sigmoid function is a research topic. Over the lifetime, 2228 publications have been published within this topic receiving 59557 citations. The topic is also known as: S curve.


Papers
More filters
Journal ArticleDOI
TL;DR: This paper reviews possible candidate models that may be used in theoretical modelling and empirical studies of species–area relationships (SARs) and suggests two main types of species-area curves: sample curves that are inherently convex and isolate curves, which are sigmoid.
Abstract: Aim This paper reviews possible candidate models that may be used in theoretical modelling and empirical studies of species–area relationships (SARs). The SAR is an important and well-proven tool in ecology. The power and the exponential functions are by far the models that are best known and most frequently applied to species–area data, but they might not be the most appropriate. Recent work indicates that the shape of species–area curves in arithmetic space is often not convex but sigmoid and also has an upper asymptote. Methods Characteristics of six convex and eight sigmoid models are discussed and interpretations of different parameters summarized. The convex models include the power, exponential, Monod, negative exponential, asymptotic regression and rational functions, and the sigmoid models include the logistic, Gompertz, extreme value, Morgan–Mercer–Flodin, Hill, Michaelis–Menten, Lomolino and Chapman–Richards functions plus the cumulative Weibull and beta-P distributions. Conclusions There are two main types of species–area curves: sample curves that are inherently convex and isolate curves, which are sigmoid. Both types may have an upper asymptote. A few have attempted to fit convex asymptotic and/or sigmoid models to species–area data instead of the power or exponential models. Some of these or other models reviewed in this paper should be useful, especially if species–area models are to be based more on biological processes and patterns in nature than mere curve fitting. The negative exponential function is an example of a convex model and the cumulative Weibull distribution an example of a sigmoid model that should prove useful. A location parameter may be added to these two and some of the other models to simulate absolute minimum area requirements.

430 citations

Journal ArticleDOI
TL;DR: The utility of a mean field theory for sigmoid belief networks based on ideas from statistical mechanics is demonstrated on a benchmark problem in statistical pattern recognition-the classification of handwritten digits.
Abstract: We develop a mean field theory for sigmoid belief networks based on ideas from statistical mechanics. Our mean field theory provides a tractable approximation to the true probability distribution in these networks; it also yields a lower bound on the likelihood of evidence. We demonstrate the utility of this framework on a benchmark problem in statistical pattern recognition-the classification of handwritten digits.

428 citations

Proceedings Article
01 Nov 2018
TL;DR: This paper introduces CROWN, a general framework to certify robustness of neural networks with general activation functions for given input data points and facilitates the search for a tighter certified lower bound by adaptively selecting appropriate surrogates for each neuron activation.
Abstract: Finding minimum distortion of adversarial examples and thus certifying robustness in neural networks classifiers is known to be a challenging problem. Nevertheless, recently it has been shown to be possible to give a non-trivial certified lower bound of minimum distortion, and some recent progress has been made towards this direction by exploiting the piece-wise linear nature of ReLU activations. However, a generic robustness certification for \textit{general} activation functions still remains largely unexplored. To address this issue, in this paper we introduce CROWN, a general framework to certify robustness of neural networks with general activation functions. The novelty in our algorithm consists of bounding a given activation function with linear and quadratic functions, hence allowing it to tackle general activation functions including but not limited to the four popular choices: ReLU, tanh, sigmoid and arctan. In addition, we facilitate the search for a tighter certified lower bound by \textit{adaptively} selecting appropriate surrogates for each neuron activation. Experimental results show that CROWN on ReLU networks can notably improve the certified lower bounds compared to the current state-of-the-art algorithm Fast-Lin, while having comparable computational efficiency. Furthermore, CROWN also demonstrates its effectiveness and flexibility on networks with general activation functions, including tanh, sigmoid and arctan.

387 citations

01 Jan 1999
TL;DR: In this paper, a taxonomy of activation and output functions is proposed, and advantages of various non-local and local neural transfer functions are discussed, and several less-known types of transfer functions and new combinations of activation/output functions are described.
Abstract: The choice of transfer functions may strongly influence complexity and performance of neural networks. Although sigmoidal transfer functions are the most common there is no a priori reason why models based on such functions should always provide optimal decision borders. A large number of alternative transfer functions has been described in the literature. A taxonomy of activation and output functions is proposed, and advantages of various non-local and local neural transfer functions are discussed. Several less-known types of transfer functions and new combinations of activation/output functions are described. Universal transfer functions, parametrized to change from localized to delocalized type, are of greatest interest. Other types of neural transfer functions discussed here include functions with activations based on nonEuclidean distance measures, bicentral functions, formed from products or linear combinations of pairs of sigmoids, and extensions of such functions making rotations of localized decision borders in highly dimensional spaces practical. Nonlinear input preprocessing techniques are briefly described, offering an alternative way to change the shapes of decision borders.

299 citations


Network Information
Related Topics (5)
Artificial neural network
207K papers, 4.5M citations
69% related
Deep learning
79.8K papers, 2.1M citations
68% related
Linear regression
21.3K papers, 1.2M citations
68% related
Convolutional neural network
74.7K papers, 2M citations
67% related
Sampling (statistics)
65.3K papers, 1.2M citations
67% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023253
2022674
2021121
2020158
2019167
2018134