scispace - formally typeset
Search or ask a question
Topic

Inverse trigonometric functions

About: Inverse trigonometric functions is a research topic. Over the lifetime, 854 publications have been published within this topic receiving 11141 citations. The topic is also known as: arcus function & antitrigonometric function.


Papers
More filters
Journal Article
TL;DR: In this paper, a few trigonometric functions within the domain on monotone interval analytic expressions of inverse function are given, and they are used to resolve a few exercises.
Abstract: Starting from the definition of inverse function,this paper gives a few trigonometric functions within the domain on monotone interval analytic expressions of inverse function,and applies them to resolve a few exercises.
Posted ContentDOI
18 Jan 2023
TL;DR: In this paper , a generalized series expansion of the acrtangent function was derived by using the enhanced midpoint integration (EMI) method, which significantly improves the convergence and requires no surd numbers in computation of the arctangent functions.
Abstract: In this work we derive a generalized series expansion of the acrtangent function by using the enhanced midpoint integration (EMI). Algorithmic implementation of the generalized series expansion utilizes a simple two-step iteration. This approach significantly improves the convergence and requires no surd numbers in computation of the arctangent function.
Posted ContentDOI
06 Jul 2022
TL;DR: In this article , simple general formulas for expectations of functions of a random walk and its running extremum were derived using the inverse $Z$-transform, the Fourier/Laplace inversion and Wiener-Hopf factorization.
Abstract: We prove simple general formulas for expectations of functions of a random walk and its running extremum. Under additional conditions, we derive analytical formulas using the inverse $Z$-transform, the Fourier/Laplace inversion and Wiener-Hopf factorization, and discuss efficient numerical methods for realization of these formulas. As applications, the cumulative probability distribution function of the process and its running maximum and the price of the option to exchange the power of a stock for its maximum are calculated. The most efficient numerical methods use a new efficient numerical realization of the inverse $Z$-transform, the sinh-acceleration technique and simplified trapezoid rule. The program in Matlab running on a Mac with moderate characteristics achieves the precision E-10 and better in several dozen of milliseconds, and E-14 - in a fraction of a isecond.
Posted ContentDOI
06 Apr 2023
TL;DR: In this article , the authors compared different activation functions on the arc tangent and its variations, and found that different irrationals work well for different problems. But, they also found that the self-arctan ϕ activation function is better for multiclass classification and arctan e gives the best results for time series prediction problems.
Abstract: Deep learning has been applied in many areas that have had a significant impact on applications that improves real-life challenges. The success of deep learning in a wide range of areas is due in part to the use of activation functions, which are particularly effective at solving non-linear problems. Activation functions are a key focus for researchers in artificial intelligence who aim to improve the performance of neural networks. This article provides a comprehensive explanation and comparison of different activation functions, with a focus on the arc tangent and its variations specifically. The paper presents experimental results that show that variations of the arc tangent using irrational numbers such as pi, the golden ratio, and Euler’s number, as well as a self-arctan function, produce promising results. Since we experimented with promising activation functions on two different problems, and datasets, we reached a result that different irrationals work well for different problems. In other words, arctan ϕ gives the best results mostly for multiclass classification and arctan e gives the best results for time series prediction problems. The paper focuses on a multi-class classification problem applied to the Reuters Newswire dataset and a time-series prediction problem on Türkiye energy trade value to show the impacts of activation functions.
Journal ArticleDOI
TL;DR: In this article , the authors presented a theoretical study of an absolute, ratiometric inductive position sensor (IPS) based on eddy currents and showed that the best choice is to have a rectangular target and rectangular receivers.
Abstract: This article presents a theoretical study of an absolute, ratiometric inductive position sensor (IPS) based on eddy currents. The aim is to describe the working principle of the sensor, having as key components a transmitting coil, the receiving coils, and the conductive target, by introducing area-of-overlap functions. We show that each target–receiver pair needs the adoption of a different reconstruction formula for the identification of the target position, whereas in the literature the usual inverse tangent function is applied for every possible pair. Then, we seek the target–receiver pair that maximizes the amplitude of the induced voltages on the receivers. The results show that to achieve the maximum value of the induced voltages, the best choice is to have a rectangular target and rectangular receivers. To verify the theory, a simulation and optimization method has been applied to the rectangular receiver coils on two rotary IPS realized with printed circuit board (PCB) technology. Measurements performed on the prototypes have shown an increment of the induced voltage of more than 57% with respect to the commonly used sinusoidal receivers. However, a linearity error of 1.5%FS is obtained by using the inverse tangent reconstruction formula. When using the formula provided by the theory, the linearity error becomes 0.6%FS for the nonoptimized prototype and below 0.15%FS for the optimized one.

Network Information
Related Topics (5)
Differential equation
88K papers, 2M citations
81% related
Matrix (mathematics)
105.5K papers, 1.9M citations
80% related
Bounded function
77.2K papers, 1.3M citations
79% related
Boundary value problem
145.3K papers, 2.7M citations
78% related
Nonlinear system
208.1K papers, 4M citations
77% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202335
202298
202134
202027
201918
201814