scispace - formally typeset
Search or ask a question
Topic

Piecewise linear function

About: Piecewise linear function is a research topic. Over the lifetime, 8133 publications have been published within this topic receiving 161444 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: In this paper, the authors consider a class of piecewise linear (PL) equations which have been proposed to model biological control systems and prove that for the associated PL equation, all trajectories in the regions of phase space corresponding to the cyclic attractor either (i) approach a unique stable limit cycle attractor, or (ii) approach the origin, in the limitt→∞.
Abstract: Oscillations in a class of piecewise linear (PL) equations which have been proposed to model biological control systems are considered. The flows in phase space determined by the PL equations can be classified by a directed graph, called a state transition diagram, on anN-cube. Each vertex of theN-cube corresponds to an orthant in phase space and each edge corresponds to an open boundary between neighboring orthants. If the state transition diagram contains a certain configuration called a cyclic attractor, then we prove that for the associated PL equation, all trajectories in the regions of phase space corresponding to the cyclic attractor either (i) approach a unique stable limit cycle attractor, or (ii) approach the origin, in the limitt→∞. An algebraic criterion is given to distinguish the two cases. Equations which can be used to model feedback inhibition are introduced to illustrate the techniques.

175 citations

Journal ArticleDOI
TL;DR: In this paper, upwind methods for the 1-D Euler equations are reinterpreted as residual distribution schemes, assuming continuous piecewise linear space variation of the unknowns defined at the cell vertices.

174 citations

Journal ArticleDOI
TL;DR: A class of implementable algorithms is described for minimizing any convex, not necessarily differentiable, functionf of several variables that have flexible storage requirements and computational effort per iteration that can be controlled by a user.
Abstract: A class of implementable algorithms is described for minimizing any convex, not necessarily differentiable, functionf of several variables The methods require only the calculation off and one subgradient off at designated points They generalize Lemarechal's bundle method More specifically, instead of using all previously computed subgradients in search direction finding subproblems that are quadratic programming problems, the methods use an aggregate subgradient which is recursively updated as the algorithms proceed Each algorithm yields a minimizing sequence of points, and iff has any minimizers, then this sequence converges to a solution of the problem Particular members of this algorithm class terminate whenf is piecewise linear The methods are easy to implement and have flexible storage requirements and computational effort per iteration that can be controlled by a user

172 citations

Posted Content
TL;DR: This article shows that MARS (multivariate adaptive regression splines) is improper learnable by DNNs in the sense that for any given function that can be expressed as a function in MARS with M parameters there exists a multilayer neural network with O(Mlog(M∕ε) parameters that approximates this function up to sup-norm error ε.
Abstract: Deep neural networks (DNNs) generate much richer function spaces than shallow networks. Since the function spaces induced by shallow networks have several approximation theoretic drawbacks, this explains, however, not necessarily the success of deep networks. In this article we take another route by comparing the expressive power of DNNs with ReLU activation function to piecewise linear spline methods. We show that MARS (multivariate adaptive regression splines) is improper learnable by DNNs in the sense that for any given function that can be expressed as a function in MARS with $M$ parameters there exists a multilayer neural network with $O(M \log (M/\varepsilon))$ parameters that approximates this function up to sup-norm error $\varepsilon.$ We show a similar result for expansions with respect to the Faber-Schauder system. Based on this, we derive risk comparison inequalities that bound the statistical risk of fitting a neural network by the statistical risk of spline-based methods. This shows that deep networks perform better or only slightly worse than the considered spline methods. We provide a constructive proof for the function approximations.

170 citations

Journal ArticleDOI
TL;DR: The proposed truncated linear replenishment policy (TLRP) is proposed, which is piecewise linear with respect to demand history, improves upon static and linear policies, and achieves objective values that are reasonably close to optimal.
Abstract: We propose a robust optimization approach to address a multiperiod inventory control problem under ambiguous demands, that is, only limited information of the demand distributions such as mean, support, and some measures of deviations. Our framework extends to correlated demands and is developed around a factor-based model, which has the ability to incorporate business factors as well as time-series forecast effects of trend, seasonality, and cyclic variations. We can obtain the parameters of the replenishment policies by solving a tractable deterministic optimization problem in the form of a second-order cone optimization problem (SOCP), with solution time; unlike dynamic programming approaches, it is polynomial and independent on parameters such as replenishment lead time, demand variability, and correlations. The proposed truncated linear replenishment policy (TLRP), which is piecewise linear with respect to demand history, improves upon static and linear policies, and achieves objective values that are reasonably close to optimal.

169 citations


Network Information
Related Topics (5)
Nonlinear system
208.1K papers, 4M citations
89% related
Linear system
59.5K papers, 1.4M citations
88% related
Optimization problem
96.4K papers, 2.1M citations
87% related
Robustness (computer science)
94.7K papers, 1.6M citations
86% related
Differential equation
88K papers, 2M citations
86% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023179
2022377
2021312
2020353
2019329
2018297