scispace - formally typeset
Search or ask a question
Author

Cédric Join

Bio: Cédric Join is an academic researcher from University of Lorraine. The author has contributed to research in topics: Nonlinear system & Fault detection and isolation. The author has an hindex of 32, co-authored 178 publications receiving 4562 citations. Previous affiliations of Cédric Join include Nancy-Université & Concordia University Wisconsin.


Papers
More filters
Posted Content
TL;DR: In this paper, the authors studied short-term forecasts and risk management for photovoltaic energy via a new standpoint on time series: a result published by P. Cartier and Y. Perrin in 1995 permits, without any probabilistic and/or statistical assumption, an additive decomposition of a time series into its mean or trend, and quick fluctuations around it.
Abstract: Short-term forecasts and risk management for photovoltaic energy is studied via a new standpoint on time series: a result published by P. Cartier and Y. Perrin in 1995 permits, without any probabilistic and/or statistical assumption, an additive decomposition of a time series into its mean, or trend, and quick fluctuations around it. The forecasts are achieved by applying quite new estimation techniques and some extrapolation procedures where the classic concept of "seasonalities" is fundamental. The quick fluctuations allow to define easily prediction bands around the mean. Several convincing computer simulations via real data, where the Gaussian probability distribution law is not satisfied, are provided and discussed. The concrete implementation of our setting needs neither tedious machine learning nor large historical data, contrarily to many other viewpoints.

34 citations

Posted Content
TL;DR: A longstanding quarrel in quantitative finance is settled by proving the existence of trends in financial time series thanks to a theorem due to P. Cartier and Y. Perrin, and the role of probability theory is discussed.
Abstract: We are settling a longstanding quarrel in quantitative finance by proving the existence of trends in financial time series thanks to a theorem due to P. Cartier and Y. Perrin, which is expressed in the language of nonstandard analysis (Integration over finite sets, F. & M. Diener (Eds): Nonstandard Analysis in Practice, Springer, 1995, pp. 195--204). Those trends, which might coexist with some altered random walk paradigm and efficient market hypothesis, seem nevertheless difficult to reconcile with the celebrated Black-Scholes model. They are estimated via recent techniques stemming from control and signal theory. Several quite convincing computer simulations on the forecast of various financial quantities are depicted. We conclude by discussing the role of probability theory.

32 citations

Journal ArticleDOI
TL;DR: In this paper, the authors deal with the first application of the new framework of model-free control to the promising technology of shape memory alloys actuators, in particular antagonistic shape memory actuators.

29 citations

Journal ArticleDOI
TL;DR: This work extends previous works on model-free control to switched nonlinear SISO systems and yields PID-like regulators which ensure practical stability, utilizing new algebraic methods for numerical differentiations.

28 citations

Journal ArticleDOI
TL;DR: In this paper, Proportional-Derivative feedback loops, or iPDs, were proposed to replace PIs and PIDs, which play a key role in control engineering.
Abstract: This paper suggests to replace PIs and PIDs, which play a key role in control engineering, by intelligent Proportional-Derivative feedback loops, or iPDs, which are derived from model-free control. This standpoint is enhanced by a laboratory experiment.

27 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: A bibliographical review on reconfigurable fault-tolerant control systems (FTCS) is presented, with emphasis on the reconfiguring/restructurable controller design techniques.

2,455 citations

Book ChapterDOI
15 Feb 2011

1,876 citations

01 Nov 1981
TL;DR: In this paper, the authors studied the effect of local derivatives on the detection of intensity edges in images, where the local difference of intensities is computed for each pixel in the image.
Abstract: Most of the signal processing that we will study in this course involves local operations on a signal, namely transforming the signal by applying linear combinations of values in the neighborhood of each sample point. You are familiar with such operations from Calculus, namely, taking derivatives and you are also familiar with this from optics namely blurring a signal. We will be looking at sampled signals only. Let's start with a few basic examples. Local difference Suppose we have a 1D image and we take the local difference of intensities, DI(x) = 1 2 (I(x + 1) − I(x − 1)) which give a discrete approximation to a partial derivative. (We compute this for each x in the image.) What is the effect of such a transformation? One key idea is that such a derivative would be useful for marking positions where the intensity changes. Such a change is called an edge. It is important to detect edges in images because they often mark locations at which object properties change. These can include changes in illumination along a surface due to a shadow boundary, or a material (pigment) change, or a change in depth as when one object ends and another begins. The computational problem of finding intensity edges in images is called edge detection. We could look for positions at which DI(x) has a large negative or positive value. Large positive values indicate an edge that goes from low to high intensity, and large negative values indicate an edge that goes from high to low intensity. Example Suppose the image consists of a single (slightly sloped) edge:

1,829 citations