scispace - formally typeset
Search or ask a question
Author

Thomas Dionysopoulos

Bio: Thomas Dionysopoulos is an academic researcher. The author has contributed to research in topics: Data analysis & Analytics. The author has an hindex of 2, co-authored 7 publications receiving 11 citations.

Papers
More filters
Posted Content
TL;DR: This paper relies on sparse representations of the original time series in terms of dictionary atoms, which are learned and updated from the available data directly in a rolling-window fashion to extract sparse patterns, aiming at compactly capturing the meaningful information of volatile financial data.
Abstract: Financial time series usually exhibit non-stationarity and time-varying volatility. Extraction and analysis of complicated patterns, such as trends and transient changes, are at the core of modern financial data analytics. Furthermore, efficient and timely analysis is often hindered by large volumes of raw data, which are supplied and stored nowadays. In this paper, the power of learned dictionaries in adapting accurately to the underlying micro-local structures of time series is exploited to extract sparse patterns, aiming at compactly capturing the meaningful information of volatile financial data. Specifically, our proposed method relies on sparse representations of the original time series in terms of dictionary atoms, which are learned and updated from the available data directly in a rolling-window fashion. In contrast to previous methods, our extracted sparse patterns enable both compact storage and highly accurate reconstruction of the original data. Equally importantly, financial analytics, such as volatility clustering, can be performed on the sparse patterns directly, thus reducing the overall computational cost, without deteriorating accuracy. Experimental evaluation on 12 market indexes reveals a superior performance of our approach against a modified symbolic representation and a well-established wavelet transform-based technique, in terms of information compactness, reconstruction accuracy, and volatility clustering efficiency.

3 citations

Book ChapterDOI
01 Jan 2016
TL;DR: In this paper, a matrix completion based approach is proposed to restore the corrupted cross-recurrence plot (CRP) prior to the estimation of the time-synchronization relationship.
Abstract: The success of a trading strategy can be significantly enhanced by tracking accurately the implied volatility changes, which refers to the amount of uncertainty or risk about the degree of changes in a market index. This fosters the need for accurate estimation of the time-synchronization profile between a given market index and its associated volatility index. In this chapter, we advance existing solutions, which are based widely on the typical correlation, for identifying this temporal interdependence. To this end, cross-recurrence plot (CRP) analysis is exploited for extracting the underlying dynamics of a given market and volatility indexes pair, along with their time-synchronization profile. However, CRPs of degraded quality, for instance due to missing information, may yield a completely erroneous estimation of this profile. To overcome this drawback, a restoration stage based on the concept of matrix completion is applied on a corrupted CRP prior to the estimation of the time-synchronization relationship. A performance evaluation on the S&P 500 index and its associated VIX volatility index reveals the superior capability of our proposed approach in restoring accurately their CRP and subsequently estimating a temporal relation between the two indexes even when \(80\,\%\) of CRP values are missing.

3 citations

Journal ArticleDOI
TL;DR: In this article, a novel measure is proposed, which better adapts to the time-frequency content of market indices for quantifying the degree of their integration, which distinguishes between short and long-term investors.
Abstract: Accurate quantification of the integration strength between dynamically evolving markets has become a major issue in the context of the recent financial predicament, with the typical approaches relying mainly on the time-varying aspects of market indices. Despite its recognized virtue, incorporation of both temporal and frequency information has still gained limited attention in the framework of market integration. In this paper, a novel measure is proposed, which better adapts to the time-frequency content of market indices for quantifying the degree of their integration. To this end, advanced statistical signal processing techniques are employed to extract market interrelations not only across time, but also across frequency, thus distinguishing between short and long-term investors. Specifically, probabilistic principal component analysis is employed to extract the principal factors explaining the cross-market returns, while a Hough transformation, applied on appropriate time-scale wavelet decompositions of the original time series and the principal factors, is exploited to extract global patterns in the time-scale domain by detecting local features. Then, statistical divergence between the corresponding Hough transformed time-scale decompositions is used to quantify the degree of market integration. The efficiency of the proposed measure is evaluated on a set of 12 equity indices in the framework of well-diversified portfolio construction revealing an improved performance against alternative market integration measures, in terms of typical financial performance metrics.

3 citations

Journal ArticleDOI
TL;DR: In this article, a long-short beta neutral portfolio strategy is proposed based on earnings yields forecasts, where positions are modified by accounting for time-varying risk budgeting by employing an appropriate integration measure.
Abstract: In this paper, a long-short beta neutral portfolio strategy is proposed based on earnings yields forecasts, where positions are modified by accounting for time-varying risk budgeting by employing an appropriate integration measure. In contrast to previous works, which primarily rely on a standard principal component analysis (PCA), here we exploit the advantages of a probabilistic PCA (PPCA) framework to extract the factors to be used for designing an efficient integration measure, as well as relating these factors to an asset-pricing model. Our experimental evaluation with a dataset of 12 developed equity market indexes reveals certain improvements of our proposed approach, in terms of an increased representation capability of the underlying principal factors, along with an increased robustness to noisy and/or missing data in the original dataset.

2 citations

Journal ArticleDOI
TL;DR: Results indicate an improved interpretation capability of RQA when applied on denoised data using the proposed approach, and an increased accuracy of the proposed method in detecting switching volatility regimes, which is important for estimating the risk associated with a financial instrument.
Abstract: In this paper we propose an enhancement of recurrence quantification analysis (RQA) performance in extracting the underlying non-linear dynamics of market index returns, under the assumption of data corrupted by additive white Gaussian noise. More specifically, first we show that the statistical distribution of wavelet decompositions of distinct index returns is best fitted using members of the alpha-stable distributions family. Then, an efficient maximum a posteriori (MAP) estimator is applied on pairs of wavelet coefficients at adjacent levels to suppress the noise effect, prior to performing RQA. Quantitative and qualitative results on 22 future indices indicate an improved interpretation capability of RQA when applied on denoised data using our proposed approach, as opposed to previous methods based solely on a Gaussian assumption for the underlying statistics, in terms of extracting the underlying dynamical structure of index returns generating processes. Furthermore, our results reveal an increased accuracy of the proposed method in detecting switching volatility regimes, which is important for estimating the risk associated with a financial instrument.

2 citations


Cited by
More filters
Posted Content
TL;DR: In this paper, the authors explored a novel approach that combines high-frequency volatility matrix estimation together with low-frequency dynamic models, and established the asymptotic theory for the proposed methodology in the framework that allows sample size, number of assets, and number of days go to infinity together.
Abstract: It is increasingly important in financial economics to estimate volatilities of asset returns. However, most of the available methods are not directly applicable when the number of assets involved is large, due to the lack of accuracy in estimating high-dimensional matrices. Therefore it is pertinent to reduce the effective size of volatility matrices in order to produce adequate estimates and forecasts. Furthermore, since high-frequency financial data for different assets are typically not recorded at the same time points, conventional dimension-reduction techniques are not directly applicable. To overcome those difficulties we explore a novel approach that combines high-frequency volatility matrix estimation together with low-frequency dynamic models. The proposed methodology consists of three steps: (i) estimate daily realized covolatility matrices directly based on high-frequency data, (ii) fit a matrix factor model to the estimated daily covolatility matrices, and (iii) fit a vector autoregressive model to the estimated volatility factors.We establish the asymptotic theory for the proposed methodology in the framework that allows sample size, number of assets, and number of days go to infinity together. Our theory shows that the relevant eigenvalues and eigenvectors can be consistently estimated. We illustrate the methodology with the high-frequency price data on several hundreds of stocks traded in Shenzhen and Shanghai Stock Exchanges over a period of 177 days in 2003. Our approach pools together the strengths of modeling and estimation at both intra-daily (high-frequency) and inter-daily (low-frequency) levels.

77 citations

Posted Content
01 Jan 2007
TL;DR: In this article, a model to select the optimal hedge ratio of a portfolio comprised of an arbitrary number of commodities is presented, where returns dependency and heterogeneous investment horizons are accounted for by copulas and wavelets, respectively.
Abstract: This article presents a model to select the optimal hedge ratios of a portfolio comprised of an arbitrary number of commodities. In particular, returns dependency and heterogeneous investment horizons are accounted for by copulas and wavelets, respectively. We analyze a portfolio of London Metal Exchange metals for the period July 1993-December 2005, and conclude that neglecting cross correlations leads to biased estimates of the optimal hedge ratios and the degree of hedge effectiveness. Furthermore, when compared with a multivariate-GARCH specification, our methodology yields higher hedge effectiveness for the raw returns and their short-term components.

49 citations

Journal ArticleDOI
01 Jun 2019
TL;DR: The traditional financial paradigm seeks to understand financial markets by using models in which markets are perfect, which includes agents who are "rational" and update their beliefs correctly based on new information as mentioned in this paper.
Abstract: The traditional financial paradigm seeks to understand financial markets by using models in which markets are perfect, which includes agents who are “rational” and update their beliefs correctly based on new information. By comparison, the new institutional economics approach attempts to provide a more realistic picture of economic processes, even in financial markets, by postulating several market imperfections, including the agents’ limited rationality. In contrast, behavioral finance completely challenges the rationality assumption and aims to improve the understanding of financial markets by assuming that, due to psychological factors, investors’ decisions will contradict the expected utility theory. However, the traditional, new institutional and the behavioral finance models all share one important feature: They are all based on the notion of a representative agent even though this mythological figure is dressed differently. Evolutionary finance suggests a model of portfolio selection and asset price dynamics that is explicitly based on the ideas of investors’ heterogeneity, dynamics and changes, learning and a natural selection of strategies. The paper suggests a systematization of this new approach, which is subsequently used to conduct a state-of-the-art literature survey and an evaluation of evolutionary finance research.

26 citations

Journal Article
TL;DR: In this paper, a time series summarization and prediction framework is presented to analyse non-stationary, volatile and high-frequency time series data, where multiscale wavelet analysis is used to separate out the trend, cyclical fluctuations and autocorrelational effects.
Abstract: Most financial time series processes are nonstationary and their frequency characteristics are time-dependant. In this paper we present a time series summarization and prediction framework to analyse nonstationary, volatile and high-frequency time series data. Multiscale wavelet analysis is used to separate out the trend, cyclical fluctuations and autocorrelational effects. The framework can generate verbal signals to describe each effect. The summary output is used to reason about the future behaviour of the time series and to give a prediction. Experiments on the intra-day European currency spot exchange rates are described. The results are compared with a neural network prediction framework.

20 citations

Journal ArticleDOI
Fan He, Xuansen He1
TL;DR: The experimental results show that compared with the classical shrinkage (hard, soft, and nonnegative garrote) functions, the proposed thresholding function not only has the advantage of continuous derivative, but also has a very competitive denoising performance.
Abstract: In economic (financial) time series analysis, prediction plays an important role and the inclusion of noise in the time series data is also a common phenomenon In particular, stock market data are highly random and non-stationary, thus they contain much noise Prediction of the noise-free data is quite difficult when noise is present Therefore, removal of such noise before predicting can significantly improve the prediction accuracy of economic models Based on this consideration, this paper proposes a new shrinkage (thresholding) function to improve the performance of wavelet shrinkage denoising The proposed thresholding function is an arctangent function with several parameters to be determined and the optimal parameters are determined by ensuring that the thresholding function satisfies the condition of continuously differentiable The closing price data with the Shanghai Composite Index from January 1991 to December 2014 are used to illustrate the application of the proposed shrinkage function in denoising the stock data The experimental results show that compared with the classical shrinkage (hard, soft, and nonnegative garrote) functions, the proposed thresholding function not only has the advantage of continuous derivative, but also has a very competitive denoising performance

10 citations