scispace - formally typeset
Search or ask a question
Author

Yi-Ting Chen

Other affiliations: University of Montpellier
Bio: Yi-Ting Chen is an academic researcher from National Chiao Tung University. The author has contributed to research in topics: Wavelet & Wavelet transform. The author has an hindex of 5, co-authored 9 publications receiving 82 citations. Previous affiliations of Yi-Ting Chen include University of Montpellier.

Papers
More filters
Journal ArticleDOI
TL;DR: A new wavelet-based methodology is proposed (named GOWDA, i.e., the generalized optimal wavelet decomposition algorithm) that allows to deconstruct price series into the true efficient price and microstructure noise, particularly for the noise that induces the phase transition behaviors.

55 citations

Journal ArticleDOI
TL;DR: A novel method to forecast travel time based on big data collected from the industrial IoT infrastructure that separates the global regression tree model based on the gradient boosting decision tree into several partitions to capture the time-varying features simultaneously.

36 citations

Journal ArticleDOI
TL;DR: This article uses a recurrently adaptive separation algorithm, which is based on the maximal overlap discrete wavelet transform (MODWT) and that can effectively identify the time-variant jumps, extract the time -consistent patterns from the noise (jumps), and denoise the marginal perturbations.
Abstract: High-frequency data is a big data in finance in which a large amount of intra-day transactions arriving irregularly in financial markets are recorded. Given the high frequency and irregularity, such data require efficient tools to filter out the noise (i.e. jumps) arising from the anomaly, irregularity, and heterogeneity of financial markets. In this article, we use a recurrently adaptive separation algorithm, which is based on the maximal overlap discrete wavelet transform (MODWT) and that can effectively: (1) identify the time-variant jumps, (2) extract the time-consistent patterns from the noise (jumps), and (3) denoise the marginal perturbations. In addition, the proposed algorithm enables reinforcement learning to optimize a multiple-criteria decision or convex programming when reconstructing the wavelet-denoised data. Using simulated data, we show the proposed approach can perform efficiently in comparison with other conventional methods documented in the literature. We also apply our method in an empirical study by using high-frequency data from the US stock market and confirm that the proposed method can significantly improve the accuracy of predictive analytics models for financial market returns.

12 citations

Journal ArticleDOI
TL;DR: A new integrated wavelet Denoising method, named smoothness-oriented wavelet denoising algorithm (SOWDA), that optimally determines the wavelet function, maximal level of decomposition, and the threshold rule by using a smoothness score function that simultaneously detects the global and local extrema.
Abstract: Intelligent pattern recognition imposes new challenges in high-frequency financial data mining due to its irregularities and roughness. Based on the wavelet transform for decomposing systematic patterns and noise, in this paper we propose a new integrated wavelet denoising method, named smoothness-oriented wavelet denoising algorithm (SOWDA), that optimally determines the wavelet function, maximal level of decomposition, and the threshold rule by using a smoothness score function that simultaneously detects the global and local extrema. We discuss the properties of our method and propose a new evaluation procedure to show its robustness. In addition, we apply this method both in simulation and empirical investigation. Both the simulation results based on three typical stylized features of financial data and the empirical results in analyzing high-frequency financial data from Frankfurt Stock Exchange confirm that SOWDA significantly (based on the RMSE comparison) improves the performance of classical econometric models after denoising the data with the discrete wavelet transform (DWT) and maximal overlap discrete wavelet transform (MODWT) methods.

11 citations

Journal ArticleDOI
TL;DR: A novel machine learning method originating from the parallel neural networks for robust monitoring and forecasting power demand to enhance supervisory control and data acquisition for new industrial tendency such as Industry 4.0 and Energy IoT is proposed.
Abstract: Traditional methods applied in electricity demand forecasting have been challenged by the course of dimensionality arisen with a growing number of distributed or decentralized energy systems are employing. Without manually operated data preprocessing, classic models are not well-calibrated for their robustness when dealing with the disruptive elements (e.g., demand changes in holidays and extreme weather). Based on the application of big data driven analytics, we propose a novel machine learning method originating from the parallel neural networks for robust monitoring and forecasting power demand to enhance supervisory control and data acquisition for new industrial tendency such as Industry 4.0 and Energy IoT. Through our approach, we generalize the implementation of machine learning by using classic feed-forward neural networks, for parallelization in order to let the proposed method achieve superior performance when dealing with high dimensionality and disruptiveness. With the high-frequency data of consumption in Australia from January 2009 to December 2015, the overall empirical results confirm that our proposed method performs significantly better for dynamic monitoring and forecasting of power demand comparing with the classic methods.

7 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: The present paper aims to provide a systematic literature review that can help to explain the mechanisms through which big data analytics (BDA) lead to competitive performance gains and identifies gaps in the extant literature and proposes six future research themes.
Abstract: With big data growing rapidly in importance over the past few years, academics and practitioners have been considering the means through which they can incorporate the shifts these technologies bring into their competitive strategies. To date, emphasis has been on the technical aspects of big data, with limited attention paid to the organizational changes they entail and how they should be leveraged strategically. As with any novel technology, it is important to understand the mechanisms and processes through which big data can add business value to companies, and to have a clear picture of the different elements and their interdependencies. To this end, the present paper aims to provide a systematic literature review that can help to explain the mechanisms through which big data analytics (BDA) lead to competitive performance gains. The research framework is grounded on past empirical work on IT business value research, and builds on the resource-based view and dynamic capabilities view of the firm. By identifying the main areas of focus for BDA and explaining the mechanisms through which they should be leveraged, this paper attempts to add to literature on how big data should be examined as a source of competitive advantage. To this end, we identify gaps in the extant literature and propose six future research themes.

431 citations

01 Jan 2012
TL;DR: A systematic review of the current state of research in travel time reliability, and more explicitly in the value oftravel time reliability is presented.
Abstract: Travel time reliability is a fundamental factor in travel behavior. It represents the temporal uncertainty experienced by users in their movement between any two nodes in a network. The importance of the time reliability depends on the penalties incurred by the users. In road networks, travelers consider the existence of a trip travel time uncertainty in different choice situations (departure time, route, mode, and others). In this paper, a systematic review of the current state of research in travel time reliability, and more explicitly in the value of travel time reliability is presented. Moreover, a meta-analysis is performed in order to determine the reasons behind the discrepancy among the reliability estimates.

352 citations

Journal ArticleDOI
TL;DR: This article presents a comprehensive, well-informed examination, and realistic analysis of deploying big data analytics successfully in companies and presents a methodical analysis for the usage of Big Data Analytics in various applications such as agriculture, healthcare, cyber security, and smart city.
Abstract: Big Data Analytics (BDA) is increasingly becoming a trending practice that generates an enormous amount of data and provides a new opportunity that is helpful in relevant decision-making. The developments in Big Data Analytics provide a new paradigm and solutions for big data sources, storage, and advanced analytics. The BDA provide a nuanced view of big data development, and insights on how it can truly create value for firm and customer. This article presents a comprehensive, well-informed examination, and realistic analysis of deploying big data analytics successfully in companies. It provides an overview of the architecture of BDA including six components, namely: (i) data generation, (ii) data acquisition, (iii) data storage, (iv) advanced data analytics, (v) data visualization, and (vi) decision-making for value-creation. In this paper, seven V's characteristics of BDA namely Volume, Velocity, Variety, Valence, Veracity, Variability, and Value are explored. The various big data analytics tools, techniques and technologies have been described. Furthermore, it presents a methodical analysis for the usage of Big Data Analytics in various applications such as agriculture, healthcare, cyber security, and smart city. This paper also highlights the previous research, challenges, current status, and future directions of big data analytics for various application platforms. This overview highlights three issues, namely (i) concepts, characteristics and processing paradigms of Big Data Analytics; (ii) the state-of-the-art framework for decision-making in BDA for companies to insight value-creation; and (iii) the current challenges of Big Data Analytics as well as possible future directions.

274 citations

Journal ArticleDOI
TL;DR: A conceptual framework using constructs obtained using reduction of gathered data that summarizes the role of big data analytics in supporting world-class sustainable manufacturing (WCSM) is proposed and the importance for academia and practice is highlighted.
Abstract: Big data (BD) has attracted increasing attention from both academics and practitioners. This paper aims at illustrating the role of big data analytics in supporting world-class sustainable manufacturing (WCSM). Using an extensive literature review to identify different factors that enable the achievement of WCSM through BD and 405 usable responses from senior managers gathered through social networking sites (SNS), we propose a conceptual framework using constructs obtained using reduction of gathered data that summarizes this role; test this framework using data which is heterogeneous, diverse, voluminous, and possess high velocity; and highlight the importance for academia and practice. Finally, we conclude our research findings and further outlined future research directions.

264 citations

Journal ArticleDOI
TL;DR: It was found that most of the existing research on big data focuses majorly on consumer discretionary, followed by public administration, and not much focus was highlighted in these studies to demonstrate the tools used for the analysis to address this gap.
Abstract: The importance of data science and big data analytics is growing very fast as organizations are gearing up to leverage their information assets to gain competitive advantage. The flexibility offered through big data analytics empowers functional as well as firm-level performance. In the first phase of the study, we attempt to analyze the research on big data published in high-quality business management journals. The analysis was visualized using tools for big data and text mining to understand the dominant themes and how they are connected. Subsequently, an industry-specific categorization of the studies was done to understand the key use cases. It was found that most of the existing research focuses majorly on consumer discretionary, followed by public administration. Methodologically, a major focus in such exploration is in social media analytics, text mining and machine learning applications for meeting objectives in marketing and supply chain management. However, it was found that not much focus was highlighted in these studies to demonstrate the tools used for the analysis. To address this gap, this study also discusses the evolution, types and usage of big data tools. The brief overview of big data technologies grouped by the services they enable and some of their applications are presented. The study categorizes these tools into big data analysis platforms, databases and data warehouses, programming languages, search tools, and data aggregation and transfer tools. Finally, based on the review, future directions for exploration in big data has been provided for academic and practice.

136 citations