scispace - formally typeset
Search or ask a question
Proceedings ArticleDOI

Solar Irradiance Prediction Based on Weather Patterns Using Bagging-Based Ensemble Learners with Principal Component Analysis

TL;DR: In this article, a bagging-based ensemble learning system was used to predict solar irradiance based on weather patterns, and the results showed that ensemble learners produced unbiased and more accurate results compared to single learners.
Abstract: Energy production of photovoltaic (PV) system depends on the amount of solar irradiance present on a certain location. Accurate prediction of solar irradiance ensures economic integration of PV system to grid and leads to optimal dispatching of available energy resources. Weather conditions has strong correlation with solar irradiance, and its erratic nature causes fluctuation to energy production. Therefore, it is difficult to achieve consistent optimal energy production and reliable prediction of solar irradiance. In the study, a bagging-based ensemble learning system was used to predict solar irradiance based on weather patterns. Previous researches confirmed that ensemble learners produced unbiased and more accurate results compared to single learners. A pre-processed stacked long-short term memory model (stacked LSTM) was used as base learner in ensemble learning since it has good performance in handling time series sequences. A plot that compares the performance between single learner and ensemble learners was provided. From the plot, it shows that at some iteration, ensemble learners get consistent at providing more accurate predictions compared to single learners. Metrics used in the study include explained variance score, maximum residual error, mean absolute error, mean squared error, and regression score function.
Citations
More filters
Proceedings ArticleDOI
03 Dec 2020
TL;DR: In this article, the authors presented the renewable energy sources particularly solar and wind energy as well as the demand of energy in agriculture and forecasted the number of publications is also forecasted to increase by 71.1% and 134.3% in 2030 and 2040, respectively.
Abstract: Since the modernization of industries, the impacts of climate change to the environment have been alarming. The conventional energy practices in the industry which uses non-renewable energy sources pose threats to the global ecosystems. Hence, the need for renewable energy and energy management strategies should be given importance. With that, being able to develop new and sustainable solutions is a matter of approach to deal with the effects of climate change. Sustainable development in energy demands the use of renewable energy sources and its efficient utilization in the industry. This study presents the renewable energy sources particularly solar and wind energy as well as the demand of energy in agriculture. Standards and regulations for energy management is being implemented globally as well as in the Philippines to support the environment efforts. Energy management systems have adapted modern technologies that bolsters sustainability. Further, the number of total publications on renewable energy management systems in agriculture has been increasing with an average growth rate of 10.2% for the past 10 years and total publications of 810 and 926 in 2018 and 2019, respectively. The number of publications is also forecasted to increase by 71.1% and 134.3% in 2030 and 2040, respectively. Lastly, with the increasing number of publications in the following years, modern strategies and technologies are expected to be adapted by energy management systems not only in agriculture industry.

3 citations

Journal ArticleDOI
TL;DR: In this paper , the authors discuss overview of battery management system (BMS) functions, LiFePO4 characteristics, key issues, estimation techniques, main features, and drawbacks of using this battery type.
Abstract: Lithium iron phosphate (LiFePO4) has become the top choice battery chemical in photovoltaic (PV) system nowadays due to numerous advantages as compared to lead acid batteries. However, LiFePO4 needs a battery management system to optimize energy utilization. State of charge (SoC), state of health (SoH), cell balancing, remaining useful life are some of its crucial parameters. This review paper discusses overview of battery management system (BMS) functions, LiFePO4 characteristics, key issues, estimation techniques, main features, and drawbacks of using this battery type.

2 citations

Journal ArticleDOI
TL;DR: A review of current advances and prospects in the field of forecasting renewable energy generation using machine learning (ML) and deep learning (DL) techniques is presented in this article , where different approaches and models have been used for renewable energy forecasting and discusses their strengths and limitations.
Abstract: This article presents a review of current advances and prospects in the field of forecasting renewable energy generation using machine learning (ML) and deep learning (DL) techniques. With the increasing penetration of renewable energy sources (RES) into the electricity grid, accurate forecasting of their generation becomes crucial for efficient grid operation and energy management. Traditional forecasting methods have limitations, and thus ML and DL algorithms have gained popularity due to their ability to learn complex relationships from data and provide accurate predictions. This paper reviews the different approaches and models that have been used for renewable energy forecasting and discusses their strengths and limitations. It also highlights the challenges and future research directions in the field, such as dealing with uncertainty and variability in renewable energy generation, data availability, and model interpretability. Finally, this paper emphasizes the importance of developing robust and accurate renewable energy forecasting models to enable the integration of RES into the electricity grid and facilitate the transition towards a sustainable energy future.

1 citations

Proceedings ArticleDOI
01 Dec 2022
TL;DR: In this paper , simple linear, Gauss-Newton and Nernst-based nonlinear, and 14-gene genetic programming regression models were developed and embedded to motes in four selected rain test areas in Metro Manila and Rizal province in predicting two weather states (no rain and raining).
Abstract: Temperature and humidity are two of the many factors that play vital roles in weather, and these two factors are used in determining the present weather conditions. Not only does it concern meteorology, but especially the food science and agricultural fields. Weather monitoring pertains to an activity in which the state of the atmosphere is analyzed, which usually includes the variables such as wind speed, temperature, humidity, air moisture, pressure, and rainfall. This study uses the Arduino Uno board as a microcontroller and the DHT11 temperature (T) and humidity (H) sensor to gather information about the environment and display it in the LCD module. Simple linear, Gauss-Newton and Nernst-based non-linear, and 14-gene genetic programming regression models were developed and embedded to motes in four selected rain test areas in Metro Manila and Rizal province in predicting two weather states (no rain and raining). The expected result in this system is an approximation as to whether or not it would rain based on the data gathered throughout the project development. Weather data were automatically uploaded and stored in a ThingSpeak server using ESP32, which is viewed in the form of a graph. Based on the results, the temperature changes slightly during rainfall while humidity; on the other hand, changes much more drastically during rainfall and is a key telltale sign of rainfall. Linear regression outperformed other models in binary rain prediction based on temperature and humidity parameters only.
References
More filters
Proceedings ArticleDOI
01 Dec 2018
TL;DR: The empirical studies conducted and reported in this article show that deep learning-based algorithms such as LSTM outperform traditional-based algorithm such as ARIMA model and the average reduction in error rates obtained by L STM was between 84 - 87 percent when compared to ARimA indicating the superiority of LSTm to AR IMA.
Abstract: Forecasting time series data is an important subject in economics, business, and finance. Traditionally, there are several techniques to effectively forecast the next lag of time series data such as univariate Autoregressive (AR), univariate Moving Average (MA), Simple Exponential Smoothing (SES), and more notably Autoregressive Integrated Moving Average (ARIMA) with its many variations. In particular, ARIMA model has demonstrated its outperformance in precision and accuracy of predicting the next lags of time series. With the recent advancement in computational power of computers and more importantly development of more advanced machine learning algorithms and approaches such as deep learning, new algorithms are developed to analyze and forecast time series data. The research question investigated in this article is that whether and how the newly developed deep learning-based algorithms for forecasting time series data, such as "Long Short-Term Memory (LSTM)", are superior to the traditional algorithms. The empirical studies conducted and reported in this article show that deep learning-based algorithms such as LSTM outperform traditional-based algorithms such as ARIMA model. More specifically, the average reduction in error rates obtained by LSTM was between 84 - 87 percent when compared to ARIMA indicating the superiority of LSTM to ARIMA. Furthermore, it was noticed that the number of training times, known as "epoch" in deep learning, had no effect on the performance of the trained forecast model and it exhibited a truly random behavior.

508 citations

Proceedings ArticleDOI
01 Aug 2013
TL;DR: This work proposes to combine the benefits of cross-validation and forecast aggregation, i.e. crogging, to create diverse base-models of neural networks for time series prediction trained on different data subsets, and average their individual multiple-step ahead predictions.
Abstract: In classification, regression and time series prediction alike, cross-validation is widely employed to estimate the expected accuracy of a predictive algorithm by averaging predictive errors across mutually exclusive subsamples of the data. Similarly, bootstrapping aims to increase the validity of estimating the expected accuracy by repeatedly sub-sampling the data with replacement, creating overlapping samples of the data. Estimates are then used to anticipate of future risk in decision making, or to guide model selection where multiple candidates are feasible. Beyond error estimation, bootstrapping has recently been extended to combine each of the diverse models created for estimation, and aggregating over each of their predictions (rather than their errors), coined bootstrap aggregation or bagging. However, similar extensions of cross-validation to create diverse forecasting models have not been considered. In accordance with bagging, we propose to combine the benefits of cross-validation and forecast aggregation, i.e. crogging. We assesses different levels of cross-validation, including a (single-fold) hold-out approach, 2-fold and 10-fold cross validation and Monte-Carlos cross validation, to create diverse base-models of neural networks for time series prediction trained on different data subsets, and average their individual multiple-step ahead predictions. Results of forecasting the 111 time series of the NN3 competition indicate significant improvements accuracy through Crogging relative to Bagging or individual model selection of neural networks.

32 citations

Proceedings ArticleDOI
11 Jul 2010
TL;DR: A novel ensemble model is proposed that refines the bagging algorithm with an optimization process and reveals that the new model does outperform the original method in terms of learning accuracy and complexity.
Abstract: Bagging (Bootstrap Aggregating) has been proved to be a useful, effective and simple ensemble learning methodology. In generic bagging methods, all the classifiers which are trained on the different training datasets created by bootstrap resampling original datasets would be seen as base classifiers and their results would be combined to compute final result. This paper proposed a novel ensemble model that refines the bagging algorithm with an optimization process. The optimization process mainly emphasizes on how to select the optimal classifiers according to the accuracy and diversity of the base classifiers. While the select classifiers constitute the final base classifiers. The empirical results reveal that the new model does outperform the original method in terms of learning accuracy and complexity.

24 citations

Journal ArticleDOI
26 Feb 2018-Energies
TL;DR: An unsupervised load forecasting scheme using combined classic methods of principal component analysis (PCA) and autoregressive (AR) modeling, as well as a supervised scheme using orthonormal partial least squares (OPLS) are proposed in this paper.
Abstract: Healthcare buildings exhibit a different electrical load predictability depending on their size and nature. Large hospitals behave similarly to small cities, whereas primary care centers are expected to have different consumption dynamics. In this work, we jointly analyze the electrical load predictability of a large hospital and that of its associated primary care center. An unsupervised load forecasting scheme using combined classic methods of principal component analysis (PCA) and autoregressive (AR) modeling, as well as a supervised scheme using orthonormal partial least squares (OPLS), are proposed. Both methods reduce the dimensionality of the data to create an efficient and low-complexity data representation and eliminate noise subspaces. Because the former method tended to underestimate the load and the latter tended to overestimate it in the large hospital, we also propose a convex combination of both to further reduce the forecasting error. The analysis of data from 7 years in the hospital and 3 years in the primary care center shows that the proposed low-complexity dynamic models are flexible enough to predict both types of consumption at practical accuracy levels.

20 citations

Proceedings ArticleDOI
01 Oct 2019
TL;DR: Through the comparison of training and prediction with the two benchmark algorithms of deep neural network (DNN) and random forest (RF) on data generated under different sliding windows, it is found that the network built by Stacked LSTM is superior in the MSE, RMSE and MAE and more stable in R Squared.
Abstract: Temperature is a commonly used meteorological variable that plays an important role in society, agricultural production and the economy. In this paper, a stacked long short-term memory network (Stacked LSTM) is used to process temperature time series data and to provide temperature prediction results every half hour. Through the comparison of training and prediction with the two benchmark algorithms of deep neural network (DNN) and random forest (RF) on data generated under different sliding windows, it is found that the network built by Stacked LSTM is superior in the MSE, RMSE and MAE and more stable in R Squared. The prediction accuracy of the Stacked LSTM network is further improved by fusing various models in a linear fusion mode. The comparison shows that the result of combining the random forest and Stacked LSTM is not greatly different from the Stacked LSTM, while the result of combining the Stacked LSTM and deep neural network is the optimal. Specifically, the prediction results of combining the DNN with Stacked-LSTM are 1.7%, 3.8%, and 8.7% higher than the Stacked LSTM, DNN and random forest alone, respectively.

18 citations