scispace - formally typeset
Search or ask a question
Author

Anam-Nawaz Khan

Bio: Anam-Nawaz Khan is an academic researcher from Jeju National University. The author has contributed to research in topics: Computer science & Ensemble forecasting. The author has an hindex of 1, co-authored 2 publications receiving 8 citations.

Papers
More filters
Journal ArticleDOI
23 May 2021-Energies
TL;DR: A spatial and temporal ensemble forecasting model for short-term electric consumption forecasting that has efficiently captured the dynamic electric consumption characteristics to exploit ensemble model diversities and achieved lower forecasting error is presented.
Abstract: Due to the availability of smart metering infrastructure, high-resolution electric consumption data is readily available to study the dynamics of residential electric consumption at finely resolved spatial and temporal scales. Analyzing the electric consumption data enables the policymakers and building owners to understand consumer’s demand-consumption behaviors. Furthermore, analysis and accurate forecasting of electric consumption are substantial for consumer involvement in time-of-use tariffs, critical peak pricing, and consumer-specific demand response initiatives. Alongside its vast economic and sustainability implications, such as energy wastage and decarbonization of the energy sector, accurate consumption forecasting facilitates power system planning and stable grid operations. Energy consumption forecasting is an active research area; despite the abundance of devised models, electric consumption forecasting in residential buildings remains challenging due to high occupant energy use behavior variability. Hence the search for an appropriate model for accurate electric consumption forecasting is ever continuing. To this aim, this paper presents a spatial and temporal ensemble forecasting model for short-term electric consumption forecasting. The proposed work involves exploring electric consumption profiles at the apartment level through cluster analysis based on the k-means algorithm. The ensemble forecasting model consists of two deep learning models; Long Short-Term Memory Unit (LSTM) and Gated Recurrent Unit (GRU). First, the apartment-level historical electric consumption data is clustered. Later the clusters are aggregated based on consumption profiles of consumers. At the building and floor level, the ensemble models are trained using aggregated electric consumption data. The proposed ensemble model forecasts the electric consumption at three spatial scales apartment, building, and floor level for hourly, daily, and weekly forecasting horizon. Furthermore, the impact of spatial-temporal granularity and cluster analysis on the prediction accuracy is analyzed. The dataset used in this study comprises high-resolution electric consumption data acquired through smart meters recorded on an hourly basis over the period of one year. The consumption data belongs to four multifamily residential buildings situated in an urban area of South Korea. To prove the effectiveness of our proposed forecasting model, we compared our model with widely known machine learning models and deep learning variants. The results achieved by our proposed ensemble scheme verify that model has learned the sequential behavior of electric consumption by producing superior performance with the lowest MAPE of 4.182 and 4.54 at building and floor level prediction, respectively. The experimental findings suggest that the model has efficiently captured the dynamic electric consumption characteristics to exploit ensemble model diversities and achieved lower forecasting error. The proposed ensemble forecasting scheme is well suited for predictive modeling and short-term load forecasting.

26 citations

Journal ArticleDOI
TL;DR: In this article, an ensemble GWL prediction model using boosting and bagging models based on stacking techniques to predict GWL for enhancing hydraulic resource management and planning is presented. But, the performance of the proposed E-GWLP model is compared with existing ensemble and baseline models and experimental results reveal that the proposed model performed accurately in respect of MAE, MSE, and RMSE of 0.340, 0.564, and 0.751, respectively.
Abstract: Drilling data for groundwater extraction incur changes over time due to variations in hydrogeological and weather conditions. At any time, if there is a need to deploy a change in drilling operations, drilling companies keep monitoring the time-series drilling data to make sure it is not introducing any changes or new errors. Therefore, a solution is needed to predict groundwater levels (GWL) and detect a change in boreholes data to improve drilling efficiency. The proposed study presents an ensemble GWL prediction (E-GWLP) model using boosting and bagging models based on stacking techniques to predict GWL for enhancing hydraulic resource management and planning. The proposed research study consists of two modules; descriptive analysis of boreholes data and GWL prediction model using ensemble model based on stacking. First, descriptive analysis techniques, such as correlation analysis and difference mechanisms, are applied to investigate boreholes log data for extracting underlying characteristics, which is critical for enhancing hydraulic resource management. Second, an ensemble prediction model is developed based on multiple hydrological patterns using robust machine learning (ML) techniques to predict GWL for enhancing drilling efficiency and water resource management. The architecture of the proposed ensemble model involves three boosting algorithms as base models (level-0) and a bagging algorithm as a meta-model that combines the base models predictions (level-1). The base models consist of the following boosting algorithms; eXtreme Gradient Boosting (XGBoost), AdaBoost, Gradient Boosting (GB). The meta-model includes Random Forest (RF) as a bagging algorithm referred to as a level-1 model. Furthermore, different evaluation metrics are used, including mean absolute error (MAE), mean square error (MSE), and root mean square error (RMSE), mean absolute percentage error (MAPE), and R2 score. The performance of the proposed E-GWLP model is compared with existing ensemble and baseline models. The experimental results reveal that the proposed model performed accurately in respect of MAE, MSE, and RMSE of 0.340, 0.564, and 0.751, respectively. The MAPE and R2 score of our proposed approach is 12.658 and 0.976, respectively, which signifies the importance of our work. Moreover, experimental results suggest that E-GWLP model is suitable for sustainable water resource management and improves reservoir engineering.

10 citations

Journal ArticleDOI
TL;DR: The core objective is to develop a robust model for enhancing the performance of the existing tumor detection systems in terms of accuracy and efficiency and to suggest that the proposed model is more effective and efficient in facilitating clinical research and practice for MRI classification.
Abstract: A brain tumor is the growth of abnormal cells in certain brain tissues with a high mortality rate; therefore, it requires high precision in diagnosis, as a minor human judgment can eventually cause severe consequences. Magnetic Resonance Image (MRI) serves as a non-invasive tool to detect the presence of a tumor. However, Rician noise is inevitably instilled during the image acquisition process, which leads to poor observation and interferes with the treatment. Computer-Aided Diagnosis (CAD) systems can perform early diagnosis of the disease, potentially increasing the chances of survival, and lessening the need for an expert to analyze the MRIs. Convolutional Neural Networks (CNN) have proven to be very effective in tumor detection in brain MRIs. There have been multiple studies dedicated to brain tumor classification; however, these techniques lack the evaluation of the impact of the Rician noise on state-of-the-art deep learning techniques and the consideration of the scaling impact on the performance of the deep learning as the size and location of tumors vary from image to image with irregular shape and boundaries. Moreover, transfer learning-based pre-trained models such as AlexNet and ResNet have been used for brain tumor detection. However, these architectures have many trainable parameters and hence have a high computational cost. This study proposes a two-fold solution: (a) Multi-Scale CNN (MSCNN) architecture to develop a robust classification model for brain tumor diagnosis, and (b) minimizing the impact of Rician noise on the performance of the MSCNN. The proposed model is a multi-class classification solution that classifies MRIs into glioma, meningioma, pituitary, and non-tumor. The core objective is to develop a robust model for enhancing the performance of the existing tumor detection systems in terms of accuracy and efficiency. Furthermore, MRIs are denoised using a Fuzzy Similarity-based Non-Local Means (FSNLM) filter to improve the classification results. Different evaluation metrics are employed, such as accuracy, precision, recall, specificity, and F1-score, to evaluate and compare the performance of the proposed multi-scale CNN and other state-of-the-art techniques, such as AlexNet and ResNet. In addition, trainable and non-trainable parameters of the proposed model and the existing techniques are also compared to evaluate the computational efficiency. The experimental results show that the proposed multi-scale CNN model outperforms AlexNet and ResNet in terms of accuracy and efficiency at a lower computational cost. Based on experimental results, it is found that our proposed MCNN2 achieved accuracy and F1-score of 91.2% and 91%, respectively, which is significantly higher than the existing AlexNet and ResNet techniques. Moreover, our findings suggest that the proposed model is more effective and efficient in facilitating clinical research and practice for MRI classification.

7 citations

Journal ArticleDOI
TL;DR: In this paper , an integrated solution using an enhanced TCA tasks scheduling mechanism based on a predictive optimization approach to improve smart manufacturing production efficiency is presented, which is an improved variant of FEF scheduling that considers accurate decision (prediction) measures and tasks' minimal (optimal) time to schedule tasks efficiently.

5 citations

Journal ArticleDOI
TL;DR: In this paper , a fuzzy logic-based control module operates on an IoT device that maps the optimized parameters with the actuator and operates accordingly, and the proposed mechanism saves 36% of energy.
Abstract: Greenhouses are a productive system that allows us to respond to the growing global demand for fresh and healthy food throughout the year, but the greenhouse environment is not easily controlled because its climate parameters are interrelated. However, the numbers of the actuator are operated parallelly to maintain the greenhouse environment; as a result, the energy consumption of greenhouses is high. In this study, we presented the optimization module by considering the outdoor environment with the aim of minimum energy consumption. Metaheuristic-based differential evaluation (DE) is used to optimize the climate parameters by considering indoor and outdoor environmental constraints. Furthermore, the long short-term memory (LSTM)-based inference model is offloaded on the Internet of Things (IoT) device to predict the next environmental situation. The objective function selects the optimal parameters within user preferences with minimum energy consumption based on the inferred parameter value. The open-source software framework IoTivity, implementing open connectivity foundation (OCF) technical standards, is used for the real-time connection between IoT devices and the IoT platform. Greenhouse owners can set the preferences based on the requirements of plants in the greenhouse by using a smart and remotely accessible Android-based interface. A fuzzy logic-based control module operates on an IoT device that maps the optimized parameters with the actuator and operates accordingly. The proposed model is analyzed, and the performance is evaluated in terms of energy consumption for each climate parameter and actuator in the greenhouse. The results show that the proposed mechanism saves 36% of energy.

2 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: The performance comparison results show that the proposed ensemble model-based intrusion detection significantly improves the intrusion detection accuracy.
Abstract: The connectivity of our surrounding objects to the internet plays a tremendous role in our daily lives. Many network applications have been developed in every domain of life, including business, healthcare, smart homes, and smart cities, to name a few. As these network applications provide a wide range of services for large user groups, the network intruders are prone to developing intrusion skills for attack and malicious compliance. Therefore, safeguarding network applications and things connected to the internet has always been a point of interest for researchers. Many studies propose solutions for intrusion detection systems and intrusion prevention systems. Network communities have produced benchmark datasets available for researchers to improve the accuracy of intrusion detection systems. The scientific community has presented data mining and machine learning-based mechanisms to detect intrusion with high classification accuracy. This paper presents an intrusion detection system based on the ensemble of prediction and learning mechanisms to improve anomaly detection accuracy in a network intrusion environment. The learning mechanism is based on automated machine learning, and the prediction model is based on the Kalman filter. Performance analysis of the proposed intrusion detection system is evaluated using publicly available intrusion datasets UNSW-NB15 and CICIDS2017. The proposed model-based intrusion detection accuracy for the UNSW-NB15 dataset is 98.801 percent, and the CICIDS2017 dataset is 97.02 percent. The performance comparison results show that the proposed ensemble model-based intrusion detection significantly improves the intrusion detection accuracy.

29 citations

Journal ArticleDOI
TL;DR: This research study examines the integration of BC technology with IoT and analyzes the advancements of these innovative paradigms in the healthcare sector and comprehensively studies the peculiarities of the IoHT environment and the security, performance, and progression of the enabling technologies.
Abstract: With the growth of computing and communication technologies, the information processing paradigm of the healthcare environment is evolving. The patient information is stored electronically, making it convenient to store and retrieve patient information remotely when needed. However, evolving the healthcare systems into smart healthcare environments comes with challenges and additional pressures. Internet of Things (IoT) connects things, such as computing devices, through wired or wireless mediums to form a network. There are numerous security vulnerabilities and risks in the existing IoT-based systems due to the lack of intrinsic security technologies. For example, patient medical data, data privacy, data sharing, and convenience are considered imperative for collecting and storing electronic health records (EHR). However, the traditional IoT-based EHR systems cannot deal with these paradigms because of inconsistent security policies and data access structures. Blockchain (BC) technology is a decentralized and distributed ledger that comes in handy in storing patient data and encountering data integrity and confidentiality challenges. Therefore, it is a viable solution for addressing existing IoT data security and privacy challenges. BC paves a tremendous path to revolutionize traditional IoT systems by enhancing data security, privacy, and transparency. The scientific community has shown a variety of healthcare applications based on artificial intelligence (AI) that improve health diagnosis and monitoring practices. Moreover, technology companies and startups are revolutionizing healthcare with AI and related technologies. This study illustrates the implication of integrated technologies based on BC, IoT, and AI to meet growing healthcare challenges. This research study examines the integration of BC technology with IoT and analyzes the advancements of these innovative paradigms in the healthcare sector. In addition, our research study presents a detailed survey on enabling technologies for the futuristic, intelligent, and secure internet of health things (IoHT). Furthermore, this study comprehensively studies the peculiarities of the IoHT environment and the security, performance, and progression of the enabling technologies. First, the research gaps are identified by mapping security and performance benefits inferred by the BC technologies. Secondly, practical issues related to the integration process of BC and IoT devices are discussed. Third, the healthcare applications integrating IoT, BC, and ML in healthcare environments are discussed. Finally, the research gaps, future directions, and limitations of the enabling technologies are discussed.

17 citations

Journal ArticleDOI
TL;DR: The results indicated that research on blockchain applications is still relatively new and fragmented with regard to several topics, and guidelines for further research on Blockchain applications in the AEC industry were provided.
Abstract: Blockchain is regarded as a potential technology for transforming the architecture, engineering, and construction (AEC) industry, and the number of related publications is increasing rapidly. However, a systematic review of blockchain applications in the AEC industry is lacking. The objective of this study was to review the current status of blockchain applications via a bibliometric analysis combined with a systematic literature review. According to related articles collected from databases, the present status of blockchain was analysed with regard to the distribution of articles over publication years, journals, institutions, countries, cooperation networks between authors, keyword co-occurrence networks, and research methodologies. The results indicated that research on blockchain applications is still relatively new and fragmented with regard to several topics. Five areas of benefit were identified: (i) supply chain management, (ii) contract management, (iii) information management, (iv) stakeholder management, and (v) integration management. On the basis of the technology–organisation–environment framework, nine types of challenges were identified. Future research opportunities were proposed according to the research findings. This study contributes to the current body of knowledge and provides guidelines for further research on blockchain applications in the AEC industry.

15 citations

Journal ArticleDOI
22 Nov 2021-Energies
TL;DR: In this article, an optimal number of households would be selected adaptively, and the total aggregated residential load of the selected households is used for load prediction, and ordering points to identify the clustering structure (OPTICS) algorithm is also selected to cluster households with similar power consumption patterns adaptively.
Abstract: Short-term residential load forecasting is the precondition of the day-ahead and intra-day scheduling strategy of the household microgrid. Existing short-term electric load forecasting methods are mainly used to obtain regional power load for system-level power dispatch. Due to the high volatility, strong randomness, and weak regularity of the residential load of a single household, the mean absolute percentage error (MAPE) of the traditional methods forecasting results would be too big to be used for home energy management. With the increase in the total number of households, the aggregated load becomes more and more stable, and the cyclical pattern of the aggregated load becomes more and more distinct. In the meantime, the maximum daily load does not increase linearly with the increase in households in a small area. Therefore, in our proposed short-term residential load forecasting method, an optimal number of households would be selected adaptively, and the total aggregated residential load of the selected households is used for load prediction. In addition, ordering points to identify the clustering structure (OPTICS) algorithm are also selected to cluster households with similar power consumption patterns adaptively. It can be used to enhance the periodic regularity of the aggregated load in alternative. The aggregated residential load and encoded external factors are then used to predict the load in the next half an hour. The long short-term memory (LSTM) deep learning algorithm is used in the prediction because of its inherited ability to maintain historical data regularity in the forecasting process. The experimental data have verified the effectiveness and accuracy of our proposed method.

14 citations

Journal ArticleDOI
TL;DR: It is advised that academics employ new machine learning techniques while also considering mathematical model approaches to predicting groundwater level changes, as machine learning approaches achieve higher accuracies than mathematical model techniques, according to the research.
Abstract: With the effects of climate change such as increasing heat, higher rainfall, and more recurrent extreme weather events including storms and floods, a unique approach to studying the effects of climatic elements on groundwater level variations is required. These unique approaches will help people make better decisions. Researchers and stakeholders can attain these goals if they become familiar with current machine learning and mathematical model approaches to predicting groundwater level changes. However, descriptions of machine learning and mathematical model approaches for forecasting groundwater level changes are lacking. This study picked 117 papers from the Scopus scholarly database to address this knowledge gap. In a systematic review, the publications were examined using quantitative and qualitative approaches, and the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) was chosen as the reporting format. Machine learning and mathematical model techniques have made significant contributions to predicting groundwater level changes, according to the study. However, the domain is skewed because machine learning has been more popular in recent years, with random forest (RF) methods dominating, followed by the methods of support vector machine (SVM) and artificial neural network (ANN). Machine learning ensembles have also been found to help with aspects of computational complexity, such as performance and training times. Furthermore, compared to mathematical model techniques, machine learning approaches achieve higher accuracies, according to our research. As a result, it is advised that academics employ new machine learning techniques while also considering mathematical model approaches to predicting groundwater level changes.

10 citations