scispace - formally typeset
Search or ask a question

Showing papers by "Ryerson University published in 2019"


Proceedings ArticleDOI
08 Apr 2019
TL;DR: In this article, a generalized focal loss function based on the Tversky index was proposed to address the issue of data imbalance in medical image segmentation, which achieved a better trade off between precision and recall when training on small structures such as lesions.
Abstract: We propose a generalized focal loss function based on the Tversky index to address the issue of data imbalance in medical image segmentation. Compared to the commonly used Dice loss, our loss function achieves a better trade off between precision and recall when training on small structures such as lesions. To evaluate our loss function, we improve the attention U-Net model by incorporating an image pyramid to preserve contextual features. We experiment on the BUS 2017 dataset and ISIC 2018 dataset where lesions occupy 4.84% and 21.4% of the images area and improve segmentation accuracy when compared to the standard U-Net by 25.7% and 3.6%, respectively.

515 citations


Journal ArticleDOI
TL;DR: In this article, the authors proposed a new typology of performance metrics, based on the analysis of the structure and properties of various performance metrics and proposed a framework of metrics which includes four (4) categories: primary metrics, extended metrics, composite metrics, and hybrid sets of metrics.
Abstract: Aim/Purpose: The aim of this study was to analyze various performance metrics and approaches to their classification. The main goal of the study was to develop a new typology that will help to advance knowledge of metrics and facilitate their use in machine learning regression algorithms Background: Performance metrics (error measures) are vital components of the evaluation frameworks in various fields. A performance metric can be defined as a logical and mathematical construct designed to measure how close are the actual results from what has been expected or predicted. A vast variety of performance metrics have been described in academic literature. The most commonly mentioned metrics in research studies are Mean Absolute Error (MAE), Root Mean Squared Error (RMSE), etc. Knowledge about metrics properties needs to be systematized to simplify the design and use of the metrics. Methodology: A qualitative study was conducted to achieve the objectives of identifying related peer-reviewed research studies, literature reviews, critical thinking and inductive reasoning. Contribution: The main contribution of this paper is in ordering knowledge of performance metrics and enhancing understanding of their structure and properties by proposing a new typology, generic primary metrics mathematical formula and a visualization chart Findings: Based on the analysis of the structure of numerous performance metrics, we proposed a framework of metrics which includes four (4) categories: primary metrics, extended metrics, composite metrics, and hybrid sets of metrics. The paper identified three (3) key components (dimensions) that determine the structure and properties of primary metrics: method of determining point distance, method of normalization, method of aggregation of point distances over a data set. For each component, implementation options have been identified. The suggested new typology has been shown to cover a total of over 40 commonly used primary metrics Recommendations for Practitioners: Presented findings can be used to facilitate teaching performance metrics to university students and expedite metrics selection and implementation processes for practitioners Recommendation for Researchers: By using the proposed typology, researchers can streamline development of new metrics with predetermined properties Impact on Society: The outcomes of this study could be used for improving evaluation results in machine learning regression, forecasting and prognostics with direct or indirect positive impacts on innovation and productivity in a societal sense Future Research: Future research is needed to examine the properties of the extended metrics, composite metrics, and hybrid sets of metrics. Empirical study of the metrics is needed using R Studio or Azure Machine Learning Studio, to find associations between the properties of primary metrics and their “numerical” behavior in a wide spectrum of data characteristics and business or research requirements

289 citations


Journal ArticleDOI
TL;DR: Rituximab was noninferior to cyclosporine in inducing complete or partial remission of proteinuria at 12 months and was superior in maintaining proteinuria remission up to 24 months.
Abstract: Background B-cell anomalies play a role in the pathogenesis of membranous nephropathy. B-cell depletion with rituximab may therefore be noninferior to treatment with cyclosporine for induc...

260 citations


Journal ArticleDOI
TL;DR: From an empirical study of Canadian smartphone owners, the results show that perceived privacy concerns influence perceived value and that intention to use is significantly influenced by hedonic motivation and perceived value.

226 citations


Proceedings ArticleDOI
02 Dec 2019
TL;DR: In this article, a grid-based representation for event cameras is proposed, which can learn the input event representation together with the task dedicated network in an end-to-end manner.
Abstract: Event cameras are vision sensors that record asynchronous streams of per-pixel brightness changes, referred to as "events”. They have appealing advantages over frame based cameras for computer vision, including high temporal resolution, high dynamic range, and no motion blur. Due to the sparse, non-uniform spatio-temporal layout of the event signal, pattern recognition algorithms typically aggregate events into a grid-based representation and subsequently process it by a standard vision pipeline, e.g., Convolutional Neural Network (CNN). In this work, we introduce a general framework to convert event streams into grid-based representations by means of strictly differentiable operations. Our framework comes with two main advantages: (i) allows learning the input event representation together with the task dedicated network in an end to end manner, and (ii) lays out a taxonomy that unifies the majority of extant event representations in the literature and identifies novel ones. Empirically, we show that our approach to learning the event representation end-to-end yields an improvement of approximately 12% on optical flow estimation and object recognition over state-of-the-art methods.

212 citations


Journal ArticleDOI
TL;DR: This paper investigates the joint problem of partial offloading scheduling and resource allocation for MEC systems with multiple independent tasks, and proposes iterative algorithms for the joint issue of POSP.
Abstract: Mobile edge computing (MEC) is a promising technique to enhance computation capacity at the edge of mobile networks. The joint problem of partial offloading decision, offloading scheduling, and resource allocation for MEC systems is a challenging issue. In this paper, we investigate the joint problem of partial offloading scheduling and resource allocation for MEC systems with multiple independent tasks. A partial offloading scheduling and power allocation (POSP) problem in single-user MEC systems is formulated. The goal is to minimize the weighted sum of the execution delay and energy consumption while guaranteeing the transmission power constraint of the tasks. The execution delay of tasks running at both MEC and mobile device is considered. The energy consumption of both the task computing and task data transmission is considered as well. The formulated problem is a nonconvex mixed-integer optimization problem. In order to solve the formulated problem, we propose a two-level alternation method framework based on Lagrangian dual decomposition. The task offloading decision and offloading scheduling problem, given the allocated transmission power, is solved in the upper level using flow shop scheduling theory or greedy strategy, and the suboptimal power allocation with the partial offloading decision is obtained in the lower level using convex optimization techniques. We propose iterative algorithms for the joint problem of POSP. Numerical results demonstrate that the proposed algorithms achieve near-optimal delay performance with a large energy consumption reduction.

210 citations


Journal ArticleDOI
TL;DR: There is a cluster of four methods that rank significantly better than the other methods, with one clear winner, and the inter-scanner robustness ranking shows that not all the methods generalize to unseen scanners.
Abstract: Quantification of cerebral white matter hyperintensities (WMH) of presumed vascular origin is of key importance in many neurological research studies. Currently, measurements are often still obtained from manual segmentations on brain MR images, which is a laborious procedure. The automatic WMH segmentation methods exist, but a standardized comparison of the performance of such methods is lacking. We organized a scientific challenge, in which developers could evaluate their methods on a standardized multi-center/-scanner image dataset, giving an objective comparison: the WMH Segmentation Challenge. Sixty T1 + FLAIR images from three MR scanners were released with the manual WMH segmentations for training. A test set of 110 images from five MR scanners was used for evaluation. The segmentation methods had to be containerized and submitted to the challenge organizers. Five evaluation metrics were used to rank the methods: 1) Dice similarity coefficient; 2) modified Hausdorff distance (95th percentile); 3) absolute log-transformed volume difference; 4) sensitivity for detecting individual lesions; and 5) F1-score for individual lesions. In addition, the methods were ranked on their inter-scanner robustness; 20 participants submitted their methods for evaluation. This paper provides a detailed analysis of the results. In brief, there is a cluster of four methods that rank significantly better than the other methods, with one clear winner. The inter-scanner robustness ranking shows that not all the methods generalize to unseen scanners. The challenge remains open for future submissions and provides a public platform for method evaluation.

194 citations


Journal ArticleDOI
TL;DR: The enzymatic pretreatment as one of the biological pret treatment methods which has received less attention in the literature than the other pretreatment methods is reviewed, and the current status of research to improve the biogas rate and yield from the AD of lignocellulosic biomass via enzyme pretreatment is reviewed.

188 citations


Journal ArticleDOI
TL;DR: Type of social media use, body image dimension, country grouping, and age were all found to be significant moderators of this relationship and Strengths and limitations of the meta-analysis, as well as future directions for this line of research are discussed.

177 citations


Journal ArticleDOI
TL;DR: This paper addresses the more challenging case of not only using a single camera but also not leveraging markers: going directly from 2D appearance to 3D geometry, using a novel approach that treats 2D joint locations as latent variables whose uncertainty distributions are given by a deep fully convolutional neural network.
Abstract: Recovering 3D full-body human pose is a challenging problem with many applications. It has been successfully addressed by motion capture systems with body worn markers and multiple cameras. In this paper, we address the more challenging case of not only using a single camera but also not leveraging markers: going directly from 2D appearance to 3D geometry. Deep learning approaches have shown remarkable abilities to discriminatively learn 2D appearance features. The missing piece is how to integrate 2D, 3D, and temporal information to recover 3D geometry and account for the uncertainties arising from the discriminative model. We introduce a novel approach that treats 2D joint locations as latent variables whose uncertainty distributions are given by a deep fully convolutional neural network. The unknown 3D poses are modeled by a sparse representation and the 3D parameter estimates are realized via an Expectation-Maximization algorithm, where it is shown that the 2D joint location uncertainties can be conveniently marginalized out during inference. Extensive evaluation on benchmark datasets shows that the proposed approach achieves greater accuracy over state-of-the-art baselines. Notably, the proposed approach does not require synchronized 2D-3D data for training and is applicable to “in-the-wild” images, which is demonstrated with the MPII dataset.

175 citations


Journal ArticleDOI
TL;DR: Evidence supports that probiotic supplementation in healthy adults can lead to transient improvement in gut microbiota concentration of supplement-specific bacteria, and evidence also supports the role of probiotics in improving immune system responses, stool consistency, bowel movement, and vaginal lactobacilli concentration.
Abstract: Probiotic supplements have a positive impact on several health outcomes. However, the majority of published studies have focused on populations with specific health pathologies. Therefore, this study reviewed the current literature on the health effects of probiotic consumption in “healthy adults.” The findings from this review may help guide consumers, researchers, and manufacturers regarding probiotic supplementation. Relevant literature published between 1990 and August 2017 was reviewed. Studies were included if they were experimental trials, included healthy adults, used live bacteria, and had accessible full-text articles published in English. Included studies were classified according to common foci that emerged. Forty-five studies were included in this review. Five foci emerged: gut microbiota changes (n = 15); immune system response (n = 16); lipid profile and cardiovascular disease risk (n = 14); gastrointestinal discomfort (n = 11); and female reproductive health (n = 4). Results suggest that probiotic supplementation in healthy adults can lead to transient improvement in gut microbiota concentration of supplement-specific bacteria. Evidence also supports the role of probiotics in improving immune system responses, stool consistency, bowel movement, and vaginal lactobacilli concentration. There is insufficient evidence to support the role of probiotics to improve blood lipid profile. Probiotic consumption can improve in the immune, gastrointestinal, and female reproductive health systems in healthy adults. However, this review failed to support the ability of probiotics to cause persistent changes in gut microbiota, or improve lipid profile in healthy adults. The feasibility of probiotics consumption to provide benefits in healthy adults requires further investigation.

Journal ArticleDOI
TL;DR: The methodology and results of the ISTSS Guideline process are described and the interpretation and implementation of the recommendations are considered.
Abstract: Over the last two decades, treatment guidelines have become major aids in the delivery of evidence-based care and improvement of clinical outcomes. The International Society for Traumatic Stress Studies (ISTSS) produced the first guidelines for the prevention and treatment of posttraumatic stress disorder (PTSD) in 2000 and published its latest recommendations, along with position papers on complex PTSD (CPTSD), in November 2018. A rigorous methodology was developed and followed; scoping questions were posed, systematic reviews were undertaken, and 361 randomized controlled trials were included according to the a priori agreed inclusion criteria. In total, 208 meta-analyses were conducted and used to generate 125 recommendations (101 for adults and 24 for children and adolescents) for specific prevention and treatment interventions, using an agreed definition of clinical importance and recommendation setting algorithm. There were eight strong, eight standard, five low effect, 26 emerging evidence, and 78 insufficient evidence to recommend recommendations. The inclusion of separate scoping questions on treatments for complex presentations of PTSD was considered but decided against due to definitional issues and the virtual absence of studies specifically designed to clearly answer possible scoping questions in this area. Narrative reviews were undertaken and position papers prepared (one for adults and one for children and adolescents) to consider the current issues around CPTSD and make recommendations to facilitate further research. This paper describes the methodology and results of the ISTSS Guideline process and considers the interpretation and implementation of the recommendations.

Journal ArticleDOI
TL;DR: A blockchain-assisted lightweight anonymous authentication (BLA) mechanism for distributed VFS, which is provisioned to driving vehicles, and achieves anonymity, and granting vehicle users the responsibility of preserving their privacy by effectively combining modern cryptographical technology and blockchain technology.
Abstract: As modern vehicles and distributed fog services advance apace, vehicular fog services (VFSs) are being expected to span across multiple geo-distributed datacenters, which inevitably leads to cross-datacenter authentication. Traditional cross-datacenter authentication models are not suitable for the scenario of high-speed moving vehicles accessing VFS, because these models either ignored user privacy or ignored the delay requirement of driving vehicles. This paper proposes a blockchain-assisted lightweight anonymous authentication (BLA) mechanism for distributed VFS, which is provisioned to driving vehicles. BLA can achieve the following advantages: 1) realizing a flexible cross-datacenter authentication, in which a vehicle can decide whether to be reauthenticated or not when it enters a new vehicular fog datacenter; 2) achieving anonymity, and granting vehicle users the responsibility of preserving their privacy; 3) it is lightweight by achieving noninteractivity between vehicles and service managers (SMs), and eliminating the communication between SMs in the authentication process, which significantly reduces the communication delay; and 4) resisting the attack that the database governed by one center is tampered with. BLA achieves these advantages by effectively combining modern cryptographical technology and blockchain technology. These security features are demonstrated by carrying out security analysis. Meanwhile, extensive simulations are conducted to validate the efficiency and practicality of BLA.

Journal ArticleDOI
TL;DR: The ability of this model to predict patients with Diabetes using some commonly used lab results is high with satisfactory sensitivity and these models can be built into an online computer program to help physicians in predicting patients with future occurrence of diabetes and providing necessary preventive interventions.
Abstract: Diabetes Mellitus is an increasingly prevalent chronic disease characterized by the body’s inability to metabolize glucose. The objective of this study was to build an effective predictive model with high sensitivity and selectivity to better identify Canadian patients at risk of having Diabetes Mellitus based on patient demographic data and the laboratory results during their visits to medical facilities. Using the most recent records of 13,309 Canadian patients aged between 18 and 90 years, along with their laboratory information (age, sex, fasting blood glucose, body mass index, high-density lipoprotein, triglycerides, blood pressure, and low-density lipoprotein), we built predictive models using Logistic Regression and Gradient Boosting Machine (GBM) techniques. The area under the receiver operating characteristic curve (AROC) was used to evaluate the discriminatory capability of these models. We used the adjusted threshold method and the class weight method to improve sensitivity – the proportion of Diabetes Mellitus patients correctly predicted by the model. We also compared these models to other learning machine techniques such as Decision Tree and Random Forest. The AROC for the proposed GBM model is 84.7% with a sensitivity of 71.6% and the AROC for the proposed Logistic Regression model is 84.0% with a sensitivity of 73.4%. The GBM and Logistic Regression models perform better than the Random Forest and Decision Tree models. The ability of our model to predict patients with Diabetes using some commonly used lab results is high with satisfactory sensitivity. These models can be built into an online computer program to help physicians in predicting patients with future occurrence of diabetes and providing necessary preventive interventions. The model is developed and validated on the Canadian population which is more specific and powerful to apply on Canadian patients than existing models developed from US or other populations. Fasting blood glucose, body mass index, high-density lipoprotein, and triglycerides were the most important predictors in these models.

Journal ArticleDOI
TL;DR: The experimental results prove the innovative SDN-based IIoT solutions can improve grid reliability for enhancing smart grid resilience and achieve multifunctionality control and optimization challenge by providing operators with real-time data monitoring to manage demand, resources, and increasing system reliability.
Abstract: Software-defined networking (SDN) is a key enabling technology of industrial Internet of Things (IIoT) that provides dynamic reconfiguration to improve data network robustness. In the context of smart grid infrastructure, the strong demand of seamless data transmission during critical events (e.g., failures or natural disturbances) seems to be fundamentally shifting energy attitude toward emerging technology. Therefore, SDN will play a vital role on energy revolution to enable flexible interfacing between smart utility domains and facilitate the integration of mix renewable energy resources to deliver efficient power of sustainable grid. In this regard, we propose a new SDN platform based on IIoT technology to support resiliency by reacting immediately whenever a failure occurs to recover smart grid networks using real-time monitoring techniques. We employ SDN controller to achieve multifunctionality control and optimization challenge by providing operators with real-time data monitoring to manage demand, resources, and increasing system reliability. Data processing will be used to manage resources at local network level by employing SDN switch segment, which is connected to SDN controller through IIoT aggregation node. Furthermore, we address different scenarios to control packet flows between switches on hub-to-hub basis using traffic indicators of the infrastructure layer, in addition to any other data from the application layer. Extensive experimental simulation is conducted to demonstrate the validation of the proposed platform model. The experimental results prove the innovative SDN-based IIoT solutions can improve grid reliability for enhancing smart grid resilience.

Journal ArticleDOI
06 Dec 2019-Science
TL;DR: It is concluded that conservation efforts to limit edges created by fragmentation will be most important in the world’s tropical forests because species that have evolved in, and survived in, high-disturbance environments are less sensitive to forest fragmentation.
Abstract: Habitat loss is the primary driver of biodiversity decline worldwide, but the effects of fragmentation (the spatial arrangement of remaining habitat) are debated. We tested the hypothesis that forest fragmentation sensitivity-affected by avoidance of habitat edges-should be driven by historical exposure to, and therefore species' evolutionary responses to disturbance. Using a database containing 73 datasets collected worldwide (encompassing 4489 animal species), we found that the proportion of fragmentation-sensitive species was nearly three times as high in regions with low rates of historical disturbance compared with regions with high rates of disturbance (i.e., fires, glaciation, hurricanes, and deforestation). These disturbances coincide with a latitudinal gradient in which sensitivity increases sixfold at low versus high latitudes. We conclude that conservation efforts to limit edges created by fragmentation will be most important in the world's tropical forests.

Journal ArticleDOI
TL;DR: In this article, the authors survey approaches and insights that have helped to identify and remove systemic bias and barriers in science and medicine, and propose tools that will help organisational change toward gender equality.

Journal ArticleDOI
TL;DR: The simulation results depict that renewable energy resources have the capability to improve frequency excursion during various operating conditions if comprehensive small-signal dynamic models for RESs are introduced in isolated micro-grid and proper contribution of them in load frequency control studies is considered.

Journal ArticleDOI
TL;DR: In this article, biochar, produced from residual biomass of the bio-ethanol industry (Dry Distillers Grains), was added as filler to a standard concrete, aiming at finding potential solutions for simultaneous carbon sequestration and improved properties and performance of the concrete.
Abstract: The current imbalance of carbon in the atmosphere is stimulating the search for carbon sequestration opportunities and for alternative processes and products with a reduced carbon footprint. Biochar, produced from residual biomass of the bio-ethanol industry (Dry Distillers Grains), was added as filler to a standard concrete, aiming at finding potential solutions for simultaneous carbon sequestration and improved properties and performance of the concrete. The addition of biochar resulted in a linear decrease in concrete density, with a concrete density of 1454 kg/m3 for 15 wt% biochar. The addition of biochar also considerably increased the sound absorption coefficient of concrete across the range of 200–2000 Hz, as it created pore networks within the concrete. The thermal conductivity of the concrete showed the largest reductions with 2 wt% of biochar, reaching lows of 0.192 W/(m·K). Finally, the incorporation of biochar showed a detrimental effect on the compressive strength of the concrete, which would put bio-enhanced concretes in the low-strength concrete classification category.

Journal ArticleDOI
TL;DR: Topographical substrates, controlling cell adhesion in two and three dimensions, are reviewed and compared with two- and three-dimensional models.
Abstract: In the body, cells inhabit within a complex three-dimensional (3D) extracellular matrix that provides physical and chemical signals to regulate the cell fate. Cultured cells in Petri dishes and tissue culture flasks (2D) receive completely different environmental cues compared to natural tissues, causing radical alterations in cell morphology and function. Three-dimensional culture models have been able to revolutionize biomedical applications by better emulating natural tissues. However, sample handling and high-throughput screening can be challenging with 3D cell culture. Moreover, most 3D matrices are unable to quantify intracellular mechanics due to their structurally undefined surface characteristics. Therefore, highly structured surfaces (2½D) comprising various micro- and nano-patterns were introduced to address these limitations. The topographical substrates have also been shown to retain in vivo cell functionalities, such as proliferative capacity. Here, we review recent advancements in modulation of surface patterns that have been able to control cell adhesion in two or three dimensions, and their impacts on the cell behavior. Finally, we provide a comparison between 2D, 2½D and 3D systems and present several clinical applications of non-planar substrates.

Journal ArticleDOI
TL;DR: This review focuses on the concept that the lysosome surface serves as a platform to assemble major signaling hubs like mTORC1, AMPK, GSK3 and the inflammasome, which enable responses such as autophagy, cell growth, membrane repair and microbe clearance.
Abstract: Lysosomes are the terminal degradative compartment of autophagy, endocytosis and phagocytosis. What once was viewed as a simple acidic organelle in charge of macromolecular digestion has emerged as a dynamic organelle capable of integrating cellular signals and producing signal outputs. In this review, we focus on the concept that the lysosome surface serves as a platform to assemble major signaling hubs like mTORC1, AMPK, GSK3 and the inflammasome. These molecular assemblies integrate and facilitate cross-talk between signals such as amino acid and energy levels, membrane damage and infection, and ultimately enable responses such as autophagy, cell growth, membrane repair and microbe clearance. In particular, we review how molecular machinery like the vacuolar-ATPase proton pump, sestrins, the GATOR complexes, and the Ragulator, modulate mTORC1, AMPK, GSK3 and inflammation. We then elaborate how these signals control autophagy initiation and resolution, TFEB-mediated lysosome adaptation, lysosome remodeling, antigen presentation, inflammation, membrane damage repair and clearance. Overall, by being at the cross-roads for several membrane pathways, lysosomes have emerged as the ideal surveillance compartment to sense, integrate and elicit cellular behavior and adaptation in response to changing environmental and cellular conditions.

Journal ArticleDOI
TL;DR: The importance of each challenge in an MPC and its impact on the system performance is discussed, and the MMC mathematical models used in the implementation of MPC are presented.
Abstract: Model predictive control (MPC) has emerged as a promising approach to control a modular multilevel converter (MMC). With the help of a cost function, the control objectives of an MMC are achieved easily by using an MPC approach. However, the MPC has several technical challenges and issues including the need of accurate system models, computational complexity, and variable switching frequency operation and weighting factor selection, when it comes to the control of an MMC. In the past few years, several research studies are conducted to address some of the challenges and issues in an MPC and developed several model predictive algorithms for an MMC. In this paper, the importance of each challenge and its impact on the system performance is discussed. Also, the MMC mathematical models used in the implementation of MPC are presented. Furthermore, some of the popular MPC algorithms are discussed briefly, and their features and performance are highlighted through case studies. Finally, summary and future trends of MPC for an MMC are presented.

Journal ArticleDOI
TL;DR: In this article, the phase change cycle of phase change materials (PCMs) and their impact on indoor air and interior building surface temperatures are assessed. And the results indicate improved performance of the test cell containing the composite PCM system in lowering peak indoor and surface temperatures up to 6°C.

Journal ArticleDOI
TL;DR: In this article, the authors investigate whether the use of virtual, interactive, and interactive learning can improve the performance of online and technology-enabled learning in post-secondary education, and find that it can.
Abstract: There are growing trends in postsecondary education that emphasize the importance of online and technology-enabled learning. This study aims to investigate whether the use of virtual, interactive, ...

Journal ArticleDOI
TL;DR: A systematic literature review on the applications of learning curves in production and operations management is presented, with a framework that includes typical learning curve models developed and a discussion of the most important and informative articles in each of the major categories.

Journal ArticleDOI
TL;DR: A taxonomy of deep learning-based aspect-level sentiment classification is designed and a comprehensive summary of the state-of-the-art methods is given, showing the tremendous progress that has already been made in ASC.
Abstract: This survey focuses on deep learning-based aspect-level sentiment classification (ASC), which aims to decide the sentiment polarity for an aspect mentioned within the document. Along with the success of applying deep learning in many applications, deep learning-based ASC has attracted a lot of interest from both academia and industry in recent years. However, there still lack a systematic taxonomy of existing approaches and comparison of their performance, which are the gaps that our survey aims to fill. Furthermore, to quantitatively evaluate the performance of various approaches, the standardization of the evaluation methodology and shared datasets is necessary. In this paper, an in-depth overview of the current state-of-the-art deep learning-based methods is given, showing the tremendous progress that has already been made in ASC. In particular, first, a comprehensive review of recent research efforts on deep learning-based ASC is provided. More concretely, we design a taxonomy of deep learning-based ASC and provide a comprehensive summary of the state-of-the-art methods. Then, we collect all benchmark ASC datasets for researchers to study and conduct extensive experiments over five public standard datasets with various commonly used evaluation measures. Finally, we discuss some of the most challenging open problems and point out promising future research directions in this field.

Journal ArticleDOI
TL;DR: Intersectionality involves the study of the ways that race, gender, disability, sexuality, class, age, and other social categories are mutually shaped and interrelated through forces such as coloni...
Abstract: Intersectionality involves the study of the ways that race, gender, disability, sexuality, class, age, and other social categories are mutually shaped and interrelated through forces such as coloni...

Journal ArticleDOI
TL;DR: This paper proposes a current-controlled voltage-mode control scheme for dispatchable electronically coupled distributed energy resource (DER) units based on sliding- mode control (SMC) strategy that provides fast and stable control on the terminal voltage and frequency of DER unit and ensures protection of the power-electronic interface to external faults.
Abstract: This paper proposes a current-controlled voltage-mode control scheme for dispatchable electronically coupled distributed energy resource (DER) units based on sliding-mode control (SMC) strategy. The proposed control strategy provides fast and stable control on the terminal voltage and frequency of DER unit and ensures protection of the power-electronic interface to external faults. In addition it maintains the quality of the output voltage of the host DER unit in spite of unbalanced and/or distorted load currents. Performance of the proposed SMC strategy is demonstrated through time-domain simulation of a single islanded DER unit, starting from black-start and subjected to various operating scenarios, and it is compared to the performance of a proportional-integral (PI) based control strategy. Also a current-mode control strategy is proposed for DER units based on SMC. The performance of both voltage- and current-mode control strategies are demonstrated in a sample master-slave organized three-unit microgrid.

Journal ArticleDOI
TL;DR: A review on the continual development of context-aware recommender systems by analyzing different kinds of contexts without limiting to any specific application domain and what modification or additions are being applied on the top of conventional recommendation approaches to produce context- Aware recommendations.

Journal ArticleDOI
TL;DR: In this paper, a review of the use of phase change materials (PCMs) as an additive or replacement material in typical concrete mixtures for building applications is presented, which shows that organic paraffin and non-paraffin are the most suitable PCMs for incorporation into concrete mixture, as they have suitable melting points that match human comfort temperature, high heat capacity, low volume changes during phase change transition and good chemical and thermal stability.