scispace - formally typeset
Search or ask a question

Showing papers by "Concordia University published in 2016"


Journal ArticleDOI
Daniel J. Klionsky1, Kotb Abdelmohsen2, Akihisa Abe3, Joynal Abedin4  +2519 moreInstitutions (695)
TL;DR: In this paper, the authors present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macro-autophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes.
Abstract: In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. For example, a key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process versus those that measure flux through the autophagy pathway (i.e., the complete process including the amount and rate of cargo sequestered and degraded). In particular, a block in macroautophagy that results in autophagosome accumulation must be differentiated from stimuli that increase autophagic activity, defined as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (in most higher eukaryotes and some protists such as Dictyostelium) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the field understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. It is worth emphasizing here that lysosomal digestion is a stage of autophagy and evaluating its competence is a crucial part of the evaluation of autophagic flux, or complete autophagy. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. Along these lines, because of the potential for pleiotropic effects due to blocking autophagy through genetic manipulation, it is imperative to target by gene knockout or RNA interference more than one autophagy-related protein. In addition, some individual Atg proteins, or groups of proteins, are involved in other cellular pathways implying that not all Atg proteins can be used as a specific marker for an autophagic process. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular assays, we hope to encourage technical innovation in the field.

5,187 citations



Journal ArticleDOI
TL;DR: This paper provides a survey-style introduction to dense small cell networks and considers many research directions, namely, user association, interference management, energy efficiency, spectrum sharing, resource management, scheduling, backhauling, propagation modeling, and the economics of UDN deployment.
Abstract: The exponential growth and availability of data in all forms is the main booster to the continuing evolution in the communications industry. The popularization of traffic-intensive applications including high definition video, 3-D visualization, augmented reality, wearable devices, and cloud computing defines a new era of mobile communications. The immense amount of traffic generated by today’s customers requires a paradigm shift in all aspects of mobile networks. Ultradense network (UDN) is one of the leading ideas in this racetrack. In UDNs, the access nodes and/or the number of communication links per unit area are densified. In this paper, we provide a survey-style introduction to dense small cell networks. Moreover, we summarize and compare some of the recent achievements and research findings. We discuss the modeling techniques and the performance metrics widely used to model problems in UDN. Also, we present the enabling technologies for network densification in order to understand the state-of-the-art. We consider many research directions in this survey, namely, user association, interference management, energy efficiency, spectrum sharing, resource management, scheduling, backhauling, propagation modeling, and the economics of UDN deployment. Finally, we discuss the challenges and open problems to the researchers in the field or newcomers who aim to conduct research in this interesting and active area of research.

828 citations


Journal ArticleDOI
TL;DR: An overview of both historical and recent USVs development is provided, along with some fundamental definitions, and existing USVs GNC approaches are outlined and classified according to various criteria, such as their applications, methodologies, and challenges.

709 citations


Journal ArticleDOI
TL;DR: This work adapted Kahneman's seminal (1973) Capacity Model of Attention to listening and proposed a heuristically useful Framework for Understanding Effortful Listening (FUEL), which incorporates the well-known relationship between cognitive demand and the supply of cognitive capacity that is the foundation of cognitive theories of attention.
Abstract: The Fifth Eriksholm Workshop on “Hearing Impairment and Cognitive Energy” was convened to develop a consensus among interdisciplinary experts about what is known on the topic, gaps in knowledge, the use of terminology, priorities for future research, and implications for practice. The general term cognitive energy was chosen to facilitate the broadest possible discussion of the topic. It goes back to Titchener (1908) who described the effects of attention on perception; he used the term psychic energy for the notion that limited mental resources can be flexibly allocated among perceptual and mental activities. The workshop focused on three main areas: (1) theories, models, concepts, definitions, and frameworks; (2) methods and measures; and (3) knowledge translation. We defined effort as the deliberate allocation of mental resources to overcome obstacles in goal pursuit when carrying out a task, with listening effort applying more specifically when tasks involve listening. We adapted Kahneman’s seminal (1973) Capacity Model of Attention to listening and proposed a heuristically useful Framework for Understanding Effortful Listening (FUEL). Our FUEL incorporates the well-known relationship between cognitive demand and the supply of cognitive capacity that is the foundation of cognitive theories of attention. Our FUEL also incorporates a motivation dimension based on complementary theories of motivational intensity, adaptive gain control, and optimal performance, fatigue, and pleasure. Using a three-dimensional illustration, we highlight how listening effort depends not only on hearing difficulties and task demands but also on the listener’s motivation to expend mental effort in the challenging situations of everyday life.

686 citations


Journal ArticleDOI
TL;DR: The Triple Layered Business Model Canvas as discussed by the authors is a tool for exploring sustainability-oriented business model innovation, which extends the original business model canvas by adding two layers: an environmental layer based on a lifecycle perspective and a stakeholder perspective.

589 citations


Journal ArticleDOI
TL;DR: The progresses of different modeling and control approaches for piezo-actuated nanopositioning stages are discussed and new opportunities for the extended studies are highlighted.
Abstract: Piezo-actuated stages have become more and more promising in nanopositioning applications due to the excellent advantages of the fast response time, large mechanical force, and extremely fine resolution. Modeling and control are critical to achieve objectives for high-precision motion. However, piezo-actuated stages themselves suffer from the inherent drawbacks produced by the inherent creep and hysteresis nonlinearities and vibration caused by the lightly damped resonant dynamics, which make modeling and control of such systems challenging. To address these challenges, various techniques have been reported in the literature. This paper surveys and discusses the progresses of different modeling and control approaches for piezo-actuated nanopositioning stages and highlights new opportunities for the extended studies.

458 citations


Journal ArticleDOI
TL;DR: In this article, the authors evaluated different UHI mitigation strategies in different urban neighbors of Toronto, selected according to their building density, and evaluated the effects of cool surfaces (on the roofs, on the street pavements or as vegetation areas) through numerical simulations using the software ENVI-met.

328 citations


Journal ArticleDOI
01 Jun 2016
TL;DR: This paper presents a model predictive control scheme incorporating neural-dynamic optimization to achieve trajectory tracking of nonholonomic mobile robots (NMRs), and demonstrates that the MPC scheme has an effective performance on a real mobile robot system.
Abstract: Mobile robots tracking a reference trajectory are constrained by the motion limits of their actuators, which impose the requirement for high autonomy driving capabilities in robots This paper presents a model predictive control (MPC) scheme incorporating neural-dynamic optimization to achieve trajectory tracking of nonholonomic mobile robots (NMRs) By using the derived tracking-error kinematics of nonholonomic robots, the proposed MPC approach is iteratively transformed as a constrained quadratic programming (QP) problem, and then a primal–dual neural network is used to solve this QP problem over a finite receding horizon The applied neural-dynamic optimization can make the cost function of MPC converge to the exact optimal values of the formulated constrained QP Compared with the existing fast MPC, which requires repeatedly calculating the Hessian matrix of the Langragian and then solves a quadratic program The computation complexity reaches ${O}({n}^{{3}})$ , while the proposed neural-dynamic optimization contains ${O}({n}^{{2}})$ operations Finally, extensive experiments are provided to illustrate that the MPC scheme has an effective performance on a real mobile robot system

309 citations


Journal ArticleDOI
TL;DR: In this article, the authors provide a selective representation of the research on the development and evaluation of mitigation measures, including: cool roofs, cool pavements, and urban vegetation, to counter the effects of summertime UHI.

302 citations


Journal ArticleDOI
TL;DR: The basics of WSN virtualization are introduced and motivate its pertinence with carefully selected scenarios and existing works are presented in detail and critically evaluated using a set of requirements derived from the scenarios.
Abstract: Wireless Sensor Networks (WSNs) are the key components of the emerging Internet-of-Things (IoT) paradigm. They are now ubiquitous and used in a plurality of application domains. WSNs are still domain specific and usually deployed to support a specific application. However, as WSNs' nodes are becoming more and more powerful, it is getting more and more pertinent to research how multiple applications could share a very same WSN infrastructure. Virtualization is a technology that can potentially enable this sharing. This paper is a survey on WSN virtualization. It provides a comprehensive review of the state-of-the-art and an in-depth discussion of the research issues. We introduce the basics of WSN virtualization and motivate its pertinence with carefully selected scenarios. Existing works are presented in detail and critically evaluated using a set of requirements derived from the scenarios. The pertinent research projects are also reviewed. Several research issues are also discussed with hints on how they could be tackled.

Journal ArticleDOI
TL;DR: The study sheds light on the controversial act of presenteeism, uncovering both positive and negative underlying mechanisms and underlines the opportunities for researchers to meaningfully investigate and for organizations to manage it.
Abstract: Interest in presenteeism, attending work while ill, has flourished in light of its consequences for individual well-being and organizational productivity. Our goal was to identify its most significant causes and correlates by quantitatively summarizing the extant research. Additionally, we built an empirical model of some key correlates and compared the etiology of presenteeism versus absenteeism. We used meta-analysis (in total, K = 109 samples, N = 175,965) to investigate the correlates of presenteeism and meta-analytic structural equation modeling to test the empirical model. Salient correlates of working while ill included general ill health, constraints on absenteeism (e.g., strict absence policies, job insecurity), elevated job demands and felt stress, lack of job and personal resources (e.g., low support and low optimism), negative relational experiences (e.g., perceived discrimination), and positive attitudes (satisfaction, engagement, and commitment). Moreover, our dual process model clarified how job demands and job and personal resources elicit presenteeism via both health impairment and motivational paths, and they explained more variation in presenteeism than absenteeism. The study sheds light on the controversial act of presenteeism, uncovering both positive and negative underlying mechanisms. The greater variance explained in presenteeism as opposed to absenteeism underlines the opportunities for researchers to meaningfully investigate the behavior and for organizations to manage it. (PsycINFO Database Record

Journal ArticleDOI
TL;DR: The sustainable development of cities is increasingly recognized as crucial to meeting collectively agreed sustainability goals at local, regional and global scales, and more broadly to securing hu... as discussed by the authors,...

Journal ArticleDOI
TL;DR: This paper uses data from the popular online Q&A site, Stack Overflow, and analyze 13,232,821 posts to examine what mobile developers ask about, and establishes a novel approach for analyzing questions asked onQ&A forums.
Abstract: The popularity of mobile devices has been steadily growing in recent years. These devices heavily depend on software from the underlying operating systems to the applications they run. Prior research showed that mobile software is different than traditional, large software systems. However, to date most of our research has been conducted on traditional software systems. Very little work has focused on the issues that mobile developers face. Therefore, in this paper, we use data from the popular online Q&A site, Stack Overflow, and analyze 13,232,821 posts to examine what mobile developers ask about. We employ Latent Dirichlet allocation-based topic models to help us summarize the mobile-related questions. Our findings show that developers are asking about app distribution, mobile APIs, data management, sensors and context, mobile tools, and user interface development. We also determine what popular mobile-related issues are the most difficult, explore platform specific issues, and investigate the types (e.g., what, how, or why) of questions mobile developers ask. Our findings help highlight the challenges facing mobile developers that require more attention from the software engineering research and development communities in the future and establish a novel approach for analyzing questions asked on Q&A forums.

Journal ArticleDOI
TL;DR: This paper proposes to employ a multi-antenna base station (BS) as a source of green interference to enhance secure transmission in the satellite network and presents two beamforming schemes, namely, hybrid zero- forcing and partial zero-forcing to solve the optimization problem and obtain the BF weight vectors in a closed form.
Abstract: This paper investigates the physical layer security of a satellite network, whose downlink spectral resource is shared with a terrestrial cellular network. We propose to employ a multi-antenna base station (BS) as a source of green interference to enhance secure transmission in the satellite network. By taking the mutual interference between these two networks into account, we first formulate a constrained optimization problem to maximize the instantaneous rate of the terrestrial user while satisfying the interference probability constraint of the satellite user. Then, with the assumption that imperfect channel state information (CSI) and statistical CSI of the link between the BS and satellite user are available at the BS, we present two beamforming (BF) schemes, namely, hybrid zero-forcing and partial zero-forcing to solve the optimization problem and obtain the BF weight vectors in a closed form. Moreover, we analyze the secrecy performance of primary satellite network by considering two practical scenarios, namely: Scenario I, the eavesdroppers CSI is unknown at the satellite and Scenario II, the eavesdroppers CSI is known at the satellite. Specifically, we derive the analytical expressions for the secrecy outage probability for Scenario I and the average secrecy rate for Scenario II. Finally, numerical results are provided to confirm the superiority of the proposed BF schemes and the validity of the performance analysis, as well as demonstrate the impacts of various parameters on the secrecy performance of the satellite network.

Journal ArticleDOI
TL;DR: It is forecast that security will be a central enabling technology for the next generation of civilian unmanned aerial vehicles and the security properties required by their critical operation environment.
Abstract: The market for civilian unmanned aerial vehicles, also known as drones, is expanding rapidly as new applications are emerging to incorporate the use of civilian drones in our daily lives. On one hand, the convenience of offering certain services via drones is attractive. On the other hand, the mere operation of these airborne machines, which rely heavily on their cyber capabilities, poses great threats to people and property. Also, while the Federal Aviation Administration NextGen project aims to integrate civilian drones into the national airspace, the regulation is still a work-in-progress and does not cope with their threats. This article surveys the main security, privacy, and safety aspects associated with the use of civilian drones in the national airspace. In particular, we identify both the physical and cyber threats of such systems and discuss the security properties required by their critical operation environment. We also identify the research challenges and possible future directions in the fields of civilian drone security, safety, and privacy. Based on our investigation, we forecast that security will be a central enabling technology for the next generation of civilian unmanned aerial vehicles.

Journal ArticleDOI
TL;DR: This paper addresses the problem of evaluating green supplier development programs and proposes a fuzzy NGT (Nominal Group Technique)-VIKOR (VlseKriterijumska Optimizacija I Kompromisno Resenje) based solution approach.

Journal ArticleDOI
TL;DR: In this article, a collection of short articles written by experts in thermal spray who were asked to present a snapshot of the current state of their specific field, give their views on current challenges faced by the field and provide some guidance as to the R&D required to meet these challenges.
Abstract: Considerable progress has been made over the last decades in thermal spray technologies, practices and applications. However, like other technologies, they have to continuously evolve to meet new problems and market requirements. This article aims to identify the current challenges limiting the evolution of these technologies and to propose research directions and priorities to meet these challenges. It was prepared on the basis of a collection of short articles written by experts in thermal spray who were asked to present a snapshot of the current state of their specific field, give their views on current challenges faced by the field and provide some guidance as to the R&D required to meet these challenges. The article is divided in three sections that deal with the emerging thermal spray processes, coating properties and function, and biomedical, electronic, aerospace and energy generation applications.


Journal ArticleDOI
TL;DR: A genetic algorithm-based method for solving the VNF scheduling problem efficiently is developed and it is shown that dynamically adjusting the bandwidths on virtual links connecting virtual machines, hosting the network functions, reduces the schedule makespan by 15%-20% in the simulated scenarios.
Abstract: To accelerate the implementation of network functions/middle boxes and reduce the deployment cost, recently, the concept of network function virtualization (NFV) has emerged and become a topic of much interest attracting the attention of researchers from both industry and academia. Unlike the traditional implementation of network functions, a software-oriented approach for virtual network functions (VNFs) creates more flexible and dynamic network services to meet a more diversified demand. Software-oriented network functions bring along a series of research challenges, such as VNF management and orchestration, service chaining, VNF scheduling for low latency and efficient virtual network resource allocation with NFV infrastructure, among others. In this paper, we study the VNF scheduling problem and the corresponding resource optimization solutions. Here, the VNF scheduling problem is defined as a series of scheduling decisions for network services on network functions and activating the various VNFs to process the arriving traffic. We consider VNF transmission and processing delays and formulate the joint problem of VNF scheduling and traffic steering as a mixed integer linear program. Our objective is to minimize the makespan/latency of the overall VNFs’ schedule. Reducing the scheduling latency enables cloud operators to service (and admit) more customers, and cater to services with stringent delay requirements, thereby increasing operators’ revenues. Owing to the complexity of the problem, we develop a genetic algorithm-based method for solving the problem efficiently. Finally, the effectiveness of our heuristic algorithm is verified through numerical evaluation. We show that dynamically adjusting the bandwidths on virtual links connecting virtual machines, hosting the network functions, reduces the schedule makespan by 15%–20% in the simulated scenarios.

Proceedings ArticleDOI
01 Nov 2016
TL;DR: This work monitored Java projects hosted on GitHub to detect recently applied refactorings, and asked developers to explain the reasons behind their decision to refactor the code, compiling a catalogue of 44 distinct motivations for 12 well-known refactoring types.
Abstract: Refactoring is a widespread practice that helps developers to improve the maintainability and readability of their code. However, there is a limited number of studies empirically investigating the actual motivations behind specific refactoring operations applied by developers. To fill this gap, we monitored Java projects hosted on GitHub to detect recently applied refactorings, and asked the developers to explain the reasons behind their decision to refactor the code. By applying thematic analysis on the collected responses, we compiled a catalogue of 44 distinct motivations for 12 well-known refactoring types. We found that refactoring activity is mainly driven by changes in the requirements and much less by code smells. Extract Method is the most versatile refactoring operation serving 11 different purposes. Finally, we found evidence that the IDE used by the developers affects the adoption of automated refactoring tools.

Journal ArticleDOI
21 Nov 2016-PLOS ONE
TL;DR: It is found that comparatively expensive mitigation measures reduce large mammal road-kill much more than inexpensive measures, and that inexpensive measures such as reflectors should not be used until and unless their effectiveness is tested using a high-quality experimental approach.
Abstract: Road traffic kills hundreds of millions of animals every year, posing a critical threat to the populations of many species. To address this problem there are more than forty types of road mitigation measures available that aim to reduce wildlife mortality on roads (road-kill). For road planners, deciding on what mitigation method to use has been problematic because there is little good information about the relative effectiveness of these measures in reducing road-kill, and the costs of these measures vary greatly. We conducted a meta-analysis using data from 50 studies that quantified the relationship between road-kill and a mitigation measure designed to reduce road-kill. Overall, mitigation measures reduce road-kill by 40% compared to controls. Fences, with or without crossing structures, reduce road-kill by 54%. We found no detectable effect on road-kill of crossing structures without fencing. We found that comparatively expensive mitigation measures reduce large mammal road-kill much more than inexpensive measures. For example, the combination of fencing and crossing structures led to an 83% reduction in road-kill of large mammals, compared to a 57% reduction for animal detection systems, and only a 1% for wildlife reflectors. We suggest that inexpensive measures such as reflectors should not be used until and unless their effectiveness is tested using a high-quality experimental approach. Our meta-analysis also highlights the fact that there are insufficient data to answer many of the most pressing questions that road planners ask about the effectiveness of road mitigation measures, such as whether other less common mitigation measures (e.g., measures to reduce traffic volume and/or speed) reduce road mortality, or to what extent the attributes of crossing structures and fences influence their effectiveness. To improve evaluations of mitigation effectiveness, studies should incorporate data collection before the mitigation is applied, and we recommend a minimum study duration of four years for Before-After, and a minimum of either four years or four sites for Before-After-Control-Impact designs.

Journal ArticleDOI
TL;DR: It is demonstrated that higher levels of education and daily FOSC are related to larger brain volume than predicted by CA which supports the utility of regional gray matter volume as a biomarker of healthy brain aging.

Journal ArticleDOI
TL;DR: In this article, a classification of energy sources is presented in terms of their sustainability and ease of integration to district heating systems (DHS), and the recent studies of DHS are classified in accordance with the optimization objectives, including energy/exergy efficiency, cost, exergo-economic/thermo-economic and green-house gas (GHG) and pollutant production.

Journal ArticleDOI
TL;DR: A comprehensive review of the BIPV/T technology, including major developments of various building-integrated photovoltaic/thermal (BIPV/Ts) systems, experimental and numerical studies, and the impact of building performance on building performance is presented in this article.
Abstract: The concept of building-integrated photovoltaic/thermal (BIPV/T) systems emerged in the early 1990s. It has attracted increasing attention since 2000 due to its potential to facilitate the design of net-zero energy buildings through enhanced solar energy utilisation. This article presents a comprehensive review of the BIPV/T technology, including major developments of various BIPV/T systems, experimental and numerical studies, and the impact of BIPV/T system on building performance. The BIPV/T systems reviewed here are: air-based systems, water-based systems, concentrating systems and systems involving a phase change working medium such as BIPV/T with either heat pipe or heat pump evaporator. This work provides an overview of research, development, application and status of BIPV/T systems and modules. Finally, research needs and opportunities are identified.

Journal ArticleDOI
TL;DR: In this paper, the authors evaluate the application of Multi-Criteria Decision Making (MCDM) methods to the selection of green technologies for retrofitting to existing buildings and propose an integrated green technology assessment and selection framework.

Journal ArticleDOI
TL;DR: Predictors of smoking onset for which there is robust evidence should be considered in the design of interventions to prevent first puff in order to optimize their effectiveness and to define onset clearly as the transition from never use to first use.

Journal ArticleDOI
TL;DR: The extended linear matrix inequalities (LMIs) are used to reduce the conservativeness of the SFDCC results by introducing additional matrix variables to eliminate the couplings of Lyapunov matrices with the system matrices.

Journal ArticleDOI
TL;DR: In this article, a review of parameters which affect the performance of photocatalytic degradation processes by using the ZnO catalyst, and discusses methods for the modification of zinc oxide structure.

Journal ArticleDOI
TL;DR: In this paper, the effect of tree size and space between trees on outdoor comfort is compared by using environmental simulation, and it is demonstrated that the correlation (R2) between tree cover (SVF) and urban heat island (UHI) is about 0.64 at summer mid-night.