About: National Ilan University is a education organization based out in Yilan, Taiwan. It is known for research contribution in the topics: Membrane & Compressive strength. The organization has 1881 authors who have published 2996 publications receiving 48215 citations. The organization is also known as: Guólì Yilán Dàxué & NIU.
Topics: Membrane, Compressive strength, Wireless sensor network, Microbial fuel cell, Control theory
Papers published on a yearly basis
TL;DR: In this paper, the authors present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macro-autophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes.
Abstract: In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. For example, a key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process versus those that measure flux through the autophagy pathway (i.e., the complete process including the amount and rate of cargo sequestered and degraded). In particular, a block in macroautophagy that results in autophagosome accumulation must be differentiated from stimuli that increase autophagic activity, defined as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (in most higher eukaryotes and some protists such as Dictyostelium) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the field understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. It is worth emphasizing here that lysosomal digestion is a stage of autophagy and evaluating its competence is a crucial part of the evaluation of autophagic flux, or complete autophagy. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. Along these lines, because of the potential for pleiotropic effects due to blocking autophagy through genetic manipulation, it is imperative to target by gene knockout or RNA interference more than one autophagy-related protein. In addition, some individual Atg proteins, or groups of proteins, are involved in other cellular pathways implying that not all Atg proteins can be used as a specific marker for an autophagic process. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular assays, we hope to encourage technical innovation in the field.
TL;DR: The results of this study show that the technologies of cloud and big data can be used to enhance the performance of the healthcare system so that humans can then enjoy various smart healthcare applications and services.
Abstract: The advances in information technology have witnessed great progress on healthcare technologies in various domains nowadays. However, these new technologies have also made healthcare data not only much bigger but also much more difficult to handle and process. Moreover, because the data are created from a variety of devices within a short time span, the characteristics of these data are that they are stored in different formats and created quickly, which can, to a large extent, be regarded as a big data problem. To provide a more convenient service and environment of healthcare, this paper proposes a cyber-physical system for patient-centric healthcare applications and services, called Health-CPS, built on cloud and big data analytics technologies. This system consists of a data collection layer with a unified standard, a data management layer for distributed storage and parallel computing, and a data-oriented service layer. The results of this study show that the technologies of cloud and big data can be used to enhance the performance of the healthcare system so that humans can then enjoy various smart healthcare applications and services.
TL;DR: The question that arises now is, how to develop a high performance platform to efficiently analyze big data and how to design an appropriate mining algorithm to find the useful things from big data.
Abstract: The age of big data is now coming. But the traditional data analytics may not be able to handle such large quantities of data. The question that arises now is, how to develop a high performance platform to efficiently analyze big data and how to design an appropriate mining algorithm to find the useful things from big data. To deeply discuss this issue, this paper begins with a brief introduction to data analytics, followed by the discussions of big data analytics. Some important open issues and further research directions will also be presented for the next step of big data analytics.
TL;DR: This study pioneers in using the fuzzy decision-making trial and evaluation laboratory (DEMATEL) method to find influential factors in selecting SCM suppliers and finds that stable delivery of goods is the most influence and the strongest connection to other criteria.
Abstract: Supply chain management (SCM) practices have flourished since the 1990s. Enterprises realize that a large amount of direct and indirect profits can be obtained from effective and efficient SCM practices. Supplier selection has great impact on integration of the supply chain relationship. Effective and accurate supplier selection decisions are significant components for productions and logistics management in many firms to enhance their organizational performance. This study pioneers in using the fuzzy decision-making trial and evaluation laboratory (DEMATEL) method to find influential factors in selecting SCM suppliers. The DEMATEL method evaluates supplier performance to find key factor criteria to improve performance and provides a novel approach of decision-making information in SCM supplier selection. This research designs a fuzzy DEMATEL questionnaire sent to seventeen professional purchasing personnel in the electronic industry. Our research results find that stable delivery of goods is the most influence and the strongest connection to other criteria.
TL;DR: The accuracy of tree height, after removing gross errors, was better than 0.5 m in all tree height classes with the best methods investigated in this experiment, suggesting minimum curvature-based tree detection accompanied by point cloud-based cluster detection for suppressed trees is a solution that deserves attention in the future.
Abstract: The objective of the “Tree Extraction” project organized by EuroSDR (European Spatial data Research) and ISPRS (International Society of Photogrammetry and Remote Sensing) was to evaluate the quality, accuracy, and feasibility of automatic tree extraction methods, mainly based on laser scanner data. In the final report of the project, Kaartinen and Hyyppa (2008) reported a high variation in the quality of the published methods under boreal forest conditions and with varying laser point densities. This paper summarizes the findings beyond the final report after analyzing the results obtained in different tree height classes. Omission/Commission statistics as well as neighborhood relations are taken into account. Additionally, four automatic tree detection and extraction techniques were added to the test. Several methods in this experiment were superior to manual processing in the dominant, co-dominant and suppressed tree storeys. In general, as expected, the taller the tree, the better the location accuracy. The accuracy of tree height, after removing gross errors, was better than 0.5 m in all tree height classes with the best methods investigated in this experiment. For forest inventory, minimum curvature-based tree detection accompanied by point cloud-based cluster detection for suppressed trees is a solution that deserves attention in the future.
Showing all 1887 results
|Hong-Yuan Mark Liao
|Chung Hsiung Wang
Related Institutions (5)
National Cheng Kung University
69.7K papers, 1.4M citations
National Chiao Tung University
52.4K papers, 956.2K citations
South China University of Technology
69.4K papers, 1.2M citations
National Taiwan University
130.8K papers, 3.3M citations
Nanyang Technological University
112.8K papers, 3.2M citations