scispace - formally typeset
Search or ask a question
Browse all papers


Book ChapterDOI
08 Oct 2016
TL;DR: Markovian Generative Adversarial Networks (MGANs) are proposed, a method for training generative networks for efficient texture synthesis that surpasses previous neural texture synthesizers by a significant margin and applies to texture synthesis, style transfer, and video stylization.
Abstract: This paper proposes Markovian Generative Adversarial Networks (MGANs), a method for training generative networks for efficient texture synthesis. While deep neural network approaches have recently demonstrated remarkable results in terms of synthesis quality, they still come at considerable computational costs (minutes of run-time for low-res images). Our paper addresses this efficiency issue. Instead of a numerical deconvolution in previous work, we precompute a feed-forward, strided convolutional network that captures the feature statistics of Markovian patches and is able to directly generate outputs of arbitrary dimensions. Such network can directly decode brown noise to realistic texture, or photos to artistic paintings. With adversarial training, we obtain quality comparable to recent neural texture synthesis methods. As no optimization is required at generation time, our run-time performance (0.25 M pixel images at 25 Hz) surpasses previous neural texture synthesizers by a significant margin (at least 500 times faster). We apply this idea to texture synthesis, style transfer, and video stylization.

1,403 citations


Journal ArticleDOI
Sergey Levine1, Peter Pastor, Alex Krizhevsky1, Julian Ibarz1, Deirdre Quillen1 
TL;DR: The approach achieves effective real-time control, can successfully grasp novel objects, and corrects mistakes by continuous servoing, and illustrates that data from different robots can be combined to learn more reliable and effective grasping.
Abstract: We describe a learning-based approach to hand-eye coordination for robotic grasping from monocular images. To learn hand-eye coordination for grasping, we trained a large convolutional neural netwo...

1,402 citations


Journal ArticleDOI
TL;DR: The latest guidelines for the treatment of HCC recommend evidence-based management and are considered suitable for universal use in the Asia–Pacific region, which has a diversity of medical environments.
Abstract: There is great geographical variation in the distribution of hepatocellular carcinoma (HCC), with the majority of all cases worldwide found in the Asia–Pacific region, where HCC is one of the leading public health problems. Since the “Toward Revision of the Asian Pacific Association for the Study of the Liver (APASL) HCC Guidelines” meeting held at the 25th annual conference of the APASL in Tokyo, the newest guidelines for the treatment of HCC published by the APASL has been discussed. This latest guidelines recommend evidence-based management of HCC and are considered suitable for universal use in the Asia–Pacific region, which has a diversity of medical environments.

1,402 citations


Proceedings ArticleDOI
01 Jun 2019
TL;DR: SKNet as discussed by the authors proposes a dynamic selection mechanism in CNNs that allows each neuron to adaptively adjust its receptive field size based on multiple scales of input information, which can capture target objects with different scales.
Abstract: In standard Convolutional Neural Networks (CNNs), the receptive fields of artificial neurons in each layer are designed to share the same size. It is well-known in the neuroscience community that the receptive field size of visual cortical neurons are modulated by the stimulus, which has been rarely considered in constructing CNNs. We propose a dynamic selection mechanism in CNNs that allows each neuron to adaptively adjust its receptive field size based on multiple scales of input information. A building block called Selective Kernel (SK) unit is designed, in which multiple branches with different kernel sizes are fused using softmax attention that is guided by the information in these branches. Different attentions on these branches yield different sizes of the effective receptive fields of neurons in the fusion layer. Multiple SK units are stacked to a deep network termed Selective Kernel Networks (SKNets). On the ImageNet and CIFAR benchmarks, we empirically show that SKNet outperforms the existing state-of-the-art architectures with lower model complexity. Detailed analyses show that the neurons in SKNet can capture target objects with different scales, which verifies the capability of neurons for adaptively adjusting their receptive field sizes according to the input. The code and models are available at https://github.com/implus/SKNet.

1,401 citations


Journal ArticleDOI
TL;DR: Rigour, or the integrity in which a study is conducted, is outlined and concepts such as reliability, validity and generalisability typically associated with quantitative research and alternative terminology will be compared in relation to their application to qualitative research.
Abstract: Evaluating the quality of research is essential if findings are to be utilised in practice and incorporated into care delivery. In a previous article we explored ‘bias’ across research designs and outlined strategies to minimise bias.1 The aim of this article is to further outline rigour, or the integrity in which a study is conducted, and ensure the credibility of findings in relation to qualitative research. Concepts such as reliability, validity and generalisability typically associated with quantitative research and alternative terminology will be compared in relation to their application to qualitative research. In addition, some of the strategies adopted by qualitative researchers to enhance the credibility of their research are outlined. Assessing the reliability of study findings requires researchers and health professionals to make judgements about the ‘soundness’ of the research in relation to …

1,401 citations


Journal ArticleDOI
TL;DR: Modules for Experiments in Stellar Astrophysics (MESA) as discussed by the authors can now simultaneously evolve an interacting pair of differentially rotating stars undergoing transfer and loss of mass and angular momentum, greatly enhancing the prior ability to model binary evolution.
Abstract: We substantially update the capabilities of the open-source software instrument Modules for Experiments in Stellar Astrophysics (MESA). MESA can now simultaneously evolve an interacting pair of differentially rotating stars undergoing transfer and loss of mass and angular momentum, greatly enhancing the prior ability to model binary evolution. New MESA capabilities in fully coupled calculation of nuclear networks with hundreds of isotopes now allow MESA to accurately simulate advanced burning stages needed to construct supernova progenitor models. Implicit hydrodynamics with shocks can now be treated with MESA, enabling modeling of the entire massive star lifecycle, from pre-main sequence evolution to the onset of core collapse and nucleosynthesis from the resulting explosion. Coupling of the GYRE non-adiabatic pulsation instrument with MESA allows for new explorations of the instability strips for massive stars while also accelerating the astrophysical use of asteroseismology data. We improve treatment of mass accretion, giving more accurate and robust near-surface profiles. A new MESA capability to calculate weak reaction rates "on-the-fly" from input nuclear data allows better simulation of accretion induced collapse of massive white dwarfs and the fate of some massive stars. We discuss the ongoing challenge of chemical diffusion in the strongly coupled plasma regime, and exhibit improvements in MESA that now allow for the simulation of radiative levitation of heavy elements in hot stars. We close by noting that the MESA software infrastructure provides bit-for-bit consistency for all results across all the supported platforms, a profound enabling capability for accelerating MESA's development.

1,401 citations


Journal ArticleDOI
19 Mar 2019
TL;DR: The authors explored the role of social presence in online learning environments and its relationship to students' perceptions of learning and satisfaction with the instructor, and found that students with high overall perceptions of the social presence also scored high in terms of perceived learning and perceived satisfaction with instructor.
Abstract: Research has demonstrated that social presence not only affects outcomes but also student, and possibly instructor, satisfaction with a course [1]. Teacher immediacy behaviors and the presence of others are especially important issues for those involved in delivering online education. This study explored the role of social presence in online learning environments and its relationship to students’ perceptions of learning and satisfaction with the instructor. The participants for this study were students who completed Empire State College’s (ESC) online learning courses in the spring of 2000 and completed the end of semester course survey (n=97). A correlational design was utilized. This study found that students with high overall perceptions of social presence also scored high in terms of perceived learning and perceived satisfaction with the instructor. Students’ perceptions of social presence overall, moreover, contributed significantly to the predictor equation for students’ perceived learning overall. Gender accounted for some of the variability of students’ overall perception of social presence, while age and number of college credits earned did not account for any of the variability.

1,399 citations


Journal ArticleDOI
TL;DR: This work provides a comprehensive overview of fundamental principles that underpin blockchain technologies, such as system architectures and distributed consensus algorithms, and discusses opportunities, potential challenges and limitations for a number of use cases, ranging from emerging peer-to-peer energy trading and Internet of Things applications, to decentralised marketplaces, electric vehicle charging and e-mobility.
Abstract: Blockchains or distributed ledgers are an emerging technology that has drawn considerable interest from energy supply firms, startups, technology developers, financial institutions, national governments and the academic community. Numerous sources coming from these backgrounds identify blockchains as having the potential to bring significant benefits and innovation. Blockchains promise transparent, tamper-proof and secure systems that can enable novel business solutions, especially when combined with smart contracts. This work provides a comprehensive overview of fundamental principles that underpin blockchain technologies, such as system architectures and distributed consensus algorithms. Next, we focus on blockchain solutions for the energy industry and inform the state-of-the-art by thoroughly reviewing the literature and current business cases. To our knowledge, this is one of the first academic, peer-reviewed works to provide a systematic review of blockchain activities and initiatives in the energy sector. Our study reviews 140 blockchain research projects and startups from which we construct a map of the potential and relevance of blockchains for energy applications. These initiatives were systematically classified into different groups according to the field of activity, implementation platform and consensus strategy used. 1 Opportunities, potential challenges and limitations for a number of use cases are discussed, ranging from emerging peer-to-peer (P2P) energy trading and Internet of Things (IoT) applications, to decentralised marketplaces, electric vehicle charging and e-mobility. For each of these use cases, our contribution is twofold: first, in identifying the technical challenges that blockchain technology can solve for that application as well as its potential drawbacks, and second in briefly presenting the research and industrial projects and startups that are currently applying blockchain technology to that area. The paper ends with a discussion of challenges and market barriers the technology needs to overcome to get past the hype phase, prove its commercial viability and finally be adopted in the mainstream.

1,399 citations


Journal ArticleDOI
TL;DR: Results add further challenge to the assumption that carotenoids are directly involved in supporting physiological function in vertebrate animals, and suggest that they may play little to no direct role in key physiological processes in birds.
Abstract: Dietary carotenoids have been proposed to boost immune system and antioxidant functions in vertebrate animals, but studies aimed at testing these physiological functions of carotenoids have often failed to find support. Here we subject yellow canaries (Serinus canaria), which possess high levels of carotenoids in their tissue, and white recessive canaries, which possess a knockdown mutation that results in very low levels of tissue carotenoids, to oxidative and pathogen challenges. Across diverse measures of physiological performance, we detect no differences between carotenoid-rich yellow and carotenoid-deficient white canaries. These results add further challenge to the assumption that carotenoids are directly involved in supporting physiological function in vertebrate animals. While some dietary carotenoids provide indirect benefits as retinoid precursors, our observations suggest that carotenoids themselves may play little to no direct role in key physiological processes in birds.

1,398 citations


Journal ArticleDOI
TL;DR: InteractiVenn is a more flexible tool for interacting with Venn diagrams including up to six sets that offers a clean interface for Venn diagram construction and enables analysis of set unions while preserving the shape of the diagram.
Abstract: Set comparisons permeate a large number of data analysis workflows, in particular workflows in biological sciences. Venn diagrams are frequently employed for such analysis but current tools are limited. We have developed InteractiVenn, a more flexible tool for interacting with Venn diagrams including up to six sets. It offers a clean interface for Venn diagram construction and enables analysis of set unions while preserving the shape of the diagram. Set unions are useful to reveal differences and similarities among sets and may be guided in our tool by a tree or by a list of set unions. The tool also allows obtaining subsets’ elements, saving and loading sets for further analyses, and exporting the diagram in vector and image formats. InteractiVenn has been used to analyze two biological datasets, but it may serve set analysis in a broad range of domains. InteractiVenn allows set unions in Venn diagrams to be explored thoroughly, by consequence extending the ability to analyze combinations of sets with additional observations, yielded by novel interactions between joined sets. InteractiVenn is freely available online at: www.interactivenn.net .

Proceedings ArticleDOI
18 Jun 2018
TL;DR: Yu et al. as discussed by the authors proposed a new deep generative model-based approach which can not only synthesize novel image structures but also explicitly utilize surrounding image features as references during network training to make better predictions.
Abstract: Recent deep learning based approaches have shown promising results for the challenging task of inpainting large missing regions in an image. These methods can generate visually plausible image structures and textures, but often create distorted structures or blurry textures inconsistent with surrounding areas. This is mainly due to ineffectiveness of convolutional neural networks in explicitly borrowing or copying information from distant spatial locations. On the other hand, traditional texture and patch synthesis approaches are particularly suitable when it needs to borrow textures from the surrounding regions. Motivated by these observations, we propose a new deep generative model-based approach which can not only synthesize novel image structures but also explicitly utilize surrounding image features as references during network training to make better predictions. The model is a feedforward, fully convolutional neural network which can process images with multiple holes at arbitrary locations and with variable sizes during the test time. Experiments on multiple datasets including faces (CelebA, CelebA-HQ), textures (DTD) and natural images (ImageNet, Places2) demonstrate that our proposed approach generates higher-quality inpainting results than existing ones. Code, demo and models are available at: https://github.com/JiahuiYu/generative_inpainting.

Journal ArticleDOI
TL;DR: This guidance provides a data-supported approach to risk stratification, diagnosis, and management of patients with cirrhosis and portal hypertension (PH), varices, and variceal hemorrhage (VH), and statements are based on the following.

Journal ArticleDOI
19 Oct 2018-Science
TL;DR: What it will take to achieve this so-called quantum internet is reviewed and different stages of development that each correspond to increasingly powerful applications are defined, including a full-blown quantum internet with functional quantum computers as nodes connected through quantum communication channels.
Abstract: The internet-a vast network that enables simultaneous long-range classical communication-has had a revolutionary impact on our world. The vision of a quantum internet is to fundamentally enhance internet technology by enabling quantum communication between any two points on Earth. Such a quantum internet may operate in parallel to the internet that we have today and connect quantum processors in order to achieve capabilities that are provably impossible by using only classical means. Here, we propose stages of development toward a full-blown quantum internet and highlight experimental and theoretical progress needed to attain them.

Book
06 Jun 2017
TL;DR: Each component of emotional intelligence is discussed and shown through examples how to recognize it in potential leaders, how and why it leads to measurable business results, and how it can be learned.
Abstract: Superb leaders have very different ways of directing a team, a division, or a company. Some are subdued and analytical; others are charismatic and go with their gut. And different situations call for different types of leadership. Most mergers need a sensitive negotiator at the helm, whereas many turnarounds require a more forceful kind of authority. Psychologist and noted author Daniel Goleman has found, however, that effective leaders are alike in one crucial way: they all have a high degree of what has come to be known as emotional intelligence. In fact, Goleman's research at nearly 200 large, global companies revealed that emotional intelligence--especially at the highest levels of a company--is the sine qua non for leadership. Without it, a person can have first-class training, an incisive mind, and an endless supply of good ideas, but he still won't make a great leader. The components of emotional intelligence--self-awareness, self-regulation, motivation, empathy, and social skill--can sound unbusinesslike. But exhibiting emotional intelligence at the workplace does not mean simply controlling your anger or getting along with people. Rather, it means understanding your own and other people's emotional makeup well enough to move people in the direction of accomplishing your company's goals. In this article, the author discusses each component of emotional intelligence and shows through examples how to recognize it in potential leaders, how and why it leads to measurable business results, and how it can be learned. It takes time and, most of all, commitment. But the benefits that come from having a well-developed emotional intelligence, both for the individual and the organization, make it worth the effort.


Journal ArticleDOI
TL;DR: In this article, a comprehensive tutorial on the potential benefits and applications of UAVs in wireless communications is presented, and the important challenges and the fundamental tradeoffs in UAV-enabled wireless networks are thoroughly investigated.
Abstract: The use of flying platforms such as unmanned aerial vehicles (UAVs), popularly known as drones, is rapidly growing. In particular, with their inherent attributes such as mobility, flexibility, and adaptive altitude, UAVs admit several key potential applications in wireless systems. On the one hand, UAVs can be used as aerial base stations to enhance coverage, capacity, reliability, and energy efficiency of wireless networks. On the other hand, UAVs can operate as flying mobile terminals within a cellular network. Such cellular-connected UAVs can enable several applications ranging from real-time video streaming to item delivery. In this paper, a comprehensive tutorial on the potential benefits and applications of UAVs in wireless communications is presented. Moreover, the important challenges and the fundamental tradeoffs in UAV-enabled wireless networks are thoroughly investigated. In particular, the key UAV challenges such as 3D deployment, performance analysis, channel modeling, and energy efficiency are explored along with representative results. Then, open problems and potential research directions pertaining to UAV communications are introduced. Finally, various analytical frameworks and mathematical tools, such as optimization theory, machine learning, stochastic geometry, transport theory, and game theory are described. The use of such tools for addressing unique UAV problems is also presented. In a nutshell, this tutorial provides key guidelines on how to analyze, optimize, and design UAV-based wireless communication systems.

Proceedings ArticleDOI
03 Mar 2021
TL;DR: The authors take a step back and ask: How big is too big? What are the possible risks associated with this technology and what paths are available for mitigating those risks? They provide recommendations including weighing the environmental and financial costs first, investing resources into curating and carefully documenting datasets rather than ingesting everything on the web, carrying out pre-development exercises evaluating how the planned approach fits into research and development goals and supports stakeholder values, and encouraging research directions beyond ever larger language models.
Abstract: The past 3 years of work in NLP have been characterized by the development and deployment of ever larger language models, especially for English. BERT, its variants, GPT-2/3, and others, most recently Switch-C, have pushed the boundaries of the possible both through architectural innovations and through sheer size. Using these pretrained models and the methodology of fine-tuning them for specific tasks, researchers have extended the state of the art on a wide array of tasks as measured by leaderboards on specific benchmarks for English. In this paper, we take a step back and ask: How big is too big? What are the possible risks associated with this technology and what paths are available for mitigating those risks? We provide recommendations including weighing the environmental and financial costs first, investing resources into curating and carefully documenting datasets rather than ingesting everything on the web, carrying out pre-development exercises evaluating how the planned approach fits into research and development goals and supports stakeholder values, and encouraging research directions beyond ever larger language models.

Journal ArticleDOI
TL;DR: An ambient-driven actuator that takes advantage of inherent nanoscale molecular channels within a commercial perfluorosulfonic acid ionomer (PFSA) film, fabricated by simple solution processing to realize a rapid response, self-adaptive, and exceptionally stable actuation.
Abstract: The ability to achieve simultaneous intrinsic deformation with fast response in commercially available materials that can safely contact skin continues to be an unresolved challenge for artificial actuating materials. Rather than using a microporous structure, here we show an ambient-driven actuator that takes advantage of inherent nanoscale molecular channels within a commercial perfluorosulfonic acid ionomer (PFSA) film, fabricated by simple solution processing to realize a rapid response, self-adaptive, and exceptionally stable actuation. Selective patterning of PFSA films on an inert soft substrate (polyethylene terephthalate film) facilitates the formation of a range of different geometries, including a 2D (two-dimensional) roll or 3D (three-dimensional) helical structure in response to vapor stimuli. Chemical modification of the surface allowed the development of a kirigami-inspired single-layer actuator for personal humidity and heat management through macroscale geometric design features, to afford a bilayer stimuli-responsive actuator with multicolor switching capability. Intrinsic deformation with fast response in commercially available materials that can safely contact skin continues to be a challenge for artificial actuating materials. Here the authors incorporate nanoscale molecular channels within perfluorosulfonic acid ionomer for self-adaptive and ambient-driven actuation.

Journal ArticleDOI
TL;DR: Among patients with previously untreated advanced NSCLC with an EGFR mutation, those who received osimertinib had longer overall survival than those whoreceived a comparator EGFR-TKI.
Abstract: Background Osimertinib is a third-generation, irreversible tyrosine kinase inhibitor of the epidermal growth factor receptor (EGFR-TKI) that selectively inhibits both EGFR-TKI–sensitizing ...

Journal ArticleDOI
TL;DR: This amended and improved digestion method (INFOGEST 2.0) avoids challenges associated with the original method, such as the inclusion of the oral phase and the use of gastric lipase.
Abstract: Developing a mechanistic understanding of the impact of food structure and composition on human health has increasingly involved simulating digestion in the upper gastrointestinal tract. These simulations have used a wide range of different conditions that often have very little physiological relevance, and this impedes the meaningful comparison of results. The standardized protocol presented here is based on an international consensus developed by the COST INFOGEST network. The method is designed to be used with standard laboratory equipment and requires limited experience to encourage a wide range of researchers to adopt it. It is a static digestion method that uses constant ratios of meal to digestive fluids and a constant pH for each step of digestion. This makes the method simple to use but not suitable for simulating digestion kinetics. Using this method, food samples are subjected to sequential oral, gastric and intestinal digestion while parameters such as electrolytes, enzymes, bile, dilution, pH and time of digestion are based on available physiological data. This amended and improved digestion method (INFOGEST 2.0) avoids challenges associated with the original method, such as the inclusion of the oral phase and the use of gastric lipase. The method can be used to assess the endpoints resulting from digestion of foods by analyzing the digestion products (e.g., peptides/amino acids, fatty acids, simple sugars) and evaluating the release of micronutrients from the food matrix. The whole protocol can be completed in ~7 d, including ~5 d required for the determination of enzyme activities.

Journal ArticleDOI
17 May 2016-Immunity
TL;DR: Co-inhibitory receptors, such as CTLA-4 and PD-1, have an important role in regulating T cell responses and have proven to be effective targets in the setting of chronic diseases where constitutive co- inhibitory receptor expression on T cells dampens effector T-cell responses.

Journal ArticleDOI
TL;DR: In this article, the authors discuss and review the development of this rapidly growing research field that encompasses the characterization, quantification, manipulation, dynamical evolution, and operational application of quantum coherence.
Abstract: The coherent superposition of states, in combination with the quantization of observables, represents one of the most fundamental features that mark the departure of quantum mechanics from the classical realm. Quantum coherence in many-body systems embodies the essence of entanglement and is an essential ingredient for a plethora of physical phenomena in quantum optics, quantum information, solid state physics, and nanoscale thermodynamics. In recent years, research on the presence and functional role of quantum coherence in biological systems has also attracted a considerable interest. Despite the fundamental importance of quantum coherence, the development of a rigorous theory of quantum coherence as a physical resource has only been initiated recently. In this Colloquium we discuss and review the development of this rapidly growing research field that encompasses the characterization, quantification, manipulation, dynamical evolution, and operational application of quantum coherence.

Journal ArticleDOI
28 Apr 2015-JAMA
TL;DR: PRISMA-IPD provides guidelines for reporting systematic reviews and meta-analyses of IPD, and although developed primarily for reviews of randomized trials, many items will apply in other contexts, including reviews of diagnosis and prognosis.
Abstract: Importance Systematic reviews and meta-analyses of individual participant data (IPD) aim to collect, check, and reanalyze individual-level data from all studies addressing a particular research question and are therefore considered a gold standard approach to evidence synthesis. They are likely to be used with increasing frequency as current initiatives to share clinical trial data gain momentum and may be particularly important in reviewing controversial therapeutic areas. Objective To develop PRISMA-IPD as a stand-alone extension to the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) Statement, tailored to the specific requirements of reporting systematic reviews and meta-analyses of IPD. Although developed primarily for reviews of randomized trials, many items will apply in other contexts, including reviews of diagnosis and prognosis. Design Development of PRISMA-IPD followed the EQUATOR Network framework guidance and used the existing standard PRISMA Statement as a starting point to draft additional relevant material. A web-based survey informed discussion at an international workshop that included researchers, clinicians, methodologists experienced in conducting systematic reviews and meta-analyses of IPD, and journal editors. The statement was drafted and iterative refinements were made by the project, advisory, and development groups. The PRISMA-IPD Development Group reached agreement on the PRISMA-IPD checklist and flow diagram by consensus. Findings Compared with standard PRISMA, the PRISMA-IPD checklist includes 3 new items that address (1) methods of checking the integrity of the IPD (such as pattern of randomization, data consistency, baseline imbalance, and missing data), (2) reporting any important issues that emerge, and (3) exploring variation (such as whether certain types of individual benefit more from the intervention than others). A further additional item was created by reorganization of standard PRISMA items relating to interpreting results. Wording was modified in 23 items to reflect the IPD approach. Conclusions and Relevance PRISMA-IPD provides guidelines for reporting systematic reviews and meta-analyses of IPD.

Journal ArticleDOI
TL;DR: The mechanism of action of the natural antioxidant compounds and assays and their reaction mechanisms can help in evaluating the antioxidant activity of various antioxidant compounds as well as in the development of novel antioxidants.
Abstract: The normal biochemical reactions in our body, increased exposure to the environment, and higher levels of dietary xenobiotic's result in the generation of reactive oxygen species (ROS) and reactive nitrogen species (RNS). The ROS and RNS create oxidative stress in different pathophysiological conditions. The reported chemical evidence suggests that dietary antioxidants help in disease prevention. The antioxidant compounds react in one-electron reactions with free radicals in vivo/in vitro and prevent oxidative damage. Therefore, it is very important to understand the reaction mechanism of antioxidants with the free radicals. This review elaborates the mechanism of action of the natural antioxidant compounds and assays for the evaluation of their antioxidant activities. The reaction mechanisms of the antioxidant assays are briefly discussed (165 references). Practical applications: understanding the reaction mechanisms can help in evaluating the antioxidant activity of various antioxidant compounds as well as in the development of novel antioxidants.

Proceedings ArticleDOI
01 Jun 2016
TL;DR: A large-scale dataset for RGB+D human action recognition with more than 56 thousand video samples and 4 million frames, collected from 40 distinct subjects is introduced and a new recurrent neural network structure is proposed to model the long-term temporal correlation of the features for each body part, and utilize them for better action classification.
Abstract: Recent approaches in depth-based human activity analysis achieved outstanding performance and proved the effectiveness of 3D representation for classification of action classes. Currently available depth-based and RGB+Dbased action recognition benchmarks have a number of limitations, including the lack of training samples, distinct class labels, camera views and variety of subjects. In this paper we introduce a large-scale dataset for RGB+D human action recognition with more than 56 thousand video samples and 4 million frames, collected from 40 distinct subjects. Our dataset contains 60 different action classes including daily, mutual, and health-related actions. In addition, we propose a new recurrent neural network structure to model the long-term temporal correlation of the features for each body part, and utilize them for better action classification. Experimental results show the advantages of applying deep learning methods over state-of-the-art handcrafted features on the suggested cross-subject and crossview evaluation criteria for our dataset. The introduction of this large scale dataset will enable the community to apply, develop and adapt various data-hungry learning techniques for the task of depth-based and RGB+D-based human activity analysis.

Journal ArticleDOI
TL;DR: It is shown that it is possible to disrupt, restore, and move loops and domains using targeted mutations as small as a single base pair at CTCF sites, and it is found that the observed contact domains are inconsistent with the equilibrium state for an ordinary condensed polymer.
Abstract: We recently used in situ Hi-C to create kilobase-resolution 3D maps of mammalian genomes. Here, we combine these maps with new Hi-C, microscopy, and genome-editing experiments to study the physical structure of chromatin fibers, domains, and loops. We find that the observed contact domains are inconsistent with the equilibrium state for an ordinary condensed polymer. Combining Hi-C data and novel mathematical theorems, we show that contact domains are also not consistent with a fractal globule. Instead, we use physical simulations to study two models of genome folding. In one, intermonomer attraction during polymer condensation leads to formation of an anisotropic "tension globule." In the other, CCCTC-binding factor (CTCF) and cohesin act together to extrude unknotted loops during interphase. Both models are consistent with the observed contact domains and with the observation that contact domains tend to form inside loops. However, the extrusion model explains a far wider array of observations, such as why loops tend not to overlap and why the CTCF-binding motifs at pairs of loop anchors lie in the convergent orientation. Finally, we perform 13 genome-editing experiments examining the effect of altering CTCF-binding sites on chromatin folding. The convergent rule correctly predicts the affected loops in every case. Moreover, the extrusion model accurately predicts in silico the 3D maps resulting from each experiment using only the location of CTCF-binding sites in the WT. Thus, we show that it is possible to disrupt, restore, and move loops and domains using targeted mutations as small as a single base pair.

Journal ArticleDOI
11 Feb 2019
TL;DR: Using satellite data from 2000–2017, this study finds striking greening of both China and India, driven primarily by land-use change, with forest growth and cropland intensification more important in China andCropland moreimportant in India.
Abstract: Satellite data show increasing leaf area of vegetation due to direct (human land-use management) and indirect factors (climate change, CO2 fertilization, nitrogen deposition, recovery from natural disturbances, etc.). Among these, climate change and CO2 fertilization effect seem to be the dominant drivers. However, recent satellite data (2000-2017) reveal a greening pattern that is strikingly prominent in China and India, and overlapping with croplands world-wide. China alone accounts for 25% of the global net increase in leaf area with only 6.6% of global vegetated area. The greening in China is from forests (42%) and croplands (32%), but in India is mostly from croplands (82%) with minor contribution from forests (4.4%). China is engineering ambitious programs to conserve and expand forests with the goal of mitigating land degradation, air pollution and climate change. Food production in China and India has increased by over 35% since 2000 mostly due to increasing harvested area through multiple cropping facilitated by fertilizer use and surface/ground-water irrigation. Our results indicate that the direct factor is a key driver of the "Greening Earth", accounting for over a third, and likely more, of the observed net increase in green leaf area. They highlight the need for realistic representation of human land-use practices in Earth system models.

Journal ArticleDOI
TL;DR: In this article, the authors updated the American Thoracic Society/European Respiratory Society/Japanese Rensenshine Society/Latin American thoracic Association guideline on idiopathic pulmonary fibrosis treatment using systematic reviews and meta-analyses.
Abstract: Background: This document updates the American Thoracic Society/European Respiratory Society/Japanese Respiratory Society/Latin American Thoracic Association guideline on idiopathic pulmonary fibrosis treatment.Methods: Systematic reviews and, when appropriate, meta-analyses were performed to summarize all available evidence pertinent to our questions. The evidence was assessed using the GRADE (Grading of Recommendations, Assessment, Development and Evaluation) approach and then discussed by a multidisciplinary panel. Predetermined conflict-of-interest management strategies were applied, and recommendations were formulated, written, and graded exclusively by the nonconflicted panelists.Results: After considering the confidence in effect estimates, the importance of outcomes studied, desirable and undesirable consequences of treatment, cost, feasibility, acceptability of the intervention, and implications to health equity, recommendations were made for or against specific treatment interventions.Conclusion...

Journal ArticleDOI
TL;DR: This work establishes the detailed Fermi-surface topology of the recently identified WSM TaP via combined angle-resolved quantum-oscillation spectra and band-structure calculations and observes a large negative longitudinal magnetoresistance.
Abstract: Weyl semimetals (WSMs) are topological quantum states wherein the electronic bands disperse linearly around pairs of nodes with fixed chirality, the Weyl points In WSMs, nonorthogonal electric and magnetic fields induce an exotic phenomenon known as the chiral anomaly, resulting in an unconventional negative longitudinal magnetoresistance, the chiral-magnetic effect However, it remains an open question to which extent this effect survives when chirality is not well-defined Here, we establish the detailed Fermi-surface topology of the recently identified WSM TaP via combined angle-resolved quantum-oscillation spectra and band-structure calculations The Fermi surface forms banana-shaped electron and hole pockets surrounding pairs of Weyl points Although this means that chirality is ill-defined in TaP, we observe a large negative longitudinal magnetoresistance We show that the magnetoresistance can be affected by a magnetic field-induced inhomogeneous current distribution inside the sample