scispace - formally typeset
Search or ask a question
Browse all papers

Posted ContentDOI
06 May 2018-bioRxiv
TL;DR: FMRIPrep has the potential to transform fMRI research by equipping neuroscientists with a high-quality, robust, easy-to-use and transparent preprocessing workflow which can help ensure the validity of inference and the interpretability of their results.
Abstract: Preprocessing of functional MRI (fMRI) involves numerous steps to clean and standardize data before statistical analysis. Generally, researchers create ad hoc preprocessing workflows for each new dataset, building upon a large inventory of tools available for each step. The complexity of these workflows has snowballed with rapid advances in MR data acquisition and image processing techniques. We introduce fMRIPrep, an analysis-agnostic tool that addresses the challenge of robust and reproducible preprocessing for task-based and resting fMRI data. FMRIPrep automatically adapts a best-in-breed workflow to the idiosyncrasies of virtually any dataset, ensuring high-quality preprocessing with no manual intervention. By introducing visual assessment checkpoints into an iterative integration framework for software-testing, we show that fMRIPrep robustly produces high-quality results on a diverse fMRI data collection comprising participants from 54 different studies in the OpenfMRI repository. We review the distinctive features of fMRIPrep in a qualitative comparison to other preprocessing workflows. We demonstrate that fMRIPrep achieves higher spatial accuracy as it introduces less uncontrolled spatial smoothness than one commonly used preprocessing tool. FMRIPrep has the potential to transform fMRI research by equipping neuroscientists with a high-quality, robust, easy-to-use and transparent preprocessing workflow which can help ensure the validity of inference and the interpretability of their results.

684 citations


Journal Article
TL;DR: A novel algorithm is introduced, Hyperband, for hyperparameter optimization as a pure-exploration non-stochastic infinite-armed bandit problem where a predefined resource like iterations, data samples, or features is allocated to randomly sampled configurations.
Abstract: Performance of machine learning algorithms depends critically on identifying a good set of hyperparameters. While recent approaches use Bayesian optimization to adaptively select configurations, we focus on speeding up random search through adaptive resource allocation and early-stopping. We formulate hyperparameter optimization as a pure-exploration nonstochastic infinite-armed bandit problem where a predefined resource like iterations, data samples, or features is allocated to randomly sampled configurations. We introduce a novel algorithm, Hyperband, for this framework and analyze its theoretical properties, providing several desirable guarantees. Furthermore, we compare Hyperband with popular Bayesian optimization methods on a suite of hyperparameter optimization problems. We observe that Hyperband can provide over an order-of-magnitude speedup over our competitor set on a variety of deep-learning and kernel-based learning problems.

683 citations


Proceedings ArticleDOI
14 Jun 2020
TL;DR: A novel single-shot, multi-level face localisation method, named RetinaFace, which unifies face box prediction, 2D facial landmark localisation and 3D vertices regression under one common target: point regression on the image plane.
Abstract: Though tremendous strides have been made in uncontrolled face detection, accurate and efficient 2D face alignment and 3D face reconstruction in-the-wild remain an open challenge. In this paper, we present a novel single-shot, multi-level face localisation method, named RetinaFace, which unifies face box prediction, 2D facial landmark localisation and 3D vertices regression under one common target: point regression on the image plane. To fill the data gap, we manually annotated five facial landmarks on the WIDER FACE dataset and employed a semi-automatic annotation pipeline to generate 3D vertices for face images from the WIDER FACE, AFLW and FDDB datasets. Based on extra annotations, we propose a mutually beneficial regression target for 3D face reconstruction, that is predicting 3D vertices projected on the image plane constrained by a common 3D topology. The proposed 3D face reconstruction branch can be easily incorporated, without any optimisation difficulty, in parallel with the existing box and 2D landmark regression branches during joint training. Extensive experimental results show that RetinaFace can simultaneously achieve stable face detection, accurate 2D face alignment and robust 3D face reconstruction while being efficient through single-shot inference.

683 citations


Journal ArticleDOI
24 Apr 2015-Autism
TL;DR: Nearly all medical conditions were significantly more common in adults with autism, including immune conditions, gastrointestinal and sleep disorders, seizure, obesity, dyslipidemia, hypertension, and diabetes.
Abstract: Compared to the general pediatric population, children with autism have higher rates of co-occurring medical and psychiatric illnesses, yet very little is known about the general health status of adults with autism. The objective of this study was to describe the frequency of psychiatric and medical conditions among a large, diverse, insured population of adults with autism in the United States. Participants were adult members of Kaiser Permanente Northern California enrolled from 2008 to 2012. Autism spectrum disorder cases (N = 1507) were adults with autism spectrum disorder diagnoses (International Classification of Diseases-9-Clinical Modification codes 299.0, 299.8, 299.9) recorded in medical records on at least two separate occasions. Controls (N = 15,070) were adults without any autism spectrum disorder diagnoses sampled at a 10:1 ratio and frequency matched to cases on sex and age. Adults with autism had significantly increased rates of all major psychiatric disorders including depression, anxiety...

683 citations


19 Dec 2016
TL;DR: The first digital signature was proposed by Whitfield Diffie and Martin E. Hellman as mentioned in this paper, who proposed a solution that required only 64 bits of published key to sign a single bit.
Abstract: At a coffee house in Berkeley around 1975, Whitfield Diffie described a problem to me that he had been trying to solve: constructing a digital signature for a document. I immediately proposed a solution. Though not very practical–it required perhaps 64 bits of published key to sign a single bit–it was the first digital signature algorithm. Diffie and Hellman mention it in their classic paper: Whitfield Diffie and Martin E. Hellman. New Directions in Cryptography. IEEE Transactions on Information Theory IT-22, 6 (1976), 644-654. (I think it’s at the bottom right of page 650.) In 1978, Michael Rabin published a paper titled Digitalized Signatures containing a more practical scheme for generating digital signatures of documents. (I don’t remember what other digital signature algorithms had already been proposed.) However, his solution had some drawbacks that limited its utility. This report describes an improvement to Rabin’s algorithm that eliminates those drawbacks. I’m not sure why I never published this report. However, I think it was because, after writing it, I realized that the algorithm could be fairly easily derived directly from Rabin’s algorithm. So, I didn’t feel that it added much to what Rabin had done. However, I’ve been told that this paper is cited in the cryptography literature and is considered significant, so perhaps I was wrong.

683 citations


Journal ArticleDOI
TL;DR: In this article, the authors provide an updated recommendation for the usage of sets of parton distribution functions (PDFs) and the assessment of PDF and PDF+$\alpha_s$ uncertainties suitable for applications at the LHC Run II.
Abstract: We provide an updated recommendation for the usage of sets of parton distribution functions (PDFs) and the assessment of PDF and PDF+$\alpha_s$ uncertainties suitable for applications at the LHC Run II. We review developments since the previous PDF4LHC recommendation, and discuss and compare the new generation of PDFs, which include substantial information from experimental data from the Run I of the LHC. We then propose a new prescription for the combination of a suitable subset of the available PDF sets, which is presented in terms of a single combined PDF set. We finally discuss tools which allow for the delivery of this combined set in terms of optimized sets of Hessian eigenvectors or Monte Carlo replicas, and their usage, and provide some examples of their application to LHC phenomenology.

683 citations


Journal ArticleDOI
TL;DR: In this article, valley transport of sound is reported for a macroscopic triangular-lattice array of rod-like scatterers in a 2D air waveguide.
Abstract: Valleytronics — exploiting a system’s pseudospin degree of freedom — is being increasingly explored in sonic crystals. Now, valley transport of sound is reported for a macroscopic triangular-lattice array of rod-like scatterers in a 2D air waveguide.

683 citations


Journal ArticleDOI
TL;DR: In this article, a 4-level conceptual framework was proposed to clarify the embedded themes of the relationship between sex/gender differences and autism and to better understand the implications from existing research and to help design future studies.
Abstract: Objective The relationship between sex/gender differences and autism has attracted a variety of research ranging from clinical and neurobiological to etiological, stimulated by the male bias in autism prevalence. Findings are complex and do not always relate to each other in a straightforward manner. Distinct but interlinked questions on the relationship between sex/gender differences and autism remain underaddressed. To better understand the implications from existing research and to help design future studies, we propose a 4-level conceptual framework to clarify the embedded themes. Method We searched PubMed for publications before September 2014 using search terms "‘sex OR gender OR females' AND autism." A total of 1,906 articles were screened for relevance, along with publications identified via additional literature reviews, resulting in 329 articles that were reviewed. Results Level 1, "Nosological and diagnostic challenges," concerns the question, "How should autism be defined and diagnosed in males and females?" Level 2, "Sex/gender-independent and sex/gender-dependent characteristics," addresses the question, "What are the similarities and differences between males and females with autism?" Level 3, "General models of etiology: liability and threshold," asks the question, "How is the liability for developing autism linked to sex/gender?" Level 4, "Specific etiological–developmental mechanisms," focuses on the question, "What etiological–developmental mechanisms of autism are implicated by sex/gender and/or sexual/gender differentiation?" Conclusions Using this conceptual framework, findings can be more clearly summarized, and the implications of the links between findings from different levels can become clearer. Based on this 4-level framework, we suggest future research directions, methodology, and specific topics in sex/gender differences and autism.

683 citations


Journal ArticleDOI
TL;DR: The goal of this survey article is to give a coherent and comprehensive review of the literature around the construction and use of Normalizing Flows for distribution learning to provide context and explanation of the models.
Abstract: Normalizing Flows are generative models which produce tractable distributions where both sampling and density evaluation can be efficient and exact. The goal of this survey article is to give a coherent and comprehensive review of the literature around the construction and use of Normalizing Flows for distribution learning. We aim to provide context and explanation of the models, review current state-of-the-art literature, and identify open questions and promising future directions.

683 citations


Journal ArticleDOI
01 Jan 2016-Gut
TL;DR: The strength of the associations between stool consistency and species richness, enterotypes and community composition emphasises the crucial importance of stool consistency assessment in gut metagenome-wide association studies.
Abstract: Objective The assessment of potentially confounding factors affecting colon microbiota composition is essential to the identification of robust microbiome based disease markers. Here, we investigate the link between gut microbiota variation and stool consistency using Bristol Stool Scale classification, which reflects faecal water content and activity, and is considered a proxy for intestinal colon transit time. Design Through 16S rDNA Illumina profiling of faecal samples of 53 healthy women, we evaluated associations between microbiome richness, Bacteroidetes:Firmicutes ratio, enterotypes, and genus abundance with self-reported, Bristol Stool Scale-based stool consistency. Each sample’s microbiota growth potential was calculated to test whether transit time acts as a selective force on gut bacterial growth rates. Results Stool consistency strongly correlates with all known major microbiome markers. It is negatively correlated with species richness, positively associated to the Bacteroidetes:Firmicutes ratio, and linked to Akkermansia and Methanobrevibacter abundance. Enterotypes are distinctly distributed over the BSS-scores. Based on the correlations between microbiota growth potential and stool consistency scores within both enterotypes, we hypothesise that accelerated transit contributes to colon ecosystem differentiation. While shorter transit times can be linked to increased abundance of fast growing species in Ruminococcaceae- Bacteroides samples, hinting to a washout avoidance strategy of faster replication, this trend is absent in Prevotella -enterotyped individuals. Within this enterotype adherence to host tissue therefore appears to be a more likely bacterial strategy to cope with washout. Conclusions The strength of the associations between stool consistency and species richness, enterotypes and community composition emphasises the crucial importance of stool consistency assessment in gut metagenome-wide association studies.

683 citations


Posted Content
TL;DR: This paper introduces SC2LE (StarCraft II Learning Environment), a reinforcement learning environment based on the StarCraft II game that offers a new and challenging environment for exploring deep reinforcement learning algorithms and architectures and gives initial baseline results for neural networks trained from this data to predict game outcomes and player actions.
Abstract: This paper introduces SC2LE (StarCraft II Learning Environment), a reinforcement learning environment based on the StarCraft II game. This domain poses a new grand challenge for reinforcement learning, representing a more difficult class of problems than considered in most prior work. It is a multi-agent problem with multiple players interacting; there is imperfect information due to a partially observed map; it has a large action space involving the selection and control of hundreds of units; it has a large state space that must be observed solely from raw input feature planes; and it has delayed credit assignment requiring long-term strategies over thousands of steps. We describe the observation, action, and reward specification for the StarCraft II domain and provide an open source Python-based interface for communicating with the game engine. In addition to the main game maps, we provide a suite of mini-games focusing on different elements of StarCraft II gameplay. For the main game maps, we also provide an accompanying dataset of game replay data from human expert players. We give initial baseline results for neural networks trained from this data to predict game outcomes and player actions. Finally, we present initial baseline results for canonical deep reinforcement learning agents applied to the StarCraft II domain. On the mini-games, these agents learn to achieve a level of play that is comparable to a novice player. However, when trained on the main game, these agents are unable to make significant progress. Thus, SC2LE offers a new and challenging environment for exploring deep reinforcement learning algorithms and architectures.

Book
11 Jul 2018
TL;DR: Virtues of the Mind as mentioned in this paper is a powerful systematic development of a virtue approach to epistemology, and there is a penetrating discussion of virtue in general and moral virtue in particular.
Abstract: Virtues of the Mind is a powerful systematic development of a virtue approach to epistemology. As a bonus there is a penetrating discussion of virtue in general and moral virtue in particular. The book deserves to be at the center of discussion of virtue epistemology for some time to come. But enough of this encomium. I come not to praise Linda Zagzebski, nor, indeed, to bury her, but to raise a few critical questions about positions she takes in the book.

Book ChapterDOI
08 Oct 2016
TL;DR: This paper forms an approach for learning a visual representation from the raw spatiotemporal signals in videos using a Convolutional Neural Network, and shows that this method captures information that is temporally varying, such as human pose.
Abstract: In this paper, we present an approach for learning a visual representation from the raw spatiotemporal signals in videos. Our representation is learned without supervision from semantic labels. We formulate our method as an unsupervised sequential verification task, i.e., we determine whether a sequence of frames from a video is in the correct temporal order. With this simple task and no semantic labels, we learn a powerful visual representation using a Convolutional Neural Network (CNN). The representation contains complementary information to that learned from supervised image datasets like ImageNet. Qualitative results show that our method captures information that is temporally varying, such as human pose. When used as pre-training for action recognition, our method gives significant gains over learning without external data on benchmark datasets like UCF101 and HMDB51. To demonstrate its sensitivity to human pose, we show results for pose estimation on the FLIC and MPII datasets that are competitive, or better than approaches using significantly more supervision. Our method can be combined with supervised representations to provide an additional boost in accuracy.

Journal ArticleDOI
18 Feb 2017-Cancers
TL;DR: An overview of the clinical trials conducted over the last 10 years is provided, illustrating how PDT is applied in the clinic today, and the factors that hamper the exploration of this effective therapy and what should be changed to render it a more effective and more widely available option for patients.
Abstract: Photodynamic therapy (PDT) is a clinically approved cancer therapy, based on a photochemical reaction between a light activatable molecule or photosensitizer, light, and molecular oxygen. When these three harmless components are present together, reactive oxygen species are formed. These can directly damage cells and/or vasculature, and induce inflammatory and immune responses. PDT is a two-stage procedure, which starts with photosensitizer administration followed by a locally directed light exposure, with the aim of confined tumor destruction. Since its regulatory approval, over 30 years ago, PDT has been the subject of numerous studies and has proven to be an effective form of cancer therapy. This review provides an overview of the clinical trials conducted over the last 10 years, illustrating how PDT is applied in the clinic today. Furthermore, examples from ongoing clinical trials and the most recent preclinical studies are presented, to show the directions, in which PDT is headed, in the near and distant future. Despite the clinical success reported, PDT is still currently underutilized in the clinic. We also discuss the factors that hamper the exploration of this effective therapy and what should be changed to render it a more effective and more widely available option for patients.

Book
06 Jul 2015
TL;DR: In this article, the authors argue that an individual can use marihuana for pleasure only if he learns to smoke it in a way that will produce real effects; learns to recognize the effects and connect them with drug use; and learns to enjoy the sensations he perceives.
Abstract: An individual will be able to use marihuana for pleasure only when he 1. learns to smoke it in a way that will produce real effects; 2. learns to recognize the effects and connect them with drug use; and 3. learns to enjoy the sensations he perceives. This proposition, based on an analysis of fifty interviews with marihuana users, calls into question theories which ascribe behavior to antecedent predispositions and suggests the utility of explaining behavior in terms of the emergence of motives and dispositions in the course of experience.

BookDOI
04 Aug 2016
TL;DR: A definitional framework for NbS is proposed, including a set of general principles for any N bS intervention, and the scope of NBS is defined as an umbrella concept embracing a number of different ecosystem-based approaches.
Abstract: This report has been prepared as part of an effort by IUCN to define its position on Nature-based Solutions (NbS) and plan for future work to advance this concept and support effective implementation of NbS to enhance ecosystem services provision and address societal challenges. The report proposes a definitional framework for NbS, including a set of general principles for any NbS intervention. The report also defines the scope of NbS as an umbrella concept embracing a number of different ecosystem-based approaches.

Posted Content
TL;DR: Analysis shows that Cs-based devices are as efficient as, and more stable than methylammonium-based ones, after aging, as well as under constant illumination, and under electron beam irradiation.
Abstract: Direct comparison between perovskite-structured hybrid organic-inorganic - methyl ammonium lead bromide (MAPbBr3) and all-inorganic cesium lead bromide (CsPbBr3), allows identifying possible fundamental differences in their structural, thermal and electronic characteristics. Both materials possess a similar direct optical band-gap, but CsPbBr3 demonstrates a higher thermal stability than MAPbBr3. In order to compare device properties we fabricated solar cells, with similarly synthesized MAPbBr3 or CsPbBr3, over mesoporous titania scaffolds. Both cell types demonstrated comparable photovoltaic performances under AM1.5 illumination, reaching power conversion efficiencies of ~6 % with a poly-aryl amine-based derivative as hole transport material. Further analysis shows that Cs-based devices are as efficient as, and more stable than methyl ammonium-based ones, after aging (storing the cells for 2 weeks in a dry (relative humidity 15-20%) air atmosphere in the dark) for 2 weeks, under constant illumination (at maximum power), and under electron beam irradiation.

Journal ArticleDOI
TL;DR: It is often advantageous to transform a strongly nonlinear system into a linear one in order to simplify its analysis for prediction and control, so the authors combine dynamical systems with deep learning to identify these hard-to-find transformations.
Abstract: Identifying coordinate transformations that make strongly nonlinear dynamics approximately linear is a central challenge in modern dynamical systems. These transformations have the potential to enable prediction, estimation, and control of nonlinear systems using standard linear theory. The Koopman operator has emerged as a leading data-driven embedding, as eigenfunctions of this operator provide intrinsic coordinates that globally linearize the dynamics. However, identifying and representing these eigenfunctions has proven to be mathematically and computationally challenging. This work leverages the power of deep learning to discover representations of Koopman eigenfunctions from trajectory data of dynamical systems. Our network is parsimonious and interpretable by construction, embedding the dynamics on a low-dimensional manifold that is of the intrinsic rank of the dynamics and parameterized by the Koopman eigenfunctions. In particular, we identify nonlinear coordinates on which the dynamics are globally linear using a modified auto-encoder. We also generalize Koopman representations to include a ubiquitous class of systems that exhibit continuous spectra, ranging from the simple pendulum to nonlinear optics and broadband turbulence. Our framework parametrizes the continuous frequency using an auxiliary network, enabling a compact and efficient embedding at the intrinsic rank, while connecting our models to half a century of asymptotics. In this way, we benefit from the power and generality of deep learning, while retaining the physical interpretability of Koopman embeddings.

Journal ArticleDOI
TL;DR: This paper shows how gradients of expectation values of quantum measurements can be estimated using the same, or almost the same the architecture that executes the original circuit, and proposes recipes for the computation of gradients for continuous-variable circuits.
Abstract: An important application for near-term quantum computing lies in optimization tasks, with applications ranging from quantum chemistry and drug discovery to machine learning. In many settings, most prominently in so-called parametrized or variational algorithms, the objective function is a result of hybrid quantum-classical processing. To optimize the objective, it is useful to have access to exact gradients of quantum circuits with respect to gate parameters. This paper shows how gradients of expectation values of quantum measurements can be estimated using the same, or almost the same, architecture that executes the original circuit. It generalizes previous results for qubit-based platforms, and proposes recipes for the computation of gradients of continuous-variable circuits. Interestingly, in many important instances it is sufficient to run the original quantum circuit twice while shifting a single gate parameter to obtain the corresponding component of the gradient. More general cases can be solved by conditioning a single gate on an ancilla.


Journal ArticleDOI
TL;DR: In this paper, the equivalence of antiferromagnets and ferromagnetic spintronics for effects that are an even function of the magnetic moment has been demonstrated based on even-in-moment relativistic transport phenomena.
Abstract: Louis Neel pointed out in his Nobel lecture that while abundant and interesting from a theoretical viewpoint, antiferromagnets did not seem to have any applications. Indeed, the alternating directions of magnetic moments on individual atoms and the resulting zero net magnetization make antiferromagnets hard to control by tools common in ferromagnets. Remarkably, Neel in his lecture provides the key which, as we show here, allows us to control antiferromagnets by electrical means analogous to those which paved the way to the development of ferromagnetic spintronics applications. The key noted by Neel is the equivalence of antiferromagnets and ferromagnets for effects that are an even function of the magnetic moment. Based on even-in-moment relativistic transport phenomena, we demonstrate room-temperature electrical switching between two stable configurations combined with electrical read-out in antiferromagnetic CuMnAs thin film devices. Our magnetic memory is insensitive to and produces no magnetic field perturbations which illustrates the unique merits of antiferromagnets for spintronics.

Journal ArticleDOI
TL;DR: The assessment of chronic obstructive pulmonary disease has been refined to separate the spirometric assessment from symptom evaluation, and the concept of de‐escalation of therapy is introduced in the treatment assessment scheme.
Abstract: This Executive Summary of the Global Strategy for the Diagnosis, Management and Prevention of COPD, Global Initiative for Chronic Obstructive Lung Disease (GOLD) 2017 Report focuses primarily on the revised and novel parts of the document. The most significant changes include: (i) the assessment of chronic obstructive pulmonary disease has been refined to separate the spirometric assessment from symptom evaluation. ABCD groups are now proposed to be derived exclusively from patient symptoms and their history of exacerbations; (ii) for each of the groups A to D, escalation strategies for pharmacological treatments are proposed; (iii) the concept of de-escalation of therapy is introduced in the treatment assessment scheme; (iv)non-pharmacological therapies are comprehensively presented and (v) the importance of co-morbid conditions in managing COPD is reviewed.

Journal ArticleDOI
TL;DR: MRI-guided focused ultrasound thalamotomy reduced hand tremor in patients with essential tremor and secondary outcome measures assessing disability and quality of life improved with active treatment as compared with the sham procedure.
Abstract: BackgroundUncontrolled pilot studies have suggested the efficacy of focused ultrasound thalamotomy with magnetic resonance imaging (MRI) guidance for the treatment of essential tremor. MethodsWe enrolled patients with moderate-to-severe essential tremor that had not responded to at least two trials of medical therapy and randomly assigned them in a 3:1 ratio to undergo unilateral focused ultrasound thalamotomy or a sham procedure. The Clinical Rating Scale for Tremor and the Quality of Life in Essential Tremor Questionnaire were administered at baseline and at 1, 3, 6, and 12 months. Tremor assessments were videotaped and rated by an independent group of neurologists who were unaware of the treatment assignments. The primary outcome was the between-group difference in the change from baseline to 3 months in hand tremor, rated on a 32-point scale (with higher scores indicating more severe tremor). After 3 months, patients in the sham-procedure group could cross over to active treatment (the open-label exte...

Journal ArticleDOI
TL;DR: This paper aims to provide a history of colorectal surgery in Italy and Spain over the past 50 years and some of the techniques used in that time have changed significantly.
Abstract: 1Department of Biomedical Sciences, Humanitas University, Via Manzoni 113 20089 Rozzano Milan, 2Division of Colon and Rectal Surgery, Humanitas Clinical and Research Hospital, Via Manzoni 56 20089 Rozzano Milan, 3Department of Advanced Medical and Surgical Sciences, Università degli Studi della Campania ‘Luigi Vanvitelli’, Naples, Italy, and 4Colorectal Unit, Vall d’Hebron University Hospital, Barcelona, Spain (e-mail: antonino.spinelli@hunimed.eu)

Journal ArticleDOI
TL;DR: The ICLabel classifier improves upon existing methods by improving the accuracy of the computed label estimates and by enhancing its computational efficiency by outperforms or performs comparably to the previous best publicly available automated IC component classification method for all measured IC categories.

Journal ArticleDOI
16 Jan 2019
TL;DR: In this paper, a method for training a neural network policy in simulation and transferring it to a state-of-the-art legged system is presented. But this method is limited to simulation and only few and comparably simple examples have been deployed on real systems.
Abstract: Legged robots pose one of the greatest challenges in robotics. Dynamic and agile maneuvers of animals cannot be imitated by existing methods that are crafted by humans. A compelling alternative is reinforcement learning, which requires minimal craftsmanship and promotes the natural evolution of a control policy. However, so far, reinforcement learning research for legged robots is mainly limited to simulation, and only few and comparably simple examples have been deployed on real systems. The primary reason is that training with real robots, particularly with dynamically balancing systems, is complicated and expensive. In the present work, we introduce a method for training a neural network policy in simulation and transferring it to a state-of-the-art legged system, thereby leveraging fast, automated, and cost-effective data generation schemes. The approach is applied to the ANYmal robot, a sophisticated medium-dog-sized quadrupedal system. Using policies trained in simulation, the quadrupedal machine achieves locomotion skills that go beyond what had been achieved with prior methods: ANYmal is capable of precisely and energy-efficiently following high-level body velocity commands, running faster than before, and recovering from falling even in complex configurations.

Journal ArticleDOI
TL;DR: It is indicated that ACT is more effective than treatment as usual or placebo and that ACT may be as effective in treating anxiety disorders, depression, addiction, and somatic health problems as established psychological interventions.
Abstract: Background: The current study presents the results of a meta-analysis of 39 randomized controlled trials on the efficacy of acceptance and commitment therapy (ACT), including 1,821 patients with mental disorders or somatic health problems. Methods: We searched PsycINFO, MEDLINE and the Cochrane Central Register of Controlled Trials. Information provided by the ACBS (Association of Contextual Behavioral Science) community was also included. Statistical calculations were conducted using Comprehensive Meta-Analysis software. Study quality was rated using a methodology rating form. Results: ACT outperformed control conditions (Hedges' g = 0.57) at posttreatment and follow-up assessments in completer and intent-to-treat analyses for primary outcomes. ACT was superior to waitlist (Hedges' g = 0.82), to psychological placebo (Hedges' g = 0.51) and to treatment as usual (TAU) (we defined TAU as the standard treatment as usual; Hedges' g = 0.64). ACT was also superior on secondary outcomes (Hedges' g = 0.30), life satisfaction/quality measures (Hedges' g = 0.37) and process measures (Hedges' g = 0. 56) compared to control conditions. The comparison between ACT and established treatments (cognitive behavioral therapy) did not reveal any significant differences between these treatments (p = 0.140). Conclusions: Our findings indicate that ACT is more effective than treatment as usual or placebo and that ACT may be as effective in treating anxiety disorders, depression, addiction, and somatic health problems as established psychological interventions. More research that focuses on quality of life and processes of change is needed to understand the added value of ACT and its transdiagnostic nature.

Journal ArticleDOI
TL;DR: Recently, a variety of QCD inspired phenomenological models have been proposed, such as meson-gluon hybrids and pentaquark baryons that contain heavy (charm or bottom) quarks as mentioned in this paper.
Abstract: Quantum chromodynamics (QCD), the generally accepted theory for strong interactions, describes the interactions between quarks and gluons. The strongly interacting particles that are seen in nature are hadrons, which are composites of quarks and gluons. Since QCD is a strongly coupled theory at distance scales that are characteristic of observable hadrons, there are no rigorous, first-principle methods to derive the spectrum and properties of the hadrons from the QCD Lagrangian, except for lattice QCD simulations that are not yet able to cope with all aspects of complex and short-lived states. Instead, a variety of “QCD inspired” phenomenological models have been proposed. Common features of these models are predictions for the existence of hadrons with substructures that are more complex than the standard quark-antiquark mesons and the three-quark baryons of the original quark model that provides a concise description of most of the low-mass hadrons. Recently, an assortment of candidates for nonstandard multiquark mesons, meson-gluon hybrids, and pentaquark baryons that contain heavy (charm or bottom) quarks has been discovered. Here the experimental evidence for these states is reviewed and some general comparisons of their measured properties with standard quark model expectations and predictions of various models for nonstandard hadrons are made. The conclusion is that the spectroscopy of all but the simplest hadrons is not yet understood.

Journal ArticleDOI
TL;DR: The coproduction principle is used to examine the roles, relationships and aims of this interdependent work, and the principle's implications and challenges for health professional development, for service delivery system design and for understanding and measuring benefit in healthcare services.
Abstract: Efforts to ensure effective participation of patients in healthcare are called by many names—patient centredness, patient engagement, patient experience. Improvement initiatives in this domain often resemble the efforts of manufacturers to engage consumers in designing and marketing products. Services, however, are fundamentally different than products; unlike goods, services are always ‘coproduced’. Failure to recognise this unique character of a service and its implications may limit our success in partnering with patients to improve health care. We trace a partial history of the coproduction concept, present a model of healthcare service coproduction and explore its application as a design principle in three healthcare service delivery innovations. We use the principle to examine the roles, relationships and aims of this interdependent work. We explore the principle's implications and challenges for health professional development, for service delivery system design and for understanding and measuring benefit in healthcare services.

Posted Content
TL;DR: This paper performs a thorough empirical evaluation of four prominent GNN models and suggests that simpler GNN architectures are able to outperform the more sophisticated ones if the hyperparameters and the training procedure are tuned fairly for all models.
Abstract: Semi-supervised node classification in graphs is a fundamental problem in graph mining, and the recently proposed graph neural networks (GNNs) have achieved unparalleled results on this task. Due to their massive success, GNNs have attracted a lot of attention, and many novel architectures have been put forward. In this paper we show that existing evaluation strategies for GNN models have serious shortcomings. We show that using the same train/validation/test splits of the same datasets, as well as making significant changes to the training procedure (e.g. early stopping criteria) precludes a fair comparison of different architectures. We perform a thorough empirical evaluation of four prominent GNN models and show that considering different splits of the data leads to dramatically different rankings of models. Even more importantly, our findings suggest that simpler GNN architectures are able to outperform the more sophisticated ones if the hyperparameters and the training procedure are tuned fairly for all models.