scispace - formally typeset
Search or ask a question
Browse all papers

Journal ArticleDOI
TL;DR: Patients with left-sided tumors had a markedly better prognosis than those with right-sided tumor locations and a significant interaction was observed between primary tumor location and treatment for OS within the RAS wt populations of both studies in multivariable models that also included sex, prior adjuvant therapy, and BRAF mutational status.
Abstract: Importance Metastatic colorectal cancer (mCRC) is heterogeneous, and primary tumors arising from different regions of the colon are clinically and molecularly distinct. Objective To examine the prognostic and predictive value of primary tumor location in patients withRAS wild-type (wt) mCRC treated with first-line fluorouracil, leucovorin, and irinotecan (FOLFIRI) plus cetuximab in the Cetuximab Combined With Irinotecan in First-line Therapy for Metastatic Colorectal Cancer (CRYSTAL) trial and FOLFIRI Plus Cetuximab Versus FOLFIRI Plus Bevacizumab as First-Line Treatment For Patients With Metastatic Colorectal Cancer (FIRE-3) trial. Design, Setting, and Participants In this retrospective analysis patients withRASwt metastatic colorectal cancer from the CRYSTAL and FIRE-3 trials were classified as having left-sided or right-sided mCRC, defined, respectively, as patients whose tumors originated in the splenic flexure, descending colon, sigmoid colon, or rectum vs appendix, cecum, ascending colon, hepatic flexure, or transverse colon. Main Outcomes and Measures Progression-free survival (PFS), overall survival (OS), and objective response rate (ORR) were assessed according to tumor location and treatment arm. Results In theRASwt populations of the CRYSTAL and FIRE-3 trials, patients with left-sided tumors (n = 142 and n = 157, respectively) had markedly superior PFS, OS, and ORR compared with patients with right-sided tumors (n = 33 and n = 38, respectively). Among CRYSTAL and FIRE-3 study patients withRASwt left-sided tumors, FOLFIRI plus cetuximab significantly improved OS relative to the respective comparators (FOLFIRI and FOLFIRI plus bevacizumab); in contrast, inRASwt patients with poor-prognosis right-sided tumors, limited efficacy benefits were observed upon the addition of cetuximab to FOLFIRI in CRYSTAL, and comparable outcomes were observed between the FOLFIRI plus cetuximab and FOLFIRI plus bevacizumab arms of FIRE-3. A significant interaction was observed between primary tumor location and treatment for OS (CRYSTAL: hazard ratio [HR], 1.95; 95% CI, 1.09-3.48 and FIRE-3: HR, 0.40; 95% CI, 0.23-0.70) within theRASwt populations of both studies in multivariable models that also included sex, prior adjuvant therapy, andBRAFmutational status. Conclusions and Relevance In theRAS wt populations of CRYSTAL and FIRE-3, patients with left-sided tumors had a markedly better prognosis than those with right-sided tumors. First-line FOLFIRI plus cetuximab clearly benefitted patients with left-sided tumors (vs FOLFIRI or FOLFIRI plus bevacizumab, respectively), whereas patients with right-sided tumors derived limited benefit from standard treatments. Trial Registration clinicaltrials.gov Identifiers: CRYSTAL,NCT00154102, and FIRE-3,NCT00433927

555 citations


Proceedings ArticleDOI
Pavlo Molchanov1, Xiaodong Yang1, Shalini Gupta1, Kihwan Kim1, Stephen Tyree1, Jan Kautz1 
27 Jun 2016
TL;DR: A recurrent three-dimensional convolutional neural network that performs simultaneous detection and classification of dynamic hand gestures from multi-modal data and achieves state-of-the-art performance on SKIG and ChaLearn2014 benchmarks.
Abstract: Automatic detection and classification of dynamic hand gestures in real-world systems intended for human computer interaction is challenging as: 1) there is a large diversity in how people perform gestures, making detection and classification difficult, 2) the system must work online in order to avoid noticeable lag between performing a gesture and its classification, in fact, a negative lag (classification before the gesture is finished) is desirable, as feedback to the user can then be truly instantaneous. In this paper, we address these challenges with a recurrent three-dimensional convolutional neural network that performs simultaneous detection and classification of dynamic hand gestures from multi-modal data. We employ connectionist temporal classification to train the network to predict class labels from inprogress gestures in unsegmented input streams. In order to validate our method, we introduce a new challenging multimodal dynamic hand gesture dataset captured with depth, color and stereo-IR sensors. On this challenging dataset, our gesture recognition system achieves an accuracy of 83:8%, outperforms competing state-of-the-art algorithms, and approaches human accuracy of 88:4%. Moreover, our method achieves state-of-the-art performance on SKIG and ChaLearn2014 benchmarks.

555 citations


Journal ArticleDOI
TL;DR: Dysfunction in the production and/or the bioavailability of NO characterizes endothelial dysfunction, which is associated with cardiovascular diseases such as hypertension and atherosclerosis.

555 citations


Posted Content
TL;DR: This work introduces PoseCNN, a new Convolutional Neural Network for 6D object pose estimation, which is highly robust to occlusions, can handle symmetric objects, and provide accurate pose estimation using only color images as input.
Abstract: Estimating the 6D pose of known objects is important for robots to interact with the real world. The problem is challenging due to the variety of objects as well as the complexity of a scene caused by clutter and occlusions between objects. In this work, we introduce PoseCNN, a new Convolutional Neural Network for 6D object pose estimation. PoseCNN estimates the 3D translation of an object by localizing its center in the image and predicting its distance from the camera. The 3D rotation of the object is estimated by regressing to a quaternion representation. We also introduce a novel loss function that enables PoseCNN to handle symmetric objects. In addition, we contribute a large scale video dataset for 6D object pose estimation named the YCB-Video dataset. Our dataset provides accurate 6D poses of 21 objects from the YCB dataset observed in 92 videos with 133,827 frames. We conduct extensive experiments on our YCB-Video dataset and the OccludedLINEMOD dataset to show that PoseCNN is highly robust to occlusions, can handle symmetric objects, and provide accurate pose estimation using only color images as input. When using depth data to further refine the poses, our approach achieves state-of-the-art results on the challenging OccludedLINEMOD dataset. Our code and dataset are available at this https URL.

555 citations


Journal ArticleDOI
TL;DR: In this article, the detection, creation, manipulation and deletion of individual skyrmions in ultrathin transition metal films and in multilayers are surveyed, and their control by currents and external fields is discussed.
Abstract: Magnetic skyrmions are chiral quasiparticles that show promise for the transportation and storage of information. On a fundamental level, skyrmions are model systems for topologically protected spin textures and can be considered as the counterpart of topologically protected electronic states, emphasizing the role of topology in the classification of complex states of condensed matter. Recent impressive demonstrations of the control of individual nanometre-scale skyrmions — including their creation, detection, manipulation and deletion — have raised expectations for their use in future spintronic devices, including magnetic memories and logic gates. From a materials perspective, it is remarkable that skyrmions can be stabilized in ultrathin transition metal films, such as iron — one of the most abundant elements on earth — if in contact with materials that exhibit high spin–orbit coupling. At present, research in this field is focused on the development of transition-metal-based magnetic multilayer structures that support skyrmionic states at room temperature and allow for the precise control of skyrmions by spin-polarized currents and external fields. Magnetic skyrmions are quasiparticles that hold promise for future spintronic devices. In this Review, the detection, creation, manipulation and deletion of individual skyrmions in ultrathin films and in multilayers are surveyed, and their control by currents and external fields is discussed.

555 citations


Journal ArticleDOI
04 Nov 2016-Science
TL;DR: It is shown that an optical processing approach based on a network of coupled optical pulses in a ring fiber can be used to model and optimize large-scale Ising systems, and a coherent Ising machine outperformed simulated annealing in terms of accuracy and computation time for a 2000-node complete graph.
Abstract: The analysis and optimization of complex systems can be reduced to mathematical problems collectively known as combinatorial optimization. Many such problems can be mapped onto ground-state search problems of the Ising model, and various artificial spin systems are now emerging as promising approaches. However, physical Ising machines have suffered from limited numbers of spin-spin couplings because of implementations based on localized spins, resulting in severe scalability problems. We report a 2000-spin network with all-to-all spin-spin couplings. Using a measurement and feedback scheme, we coupled time-multiplexed degenerate optical parametric oscillators to implement maximum cut problems on arbitrary graph topologies with up to 2000 nodes. Our coherent Ising machine outperformed simulated annealing in terms of accuracy and computation time for a 2000-node complete graph.

555 citations


Journal ArticleDOI
TL;DR: The misclassification of benign variants as pathogenic variants that were found in this study shows the need for sequencing the genomes of diverse populations, both in asymptomatic controls and the tested patient population.
Abstract: BackgroundFor more than a decade, risk stratification for hypertrophic cardiomyopathy has been enhanced by targeted genetic testing. Using sequencing results, clinicians routinely assess the risk of hypertrophic cardiomyopathy in a patient’s relatives and diagnose the condition in patients who have ambiguous clinical presentations. However, the benefits of genetic testing come with the risk that variants may be misclassified. MethodsUsing publicly accessible exome data, we identified variants that have previously been considered causal in hypertrophic cardiomyopathy and that are overrepresented in the general population. We studied these variants in diverse populations and reevaluated their initial ascertainments in the medical literature. We reviewed patient records at a leading genetic-testing laboratory for occurrences of these variants during the near-decade-long history of the laboratory. ResultsMultiple patients, all of whom were of African or unspecified ancestry, received positive reports, with va...

555 citations


Proceedings ArticleDOI
TL;DR: Zhang et al. as mentioned in this paper proposed a triplet ranking loss to characterize that one image is more similar to the second image than to the third one, which achieved state-of-the-art performance.
Abstract: Similarity-preserving hashing is a widely-used method for nearest neighbour search in large-scale image retrieval tasks. For most existing hashing methods, an image is first encoded as a vector of hand-engineering visual features, followed by another separate projection or quantization step that generates binary codes. However, such visual feature vectors may not be optimally compatible with the coding process, thus producing sub-optimal hashing codes. In this paper, we propose a deep architecture for supervised hashing, in which images are mapped into binary codes via carefully designed deep neural networks. The pipeline of the proposed deep architecture consists of three building blocks: 1) a sub-network with a stack of convolution layers to produce the effective intermediate image features; 2) a divide-and-encode module to divide the intermediate image features into multiple branches, each encoded into one hash bit; and 3) a triplet ranking loss designed to characterize that one image is more similar to the second image than to the third one. Extensive evaluations on several benchmark image datasets show that the proposed simultaneous feature learning and hash coding pipeline brings substantial improvements over other state-of-the-art supervised or unsupervised hashing methods.

555 citations


Journal ArticleDOI
24 Jul 2015-Science
TL;DR: In this paper, the authors demonstrate the coherent coupling between a single-magnon excitation in a millimeter-sized ferromagnetic sphere and a superconducting qubit.
Abstract: Rigidity of an ordered phase in condensed matter results in collective excitation modes spatially extending to macroscopic dimensions. A magnon is a quantum of such collective excitation modes in ordered spin systems. Here, we demonstrate the coherent coupling between a single-magnon excitation in a millimeter-sized ferromagnetic sphere and a superconducting qubit, with the interaction mediated by the virtual photon excitation in a microwave cavity. We obtain the coupling strength far exceeding the damping rates, thus bringing the hybrid system into the strong coupling regime. Furthermore, we use a parametric drive to realize a tunable magnon-qubit coupling scheme. Our approach provides a versatile tool for quantum control and measurement of the magnon excitations and may lead to advances in quantum information processing.

555 citations


Journal ArticleDOI
TL;DR: In patients with anterior STEMI who had been referred for primary PCI, intravenous cyclosporine did not result in better clinical outcomes than those with placebo and did not prevent adverse left ventricular remodeling at 1 year.
Abstract: BACKGROUND: Experimental and clinical evidence suggests that cyclosporine may attenuate reperfusion injury and reduce myocardial infarct size. We aimed to test whether cyclosporine would improve clinical outcomes and prevent adverse left ventricular remodeling. METHODS: In a multicenter, double-blind, randomized trial, we assigned 970 patients with an acute anterior ST-segment elevation myocardial infarction (STEMI) who were undergoing percutaneous coronary intervention (PCI) within 12 hours after symptom onset and who had complete occlusion of the culprit coronary artery to receive a bolus injection of cyclosporine (administered intravenously at a dose of 2.5 mg per kilogram of body weight) or matching placebo before coronary recanalization. The primary outcome was a composite of death from any cause, worsening of heart failure during the initial hospitalization, rehospitalization for heart failure, or adverse left ventricular remodeling at 1 year. Adverse left ventricular remodeling was defined as an increase of 15% or more in the left ventricular end-diastolic volume. RESULTS: A total of 395 patients in the cyclosporine group and 396 in the placebo group received the assigned study drug and had data that could be evaluated for the primary outcome at 1 year. The rate of the primary outcome was 59.0% in the cyclosporine group and 58.1% in the control group (odds ratio, 1.04; 95% confidence interval [CI], 0.78 to 1.39; P=0.77). Cyclosporine did not reduce the incidence of the separate clinical components of the primary outcome or other events, including recurrent infarction, unstable angina, and stroke. No significant difference in the safety profile was observed between the two treatment groups. CONCLUSIONS: In patients with anterior STEMI who had been referred for primary PCI, intravenous cyclosporine did not result in better clinical outcomes than those with placebo and did not prevent adverse left ventricular remodeling at 1 year. (Funded by the French Ministry of Health and NeuroVive Pharmaceutical; CIRCUS ClinicalTrials.gov number, NCT01502774; EudraCT number, 2009-013713-99.)

555 citations


Journal ArticleDOI
TL;DR: By systematic measurements on individual droplets it is demonstrated quantitatively that quantum fluctuations mechanically stabilize them against the mean-field collapse, and the interference of several droplets indicating that this stable many-body state is phase coherent.
Abstract: The collapse of a trapped ultracold magnetic gas is arrested by quantum fluctuations, creating quantum droplets of superfluid atoms.

Book ChapterDOI
10 Sep 2017
TL;DR: Wang et al. as mentioned in this paper trained a fully convolutional network (FCN) to generate CT given the MR image, and applied Auto-Context Model (ACM) to implement a context-aware generative adversarial network.
Abstract: Computed tomography (CT) is critical for various clinical applications, e.g., radiation treatment planning and also PET attenuation correction in MRI/PET scanner. However, CT exposes radiation during acquisition, which may cause side effects to patients. Compared to CT, magnetic resonance imaging (MRI) is much safer and does not involve radiations. Therefore, recently researchers are greatly motivated to estimate CT image from its corresponding MR image of the same subject for the case of radiation planning. In this paper, we propose a data-driven approach to address this challenging problem. Specifically, we train a fully convolutional network (FCN) to generate CT given the MR image. To better model the nonlinear mapping from MRI to CT and produce more realistic images, we propose to use the adversarial training strategy to train the FCN. Moreover, we propose an image-gradient-difference based loss function to alleviate the blurriness of the generated CT. We further apply Auto-Context Model (ACM) to implement a context-aware generative adversarial network. Experimental results show that our method is accurate and robust for predicting CT images from MR images, and also outperforms three state-of-the-art methods under comparison.

Journal ArticleDOI
16 Jun 2015-JAMA
TL;DR: Most patients randomized to antibiotic treatment for uncomplicated appendicitis did not require appendectomy during the 1-year follow-up period, and those who required appendectomy did not experience significant complications, despite the prespecified noninferiority margin.
Abstract: Importance An increasing amount of evidence supports the use of antibiotics instead of surgery for treating patients with uncomplicated acute appendicitis. Objective To compare antibiotic therapy with appendectomy in the treatment of uncomplicated acute appendicitis confirmed by computed tomography (CT). Design, Setting, and Participants The Appendicitis Acuta (APPAC) multicenter, open-label, noninferiority randomized clinical trial was conducted from November 2009 until June 2012 in Finland. The trial enrolled 530 patients aged 18 to 60 years with uncomplicated acute appendicitis confirmed by a CT scan. Patients were randomly assigned to early appendectomy or antibiotic treatment with a 1-year follow-up period. Interventions Patients randomized to antibiotic therapy received intravenous ertapenem (1 g/d) for 3 days followed by 7 days of oral levofloxacin (500 mg once daily) and metronidazole (500 mg 3 times per day). Patients randomized to the surgical treatment group were assigned to undergo standard open appendectomy. Main Outcomes and Measures The primary end point for the surgical intervention was the successful completion of an appendectomy. The primary end point for antibiotic-treated patients was discharge from the hospital without the need for surgery and no recurrent appendicitis during a 1-year follow-up period. Results There were 273 patients in the surgical group and 257 in the antibiotic group. Of 273 patients in the surgical group, all but 1 underwent successful appendectomy, resulting in a success rate of 99.6% (95% CI, 98.0% to 100.0%). In the antibiotic group, 70 patients (27.3%; 95% CI, 22.0% to 33.2%) underwent appendectomy within 1 year of initial presentation for appendicitis. Of the 256 patients available for follow-up in the antibiotic group, 186 (72.7%; 95% CI, 66.8% to 78.0%) did not require surgery. The intention-to-treat analysis yielded a difference in treatment efficacy between groups of −27.0% (95% CI, −31.6% to ∞) ( P = .89). Given the prespecified noninferiority margin of 24%, we were unable to demonstrate noninferiority of antibiotic treatment relative to surgery. Of the 70 patients randomized to antibiotic treatment who subsequently underwent appendectomy, 58 (82.9%; 95% CI, 72.0% to 90.8%) had uncomplicated appendicitis, 7 (10.0%; 95% CI, 4.1% to 19.5%) had complicated acute appendicitis, and 5 (7.1%; 95% CI, 2.4% to 15.9%) did not have appendicitis but received appendectomy for suspected recurrence. There were no intra-abdominal abscesses or other major complications associated with delayed appendectomy in patients randomized to antibiotic treatment. Conclusions and Relevance Among patients with CT-proven, uncomplicated appendicitis, antibiotic treatment did not meet the prespecified criterion for noninferiority compared with appendectomy. Most patients randomized to antibiotic treatment for uncomplicated appendicitis did not require appendectomy during the 1-year follow-up period, and those who required appendectomy did not experience significant complications. Trial Registration clinicaltrials.gov Identifier:NCT01022567

Journal ArticleDOI
TL;DR: An overview of the technologies used to implement surface plasmon resonance (SPR) effects into fiber-optic sensors for chemical and biochemical applications and a survey of results reported over the last ten years is presented.
Abstract: This paper presents a brief overview of the technologies used to implement surface plasmon resonance (SPR) effects into fiber-optic sensors for chemical and biochemical applications and a survey of results reported over the last ten years. The performance indicators that are relevant for such systems, such as refractometric sensitivity, operating wavelength, and figure of merit (FOM), are discussed and listed in table form. A list of experimental results with reported limits of detection (LOD) for proteins, toxins, viruses, DNA, bacteria, glucose, and various chemicals is also provided for the same time period. Configurations discussed include fiber-optic analogues of the Kretschmann–Raether prism SPR platforms, made from geometry-modified multimode and single-mode optical fibers (unclad, side-polished, tapered, and U-shaped), long period fiber gratings (LPFG), tilted fiber Bragg gratings (TFBG), and specialty fibers (plastic or polymer, microstructured, and photonic crystal fibers). Configurations involving the excitation of surface plasmon polaritons (SPP) on continuous thin metal layers as well as those involving localized SPR (LSPR) phenomena in nanoparticle metal coatings of gold, silver, and other metals at visible and near-infrared wavelengths are described and compared quantitatively.

Journal ArticleDOI
TL;DR: Improved neurological impairment and long‐term neuroprotection associated with enhanced angioneurogenesis were noticed in stroke mice receiving EVs from two different bone marrow‐derived MSC lineages, providing clinically relevant evidence warranting rapid proof‐of‐concept studies in stroke patients.
Abstract: Althoughtheinitialconceptsofstemcelltherapyaimedatreplacinglosttissue,morerecentevidence has suggested that stem and progenitor cells alike promote postischemic neurological recovery by secreted factors that restore the injured brain’s capacity to reshape. Specifically, extracellular vesicles (EVs) derived from stem cells such as exosomes have recently been suggested to mediate restorative stem cell effects.In order to define whether EVs indeedimprove postischemic neurologicalimpairmentandbrainremodeling,wesystematicallycomparedtheeffectsofmesenchymalstem cell(MSC)-derivedEVs(MSC-EVs)withMSCsthatwerei.v.deliveredtomiceondays1,3,and5(MSCEVs) or on day 1 (MSCs) after focal cerebral ischemia in C57BL6 mice. For as long as 28 days after stroke, motor coordination deficits, histological brain injury, immune responses in the peripheral bloodandbrain,andcerebralangiogenesisandneurogenesiswereanalyzed.Improvedneurological impairment and long-term neuroprotection associated with enhanced angioneurogenesis were noticed in stroke mice receiving EVs from two different bone marrow-derived MSC lineages. MSC-EV administration closely resembled responses to MSCs and persisted throughout the observation period. Although cerebral immune cell infiltration was not affected by MSC-EVs, postischemic immunosuppression (i.e., B-cell, natural killer cell, and T-cell lymphopenia) was attenuated in the peripheral blood at 6 days after ischemia, providing an appropriate external milieu for successful brain remodeling. Because MSC-EVs have recently been shown to be apparently safe in humans, the present study provides clinically relevant evidence warranting rapid proof-of-concept studies in stroke patients. STEM CELLS TRANSLATIONAL MEDICINE 2015;4:1–13

Journal ArticleDOI
12 Mar 2015-Cell
TL;DR: It is identified that a SNP in COLD1, SNP2, originated from Chinese Oryza rufipogon, is responsible for the ability of COLD(jap/ind) to confer chilling tolerance, supporting the importance ofCOLD1 in plant adaptation.


Journal ArticleDOI
TL;DR: Progress in cancer control over the study period was evident for stomach, colon, lung (in males), and ovarian cancer, and the impact of comorbidity are likely the main determinants of patient outcomes.
Abstract: Summary Background Population-based cancer survival estimates provide valuable insights into the effectiveness of cancer services and can reflect the prospects of cure. As part of the second phase of the International Cancer Benchmarking Partnership (ICBP), the Cancer Survival in High-Income Countries (SURVMARK-2) project aims to provide a comprehensive overview of cancer survival across seven high-income countries and a comparative assessment of corresponding incidence and mortality trends. Methods In this longitudinal, population-based study, we collected patient-level data on 3·9 million patients with cancer from population-based cancer registries in 21 jurisdictions in seven countries (Australia, Canada, Denmark, Ireland, New Zealand, Norway, and the UK) for seven sites of cancer (oesophagus, stomach, colon, rectum, pancreas, lung, and ovary) diagnosed between 1995 and 2014, and followed up until Dec 31, 2015. We calculated age-standardised net survival at 1 year and 5 years after diagnosis by site, age group, and period of diagnosis. We mapped changes in incidence and mortality to changes in survival to assess progress in cancer control. Findings In 19 eligible jurisdictions, 3 764 543 cases of cancer were eligible for inclusion in the study. In the 19 included jurisdictions, over 1995–2014, 1-year and 5-year net survival increased in each country across almost all cancer types, with, for example, 5-year rectal cancer survival increasing more than 13 percentage points in Denmark, Ireland, and the UK. For 2010–14, survival was generally higher in Australia, Canada, and Norway than in New Zealand, Denmark, Ireland, and the UK. Over the study period, larger survival improvements were observed for patients younger than 75 years at diagnosis than those aged 75 years and older, and notably for cancers with a poor prognosis (ie, oesophagus, stomach, pancreas, and lung). Progress in cancer control (ie, increased survival, decreased mortality and incidence) over the study period was evident for stomach, colon, lung (in males), and ovarian cancer. Interpretation The joint evaluation of trends in incidence, mortality, and survival indicated progress in four of the seven studied cancers. Cancer survival continues to increase across high-income countries; however, international disparities persist. While truly valid comparisons require differences in registration practice, classification, and coding to be minimal, stage of disease at diagnosis, timely access to effective treatment, and the extent of comorbidity are likely the main determinants of patient outcomes. Future studies are needed to assess the impact of these factors to further our understanding of international disparities in cancer survival. Funding Canadian Partnership Against Cancer; Cancer Council Victoria; Cancer Institute New South Wales; Cancer Research UK; Danish Cancer Society; National Cancer Registry Ireland; The Cancer Society of New Zealand; National Health Service England; Norwegian Cancer Society; Public Health Agency Northern Ireland, on behalf of the Northern Ireland Cancer Registry; The Scottish Government; Western Australia Department of Health; and Wales Cancer Network.

Journal ArticleDOI
TL;DR: In this article, the authors present a review of the use of gamification in education, highlighting the need for systematically designed studies and rigorously tested approaches to confirm the educational benefits of gamified learning.
Abstract: Gamification of education is a developing approach for increasing learners’ motivation and engagement by incorporating game design elements in educational environments. With the growing popularity of gamification and yet mixed success of its application in educational contexts, the current review is aiming to shed a more realistic light on the research in this field by focusing on empirical evidence rather than on potentialities, beliefs or preferences. Accordingly, it critically examines the advancement in gamifying education. The discussion is structured around the used gamification mechanisms, the gamified subjects, the type of gamified learning activities, and the study goals, with an emphasis on the reliability and validity of the reported outcomes. To improve our understanding and offer a more realistic picture of the progress of gamification in education, consistent with the presented evidence, we examine both the outcomes reported in the papers and how they have been obtained. While the gamification in education is still a growing phenomenon, the review reveals that (i) insufficient evidence exists to support the long-term benefits of gamification in educational contexts; (ii) the practice of gamifying learning has outpaced researchers’ understanding of its mechanisms and methods; (iii) the knowledge of how to gamify an activity in accordance with the specifics of the educational context is still limited. The review highlights the need for systematically designed studies and rigorously tested approaches confirming the educational benefits of gamification, if gamified learning is to become a recognized instructional approach.

Journal ArticleDOI
06 Sep 2016-JAMA
TL;DR: Exposure to MRI during the first trimester of pregnancy compared with nonexposure was not associated with increased risk of harm to the fetus or in early childhood, and gadolinium MRI at any time during pregnancy was associated with an increasedrisk of a broad set of rheumatological, inflammatory, or infiltrative skin conditions and for stillbirth or neonatal death.
Abstract: Importance Fetal safety of magnetic resonance imaging (MRI) during the first trimester of pregnancy or with gadolinium enhancement at any time of pregnancy is unknown. Objective To evaluate the long-term safety after exposure to MRI in the first trimester of pregnancy or to gadolinium at any time during pregnancy. Design, Setting, and Participants Universal health care databases in the province of Ontario, Canada, were used to identify all births of more than 20 weeks, from 2003-2015. Exposures Magnetic resonance imaging exposure in the first trimester of pregnancy, or gadolinium MRI exposure at any time in pregnancy. Main Outcomes and Measures For first-trimester MRI exposure, the risk of stillbirth or neonatal death within 28 days of birth and any congenital anomaly, neoplasm, and hearing or vision loss was evaluated from birth to age 4 years. For gadolinium-enhanced MRI in pregnancy, connective tissue or skin disease resembling nephrogenic systemic fibrosis (NSF-like) and a broader set of rheumatological, inflammatory, or infiltrative skin conditions from birth were identified. Results Of 1 424 105 deliveries (48% girls; mean gestational age, 39 weeks), the overall rate of MRI was 3.97 per 1000 pregnancies. Comparing first-trimester MRI (n = 1737) to no MRI (n = 1 418 451), there were 19 stillbirths or deaths vs 9844 in the unexposed cohort (adjusted relative risk [RR], 1.68; 95% CI, 0.97 to 2.90) for an adjusted risk difference of 4.7 per 1000 person-years (95% CI, −1.6 to 11.0). The risk was also not significantly higher for congenital anomalies, neoplasm, or vision or hearing loss. Comparing gadolinium MRI (n = 397) with no MRI (n = 1 418 451), the hazard ratio for NSF-like outcomes was not statistically significant. The broader outcome of any rheumatological, inflammatory, or infiltrative skin condition occurred in 123 vs 384 180 births (adjusted HR, 1.36; 95% CI, 1.09 to 1.69) for an adjusted risk difference of 45.3 per 1000 person-years (95% CI, 11.3 to 86.8). Stillbirths and neonatal deaths occurred among 7 MRI-exposed vs 9844 unexposed pregnancies (adjusted RR, 3.70; 95% CI, 1.55 to 8.85) for an adjusted risk difference of 47.5 per 1000 pregnancies (95% CI, 9.7 to 138.2). Conclusions and Relevance Exposure to MRI during the first trimester of pregnancy compared with nonexposure was not associated with increased risk of harm to the fetus or in early childhood. Gadolinium MRI at any time during pregnancy was associated with an increased risk of a broad set of rheumatological, inflammatory, or infiltrative skin conditions and for stillbirth or neonatal death. The study may not have been able to detect rare adverse outcomes.

Journal ArticleDOI
TL;DR: The LPA locus link with cardiovascular risk exemplifies how detailed metabolic profiling may inform underlying aetiology via extensive associations with very-low-density lipoprotein and triglyceride metabolism and strengthens the argument for safe LPA-targeted intervention to reduce cardiovascular risk.
Abstract: Genome-wide association studies have identified numerous loci linked with complex diseases, for which the molecular mechanisms remain largely unclear. Comprehensive molecular profiling of circulating metabolites captures highly heritable traits, which can help to uncover metabolic pathophysiology underlying established disease variants. We conduct an extended genome-wide association study of genetic influences on 123 circulating metabolic traits quantified by nuclear magnetic resonance metabolomics from up to 24,925 individuals and identify eight novel loci for amino acids, pyruvate and fatty acids. The LPA locus link with cardiovascular risk exemplifies how detailed metabolic profiling may inform underlying aetiology via extensive associations with very-low-density lipoprotein and triglyceride metabolism. Genetic fine mapping and Mendelian randomization uncover wide-spread causal effects of lipoprotein(a) on overall lipoprotein metabolism and we assess potential pleiotropic consequences of genetically elevated lipoprotein(a) on diverse morbidities via electronic health-care records. Our findings strengthen the argument for safe LPA-targeted intervention to reduce cardiovascular risk.

Journal ArticleDOI
02 Jun 2020-JAMA
TL;DR: This study describes demographic characteristics and hospital bed capacities of the 5 New York City boroughs, and evaluates whether differences in testing for coronavirus disease 2019 (COVID-19), hospitalizations, and deaths have emerged as a signal of racial, ethnic, and financial disparities.
Abstract: This study describes demographic characteristics and hospital bed capacities of the 5 New York City boroughs, and evaluates whether differences in testing for coronavirus disease 2019 (COVID-19), hospitalizations, and deaths have emerged as a signal of racial, ethnic, and financial disparities.

Proceedings ArticleDOI
15 Feb 2018
TL;DR: This work analyzes four gradient-based attribution methods and formally prove conditions of equivalence and approximation between them, and constructs a unified framework which enables a direct comparison, as well as an easier implementation.
Abstract: Understanding the flow of information in Deep Neural Networks (DNNs) is a challenging problem that has gain increasing attention over the last few years. While several methods have been proposed to explain network predictions, there have been only a few attempts to compare them from a theoretical perspective. What is more, no exhaustive empirical comparison has been performed in the past. In this work, we analyze four gradient-based attribution methods and formally prove conditions of equivalence and approximation between them. By reformulating two of these methods, we construct a unified framework which enables a direct comparison, as well as an easier implementation. Finally, we propose a novel evaluation metric, called Sensitivity-n and test the gradient-based attribution methods alongside with a simple perturbation-based attribution method on several datasets in the domains of image and text classification, using various network architectures.

Journal ArticleDOI
TL;DR: This article inquire into Facebook’s development as a platform by situating it within the transformation of social network sites into social media platforms with a historical perspective on platformization, or the rise of the platform as the dominant infrastructural and economic model of the social web and its consequences.
Abstract: In this article, I inquire into Facebook’s development as a platform by situating it within the transformation of social network sites into social media platforms. I explore this shift with a histo...

Proceedings ArticleDOI
18 Apr 2017
TL;DR: A lightweight BC-based architecture for IoT that virtually eliminates the overheads of classic BC, while maintaining most of its security and privacy benefits, is proposed.
Abstract: There has been increasing interest in adopting BlockChain (BC), that underpins the crypto-currency Bitcoin, in Internet of Things (IoT) for security and privacy. However, BCs are computationally expensive and involve high bandwidth overhead and delays, which are not suitable for most IoT devices. This paper proposes a lightweight BC-based architecture for IoT that virtually eliminates the overheads of classic BC, while maintaining most of its security and privacy benefits. IoT devices benefit from a private immutable ledger, that acts similar to BC but is managed centrally, to optimize energy consumption. High resource devices create an overlay network to implement a publicly accessible distributed BC that ensures end-to-end security and privacy. The proposed architecture uses distributed trust to reduce the block validation processing time. We explore our approach in a smart home setting as a representative case study for broader IoT applications. Qualitative evaluation of the architecture under common threat models highlights its effectiveness in providing security and privacy for IoT applications. Simulations demonstrate that our method decreases packet and processing overhead significantly compared to the BC implementation used in Bitcoin.

Journal ArticleDOI
TL;DR: PD-1 blockade may facilitate the proliferation of highly suppressive PD-1+ eTreg cells in HPDs, resulting in inhibition of antitumor immunity andletion of the former may help treat and prevent HPD.
Abstract: PD-1 blockade is a cancer immunotherapy effective in various types of cancer. In a fraction of treated patients, however, it causes rapid cancer progression called hyperprogressive disease (HPD). With our observation of HPD in ∼10% of anti–PD-1 monoclonal antibody (mAb)-treated advanced gastric cancer (GC) patients, we explored how anti–PD-1 mAb caused HPD in these patients and how HPD could be treated and prevented. In the majority of GC patients, tumor-infiltrating FoxP3highCD45RA−CD4+ T cells [effector Treg (eTreg) cells], which were abundant and highly suppressive in tumors, expressed PD-1 at equivalent levels as tumor-infiltrating CD4+ or CD8+ effector/memory T cells and at much higher levels than circulating eTreg cells. Comparison of GC tissue samples before and after anti–PD-1 mAb therapy revealed that the treatment markedly increased tumor-infiltrating proliferative (Ki67+) eTreg cells in HPD patients, contrasting with their reduction in non-HPD patients. Functionally, circulating and tumor-infiltrating PD-1+ eTreg cells were highly activated, showing higher expression of CTLA-4 than PD-1− eTreg cells. PD-1 blockade significantly enhanced in vitro Treg cell suppressive activity. Similarly, in mice, genetic ablation or antibody-mediated blockade of PD-1 in Treg cells increased their proliferation and suppression of antitumor immune responses. Taken together, PD-1 blockade may facilitate the proliferation of highly suppressive PD-1+ eTreg cells in HPDs, resulting in inhibition of antitumor immunity. The presence of actively proliferating PD-1+ eTreg cells in tumors is therefore a reliable marker for HPD. Depletion of eTreg cells in tumor tissues would be effective in treating and preventing HPD in PD-1 blockade cancer immunotherapy.

Journal ArticleDOI
14 Jun 2016-JAMA
TL;DR: All active agents were associated with significant excess weight loss compared with placebo at 1 year and phentermine-topiramate, naltrexone-bupropion, and liraglutide, each associated with 5% weight loss at 52 weeks, were linked with the highest odds of adverse event-related treatment discontinuation.
Abstract: Importance Five medications have been approved for the management of obesity, but data on comparative effectiveness are limited. Objective To compare weight loss and adverse events among drug treatments for obesity using a systematic review and network meta-analysis. Data Sources MEDLINE, EMBASE, Web of Science, Scopus, and Cochrane Central from inception to March 23, 2016; clinical trial registries. Study Selection Randomized clinical trials conducted among overweight and obese adults treated with US Food and Drug Administration–approved long-term weight loss agents (orlistat, lorcaserin, naltrexone-bupropion, phentermine-topiramate, or liraglutide) for at least 1 year compared with another active agent or placebo. Data Extraction and Synthesis Two investigators identified studies and independently abstracted data using a predefined protocol. A Bayesian network meta-analysis was performed and relative ranking of agents was assessed using surface under the cumulative ranking (SUCRA) probabilities. Quality of evidence was assessed using GRADE criteria. Main Outcomes and Measures Proportions of patients with at least 5% weight loss and at least 10% weight loss, magnitude of decrease in weight, and discontinuation of therapy because of adverse events at 1 year. Results Twenty-eight randomized clinical trials with 29 018 patients (median age, 46 years; 74% women; median baseline body weight, 100.5 kg; median baseline body mass index, 36.1) were included. A median 23% of placebo participants had at least 5% weight loss vs 75% of participants taking phentermine-topiramate (odds ratio [OR], 9.22; 95% credible interval [CrI], 6.63-12.85; SUCRA, 0.95), 63% of participants taking liraglutide (OR, 5.54; 95% CrI, 4.16-7.78; SUCRA, 0.83), 55% taking naltrexone-bupropion (OR, 3.96; 95% CrI, 3.03-5.11; SUCRA, 0.60), 49% taking lorcaserin (OR, 3.10; 95% CrI, 2.38-4.05; SUCRA, 0.39), and 44% taking orlistat (OR, 2.70; 95% CrI, 2.34-3.09; SUCRA, 0.22). All active agents were associated with significant excess weight loss compared with placebo at 1 year—phentermine-topiramate, 8.8 kg (95% CrI, −10.20 to −7.42 kg); liraglutide, 5.3 kg (95% CrI, −6.06 to −4.52 kg); naltrexone-bupropion, 5.0 kg (95% CrI, −5.94 to −3.96 kg); lorcaserin, 3.2 kg (95% CrI, −3.97 to −2.46 kg); and orlistat, 2.6 kg (95% CrI, −3.04 to −2.16 kg). Compared with placebo, liraglutide (OR, 2.95; 95% CrI, 2.11-4.23) and naltrexone-bupropion (OR, 2.64; 95% CrI, 2.10-3.35) were associated with the highest odds of adverse event–related treatment discontinuation. High attrition rates (30%-45% in all trials) were associated with lower confidence in estimates. Conclusions and Relevance Among overweight or obese adults, orlistat, lorcaserin, naltrexone-bupropion, phentermine-topiramate, and liraglutide, compared with placebo, were each associated with achieving at least 5% weight loss at 52 weeks. Phentermine-topiramate and liraglutide were associated with the highest odds of achieving at least 5% weight loss.

Journal ArticleDOI
TL;DR: This comprehensive guide extends the base methodology from the health sciences and other fields with numerous adaptations to meet the needs of methodologically diverse fields such as IS research, especially those that involve including and synthesizing both quantitative and qualitative studies.
Abstract: Many scholars are not well trained in conducting a standalone literature review, a scholarly paper that in its entirety summarizes and synthesizes knowledge from a prior body of research. Numerous guides that exist for information systems (IS) research mainly concentrate on only certain parts of the process; few span the entire process. This paper introduces the rigorous, standardized methodology for the systematic literature review (also called systematic review) to IS scholars. This comprehensive guide extends the base methodology from the health sciences and other fields with numerous adaptations to meet the needs of methodologically diverse fields such as IS research, especially those that involve including and synthesizing both quantitative and qualitative studies. Moreover, this guide provides many examples from IS research and provides references to guides with further helpful details for conducting a rigorous and valuable literature review. Although tailored to IS research, it is sufficiently broad to be applicable and valuable to scholars from any social science field.

Book
27 Mar 2018
TL;DR: Imitation learning as discussed by the authors is a generalization of reinforcement learning, where a teacher can demonstrate a desired behavior rather than attempting to manually engineer it, which is referred to as imitation learning.
Abstract: As robots and other intelligent agents move from simple environments and problems to more complex, unstructured settings, manually programming their behavior has become increasingly challenging and expensive. Often, it is easier for a teacher to demonstrate a desired behavior rather than attempt to manually engineer it. This process of learning from demonstrations, and the study of algorithms to do so, is called imitation learning. This work provides an introduction to imitation learning. It covers the underlying assumptions, approaches, and how they relate; the rich set of algorithms developed to tackle the problem; and advice on effective tools and implementation. We intend this paper to serve two audiences. First, we want to familiarize machine learning experts with the challenges of imitation learning, particularly those arising in robotics, and the interesting theoretical and practical distinctions between it and more familiar frameworks like statistical supervised learning theory and reinforcement learning. Second, we want to give roboticists and experts in applied artificial intelligence a broader appreciation for the frameworks and tools available for imitation learning. We pay particular attention to the intimate connection between imitation learning approaches and those of structured prediction Daume III et al. [2009]. To structure this discussion, we categorize imitation learning techniques based on the following key criteria which drive algorithmic decisions: 1) The structure of the policy space. Is the learned policy a time-index trajectory (trajectory learning), a mapping from observations to actions (so called behavioral cloning [Bain and Sammut, 1996]), or the result of a complex optimization or planning problem at each execution as is common in inverse optimal control methods [Kalman, 1964, Moylan and Anderson, 1973]. 2) The information available during training and testing. In particular, is the learning algorithm privy to the full state that the teacher possess? Is the learner able to interact with the teacher and gather corrections or more data? Does the learner have a (typically a priori) model of the system with which it interacts? Does the learner have access to the reward (cost) function that the teacher is attempting to optimize? 3) The notion of success. Different algorithmic approaches provide varying guarantees on the resulting learned behavior. These guarantees range from weaker (e.g., measuring disagreement with the agent’s decision) to stronger (e.g., providing guarantees on the performance of the learner with respect to a true cost function, either known or unknown). We organize our work by paying particular attention to distinction (1): dividing imitation learning into directly replicating desired behavior (sometimes called behavioral cloning) and learning the hidden objectives of the desired behavior from demonstrations (called inverse optimal control or inverse reinforcement learning [Russell, 1998]). In the latter case, behavior arises as the result of an optimization problem solved for each new instance that the learner faces. In addition to method analysis, we discuss the design decisions a practitioner must make when selecting an imitation learning approach. Moreover, application examples—such as robots that play table tennis [Kober and Peters, 2009], programs that play the game of Go [Silver et al., 2016], and systems that understand natural language [Wen et al., 2015]— illustrate the properties and motivations behind different forms of imitation learning. We conclude by presenting a set of open questions and point towards possible future research directions for machine learning.

Proceedings ArticleDOI
16 Apr 2018
TL;DR: The FEVER dataset as mentioned in this paper is a publicly available dataset for verification against textual sources, which consists of 185,445 claims generated by altering sentences extracted from Wikipedia and subsequently being verified without knowledge of the sentence from which they were derived.
Abstract: In this paper we introduce a new publicly available dataset for verification against textual sources, FEVER: Fact Extraction and VERification. It consists of 185,445 claims generated by altering sentences extracted from Wikipedia and subsequently verified without knowledge of the sentence they were derived from. The claims are classified as SUPPORTED, REFUTED or NOTENOUGHINFO by annotators achieving 0.6841 in Fleiss κ. For the first two classes, the annotators also recorded the sentence(s) forming the necessary evidence for their judgment. To characterize the challenge of the dataset presented, we develop a pipeline approach and compare it to suitably designed oracles. The best accuracy we achieve on labeling a claim accompanied by the correct evidence is 31.87%, while if we ignore the evidence we achieve 50.91%. Thus we believe that FEVER is a challenging testbed that will help stimulate progress on claim verification against textual sources.