scispace - formally typeset
Search or ask a question
Institution

Yahoo!

CompanyLondon, United Kingdom
About: Yahoo! is a company organization based out in London, United Kingdom. It is known for research contribution in the topics: Population & Web search query. The organization has 26749 authors who have published 29915 publications receiving 732583 citations. The organization is also known as: Yahoo! Inc. & Maudwen-Yahoo! Inc.


Papers
More filters
Journal ArticleDOI
Ozlem Er1
TL;DR: There is still a long way to go in characterizing cancer stem cells and understanding their role in carcinogenesis, which should lead to a greater understanding of the biology of these cells with significant implications for diagnosis, treatment, and prevention of cancer.
Abstract: Carcinogenesis and tumor cell biology are very complex subjects Recent findings on cancer stem cells may have a great impact on the understanding of the biology of malignancies and the development of future treatment approaches Cancer stem cells are defined by their potential to self-renew and differentiate Identification of cancer stem cells is possible by detecting expression of a certain combination of various cell surface markers and also by functional assays The cancer stem cell concept was originally developed for hematological malignancies but is now becoming accepted for solid tumors as well Emerging data indicates that cancer stem cells are responsible for the growth and spread of tumors There is still a long way to go in characterizing cancer stem cells and understanding their role in carcinogenesis Further studies should lead to a greater understanding of the biology of these cells with significant implications for diagnosis, treatment, and prevention of cancer The recent studies regarding stem cell origin of various solid tumors will be discussed briefly in this review

280 citations

Proceedings ArticleDOI
20 Jun 2007
TL;DR: In this paper, a trust region Newton method is applied to maximize the log-likelihood of the logistic regression model, which achieves fast convergence in the end, using only approximate Newton steps in the beginning.
Abstract: Large-scale logistic regression arises in many applications such as document classification and natural language processing. In this paper, we apply a trust region Newton method to maximize the log-likelihood of the logistic regression model. The proposed method uses only approximate Newton steps in the beginning, but achieves fast convergence in the end. Experiments show that it is faster than the commonly used quasi Newton approach for logistic regression. We also compare it with linear SVM implementations.

279 citations

Posted Content
TL;DR: FeUdal Networks (FuNs) as mentioned in this paper is a hierarchical reinforcement learning architecture that employs a Manager module and a Worker module, where the Manager operates at lower temporal resolution and sets abstract goals which are conveyed to and enacted by the Worker.
Abstract: We introduce FeUdal Networks (FuNs): a novel architecture for hierarchical reinforcement learning. Our approach is inspired by the feudal reinforcement learning proposal of Dayan and Hinton, and gains power and efficacy by decoupling end-to-end learning across multiple levels -- allowing it to utilise different resolutions of time. Our framework employs a Manager module and a Worker module. The Manager operates at a lower temporal resolution and sets abstract goals which are conveyed to and enacted by the Worker. The Worker generates primitive actions at every tick of the environment. The decoupled structure of FuN conveys several benefits -- in addition to facilitating very long timescale credit assignment it also encourages the emergence of sub-policies associated with different goals set by the Manager. These properties allow FuN to dramatically outperform a strong baseline agent on tasks that involve long-term credit assignment or memorisation. We demonstrate the performance of our proposed system on a range of tasks from the ATARI suite and also from a 3D DeepMind Lab environment.

277 citations

Journal ArticleDOI
TL;DR: In this article, the authors evaluated the efficacy and safety of colchicine for the secondary prevention of recurrent pericarditis in 120 patients with a first recurrence of pericharditis.
Abstract: Background Recurrence is the most common complication of pericarditis, affecting 10% to 50% of patients. Objective To evaluate the efficacy and safety of colchicine for the secondary prevention of recurrent pericarditis. Design Prospective, randomized, double-blind, placebo-controlled multicenter trial. (ClinicalTrials.gov registration number: NCT00128414) SETTING: 4 general hospitals in urban areas of Italy. Patients 120 patients with a first recurrence of pericarditis. Intervention In addition to conventional treatment, patients were randomly assigned to receive either placebo or colchicine, 1.0 to 2.0 mg on the first day followed by a maintenance dose of 0.5 to 1.0 mg/d, for 6 months. Measurements The primary study end point was the recurrence rate at 18 months. Secondary end points were symptom persistence at 72 hours, remission rate at 1 week, number of recurrences, time to first recurrence, disease-related hospitalization, cardiac tamponade, and rate of constrictive pericarditis. Results At 18 months, the recurrence rate was 24% in the colchicine group and 55% in the placebo group (absolute risk reduction, 0.31 [95% CI, 0.13 to 0.46]; relative risk reduction, 0.56 [CI, 0.27 to 0.73]; number needed to treat, 3 [CI, 2 to 7]). Colchicine reduced the persistence of symptoms at 72 hours (absolute risk reduction, 0.30 [CI, 0.13 to 0.45]; relative risk reduction, 0.56 [CI, 0.27 to 0.74]) and mean number of recurrences, increased the remission rate at 1 week, and prolonged the time to subsequent recurrence. The study groups had similar rates of side effects and drug withdrawal. Limitation Multiple recurrences and neoplastic or bacterial causes were excluded. Conclusion Colchicine is safe and effective for secondary prevention of recurrent pericarditis.

276 citations

Proceedings Article
01 Jan 2019
TL;DR: In this article, the object relation relation transformer (ORT) is proposed to explicitly incorporate information about the spatial relationship between input detected objects through geometric attention, leading to improvements on all common captioning metrics on the MS-COCO dataset.
Abstract: Image captioning models typically follow an encoder-decoder architecture which uses abstract image feature vectors as input to the encoder. One of the most successful algorithms uses feature vectors extracted from the region proposals obtained from an object detector. In this work we introduce the Object Relation Transformer, that builds upon this approach by explicitly incorporating information about the spatial relationship between input detected objects through geometric attention. Quantitative and qualitative results demonstrate the importance of such geometric attention for image captioning, leading to improvements on all common captioning metrics on the MS-COCO dataset. Code is available at https://github.com/yahoo/object_relation_transformer .

276 citations


Authors

Showing all 26766 results

NameH-indexPapersCitations
Ashok Kumar1515654164086
Alexander J. Smola122434110222
Howard I. Maibach116182160765
Sanjay Jain10388146880
Amirhossein Sahebkar100130746132
Marc Davis9941250243
Wenjun Zhang9697638530
Jian Xu94136652057
Fortunato Ciardiello9469547352
Tong Zhang9341436519
Michael E. J. Lean9241130939
Ashish K. Jha8750330020
Xin Zhang87171440102
Theunis Piersma8663234201
George Varghese8425328598
Network Information
Related Institutions (5)
University of Toronto
294.9K papers, 13.5M citations

85% related

University of California, San Diego
204.5K papers, 12.3M citations

85% related

University College London
210.6K papers, 9.8M citations

84% related

Cornell University
235.5K papers, 12.2M citations

84% related

University of Washington
305.5K papers, 17.7M citations

84% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
20232
202247
20211,088
20201,074
20191,568
20181,352