Institution
Company•Tel Aviv, Israel•
About: Facebook is a company organization based out in Tel Aviv, Israel. It is known for research contribution in the topics: Computer science & Artificial neural network. The organization has 7856 authors who have published 10906 publications receiving 570123 citations. The organization is also known as: facebook.com & FB.
Topics: Computer science, Artificial neural network, Language model, Context (language use), Reinforcement learning
Papers published on a yearly basis
Papers
More filters
•
12 Sep 2006TL;DR: In this article, the content maintained in an online social network or other online communities is tracked for changes and updates, such as user profiles, digital photos, digital audio and video files, testimonials, and identification of users who are friends.
Abstract: Content maintained in an online social network or other online communities is tracked for changes and updates. The content may include user profiles, digital photos, digital audio and video files, testimonials, and identification of users who are friends. When such change or update occurs, users of the online social network or online community are notified according to various criteria that they have set. The notification may be provided by e-mail, an RSS feed, or a web page when accessed. With this feature, users can browse through content of other users with efficiency.
117 citations
••
01 Apr 2021TL;DR: The authors show that large scale models can learn these skills when given appropriate training data and choice of generation strategy, and build variants of these recipes with 90M, 2.7B and 9.4B parameter models, and make their models and code publicly available.
Abstract: Building open-domain chatbots is a challenging area for machine learning research. While prior work has shown that scaling neural models in the number of parameters and the size of the data they are trained on gives improved results, we highlight other ingredients. Good conversation requires blended skills: providing engaging talking points, and displaying knowledge, empathy and personality appropriately, while maintaining a consistent persona. We show that large scale models can learn these skills when given appropriate training data and choice of generation strategy. We build variants of these recipes with 90M, 2.7B and 9.4B parameter models, and make our models and code publicly available. Human evaluations show our best models outperform existing approaches in multi-turn dialogue on engagingness and humanness measurements. We then discuss the limitations of this work by analyzing failure cases of our models.
117 citations
••
TL;DR: In this paper, a heuristic clustering method was used to group Bitcoin wallets based on evidence of shared authority and then using reidentification attacks (i.e., empirical purchasing of goods and services) to classify the operators of those clusters.
Abstract: Bitcoin is a purely online virtual currency, unbacked by either physical commodities or sovereign obligation; instead, it relies on a combination of cryptographic protection and a peer-to-peer protocol for witnessing settlements. Consequently, Bitcoin has the unintuitive property that while the ownership of money is implicitly anonymous, its flow is globally visible. In this paper we explore this unique characteristic further, using heuristic clustering to group Bitcoin wallets based on evidence of shared authority, and then using re-identification attacks (i.e., empirical purchasing of goods and services) to classify the operators of those clusters. From this analysis, we consider the challenges for those seeking to use Bitcoin for criminal or fraudulent purposes at scale.
117 citations
••
18 Jun 2018TL;DR: In this article, the problem of inferring image labels from images when only a few annotated examples are available at training time is considered, which is referred to as low-shot learning, where a standard approach is to retrain the last few layers of a CNN learned on separate classes for which training examples are abundant.
Abstract: This paper considers the problem of inferring image labels from images when only a few annotated examples are available at training time. This setup is often referred to as low-shot learning, where a standard approach is to retrain the last few layers of a convolutional neural network learned on separate classes for which training examples are abundant. We consider a semi-supervised setting based on a large collection of images to support label propagation. This is possible by leveraging the recent advances on large-scale similarity graph construction. We show that despite its conceptual simplicity, scaling label propagation up to hundred millions of images leads to state of the art accuracy in the low-shot learning regime.
117 citations
•
14 Jun 2004TL;DR: In this article, relevant content is prepared and selected for delivery to a member of a network based on prior online activities of other members of the network, and the closeness of the member's relationship with the other members.
Abstract: Relevant content is prepared and selected for delivery to a member of a network based, in part, on prior online activities of the other members of the network, and the closeness of the member's relationship with the other members of the network. The relevant content may be an online ad, and is selected from a number of candidate online ads based on click-through rates of groups that are predefined with respect to the member and with respect to certain attributes. An online ad's revenue-generating potential may be considered in the selection process.
117 citations
Authors
Showing all 7875 results
Name | H-index | Papers | Citations |
---|---|---|---|
Yoshua Bengio | 202 | 1033 | 420313 |
Xiang Zhang | 154 | 1733 | 117576 |
Jitendra Malik | 151 | 493 | 165087 |
Trevor Darrell | 148 | 678 | 181113 |
Christopher D. Manning | 138 | 499 | 147595 |
Robert W. Heath | 128 | 1049 | 73171 |
Pieter Abbeel | 126 | 589 | 70911 |
Yann LeCun | 121 | 369 | 171211 |
Li Fei-Fei | 120 | 420 | 145574 |
Jon Kleinberg | 117 | 444 | 87865 |
Sergey Levine | 115 | 652 | 59769 |
Richard Szeliski | 113 | 359 | 72019 |
Sanjeev Kumar | 113 | 1325 | 54386 |
Bruce Neal | 108 | 561 | 87213 |
Larry S. Davis | 107 | 693 | 49714 |