scispace - formally typeset
Search or ask a question
Institution

Amazon.com

CompanySeattle, Washington, United States
About: Amazon.com is a company organization based out in Seattle, Washington, United States. It is known for research contribution in the topics: Service (business) & Service provider. The organization has 13363 authors who have published 17317 publications receiving 266589 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: In this paper, the average bole density of trees in Amazonian forests was found to be 5.3% lower than the commonly used wood density estimates found in published lists by species.

139 citations

Journal ArticleDOI
TL;DR: With the exception of fat, amounts of macronutrients, minerals and titratable acids decreased during the ripening process, and the same trend was observed for most of the phenolic constituents identified by HPLC-ESI-MS/MS.

138 citations

Patent
26 Sep 2011
TL;DR: In this paper, a system, methods, and interfaces for managing request routing functionality associated with resource requests for one or more resources associated with a content provider is presented, which can correspond to the processing of domain name service (DNS) requests for resources by computing devices and the resolution of the DNS requests by the identification of a network address of a computing device that will provide the requested resources.
Abstract: A system, methods, and interfaces for managing request routing functionality associated with resource requests for one or more resources associated with a content provider. The request routing functionality can correspond to the processing of domain name service (“DNS”) requests for resources by computing devices and the resolution of the DNS requests by the identification of a network address of a computing device that will provide the requested resources. Unlike traditional CDN service provider implementation, the processing of resource requests by the service provider is separate from the delivery of the content by the content provider (or on behalf of the content provider).

138 citations

Patent
17 Nov 2014
TL;DR: In this paper, the authors describe systems, devices, and methods providing countermeasures for threats that may compromise the UAVs by using external data from a second UAV using the mesh network.
Abstract: Uncrewed autonomous vehicles (“UAVs”) may navigate from one location to another location. Described herein are systems, devices, and methods providing countermeasures for threats that may compromise the UAVs. A plurality of UAVs may establish a mesh network to distribute information to one another. A first UAV may receive external data from a second UAV using the mesh network. The external data may be used to confirm or cross-check data such as location, heading, altitude, and so forth. Disagreement between data generated by the first UAV with external data from the second UAV may result in the determination that the first UAV is compromised. Remedial actions may be taken, such as the first UAV may be directed to a safe location to land or park, may receive commands from another UAV, and so forth.

138 citations

Proceedings ArticleDOI
03 May 2021
TL;DR: The Speech processing Universal PERformance Benchmark (SUPERB) as discussed by the authors is a leaderboard to benchmark the performance of a shared model across a wide range of speech processing tasks with minimal architecture changes and labeled data.
Abstract: Self-supervised learning (SSL) has proven vital for advancing research in natural language processing (NLP) and computer vision (CV). The paradigm pretrains a shared model on large volumes of unlabeled data and achieves state-of-the-art (SOTA) for various tasks with minimal adaptation. However, the speech processing community lacks a similar setup to systematically explore the paradigm. To bridge this gap, we introduce Speech processing Universal PERformance Benchmark (SUPERB). SUPERB is a leaderboard to benchmark the performance of a shared model across a wide range of speech processing tasks with minimal architecture changes and labeled data. Among multiple usages of the shared model, we especially focus on extracting the representation learned from SSL due to its preferable re-usability. We present a simple framework to solve SUPERB tasks by learning task-specialized lightweight prediction heads on top of the frozen shared model. Our results demonstrate that the framework is promising as SSL representations show competitive generalizability and accessibility across SUPERB tasks. We release SUPERB as a challenge with a leaderboard and a benchmark toolkit to fuel the research in representation learning and general speech processing.

138 citations


Authors

Showing all 13498 results

NameH-indexPapersCitations
Jiawei Han1681233143427
Bernhard Schölkopf1481092149492
Christos Faloutsos12778977746
Alexander J. Smola122434110222
Rama Chellappa120103162865
William F. Laurance11847056464
Andrew McCallum11347278240
Michael J. Black11242951810
David Heckerman10948362668
Larry S. Davis10769349714
Chris M. Wood10279543076
Pietro Perona10241494870
Guido W. Imbens9735264430
W. Bruce Croft9742639918
Chunhua Shen9368137468
Network Information
Related Institutions (5)
Microsoft
86.9K papers, 4.1M citations

89% related

Google
39.8K papers, 2.1M citations

88% related

Carnegie Mellon University
104.3K papers, 5.9M citations

87% related

ETH Zurich
122.4K papers, 5.1M citations

82% related

University of Maryland, College Park
155.9K papers, 7.2M citations

82% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
20234
2022168
20212,015
20202,596
20192,002
20181,189