scispace - formally typeset
Search or ask a question

Showing papers by "Amazon.com published in 2022"


Journal ArticleDOI
TL;DR: A survey of the application of deep learning techniques in NLP, with a focus on the various tasks where deep learning is demonstrating stronger impact.

123 citations


Journal ArticleDOI
TL;DR: A survey of the state-of-the-art semantic networks for engineering design and propositions of future research to build and utilize large-scale semantic networks as knowledge bases to support engineering design research and practice are provided.
Abstract: In the past two decades, there has been increasing use of semantic networks in engineering design for supporting various activities, such as knowledge extraction, prior art search, idea generation and evaluation. Leveraging large-scale pre-trained graph knowledge databases to support engineering design-related natural language processing (NLP) tasks has attracted a growing interest in the engineering design research community. Therefore, this paper aims to provide a survey of the state-of-the-art semantic networks for engineering design and propositions of future research to build and utilize large-scale semantic networks as knowledge bases to support engineering design research and practice. The survey shows that WordNet, ConceptNet and other semantic networks, which contain common-sense knowledge or are trained on non-engineering data sources, are primarily used by engineering design researchers to develop methods and tools. Meanwhile, there are emerging efforts in constructing engineering and technical-contextualized semantic network databases, such as B-Link and TechNet, through retrieving data from technical data sources and employing unsupervised machine learning approaches. On this basis, we recommend six strategic future research directions to advance the development and uses of large-scale semantic networks for artificial intelligence applications in engineering design.

28 citations


Journal ArticleDOI
TL;DR: In this article, the authors determine the properties of copper mining wastes generated in the eastern Amazon and their potential risks to environment and human health, and demonstrate that only the artisanal rock waste is associated with environmental risk.

26 citations


Journal ArticleDOI
Tim Januschowski1
TL;DR: The prevalence of gradient boosted trees among the top contestants in the M5 competition is potentially the most eye-catching result as discussed by the authors , where tree-based methods out-shone other solutions, in particular deep learning-based solutions.

15 citations


Journal ArticleDOI
TL;DR: In this article, anthropogenic activities may have increased the concentrations of potentially toxic elements (PTEs) in fish from the southeastern Carajas Mineral Province in Brazil, which has not yet been studied.

15 citations


Journal ArticleDOI
TL;DR: This work measures fairness according to Demographic Parity, requiring the probability of the model decisions to be independent of the sensitive information, and investigates how to impose this constraint in the different layers of deep neural networks for complex data, with particular reference to deep networks for graph and face recognition.

13 citations


Journal ArticleDOI
TL;DR: In this paper, a deep learning model (i.e. superpixel-based Residual Networks 50, SP-ResNet50) was used to automatically differentiate leaves from non-leaves in phenocam images and to derive leaf fraction at the tree-crown scale.
Abstract: Tropical leaf phenology—particularly its variability at the tree-crown scale—dominates the seasonality of carbon and water fluxes. However, given enormous species diversity, accurate means of monitoring leaf phenology in tropical forests is still lacking. Time series of the Green Chromatic Coordinate (GCC) metric derived from tower-based red–greenblue (RGB) phenocams have been widely used to monitor leaf phenology in temperate forests, but its application in the tropics remains problematic. To improve monitoring of tropical phenology, we explored the use of a deep learning model (i.e. superpixel-based Residual Networks 50, SP-ResNet50) to automatically differentiate leaves from non-leaves in phenocam images and to derive leaf fraction at the tree-crown scale. To evaluate our model, we used a year of data from six phenocams in two contrasting forests in Panama. We first built a comprehensive library of leaf and non-leaf pixels across various acquisition times, exposure conditions and specific phenocams. We then divided this library into training and testing components. We evaluated the model at three levels: 1) superpixel level with a testing set, 2) crown level by comparing the model-derived leaf fractions with those derived using image-specific supervised classification, and 3) temporally using all daily images to assess the diurnal stability of the model-derived leaf fraction. Finally, we compared the model-derived leaf fraction phenology with leaf phenology derived from GCC. Our results show that: 1) the SP-ResNet50 model accurately differentiates leaves from non-leaves (overall accuracy of 93%) and is robust across all three levels of evaluations; 2) the model accurately quantifies leaf fraction phenology across tree-crowns and forest ecosystems; and 3) the combined use of leaf fraction and GCC helps infer the timing of leaf emergence, maturation and senescence, critical information for modeling photosynthetic seasonality of tropical forests. Collectively, this study offers an improved means for automated tropical phenology monitoring using phenocams.

11 citations


Journal ArticleDOI
TL;DR: In this article, the authors show how document-ordered indexes can be organized such that they can be queried in an anytime fashion, enabling strict latency control with effective early termination.
Abstract: Inverted indexes continue to be a mainstay of text search engines, allowing efficient querying of large document collections. While there are a number of possible organizations, document-ordered indexes are the most common, since they are amenable to various query types, support index updates, and allow for efficient dynamic pruning operations. One disadvantage with document-ordered indexes is that high-scoring documents can be distributed across the document identifier space, meaning that index traversal algorithms that terminate early might put search effectiveness at risk. The alternative is impact-ordered indexes, which primarily support top- disjunctions but also allow for anytime query processing, where the search can be terminated at any time, with search quality improving as processing latency increases. Anytime query processing can be used to effectively reduce high-percentile tail latency that is essential for operational scenarios in which a service level agreement (SLA) imposes response time requirements. In this work, we show how document-ordered indexes can be organized such that they can be queried in an anytime fashion, enabling strict latency control with effective early termination. Our experiments show that processing document-ordered topical segments selected by a simple score estimator outperforms existing anytime algorithms, and allows query runtimes to be accurately limited to comply with SLA requirements.

9 citations


Journal ArticleDOI
Engin Er1
TL;DR: In this article , the authors discuss the epidemiology worldwide of shrimp allergy and the properties of shrimp's main allergens, and summarize the latest findings of novel processing methods to lower the allergenicity of shrimps including microwave, ultrasound, pulsed light, cold plasma, fermentation, enzymatic hydrolysis, high pressure, and a combination of several processing methods.
Abstract: Shrimp, belonging to the class Malacostraca and specifically to the order Decapod, is a common species in crustaceans with abundant nutrients, such as protein, amino acids, minerals, unsaturated fatty acids, vitamins, astaxanthin, and antioxidants. However, shrimp is considered as one of the “big eight” allergenic foods, which leads to a series of allergic reactions from mild to life-threatening in shrimp-allergic individuals. Tropomyosin is identified as shrimp's major allergen. Food processing techniques induce the structural changes in allergens to further affect shrimp allergenicity. Compared to conventional treatments (e.g. heating, steaming), novel processing treatments show superior effects in reducing shrimp allergenicity and retaining the nutritional value and sensory quality. This review discusses the epidemiology worldwide of shrimp allergy and the properties of shrimp's main allergens. It summarizes the latest findings of novel processing methods to lower the allergenicity of shrimps, including microwave, ultrasound, pulsed light, cold plasma, fermentation, enzymatic hydrolysis, high pressure, and a combination of several processing methods. Besides, current strategies and future therapies for patients are also discussed to better manage shrimp allergy. Shrimp allergy is a critical health issue with globally increasing prevalence. In the food industry, non-thermal processing techniques can modify allergen structures and meanwhile maintain shrimp physiochemical properties better than thermal techniques. Ultrasound processing is the most commonly discussed technique that effectively reduces shrimp allergenicity. Compared to when the novel processing technique was applied alone, a combination of several processing techniques applied sequentially can reduce shrimp allergenicity more efficiently and effectively. To date, studies focusing on the application of novel processing techniques to reduce/modify shrimp allergens are still scarce, and improved processing efficiency is also required in further research. • Non-thermal methods can retain shrimp physiochemical properties better. • Heat-stable tropomyosin can be modified by high-intensity thermal treatments. • Ultrasound is the most common processing method to reduce shrimp allergenicity. • A combination of several processing techniques can be more efficient and effective.

8 citations


Journal ArticleDOI
Sekino, Nozomu1
TL;DR: In this paper , the authors extend k-means clustering to group assets by risk prices and introduce a formal test for whether differences in risk premiums across market segments are too large to occur by chance.
Abstract: Abstract Equal compensation across assets for the same risk exposures is a bedrock of asset pricing theory and empirics. Yet real-world frictions can violate this equality and create apparently high Sharpe ratio opportunities. We develop new methods for asset pricing with cross-sectional heterogeneity in compensation for risk. We extend k-means clustering to group assets by risk prices and introduce a formal test for whether differences in risk premiums across market segments are too large to occur by chance. We find significant evidence of cross-sectional variation in risk prices for almost all combinations of test assets, factor models, and time periods considered.

7 citations


Journal ArticleDOI
TL;DR: In this paper , a flexible transparent CsPbBr3 quantum dots (QDs) mixed in graphene oxide (GO) RRAM device is introduced, which is controllable by both an electric field and illumination.
Abstract: Perovskite resistive random-access memory (RRAM) is a promising candidate for next-generation logic, adaptive and nonvolatile memory devices, because of its high ON/OFF ratio, low-cost fabrication, and good photoelectric regulation performance. In this work, a flexible transparent CsPbBr3 quantum dots (QDs) mixed in graphene oxide (GO) RRAM device is introduced, which is controllable by both an electric field and illumination. Under illumination, the ON/OFF ratio of the Ag/CsPbBr3 QDs:GO/ITO device is ≈1.4 × 107, which is 1077 times larger than that in the dark condition (1.3 × 104). The SET/RESET voltages are +2.28/−2.04 V and +1.68/−1.08 V under the dark and illumination conditions, respectively. As a flexible memory device, the resistances are little affected by the bending curvatures and load-cycling. Before and after 104 bending cycles with a radius of 5 mm under illumination, the ON/OFF ratios keep in the same order, which are 2.5 × 107 and 2.3 × 107, respectively. The ratio values are 8.8 × 104 and 2.9 × 104 under the dark condition, respectively. This innovative resistive memory based on the CsPbBr3 QDs:GO hybrid film supports a huge space for the development of photoelectrical dual-controlled flexible RRAM devices.

Journal ArticleDOI
TL;DR: Saikosaponin D (SSD) is a triterpenoid saponin with many biological activities including anti-inflammatory effects and antioxidant properties, which provides protection against pathologic cardiac remodeling and fibrosis as mentioned in this paper .
Abstract: As a highly efficient anticancer agent, doxorubicin (DOX) is used for treatment of various cancers, but DOX-induced oxidative damages contribute to a degenerative irreversible cardiac toxicity. Saikosaponin D (SSD), which is a triterpenoid saponin with many biological activities including anti-inflammatory effects and antioxidant properties, provides protection against pathologic cardiac remodeling and fibrosis. In the present study, we investigated the work of SSD for DOX-induced cardiotoxicity and the involved mechanisms. We observed that DOX injection induced cardiac injury and malfunction and decreased survival rate. Besides, DOX treatment increased lactate dehydrogenase leakage, cardiomyocyte apoptosis, and myocardium fibrosis and decreased the size of cardiomyocytes. Meanwhile, all the effects were notably attenuated by SSD treatment. In vitro, we found that 1 μM SSD could enhance the proliferation of H9c2 cells and inhibit DOX-induced apoptosis. It was found that the levels of malondialdehyde (MDA) and reactive oxygen species were significantly reduced by improving the activities of the endogenous antioxidative enzymes including catalase and glutathione peroxidase. Furthermore, SSD treatment could downregulate the DOX-induced p38 phosphorylation. Our results suggested that SSD efficiently protected the cardiomyocytes from DOX-induced cardiotoxicity by inhibiting the excessive oxidative stress via p38-MAPK (mitogen-activated protein kinase, MAPK) signaling pathway.

Journal ArticleDOI
TL;DR: In this article, Zhang et al. reported zircon U-Pb ages, bulk-rock geochemical and ZIRcon Hf isotopic data on the orthogneisses in the Milin area of the southeastern Lhasa terrane, southern Tibet.

Journal ArticleDOI
Engin Er1
TL;DR: In this paper , Zhang et al. reported zircon U-Pb ages, bulk-rock geochemical and Zircon Hf isotopic data on the orthogneisses in the Milin area of the southeastern Lhasa terrane, southern Tibet.

Journal ArticleDOI
TL;DR: In this article, a temporal convolutional neural network (TCN) and a Transformer-based architecture were proposed to detect and count simultaneous, overlapping speakers in a multichannel, distant-microphone scenario.

Book ChapterDOI
TL;DR: A distributed system that uses an efficient evolutionary algorithm to design a modular autoencoder that beats random search by nearly an order of magnitude on both tasks while achieving near linear horizontal scaling as additional worker nodes are added to the system.
Abstract: Autoencoders have seen wide success in domains ranging from feature selection to information retrieval. Despite this success, designing an autoencoder for a given task remains a challenging undertaking due to the lack of firm intuition on how the backing neural network architectures of the encoder and decoder impact the overall performance of the autoencoder. In this work we present a distributed system that uses an efficient evolutionary algorithm to design a modular autoencoder. We demonstrate the effectiveness of this system on the tasks of manifold learning and image denoising. The system beats random search by nearly an order of magnitude on both tasks while achieving near linear horizontal scaling as additional worker nodes are added to the system.

Journal ArticleDOI
TL;DR: In this article, the authors considered a retailer with an online store and a network of stores operating in an omni-channel strategy, where the fulfillment decision for an online order, which contains a number of items, involves the allocation of these items to the stores where they are available and the selection of one store for consolidation of the items into the final package to be dispatched to the customer.


Journal ArticleDOI
01 Jan 2022-Catena
TL;DR: In this article, the impacts of pastoral use systems on C and N stocks and the natural abundance of 13C (δ13C) in the fractions of soil organic matter in the Brazilian Amazon biome were evaluated.
Abstract: Changes with land use and management are the second largest cause of greenhouse gas emissions (E-GHG) into the atmosphere. So, it is important to better understand agricultural systems regarding C and N stocks, which are directly associated with soil quality. The objective of this work was to evaluate the impacts of pastoral use systems on C and N stocks and the natural abundance of 13C (δ13C) in the fractions of soil organic matter in the Brazilian Amazon biome. The pastoral systems evaluated involved two silvopastoral systems with 30% (SP30) and 60% (SP60) of shading, two full-sun pasture systems, one in use and under intensive management (IMP) and other in fallow and is considered degraded (DP), as well as a control area under forest (NV). Dominant the grass of the pasture areas was Mombasa grass (Megathyrsus maximus). The experiment design consisted of four replicates where the collection sites (trenches) were systematically distributed within the areas (SP30, SP60, NV, DP and IMP) and the layers (0–5, 5–15, 15–30, 30–60 and 60–100 cm). Silvopastoral system with 60% shading had the greatest carbon stock, even greater than the control forest at some depths. In comparison to the control condition (NV) in soil profile, SP30, DP and IMP, reduced C-total stocks by 24%, 17% and 20%, and N-total by 14%, 10% and 18%, respectively. The δ13C values were higher in the IMP and DP systems and lower in SP60 and NV, both in organic matter associated with minerals (MaOM) and particulate organic matter (POM). An isotopic enrichment occurred in all systems in the soil profile.

Journal ArticleDOI
TL;DR: In this article , the phase alignment between a target and background was found to result in a local contrast signal that facilitates detection when target-background similarity is high, which is similar to ours.
Abstract: The sensitivity of the human visual system is thought to be shaped by environmental statistics. A major endeavor in vision science, therefore, is to uncover the image statistics that predict perceptual and cognitive function. When searching for targets in natural images, for example, it has recently been proposed that target detection is inversely related to the spatial similarity of the target to its local background. We tested this hypothesis by measuring observers' sensitivity to targets that were blended with natural image backgrounds. Targets were designed to have a spatial structure that was either similar or dissimilar to the background. Contrary to masking from similarity, we found that observers were most sensitive to targets that were most similar to their backgrounds. We hypothesized that a coincidence of phase alignment between target and background results in a local contrast signal that facilitates detection when target-background similarity is high. We confirmed this prediction in a second experiment. Indeed, we show that, by solely manipulating the phase of a target relative to its background, the target can be rendered easily visible or undetectable. Our study thus reveals that, in addition to its structural similarity, the phase of the target relative to the background must be considered when predicting detection sensitivity in natural images.

Journal ArticleDOI
Shuqing Chen1
TL;DR: Wang et al. as discussed by the authors explored the spatial effects of IPR protection on city-level carbon dioxide (CO2) emissions through research and development (R&D) investment, foreign direct investment (FDI) technology spillover, and intercity technology spill over (DS).
Abstract: Technological progress is vital for China to reduce carbon emissions. A comprehensive understanding of the relationship between intellectual property (IPR) protection and city carbon emissions can better serve China achieve carbon peak and carbon neutral goals. This study focuses on exploring the spatial effects of IPR protection on city-level carbon dioxide (CO2) emissions through research and development (R&D) investment, foreign direct investment (FDI) technology spillover, and intercity technology spillover (DS). This work developed an extended Environmental Kuznets Curve model to investigate the influencing mechanism of IPR protection using the geographical weighted regression Kriging method. Our study period covered 2005, 2009, 2013, and 2017. We identified spatial correlation characteristics in various cities and found that the impact of IPR protection on carbon emissions through R&D investment showed a trend of decreasing first and then increasing. IPR protection through DS had a significant impact of inhibiting CO2 emissions. The impact of IPR protection on carbon emissions through FDI showed a trend of fluctuating up and down. Space coordination in IPR protection, technology convergence between cities, infrastructure construction, and a strict negative list of foreign investment all contributed to city carbon-emission reduction.

Journal ArticleDOI
Engin Er1
05 Apr 2022-Robotica
TL;DR: In this paper , a novel underactuated positioning system has been built by different sets of linear motion units (defined as the positioning lines) enabling to actuate the multiple degree-of-freedom manipulators with one motor.
Abstract: Abstract Parallel manipulators are increasingly utilized in extensive industrial applications due to their high accuracy, compact structure, and significant stiffness characteristics. However, most of the time, massive actuators are involved in constructing and controlling a parallel manipulator, which burdens the structure design and controller development. In this paper, a novel underactuated positioning system been built by different sets of linear motion units (defined as the positioning lines) is proposed, enabling to actuate the multiple degree-of-freedom manipulators with one motor. To achieve this, a smart shape memory alloy (SMA) clutch is presented to obtain the positioning function of each positioning line. Further, to get the decoupled motion regulation of the positioning lines, a new thermal kinematic model of the SMA clutch, which considers the heat dissipation influence on the metal components, was built and validated by the physical prototypes. The experimental results show that the constitutive model of the SMA clutch developed in this paper can be validated within the error of 5.3%. It can also be found that the heat dissipation of the metal component has a significant influence on the model accuracy of the SMA clutch (i.e., 2.6% of the model accuracy). The experiments on the underactuated positioning system produce the following results: the single positioning line can achieve high positioning (i.e., average error: 1.01%) and tracking (i.e., average error $\leq$ 1 mm) abilities; the underactuated positioning system can perform decoupled motions in the three positioning lines with high accuracy (i.e., ±2 mm within the stroke of 180 mm).

Journal ArticleDOI
TL;DR: In this article, the authors proposed a smart home architecture composed of an Internet of Things (IoT), people, and physical content, which can provide digital services to optimize space use and enhance user experience.
Abstract: Smart spaces such as smart homes deliver digital services to optimize space use and enhance user experience. They are composed of an Internet of Things (IoT), people, and physical content. They dif...

Journal ArticleDOI
TL;DR: In this paper, the authors present simulations of the seismic liquefaction response of dense and loose clean Ottawa sand under low and high overburden in the centrifuge, using Program FLAC3D and the P2Psand constitutive model.


Journal ArticleDOI
Andrew S. Richman1
02 Sep 2022-Medicine
TL;DR: In this article , a multivariate logistic regression analysis was performed to identify independent risk factors of surgical site infection (SSI) in patients with upper extremity fracture and intraoperative blood loss more than 135 mL.

Journal ArticleDOI
TL;DR: In this article , the authors analyze the current situation of university teachers' informationized teaching, and analyzes the significance of improving the information-driven teaching ability of teachers, and analyze the promotion strategies of teachers' ability of informatization teaching.
Abstract: The 13th Five-Year Plan clearly proposes to enhance teachers’ ability of informationized teaching, so as to make informationized teaching a routine mode. In 2020,in order to prevent and control COVID-19, colleges have launched online teaching activities, which is necessary to use information technology in teaching and also a test of teachers’ ability of informationized teaching. In this context,teachers should change their teaching ideology and improve their ability of informatizationized teaching in an all-round way. Firstly, this paper analyzes the current situation of university teachers’ informationized teaching. Secondly, it analyzes the significance of improving the informationized teaching ability of university teachers. Finally, it analyzes the promotion strategies of university teachers’ ability of informatization teaching .

Book ChapterDOI
Ammar Abbas1
01 Jan 2022

Journal ArticleDOI
Ruoyuan Gao1
TL;DR: In this article , the authors proposed a new metric called FAIR, which unifies standard IR metrics and fairness measures into an integrated metric, and developed an effective ranking algorithm that jointly optimized user utility and fairness.
Abstract: With the emerging needs of creating fairness-aware solutions for search and recommendation systems, a daunting challenge exists of evaluating such solutions. While many of the traditional information retrieval (IR) metrics can capture the relevance, diversity, and novelty for the utility with respect to users, they are not suitable for inferring whether the presented results are fair from the perspective of responsible information exposure. On the other hand, existing fairness metrics do not account for user utility or do not measure it adequately. To address this problem, we propose a new metric called FAIR. By unifying standard IR metrics and fairness measures into an integrated metric, this metric offers a new perspective for evaluating fairness-aware ranking results. Based on this metric, we developed an effective ranking algorithm that jointly optimized user utility and fairness. The experimental results showed that our FAIR metric could highlight results with good user utility and fair information exposure. We showed how FAIR related to a set of existing utility and fairness metrics and demonstrated the effectiveness of our FAIR-based algorithm. We believe our work opens up a new direction of pursuing a metric for evaluating and implementing the FAIR systems.