Institution
Alibaba Group
Company•Hangzhou, China•
About: Alibaba Group is a company organization based out in Hangzhou, China. It is known for research contribution in the topics: Computer science & Terminal (electronics). The organization has 6810 authors who have published 7389 publications receiving 55653 citations. The organization is also known as: Alibaba Group Holding Limited & Alibaba Group (Cayman Islands).
Topics: Computer science, Terminal (electronics), Graph (abstract data type), Node (networking), Deep learning
Papers published on a yearly basis
Papers
More filters
••
23 Aug 2020TL;DR: This work proposes a strategy that combines the reinforcement learning manner with the supervised learning manner to train the model based on the graph convolutional network with node feature and edge feature as input and embedded.
Abstract: Our model is based on the graph convolutional network (GCN) with node feature (coordination and demand) and edge feature (the real distance between nodes) as input and embedded. Separate decoders are proposed to decode the representations of these two features. The output of one decoder is the supervision of the other decoder. We propose a strategy that combines the reinforcement learning manner with the supervised learning manner to train the model. Through comprehensive experiments on real-world data, we show that 1) the edge feature is important to be explicitly considered in the model; 2) the joint learning strategy can accelerate the convergence of the training and improve the solution quality; 3) our model significantly outperforms several well-known algorithms in the literature, especially when the problem size is large; 3) our method is generalized beyond the size of problem instances they were trained on.
39 citations
••
27 Feb 2016TL;DR: Findings reveal that Chinese cultural beliefs and Chinese beliefs about traditional medicine significantly affect patients' understandings of depression, illness management, and social interactions, and implications for how Chinese society as a whole may respond to the misunderstanding of mental illness and the raising of public awareness are drawn.
Abstract: More than 350 million people worldwide suffer from depression. Major depressive disorder has a hugely negative impact on psychological well-being, work, and family life. Yet culture may shape how depressed patients interpret their symptoms, choose treatments, and behave. This paper reports a case study, including participant observations and interviews, of the Chinese online depression community, SunForum. Our findings reveal that Chinese cultural beliefs (e.g., the power of inner self-control) and Chinese beliefs about traditional medicine (e.g., the integrated body-mind relationship) significantly affect patients' understandings of depression, illness management, and social interactions. These beliefs create problems of understanding depression in society - including family members, friends, co-workers, and others - and present various challenges for depressed patients who can become marginalized, suffer discrimination, and lose their jobs. We draw implications for how Chinese society as a whole may respond to the misunderstanding of mental illness and the raising of public awareness. We also propose specific social media design to support depressed patients as they seek online information and social support.
39 citations
••
TL;DR: A new way to generate personalized product descriptions by combining the power of neural networks and knowledge base is explored by proposing a KnOwledge Based pErsonalized (or KOBE) product description generation model in the context of E-commerce.
Abstract: Quality product descriptions are critical for providing competitive customer experience in an e-commerce platform. An accurate and attractive description not only helps customers make an informed decision but also improves the likelihood of purchase. However, crafting a successful product description is tedious and highly time-consuming. Due to its importance, automating the product description generation has attracted considerable interests from both research and industrial communities. Existing methods mainly use templates or statistical methods, and their performance could be rather limited. In this paper, we explore a new way to generate the personalized product description by combining the power of neural networks and knowledge base. Specifically, we propose a KnOwledge Based pErsonalized (or KOBE) product description generation model in the context of e-commerce. In KOBE, we extend the encoder-decoder framework, the Transformer, to a sequence modeling formulation using self-attention. In order to make the description both informative and personalized, KOBE considers a variety of important factors during text generation, including product aspects, user categories, and knowledge base, etc. Experiments on real-world datasets demonstrate that the proposed method out-performs the baseline on various metrics. KOBE can achieve an improvement of 9.7% over state-of-the-arts in terms of BLEU. We also present several case studies as the anecdotal evidence to further prove the effectiveness of the proposed approach. The framework has been deployed in Taobao, the largest online e-commerce platform in China.
39 citations
••
01 Nov 2018TL;DR: This paper proposes an approach to extend the neural abstractive model trained on large scale SDS data to the MDS task, which makes use of a small number of multi-document summaries for fine tuning.
Abstract: Till now, neural abstractive summarization methods have achieved great success for single document summarization (SDS). However, due to the lack of large scale multi-document summaries, such methods can be hardly applied to multi-document summarization (MDS). In this paper, we investigate neural abstractive methods for MDS by adapting a state-of-the-art neural abstractive summarization model for SDS. We propose an approach to extend the neural abstractive model trained on large scale SDS data to the MDS task. Our approach only makes use of a small number of multi-document summaries for fine tuning. Experimental results on two benchmark DUC datasets demonstrate that our approach can outperform a variety of baseline neural models.
39 citations
••
TL;DR: The Attention-aware Temporal Weighted CNN (ATW CNN) for action recognition in videos, which embeds a visual attention model into a temporal weighted multi-stream CNN, and contributes substantially to the performance gains with the more discriminative snippets by focusing on more relevant video segments.
Abstract: Research in human action recognition has accelerated significantly since the introduction of powerful machine learning tools such as Convolutional Neural Networks (CNNs). However, effective and efficient methods for incorporation of temporal information into CNNs are still being actively explored in the recent literature. Motivated by the popular recurrent attention models in the research area of natural language processing, we propose the Attention-aware Temporal Weighted CNN (ATW CNN) for action recognition in videos, which embeds a visual attention model into a temporal weighted multi-stream CNN. This attention model is simply implemented as temporal weighting yet it effectively boosts the recognition performance of video representations. Besides, each stream in the proposed ATW CNN framework is capable of end-to-end training, with both network parameters and temporal weights optimized by stochastic gradient descent (SGD) with back-propagation. Our experimental results on the UCF-101 and HMDB-51 datasets showed that the proposed attention mechanism contributes substantially to the performance gains with the more discriminative snippets by focusing on more relevant video segments.
39 citations
Authors
Showing all 6829 results
Name | H-index | Papers | Citations |
---|---|---|---|
Philip S. Yu | 148 | 1914 | 107374 |
Lei Zhang | 130 | 2312 | 86950 |
Jian Xu | 94 | 1366 | 52057 |
Wei Chu | 80 | 670 | 28771 |
Le Song | 76 | 345 | 21382 |
Yuan Xie | 76 | 739 | 24155 |
Narendra Ahuja | 76 | 474 | 29517 |
Rong Jin | 75 | 449 | 19456 |
Beng Chin Ooi | 73 | 408 | 19174 |
Wotao Yin | 72 | 303 | 27233 |
Deng Cai | 70 | 326 | 24524 |
Xiaofei He | 70 | 260 | 28215 |
Irwin King | 67 | 476 | 19056 |
Gang Wang | 65 | 373 | 21579 |
Xiaodan Liang | 61 | 318 | 14121 |