scispace - formally typeset
C

Chuan-Ju Wang

Researcher at Academia Sinica

Publications -  68
Citations -  813

Chuan-Ju Wang is an academic researcher from Academia Sinica. The author has contributed to research in topics: Ranking (information retrieval) & Computer science. The author has an hindex of 12, co-authored 60 publications receiving 488 citations. Previous affiliations of Chuan-Ju Wang include Center for Information Technology & University of Taipei.

Papers
More filters
Proceedings ArticleDOI

HOP-rec: high-order proximity for implicit recommendation

TL;DR: This paper presents HOP-Rec, a unified and efficient method that incorporates factorization and graph-based models and significantly outperforms the state of the art on a range of large-scale real-world datasets.
Proceedings ArticleDOI

Collaborative Similarity Embedding for Recommender Systems

TL;DR: A unified framework that exploits comprehensive collaborative relations available in a user-item bipartite graph for representation learning and recommendation and proposes a sampling technique that is specifically designed to capture the two types of proximity relations.
Journal ArticleDOI

On the risk prediction and analysis of soft information in finance reports

TL;DR: The experimental results show that, based on a bag-of-words model, using only financial sentiment words results in performance comparable to using the whole texts; this confirms the importance offinancial sentiment words with respect to risk prediction.
Proceedings Article

Financial Sentiment Analysis for Risk Prediction

TL;DR: The experimental results show that, based on the bag-of-words model, models trained on sentiment words only result in comparable performance to those on origin texts, which confirms the importance of financial sentiment words on risk prediction.
Posted Content

Conversational Question Reformulation via Sequence-to-Sequence Architectures and Pretrained Language Models

TL;DR: Examining a variety of architectures with different numbers of parameters, it is demonstrated that the recent text-to-text transfer transformer (T5) achieves the best results both on CANARD and CAsT with fewer parameters, compared to similar transformer architectures.