H
Hangyeol Yu
Researcher at KAIST
Publications - 13
Citations - 163
Hangyeol Yu is an academic researcher from KAIST. The author has contributed to research in topics: Differentiable function & Computer science. The author has an hindex of 5, co-authored 9 publications receiving 58 citations.
Papers
More filters
Proceedings ArticleDOI
SAINT+: Integrating Temporal Features for EdNet Correctness Prediction
TL;DR: SAINT+, a successor of SAINT which is a Transformer based knowledge tracing model that separately processes exercise information and student response information, achieves state-of-the-art performance in knowledge tracing with an improvement of 1.25% in area under receiver operating characteristic curve compared to SAINT.
Posted Content
On Correctness of Automatic Differentiation for Non-Differentiable Functions
TL;DR: This paper investigates a class of functions, called PAP functions, that includes nearly all (possibly non-differentiable) functions in deep learning nowadays, and proposes a new type of derivatives, called intensional derivatives, and proves that these derivatives always exist and coincide with standard derivatives for almost all inputs.
Proceedings ArticleDOI
SAINT+: Integrating Temporal Features for EdNet Correctness Prediction
TL;DR: SAINT+ as discussed by the authors is a successor of SAINT which is a Transformer-based knowledge tracing model that separately processes exercise information and student response information, and incorporates two temporal feature embeddings into the response embedding: elapsed time, the time taken for a student to answer, and lag time.
Journal ArticleDOI
Dialogue Summaries as Dialogue States (DS2), Template-Guided Summarization for Few-shot Dialogue State Tracking
TL;DR: It is hypothesized that dialogue summaries are essentially unstructured dialogue states; hence, it is proposed to reformulate dialogue state tracking as a dialogue summarization problem, and the method DS2 outperforms previous works on few-shot DST in MultiWoZ 2.0 and 2.1.
Posted Content
Reparameterization Gradient for Non-differentiable Models
TL;DR: In this paper, the authors propose an algorithm for stochastic variational inference that targets at models with non-differentiable densities by generalizing the reparameterization trick, one of the most effective techniques for addressing the variance issue for differentiable models.