scispace - formally typeset
Open AccessProceedings Article

Learning Program Embeddings to Propagate Feedback on Student Code

TLDR
A neural network method is introduced to encode programs as a linear mapping from an embedded precondition space to an embedded postcondition space and an algorithm for feedback at scale is proposed using these linear maps as features.
Abstract
Providing feedback, both assessing final work and giving hints to stuck students, is difficult for open-ended assignments in massive online classes which can range from thousands to millions of students. We introduce a neural network method to encode programs as a linear mapping from an embedded precondition space to an embedded postcondition space and propose an algorithm for feedback at scale using these linear maps as features. We apply our algorithm to assessments from the Code.org Hour of Code and Stanford University's CS1 course, where we propagate human comments on student assignments to orders of magnitude more submissions.

read more

Content maybe subject to copyright    Report

Citations
More filters
Proceedings Article

Dynamic Neural Program Embeddings for Program Repair

TL;DR: A novel semantic program embedding that is learned from program execution traces is proposed, showing that program states expressed as sequential tuples of live variable values not only captures program semantics more precisely, but also offer a more natural fit for Recurrent Neural Networks to model.
Proceedings ArticleDOI

Writing Reusable Code Feedback at Scale with Mixed-Initiative Program Synthesis

TL;DR: A mixed-initiative approach which combines teacher expertise with data-driven program synthesis techniques is introduced which helps teachers better understand student bugs and write reusable feedback that scales to a massive introductory programming classroom.
Posted Content

Context-Aware Attentive Knowledge Tracing

TL;DR: In this paper, the authors propose attentive knowledge tracing (AKT), which couples flexible attention-based neural network models with a series of novel, interpretable model components inspired by cognitive and psychometric models.
Proceedings ArticleDOI

Deep Knowledge Tracing On Programming Exercises

TL;DR: This work feeds embedded program submissions into a recurrent neural network and train it on the task of predicting the student's success on the subsequent programming exercise, and reliably predicts future student performance.
Posted Content

code2seq: Generating Sequences from Structured Representations of Code

TL;DR: CODE2SEQ as mentioned in this paper represents a code snippet as the set of compositional paths in its abstract syntax tree (AST) and uses attention to select the relevant paths while decoding.
References
More filters
Proceedings Article

Adaptive Subgradient Methods for Online Learning and Stochastic Optimization.

TL;DR: Adaptive subgradient methods as discussed by the authors dynamically incorporate knowledge of the geometry of the data observed in earlier iterations to perform more informative gradient-based learning, which allows us to find needles in haystacks in the form of very predictive but rarely seen features.
Journal Article

Adaptive Subgradient Methods for Online Learning and Stochastic Optimization

TL;DR: This work describes and analyze an apparatus for adaptively modifying the proximal function, which significantly simplifies setting a learning rate and results in regret guarantees that are provably as good as the best proximal functions that can be chosen in hindsight.
Journal Article

Random search for hyper-parameter optimization

TL;DR: This paper shows empirically and theoretically that randomly chosen trials are more efficient for hyper-parameter optimization than trials on a grid, and shows that random search is a natural baseline against which to judge progress in the development of adaptive (sequential) hyper- parameter optimization algorithms.
Proceedings Article

Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank

TL;DR: A Sentiment Treebank that includes fine grained sentiment labels for 215,154 phrases in the parse trees of 11,855 sentences and presents new challenges for sentiment compositionality, and introduces the Recursive Neural Tensor Network.
Book

A complexity measure

TL;DR: In this paper, a graph-theoretic complexity measure for managing and controlling program complexity is presented. But the complexity is independent of physical size, and complexity depends only on the decision structure of a program.
Related Papers (5)