Learning Program Embeddings to Propagate Feedback on Student Code
Citations
10 citations
10 citations
10 citations
9 citations
9 citations
Cites background from "Learning Program Embeddings to Prop..."
...Thus, minor syntactic changes correspond to major changes in terms of function [14]....
[...]
...Piech and colleagues report multiplication factors of up to 214, that is, a human tutors annotation for one program permits inference of said annotation for up to 214 other programs [14]....
[...]
...Recently, Piech and colleagues have criticized this approach and judged syntax trees not sufficiently discriminative to capture the strong functional consequences of small syntactic changes [14]....
[...]
References
7,244 citations
6,984 citations
Additional excerpts
...Learning rates are set using Adagrad (Duchi et al., 2011)....
[...]
6,935 citations
"Learning Program Embeddings to Prop..." refers methods in this paper
...We use random search (Bergstra & Bengio, 2012) to optimize over hyperparameters (e.g, regularization parameters, matrix dimensions, and minibatch size)....
[...]
6,792 citations
"Learning Program Embeddings to Prop..." refers background or methods in this paper
...The programs for these assignments operate in maze worlds where an agent can move, turn, and test for conditions of its current location....
[...]
...Our models are related to recent work from the NLP and deep learning communities on recursive neural networks, particularly for modeling semantics in sentences or symbolic expressions (Socher et al., 2013; 2011; Zaremba et al., 2014; Bowman, 2013)....
[...]
...…on recursive neural networks (called the NPM-RNN model) in which we parametrize a matrix MA in this new model with an RNN whose architecture follows the abstract syntax tree (similar to the way in which RNN architectures might take the form of a parse tree in an NLP setting (Socher et al., 2013))....
[...]
[...]
5,171 citations