scispace - formally typeset
Search or ask a question

Showing papers by "Dirk T. Tempelaar published in 2015"


Journal ArticleDOI
TL;DR: This empirical contribution provides an application of Buckingham Shum and Deakin Crick's theoretical framework of dispositional learning analytics: an infrastructure that combines learning dispositions data with data extracted from computer-assisted, formative assessments and LMSs.

352 citations


Journal ArticleDOI
TL;DR: In this article, an empirical study based on a large sample of university students was conducted to demonstrate that relaxing these stringent assumptions, and thereby using the meaning system framework to its full potential, will provide strong benefits: effort beliefs are crucial mediators of relationships between implicit theories and achievement goals and academic motivations.
Abstract: Empirical studies into meaning systems surrounding implicit theories of intelligence typically entail two stringent assumptions: that different implicit theories and different effort beliefs represent opposite poles on a single scale, and that implicit theories directly impact the constructs as achievement goals and academic motivations. Through an empirical study based on a large sample of university students, we aim to demonstrate that relaxing these stringent assumptions, and thereby using the meaning system framework to its full potential, will provide strong benefits: effort beliefs are crucial mediators of relationships between implicit theories and achievement goals and academic motivations, and the different poles of implicit theories and effort beliefs do expose different relationships with goal setting behaviour and academic motivations. A structural equation model, cross-validated by demonstrating gender-invariance of path coefficients, demonstrates that incremental and entity theory views have less predictive power than positive and negative effort beliefs in explaining achievement goals and motivations.

73 citations


DOI
24 Mar 2015
TL;DR: In this article, the authors developed a framework which links a course contextualized antecedent - academic control in Pekrun's (2006) Control Value Theory of Achievement Emotions - with generic antecedents - adaptive and maladaptive cognitions and behaviors from Martin's (2007) Motivation and Engagement Wheel framework - to explain the emergence of learning-related emotions (LREs) in a transition period.
Abstract: Recent work suggests that learning-related emotions (LREs) play a crucial role in performance especially in the first year of university, a period of transition for most students; however, additional research is needed to show how these emotions emerge. We developed a framework which links a course-contextualized antecedent - academic control in Pekrun’s (2006) Control Value Theory of Achievement Emotions - with generic antecedents - adaptive and maladaptive cognitions and behaviors from Martin’s (2007) Motivation and Engagement Wheel framework - to explain a classical problem: the emergence of LREs in a transition period. Using a large sample (N = 3451) of first year university students, our study explores these two antecedents to better understand how four LREs (enjoyment, anxiety, boredom and hopelessness) emerge in a mathematics and statistics course. Through the use of path-modelling, we found that academic control has a strong effect on all four LREs – with the strongest impact observed for learning hopelessness and secondary, for learning anxiety. Academic control, on its turn, builds on contributions from adaptive and mal-adaptive cognitions. Furthermore, adaptive cognitions have an impact on learning enjoyment (positive) and on boredom (negative). Surprisingly though, the maladaptive behaviors impact positively learning enjoyment and negatively learning anxiety. Following this, we predicted performance outcomes in the course and found again academic control as the main predictor, followed by learning hopelessness. Overall, this study brings evidence that adaptive and maladaptive cognitions and behaviours act as important antecedents of academic control, the main predictor of LREs and course performance outcomes.

19 citations


Journal ArticleDOI
TL;DR: In this article, the authors examined the nature of hospitality managers' knowledge when solving typical hotel management problems and pointed out the importance of knowledge as an important element in performance on hospitality management tasks.
Abstract: This study examines the nature of hospitality managers’ knowledge when solving typical hotel management problems. In a cross-sectional study, data were collected from first-, fourth-, and eighth-semester students and compared with experts who had seven to ten years of experience working as hospitality managers. Three typical hospitality management cases were used to measure the way in which participants with different levels of education and experience cognitively represented the cases. Participants' recall and problem solving processes were also assessed. Through a method of proposition analysis, the data were scored and then evaluated using one-way analysis of variance. The results point to knowledge as an important element in performance on hospitality management tasks. Hospitality experts' recall and problem solving abilities in this study were strengthened by their rich and broad knowledge base of facts, concepts and inferences which they used to abstract information from the cases and then accuratel...

14 citations


Proceedings ArticleDOI
23 May 2015
TL;DR: Focusing on the predictive power, evidence of both stability and sensitivity of regression type prediction models is provided, and an application of Buckingham Shum and Deakin Crick’s theoretical framework of dispositional learning is provided.
Abstract: Learning analytics seek to enhance the learning processes through systematic measurements of learning related data and to provide informative feedback to learners and educators In this follow-up study of previous research (Tempelaar, Rienties, and Giesbers, 2015), we focus on the issues of stability and sensitivity of Learning Analytics (LA) based prediction models Do predictions models stay intact, when the instructional context is repeated in a new cohort of students, and do predictions models indeed change, when relevant aspects of the instructional context are adapted? This empirical contribution provides an application of Buckingham Shum and Deakin Crick’s theoretical framework of dispositional learning analytics: an infrastructure that combines learning dispositions data with data extracted from computer-assisted, formative assessments and LMSs We compare two cohorts of a large introductory quantitative methods module, with 1005 students in the ’13/’14 cohort, and 1006 students in the ’14/’15 cohort Both modules were based on principles of blended learning, combining face-to-face Problem-Based Learning sessions with e-tutorials, and have similar instructional design, except for an intervention into the design of quizzes administered in the module Focusing on the predictive power, we provide evidence of both stability and sensitivity of regression type prediction models

12 citations


Book ChapterDOI
23 May 2015
TL;DR: This empirical contribution compares two cohorts of a large module introducing mathematics and statistics, and analyses bivariate and multivariate relationships of module performance and track and disposition data to provide evidence of both stability and sensitivity of prediction models.
Abstract: In this empirical contribution, a follow-up study of previous research [1], we focus on the issues of stability and sensitivity of Learning Analytics based prediction models. Do predictions models stay intact, when the instructional context is repeated in a new cohort of students, and do predictions models indeed change, when relevant aspects of the instructional context are adapted? Applying Buckingham Shum and Deakin Crick’s theoretical framework of dispositional learning analytics combined with formative assessments and learning management systems, we compare two cohorts of a large module introducing mathematics and statistics. Both modules were based on principles of blended learning, combining face-to-face Problem-Based Learning sessions with e-tutorials, and have similar instructional design, except for an intervention into the design of quizzes administered in the module. We analyse bivariate and multivariate relationships of module performance and track and disposition data to provide evidence of both stability and sensitivity of prediction models.

5 citations


Journal ArticleDOI
TL;DR: In this article, the authors examined the predictive value of four learning-related emotions Enjoyment, Anxiety, Boredom and Hopelessness for achievement outcomes in the first year of study at university.
Abstract: Introduction. This study examined the predictive value of four learning-related emotions Enjoyment, Anxiety, Boredom and Hopelessness for achievement outcomes in the first year of study at university. Method. We used a large sample (N = 2337) of first year university students enrolled over three consecutive academic years in a mathematics and statistics course, from an undergraduate Economics and an International Business degree programs. Results. We first showed significant differences in the emotional experiences between the students who attended, and those who were absent from the final exam. Second, the present study found emotions to have a strong predictive value for student exam scores, particularly for learning hopelessness and a prior mathematics background. This relationship was consistent over three consecutive academic years. Discussion and Conclusion. Recommendations for improving educational practice have been formulated and are shared in this article.

5 citations


Book ChapterDOI
TL;DR: Time on task data appear to be more sensitive to the effects of heterogeneity than mastery data, providing a further argument to prioritize formative assessment mastery data as predictor variables in the design of prediction models directed at the generation of learning feedback.
Abstract: Mastery data derived from formative assessments constitute a rich data set in the development of student performance prediction models. The dominance of formative assessment mastery data over use intensity data such as time on task or number of clicks was the outcome of previous research by the authors in a dispositional learning analytics context [1, 2, 3]. Practical implications of these findings are far reaching, contradicting current practices of developing (learning analytics based) student performance prediction models based on intensity data as central predictor variables. In this empirical follow-up study using data of 2011 students, we search for an explanation for time on task data being dominated by mastery data. We do so by investigating more general models, allowing for nonlinear, even non-monotonic, relationships between time on task and performance measures. Clustering students into subsamples, with different time on task characteristics, suggests heterogeneity of the sample to be an important cause of the nonlinear relationships with performance measures. Time on task data appear to be more sensitive to the effects of heterogeneity than mastery data, providing a further argument to prioritize formative assessment mastery data as predictor variables in the design of prediction models directed at the generation of learning feedback.

4 citations