scispace - formally typeset
Open AccessJournal ArticleDOI

Quantitative Research Methods Training in Education Leadership and Administration Preparation Programs as Disciplined Inquiry for Building School Improvement Capacity

Reads0
Chats0
TLDR
The authors argue for programs to focus as well in these courses on helping aspiring leaders develop skills as practitioner-scholars, including deepening their practice around data analytics, providing opportunities to read and evaluate peer-reviewed research, analyzing data using current methods, and applying findings to facilitate building evidence-based improvement cycles in their schools.
Abstract
The quantitative research methods course is a staple of graduate programs in education leadership and administration. Historically, these courses serve to train aspiring district and school leaders in fundamental statistical research topics. This article argues for programs to focus as well in these courses on helping aspiring leaders develop skills as practitioner-scholars, including deepening their practice around data analytics, providing opportunities to read and evaluate peer-reviewed research, analyzing data using current methods, and applying findings to facilitate building evidence-based improvement cycles in their schools. Additional data leadership training should be offered for the practicing administrator, educational quantitative analyst, research specialist, and district data scientist.

read more

Content maybe subject to copyright    Report

1
Bowers, A.J. (2017)
Quantitative Research Methods Training in Education Leadership
and Administration Preparation Programs as Disciplined Inquiry for
Building School Improvement Capacity
Alex J. Bowers
Teachers College, Columbia University
ABSTRACT:
12
The quantitative research methods course is a staple of graduate
programs in education leadership and administration.
Historically, these courses serve to train aspiring district and
school leaders in fundamental statistical research topics. This
article argues for programs to focus as well in these courses on
helping aspiring leaders develop skills as practitioner-scholars,
including deepening their practice around data analytics,
providing opportunities to read and evaluate peer-reviewed
research, analyzing data using current methods, and applying
findings to facilitate building evidence-based improvement
cycles in their schools. Additional data leadership training
should be offered for the practicing administrator, educational
quantitative analyst, research specialist and district data scientist.
KEYWORDS: Quantitative Methods, Statistical Analysis,
Educational Administration, School Administration, Leadership,
College Programs, Doctoral Programs, Graduate Study, Methods
Courses, Methods Research, Data Analytics, Data Science,
Design-based research, Continuous Improvement
Introduction:
I believe that the greatest impact of the quantitative
approach will not be in the area of problem solving,
although it will have growing usefulness there. Its
greatest impact will be on problem formulation: the
way managers think about their problems how they
size them up, bring new insights to bear on them, relate
them to other problems, communicate with other people
about them, and gather information for analyzing them.
In this sense, the results that "quantitative peoplehave
produced are beginning to contribute to in a really
significant way to the art of management (Farmer,
1970, p.21).
1
This document is a pre-print of this manuscript, published in
the Journal of Research on Leadership Education. Citation:
Bowers, A.J. (2017) Quantitative Research Methods Training in
Education Leadership and Administration Preparation Programs
as Disciplined Inquiry for Building School Improvement
Capacity. Journal of Research on Leadership Education, 12(1),
p.72-96. http://doi.org/10.1177/1942775116659462
2
Alex J. Bowers (bowers@tc.edu); Teachers College, Columbia
University; Bowers@tc.edu; 525 W. 120th Street, New York,
New York 10027. ORCiD: 0000-0002-5140-6428
The purpose of this article is to provide a discussion of the use of
quantitative research methods instruction in university graduate
programs of education leadership and administration, with a
specific focus on training doctoral students in Ed.D and Ph.D
degree programs as practitioner-scholars who aim to work as
district leader practitioners. These programs have historically
included methods training for practitioner-scholars aspiring to
hold administrative positions in K-12 schooling organizations
with an Ed.D or Ph.D degree, with many graduates becoming
school district central office staff, superintendents, or state or
national policymakers. A quantitative research methods course
has a longstanding tradition of being included within education
leadership graduate programs (Anderson & Reynolds, 2015;
Hess & Kelly, 2005; Militello, Gajda, & Bowers, 2009;
Thornton & Perreault, 2002), along with a host of other courses
in programs designed to help graduates learn to lead schools and
districts, courses such as qualitative methods, diversity and
social justice issues, law, policy, finance and budgeting, human
resource management, facilities, labor negotiation, curriculum
and instruction, assessment and evaluation, ethics, and the list
goes on. Over the past few decades, programs focused on
doctoral training in education leadership are on the rise, with
continually increasing numbers of programs and graduates in the
U.S. (B. D. Baker, Orr, & Young, 2007; Goldring &
Schuermann, 2009; Hackmann & McCarthy, 2011).
However, over the same time period there has been a host of
critiques of education leadership preparation, with increasing
attention on the Ed.D. as a problematic degree and training
structure for preparing graduates to actually lead schools and
districts well (Goldring & Schuermann, 2009; Perry, 2012;
Shulman, Golde, Bueschel, & Garabedian, 2006). Nevertheless,
throughout this context the quantitative research methods course
remains. It is within this context that I aim to consider the
following questions as a means to help engage students,
preparation programs, district and state school administrators,
and university faculty in examining the quantitative research
methods course. These questions include: What are quantitative
research methods courses in education leadership? What is the
purpose of such courses? Why are they included? What are the
expectations for student outcomes, especially as applied to their
work on the ground in districts? And what are some useful
structures, curricula, and instructional techniques for these types
of courses that can prepare practitioner-scholars to use data and
research in their everyday practice to help motivate instructional
improvement in their organizations?

2
Bowers, A.J. (2017)
In this article, I first overview the current conversation in the
research literature on the delivery of graduate programs in
education leadership, such as the EdD and PhD, aimed at
training scholar-practitioners to take leadership positions in
schools and districts. Second, I discuss the purpose of the
quantitative methods course in these types of programs of study,
an issue rarely discussed in the research literature. Third, I then
turn to discussing specifics of how to provide professional
capacity building through graduate programs through
opportunities to discuss and apply current research, engage in
meaningful analysis and critique of data in schools, and provide
opportunities for increased collaboration between universities
and districts. Throughout, my contention here for quantitative
methods courses in education leadership preparation programs is
that while it is important to provide instruction on basic statistics
and empirical reasoning through structured testing using
quantitative methods, quantitative methods courses provide an
opportunity to build the capacity of school leaders as
practitioner-scholars in assessment literacy, data literacy, and
how to facilitate and lead building professional capacity through
evidence-based improvement cycles.
Preparing Administrators to Lead Schooling Organizations:
Traditionally in the preparation of school district administration
and leadership, the quantitative research methods course has
been one of many courses designed to help the school system
leader learn the skills needed to effectively manage systems of
schools (Bruno & Fox, 1973; Kowalski, McCord, Peterson,
Young, & Ellerson, 2011). More recently, as research has shown
the effect that central office district administrators and the
superintendent can have on schooling outcomes, such as growth
in student achievement, the professional development of
principals and the central office, and the influence over school
facilities and community involvement (Bird, Dunaway,
Hancock, & Wang, 2013; Bowers, 2008, 2010b, 2015; Bowers
& Chen, 2015; Bowers & Lee, 2013; Honig, 2003, 2008, 2009,
2012; Wallace Foundation, 2013), preparation programs have
included the areas of instructional improvement, adult
development, and continuous systems improvement, among
others (Bryk, Gomez, & Grunow, 2011; Carter, Glass, & Hord,
1993; Drago-Severson & Blum-DeStefano, 2013). Across these
types of programs, graduates have historically rated their
experiences as preparing them “well” or “very well” for central
office and superintendent roles (Kowalski, et al., 2011). As an
example, the American Association of School Administrators
(AASA) has conducted over 40 years of extensive surveys of
superintendents from across the US, asking them a variety of
questions about the job every ten years, including their
perceptions of how well they were trained (Knezevich, 1971;
Kowalski, et al., 2011).
As a recent example of the positive perception of superintendent
training, when asked to rate their overall perception of their
academic training program, 78.5% of respondents replied that
their training was either “good” or “excellent”, mirroring other
similar studies (Kowalski, et al., 2011). Additionally, from the
2000 to 2010 AASA study, while there was significant growth of
the number of preparation programs and the overall number of
graduates with EdDs and PhDs from these programs,
superintendent responses to rating the credibility of their
professors as “good” or “excellent” rose from 65.9% in 2000 to
81.1% in 2010 (Kowalski, et al., 2011). In relation to the
importance of specific courses in their preparation programs to
the job of the superintendent, the majority of superintendent
respondents have continually rated school law, finance, public
relations and human resource management as “extremely
important”. However, interestingly in relation to the topic of the
discussion in this article, the courses receiving the most
responses for “unimportant” for superintendents are
organizational theory, tests and measurements, research, and
diversity.
Thus, these findings, from the people who actually do the job of
district administration, present an interesting conundrum given
the recent research literature on the EdD and PhD in education
leadership. Superintendents continually rate their university
training programs highly, yet there is a deep line of criticism in
the research literature of the focus, quality, and rigor of doctoral
programs in education leadership, in which these critiques focus
on the extent to which programs can prepare leaders for actual
practice in schools and districts (Goldring & Schuermann,
2009). In recent years, there have been multiple reports that have
critiqued the extent to which university preparation programs
train leaders for the job of running school districts (Grogan &
Andrews, 2002; Levine, 2005; Shulman, et al., 2006), especially
when it comes to the use of the EdD as the central capstone of a
practitioner degree - a degree which historically has taken the
form of a research dissertation (Townsend, 2002). This critique
also has extended to the PhD in the same and similar programs,
as there has historically been little difference between the two
degrees in practice, with aspiring researchers and practitioners
obtaining either degree, with the only difference in requirements
being an advanced statistics course for the PhD (McCarthy &
Forsyth, 2009; Osguthorpe & Wong, 1993).
Despite the positive responses of superintendents to their past
university training programs, to address these critiques from the
research literature many university programs have recently
engaged in redesigns (Sanzo & Scribner, 2015; Smrekar &
McGraner, 2009), refocusing their doctoral training programs on
the issues and the problems of practice that are of most concern
to their students in their daily work in schools (Carnegie Project
on the Education Doctorate, n.d.; Goldring & Schuermann,
2009; Shulman, et al., 2006). This refocusing is meant in part to
make the training more meaningful and relevant for practice in
districts. The vast majority of graduate students in these
programs are full time school practitioners who are steeped in
the everyday issues of schooling systems. A “problems of
practice” perspective is meant to engage practitioners in action
research (Herr & Anderson, 2015) in which graduate students
take on these issues that are most relevant for their context as a
means to engage both the graduate student and the organization
to be studied in working to solve real-world problems in schools
(Coburn, Penuel, & Geil, 2013). This thus addresses one of the

3
Bowers, A.J. (2017)
central critiques from the literature on the EdD on relevance of
the preparation program to practice.
However, despite the current popularity of a problems of
practice approach, there is a long-standing counter argument.
Published in the very first volume and issue of Educational
Administration Quarterly in 1965, Hills noted a central issue
with problems of practice when it comes to training future
leaders of schools:
This is emphatically not to say that practical problems
are not important, nor even that they are less important
than some other kinds of problems. But it is to say that
a problem centered approach to the study and practice
of administration, regardless of how scientific, obscures
and possibly precludes the recognition of a further, and
to me, equally significant kind of relevance...The tied-
to-action quality of current approaches to the study and
practice of administration, even those which
wholeheartedly embrace the social sciences, rules out
the possibility of developing in students and
practitioners what Berger has called the "sociological
consciousness" or "intellectual irreverence". p.23-24
(Hills, 1965).
In this quote and article, Hills outlines the point that a problems
of practice approach limits the student to attempting to solve the
everyday problems of a system that itself may be in need of
rethinking. To the point, Hills notes that the distinctive
characteristic of the action-oriented the applied science
approach is that the goal is always given” (p.25). Thus, a goal of
university programs should be “that administrators in particular
should be aware of the fact that other worlds besides their own
do exist, that there are alternatives” (p.27). While written over
50 years ago, Hills’ points have a certain salience today,
especially when considered within the context of current social
justice critiques of the education system and education
leadership specifically (Brooks, 2012; Davis, Gooden, &
Bowers, 2015; Horsford, 2010; Reyes & Wagstaff, 2005), a
system for which Hills might argue a problems of practice
approach might help prop up rather than rethink, restructure and
reform.
To sum up these points, to rely on yet another nearly 50 year old
article on these issues, Cunningham and Nystrand (1969) could
easily be talking about contemporary issues of administrator
preparation when they noted:
Although we now perceive the administrator as an
applied social scientist and urge students to become
capable students of behavioral science, we have not put
aside altogether the images of educational superman,
technical expert, and democratic leader. We have
developed instead a very crowded curriculum which, in
too many cases, conveys a composite image of the
administrator who is all-knowing, well-versed in all
details of administering schools, and able to use
behavioral science "principles" p.10 (Cunningham &
Nystrand, 1969)
Thus, graduate students in education leadership today enter into
a field of university training programs simultaneously seen as a
positive stepping stone into the profession yet also under critique
and revision in an effort to make the student’s investment of
time and money in their training relevant, rigorous, applied, and
research-based. As with the majority of the sub-domains within
educational leadership research and practice (Oplatka, 2009;
Wang & Bowers, 2016; Wang, Bowers, & Fikis, 2015), the
history of professional preparation of school leaders in
university programs could be termed, as Riehl has recently
termed the research in educational leadership overall, as “mostly
unpunctuated disequilibrium” (Riehl, 2015). It is within this
context that I aim to discuss the issue of quantitative methods in
education leadership preparation programs, especially as they
relate to training doctoral students as scholar-practitioners who
aim to work as district leader practitioners.
Why Teach Quantitative Research Methods to Aspiring District
Leaders?
Why do we teach quantitative methods in programs that are
aimed to train working practitioners for roles in school and
district organizational leadership? Historically, it has been a
taken for granted course in university programs. But why
include it among all of the other possible courses that vie for
attention to help prepare students? As noted above,
superintendents rate research methods and data and assessment
courses as some of the least useful. Additionally, it is well-
known that while school leaders will often justify decisions in
schools through using the phrase “research says” as well as refer
obliquely to vague research topics such as “brain science”,
studies show that school leaders rarely read current education
research, nor do they incorporate specific research findings into
their practice, and even rarer still do they do primary research in
their schools (Fusarelli, 2008). Nevertheless, over the last 50
years, and especially the first decade and a half of the 21
st
century, there has been an ever increasing positive research
literature publication trend of ever more high quality education
research, from across ideological, methodological and
epistemological domains (Wang & Bowers, 2016; Wang, et al.,
2015). Specifically for quantitative methods, as recently noted
by Guthrie (2009) in relation to discussing the EdD in education
leadership:
Modern education research increasingly is
characterized by a rigorous methodological and
philosophical paradigm entirely different than was true
even in the late 20th century. Experimentation and large
data set analyses, random and fixed effect modeling, are
now the expected research mode. Measurement
techniques such as those regularly used by
epidemiologists, psychologists, and economists,
regression and discontinuity regression analysis,
propensity analysis, and hierarchical linear modeling
are increasingly threshold quantitative skills for
research methodological competency. These are skill
sets and understandings that take time to impart, require
immersion in analyses and research to perfect, and are
not learned by lecture and from textbooks alone. p.3-4
(Guthrie, 2009)

4
Bowers, A.J. (2017)
In addition to the growing diversity of quantitative methods
aimed to capture and model the complex sociological
interactions in schools (Goff & Finch, 2015; Hallinger & Heck,
2011), a search of the ERIC.gov education research search
engine shows that for the 2014 year alone, there were over
20,000 articles that mentioned “leadership” or “administration”,
with almost 3,000 of these mentioning “statistical analysis” as a
keyword. So what is the aspiring scholar-practitioner to do? Is it
possible to keep up with 20,000 articles or even 3,000, while
holding jobs in schools that require many more than 40 hours a
week? If we have reserved the time and space in a busy and
crowded university training program for quantitative research
methods, how should we use that time to best meet the needs of
the students, their organizations, the program and the research
literature?
This question of the point of quantitative research methods
programming has rarely been taken up in the education
leadership research literature. One of the few attempts to
overview the purpose of quantitative research methods courses
in education leadership graduate programs, as well as differences
and innovations across programs, was an effort by Bruno and
Fox in 1973 in a report commissioned by the University Council
for Educational Administration (UCEA) titled appropriately
enough Quantitative Analysis in Educational Administrator
Preparation Programs (Bruno & Fox, 1973). In their extensive
report, Bruno and Fox reviewed the literature at the time on the
use of quantitative methods in education administration and
management programs, and how the methods courses could help
aspiring school administrators address the needs of the rising
dual demands of accountability and instructional improvement.
Additionally, they provided overviews of the content of multiple
university programs, providing evidence that has not been
updated in the 40 years since. Indeed, a main recommendation of
the present paper is to encourage UCEA or other like-minded
institutions or researchers to provide evidence from programs in
a similar manner. In the following quote, Bruno and Fox (1973)
summarize well the position of programs on the purpose of
quantitative methods courses in graduate education administrator
programs both then and currently:
It is important to emphasize that programs constructed
for the practicing decision-maker should not be
designed to make him an expert in the use of the
various technical tools and concepts that are involved.
Rather, these programs should be designed to acquaint
him with what tools and concepts are available, under
what situations they can be used, and, most importantly,
what their limitations are. It is possible that most
program analyses will be performed by central office
staff or outside consultants. Other district personnel
should know what this group can do and be able to
interpret and apply the results of such analyses.
Moreover, all decision-makers should be able to apply
analytical thinking to the decisions they must make
daily. In brief, general administrators should be trained
to criticize and utilize analyses, rather than formulate
them themselves. p.24-25 (Bruno & Fox, 1973)
While obviously dated in their pronoun use, this quote from
Bruno and Fox exemplifies the central argument of the present
article quantitative research methods courses in education
leadership preparation programs should teach the practicing
decision maker how to apply analytical thinking, formulate
evidence-based questions, and criticize and utilize analyses.
Given the vast quantity of research published annually,
combined with the ever increasing sophistication of research
methods, including quantitative, qualitative, and mixed methods,
a central purpose of leadership training is to train future school
system leaders to become consumers of this work and apply
critical thinking and evaluation of analytics to their decisions in
their schools on a daily basis.
Contemporary authors have worked to detail specifics of what
aspects of quantitative analysis may be the most useful for
practitioners to be fluent in, specifically assessment literacy and
data literacy. The term “fluent” here is purposeful, as much of
this literature uses the term “literacy” to evoke the idea of the
ability to read, unpack and summarize research and apply critical
thinking and questioning of that research to practice. First,
assessment literacy (Boudett, City, & Murnane, 2013; Popham,
2009, 2010) includes a working knowledge of assessment and
evaluation. Given the increasing demands of accountability in
schools and the use of standardized assessments, research on
assessment literacy has shown that a key component of
preparation programs should be to instill an ability in their
graduates to critique assessments and evaluations, know what to
look for when examining content, criterion and construct validity
arguments, and understand how to help teachers assess both
student growth and the teacher’s own development in valid and
reliable ways. This type of knowledge can change the stance of
administrators towards assessments from compliance to useful
feedback on student, teacher, school and organizational
performance, measured in many different ways beyond test
scores for formative and summative feedback (Halverson, 2010;
Leithwood, 2013). Second, data literacy (Jacobs, Gregory,
Hoppey, & Yendol-Hoppey, 2009; Mandinach, Friedman, &
Gummer, 2015; Mandinach & Gummer, 2013) includes the
concepts of knowing how to identify and collect relevant data
that can then be turned to analysis to help test hypotheses and
questions to help decide on and then monitor and iterate on
decisions and courses of action (Bowers, Shoho, & Barnett,
2014). More specific than data driven decision making (Wayman
& Stringfield, 2006), data literacy focuses on the tasks and skills
needed to organize and understand the information flow in
schooling organizations, and how to prioritize and analyze data
to help inform current decisions and evidence-based
improvement cycles (Bowers, 2008; Bowers, Krumm, Feng, &
Podkul, 2016; Bryk, et al., 2011; Cho & Wayman, 2015; Feng,
Krumm, Bowers, & Podkul, 2016; Marsh, 2012; Schildkamp,
Poortman, & Handelzalts, 2016; Wayman, Cho, Jimerson, &
Snodgrass Rangel, 2015).
Moreover, when it comes to the quantitatively-oriented
questions of practitioners in schools, recent research has shown
that the data and analysis needs of the system differ at the

5
Bowers, A.J. (2017)
teacher, principal and superintendent levels (Brocato, Willis, &
Dechert, 2014; Corcoran, Peck, & Reitzug, 2013; Cosner, 2014;
Farley-Ripple & Cho, 2014). The evidence from this work can
help form a basis for creating conversations around the data and
analytic needs of graduate students in education leadership
doctoral programs and their current and future organizations. For
example, Brocato, Willis and Dechert (2014) asked a large
sample of districts in a state to have superintendents, principals
and teachers respond to the prompt: “what components of a
statewide longitudinal data system are needed for that system to
best meet the respective needs of superintendent, principal and
teacher leaders?” Interestingly, their results showed that each
organizational level responded with very different needs in
which teachers focused on individual student demographic,
performance and growth needs, principals focused on teacher
evaluation and hiring, and superintendents focused on
comparative data (comparing student, teacher and school growth
over time) as well as budgets and community relations. The
responses point to three main issues. First, different stakeholders
have different data needs (Bernhardt, 2013). Thus, any
recommendations for encouraging data use must incorporate
these differing perspectives. Second, while all of the respondents
wished for data that would inform decisions on specific people,
such as students or teachers, all of the respondents indicated that
comparisons were very important. This points to the need for
analysis, especially correlations, cross-tabs, and scatterplots as
an accessible means to make comparisons. Third, the results
relate to the large variety of data and analysis needs throughout
each level of a schooling system, which highlights the need for
graduate instruction in how to select the “data story” among the
large variety of choices, as a means to focus the development of
assessment and data literacy skills within a school or district as a
means to build capacity for instructional improvement
(Bambrick-Santoyo, 2010; Bernhardt, 2013; Boudett, et al.,
2013; Cosner, 2014; Marsh, 2012; Marsh, Bertrand, & Huguet,
2015; Piety, 2013).
As an example from my own courses, in addition to the
interesting results of their study, the question asked by Brocato,
Willis and Dechert (2014) can be very useful to start a
discussion in a quantitative methods course or in engaging
school leaders in discussions about their data, helping to
structure conversations around issues of assessment literacy,
data literacy and data driven decision making. I have found that
school leaders will dig deeply into the main issues when I use
the following protocol: 1) ask the Brocato, Willis and Dechert
(2014) question, but ask participants to first write down their
responses as if they were a teacher in their own organization,
then repeat for the principal, and then the superintendent; 2)
have participants form groups of two to three and have them
discuss their answers; 3) then ask them to draw a Venn diagram,
with one circle each for the three different levels; 4) discuss
where the overlaps are and where they are not and ask why; 5)
then review their answers in light of the answers from the study
itself and provide time to discuss the differences and what they
might mean for how to understand the data and assessment
needs of a district. This type of discussion protocol in a course or
professional development opportunity provides an excellent
opportunity for practitioners to begin to unpack the differences
in data needs across an organization. For students in doctoral
programs in education leadership who most likely are teachers or
school building administrators, this type of discussion
encourages them to consider the data perspectives of each of the
levels in the system, and how those may differ, and then to
consider why they differ. For the practitioner-scholar in a
graduate quantitative methods course, using data encounter and
discussion protocols such as this creates a space in the course
and curriculum for the expression of their current data needs
around their problems of practice while structuring these
discussions through opening up the conversations to consider the
broader needs across organizations. The questions and issues
raised by students and instructors through this type of dialogue
can then be incorporated into course discussions, assignments, or
as a start in developing action research questions.
Given the large efforts of the work of educators over the last 50
years to generate and record ever increasing streams of data in
schools, in addition to assessment and data literacy, a central
professional development need of educators is now in turning to
building their capacity around data use to inform evidence-based
improvement cycles in their organizations (Bowers, et al., 2014).
As has been noted in the recent literature on evidence-based
improvement cycles in schools, schooling organizations should
strive to build trusting and robust cultures around evidence and
data use for everyone within the system (Boudett, et al., 2013;
Schildkamp & Poortman, 2015; Schildkamp, Poortman, &
Handelzalts, 2015). When done well, any teacher should be able
to ask any other teacher, principal or central office staff “what is
your evidence for that statement?”, and this question should be
interpreted by a colleague as a trusting and helpful question
which is aimed at helping the entire organization improve
(Boudett, et al., 2013; Bryk, et al., 2011). Note also that this
changes the orientation of the school organization, from one of
low evidence/high inference, to high evidence/low inference,
addressing the problems of above such as the use of the phrase
“research says”, instead focusing a school on examining the
evidence (Bowers, et al., 2014). As part of my argument here in
this article, quantitative methods courses should go beyond the
notion of action research focusing on addressing a specific
problem of practice for a student’s organization, and include an
opportunity for students to develop the skills on how to lead
evidence-based improvement cycles, also termed plan-do-study-
act cycles.
Thus, throughout this article I argue for a more applied focus for
the quantitative research methods course. Nevertheless, the
fundamentals of statistics methods and research are important
considerations to include within these types of courses,
especially when it comes to interpretation and application. As an
example, a component of the data literacy and assessment
literacy domains (Mandinach & Gummer, 2016; Popham, 2010)
is providing students with an understanding of sampling
distributions, especially when it comes to interpreting t-tests,
ANOVAs, and correlations (Cohen, Cohen, West, & Aiken,
2003), but also in building capacity around discussing both
averages, the variation around those averages (DeAngelis &

Citations
More filters
Journal Article

How to lie With Statistics

TL;DR: A governor of a certain state is concerned about the test scores of high school students, one of his aides brings up an interesting statistic: there is a very strong link between student test scores and the taxes paid by the parents of the student.

Students’ sense of belonging in technical/vocational schools versus academic schools: the mediating role of faculty trust in students

TL;DR: In this paper, the authors examined students' sense of belonging in secondary schools that offer different tracks, and the role played by the faculty's trust in the students, finding that students in technical/vocational schools have a significant lower sense of feeling of belonging than students in academic schools.
Journal ArticleDOI

Receiver Operating Characteristic (ROC) Area under the Curve (AUC): A Diagnostic Measure for Evaluating the Accuracy of Predictors of Education Outcomes.

TL;DR: Using nationally generalizable data from the Education Longitudinal Study of 2002, examples of applying ROC accuracy analysis to a variety of predictors of student outcomes, such as dropping out of high school, college enrollment, and postsecondary STEM degrees and careers are provided.
OtherDOI

Data Analytics and Decision- Making in Education: Towards the Educational Data Scientist as a Key Actor in Schools and Higher Education Institutions.

TL;DR: In this paper, the importance of data usage for improving policymaking, management of educational institutions and pedagogical approaches in the classroom is highlighted, and the authors call for a development of a more robust professional role of data scientists applied to education.
References
More filters
Journal Article

R: A language and environment for statistical computing.

R Core Team
- 01 Jan 2014 - 
TL;DR: Copyright (©) 1999–2012 R Foundation for Statistical Computing; permission is granted to make and distribute verbatim copies of this manual provided the copyright notice and permission notice are preserved on all copies.
Book

Applied multiple regression/correlation analysis for the behavioral sciences

TL;DR: In this article, the Mathematical Basis for Multiple Regression/Correlation and Identification of the Inverse Matrix Elements is presented. But it does not address the problem of missing data.
Journal ArticleDOI

The visual display of quantitative information

TL;DR: Med sin høye kompetanse innen informasjonsgrafikk blir Edward Tufte i dag sett på som en av de fremste pioneerene innen faget, og han har blitt tildelt over 40 priser for sine verker.
Journal ArticleDOI

A Review of Applied Multiple Regression/Correlation Analysis for the Behavioral Sciences (3rd ed.)

TL;DR: The previous editions of this book, written by the first two authors, Jacob and Patricia Cohen, became essential reference tools for many social scientists who needed a solid grounding in statistical methodology.
Related Papers (5)
Trending Questions (3)
Why use Quantitative Data Collection method in educational research?

Quantitative data collection in educational research helps aspiring leaders develop data analytics skills, evaluate research, analyze data, and build evidence-based improvement cycles in schools.

What are the different types of quantitative methods used in administration?

The paper does not explicitly mention the different types of quantitative methods used in administration.

Can a software developer become data analyst?

Additional data leadership training should be offered for the practicing administrator, educational quantitative analyst, research specialist, and district data scientist.