USE IN PRACTICE:
Paper presented at the American Educational
paper reports results from a study supported by a grant from the Joyce
Foundation and the Wisconsin Center for Education Research. Any opinions,
findings, and conclusions are those of the author and do not necessarily reflect
those of the supporting agencies.
This paper describes in detail the results of work in six Milwaukee Public Schools (MPS) on using data to inform decision-making. The work was the result of collaboration between a team from the Wisconsin Center of Education Research (WCER), headed by Norman Webb, and teams from each of the six schools (henceforth labeled QSP teams). The case studies represent a synthesis of data collected over the course of the two-year project.
Review of the Literature on Using Data in Schools
Quality School Portfolio project was conceived to help schools adopt a
school-based version of Deming's Total Quality Management (TQM) for use in
school improvement planning and decision-making (Deming, 1986, cited in
). Schmoker, borrowing from Deming's work
(Schmoker, 1996; Schmoker & Wilson, 1995)
, argues that short-term, measurable results are the key to long-term
improvement. Schools, Schmoker
argues, would improve if they focused on short-term, local data to guide the
achievement of long-term goals. Elmore and Rothman (cited in
) concur: "The theory of action of the basic standards-based reform model
suggests that, armed with data on how students perform against standards,
schools will make the instructional changes needed to improve performance."
Schmoker suggests a process of collaboration and brainstorming, combined
with data collection and analysis based on student assessment, was responsible
for numerous examples of schools increasing test scores, dramatically so in a
number of his examples
(Schmoker & Wilson, 1995)
This vision of school improvement is not without its critics.
describes a number of problems and inconsistencies experience by schools
adopting a TQM approach. One of the
goals of TQM, according to Kohn, is that the improvement process be internally
driven. This characteristic
conflicts with the reliance on standardized test scores as a measure of student
achievement on the part of many school-based TQM proponents: most standardized
tests are externally developed without regard to a particular school’s
improvement process. Kohn
further argues that applying workplace terminology to schools is inappropriate
because it places undue emphasis on how well students learn particular content
rather than on what content students are learning. This can lead to a focus on
rote skill development at the expense of broader learning goals.
Nevertheless, the national standards and accountability movements have pressured schools into using data in their school improvement plans (Massell, 2001) . Massell describes efforts in eight states to use data to inform decision-making and notes the increased emphasis on the use of data at the state, local, and school levels over the last twenty years. This, Massell argues, is the result of the standards and accountability movement, which has put pressure on schools to increase their test scores. The states’ role has gone beyond emphasizing the use of data in school improvement planning to providing professional development to school personnel. Massell reports that these efforts have resulted in increased demand for data. In some cases, local districts or schools have supplemented state efforts in providing assessment data.
(Fullan, 1999; Fullan & Hargreaves, 1998)
argues that school improvement models need to be coherent.
Fullan and Hargreaves (1998) suggest that schools need to develop
assessment literacy, defined as: (1) teacher capacity to analyze student data
and make sense of it; (2) develop school improvement plans based on data; and,
(3) enter the debate about the uses and misuses of achievement data. Some local
efforts to develop this literacy are discussed below.
Herman and Gribbons (2001) and Nichols and Singer (2000)
describe specific local efforts to get schools to use data.
Nichols and Singer
describe how the use of “data mentors,” plus other measures, have helped
schools to interpret their data and increase their test scores. Herman and
Gribbons describe their efforts in two school groups to use school data
profiles; the profiles, which were portrayed in graphic form, provided
Herman and Gribbons
describe how one school used data to investigate some of the school’s
inequities. In another school, efforts to promote data use were met with
resistance. In these cases, the
district kept longitudinal data and the analysis was not complex. The authors
cite a need for data beyond standardized test data to understand and monitor
students’ progress. The
non-performance indicators, such as attendance and dropout data, also proved
problematic and Herman and Gribbons suggest improvements in the way they are
calculated to be more useful. For
example, the district calculated attendance rates by looking at the average
number of absences, while schools would be more interested in data on the
percentage of days a student attends school.
issue on non-performance indicators is particularly relevant to the case studies
discussed below. The school system
in which the schools are located tested only 4th, 8th, and 10th grades on an
annual basis. This limited
assessment data pushed the schools to consider other types of indicators to
measure improvement. Such data use contrasts with the literature discussed
above, which focused primarily on test scores as forms of measurement.
literature shows an increased use of data in many local districts and schools. The anecdotal evidence provides a number of examples of
small-scale successes in solving particular school problems and in raising test
scores. This study contributes to
this small but growing body of literature on efforts by school staffs to use
data to make decisions.
This section will report findings from the six MPS case study schools in
terms of changes in data practices as a result of the QSP project. The Phase 1
schools, Forrester Middle School and Garden Heights Middle School,
began their involvement in the QSP project in January, 2000. The four Phase 2
schools, two middle and two high schools began their involvement in August,
2000. We found clear distinctions in the level of success in using data for
decision-making between the Phase 1 schools and the Phase 2 schools. The
discussion below will highlight those differences and describe the most
The greatest differences that distinguished the efforts of the Phase 1
schools were: 1) the creation of
reports at regular intervals on a variety of indicators; and, 2) the creation of
in-house databases to capture data normally sent to the district, or to generate
data at a finer grain-size than required by the district. The indicators in the
reports, mainly discipline-referral information, were aggregated at the school
team level (called Families at Garden Heights and Teams at Forrester). The data
necessary to measure the indicators were created by school-run databases; these
databases required dedicated data entry.
Heights and Forrester were able to incorporate data as a regular part of their
decision making process. Garden Heights’s efforts were tightly focused on the
principal’s attempts to inform the school teams about their rates of
discipline referrals and resource usage. This was designed in part to put some
pressure on the teams to be more proactive and take more responsibility for
student behavior. The principal also used this data to justify a hiring
decision. The six-week reports created by the Forrester principal also centered
on discipline issues, although there was a greater effort to include academic
data and to link attendance, discipline, and classroom-level academic data to
district-level assessment scores. On an annual basis, the Forrester QSP team
used demographic and assessment data from the incoming sixth graders. The
Forrester team’s efforts to use data appeared ready at the end of out two-year
study to expand to additional staff
members and to affect a greater number of instructional decisions. Forrester’s
principal also used the results of his data analysis to inform hiring decisions.
Both principals felt that the use of data would produce greater accountability
among their staffs.
Forrester personnel, however, appeared ready to alter their current decision
making processes as a result of the QSP project. The principal made attempts to
expand the number of staff involved in using data; this included training
additional staff on the QSP software and on data analysis. He also held a data
workshop for about twenty teachers that coincided with the technical assistance
visit from the WCER research staff. Garden
Heights’ data use was expanded, but the school’s decision making process did
not appear to be greatly affected. In
a June, 2001, focus group, several staff members reported that the principal
still controlled most of the data and the way it was used.
They expressed the opinion that the same group of people was still
responsible for making decisions in the same ways as before.
four Phase 2 schools were unable to establish any regular use of data to inform
their decision making during the course of the project. However, these teams
continued to express an interest in using data, but needed greater access to
reliable data and increased technical capacities. An additional consideration
for the Phase 2 group was the inclusion of two high schools. The data management
system for the two high schools proved more problematic than the system at the
middle school level. These high schools reported access and accuracy problems.
Furthermore, the two high schools were structured by departments according to
academic discipline, while the four middle schools had school teams that were
obstacles faced by all six teams included a dearth of data, issues of technical
capacity, and lack of personnel resources. The lack of data available to schools
was affected by both the district’s data management policies, as well as its
assessment practices. The district is currently developing a plan to increase
testing so that most grades will be assessed every year; this will increase the
quantity of assessment data the schools will receive. MPS policies regarding
data flow back to schools were limited as a result of the current transition the
district information system infrastructure is undergoing. The transition of the system has created barriers to data
collection and retrieval. The
completion of the district data warehouse and query tools should alleviate many
of the barriers to data access experienced by the MPS schools in the 2000-2001
lack of school-level technical capacity remains an enduring problem. It became
clear to us that these capacity problems crossed technical, analytical, and
organizational areas in each of the six schools.
Despite the WCER research staff’s efforts, school staff members still
displayed a lack of confidence and ability to create functional databases and to
analyze the data they were able to access.
The capacity issue is aggravated by the need to master and integrate
several software packages and data systems.
Furthermore, the ability of the school teams to generate appropriate
questions and draw proper conclusions will require more training. Each school
also faced problems of data management. There
was only so much time and few available personnel to dedicate to the task of
data entry and analysis. This was
why the principal at Garden Heights insisted that this process would only work
if there were a seamless way to incorporate the data entry into the appropriate
summary, the two schools (the two Phase 1schools) that successfully incorporated
data had: 1) strong leadership from the principal; 2) invested considerable time
and effort; and, (3) had two years of involvement in the project. The most
intriguing of the four Phase 2 schools, Norris, did not appear to be following
the pattern of strong principal leadership, but relied more heavily on
technology personnel to provide the impetus for using data.
In January, 2000, QSP research staff interviewed members of the two Phase
1 school teams. A total of nine team members were interviewed regarding:
the nature of the school staff’s decision-making processes, the
staff’s use of data; criteria for school success; and the staff’s most
pressing concerns. In addition, staff members at both schools were given a
survey that inquired about the same topics. Fifty-three surveys were returned.
The results of these interviews and surveys were compiled and analyzed to
determine baseline data on the two schools.
Technical assistance and training were provided for the Spring, 2000,
semester at the two schools. A total of four training days were allotted for
each school; the training focused on decision-making processes, software
operation, and data analysis. Technical assistance was provided by telephone and
by visits from the Milwaukee liaison. In addition to school-level assistance,
WCER staff members were also involved at the district level, partly in an effort
to gain access to data for the schools.
In August, 2000, WCER research staff interviewed members of the four
Phase 2 schools. The protocol used during the first round of interviews was used
for these interviews. In addition, the four school staffs were given the same
survey that was administered to the original two staffs. A total of 33
interviews were conducted and 44 surveys were completed.
These interviews and surveys comprised the baseline data for the four new
The Phase 2 schools were allotted three days of training, which focused
on decision-making processes, QSP software operation, and data analysis.
Technical assistance was provided throughout the school year by telephone and
e-mail with WCER research staff and by visits by one of the WCER research staff
members. In addition, three of the schools had day-long technical assistance
visits in May and June to further facilitate the teams’ efforts. In May and
June, 2001, the WCER research staff conducted focus groups at each of the six
school sites. These focus groups were queried regarding their efforts to use
data and to assess the impact of the QSP project.
The case study data were analyzed to obtain answers to the project’s
four research questions:
What are the data needs of schools?
How can the quality and flow of data to schools be improved?
What level of data analysis is useful to schools?
How can schools use data effectively to meet their needs?
the data needs of schools?
All six of the school QSP teams expressed a desire for rapid access to a
wide variety of data in convenient formats. In general, they wanted data that
would allow them to track their students’ academic and behavioral progress on
a regular basis. The majority of the QSP teams wanted both historical data on
the students, as well as data generated throughout the school year, such as
current attendance, discipline, and academic data.
Current academic data would include grades, aggregated at both the
student and teacher levels, and school-administered assessments. Several teams
explicitly commented on the need to integrate data collection and analysis into
regular school routines. All of the teams commented on their lack of access to
this wide variety of data.
The school teams wanted access to academic data, such as grades and
standardized test results, and data on attendance and discipline. The teams felt
it was important to monitor non-academic indicators, in part because these data
were readily available, but also because the teams felt that a causal link
existed between these factors and learning.
The teams wanted access to historical and current data. Historical data
refers to data from previous years and could include grades, standardized test
scores, and Special Education data. Current data refers to data collected
throughout the year, such as attendance, discipline and current school-year
The teams expressed a need to receive data that were already in a format
convenient for entry into a database or computer program (like QSP) for
analysis. The district’s School Management System (SMS) has that potential,
but at the time of our study it lacked a functioning export feature, and it
lacks the range of reporting features of QSP.
The case studies provide examples of the ways individual school teams
varied in their expression of data needs. Teams at the two Phase 1 schools were
more sophisticated in the way they expressed their needs; they were interested
both in establishing baseline data and in measuring indicators at regular
intervals to monitor progress. The two Phase 2 high schools had great difficulty
obtaining trustworthy data of any sort; this situation constrained the efforts
of team members to get beyond data access issues to the utilization of data for
decision-making. At the two Phase 2 middle schools, the staff members of one
demonstrated increasingly sophisticated notions of how they might be able to
collect and utilize data to monitor their students’ progress.
How can the
quality and flow of data to schools be improved?
All of the school teams experienced difficulty in gaining access to data
maintained by the district. The data the schools did receive were not formatted
conveniently for school use, were not electronically stored, were at an
inappropriate level of aggregation, and were not linked to other essential data.
Some data, such as attendance records, were sent by the district to
schools in large stacks of paper. Schools had to enter these data into their
computers in order to do analyses. Data were either aggregated only at the
school level or available only at the individual level. Compiling data on one
student might require accessing several different database systems, and, often
these data were of questionable accuracy. The database in use at the high school
level, the School Management System (SMS), had features such as the ability to
aggregate data above the student level or to report attendance accurately; these
features often did not work properly or were not exportable to, or compatible
with, other database systems in place at the school.
The accuracy of data was cited by all of the QSP teams as a problem.
Attendance data from the SMS at the high school level were viewed as
particularly problematic, but district data on Special Education, state
assessments, and grades were also questioned by both the middle and high
Nevertheless, the school teams expressed a desire for greater access to
the historical and operational data maintained by the district. A number of team
members expressed frustration with the district’s reluctance to provide
schools with data in an electronic format. In late Spring of 2001, the district
began training school-level staff on Brio software, which promises to facilitate
access to the district’s data warehouse.
of data analysis is useful to schools?
level of analysis can be expressed along several dimensions: temporal, grain
size, and statistical analysis. The temporal dimension refers to the time
intervals used by schools in their regular collection and analysis of data; for
example, the interval could be six weeks or it could be a year, depending on the
availability and use of the data. Watson,
referring to the data rather than to its collection, calls this characteristic temporal
resolution (Watson, 2001). For example, a school might collect and aggregate
discipline referral data every week. This datum’s temporal resolution is
weekly. The grain size
(Watson, 2001) refers to the size of the group a particular datum is describing,
or to the number of items in an assessment that a particular score represents.
For example, a school team might want to look at the school’s median test
score on a mathematics assessment; or the members might decide to look at the
scores of the 6th graders from a particular feeder school. Statistical analysis is represented by the type of charts and the
techniques schools employ to analyze data. A simple descriptive
analysis might include pie charts and histograms; a medium level of
sophistication would include statistics that describe distributions, such as
measures of central tendency and variance.
A more sophisticated analysis would include some inferential statistics.
School teams were interested in
looking at data at regular intervals, although only the two Phase 1 schools were
able to accomplish this. The Phase 2 schools expressed an interest in generating
data that would regularly measure the progress of students along a variety of
indicators, but did not have the capacity, or access to appropriate data, to
accomplish this. The school teams did look at data generated on an annual basis,
such as state or district standardized assessments. These data were frequently
used in the school’s Educational Plans, the district-mandated school
All four middle school teams
were interested in looking at data on groups of students. The most common way
the school teams wanted to group the data was according to internal school teams
of teachers and students. For example, one principal wanted to compare rates of
discipline referrals among the various teams (the school-level division of
teachers and students) at his
school. Another example involved looking at 8th graders who needed to pass the
MPS Middle School Proficiencies. The
school teams analyzed data aggregated at the school level or grade level as
well. Examples include looking at the students’ performance on a standardized
assessment, or at the distribution of mathematics scores of an incoming 6th
The school teams mostly relied
on QSP to produce simple descriptive analyses, such as frequency tables, pie
charts, and histograms. In many cases, they had never seen these types of visual
representations of characteristics of their students used. The two Phase I teams
occasionally utilized QSP to produce the distribution of means to compare
cohorts of students, but otherwise the schools did not engage in analysis much
beyond simple description.
schools use data effectively to meet their needs?
The two Phase 1 schools were most effective in using data to inform their
decision-making. Both of these schools prepared reports that were disseminated
to their internal school teams at regular intervals. The reports differed, but
they both contained data regarding the number of discipline referrals. The
Garden Heights report did this in a more detailed way and also included data on
resource usage of each internal team. The Forrester report contained information
about grades and attendance, in addition to the discipline data. It is important
to note that these schools focused on these data in part because they were the
most accessible and because both schools felt that discipline was an important
indicator to monitor.
The Forrester QSP team tended to use data to inform a broader set of
decisions than did Garden Heights. Members used data as a basis for resource
decisions, proportional distribution of students to school teams, school policy,
and to attempt to align grading policies with achievement results.
This ambitious agenda reflected the principal’s strong long-term
interest in data use.
teams at the four Phase 2 schools provided a number of examples of how they used
data to inform their decision-making. These examples can be analyzed in terms
of: 1) who was using the data; 2) what data were used; and 3) what kinds of
decisions were made as a result of using data. The examples were wide-ranging on
all three criteria.
three groups of people from the Phase 2 schools who reported using data were
teachers, administrators, and support personnel (counselors, Special Education
teachers, curriculum specialists). For example, an administrator at one of the
high schools used attendance and assessment data to alter the 9th-grade
schedule. A counselor at one of the middle schools used attendance and
discipline referral data to select students into a program to help 8th graders
pass the MPS Middle School Proficiencies.
data mentioned were reading scores from a variety of assessments, grades,
pass/fail rates, Honors Level discipline data, attendance rates, MPS Proficiency
scores, and enrollment rates. For
example, pass/fail rates at one of the high schools were used to abandon the 9th
grade family team structure. In another school, low mathematics scores on a
statewide assessment motivated the school to switch to block scheduling.
members made decisions that influenced instruction, scheduling, course
offerings, school structure, student privileges, team performance, resource
allocation, and program enrollment. For
example, one school used a software program to track students’ discipline
referrals; students with positive ratings were granted privileges for
extracurricular activities. At
another school, low reading scores on an internally administered assessment were
used by the librarian as a basis for purchasing more appropriate reading books.
The difference between the Phase 1 schools and the Phase 2 schools was
that the examples of data use in the later schools seemed to be isolated and
uncoordinated. The two Phase 1 schools had principals who were strong advocates
of a data-driven approach and this was reflected in the more integrated manner
in which they incorporated data into their decision making process.
the end of the two years of the project, each QSP school team participated in a
focus group. The purpose of these focus groups was to assess the results of the
efforts of the school teams to use data to inform their decision-making. Some of
the comments have been summarized above. Other comments respond in a tangential
but relevant way to the research questions. Some comments are also included
The QSP school teams expressed a variety of conditions that would
facilitate the use of data to inform decisions. These conditions reflected both
the resource commitment necessary for data collection and analysis and the
cultural change necessary to implement such a process.
The resource requirements
included the need for personnel who have both time and technical expertise. The
required expertise was defined as the ability to enter and analyze the data, as
well as the ability to interpret the results. This included the abilities to
determine what questions to ask and what variables to compare. Another resource
requirement involved finding convenient processes and software for collecting
and analyzing data.
necessary cultural changes entail the need to establish a focus for data
collection and analysis. This meant that data collection and analysis need to
follow an established plan that had been discussed by all of the involved
stakeholders. The other cultural change needed involves the creation of what one
team member called “non-defensive atmosphere,” which would promote the use
of data for school-wide improvement and thus make school staff more comfortable
in using data.
of the Data Process as a Result of the Project
The two Phase 1 school teams reported a more substantial impact from the
QSP project than did the four Phase 2 teams. Phase 1 teams reported an expanded
role for data use in their decision-making processes and in a school-wide
awareness of the impact of the use of data at both sites.
team reported a stronger focus in its data collection; it also reported that the
use of data had helped to reduce the emotional aspect of important decisions and
had helped to expand the number of staff members involved in decision-making.
The Garden Heights’s team also reported increased use of data to inform
decisions, but the attempts were less focused and created greater levels of
anxiety among the staff than those at Forrester.
four Phase 2 school teams reported an increased awareness of the difficulties of
collecting and analyzing data to help inform decision making. This awareness
included the need for future efforts to be more focused and to include multiple
sources of data to analyze a problem. Beyond that, their efforts were still in
the preliminary stages at the end of the academic year. The four schools
reported obstacles in the gathering of data, as well as in developing the
technical capacity to use data efficiently.
Data Uses and Processes
The data uses that the QSP school teams would like to implement in the
future fall into two categories: individual student profiles and school
Four of the six teams would ultimately like to create extensive
electronic student profiles. This
would allow teachers to easily retrieve information regarding a student’s past
academic performance, Special Education history, and discipline referral
information. One principal also
expressed an interest in having students monitor their progress via electronic
profiles. Ideally, this information
would be updated at regular intervals to help teachers track their students’
The two Phase 1 already provide their school teams with regular reports containing about half a dozen measures (which differ at the two sites). Both of these QSP school teams would like to see the number of measures expanded to help determine the effectiveness of various interventions and teaching approaches. In addition, two of the other QSP school teams also expressed an interest in creating school team-level indicators. These would help the school teams track the progress of their students on important indicators, both behavioral and academic.
Fullan, M. (1999). Change forces: The sequel. Philadelphia: Falmer Press.
Fullan, M., & Hargreaves, A. (1998). What's worth fighting for out there? New York: Teachers College Press.
Herman, J., & Gribbons, B. (2001). Lessons learned in using data to support school inquiry and continuous improvement: Final report to the Stuart Foundation . Los Angeles, CA: Center for the Study of Evaluation.
Kohn, A. (1993). Turning learning into a business: Concerns about total quality. Educational Leadership, 51(1), 58-61.
Massell, D. (2001). The theory and practice of using data to build capacity: State and local strategies and their effects. In S. H. Fuhrman (Ed.), From the capitol to the classroom: Standards-based reform in the states. One hundredth yearbook of the National Society for the Study of Education, Part II. Chicago: University of Chicago Press.
Nichols, B. W., & Singer, K. P. (2000). Developing data mentors. Educational Leadership, 57(5), 34-37.
Schenkat, R. (1993). Deming's quality: Our last but best hope. Educational Leadership, 51(1), 64-65.
Schmoker, M. (1996). Results: The key to continuous school improvement. Alexandria, VA: Association for Supervision and Curriculum Development.
Schmoker, M., & Wilson, R.
B. (1995). Results: The key to renewal. Educational
Leadership, 52(7), 62-64.
 The names of all schools mentioned in this paper are fictitious.