Center for the Study of Systemic Reform
   in Milwaukee Public Schools



 Data Use in the Classroom:
The Challenges of Implementing Data-based Decision-making at the School Level


Christopher A. Thorn
University of Wisconsin-Madison
Wisconsin Center for Education Research


Presented at the American Education Research Association Convention, April 1-5, 2002.

This paper reports results from a study supported by a grant from the Joyce Foundation and the Wisconsin Center for Education Research. Any opinions, findings, and conclusions are those of the author and do not necessarily reflect those of the supporting agencies.


This paper will examine problems school-level staff encounter when attempting to implement data-based decision-making reform efforts, specifically those decisions that influence teaching and learning in the classroom. The paper will also offer recommendations for professional development that address gaps in traditional principal and teacher training.

Many schools and districts are exploring data-driven decision making as a solution for improving resource allocation and instructional program decisions. One of the most challenging problems policy makers and educators face in attempting to implement curriculum reforms is that intervention decisions are made at least one organizational level above that of the teachers—the persons actually engaged in instruction.

Any systemic effort to implement a focus on data-based decision making at the school- and classroom-level faces several challenges. First, most data available within district information systems are limited to what has been deemed important for the operational needs of schools and the district and will only be available on systems supported by centralized computing services. These data include attendance, discipline, and basic demographic data. District systems will also contain detailed information about human resources, budgets, and other business processes. Typically, the only outcome data available are the results from centrally administered tests (which are often annual events) and grades. While this data is useful to help frame annual analysis of school-, classroom-, or student-level outcomes, it is inadequate for making mid-course or interim instructional decisions within a single grade/marking period.

Second, educational organizations have much to learn about the complexities of integrating data effectively into their decision-making processes. While the issues involved in successful data-based decision making are beginning to be discussed in the literature on educational administration and assessment, an important and growing body of relevant work is emerging from business schools around the world. These studies range from considerations of the role of experts in organizational learning (Albert and Bradley 1997) to multi-dimensional representations of the lifecycle of knowledge (Boisot 1998) . The work that is emerging from educational sources tends to be focused on application. For example, the journal, School Administrator, recently published an entire edition (April, 2001) dedicated to data-driven decisions. In particular, the question of what data and what representational forms are appropriate for district- and school-level administrators as they attempt to evaluate the impact of curricular changes (Creighton 2001) . This work addresses systemic reform concerns that help to provide both the technical and information resources necessary to support school-level data-based decision making.

Third, while large-scale efforts to make assessment and accountability data more generally available do provide some insight into the performance of an educational system, it also seems reasonable to assume that there are major differences between the evidence used for external accountability systems and the data needed for making instructional decisions on a quarterly, weekly, or daily basis in the classroom. The large gap between the time horizons of state-level testing and the more immediate needs of program administrators and individual teachers in the classroom means that data needs and rules of evidence will be commensurately divergent. Moreover, many teachers (and administrators) do not have the skills to adequately understand and accurately use standardized test results and accountability data. The problem at this level is ensuring that building-level leaders and individual teachers have the necessary skills to engage in action research, data analysis, and evaluation.

Approaches to Understanding Decision Making and Knowledge Work

            There is a growing literature on information seeking, information processing, and information use that provides insights into how individuals and groups identify information needs and then respond (or choose not to respond) to those needs (Davenport and Prusak 1998; Wai-yi 1998; Choo, Detlor et al. 2000; Höglund and Wilson 2000; Ford 2001; Höglund and Wilson 2001) . This work draws on and can be used to frame other work in the areas of group decision making (Eden and Ackermann 1998; Snyder 2000; Bhatt and Zaveri 2001; Fazlollahi and Vahidov 2001; Alter 2002) , knowledge management (Greengard 1998; Hibbard 1998; Davenport 1999; Creighton 2001; Harrison and Pelletier 2001; Thorn 2001; Jasinski and Huff 2002) , and the human factors of decision support systems (Albert and Bradley 1997; Boisot 1998; Eden and Ackermann 1998; Shneiderman 1998; Bhatt and Zaveri 2001; Harrison and Pelletier 2001; Alter 2002) .

            One recent study of information seeking on the web provides an excellent synopsis of what the authors call an integrated model of human information seeking (Choo, Detlor et al. 2000) summarizes this model. The important aspects of the model for this paper are intersections between the different behavioral areas – the identification of informational needs, seeking to fulfill those needs, and use of information to address the identified needs. The three points outlined in the introduction can be addressed through this model. I will use a school improvement team attempting to create an improvement plan for math instruction as an example case to illuminate several aspects of this information seeking model. The anecdotes described in this case are taken from lessons learned from working with school improvement teams and from school improvement plan documents.

Information Needs

First, the identification of information needs is a primary problem in any type of school reform or curriculum improvement. As indicated above, much of the data available from district information systems is limited to data useful to make district level decisions. The granularity[1] and temporal resolution[2] of the data available severely restrict its usefulness for different user groups. In the case of an effort to understand current performance and how that performance relates to school and district goals, the school improvement planning [SIP] team can examine district accountability reports for aggregate data and school- and individual-level score sheets from math component of the annual standardized math achievement test and compare the results on these metrics to goals set by the school, district, and/or state for their desired or expected performance.

In this situation, the SIP team attempts to frame a problem. For example, fourth and fifth grade students seem to display declining scores on the math component of a district wide test leaving a gap between the observed performance of the students and outcomes expressed by the accountability goals. At this point, the problem has been defined as a gap. Typically, the data contained in the accountability system is not sufficient to diagnose the cause of the performance gap. Information on curriculum, teacher ability, classroom resources, attendance, discipline problems, contributing social/home factors, etc. comes from many different sources. The next step is a decision making process that produces one of two outcomes. Either the SIP team identifies gaps in their knowledge about what might explain the performance gap and initiates a search for additional information or the choice is made to avoid the problem. This avoidance might take the form of a simple denial of the problem – bad tests, high student turnover, etc. – or the team might choose a solution based solely on the easily available aggregate data and make an intervention decision based on incomplete (or nonexistent) data.


Figure 1 Human Information Seeking: An integrated model (Choo, Detlor et al. 2000, p. 22)

Figure 2 Sensemaking, Knowledge Creating, and Decision Making – adapted from (Choo, Detlor et al. 2000, p. 64) .

Information Seeking

            This step is the most intriguing part of the information processing model when examining school level decision making and the use of decision support tools. Once an information gap is identified, the SIP team is then faced with expressing a model of learning that encompasses the outcome and includes factors that the team feels contribute – either directly or indirectly – to that outcome. It is at this point that the search for new sources of information.[3] This data may reside in teacher grade books, lesson planning software, local databases, locked filing cabinets, etc. The team is faced with the task of assembling data that complies with the model of learning they have expressed. One important dimension of the information seeking activity is differential ease of access. Some data will be readily available in a central data store or in the school office. Other data will be in paper form in or on an individual teacher’s desk. Resolving access issues is a constant struggle, since the resource needed to acquire hard to find or manage data may exceed its value in the analysis.

Issues of the data reliability and validity of the data are also important components of the discussion, particularly if students or teachers with substandard performance are to be exposed to serious consequences. In-class assessments, for example, may be entirely valid measures in that they accurately represent the content of the curriculum and related learning standards. The fact that they are scored by different teachers and that the sample sizes are small (individual classrooms) means that scores will not be reliable. It is at this point that tools for supporting data collection and data exploration are most important. The ability to visualize student outcomes across any of these data types will show how consistently a certain group students perform at a particular level.

            There are at several districts in Wisconsin where district-level IT and/or research units are attempting to make more and more data available through a data warehouse interface. Some of these efforts rely on a combination of warehouse access for accountability and school management data and local tools (spreadsheets, database packages, statistical analysis, report generators, etc.) that allow school-level staff to combine local information with district-level data for a variety of possible uses. Decision support models need to reflect both a districts governance structure and its IT infrastructure.

Information Use

            The information use portion of the model describes the combination of the identified needs and the acquired information and its translation into action. This is the area where the identification of gaps in student understanding or in teacher professional development is codified based on the gathered information and a plan is implemented to address the shortcoming.

            As the nodes within this section of the model indicate, information use interacts with human tendencies to engage in maintenance of the current practice and accept existing norms of behavior. Information may also be used to choose new paths, reassign resources, and question existing rules. There are any number of instances where school reform initiatives challenge existing practices, allocation of resources, and the skills of teaching professionals. It is only in gathering appropriate information that newly defined problems can be examined and the results of new efforts can be evaluated. School reform efforts interact with all areas of this model. Policy makers and implementers must recognize these social structures and practices in order to support the desired behavior and achieve the target outcomes.

Decision Making Models

Choo et al. provide a second model (see Figure 2 , above ) that parallels the human information seeking model. This model provides a short hand for a larger decision making model. This model has three parts that culminate in decision making. The authors argue that sensemaking is the first important activity. This is the process of coordinating beliefs with extant information. The sensemaking process often requires that one engage in knowledge creation – bringing external information together with implicit and explicit knowledge about the current problem. Knowledge creation serves to fill gaps in the sensemaking process – it fills in gaps in understanding. Finally, one then moves on the decision making process that integrates new understanding with existing rules and procedures that guide action.

Knowledge based systems are designed to support the decision making model such as the one outlined above. Knowledge systems allow users to explore many different alternatives in a series of “what if” models. Systems that support exploration of large, complex data sets also support the development of increasingly sophisticated mental models on the part of users. Use of the environment itself can contribute to the user’s personal skills as an analyst (Jasinski and Huff 2002) . School reform efforts encourage school-level staff to make informed decisions that both operationalize their own long-term strategies for student learning and professional grow as well as align with larger district and societal goals. Decision support systems can help to make sense of the overwhelming complexity of large data collections.

Another important benefit that decision support systems should provide is supporting users as they search for and explore the attributes of alternative strategies. The search for alternatives is one of the most important activities in which a school improvement team engages. Short comings of existing outcomes lead to an identification of gaps, failures, and needs. The search for solutions is much enhanced if the governance structures and the norms of interaction support searching for alternative approaches. In situations where the resources are fixed in a particular array – funding, staffing, curricular support materials, etc., alternatives may be quite limited. It is, however, the search for alternatives that provide a context for meaningful dialogue between professionals on a school improvement team.[4] In Figure 2 (above ), Choo, et al. summarize an important parallel to the information use model. Rather than focusing on information itself, this model addresses the actions that take place in each of three main areas of activity.


In the case of our example, Sensemaking is one of the core activities that is often overlooked in school improvement planning. Planning templates are usually created at the district level as a part of a district’s accountability system. Outcome metrics are annual scores or percentiles apply a single accountability model to all schools equally. The sensemaking process provides the framework for local decision making. In this example, when faced by declining math scores a SIP team might look at the subscale scores on that test for both grade cohorts and individual students to see if they observe score declines across the board or in specific areas. This would allow the team to reconcile their local beliefs about the quality of student work and the relationship of existing classroom practices to that work. This process mirrors the information needs process in that is through the identification of what is to be studied that information needs emerge.

Knowledge Creation

            The knowledge creation process again can be seen as parallel to information seeking. Knowledge is created through assembling individual understanding, bringing in outside information in the form of possible alternatives, lessons learned, etc. and relating that to information already in use. In the example case, this is likely to take the form of exploring what the math test actually measures. This information is then combined with an understanding of the relevant teaching and learning standards to compare what is measured with what is being taught. This synthesis then frames the gaps in the test and the gaps in the curriculum. This gap analysis combined with the sensemaking process than provides a framework for crafting outcomes.

Decision Making

            Now that necessary new knowledge has been created and the SIP team has a clear understanding of the instructional needs of the students and the areas that the test does not measure, it is time to examine a series of alternatives that would achieve the learning goal. This is another area where teams are likely to find great frustration. Often, schools have few resources for identifying alternative approaches to instructional challenges. They often have to rely on a few professional development days, personal experience, and limited contacts with other professionals outside of the school year. However, the team that takes the sensemaking process seriously – perhaps by neglecting the narrowly focused SIP framework – is often better prepared to make a more concrete statement and choose a plan that fulfills their understanding of the needs in the current situation.


            These models are not presented as a panacea for decision making challenges, but they do provide a more human and responsible framework for understanding the dynamics of successful school improvement planning at the school level. It is in this area that decision support systems will be heavily challenged to provide useful support.

Characteristics of Group Decision Making

Carneiro describes the group decision making process with very accessible language (Carneiro 2001, p. 223) . He states that the process begins with a needs assessment and a determination of the scope of the problem. A decision team then moves on to collect information, evaluate alternatives, and implement the solution.

Identification of individual preferences is a vitally important to understand the outcomes of decision making exercises. The social combination approach to decision making that focuses on how much support that each alternative has at the outset – before any discussion (Baron, Kerr et al. 1992, p. 94) . In a school where the majority of SIP team members prefer to focus on personal professional growth as the primary force in school improvement would take a very different approach than would a school were SIP members valued collective action. The preferences of the team members participating in the decision making process have a great deal of influence over the outcome before any evidence is introduced. This understanding of group decision making can be played off against the more traditional approach of content of the discussion within the group to examine the movement from disagreement to consensus (Baron, Kerr et al. 1992, p. 104) .  It is the interplay between information brought to the group process and the individual preferences of participants that must be studied in combination if one is to understand the decision outcomes.

Strategic Decision Success

Making successful strategic decisions has been studied a number of times. One interesting analysis of strategic choices provides a simple model that describes the challenges faced by decision makers in differing organizational environments (Harrison and Pelletier 2001) . Harrison and Pelletier describe a simple 2x2 matrix that describe for different outcomes. The rows represent the decision maker’s attitude towards a particular decision.

In this model, some actors are categorized as being focused on maximizing the outcome from a choice. As maximizers, decision makers in this group use a computational approach towards a decision – weighing the costs and benefits in an objective manner. Seeking to maximize outcomes also presumes that one has better information, more flexible resources, etc. The other approach on this axis is that of seeking a satisfactory outcome. In this case, the decision maker seeks simply to meet the strategic goal. There is an acknowledgement that the decision maker is making a judgment call on the best available data, but that the focus in on achieving the goal, not on maximizing the outcome. The other axis represents the decision maker’s attitude toward the decision making process itself. Attitude is expressed in terms of the relative openness of the decision making process and whether or not the strategic goals are at all attainable.

The authors suggest that while Type D is a position that guarantees failure, Type B and Type C situations hold out some possibility of a successful outcome if the decision maker can see the incentive structure that suggests a change in attitude. 

Table 1 Strategic Decision Matrix adapted from (Harrison and Pelletier 2001, pp. 172-3)


Attitude toward the decision making process

Attitude toward the decision

Attainable objectives/Open decision making process

Unattainable objectives/

Closed decision making process

Judgmental decision making strategy/Satisficing outcome

Type A

Type B

Computational decision making strategy/Maximizing outcome

Type C

Type D


What this example means in a school decision support setting is that district- or school-level processes that do not allow teachers and other school-level practitioners to participate in the goals setting, needs assessment, implementation, and evaluation processes are very likely to fail because they do not include the appropriate participants. This lack of participation also increases the likelihood that the objectives will be unrealistic and unattainable. Along the same lines, a strategy that focuses on extracting the maximum benefit from a particular intervention is guaranteed to rob much needed resources from other areas of the educational enterprise where they are needed just as desperately.

Bhaat and Zaveri provide an alternate perspective on the characteristics of decision support systems [DSS]. According to this model DSS include a wide range of data management and data visualization features, but the following table describes many of the most common system features. They describe efficient access to data, the ability to experiment with variables to find new correlations, generation of alternative models, trend analysis of historical data, and summative evaluation as vitally important aspects of decision support and necessary if one is serious about supporting organizational learning (Bhatt and Zaveri 2001, pp. 304-5) . Many data warehouse and other district-level information access projects never provide any services beyond enabling access to data. However, without the incentives provided by access to the more sophisticated elements, many potential users may not find that learning to use the system is worth the effort. In the case of a school improvement model, limited time and skills on the part of the participants argue for a system that contains a great deal of embedded knowledge that can be manipulated by novice users.

Technical Capacity

There is no single approach to supporting better decision making at the school level. There are, however, a number of approaches to data collection and analysis that might support the development of more sophisticated questions and a more rigorous search for alternatives. As discussed above, state- or district-level testing is often the only common assessment metric (aside from grades) that is available for analysis of student outcomes. In addition, schools may have access to attendance, discipline, and other administrative data at some level of aggregation. In the case of elementary schools, in particular, much of this data is kept in paper form in teacher grade books and lesson plans. In such cases, only aggregated data is passed up to the school- and district-level for reporting student outcomes and providing compliance evidence.

Centralized classroom record keeping is possible in some environments. The majority of school classrooms have Internet-connected computers and would, therefore, potentially have access to remote systems for entering classroom data and receiving reports. According to at recent National Center for Education Statistics [NCES] study, the availability of networked computers in classrooms has increased substantially.

Since 1994, when 3 percent of instructional rooms had computers with Internet access, public schools have made consistent progress in this area: in fall 2000, 77 percent of instructional rooms were connected to the Internet, up from 64 percent in 1999. However, in 2000, as in previous years, there were differences in Internet access in instructional rooms by school characteristics. For example, in schools with the highest concentration of students in poverty (75 percent or more students eligible for free or reduced-price school lunch), a smaller percentage of instructional rooms were connected to the Internet (60 percent) than in schools with lower concentrations of poverty (77 to 82 percent of instructional rooms) (Cattagni and Farris 2001, p. 1) .

While schools and students in urban districts continue be relatively disadvantaged in their access to Internet technologies in the classroom, there have been substantial improvements in technical infrastructure. To cite a regional example, both the Madison and Milwaukee public school systems have made major improvements in network infrastructure and have increased the number of networked computers available for administrative and instructional uses. Unfortunately, the technology alone is not sufficient to support robust data-based decision making.

Organizational capacity

            Technology in the classroom is important, but more important is an expanded vision of what constitutes data. As referred to above, most district information systems can provide annual test scored data, grades, demographics, attendance, and discipline data. This is the sort of information required to run the business of a district. Much of this data is also required by state and federal agencies that fund or regulate different segments of K-12 activity. One recent article described Palisades School District, Pennsylvania in which parents, teachers, and administrators (including the superintendent and assistant superintendent who each did over 100 interviews) conduct two, short, one-on-one interviews with over half of the students in the district (Barnes 2001) . This data was collected to provide an additional perspective on how the district was making progress to meet its learning goals.[5] Teachers use the feedback they get from students to reflect on instructional practices and refine teaching approaches and aid them in selecting curricular materials. An interview team visits a single school and spends the day doing interviews. At the end of the day, the team meets with the principal and the teaching staff to summarize the comments made by the students. The interview notes are left at the school to provide teachers with the opportunity to engage in a more detailed analysis of the responses. The survey was designed to explore and make clear the districts expectations for particular areas of the curriculum in each year. In this way, the administration of the survey both reinforced the districts goals and gathered detailed information about students’ progress towards meeting those goals. The fact that the walk-throughs are repeated twice a year and occur each year allows teachers and administrators to observer change more clearly and provides a longitudinal view of student experiences that is often lost as student progress from grade to grade with little interaction between grade cohorts.

The importance of active, focused, local school reform is made even more pressing and relevant by a recent study that shows that the impact of teacher classroom practice is equally important as socio-economic background and ethnicity – based on NAEP 1996 test data (Wenglinsky 2002) . Educational improvement planning is happening at the school level and has the potential to have a large positive impact on student learning. One of the things that is missing is an open, reflective, decision making process that would support school administrators and local teaching staff in their desire to engage in meaningful, locally-led improvement. Linking larger educational goals to local school- and classroom-level practices must be a vital part of any serious improvement project.


There is a long literature on decision making. The Handbook of Social Psychology provides an excellent cross-disciplinary overview and describes structure versus process, riskless versus risky choices, and normative versus descriptive models of choice as the important dimensions of decision making (Abelson and Levi 1985, p. 233) . There authors suggest several approaches that would improve the quality of decision making. There are several recommendations that would be particularly relevant for this case (Ibid. pp. 274-293).

First, improved access to relevant data – whether the data is locally held or extracted from a district data store – would reduce the school improvement team’s uncertainty about the important dimensions of the problem at hand. This clearer view of the gap in expected outcomes would allow the team to craft a more targeted and appropriate response.

Second, providing access to high quality professional development materials in the form of case studies that link practice, materials, and outcomes would help the team expand its search for alternatives outside the narrow box of improving a particular score.

Third, review incentives that reinforce short-term gains over long-term structural improvements. The current climate in many urban schools is one of improve this year or else. Value-added analyses that relate performance to the rate and/or amount of improvement for particular groups of students should be explored as a method for rewarding movement towards a goal and provide incentives for improvement to both high and low performing schools.

Finally, district school improvement planners need to recognize that providing school leaders with highly constrained choices about how to show improvement reduces these leaders’ ability to think creatively and limits their ability to search for alternatives that would be most meaningful for their particular conditions.

State and district level administrators need to provide leadership by showing that they value appropriate use of data to support school improvement. They also need to provide professional development opportunities and personal mentoring necessary to make their staff comfortable with the process of needs analysis and program evaluation at the school level. While political accountability and is related metrics are important they are too far removed from instructional practices and provide very little relevant feedback for local, classroom-level improvements.


Abelson, R. P. and A. Levi (1985). Decision making and decision theory. The Handbook of social psychology. E. Aronson and G. Lindzey. New York, Random House. 1: 231-309.

Albert, S. and K. Bradley (1997). Managing knowledge : experts, agencies, and organizations. Cambridge, U.K. ; New York, Cambridge University Press.

Alter, S. (2002). "The collaboration triangle." CIO Insight 9(January): 21-27.

Barnes, F. V. M., Marilyn (2001). "Data analysis by walking around." The School administrator 58(April (4)): 20-25.

Baron, R. S., N. L. Kerr, et al. (1992). Group process, group decision, group action. Pacific Grove, Calif., Brooks/Cole Pub. Co.

Bhatt, G. D. and J. Zaveri (2001). "The enabling role of decision support systems in organizational learning." Decision Support Systems 32: 297-309.

Boisot, M. (1998). Knowledge assets: securing competitive advantage in the information economy. Oxford ; New York, Oxford University Press.

Carneiro, A. (2001). "A group decision support system for strategic alternatives selection." Management Decision 39(3): 218-226.

Cattagni, A. and E. Farris (2001). Internet Access in U.S. Public Schools and Classrooms: 1994-2000. Washington, D.C., U.S. Department of Education, National Center for Education Statistics: 20.

Choo, C. W., B. Detlor, et al. (2000). Web Work: Information seeking and knowledge work on the world wide web. Boston, Kluwer Academic Publishers.

Creighton, T. B. (2001). "Data Analysis in Administrators' Hands: An Oxymoron?" The School administrator 58(April (4)): 6-11.

Davenport, T. H. and L. Prusak (1998). Working knowledge : how organizations manage what they know. Boston, Mass, Harvard Business School Press.

Davenport, T. H. a. D. M. (1999). Is KM just good information management? The Financial Times Mastering Series:  Mastering Information Management. London: 2-3.

Eden, C. and F. Ackermann (1998). Making strategy : the journey of strategic management. London ; Thousand Oaks, Calif., Sage Publications.

Fazlollahi, B. and R. Vahidov (2001). "A method for generation of alternatives by decision support systems." Journal of Management Information Systems 18(2): 229-250.

Ford, N. (2001). "The increasing relevance of Pask's work to modern information seeking and use." Kybernetes: The International Journal of Systems & Cybernetics 30(5,1): 603-630.

Greengard, S. (1998). "Will Your Culture Support KM?" Workforce 77(10 (October)): 93-94.

Harrison, E. F. and M. A. Pelletier (2001). "Revisiting strategic decision success." Management Decision 39(3): 169-179.

Hibbard, J. C., Karen (1998). "Knowledge Revolution." Information Week(January 5): 49-50,54.

Höglund, L. and T. Wilson, Eds. (2000). The New Review of Information Behaviour Research. Studies of Information Seeking in Context. Cambridge, UK, Taylor Graham.

Höglund, L. and T. Wilson, Eds. (2001). The New Review of Information Behaviour Research. Studies of Information Seeking in Context. Cambridge, UK, Taylor Graham.

Jasinski, D. W. and A. S. Huff, Eds. (2002). Using a knowledge-based system to study strategic options. Mapping Strategic Knowledge. London, Sage Publications Ltd.

Shneiderman, B. (1998). Designing the user interface : strategies for effective human-computer-interaction. Reading, Mass, Addison Wesley Longman.

Snyder, W. M. (2000). "Communities of Practice: The Organizational Frontier." Harvard Business Review(January/February): 139.

Thorn, C. A. (2001). "Knowledge Management for Educational Information Systems:

What Is the State of the Field?" Education Policy Analysis Archives 9(47).

Wai-yi, B. C. (1998). "An information seeking and using process model in the workplace: a constructivist approach." Asian Libraries 7(12): 375-390.

Wenglinsky, H. (2002). "How Schools Matter: The link between teacher classroom practices and student academic performance." Education Policy Analysis Archive 10(12).

[1] Granularity is a term used to describe the level of aggregation of data. For example, attendance data could be listed as follows in increasing finer granularity – days absent this year, days absent this semester, days absent this week, or periods absent this day. The finer the grain size, the more detailed the analysis can be. The tradeoff however, is that the finer the granularity, the more data one must manage.

[2] Temporal resolution refers to the span of time to which a particular datum or data set refer. Annual test scores have the temporal resolution of one year. Weekly spelling test scores have a temporal resolution of one week. The temporal resolution of a particular type of data makes it more or less useful for measuring the state of or the change within a system over a given span of time.

[3] Some examples include; attendance and tardiness data for individual students, discipline data, results of in-class assessments, seniority and educational level of teachers, quality of the curriculum, quality and availability of professional development in the area in question, etc.

[4] For a more technical discussion on methods used generate alternatives see Fazlollahi, B. and R. Vahidov (2001). "A method for generation of alternatives by decision support systems." Journal of Management Information Systems 18(2): 229-250.

[5] The 15 minute interviews are based on a structured protocol that is adjusted to be grade appropriate.

Information Staff Reports Links