Go to Collaborative Learning Go to FLAG Home Go to Search
Go to Learning Through Technology Go to Site Map
Go to Who We Are
Go to College Level One Home
Go to Introduction Go to Assessment Primer Go to Matching CATs to Goals Go to Classroom Assessment Techniques Go To Tools Go to Resources




Go to CATs overview
Go to Attitude survey
Go to ConcepTests
Go to Concept mapping
Go to Conceptual diagnostic tests
Go to Interviews
Go to Mathematical thinking
Go to Performance assessment
Go to Portfolios
Go to Scoring rubrics
Go to Student assessment of learning gains (SALG)
Go to Weekly reports

Go to previous page

Classroom Assessment Techniques
Weekly Reports

(Screen 5 of 6)
Go to next page

Theory and Research
The use of Weekly Reports in college SMET teaching and assessment emerges from a "constructivist" philosophy of science education. In its most basic form, Human Constructivists (Mintzes, Wandersee and Novak, Eds., 1998) reject the view that knowledge is a product that can be faithfully conveyed by instructors. Instead they substitute the idea that knowledge is an idiosyncratic, dynamic construction of human beings, and that instructors are "middlemen" or negotiators of meaning. Taken to an extreme, this philosophy suggests:

What are the implications of this view for college SMET instructors? Perhaps the most important implication is that without knowing what students are thinking, an instructor can offer only limited assistance in helping them learn. Put slightly differently, "....the most important single factor influencing learning is what the learner already knows. Ascertain this and teach him [sic] accordingly" (Ausubel, Novak and Hanesian, 1978). That is why metacognition is crucial for implementing a constructivist approach to teaching and assessment.

Metacognition refers to knowledge, awareness, and control of one's own learning (Baird, 1990; p.184). Gunstone and Mitchell (1998) suggest examples of learning behaviors which illustrate metacognition in classrooms:

    Examples include telling the teacher what students don't understand, planning a general strategy before starting a task, seeking links with other activities or topics, and justifying opinions. (p. 137)
Certainly, one cannot subscribe to a constructivist philosophy in teaching and a traditional approach to assessment. If we want our students to construct their own understanding, reflect on their knowledge, ask questions and plan their own learning, we need to devise assessment methods that assess these aspects of learning.

Journal writing is one of the least used forms of alternative assessment (Lester, et al, 1997). This may be due, at least in part, to the time consuming qualities of writing and assessing this writing. However, Bagle and Gallenberger (1992) suggest that,

    Writing is more than just a mean of expressing what we think - a means of shaping, clarifying and discovering our ideas (p. 660).
Weekly Reports are not just journals - they are structured journals, and have other purposes. In the beginning, students usually express negative feelings towards them, but very soon they start appreciating the help that Reports provide. Here we cite some comments of our own students:
  • "Weekly Reports helped me to learn in this course. Sometimes I write that is said in class rather quickly. Writing it again gave me more of a chance to understand/learn it. It is better that I had to review, because I understand/remember more. Don't get me wrong: I hated doing them at the time because they took forever! But I am glad we had to now!"
  • "Although the Weekly Reports seemed to be a burden throughout the semester, I can conclude that they greatly helped me to learn. I was forced to evaluate how much and just how I learned what I learned and correct misconceptions I had. Also it forced me to pay excellent attention in class because I knew I would eventually do a report in it".
  • "Sometimes I would leave class being so filled with information that my Weekly Reports were my only way of organizing my knowledge".
  • "They allowed me to organize my weekly learning and ask any questions about misunderstandings. They will also act as clear, concise notes for future reference".
  • "Sometimes at the end of the class I thought I understood everything but when I wrote my Report I would find what I did not understand".
  • "I like the fact that we could pose questions that remained unclear, and you actually answered all of them. I also liked that we could predict your questions".
  • "The Weekly Reports did help because although they were time consuming, they actually forced me to grapple the ideas that we were presented in class. The structure we were asked to put them in also helped me to think about the concepts in a more logical way".
  • "The Reports made me stop and check if there was something I did not understand".
  • "Even though I hated the Weekly Reports because they were time consuming they were a great learning tool for me. It forced me to sit down and digest what we learned. It also made me think about how to put it all into clear thoughts that made sense. This was a perfect assessment and it definitely reinforced what we had learned".
  • "They helped me to organize the information in a meaningful manner. Instead of having information scattered all over, that is not related to each other, I had the information easy to access and understand".
Our own research (Etkina, 199-) supports the value of using Weekly Reports. For example, if you graph the number of student questions at each level as a percent of the total for both parts of the Weekly Reports (Fig. 1), the majority of the student questions for both parts of the Reports fall in the "low" and "moderate" categories. Generally, students tend to ask questions that help them clarify and apply previously taught concepts. This means that they tend to assign the same value to the content as the professor. Research in two courses (i.e., "Introductory Physics" and "Electrodynamics"), demonstrated that the number of "minimal" level questions is much higher for part three while the number of "highest" level questions was considerably higher for part 2 of the reports for the introductory course and vice versa for high level electrodynamics. In other words the students either expect the professor to ask them much easier questions than they wanted to ask for themselves (introductory physics course data) or vice versa (electromagnetism course data). These mismatches do not help to raise the level of confidence in our students.

A bar graph with percent on the y axis and level of difficulty on the x axis.  There are 4 choices on the x axis: minimal, low, moderate, and highest.  The moderate value has the highest percent of student questions.  There were 2 samples for each x axis choice; in both cases, moderate level had highest percent.
A - the number of questions that the students asked in the second part of the Reports, assigned to four levels of difficulty and expressed as a percent of the total number of questions asked in this part.
B - the number of questions that the students asked in the third part of the Reports, assigned to four levels of difficulty and expressed as a percent of the total number of questions asked in this part.


Links


Sources
Ausubel, D. P., J. D. Novak and H. Hanesian (1978). Educational Psychology: A Cognitive View. New York: Holt, Rinehart and Winston.

Bagley, T., Gallenberger, C. (1992). Assessing Students' dispositions: Using journals to Improve Students' Performance. Mathematics Teacher, 85, 660-663.

Baird, J. R. (1990). Metacognition, purposeful inquiry and conceptual change. In E. Hegarty-Hazel (ed.) The student laboratory and the science curriculum. London: Routledge

Cizek, G. I. (1997). Learning, Achievement, and Assessment. In G. D. Phye (Ed.), Classroom Assessment. Learning, Achieving and Adjustment. (pp. 2 - 29). In G. D. Phye (Ed.), Classroom Assessment. Learning, Achieving and Adjustment. San Diego, CA. Academic Press.

Gunstone, R. F. & Mitchell, I. J. (1998). Metacognition and Conceptual Change. In J. J. Mintzes, J. H. Wandersee, & J. Novak (Eds), Teaching Science for Understanding: A Human Constructivist View (pp.133 - 163). San Diego, CA. Academic Press.

Lester, F. K., Lambdin, D. V., & Preston, R. V. (1997). A New Vision of the Nature and Purposes of Assessment in the Mathematics Classroom. In G. D. Phye (Ed.), Classroom Assessment. Learning, Achieving and Adjustment. (pp. 287 - 319). San Diego, CA. Academic Press.

Mintzes, J. J., J. H. Wandersee, & J. Novak (Eds), (1998). Teaching Science for Understanding: A Human Constructivist View. San Diego, CA. Academic Press.

Novak, J. D. and D. B. Gowin (1984). Learning How to Learn. Cambridge: Cambridge University Press.

Redish, E. F. (1994). Implications of cognitive studies for teaching physics. American Journal of Physics. 62, 796 - 803.

Go to previous page Go to next page


Tell me more about this technique:

Got to the top of the page.



Introduction || Assessment Primer || Matching Goals to CATs || CATs || Tools || Resources

Search || Who We Are || Site Map || Meet the CL-1 Team || WebMaster || Copyright || Download
College Level One (CL-1) Home || Collaborative Learning || FLAG || Learning Through Technology || NISE