Beth L. Chance

University of the Pacific

Journal of Statistics Education v.5, n.3 (1997)

Copyright (c) 1997 by Beth L. Chance, all rights reserved. This text may be freely shared among individuals, but it may not be republished in any medium without express written consent from the author and advance notification of the editor.

**Key Words:** Computer labs; Journals; Projects; Take-home
exams.

In an effort to align evaluation with new instructional goals, authentic assessment techniques (see, e.g., Archbald and Newmann 1988, Crowley 1993, and Garfield 1994) have recently been introduced in introductory statistics courses at the University of the Pacific. Such techniques include computer lab exercises, term projects with presentations and peer reviews, take-home final exam questions, and student journals. In this article, I discuss the University of the Pacific's goals and experiences with these techniques, along with strategies for more effective implementation.

1 As instructional goals in statistics courses change, so must the assessment techniques used to evaluate progress towards these goals. Many statistics courses are shifting focus (see, e.g., Cobb 1992 and NCTM 1989), emphasizing skills such as the ability to interpret, evaluate, and apply statistical ideas over procedural calculations. Many of these outcomes are not adequately assessed using traditional tests, which too often emphasize the final numerical answer over the reasoning process (Garfield 1993, NCTM 1993). Thus, instructors need to accompany these new instructional aims with more authentic assessment techniques that address students' ability to evaluate and utilize statistical knowledge in new domains, communicate and justify statistical results, and produce and interpret computer output. Further, students need to receive feedback not only on their exam performance, but also constructive indications of their strengths and weaknesses, guidelines for improving their understanding, and challenges to extend their knowledge. The instructor needs to know not only how the students are mastering the material, but also how to improve instructional techniques and enhance statistical understanding. Since students' attitudes towards statistics can affect their learning (see, e.g., Gal and Ginsburg 1994), an assessment program should also include a way of judging how students are reacting to the material, including their impression of the relevance and fairness of the assessment process. Above all, assessment should mirror the skills students will need in order to be effective consumers and evaluators of statistical information.

2 Towards these goals, I have incorporated several techniques into my courses that seem well suited to statistics instruction. These techniques include a computer laboratory component, a term project with peer reviews and oral presentations, a take-home component to the final exam, minute papers, and student journals. In this paper, I first discuss my goals for these assessment procedures and how my implementation of these techniques has evolved. Based on what I find works and doesn't work in my courses, I describe what I feel are the essential features of effective assessment techniques in the introductory statistics course.

3 The University of the Pacific is a small (4,000 students), private, comprehensive university. One of the university's goals is to maintain small class sizes, allowing more personalized instruction. The main statistics course, Math 37, fulfills a General Education requirement and is taken by students in numerous disciplines, such as business, science (computer, natural, social, sports), physical therapy, humanities, and pharmacy. Business is the most common major (Fall 1995: 36%, Spring 1996: 61%), and students come from all years (Fall 1995: 62% juniors, Spring 1996: 77% freshman and sophomores). The prerequisite is intermediate algebra, and the course is worth four credits. A different course is generally taken by mathematics and engineering majors. Typically, Math 37 is the only statistics course these students will see, and there are various levels of mathematics and computer anxiety and competency among the students. From surveys distributed the first day of class, most students indicate they are taking the course to fulfill a requirement for their major or to fulfill the General Education requirement (Fall 1995: 80%, Spring 1996: 92%). When asked their initial interest level in statistics, on a scale of one to five with five being high, 79% in the Fall and 88% in the Spring indicated an interest level of three or below. I believe authentic assessment techniques should work well in this environment due to the focus on student-centered learning at my university, and the flexibility I am given by my department in structuring the course.

4 When designing my course, I identified several ways I wanted to enhance students' statistical learning experience and my assessment of that experience. In particular I wanted to better gauge the students' understanding of statistical ways of thinking, evaluate their communication and collaboration skills, measure their statistical and computer literacy, and monitor their interest in statistics. To see if these goals were met, I felt I needed to move beyond traditional assessment techniques and incorporate several nontraditional techniques in conjunction with traditional exams, homework exercises, and quizzes. Below I discuss how I combined several different assessment components together to address these goals. The Appendix gives an overview of how these different techniques are implemented in my introductory course.

5 I want my students to think of statistics as not just plugging numbers into formulas, but as a process for gaining information. Thus, I feel it is important to evaluate student understanding of this process by requiring them to complete such a process from beginning (conception of the question of interest) to end (presentation of results). One way to accomplish this is through semester-long projects. Use of projects in statistics courses has received considerable discussion in the literature (for example, Fillebrown 1994; Garfield 1993, 1994; Mackisack 1994; and Roberts 1992). My goals include seeing how students apply their knowledge to real (and possibly messy) data that they collect, while integrating ideas from many parts of the course (data collection, description, and analysis) into one complete package. By using the same project throughout the semester, students are able to rework their plan as new ideas occur, illustrating the dynamic nature of the process. Students also have an application to constantly refer to as new topics are covered in lecture. While the students' final projects may not incorporate all of the statistical techniques covered during the course, they need to evaluate each idea to judge its suitability for incorporation. Thus the students are determining the necessary components for their analysis. I feel this leads to more creativity in the projects, and better mastery of the statistical ideas, than when I tell them to apply one specific idea.

6 Students also see this cycle in their labs and final exam. Five of their 14 labs require students to collect the data themselves and then complete a technical report detailing their data collection procedures (e.g., precautions, operational definitions), analysis, diagnostic checks, and final conclusions. Students must discuss pitfalls in the data collection and analysis, and suggest potential improvements. At the end of the course, to further assess their ability to apply what they have learned, I now incorporate a take-home component to the final exam. Students are given a series of questions pertaining to a new data set. Students identify an appropriate test procedure based on the question asked, perform the computer analysis (choosing informative graphs and relevant diagnostic procedures), and interpret the results. Students need to integrate different portions of the course and review several potential approaches to the problem. Students are graded on their ability to identify and justify appropriate techniques, perform the analysis, and interpret their results to formulate a conclusion.

7 These assessment techniques provide the instructor with an authentic performance assessment that cannot be obtained solely from traditional timed, in-class exams. In particular, these approaches judge whether students can move beyond the textbook and apply the statistical concepts to new situations.

8 Another assessment goal in my course is to judge whether students possess sufficient knowledge to interpret statistical arguments and computer output. Many current newspaper articles are discussed in the course, and I often ask students to evaluate the merits of the numerical arguments and generalizations. For example, many of the homework assignments and test questions involve recent articles, following the model given in the CHANCE course (Snell and Finn 1991). Students are also asked to critique classmates' project reports. These reviews are completed in 10 to 20 minutes at the end of one class, reviewed by the instructor, and then returned to the groups the next class period. Each project group receives two to four reviews. It would also be possible to grade the peer reviewer on the quality of the review to see if he or she can evaluate the proposal critically. In both these cases, I can see whether the students understand the ideas well enough to distinguish between good and bad uses of statistics.

9 While many questions can ask students to evaluate computer output, it is also important for students to work with computers directly, producing statistical output and interpreting that output in their own words. This was my initial justification for incorporating weekly computer lab sessions into the course. In most of these labs, students work with Minitab to analyze existing or self-collected data. The rest of the labs are directly aimed at enhancing student understanding of statistical concepts. These labs work with programs developed by delMas (delMas 1996) to develop visual understanding of concepts such as the standard normal curve, sampling distributions, and probability. Thus, students use the computer as a tool for analysis and for obtaining deeper conceptual understanding. In all of these labs, students' explanations of what they learned are emphasized.

10 Writing and being able to explain ideas to others have become important components of my course. In the lab write-ups, students either explain an important statistical concept at length, or identify, execute, and justify the relevant statistical procedure. I feel some of the students' most significant learning gains come from debating ideas with each other, so I encourage students to work in pairs on the lab write-ups to further increase this practice. Requiring students to explain ideas in their own words also greatly augments my ability to judge their level of understanding, while giving the students repetitive practice with the language of statistics and further internalization of the concepts. (For research on how giving explanations helps, see, for example, Rosenshine and Meister 1994.)

11 My method for allocating points on assignments reinforces the importance I give to the different components. When I grade the full lab write-ups, 50% of the points are given for discussion. Thus, students learn that interpretations and explanations are important and highly valued components of the analysis. Students find the writing component quite time consuming, often indicating that their least favorite lab is "any full lab write-up." They can have tremendous difficulty explaining their reasoning, reinforcing my belief in its necessity. The most common feedback I give is "Why?" or "How do you know this is true?" as I try to teach the students to build a logical argument based only on the statistical evidence. In grading the labs it is important to reward thoughtfulness and creativity, and especially to allow multiple interpretations of the same idea as long as the interpretations have been justified.

12 Another way I sometimes address the need for writing is through the use of student journals. I use many in-class activities (primarily based on those found in Scheaffer, Gnanadesikan, Watkins, and Witmer 1996), but I notice that while students find the in-class activities entertaining, they often miss the important statistical concept involved unless they are asked to further reflect on the ideas. Thus, when I use journals, I require students to include some explanation of the activity, such as a simple paragraph explaining what the activity was about or answers to a series of directed questions. I also require chapter summaries to encourage students to continually review the material, blending lectures with text explanations and tying topics together. Due dates of project reports roughly correspond with midterms to help students assimilate the ideas they are reviewing. Hearing presentations of other project groups enhances review for the final exam. These additional reflections greatly aid students' understanding of the underlying ideas.

13 Peer reviews and oral presentations require the students to express their understanding to each other. I find that if only I present examples in class, students often don't participate fully and then later cannot extend the knowledge. Now each week, three to four students are responsible for presenting homework solutions on the board, with no prior notice. Also, at the end of the semester, each project group is required to give a presentation of their results to the class. The rest of the class and I evaluate these presentations. From homework and project presentations, I learn much more about the student's or group's ability to communicate results, as well as their ability to summarize the important points and to effectively convey results visually. In each of these tasks, students must learn to utilize the statistical terminology in a coherent manner and to explain the ideas to themselves, to other students, and to me. This requires them to develop a deeper understanding of statistical concepts.

14 Learning to function in groups should also be an important component of statistics education. Thus, the term projects are completed by teams of 3 to 5 students. In extreme situations (e.g., commuting from another city), students have been allowed to work individually. However, I encourage the group experience to foster debate and discussion among the group members, as well as to model their future working environment. The first semester I tried project groups, I learned that students at the University of the Pacific can have trouble working together in groups. To address these difficulties, I have given them more opportunities to work with other students (e.g., in-class activities) prior to selecting groups. Also, I now split the grades into an individual component and a group component. A student's final grade is based on a group grade (85%) and an individual grade (15%) that varies among the team members if there is a severe discrepancy in effort. This discrepancy is identified by confidential input from the group members at the end of the semester. The discrepancy has to be clear before points are adjusted. Knowing from the beginning that their teammates will provide this information has eliminated many of the potential problems. Contracts for projects can also be included, detailing which parts of the project each person will work on. Some instructors have also recommended including a quiz or test question requiring the individual students to demonstrate knowledge of the group project. Overall, I believe learning to work in groups should be a requirement, but the instructor needs to incorporate some flexibility and individual responsibility into the grading.

15 I also want to utilize assessment practices that allow students to reevaluate their work, in order to establish a dialogue with the instructor and improve students' own self-evaluation skills. Through reevaluation of their work, students can develop their knowledge over time. For example, I grade the lab reports "progressively," requiring slightly higher levels of explanation as they proceed through the labs and incorporate the earlier feedback. Pairing students also allows me to provide more immediate feedback. Another process for continual review is the use of periodic project reports. Four times during the semester each group is asked to submit a report on their current progress for the instructor to review. This allows me to give guidance, prompt the students with questions, and answer their questions at several points during the semester while they are thinking about different stages of the project. I find this has greatly enhanced my ability to monitor and challenge the teams. Students see that statistics is an evolving process, and they also learn to justify their decisions.

16 Similarly, with the journals, the only grade that counts is their final grade, and I encourage students to resubmit earlier work to increase their credit. My goal is for students to continually reflect on a concept until they understand it fully. This approach has been most effective with the lab reports as I do see a tremendous improvement in the quality of the reports over the semester. It is important for students to be able to revisit an idea, so they can clear up misconceptions and enhance their understanding.

17 It is also important for students to learn to assess their own work. In the journals, I require self-evaluations so students will think critically about their own participation in the course and knowledge of the material. I have found that many of my students do not ask questions in class or office hours, and so I hope that by having them submit questions through the journal I can establish a dialogue with them and provide personally directed feedback in a nonthreatening environment.

18 Since my students do not enter the course with much interest in the material, I want my assessment program to monitor and enhance their interest levels. Thus, I choose techniques that are effective not only as instructional and assessment tools, but also as motivational tools. This is important because I believe increasing their interest will also increase their learning and retention of the material. Students often agree, as shown in the following comments.

"Doing the project did help strengthen my knowledge of proportions. When I actually had to sit down and do the calculations I found that the results made much more sense than before."

"When you apply what we've learned to the real world we see everyday it makes more sense and becomes more interesting."

"Putting the ideas to use further implants them into the brain cells."

19 To maximize interest, I allow students to pick their own project topics. I want to be sure the topic is of interest to the students, and I encourage them to expand on activities from other courses. This fosters student creativity and ownership in the topic. Students appreciate seeing a "real world application."

"It certainly made it more interesting to see that we could actually use what we learned in class."

"It was good to analyze something we were interested in."

"It was good to be able to use what we learned in a study we conducted ourselves."

"The project helped me to see the practical use of statistics. This type of project was more interesting than just computing data for homework."

20 By allowing some flexibility in what topics the students examine, they will better see the relevance of statistics and want to learn the ideas. When asked if they enjoyed the project, 79% of Spring 1996 students (n = 63) said yes; 86% said the project helped them understand the material; 62% said the project increased their interest in the material.

21 Similarly, I find increasing the number of labs where students collect data about themselves and their classmates increases their interest in the answers. I also incorporate an "Other" category in the journal assignment. Students must contribute experiences they have with statistics outside the course, for example, an interesting article or news reference. My aim is to give students the opportunity to expand their knowledge and find interesting uses of statistics in the world around them.

22 To track student interest levels I ask the class to complete minute papers at various points during the semester. These papers, which are completed at the end of a class period and can be turned in anonymously, are designed to inform me how I can improve the course or identify concepts that need further explanation. Students are also asked again at the end of the semester to rate their interest level in statistics, revealing how their views have changed. For example, typically I've found 10 to 20% of the students have an initial interest level of four or higher, but over 50% do at the end of the semester. I can also compare these numbers across semesters. Thus, I have very quick and efficient ways of tracking student opinions during the course, allowing immediate adjustments.

23 After implementing the above strategies in various stages, I have found the following guidelines to be essential to the success of the evaluation process.

24 Assessment techniques should provide a dialogue between the instructor and the student. The feedback needs to identify the problem to the student and provide guidance for how he or she may proceed. On homeworks, instead of simply telling students the correct answers, I try to show them how their solutions are insufficient and guide them towards an improved solution. When I use student graders, I also distribute homework solutions for further review. One student explained, "The solutions handed out were more helpful to determine how to fix what I'd done wrong."

25 Students also need to be given the opportunity to reevaluate their work and receive additional feedback. The journals are an excellent way to maintain ongoing discussions with the students. However, I find that many students may not naturally take the opportunity to resubmit earlier work. Conversely, the project reports have worked well to continually monitor students' efforts and provide hints at different stages of the process, and students have found them valuable. The lab reports also allow me to provide feedback on data collection issues before the group project has been completed. The "full" write-ups provide the students with lots of practice before they produce the final project report. These approaches reinforce to students that statistics is an evolving process, while allowing me to monitor their methods as they are being developed.

26 I find peer reviews of student projects quite helpful. These reviews show students what others in the course are doing, and provide additional feedback from their peers. This feedback is sometimes more valuable to the students than the instructor's input. Overall, the student feedback is quite constructive, providing input that is prompt and rich with several new perspectives. These techniques provide the students with information they can use in their next assignment, instead of only giving them a final interpretation of their work at the end of the term.

27 It is important to develop students' ability to evaluate their own work and challenge themselves. Decreasing the amount of specific instructions in the labs as the semester progresses allows me to measure students' level of self-guidance and independence. The labs have proven more and more successful as I incorporate students' suggestions on how to clarify the instructions. For example, at the students' request, the lab instructions now include an index, glossary, and frequently asked questions section. By minimizing the computer learning time, students are able to concentrate longer on the conceptual points of the lab. I also now hire a temporary student assistant each semester to provide feedback on the lab exercises and the clarity of the instructions while I am developing them. I find this has greatly enhanced my ability to present the material at the appropriate level for the students, so they can focus on the concepts being addressed.

28 The index, glossary, and frequently asked questions sections also help me to assess the students' ability to help themselves when they encounter a problem. Including these sections has certainly helped to provide students with guidance, but I am still trying to find the proper balance between leading and telling. I could also include more grading information in the lab instructions, but I think it is important for the students to learn to identify the components of an effective statistical analysis or explanation for themselves. I want students to do much of the discovery for themselves, because what students construct for themselves they will understand better and remember longer (see, e.g., Resnick 1987).

29 Often it is difficult to get students to become more independent learners. For example, with the journals, many students didn't appreciate the opportunity to outline the chapters and summarize the relevant points for later review. I thought students would like the opportunity to receive feedback and credit for review techniques they were already utilizing, but the study habits of my students are not at this level. Still, I think it is advantageous for students to generate their own summaries, and I continue to ask them to develop their own review sheets (which I look over) instead of relying on the instructor. (An overview of research on how students actively construct knowledge is presented in Anderson 1983).

30 Students also had mixed reactions about the "Other" category of the journals. A few students took the opportunity to expand their knowledge and find interesting examples in the world around them, but most did not. Thus, instead of increasing their opportunity to see the relevance of statistics, the requirement instead tended to add to their anxiety about the course load.

31 It is important to put students in situations that require the skills you are trying to assess. For example, I use the take-home component of the final exam to allow students to show me what they can produce independently. By putting them in the role of a statistician, I challenge them to develop a solution to a new problem on their own. I encourage them to ask questions about the computer commands, as this is not my main focus, and I want them to be able to move on with the problem. However, if they ask for statistical knowledge, they are "charged" by losing points if I tell them the answer. I indicate that this is similar to having to pay an external consultant when they need statistical expertise. Usually, if I tell them a question will cost them, students accept the challenge to think more about the problem. I hope by offering myself as a resource I also reduce unpermitted collaboration. Students are given the freedom to demonstrate to me how they can use the knowledge they have gained in the course.

32 Almost every time I have implemented a new technique, I have erred in not providing the students with enough guidance as to what I expect. This is primarily because I have not fully developed the ideas ahead of time. Students have commented that they "did not always know what to write about the different activities" or "what the point was." To address these difficulties, I now give quizzes to preview exam questions, include project checklists in the syllabus, supply a detailed lab manual, and make model papers available for review.

33 For example, initially my in-class exams were slightly different from what students were expecting because the exams focused as much on the conceptual topics in the course as on the calculations. Now I give biweekly quizzes, partly for evaluation, but also to introduce students to the types of questions that may appear on an exam. The quiz questions are quite conceptual in nature, typically requiring more thought than the homework problems. I alleviate some test anxiety with this approach, and indicate to students the topics I find most important. I have found this to be sufficient exposure to the exams, and do not provide exams from previous semesters for study.

34 For the projects, I attempt to incorporate the suggestions made by Sevin (1995): give clear, detailed instructions; provide feedback and guidance. For example, I give students a checklist in their syllabus explaining what needs to be discussed in the final project report. A clear timeline for the periodic project reports is also included. Students are also given access to a web page of previous project topics to review at their convenience. This gives the students more of a framework for the project, and encourages them to extend themselves beyond what was done previously.

35 The largest initial problem when developing lab exercises was the students' computer anxiety and lack of experience. I now produce a lab manual that is given to the students at the beginning of the semester (as part of a lab fee paid with the course). The manual, modeled after those by Egge, Foley, Haskins, and Johnson (1995) and Spurrier, Edwards, and Thombs (1995), includes instructions for the 14 lab assignments, sections on how to use the computers, Microsoft Word, and Minitab, as well as the index, glossary, and frequently asked questions sections. The manual provides detailed computer steps, including pictures of the menus they will see, for all of the labs throughout the semester. Guidelines for the full lab write-ups are also included (similar to Spurrier et al. 1995). Thus, the manual provides students with a constant reference and knowledge of what is expected of them.

36 Each week, a lab write-up that received one of the highest grades is posted in the lab and on the course web page. The goal is for students to review these papers, feel pride if their paper is selected, see the work of their peers (and thus what they are also capable of), and notice that there can be multiple interpretations and conclusions for the same data. I purposely post these materials after the lab is due, so that students can utilize the feedback for their next lab but cannot use the papers as templates for each lab. Unfortunately, less than 20% of students indicated that they reviewed these model papers. In fact, three students indicated that they were not aware of their existence. Perhaps this is one reason students still felt there wasn't enough guidance on what was expected in the lab write-ups. In the future, I need to increase student awareness of this resource.

37 Lack of guidance also appeared to be a problem with the journals. A definitive set of guidelines for the journals, similar to those for the projects and labs, given at the beginning of the semester, would probably help significantly. In the future, I will be more clear about which topics will be included in the journals and what I expect from students. For example, the journal questions are now distributed as handouts, with more focused questions to be addressed, rather than requiring merely a simple summary. These handouts also include graphs of data collected in class for students to reflect on and utilize to support their conclusions. I now also include an initial practice example and distribute a model explanation of the activity.

38 In short, students need to be given enough direction to start the problem, but not so much that their approach becomes rote. Evaluation techniques should give students the ability to continue to learn and develop, while being flexible enough to allow students to develop in their own ways.

39 Students always know if they are getting different grades for the same work and may discount feedback if they don't feel it is valid. This is especially a concern to me because grading the labs and projects can be time consuming (approximately 15 to 20 minutes for full lab write-ups, 30 minutes for projects). Development of point systems (how much is taken off for each type of mistake) and scoring rubrics has proven quite helpful for consistency, and can be reused in subsequent semesters. The point system can also be revealed to students if they desire additional information on the grading process.

40 For example, I use a breakdown sheet for the projects each semester: 20 points for oral presentation, 20 points for the quality of the write-up, 20 points for the data collection design, 20 points for choice and correctness of the analysis, and 20 points for interpretation of the results. During the project presentations, students rate the following features on a five point scale: introduction, organization, clarification, visual aids, and conclusions. Comments and suggestions from the class are summarized into a final feedback report for the group. Such sheets have helped my consistency from semester to semester and have clarified to the students what I am looking for.

41 For the journals, I grade the explanations of in-class activities on a three point scale with 0 = incomplete, 1 = severe misunderstanding, 2 = partially correct, 3 = correct (Garfield 1993). Half points are often assigned. This point system allows me to assign grades more quickly, but such totals often need to be adjusted to assign letter grades. Furthermore, a three was seldom given initially, as I hope to encourage further reflection. However, if students don't resubmit their work, the grading can seem harsh to them.

42 Most of the student complaints about lack of fairness came with the journals. While several students expressed appreciation in the course evaluations that their questions were answered personally and in detail, others felt their grade shouldn't be lowered because they did not have questions. Evaluating the self-evaluations also proved problematic. I tried to make it clear that these self-evaluations are different from the in-class minute papers, in that students should reflect on their efforts (instead of mine). I gave full credit if I felt the student was being constructive and insightful. However, students didn't see how I could grade their opinions. Again, more clarity is needed regarding my expectations.

43 For the assessment technique to be informative, students need to understand why it is being used and why it is important. Otherwise the evaluation will probably not be an accurate reflection of their abilities, and their anxiety about the course may increase.

44 Initially, students saw the lab activities and journals as time consuming and not adding to their knowledge. Of the students working with journals, 66% found them too time consuming, with 41% pinpointing the chapter summaries as the most time consuming component. One of the main problems with my implementation of the journal technique was not explaining to students the concept of the journal. Students didn't really understand what I meant by "journal," thinking of a diary that had to be completed daily. Furthermore, students did not understand how the journals assessed a component of their learning that is separate from the traditional homework assignments. One approach I may adopt is to have the journals substitute for the homework assignments for one week or at the end of a large block of material, but feasibility of this may depend on the pace of the course. I will also encourage students to keep their journal entries on the computer so that later revision will be less troublesome. Journal entries can also be submitted through e-mail.

45 Some students don't see the data collection and writing components of the course as "worth the effort." I need to further emphasize why these tasks are being asked of them and what I want them to get out of the experience. This will help the students concentrate on expressing what they have learned instead of dwelling on how long the task is taking. Some students have expressed appreciation, though perhaps reluctantly, for the gain in understanding of the material, despite the additional work:

"I hated doing them (journals), but I realize now that they really helped in learning the concepts."

"(I didn't like) writing about the chapter summaries, but it did help me get some of the concepts down."Furthermore, students need to realize how each assessment tool is different. This was evident when journals were first introduced:

"You already provided most of the service the journal was meant to be for."

"If I have a problem, I would go straight and ask you and you can judge my understanding by looking at my homework and lab assignment."Comments like this illustrate why students did not easily accept the journal component. Conversely, students can more easily see the purposes of the take-home component of the final exam and the lab activities.

46 I believe additional clarification of the distinct goals of the different assessment activities will increase student participation. Students will better appreciate what they are learning from the activities, and the assessment will more accurately reflect students' abilities.

47 For evaluation procedures to be successful, they need to be well organized and complementary to the lecture material. All assessment components should be described at the beginning of the course. Consideration also needs to be given to the amount of time required by the students and the instructor for each component. If too many tasks are assigned, neither the students nor the instructor will be able to devote sufficient energy to each task, and the speed and consistency of the assessment will decline.

48 Initially, labs and journals were not well received because of my lack of organization and the need for better integration of the journal and lab assignments with traditional homework assignments. I find I have more success when I make of point of briefly mentioning the labs and the projects during the lecture time. These reminders are helpful to illustrate to the students when and where the ideas fit in. I have also been able to better time the lab exercises to be concurrent with lecture topics, although this can lead to a rather rigid syllabus. It is important for students to see that all these ideas are related, and not just several disjoint tasks.

49 Students will also resent an assessment activity if they feel the time required is excessive.

"The labs didn't really help because I was too busy trying to get it done that I didn't learn anything."Students claim that a lab requiring 15 minutes of observation outside of class is unreasonable. They prefer smaller tasks or data collected in class. The lab activities became more successful after I introduced the separate lab hour into the course and incorporated more of the data collection into the lecture time. During the lab, half of the class meets at a time. Students work in pairs on the lab assignment, with the instructor available for immediate assistance and clarification. Students appreciate having access to the instructor and seem able to approach the computer with higher efficiency and less anxiety. When designing lab assignments, I plan them so the computer work can be completed during this lab hour, with the students' explanations and interpretations being incorporated on their own during the week. I also reduced the number of full lab write-ups to five of the 14 labs. These changes allow students time to absorb the ideas and to incorporate ideas from lecture into their lab reports. Reducing the number of full lab write-ups and encouraging students to work in pairs also reduces the grading load.

50 Furthermore, the assessment strategies need to complement each other. I think this is accomplished with the activities I am currently using, providing me with a more complete picture of student understanding and performance. Each technique gives me different sources of information, whereas, as Garfield (1994) states, a single indicator would limit the amount of information I receive. Still, too many competing assignments can overwhelm the students or seem too repetitive. I must be careful not to use so many distinct indicators that they overload the students and are no longer effective. Currently, I do not require a separate journal assignment, but I do integrate some of the questions into the homeworks. While undergraduates grade most of the homework problems, I grade these more open-ended questions. This approach achieves many of the journal goals. I also include exam questions that directly relate to the main ideas I want the students to learn from the labs and in-class activities. (For example, Scheaffer et al., 1996, include assessment questions after each activity.) This reminds students to reflect on the learning goals of these activities.

51 I concur with Garfield's (1994) suggestion to only attempt to incorporate one new assessment technique at a time. These techniques can be quite time consuming and demanding on the instructor, and need to be well organized and thought out ahead of time. Both the instructor and the administration need to be cognizant of the additional demands required by these techniques. Successful implementation will not be instantaneous, but will require continual review and redevelopment. My current system of weekly homework, biweekly quizzes, weekly labs, and a semester-long project seems to be at the upper limit of what I can ask of my students and of myself.

52 I have found that traditional assessment techniques often do not sufficiently inform me as to whether my students possess the skills I desire. For example, at what level do they understand the material? Will they be able to read and critique articles in their field? Can they read and debate statistical evidence? Can they make reasoned arguments with data and explain their ideas to others? Can they read and interpret computer output? Can they use what they have learned? I have found the techniques discussed here to be quite effective in expanding my knowledge base about the students, including where they are struggling and what topics need to be reviewed. Still, such techniques should be implemented gradually and with careful planning. For example, I did not find the journals added significantly to the techniques I had already implemented, but instead created an excessive workload. However, I think journals could successfully be substituted for more traditional homework assignments or in situations where more individualized assessment is not otherwise possible. At the university level, we often have tremendous flexibility in the types of assessment we can incorporate into a course. Furthermore, as students become accustomed to such approaches, we will be able to more easily expand their use. I find the above techniques to be excellent tools for informing me about what students know and can do. At the same time, these techniques tell me whether students appreciate statistics and computers. They also allow me to provide prompt feedback that is meaningful to the students. Such information is crucial to both the instructor and the students, and is much richer than that provided by traditional techniques.

Math 37 is a four unit course with three 80-minute lectures per week and one 50-minute lab per week for 15 weeks. Sections are limited to 45 students. The text is Moore and McCabe (1993). The following assessment components are incorporated in Math 37:

- Two in-class midterms (15% each)
- Final exam (15%)

85% from in-class exam, 15% from take-home exam distributed one week before traditional in-class final. Students work individually on distinct questions. Sources of data: ASA datasets, Journal of Statistics Education, casebooks (e.g., Chatterjee, Handcock, and Simonoff 1995). - Weekly homework assignments (15%)

Due on Friday. Randomly selected students present "practice problem" solutions to the class on Wednesday. - Weekly computer labs (15%)

Students meet for one hour per week in computer lab with instructor and have one week to complete lab write-up. Sections are limited to 23 students. Lab manuals are handed out the first week. Students complete nine 25-point labs and five 50-point labs ("full labs"). The full lab write-ups are graded on presentation (10%), computer output (40%), and interpretation of results (50%). - Quizzes (10%)

Eight to ten quizzes given, lowest two grades dropped. - Term Project (15%)

Groups of 3 to 5 work together on one project over the semester. Projects are graded on data collection design (20%), analysis (20%), interpretation (20%), written report (20%), and final oral presentation to the class (20%). Grade given is 85% group grade and 15% individual grade. Previous project topics can be viewed at

http://www.uop.edu/cop/math/chance/projects.htmlGroups submit four periodic project reports:

- Report 1 (Week 4): Topic, population, type of
study, sampling frame.
- Report 2 (Week 6): Data collection with copy of
design or survey. Submitted for peer review.
- Report 3 (Week 11): Preliminary descriptive
statistics, goals for analysis.
- Report 4 (Week 12): Final selection of statistical
tests.
- Rough Draft (Week 14): Optional.
- Final Report and Presentations (Week 15)

In the peer reviews students are asked to review the proposed study, indicating where the study is lacking in clarity or has potential biases. The instructor looks over the reviews and returns them to the groups.

- Report 1 (Week 4): Topic, population, type of
study, sampling frame.
- If journals are used, they are worth one-half of the homework grade. A 3-point scoring rubric is used to assign points for write-ups on ten in-class activities, and a 2-point rubric is used for ten chapter summaries. There are 10 points total for self-evaluations, 10 points total for questions on the material, and 10 points total for finding "other" applications. Journals are collected every 2 to 3 weeks.

Anderson, J. R. (1983), The Architecture of Cognition, Cambridge, MA: Harvard University Press.

Archbald, D. and Newmann, F. (1988), Beyond Standardized Testing: Assessing Authentic Academic Achievement in the Secondary School, Reston, VA: National Association of Secondary School Principals.

Chatterjee, S., Handcock, M., and Simonoff, J. (1996), A Casebook for a First Course in Statistics and Data Analysis, New York: John Wiley and Sons.

Cobb, G. (1992), "Teaching Statistics," in Heeding the Call for Change: Suggestions for Curricular Action, ed. L. Steen. MAA Notes, No. 22, Washington: Mathematical Association of America, pp. 3-34.

Crowley, M. L. (1993), "Student Mathematics Portfolio: More Than a Display Case," The Mathematics Teacher, 87, 544-547.

delMas, R. (1996), "A Framework for the Development of Software for Teaching Statistical Concepts," Proceedings of the 1996 International Association of Statistics Education (IASE) Roundtable, Granada, Spain.

Egge, E., Foley, S., Haskins, L., and Johnson, R. (1995), "Statistics Lab Manual," Carleton University, Mathematics and Computer Science Department, 3rd edition.

Fillebrown, S. (1994), "Using Projects in an Elementary Statistics Course for Non-Science Majors," Journal of Statistics Education [Online], 2(2). (http://www.amstat.org/publications/jse/v2n2/fillebrown.html)

Gal, I. and Ginsburg, L. (1994), "The Role of Beliefs and Attitudes in Learning Statistics: Towards an Assessment Framework," Journal of Statistics Education [Online], 2(2). (http://www.amstat.org/publications/jse/v2n2/gal.html)

Garfield, J. (1993), "An Authentic Assessment of Students' Statistical Knowledge," in National Council of Teachers of Mathematics 1993 Yearbook: Assessment in the Mathematics Classroom, ed. N. Webb, Reston, VA: NCTM, pp. 187-196.

----- (1994), "Beyond Testing and Grading: Using Assessment to Improve Student Learning," Journal of Statistics Education [Online], 2(1). (http://www.amstat.org/publications/jse/v2n1/garfield.html)

Mackisack, M. (1994), "What is the Use of Experiments Conducted by Statistics Students?" Journal of Statistics Education [Online], 2(1). (http://www.amstat.org/publications/jse/v2n1/mackisack.html)

Moore, D. S. and McCabe, G. P. (1993), Introduction to the Practice of Statistics (2nd ed.), New York: W. H. Freeman.

National Council of Teachers of Mathematics (1989), Curriculum and Evaluation Standards for School Mathematics, Reston, VA: NCTM.

----- (1993), Assessment Standards for School Mathematics, Reston, VA: NCTM.

Resnick, L. (1987), Education and Learning to Think, Washington, DC: National Research Council.

Roberts, H. V. (1992), "Student-Conducted Projects in Introductory Statistics Courses," in Statistics for the Twenty-First Century, eds. F. Gordon and S. Gordon, MAA Notes, No. 26, Washington: Mathematical Association of America, pp. 109-121.

Rosenshine, B., and Meister, C. (1994), "Reciprocal Teaching: A Review of the Research," Review of Educational Research, 64(4), 479-530.

Scheaffer, R., Gnanadesikan, M., Watkins, A., and Witmer, J. (1996), Activity-Based Statistics, New York: Springer-Verlag.

Sevin, A. (1995), "Some Tips for Helping Students in Introductory Statistics Classes Carry Out Successful Data Analysis Projects," presented at the Annual Meeting of the American Statistical Association, Orlando, FL.

Snell, J. L., and Finn, J. (1992), "A Course called `Chance'," Chance, 5(3-4), 12-16.

Spurrier, J. D., Edwards, D., and Thombs, L. A. (1995), Elementary Statistics Lab Manual, Belmont: Wadsworth Publishing Co.

Beth L. Chance

Department of Mathematics

University of the Pacific

Stockton, CA 95211

Return to Table of Contents | Return to the JSE Home Page