INTRODUCTORY STATISTICAL EDUCATION: RADICAL REDESIGN IS NEEDED, OR IS IT?

Roger W. Hoerl
General Electric Company
Introduction. The bulk of this article was originally presented as a commentary on Neil Ullman's paper "Statistical or Quantitative Thinking as a Fundamental Intelligence", presented at the 1995 Joint Statistical Meetings in Orlando. (Editors' Note: A condensed version of N. Ullman's paper appeared in Vol. 2, No. 1-Winter 1996 of this Newsletter.) The purpose for including it in this publication is to suggest that introductory statistics courses must be radically redesigned, not incrementally improved, if statistics is to assume its rightful place in US society. It is further argued that how to implement this radical revision is basically known; it simply requires combining several suggestions which have been made previously by recognized researchers in the field. While these do not appear radical individually, combining them would result in an introductory course virtually unrecognizable by today's standards.

The remainder of the article is organized as follows. First, a case is made for radical revision of introductory statistical education as a key leverage point to positively impact the statistics profession as a whole. Next, an overall approach to redesign, based on four areas of revision is presented. These four individual areas will then be explored in detail, and lastly, an overall summary will be given.

The need for radical redesign. The status of the statistics profession has been discussed at great length over the past ten or so years, and has been addressed by virtually every ASA president in this time frame. There is general consensus that we should be more influential in important issues facing our country, such as health care, economic policy, and the evaluation of various social programs. The word "crisis" and phrase "under siege" have been used by some to describe our current situation. While I do not intend to review this vast literature, the following quote from Bailar (1995) is hopefully indicative of the tone of much of the commentary.

"As academic statisticians, we are missing the boat. We are barking up the wrong tree. We do not see what is plainly before us. We are kidding ourselves when we think that 'our' kind of statistics is vital to the welfare of the nation and the world....More and more, despite occasional appearances otherwise, we as academic statisticians are talking to ourselves. Even at this symposium, we talk about how to do the old things better and more broadly, not about what we could offer to society and what needs to be done...The kinds of statistics that we teach in under-graduate and especially in graduate programs have almost nothing to contribute to anything that matters on the scale of these problems."

Unfortunately, many view the problem as one of "image" rather than substance, and do not see a need for us to fundamentally change. In diagnosing why this is so, I believe Carl Morris (1995) gave the correct answer when he said: "Can the needed changes be made? I am pessimistic about this. It is awfully hard to change, because to do so requires performing surgery on ourselves, and that hurts."

If lack of influence is a symptom, what might be the root causes? As has been pointed out by many, the first exposure to formal statistics that most people receive is through introductory statistics courses in college. As someone who has both taught this course, and worked with numerous people in business and industry who went through it, I can testify that it can significantly and permanently impact students' perception of statistics. Unfortunately, this first impression is often negative, and may leave the future engineer or business leader with the impression that formal statistics is to be avoided at all costs. Bob Hogg (1991) spoke for many of us when he stated: "So students frequently view statistics as the worst course taken in college." I firmly believe that as a profession, we have significantly underestimated the long term detrimental impact to us of this perception by introductory students. I would further suggest that radical improvement of these introductory courses may be the single most powerful leverage point to improve the influence of our profession.

Clearly, we are not solely to blame for the poor performance of introductory statistics courses. The lack of quantitative literacy in incoming students, the low value placed on introductory courses by academic administrators, academic reward systems which force non-tenured faculty to narrowly focus their energies on publication, and the frequent use of untrained non-statisticians to teach statistics, all contribute to the problem. Most commentators have acknowledged that we should have the leading role in improving the introductory courses, however, and many positive reforms are under way. My concern is that individually these reforms appear detached from one another, and only address isolated symptoms. In other words, they appear as "one-at-a-time" experiments unguided by theory. What follows is an attempt to view the reforms holistically, and provide a theoretical framework for how they might be integrated.

A suggested approach. If we organize the main reform suggestions which have been made, we are left with four major redesign areas, which are closely interrelated. These provide a suggested course of action, which I would view as a "known solution", in that the individual ideas have been suggested previously, and generally agreed upon.

  1. Consciously develop course objectives based on students' and future employer needs, and structure the course to meet these objectives.
  2. Radically revise the course content based on the above objectives, which should lead to more emphasis on statistical thinking and less on numerical calculations.
  3. Utilize experiential learning more and lecture less.
  4. Reverse the organization of the course to teach overall approaches to scientific inquiry first, and individual tools afterwards.

All of these points are referred to in Shon's paradox, mentioned by Ullman (1995). Shon's paradox (gender neutral version) states: "The paradox of learning a really new competence is this:

I will now discuss the individual suggestions.

Course objectives. As noted in Shon's paradox, students do not know what they need to learn. Therefore, student evaluations cannot provide much guidance on what we should try to accomplish in the introductory course. I believe that much of the disagreement on course design is due to having different views of what we are trying to accomplish. As the old saying goes, if we don't know where we are going, any road will do! Despite the vast literature on reforms to the introductory course, precious little meaningful discussion over proper objectives has occurred. This point has been made previously:

Hogg (1991): "In some of these beginning courses, we must carefully consider the goals which differ from one group of students to another. Certainly, students in the liberal arts, taking statistics only to satisfy some 'quantitative reasoning' requirement, do not need to know about t and F tests....but no matter what the first course is under consideration, it is important to state the goals and give an appropriate course."

Wild (1994): "We need to give careful consideration to the aims and objectives of every part of our programs. If we have not thought through very carefully what we are trying to achieve, we are in no position to assess the quality of what we provide. Unfortunately, so much of what we do is not thought through from a careful consideration of customers, aims, and objectives; it just grows in an ad hoc way over the years, building on what has been done before."

In summary, we can't agree on a course because we haven't agreed on (or even discussed) objectives. Hopefully, however, all would agree that trying to make students experts in all statistical tools is not a realistic objective. It is interesting to look at typical introductory courses and texts, and think inductively about what set of objectives would lead one to develop this course. Such an exercise would seem to add credence to Wild's point.

Content. Again, as noted in Shon's paradox, the student is not in a good position to suggest content. Clearly the content should be directly based on the objectives. It has been convincingly argued, however, that any reasonable set of objectives would lead one to stress good conceptual understanding of statistical concepts over memorization of formulas. The following comments are illustrative.

Cobb (1991): "Emphasize statistical thinking: any introductory course should take as its main goal helping students to learn the basic elements of statistical thinking. Many advanced courses would be improved by a more explicit emphasis on those same basic elements."

Hogg (1991): "The course should focus on the process of learning how to ask appropriate questions, how to collect data effectively how to summarize and interpret that information, and how to understand the limitations of statistical inference .… Good statistics is not equated with mathematical rigor or purity, but is more closely associated with careful thinking."

Wild (1994): "These characteristics concern mental habits (rather than technical skills), which play a large part in statistical thinking...It would be very beneficial to society as a whole if these mental habits could be instilled early and widely, beginning with Stat 101."

Bradstreet(1996): "Our profession must decide if we want to teach people to compute before we ask them to think."

In summary, any reasonable set of objectives would lead to emphasizing statistical thinking over statistical techniques. The formal techniques can then be more rigorously addressed in subsequent courses. This is the approach typically used in other disciplines, such as economics, engineering, psychology, physics, and so on.

Since statistical thinking has become somewhat of a "buzzword", I would like to clarify that I am using the term in the sense of the definition published by the Statistics Division of ASQC (1995):

Statistical thinking is a philosophy of learning and action based on the following fundamental principles:

This definition is discussed in greater detail in the Special Publication on Statistical Thinking (1996).

Experiential learning. Shon notes that students can only learn by educating themselves. While there are various learning styles, it is virtually impossible to develop a new competence without actually doing it yourself. For example, imagine trying to learn to play chess or bridge solely by studying the rules in a class. Again, the suggestion to move towards less lecturing and more experiential learning has been made by many.

Hogg (1991): "Instead of asking students to work on 'old' data, even though real, is it not better to have them find or generate their own data? Projects give students experience in asking questions, defining problems, formulating hypotheses and operational definitions, designing experiments and surveys, collecting data and dealing with measurement error, summarizing data, analyzing data, communicating findings, and planning 'follow-up' experiments suggested by the findings."

Garfield (in Cobb 1991): "I do not lecture at all."

Snee (1993): "We all learn more from what we do than from what we watch. Value comes from using statistics in one's life ... collection and analysis of data is at the heart of statistical thinking. Data collection promotes learning by experience and connects the learning process to reality."

In summary, this reform appears obvious, and more progress has probably been made on this one than the others. Considering the previous two points, however, using experiential learning to teach the wrong content is of little value.

Organization of the course. This is probably the reform area which has received the least attention in the literature. This is unfortunate, as it is well known in educational and behavioral research that the sequence in which material is presented significantly impacts understanding and retention. For example, it is known (Forrester 1990) that students learn most easily by seeing the big picture first, and then learning the details (whole to parts); by seeing a tangible example before learning the theory behind it (tangible to abstract); and by first grasping the concept of what they are doing, and then developing technical capability to do it (gross to fine). These points provide a proven theory to guide reform efforts. Statistics, especially at the introductory level, tends to use just the opposite approach:

There are obvious tie-ins to content here. The overall process of scientific inquiry is a "whole". Real, sequential case studies can be used to introduce new techniques. In fact, Harry Roberts has been making this suggestion for years. "Gross" conceptual understanding is used with statistical thinking, while "fine" technical competency is used with statistical techniques.

One potential explanation for the statistical community's resistance to utilizing this theory is the current overemphasis on mathematics (See Box 1993). Anything mathematical is often viewed as "rigorous" or "pure", while the overall process of investigation, sequential case studies, and conceptual understanding, although certainly scientific, are often viewed as extraneous, or "fluff", because they cannot be rigorously mathematicized. What is really "meat", and what is "fluff" obviously depends on the objectives.

Shon's point that students "...can educate themselves only by beginning to do what they do not yet understand" refers to the need for gross understanding prior to technical competence, i.e., students need to begin applying what they have learned before they have fully mastered it. This point also explains why it is not a problem to show students techniques in a case study which they have not yet been formally taught. They can still get a gross understanding of what the technique is doing, and why it is being applied. This also creates "suction" from the students, in that after seeing the technique used, they are anxious to learn more about it.

Several authors have also noted the need to explicitly teach the overall process of scientific inquiry prior to teaching individual tools. This provides an overall context for the tools, so that students understand why they need to learn a particular tool, and how it fits in with the rest of the course. For example: Wild (1994): "The process of investigation as a whole should be the heart of any statistics program, particularly of the basic introductory course, Stat 101. It is something one should never lose sight of, and should always come back to." And Hogg (1991): "In particular, students should appreciate how statistics is used in the endless cycle associated with the scientific method..."

In summary, students' understanding and retention could be significantly enhanced by teaching the overall process of investigation before the tools, by using tangible case studies to introduce and motivate new topics, and by striving for gross understanding of key concepts (statistical thinking) before fine skills to apply numerical tools. It should be obvious that this would require virtually a complete reversal of the traditional course and text.

Conclusion. Revitalizing the introductory statistics courses may be the single most impactful thing we can do in the long run to enhance the influence of our profession within US society. There is enough theory and specific examples to guide radical revision of these introductory courses. Individually, the reforms have been proven effective. Integrating them would lead to the following overall approach:

  1. Clarify our objectives. Suggestion - the main objectives should be to develop capability to apply statistical thinking and the overall process of investigation to everyday situations.
  2. Base the content on the objectives. Suggestion - emphasize statistical thinking versus statistical techniques.
  3. Utilize experiential learning. Suggestion - sequential, real projects of the students' choosing are excellent vehicles for learning the process of investigation.
  4. Reorganize the course in light of educational and behavioral science. Suggestion - a potential sequence might be: Why are we studying this topic? (Gross understanding); What is it? (Part 1, Tangible whole - real case studies); What is it? (Part 2, Abstract whole - overall framework or model); How do I do it? (Fine skills in the tools).
This sequence may seen unusual (it is!), but is in fact natural, both for the overall course, and individual sections. This article utilizes the same basic sequence, although in the interest of space I have not provided a tangible case study for the redesign. For such an example, see Hoerl and Snee (1995). The article began by explaining why improving the introductory courses is so important for our profession (Why). It then presented the overall suggested approach for reform (What Part 1). Shon's paradox could be considered a theoretical framework for this approach (What Part 2). A much better example of the type of theoretical framework I am suggesting we use is the model in Chapter 1 of Box, Hunter, and Hunter (1978), which illustrates the sequential nature of experimentation, iterating between data and subject matter theory. This is one example of explicitly teaching the process of investigation prior to the individual tools.

Once the overall suggested approach was presented at a high level, details of the individual points (How) were discussed. The reader will have to decide whether this approach was more effective than if the article had begun with details of the individual reform areas, and then tried to tie them together at the end, without ever discussing why this subject is important. Such an approach would be consistent with the typical introductory statistics course.

Since how to change is basically known, the key question becomes will we? As noted by Hogg (1991): "Probably each of us thinks the others should change, but it is important that we must start with ourselves." Now is not the time to point fingers at one another, rather it is time for those who are willing to "belly up to the bar" and "perform surgery on ourselves". Good luck with your surgery!

For further information contact:
Roger Hoerl
General Electric Co.
Bldg K1, 4C39A
PO Box 8
Schenectady NY 12301
(518) 387-4040
Fax:(518)387-5714
hoerl@crd.ge.com"

REFERENCES

  • Bailar, John C.(1995), "A Larger Perspective", in Special Section on CATS Symposium on Modern Interdisciplinary University Statistics Education, The American Statistician, 49, 1, 10-11.

  • Box, G.E.P.(1993), "The Role of Statistics in Quality Systems-Methods and Training," presented at the International Statistics Institute biennial meeting, Florence, Italy.

  • Box, G.E.P., Hunter, W.G., and Hunter, J.S.(1978), Statistics For Experimenters, John Wiley and Sons, New York.

  • Bradstreet, Thomas E.(1996), "Teaching Introductory Statistics Courses So That Nonstatisticians Experience Statistical Reasoning", The American Statistician, 50, 1, 69-78.

  • Cobb, George(1991), "Teaching Statistics: More Data, Less Lecturing," Amstat News, December, 182, 1 and 4.

  • Forrester, A.D.(1990), "An Examination of the Parallels Between Deming's Model for Transforming Industry and Current Trends in Education," Small College Creativity, 2, 2, 43-66.

  • Hoerl, R.W., and Snee, R.D.(1995), "Redesigning the Introductory Statistics Course," Report No. 130, Center for Quality and Productivity Improvement, University of Wisconsin-Madison.

  • Hogg, Robert V.(1991), "Statistical Education: Improvements Are Badly Needed", The American Statistician, 45, 4, 342-343.

  • Morris, Carl N.(1995), Comments on the Special Section on CATS Symposium on Modern Interdisciplinary University Statistics Education, The American Statistician, 49, 1, 21-23.

  • Snee, Ronald D.(1993), "What's Missing in Statistical Education?", The American Statistician, 47, 149-154.

  • Statistics Division, ASQC(1996), Glossary and Tables for Statistical Quality Control, Quality Press, Milwaukee, Wisconsin.

  • Statistics Division, ASQC(1996), Special Publication on Statistical Thinking, available from Quality Information Center, ASQC, 611 East Wisconsin Avenue, PO Box 3005, Milwaukee, WI 53201-3005. (800) 248-1946.

  • Ullman, Neil R.(1995), "Statistical or Quantitative Thinking as a Fundamental Intelligence", presented at the 1995 Joint Statistical Meetings in Orlando Florida.

  • Wild, C.J.(1994), "Embracing the 'Wider View' of Statistics", The American Statistician, 48, 2, 165-171.
    Return to V3 N1 Contents
    Return to Newsletter Home Page