Newsletter for the Section on Statistical Education
Volume 7, Number 1 (Winter 2001)
Statistics began its emergence 200 years ago to meet needs of science and government, and has enjoyed considerable success in guiding quantitative studies for 100 years. Why, then, is it not considered part of everyone's basic school education? Why is it still among the most despised of college courses? Why has its use not become epidemic in business, industry, and government? Is progress being made to remedy these situations? The answer to the last question is a qualified "yes," but much work remains if statistics is to move to the forefront of quantitative education and practice. Opportunities abound, however, and the future could be bright!
1. Perusing the Past: Many Parents but No Home
As the recent history and current trends in statistics education have their genesis in the development of the discipline of statistics over the past 200 years, it seems appropriate to review key elements in that development. Education is generally entrenched in tradition, and statistics education is no exception. The problem is that the sciences saw statistics as a useful servant to many but a key component of none, and the traditions that became entrenched had wide enough cracks to allow statistics to fall through.
From the Pharaohs to Napoleon. From the beginning of recorded history to the late eighteenth century, the developments in statistics were few and far between. There were a few glimmers of excitement, to be sure, such as censuses in the Roman Empire, John Graunt's Observations upon Bills of Mortality in 1662, John Arbuthnot's 1710 argument in favor of Divine Providence governing births, and the exciting developments in the companion field of probability. Emperors were interested in keeping track of their people, money and key events (such as wars and the flooding of the Nile) but little else in the way of quantitative assessment of the world at large. Quantification was a tremendous problem, as pointed out by historian Theodore Porter.
As recently as two centuries ago, few people were quantitatively literate. Scientists were a partial exception, but not a very important one, because there were so few of them, and because many, even in what we now consider the physical sciences, made little use of numbers. As recently as 1800, the most important practitioners of quantification were merchants. The manipulation of quantities was an extraordinarily challenging task in those days when measures often varied from town to town, when there were different measures for different substances... The complexity of measure provided one of the main sources of support for mathematicians in Europe through the eighteenth century. (In Steen, 1997)
In short, many mathematicians earned a living working for government or business doing complicated arithmetic, and this was not all bad.
In the late 1700's, the situation began changing rapidly. It was the Age of Reason, The Enlightenment, in which great mathematicians and scientists worked together to begin developing theoretical as well as practical solutions to the problems of the day. The unifying decimal system was developed in France (and was immediately suspected by the lower classes as a government scheme that would work to their disadvantage). Nations began serious attempts to gather data on their residents (the first United States census being in 1790), an exercise that gave our field its name, and government bureaucracies were born. One of the hallmarks of this period, though, was that mathematicians still gained income by solving arithmetic problems, and so many of the great names of the day had government jobs that allowed them to lend their brilliance to the solution of everyday problems. A case in point is Laplace, "the Newton of France," who worked for the French government in various capacities (once being the Minister of the Interior) and made innovative contributions to the country's census. Data were becoming important, and great minds were thinking about how to collect and use it intelligently.
From Measurement to Statistical Science. The science that held sway above all others around 1800 was astronomy, and the great mathematicians of the day made their scientific contributions in that area. Legendre (least squares), Gauss (normal theory of errors), and Laplace (least squares and the central limit theorem) all were motivated by problems in astronomy, an appropriate science for the Age of Reason. The importance of quantification in the physical sciences is summarized succinctly in Lord Kelvin's dictum, "If you cannot measure, then your knowledge is meager and insubstantial." Serious quantitative work in psychology and the social sciences began, however, in the nineteenth century (the Age of Romance), and by the end of the century the main thrust of statistical thinking had taken hold. In the 1860's Fechner introduced paired comparison and factorial designs for experiments in sensory perception and by the 1890's Galton had laid the foundation for regression and correlation (Stigler, 1986). It took about 100 years to get from least squares to regression analysis, whereas the two topics are now hopelessly intermingled in introductory statistics books. There was, and is, quite a philosophical difference between using least squares to estimate physical parameters of the solar system on the one hand and means of conditional distributions in a theoretical probability model on the other. Statistics is, indeed, subtle!
As statistics moved from least squares to regression, what happened to the mathematicians? Some were still fascinated by statistics, such as Galton's associates Edgeworth and Karl Pearson, but most found homes in the many universities that were established around the world in the Romantic period. As educational opportunities opened up for large segments of populations, mathematicians could find positions that paid them to think about abstract research. No longer did they have to do complicated arithmetic, or statistics, to have a decent income. So, as the number of students taking mathematics courses in colleges and universities increased, the interest of the mathematicians in statistics decreased. In addition, these same mathematicians were dictating to the schools what mathematics a high school student was required to have before entering college. Statistics in the late nineteenth and early twentieth century was becoming the province of the social sciences (Florence Nightingale being one of its proponents) and, after a young geneticist named Ronald A. Fisher took a job at an English agricultural research station in 1919, the biological sciences and agriculture. At the same time, the ever-increasing demands of industry were producing new fields of engineering, which made use of statistical procedures in research and development and in maintaining quality of manufactured products. Walter Shewhart's control charts (1925) embodied the spirit of the latter. In short, statistics lost a home with the mathematicians and was now spread across the social sciences, the biological sciences, and engineering. Statistics was emerging as a science, but had a troubled childhood; many homes offered a bed, but none would support its maturing to its full potential; this boded ill for statistics education.
From Research Centers to Undergraduate Courses in Universities. The influence of agriculture and Fisher on the maturing of statistics as a discipline cannot be overstated, and this influence permeates statistics education as well. Under Fisher's influence, research centers began springing up in Commonwealth countries, notably India and Australia. These led to educating graduate students in statistics to solve applied problems and to develop new theory, as new problems were being found with great regularity. In the United States, the first graduate programs in statistics were formed in the 1930's and 1940's at land-grant universities such as Iowa State, North Carolina State, and Virginia Polytechnic Institute, followed quickly by more theoretical programs at such universities as North Carolina, Michigan, and California at Berkeley.
World War II brought statistics back into the limelight as a very useful methodology for solving problems of importance to industrial productivity and to the success of military campaigns. Not since the Enlightenment Period (the days of Laplace) had so many great minds in the mathematical and physical sciences been directed toward the solution of practical problems. After the war, many retained an interest in statistics and myriad new programs sprang up around the U.S. and around the world. Not all went under the name of "statistics" though, as new names such as operations research, industrial engineering, and management science came into being.
During this period, undergraduate courses in statistics began to appear, especially the infamous introductory course that came to be much hated by large numbers of students over many years. This course was typically taught in a mathematics department and based on an outline that was watered down from a graduate-level theory course, covering some probability and the basics of inference (hypothesis testing and confidence intervals) with little on design of studies or analyzing data. The courses on statistical methods were very often taught in the various disciplines using statistics (particularly the social sciences, business, and agriculture). Even with all the success of the discipline during the war years, statistics still had trouble finding a stable home.
From Grand Ideas to Practical Revolutions in the Schools. In the United States, every major effort at curricular reform in the twentieth century had statistics on the list of new topics for emphasis in the schools. Typical of the thinking of the early part of the century is the following 1926 statement from Herbert Slaught, an educator at the University Chicago.
One of the most significant evidences that the importance of mathematics is permeating the whole fabric of modern life is shown in the recent unparalleled development of the use of statistical methods in the study of quantitative relations in almost every department of investigation. This appears in the simplest form in all the proposed new curricula for the junior high schools. It is emphasized in the reorganized programs for the senior high schools. It is further developed in the enriched courses for college students not only in all the sciences but also in economics, sociology, anthropology, etc., and in most of the effective studies in education. It is indispensable in all laboratories everywhere, whether in the schools or in the factories, in commerce, in big business of every kind. These are phases of mathematics which are gradually growing into the consciousness of the general public and which are capable of recognition and widespread understanding. (Slaught, 1926)
A similar statement came from one of the most highly regarded university teachers of statistics in the first half of the century, Helen Walker of Columbia University.
More and more the modern temper relies upon statistical method in its attempts to understand and to chart the workings of the world in which we live.
The average man -- a phrase which itself is a statistical abstraction -- finds that to a certain degree he must think statistically in order to read his newspaper and the current magazines
Any one vitally concerned with the teaching of high school pupils and observant of the rapidly growing public need for some knowledge of quantitative method in social problems must be asking what portions of statistical method can be brought within the comprehension of high school boys and girls, and in what way these can best be presented to them. (Walker, 1931)
In later years others added similar pleas for statistics in the schools. A National Research Council report of 1947 called for introducing elementary statistics into the high school curriculum " as soon as there is a sufficient supply of trained teachers" and W. Edwards Deming in 1948 pointed to the "pressing need for introducing very general courses into the high schools and more widely into in the colleges so that ... future citizens may have the valuable orientation in quantitative thinking about social affairs which statistics affords" (Dutka, 1950). By 1975 however, a National Committee on Mathematics Education reported that "While probability instruction seems to have made some progress, statistics instruction has yet to get off the ground."
Why were these grand plans for statistics never achieved? There are many reasons, of course, among them the lack of interest and preparation of the teachers and the stranglehold that mathematicians held on the mathematics curriculum in the schools. And then there was the lack of a unified voice for statistics! Schools did teach very applied topics, such as business arithmetic, for much of the century but statistics kept falling through the cracks on both the applied and theoretical sides. It took a couple of great turns of events to finally allow statistics too succeed in the schools. One was the emphasis upon data analysis and the other was the development of technology.
2. Embracing the Present: Revolutions in Data, Technology, and Measurement
Tukey and Technology. John Tukey's admonishment of "Let the data speak" set the tone for a revolution in statistical thinking in the 1970's and 80's. All of a sudden it became respectable to explore data and look at modeling as an interactive process between theory and data (even though this is what good statisticians had done for years). But data exploration is not easy to do by hand, so the companion revolution in the availability of inexpensive and easy to use technology had to accompany it. We should be clear, however, about what Tukey meant by data analysis. To him, it embraces "procedures for analyzing data, techniques for interpreting the results, ways of planning and gathering data, and the machinery of mathematical statistics which apply to analyzing data." Data analysis is part art and part science, and should emphasize "the art of cookery" rather than "cookbookery" (Tukey, 1962).
"In God we trust; all others bring data." As data analysis was becoming acceptable, the idea of collecting data to make objective decisions on all sorts of problems, from the quality of manufactured products to the efficiency of an office, was sweeping much of the world. The spirit of this revolution is captured in David Moore's remark, "If you don't know what to measure, measure anyway: you'll learn what to measure" (Cobb, 1993). This transition in statistical thinking to measurement and data analysis, with the help of easy computation, led to profound changes in the introductory college courses (changes that are still going on) and to the acceptance of a strand in statistics within the K-12 mathematics curriculum.
The NCTM-ASA Joint Committee. In the mid 1980's the Joint Committee between the National Council of Teachers of Mathematics (NCTM) and ASA, organized by Fred Mosteller in 1967, developed a series of booklets on statistics for the middle school and early high school grades. The goal was to show students and teachers how an emphasis on collecting and analyzing data and on simulation of probabilistic events could motivate and illustrate much of the mathematics curriculum while, at the same time, teaching students some important statistical skills. This so called Quantitative Literacy Project (QLP) became quite successful in changing the way many teachers thought about statistics, and became the basis for the statistics strand in NCTM's Curriculum and Evaluation Standards for School Mathematics released in 1989.
Collecting, organizing, describing and interpreting data ... are skills that are increasingly important in a society based on technology and communication. These processes are particularly appropriate for younger children because they can be used to solve problems that are inherently interesting, represent significant applications of mathematics to practical questions, and offer rich opportunities for mathematical inquiry. (NCTM, 1989)
These Standards became the blueprint for the revamping of mathematics curricula in most states, and has even influenced some countries outside of the U.S. The QLP and the Standards influenced other work, notably the reports of the National Research Council's Mathematical Sciences Education Board.
Secondary school mathematics should introduce the entire spectrum of mathematical sciences: ... data analysis, probability and sampling distributions, and inferential reasoning. (MSEB, 1990)
The statistics strand has become so widely accepted as part of the K-12 mathematics curriculum that it has become one of the areas covered in the National Assessment of Educational Progress (NAEP), the so-called Nation's report card. Mathematics on the NAEP is defined as the five areas of number sense properties, and operations; measurement; geometry and spatial sense; data analysis, statistics and probability; and algebra and functions. These NAEP definitions also formed the basis for the mathematics portion of the Third International Mathematics and Science Study (TIMSS).
Philosophy and Style of the "New" Statistics. Developed by a team of statisticians and high school teachers, the QLP attempted to capture the spirit of modern statistics and modern ideas of education by following a philosophy that emphasized understanding and communication. That philosophy is outlined in the following steps.
1. Data analysis is central.
2. Statistics is not probability.
3. Resistant statistics should play a large role.
4. There is more than one way to approach a problem in statistics.
5. Real data of interest and importance to the students should be used.
6. The emphasis should be on good examples and building intuition.
7. Students should write more and calculate less.
8. The statistics taught in the schools should be important and useful in its own right, for all students.
This philosophy is best put into classroom practice with a teaching style emphasizing a hands-on approach that engages students to DO an activity, SEE what happens, THINK about what they just saw, and then CONSOLIDATE the new information with what they have learned in the past. This style requires a laboratory in which to experiment and collect data, but the "laboratory" could be the classroom itself; it does not need to be a computer laboratory, although the use of appropriate technology is highly encouraged.
College Courses and the Advanced Placement Connection. The same philosophy and style that marks the QLP is recommended by many for the introductory college course. Geoffrey Jowett, one of the great teachers of statistics in New Zealand and England, in his 1990 address to the Third International Conference on Teaching Statistics stated that "A statistics course at a university should have as many laboratory hours as physics or chemistry." In fact, many have offered that the teaching of statistics should resemble the teaching of science more than the traditional teaching of mathematics. Many of the originators and master teachers of statistics courses in this century (William Cochran and Fred Mosteller, to name two) actually made use of laboratory activities long before the advent of computers.
Another Joint Committee, this one between the Mathematics Association of America (MAA) and ASA, put its ideas on teaching statistics, similar to those expressed above, into formal recommendations for the introductory course that have the approval of both named associations. In summary, the recommendations (Cobb, 1992) are:
1. Emphasize statistical thinking.
2. More data and concepts; less theory; fewer recipes
3. Foster active learning.
There seems to be a large measure of agreement these days on what content to emphasize in introductory statistics and how to teach the course. As a result, statistics education is making some progress and the introductory course is no longer as hated as it once was (although it is still not as well loved as many would like).
The K-12 strand in statistics and the introductory college course in the subject should both be built around the spirit of modern data analysis, design of studies, measurement, and simulation, with appropriate use of technology. The strand should serve as good background for the course. Realizing the connection between these two, a group of statistics and mathematics educators thought that the college course could, perhaps, be moved into the high school curriculum for good students interested in another option in high school mathematics. The mechanism for accomplishing this in a way that would establish national standards for the high school course was the Advanced Placement program of the College Board. An AP Statistics course was finally approved and offered for the first time in 1997. In 1999 the exam for this course was given to over 25,000 students in 1,795 high schools across the country.
With a K-12 strand in statistics, an AP Statistics course, and exciting introductory courses in colleges and universities, statistics education has truly come of age. The next step is to enhance undergraduate offerings in statistics so that more college and university students have opportunities to major in the subject or to at least strengthen their backgrounds in the subject for whatever their field of choice might be. The ASA is now planning a project that will address these issues.
Before addressing the future, it seems appropriate to look once more at how the current situation in teaching statistics is connected to the past. The modern instructional methods that emphasize simulation often use randomization procedures to introduce the notion of hypothesis testing. Some may think that is a relatively new idea. Well, that is the way R. A. Fisher thought about tests of significance 75 years ago.
It seems to have escaped recognition that the physical act of randomization, which is necessary for the validity of any test of significance, affords the means ... of assuring the wider hypothesis in which no normality of distribution is applied. (In Box, 1978)
In other words, randomization procedures are the way to go, and they will work without any assumption of normality. The t-test is an approximation to randomization, not the other way around. If Fisher had had a workstation, the history of statistics would have been much different. The more statistics changes to a modern approach that emphasizes data, the more it seems to agree with the old masters' original thoughts on the subject.
3. Charting the Future: Surmounting the Opportunities
From elementary school to graduate school, from customers to manufacturers, from sports fanatics to health food fanatics, almost everyone seems to be interested in statistics these days. As Pogo would say, "We seem to be confronted with insurmountable opportunity." In boating (an analogy Pogo would appreciate) prudent navigation requires charting a course. Sometimes, however, the channels seen on a map are not open for efficient service because they need to be cleared of debris, widened, or deepened. The same can be said for the channels to be used in charting the course of the future of statistics education.
Clearing Channels of Communication. "Sampling is guessing," says a prominent Senator. "Sampling is no science," says a prominent newspaper columnist. There are, to be sure, legitimate scientific reasons to criticize any particular sampling plan put forth to adjust the Census, but a blanket condemnation of sampling as an invalid scientific procedure shows that there is something fouling up communication channels. "Racial Discrimination and Blood Pressure" is the title of a research article purporting to show that high blood pressure in blacks is caused by discrimination (Satel, 1997). The only problem is that the data do not show this; the social agenda of the researcher got confounded with the science. Communication problem? The recent publicity on the uses of statistical procedures to improve quality of products and services in business and industry (TQM) might lead one to believe that almost all companies use these procedures. Research shows, however, that quality concepts and tools are used extensively by fewer than half of Fortune 500 companies (Lackritz, 1997). Something is not being communicated clearly. A chemist was heard to remark, "We have discovered neural networks and no longer need statistics." This scientist needs to communicate with someone about both neural networks and statistics.
What is the debris that is blocking the communication channels? Some of it emanates from uncommitted leaders with limited understanding of quantitative issues in business, industry, government (B/I/G) and education. Some comes from a public that is easily swayed by the most recent alarming anecdote. Some comes from a workforce surrounded by technology but, at the same time, afraid of technology and easily swayed by black-box magic. Teachers at all levels need to work on clearing communication channels, even though they are sometimes caught in the middle with little support from either educational leadership or the public. Communication on statistical issues must be improved, if not with this generation of leaders, than with the next, ... or the next.
Broadening Channels of Application. Improved communication among B/I/G and educational systems at both the school and college levels will require identification of the strengths that unite and the gaps that divide. The principle uniting strength is data -- its collection and use to solve real problems. Schoolteachers must be armed with examples that not only motivate students but also convince administrators and school boards that statistics is a valuable and necessary component of the curriculum. While more independent in their decisions about courses and content, colleges and universities need to adjust their offerings to capture the interest of high school students with some statistical experience and to prepare those students for their academic and career goals. B/I/G must help provide motivating and convincing examples of the uses of statistics and work with colleges on improving course content so that college graduates, in whatever field, have an understanding of statistical thinking. All of this communication must take place in a spirit of collegiality and cooperation.
One desirable outcome of the improved communication is to have students at all levels see statistics broadly. This broad view, which must be emphasized in all courses that deal with statistical issues, can be approached by viewing statistics in three inter-related components.
Statistics as number sense: Do I understand what the numbers mean? (seeing data as numbers in context, reading charts, graphs and tables, understanding numerical and graphical summaries of data, etc.)
Statistics as a way of understanding the world: Can I use existing data to help make decisions? (using census data, birth and death rates, disease rates, CPI, ratings, rankings, etc., to describe, decide and defend)
Statistics as organized problem solving: Can I design and carry out a study to answer specific questions? (pose problem, collect data according to a plan, analyze data, and draw conclusions from data)
Graduates of high school or college are expected to be able to read literature related to their personal life or job, understand what they read, and then use what they have learned to make decisions. (That may be what intelligence is all about.) Why should we expect less of them when the "literature" involves data?
Deepening Channels of Content. With statistics courses built around study design, data collection, and data analysis, and with the availability of appropriate technology, serious questions arise as to how content should change to enhance statistical thinking and understanding of concepts over rote use of standard procedures. Those issues are much too complex to fully address here, but a few suggestions will be offered.
Deepen the discussion of exploratory data analysis, using more of the power of revelation, residuals, re-expression, and resistance as recommended by the originators of this approach to data.
Deepen the exposure to study design, separating sample surveys (random sampling, stratification, and estimation of parameters) from experiments (random assignment, blocking, and tests of significant treatment differences).
Deepen the understanding of inferential procedures for both continuous and categorical variables, making use of randomization and resampling techniques.
Should students be exposed to statistical techniques for which they cannot understand (or even see) the derivation or computations? Some say "no" and use this as an argument against introducing transformations, logistic regression, smoothing and density estimation, and other modern topics in the introductory courses. With modern technology which allows many numerical examples to be seen quickly, it is time to rethink this position. Perhaps we should tell students about statistics the way it is practiced, not the way it is stated in textbooks. Wouldn't that help communication up and down the line?
Statistics has it roots in many fields; there is strength in diversity.
Statistics was built on real measurement problems; utility is still its greatest asset.
Statistics is dynamic; that's exciting!!!