Characteristics and Performance of Students in an Online Section of Business Statistics

John Dutton
North Carolina State University

Marilyn Dutton
North Carolina Central University

Journal of Statistics Education Volume 13, Number 3 (2005), www.amstat.org/publications/jse/v13n3/dutton.html

Copyright © 2005 by John Dutton and Marilyn Dutton, all rights reserved. This text may be freely shared among individuals, but it may not be republished in any medium without express written consent from the authors and advance notification of the editor.


Key Words:Distance education; Internet course; Online education.

Abstract

We compare students in online and lecture sections of a business statistics class taught simultaneously by the same instructor using the same content, assignments, and exams in the fall of 2001. Student data are based on class grades, registration records, and two surveys. The surveys asked for information on preparedness, reasons for section choice, and evaluations of course experience and satisfaction. Using descriptive statistics, regression analysis and standard hypothesis tests, we test for significant differences between the online and lecture sections with regard to performance and satisfaction with the course as well as motivation and preparedness for taking an online course. We report several differences, including better performance by online students.

1. Introduction

Distance education courses taught completely via the internet have become increasingly common in a wide variety of disciplines. These courses hold appeal for both existing and potential students. Like other forms of distance learning, they make the university more accessible to groups who because of distance, other commitments, or demographics might not otherwise be able or willing to attend university classes. (Bengiamin, Johnson, Zidon, Moen, and Ludlow 1998) At the same time, when online class sections are established, universities often find that students who are currently enrolled on-campus are clamoring for inclusion (Burbules and Callister 2000). For these students, being able to control the pace and time of study often outweighs the benefits of face-to-face lectures, particularly when the classroom alternative is a large lecture hall with two hundred or more other students (Wallace and Mutooni 1997; Wilson and Whitelock 1998). Indeed, as one university administrator has commented, students in these large sections are already engaged in a form of distance education.

There has been a great deal of speculation about the motives and characteristics of the students who enroll in online classes, as well as about the success of internet courses as vehicles of instruction. Comparisons of online and traditional lecture formats generally indicate that on average, students perform at least as well in classes with an online component (Davis 1996; Dutton, Dutton, and Perry 2001, 2002; Russell 1999). Moreover, there is evidence that student ratings of various aspects of courses are also similar for online and lecture modes (Spooner, Jordan, Algazzine and Spooner 1999).

In previous work, we studied student performance and characteristics in two pairs of side-by-side sections of a computer programming class. (Dutton, et al. 2001, 2002) An important question is whether the results from computer science extend to other disciplines, since some courses are better suited than others to the online format. For example, theater and studio art may not readily be adapted to an online version. One objective of this study is to evaluate whether business statistics can effectively be taught in an online format. In this paper we take advantage of a situation of side-by-side sections, one in which one of the authors taught both an online and a lecture section of a business statistics class during the fall 2001 semester. This situation affords us an opportunity to see to what extent our results from computer science extend to statistics. We can address the question of whether students in a statistics course can do as well with an online format as with lectures. We can also compare the characteristics, motivations, and responses of students using the two course formats.

An important caveat of our previous studies was that because they involved computer science classes, the students were possibly more technically proficient and computer literate than the typical undergraduate and therefore better able to handle an online format. In business statistics, the students are likely to be less proficient and more representative of the typical university student in their technical and computer skills. One might argue that they are a particularly interesting group for a study like this. They are taking a course in the mathematically oriented field of statistics, but most are doing it as part of social studies curricula. It is notable that recent research shows survey respondents willing to take online classes were predominantly interested in business courses (Evans and Haase 2001); business students therefore may be a particularly important source of demand for online statistics courses. The results of this study therefore have applications to the application of online courses in both the statistics and business curricula.

With regard to course content, assignments, and exams, the lecture and online sections were almost identical except that students in the online section did not attend the regular lectures. Online students came to campus for an initial organization session and took the three intra-semester tests and the final exam together with the lecture students. Each test or exam was offered both during the regular class period and in an evening session, and students from the two sections had the option of taking the test at either time. The tests and final exam consisted entirely of multiple-choice questions and were graded with an optical scanning system. For security purposes, several equivalent versions of each test were used. The test versions were very similar and were designed to be of the same level of difficulty. They contained virtually the same questions. Some of the computational problems were varied by simply changing numerical parameters. In others the order of the possible answers was varied.

The near-equivalence of the two sections, traditional lecture and online, provides an excellent opportunity for a side-by-side comparison. We focus on three particular issues in this paper. First, we want to identify any systematic differences between the online and lecture students. To find such differences, we look at such student characteristics as GPA, course hours, and major. We also look at self-reported learning needs taken from the responses to surveys distributed at the beginning and end of the semester. Second, we compare the two sections in the degree to which students used the various resources of the class and their own evaluation of their experiences in the class. Finally, we look at the academic performance of students in the two sections. We are interested in whether there is a significant difference in performance between the online and traditional sections, and especially in whether the online students perform more poorly than those with the benefit of class meetings.

In this work, we employ comparative descriptive statistics, regression analysis, and standard hypothesis tests to identify significant differences and similarities between the online and lecture students.

2. The Data

The data for this study are taken from two class sections of BUS/ST 350, Economics and Business Statistics, which were taught at North Carolina State University in the 2001 fall semester. One section was a traditional lecture section with 159 students and the other was an online section with 48 students (both measured as of the end of the refund period, when a student could still be refunded tuition). Both sections were taught by the same instructor. A third large lecture section of the course was offered that semester by another instructor but is not included in this study.

Both sections in the study were listed in the regular course offerings of the university (though some online courses are available in a separate listing specifically for distance education, the online section of this course was not). The two cost the same. Students signed up for both in the same registration system. In the registration materials, the online section looked much like any traditional section on campus, except that there were no regular meetings listed.

The online section had a cap of about 50 and the lecture section had a cap of 235. In both cases most of the seats were reserved for College of Management students. However, the instructor informally loosened the restriction for the online section (in order to ensure that the sample size in that section would not be too small). Possibly as a result of this loosening, a somewhat smaller fraction of the students in the online section were College of Management undergraduates (59% as opposed to 70% in the lecture section). All students in both sections selected those sections on their own; none were recruited

BUS/ST 350 is a standard introductory business statistics course required for majors in business, accounting, and economics, as well as for a few other majors on campus. A few students take the course as preparation for entering MBA programs. The course is cross-listed as both a Business Management and a Statistics offering. Topics include measures of central tendency and variation, the binomial distribution, confidence intervals and inference for means and proportions and for differences of means and proportions, correlation, simple and multiple regression, qualitative variables, and chi-squared tests for multinomial distributions and contingency tables.

The on-campus lecture section met in two 75-minute lecture sessions, plus an optional weekly hour-long lab held in the College of Management computing facilities. The lectures met in a large hall and involved little individual attention or interaction. The lab meetings were conducted in groups of 30 or fewer. These labs were led by a graduate student who served primarily as an immediately available consultant. Students in the online section did not formally enroll in a lab but were offered the option of attending one. The lectures were open to them as well, but lecture attendance by the online students appeared to be very occasional. The instructor was able to monitor attendance to some degree by taking up short collaborative assignments completed in class. There was little indication that online students were attending class. (Students seemed to take their placement in one section or the other seriously; one student who chose to switch from the lecture to the online section early in the semester went to the trouble of formally changing, rather than just ceasing attendance in the lecture class.)

Grades in the two classes were based on the following:

In the lecture section the short daily assignments consisted of short question sets completed in class by groups of 2 to 4 students. These same questions were made available on the web for completion individually by the online students.

Online courses have been offered at N.C. State over the past several years in increasing numbers. Some of the offerings have been in the business area. The instructor of the two BUS/ST 350 sections under study had been providing substantial web-based materials to lecture sections for several years; however, the fall of 2001 was the first completely online offering of the course.

The course website (used for both sections) consisted of the following major elements:

  1. General Information in a syllabus, including contact information, course policies, etc.
  2. A bulletin with periodic announcements by the instructor.
  3. A set of 40 lessons covering the material of the course.
  4. PowerPoint slides that formed the basis of presentations in the lecture section.
  5. Lab pages consisting of detailed instructions on performing statistical computations, Excel operations, etc.
  6. A schedule page outlining the topic coverage for each class day, as well as due dates for lab and homework assignments. Though they had no class meetings, the online students followed the same timing of assignments as the lecture students.
  7. Study aids consisting of old tests, an FAQ page, and project assignment pages.
  8. Access to WebAssign©, a homework and testing facility developed at North Carolina State University. WebAssign was used extensively to deliver assignments to the students. It provides automatic and immediate grading, multiple submission opportunities, algorithmically generated questions, etc., and is one of the most interactive elements of the course. Both sections used WebAssign to submit their work.

The software used for statistical calculations in the course was Excel. Students were assumed to be familiar with Excel, though some time was devoted to training them in the spreadsheet functions specific to statistics. Other than using WebAssign and other web features, Excel was the only software required for use in the course.

At the beginning of the semester we administered a simple survey instrument to as many enrolled students as possible in the two class sections. The questionnaire (presented in the Appendix) collected information on work and child care responsibilities, commuting distance and prior computer experience, as well as attitudes toward various aspects of the course. In addition, we were able to obtain information from the university registrar on the gender, age, university program, current semester hours enrolled, and total GPA of each student. At the end of the semester we collected information in a follow-up survey (also in the Appendix) asking for information about students’ experiences with the course. The two survey instruments were each provided both in a paper format and in an online version on the website. Students had the option of completing either form of the survey. In addition to the surveys and information from Registration and Records, the other main source of data was the instructor’s records of student performance in the course.

As of August 31, the end of the period when a full tuition refund was possible, the two sections of the class together contained 207 students, 159 in the lecture section and 48 in the online section. We chose to keep in our study only those students committed enough to remain enrolled past that period. Unfortuantely our cutoff was not as clean as we intended. We missed one student who dropped on September 2 and would have been included under our cutoff rule. By the time we realized the error the semester was past and we were unable to add that student as a data point. (It would be useful in future work to look more closely at those who dropped earlier and why they did so.)


Table 1: Course completion by section

Section Finished
Course
Dropped
by Oct 1
Dropped
after Oct 1
Totals
Online 137 6 16 159
Lecture 41 6 1 48
Total 178 12 17 207


Of those remaining in the course after the drop date, 178 students completed the course (took the final examination), and 29 did not. Of the 29, 12 dropped during the drop period ending October 1, when students could drop without approval or penalty; the other 17 simply failed to take the final exam. The total of 207 students were the basis of our study. Table 1 provides a breakdown.

Of the 207 who started the course, 178 completed a usable version of the first survey, for a completion rate of 86%. Completion rates were 84% in the lecture section and 94% in the online section. There were 152 completed usable versions of the second survey, for a rate of 73%, with 75% completion for lecture and 69% for online. Note that these completion rates were based on the 207 students still in the course at the end of the drop period (about a month into the semester). There was some attrition during the course so completion rates for the second survey (based on the original 207) were naturally lower. Students were granted one point of extra credit toward one of the tests as an incentive to complete each survey.

3. Student Characteristics

In this section, we examine the data to identify differences between the characteristics of students who chose the online section and those who opted for the lecture section. We had two data sources for this. From Registration and Records we were able to get information such as enrolled hours, cumulative GPA prior to the beginning of the semester, age, gender, college class and major curriculum. The responses to the first survey provided additional information regarding computer experience, typical mode of internet access, work and child care commitments, and commuting distance as well as students’ preferences for learning and study styles. Table 2 summarizes means and proportions of some series from the registration data and Survey 1. Figures 1 through 4 provide data on several other variables.


Table 2: Descriptive statistics, various characteristics.

Lecture Online Total Test of Difference
Semester hours - Mean
St. dev.
No. obs.
15.0
2.2
137
14.1
4.3
41
14.8
2.8
178
Difference
Test statistic - t
p-value of test
-0.89
1.29
0.2032
% WebAssign experience
St. dev.
No. obs.
86.1%
3.1%
122
82.9%
5.9%
41
85.3%
2.8%
163
Difference
Test statistic - z
p-value of test
-3.1%
-0.49
0.6236
% commuting
St. dev.
No. obs.
18.0%
3.5%
122
26.8%
6.9%
41
20.2%
3.1%
163
Difference
Test statistic - z
p-value of test
8.8%
0.15
0.8770
% working
St. dev.
No. obs.
83.6%
3.4%
122
85.4%
5.5%
41
84.0%
2.9%
163
Difference
Test statistic - z
p-value of test
1.8%
0.27
0.7901
Hours of work - Mean
St. dev.
No. obs.
16.9
8.5
101
22.4
13.2
34
18.3
10.1
135
Difference
Test statistic - t
p-value of test
5.5
2.27
0.0281
% female
St. dev.
No. obs.
36.5%
4.1%
137
58.5%
7.7%
41
41.6%
3.7%
178
Difference
Test statistic - z
p-value of test
22.0%
2.51
0.0120
GPA - Mean
St. dev.
No. obs.
2.92
0.48
126
3.17
0.51
39
2.98
0.50
165
Difference
Test statistic - t
p-value of test
0.25
2.93
0.0039
Age - Mean
St. dev.
No. obs.
21.9
2.6
137
23.6
5.1
41
22.3
3.4
178
Difference
Test statistic - t
p-value of test
1.8
2.13
0.0387
% previous distance educ
St. dev.
No. obs.
19.7%
3.6%
122
56.1%
7.8%
41
28.8%
3.5%
163
Difference
Test statistic - z
p-value of test
36.4%
4.45
0.0000


In many respects, the sections were quite similar and appeared to draw students from the same general university population. Both were composed of primarily full-time students with similar majors and computer experience who lived within ten miles of campus. The average semester hours of enrollment were 14.1 for online and 15.0 for lecture students (with the difference not statistically significant). Based on student self-evaluations with regard to computer experience, the two sections appeared to be nearly identical (Figure 1). In the lecture section 42% of the students had had computer experience beyond word processing and spreadsheet applications while 46% of the online section reported having similar experience. Only 4% of the online section compared with 1% of the lecture section reported extensive programming experience. The p-value for a chi-square test of difference in skill distributions is 0.48, indicating no significant difference between the two sections. Another question in the first survey asked about experience with WebAssign, the homework/testing software described above. The two sections were almost identical in that area as well; in both sections more than 80% of the students had used WebAssign before.


Figure 1

Figure 1. Computer Experience Distributions.


There was also a lot of similarity in terms of majors represented (Figure 2). Students in Business (58%) and Accounting (8%) comprised the majority of the students in both sections. Textiles Management was the third major for which there were significant numbers (6%). To a large degree, the percentages were even for the two sections, and a chi-square test revealed no significant difference in the distributions. The slightly higher proportion of Business majors in the lecture section (61% versus 50%) was most likely the result of College of Management registration policy that gave College of Management students priority in enrollment for the Business Statistics class. After the lecture section closed, the professor allowed a few additional students into the online section. Often those were students from other disciplines. While enrollment policies probably had some influence on who went into which type of section, students themselves had by far the largest say in that decision.


Figure 2

Figure 2. Distribution of Majors.


The two sections were also fairly similar in how far students needed to commute to campus, in commitment to outside work, and in location of computer access. As Table 2 indicates, the percentage of students with a long commute (more than ten miles) was somewhat higher for online than for lecture students (27% versus 18%), but the difference was not statistically significant. Table 2 also shows that almost identical percentages in each section were employed at work outside class. In Figure 3 it is apparent that sources of computer access were also similar for online and lecture students. Online students were somewhat more prone to obtain access at home or work and lecture students were more likely to use dormitory or campus lab access, but the differences were not statistically significant.


Figure 3

Figure 3. Percentages, Computer Access Time.


Although in general the two sections appeared to be drawn from the same student population, in several important ways they differed substantially from each other. These differences indicate that there was a large degree of self-selection among the students. The fact that women were much more heavily represented in the online section was a little surprising and we would have trouble explaining this. However, other differences indicate that the two sections appealed to students with disparate academic and scheduling needs. For example, the online section appeared to attract students who were better able to handle the demands of the distance education format. The online students had a higher cumulative GPA at the time of entering the course (0.25 points higher on a 4-point scale), a difference that was statistically significant. In addition, the online section had a larger number of mature students (in terms of class standing and age) than the lecture section. The fraction of seniors in the online section was more than twice as large as in the lecture section (Figure 4). A chi-squared test indicates the difference in class distributions is statistically significant. Also, the online students were almost two years older (a statistically significant difference). These differences fit the usual description of students in distance education. Another difference is that a significantly higher proportion of online students reported experience with a previous online course (56.1% versus 19.7%).


Figure 4

Figure 4. Class Distributions.


It also appears that the two sections appealed to students with different requirements for flexibility and learning styles. The need for scheduling flexibility shows up in the responses to a number of questions on the first survey. For example, online students tended to work more hours in outside jobs than lecture students.

Further evidence of differences between the two sections is contained in Table 3 which summarizes the responses to Likert scale questions relating to scheduling needs and learning styles. For each item the table contains the question, the relative distributions of responses for the two groups, the number of usable responses for each group, and results of a chi-square test of independence between the two distributions. For each question we conclude that response distributions were significantly different for online and lecture students if the p-value is small (generally we use the standard cutoff of .05).


Table 3: Survey 1 Responses concerning student needs, with P-values for tests of independence between section type and responses.

Likert
Scale
Strongly
Disagree
Disagree Neutral Agree Strongly
Agree
Sample
Size
P value
1. Conflict between class time and work commitmens is a significant problem for me.
Lecture
Online
17.4%
9.8%
31.4%
24.4%
24.0%
17.1%
20.7%
26.8%
6.6%
22.0%
121
41
0.0458
2. Work commitments strongly influenced my choice between lecture and online sections.
Lecture
Online
23.3%
12.2%
32.5%
26.8%
31.7%
17.1%
10.8%
19.5%
1.7%
24.4%
120
41
0.0000
3. A course scheduling conflict strongly influenced my choice between lecture and online sections.
Lecture
Online
17.7%
9.8%
30.3%
19.5%
35.3%
7.3%
10.9%
26.8%
5.9%
36.6%
119
41
0.0000
4. I find commuting to class a hardship.
Lecture
Online
20.8%
17.1%
33.3%
26.8%
15.0%
26.8%
23.3%
22.0%
7.5%
7.3%
120
41
0.5534
5. Face-to-face contact with the instructors of my classes is very important to me.
Lecture
Online
0.0%
0.0%
4.1%
14.6%
15.7%
48.8%
49.6%
34.2%
30.6%
2.4%
121
41
0.0000
6. Face-to-face contact with fellow students in my classes is very important to me.
Lecture
Online
0.8%
4.9%
7.4%
43.9%
31.4%
29.3%
39.7%
17.1%
20.7%
4.9%
121
41
0.0000
7. Attending regular class meetings is an important motivator for me.
Lecture
Online
1.7%
4.9%
5.0%
43.9%
12.4%
31.7%
49.6%
19.5%
31.4%
0.0%
121
41
0.0000
8. I learn better when I have face-to-face contact with my fellow students.
Lecture
Online
2.5%
12.2%
11.6%
41.5%
32.2%
36.6%
37.2%
4.9%
16.5%
4.9%
121
41
0.0000
9. If I need help, I'm more likely to email my instructor than to meet with him/her face-to-face.
Lecture
Online
15.7%
2.4%
17.4%
14.6%
18.2%
12.2%
33.9%
41.5%
14.9%
29.3%
121
41
0.0593
10. Flexibility in setting pace and time for studying is important to me.
Lecture
Online
0.8%
0.0%
4.1%
2.4%
18.2%
4.9%
53.7%
56.1%
23.1%
36.6%
121
41
0.1919
11. I learn better from hearing a lecture.
Lecture
Online
0.8%
5.0%
6.6%
20.0%
20.7%
50.0%
45.5%
22.5%
26.5%
2.5%
121
40
0.0000
12. I learn better form reading lecture materials.
Lecture
Online
2.5%
2.4%
17.4%
12.2%
30.6%
43.9%
35.5%
26.8%
14.1%
14.6%
121
41
0.5895


Several questions related to the effect of scheduling problems on choice of class section. Questions 1 through 3 indicate that conflicts between class and work and conflicts between classes were significantly more important for online students. For example, only 13% of lecture students responded “agree” or “strongly agree” to the statement “Work commitments strongly influenced my choice between lecture and online sections.”; the comparable number for online students was 44%. The results concerning conflicts between classes are particularly notable and may reflect the fact that numbers of available class sections have declined recently because of budgetary stringency. For the statement, “A course scheduling conflict strongly influenced my choice between lecture & online sections,” only 17% of lecture students but 63% of online students agreed or strongly agreed. Not shown in the table are results for child care commitments. The numbers indicating that child care was important were small. (10% of online versus 4% of lecture students agreed or strongly agreed that “Conflict between class time and child care commitments is a significant problem for me.”) However, those numbers do tend to back up the result that conflicts were more important for online students and had a greater effect on their choice of section.

The last question related to scheduling difficulty in Table 3 is question 4, on commuting difficulty. (31% of lecture and 29% of online students agreed or strongly agreed that commuting was a hardship.) The fact that student responses concerning commuting difficulty did not significantly differ between the two sections probably stems from the similarity between the two sections regarding length of commute to campus. Most students in both sections lived within ten miles of campus.

In addition to scheduling issues, the data in Table 3 indicate that the two sections attracted students with different learning styles. The importance that students placed on face-to-face contact and on the opportunity to attend a lecture differed substantially across the two sections. Questions 5 and 6 in the table deal with face-to-face contacts with the instructor and with other students. In both cases students in the online section placed much less importance on those contacts. The difference was significant. (Among lecture students, 80% found face to face contact with instructors to be very important and 60% found face-to-face contact with fellow students to be very important. The corresponding numbers for online students were only 37% and 22%.)

Questions 7 and 8 back up the results of 5 and 6. (For example, 81% of lecture versus 20% of online students agreed or strongly agreed that class meetings are an important motivator.) Question 9 indicates that online students are more likely to use email, though in this case the difference is only marginally significant. (71% of online versus 49% of lecture students agree or strongly agree that, “If I need help, I’m more likely to email my instructor than to meet with him/her face-to-face.”) In response to question 10, on flexibility in pace and time of study, online students report a somewhat higher level of agreement that flexibility is important (but the difference between sections is not significant). These differences are underscored by the responses to Questions 11 and 12 concerning learning style. Lecture students are significantly more likely to report learning better from hearing a lecture (72% versus 25%). There is no significant difference in reported learning from reading lecture materials.

Responses to the end of semester survey confirm many of the differences in preferences that were apparent in the responses to Survey 1. Some of these are summarized in Table 4. Students in the lecture section were significantly more likely to agree with the statement: “I needed lectures in class meeting to really understand the material of the course,” (82% of lecture students agreed or strongly agreed versus 6% of online students). Consistent with this attitude, lecture students were significantly less likely to find the Website materials sufficient for learning the course. However, while lecture students reported significantly more difficulty balancing the demands of this course with their other commitments, the two groups were similar in feeling that they had ultimately been able to keep up with the demands of the course.


Table 4: Survey 2 Responses concerning student needs with P-values for tests of independence between section type and responses.

Likert
Scale
Strongly
Disagree
Disagree Neutral Agree Strongly
Agree
Sample
Size
P value
I needed lectures in class meeting to really understand the material of the course.
Lecture
Online
2.5%
12.5%
5.1%
46.9%
10.2%
34.4%
53.4%
6.3%
28.8%
0.0%
118
32
0.0000
I was able to learn what I needed for the course from the materials on the website.
Lecture
Online
4.2%
0.0%
11.0%
6.3%
26.3%
3.1%
47.5%
56.3%
11.0%
34.4%
118
32
0.0028
I found it difficult to balance the demands of the course with my other commitments.
Lecture
Online
1.7%
12.5%
26.3%
31.3%
28.8%
25.0%
37.3%
21.9%
5.9%
9.4%
118
32
0.0419
I was able to avoid getting behind in the course during the semester.
Lecture
Online
1.7%
6.3%
18.6%
9.4%
16.9%
25.0%
50.8%
40.6%
11.9%
18.8%
118
32
0.2340


4. Student Uses of Course Resources

Next we looked at the degree to which the two groups made use of the resources available to them in the class. Some of these resources were accessible only online. For example, all students were given online access to a record of their grades as well as the course bulletin, lectures and PowerPoint slides that were posted on the website. In addition, students were free to email the instructor or the Teaching Assistants (TA’s) with questions regarding the class, tests and assignments. Other resources were accessible only on campus. Students were assigned to but not required to attend weekly lab sessions in which TA’s provided help with class problems and projects. TA’s also conducted special review sessions prior to each test and were available by appointment for special sessions with students. In addition, students could visit the instructor during office hours and by appointment. It should be emphasized that students in both sections had equal access to all resources and could effectively switch sections. That is, online students could attend lectures if they so desired and lecture students were not required to attend class and had full access to all online resources. We were interested to see how similar their use of these resources was.

In some regards, the two groups of students were quite similar in the way they utilized the course resources. Students in general are very interested in their grades, and most of these students frequently checked the record of their grades online; the proportion was greater than 90% for both groups (see Table 5). The instructor had posted the PowerPoint slides from the lectures on the website and approximately 40% of both groups of students reported using this resource frequently.


Table 5: Survey 2 responses concerning student use of resources, with P-values for tests of independence between section type and responses.

Likert
Scale
Strongly
Disagree
Disagree Neutral Agree Strongly
Agree
Sample
Size
P value
During the semester I referred frequently to the record of my grades accessible through the website.
Lecture
Online
0.8%
0.0%
3.4%
0.0%
3.4%
6.3%
52.5%
34.4%
39.8%
59.4%
118
32
0.3198
During the semester I referred frequently to the PowerPoint slides on the website.
Lecture
Online
11.0%
16.1%
31.4%
19.4%
21.2%
25.8%
25.4%
22.6%
11.0%
16.1%
118
31
0.6261
During the semester I referred frequently to the series of 40 Lessons on the website.
Lecture
Online
10.3%
0.0%
35.0%
0.0%
22.2%
3.1%
23.1%
18.8%
9.4%
78.1%
117
32
0.0000
I regularly read the bulletins on the course website.
Lecture
Online
0.9%
0.0%
11.1%
0.0%
18.8%
9.7%
51.3%
25.8%
17.9%
64.5%
117
31
0.0001


In many other regards, though, we found a large degree of segregation in the ways in which the students used the course resources. Online students made greater use of those resources that might in some way be considered substitutes for information that would otherwise be provided in the on-campus lecture while the lecture students took greater advantage of the on-campus supplemental assistance. For example, the instructor had made available a set of 40 lessons that explained the PowerPoint slides. The online students referred significantly more than the lecture students. Although the course bulletin was used by both groups, online students reported using it significantly more.

In keeping with their greater tendency to rely on electronic communication via the internet, online students were also significantly more likely to communicate with the instructor via email (see Table 6). There was no significant difference between the two sections in reported visits to instructor office hours.


Table 6: Survey 2 responses concerning emails to instructor.

Response Lecture Section Online Section
About how many times during the semester did you send emails to the instructor?
Never
Between 1 and 4 times
Between 5 and 10 times
More than 10 times
43.4%
52.2%
2.6%
1.7%
6.6%
75.0%
12.5%
6.3%
P-value for chi-square test of independence between section and response = 0.0003
How often did you come to the instructor's office hours?
Never
Once or twice during the semester
Every week or two
Once a week or more
71.2%
22.9%
3.4%
2.5%
62.5%
28.1%
6.3%
3.1%
P-value for chi-square test of independence between section and response = 0.7748
How often did you attend the weekly lab sessions in the basement of Nelson Hall?
Never
About a quarter of the time
About half the time
About three quarters of the time
All the time or almost all the time
49.2%
29.7%
5.9%
6.8%
8.5%
96.8%
3.2%
0.0%
0.0%
0.0%
P-value for chi-square test of independence between section and response = 0.0013
How many of the test review sessions conducted by the TA's did you attend?
Mean 2.15 1.41
P-value for 2-tailed t test for difference in means (variances equal) = 0.0050
On average, how often did you consult the TA's for help outside of the lab or review sessions?
Never
Once or twice during the semester
Every week or two
Once a week or more
44.9%
43.2%
7.6%
4.2%
81.3%
18.8%
0.0%
0.0%
P-value for chi-square test of independence between section and response = 0.0113


From Table 6 we can see also that online students were less likely to take advantage of the assistance provided on campus through resources such as the weekly lab sessions and test review sessions. Only about 3% of the online students ever attended the weekly lab sessions, and none went to more than a quarter of the sessions. This compares with more than half of the lecture students who reported attending the labs, and 15% who went to more than half of the weekly sessions. Online students were also significantly less likely to attend test review sessions given by the TA’s the evening before each test. In addition, most online students (81%) had no contact with the TA outside of the labs and review sessions. In contrast, more than half of the lecture students reported consulting with the TA’s for extra help. This difference in contact with the TA’s was statistically significant.

These results on use of course resources seem to indicate that, while they may have had similar access to on-campus resources, the course experience was quite different for online and lecture students. For the online students the course was truly distance education, while lecture students made far greater use of the face-to-face resources on campus.

Although the online students clearly relied more heavily on the electronic medium, the two groups appeared to be fairly similar in their attitude toward the electronic features of the course (Table 7). The two groups reported similar experiences in finding things on the course website. Both groups were positive about WebAssign. More than 80% of both groups preferred to submit homework electronically rather than on paper, and the proportions were similar. Students in both groups overwhelmingly agreed that WebAssign helped in learning the material, both because it allowed students to resubmit answers and because they could get immediate feedback on whether answers were correct. The two groups also were very close in their evaluation of the number of homework and lab assignments, with 56% agreeing or strongly agreeing that the number was about right for the material covered.


Table 7: Survey 2 responses concerning student use of resourses, with P-values for tests of independence between section type and responses.

Likert
Scale
Strongly
Disagree
Disagree Neutral Agree Strongly
Agree
Sample
Size
P value
It is easy to find things in the course website.
Lecture
Online
0.8%
0.0%
5.1%
0.0%
5.1%
6.5%
55.1%
42.2%
33.9%
48.4%
118
31
0.6771
For homework assignments I prefer using WebAssign rather than doing assignments to turn in on paper for grading.
Lecture
Online
4.2%
3.1%
8.5%
3.1%
6.8%
3.1%
41.5%
31.3%
39.0%
59.4%
118
32
0.3181
The ability to resubmit my answers several times on WebAssign helped me learn more from the assignments.
Lecture
Online
0.8%
0.0%
5.1%
0.0%
4.2%
6.3%
37.3%
31.3%
52.5%
62.5%
118
32
0.8554
Knowing immediately whether my answers are correct or not on WebAssign helped me learn more from the assignments.
Lecture
Online
1.7%
0.0%
3.4%
3.1%
4.2%
0.0%
39.8%
25.0%
50.8%
71.9%
118
32
0.4153
The number of homework and lab assignments was about right for the material we covered.
Lecture
Online
2.6%
6.3%
21.4%
15.6%
19.7%
21.9%
42.7%
40.6%
13.7%
15.6%
118
32
0.8116


5. Assessment of Student Performance

The last comparison we made was in the area of academic performance. In our previous work with side-by-side comparisons, online students performed somewhat better on the final examination. We expected to observe a similar pattern here. In this study the average exam score was 82.7 for the online students versus 74.5 for the lecture students. The difference is statistically significant. We also computed a broader measure of performance consisting of a weighted average of the all tests and projects. The online section performed better on that measure as well, with an average grade of 87.6 versus 81.1, a 6.5 point difference that is also statistically significant.

These simple averages are revealing and seem to show that the online mode of course delivery has a big effect on academic performance. However, we saw previously that there were several differences in the composition of the two sections and there is a good chance that other factors influenced relative performance. To control for these other factors, we used linear regression analysis, a technique that allowed us to detect the simultaneous effect of several explanatory variables on academic performance. Table 8 and Table 9 present results of two sets of regressions, one using the exam grade as the dependent variable and the other set using the broader course average. The regressions are based on the students who completed the course and for whom we have data on all the included variables. The number of students included in each regression (indicated by “n”) is different for the various regressions because in some cases we did not have complete responses to survey questions.

In each of the regressions, we included a qualitative variable for online status that took on a value of 1 for online students and 0 for lecture students. We also included five additional explanatory variables. Two of these, GPA and HWLab, were quantitative and were used to control for differences in the caliber of students in the two sections. GPA was taken from university records for courses prior to this one and was used as an indicator of a student’s overall academic performance. We included this to control for the fact that students in the online class tended to have higher GPA’s. HWLab is the average of homework and lab assignment grades and was used as a control for the effort devoted to completing class assignments (since students could correct and resubmit homework multiple times, a student who was either well prepared or persistent could get 100 percent of the homework correct). The other three variables, Commute, Computer Experience and Female, were all qualitative variables. We expected students who had a long commute to have a harder time meeting the demands of the class. Those with extensive programming experience would be better prepared to handle the technical requirements of the statistics class. We included Female as an explanatory variable because women sometimes perform less well in quantitative subjects (Huang, Taddese, Walter, and Peng 2000). If a student lived more than 10 miles from campus, had extensive programming experience and was female, each of these variables was assigned a value of one, and zero otherwise. All of these additional variables are intended to control for the particular individual characteristics that might affect the overall academic performance of a given section of the statistics class.

Table 8 presents the regression results using the final exam grade as the dependent variable. The reported coefficients indicate the impact of each variable on final exam grade. The first regression, which included only “Online” as an explanatory variable, shows the simple difference in mean exam scores, 8.26 points. The second regression included the online status variable as well as several others. It is clear that once we included other explanatory variables, thereby controlling for their effects, the difference in exam scores shrank (though still remaining significant for a = .05). This implies that the difference in exam scores is attributable in part to factors other than online status.


Table 8: OLS regression coefficients for exam grades.

Coefficients in bold
Intercept
p-value
74.4721
0.0000
12.3798
0.0596
Online
p-value
8.2615
0.0032
4.8101
0.0371
GPA
p-value
  12.9755
0.0000
HWLab
p-value
  0.3147
0.0000
Commute
p-value
  -7.2767
0.0043
Comp Exper
p-value
  2.7137
0.2485
Female
p-value
  -3.2484
0.1134
R Squared 0.0483 0.4677
N 178 150


In addition to online status, three of the explanatory variables appearing in the last regression have low p-values, indicating statistical significance (note that writing a p-value as 0.0000 means there are no significant digits until beyond the first four after the decimal). The GPA prior to entering the course has a strong positive effect on the predicted exam grade (a 13 point increase in exam grade for each 1 point of GPA). The average of homework and lab assignment grades also has a significant positive effect (each additional point on that average leads to a predicted increase of about a third of a point on the exam grade). The third variable with a significant effect on exam grade is commuting status. In these regressions, students with a long commute are predicted to score about 7 points lower than those without such a commute.

The last two variables we included are qualitative variables for computer experience and for gender. According to the results, those with computer experience scored almost 3 points higher and females scored about 3 points lower on the final exam. In neither case were the coefficients statistically significant.

We recognize that the final exam score is a limited measure of academic performance. So, in order to obtain a broader picture of academic performance in the course, we also performed regressions with a partial course grade (based just on the exam, three tests, and two projects) as the dependent variable. The results of these regressions for online student performance, presented in Table 9, are qualitatively similar to those for the exam grade. Online students did significantly better than lecture students, with the gap shrinking when control variables are included.


Table 9: OLS regression coefficients for partial course grades.

Coefficients in bold
Intercept
p-value
81.0774
0.0000
29.7459
0.0000
Online
p-value
6.4971
0.0014
3.6462
0.0181
GPA
p-value
  8.4215
0.0000
HWLab
p-value
  0.3328
0.0000
Commute
p-value
  -2.4894
0.1395
Comp Exper
p-value
  1.8690
0.2335
Female
p-value
  -3.0567
0.0262
R Squared 0.0565 0.5672
N 178 150


Results for the control variables in this second set of regressions roughly paralleled those of Table 8. Coefficient signs were the same. However, the commute variable was no longer significant, while the female status was significant, with women scoring about three points lower than men.

In addition to the variables used in these two sets of regressions, we tried several other variables that did not demonstrate statistical significance in their effects on academic performance. These included age, work status, work hours (in separate equations from work status), semester hours enrolled, previous experience with distance education, and number of visits to the website per week. We were surprised that age, work status, and work hours had no significant effects. We had surmised that the higher average age (and therefore maturity) of online students might account for their higher grades; however, the regression results did not support that notion. (Age was not a significant predictor of performance in our earlier papers either.) We also thought that working at jobs outside of class would have a negative effect on performance. In our earlier papers we found that students who reported holding outside jobs did more poorly; however, that result was not duplicated in the current analysis. There were some outliers in our data set from particularly low grades. To check our results we performed robust regression with the SAS ROBUSTREG procedure. We used Huber's M and Least Trimmed Squares (LTS) estimations and found the results to be qualitatively very similar to the OLS results in this section. See Appendix B for our estimations and a discussion. The only qualitative change was that the Female coefficients appeared to be more significant.

In addition to looking at the final exam score and more general course grade, we also considered completion of the course as a measure of academic success. Of the 207 students who began the course (defined as still being enrolled at the end of the official refund period on August 31), 178 completed the course by taking the final exam and 29 did not. Table 10 contains the results of logit regressions that use course completion as the binary dependent variable. That variable is defined to take on a value of 1 for those who finished and 0 for those who did not. The coefficients from a logit regression indicate the effects of the explanatory variables on the log of the odds (e.g. one-to-three or three-to-five) of completing the course. From these coefficient values we can also derive the effects of the explanatory variables on the probability (e.g. a 10 percent chance or 50 percent chance) of completing the course and we focus on this because it is intuitively easier to interpret. The probability effects are not constant since they depend on the values of the explanatory variables. We follow the convention of using the mean values of the explanatory variables in the computations. There are three sets of values reported in Table 10: each variable’s logit coefficient, the p-value associated with that coefficient, and the variable’s effect on the probability of completing the course. We had initially used a larger set of variables in the logit regressions, but these variables had such high p-values that we decided not to include them in the set of regressions reported here.


Table 10: Logit coefficients.
Limited dep. variable = finished
(Coefficients, p-vlaues, marginal probability effects.)

Intercept
p-value
1.8289
0.0000
-3.6664
0.0078
Online
p-value
prob effect
-0.0613
0.8061
-0.0074
-0.6022
0.2462
-0.0747
GPA
p-value
prob effect
  2.0163
0.0001
0.2501
Mean dep var 0.8599 0.8549
N 207 193


In our earlier papers, we found that online students were less likely to complete the course. The current study, however, provides little or no support for that result. The coefficient for online status, though negative, was not significant in our regressions. We did control for GPA and found that the GPA level upon entering the course had a strong and significant positive effect on likelihood of course completion. A one-point increase in GPA raised the chance of finishing the class by 25%.

In ending this section, it is worth noting that the superior grade performance of online students in our study does not imply that all students should take statistics or other courses courses online. Online courses may never be suitable for students with certain learning styles. We can reasonably conclude, however, that a substantial component of the student population taking introductory statistics can learn at least as much online as in a large lecture class. A large group of students selected themselves into the online class for various reasons and managed as a group to outperform students in the lecture class.

6. Summary and Conclusions

In this paper we have reported the results of a side-by-side comparison of two sections of an undergraduate business statistics course, one a traditional lecture section and the other a completely online section. We compared the two sections in three specific areas. First, we looked at the characteristics of students and the factors that affected their choice of section. Second, we focused on students’ use of the course resources and their overall satisfaction with various aspects of the course. Finally, we compared the academic performance of students in the two sections.

In many respects students in the two statistics sections were quite similar. Both groups were predominantly full time students in undergraduate degree programs. There were some non-traditional students, but not nearly so many as in the two studies of a computer programming course that we conducted previously. From our results here it would seem that online sections in business statistics at North Carolina State are being used not so much because of work or child care commitments (though those motivations remain important) but more as close substitutes for traditional lecture sections. For that reason the online students here seem very like the traditional students. Another possible reason that the online section is not more different from the lecture section is that relatively few majors outside Business Management and Accounting are likely to take the course. The two sections come from a single, relatively homogeneous population of students and so are quite similar.

Having noted the substantial similarities of students in the lecture and online students, the differences indicate that there is a large degree of self-selection that takes place. The online students were significantly older on average, and included a larger proportion of seniors. They entered the class with higher Grade Point Averages. Finally, though similar fractions of the two groups engaged in outside work, the online students reported working longer hours on average. Somewhat to our surprise, women were disproportionately represented in the online section. (We are not sure why this is so and would like to see whether the same gender difference appears in comparisons for other courses and sections.).

Self-selection became even more evident in students’ responses to questions regarding their motivations for choosing the particular type of class format. To some degree the differences were in keeping with our prior expectations. Lecture students placed a high value on the stimulation and motivation provided by face-to-face contact with the instructor and with fellow students. Online students needed greater flexibility to accommodate conflicts with other commitments.

A somewhat surprising result concerning choice of section was the large fraction of online students who reported choosing the online section in order to avoid conflicts with other classes. This motivation fits a picture of online students as being regular full-time undergraduates who take some of their courses online to increase flexibility, rather than being primarily older part-time students engaged in post-baccalaureate study or other lifelong learning. This finding for business statistics is somewhat in contrast with what we found in our earlier studies of the computer programming course.

Student use of course resources differed between the two sections much as we expected. Online students reported utilizing the website more and emailing the instructor more. Lecture students made greater use of on-campus resources such as the teaching assistants and the labs. These patterns held even though de facto all the resources were equally accessible to both sections. Students chose one mode of receiving the course and then stuck to that mode, even though they could have “crossed over” and essentially taken the course by the other mode. Regardless of which section they were in, students indicated roughly equal levels of satisfaction in the two sections.

Finally, we looked at comparisons of academic performance. We performed regressions to identify which factors helped predict success in the course. The major result here was that students in the online section had higher academic performance, even when we controlled for other important variables. The difference in performance ranged from 3 to more than 8 percentage points in grades and was statistically significant. This differential in favor of the online students reaffirmed what we found in our two earlier studies of the computer programming course, i.e., that there is a significant group of students who can perform at least as well in an online course of statistics as in a traditional lecture course.

In order to isolate the impact of online status, we controlled for a number of student characteristics that would influence academic performance. Previous academic success (GPA at the beginning of the semester) and student effort (measured by the homework average) were both important positive predictors of success in this course. In some (but not all) specifications, computer experience had a significant positive effect while commuting and being female both had significant negative effects. The online-lecture differential became smaller when other explanatory variables were added, but it still remained statistically significant.

In addition to grades, we also considered course completion as an indicator of success. We used logit regression equations to see what factors influenced the probability of course completion (as indicated by exam completion). Prior GPA had a large and significant positive effect on probability of completion. Online status showed a negative effect but it was never statistically significant. This is in contrast to our findings in previous work that online students were significantly less likely to finish.

This study differs from our two previous comparisons of online and lecture students in a computer science course because here we use data for business statistics students. We were interested to determine whether our previous results would extend to the statistics class which generally contains different and perhaps more typical set of undergraduates. In many ways these comparisons have turned out to be similar to our previous results. In both cases online students tended to be older and more mature and in both they performed better academically. In other important ways, though, the students in this study were different from those in the previous work. The online section of the computer science class consisted of many part-time, lifelong, or nontraditional students who enrolled in the online section mainly because of work or child care commitments. In the current business statistics case, the online students seemed to be much more like the lecture students. They were older, it is true, but they were largely enrolled in traditional, fulltime degree programs. This fact may well indicate an increasing importance of online education for full-time on-campus students. Our finding that online students performed at least as well as lecture students is evidence in favor of the application of the online format to the statistics class.

An interesting extension of this work would be to examine whether there are differences between online and lecture students in learning of specific topics within the statistics course. It is possible that the online format works better to convey certain concepts or methods while others are better taught using the traditional lecture. Such research would involve subdividing and collecting performance measures by topic and comparing online and lecture students for each topic. We leave this study to a future paper.


Appendix A

Survey 1

To members of section 003 or BUS/ST 350:

A co-author and I are doing research about online education, investigating who takes online courses and what is the course performance of online students relative to lecture students. We have already published one journal article about this subject and written another. Since I am teaching both a lecture section and an online section this fall, we have a good opportunity for a side by side comparison of the two types of courses. For that reason I am requesting that you complete information about your situation and intentions with respect to BUS/ST 350. I will be distributing this survey and a couple of other during the semester.

Your help with this is extremely important. There are many online courses being offered these days but we do not know enough about how effective they are and whom they cater to. The research based on this survey will be a useful advance of knowledge about online education.

Important: All individual information gathered in this study will be held in the strictest confidence. I am the only person who will see the individual responses. The data will be aggregated and only averages, percentages, and similar statistics for groups of individuals will be in the reports that result from our research. Also, the answers you give will have no effect on your grade.

I do ask that you provide identifying information. I plan to match what you say here with future survey results, with your grades, and with other information available from NCSU Registration and Records.

This survey is voluntary. I hope you agree to participate. If you do, please indicate by completing this form, signing below, and completing the survey attached.

NAME:						 
SS#:				
UNITY USER ID:				
Signature: 						 .
Date:				


UNITY USER ID:

  1. Will you be working this semester? ______ Yes _______ No

    If yes, how many hours per week, on average? ______

  2. Will you have child care responsibilities this semester? ______ Yes _______ No

    If yes, how many hours per week, on average? ______

  3. Do you commute 10 or more miles to class from where you live (one way)? ______ Yes _______ No

  4. Which of the following best describes your computer experience?
    _______ None
    _______ Experience with word processing and/or spreadsheet applications.
    _______ Experience with typical software applications plus web page development.
    _______ Experience with all of the above plus some programming.
    _______ Extensive programming experience.

  5. Have you successfully completed a distance education course prior to this semester? ____Yes ____ No

  6. Have you used WebAssign in a class before? _______Yes _______ No

  7. Where will you do most of your internet work for this course? ___Home ___ Work ___Campus ___Other

  8. What kind of internet access do you have at home?
    ___ Telephone modem ____ Cable modem or DSL ___ Other

  9. Please indicate the degree to which you agree with the following statements.

    1=Strongly disagree 2=Disagree 3=Neutral 4=Agree 5=Strongly agree

    _______ Face-to-face contact with the instructors of my classes is very important to me.
    _______ Face-to-face contact with fellow students in my classes is very important to me.
    _______ Conflict between class time and work commitments is a significant problem for me.
    _______ Work commitments strongly influenced my choice between lecture and online sections.
    _______ Conflict between class time and child care commitments is a significant problem for me.
    _______ Child care commitments strongly influenced my choice between lecture and online sections.
    _______ A course scheduling conflict strongly influenced my choice between lecture and online sections.
    _______ I find commuting to class a hardship.
    _______ Attending regular class meetings is an important motivator for me.
    _______ Flexibility in setting pace and time for studying is important to me.
    _______ I learn better from hearing a lecture.
    _______ I learn better from reading lecture materials.
    _______ I learn better when I have face-to-face contact with my fellow students.
    _______ My choice of lecture vs online section was influenced by advice from an advisor or other university official.
    _______ If I need help, I’m more likely to email my instructor than to meet with him/her face-to-face.

Please make on the reverse side any comments you wish about your choice of online vs. lecture section.


Survey 2

To members of BUS/ST 350 sections 002 and 003:

Earlier in the semester I collected a survey from many of you. Now as part of our research about online education, my co-author and I would like you to complete a second survey. Part of it is follow-up to the first survey and part is to get information on your experiences with the course. We intend to use the survey data in a paper similar to two others we have already written on the subject.

Your help with this is extremely important. There are many online courses being offered these days but we do not know enough about how effective they are and whom they cater to. The research based on this survey will be a useful advance of knowledge about online education.

Important: All individual information gathered in this study will be held in the strictest confidence. My secretary, my co-author, and I are the only persons who will see the individual responses. The data will be aggregated and only averages, percentages, and similar statistics for groups of individuals will be in the reports that result from our research. Also, the answers you give will have no effect on your grade.

I do ask that you provide identifying information. I plan to match what you say here with previous survey results, with your grades, and with other information available from NCSU Registration and Records.

This survey is voluntary. I hope you agree to participate. Please indicate whether you agree by answering below, completing the identification information, and signing.

You will receive full credit for one in-class or extra point assignment if you answer this question. The survey will have no other effect on your grade.

If you do agree to participate, please then complete the attached survey.

I agree to allow my data to be used in this study: Yes No (Circle one)

NAME:						 
SS#:				
UNITY USER ID:				
Signature: 						 .
Date:				


Survey 2 for BUS/ST 350

  1. During the fall semester, did you increase the number of hours you work?
    ___ Yes
    ___ No
    ___ Not applicable

  2. During the fall semester, did you decrease the number of hours you work?
    ___ Yes
    ___ No
    ___ Not applicable

  3. What percentage of your computer work for this course did you do in each of the following places (percentages should total 100%):
    ______Computer lab in Nelson Hall
    ______Computer labs elsewhere on campus
    ______On campus in your dormitory room
    ______In your home or apartment off campus
    ______At your place of work
    ______Other

  4. What percentage of the time did you use each of the following modes of access to the internet (please make the percentages total 100%)?
    ______Access through university facilities
    ______Telephone modem
    ______Cable modem (for example, Roadrunner)
    ______DSL line
    ______Other

  5. On average, about how many times per week did you visit the course website? ______

  6. How often did you attend the weekly lab sessions in the basement of Nelson Hall?
    ___ Never
    ___ About a quarter of the time
    ___ About half the time
    ___ About three-quarters of the time
    ___ All the time or almost all the time

  7. How many of the test review sessions conducted by the TA’s did you attend? _____

  8. On average, how often did you consult the TA’s for help outside of the lab or review sessions?
    ___ Never
    ___ Once or twice during the semester
    ___ Every week or two
    ___ Once a week or more

  9. How often did you come to the instructor’s office hours?
    ___ Never
    ___ Once or twice during the semester
    ___ Every week or two
    ___ Once a week or more

  10. How many online courses (with no lecture meetings) have you taken up to now. Include this course if you are taking it online, as well as others you are taking this semester. ____

  11. Which of the following characterizes your experience with the completely online version of BUS/ST 350 (Section 003)?
    ___ Not applicable. I was in Section 002.
    ___ I preferred taking the course online and doing so has worked out very well for me.
    ___ I preferred taking the course online initially but now I wish I had taken it as a lecture course.
    ___ From the beginning, I would rather have taken a lecture section. It turns out though that the online section has worked out well for me.
    ___ From the beginning, I would rather have taken a lecture section. The online section has not worked out well for me.
    ___ Other (please explain).

  12. How does the online section of BUS/ST 350 compare with other online courses you have taken?
    ___ Not applicable. I was in Section 002.
    ___ Not applicable. This is the first completely online course I have taken.
    ___ It was generally better than the others I have taken.
    ___ It was generally about the same as the others I have taken.
    ___ It was generally not so good as the others I have taken.

  13. About how many times during the semester did you send emails to the instructor?
    ____ Never
    ____ Between 1 and 4 times
    ____ Between 5 and 10 times
    ____ More than 10 times

    Please indicate your level of agreement with each of the following statements by circling the appropriate response:

  14. I needed lectures in class meeting to really understand the material of the course.
    Strongly disagree Disagree Neutral Agree Strongly Agree

  15. I had as much contact with fellow students as I wanted during the course. Strongly disagree Disagree Neutral Agree Strongly Agree

  16. I found it difficult to balance the demands of the course with my other commitments.
    Strongly disagree Disagree Neutral Agree Strongly Agree

  17. I was able to avoid getting behind in the course during the semester.
    Strongly disagree Disagree Neutral Agree Strongly Agree

  18. It is easy to find things in the course website.
    Strongly disagree Disagree Neutral Agree Strongly Agree

  19. I regularly read the bulletins on the course website.
    Strongly disagree Disagree Neutral Agree Strongly Agree

  20. During the semester I referred frequently to the PowerPoint slides on the website.
    Strongly disagree Disagree Neutral Agree Strongly Agree

  21. During the semester I referred frequently to the series of 40 Lessons on the website.
    Strongly disagree Disagree Neutral Agree Strongly Agree

  22. During the semester I referred frequently to the record of my grades accessible through the website.
    Strongly disagree Disagree Neutral Agree Strongly Agree

  23. I was able to learn what I needed for the course from materials on the website. Strongly disagree Disagree Neutral Agree Strongly Agree

  24. For homework assignments I prefer using WebAssign rather than doing assignments to turn in on paper for grading.
    Strongly disagree Disagree Neutral Agree Strongly Agree

  25. The ability to resubmit my answers several times on WebAssign helped me learn more from the assignments.
    Strongly disagree Disagree Neutral Agree Strongly Agree

  26. Knowing immediately whether my answers are correct or not on WebAssign helped me learn more from the assignments.
    Strongly disagree Disagree Neutral Agree Strongly Agree

  27. The number of homework and lab assignments was about right for the material we covered.
    Strongly disagree Disagree Neutral Agree Strongly Agree

  28. What improvements would you make in the website?

  29. What other comments do you have about the course and the course website?


Appendix B

Tables B-8 and B-9 contain results of robust regressions paralleling those of Table 8 and Table 9 in the text. The OLS estimates from the earlier tables are included for easier comparison. The Huber M estimates in the middle were computed with iteratively reweighted least squares that give a lower weight in the least squares computation to the outlying residuals. The SAS ROBUSTREG procedure was used for the estimation. The default bisquare weight function was used for the weighting. At the right are Final Weighted Least Squares (FWLS) results from Least Trimmed Squares regressions, also performed with the ROBUSTREG procedure in SAS. The FWLS estimates are OLS results with outliers detected by the procedure left out. The numbers left out are indicated in the last row of each table.

The two tables show little qualitative or quantitative difference between the OLS results and the Huber or FWLS results. The biggest difference is that the Female coefficient is larger and significant in the regression with Exam as the dependent variable. A similar but much less substantial change is evident for the Female coefficient in Table B-9 as well.


Table B-8: Regression coefficients for exam grades.

Coefficients in bold
OLS extimates Huber M estimates FWLS estimates
Intercept
p-value
74.4721
0.0000
12.3798
0.0596
75.0806
0.0000
12.3798
0.0554
74.8377
0.0000
14.4937
0.0207
Online
p-value
8.2615
0.0032
4.8101
0.0371
9.7182
0.0006
6.4680
0.0041
10.2925
0.0000
6.2689
0.0048
GPA
p-value

12.9755
0.0000

12.6744
0.0000

12.3666
0.0000
HWLab
p-value

0.3147
0.0000

0.3348
0.0000

0.3170
0.0000
Commute
p-value

-7.276
0.0043

-7.5813
0.0022

-7.1857
0.0012
CompExper
p-value

2.7137
0.2485

2.4909
0.2803

2.1071
0.3484
Female
p-value

-3.2484
0.1134

-4.9137
0.0145

-4.0229
0.0404
R Squared 0.0483 0.4677 0.0591 0.4160 0.0738 0.4160
N 178 150 178 150 178 150
Outliers         3 1


Table B-9: Regression coefficients for partial course grades.

Coefficients in bold
OLS extimates Huber M estimates FWLS estimates
Intercept
p-value
81.0774
0.0000
29.7459
0.0000
81.8194
0.0000
33.5690
0.0000
81.7661
0.0000
29.7459
0.0000
Online
p-value
6.4971
0.0014
3.6462
0.0181
6.7590
0.0004
3.8055
0.0096
6.6115
0.0002
3.6462
0.0168
GPA
p-value

8.4215
0.0000

8.6484
0.0000

8.4215
0.0000
HWLab
p-value

0.3328
0.0000

0.2903
0.0000

0.3328
0.0000
Commute
p-value

-2.4894
0.1395

-2.4572
0.1280

-2.4894
0.1373
CompExper
p-value

1.8690
0.2335

1.5533
0.3022

1.8690
0.2316
Female
p-value

-3.0567
0.0262

-3.7833
0.0039

-3.0567
0.0246
R Squared 0.0565 0.5672 0.0609 0.4586 0.0763 0.6609
N 178 150 178 150 178 150
Outliers         3 0


References

Bengiamin, N. N., Johnson, A., Zidon, M., Moen, D. and Ludlow, D. K. (1998), “The Development of an Undergraduate Distance Learning Engineering Degree for Industry – A University/Industry Collaboration,” Journal of Engineering Education, 87, 277 – 282.

Burbules, N.C. and Callister, T.A., Jr. (2000), “Universities in Transition: The Promise and the Challenge of New Technologies,” Teachers College Record, 102, 271-293.

Davis, J.L. (1996), “Computer-Assisted Distance Learning, Part II: Examination Performance of Students On and Off-Campus,” Journal of Engineering Education, 85, 77-82.

Dutton, J., Dutton, M. and Perry, J. (2001), “Do Online Students Perform as Well as Lecture Students?” Journal of Engineering Education, 90, 131-136.

Dutton, J., Dutton, M. and Perry, J. (2002), “How Do Online Students Differ From Lecture Students?” Journal of Asynchronous Learning Networks [Online], 6(1). www.sloan-c.org/publications/jaln/v6n1/v6n1_dutton.asp

Evans, J.R. and Haase, I.L. (2001), “ Online Business Education in the Twenty-First Century: An Analysis of Potential Target Markets,” Internet Research: Electronic Networking Applications and Policy, 11, 246-260.

Huang, G., Taddese, N., Walter, E., and Peng, S. (2000), “Entry and Persistence of Women and Minorities in College Science and Engineering Education,” Research and Development Report, National Center for Education Statistics, U.S. Department of Education.

Russell, T.L. (1999), “The No Significant Difference Phenomenon,” North Carolina State University, Raleigh, NC.

Spooner, F., Jordan, L., Algozzine, B. and Spooner, M. (1999), “Student Ratings of Instruction in Distance Learning and On-Campus Classes,” Journal of Educational Research, 92, 132-140.

Wallace, D. R. and Mutooni, P. (1997), “A Comparative Evaluation of World Wide Web-Based and Classroom Teaching,” Journal of Engineering Education, 86, 211 – 219.

Wilson, T. and Whitelock, D. (1998), “What are the Perceived Benefits of Participating in a Computer-Mediated Communication (CMC) Environment for Distance Learning Computer Science Students?,” Computers Education, 30, 259 – 269.


Acknowledgement

This paper has been presented at the Hawaii International Conference on Business in June 2002 and at the North Carolina Teaching and Learning with Technology Conference in March 2003. Thanks to several anonymous referees for helpful comments.


John Dutton
College of Management
N.C. State University
Raleigh, NC 27695-7229
U.S.A.
John_Dutton@ncsu.edu

Marilyn Dutton
School of Business
N.C. Central University
Durham, NC 27707
U.S.A.
mdutton@wpo.nccu.edu


Volume 13 (2005) | Archive | Index | Data Archive | Information Service | Editorial Board | Guidelines for Authors | Guidelines for Data Contributors | Home Page | Contact JSE | ASA Publications