The Development and Validation of a Teacher Preparation Program Follow-Up Survey

Laura E. Schulte
University of Nebraska at Omaha

Journal of Statistics Education Volume 16, Number 3 (2008), www.amstat.org/publications/jse/v16n3/schulte.html

Copyright © 2008 by Laura E. Schulte all rights reserved. This text may be freely shared among individuals, but it may not be republished in any medium without express written consent from the author and advance notification of the editor.


Key Words: Scale development; Applied statistics; Service learning.

Abstract

Students in my applied advanced statistics course for educational administration doctoral students developed a follow-up survey for teacher preparation programs, using the following scale development processes: adopting a framework; developing items; providing evidence of content validity; conducting a pilot test; and analyzing data. The students developed the survey items by using the Interstate New Teacher Assessment and Support Consortium (INTASC) principles as the framework to operationally define the knowledge and skills that highly qualified teachers should possess. The students analyzed the data from the pilot study for their final exam in the course. The follow-up survey currently is being used by our university for program evaluation, improvement, and accreditation.

1. Introduction

The doctoral students in my applied advanced statistics course for educational administration developed a follow-up survey for teacher preparation programs, using the following scale development processes: adopting a framework; developing items; providing evidence of content validity; conducting a pilot test; and analyzing data (DeVellis 2003). The course project served as a way to model and teach best practices in scale development. The students in the class are practicing school administrators who are responsible for assessing student achievement, staff effectiveness, and graduates’ and community members’ perceptions of school programs. The course project helped the students gain the skills needed to develop sound assessment instruments.

2. Adopting a Framework

The students used the Interstate New Teacher Assessment and Support Consortium (INTASC 1992) principles and their corresponding indicators as the framework to operationally define the knowledge and skills that highly qualified teachers should possess (Guskey 2005; Wiggins and McTighe 2006). INTASC’s (1992) Model Standards for Beginning Teacher Licensing and Development include 10 principles and their corresponding knowledge and skill indicators. Listed below are the 10 INTASC principles and, as an example, the knowledge and skill indicators for Principle 1.

v     Principle 1: The teacher understands the central concepts, tools of inquiry, and structures of the discipline(s) he or she teaches and can create learning experiences that make these aspects of subject matter meaningful for students.

Knowledge Indicators

·        The teacher understands major concepts, assumptions, debates, processes of inquiry, and ways of knowing that are central to the discipline(s) s/he teaches.

·        The teacher understands how students’ conceptual frameworks and their misconceptions for an area of knowledge can influence their learning.

·        The teacher can relate his/her disciplinary knowledge to other subject areas.

Skill Indicators

·        The teacher effectively uses multiple representations and explanations of disciplinary concepts that capture key ideas and links them to students’ prior understandings.

·        The teacher can represent and use differing viewpoints, theories, "ways of knowing," and methods of inquiry in his/her teaching of subject matter concepts.

·        The teacher can evaluate teaching resources and curriculum materials for their comprehensiveness, accuracy, and usefulness for representing particular ideas and concepts.

·        The teacher engages students in generating knowledge and testing hypotheses according to the methods of inquiry and standards of evidence used in the discipline.

·        The teacher develops and uses curricula that encourage students to see, question, and interpret ideas from diverse perspectives.

·        The teacher can create interdisciplinary learning experiences that allow students to integrate knowledge, skills, and methods of inquiry from several subject areas.

v     Principle 2: The teacher understands how children learn and develop and can provide learning opportunities that support their intellectual, social, and personal development.

v     Principle 3: The teacher understands how students differ in their approaches to learning and creates instructional opportunities that are adapted to diverse learners.

v     Principle 4: The teacher understands and uses a variety of instructional strategies to encourage students’ development of critical thinking, problem solving, and performance skills.

v     Principle 5: The teacher uses an understanding of individual and group motivation and behavior to create a learning environment that encourages positive social interaction, active engagement in learning, and self-motivation.

v     Principle 6: The teacher uses knowledge of effective verbal, nonverbal, and media communication techniques to foster active inquiry, collaboration, and supportive interaction in the classroom.

v     Principle 7: The teacher plans instruction based upon knowledge of subject matter, students, the community, and curriculum goals.

v     Principle 8: The teacher understands and uses formal and informal assessment strategies to evaluate and ensure the continuous intellectual and social development of the learner.

v     Principle 9: The teacher is a reflective practitioner who continually evaluates the effects of his/her choices and actions on others (students, parents, and other professionals in the learning community) and who actively seeks out opportunities to grow professionally.

v     Principle 10: The teacher fosters relationships with school colleagues, parents, and agencies in the larger community to support students’ learning and well-being. (INTASC 1992)

3. Developing Items

The item development panel consisted of 13 professional educators: the 8 educational administration doctoral students enrolled in my applied advanced statistics course; 4 teachers in the surrounding school districts; and 1 professor in the College of Education. At the time of this study, the item development panel members’ mean years of experience in the field of education was 17.14 (SD = 9.39).

During the first class session of the statistics course, I gave an overview of the INTASC (1992) principles and corresponding knowledge and skill indicators to the item development panel. To model the item development process I worked with the item development panel to generate knowledge and skill items for INTASC Principle 10. I provided sample knowledge and skill items for each of the other nine INTASC principles. Then, the item development panel broke into small groups and generated knowledge and skill items for the remaining nine INTASC principles. In total, the item development panel generated 100 knowledge and skill items that were reviewed for content validity.

4. Providing Evidence of Content Validity

A group of 21 persons with experience in teacher education reviewed the 100 items from the item development panel to provide evidence of the College of Education (COE) Follow-Up Survey’s content validity. None of the members of the content validity panel was a member of the item development group. The content validity panel included 12 community members who were graduate students, teachers, and/or administrators in area P-12 schools and 9 staff, professors, and/or administrators from the College of Education. The reviewers’ years of experience in the field of education ranged from 2 to 36 years with a mean of 21.71 years (SD = 10.68).

We provided the reviewers with each INTASC principle and the corresponding knowledge and skill indicators. We asked the reviewers to rate the appropriateness of the 100 survey items in measuring the knowledge or skills represented by each INTASC (1992) principle on a 3-point scale (1 = not appropriate, 2 = marginally appropriate, and 3 = very appropriate). We asked the reviewers to provide ways to improve the items that they rated "1" or "2", if possible. In addition, we asked the reviewers to circle the items that best captured the essence of the knowledge or skill indicators for each INTASC principle. This step was deemed necessary because of the large number of items and the need to reduce the number of items by about 50%.

The process of retaining and/or revising items followed a three-step procedure. During another class session of the statistics course, the item development panel members broke into their item development groups to: 1. determine how frequently each item was chosen as capturing the essence of an INTASC (1992) principle, 2. consider each item’s ratings, and 3. revise (if necessary) each item based on input from the content validity panel. Of the original 100 survey items, 49 items were retained following the three-step process. Of those 49 items, 12 were revised based on input from the content validity panel. The 49 items consisted of 24 knowledge and 25 skill items. We then added 19 disposition items from the Teacher Dispositions Index (Schulte, Edick, Edwards, and Mackiel 2004), which was developed using a similar process with students in another statistics class. Because the Teacher Dispositions Index items were developed for INTASC principles 1, 2, 3, 5, 6, 7, and 9, fewer INTASC principles are represented by the disposition items in the COE Follow-Up Survey. In total, the COE Follow-Up Survey contained 68 knowledge, skill, and disposition items that represented the 10 INTASC principles.

5. Conducting a Pilot Test

We received approval from the university’s Institutional Review Board (IRB) to collect data on the COE Follow-Up Survey from recent graduates of our teacher preparation programs. In the future the doctoral students in the statistics class could use the information about receiving IRB approval for their dissertation research, which requires IRB approval.

For the pilot study, graduates were surveyed using two methods. We mailed surveys to recent graduates of our bachelor and master degree teacher preparation programs. We also asked professors who were teaching graduate level courses in the College of Education to determine which students in their classes graduated from the university with Bachelor’s degrees in teacher preparation programs. We then asked those graduates identified from classes to complete the COE Follow-Up Survey if they already had not received and completed a mailed survey. Through the use of these two methods, 487 graduates were asked to participate, and 123 graduates returned completed surveys, providing a 25% response rate.

The survey information included the following: (a) a cover letter that explained the purposes of the study and informed the respondents that participation was voluntary and that responses would be anonymous, (b) demographic questions used to describe the sample, and (c) the 68-item survey with three additional open-ended items that asked respondents to indicate in what areas their teacher preparation program was strong and to provide recommendations for changes and any other comments. The graduates were asked to give their perception of their teacher preparation program using a response scale ranging from "1" strongly disagree to "5" strongly agree.

6. Analyzing Data

To further validate the survey and to provide an estimation of its reliability, the data from the 123 graduates who completed the COE Follow-Up Survey were analyzed by the students in the applied advanced statistics course for their final examination. The students conducted the following statistical analyses using SPSS for Windows to investigate the construct validity and reliability of the COE Follow-Up Survey:

1. The construct validity and dimensionality of the COE Follow-Up Survey were investigated with exploratory factor analyses using a principal axis factoring method followed by a varimax rotation of the number of factors extracted. The principal axis factoring method was used rather than the principal components method because the purpose was to investigate common variance in order to determine the number of dimensions that the COE Follow-Up Survey measured (Kachigan 1991).

2. The reliability of the COE Follow-Up Survey subscales was estimated using coefficient alpha (Cronbach’s alpha) (Crocker and Algina 1986).

6.1 Factor Analysis

For the factor analysis part of the final exam, all 8 students indicated that a two-factor solution best fit the data. The first factor had an eigenvalue of 25.97 and accounted for 38.20% of the total variance. The second factor had an eigenvalue of 4.71 and accounted for 6.92% of the total variance. The two factors accounted for approximately 45.12% of the variance in the COE Follow-Up Survey items. When the students considered including a third factor, they found that the third factor had an eigenvalue of 2.60 accounting for 3.83% of the total variance with only three items loading on the factor.

Using a factor loading cutoff value of .50, the students removed 19 of the original 68 items that did not load on either factor. They found that the remaining items measured a knowledge and skills dimension and a dispositions dimension (see Table 1). Thus, the results of the factor analysis yielded a 49-item COE Follow-Up Survey that measures two unique constructs that encompass all 10 INTASC (1992) principles with items for the knowledge (15 items), skill (20 items), and disposition (14 items) indicators (see Table 1).

When I asked the students to reflect upon the results, they indicated that they were not surprised that the factor analysis clustered the knowledge and skill items together in the dominant factor and the disposition items into the secondary factor. Because the doctoral students are practicing administrators with many years of experience working with teachers, they know that teachers must possess both subject matter knowledge (Shulman 1986) and pedagogical skills (Banks et al. 2005; Grant and Gillette 2006; Leahy, Lyon, Thompson, and Wiliam 2005; LePage et al. 2005; Shepard et al. 2005) to be effective teachers. In addition they realize that teachers’ dispositions are the bridge between knowledge and skills that enable teachers to be effective with all students (Grant and Gillette 2006; Sockett 2006). All but one of the skill items and one of the knowledge items loaded on the knowledge and skills factor, and all but three of the disposition items loaded on the dispositions factor. Although the assignment of items to factors based on the factor analysis was not perfect, it was quite impressive given the relationship among knowledge, skills, and dispositions.

Table 1. College of Education Follow-Up Survey Items with INTASC Principles and Indicators and Factor Loadings

College of Education Follow-Up Survey

Knowledge and Skills Subscale Item

P

I

F1

F2

1. I connect students’ prior knowledge with current learning.

1

S

.606

.190

2. I develop lessons that engage students in critical thinking and problem solving.

1

S

.550

.280

3. I develop lessons that encourage students to consider ideas from diverse perspectives.

1

S

.591

.125

4. I stimulate students’ interests.

1

D

.531

.273

5. I know how to use a variety of instructional strategies that promote student learning.

2

K

.684

.117

6. I know how to adapt my instruction to meet the developmental needs of students.

2

K

.502

.355

7. I understand the connection between physical, social, emotional, moral, and cognitive performance.

2

K

.526

.382

8. I design instruction that meets students’ cognitive, social, emotional, moral, and physical needs.

2

S

.662

.332

9. I actively engage all students in learning.

2

S

.522

.457

10. I provide appropriate feedback to encourage students in their development.

2

D

.512

.429

11. I know how to design instruction that helps use students’ strengths as the basis for growth.

3

K

.597

.439

12. I understand how students’ learning is influenced by their culture.

3

K

.540

.271

13. I differentiate instruction to meet individual learning styles/needs.

3

S

.527

.466

14. I understand the cognitive processes associated with various kinds of learning.

4

K

.754

.192

15. I understand the advantages of using a variety of strategies in the teaching-learning process.

4

K

.603

.310

16. I monitor and adjust teaching strategies in response to student feedback.

4

S

.645

.239

17. I present concepts clearly and accurately.

4

S

.504

.486

18. I know a wide range of strategies to promote positive relationships and purposeful learning.

5

K

.572

.474

19. I know how to help students become self-motivated.

5

K

.655

.357

20. I create a classroom environment where students engage in purposeful learning activities.

5

S

.635

.364

21. I effectively use a wide repertoire of classroom management techniques.

5

S

.599

.349

22. I know techniques for effective verbal and nonverbal communication.

6

K

.633

.250

23. I model effective communication strategies.

6

S

.648

.415

24. I use a variety of questioning strategies to stimulate critical thinking and problem solving.

6

S

.586

.203

25. I plan lessons using effective instructional strategies.

7

S

.693

.178

26. I plan activities that promote learning for all students.

7

S

.602

.438

27. I reflect upon the effectiveness of both short-range and long-term plans.

7

S

.588

.473

28. I know how to evaluate students’ learning using formative and summative assessments.

8

K

.679

.127

29. I understand assessment related issues, such as reliability, validity, bias, and scoring concerns.

8

K

.595

-.029

30. I use a variety of assessment strategies to measure students’ learning.

8

S

.768

.097

31. I use assessment results to evaluate students’ progress and modify instruction.

8

S

.705

.074

32. I involve students in self-assessment of their learning.

8

S

.655

.175

33. I effectively communicate students’ progress to students, parents, and colleagues.

8

S

.611

.215

34. I understand the impact that self-assessment and reflection have on teaching and learning.

9

K

.536

.212

35. I know how to locate and use best practices.

9

K

.756

.148

36. I engage in research-based teaching practices.

9

D

.545

.365

 

 

 

 

Dispositions Subscale Item

P

I

F1

F2

1. I demonstrate qualities of humor, empathy, and warmth with others.

5

D

.251

.674

2. I treat students with dignity and respect at all times.

5

D

.323

.574

3. I am patient when working with students.

5

D

.389

.531

4. I am a thoughtful and responsive listener.

6

D

.365

.587

5. I assume responsibility when working with others.

7

D

.357

.667

6. I cooperate with colleagues in planning instruction.

7

D

.300

.519

7. I collaborate with my professional colleagues to grow professionally.

9

S

.340

.516

8. I am punctual and reliable in my attendance.

9

D

.041

.789

9. I maintain a professional appearance.

9

D

.003

.861

10. I honor my commitments.

9

D

-.035

.910

11. I am willing to receive feedback and assessment of my teaching.

9

D

.051

.795

12. I uphold the laws and ethical codes governing the teaching profession.

9

D

.013

.762

13. I understand how my school is an organization that operates within the larger community.

10

K

.348

.578

Note. Column P indicates the INTASC (1992) principle represented by the item; Column I indicates the
INTASC principle indicator represented by the item, K = Knowledge, S = Skills, D= Dispositions;
Column F1 indicates the loading on Factor 1; Column F2 indicates the loading on Factor 2.

6.2 Reliability Analysis

The students estimated reliability using Cronbach’s alpha for each of the two COE Follow-Up Survey subscales. The reliability estimate for the 36-item knowledge and skills subscale was .97. The mean of the corrected item-total correlations was .66 (SD = .06). The reliability estimate for the 13-item dispositions subscale was .92. The mean of the corrected item-total correlations was .67 (SD = .08). When I asked the students to reflect upon the reliability estimates, they said the coefficients indicated that respondents were very consistent in their responses to items measuring each construct. They also stated that the coefficients were well above the acceptable level of .70 (Cortina 1993).

7. Conclusion

This service learning statistics course project served two unique purposes. First, it provided a way to model and teach best practices in scale development to our doctoral students who are practicing administrators. The students need the information to develop sound assessment instruments to measure student achievement, staff effectiveness, and graduates’ and community members’ perceptions of their school programs. Second, the course project provided our university with a follow-up survey that has many potential uses for our teacher preparation programs. Currently, it is being used to (a) assess graduates’ and their employers’ perceptions of the effectiveness of our university’s teacher preparation programs and (b) provide university administrators and faculty members with information for program evaluation, improvement, and accreditation (Fitzpatrick, Sanders, and Worthen 2003). In the future, the data from the employer version of the survey could be analyzed to help further establish the survey’s construct validity and reliability.

Through the course project the students learned the scale development processes of adopting a framework, developing items, providing evidence of content validity, conducting a pilot test, and analyzing data (DeVellis 2003). The students’ grades on the final exam ranged from 84% to 100% with a mean of 95.88%. Their performance on the final exam indicated that they had learned how to appropriately analyze and interpret data using factor and reliability analyses. The students said that they enjoyed the course project and appreciated that their efforts resulted in an assessment instrument that was being used to assess the university’s teacher preparation programs. One of the students in the statistics course presented the COE Follow-Up Survey project at a national conference, where it was well received. Former students have asked what scale development project we are working on in the statistics course this semester, which indicates to me that the students feel the projects are worthwhile and beneficial. I plan to continue to conduct service learning scale development projects in the statistics course in the future.


Acknowledgements

The author would like to thank the members of the item development panel for their efforts in developing the COE Follow-Up Survey items. The members included the following doctoral students enrolled in my applied advanced statistics course: Shari Hoffman, Robert Ingram, Kraig Lofquist, Andrew Rikli, Dorothy Sansom, Peter Smith, Tami Williams, and Joan Wilson and the following community members: Cheryl Heineman-Pitt, Kay Keiser, Sean Leverty, Jane Pille, and Char Riewer.


References

Banks, J., Cochran-Smith, M., Moll, L., Richert, A., Zeichner, K., LePage, P., Darling-Hammond, L., Duffy, H., and McDonald, M. (2005), "Teaching Diverse Learners," In L. Darling-Hammond, and J. Bransford (eds.), Preparing Teachers for a Changing World: What Teachers Should Learn and Be Able to Do, San Francisco: Jossey-Bass, pp. 232-274.

Cortina, J. M. (1993). "What is Coefficient Alpha? An Examination of Theory and Applications, " Journal of Applied Psychology, 78(1), 98-104.

Crocker, L., and Algina, J. (1986), Introduction to Classical and Modern Test Theory, New York: CBS College Publishing.

DeVellis, R. F. (2003), Scale Development: Theory and Applications (2nd ed.), Thousand Oaks, CA: SAGE Publications.

Fitzpatrick, J. L., Sanders, J. R., and Worthen, B. R. (2003), Program Evaluation: Alternative Approaches and Practical Guidelines (3rd ed.), Boston, MA: Allyn and Bacon.

Grant, C. A., and Gillette, M. (2006), "A Candid Talk to Teacher Educators about Effectively Preparing Teachers Who Can Teach Everyone’s Children," Journal of Teacher Education, 57(3), 292-299.

Guskey, T. R. (2005), "Mapping the Road to Proficiency," Educational Leadership, 63(3), 32-38.

Interstate New Teacher Assessment and Support Consortium (INTASC). (1992), Model Standards for Beginning Teacher Licensing, Assessment and Development: A Resource for State Dialogue, Washington, DC: Council of Chief State School Officers. Retrieved on November 26, 2005, from http://www.ccsso.org/content/pdfs/corestrd.pdf

Kachigan, S. K. (1991), Multivariate Statistical Analysis: A Conceptual Introduction (2nd ed.), New York: Radius Press.

Leahy, S., Lyon, C., Thompson, M., and Wiliam, D. (2005), "Classroom Assessment: Minute by Minute, Day by Day," Educational Leadership, 63(3), 18-24.

LePage, P., Darling-Hammond, L., Akar, H., Gutierrez, C., Jenkins-Gunn, E., and Rosebrock, K. (2005), "Classroom Management," In L. Darling-Hammond, and J. Bransford (eds.), Preparing Teachers for a Changing World: What Teachers Should Learn and Be Able to Do, San Francisco: Jossey-Bass, pp. 327-357.

Schulte, L.E., Edick, N., Edwards, S., and Mackiel, D. (2004), "The Development and Validation of the Teacher Dispositions Index," Essays In Education, 12.

Shepard, L., Hammerness, K., Darling-Hammond, L., Rust, F., Snowden, J. B., Gordon, E., Gutierrez, C., and Pacheco, A. (2005), "Assessment," In L. Darling-Hammond, and J. Bransford (eds.), Preparing Teachers for a Changing World: What Teachers Should Learn and Be Able to Do, San Francisco: Jossey-Bass, pp. 275-326.

Shulman, L. S. (1986), "Those Who Understand: Knowledge Growth in Teaching," Educational Researcher, 15(2), 4-14.

Sockett, H. (ed.), (2006), Teacher Dispositions: Building a Teacher Education Framework of Moral Standards, Washington, DC: AACTE Publications.

Wiggins, G., and McTighe, J. (2006), "Examining the Teaching Life," Educational Leadership, 63(6), 26-29.


Laura E. Schulte, Ph.D.
University of Nebraska at Omaha
Teacher Education Department
6001 Dodge Street
Kayser Hall 314
Omaha, NE 68182-0163
lschulte@mail.unomaha.edu


Volume 16 (2008) | Archive | Index | Data Archive | Resources | Editorial Board | Guidelines for Authors | Guidelines for Data Contributors | Home Page | Contact JSE | ASA Publications