63
Teacher Technology Usage 1 Running head: TEACHER TECHNOLOGY USAGE A Survey of Technology Training, Knowledge, Usage, and Attitudes of Southern California Primary and Secondary School Teachers Richard Wainess University of Southern California

The effects of interface design and modalities on computer ... · Web viewJournal of Industrial Teacher Education, 38(1), 91-102. McCannon, M.; Crews, T. B. (2000). Assessing the

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

The effects of interface design and modalities on computer assisted learning

Teacher Technology Usage 13

Running head: TEACHER TECHNOLOGY USAGE

A Survey of Technology Training, Knowledge, Usage, and Attitudes

of Southern California Primary and Secondary School Teachers

Richard Wainess

University of Southern California

Abstract

Competence with and use of technology, inside and outside the classroom, is becoming a requirement for primary and secondary school teachers. Federal and state educational departments are requiring teachers to learn and use technology, universities have begun to offer technology courses to their educational students, and staff development courses in technology are being offered to inservice teachers, yet it is unclear whether these training programs are effective. Research has shown that effective training requires diversity, contextualized training or information, peer communication, and convenient and ongoing instruction. The purpose of this study is to discover, through a survey instrument, whether teachers are receiving the training they need to become comfortable using technology for administrative purposes and teaching with, or teaching, technology.

A Survey of Technology Training, Knowledge, Usage, and Attitudes

of Southern California Primary and Secondary School Teachers

The availability of computers for student use in public schools has changed significantly in recent years. In the last decade, the student-to-computer ratio has moved from 26.7:1 to 5.7:1, and it is estimated that, since 1991, $19 billion has been spent on creating the information technology infrastructure for the nation’s public schools (Dorman, 2001). As society continues to rely more and more on computers, current and future educators must keep up-to-date with technology, to provide meaningful educational experiences for their students (McCannon & Crews, 2000). However, despite the growing access to technology, only 20 percent of the nation’s 2.5 million public school teachers are comfortable with using computers in the classroom (Dorman, 2001). Adding to this problem, students today often know more about computers and technology than their teachers (Clifford, 1998). “Clearly, the need for teacher training in the use of information technology is crucial in all schools” (Lanier & White, 1998, p. 2).

Most technology planning concentrates on hardware and software, not on staff development (Benson, 1997), yet without proper training and motivation, the technology will probably not be used. Staff development must be factored into school technology plans (Clifford, 1998). Currently, 38 states have technology requirements in their teacher preparation programs (Hornung & Bronack, 2000). For example, Title 5 Regulation, in Section 44161.7 of the California Education Code (California State Legislature, 1997), requires preservice and inservice teachers to take an educational computing course. These courses are typically designed to teach basic computing skills as well as integration of computers into the classroom (Yildrim, 2000). However, according to Smith (2000), today’s teachers are not entering the classroom well prepared to use technology. Many teachers feel they’ve received inadequate training, even though most schools already have some form of technology and those same teachers are expected to be competent in its use for delivery and support of instruction (Clark, 2000). Technology education has not kept pace with the rapid changes in computer technology and many teachers find their experience with the practical use of computers is lacking. Adding to this problem, Hornung and Bronack (2000) state that many educational faculty do not receive the training they themselves need to model proper or effective use of technology. They often do not understand the impact of technology on K-12 classrooms, and have not adjusted their own teaching methodologies to reflect those changes. This lack of modeling is detrimental to teacher technology training (Hornung & Bronack, 2000).

Background

For students to benefit from technology, they must have teachers who are trained in the use of those technologies for education. Teachers must integrate technology into the curriculum as a tool for students to use for information gathering, data manipulation, problem-solving, and critical thinking (Tinson, 1996). However, where technology training does exist, its design is often inadequate to support teachers’ needs (McCannon & Crews, 2000). Research has shown that effective teacher technology training requires a blend of up-to-date technology instruction (McBrath & Kinze, 2000) that supports students with diverse levels of prior computer knowledge and abilities (Yildirin, 2000), contextual and best practice resources for applying technology to a wide range of subjects and classroom situations (Linnell, 2000), ongoing peer communication (Jones, 2001a), and convenient and ongoing delivery of instruction (Hornung & Bronack, 2000). If current and future teachers are to value technology and help students become comfortable with its use, teachers must have access to appropriate training, resources, and activities (Linnell, 2000). Milbrath and Kinzie (2000) have shown that course exposure and frequent use of technology lead to increased comfort levels as well as embracing of technology in the classroom.

Teacher preparation programs must provide their students with many and varied experiences (Clifford, 1998). TERC, a nonprofit educational research and development organization, has found that, to support teacher integration of tools and technology-related content into their classrooms, training must embrace a broad and comprehensive approach to professional development that incorporates pedagogical objectives and is responsive to individual teacher needs (Grant, 1999). Krueger, Hansen, and Smaldino (2000), too, state that teachers must know more than just how to use technology. They must know how to use technology to enhance learning. For example, one fifth-grade classroom used a spreadsheet to plan a party within a budget (Grant, 1999).

According to Benson (1997), schools are finding that traditional models of staff development, particularly one-time inservice training, are ineffective for teaching computer use and for helping teachers develop methods to integrate computers as instructional tools. Staff development programs must meet teachers' diverse, ongoing, and ever-changing technology needs (Benson, 1997). To support varying student needs, Yildirim (2000) suggests assessing teachers computer competency levels before they enroll in courses. Even with an effective training program, though, many teachers have not adopted technology as a tool for use in their curriculum simply because they do not have time to attend classes (Hornung & Bronack, 2000). One solution is to provide instruction online (Jones, 2001a). In addition, to support learning and application of technology, effective staff development for teachers should allow teachers to interact with one another for technical support and classroom ideas (Jones, 2001a).

As an added benefit to technology training, research shows that as computer knowledge increases, anxiety levels decrease and comfort levels increase (Barker, 1994; Laffey & Musser, 1998; McCannon & Crews, 2000; Shick, 1996), and classroom use increases (Larner & Timberlake, 1995; McCannon & Crew, 2000).

Statement of the Problem

While teacher technology training exists, there is little information regarding its effectiveness in teaching teachers, its affect on teachers’ comfort levels in using technology, and how teachers are using the knowledge they’ve gained. In addition, little research has been conducted on the differences between the training delivered to inservice and preservice teachers. Furthermore, with the rapidly changing nature of technology, it is imperative to evaluate current teachers and their knowledge and use of current technology, so schools will know if teachers are capable of using or teaching current systems and are keeping pace with their students.

Since the advent of technology, educational institutions have struggled with how to embed technology into the teaching process and how to train teachers in the use of those technologies as teaching tools and to prepare students for the workplace (Yildirim, 2000). While federal, local, and district mandates exist, the implementation of solutions to those mandates appears to be inconsistent and often flawed. Many teachers are not learning what they should. Many teachers are learning to use the tools but not how to apply them to the classroom (Grant, 1999; Jones, 2001a, 2001b). Many teachers feel alone and unsupported in their technology education, with no place to turn for guidance or support. And many teachers do not have the time to travel to or attend technology courses (Grant, 1999; Jones, 2001b).

It is unclear whether the current training models are effective—whether teachers are learning what they need to know to use technology and to incorporate it into the classroom. From the current research, it appears there is little or no training available that incorporates all of the necessary elements of an effective technology training program: up-to-date training that supports diverse levels of prior knowledge and experience, contextual support and resources, peer communication, on-going support, and convenience. The first step to creating a new, effective program is to analyze the results of current systems, to determine whether teachers feel they have received the training they need to use technology, to teach with technology, and to teach technology.

Purpose of the Study

This study will investigate the attitudes teachers have regarding technology and the comfort levels of teachers with regards to their ability to use technology inside and outside the classroom. This study will collect data from teachers in several southern California school systems, to determine the effectiveness of current technology training and to determine whether the training across districts and among schools is comparable, combining both quantitative and qualitative approaches.

Significance of the Study

While technology training currently exists, it is unclear whether it is effective and whether it meets all the requirements of effective technology instruction. This study will assess teachers from multiple districts who have been exposed to technology training, to determine if training practices, or which training methods, have been effective in preparing educators to use, teach with, and teach technology, whether comfort levels are associated with specific training methods, or whether current practices are lacking. No study, to date, has compared the effectiveness of training programs across districts or from multiple training sources, or how different programs might result in varied comfort and usage levels. This study will provide insight into the possible need to establish more defined training requirements and more consistent training practices.

Research Hypotheses and Questions

1. How comfortable are teachers using technology on the job?

H1: Different training methods will result in different comfort levels.

H2: All training methods will result in reduced anxiety levels toward computer usage.

2. How useful do teachers consider technology for their job?

H3: Teachers who have received technology training will perceive technology as more useful than those who have not received technology training.

H4: There is a relationship between a teacher’s age and the teacher’s attitude toward the usefulness of technology.

3. Do contextually based training experiences increase the likelihood that teachers will use technology in the classroom?

H5: There is a positive relationship between the willingness of a teacher to use technology as a teaching tool and how much contextual training or information the teacher has received.

4. What is the effect of technical support on a teacher’s willingness to use technology?

H6: There is a positive relationship between a teacher’s willingness to use technology on the job and the availability of qualified technical support.

5. What is the role of convenient training (e.g. online training) on a teacher’s willingness to access technology instruction?

H7: There is a positive relationship between training convenience and a teacher’s willingness to learn technology.

6. What is the effect of peer communication on a teacher’s use of and attitude towards technology in education?

H8: The greater the peer communication, the more likely a teacher is to use technology as a teaching tool.

H9: There is a positive relationship between peer communication regarding technology and comfort with technology.

Overview of the Method

This study will use a survey instrument administered to 900 teachers from three school districts (Los Angeles, Long Beach, and Orange Unified School Districts) to assess attitudes of teachers regarding their abilities to use technology, deliver technology-based instruction, and teach technology, based on how the technology was learned, the teacher’s access to contextually based training or information, his or her age, his or her experience with peer communication, and the availability of technical support. It will also assess perceived usefulness of and comfort levels toward technology, as well as to determine if teachers would be more likely to access learning opportunities if the training were more convenient. Using random sampling and stratified random sampling method, the 900 participants will represent 45 schools: 5 each of elementary, middle, and high schools from each of the three districts. Teachers will be sent the survey with an explanation of its purpose and a letter of support from their district’s school board. Teachers will be asked to complete the form and return it using the enclosed, self-addressed stamped envelope. Teachers will be informed their responses will remain confidential and results will be reported by district and by school type (e.g. elementary schools), but not by school. The survey contains demographic questions, including number of years teaching. The majority of the survey consists of questions about preservice and inservice training experiences: where and how software training was administered, frequency of software use (from never to daily), and comfort level (strongly uncomfortable to strongly comfortable) using the software for different academic purposes (administrative, as a teaching tool, or to teach the software). In addition, questions have been included to assess teachers’ attitudes toward the usefulness of computers and teachers’ comfort or anxiety levels regarding technology.

Assumptions

It is assumed that participants will be honest in their responses to the survey. It is assumed that the teachers surveyed will be representative of the population of teachers in the three school districts surveyed. It is assumed that, due to size of the sample (900 surveys will be sent), those surveyed will represent a diversity of training experiences, a diversity of technology attitudes, and a diversity of technology usage.

Limitations

There are three obvious limitations to this study. First, only southern California teachers will be assessed, and one of the three districts (Orange) represents more affluent, primarily white communities, which might have resulted in better implementation and support of technology in its classroom. Second, the assessment is based on self-evaluation, which relies on potentially biased self-opinion. Third, while based heavily on a prior high-validity/reliability instrument, the complete assessment instrument for this study is currently untested for validity and reliability. The partial instrument used for the pilot study contains two sections, and only the second section has been tested for reliability (alpha = 0.8701).

Delimitations

There are many individual differences that can affect computer acceptance, adoption, and use, such as type of prior computer experience (e.g. programming, multimedia authoring, spreadsheet use, database development, or word processing). Each of these variants of use has been shown to affect attitudes toward computer use (Reed, Oughton, Ayersman, Ervin & Giessler, 2000). In this study, type or degree of prior computer experience will not be evaluated. Therefore, results will not account for these types of effects. Because the study surveys teachers in three southern California school districts, the results might only be generalizable to those three districts.

Review of the Literature

In the next decade, the United States will need over 2.2 million new teachers (Smith, 2000) and technology training is now a critical component in learning (Clifford, 1998). National and state teacher accreditation agencies have taken notice of this need (Smith, 2000), yet research shows that “traditional professional development activities are often short term, devoid of adequate follow up, and do not address school contexts” (Jones, 2001a, p. 2), and traditional programs are fragmented and unconnected to real classroom experiences (Jones, 2001a). According to Vojtek and Vojtek (2000), “while there is wide agreement about the importance of technology in classroom instruction, learning about technology is not central to teacher preparation programs” (p. 1). In addition, current technology training is usually about learning technology and not about applying those skills to educational content or pedagogy, leaving teachers with the task of determining on their own how to adapt what they’ve learned to the classroom (Vojtek & Vojtek, 2000).

A number of federal, state, and local programs have been created to either establish teacher technology training or to offer funding to such programs. In 1997, the National Council for Accreditation of Teacher Education (NCATE) formed a task force to determine what knowledge and skills new teachers should have with regards to technology. In 1999, the United States Department of Education began the Preparing Tomorrow's Teachers to Use Technology (PTTT) program that, in its first year of funding, provided 75 million dollars to colleges and universities across the country through various grants (Smith, 2000). Jones (2001) reports that PTTT has requested $150 million to provide technology training for new teachers. The University of Northern Iowa (UNI) Teacher Education Faculty has developed and adopted Preservice Teacher Technology Competencies, which is based on standards from several national agencies including the International Society for Technology in Education (ISTE; Krueger et al., 2000). The CEO Forum on Education and Technology, a partnership between industry and education leaders, developed the Teacher Preparation STaR Chart: A Self-Assessment Tool for Colleges of Education which enables schools, colleges, and departments of education to assess their level of readiness in preparing tomorrow's teachers to use technology (Smith, 2000; Vojtek & Vojtek, 2000). Other states are developing similar partnerships (Smith, 2000). Case studies, which highlight how various colleges of education are integrating technology tools in teacher curriculum and exposing prospective teachers to technology-rich K-12 classroom environments, can be seen at http://www.ncate.org/projects/tech/Caseintr.html (Smith, 2000, p. 4). Online resources for teachers include 21st Century Teachers (http://www.nekesc.org/kids/21tea.html; Jones, 2001a, p. 2) for teachers seeking technology help, the 21st Century Teacher Network (http://www.21ct.org; Jones, 2001b, p. 1), which enables teachers to exchange technology ideas with peers from around the world, and Classroom Connect (http://www.classroom.net; Jones, 2001a, p. 2) for K-12 teachers to interact in online discussions and to obtain expert advice.

While a large number of online and traditional environments exist to train teachers in the use of technology and to provide peer communication, the effectiveness of those programs to prepare teachers to use technology in the classroom and to teach technology to their students is unclear. Are teachers learning to teach with technology and to teach technology or are they simply learning the basics of how computers work and how to use computers as an administrative tool?

In a 2000 National Center for Education Statistics (NCES) survey, only one-third of full-time regular public-school teachers reported feeling well prepared or very well prepared to use computers for classroom instruction (Jones, 2001b). In a survey conducted by McCannon and Crews (2000) of inservice teachers in the state of Georgia, 78 percent of respondents reported using the computer for administrative tasks, such as test creation and grading, while only 31 percent reported using computers to enhance lectures or presentations, and only 28 percent used computers to introduce new material (McCannon & Crews, 2000). Respondents ranked word processing classes as first in importance and computer-curriculum integration classes second (McCannon & Crews, 2000). According to respondents to a recent NCES survey, 77 percent had participated in 32 or fewer hours of technology training and felt they had not received the technology training necessary to incorporate technology into their classrooms (Jones, 2001a). Similarly, Dorman (2001) found that only 53 percent of teachers with computers in their schools used them for classroom instructional purposes. In a phone survey conducted by Linnell (2000) of faculty at university schools of education that offered ESTE (elementary school technology education) courses in their curriculum, respondents indicated that the majority of their ESTE courses were electives. Only five universities in the United Stages had a required ESTE course. Overall, there are few programs that prepare teachers to implement elementary technology education. (Linnell, 2000). At one school that provided its own ad hoc staff development, administrators and staff realized their instruction “in no way replaces the need for systematic technology training at the district and school levels” (Booher & Taylor, 1999, p. 1). To be effective, teacher technology training must be determined, sustained, varied, calculated, supported and frequently individualized (Clifford, 1998).

Training Requirements

Research has found there are several elements which help result in effective teacher technology training: training that results in teachers becoming comfortable with computers and using what they’ve learned to develop technology embedded instruction, to teach using technology, and to teach technology. The individual elements are diverse instruction that supports varied levels of prior knowledge and experience (Yildirin, 2000), contextual lessons or materials to relate computer abilities with teaching needs and practices (Linnell, 2000), peer communication for support and advice (Jones, 2001a), and convenient and ongoing instruction to fit with teachers’ schedules, to provide timely or just-in-time training, and to refresh knowledge, and to keep teachers up to date with technology (Benson, 1997; Grant, 1999; Hornung & Bronack, 2000; Jones, 2001b; McBrath & Kinze, 2000).

Need for diversity in instruction. Researchers agree that different teachers need different lessons. In a study by Yildirim (2000), the level of benefit respondents received from a basic computing course was dependent on their prior experience with computers. The lower the prior knowledge, the more beneficial the course. Advanced users indicated the course would have been beneficial if the content has been determined according to their needs (Yildirim, 2000). Benson (1997) also found that, in a group presentation, some teachers needed a slower pace of instruction while others wanted to move faster. A single group experience could not satisfy diverse interests and ability levels (Benson, 1997). Clifford (1998) writes, “Frequently, people are at different places technologically, so individual differences need to be considered,” (p. 3). In addition to satisfying varied levels of competency, individual contextual needs should be addressed as well. Technology training curricula should be designed to meet specific technology needs—to satisfy the needs in the school where the teacher works (Yildirim, 2000). Lee (1997) suggests that teachers who are anxious about using technology and have low prior knowledge of computers might benefit from discussions, demonstrations, and information on the values and importance of using computer technology.

Need for contextual lessons or materials. According to Ropp (1999):

If preservice or inservice teachers demonstrate proficiency integrating technology into their teaching but do not believe that technology has a use in the classroom, they will probably not teach with technology despite their proficiency. In this respect, attitudes and beliefs about teaching with technology in a subject matter area could vitiate well-planned instruction in teacher preparation. (p. 2)

Teachers should be given training that is relevant to their work, interests, or area of specialty (Lee, 1997). Krueger et al. (2000) stated there needs to be a balance between learning skills and understanding how technology can support learning. Learning should guide students in how the technology can be used within their own curriculum, helping students to understand the role of technology in learning. Further, Lee (1997) suggested that teachers with low computing abilities would benefit from learning why, how, and when to use technology.

Some teachers have concerns about using technology for instruction: how it might affect the developmental needs of students and how it can support various curricular goals (Clark, 2000), or how to develop lesson plans to incorporate new technology (Jones, 2001a). Contextual classroom examples or lesson plans using technology can bridge the gap between fear of use or the embracing of technology as a teaching tool. Warner and Akins (1999) suggested teachers need to be presented tools with demonstrations in a performance context. For example, Grant (1999) commented, “spreadsheets are often introduced through a set of isolated instructions without applying their power in a meaningful context,” (p. 2). Tools that teachers learn to use can become tools they teach their students to use. Lanier and White (1998) suggested using computers as a means of independent learning. Teachers can use computers to search online sites and services for professional information in such areas as reading, math, science, music, and special education, and similarly, teach their students to search online databases for information related to historical, scientific, or literary topics (Lanier & White, 1998). Grant (1999) found positive results from a two-day workshop on tools training that not only taught the tools, but also explored the power of the tools for communicating and making sense of data, and provided examples of how the tools could be used to achieve content goals when applied to classroom practice. The workshop helped teachers step back, consider their goals, and think about how technology might support those goals (Grant, 1999).

Need for peer communications. The first place teachers look for technology help is their peers (Jones, 2001b). Faculty members who have been trained should be given time together to discuss any successes or problems encountered and to develop methods of appropriate instruction (Clifford, 1998). Warner and Akins (1999) found teachers benefited from group discussions after training on how to use their newfound technology knowledge in the classroom and how to develop websites to support their students. Teachers discussed the viability of technology to assist in the development of problem-solving activities and discovery learning activities. Jones (2001a) discussed the effectiveness of online discussion groups, forums, e-mail lists, bulletin boards, message boards, and chat rooms for peer communication. Hornung and Bronack (2000) found teachers who come across problems while teaching with a technology and have support are more willing to keep trying, and that, equally, the opposite is true. Teachers who lack support and encouragement will more likely give up on technology-enriched lessons. Linnell (2000) commented:

Developing exemplary preservice elementary technology education curricula that are open to inclusion of other disciplines, open to advice and change, and proactive in seeking participation of faculty from math, science, and elementary education will create a greater understanding among faculty and teachers about how ESTE can benefit young schoolchildren. (p. 4)

Need for convenient learning and continued support. Smaldino and Muffoletto (1997) commented that technology training cannot be done as a single event. Yet, according to Jones (2001b), research shows that traditional professional development activities are often short term and devoid of adequate follow-up. Adding to the problem, in a survey of Georgia elementary school teachers, reasons stated for not participating in staff develop courses included the distance to travel after school to attend the training and the large amount of traffic where the courses were being offered (McCannon & Crews, 2000). Many teachers complain that the lack of available technology training is preventing them from taking advantage of the wide range of educational technology (Jones, 2001a). Booher and Tayler (1999) found that teachers responded well to short, 15-20 minute lessons.

Gaps in the Literature

According to Yildirim (2000), more than 1,300 institutions of higher education prepare future teachers in the United States and within the next decade schools will need to hire nearly two million teachers. Teachers teach as they have been taught, and it is unlikely that computer skills will be transferred to students and encouraged by teachers unless the teachers have positive attitudes toward computer use (Yildirim, 2000). It is important to prepare future teachers to use the technology already present in most schools, to adapt to new technology, and to critically examine technology for instruction (Clark, 2000).

While technology training currently exists, it is unclear whether it is effective and whether it meets all the requirements of effective technology instruction. This study will assess teachers who have been exposed to technology training, to determine if current training practices are effective in preparing educators to use, teach with, and teach technology or whether current practices are lacking. In addition, this study will examine the attitudes of teachers in multiple districts who have been taught technology through varied training experiences. No study, to date, has compared the effectiveness of training programs across districts or from multiple training sources. This study will provide insight into the potential need to establish more defined training requirements and more consistent training practices.

Methods

Overview of the Problem

According to Dorman (2001), the availability of computers for student use in public schools has increased significantly in recent years. However, despite the growing access to technology, only 20 percent of the nation’s 2.5 million public school teachers are comfortable with using computers in the classroom (Dorman, 2001). Most technology planning concentrates on hardware and software, not on staff development (Benson, 1997), yet according to Clifford (1998), staff development needs to be factored into school technology plans. Currently, 38 states have technology requirements in their teacher preparation programs (Hornung & Bronack, 2000). However, these courses are typically designed to teach basic computing skills (Yildrim, 2000), leaving teachers unprepared to effectively use technology in the classroom (Smith, 2000). Compounding the problem, the educational faculty who are expected to teach the teachers are not receiving the training they need to model proper or effective use of technology (Hornung & Bronak, 2000).

Population

The population for this study is inservice southern California elementary school, middle school, and high school teachers, including emergency credentialed teachers, newly credentialed teachers, and experienced teachers. The population includes teachers who became teachers both before and after the move of local, state, and federal agencies to incorporate technology into education and to place more formal technology requirements on teachers.

Sample

The sample for this study will consist of 900 teachers from three southern California school districts (Los Angeles, Long Beach, and Orange Unified School Districts). Using a stratified random sampling procedure, fifteen schools (5 each of elementary, middle, and high schools) will be randomly selected from each district. According to the National Center for Educational Statistics (NCES, 2002), the Los Angeles Unified School (LAUSD) has a population of 4,444,560 with 710,007 students and 33,754 teachers in 655 primary and secondary schools, the Long Beach Unified School District (LBUSD) has a population of 507,068 with 91,465 students and 4,079 teachers in 86 primary and secondary schools, and the Orange Unified School District (OUSD) has a population of 210,820 with 30,858 students and 1,442 teachers in 41 primary and secondary schools. The LAUSD population is 45.5% white, 11.4% black or African American, and 9.4% Asian, while the 18-year-old and younger school age population is 38.3% white, 11.5% black or African American, and 6.2% Asian. The LBUSD population is 47.7% white, 13.7% black or African American, and 11.8% Asian, while the 18-year-old and younger school age population is 34.6% white, 16.3% black or African American, and 11.9% Asian. The OUSD population is 70.1% white, 1.7% black or African American, and 11.2% Asian, while the 18-year-old and younger school age population is 63.8% white, 1.9% black or African American, and 10.2% Asian.

Nature of the study

This study uses both non-experimental (descriptive) and qualitative methods. There is no control group and no variables will be manipulated. Using a survey instrument, this study will attempt to answer the following research questions.

1. How comfortable are teachers using technology on the job?

2. How useful do teachers consider technology for their job?

3. Do contextually based training experiences increase the likelihood that teachers will use technology in the classroom?

4. What is the effect of technical support on a teacher’s willingness to use technology?

5. What is the role of convenient training (e.g. online training) on a teacher’s willingness to access technology instruction?

6. What is the effect of peer communication on a teacher’s use of and attitude towards technology in education?

Instrument

This study will use a modified version of an instrument developed for a study by Milbrath and Kinzie (2000). It has been modified for applicability to teachers, for evaluating learning experiences, and for determining the types of use for various technologies. Questions were added to assess each dependent variable in this study: attitude toward computers, comfort level with computers, comfort levels using computers for administration and teaching, comfort levels teaching various computer applications, attitudes toward the value of convenient learning, attitudes toward the value of and need for peer communication, attitudes toward the need for diversity in training to support varied prior knowledge, and attitudes toward the need for contextual learning and information. The original Computer Technology Survey by Milbrath and Kinzie (2000) included three parts: demographic information and two instruments--Attitudes Toward Computer Technologies (ACT) and Self-Efficacy with Computer Technologies (SCT). The modified instrument used in this study (see Appendix B) contains two of those three original parts: demographic information and one instrument--Attitudes Toward Computer Technologies (ACT).

Section one of the modified instrument begins with a demographic section with questions to elicit the teacher’s school, type of school (Elementary, Middle, High School), years teaching, age, gender (Male/Female), and race/ethnicity (with the option to mark multiple ethnicity). A second part as been added to the first section to assess if and how teachers have learned nine common computer applications (word processing, e-mail, presentation, spreadsheet, web browsing, database management, multimedia authoring, web creation, and video editing), frequency of use of each application, perceived comfort level of five of the nine applications (word processing, e-mail, spreadsheet, database management, and web creation) for administrative purposes, perceived comfort level of four of the nine applications (presentation, spreadsheet, database management, and web browsing) as a teaching tool, and perceived comfort level teaching eight of the nine applications (word processing, e-mail, presentation, spreadsheet, database management, multimedia authoring, web creation, and video editing). Frequency of use will be measured using a five-point Likert-type scale (1=never, 2=once a year, 3=once a month, 4=once a week, 5=daily), with participants marking the most representative choice for each of the nine applications. Method of learning will be assessed with participants marking one or more of six options, to indicate all methods used for learning each application (1=haven’t learned, 2=self-taught, 3=academic courses, 4=online courses, 5=hands-on staff development, 6=lecture staff development). These six options are the result of a modification made after administering the pilot study. In the pilot study, method of learning used five points (1=no formal training, 2=college courses, 3=online courses, 4=hands-on workshop, 5=seminar). The instrument in Appendix B is the instrument used for the pilot study. Perceived comfort level for using computers for administrative tasks, teaching with computers, and teaching computers will all be assessed using a four-point Likert-type scale (1=very uncomfortable, 2=slightly uncomfortable, 3=slightly comfortable, 4=very comfortable).

Additional questions, not included in the pilot study and, therefore, not appearing in Appendix B, will be added to the instrument to assess for experiences with and attitudes toward peer communications, convenient learning (e.g. online, one-the-job lunch workshops, CD-ROM-based instruction), and diversity of training (training designed for varied levels of prior knowledge). Three open-ended questions, with approximately one-third of a letter-sized page available for each answer, will also be added. The questions will query attitudes toward technology usage at the teacher’s school, attitudes toward currently available technology training options, and what the teacher would like to see offered with regard to technology training. These items did not appear in the pilot study, but will be added and tested using a future pilot study.

Section two of the instrument, the modified ACT scale, consists of 18 items to assess perceived comfort/anxiety with (7 items) and perceived usefulness of (11 items) computer technologies. Questions were altered to specifically address teachers and the three computer uses being assessed by this instrument (administration using computers, teaching with computers, and teaching computers). In addition, two questions were added to assess teachers’ attitudes toward the benefit of students using computers. The ACT section uses a four-point Likert-type (1= strongly disagree, 2=slightly disagree, 3=slightly agree, 4=strongly agree). To obtain the ACT score, responses to the negatively phrased items will be re-coded (1=4, 2=3, 3=2, and 4=1) and summed for each subscale (comfort/anxiety, usefulness). The highest possible score for the perceived comfort/anxiety scale is 28, and 44 for the perceived usefulness scale (the highest possible total score is 72).

Construct validity for the original Milbrath and Kinzie (2000) instrument was tested using Principal Component Analysis with varimax rotation. Sets of item loadings indicated a valid measure of each factor extracted. High internal consistency reliability (alpha) estimates for the factors on the ACT scale was reported. The modified version of the instrument used in this study had not been tested for reliability or validity prior to the pilot study.

Procedure

At the beginning of the Spring semester, 2003, the names of all schools from the Los Angeles, Long Beach, and Orange school districts (southern California) will be added to a database management system (Access XP, 2002). To obtain this list, each district will be contacted, via phone, and then by mail, to explain the study and request the desired information. Then, each school principle will be contacted, via mail, explaining the study and requesting participation of their teachers. The principles from all schools that do not respond, or that refuse to participate, will be contacted by phone. Each school that has agreed to participate will be assigned a code for school type (elementary, middle, and high) and district (LA, LB, and OC for Los Angeles, Long Beach, and Orange respectively). A SQL query will be written using random number generation to select 5 elementary, 5 middle, and 5 high schools from each district, for a total of 15 schools from each district, or 45 schools in total.

A list of all full-time teachers, including emergency-credentialed teachers, from the 45 selected schools will be added to the Access database, with each teacher linked to his or her respective school. Using a SQL query with random number generation, 20 teachers will be selected from each of the 45 previously selected schools, for a total of 900 teachers.

In February 2003, the 900 randomly selected teachers will be sent a packet containing the survey instrument, a consent form, and a letter from their respective school district board explaining the purpose of the study, promising confidentiality, and encouraging participation. The letter will state that participation is purely voluntary and there will be no repercussions for non-participation. Also included with the survey will be a self-addressed, stamped return envelope. Teachers will be asked to sign the consent form, complete the survey, and return them both using the enclosed envelope. To encourage participation, two weeks later a postcard will be sent to all participants reminding them to complete and return the survey. Two weeks after that, a “final” reminder card will be sent.

Data Analysis

The first week of April 2003, analysis will begin. The effect of demographic variables such as school, age, sex, ethnic background, and computer course experience will be analyzed using descriptive, t-test, Chi-Square, ANOVA, and correlation statistics. Attitude variables will be analyzed using Chi-Square, ANOVA, and correlation statistics.

Validity and reliability

The ACT portion of the modified instrument used for this study was analyzed for internal consistency using Cronbach’s alpha (alpha = 0.8595). Item 24 was removed based on “alpha if item deleted,” resulting in a final alpha of 0.8701. Subscale analyses were performed on the comfort/anxiety and usefulness portions of the ACT scale, using Crombach’s alpha as well. On the six item (with item 24 removed) comfort/anxiety scale, alpha = 0.8271. On the eleven item usefulness scale, alpha = 0.8616.

Reliability was established with the pilot study, by administering the instrument to 21 elementary, middle, and high school teachers, as well as two university professors. Post test discussion supported the overall reliability of the questions and the scales to accurately assess teacher technology usage and attitudes. One flaw uncovered in the pilot study was the response options for question 8—how each technology was learned. While the question states that participants should only circle responses to software they’ve learned, not every participant appeared to have read the question carefully. Compounding the problem, the first response category, “no formal training,” may have been misinterpreted by some participants as meaning “no training.” In addition, there were concerns as to whether “seminar” also meant “staff development.” These problems will be addressed and corrected before the second pilot study. The other problem some participants had with the study was the terms used for defining certain software applications, such as “database management.” To alleviate this problem, a glossary of terms, with brief definitions and more examples than are contained in the pilot study instrument, will be added.

Pilot Study

A pilot study was conducted in May 2002. The survey instrument (see Appendix B) was administered to students and instructors of the final masters seminar course for masters students at the Rossier School of Education at the University of Southern California. The researcher for this study was a student in that class. The survey was administered in class, on May 27, 2002, to all students that were present (21 students) and the two class professors; a total of 23 surveys were completed. As the instruments were handed out, the researcher explained the purpose of the study and told those that did not teach primary or secondary school to act as if they did and to select one of the three school types for question 2. Students and instructors were told that, while participation was encouraged, they were not required to participate. Participants then filled out the surveys, which were then collected by the researcher. An informal group discussion followed, to explore impressions and opinions about the survey, including its strengths and weaknesses.

Following administration of the test, items were entered into an Excel spreadsheet for preparation for import into SPSS for Windows for analysis. Two weeks later, using SPSS, internal validity of the instrument was analyzed. The results were discussed in the validity and reliability section of chapter 3. In addition to testing for reliability, SPSS was used for analyzing some of the data. Means and standard deviations were analyzed for how teachers used word processing software (for administration and to teach—“to teach with” was not included, since word processing was not listed as a software to teach with). Because there were five learning methods, and participants could select more than one, eight individual or combined methods were established, based on participant responses. To simplify analysis, the researcher made assumptions about the characteristics of each method and used those assumptions to combine learning methods into three categories (1=college or self taught, 2=hands-on workshop or online, 3=lecture-based workshop). An analysis of means was run (see Appendix A Table 1). The results indicated there was a significant difference (p<.05) in the effect of learning method on the comfortable level using word processing for administration, but not on comfort level for teaching word processing. The results indicated that learning in college or being self-taught results in greater comfort than either of the other two learning methods (hands-on or online, lecture). A one-way ANOVA was also run on the data. The between groups result indicated f=689.557, p=0.000 for administrative comfort and f=0.859, p=0.427 for teaching comfort (see Appendix A Table 2). A Scheffe post hoc analysis for administrative comfort also indicated a significant difference between each of the three learning method groups (see Appendix A Table 3). Similar analyses were run for the effect of learning methods for spreadsheets on comfort levels using spreadsheets as an administrative tool, as a teaching tool, and to teach spreadsheet programming. The results of the analysis (see Appendix A Tables 4 and 5) showed no significant difference for learning method on comfort levels for any of the three types of use.

A final analysis was run to test correlations between years teaching, age, totals from the ACT scale, totals for the “perceived comfort/anxiety level with technology” subscale portion of the ACT scale, and totals for the “perceived usefulness of technology” subscale portion of the ACT scale. All correlations (see Appendix A Table 6) were in the predicted direction. Several were significant as hypothesized. Others were trended but not significant.

It should be noted that some of the data were changed for analyses purposes. Several of the participants either intentionally or unintentionally left some responses blank. Questions 10, 11, and 12 (see Appendix B) evaluated perceived comfort levels using a variety of software for administration and teaching. There were 17 required responses per participant. With 23 participants, the total expected responses was 391. However, there were 22 responses left blank (5.6% of the total possible responses). For analyses purposes, each blank response was replaced with the mode of the obtained responses for each respective question. In addition, the ACT portion (the comfort/anxiety and usefulness portion) of the survey consisted of 18 questions. One participant left one response blank, one left 2 responses blank, one left 3 responses blank, and one left 4 responses blank. In addition, two participants only marked the first 5 questions, each leaving 13 responses blank. With 18 possible responses per participant and 23 participants, the total expected responses were 414. However, as indicated, 36 responses (8.7% of the total possible responses) were left blank. To facilitate analyses, each blank response was replaced with the mode of the responses to each respective question. The actions described may have improperly skewed and misrepresented results.

References

Benson, D. (Jan. 1997). Technology training: Meeting teachers’ changing needs [Electronic Version]. Principal, 76, 17-19.

Booher, A.; Taylor, M. (Nov./Dec. 1999). Byte-sized technology sessions: Teachers training teachers [Electronic Version]. Book Report, 18(3), 46-47.

Clark, P.; Martin, L.; Hall, V. (Autumn 2000). Preparing preservice teachers to use computers effectively in elementary schools [Electronic Version]. The Teacher Education, 36(2), 102-114.

Clifford, W. (Aug./Sept. 1998). Updating teachers’ technology skills [Electronic Version]. Momentum, 29(3), 34-36.

Dorman, S. M. (2001). Are teachers using computers for instruction [Electronic Version]? The Journal of School Health, 71(2), 83-84.

Grant, C. M. (Nov. 1999). Beyond tool training [Electronic Version]. Learning and Leading with Technology, 27(3), 42-47.

Hornung, C. S.; Bronack, S. C. (May 2000). Preparing technology-based teachers [Electronic Version]. TechTrends, 44(4), 17-20.

Jones, C. A. (May/June 2001a). Tech support: Preparing teachers to user technology [Electronic Version]. Principal Leadership (High School Ed.), 1(9), 35-39.

Jones, C.A. (Oct. 2001b). When teachers’ computer literacy doesn’t go far enough [Electronic Version]. The Education Digest, 67(2), 57-61.

Krueger, K.; Hansen, L.; Smaldino, S. E. (Apr. 2000). Preservice teacher technology competencies: A model for preparing teachers of tomorrow to use Technology [Electronic Version]. TechTrends, 44(3), 47-50.

Lanier, K. A.; White, W. (May 1998). Building computer compentency [Electronic Version]. Prnicipal, 77(5), 41-42.

Laffey, J.; Musser, D. (1998). Attitudes of preservice teachers about using technology in teaching [Electronic Version]. Journal of Technology and Teacher Education, 6(4), 223-241.

Lee, D. (Apr. 1997). Factors influencing the success of computer skills learning among inservice teachers [Electronic Version]. British Journal of Educational Technology, 28, 139-141.

Linnell, C. C. (Fall 2000). Identifying institutions that prepare elementary teachers to teach technology education: Promoting ESTE awareness [Electronic Version]. Journal of Industrial Teacher Education, 38(1), 91-102.

McCannon, M.; Crews, T. B. (2000). Assessing the technology training needs of elementary school teachers [Electronic Version]. Journal of Technology and Teacher Education, 8(2), 111-121.

Milbrath, Y-c. L.; Kinzie, M. B. (2000). Computer technology training for prospective teachers: Computer attitudes and perceived self-efficacy [Electronic Version]. Journal of Technology and Teacher Education, 8(4), 373-396.

NCES: National Center for Educational Statistics district demographic information. Retrieved June 19, 2002, from http://www.nces.ed.org.

Reed, W. M.; Oughton, J. M.; Ayersman, D. J.; Ervin Jr., J. R.; Giessler, S. F. (2000). Computer experience, learning style, and hypermedia navigation. Computers in Human Behavior, 16, 609-628.

Ropp, M. M. (Summer 1999). Exploring individual characteristics associated with learning to use computer in preservice teacher preparation [Electronic Version]. Journal of Research Computing in Education, 31(4), 402-424.

Smith, S. (Winter 2000). Teacher education [Electronic Version]. Journal of Special Education Technology, 15(1), 59-62.

Tinson, L. (Apr. 1996). Teachers’ vital role in bringing technology into the classroom: Technology for learning project, Los Angeles county, California [Electronic Version]. Thrust for Educational Leadership, 25, 10-11.

Vojtek, R.; Vojtek, R. O. (Summer 2000). Tech cum laude [Electronic Version]. Journal of Staff Development, 21(3), 61-62.

Warner, M.; Akins, M. (Oct. 1999). Training today’s teachers for tomorrow’s classrooms [Electronic Version]. T.H.E. Journal, 27(3), 118.

Yildirim, S. (Summer 2000). Effects of an educational computing course on preservice and inservice teachers: A discussion and analysis of attitudes and use [Electronic Version]. Journal of Research on Computing in Education, 32(4), 479-495.

APPENDIX A

Table 1

The comfort level using word processing programs for teaching administration and as a teaching tool, based on how word processing was learned.

How learned

N

M

SD

1. Use Word Processing for Admin

1a

80

3.95

0.219

2b

4

3.00

0.000

3c

4

0.00

0.000

2. Teach with Word Processing

1a

80

3.45

1.124

2b

4

3.00

0.000

3c

4

4.00

0.000

Note. aCombines two learning methods: self-taught and college. bCombines two learning methods: hands-on workshop and online. cLearning by lecture-style workshop.

Table 2

One-way ANOVA for effects of method participant used to learn word processing as an indicator of comfort levels using word processing for administration and as a teaching tool.

F

Significance

1. Use Word Processing for Admin

Between Groups

689.557

0.000

2. Teach Word Processing

Between Groups

000.859

0.427

Table 3

Scheffe post hoc analysis on comfort using word processing as an administrative tool and as a teaching tool, comparing self-taught or college and hands-on workshop or online to self-taught or college, hands-on workshop or online, and lecture-style workshop.

How learned Word Processing

Mean

Difference

Significance

Self-taught or through college

Hands-on workshop or online

1. Using Word Processing for Admin

1a

2b

0.95*

0.000*

3c

3.95*

0.000*

2b

1a

-0.95*

0.000*

3c

3.00*

0.000*

3c

1a

-3.95*

0.000*

2b

-3.00*

0.000*

2. Teach Word Processing

1a

2b

0.45

0.721

3c

-0.55

0.614

2b

1a

-0.45

0.721

3c

-1.00

0.430

3c

1a

0.55

0.614

2b

1.00

0.430

Note. aCombines two learning methods: self-taught and college. bCombines two learning methods: hands-on workshop and online. cLearning by lecture-style workshop.

*p < .05

Table 4

The comfort level using spreadsheet programs for teaching administration and as a teaching tool, based on how spreadsheet programs were learned.

HowLearned

N

M

SD

1. Use spreadsheet for administration

1a

88

3.32

1.067

3b

4

3.00

0.000

2. Use spreadsheet as a teaching tool

1a

88

3.27

1.014

3b

4

3.00

0.000

3. Teach spreadsheets

1a

88

2.86

1.261

3b

4

3.00

0.000

Note. aCombines two learning methods: self-taught and college. bLearning by lecture-style workshop.

Table 5

One-way ANOVA for effects of method participant used to learn word processing as an indicator of comfort levels using word processing for administration and as a teaching tool

F

Significance

1. Use Excel for Administration

Between Groups

0.352

0.555

2. Teach with Excel

Between Groups

0.286

0.594

3. Teach Excel

Between Groups

0.046

0.830

Table 6

Correlations between years teaching, age, ACT total, comfort total, and usefulness total.

Years teaching

Age

ACT total

Comfort

total

Usefulness total

1. Years teaching

Pearson Correlation

1.00

0.741**

0.066

-0.009

0.065

Sig. (2-tailed)

__

0.000

0.534

0.929

0.538

2. Age

Pearson Correlation

0.741**

1.00

-0.143

-0.094

-0.173

Sig. (2-tailed)

0.000

__

0.175

0.374

0.099

3. ACT total

Pearson Correlation

0.066

-0.143

1.00

0.733**

0.899**

Sig. (2-tailed)

0.534

0.175

__

0.000

0.000

4. Comfort total

Pearson Correlation

-0.009

-0.094

0.733**

1.00

0.381**

Sig. (2-tailed)

0.929

0.374

0.000

__

0.000

5. Usefulness total

Pearson Correlation

0.065

-0.173

0.899**

0.381**

1.00

Sig. (2-tailed)

0.538

0.099

0.000

0.000

__

** p< .01

APPENDIX B

COMPUTER TECHNOLOGIES SURVEY

The purpose of this survey is to find out how teachers feel about computer technologies. Results from this survey will be used to help determine how computer technologies are perceived and used by teachers. Your responses are an important contribution to the process.

Within this survey, the terms “technology” and "computer technologies" are defined as the use of computers and related hardware and software to perform specific tasks. Computer technologies in education are most often used for word processing (e.g., WordPerfect, Word), communicating with others (e.g., email), class budgeting and class grade sheets (e.g. Excel), database creation and management of multiple classes or larger budgets (e.g. Access, Paradox, Dbase), classroom presentations (e.g. PowerPoint), and browsing or searching the Internet (e.g. Internet Explorer, Netscape, AOL). When responding to the following statements, consider your use of any or all of these technologies.

There are two segments to this survey: Background Information and Attitudes. It should take about 5-10 minutes to complete all sections. Your responses will be kept completely confidential.

Thank you in advance for your participation.

PART ONE

BACKGROUND INFORMATION

Please respond to each of the following items:

1. School where you teach:

2. Type of school where you teach (check one):

[ ] Elementary school [ ] Middle school [ ] High school

3. Number of years you’ve been teaching:

4. Your Age:

5. Your Gender (check one): [ ] Male [ ] Female

6. Your Ethnicity (check all that apply):

a. [ ] African-American

b. [ ] Caucasian (non Hispanic)

c. [ ] Hispanic

d. [ ] Native American

e. [ ] Asian/Pacific Islander

f. [ ] Other:

7. How often do you use any of the following for any part of your job (circle the answer that most closely represents your use)?

once aonce aonce a

NeveryearmonthweekDaily

Word Processing software12345

E-mail12345

Presentation software (e.g. PowerPoint)12345

Spreadsheet software12345

Web browsing software12345

Database Management software12345

Multimedia Authoring software12345

Website Creation software12345

Video Editing software12345

8. Have you taken any courses, workshops, or seminars to learn to use any of the following (circle responses only for those technologies you use on or off the job, including for fun)?

No formalCollegeOnlineHands-on

learningcourse(s)course(s)WorkshopSeminar

Word Processing software12345

E-mail12345

Presentation software (e.g. PowerPoint)12345

Spreadsheet software12345

Web Browsing software12345

Database Management software12345

Multimedia Authoring software12345

Website Creation software12345

Video Editing software12345

9. How comfortable are you with using the following technologies for administrative tasks (circle the answer that most closely represents your level of comfort. If you do not use a particular technology, do not circle any of the numbers for that technology)?

VerySlightlySlightlyVery

UncomfortableUncomfortableComfortableComfortable

Word Processing software1234

E-mail1234

Spreadsheet software1234

Database Management software1234

Website Creation software1234

10. How comfortable would you be using any of the following technologies as a teaching tool in the classroom (circle the answer that most closely represents your level of comfort)?

VerySlightlySlightlyVery

UncomfortableUncomfortableComfortableComfortable

Presentation software (e.g. PowerPoint)1234

Spreadsheet software (e.g. Excel)1234

Database Management software1234

Web browsing software (e.g. Netscape)1234

11. How comfortable would you be teaching any of the following technologies to your students (circle the answer that most closely represents your comfort level)?

VerySlightlySlightlyVery

UncomfortableUncomfortableComfortableComfortable

Word Processing software1234

E-mail1234

Presentation software (e.g. PowerPoint)1234

Spreadsheet software1234

Database Management software1234

Multimedia Authoring software1234

Website Creation software1234

Video Editing software1234

PART TWO

ATTITUDES TOWARED COMPUTER TECHNOLOGIES

This survey has 18 statements about computer technologies. After reading each statement, please indicate the extent to which you agree or disagree by circling the number to the right of each statement. Please respond to all statements. There are no correct or incorrect responses.

StronglySlightlySlightlyStrongly

DisagreeDisagreeAgreeAgree

12. I don’t have any use for computer technologies1234

on a day-to-day basis.

13. Using computer technologies to communicate1234

with others over a computer network can help

me in my job.

StronglySlightlySlightlyStrongly

DisagreeDisagreeAgreeAgree

14. I am confident about my ability to do well in any1234

task that requires me to use computer technologies.

15. Using computer technologies for administrative1234

tasks will mean work for me.

16. I do not think that computer technologies will be1234

useful to me in my profession.

17. I feel at ease learning about computer technologies.1234

18. With the use of computer technologies, I can create1234

materials to enhance my performance as a teacher.

19. If I can use word processing software, I will be1234

more productive.

20. Anything that computer technologies can be used1234

for I can do just as well some other way.

21. The thought of using computer technologies1234

in the classroom frightens me.

22. Computer technologies are confusing to me.1234

23. I could use computer technologies to access1234

many types of information sources for my work.

24. I do not feel threatened by the impact of1234

computer technologies on my job.

25. I am anxious about computer technologies because1234

I don’t know what to do if something goes wrong.

26. Computer technologies can be used to assist my1234

students in learning new skills.

27. I feel comfortable about my ability to work with1234

computer technologies.

28. Using computers technologies in the classroom1234

will not improve my performance as a teacher.

29. My students can use computers to help them1234

learn critical thinking and problem solving.

Thanks for your participation!