38
http://rer.aera.net Research Review of Educational http://rer.sagepub.com/content/82/1/90 The online version of this article can be found at: DOI: 10.3102/0034654312438409 February 2012 2012 82: 90 originally published online 21 REVIEW OF EDUCATIONAL RESEARCH Cheryl Amundsen and Mary Wilson Educational Development Literature in Higher Education Are We Asking the Right Questions?: A Conceptual Review of the Published on behalf of American Educational Research Association and http://www.sagepublications.com can be found at: Review of Educational Research Additional services and information for http://rer.aera.net/alerts Email Alerts: http://rer.aera.net/subscriptions Subscriptions: http://www.aera.net/reprints Reprints: http://www.aera.net/permissions Permissions: What is This? - Feb 21, 2012 OnlineFirst Version of Record - Mar 6, 2012 Version of Record >> by guest on October 11, 2013 http://rer.aera.net Downloaded from by guest on October 11, 2013 http://rer.aera.net Downloaded from by guest on October 11, 2013 http://rer.aera.net Downloaded from by guest on October 11, 2013 http://rer.aera.net Downloaded from by guest on October 11, 2013 http://rer.aera.net Downloaded from by guest on October 11, 2013 http://rer.aera.net Downloaded from by guest on October 11, 2013 http://rer.aera.net Downloaded from by guest on October 11, 2013 http://rer.aera.net Downloaded from by guest on October 11, 2013 http://rer.aera.net Downloaded from by guest on October 11, 2013 http://rer.aera.net Downloaded from by guest on October 11, 2013 http://rer.aera.net Downloaded from by guest on October 11, 2013 http://rer.aera.net Downloaded from by guest on October 11, 2013 http://rer.aera.net Downloaded from by guest on October 11, 2013 http://rer.aera.net Downloaded from by guest on October 11, 2013 http://rer.aera.net Downloaded from by guest on October 11, 2013 http://rer.aera.net Downloaded from by guest on October 11, 2013 http://rer.aera.net Downloaded from by guest on October 11, 2013 http://rer.aera.net Downloaded from by guest on October 11, 2013 http://rer.aera.net Downloaded from by guest on October 11, 2013 http://rer.aera.net Downloaded from by guest on October 11, 2013 http://rer.aera.net Downloaded from by guest on October 11, 2013 http://rer.aera.net Downloaded from by guest on October 11, 2013 http://rer.aera.net Downloaded from by guest on October 11, 2013 http://rer.aera.net Downloaded from by guest on October 11, 2013 http://rer.aera.net Downloaded from by guest on October 11, 2013 http://rer.aera.net Downloaded from by guest on October 11, 2013 http://rer.aera.net Downloaded from by guest on October 11, 2013 http://rer.aera.net Downloaded from by guest on October 11, 2013 http://rer.aera.net Downloaded from by guest on October 11, 2013 http://rer.aera.net Downloaded from by guest on October 11, 2013 http://rer.aera.net Downloaded from by guest on October 11, 2013 http://rer.aera.net Downloaded from by guest on October 11, 2013 http://rer.aera.net Downloaded from by guest on October 11, 2013 http://rer.aera.net Downloaded from by guest on October 11, 2013 http://rer.aera.net Downloaded from by guest on October 11, 2013 http://rer.aera.net Downloaded from by guest on October 11, 2013 http://rer.aera.net Downloaded from by guest on October 11, 2013 http://rer.aera.net Downloaded from

Are We Asking the Right Questions?: A Conceptual Review of the Educational Development Literature in Higher Education

Embed Size (px)

Citation preview

http://rer.aera.netResearch

Review of Educational

http://rer.sagepub.com/content/82/1/90The online version of this article can be found at:

 DOI: 10.3102/0034654312438409

February 2012 2012 82: 90 originally published online 21REVIEW OF EDUCATIONAL RESEARCH

Cheryl Amundsen and Mary WilsonEducational Development Literature in Higher Education

Are We Asking the Right Questions?: A Conceptual Review of the  

 Published on behalf of

  American Educational Research Association

and

http://www.sagepublications.com

can be found at:Review of Educational ResearchAdditional services and information for    

  http://rer.aera.net/alertsEmail Alerts:

 

http://rer.aera.net/subscriptionsSubscriptions:  

http://www.aera.net/reprintsReprints:  

http://www.aera.net/permissionsPermissions:  

What is This? 

- Feb 21, 2012OnlineFirst Version of Record  

- Mar 6, 2012Version of Record >>

by guest on October 11, 2013http://rer.aera.netDownloaded from by guest on October 11, 2013http://rer.aera.netDownloaded from by guest on October 11, 2013http://rer.aera.netDownloaded from by guest on October 11, 2013http://rer.aera.netDownloaded from by guest on October 11, 2013http://rer.aera.netDownloaded from by guest on October 11, 2013http://rer.aera.netDownloaded from by guest on October 11, 2013http://rer.aera.netDownloaded from by guest on October 11, 2013http://rer.aera.netDownloaded from by guest on October 11, 2013http://rer.aera.netDownloaded from by guest on October 11, 2013http://rer.aera.netDownloaded from by guest on October 11, 2013http://rer.aera.netDownloaded from by guest on October 11, 2013http://rer.aera.netDownloaded from by guest on October 11, 2013http://rer.aera.netDownloaded from by guest on October 11, 2013http://rer.aera.netDownloaded from by guest on October 11, 2013http://rer.aera.netDownloaded from by guest on October 11, 2013http://rer.aera.netDownloaded from by guest on October 11, 2013http://rer.aera.netDownloaded from by guest on October 11, 2013http://rer.aera.netDownloaded from by guest on October 11, 2013http://rer.aera.netDownloaded from by guest on October 11, 2013http://rer.aera.netDownloaded from by guest on October 11, 2013http://rer.aera.netDownloaded from by guest on October 11, 2013http://rer.aera.netDownloaded from by guest on October 11, 2013http://rer.aera.netDownloaded from by guest on October 11, 2013http://rer.aera.netDownloaded from by guest on October 11, 2013http://rer.aera.netDownloaded from by guest on October 11, 2013http://rer.aera.netDownloaded from by guest on October 11, 2013http://rer.aera.netDownloaded from by guest on October 11, 2013http://rer.aera.netDownloaded from by guest on October 11, 2013http://rer.aera.netDownloaded from by guest on October 11, 2013http://rer.aera.netDownloaded from by guest on October 11, 2013http://rer.aera.netDownloaded from by guest on October 11, 2013http://rer.aera.netDownloaded from by guest on October 11, 2013http://rer.aera.netDownloaded from by guest on October 11, 2013http://rer.aera.netDownloaded from by guest on October 11, 2013http://rer.aera.netDownloaded from by guest on October 11, 2013http://rer.aera.netDownloaded from by guest on October 11, 2013http://rer.aera.netDownloaded from by guest on October 11, 2013http://rer.aera.netDownloaded from

Review of Educational Research March 2012, Vol. 82, No. 1, pp. 90–126

DOI: 10.3102/0034654312438409© 2012 AERA. http://rer.aera.net

90

Are We Asking the Right Questions? A Conceptual Review of the Educational

Development Literature in Higher Education

Cheryl Amundsen and Mary WilsonSimon Fraser University

This is a conceptual review of the literature variously referred to as faculty development, educational development, instructional development, and aca-demic development in higher education. Previous empirical reviews covering more than 30 years of published literature could draw only tentative and weak conclusions about the effectiveness of educational development prac-tices. The authors used different questions that queried the nature of educa-tional development practice and the thinking underlying practice. Their conceptual review yielded a framework with six foci of practice (skill, method, reflection, disciplinary, institutional, and action research or inquiry) that was drawn from an analysis of the design elements of the educational develop-ment practices in the research they reviewed and from an analysis of the conceptual, theoretical, and empirical literature cited by those articles. This six-cluster framework provides a new way of thinking about the design of practice and a more meaningful basis for investigating the effectiveness of educational development practice.

Keywords: educational development, faculty development, academic development.

Some long-standing assumptions in higher education are that academics are not adequately prepared for their teaching role, have unsophisticated conceptions of teaching and learning, and have little knowledge of effective teaching practices, both in general and in their own specific discipline (Evers & Hall, 2009). These assumptions have led universities to pursue a number of ways to develop teaching practice. At the same time, a literature variously referred to as faculty development, educational development, instructional development, and academic development (Gosling, 2009) has emerged in support of research and practice, and this literature is the focus of our review. We use the term educational development to describe actions, planned and undertaken by faculty members themselves or by others working with faculty, aimed at enhancing teaching. The sources of this literature

RER438409RER10.3102/0034654312438409Amundsen & WilsonEducational Develop-ment Literature

Educational Development Literature

91

are diverse and include journals that are (a) discipline specific (e.g., European Journal of Engineering Education), (b) about higher education in general (e.g., Research in Higher Education), (c) dedicated specifically to teaching in higher education (e.g., Teaching in Higher Education), and (d) focused on the broader academic role (including teaching) in higher education (e.g., International Journal for Academic Development).

Most would agree that educational development in higher education remains a developing field—one that includes practitioners (educational development staff), faculty, researchers, and those who assume all three roles simultaneously to focus on the improvement of teaching and learning in higher education. One of the ways a developing field is able to further define itself and to enhance practice and further research is through the periodic review, analysis, and synthesis of the published literature. Three systematic empirical reviews of the educational development lit-erature have been conducted, all focused on the question of effectiveness of edu-cational development practices (Levinson-Rose & Menges, 1981; Steinert et al., 2006; Stes, Min-Leliveld, Gijbels, & Van Petegem, 2010). Despite the review of more than 30 years of literature represented by these reviews, the good work reflected in the well-designed and conducted review processes, and the useful attention to levels of learning and assessment outcomes, the answer to the question posed—What are the features of educational development that make it effective?—has been elusive. Authors of these reviews were able to draw only tentative and weak conclusions about effectiveness. All three reviews call for not only more rigorous research designs but also more qualitative research, a better theoretical and conceptual grounding of educational development practice, and a more detailed description of practice, so that each new study can build more explicitly on previous ones. The paucity of findings and the strikingly similar recommenda-tions of all three reviews led us to ponder whether they were asking the right question if their intention was to inform practice and further research.

Kennedy (2007) distinguished empirical reviews, often based on cause–effect questions, from conceptual reviews, which she suggested “share an interest in gaining new insights into an issue” (p. 139). Essentially, the two types of reviews ask different questions. A conceptual review would not ask, for example, What educational development practices have the greatest impact? but would instead ask, How are educational development practices designed? and What is the think-ing underlying the design of practice? Kennedy argued that the term systematic review is usually used to refer to empirical reviews but that both empirical reviews and conceptual reviews may adhere to the criteria for a systematic review: defined question(s) to drive the review, ensuring that an attempt has been made to identify all literature relevant to the questions posed, and employment of specific inclusion and exclusion criteria in defining the body of literature to be reviewed.

This article describes a systematically conducted conceptual review. The ques-tions that drove the review were How are educational development practices designed? and What is the thinking underpinning the design of educational devel-opment practice? We had two specific purposes for conducting this review. Our first purpose was to deepen the understanding of current practice by describing and characterizing educational development practice in more detail than others have done. We began by documenting the core characteristics of each educational

Amundsen & Wilson

92

development initiative described in the articles we reviewed. By core characteris-tics, we mean the intention or goal of an initiative, the processes and activities planned to realize the intention or goal, and the evidence collected to demonstrate success in achieving the intention or goal. A framework with six clusters of prac-tice emerged from this process: skill, method, reflection, disciplinary, institutional, and action research or inquiry. Our second and related purpose was to explore the thinking underpinning the design of practice. Evidence of this was drawn from an analysis of the core characteristics of the educational development practice and from an analysis of the conceptual, theoretical, and empirical literature cited by the articles in our review.

We begin by discussing how this conceptual review builds on and adds to previ-ous empirical reviews. We then present the design and procedures of our review, followed by a detailed description and analysis of the six-cluster framework. Our conclusion considers the implications of this study for educational development practice, further research, and the continuing definition of the field of educational development.

Previous Empirical Reviews

Previous empirical reviews grouped studies by format (e.g., workshop, one-to-one consultations), by level of learning outcome (e.g., self-report of attitude, observed behavior), and by individual variables (e.g., duration of the activity) in relationship to reported measures of impact. We are aware of three systematic and empirical reviews of the literature;1 they are briefly described here to provide a sense of the work we are building on.

The earliest systematic review (Levinson-Rose & Menges, 1981) was based on 71 reports published between the mid-1960s and 1980 and had the expressed pur-pose of drawing conclusions for both “researchers and those who design and imple-ment instructional improvement programs” (p. 416). Articles were grouped by the format of the initiative (e.g., workshops, one-to-one consultations) and discussed according to three levels of assessment of impact: teacher skill based on observation, student attitude from self-report, and student learning from tests or observer reports. The researchers evaluated the research design of each study and applied a rating (high, fair, or low) to indicate how much confidence should be placed in its findings; 78% of the articles reviewed supported the general success of the initiative described, although the level of support was reduced to 62% when the confidence ratings were considered. The authors noted that although workshops and seminars were the most common instructional development intervention (short, “one-shot” workshops were most numerous), they were also the least evaluated and, in their opinion, the least likely to “produce lasting changes in teaching behaviour or lasting impact on stu-dents” (Levinson-Rose & Menges, 1981, p. 419).

Steinert et al. (2006) conducted another empirical review 25 years later based on 53 identified articles published during the period 1980 and 2002 and specifi-cally focused on educational development in the medical sciences. Like Levinson-Rose and Menges (1981), these researchers also hoped to draw conclusions about “features of faculty development that make it effective” and “the impact on knowl-edge, attitudes and skills of teachers” (Steinert et al., 2006, p. 499). Articles were coded by format of the initiative (e.g., workshop, seminar), level of learning or

Educational Development Literature

93

assessment outcome adapted from Kirkpatrick’s (1994) four levels of assessment outcomes and similar to the levels used by Levinson-Rose and Menges (1981), and research design type. Like the previous review, they developed a 5-point confi-dence rating scale based on quality of the research design; the mean confidence rating across the 53 articles they reviewed was 3.14 out of 5. The discussion of their findings is organized in two ways. First, they provide an overview of the articles in their review, providing frequency counts for several aspects including format, duration, and level of outcome assessed. Second, they discuss specific articles organized first by format (e.g., workshop, seminar) and within that the levels of learning or assessment outcome. The authors admitted that neither their review nor the work of others to date had the evidence to “tease apart features of faculty development that make it effective” (Steinert et al., 2006, p. 509), but they did provide some “preliminary conclusions” (p. 509). These include the value of feedback, the importance of peers, and the use of multiple instructional methods.

The most recent empirical review (Stes et al., 2010) was based on 36 identified articles; the review was not limited in terms of time (earliest article was 1977 and latest 2007) or source of publication, and the authors intentionally looked beyond peer-reviewed journals. Articles were coded by level of learning or assessment outcome and by format of the initiative (workshop, one-to-one consultation), con-sistent with Steinert et al.’s (2006) review, and by research design type (quantita-tive, qualitative, and mixed-methods). They also investigated the relationship of several individual variables (duration, format, target group) to impact levels. Impact level was assigned as follows: 0 = no indication of impact, + = indication of impact; = partial indication of impact; ? = unclear if there was any impact. They found weak evidence that interventions of longer duration were reported to have more positive learning outcomes at the level of the faculty participant and that course-length interventions showed more positive outcomes at the level of the student.

Little changed in the conclusions or recommendations made by authors from the first review (Levinson-Rose & Menges, 1981) compared to the most recent one (Stes et al., 2010). All three reviews were critical of the quality of the studies in terms of their theoretical grounding and clarity of goals. Steinert et al. (2006) reported that 57% of the articles in their sample of articles from educational development in the medical sciences cited a theoretical or concep-tual framework, drawn primarily from adult learning, instructional design, or reflective practice. Still, they called for more explicit attention to this as well as to acknowledging the importance of context.

The authors of all three reviews noted the difficulties in drawing any meaningful findings because of the limited number of studies meeting their inclusion criteria and rigor in research design. All three called for the use of qualitative research designs that draw on more diverse forms of evidence (e.g., narratives, critical inci-dent interviews, journals, portfolios). However, most articles matching these descriptions would likely not have met the selection criteria of these reviews and so would have been excluded. We return to this point later.

The particular features used to group educational development initiatives in all three of the previous reviews were, we argue, problematic and may provide at least a partial explanation for their lack of findings. All three reviews grouped educational development initiatives by format (e.g., workshop, one-to-one consultations). As best

Amundsen & Wilson

94

we can determine, authors of the reviews simply used the label given in the article, not defining it further. The term workshop, based on our reading of the literature, can refer to activities that vary widely in terms of purpose, content, duration, and quality. The format might be thought of as the outer shell, not reflecting the varia-tion in actual practice contained inside. The feature of duration is also poorly defined. Two of the three reviews collected information about the duration of the initiative, but only Stes et al. (2010) used this information in a subsequent analysis of impact. We know from the articles we reviewed that information about duration, when provided, is variously described as number of contact hours, numbers of workshops, or the length of time over which the initiative spanned (number of days, months, or years).

The clear difficulty in comparing these diverse (and often inexact) measures may be what prompted Stes et al. (2010) to use a dichotomous classification for duration—one-time events versus longer initiatives. Grouping educational devel-opment initiatives using vague descriptors is, in our view, only part of the problem. The more important issue is the consideration of individual features separate from the overall design of an initiative. In the case of duration, it begs the questions, longer than what or longer to achieve what? Similarly, grouping initiatives by kind and level of assessment without consideration of the broader design ignores the critical link among conceptual or theoretical grounding, core characteristics of the design, and learning. Is this a reasonable outcome given the design of the initia-tive? is a more meaningful question to ask. The authors of the most recent review recognized this weakness:

Future research not focusing on one specific educational feature of instruc-tional development (e.g., format, duration), but considering the core charac-teristics of instructional development initiatives (e.g., the theoretical foundation, goals and content) in their internal connection would be even more worthwhile. (Stes et al., 2010, p. 47)

This is where our conceptual review picks up. A conceptual review, with the stated interest of “gaining new insights into an issue” (Kennedy, 2007, p. 139), seemed to us an appropriate way to proceed. We began by identifying the literature that describes and analyzes initiatives that are planned and undertaken by faculty mem-bers themselves or by others working with them and that are aimed at enhancing teaching.

The Review Process

Review Team

Seven different individuals worked on the review: the two authors and five gradu-ate students at Simon Fraser University in Canada.2 At least three of these indi-viduals were working on the team at any one time. Both authors were actively involved throughout the review process.

Inclusion and Exclusion Criteria

We used inclusion and exclusion criteria that moved from broad to more specific. We began by searching databases with these general keywords: instructional development, faculty development, professional development, educational development, academic

Educational Development Literature

95

development, teaching development, higher education, and post-secondary education. We then used the following three more specific criteria.

Focus of activities, initiatives, programs. We included all types of formal and informal teaching development initiatives, including those initiated by both cen-tralized teaching support centers or other formal administrative groups (e.g., work-shops, seminars, courses) and those initiated by individual academics or groups of academics (e.g., book groups, peer mentoring, learning communities, classroom research studies). To be included, initiatives had to be fully articulated in terms of the core characteristics of the design, including how effectiveness was determined.

Assessment of impact and evaluation of effectiveness. We began by including only reports of initiatives that went beyond satisfaction ratings, in keeping with the critiques of the literature articulated by two of the previous review teams. We adjusted this, however, to more fully capture the literature and included a few articles that provided a detailed account of procedures or activities that had been put in place to determine effectiveness but did not report any findings.

Institutional context. Our focus in this review is the university context—the con-text in which we work as teachers, researchers, and educational developers. One motivator to conduct this review was to situate our own educational development practice in the broader literature. Specifically, we included initiatives that were designed for two types of universities: medical or doctoral, and comprehensive, where faculty generally have joint roles as both teacher and researcher. These are labels commonly used in Canada. Medical or doctoral universities offer a broad range of Ph.D. programs and research; all institutions in this category have medical schools. Comprehensive universities have a significant degree of research activity and a wide range of programs at the undergraduate and graduate levels, including professional degrees.3

Search Process

We followed multiple steps in identifying articles for review.

1. We searched three databases (ERIC, Google Scholar, PsycINFO) for litera-ture published in peer review journals between 1995 and 2008 using the keywords listed above.

2. We identified conference papers, proceedings, and reports using the same keywords and the same databases, but only for 1997 to 2004. We wanted to include this literature because we knew it would be a good source, but we did not have the resources to consider the same span of years as for the peer-reviewed journals. Consequently, we selected a span of time that would pro-vide a reasonable sampling of this literature. Based on our examination of our sampling of this literature, we were satisfied that it did not represent a literature with substantially different characteristics than that found in the peer-reviewed journals we reviewed.

3. We read the abstracts of the 3,048 peer-reviewed journal articles (in 104 dif-ferent journals), conference papers, proceedings, and reports identified in

Amundsen & Wilson

96

the first two steps. Each abstract was read by a team member to determine if the article described an educational development initiative.

4. We conducted a manual search (reading article abstracts) of 11 journals known to our team as sources of articles concerning educational develop-ment. These 11 journals were the Journal of Higher Education, Studies in Higher Education, Teaching in Higher Education, Higher Education, Higher Education Quarterly, Adult Education Quarterly, New Directions for Teaching and Learning, the International Journal for Academic Development, Higher Education Research and Development, Active Learning in Higher Education, and Innovative Higher Education.

5. The database search and manual search left us with 428 articles (excluding duplicates) to read fully. These were read in the context of our three specific inclusion and exclusion criteria described above. Every article citation was entered into a reference management system (RefWorks), along with the names of the two readers; article copies were made for the readers and retained in our files for cross-checking.

The review and screening proceeded as follows:

• At least two readers read each article in full and applied our specific criteria to decide to include or exclude that article. A third reader was used when there was disagreement, and that article was also discussed at our weekly meeting.

• One reader for each article noted this specific information in an Excel table: first author, year of publication, reader name, description of the initiative, focus or goal of the initiative, duration of the initiative, theoretical or con-ceptual underpinnings or rationale for the design, evidence of impact or effectiveness, and philosophies, beliefs, and values about what constitutes effective teaching and/or learning in higher education.

• We documented in RefWorks and in the Excel table the reasons for excluding any article.

Our final database of articles totaled 137. Of the 137 articles included in our review, 64 (47%) were published in the 11 journals listed above (some of these articles were discovered in our manual search and some in our electronic searches) and, of these, 26 (19%) were published in the International Journal for Academic Development. An obvious omission is the lack of articles published in languages other than English, although we did include more articles written by authors resid-ing in countries other than the United States (20 other countries) than did previous reviews.

Emergence of a Conceptual Framework

We noticed that the initiatives we were reading about seemed to cluster based on the purpose or focus of the initiative, the core characteristics of the design, and the literature cited. In one of our weekly meetings, we began to sketch out these emerg-ing clusters on a large piece of paper. We then moved to develop a coding sheet that

Educational Development Literature

97

was revised and refined as we progressed with the review process. For each article, coding sheets were completed by two readers and compared. If there was not con-sensus, a third reader read and coded the article and it was discussed at a full team meeting. The coding scheme and revisions were documented in a codebook. Once the clusters and the associated elements appeared stable (i.e., this iterative cluster-defining process was not changing our cluster definitions, and no new clusters were being identified), all articles were reread and recoded by the two authors. Essentially, we followed an emergent coding process as recommended by Huberman and Miles (2002). We were able to agree on a good fit with the focus of a particular cluster for every article; in other words, the educational development initiative described in each article adhered to all of the elements in one particular cluster. The last version of our coding framework included the following six clusters (see Table 1).

• The skill focus cluster includes 14 articles that focus on the acquisition or enhancement of observable teaching skills and techniques (voice projection, presentation skills, discussion facilitation skills, etc.).

• The method focus cluster includes 33 articles that focus on mastery of a particular teaching method (e.g., problem-based learning).

• The reflection focus cluster includes 30 articles that focus on change in indi-vidual teacher conceptions of teaching and learning.

• The institutional focus cluster includes 37 articles that focus on coordinated institutional plans to support teaching improvement.

• The disciplinary focus cluster includes 4 articles that focus on disciplinary understanding to develop pedagogical knowledge.

• The action research or inquiry focus cluster includes 19 articles that focus on individuals or groups of faculty investigating teaching and learning ques-tions of interest to them.

The next section describes these six clusters of practice in detail. It is important to note that it is not our intention to value one cluster over another; we discuss the rationale for this further in the concluding section. We attend, in the following discussion, to the questions motivating this review—How are educational devel-opment practices designed? and What is the thinking underpinning the design of educational development practice?—by describing the core characteristics of the initiatives in each cluster (intention or goal of an initiative, the processes and activ-ities planned to realize the intention or goal, and the evidence collected to demon-strate success in achieving the intention or goal) and their internal consistency. To further address the conceptual consistency of the initiatives in each cluster and the thinking underpinning design, we discuss the literature most cited in the articles in each cluster and, to the extent possible given the information provided in each article, how this literature was drawn on in the design of each initiative. This cita-tion information resulted from a separate analysis in which we identified all of the (first) authors who were cited at least three times in three separate articles in each cluster (the complete analysis is in Tables 2–7 in Appendix B in the online journal).

98

TAble 1Educational development clusters and associated elements of thinking and design

Skills focus: Acquisition or enhancement of observable teaching skills and techniques

Emphasis on observable skills and techniques (e.g., presentation skills)Largely generic, not discipline basedFocus of intervention is to support change in specific behaviors identified through course

ratings, class observations, or self-reportsAssessment of impact is based on change in student perception (e.g., course ratings) or

observable skills (e.g., class observation)Draws on relevant literature (e.g., individual consultations based on student ratings)

Method focus: Mastery of a particular teaching method

Emphasis on learning about a particular teaching method and how to use it (e.g., problem-based learning)

The elements that make up the method have integrity and coherenceDesign of training models the method being taughtAssessment of impact based on how well the method is demonstrated during training

(e.g., trueness to approach, consistency) and how widely adopted afterward (once training is finished)

Draws on theoretical, ideological, or empirical literature relevant to the particular method

Reflection focus: Change in individual teacher’s conceptions of teaching and learning

Assumption is that reflections lead to conceptual change and that this in turn leads to change in teaching practice

Design of activity is to prompt and support individual reflectionIncludes a collegial element to aid individual reflectionAssessment of impact based on individual change in conceptions about teaching and

learning and sometimes the link from changed conceptions to new teaching practicesDraws on relevant literature (e.g., teaching conceptions, reflection)

Institutional focus: Coordinated institutional plan to support teaching improvement

Top-down approach with the assumption that the initiative is useful and beneficial for allStrategic planning involvedA focus on human resource developmentAssessment of impact focuses on diffusion and uptake of the initiativeDraws on relevant literature (e.g., organizational change)

Disciplinary focus: Examine disciplinary understanding to develop pedagogical knowledge

Assumption is that teaching is different (at least in part) in different disciplines because the structure of knowledge is different

Assumption is that academics identify best with their own disciplinary culture, knowl-edge, and practices and, therefore, disciplinary understanding is the foundation on which to build pedagogical knowledge

Activities are focused on scholarly discussion among colleaguesAssessment of impact is informal (e.g., participation in discussions, reflection portfo-

lios, and ongoing teaching projects)Draws on relevant literature (e.g., discipline-based understanding)

(continued)

99

Cluster 1: Skill Focus

The 14 articles that were coded as having a skill focus describe initiatives that seek to improve teaching through the enhancement of observable teaching skills and techniques, primarily as demonstrated in a classroom setting. Initiatives took the form of pretest-intervention-posttest (sometimes administered more than once over time) or intervention-posttest. Specific skills or techniques to be learned or improved were identified by instructors themselves and/or by students (e.g., through course evaluations or interviews) and/or by educational developers. Targeted skills and techniques, as described in these articles, included questioning techniques, interactive techniques, presentation skills, consultation skills, clinical teaching skills, and lesson planning. Interventions took the form of one-to-one consultations or targeted workshops led by educational developers. Assessment of the effectiveness of the intervention was accomplished through student course rat-ings, structured student interviews, self-evaluation of performance using video classroom observations, self-report of teaching practice, self-perceptions of per-formance, and classroom or clinical observations by others.

Many of the articles in this cluster cited the work of authors who have contrib-uted to the well-developed literature on faculty consultation using pre–post student course ratings (e.g., H. W. Marsh, P. A. Cohen) as the basis or rationale for the design of the initiative.4 This is a largely atheoretical literature commonly employ-ing quasiexperimental designs to determine change in pre–post measures. Another source regularly drawn on in the design of these initiatives was the literature dis-cussing evidence-based teaching techniques (e.g., W. McKeachie, A. Saroyan) and the assessment of clinical teaching skills (e.g., M. Hewson). In total, there were 19 authors cited as the first author three or more times across the 14 skill focus articles we reviewed (see Table 2 in Appendix B online).

Example from the skill focus cluster. Nasmith and Steinert (2001) investigated the effectiveness of a workshop that allowed faculty to explore interactive techniques they can incorporate into their lectures.5 The workshop design drew on the medical education clinical teaching strategies literature (e.g., Skeff, Stratos, Berman, & Bergen, 1992) and that dealing with evidence-based teaching techniques (e.g.,

Action research or inquiry focus: Individuals or groups of faculty pursue topics of interest

Work is peer basedFocus of the inquiry is chosen by the individual or groupInvolves mentoring among group membersInquiry process initiated by faculty, instructional developers, or both in collaborationAssessment of impact is informal (e.g., reflection on course materials, action plans,

dissemination of findings and materials produced by individual or group)Draws on relevant literature (e.g., communities of practice, scholarship of teaching and

learning)

TAble 1 (continued)

Amundsen & Wilson

100

McKeachie, 1994). The experimental group and comparison group were compared on an immediate postworkshop questionnaire, a six-month postworkshop question-naire, and independently rated videotapes of selected faculty from both groups six months after the workshop. All measures showed a change in favor of the increased use of interactive techniques among the experimental group.

All of the initiatives included in this cluster were designed as interventions either to address a documented need of a particular instructor (individual consulta-tion) or to teach a group of faculty how to incorporate a particular technique into their teaching practice. The emphasis was on an observable outcome. Quasi-experimental designs were used incorporating pre–post or posttest-only measures and sometimes incorporating longitudinal measures and control or comparison groups.

Cluster 2: Method Focus

The focus of the 33 articles in this cluster is a teaching method that is based on a particular view of learning and encompasses a set of teaching and learning strate-gies that, taken together, support the desired learning (e.g., problem-based learn-ing; PBL). Thus, the purpose of these initiatives is not only learning to use the teaching method but also coming to understand the particular view of learning that underpins it. This is what primarily differentiates this cluster from the skill focus cluster, where the concentration is on individual, observable skills and techniques. The methods evident in these articles were originally developed based on an instructional need (e.g., PBL in medical education, case-based learning in business education) or, in a few cases, from a particular theoretical stance (e.g., constructiv-ist course design, self-directed learning).

Of the 33 articles in this cluster, 14 described initiatives addressing online learn-ing course design (8), PBL (3), or case-based learning (3). Other teaching methods included guided independent learning, self-directed learning, collaborative or cooperative learning, sustainable teaching and learning strategies, and course design for diversity and social justice. In many cases, the design of the training modeled the method being taught—for example, faculty learning to teach with a case-based method were taught using cases as their students would be. Assessments of effectiveness were developed to gather evidence of mastery of the method dur-ing the training as well as use of it in subsequent teaching practice. Self-report measures were most common (questionnaires, written reflections, interviews, focus groups, implementation logs), often administered multiple times after the completion of the training. Classroom observations with feedback were also com-mon, followed by student evaluation of course materials developed or teaching practices specific to the training.

The authors cited most frequently in the method focus cluster fell loosely into four groups; the first two groups are most tightly linked to how an initiative was designed or the rationale for choosing a particular method. The first group con-tained articles that described a particular method of teaching, for example, technol-ogy integration (e.g., L. Cuban) and PBL in medical education (e.g., D. Irby). A second group was literature that provided a theoretical rationale for the selected teaching method: a social practice perspective (J. Brown & P. Duguid), construc-tivist design (D. Jonassen), adult learning principles (M. Knowles), types of

Educational Development Literature

101

faculty knowledge (L. Shulman), and reflective practice (D. Schön). The third group comprised citations to broad discussion articles about the practice of educa-tional development or future directions (e.g., C. Bland). Finally, the fourth group of citations addressed the assessment of faculty learning (e.g., J. Biggs). In total, 15 authors were cited at least three times in the 33 articles we reviewed in the method focus cluster (see Table 3 in Appendix B online).

Example from the method focus cluster. Eisen and Barlett (2006), both faculty members, one in biology and the other in anthropology, and both with responsi-bilities for educational development in their academic units, provided a description of the Piedmont Project on environmental sustainability. Principles underlying the design of the initiative were drawn from Dewey’s notions of inquiry learning as well as from research in environmental education. Emphasis was on (a) interdisci-plinary cooperation, (b) shifting from teacher as expert to teacher as facilitator of learning, (c) combining research skills with ethical reflection, and (d) personal responsibility toward environmental sustainability. Faculty participants identified a new course to develop or an existing course to revise and attended a two-day course development workshop led by faculty participants from the previous year. There was an end-of-summer field trip to local sustainability-relevant sites and a follow-up dinner a year later to discuss their experiences with their new or revised courses and the impact this project had had on their professional perspectives and teaching methods. Effectiveness was assessed through three surveys conducted a few days after the summer workshop, one year later, and four years later. Course materials and written reflection statements were collected and analyzed. Faculty reports and course materials evidenced change in teaching approaches (especially including field trips and real-world problem inquiry).

All of the initiatives in this cluster were designed with the intention of teaching faculty to use a particular method of teaching and to come to an understanding of the type of learning supported by that method. Generally, as part of the design of the initiative, faculty experienced the method as they learned to use it. Measures of effectiveness focused on the use of the method both in the training sessions and later in individual teaching practice. Thus, self-report measures of teaching prac-tice were most evident, but there was also ample use of outside observers and tar-geted student feedback.

Cluster 3: Institution Focus

Each of the 37 initiatives coded as having an institutional focus outlined a coordi-nated plan for academic development at the institutional level, part or all of which had to do with teaching development. These initiatives commonly placed an emphasis on strategic planning and human resource development. Initiatives were “top-down,” in response to an institutional or national agenda (e.g., technology integration) and sometimes involved multiple universities. The assumption of the institution was that the initiative (e.g., technology integration) was generally ben-eficial to all instructors in all academic contexts. The emphasis of the development efforts was quite varied and included technology integration, developing a network of teaching scholars across the university who worked with colleagues to improve teaching, widespread adoption and resource management to support a particular

Amundsen & Wilson

102

approach (e.g., PBL), systematic evaluation of teaching across the institution, and specific curriculum addition or refocus (e.g., professionalism, technology literacy, action research).

Although the emphasis of the initiatives in this cluster may resemble some in the method focus cluster (e.g., PBL, technology integration), articles in this cluster have a stated goal to have an impact across the entire institution or academic unit (i.e., faculty or school). Thus, the focus of implementation is ensuring the success-ful diffusion of the initiative by building the structures and processes to accomplish this. Simpler to more elaborate structures and processes were used, including com-binations of the following: departmental and cross-departmental development teams, curriculum redesign groups, mentoring processes, highlighting and model-ing of existing practice consistent with the development initiative, project grants, and focused workshops. Assessment of effectiveness sought to show evidence of successful diffusion rather than assessing an individual instructor’s teaching. Thus, common broad indicators of successful diffusion were activity reports from vari-ous levels of the institution (faculty, department, program) about numbers of fac-ulty participating, degree of uptake, and challenges met and overcome. More focused assessment considered the products of the diffusion process, such as rede-signed courses, instructor portfolios, and targeted student feedback.

The cited literature in the institutional focus cluster moved beyond a focus on the individual faculty member to a broader organizational perspective on teaching and on educational development. The emphasis on strategic development and human resource development evident in many of the initiatives in this cluster was a reflection of the guidelines provided in the organizational change literature addressing institutional-level rethinking of teaching and learning (e.g., D. Laurillard; P. Ramsden; P. Trowler) and of educational development in general (e.g., G. Webb; G. Gibbs) and as specific to the health sciences (e.g., C. Bland; Y. Steinert). As with the method focus cluster, literature particular to the initiative being diffused across the institution was cited, for example, PBL in medical educa-tion (e.g., H. Barrows) and the design of technology integration (e.g., D. Jonassen). In total, 19 authors were cited as first author three or more times across the 37 institutional focus articles we reviewed (see Table 4 in Appendix B online).

Example from the institution focus cluster. Stigmar (2008) described an initiative (Educational Action Program) with the stated intention of making the “transition from conveying to creating knowledge” (p. 107). A continuous and structured pro-cess of dialogue in all academic units of the university was put in place to imple-ment the stated intention of the Educational Action Program in a way that was meaningful to each unit. This initiative, like those described in other articles in this characterization, was a top-down response to a specific institutional mandate. Strategic planning led to work at three different levels: institutional (linking the program to the development plan for the whole university), educational develop-ment unit (staff of this unit work as change agents), and departmental (required to adapt the program to its activities by creating program action plans). The primary source of literature drawn on addressed organizational change through the scholar-ship of teaching and learning (e.g., Lueddeke, 2003). Evaluation of the initiative was conducted at the department level, using department reports of progress

Educational Development Literature

103

toward institutional goals, interviews with department representatives, and analy-sis of other relevant documents. Seminars and other activities to gather input on the process were used as safeguards against faculty perceiving this as only a top-down initiative. Departments were also encouraged to adapt the program to fit their contexts. The authors credit these activities and the general way the initiative was implemented for success in realizing the institutional goals across the university.

All of the initiatives in the institutional focus cluster address cross-institutional change around some aspect of teaching and learning. Thus, the design of these initiatives emphasizes the development of structures and processes at various lev-els of the institution to support the diffusion of the initiative with the assumption that it will become embedded in everyday practice. The assessment of successful diffusion focuses on the adequacy of these structures and processes to support the initiative and provide evidence of uptake.

Cluster 4: Reflection Focus

The focus of the 29 articles coded with a reflection focus is to engage faculty members in a process of reflection (individually and/or collaboratively) with the purpose of changing or clarifying their conceptions of teaching and learning and linking this to change in their teaching practice (McAlpine & Weston, 2000). The intention here is markedly different from the previous three clusters that could be broadly described as placing value on the outcome: acquisition or enhancement of specific skills and techniques in the skills focus cluster, demonstration and use of a particular method in the method focus cluster, and diffusion of an approach or perspective across the institution in the institution focus cluster. In this cluster, the reflection focus cluster, the value is placed on the process of reflection as learning.

Some articles in this cluster explicitly argued that a transformation in thinking or in conceptions of teaching and learning is a necessary pre- or corequisite step to changes in teaching practice. Focusing only on teaching skills or techniques, or even methods, as in the first two clusters, is seen as insufficient. In fact, many of the reflection-focused articles reported programs that were designed in reaction to workshops on teaching skills, techniques, and methods. The authors of one of the articles in this cluster, Hubball, Collins, and Pratt (2005), provided a clear example of this thinking in their explanation of the intention of the program they have developed at a university in Canada “to promote reflection defined as thoughtful consideration and questioning of what we do, what works and what doesn’t work, and what premises and rationales underlie our teaching and that of others” (p. 60).

Faculty who participated in these initiatives engaged in reflection in a number of ways. It could be prompted by reflecting on one’s conceptions and practices in relationship to a personal goal or ideal (reflect and discuss), to a colleague’s prac-tice (observe, discuss, and reflect), to representations in the literature (read, dis-cuss, and reflect), and/or to newly developed knowledge (i.e., read or learn, reflect, discuss). For example, an individual faculty member who desired to improve her interactions with students asked a colleague to review and discuss her written reflections and those of her students after each class session (Stewart & McCormack, 1997). Pairs of physicians involved in clinical coteaching followed a process of observing each other’s teaching, debriefing this, and planning the next session

Amundsen & Wilson

104

together (Orlander, Gupta, Fincke, Manning, & Hershman, 2000). A several-month-long course focused faculty on comparing their own practice to theory-based practices in the literature, ending with a provocative interview challenging them to integrate theory and their teaching practices (Halliday & Soden, 1998).

Although some of the tools used to determine success of these initiatives paral-leled those evident in the first three clusters (questionnaires and surveys, classroom observations), others involved the analysis of the same activities used to promote reflection: personal journals, instructor and student reflections, focus groups, teaching philosophies, critical classroom incident assessments, teaching portfo-lios, projects.

The most cited literature in this cluster was decidedly more theoretical—not surprising given that reflection is a theoretical construct and is not directly observ-able in the same ways as the intended outcomes of initiatives in the skill and method focus clusters. Authors who wrote specifically about the process of reflec-tion (e.g., D. Schön; J. Mezirow) and those who designed initiatives using reflec-tion (e.g., P. Cranton; C. Kreber) were cited frequently. In addition, authors who wrote about conceptions of teaching and learning held by faculty and the relation-ship to approaches to teaching and student learning were cited consistently (e.g., P. Ramsden; A. Ho; K. Trigwell & M. Prosser). The design of initiatives in this cluster often worked to “ground” the reflection process in the individual faculty member’s teaching practice, citing literature that addressed the importance of dis-ciplinary identification (e.g., R. Barnett), scholarly discussion (e.g., S. Rowland), adult learning principles (e.g., S. Brookfield), and embeddedness in the academic workplace (e.g., D. Boud). In total, 38 authors were cited as first author three or more times across the 30 reflection focus articles we reviewed (see Table 5 in Appendix B online).

Example from the reflection focus cluster. Sandretto, Kane, and Heath (2002) drew on the work of Argyris and Schön (1974) to design a program with the assumption that once instructors were aware of discrepancies between what they said they did (espoused theories) and what they actually did (theories in action), they would seek to address them. They also drew on the adult education literature (e.g., Brookfield, 1990) in their design. The authors began by identifying two groups: experienced excellent teachers in the sciences and early-career academics. The program con-sisted of 10 weekly 2-hour sessions. New faculty engaged in several activities: discussions (with experienced faculty), readings, reflective journals, writing of personal histories relevant to teaching, repertory grid interviews, written reflec-tions based on viewing videotaped classroom observations of experienced teachers with accompanying explanations of thinking and practice, watching and reflecting on their own teaching, and ultimately articulating a teaching philosophy. An anal-ysis of the resulting artifacts showed an articulation of espoused theories and theo-ries in action and a movement to bring these together.

All of the initiatives in this cluster value reflection as learning, and, therefore, the activities faculty engage in are meant to support the process of reflection on their conceptions or understandings of teaching and learning and their teaching practice. Some initiatives go beyond this and specifically link or integrate what is learned through reflection to changes in teaching practice (e.g., teaching projects).

Educational Development Literature

105

For the first time, in this cluster we see a significant number of initiatives that are wholly designed by faculty to improve their own teaching, whereas in several oth-ers educational developers act as facilitators of an essentially faculty-driven pro-cess. In this cluster, documenting the reflection process and its link to conceptions of teaching and learning and to teaching practice was often accomplished through analyzing the same tools that supported reflection (e.g., journals, personal histo-ries, teaching philosophies).

Cluster 5: Discipline Focus

Like many of the articles included in the reflection focus cluster, the four articles in the discipline focus cluster also noted the insufficiency of teaching skills or techniques and methods, but for a different reason. The thinking here is that skills or techniques and methods, if they are to be meaningful, must be understood through the lens of an instructor’s discipline (Davidson, 2004; Healey & Jenkins, 2003; Neumann, 2001; Neumann, Parry, & Becher, 2002). The stated assumption is that the structure of knowledge within a discipline should in good part determine course designs and teaching methods, and therefore teaching improvement requires in-depth knowledge of the discipline. Also assumed is that academics identify best with their own disciplinary culture, knowledge, and practices and that efforts for teaching improvement are best approached from this perspective. It is important to note that articles that simply drew their participants from one discipline for con-venience or because of who initiated the program were not included in this cluster; the element of disciplinary understanding linked to teaching and student learning had to be explicitly stated. Characteristic of the stance taken in these four articles was the sentiment expressed by Mathias (2005):

A more fundamental issue is the extent to which the responsibility for the development of new university teachers has been gradually removed from the traditional academic disciplinary communities of practice and placed in the hands of education specialists, and whether this has undermined the owner-ship and commitment academic departments should have for the develop-ment of the teaching function within the context of disciplinary cultures and practices. (p. 97)

The designs of initiatives in this cluster were largely grounded in discussion about teaching experiences and/or educational readings. Designs supported developing teaching projects, collegial mentoring, and peer collaboration. Assessment of impact was accomplished through an analysis of teaching projects and self-reports of think-ing and change in teaching practices. Given that there were only four articles in this cluster, we were not able to identify common citations meeting our criteria of three common citations in at least three separate articles in the cluster.

Example from the discipline focus cluster. Rowland and Barton (1994) and Rowland (1999) designed an initiative centered on scholarly critique, suggesting that critique is fundamental to academic discussion and critical to fostering under-standing because it allows different disciplinary values or underlying theories to emerge (citing the work of Carl Rogers and Habermas and their work regarding the democracy of true discourse). Participants engaged in critique of readings, of

Amundsen & Wilson

106

each other’s ideas and perspectives, and of their reflections on their own teaching. Between weekly sessions lasting over the course of two years, participants read texts related to teaching and learning from the field of education as well as from other fields and were encouraged to “make observations and interpretations of their own teaching, try out different strategies in light of these and develop their own ongoing [teaching] projects” (Rowland & Barton, 1994, p. 369). The authors argued that the educational development program was built to “raise fundamental questions concerning pedagogy—the relationships between teacher, student and subject matter—within the particular social [disciplinary] context, rather than merely focus on the ‘how to do it’ of teaching skills” (Rowland & Barton, 1994, p. 369).

All four articles in this cluster took as a starting point that the structure of knowledge varies between disciplines and that faculty identify strongly with their discipline, both as a teacher and as a researcher. Thus, initiatives were designed to use scholarly processes to prompt thinking about teaching and learning from the perspective of the discipline. Activities challenged faculty to uncover their under-standings about knowledge development within the discipline and to link this to their teaching practice.

Cluster 6: Action Research or Inquiry Focus

There were 19 articles in this cluster, all with the underlying rationale that teaching improvement is fostered by individual or group inquiry or research into topics or questions of interest related to teaching and learning. Like the two previous clus-ters where the value is placed on process as learning—in the reflection focus clus-ter, the reflective process, and in the discipline focus cluster, the process of scholarly discussion and critique—value is placed in this cluster on the process of inquiry as learning (Cox, 2001; Kreber, 2005; Kreber & Cranton, 2000; Richlin & Cox, 2004; Trigwell & Shale, 2004). Initiatives ranged from those investigating individual teaching practice to group inquiry motivated by the desire to explore questions of mutual interest. For example, in one initiative, junior faculty investi-gated questions specific to their own teaching but with the support of a cohort composed of other junior faculty and more senior faculty. Project reports focused on what was learned from the inquiry and resulting changes in teaching practice, including a systematic assessment of learning (Middendorf, 2004).

In another initiative, a group of faculty followed an action research process to investigate two questions concerning integrating technology and incorporating an interdisciplinary approach. The final activity was to construct an action plan for moving forward (Garcia & Roblin, 2008). Assessment of effectiveness was varied depending on what aspect was of interest. The meaningfulness of the inquiry pro-cess itself was assessed through questionnaires, focus groups, action plans, rede-signed course materials, documentation of change in teaching practice, and portfolios. Some initiatives also sought to determine broader impact by looking at numbers of resulting conference presentations and publications, evidence of cross-unit collaboration, and alignment with strategic goals.

Authors who have written about the theoretical underpinnings of learning in a community, specifically in communities of practice (e.g., E. Wenger; J. Lave), and workplace learning (e.g., D. Boud) were cited regularly. Group inquiry initiatives were often designed around the elements described by E. Wenger as essential to a

Educational Development Literature

107

community of practice and around M. Cox’s guidelines in establishing learning communities in higher education. Individual investigations, on the other hand, tended to adhere to a more formal research structure, often citing the literature on the scholarship of teaching and learning (e.g., Boyer). As suggested in one article, “This approach shifts the attention . . . from the enhancement of teaching using traditional methods such as workshops, formal courses, or in-faculty curriculum and assessment advice, to encouraging and supporting participant research on learning and teaching, and disciplinary-based research” (Reid & Petocz, 2003, p. 105). Several initiatives targeted new faculty and cited literature specific to this group (e.g., R. Boice; A. Austin). In total, seven first authors were cited three or more times across the 19 action research or inquiry focus articles we reviewed (see Table 6 in Appendix B online).

Example from action research or inquiry focus cluster. Faculty and students (Koch et al., 2002) from a variety of disciplines (psychology, chemistry, political science, education, English, economics) collaborated in an initiative in which junior faculty investigated a self-selected facet of their teaching. They developed individual learning projects with goals, activities, and evaluation techniques. They worked with a faculty mentor and student associate of their choice over a one-year period to design, implement, and assess their project. Cross-disciplinary relationships developed from the regularly scheduled progress meetings. Evaluation was accom-plished through student feedback, faculty observations, student performance, and self-reflection by student helpers and faculty mentors.

All of the articles in this cluster supported a process of inquiry that took the form of investigating one’s own teaching practice or collaborative forms of inquiry where groups of faculty explored questions of mutual interest. However, even when the investigation was about an individual’s teaching practice, they rarely worked alone and had support from other colleagues or educational developers. The investigations were initiated by faculty members or educational developers or both in collaboration—but faculty always chose the topic or question to be inves-tigated. In keeping with an action research cycle (McNiff & Whitehead, 2006), the process of inquiry, the findings of the inquiry, and subsequent changes in teaching were variously documented as evidence of success of the different initiatives.

Looking Across the Six Clusters

Outcome Versus Process

The six clusters in our framework can be broadly characterized as reflecting an emphasis on either outcome or process. Initiatives designed to support a specific outcome are located in the skill, method, and institutional focus clusters of our framework, whereas initiatives designed to support a process are located in the reflection, disciplinary, and action research or inquiry focus clusters.

In initiatives emphasizing an outcome, the outcome is identified and antici-pated ahead of time. In the skills focus cluster, where the intended outcome is to acquire or enhance specific observable skills or techniques, initiatives often deter-mine a baseline against which to compare later posttesting. In the method focus cluster, where the intended outcome is that faculty come to understand learning as supported by a particular method (e.g., case-based learning) and use it in their

Amundsen & Wilson

108

teaching, assessment methods seek evidence of mastery of the method within the training sessions and evidence of use of the method in subsequent teaching prac-tice. In the institutional focus cluster, the intended outcome is the successful dif-fusion and uptake of a particular teaching and learning innovation (e.g., technology integration) or orientation (e.g., student diversity). In this cluster, broad assess-ment measures consider the adequacy of structures and processes put into place to support effective diffusion and document evidence of uptake. In short, the core characteristics of initiatives in these three clusters have coherence in respect to the intended learning, intervention, and assessment of the predetermined learning.

Initiatives emphasizing a process, in contrast, do not specify a particular out-come but rather highlight a process of learning that may result in different out-comes for different faculty or multiple outcomes for an individual faculty member. The assumption in these initiatives is that engaging in the process (reflection, scholarly discussion and critique, action research or inquiry) will lead to changed thinking about teaching and, over time, more effective teaching. The focus is almost always on individual meaning making. A stated goal in many of these initia-tives is to develop and support a questioning orientation to teaching and learning. Several initiatives formally measure changes in conceptions of teaching and learn-ing through inventories, interviews, and/or the analysis of projects. Otherwise, effectiveness is most often determined by analyzing the same activities in which faculty engage to support the reflection process (faculty and student journals, teaching projects, teaching portfolios, peer observation, teaching philosophies). In short, the core characteristics of these three types of initiatives are consistently aimed at supporting a particular process of learning.

The differences between educational development initiatives highlighting out-comes and those highlighting process challenge us to think more deeply about these differences. Our thinking has led us to explore the various assumptions designers of educational development make about the institutional, intellectual, and contextual positioning of their work.

Positioning of Educational Development

We use three sets of contrasts to organize our discussion of positioning: institu-tional location (centralized or decentralized), intellectual location (focused on con-tent or ongoing professional development), and contextual location (teaching as individual practice or as socially situated practice). We recognize that any attempt to categorize has its limitations, and this surely is the case here. These contrasts overlap to some extent. We think, however, that they provide a useful guide for considering educational development practice and for considering the various clusters of educational development practice composing the six-cluster framework we describe.

Institutional Positioning

The institutional positioning of educational development is often referred to as being either a centralized or a decentralized model, or a hybrid of the two. Centralized in this instance refers to initiatives led by educational development staff from a centralized teaching support center who deliver workshops, consulta-tions, and other programs in the name of that center; faculty members leave their

Educational Development Literature

109

own departments to attend. Decentralized, by contrast, refers to models where educational development staff take on more facilitative roles, working with faculty in academic units (e.g., facilitating teaching circles) or as members of project teams with faculty. It is not surprising that these terms are most evident in institu-tional documents that describe the functions of centralized teaching support cent-ers and the roles of staff assigned to them (Clegg, 2009).

The initiatives in the skill and method focus clusters of our framework could be almost exclusively identified with a centralized model where educational develop-ers work with individual faculty or groups of faculty and are responsible for plan-ning and carrying out the initiative. In the institutional focus cluster, the broad goals of these initiatives lead to both centralized and decentralized ways of work-ing for educational development staff. In the reflection, discipline, and action research or inquiry focus clusters, we see a hybrid of centralized and decentralized with educational developers assuming roles that involve varying amounts of facil-itation ranging from being the primary architects in planning and implementing an initiative (workshops on reflective practice) to acting as facilitator and resource person for an essentially faculty-led process (faculty learning community).

Intellectual Positioning

The intellectual positioning of an educational development initiative refers to the intended learning: an emphasis on acquiring either particular content or the tools of ongoing professional development (Webster-Wright, 2009). If we consider the learning intention of initiatives in the skill focus and method focus clusters, then certainly the emphasis is on a specific content. Initiatives in the institutional focus cluster are more mixed. In some initiatives in this cluster, considerable effort was made to adapt the innovation to existing teaching practices and departmental cul-tures, and in others, there was little or no customizing of an essentially top-down initiative. The learning intention of the remaining three clusters, reflection, disci-pline, and action research or inquiry focus, is to support a process (reflection, scholarly discussion and critique, action research and inquiry) that can lead to ongoing professional development. Many of the initiatives in the reflection and action research or inquiry focus clusters were initiated, planned, and implemented by faculty members for the purpose of their own professional development without the support of educational development staff.

Contextual Positioning

We think contextual positioning is the most interesting to think about and perhaps provides the most substantive contrast. Here the distinction is between activities focusing on improving or enhancing an instructor’s individual teaching practice versus activities that engage faculty in teaching enhancement as a socially situated practice (Boud, 1999; D’Eon, Overgaard, & Rutledge, 2000; Gregory & Jones, 2009; McAlpine, Weston, Timmermans, Berthiaume, & Fairbank-Roch, 2006). The initiatives in both the skill and method focus clusters are embedded in the context of individual teaching practice with assessment measures constructed to collect evidence of newly learned knowledge and skills within one’s teaching prac-tice. In the reflection, disciplinary, and action research or inquiry focus clusters, some initiatives support the learning process only at the level of the individual

Amundsen & Wilson

110

instructor, whereas others either embed the focus on individual practice within a support network of colleagues or focus on a collaborative group process where individual meaning making of new knowledge is a social process with faculty col-leagues.

Along with these contrasts has come debate over which types are best or most effective or what the balance should be. Knight, Tait, and Yorke (2006) character-ized programs and initiatives as ranging from “event-delivery methods” to situated approaches that take place within the context in which faculty work. They main-tained that “there is still a place for event-based educational professional develop-ment, but it [should] complement, rather than displace, situated social learning” (p. 320). Webster-Wright (2009) leveled a stronger critique at what she considered an overreliance on formal, didactic approaches in professional development, argu-ing that these types of programs and initiatives were often “decontextualized . . . and separated from engagement with authentic work experiences” (p. 703). We are not convinced that these black-and-white views of educational development are as meaningful as they could be in deepening our thinking about educational develop-ment. And, in fact, even Webster-Wright (2009), in her critique of the professional development literature, concluded that

the implication of this separation is that learning at work is different from learning through attending a professional development workshop. Although the activities may differ, if the professional learns from either or both experi-ences, then this separation is artificial; a convention reinforced by prevailing discourse. (p. 713)

We have paid close attention to the design of the initiatives in our review, par-ticularly to the internal alignment and coherence of core characteristics of initia-tives to support learning. We acknowledge, however, that even a seemingly coherent design, although critically important, does not necessarily result in learn-ing that is subsequently reflected in teaching and student learning. We understand the development process as a dynamic interplay among individual, disciplinary, and organizational elements and mediated by reflection on action. This is consis-tent with the model of professional growth proposed by Clarke and Hollingsworth (2002) that focuses on K–12 teachers. Holding such a conceptualization of educa-tional development suggests that there are different points of entry and multiple pathways to faculty learning. McAlpine, Amundsen, Clement, and Light (2009) suggested that ultimately change is an interplay between thinking and practice:

Changes in teaching practice (theories in use) may lead to changes in thinking about teaching and learning (espoused theories) or changes in thinking about teaching and learning (espoused theories) may lead to changes in teaching practice (theories in use). (p. 272)

One critical observation we make when looking across the initiatives in the six clusters of our framework is how infrequently these researchers explicitly recog-nize the broader context in which faculty teach and, therefore, in which educa-tional development happens. Only a few initiatives were explicit about the academic and social context in which faculty work and in which new knowledge must be embedded, practiced, and refined (Boud & Walker, 1998; Trowler & Cooper, 2002). This may be an illustration of Webster-Wright’s (2009) contention

Educational Development Literature

111

that much of research and practice in professional development considers “the professional and learning context as separate though related” (p. 712).

We tend to align with Boud (1999), who recognized that most “development takes place where faculty spend most of their time, departments, professional set-tings and research sites” (p. 3) and contended that these informal learning experi-ences may have a more profound influence on the understanding of teaching and learning and on teaching practice than organized activities labeled as development. At the same time, Boud (1999) recognized the benefit of well-designed outside learning to challenge taken-for-granted assumptions and provide innovative approaches, but he argued that “formalized approaches are also usefully concep-tualized as being located in sites of academic practice” (p. 3).

We interpret Boud’s (1999) words as essentially meaning that there is a place and purpose for different paths to improvement of teaching and learning but that all must take account of the situated and social nature of teaching. At this point in time, we know more about how to design educational development initiatives to improve individual teaching practice but less about how this learning is actualized and embedded in the academic workplace (Åkerlind, 2007, 2008; Dall’Alba & Sandberg, 2006; Fitzmaurice, 2010; Menges & Austin, 2001; Zellers, Howard, & Barcic, 2008).

Conclusion

We believe that the narrow questions about impact and effectiveness of educational development asked by previous empirical reviews of the educational development literature may not be the best questions to ask to more deeply understand practice and to build a sound foundation for further practice and research. Kennedy (2007) reminded us that empirical reviews are necessarily narrow, and this can make them less “informative, for such reviews are likely to eliminate studies that introduce new ideas, use new methodologies, or use unique methodologies” (p. 146). We believe that this is the case with previous empirical reviews. We argue that the primary attention in previous empirical reviews to how the impact of educational development initiatives is assessed has led to the exclusion of many articles that present educational initiatives that focus on the learning process rather than on specific learning outcomes. It is noteworthy that although the extensive overlap of articles in our review with those of the reviews by Steinert et al. (2006) and by Stes et al. (2010) provides some evidence that we are all exploring substantially the same body of work, only one (Hubball et al., 2005) of these common articles fell in a cluster of our framework that focused on process as learning.

We submit that the framework we articulate in this conceptual review is a fruit-ful way to build a better understanding of the variation and complexity of educa-tional development practice and the thinking underpinning this practice. The framework has the potential to serve as a tool for those involved in the practice of educational development to engage in the analysis of their practice. The conceptu-ally consistent core characteristics of initiatives in each cluster and at least some evidence of common citations to the literature in each cluster suggest that five of our clusters have integrity as descriptors of educational development practice and underlying thinking. The exception may be the discipline focus cluster, which contained only four articles and not enough common citations to meet our criteria

Amundsen & Wilson

112

of at least three citations in three separate articles. We think that this cluster is conceptually distinct from the others, but we realize that additional examples are needed to distinguish it more clearly, particularly from the reflection focus cluster. In short, our better articulation of practice provides a model of intentional design and reflects the amount and kind of detail that is necessary if, as a field, we hope to build on one another’s research and practice in a meaningful and systematic way.

In identifying the core characteristics of each initiative, we provided informa-tion about the intention or goal of an initiative, the processes and activities planned to realize the intention or goal, and the evidence collected to demonstrate success in achieving the intention or goal. How the effectiveness of a particular initiative was determined was but one core characteristic considered as part of the whole design; it was not considered separately from the design itself, as in previous reviews. We think this is a critical point to make. Developing a better understand-ing of effective practice must be done in consideration of the whole of the design. The clusters of the framework we have articulated are not meant to be compared to one another in terms of effectiveness; indeed, this may be meaningfully impos-sible. We have, however, provided at least a possible inroad in getting closer to the possibility of comparing “apples with apples” (Desimone, 2009) in the quest to answer an elusive question: What are the features of educational development that make it effective? New research approaches drawn from the expanding method-ologies of metasynthesis and metastudy could provide the means for this further examination (Paterson, Thorne, Canam, & Jillings, 2001).

We reflect on the work we have done in this study and ask ourselves if the lit-erature we identified is evidence that educational development has become a more well-defined field of inquiry since Levinson-Rose and Menges (1981), authors of the first empirical review, provided the following critique:

A well-defined field of inquiry should draw upon coherent theory, subscribe to high standards of research, and build upon previous research in a system-atic way. By these criteria, research on improving college teaching is not a well-defined field. For most studies, the basis in theory is strained and for some it is nonexistent. (p. 418)

Our review identified literature between 1995 and 2008, a reasonable amount of time to reflect any changes from the body of literature reviewed by Levinson-Rose and Menges. We did find, unlike Levinson-Rose and Menges, some consis-tency in the conceptual, theoretical, and empirical literature cited by articles within each of the clusters of our framework. We also found 16 authors who were cited as first authors three or more times in at least two different clusters in our model (see Table 7 in Appendix B online for details). This suggests some consistency and integrity in the field as a whole. Some of these multiple most-cited authors refer-enced classic theoretical works (e.g., J. B. Biggs; M. S. Knowles; D. A. Kolb; J. Lave; P. Ramsden; D. A. Schön), providing further evidence that this field is becoming more grounded in an established theory base.

The separation of the literature of educational development in the medical sci-ences from that in other academic contexts that was noted by Steinert et al. (2006) was reconfirmed in this review. There is little evidence of citations across these two bodies of literature or the relatively more developed professional development literature focusing on K–12 teachers. These three contexts for educational

Educational Development Literature

113

development remain in quite separate silos, with very little sharing of experience. This discourages the cross-pollination required to build on relevant research.

We also did not find much evidence of practice systematically building on other published reports of practice, a finding consistent with Levinson-Rose and Menges’s (1981) critique, although there was more evidence of this in some clusters of our frame-work than others (skill, action research or inquiry). This may be partially attributed to Stes et al.’s (2010) contention that practice is not described well or in enough detail in the literature to allow others to build on it. We tried in our review to redress this by excluding articles without sufficient detail of the practice and providing a model for what to include so that others will be able to build on it.

Our review has a number of limitations. Using only published literature means that we have undoubtedly missed many diverse and interesting examples of edu-cational development practice that have not been formally documented. We reviewed only articles describing initiatives at medical or doctoral and comprehen-sive universities, our primary context of interest, and so surely missed interesting examples of educational development practices in colleges and polytechnics. Finally, we could examine educational development initiatives only one by one as published articles, so we could not know the whole of educational development practice or thinking as it actually happens in any one place.

Our conceptual review has provided a broadened view of educational develop-ment practice and the thinking underpinning it. The six-cluster framework pro-vides a new way of thinking about the design of practice and a more meaningful basis on which to investigate the effectiveness of educational development prac-tice. The review also provides some guidance for the continued development of this field.

Appendix APapers in the Review (n = 137) Organized by the Six Clusters

Skill Focus

Bahar-Ozvaris, S., Aslan, D., Sahin-Hodoglugil, N., & Sayek, I. (2004). A faculty development program evaluation: From needs assessment to long-term effects of the teaching skills improvement program. Teaching and Learning in Medicine, 16, 368-375. doi:10.1207/s15328015tlm1604_11

Bardella, I. J., Janosky, J., Elnicki, D. M., Ploof, D., & Kolarik, R. (2005). Observed versus reported precepting skills: Teaching behaviours in a community ambulatory clerkship. Medical Education, 39, 1036-1044. doi:10.1111/j.1365-2929.2005.02269.x

Dennick, R. (2003). Long-term retention of teaching skills after attending the teaching improvement project: A longitudinal, self-evaluation study. Medical Teacher, 25, 314-318. doi:10.1080/0142159031000100436

Gelula, M. H., & Yudkowsky, R. (2003). Using standardised students in faculty devel-opment workshops to improve clinical teaching skills. Medical Education, 37, 621-629. doi:10.1046/j.1365-2923.2003.01556.x

Hativa, N. (1995). The department-wide approach to improving faculty instruction in higher education: A qualitative evaluation. Research in Higher Education, 36, 377-413. doi:10.1007/BF02207904

(continued)

Amundsen & Wilson

114

Appendix A (Continued)

Hativa, N. (2000). Becoming a better teacher: A case of changing the pedagogical knowledge and beliefs of law professors. Instructional Science, 28, 491-523. doi:10.1023/A:1026521725494

MacKinnon, M. M. (2001). Using observational feedback to promote academic devel-opment. International Journal for Academic Development, 6, 21-28. doi:10.1080/13601440110033689

Nasmith, L., & Steinert, Y. (2001). The evaluation of a workshop to promote interactive lecturing. Teaching and Learning in Medicine, 13, 43-48. doi:10.1207/S15328015TLM1301_8

Notzer, N., & Abramovitz, R. (2008). Can brief workshops improve clinical instruc-tion? Medical Education, 42, 152-156. doi:10.1111/j.1365-2923.2007.02947.x

Ottolini, M. C., Cuzzi, S., Tender, J., Coddington, D. A., Focht, C., Patel, K. M., & Greenberg, L. (2007). Decreasing variability in faculty ratings of student case pre-sentations: A faculty development intervention focusing on reflective practice. Teaching and Learning in Medicine, 19, 239-243. doi:10.1080/10401330701366390

Piccinin, S., Cristi, C., & McCoy, M. (1999). The impact of individual consultation on student ratings of teaching. International Journal for Academic Development, 4, 75-88. doi:10.1080/1360144990040202

Piccinin, S., & Moore, J. (2002). The impact of individual consultation on the teaching of younger versus older faculty. International Journal for Academic Development, 7, 123-134. doi:10.1080/1360144032000071323

Preston-Whyte, M. E., Fraser, R. C., & McKinley, R. K. (1998). Teaching and assess-ment in the consultation: A hospital clinicians’ preparatory workshop for integrated teaching of clinical method to undergraduate medical students. Medical Teacher, 20, 266-267. doi:10.1080/01421599881066

Wong, J. G., & Agisheva, K. (2007). Developing teaching skills for medical educators in Russia: A cross-cultural faculty development project. Medical Education, 41, 318-324. doi:10.1111/j.1365-2929.2006.02676.x

Method Focus

Amin, Z., Khoo, H. E., Gwee, M., Tan, C. H., & Koh, D. R. (2006). Addressing the needs and priorities of medical teachers through a collaborative intensive faculty development programme. Medical Teacher, 28, 85-88. doi:10.1080/014215 90500314124

Baroffio, A., Nendaz, M. R., Perrier, A., Layat, C., Vermeulen, B., & Vu, N. V. (2006). Effect of teaching context and tutor workshop on tutorial skills. Medical Teacher, 28, e112-e119. doi:10.1080/01421590600726961

Bellows, L., & Danos, J. R. (2003). Transforming instructional development: Online workshops for faculty. To Improve the Academy: Resources for Faculty, Instructional, and Organizational Development. 21, 160-178.

Breda, J., Clement, M., & Waeytens, K. (2003). An interactive training programme for beginning faculty: Issues of implementation. International Journal for Academic Development, 8, 91-104. doi:10.1080/1360144042000277964

Burton , L. (1997). Overcoming the inertia of traditional instruction: Final report of the social work faculty development program at Andrews University. Presented to the Chairs Council, The College of Arts & Sciences, Andrews University, Berrien Springs, MI.

(continued)

Educational Development Literature

115

Appendix A (Continued)

Cahn, D. D (1999, November). Faculty development at SUNY: Shifting from teaching to learning. Paper presented at the annual meeting of the National Communication Association, Chicago, IL.

Chalmers, D., & Fuller, R. (1999). Research and a professional development pro-gramme on teaching learning strategies as part of course content. International Journal for Academic Development, 4, 28-33. doi:10.1080/1360144990040105

Clegg, S., Konrad, J., & Tan, J. (2000). Preparing academic staff to use ICTs in support of student learning. International Journal for Academic Development, 5, 138-148. doi:10.1080/13601440050200743

Crang-Svalenius, E., & Stjernquist, M. (2005). Applying the case method for teaching within the health professions–teaching the teachers. Medical Teacher, 27, 489-492. doi:10.1080/01421590500136154

Eisen, A., & Barlett, P. (2006). The piedmont project: Fostering faculty development toward sustainability. Journal of Environmental Education, 38, 25-36. doi:10.3200/JOEE.38.1.25-36

Fidler, P. P., Neururer-Rotholz, J., & Richardson, S. (1999). Teaching the freshman seminar; its effectiveness in promoting faculty development. Journal of the First-Year Experience and Students in Transition, 11, 59-74.

Gallos, M., Berg, E., & Treagust, D. (2005). The effect of integrated course and faculty development: Experiences of a university chemistry department in the Philippines. International Journal of Science Education, 27, 985-1006. doi:10.1080/09500690500038447

Gerrard, C. (2005). The evaluation of a staff development (pilot) programme for online tutoring: A case study. Campus-Wide Information Systems, 22, 148-153. doi:10.1108/10650740510606144

Gold, S. (2001). A constructivist approach to online training for online teachers. Journal of Asynchronous Learning Networks, 5, 35-57.

Hammerschlag, R., Lasater, K., Salanti, S., & Fleishman, S. (2008). Research scholars pro-gram: A faculty development initiative at the Oregon college of oriental medicine. Journal of Alternative and Complementary Medicine, 14, 437-443. doi:10.1089/acm.2007.0813

Koehler, M. J., & Mishra, P. (2005). What happens when teachers design educational technology? The development of technological pedagogical content knowledge. Journal of Educational Computing Research, 32, 131-152. doi:10.2190/0EW7-01WB-BKHL-QDYV

Lavoie, D., & Rosman, A. J. (2007). Using active student-centered learning-based instruc-tional design to develop faculty and improve course design, delivery, and evaluation. Issues in Accounting Education, 22, 105-118. doi:10.2308/iace.2007.22.1.105

Littlejohn, A. H. (2002). Improving continuing professional development in the use of ICT. Journal of Computer Assisted Learning, 18, 166-174. doi:10.1046/j. 0266-4909.2001.00224.x

Littlejohn, A., & Sclater, N. (1998, November). Overcoming conceptual barriers to the use of internet technology in university education. Proceedings of the WebNet 98 World Conference of the WWW, Internet and Intranet. Orlando, FL.

Matthew-Maich, N., Mines, C., Brown, B., Lunyk-Child, O., Carpio, B., Drummond-Young, M., et al. (2007). Evolving as nurse educators in problem-based learning through a community of faculty development. Journal of Professional Nursing, 23, 75-82. doi:10.1016/j.profnurs.2006.07.004

(continued)

Amundsen & Wilson

116

Appendix A (Continued)

McAlpine, L., & Winer, L. (2002). Sustainable faculty development: An Indonesian case study. Innovations in Education and Teaching International, 39, 205-216. doi:10.1080/13558000210150027

McShannon, J., Hynes, P., Nirmalakhandan, N., Venkataramana, G., Ricketts, C., Ulery, A., & Steiner, R. (2006). Gaining retention and achievement for students program: A faculty development program. Journal of Professional Issues in Engineering Education and Practice, 132, 204-208. doi:10.1061/(ASCE)1052-3928(2006)132:3(204)

Meacham, J., & Ludwig, J. (2001). Faculty and students at the center: Faculty develop-ment for general education courses. The Journal of General Education, 50, 254-269. doi:10.1353/jge.2001.0027

Meskill, C., & Anthony, N. (2007). Learning to orchestrate online instructional conver-sations: A case of faculty development for foreign language educators. Computer Assisted Language Learning, 20, 5-19. doi:10.1080/09588220601118487

Nasmith, L., & McAlpine, L. (1995). Teaching by case definition: A faculty develop-ment intervention. Medical Teacher, 17, 419-430. doi:10.3109/01421599509036779

Pinheiro, S. O., Rohrer, J. D., & Heimann , C. F. L. (1998, April). Assessing change in the teaching practice of faculty in a faculty development program for Primary Care Physicians: Toward a mixed method evaluation approach. Paper presented at the American Educational Research Association Annual Conference, San Diego, CA.

Pololi, L., Clay, M. C., Lipkin Jr, M., Hewson, M., Kaplan, C., & Frankel, R. M. (2001). Reflections on integrating theories of adult education into a medical school faculty devel-opment course. Medical Teacher, 23, 276-283. doi:10.1080/01421590120043053

Premkumar, K., & Bonnycastle, D. (2006). Games as active learning strategies: A faculty development workshop. Medical Education, 40, 1129. doi:10.1111/j. 1365-2929.2006.02595.x

Regan-Smith, M., Hirschmann, K., & Iobst, W. (2007). Direct observation of faculty with feedback: An effective means of improving patient-centered and learner-cen-tered teaching skills. Teaching and Learning in Medicine, 19, 278-286. doi:10.1080/10401330701366739

Stes, A., Clement, M., & Van Petegem, P. (2007). The effectiveness of a faculty training programme: Long-term and institutional impact. International Journal for Academic Development, 12, 99-109. doi:10.1080/13601440701604898

Stevenson, C. B., Duran, R. L., Barrett, K. A., & Colarulli, G. C. (2005). Fostering faculty collaboration in learning communities: A developmental approach. Innovative Higher Education, 30, 23-36. doi:10.1007/s10755-005-3293-3

Vanhanen, H., Pitkälä, K., Puolakkainen, P., Strandberg, T. E., & Lonka, K. (2001). The problem-based learning tutorial laboratory - a method for training medical teachers. Medical Teacher, 23, 99-101. doi:10.1080/01421590125201

Whittier, D., & Lara, S. (2006). Preparing tomorrow’s teachers to use technology (PT3) at Boston University through faculty development: Assessment of three years of the project. Technology, Pedagogy and Education, 15, 321-335. doi:10.1080/ 14759390600923816

Institutional Focus

Asmar, C. (2002). Strategies to enhance learning and teaching in a research-extensive university. International Journal for Academic Development, 7, 18-30. doi:10.1080/13601440210156448

(continued)

Educational Development Literature

117

Appendix A (Continued)

Athey, S., & Hoffman, K. D. (2007). The master teacher initiative: A framework for faculty development. Marketing Education Review, 17, 1-9.

Baxley, E. G., Probst, J. C., Schell, B. J., Bogdewic, S. P., & Cleghorn, G. D. (1999). Program-centered education: A new model for faculty development. Teaching and Learning in Medicine, 11, 94-99. doi:10.1207/S15328015TL110207

Boucher, B. A., Chyka, P. A., Fitzgerald, W. L., Hak, L. J., Miller, D. D., Parker, R. B., Phelps, S. J., Wood, G. C., & Gourley, D. R. (2006). A comprehensive approach to faculty development. American Journal of Pharmaceutical Education, 70.

Bramson, R., Vanlandingham, A., Heads, A., Paulman, P., & Mygdal, W. (2007). Reaching and teaching preceptors: Limited success from a multifaceted faculty development program. Family Medicine, 39, 386-388.

Camblin Jr., L. D. & Steger, J. A. (2000). Rethinking faculty development. Higher Education, 39, 1-18. doi:10.1023/A:1003827925543

Carey, K., & Glander, C. L. (1996, May). Multiple-campus assessment of general edu-cation: A course-embedded approach. Paper presented at the annual forum of the Association for Institutional Research, Albuquerque, NM.

Dalrymple, K. R., Wuenschell, C., & Shuler, C. F. (2006). Development and implemen-tation of a comprehensive faculty development program in PBL core skills. Journal of Dental Education, 70, 948-955.

Edwards, H., Webb, G., & Murphy, D. (2000). Modeling practice – academic develop-ment for flexible learning. International Journal for Academic Development, 5, 149-155. doi:10.1080/13601440050200752

Fox, M., & Helford, P. (1999). Advancing the boundaries of higher education in Arizona using the world wide web. Interactive Learning Environments, 7, 155-174. doi:10.1076/ilee.7.2.155.7426

Grubb, A., & Hines, P. (1999). Innovative on-line instructional strategies: Faculty members as distance learners. (Report). Georgia College and State University. (Email: [email protected]).

Hall, R., Harding, D., & Ramsden, C. (2001). Priming institutional change through effective project management: A case study of the chic project. International Journal for Academic Development, 6, 152-161. doi:10.1080/13601440110033661

Herrmann, M., Lichte, T., Von Unger, H., Gulich, M., Waechtler, H., Donner-Banzhoff, N., et al. (2007). Faculty development in general practice in Germany: Experiences, evaluations, perspectives. Medical Teacher, 29, 219-224. doi:10.1080/01421 590701299231

Ives, C., McWhaw, K., & De Simone, C. (2005). Reflections of researchers involved in the evaluation of pedagogical technological innovations in a university setting. Canadian Journal of Higher Education, 35, 61-84.

Johnston, S. (1997). Preparation for the role of teacher as part of induction into faculty life and work. New Directions for Teaching and Learning, 72, 31-39. doi:10.1002/tl.7204

Kirkpatrick, D. (2001). Staff development for flexible learning. International Journal for Academic Development, 6, 168-176. doi:10.1080/713769268

Leh, A. S. C. (2005). Lessons learned from service learning and reverse mentoring in faculty development: A case study in technology training. Journal of Technology and Teacher Education, 13, 25-41.

(continued)

Amundsen & Wilson

118

Appendix A (Continued)

Lueddeke, G. R. (1997). Preparing academics for teaching in higher education: Towards an institutional model of professional practice. Reflections on Higher Education, 9, 51-75.

Major, C. H. (2002). Problem-based learning in general education at Samford University: A case study of changing faculty culture through targeted improvement efforts. The Journal of General Education, 51, 235-256. doi:10.1353/jge.2003.0015

Manwell, L. B., Pfeifer, J., & Stauffacher, E. A. (2006). An interdisciplinary faculty development model for the prevention and treatment of alcohol use disorders. Alcoholism: Clinical & Experimental Research, 30, 1393-1399. doi:10.1111/j. 1530-0277.2006.00166.x

McLoughlin, C. (2000). Creating partnerships for generative learning and systemic change: Redefining academic roles and relationships in support of learning. International Journal for Academic Development, 5, 116-128. doi:10.1080/13601440050200725

Murray, I., & Savin-Baden, M. (2000). Staff development in problem-based learning. Teaching in Higher Education, 5, 107-126. doi:10.1080/135625100114993

Newton, J. (2003). Implementing an institution-wide learning and teaching strategy: Lessons in managing change. Studies in Higher Education, 28, 427-441. doi:10.1080/0307507032000122279

Petrone, M. C. (2004). Supporting diversity with faculty learning communities: Teaching and learning across boundaries. New Directions for Teaching and Learning, 97, 111-125. doi:10.1002/tl.138

Radloff, A., de la Harpe, B. & Wright, L. (2000). A professional development program to help academic staff to foster student self-directed learning. In A. Herrmann and M. M. Kulski (Eds.), Flexible futures in tertiary teaching. Proceedings of the 9th Annual Teaching Learning Forum, February 2-4, Perth: Curtin University of Technology.

Rosenbaum, M. E., Lenoch, S., & Ferguson, K. J. (2005). Outcomes of a teaching scholars program to promote leadership in faculty development. Teaching and Learning in Medicine, 17, 247-253. doi:10.1207/s15328015tlm1703_8

Schreurs, M., Roebertsen, H., & Bouhuijs, P. A. J. (1999). Leading the horse to the water: Teacher training for all teachers in a faculty of health sciences. International Journal for Academic Development, 4, 115-123. doi:10.1080/1360144990040206

Schweitzer, L., & Eells, T. D. (2007). Post-tenure review at the university of Louisville school of medicine: A faculty development and revitalization tool. Academic Medicine: Journal of the Association of American Medical Colleges, 82, 713-717.

Sharp, S., & McLaughlin, P. (1997). Disseminating development initiatives in British higher education: A case study. Higher Education, 33, 309-329. doi:10.1023/ A:1002959730812

Shih, M., & Sorcinelli, M. D. (2007). Technology as a catalyst for senior faculty devel-opment. Journal of Faculty Development, 21, 23-31.

Smith, J., & Oliver, M. (2000). Academic development: A framework for embedding learning technology. International Journal for Academic Development, 5, 129-137. doi:10.1080/13601440050200734

Steinert, Y., Cruess, R. L., Cruess, S. R., Boudreau, J. D., & Fuks, A. (2007). Faculty development as an instrument of change: A case study on teaching professionalism. Academic Medicine: Journal of the Association of American Medical Colleges, 82, 1057-1064.

(continued)

Educational Development Literature

119

Appendix A (Continued)

Stigmar, M. (2008). Faculty development through an educational action programme. Higher Education Research and Development, 27, 107-120. doi:10.1080/07294 360701805242

Verhesschen, P., & Verburgh, A. (2004). The introduction of the bachelor-master’s structure at the K. U. Leuven: Challenges and opportunities for faculty development. International Journal for Academic Development, 9, 133-152. doi:10.1080/1360144042000334636

Villar, L. M., & Alegre, O. M. (2007). Measuring the learning of university teachers following online staff development courses: A Spanish case study. International Journal of Training and Development, 11, 200-213. doi:10.1111/j.1468-2419. 2007.00281.x

Weaver, B. E., & Nilson, L. B. (2005). Laptops in class: What are they good for? What can you do with them? New Directions for Teaching and Learning, 101, 3-13. doi:10.1002/tl.181

Winton, P. J., & Catlett, C. (1997). Southeastern Institute for faculty training: A training model for systems change. A final report. Early education programs for children with disabilities. (Grant No. H024P20002) Washington, DC: US Department of Education.

Reflection Focus

Amundsen, C., Weston, C., & McAlpine, L. (2008). Concept mapping to support uni-versity academics’ analysis of course content. Studies in Higher Education, 33, 633-652. doi:10.1080/03075070802373180

Beaty, L. (1999). Consultation through action learning. New Directions for Teaching and Learning, 79, 51-58. doi:10.1002/tl.7906

Bell, M. (2001). Supported reflective practice: A programme of peer observation and feedback for academic teaching development. International Journal for Academic Development, 6, 29-39. doi:10.1080/13601440110033643

Brew, A., & Barrie, S. (1999). Academic development through a negotiated curriculum. International Journal for Academic Development, 4, 34-42. doi:10.1080/ 1360144990040106

Cole, K. A., Barker, L. R., Kolodner, K., Williamson, P., Wright, S. M., & Kern, D. E. (2004). Faculty development in teaching skills: An intensive longitudinal model. Academic Medicine: Journal of the Association of American Medical Colleges, 79, 469-480.

Cowan, J., & Westwood, J. (2006). Collaborative and reflective professional develop-ment: A pilot. Active Learning in Higher Education, 7, 63-71. doi:10.1177/ 1469787406061149

Cowie, N. (1996). Cooperative development: Professional self-development for lan-guage teachers through cooperation with a colleague. Saitama University Review, 32, 121-141.

Dall’Alba, G. (2005). Improving teaching: Enhancing ways of being university teach-ers. Higher Education Research and Development, 24, 361-372. doi:10.1080/ 07294360500284771

Frost, S. H., & Jean, P. M. (2003). Bridging the disciplines: Interdisciplinary discourse and faculty scholarship. Journal of Higher Education, 74, 119-149. doi:10.1353/jhe.2003.0013

(continued)

Amundsen & Wilson

120

Appendix A (Continued)

Halliday, J., & Soden, R. (1998). Facilitating changes in lecturers’ understanding of learning. Teaching in Higher Education, 3, 21-35. doi:10.1080/1356215980030102

Hatem, D. S., Barrett, S. V., Hewson, M., Steele, D., Purwono, U., & Smith, R. (2007). Teaching the medical interview: Methods and key learning issues in a faculty devel-opment course. Journal of General Internal Medicine, 22, 1718-1724. doi:10.1007/s11606-007-0408-9

Hatzipanagos, S., & Lygo-Baker, S. (2006). Teaching observations: Promoting devel-opment through critical reflection. Journal of Further and Higher Education, 30, 421-431. doi:10.1080/03098770600965425

Ho, A., Watkins, D., & Kelly, M. (2001). The conceptual change approach to improv-ing teaching and learning: An evaluation of a Hong Kong staff development pro-gramme. Higher Education, 42, 143-169. doi:10.1023/A:1017546216800

Hubball, H., Collins, J., & Pratt, D. (2005). Enhancing reflective teaching practices: Implications for faculty development programs. The Canadian Journal of Higher Education, 35, 57-81.

Kreber, C. (1999). A course-based approach to the development of teaching-scholarship: A case study. Teaching in Higher Education, 4, 309-325. doi:10.1080/1356251990040301

Kumagai, A. K., White, C. B., Ross, P. T., Purkiss, J. A., O’neal, C. M., & Steiger, J. A. (2007). Use of interactive theater for faculty development in multicultural medical education. Medical Teacher, 29, 335-340. doi:10.1080/01421590701378662

Light, G., & Calkins, S. (2008). The experience of faculty development: Patterns of variation in conceptions of teaching. International Journal for Academic Development, 13, 27-40. doi:10.1080/13601440701860227

Macdonald, J., & Hills, L. (2005). Combining reflective logs with electronic networks for professional development among distance education tutors. Distance Education, 26, 325-339. doi:10.1080/01587910500291405

November, P. (1997). Learning to teach experientially: A pilgrim’s progress. Studies in Higher Education, 22, 289-299. doi:10.1080/03075079712331380906

Orlander, J. D., Gupta, M., Fincke, B. G., Manning, M. E., & Hershman, W. (2000). Co-teaching: A faculty development strategy. Medical Education, 34, 257-265. doi:10.1046/j.1365-2923.2000.00494.x

Peel, D. (2005). Peer observation as a transformatory tool? Teaching in Higher Education, 10, 489-504. doi:10.1080/13562510500239125

Pereira, M. A. (1999). My reflective practice as research. Teaching in Higher Education, 4, 339-354. doi:10.1080/1356251990040303

Pickering, A. M. (2006). Learning about university teaching: Reflections on a research study investigating influences for change. Teaching in Higher Education, 11, 319-335. doi:10.1080/13562510600680756

Pololi, L. H., & Frankel, R. M. (2005). Humanising medical education through faculty development: Linking self-awareness and teaching skills. Medical Education, 39, 154-162. doi:10.1111/j.1365-2929.2004.02065.x

Quinn, L. (2003). A theoretical framework for professional development in a South African university. International Journal for Academic Development, 8, 61-75. doi:10.1080/1360144042000277946

Sandretto, S., Kane, R., & Heath, C. (2002). Making the tacit explicit: A teaching intervention programme for early career academics. International Journal for Academic Development, 7, 135-145. doi:10.1080/1360144032000071314

(continued)

121

Appendix A (Continued)

Schiller, S. A., Taylor, M. M., & Gates, P. S. (2004). Teacher evaluation within a com-munity of truth: Testing the ideas of parker palmer. Innovative Higher Education, 28, 163-186.

Schuerholz-Lehr, S., Caws, C., Van Gyn, G., & Preece, A. (2007). Internationalizing the higher education curriculum: An emerging model for transforming faculty per-spectives. Canadian Journal of Higher Education, 37, 67-94.

Stewart, J., & McCormack, C. (1997). Experiencing and supporting change: From lecture to interactive groupwork. Teaching in Higher Education, 2, 167-180.

Wlodarsky, R. (2005). The professoriate: Transforming teaching practices through critical reflection and dialogue. Teaching & Learning: The Journal of Natural Inquiry & Reflective practice, 19, 156-172.

Disciplinary Focus

MacDonald, I. (2001). The teaching community: Recreating university teaching. Teaching in Higher Education, 6, 153-167. doi:10.1080/13562510120045168

Mathias, H. (2005). Mentoring on a programme for new university teachers: A partner-ship in revitalizing and empowering collegiality. International Journal for Academic Development, 10, 95-106. doi:10.1080/13601440500281724

Quinlan, K. M., & Akerlind, G. S. (2000). Factors affecting departmental peer collabo-ration for faculty development: Two cases in context. Higher Education, 40, 23-52. doi:10.1023/A:1004096306094

Rowland, S. (1999). The role of theory in a pedagogical model for lecturers in higher education. Studies in Higher Education, 24, 303-314. doi:10.1080/03075079912331379915

Action Research/Inquiry Focus

Bernacchio, C., Ross, F., Washburn, K. R., Whitney, J., & Wood, D. R. (2007). Faculty collaboration to improve equity, access, and inclusion in higher education. Equity and Excellence in Education, 40, 56-66. doi:10.1080/10665680601066511

Blackwell, R., Channell, J., & Williams, J. (2001). Teaching circles: A way forward for part-time teachers in higher education? International Journal for Academic Development, 6, 40-53. doi:10.1080/13601440110033652

Boud, D. (1999). Situating academic development in professional work: Using peer learning. International Journal for Academic Development, 4, 3-10. doi:10.1080/ 1360144990040102

Briggs, C. L. (2007). Curriculum collaboration: A key to continuous program renewal. Journal of Higher Education, 78, 676-711.

Buchbinder, S. B., Alt, P. M., Eskow, K., Forbes, W., Hester, E., Struck, M., et al. (2005). Creating learning prisms with an interdisciplinary case study workshop. Innovative Higher Education, 29, 257-274. doi:10.1007/s10755-005-2861-x

Carlson, T., MacDonald, D., Gorely, T., Hanrahan, S., & Burgess-Limerick, R. (2000). Implementing criterion-referenced assessment within a multi-disciplinary university department. Higher Education Research and Development, 19, 103-116. doi:10.1080/07294360050020507

Cornelius, S., & Macdonald, J. (2008). Online informal professional development for distance tutors: Experiences from the Open University in Scotland. Open Learning, 23, 43-55. doi:10.1080/02680510701815319

(continued)

Amundsen & Wilson

122

Appendix A (Continued)

Cox, M. D. (1999). Peer consultation and faculty learning communities. New Directions for Teaching and Learning, 79, 39-49.doi:10.1002/tl.7905

Garcia, L. M., & Roblin, N. P. (2008). Innovation, research and professional develop-ment in higher education: Learning from our own experience. Teaching and Teacher Education: An International Journal of Research and Studies, 24, 104-116. doi:10.1016/j.tate.2007.03.007

Koch, L., Holland, L., Price, D., Gonzalez, G. L., Lieske, P., Butler, A., Wilson, K., & Holly, M. (2002). Engaging new faculty in the scholarship of teaching. Innovative Higher Education, 27, 83-94. doi:10.1023/A:1021153225914

Lynd-Balta, E., Erklenz-Watts, M., Freeman, C., & Westbay, T. D. (2006). Professional development using an interdisciplinary learning circle: Linking pedagogical theory to practice. Journal of College Science Teaching, 35, 18-24.

Mezeske, B. A. (2006). Teaching circles: Low-cost, high-impact faculty development. Academic Leader, 22, 8-8.

Middendorf, J. (2004). Facilitating a faculty learning community using the decoding the disciplines model. New Directions for Teaching and Learning, 98, 95-107. doi:10.1002/tl.151

Orzech, M. (1998). A departmental perspective on educational development. International Journal for Academic Development, 3, 18-23. doi:10.1080/13601449 80030104

Reid, A., & Petocz, P. (2003). Enhancing academic work through the synergy between teaching and research. International Journal for Academic Development, 8, 105-117. doi:10.1080/1360144042000277982

Romano, J. L., Hoesing, R., O’Donovan, K., & Weinsheimer, J. (2004). Faculty at mid-career: A program to enhance teaching and learning. Innovative Higher Education, 29, 21-48. doi:10.1023/B:IHIE.0000035365.92454.a5

Saad, A., Uskov, V. L., Cedercreutz, K., Geonetta, S., Spille, J., & Abel, D. (1999, March). Faculty collaboration on multidisciplinary web-based education. Proceedings of the Mid-South Instructional Technology Conference, Murfreesboro, TN.

Trinidad, S. & Albon, R. (2001). Tapping out new rhythms in the journey of learning. In A. Herrmann and M. M. Kulski (Eds), Expanding horizons in teaching and learn-ing. Proceedings of the 10th Annual Teaching Learning Forum, February 7-9. Perth: Curtin University of Technology.

Wildman, T. M., Hable, M. P., Preston, M. M., & Magliaro, S. G. (2000). Faculty study groups: Solving “good problems” through study, reflection, and collaboration. Innovative Higher Education, 24, 247-263. doi:10.1023/B:IHIE.0000047413. 00693.8c

Notes

The initial idea for this review was conceived of through discussions with Phillip Abrami, Lynn McAlpine, and Cynthia Weston. We thank Lynn McAlpine and Cynthia Weston for input to an early draft and Rob Dainow for suggestions on later drafts. Review of Educational Research editors and reviewers were extremely helpful in bring-ing this article to its final form. Funding was provided by the Social Sciences and

123

Humanities Research Council of Canada and the Educational Developers Caucus of the Canadian Society for Teaching and Learning in Higher Education.

1There are other often-cited reviews (e.g., Weimer & Lenze, 1991), but these were not systematically conducted.

2All students working on the review were conducting their theses in the field of educational development and had good working knowledge of the literature. We wish to acknowledge the following members of our team: Greg Hum, Marie Krbavac, Alicia Kronberg, Amrit Mundy, and Qi Zhang.

3See http://oncampus.macleans.ca/education/2010/11/11/measuring-excellence/.4The analysis of citations identified first authors cited at least three times in at least

three articles within a cluster. Thus, first authors were often authors of more than one article cited in the cluster. For example, P. A. Cohen authored three articles used in this cluster (1980, 1981, 1990). See Tables 2–7 in Appendix B online for more detail.

5 All references in the text of the article are included in the reference list immediately following the text of the article. Therefore, some citations appear in both the reference list and Appendix A.

References

Åkerlind, G. S. (2007). Constraints on academics’ potential for developing as a teacher. Studies in Higher Education, 32, 21–37. doi:10.1080/03075070601099416

Åkerlind, G. S. (2008). A phenomenographic approach to developing academics’ understanding of the nature of teaching and learning. Teaching in Higher Education, 13, 633–644. doi:10.1080/13562510802452350

Argyris, C., & Schön, D. A. (1974). Theory in practice: Increasing professional effec-tiveness. San Francisco, CA: Jossey-Bass.

Boud, D. (1999). Situating academic development in professional development work: Using peer learning. International Journal for Academic Development, 4, 3–10. doi:10.1080/1360144990040102

Boud, D., & Walker, D. (1998). Promoting reflection in professional courses: The chal-lenge of context. Studies in Higher Education, 23, 192–206. doi:10.1080/03075079812331380384

Brookfield, S. D. (1990). The skillful teacher: On technique, trust, and responsiveness in the classroom. San Francisco, CA: Jossey-Bass.

Clarke, D., & Hollingsworth, H. (2002). Elaborating a model of teacher professional growth. Teaching and Teacher Education, 18, 947–967. doi:10.1016/S0742–051X(02)00053–7

Clegg, S. (2009). Forms of knowing and academic practice. Studies in Higher Education, 34, 403–416. doi:10.1080/03075070902771937

Cox, M. D. (2001). Faculty learning communities: Change agents for transforming institutions into learning organizations. To Improve the Academy, 19, 69–93.

Dall’Alba, G., & Sandberg, J. (2006). Unveiling professional development: A critical review of stage models. Review of Educational Research, 76, 383–412. doi:10.3102/00346543076003383

Davidson, M. (2004). Bones of contention: Using self and story in the quest to profes-sionalize higher education teaching—A interdisciplinary approach. Teaching in Higher Education, 9, 299–310.

D’Eon, M., Overgaard, V., & Rutledge S. (2000). Teaching as social practice: Implications for faculty development. Advances in Health Sciences Education, 5, 151–162. doi:10.1023/A:1009898031033

Amundsen & Wilson

124

Desimone, L. M. (2009). Improving impact studies of teachers’ professional develop-ment: Toward better conceptualizations and measures. Educational Researcher, 38, 181–199. doi:10.3102/0013189X08331140

Eisen, A., & Barlett, P. (2006). The Piedmont Project: Fostering faculty development toward sustainability. Journal of Environmental Education, 38, 25–36. doi:10.3200/JOEE.38.1.25–36

Evers, F., & Hall, S. (with Britnell, J., Brockerhoff-Macdonald, B., Carter, L., Dawson, D., Kerr, D., Mighty, J., Siddall, J., & Wolf, P.). (2009). Faculty engagement in teach-ing development activities—Phase 1: Literature review. Toronto, Canada: Higher Education Quality Council of Ontario.

Fitzmaurice, M. (2010). Considering teaching in higher education as a practice. Teaching in Higher Education, 15, 45–55. doi:10.1080/13562510903487941

Garcia, L. M., & Roblin, N. P. (2008). Innovation, research and professional develop-ment in higher education: Learning from our own experience. Teaching and Teacher Education, 24, 104–116. doi:10.1016/j.tate.2007.03.007

Gosling, D. (2009). Educational development in the UK: A complex and contradictory reality. International Journal for Academic Development, 14, 5–18. doi:10.1080/13601440802659122

Gregory, J., & Jones, R. (2009). “Maintaining competence”: A grounded theory typol-ogy of approaches to teaching in higher education. Higher Education, 57, 769–785. doi:10.1007/s10734–008–9175–8

Halliday, J., & Soden, R. (1998). Facilitating changes in lecturers’ understanding of learning. Teaching in Higher Education, 3, 21–35. doi:10.1080/1356215980030102

Healey, M., & Jenkins, A. (2003). Discipline-based educational development. In H. Eggins & R. Macdonald (Eds.), The scholarship of academic development (pp. 47–47). Philadelphia, PA: Open University Press.

Hubball, H., Collins, J., & Pratt, D. (2005). Enhancing reflective teaching practices: Implications for faculty development programs. Canadian Journal of Higher Education, 35, 57–81.

Huberman, M., & Miles, M. (2002). The qualitative researcher’s companion. Thousand Oaks, CA: Sage.

Kennedy, M. M. (2007). Defining a literature. Educational Researcher, 36, 139–147. doi:10.3102/0013189X07299197

Kirkpatrick, D. L. (1994). Evaluating training programs: The four levels. San Francisco, CA: Berrett-Koehler.

Knight, P., Tait, J., & Yorke, M. (2006). The professional learning of teachers in higher education. Studies in Higher Education, 31, 319–339. doi:10.1080/03075070600680786

Koch, L., Holland, L., Price, D., Gonzalez, G. L., Lieske, P., Butler, A., . . . Holly, M. (2002). Engaging new faculty in the scholarship of teaching. Innovative Higher Education, 27, 83–94.

Kreber, C. (2005). Charting a critical course on the scholarship of university teaching move-ment. Studies in Higher Education, 30, 389–405. doi:10.1080/03075070500160095

Kreber, C., & Cranton, P. (2000). Exploring the scholarship of teaching. Journal of Higher Education, 71, 476–495. doi:10.2307/2649149

Levinson-Rose, J., & Menges, R. J. (1981). Improving college teaching: A critical review of research. Review of Educational Research, 51, 403–434.

Lueddeke, G. R. (2003). Professionalising teaching practice in higher education: A study of disciplinary variation and “teaching-scholarship.” Studies in Higher Education, 28, 213–228. doi:10.1080/0307507032000058082

Educational Development Literature

125

Mathias, H. (2005). Mentoring on a programme for new university teachers: A partner-ship in revitalizing and empowering collegiality. International Journal for Academic Development, 10, 95–106. doi:10.1080/13601440500281724

McAlpine, L., Amundsen, C., Clement, M., & Light, G. (2009). Challenging the assumptions of what we do as academic developers. Studies in Continuing Education, 31, 261–280. doi:10.1080/01580370903271461

McAlpine, L., & Weston, C. (2000). Reflection: Issues related to improving professors’ teaching and students’ learning. Instructional Science, 28, 363–385. doi:10.1023/A:1026583208230

McAlpine, L., Weston, C., Timmermans, J., Berthiaume, D., & Fairbank-Roch, G. (2006). Zones: Reconceptualizing teacher thinking in relation to action. Studies in Higher Education, 31, 601–615. doi:10.1080/03075070600923426

McKeachie, W. (1994). Teaching tips. Lexington, MA: D.C. Heath.McNiff, J., & Whitehead, J. (2006). All you need to know about action research.

Thousand Oaks, CA: Sage.Menges, R. J., & Austin, A. E. (2001). Teaching in higher education. In V. Richardson

(Ed.), Handbook of research on teaching (4th ed., pp. 1122–1156). Washington, DC: American Educational Research Association.

Middendorf, J. (2004). Facilitating a faculty learning community using the decoding the disciplines model. New Directions for Teaching and Learning, 98, 95–107. doi:10.1002/tl.151

Nasmith, L., & Steinert, Y. (2001). The evaluation of a workshop to promote interactive lecturing. Teaching and Learning in Medicine, 13, 43–48. doi:10.1207/S15328015 TLM1301_8

Neumann, R. (2001). Disciplinary differences and university teaching. Studies in Higher Education, 26, 135–146. doi:10.1080/03075070120052071

Neumann, R., Parry, S., & Becher, T. (2002). Teaching and learning in the disciplinary contexts: A conceptual analysis. Studies in Higher Education, 27, 405–417. doi:10.1080/0307507022000011525

Orlander, J. D., Gupta, M., Fincke, B. G., Manning, M. E., & Hershman, W. (2000). Co-teaching: A faculty development strategy. Medical Education, 34, 257–265. doi:10.1046/j.1365–2923.2000.00494.x

Paterson, B. L., Thorne, S. E., Canam, C., & Jillings, C. (2001). Meta-study of qualita-tive health research. Thousand Oaks, CA: Sage.

Reid, A., & Petocz, P. (2003). Enhancing academic work through the synergy between teaching and research. International Journal for Academic Development, 8, 105–117. doi:10.1080/1360144042000277982

Richlin, L., & Cox, M. (2004). Developing scholarly teaching and the scholarship of teaching and learning through faculty learning communities. New Directions for Teaching and Learning, 97, 127–135. doi:10.1002/tl.139

Rowland, S. (1999). The role of theory in a pedagogical model for lecturers in higher education. Studies in Higher Education, 24, 303–314. doi:10.1080/03075079912331379915

Rowland, S., & Barton, L. (1994). Making things difficult: Developing a research approach to teaching in higher education. Studies in Higher Education, 19, 367–374. doi:10.1080/03075079412331381940

Sandretto, S., Kane, R., & Heath, C. (2002). Making the tacit explicit: A teaching intervention programme for early career academics. International Journal for Academic Development, 7, 135–145. doi:10.1080/1360144032000071314

Amundsen & Wilson

126

Skeff, K. M., Stratos, G. A., Berman, J., & Bergen, M. R. (1992). Improving clinical teaching. Archives of Internal Medicine, 152, 1156–1160. doi:10.1001/archinte.1992.00400180028004

Steinert, Y., Mann, K., Centeno, A., Dolmans, D., Spencer, J., Gelula, M., & Prideaux, D. (2006). A systematic review of faculty development initiatives designed to improve teaching effectiveness in medical education. Medical Teacher, 28, 497–526. doi:10.1080/01421590600902976

Stes, A., Min-Leliveld, M., Gijbels, D., & Van Petegem, P. (2010). The impact of instructional development in higher education: The state-of-the-art of the research. Educational Research Review, 5, 25–49. doi:10.1016/j.edurev.2009.07.001

Stewart, J., & McCormack, C. (1997). Experiencing and supporting change: From lecture to interactive groupwork. Teaching in Higher Education, 2, 167–180. doi:10.1080/1356251970020206

Stigmar, M. (2008). Faculty development through an educational action programme. Higher Education Research and Development, 27, 107–120. doi:10.1080/07294360701805242

Trigwell, K., & Shale, S. (2004). Student learning and the scholarship of university teaching. Studies in Higher Education, 29, 523–536. doi:10.1080/0307507042000236407

Trowler, P., & Cooper, A. (2002). Teaching and learning regimes: Implicit theories and recurrent practices in the enhancement of teaching and learning through educational development programmes. Higher Education Research and Development, 21, 221–240. doi:10.1080/0729436022000020742

Webster-Wright, A. (2009). Reframing professional development through understand-ing authentic professional learning. Review of Educational Research, 79, 702–739. doi:10.3102/0034654308330970

Weimer, M., & Lenze, L. F. (1991). Instructional interventions: A review of the litera-ture on efforts to improve instruction. In J. Smart (Ed.), Higher education: Handbook of theory and research (pp. 294–333). New York, NY: Agathon.

Zellers, D. F., Howard, V. M., & Barcic, M. A. (2008). Faculty mentoring programs: Reenvisioning rather than reinventing the wheel. Review of Educational Research, 78, 552–588. doi:10.3102/0034654308320966

AuthorsCHERYL AMUNDSEN is director of the Institute for the Study of Teaching and Learning

in the Disciplines and associate professor in the Faculty of Education at Simon Fraser University, 8888 University Drive, Burnaby, British Columbia, Canada; e-mail: [email protected]. She has a long-standing interest in academic development, and her research has focused on how academics develop pedagogical knowledge in relationship to their subject matter and the thinking underlying instructional decisions. Currently, she is also investigating identity development of early-career academics (doctoral students, postdocs, and pretenure faculty).

MARY WILSON is completing her Ph.D. in the curriculum theory and implementation program at Simon Fraser University; e-mail: [email protected]. Her Ph.D. research, a metastudy of instructional development in higher education, reflects her long-standing interest in teaching adults. Her professional work is as an educational developer.