14
TA Training Program Report for 2011/2012 Department of Psychology 15 June 2012 Psychology Graduate Student Professional Development Series in Teaching and Teaching Assistance Photos from TA Development Day 2011 by Ben Cheung, Psychology Graduate Student Principal Applicant Dr. Catherine Rawn Instructor-1, Department of Psychology Email: [email protected] Fax: (604) 822-6923

TA Training Program Report for 2011/2012 Department of ...blogs.ubc.ca/catherinerawn/files/2009/05/Psychology-TA-training-follow...use rubrics and receive feedback on their use of

  • Upload
    others

  • View
    6

  • Download
    0

Embed Size (px)

Citation preview

Page 1: TA Training Program Report for 2011/2012 Department of ...blogs.ubc.ca/catherinerawn/files/2009/05/Psychology-TA-training-follow...use rubrics and receive feedback on their use of

TA Training Program Report for 2011/2012 Department of Psychology

15 June 2012

Psychology Graduate Student Professional Development Series in Teaching and Teaching Assistance

Photos from TA Development Day 2011 by Ben Cheung, Psychology Graduate Student Principal Applicant Dr. Catherine Rawn

Instructor-1, Department of Psychology Email: [email protected] Fax: (604) 822-6923

Page 2: TA Training Program Report for 2011/2012 Department of ...blogs.ubc.ca/catherinerawn/files/2009/05/Psychology-TA-training-follow...use rubrics and receive feedback on their use of

Psychology Department 2

1. Description of the delivered curriculum and format of the training program Over the past four years, the Psychology program has developed to accommodate the training needs for our graduate student Teaching Assistants and Teaching Fellows (i.e., TAs with teaching roles in our department) as well as the pragmatics of implementing the program in our department specifically. Throughout this process, the program’s mandate has remained consistent: to enhance the efficacy of our teaching assistants (TAs) and teaching fellows (TFs) to benefit our undergraduate students’ education and our graduate students’ professional growth. As it has since fall 2008, the 2011/2012 program began with a full day event, the “TA Development Day” geared primarily for first year graduate students who are new to the TA role. Fifteen participants were divided into three groups, each led by an experienced “TA Mentor.” TA Mentors were recruited based on their years of experience in the TA role and general interest in teaching well. They facilitated discussion and offered their advice and ideas throughout the day’s sessions. I facilitated the full day with the help of Alyssa Croft, who is a graduate student in Psychology. With the additional help of a volunteer photographer and a logistics coordinator in charge of food (both psychology graduate students), the day was a success. Like in years past, TAs practiced developing professional relationships with students and instructors through role plays, by generating lists of qualities that make TAs great, by setting goals for themselves, and by struggling with the challenge of developing grading rubrics. All of these actions and more were designed to specifically address the learning objectives we set out in our prior proposal (which remain roughly the same this year). One key change was made this past year to the TA Development Day program in response to feedback from 2010/2011 and earlier years. At the end of the day we had previously led a quiet reflection session where TAs could write their take-home messages for the day. The problem was people were too tired by then to be motivated to reflect (myself included). So this year we designed a short multiple choice test of some of the day’s key facts (e.g., grading policies, best practices for answering student questions). To ramp up the energy we did it Team Based Learning style: once individually, and then once in teams with scratch cards (depicted in the lower left panel of the photo collage on the cover of this report). It was a fun and energized way to end the day. Both new TAs and mentors enjoyed the session, and it yielded enlightening data (see below), so we’ll continue that again for 2012. Second, we held a set of concurrent Technology sessions in late September, during which TAs could drop in to learn about WebCT Vista, using the scantron machine, etc. After that, I hit one snag in implementing the TA program. I had proposed a 2-hour refresher session in January to focus on discussing challenges and successes from the previous term, and brainstorming learner-centred options for how to address them next time. Unfortunately, I was unable to follow through on this commitment. One change I will be making for next year is to formalize a committee that meets to assist me in implementing these meetings. The Teaching Fellow Development Group has also met with some success. This group has primarily discussed critical incidents people are encountering as they grapple with teaching students. I am continuing to meet with the group monthly. Attendance has dropped this year, ranging from 3 to 9 people (average of 5, which is down 3 from the previous year). Based on recollection and a cursory review of minutes from these meetings, we met two of the three key objectives I proposed last year: to “build a teaching support network and a learner-centred language for discussing challenges,” and to “consider learner-centred approaches to learning assessments, lesson planning, honouring diversity in the classroom, course design, and the use of technology.” The diversity of experiences among participants has led to some really rich (albeit relatively unstructured) discussions as we have grappled with the day-to-day complexities of

Page 3: TA Training Program Report for 2011/2012 Department of ...blogs.ubc.ca/catherinerawn/files/2009/05/Psychology-TA-training-follow...use rubrics and receive feedback on their use of

Psychology Department 3

teaching well. The first objective, which was to build a draft of a professional portfolio, was an ongoing goal and point for discussion, but few participants actually completed a full draft. To address this need (that TFs have expressed interest in in previous years) and others, I will be teaching a course on Teaching of Psychology Psyc 508 in January that requires completion of a teaching portfolio as a major component.

2. Adjustments to the program based on feedback collected

Re-create the grading and evaluation lesson so that it provides more opportunities for participants to use rubrics and receive feedback on their use of them.

Add a critical incident simulation that requires participants to explain the grading norms in the psychology department as they would to an undergraduate student.

Keep the Team Based Learning style individual and team test at the end of TA Development Day because it yielded rich data and the team version seemed to help clear up individual difficulties. It also was an energized way to end the day.

Invite the Mentors to complete the Peer Mentor workshop to be offered by Joseph Topornycky at CTLT to potentially increase their positive impact.

Come up with a way to gather more data at the time of registration as well as at the Technology Workshop. Increase attendance at the Technology Workshop.

Based on informal feedback from Teaching Fellows throughout the past few years, the major addition to the program for 2012 will be the Teaching of Psychology graduate course I will be offering in January (Psyc 508).

3. The total number of participants and elements of the program completed In total, across the two series (TA and TF), we have engaged 26 out of approximately 89 TAs/TFs in our Department. This 29% participation rate is down from earlier years (34% in 2011/2012, 45% in 2010/2011, 43% in 2009/2010). I suspect this is partly due to another small entering cohort (17 rather than the usual 20+), a readiness to freshen the TF program, and the absence of the January Follow-Up meeting. (Notes: Because some people participated in multiple programs below, the “Totals” add to more than 26.) Teaching Assistant Development Day (3 September 2011)

13 new Teaching Assistants (76% of our 17 new Psychology Teaching Assistants) plus 2 experienced Teaching Assistants participated

3 experienced Teaching Assistants participated as mentors

2 experienced Teaching Assistants participated as logistics support and photographer, respectively

1 experienced Teaching Assistant participated as a co-facilitator (Alyssa Croft)

Total: 21 Psychology TAs were involved in TA Development Day Teaching Assistant Development Technology Workshop (22 September 2011)

7 of the 17 new Teaching Assistants participated, plus 2 returning TAs

All 3 of the experienced Teaching Assistants came as part of their ongoing role as mentors

1 experienced Teaching Assistant returned as logistics support (Alyssa Croft)

Total: 13 Psychology TAs returned to participate in the Technology Workshop

Page 4: TA Training Program Report for 2011/2012 Department of ...blogs.ubc.ca/catherinerawn/files/2009/05/Psychology-TA-training-follow...use rubrics and receive feedback on their use of

Psychology Department 4

Teaching Fellow Development Group Meetings

Seven meetings occurred: Early October (9 participants), late October (4 participants), November (3 participants), December (6 participants), January (5 participants), February (4 participants) and May (3 participants)

Of the 12 self-declared Teaching Fellows in our department in 2011/2012, 9 Teaching Fellows attended at least one meeting (3 attended one meeting; 5 attended five meetings; 1 attended six meetings).

Total: 9 Psychology TFs (75% of our 12 TFs) were involved in the TF Development Group

4. Brief summary of expenditures in comparison to the allocated funds

TA Development Day Expenses Amount

Honoraria (1 facilitator @ 700, 3 mentors @ 200, 1 logistics coordinator @ 100) $1400.00

Materials (markers, handbooks, pens, tape, index cards, etc) $137.08

Food and snacks $429.53

TF Development Group

Monthly Refreshments (7 meetings, average 4-5 participants each) $181.72

TOTAL $2148.33

Allocated funds ($3438.75)

Remaining (Remainder estimated in proposal for next year = $1275.31) $1290.42

5. An assessment of the program, including data (quantitative and qualitative) from short-term measures demonstrating the degree of achievement of expected learning outcomes

Demonstrating the degree of achievement of expected learning outcomes is a complex task. Last year I developed materials for 2011/2012 and received research ethics approval, in case they ultimately culminate in publishable data. So far, program evaluation efforts have focused on TA Development day, given that it is the critical introduction to the TAship (and is, for some graduate students, the only TA/teaching training they will ever pursue). Evaluation includes subjective feedback on how helpful the day was, results from a quiz based on learning objectives, and subjective reports of feeling prepared for various duties. Data suggest that TA Development Day is making traction toward reaching some of our learning objectives (especially regarding developing professional relationships with students and instructors), but others need to be addressed differently (especially the use of grading rubrics and explaining departmental grading norms).

5.1 Data Offering Insight into Learning Objectives

5.1.1 Multiple Choice Test Results: Overview After a few years of trying various ways to close the day’s events (e.g., open question and answer period, time for personal reflection), we decided to try a technique inspired by Team Based Learning methods. Specifically, we designed a 14-item multiple choice quiz to test some of the more

Page 5: TA Training Program Report for 2011/2012 Department of ...blogs.ubc.ca/catherinerawn/files/2009/05/Psychology-TA-training-follow...use rubrics and receive feedback on their use of

Psychology Department 5

measurable learning objectives that guide the day. Participants completed this quiz once on their own as a basic knowledge test, and then completed it again in their small groups using a scratch card so they received immediate feedback. On the team version, one small group scored perfectly, and the other two groups scored almost perfectly (getting all correct except for one answer correct on their second attempt). Given that the average individual score was only 85.6%, the experience of the team test may have clarified incorrect responses for people scoring less than perfect originally (i.e., all but one person).

5.1.2 Multiple Choice Test Results: Indicators Suggesting Successfully Addressed Learning Objectives Eight items on this test were answered correctly by 14 or 15 out of 15 respondents, suggesting that the learning objectives addressed by these items were at least partially met. (Of course, one alternative explanation is that these items were too easy. We will revisit these items to evaluate their difficulty.) Most of these items addressed aspects of learning objectives addressing ways to maintain professional and effective interactions with faculty and students.

Learning Objective Test Item Topic

Generate strategies for assisting dealing with instructors & diverse students effectively.

Identify a situation in which you should seek outside assistance when dealing with a student (93.3% correct)

Practice communication skills in the context of dealing with challenging situations with students and instructors.

Identify which is the least professional way of responding to questions to which you do not know the answer (93.3% correct) Identify the most professional way of handling a situation in which you notice an error in the grades file you sent to the professor (100% correct)

List critical skills required of a TA (such as organization, communication, professionalism, listening) along with specific behaviors you can do to demonstrate these skills.

Identify the importance of tracking hours (100% correct) Identify a specific example of how you could cultivate a professional relationship with the course instructor (100% correct)

State what teaching opportunities are available to you as a graduate student in psychology.

Identify which opportunity to gain teaching experience is unavailable until PhD candidacy (100% correct)

Apply Robert Bjork’s concept of desirable difficulties to generate realistic and effective strategies for assisting diverse students to develop study skills.

Identify a reason why simply rewriting lecture notes may not be a sufficiently good study tool (100% correct)

Identify resources on campus for students who have specific academic and non-academic needs.

Identify which location on campus would be most appropriate to direct a student experiencing a life crisis (100% correct)

Page 6: TA Training Program Report for 2011/2012 Department of ...blogs.ubc.ca/catherinerawn/files/2009/05/Psychology-TA-training-follow...use rubrics and receive feedback on their use of

Psychology Department 6

5.1.3 Multiple Choice Test Results: Indicators Suggesting Learning Objectives Not Met Well Enough

One learning objective was clearly not met well enough, judging by two test items. Specifically, two items measuring the extent to which participants could identify the required grading norms and their rationale were answered correctly only by 46.67% and 66.67% of participants, respectively. Being able to explain to our students what the grading norms are and why they exist is an important skill, and one that requires tact and thoughtfulness. For 2012, we will add a critical simulation that raises this specific issue to reinforce the conversation we have about this policy earlier in the day.

5.1.4 Multiple Choice and Open-Ended Test Results: Ability to Use a Rubric to Grade Written Work One of the major roles that Teaching Assistants in our department take on is grading written work. Four items, including an opportunity to grade a piece of writing, were included to address the two major learning objectives on this topic:

1. Identify strategies for fair and consistent marking, including use and revision of a rubric. 2. Apply these strategies to samples of student work that represent our psychology courses

(papers, exams, and labs). Performance on these items varied. Almost everyone (93.3% and 86.7% of participants, respectively) could independently identify factors to consider when creating a rubric (e.g., assignment’s learning goals, grade level, class’s ability), and how to handle the situation of completing grading papers with too high a mean. However, only 33.3% of participants could identify what Turnitin.com does, and only 20% of participants assigned a grade to a sample paper in the grade range this paper actually received (as graded by PI Rawn’s own TA the year before using the same rubric). Most participants assigned this paper an A, whereas it actually earned a C+. It wasn’t greatly surprising that participants graded this paper very highly. For one, they have not had the chance to compare it to 70+ other papers. After they found out it actually earned a C+, we had a lengthy and useful discussion about why that was so. Investigation of the written comments participants provided proved very useful in interpreting the assigned grades. Participants tended to write a fair amount of comments (average: 24.5 words, maximum: 50 words, minimum: one person wrote only 1 word). To evaluate whether participants picked up on the importance of including both positive (i.e., what was done well) and negative (i.e., what needs improvement) comments, these were coded. On average, participants wrote 1.80 positive and 1.40 negative comments. Eleven of 15 participants (73.35) wrote more positive than negative comments. Everyone wrote at least one suggestion for improvement or negative comment, but three participants did not write any positive or encouraging comment whatsoever. Given that this paper did in fact have some strengths, and the fact that we tried to communicate earlier in the day of the importance of including at least some positive feedback, this result is disappointing. Based on these results, along with results below suggesting people felt less confident than last year’s cohort at using and creating grading rubrics, it is time to rethink the Grading and Evaluation lesson on TA Development Day. Specifically, we will be giving more opportunities to practice using existing rubrics (rather than emphasizing creating them), and both giving and receiving feedback.

Page 7: TA Training Program Report for 2011/2012 Department of ...blogs.ubc.ca/catherinerawn/files/2009/05/Psychology-TA-training-follow...use rubrics and receive feedback on their use of

Psychology Department 7

5.1.5 Feeling Prepared

Part of a successful TAship is feeling prepared (i.e., having confidence) in one’s ability to demonstrate particular skills. Therefore, participants were asked the following question: When considering your upcoming TA position and duties, how much do you feel prepared to perform each of the following activities? The response scale was 1-5, where 1 was labeled not at all prepared, 3 was labeled moderately prepared, and 5 was labeled very prepared. In 2010, we could compare these items before and after TA Day. However, in 2011, we did not achieve adequate response rate on the pre-TA Day measure. To help interpret the 2011 results, I compared them to last year’s post-TA Day ratings, and kept all earlier data in the table below. Note that in 2011 I added eight additional items, for which we have no earlier data to compare.

Unfortunately, participants after TA Day this year felt significantly less confident than last year at using a grading rubric, creating a grading rubric, and explaining the grading norms in the psychology department. This decreased confidence is consistent with the two areas of concern that were raised in the TBL quiz results, and underscore the need to refocus these two lessons (see above).

The eight new items show acceptable confidence levels. It is appropriate that participants feel unprepared to use Turnitin.com or the Scantron machine, because they have not yet been taught to do that. These items were included for comparison purposes (i.e., to ensure there was no response bias in reporting feeling prepared for everything), and to prompt participants to consider joining us for the Technology Workshops on these topics in September.

Item Registration Form Mean

2010 (Standard Deviation)

Post-TA Day Mean 2010 (Standard Deviation)

Pre-Post Difference

2010

Post-TA Day Mean 2011 (Standard Deviation)

Post 2010 vs. Post 2011 Difference

Interacting effectively with students

3.57 (0.70) 4.07 (0.73) significant increase from

pre p < .05

3.80 (0.68) ns

Interacting effectively with course instructors

3.85 (0.78) 4.00 (0.78) ns 4.20 (0.56) ns

Using a grading rubric or key to evaluate written work

3.34 (1.09) 4.29 (0.73) significant increase from

pre p < .01

3.27 (0.88) significant decrease from

2010-post p < .001

Creating a grading rubric or key to evaluate written work

2.61 (1.20) 3.64 (1.01) ns 3.00 (0.85) significant decrease from

2010-post p < .02

Providing effective written feedback to students

3.48 (0.77) 3.86 (0.66) ns 3.47 (0.74) ns

Interacting meaningfully with students for whom English is an additional language

3.54 (0.76) 3.50 (0.94) ns 3.07 (0.88) ns

Page 8: TA Training Program Report for 2011/2012 Department of ...blogs.ubc.ca/catherinerawn/files/2009/05/Psychology-TA-training-follow...use rubrics and receive feedback on their use of

Psychology Department 8

Explaining the grading norms in the psychology department

2.69 (1.01) 3.86 (0.77) significant increase from

pre p < .001

3.13 (1.06) significant decrease from

2010-post p < .02

Recommending campus resources to students with specific needs

2.19 (1.17) 3.93 (0.73) significant increase from

pre p < .001

4.20 (0.78) ns

Recommending a variety of study strategies for students

3.35 (0.85) 4.36 (0.63) significant increase from

pre p < .05

4.33 (0.72) ns

Tracking the number of TA hours you have worked

4.00 (0.85) 4.64 (0.50) significant increase from

pre p < .05

4.40 (0.63) ns

Admitting mistakes 4.47 (0.64)

Admitting gaps in knowledge 4.20 (0.78)

Asking for help/advice from

fellow TAs

4.40 (0.63)

Approaching an instructor for

whom you are TAing to

discuss responsibilities and

expectations

4.47 (0.52)

Maintaining students’

confidentiality

4.40 (0.63)

Keeping student records (e.g.,

electronic grades files, paper

exams) organized and

complete

3.93 (0.80)

Using the Scantron machine

to score multiple choice

exams

1.67 (1.13)

Using Turnitin.com to

investigate possible

plagiarism for essays

1.67 (0.82)

5.2 Qualitative Feedback Summary

At the end of the day, program participants were asked to answer the following questions. 1. What were the most valuable aspects of this workshop? 2. I would like to suggest the following ideas for improving next year’s TA Development Day… because… 3. What was the most important message you learned today?

Page 9: TA Training Program Report for 2011/2012 Department of ...blogs.ubc.ca/catherinerawn/files/2009/05/Psychology-TA-training-follow...use rubrics and receive feedback on their use of

Psychology Department 9

5.2.1 Valuable Aspects of the Workshop

Overall, each aspect of the workshop was considered valuable, from the information that was covered in each session to sharing experiences with other TAs. Many TAs valued the tools (e.g., grading rubrics) and resources (e.g., campus information), as well as interacting and discussing ideas with fellow TAs. Note that most participants listed more than one element as valuable, which suggests that the program as a whole is perceived to be valuable.

Common Theme #

Comments 2011

# Comments

2010

# Comments

2009

# Comments

2008

Illustrative Example of Topic Specific Comment (2011)

Rubrics/Grading 6 (25.0%) 14 (29.8%) 10

(16.1%) 26 (24.1%)

Discussion about how to grade effectively.

Handbook/Resources 3 (12.5%) 10 (21.3%) 13

(21.0%) 17 (15.7%)

The binders with relevant info - it will be an invaluable resource later on.

Interaction with other TAs

3 (12.5%) 8 (17.0%) 6 (9.7%) 23 (21.3%) Hearing firsthand experiences.

Critical Skills 3 (12.5%) 5 (10.6%) 13

(21.0%) 25 (23.1%)

Discussing how to handle student interactions; learning how to admit lack of knowledge

Simulations 3 (12.5%) 5 (10.6%) -- -- Role playing really made me realize what it might be like to have a student in front of me.

Strategies for helping students

3 (12.5%) 2 (4.3%) n/a n/a Discussing why students struggle. Learning about the various resources available on campus.

Other 2 (8.3%) 1 (2.1%) 12

(19.4%) --

All of the ideas to the different scenarios and multiple options for approaching each problem; explanation of grading norms.

Responsibilities 1 (4.2%) 2 (4.3%) 2 (3.2%) 13 (12.0%) Strategies for structuring my TAship, i.e., where to draw the line for email, etc.

Total # of comments 24 47 62 108

5.2.2 Ideas for Improvement

Overall, there were less than half as many ideas for improvement as nominations for what went well (Question 1). The majority of the responses to this question dealt with the length of the day. As mentioned earlier, the workshop took place over a full eight hour day at the start of the fall term. Many found this plan tiring. Other students suggested different allocations of time/activity type for specific content. Notably, the grading session was raised by two students.

Page 10: TA Training Program Report for 2011/2012 Department of ...blogs.ubc.ca/catherinerawn/files/2009/05/Psychology-TA-training-follow...use rubrics and receive feedback on their use of

Psychology Department 10

Topic Area # Comments 2011

# Comments 2010

# Comments 2009

# Comments 2008

Illustrative Example of Topic Specific Comment (2011)

Length/Timing 3 (27.3%) 6 (25.0%) 8 (41.1%) 16 (34.8%) Content is good but make it more concise (less than 4 hours if possible) because my attention span is short.

Changes to Format

3 (27.3%) 5 (20.8%) 5 (26.3%) 23 (50.0%) More independent work/activity and a harder quiz.

Grading 2 (18.1%) -- -- -- More feedback on problems with rubrics and practice because I still don’t feel comfortable creating a rubric or grading a paper.

More Information on Topics

1 (9.1%) 2 (8.3%) 4 (21.0%) 7 (15.2%) Some info on turnitin and scantron (just basics) for those of us who might use it right away.

Nothing 1 (9.1%) 2 (8.3%) -- -- Nothing – it was wonderful.

Total # of comments

11 24 19 46

5.2.3 The Most Important Message

New for 2011, participants were asked to state the most important message that they learned from the day’s session. The two most common themes suggest that participants are aware that support resources are available for them, and that they have developed their thinking about how to approach dealing with students.

Theme # Comments All Comments

Resources are available

4 (28.6%) That there are resources there for me if I need them, and also resources for students as well.

There are people to go to with questions.

That there are a lot of resources and options and many people are available to help you.

I am not alone and there are resources and people who can help me develop as a TA.

Approaches to dealing with student situations

4 (28.6%) That even if some situations seem hard/overwhelming (pp. 45-51) it's actually pretty easy to think of effective, good solutions! (Confidence up!)

Appropriate ways to handle student crises and that it's ok to admit you are unsure of something and there are a lot of good proactive things that can be done to prevent future issues.

That there are very few right/wrongs and that we have the potential to positively affect students.

It's ok to have questions and not know all the answers! Clarifying roles 2 (14.3%) Make sure expectations are clear between you and the instructor as

well as your students and yourself.

Setting boundaries, documenting hours, etc.

Page 11: TA Training Program Report for 2011/2012 Department of ...blogs.ubc.ca/catherinerawn/files/2009/05/Psychology-TA-training-follow...use rubrics and receive feedback on their use of

Psychology Department 11

Professionalism 2 (14.3%) Be professional and prepared.

Be professional and you will succeed! Grading 1 (7.2%) Pay close attention to issues of rubrics/grading before grading

assignments, to save a wealth of time and confusion. Department 1 (7.2%) That the psych department at UBC cares about teaching and wants

to support their TAs. Total # of comments

14

5.3 Quantitative Feedback Summary

Participants were asked to rate aspects of the day which fell into three broad categories: Administration, Facilitation, and Content. Aggregated results are reported below, along with examples of the comments given for each category. Historical data since 2008 are included for comparison purposes. As noted below, statistical analyses using t-tests identified some significant differences between 2010 and 2011; those that are meaningful are discussed. Somewhat arbitrarily we have denoted all means at or above 4.5/5 in dark green to highlight particularly highly rated aspects of TA Development Day, and means that fall below 4 in light pink to highlight areas to consider improving.

5.3.1 OVERALL RATING

Participants were asked to provide their overall impressions of how helpful the day was. Ratings could range from 1 (not helpful) to 5 (very helpful). Of fourteen responses, the most frequently occurring response was 5 (mean = 4.39, standard deviation = 0.74). The lowest score rating was 3 (moderately helpful).

5.3.2 ADMINISTRATION

Overall, these data show that participants rated all administrative aspects of TA Development Day as helpful. Providing food and teamwork among facilitators stand out as being consistently rated very highly. Like last year, we did not ask about the convenience of the date because there is simply no better time that we can offer it.

ADMINISTRATION 2011 Mean (Standard Deviation)

2010 Mean (Standard Deviation)

2009 Mean (Standard Deviation)

2008 Mean (Standard Deviation)

Registration procedure 4.27 (0.88) 4.00 (0.94) 4.20 (1.06) 4.29 (0.78)

Facilities/location 4.53 (0.64) 4.71 (0.46) 4.03 (0.77) 4.37 (0.85)

Refreshments/food 4.73 (0.46) 4.71 (0.53) 4.68 (0.47) 4.77 (0.56)

Convenience of date (not asked) (not asked) 4.08 (1.05) 3.82 (1.16)

Teamwork among facilitators 4.67 (0.49) 4.71 (.46) 5.00 (0.00) 4.56 (0.67)

Flow of the workshop 4.53 (0.52) 4.36 (.73) 4.75 (0.44) 4.00 (0.85)

Overall Administration Average 4.55 (0.52) 4.50 (.44) 4.44 (0.40) 4.30 (0.51)

Notes: Each aspect was rated on a 5 point scale ranging from 1 (Needs Improvement) to 5 (Very Effective). (In 2008, the scale endpoints were 1 (Not Helpful) to 5 (Very Helpful).) Number of responses ranged from 60-62 in 2008, 19-20 in 2009, 28 in 2010, and 15 in 2011.

Page 12: TA Training Program Report for 2011/2012 Department of ...blogs.ubc.ca/catherinerawn/files/2009/05/Psychology-TA-training-follow...use rubrics and receive feedback on their use of

Psychology Department 12

5.3.3 FACILITATION

Overall, these data suggest that TAs rate the style of TA Development Day as helpful. Means tended to be a little lower in 2011 than 2010, which in turn were a little lower than in 2009. Interestingly, this drop is occurring simultaneously with lower and lower numbers of participants. Our incoming class is simply getting smaller each year of this program (and will again in 2012, by the looks of things at this point). Although we’re not prepared to re-think the whole program for a smaller group at this time, this is something to watch for in the coming year. Despite this drop, all means remain in the “helpful” range, with only two significant drops (i.e., visual aids and TA mentors).

FACILITATION 2011 Mean (Standard Deviation)

2010 Mean (Standard Deviation)

2009 Mean (Standard Deviation)

2008 Mean (Standard Deviation)

Visual aids 3.60 (0.63) 4.11 (0.79) 4.53 (0.72) 4.31 (0.72)

Interactive style 4.67 (0.64) 4.32 (0.95) 4.55 (0.83) 4.23 (0.88)

Modeling of teaching techniques by facilitators

4.00 (0.88) 4.19 (0.92) 4.55 (0.76) 4.17 (0.83)

Handbook integration in day 3.60 (0.99) 4.18 (1.02) 4.50 (0.69) 4.47 (0.65)

Handbook as future resource 4.47 (1.13) 4.44 (0.89) 4.95 (0.22) 4.69 (0.53)

Opportunities for personal reflection 3.50 (1.35) 3.78 (1.05) -- --

Opportunities for brainstorming 4.20 (0.92) 4.46 (0.80) -- --

Opportunities for creativity 3.73 (1.10) 4.03 (1.07) -- --

Diversity of experience and educational background in your small group

3.73 (0.96) 4.04 (0.96) -- --

Your TA Mentor 4.21 (0.80) 4.76 (0.66) -- --

Your small group members 4.21 (0.89) 4.39 (0.92) -- --

Learning from peers 4.29 (0.83) 4.38 (0.91) -- --

Overall Facilitation Average 3.99 (0.64) 4.26 (0.66) 4.62 (0.33) 4.37 (0.54) Notes: Each aspect was rated on a 5 point scale ranging from 1 (Not Helpful) to 5 (Very Helpful). Number of responses ranged from 58-62 in 2008, 19-20 in 2009, 25-28 in 2010, and 14-15 in 2011.

5.3.4 CONTENT

Overall, content is once again rated as helpful by participants. However, it doesn’t seem to have been rated quite as helpful as in years past. I’m not quite sure what to make of that result, but given that all means fall above moderately helpful (3), I’m not too concerned. Nonetheless, two issues stand out from these means.

a. In 2011, we replaced the former open question and answer period at the end of the day with the Team Based Learning style quiz. Although the quiz was not the highest rated element of the day, it was rated (non-significantly) more helpful than the open question and answer period from 2010. Given the rich data it yielded (see above), we are going to keep this TBL style quiz.

b. We were unable to collect an adequate amount of data to evaluate the Technology Workshops in September. That seems to be a recurring theme, and I intend to put more thought into how to promote that event (and its questionnaires) more successfully.

Page 13: TA Training Program Report for 2011/2012 Department of ...blogs.ubc.ca/catherinerawn/files/2009/05/Psychology-TA-training-follow...use rubrics and receive feedback on their use of

Psychology Department 13

CONTENT 2011 Mean (Standard Deviation)

2010 Mean (Standard Deviation)

2009 Mean (Standard Deviation)

2008 Mean (Standard Deviation)

People and Policies 3.64 (1.15) 4.33 (0.96) 4.15 (1.08) 4.00 (0.97)

Professional Relationships 3.93 (1.14) 4.18 (0.95) 4.55 (0.51) 4.07 (1.00)

Grading & Evaluation 4.00 (0.96) 4.32 (0.82) 4.60 (0.50) 4.40 (0.70)

Opportunities & Resources 3.92 (1.07) 4.00 (0.98) 4.43 (0.71) 4.07 (1.02)

Helping Students Succeed 4.36 (0.84) 4.29 (0.90) -- --

Open Questions and Answers -- 3.25 (1.14) -- --

Critical Skills (role plays; integrated into Managing Relationships in 2009 & 2010)

4.00 (1.18) 3.97 (1.07)

Team-Based Learning Quiz 3.64 (0.84) -- -- --

Overall Non-Technology Average 3.93 (0.84) 4.06 (0.79) 4.38 (0.42) 4.12 (0.65) Technology Sessions (Break-out sessions during TA Day in 2008 and 2009; separate Tech Day in 2010 and 2011 but not enough data collected in 2011)

WebCT Vista 4.25 (0.50) N=4 3.75 (0.96) N=4 --

Scantron

4.00 (0.93) N=8 4.67 (0.50) N=9 3.93 (1.13)

N=27

Library

-- 3.45 (1.21)

N=10 3.30 (0.77)

N=23

Turnitin.com

4.00 (0.00) N=3 3.64 (1.29)

N=11 4.00 (0.96)

N=38

SPSS (specifically for Scantron data in 2010) 3.67 (1.53)

N=3 --

3.29 (1.05) N=28

Excel for Scantron data 3.89 (1.13) N=8 -- --

Using classroom A/V Equipment (handout only) 4.00

N=1 -- --

Overall Technology Average 3.83 (0.75) 3.79 (1.11) 3.65 (0.89)

Questions About Technology Day 2010 (not enough data in 2011)

Average Style 4.08 (0.79)

Handouts 4.67 (0.58) N=3

Breadth of Coverage 3.40 (0.89) N=5

Informal, Drop-in Style 4.00 (1.10) N=6

Average Logistics (N=10) 4.33 (0.63)

Facilities/location 4.20 (0.79)

Food/Refreshments 4.60 (0.70)

Convenience of date 4.2 (0.92)

Notes: Each aspect was rated on a 5 point scale ranging from 1 (Not Helpful) to 5 (Very Helpful). With the exception of Technology Sessions, where sample size (N) is indicated, the number of responses ranged from 55-61 in 2008, 20 in 2009, 28 in 2010, and 14 in 2011.

Page 14: TA Training Program Report for 2011/2012 Department of ...blogs.ubc.ca/catherinerawn/files/2009/05/Psychology-TA-training-follow...use rubrics and receive feedback on their use of

Psychology Department 14

6. A statement regarding recommendations for future iterations of the program

In 2011/2012 the TA Teaching Enhancement Community of Practice was established under the leadership of Joseph Topornycky of CTLT. I am absolutely thrilled by the development of this CoP; in fact, I have gained so much from it over the past year that it is the only CoP for which I carve time in my hectic schedule. There is a core group of about a dozen of us from all over campus who join each month in rich discussions about how to make our programs better. We share ideas, help each other come up with responses to challenges, and ponder the big questions like is it possible to have a common set of standards across our departments and disciplines? We are so collaborative that it’s often difficult to tell who is a graduate student and who is faculty among us. Joseph’s leadership has been immensely helpful; he brought us together and laid the foundation for us to build a structure that we needed. I no longer feel like I’m on my own in this endeavor. I wholeheartedly recommend that this CoP continue, ideally under Joseph’s guidance. Many thanks for your ongoing support of our TA Development Programs. I am so pleased to continue to receive funding once again this year. The seeds this program planted back in 2008 are continuing to grow with each passing year. Although I can’t quantify it, I believe this program is helping to change—for the better—the landscape of teaching and learning in the Psychology Department.