54
February 2010 NATIONAL TEACHING QUALITY INDICATORS PROJECT- FINAL REPORT Rewarding and recognising quality teaching in higher education through systematic implementation of indicators and metrics on teaching and teacher effectiveness Denise Chalmers http://www.catl.uwa.edu.au/projects/tqi with Deakin University, Griffith University, Macquarie University, RMIT University, The University of Queensland, University of South Australia. University of Tasmania, The University of Western Australia

NATIONAL TEACHING QUALITY INDICATORS PROJECTS SUMMARY OF DISSEMINATION ACTIVITIES

  • Upload
    uwa

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

February 2010

NATIONAL TEACHING QUALITY INDICATORS PROJECT- FINAL REPORT

Rewarding and recognising quality teaching in higher

education through systematic implementation of indicators and metrics on teaching and teacher

effectiveness

Denise Chalmers http://www.catl.uwa.edu.au/projects/tqi

with

Deakin University, Griffith University, Macquarie University, RMIT University, The University of Queensland, University of South Australia.

University of Tasmania, The University of Western Australia

Support for this project has been provided by the Australian Learning and Teaching Council, an initiative of the Australian Government Department of Education, Employment and Workplace Relations. The views expressed in this report do not necessarily reflect the views of the Australian Learning and Teaching Council Ltd. This work is published under the terms of the Creative Commons Attribution-Noncommercial-ShareAlike 2.5 Australia Licence. Under this licence you are free to copy, distribute, display and perform the work and to make derivative works.

• Attribution: You must attribute the work to the original authors and include the following statement: Support for the original work was provided by the Australian Learning and Teaching Council Ltd, an initiative of the Australian Government Department of Education, Employment and Workplace Relations.

• Noncommercial: You may not use this work for commercial purposes.

• Share Alike: If you alter, transform, or build on this work, you may distribute the resulting

work only under a licence identical to this one.

For any reuse or distribution, you must make clear to others the licence terms of this work. Any of these conditions can be waived if you get permission from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-sa/2.5/au/ or write to Creative Commons, 171 Second St, Suite 300, San Francisco, CA 94105 USA. Requests and inquiries concerning these rights should be addressed to the Australian Learning and Teaching Council, PO Box 2375, Strawberry Hills NSW 2012 or through the website: http://www.altc.edu.au 2010 Author contact details Denise Chalmers Centre for the Advancement of Teaching and Learning (M401) The University of Western Australia 35 Stirling Highway, Crawley Western Australia 6009 [email protected] Project website: http://www.catl.uwa.edu.au/projects/tqi

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 i

Contents Membership and Acknowledgements

ii

Executive Summary 1 Project Context Global trends and initiatives in teaching and learning Increasing emphasis on the use of performance indicators in higher

education Definition of performance indicators Types of performance indicators An institutional focus on teaching and learning performance indicators

3

Project Aims and Description Project methodology and implementation processes Project leadership and management

7

Project Outcomes and Impacts Stage 1: Research and Framework Development Stage 2: Pilot university project outcomes and impacts Pilot university project sustainability and transferability Pilot University Project Impacts Summary of Outcomes, Factors for Success and Future Plans

12

Dissemination 27 Linkages With the ALTC Objectives and Projects Institutional Links within Pilot Universities International Links

28

Evaluation of the Project Stage 1: Evaluation Stage 2: Evaluation Evaluation of individual Pilot University Projects

32

References 35 Appendices 37

1. Four Dimensions of Teaching Quality Framework Indicator Tables (May 2009)

2. Summary of the National TQI Dissemination Activities 3. Pilot University Reports, Resources and Publications

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 ii

Membership and Acknowledgements TQI National Project Team membership Project Leader Professor Denise Chalmers Pilot Universities Project Leader Professor Judyth Sachs Project Officer Ms Kate Thomson

Research Officers Ms Katie Lee, Ms Becky Walker, Ms Tina Cunningham (now Erne), Ms Julia Göbel

TQI Pilot University Group membership

University Pilot University Leader Project Officer Deakin University Professor John Rosenberg

Professor Alan Farley Professor Marcia Devlin

Ms Jennifer Brockett

Griffith University Professor John Dewar Professor Sue Spence

Ms Lynda Davies

Macquarie University Professor Judyth Sachs Ms Bronwyn Kosman RMIT University Professor Jim Barber

Mr Amgad Louka Dr Josephine Lang

The University of Queensland Professor Michael Keniger Professor Deborah Terry

Dr Anne Gilmore

University of South Australia Professor Peter Lee Professor Margaret Hicks

Mr Ric Bierbaum Ms Narelle Walker

University of Tasmania Professor David Rich Professor Gail Hart

Mr Steve Heron

The University of Western Australia

Professor Jane Long Ms Jacqueline Flowers

TQI Reference Group membership

Category Member ALTC Board member Professor Margaret Gardner (Chair) Chair, Universities Australia DVC/PVC (A)

Professor Jane den Hollander (2008) Professor Peter Booth (2007)

Australian Universities Quality Agency (AUQA)

Dr David Woodhouse, Dr Stella Anthony

DEEWR, Higher Education Quality Branch Manager

Ms Catherine Vandermark (2008) Ms Lois Sparkes (2007)

Expert on Scholarship of Teaching and Learning

Professor Keith Trigwell

Project Leader (and Executive officer to Reference Group)

Professor Denise Chalmers

Pilot Universities Project leader Professor Judyth Sachs

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 iii

Acknowledgements This project would not have been possible without the generous contributions of a number of people through the provision of their expertise, support and advice. The Teaching Quality Indicators Project Team undertook the task of identifying and sourcing the reports, research and literature required to inform this report. In particular Kate Thompson contributed her considerable organisational skills as Project Officer in addition to her research skills, as did Katie Lee, Julia Göbel, Tina Cunningham and Becky Walker, all of whom contributed through undertaking the extensive literature reviews and research on performance indicators, and editing and sourcing information on Australian practices. Thanks also to Associate Professor Simon Barrie, Dr Paul Ginns and Rachel Symons for their contribution to the project through their commissioned Student Survey study. Professor Judyth Sachs, Provost, Macquarie University, and leader of the group of the pilot universities for the Teaching Quality Indicators Project has provided ongoing leadership, support, insight and advice as a member of the project team and as pilot university leader. The Pilot University Leaders and Project Officers so generously participated and provided ongoing commitment, goodwill and expertise, and without whom this project would not have been possible. The members of the Australian Learning and Teaching Council (ALTC) Board, the National Project Reference Group, the Reference Groups in each of the pilot universities, and ALTC Directors all provided valuable support and insights in addition to the funding, without which this project would not have been possible. My thanks are extended to the many people who have provided feedback, access to their staff, their university information and resources, including those contacted through the Universities Australia group of DVC/PVC(A) and those in AUQA and DEST and DEEWR who so generously provided comment and advice. My thanks and gratitude to colleagues Associate Professor Janice Orrell for her insights and access to resources, particularly those related to assessment, and David Sadler and Carolyn Webb for their expertise, support and willingness to evaluate the project. The findings, conclusions and wording of the report are the responsibility of the author and do not necessarily reflect the views of the Teaching Quality Indicators Reference Group, the ALTC, or The University of Western Australia. Denise Chalmers February 2010

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 1

Executive Summary The Teaching Quality Indicators project responds to the need for an agreed approach to recognising and rewarding quality teaching and teachers in higher education. A key aspect of recognising quality teaching is the development and implementation of agreed indicators and metrics in universities and across the Australian university sector. This initiative by the Australian Learning and Teaching Council was to provide the Australian higher education sector the opportunity to proactively engage with the issue of recognising and rewarding quality teaching and teachers, and to lead institutions and the sector in defining and developing indicators and outcomes of quality teaching. The aim was to contribute to enhancing the quality of teaching and teachers in institutions of higher education by providing tools and metrics to measure their performance and enable institutions to respond to issues identified by the evidence. In order to understand the national and international context and the range and type of teaching and learning indicators in use, a major review of the literature, reports and international national and institutional practices was undertaken. These are reported in a number of comprehensive reports (e.g. Chalmers, 2007; 2008a; Chalmers, Lee and Walker, 2008, Chalmers and Thompson, 2008). These reports informed the Teaching Quality Framework by identifying the indicators that can most usefully inform on the quality teaching in an institutional context (Chalmers 2007). The report, Indicators of university teaching and learning quality (Chalmers, D. 2008a) describes the teaching quality models and the range of indicators they utilise, providing examples from five continents. In the final section of this report, a range of teaching indicators for use in Australia at the national level are proposed. The project was carried out in two stages: the aim of the first stage was to provide a comprehensive overview in terms of what is currently recognised as quality teaching at individual, institutional, national and international levels. Following this overview, a Teaching Quality Framework was developed that proposed indicators of quality teaching at four levels within universities (Chalmers 2007). Five major reports were produced and circulated widely, including an extensive mapping of each of the Australian universities use of teaching and learning indicators and summarised in the Snapshot of teaching and learning practice (Chalmers and Thomson, 2008) report and Student surveys on teaching and learning (Barrie, Ginns and Symons, 2008) report. A draft framework with indicators was developed and widely circulated. Process documents and resources were developed for use by the pilot universities to guide their activities. Stage two involved eight pilot universities auditing and reviewing their teaching and learning practices, identifying one Teaching Quality dimension from the Framework on which to focus and then implementing strategies to establish indicators at multiple levels throughout their university. Each university established a reference group, conducted an audit of teaching and learning policy and practice and developed aims and intended outcomes aligned with their university vision and strategic plans. While the eight universities chose different aspects of the framework on which to focus, they engaged as a group, agreeing to share strategies, evidence, resources and experiences. In addition to the combined outcomes of this project, the experiences and outcomes of the each of the universities are documented in eight pilot university project reports. The dissemination strategy of the project was highly engaged with the Australian higher education sector throughout the duration of the project. It involved systematic reporting and feedback to university leaders, numerous updates through written reports, presentations and engagement with discipline and representational organisations and groups. Feedback was

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 2

sought on resources from both the pilot university communities, experts in the various framework dimensions and leaders from other Australian universities. The two stages of the project were independently evaluated which concluded that a solid foundation had been laid for future work and that the experiences of the pilot universities provided a valuable resource of case studies and experiences of adopting a systematic and evidence based approach to rewarding and recognising teaching and learning in Australian universities.

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 3

Project Context The quality of teaching and learning in Australian higher education has been of increasing interest to governments and national bodies, the communities they serve and the universities themselves. Interest has grown as greater numbers of students enter higher education, and as a result of the economic drivers of increasing participation in higher education to develop an educated and skilled population. Over a period of three decades, Australia has established a robust and effective quality framework for higher education. National and state governments working with universities and higher education institutions have implemented quality reviews and audits, established protocols and guidelines for accreditation, standardised reporting and data collection and established a range of funding models. Within the higher education sector, there is much that has been achieved and recognised internationally as leading practice; for example, the early initiatives of administering national student course experience and graduate destination surveys have been adopted elsewhere. The quality auditing process established under the Australian Quality Agency in 2000 is well regarded and considered effective and practical. The national data collection process through the Institutional Assessment Framework has evolved into its current form to improve the quality of data collected. The introduction of the Learning and Teaching Performance Fund (LTPF) focused universities’ attention on the quality of their teaching and the measures that were used to assess it. More recently, the Australian Government has proposed an indicator framework for higher education performance funding in a discussion paper (DEEWR, 2009) in which it is proposed that performance funding will be allocated to universities through compact agreements.

Global trends and initiatives in teaching and learning A pervasive trend across all of the countries reviewed (Chalmers, 2007; 2008a; Chalmers, Lee & Walker, 2008) is the establishment of national systems of accreditation, quality processes and audits, and requirements to provide information on a range of performance indicators. Identifying valid and reliable indicators of teaching and learning quality remains a major challenge facing institutions, funding bodies and governments. Performance indicators are required at the national/regional level utilising the following broad quality models:

1. For quality audit and accreditation processes 2. To inform budget and funding decisions and allocations 3. Provision of mandated data that is centrally collected and may be selectively reported

in national and international comparative reports 4. Survey data about student perceptions on their satisfaction, engagement, learning

experiences and employment 5. Tests of learning which indicate student readiness for university study, acquisition of

generic skills and knowledge (discipline or general) to inform decisions on professional/graduate admissions.

A selection of data from centrally collected and publicly available information, sometimes with additional information commissioned from a range of sources, has been used to create a number of ranking and league tables. The majority of league tables are instigated by media organisations, for example, the Times in the United Kingdom and the US News, though university based tables are more highly regarded by the universities themselves, for example the Shanghai Jiao Tong University ranking.

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 4

Global trends evident in higher education include:

• Higher education now more than ever being referred to as an economic commodity, with increased interest in linking employment outcomes to higher education (employment and graduate destinations). This in turn has led to interest from governments and funding agencies in measuring the employability of students through measures of learning and their employment outcomes.

• Active steps taken to develop and use performance indicators at the national/sector level, as evidenced by the OECD reports on the PISA study, Measuring Up reports, Education at a Glance reports and international rankings.

• Growing interest and pilot projects in identifying ‘direct’ indicators or measures, particularly of student learning.

• Increasing interest in performance funding based on output measures and indicators. • Renewed interest in benchmarking at the national, regional and discipline levels (e.g. the

European Higher Education Area). • Greater emphasis on quality auditing and accreditation within countries and regional

groupings (e.g. Bologna, Higher Education Area in Europe, Spellings report and regional accreditation organisations in the USA, TEQSA in Australia).

• Steady moves in Europe to assign greater autonomy and independence to higher education institutions with less direct involvement from governments through quality auditing and accreditation mechanisms. This contrasts with the USA, where there are increasingly stronger calls for greater government oversight of higher education institutions through the use of standardised indicators, measures and accreditation processes.

• Concerns expressed by researchers and higher education institutions about the impact of the growing government requirements for national/sector performance indicators on the autonomy and diversity of institutions.

Australian trends and initiatives in teaching and learning Many of the global trends noted in reviews of international practices of quality teaching and learning practices are well established in the Australian higher education sector. These include:

• National Australia Graduate Survey, which comprises the Course Experience Questionnaire (CEQ) and Graduate Destination Survey (GDS).

• National system of quality auditing on a five year cycle (AUQA), which will be subsumed in the new Tertiary Education Quality and Standards Agency (TEQSA) from 2011 to oversee the regulation, accreditation and quality assurance of the tertiary education sector.

• National protocols, accreditation and qualification framework that apply to all higher education institutions and their programs of study.

• National data collection of information related to students and universities (IAF). • National fund to reward quality teaching and learning (LTPF), now discontinued.

Compacts between universities and government in the future will lead to the allocation of performance funding that reflects the government’s broad objectives and quality framework.

• National government funded organisation to enhance learning and teaching in higher education institutions (Australian Learning and Teaching Council) (ALTC).

• National awards for quality university teaching (administered through the ALTC). • Transnational quality strategy for international education on and off shore. • Reviews of higher education, the most recent of which was the Bradley review (2008). • Benchmarking activities within and across institutions, disciplines and national

boundaries (e.g. ACODE, EQUIS, ATN).

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 5

• Disciplinary accreditation of university and tertiary programs of study (e.g. Engineering, Accounting, Medicine, Psychology).

Increasing emphasis on the use of performance indicators in higher education Higher education institutions use performance indicators for four primary reasons:

1. To monitor their own performance for comparative purposes 2. To facilitate the assessment and evaluation of institutional operations 3. To provide information for external quality assurance audits 4. To provide information to the government for accountability and reporting purposes.

(Rowe, 2004). The reasons for using performance indicators differ at the national or state level, where they are designed to:

1. Ensure accountability for public funds 2. Improve the quality of higher education provision 3. Stimulate competition within and between institutions 4. Verify the quality of new institutions 5. Assign institutional status 6. Underwrite transfer of authority between the state and institutions 7. Facilitate international comparisons. (Fisher, Rubenson, Rockwell, Grosjean &

Atkinson-Grosjean, 2000) These different reasons for the use of performance indicators between institutions and government and national organisations may lead to disagreements on the most appropriate indicators for identifying quality teaching and learning.

Definition of performance indicators The following definition has been synthesised from the literature and is the definition used for the Teaching Quality Indicators project (Chalmers, 2008a,b).

Performance indicators are defined as measures which give information and statistics context; permitting comparisons between fields, over time and with commonly accepted standards. They provide information about the degree to which teaching and learning quality objectives are being met within the higher education sector and institutions.

Types of performance indicators There is general agreement on the four types of performance indicators as Input, Process, Output, and Outcome. These can be more broadly categorised as Quantitative and Qualitative indicators. See Chalmers (2008a) for a detailed description of types of performance indicators and their origins. This report found that while qualitative outcome and process indicators are more insightful and accurate in measuring the methods and quality of teaching and learning they are not often utilised, as quantitative input and output indicators are more easily measured. This has resulted in an inappropriate dependence on less informative, quantitative, input and output performance indicators. Consistent with the literature and conclusions of this project’s reports, the more frequent use of these quantitative indicators (particularly input measures) aligns with a system overly removed from the objectives of higher education. Quantitative input and output indicators are unlikely, perhaps even unable to effectively and accurately measure the quality of teaching and learning in isolation from the qualitative process and outcome indicators; hence the importance of using these in combination.

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 6

Each type of indicator has different characteristics and objectives, but all are operationally related. Successful indicator systems, whether at national, institutional or campus level, incorporate all four types of indicators to inform their decision-making and quality assessments. A balance of the four types of indicators is particularly important at the national level, where the emphasis can be on output or outcome indicators. Emphasis on output or outcome indicators over input and process indicators is likely to result in an unbalanced system with unintended and negative consequences (Borden & Bottrill, 1994; Burke & Minassians, 2001; 2002). It is also important to understand indicators as interrelated and linked; for example, input indicators such as level of funding, or student and staff numbers, can have a confounding effect on output, outcome and process indicators (Guthrie & Neumann, 2006). Although indicators can depict trends and uncover interesting questions about the state of higher education, they do not objectively provide explanations which reflect the complexity of higher education or permit conclusions to be drawn. Multiple sources of information and indicators are required to diagnose differences and suggest solutions. Without multiple sources of both quantitative and qualitative information, interpretation may be erroneous. It is therefore imperative that indicators only be interpreted in light of contextual information concerning institutional operations and with purposes and assumptions made explicit. In summary, the measurement of quality teaching and learning within the higher education sector should entail indicators which are significant in informing individual and institutional performance; and where feasible, also significant on a common national or sector wide scale. A useful performance indicator is one that informs the development of strategic decision-making, resulting in measurable improvements to desired educational outcomes following implementation. (Rowe & Lievesley, 2002) There are clear trends emerging of greater regulation and desire for quantitative and standardised measures of learning and institutional effectiveness at the national level. However, while national and sector organisations seek a small number of ‘direct’, quantitative and ubiquitous measures that will reliably inform them of the quality of teaching and learning in higher education, this is considered unlikely to be achieved in a way that will yield meaningful information (Chalmers, 2008a). The more promising measures and indicators are those that are situated in institutional practice – these are the focus of this project.

An institutional focus on teaching and learning performance indicators A concerted effort on the part of universities to undertake active strategies is required to achieve:

• Higher participation and completion rates, particularly of socio-economically disadvantaged, racial and ethnic minorities, and equity groups more broadly.

• More active involvement in education reform and engagement in the school sector. (Students in disadvantaged groups are often lost to the HE system by Year 9).

• Increased engagement with adult and further education and creating pathways in and out of higher education.

• Development of programs that meet the educational needs of adults of all ages, from school, early career, mid career, end career and older citizens (adapted from Davies, 2006).

Accounting for the key imperatives of attracting, retaining and graduating students at the national and institutional levels is crucial, because it will contribute to national policy pressure on universities. However, national imperatives must also take account of institutional mission and values and allow for a variety of approaches.

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 7

Focusing on student retention and completion provides an entry point to the dimensions of quality teaching that are important in terms of learning. This is not to suggest retention of students at any cost, or that there should be any reduction in the quality of learning outcomes. However, retention and progression must be part of any set of key indicators of quality learning and a quality learning experience. It was argued, in the initial stages of the project, that a process to enhance learning through a focus on teaching quality indicators must involve an institutional, multilevel audit. This will clarify the missions and goals of the university in relation to its students, uncover policy and process barriers, identify needs for new policies, bring stakeholders together to establish broad levels of agreement, design implementation strategies, assign key tasks to appropriate individuals and groups, and monitor and review progress towards the achievements of the goals established by the universities. A key aspect of recognising quality teaching is the development and implementation of agreed indicators and metrics in universities and across the Australian Higher Education sector. It was intended that the TQI initiative would provide the sector with the opportunity to proactively engage with the issues of recognising and rewarding quality teaching and teachers, and to lead institutions and the sector in defining and developing indicators and outcomes of quality teaching. This would contribute to enhancing the quality of teaching and teachers in universities by providing tools and metrics to measure their performance and enable institutions to respond to issues identified by the evidence. A systematic approach was argued, and accepted, because of the shift to a performance- based quality assurance culture in higher education, and the need for the sector to take advantage of this trend to advance the quality of teaching and learning in institutions, and enhance the quality of the students’ experience. By embracing this move, the Australian higher education sector would be providing leadership to the university sector internationally, nationally and locally.

Project Aims and Description The aim of the project was to develop an approach for recognising and rewarding quality teaching in higher education. The mechanism for this was the development and implementation of agreed indicators and metrics. The major outcome of this project was the development and implementation of a framework that identified indicators and outcomes of teaching quality at the institutional and internal university levels. The framework also identified systems and processes that support and value teaching quality. The project had been conceived by the project leader, then a Director at ALTC (formerly the Carrick Institute). A proposal was prepared and presented to the ALTC Board as taking place over four stages, with progression through each stage dependant on achieving the goals and outcomes of each stage. The proposal was approved in November 2006 and Stage 1 commenced in 2007. Stage 1: Research and Framework Development The aim of the first stage was to provide a comprehensive overview in terms of what is currently recognised as quality teaching at individual, institutional, national and international levels. Following this overview, it was intended that a Framework be developed that proposed indicators of quality teaching. An examination of the ways in which quality teaching is rewarded was also undertaken at the individual, institutional, national and international levels. The methodology included ongoing consultation at the four levels, empirical research, literature review and environmental scanning.

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 8

Stage 2: Pilot implementation of Framework Following the development of the Framework, and with extensive consultation, its utility was trialled in eight universities. This involved each of the pilot universities examining and revising relevant policies and practices that impact on the quality of teaching and learning, establishing the necessary infrastructure and systems to gather and interpret the data and implementing strategies to build a culture that values, recognises and rewards teaching quality. Following the trial, the tools and matrices, case studies and guidelines for implementing the framework would be made widely available and promoted within the sector. Although all pilot universities were to use the resources provided, each pilot university leader determined the direction the project would take at their university. Each university established their own Reference group, conducted an audit of teaching and learning policy and practice and developed aims and intended outcomes aligned with their university vision and strategic plans. Stage 3: Wider Framework Implementation It was intended that should the first two stages be successful, it might be feasible to consider supporting a further ten institutions wishing to implement the framework and apply it to their own institution, with mentoring and support provided by the project team and pilot university project members. Stage 4: Benchmarking Subsequent to a significant number of universities adopting and implementing the framework in their universities, it might be possible to then identify a set of agreed benchmarking data that would be available for sharing across the sector and utilising for national indicators. Through subsequent consultation with the Project Reference Group and ALTC and in light of the changing political and policy context, it was agreed that that the project would not progress beyond Stage 2. While funding for the project ceased at the end of 2008, the pilot universities and project leaders’ committed to continue implementing the initiatives, identifying and applying further indicators in their own universities and disseminating the outcomes over the following 18 months, concluding in mid 2010.

Project methodology and implementation processes

Stage 1: Research and Framework Development The research was carried out by a team of research officers under the direction of the project leader and in consultation with the project team. The research formed the basis of the reports produced in Stage 1 and from which the Teaching Quality Framework was grounded. Detailed tables of indicators were developed to show how the four dimensions could be used to audit university practice through multiple levels and organisational areas, and then subsequently guide the development and widespread use of performance indicators to guide review and enhancement practices. These tables were subsequently summarised into indicator tables for each of the four dimensions (Appendix 1). Reports also identified potential benchmarking and national indicators (e.g. Chalmers, 2008a). This was then canvassed widely for discussion and comment, particularly amongst the pilot institutions, the Universities Australia group of DVC/PVC(A)s and through them to their institutions, and members of the Reference Group, AUQA and DEST/DEEWR. The framework was revised in light of the comments received and was then used by the pilot universities for their consideration and detailed use. The Institute for Teaching and Learning at The University of Sydney, led by Associate Professor Simon Barrie, was commissioned to undertake a study on Student surveys on teaching and learning to provide an environmental scan and initial analysis of student feedback surveys currently in use in Australian universities.

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 9

Literature and data sourcing and management Internet and library searches were conducted to find significant sources of information relevant to the quality of teaching and learning in terms of policy and practice of institutions, governments and other organisations. The research officers searched by country at sector and government standard/regulation levels for quality controls, accreditation, funding and reporting requirements. International standards, benchmarks, regulations on the quality and outcomes of teaching and learning indicators were also of interest. The researchers also searched by country at the institutional and individual levels for standards, indicators and evidence identified as relevant for quality teaching and learning, and documented the associated funding and rewards systems. The use of reward and recognition mechanisms in other professions was also investigated, with a particular emphasis on schools and teachers. Literature, professional web sites and journals were sourced for indicators and measures and the processes by which they were implemented. Data were collected on student surveys in use in Australia at the teacher or subject/unit, program/course and whole of university levels. The purpose, number and type of questions, and the ways in which the data were used, were recorded. Interesting surveys used at any of these levels internationally were also explored, in addition to national or sector wide surveys such as CEQ or GDS. The research officers searched through all Australian university websites for policies and evidence of practice that related to teaching and learning. These included institutional and teaching indicators, for example institutional quality systems, strategic planning and initiatives, institutional funding, documentation available on quality indicators, evaluation processes and evidence. Teacher indicators were sought, particularly appointment and promotion criteria and performance management processes. The researchers also sourced learner and learning indicators, such as student learning indicators, learning quality processes and assessment policies. This was documented by institution and used to create a Snapshot of Practice which drew on and extended the Australian Vice-Chancellors’ Committee (now Universities Australia) (2004) report summary of teaching indicators. Documents that were not publically available from university websites were posted or emailed to the research officer by a staff member nominated by their DVC(A). An email address was set up so those interested in teaching quality could contact the project team directly and communication could be categorised by university. All information, including communication was managed according to ethical and ALTC guidelines. Printed documents were filed according to the university and policy title and stored securely at ALTC. At a preliminary meeting of the pilot university leaders to establish the scope, processes and principles of the project, it was recognised that there are many agendas and priorities in universities. A draft set of principles to guide the project and those working on it was presented and extensively discussed and revised. It was agreed that the focus of the project would be on students’ learning experience and teaching and learning at the institutional level. Based on these principles the following set of values was adopted (Table 1). Table 1: Values underpinning the teaching quality indicators project

Values of the Quality Teaching and Learning Quality Indicators Project Global value

• Research and teaching as the heartland of all university activities Institutional values

• Diversity over standardisation and comparability • The expectation that learning in higher education should be active, cooperative and

intellectually challenging • Trust and openness at all levels

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 10

• Equity principles and practices for both staff and students • Creative renewal opportunities for staff and students

Measures of performance values • A teaching quality framework as an opportunity for development and enhancement • Measures and indicators that contribute to the development of good/effective teaching and

learning practice • Good judgement that is aided by the use of performance indicators. • An evidence based approach to decision making • Performance indicators that can contribute to/ or be generalised to the wider sector.

Preparation for Stage 2: Pilot University Training The project team’s activities in Stage 1 preceded the pilot projects and then continued concurrently with Stage 2, with ongoing framework development and revision, and implementation at each pilot university. In preparation for Stage 2, all pilot universities were provided with funding for a senior level project officer. All project officers were required to participate in an intensive project management training program, with a number of follow-up sessions, provided by Learning Partnership International (LPI). (The LPI is a small team of consultants with a commitment to learning and associated outcomes and processes, particularly change, culture, development, improvement and quality.) The training program was developed specifically for the project but was built on a pilot project management training program developed for the discipline-based initiatives at the instigation of Associate Professor Janice Orrell. The project management training program provided by LPI was subsequently modified and adopted for the professional development of all ALTC grants program project leaders and project officers. Following this initial training, a guide detailing the process project officers could use to identify a focus area to implement the project within their university was provided (Process for Getting Started, Chalmers 2007). The key steps established for the pilot universities in this guide were:

• Appoint a university leadership group, chaired by the DVC/PVC(A) • Ground the agenda in the priorities of the university and project • Complete a policy, practice, process and data audit • Meet with key people throughout the institution and key stakeholders • Report back to the leadership group, finalise the agenda and assign responsibilities • Monitor and steward the initiatives.

Stage 2: Pilot University implementation Each of the pilot institutions appointed a project officer whose role was to oversee and document the mapping and implementation of the framework. It was requested that the project manager be attached to the office of the DVC(A) to flag institutional support and ensure progress was maintained. This did not consistently occur in implementation, with some project managers more connected to the office of the DVC/PVC(A) than others. In the early stages of implementation:

• Each university conducted audits and formed leadership/steering groups to determine their university’s priorities. Each institution mapped their institutional policies and practices to the framework to assist with identification of the direction the project would take at their institution. A number also held focus groups to determine staff and student practice and perception in relation to teaching and learning quality. The implementation of the framework is different at each of the pilot universities, reflecting their diverse contexts and priorities, but at the same time understood as part of a whole.

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 11

• The project officers met regularly with the project team and were provided with intensive leadership and project management development and ongoing support. In addition, documents and processes were developed to guide each university’s audit and review their university in order to assist their selection of dimension and indicators.

• During these meetings and between them, the project officers compared their work implementing the framework and the extent to which their institutions ‘fit’ the framework. Project managers detailed what actions needed to be undertaken to ‘fit’ the framework.

• Regular meetings and communication between the Project Officers, DVC/PVC(A) and the Project Team ensured that issues were dealt with, ideas, resources and experiences were shared, and focus was maintained.

• Project Managers met with their institutional constituents to discuss the implications and process of implementing the framework.

• DVC(A)s and institutional Project Managers met as a group to report on their progress and issues that were raised, develop strategies, and where appropriate adjust the framework to fit their context.

In the later stages of implementation

• The Framework implementation continued as each of the pilot universities made a commitment to continue to implement the project for 12-18 months beyond the funded period to ensure the outcomes of the project were embedded within their institutions policy and practices.

• Each pilot institution provided an 18 month report on their implementation and experience of it.

• Ongoing dissemination and communication with the sector on the progress and outcomes of the project has been a feature of Stages One and Two.

Project leadership and management This project was initiated and led by Professor Denise Chalmers (previously at the ALTC and currently at the University of Western Australia). Professor Judyth Sachs (Macquarie University) was the leader of the pilot group of universities and member of the national project team. The project was overseen by a Reference Group with membership from the ALTC, DEEWR, AUQA and Universities Australia through the Chair of the DVC/PVC(A) group. The project was managed by the project leader, with support from a team of research officers, and in Stage Two of the project, a project officer and the project team. For the greater part of the project’s duration (November 2006–January 2008) the project leader and her team were based at ALTC The project was proposed to and received the ALTC Board’s approval, with its progress reported regularly to the Board. The project Reference Group was appointed and met twice in 2007 and twice in 2008. Each member was selected for their individual expertise on issues relevant to teaching quality and/or for their role as representative of a group or organisation focussed on raising the quality of teaching. The original membership of this group included Professor Margaret Gardner (chair) (RMIT University), Professor Peter Booth (University of Technology, Sydney; chair of DVC/PVC (Academic) group), Professor Keith Trigwell (The University of Sydney), Ms Lois Sparkes (DEEWR), Dr David Woodhouse and Dr Antony Stella (AUQA), Professor Sachs (Macquarie University) and Professor Chalmers (The University of Western Australia). Changes in membership took place in 2008 when representatives changed roles. These resulted in Professor Jane den Hollander (Curtin University of Technology, representing the Universities Australia group of DVC/PVC (Academic) and Ms Catherine Vandermark representing DEEWR.

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 12

Project Outcomes and Impacts

Stage 1: Research and framework development Stage 1 of the project examined ways in which quality teaching is recognised and rewarded at the individual, institutional, national and international levels. It was argued that quality teaching is recognised and rewarded differentially at these levels, and that there is little relationship between what is recognised and what is rewarded. The first stage of the project involved a comprehensive review of the national and international literature and current practice in relation to teaching and learning quality. This research has been released as a series of reports and forms the evidence base for subsequent stages of the project. These can be found at the project website http://www.altc.edu.au/teaching-quality-indicators and include the following: 1. A review of Australian and international quality systems and indicators of learning and

teaching (2007). Chalmers, D. (122 pages) This report is designed to provide a broad overview of the teaching and learning practice within Australia and internationally. A summary of the Australian context and outline of current practices and initiatives at the national level is provided. The second section provides an overview of global initiatives related to the quality of teaching and learning by country or region. Commonly used student surveys and tests of learning used to identify quality of teaching and learning are described. It concludes with issues that surround the use of some of the measures at the national level. The final section provides an overview of performance indicators of teaching and learning that have a substantial evidence base to support their use in institutions, some of which will be suitable to be reported up to the national level. 2. Indicators of university teaching and learning quality (2008). Chalmers, D. (82 pages) This report provides a brief background on the growing national focus on quality and use of performance indicators of teaching and learning in higher education. It then focuses on the ways that national and sector level teaching and learning indicators are utilised in performance management models or systems, accreditation, quality audits, performance funding and budgeting, performance reporting and surveys and tests. It presents an institutional level framework for teaching and learning quality and identifies indicators and measures that have the potential to monitor teaching and learning at the national level.

3. International and national indicators and outcomes of quality teaching and learning

currently in use (2008). Chalmers, D., Lee, K., & Walker, B. (78 pages) This report provides an environmental scan of national and international approaches to the quality assurance of higher education. As such it provides an illustrative rather than comprehensive overview of current trends and practices. The first section is a summary of global trends and issues. The focus of the report is the use of performance models in higher education, quality audit, accreditation, performance funding and performance budgeting, performance reporting and surveys and tests. The five models use a variety of performance indicators, some of which overlap in method of collection, but the model used will influence the way in which the indicator is interpreted. A summary of Australian and International use of performance models in teaching and learning is provided. 4. Snapshot of Teaching and Learning Practice in Australian Higher Education Institutions

(2008). Chalmers, D., & Thomson, K. (40 pages, plus university reports on the 13 tables – average 30 pages for each of the 37 universities)

This study investigates the policies, practices and quality assurance systems linked to teaching and learning at 37 Australian universities. Universities were given the opportunity to submit additional information until 31 January 2008; consequently, this summary represents

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 13

2007 Teaching and Learning Practice. As universities continuously review and update their policies and procedures, this report should be considered a snapshot of practice. This study examines practice from a quality systems point of view, with a focus on important delineators shown to be relevant to teaching and learning. Topics covered represent a comprehensive snapshot of practice related to recognising and rewarding quality teaching and learning at Australian Higher Education Institutions. In early 2008, each university was provided with their updated sets of 13 tables and the overview report. Several universities informally reported using these tables to inform their internal quality processes and review, and for identifying good practice within their universities and with cognate universities. The eight pilot universities agreed to share their complete sets of tables with each other and these have been used extensively within the pilot universities for internal and external reference. The University of Western Australia recently completed a database using these tables for use within the university to bring together its teaching and learning policies and practices at the institution, faculty and department level and was commended at their Teaching and Learning Committee. This was also demonstrated at the Pilot Project meeting in August, 2008, with several other pilots expressing interest in developing a similar database. 5. Defining teaching and learning indicators in universities (2008). Chalmers, D, (18 pages) This report summarises information in quality indicators as a useful summary to those not familiar with the use of indicators to demonstrate quality in teaching and learning. It outlines the context and rationale for developing performance indicators for teaching and learning in Higher Education. It also offers a definition of ‘indicators’, their purpose and types; quantitative (input and output), and qualitative (process and outcome). Illustrative indicators for each type are provided across four levels; national, institutional, department/program and teacher/student. It is suggested that qualitative indicators measured and used to inform policy practice at the institutional might be most meaningful for the sector. 6. Student surveys on teaching and learning. Final Report (2008). Barrie, S., Ginns, P., &

Symons, R. (123 pages) This study provides an environmental scan and initial analysis of within-university Student Evaluation of Teaching (SET) practices across 29 Australian universities. The report is based on data collected in the first half of 2007 and as such represents a 'snapshot' of institutional SET practice which may well have changed since. The report analyses the core items drawn from internal student evaluation of teaching surveys currently in use in these Australian universities using a framework derived from a key review of the major multi-section validity studies of SET. It considers patterns of use of SET items in Australian universities and suggests how the analytic framework might be developed for use in the Australian context. Based on the findings of the analysis the report provides some preliminary conclusions and suggestions to better enable institutions to use internal SET data for benchmarking and quality assurance. The report also identifies a range of validated survey scales which could be used to gather SET data in relation to the proposed TQI dimensions (Chalmers 2007) and a network of SET experts with the potential to collaborate in developing and validating shared SET items, scales and procedures, for those levels of the proposed TQI dimensions where no suitable validated SET scales were identified. This report does not provide prescriptive recommendations as its intention is to support discussion on the current SET use. 7. Four dimensions of Teaching Quality Framework Indicator tables (2009) Chalmers, D. &

Thomson, K. (9 pages) This document provides a summary of the Quality Indicators tables for each of the four dimensions: Institutional climate and systems, Diversity, Engagement, and Assessment, under four types of indicators: Input, Process, Output and Outcome and at four organisational levels: Institution, Faculty, Department, Teacher.

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 14

8. Review of Australian and international performance indicators and measures of the quality of teaching and learning in higher education. (126 pages)

A commissioned report was prepared for the Department of Education, Science and Training (DEST) renamed the Department of Education, Employment and Workplace Relations (DEEWR) in 2008. This report was compiled by the ALTC from the body of work identified above.

Development of the Framework of Teaching Quality The above reports and documents have contributed to the development of a framework, resources and tools that assist universities to review their teaching and learning systems, policies and processes, and implement changes. The framework links quality indicators across four dimensions (see Figure 1 for a representation). The Institutional Climate and Systems dimension forms the basis upon which the other dimensions are built. The dimensions of Assessment; Engagement and Learning community; and Diversity are theoretically and empirically supported by the literature as having a strong alignment with student learning (see Chalmers, 2007, for the full review and references).

Figure 1: Framework of Teaching Quality These dimensions and indicators can be further broken down by level within the institution. There are four levels developed in the tables of the framework:

• Institution-wide • Faculty • Department/program • Teacher/Individual

It may be appropriate for an institution to consider including campus (off-shore, regional, multiple sites, distance, etc.) where there are multiple campuses involved. Institutional climate and systems An institutional climate is characterised by a commitment to the enhancement, transformation and innovation of learning. Institutional climate is a key dimension of quality teaching and learning, referring to the evaluation of institution, staff and student levels of satisfaction and experience. The measurement of student experience and satisfaction is currently a common

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 15

indicator of quality teaching and learning, however, the data only contributes a limited amount of information about the institution. Diversity Diversity in higher education relates to ethnic, cultural and socioeconomic diversity, as well as diversity regarding students’ abilities, talents and learning approaches. Diversity is an indicator that is theoretically and empirically supported by the research literature and is frequently employed as a measure of quality teaching. Assessment The most direct measures of student learning are through the assessment tasks undertaken while students are studying in their enrolled program of study. Research has repeatedly shown that assessment does not merely serve to inform students about their achievements, but is a necessary condition for quality learning. The literature about good practice in assessment is extensive and well developed and many universities have adopted a number of effective approaches to assessment. There exists a great variety of methods, ideally aligned with specific learning goals, student learning approaches and the particular subject. This diversity is desirable and essential, yet it is not an end in itself. It should also be used to encourage institutional improvement. Therefore, it is on the design, delivery and administration, provision of feedback, setting standards and moderation of those, and review of assessment where universities should be directing their attention. Indicators of quality assessment include the development and implementation of systems and reviews with an “enhancement-led” approach. Engagement and learning community The academic environment is the primary means by which students further their learning, abilities and interests - making it a central dimension to student success (Smart, Feldman & Ethington, 2000). Student engagement is a term used to describe student’s commitment and engagement with their own education. It is important to also include staff engagement with their students and their institution. Table 2: Indicative teaching and learning indicators for four dimensions of teaching practice Institutional climate &

systems

Diversity

Assessment Engagement &

learning community • Adoption of a student-

centred learning perspective

• Possession of desirable teacher characteristics

• Relevant and appropriate teaching experience, qualifications and development

• Use of current research findings in informing teaching and curriculum/ course content

• Community engagement/ partnership

• Funding model in support of teaching and learning

• Valuing and accommodating student and staff diversity

• Provision of adequate support services

• Active recruitment and admissions

• Provision of transition and academic support

• Active staff recruitment

• Multiple pathways for reward and recognition of staff

• Assessment policies address issues of pedagogy

• Adopting an evidence-based approach to assessment policies

• Alignment between institutional policy for best practice and faculty/ departmental activities

• Commitment to formative assessment

• Provision of specific, continuous and timely feedback

• Explicit learning outcomes

• Monitoring, review and moderation of standards and assessment tasks

• Student engagement • Fostering and

facilitating (academic) learning communities

• Engaging and identifying with a learning community

• Staff engagement

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 16

Each dimension of quality teaching is developed into a table which outlines indicative indicators for that dimension at the institutional, faculty, program and teacher level, and indicative measurements for those indicators. Each table outlines the indicators under input, process, output and outcome indicators and then these are expanded in more detail into checklists for each level so that the institution can assess its practices and establish the processes and measurements if they need to do so. While the indicative measures are shown with the particular dimension under scrutiny, a number will also apply to the other dimensions (see Appendix 1). The level and the indicators outlined may not be appropriate for all institutions. Some indicators may not apply in some universities, and some indicators may apply but at different levels in the institution from what is suggested in the framework. Each university is asked to consider the indicators and the levels as indicative rather than prescriptive requirements. In addition, each institution needs to be very clear about the students they are attracting, enrolling and retaining, and those students they wish to attract, enrol and retain - whether they are the very able, first-in-family, culturally and ethnically diverse, etc. This needs to be considered in the context of their mission and strategic plans. These will then impact on the indicators selected and the way in which they are then measured and reported.

Stage 2 Pilot university project outcomes and impacts Stage 2 of the project sought to develop and implement the framework that identified indicators of outcomes of teaching quality. Eight pilot universities were identified and each carried out a detailed review of their teaching and learning policy and practice informed by their Snapshot of practice tables which were shared across the group. Each pilot university elected a dimension from the framework on which to focus that was most pertinent to their priorities and mission at that point in time. While funding was provided for an 18 month period only, each university committed to the project for a period of three years. The aim was to apply the framework in the eight pilot universities, report on the experiences and identify implications for the Australian universities. In addition, resources and tools developed would be reported and made available across the sector (see Tables, Dissemination and Linkages below). All pilot universities were supported to develop their own agenda and intended outcomes for their university. University strategic plans, leadership changes and ongoing review led to expansion or reorientation for some of the universities. Pilot universities’ original and revised aims are summarised below, in Table 3. Table 3: Pilot universities’ original and revised aims and Framework dimension of focus

Pilot University Focus / Aims Deakin University Dimension: Engagement and Learning Community

Initial Focus Areas: Curriculum (including assessment); Leadership; Provision of services for students; Recognition and rewarding of staff Aims: Short Term: To identify and disseminate quality-based teaching and learning processes, linked to the Teaching and Learning Plan. Long Term: Embed a culture of systematic teaching and learning best practice. Revised: Subsequently refocused the project to identify ways in which Deakin University could provide and use quality information to faculties and departments on engagement (principally through the use of the AUSSE and

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 17

Pilot University Focus / Aims university data) to better direct their efforts and resources and to reflect their mission.

Griffith University Dimension: Assessment Aims: • To raise the status of teaching processes and practices. • Inform policies and practices in relation to career pathways and

rewards/recognition for academic staff work. Short Term: To establish a process for identifying, applying and evaluating indicators of teaching quality in relation to assessment. Long term: As only one of two pilot universities addressing the Assessment Dimension exclusively, Griffith has a unique opportunity to improve, develop and implement the indicators for adoption by other Australian universities.

Macquarie University

Dimension: Institutional Climate and Systems - Promotion, Reward and Recognition Focus: Appointment, Probation, Performance, Promotion, Appraisal and Management Aim:

Raise the profile of learning and teaching and to explicitly indicate that it was critical to all operations and activities of the institution. Short term: To facilitate policy development and management – as an absence of clear, transparent, equitable and published academic policies and procedures directly impacts on the consistency and quality of teaching, learning and the student experience. Long Term: To embed a performance culture of teaching – the emphasis is on embedding the recognition, value, celebration and reward of quality teaching right across the institution. Additional: Substantial review and development in the Assessment dimension.

RMIT University Dimensions: Institutional Climate and Systems / Assessment Focus: It was difficult to separate RMIT priorities and associated projects solely within one TQI dimension. So it was acknowledged there would be overlap. Consequently, RMIT identified synergies between other relevant TQI dimensions and metrics appropriate to each project. Aims: Short Term: To scope the potential areas for investigation to advance learning and teaching within the RMIT institutional context and the TQI Project context. Long Term: To draw upon findings of initial investigations to inform development and/or refinement of policy/ projects/ initiatives to advance learning and teaching at RMIT. Additional: Input into development of an Australian Technology Network of universities (ATN) strategy for developing an academic standards model. Revised: Following the scoping stage, and in alignment with longer term aims, after revision of the appointment and promotions policy, the project focused in the Assessment dimension and the RMIT Assessment Standards Project commenced.

The University of Queensland

Dimension: Assessment / Institutional Climate and Systems Aims (Assessment): • To make better use of current policy and practice and identify gaps by

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 18

Pilot University Focus / Aims evaluating policy and practice across the university.

• To actively engage more staff and students so that they are aware of the current policy, resources and support mechanisms that are available to assist them to improve assessment outcomes.

• To determine improved mechanisms to implement performance based teaching and learning outcomes.

Revised: After the original aims were significantly achieved focus moved to the Institutional Climate and Systems dimension, with a focus on funding models for teaching and learning.

The University of Western Australia

Dimension: Institutional Climate and Systems: Promotion, Reward and Recognition Aims: Short Term: Develop more robust mechanisms for determining what constitutes good quality teaching; use these mechanisms in the promotion and tenure and professional development review processes and for the development of institutional indicators. Four subprojects utilised to achieve the overall goal. Long Term: embed these changes into policy and practices, staff development and workforce planning.

University of South Australia

Dimension: Engagement and Learning Community: Focus: Employer Feedback Aims: To develop the indicators and systems that inform the university about employer’s needs and satisfaction levels with its graduates and to communicate, manage and respond to that information appropriately. Additional: Also Interested in developing assessment standards and worked together with RMIT on this aspect of Assessment. The outcomes of these projects to feed into both the TQI project and the ATN academic standards project.

University of Tasmania

Initial Dimensions: Institutional Climate and Systems, Assessment, Engagement, Diversity Initial Focus: Initially interested in pursuing aspects of all dimensions and then decided to focus on one as the most salient for their university. Reviewed aspects of Reward and recognition of quality teaching, revised appointment and promotion policy, and assessment, and shared their documents and experiences within the pilot group. Revised Dimension: Diversity Aims: Valuing and accommodating student and staff diversity • Provision of transition and academic support • Multiple pathways for reward and recognition of staff.

Project outcomes and impacts varied across the universities in accordance with the particular dimension of teaching practice that the university focussed on:

a. Institutional climate and systems

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 19

The aim at Macquarie University was to raise the profile of learning and teaching and to explicitly indicate that it was critical to all operations and activities of the institution. Consequently, the Academic Promotion Policy was revised to include clear selection criteria on learning and teaching, and especially the level and quality of evidence required for academic promotion. An institution-wide policy framework was also developed and implemented across all areas of the institution. Institutional climate and systems was also the focus at The University of Western Australia (UWA), which succeeded in promoting an evidence-based mode of thinking about teaching quality into discussions and planning more widely. The project produced the following outputs:

• On-line database of teaching and learning policy and practice • Benchmark Statements for the Reward and Recognition of Teaching at Institutional

and Department level (jointly with Macquarie University) • Conference Paper: Flowers & Kosman, 2008, Teaching Matters: Developing Indicators

of Teaching Quality, a comparative study, presented at AUQF2008 (jointly with Macquarie University)

• Proposal for institutional Indicators for Evaluating Reward and Recognition of Teaching at The University of Western Australia

• Proposal for changes to professional development of teaching at The University of Western Australia (endorsed)

• The University of Western Australia Teaching Criteria Framework – new framework for the formative and summative evaluation of teachers to be used for the development of academic portfolios.

At The University of Queensland a new Curriculum and teaching quality appraisal (CTQA) and Academic Program Review (APR) policy was developed, along with the creation of tools to display comparative indicators covering key aspects of student enrolment, teaching and learning outcomes and student feedback (CTQA dashboard and APR templates).

b. Diversity and inclusivity The University of Tasmania identified the following outcomes as a result of its focus on diversity and inclusivity:

• Increased awareness of the range of activities currently supporting the transition of students from diverse background at University of Tasmania

• Increased awareness of the diversity and inclusivity agenda • Improved understanding of the use of performance indicators at University of

Tasmania • Contribution to related projects.

Outputs included a survey/interview document and accompanying information sheet. A spreadsheet was also developed to record information relating to existing activities that occur throughout the University and impact on, or are impacted by, the transition of students into the University and the accommodation of student diversity. The primary advantage of the matrix was that it allowed for easier identification of parallel and/or similar initiatives being conducted by different areas of the University. This provided the opportunity to amalgamate, refine or redevelop some of these initiatives to improve efficiency and effectiveness for the benefit of the University.

c. Assessment Griffith University focussed on assessment and its intention to develop quality indicators for the University and the sector. The major achievement here was the development of the ‘Statements and Quality Indicators of Good Practice in Assessment’ (SQIs) that were tested and evaluated, and used to inform and underpin a comprehensive review of the University’s

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 20

assessment policies and procedures. Other outputs included professional development materials, including an annotated bibliography on good practices in assessment; a Good Practice Guide to Feedback; workshops; briefings to stakeholders that increased the literacy around the concept of quality indicators; and increased awareness of assessment operating as both a research-based discipline and as an everyday practice. RMIT University shared this focus and undertook the RMIT Assessment Standards Project which initiated a systematic examination of the status of assessment policy within the organisation, and created the opportunity to engage RMIT University staff in re-visioning its assessment policies and practices in order to improve assessment practices for learning. This included development of the RMIT Assessment Policies and Procedures Manual that was supported with an Assessment section in the newly developed online resource, Guide to Teaching, which involved sharing of innovative assessment practices that also improve learning outcomes.

d. Engagement and learning community At Deakin University, the focus on student engagement resulted in a research-led approach to the project with the intention of promoting incentives that encourage a culture of excellence in teaching and learning within the institution. The Student Engagement Database was initiated, where existing strategies and resources within Deakin to further enhance student engagement through teaching were mapped against the University’s 8 Principles for Teaching, Learning and the Student Experience. The database contextualised examples, exemplars, suggestions and advice on how to engage students in the teaching and learning context and ideas that staff may wish to pursue in order to engage students. This mapping informed the development of a suite of resources to assist staff to further engage students through teaching and learning activities. At the University of South Australia the Employer Feedback Survey was developed as part of the ‘community engagement’ facet of the engagement and learning community dimension of the TQI Project, which considers the wider context of universities, industry and community engagement. The survey was designed, trialled and evaluated. It accessed feedback from key industry stakeholders and graduate employers in data collected about graduate qualities and other learning outcomes. The project has also responded to a desire to collect this type of information at a school level.

Pilot university project sustainability and transferability Project sustainability and transferability were identified in positive terms by the participating universities. It was generally acknowledged that:

• the resources/tools developed as a result of the projects could be adapted by other universities to their own contexts; and

• the TQI Project provided opportunities to interrogate and add value to policies, processes and practices within the participating universities.

a. Deakin University

Deakin University identified numerous strategies that have the potential to sustain institutional awareness of, and commitment to, recognising and rewarding teaching that engages students. The outcomes have been embedded within institutional policy and practice. These changes will be carefully monitored to ensure sustainability and where possible, further development within the funding priorities of the University. Deakin University experienced a high level of interest and involvement in its student engagement forum and it is hoped that the same approach and methodology might be

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 21

successfully applied to other dimensions of teaching practice. Deakin University suggests that, as more institutions commit to the TQI framework, the more relevant and accessible their activity will become and this will aid the transference of project outcomes of each dimension of teaching practice.

b. Griffith University It was anticipated that two major assessment projects undertaken at Griffith University will continue the implementation of the TQI Project, and provide further tests of the applicability, usefulness and adaptability of the indicators. The University will be expecting the SQIs to provide insight into the value and success of such interventions. Concurrent with the two major projects, work will continue on other initiatives such as: the implementation and change-management of the reviewed Assessment Policy; student surveys; and Course and Program improvement systems. The most likely direct input Griffith University will have to the transfer of the project to other institutions will be through the draft Benchmark Statements in Assessment that have been developed and disseminated, the case-studies to be produced by the Project Leadership and the research papers to be written about the project at Griffith.

c. Macquarie University The focus on academic appointment and promotion, and the agreement that academic appointment, probation, performance development, promotion and professional development must all align, has provided Macquarie University with the opportunity to embed the concept of quality learning and teaching within its structures and through the development and implementation of a University Policy Framework. The TQI Project provided the impetus for Macquarie University to develop benchmark statements which were intended to be shared across the sector. The benchmarking work undertaken by Macquarie University and The University of Western Australia to rework the TQI framework into a tool that can be applied, with a clear methodology for implementation in any area and at different levels, facilitates the sustainability and transferability of the project. It also provides other institutions a more robust and measurable tool with which to interrogate their policies and processes than that provided by the original TQI framework tables.

d. The University of Queensland At The University of Queensland the new CTQA/APR policies will have on-going resource implications for the University. Moreover, because the implementation of the CTQA/APR policy resides with the schools and faculties, their continued effort and engagement with the policy will be vital to its success. The University of Queensland also suggests that given that all universities are grappling with the need to ensure that: (a) effective curriculum review processes are undertaken on a regular basis and (b) sufficient attention is paid to key teaching performance indicators, the potential transferability of the project outcomes are high, particularly the key elements of the revised policy and the CTQA dashboard.

e. RMIT University RMIT found that the key to sustaining the project outcomes was to integrate them within the priorities of the institutional policy and strategic directions context. This provided the opportunity to systematically examine issues that were significant to the organisation and helped to sustain project interest within the RMIT University community. The strong motivation for the project at the teacher level was an essential characteristic for project sustainability. Consequently, it was important to make explicit the connections between the project and the institutional context, its practices and issues, to help staff see the project’s

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 22

relevance and to advocate for it. As the project offered staff opportunities for ‘getting involved’, the potential for the project to become sustainable also increased.

f. University of South Australia The implementation of improvements to the survey tool used in the Employer Feedback Survey within the University of South Australia makes it more sustainable as an indicator as well as more transferable across the sector. The requirement to benchmark graduate attributes and other outcomes across institutions supports the further evolution of the survey tool and process to continue to make efficiency and accuracy improvements. In this way refinement for the next survey will allow for transferability across institutions.

g. University of Tasmania At the University of Tasmania, the TQI matrix has been developed so that it can be adapted for use across each of the Teaching Quality Framework dimensions, and will be made available across the sector. Further development of the matrix to convert it into an on-line system with flexible reporting options will enhance its usefulness. Establishing links between the TQI Project and other current and future initiatives will also be essential in ensuring sustainability of outcomes. The processes developed through the TQI Project are adaptable to quality assurance initiatives more broadly.

The development of good practice benchmark statements and assessment criteria is expected to provide further impetus for the project outcomes to be effectively carried forward. These statements, along with similar statements developed by other universities involved in the project, are designed to be transferable, and have the potential to provide consistency across all institutions.

h. The University of Western Australia The sustainability of the project at The University of Western Australia relies on its integration into mainstream university business, and input will continue to be sought from a range of stakeholders within the faculties, schools and administrative sections of the University as implementation progresses. The incorporation of the Teaching Criteria Framework into the academic portfolio guidelines significantly contributes to the project’s sustainability. There are also a number of other proposals relating to the TQI project which will be implemented at the University. These include the introduction of systems of promotion, recognition and reward indicators; the maintenance of the database; and the improvement of professional development of teaching through such things as improved sessional staffing policies. In relation to transferability the methodology used for the pilot project The University of Western Australia was designed to be transferable both to other aspects of the TQI framework and to other institutions implementing the framework and this, along with the tools developed at the University (and jointly with Macquarie University), means that The University of Western Australia experience is readily transferable.

Pilot university project impacts a. Deakin University

Deakin University indicated that the TQI Project has resulted in further interest and activity in terms of student engagement. Participation in the TQI Project provided a valuable means to begin to robustly evaluate teaching and learning related to student engagement within the University in a way that was both systematic and transparent.

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 23

b. Griffith University

The TQI Project provided the impetus and resources for Griffith University to: audit and review its assessment policies; conduct research into good practices in assessment and what practices it wants to focus on; inform the review and re-drafting of the assessment policies and procedures; and create and facilitate a focussed conversation about assessment practices and their quality.

c. Macquarie University The introduction of a policy framework has had a major impact on how policy development and approval is undertaken across Macquarie University. In this has also contributed significantly to raising the profile of learning and teaching throughout the University.

d. The University of Queensland At The University of Queensland the new CTQA/APR policy was very positively received. The CTQA dashboard, the template reports and the clear instructions on requirements for the new processes also met with positive responses.

e. RMIT University The TQI Project offered the opportunity to systematically examine the assessment issue across RMIT University and learn from the experiences of others. This intention established the RMIT Assessment Standards Project.

f. University of South Australia The value of the Employer Feedback Survey pilot at the University of South Australia has been in the identification of key improvements that enable the tool and process to become viable and sustainable as a quality indicator from its next iteration, and then transferable across the higher education sector. Also, data gathered have already informed part of the AUQA audit on internationalisation being conducted, so information about what employers and industry think about the University’s graduates can and will be applied across many areas.

g. University of Tasmania The primary impact of the project at the University of Tasmania was raising the profile of evidence-based practice and the use of performance indicators. Due to the focus on transition and diversity, a similar impact has been the increase in conversations about these issues and the reinforcement of the priority that the University of Tasmania currently places on them.

h. The University of Western Australia The TQI framework allowed The University of Western Australia to shift the focus of discussion about teaching quality toward an understanding of the way in which institutional climate and systems can affect student learning outcomes. Information gathered by the project along with other internal review mechanisms resulted in the Pro Vice-Chancellor (Teaching and Learning) announcing a wide ranging review of all student evaluations used by the University which is ongoing, and the introduction of the Teaching Criteria Framework is changing the way that academic staff approach the evaluation of their teaching and the way that quality is defined.

Pilot university project future plans a. Deakin University

Deakin University’s involvement in the TQI Project has created a platform for further development and change and many possibilities have been identified. Preliminary work on assessing the viability of capturing a very wide range of teaching and learning initiatives that promote student engagement has also begun. In ongoing academic staff development

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 24

programs, the University would particularly like to focus on supporting and developing sessional staff, who are often ‘front and centre’ in terms of interaction with students and, therefore, key to ensuring high level engagement. Deakin is considering the introduction of an additional institutional award focused on achievements in the area of student engagement as well as expanding current criteria for such recognition and reward systems. Deakin will also actively continue to seek external benchmarking opportunities with appropriate partner universities who are similarly focused on student engagement.

b. Griffith University A long-term goal of Griffith University’s TQI Project is an improvement of student learning outcomes and experience through enhanced teaching and assessment practices. It may be difficult to measure this outcome directly, but it remains the underlying motivator for the project. The projects undertaken at Griffith University also provide direction for decisions about professional development needs across the University and how best to allocate people and other resources to improve curriculum practices and thereby student learning outcomes.

c. Macquarie University At Macquarie University it was acknowledged that the policy development and review aspect of the project will be ongoing. In recognition of the central role policy development and implementation plays in the operations and activities of the University, funding has been provided for a position to permanently support the institution wide policy framework. There is already a scoping exercise looking at developing a benchmark tool at the department/program level within the area of assessment. The work in this area will focus on moving teaching from being seen as an individual activity to one involving a range of staff. There was support to continue the momentum achieved through the development and approval of selection criteria for academic promotion so as to facilitate alignment with the Performance and Development Review.

d. The University of Queensland The outcomes of the current project have been embedded within the University’s teaching quality assurance framework. Hence, the project will continue within the context of further development and monitoring of the CTQA/APR policy and procedures. Further refinement and development of the policy and mechanisms for capturing data on teaching, learning and student outcomes is anticipated.

e. RMIT University As a result of activities undertaken under the RMIT University’s Assessment Standards Project, a number of possible activities were considered for the next period of the TQI Project and could be undertaken as separate projects over time beyond 2009. Furthermore, it might be useful to determine boundaries around projects by targeting particular programs/Schools to meet the intended aims of the project and/or identifying areas of priority within a project.

f. University of South Australia At the University of South Australia, continuation of the Employer Feedback survey will sit across two areas. Planning and Assurance Services would be responsible for quality assurance of the survey. Careers Services, within the Learning and Teaching Unit, would then take responsibility for harnessing the outcomes of the pilot to run the survey into the future.

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 25

g. University of Tasmania The University of Tasmania anticipated continued development of the TQI Project focusing on processes and procedures for quality assurance and benchmarking to be developed, trialled and documented to provide a detailed toolkit for use in addressing other focus areas.

h. The University of Western Australia Implementation of a number of proposals arising from the TQI project is ongoing, including the implementation of the teaching criteria framework, changes to professional development in teaching, and the development of institutional level indicators. UWA has embarked on a benchmarking project with Macquarie University using the best practice statements for the reward and recognition of teaching developed collaboratively by the two universities through the TQI project. The full list of reports from each of the pilot universities, with web access details are listed in Appendix 3 and are available from the project website.

Summary of outcomes, factors for success and future plans Overall, the members of the project team and pilot universities and as reported to the Reference Group and ALTC Board, identify that the following outcomes have been achieved by the TQI project:

1. Contribution to scholarship on teaching and learning indicators. 2. A framework that has been tested across a range of teaching quality indicators 3. A core set of benchmark statements, performance indicators and performance

measures that can be shared between institutions. 4. A core set of materials that can be used to undertake to process of developing and

embedding institutional indicators around the framework. 5. A process for institutions to engage in renewal of teaching and learning. 6. The development of a shared language regarding teaching performance. 7. An institutional and individual approach to teaching quality, tested through practice. 8. Enhanced opportunities and tools for benchmarking. 9. Case studies and experiences which provide resources for the sector.

Factors for Success Factors considered important for the success of the TQI Project were summarised by at the meeting of the Pilot Leaders and Project Officers held in August 2008 as:

1. One of the aims of the project was to enhance awareness of internal work at each university and to share expertise within and between universities. The Project Officers worked with wide sections of units across the university, and brought people from different areas across types and parts of the university together. The process of consultation was emphasised across and within levels of the university.

2. One of the factors that ensured the commitment to change was that each university was in a context of change, for example, a change to senior leadership, an upcoming AUQA audit or an organisational restructure. The generally positive response to the concept of TQI facilitated staff willingness to change.

3. The audit of each university’s teaching and learning practice and initiatives that were developed as part of the project were systematic and deeper and broader than might have been possible otherwise. The approach was not top down, but grounded, increasing the sustainability of processes and structures. To increase sustainability, quality initiatives should be embedded within or build upon existing systems and structure whenever possible.

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 26

4. One of the challenges to measuring teaching quality is initiating dialogue and developing a shared language amongst individuals and organisations. Through the work of the Project Officers, and disseminating the project’s outcomes, some progress has been made towards meaningful discourse on teaching quality.

5. One of the aims of the project was to develop indicators of teaching quality, rather than wait until some were developed and enforced as part of the QA process. The focus of the project was not compliance but enhancement/aspiration and rewarding and recognising, which created a positive experience.

6. Another aim of the project was to enable universities to measure quality, and so the process of identifying and using evidence and measuring outcomes and target initiatives was critical in providing a means by which quality could be measured.

7. The senior level stewardship of Pilot Leaders and allocated Project Officers was critical in ensuring the project would have an impact across the university, through enhancing staff ownership and engagement.

Further discussion and analysis revealed that the hub and spoke model of the project design was critical for the success of the project. The central group provided ongoing support, resources and direction, and helped maintain the momentum while those in the pilot universities provided specific contexts and institutional expertise. The first meeting of the leaders and the subsequent first meeting of the project managers was crucial in establishing shared values and understanding of key concepts, directions, priorities and common ground. The regularly scheduled meetings for the pilot universities leaders and the project managers provided additional opportunities for networking and identifying other opportunities to engage and identify common issues and seek support and resources. Future plans for the project Through consultation and feedback from the various stakeholders including the Pilot Universities leaders, the members of the Reference group and Universities Australia, DVC/PVC(A) group it has become apparent that there is a need for a set of resources to be developed, tentatively called the Teaching Quality Indicators Resources and Toolkit. It is intended that this will be a web-based resource which will include the full set of reports listed above. In addition to these the following resources will be further developed and considered by the project universities.

• Rationale for using teaching performance indicators in universities • Guidelines for carrying out an audit of teaching and learning • TQI Audit and review of university Teaching and Learning Systems and Practices

Analyses of indicators by dimension and type The Project Team has been focused on developing core indicators use at different levels in the university, with appropriate tools. The following illustrative documents are described to identify the resources that are developed, in partial development or intended to be developed in the future. TQI Framework of benchmarking statements (in development) A set of global benchmarking statements, intended to assist institutions to determine the dimension on which to focus. Indicator statements for each dimension (Appendix 1) A set of indicator statements for each dimension (Assessment, Reward and recognition of staff, Engagement and learning community, and Diversity), identified by indicator type. These

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 27

have been drawn from the original sets of tables that were developed to assist universities to map and audit their practice. Benchmarking statements for each dimension (in development) A number of the pilot universities have been constructing benchmarking statements that have drawn from the project tables and drawing on the expertise within their own university contexts, (e.g. Macquarie University and The University of Western Australia for Reward and Recognition of Quality Teaching, and Griffith University for Assessment). It is intended these will be developed for each dimension to provide a common set that could be used as benchmarking statements across the university sector. Subject Level Student Survey Items by TQI dimension (Barrie et al., 2008) A provisional set of survey items was identified for each of the proposed TQI dimensions for use at the level of individual subjects. The items will provide the basis for further consultative development and validation by universities participating in the Teaching Quality Indicators project (now available on project website).

Dissemination Dissemination has been a key component of this project to ensure that the reports, documents, resources and experiences would be shared and available for comment and exchange. An extensive body of reports have been produced and distributed widely via direct mail, website, at forums and visits and presentations at universities (see Appendix 2 for a full list of these). Draft documents were developed and circulated widely, for example, to the pilot project teams, the Reference Group, sector groups such at the Universities Australia group of DVC/PVC(A)s, the Council of Australian Directors for Academic Development (CADAD), and provided as draft versions on the project website. Comments were invited and responded to prior to making then available via the website. All materials were circulated to the members of project team initially and then to the members of the Reference Group prior to wider circulation. Table 4 provides an overview of the dissemination activities carried out throughout the project. Table 4: Overview of Dissemination activities of the TQI project

Type of dissemination Details Progress reports presented to Universities Australia group of DVC/PVC(A). Included as a standing item 2007-2008.

November 2006 April 2007 November 2007 April 08 October 08

Pilot Group Meetings These meetings were scheduled over two days. The first day was for the pilot project team members to consult, share experiences and undertake training in project management. The second day was attended by the DVC/PVC(A)s together with the project officers and national project team to discuss progress, identify strategic decisions, and share experiences and resources.

27 March 07 16-17 August 07 15-16 October 07 16 November 07 22 April 08 28 August 08 14 November 08

Reference Group Meetings 18 April 07 24 September 07 27 May 08 20 October 08 February 2009 (by circular)

Presentations at conferences, meetings and forums See appendix for full list of

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 28

Type of dissemination Details throughout Australia and internationally over a three year period.

presentations and titles, e.g. conference presentations including IMHE/OCED, HERDSA, AUQF (2007,2008, 2009). ICED

Publication and distribution of reports, publications in refereed journals and conference proceedings.

See Appendix 2 for full list.

Pilot project dissemination activities. Detailed in Pilot University Project reports available on the ALTC project website

A more detailed outline of dissemination activities undertaken throughout the project is reported in Appendix 2.

Linkages

With the ALTC objectives and projects The Teaching Quality Indicators project has been comprehensively aligned with the objectives of the Australian Learning and Teaching Council. For example, it has: 1. Promoted and supported strategic change in higher education institutions for the

enhancement of learning and teaching, including curriculum development and assessment through: • The grants awarded to eight Australian universities to engage in the project to develop

and implement teaching and learning indicators within the university. The grant enabled the appointment of a Project Officer for each of the eight pilot universities to audit policy and practice in order to determine areas in need of change, related to the framework, and most particularly those that would impact most significantly on the student learning experience.

• The development of a Teaching Quality Framework with comprehensive resources to support the framework.

• Reporting regularly to the Universities Australia Group of DVC/PVC(A)s on the project and the experiences of the pilot universities as the project progressed.

• Providing a comprehensive audit of all Australian universities’ teaching and learning indicators and a detailed report to each university of these indicators in common tables so they could support benchmarking and identify good practice.

• Producing a number of widely disseminated reports. • Contributing to national discussion on teaching quality indicators.

2. Raised the profile and encouraged recognition of the fundamental importance of teaching

in higher education institutions and in the general community through: • The revision of teaching criteria for appointment and promotion of academic staff at

pilot universities (e.g. Macquarie University, The University of Western Australia, RMIT University and the University of Tasmania). This subsequently informed a number of Australian universities which reviewed their practices against the indicators provided.

• The promotion of a framework of quality teaching within and beyond Australian universities.

• Widespread circulation and engagement in the development of the resources with the Australian directors of academic development over a number of forums, university visits, conference presentations.

3. Fostered and acknowledged excellent teaching in higher education through:

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 29

• Showcasing university practice in detailing the extent and type of teaching indicators, including teaching excellence awards, in universities in the Snapshot of Practice reports provided to each Australian university.

• Developing indicators and benchmarks that explicitly address the reward and recognition of teaching quality in universities under the Teaching Quality Framework.

• Reporting and synthesising international and national literature on teaching and learning quality indicators in comprehensive reports.

4. Developed effective mechanisms for the identification, development, dissemination and

embedding of good individual and institutional practice in learning and teaching in Australian higher education through: • The model of the project which involved a central team with project outcomes, and

eight pilot universities working independently and together, including reviewing and auditing their policies and practices and benchmarking.

• The engaged dissemination strategies of regular reporting to the Universities Australia DVC/PVC(A) meetings, Australian Quality Assurance Forum and the ALTC Board.

• Contributions to multiple for a as individual pilot universities, combined team presentations and workshops and national team presentations.

• Engagement with a number of Australian and international universities and shared experiences and resources.

5. Developed and supported reciprocal national and international arrangements for the

purpose of sharing and benchmarking learning and teaching processes through: • Wide dissemination of reports. • Establishing benchmarking tools, and facilitated engagement between universities. • Meeting with international colleagues on quality indicators, e.g. meeting with and

presenting to delegations of senior leadership from Chinese, Swedish, Japanese and UK universities, Higher Education Academy (UK), United Kingdom universities, Sweden, Hong Kong. International conference presentations, e.g. OECD, ISSOTL, HERDSA, AUQA, ICED.

6. Proactively identified and instigated a response to the issue of developing learning and

teaching indicators which has the potential to impact on the Australian higher education system and facilitate national approaches to address these and other emerging issues through: • Initiation of the project and establishing a process and strategy to achieve the

outcomes. • Reports on national and international quality models and practices, reviewed student

evaluation of teaching surveys nationally, and developing a process to review institutional student surveys of the unit/subject.

Relationships and potential links to other ALTC projects The Teaching Quality Indicators project is aligned with the objectives of the Australian Learning and Teaching Council (see above). Due to the links between the TQI Project and ALTC’s strategic objectives, many of the projects supported by ALTC grants are associated with practices and initiatives that could be enacted or adopted by a university when implementing the framework of quality teaching practice. These include those that focus on the dimensions of assessment, diversity and engagement.

Institutional links within pilot universities One of outcomes of the project was that it raised awareness within and between institutions of current efforts to enhance the quality of teaching and learning. An example of this was the trial

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 30

of the Australasian Survey of Student Engagement (AUSSE), designed to stimulate evidence-focused conversations about students’ engagement in university study. The TQI Pilot Universities participating in the trial of the AUSSE were Deakin University, Griffith University, Macquarie University, RMIT University, The University of Queensland, The University of Western Australia, University of South Australia, and University of Tasmania. A number of the pilot universities shared their university data to better understand their performance and to begin to work towards further benchmarking, e.g. The University of Queensland and The University of Western Australia. Provided below are some of the pilot universities internal activities to illustrate how a project focused on teaching quality can affect and be affected by multiple initiatives across a university. For a more comprehensive set of each of the pilot universities’ initiatives, systems and processes linked to this project, see each of the pilot university reports. University Initiative Summary

Deakin University

Higher Education Research Group (HERG): Facilitating scholarship and research in higher education including teaching and learning.

The Higher Education Research Group was established in 2008. It now has over 200 members from across Deakin University. Members have won external and internal funding to conduct scholarship and research to inform teaching quality initiatives. The aims of the group include providing a research-led evidence base for teaching practice improvement and enhancements to student learning, student engagement and the student experience and to inform key policy and committee decisions in these areas.

Professional development and leadership

The establishment of a suite of resources, now hosted on Deakin University’s Student Engagement Website has been embedded through its inclusion in the Course Quality Framework. In addition to evidence-based but practical advice for teaching staff on student engagement, the website now also contains advice for students on how to become engaged, and for teaching and learning leaders on how they can contribute to the quality of teaching, learning and student engagement. Links to online and face-to-face professional development programs have been established.

Griffith University

Professional development

• Develop resources on assessment • Program Convenors' Network- hosted by GIHE,

assessment issues • Scholarship of Learning and Teaching Community of

Practice- sharing of resources on good practices in assessment developed through TQI

Committee tasks/ representation

• Assessment Policy Review Working Party- review of assessment policy, procedures and guidelines informed by TQI with joint membership of TQI Leadership Group

• Griffith Awards for Excellence in Teaching for "programs that enhance learning"- assessment is 1 of 7 priority areas

• Identifying teaching excellence initiative of the Education Excellence Committee

• Implementation Work Integrated Learning, Blended Learning and Internationalisation of the Curriculum strategies- assessment implications

Stakeholder networks

• Faculty Learning and Teaching Committees • Learning and Teaching Committee • Deans Learning and Teaching Forum • Heads of Schools meetings with GIHE staff (identification

of need/request for PD on assessment) • School Administration Officers' Forum

Macquarie New A framework for identifying, evaluating and developing the

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 31

University Initiative Summary University performance

development and review process

performance of Academic and General Staff; linked to the achievement of individual, work area and organisational goals.

Establishment of Learning and Teaching Centre

Amalgamation of the Centre for Professional Development and the Centre for Flexible Learning into Learning and Teaching Centre, which assists the university to enhance student engagement, learning experience, and outcomes by leading and supporting the University to develop and maintain the quality of its staff, curricula, and infrastructure for learning and teaching.

RMIT University

Learning and Teaching Scholarship Series

Establish communities of practice where sharing of ideas and thinking about learning and teaching excellence are seen as essential to influence changes in teaching practices.

Policy for new academic staff

To increase the profile of learning and teaching excellence, all new Level A and B academics without an education qualification are supported to acquire a Graduate Certificate in Tertiary Teaching & Learning. This is mandated as a condition of probation.

The University of Queensland

Quality management and assurance framework

Key elements include: • A commitment to improvement and excellence through a

comprehensive process for the review of policies, schools, centres, institutes, central organisational units and academic programs (including generalist degrees).

• The monitoring of performance indicators for research (research activity statements and outcomes of RQF/ERA preparations) and teaching and learning (annual curriculum and teaching quality appraisal process) to assist in assessing performance against strategic objectives.

Creation of teaching-focussed academic positions

The University created teaching-focussed appointments which also have an obligation to undertake scholarship in teaching and learning and contribute to the development of pedagogy in their discipline. By July 2007, appointments to the new positions had been made within all faculties and at all position levels. This is part of UQ’s process to build staff development, capability and understanding of the importance of teaching and learning as a major part of their contribution to the community.

University of South Australia

Career services Work Placement Scheme

To supplement school-based formal clinical and field placements, work on industry-based projects and liaison with people from industry and the professions, Careers Services offers final year students support in undertaking a placement in their own time and ensures that they contribute valuable knowledge and skills to the organisation they are placed in.

Collaborative partnerships

To develop and maintain partnerships and linkages with industry and government; and ensure research undertaken at UniSA remains relevant to the rapidly changing needs of industry and translates the research of today into the new markets, goods and services of tomorrow UniSA has a program for collaborative research projects jointly sponsored and funded between the University and an industry partner (with government funding in some circumstances).

University of Tasmania

Pathways Program

Seeks to provide individualised pathways for students entering the University from diverse educational backgrounds.

Creating Accessible Teaching and Support

An ALTC funded project that aims to improve the quality of teaching and support for students with disability, for example through providing information on inclusive teaching, assessment and curriculum design.

The Proposal for new This proposal makes an explicit link between performance and

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 32

University Initiative Summary University of Western Australia

academic career structure

accessing particular salary increments for the first time, making the development of robust teaching criteria which can be used to evaluate staff an even greater priority. This proposal also makes confirmation of ongoing appointment for Level B academic teaching staff contingent on successful completion of the Foundations of University Teaching and Learning program, which directly impacts on the way UWA manages professional development of teaching.

Review of all student evaluations of teaching surveys

This internal review, to be managed by the Office of the Pro Vice-Chancellor (Teaching and Learning) arises out of the TQI project, and a small review of the university’s SPOT (student perceptions of teaching) survey. The review will consider all mechanisms currently used to gain evaluative feedback from students and will consider the validity of the survey instruments; the policies and processes surrounding them; and their relationships to each other in providing robust evaluative data for the individual academic and the university. This review has direct implications for the Criteria sub-project, as student evaluations are an important evaluative tool for teachers; and for the Reward & Recognition Indicators project which is considering the potential for student evaluation data to be used at the school / faculty level.

Planning for new Operational Priorities Plan 2009 – 2013

At the time of writing, the University was in the planning stages for the new Operational Priorities Plan due to come in to affect from 2009. This has direct relevance for the Reward and Recognition Indicators sub-project which is involved in the development of quality indicators for teaching and learning matters.

International links The project was discussed at length with Directors of the Higher Education Academy (HEA). For example, the CEO of the HEA, Professor Paul Ramsden, was fully briefed on the project during his visit to the ALTC during his visit in 2007, as was David Sadler who had responsibility for the Quality Framework and development of the Professional Standards Framework during his visit in 2007. In 2008, the project leader Denise Chalmers, visited the HEA in the UK and met with David Sadler to update him on progress and gave a presentation on the project to HEA and QAA staff and HEA fellows. Presentations and discussions also took place with a number of UK universities. A presentation on the project was given at the IMHE/OECD conference in Paris in 2008, the ICED conference in 2010, and several discussions have taken place with colleagues from Hong Kong and Europe. Resources and reports have been shared. In Sweden, while attending an invited retreat with colleagues from the United States, Canada, England, Scotland, Australia, New Zealand and Sweden, there was opportunity to discuss the project and its implications and applications for other universities. In addition, a visit was made to Lund University, Sweden to talk a number of senior staff on the project. Communication and discussions continue to take place.

Evaluation of the Project A dissemination and evaluation plan was developed by the Project team and approved by the Reference Group in May 2008.

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 33

The dissemination and evaluation activities for each stage were identified and actioned and monitored by the project team and reported to the Reference Group and ALTC Board throughout the project in a formative process. In addition, external assessors were engaged to carry out an independent and summative evaluation of two stages of the project.

Stage 1 evaluation Professor Paul Ramsden, Executive Director, Higher Education Academy in the United Kingdom, was contacted to identify an expert external reviewer of the reports developed during Stage 1 of the project. He nominated Mr David Sadler, Director (Networks), at the HEA as the most appropriate person to carry out this review. Mr Sadler was provided with electronic and print copies of the reports. A follow-up meeting to discuss the scope of the evaluation was held between Mr Sadler and Professor Chalmers in September at the HEA, York. The report from Mr Sadler focused on the reports prepared for Stage 1.

1. Review of Australian and international performance indicators and measures of the quality of teaching and learning in higher education

2. Indicators of University Teaching and Learning Quality 3. International and National quality teaching and learning performance models currently

in use 4. Snapshot of Teaching and Learning Practice in Australian Universities 5. Student surveys on Teaching and Learning: Final report

His review was provided to the Project Leader in November 2008. The Sadler evaluation of the body of work was very positive. He raised a point of oversight, concerning the absence of a subject or discipline level in the framework, citing that there is evidence of between subject variation of teaching quality and effectiveness, and student perceptions. In addition to commenting on each report individually, the following general observations were made on the reports:

• They are of high quality in terms of research, methodology and scope. • They show, where appropriate, wide engagement in the overall project by

representatives from across the Australian HE sector. • They are focussed and well written and achieve their individual objectives. They have a

self-standing capacity in this sense outside of the overall project. I was especially struck, in this regard, by the Student Surveys on teaching and learning: final report, which is an excellent and well researched document which I think deserves a wide audience.

• They provide excellent examples and explanations of institutional practices in Australia, thus establishing a strong evidence base for the next stage of the project and the development of the Framework.

• They have explicitly or as a by-product of the research, created a network of contacts and buy-in at senior levels of Australian HEIs to the overall project, which will be invaluable in the pilot implementation and sector-wide upscaling stages.

• As a collection they certainly achieve the intended goal of providing a comprehensive overview of the landscape. I have some quibbles with individual sections of some of the Reports and there is some (inevitable) overlap between some Reports, but my overall evaluation is that as a set, they achieve the first objective of Stage 1.

• There would be merit in an additional document which explained the rationale for the collection of Reports and how each links to each other and builds capacity for the overall project.

These comments indicate the project has and continues to achieve its intended outcomes.

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 34

The Sadler report is included in the Appendix of the larger evaluation of the project carried out the independent evaluator, Carolyn Webb (refer Appendix 3 and the ALTC project website for a copy of the full report).

Stage 2 evaluation Carolyn Webb, an external consultant with extensive experience in evaluation and the higher education sector in Australia, was contracted by the project team to evaluate the project, with a focus on Stage Two, and to support and assist the project officers in the pilot universities in planning and carrying out their university evaluation. Based on the original terms of reference, Ms Webb provided a scoping document, and once this was approved by the project leaders and provided to the Reference Group, she commenced work sourcing and collating documents. The sources of data used for her evaluation included the project plan and progress reports, pilot university project plans, progress and final reports, interviews with and surveys of pilot university project leaders and officers, and other stakeholders as relevant, and the project research reports. Ms Webb provided her preliminary review to Professor Chalmers in December 2008, with the final version submitted in May 2009. The evaluation report was widely circulated to key stakeholders including all members of the Reference Group, pilot university project officers and leaders, and others who were involved in the project. Their comments on the evaluation report were sought, though no additional comments were received. The full report Evaluation of the national ALTC Teaching Quality Indicators Project (Carolyn Webb, 2009) including the evaluation on Stage One by David Sadler is listed in Appendix 3 and can be accessed from the project website.

Evaluation of individual pilot university projects In addition to the overall project evaluations, each pilot university was required to carry out an evaluation of their university project. Carolyn Webb attended a number of project meetings and gave a workshop on the key issues involved in evaluation. She reviewed each of pilot university evaluation plans that were submitted prior to the meeting and was available to provide advice and support to each project manager as they implemented their evaluation plan. The pilot university reports are listed in Appendix 3 and are located on the ALTC project website.

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 35

References ALTC (2008) Review of Australian and international performance indicators and measures of

the quality of teaching and learning in higher education. Report provided to DEEWR. Barrie, S., Ginns, P., and Symons, R.(2008) Student surveys on teaching and learning. Final

Report. http://www.catl.uwa.edu.au/projects/tqi Borden, V. & Bottrill, K. (1994). Performance Indicators; Histories, Definitions and Methods.

New Directions for Institutional Research. 82, 5-21. Burke, J.C. & Minassians, H. (2001). Linking state resources to campus results: From fad to

trend. The fifth annual survey (2001). Albany, New York: The Nelson A. Rockefeller Institute of Government.

Burke, J.C. & Minassians, H. (2002). Performance Reporting: The preferred “No Cost” Accountability Program. The sixth annual report (2002). Albany, New York: The Nelson A. Rockefeller Institute of Government.

Burke, J.C., Minassians, H. & Yang, P. (2002). State performance reporting indicators: What do they indicate? Planning for Higher Education, 31 (1), 15-29.

Chalmers, D. (2007). A review of Australian and international quality systems and indicators of learning and teaching. Carrick Institute for Learning and Teaching in Higher Education Ltd, Sydney, NSW. http://www.catl.uwa.edu.au/projects/tqi

Chalmers, D. (2008a). Indicators of university teaching and learning quality. http://www.catl.uwa.edu.au/projects/tqi

Chalmers, D. (2008b). Defining teaching and learning indicators in universities . http://www.catl.uwa.edu.au/projects/tqi

Chalmers, D. (in press). Progress and challenges facing the recognition and reward of scholarship of teaching in higher education. Special Issue on Scholarship of Teaching and Learning in Higher Education Research and Development, v29.

Chalmers, D. (in press). The practice and use of student feedback in the Australian national and university context. In Patricie Mertova and Sid Nair (Eds), Student Feedback: The Cornerstone to an Effective Quality Assurance System in Higher Education. Cambridge , UK: Woodhead Publishing.

Chalmers, D. & Sachs, J. (2007). Quality teaching and learning in higher education: what is it and how do we know? Paper presented at 30th HERDSA Annual Conference: Enhancing Higher Education, Theory and Scholarship, Adelaide, 8-11 July.

Chalmers, D. & Thomson, K. (2008). Snapshot of Teaching and Learning Practice in Australian Higher Education Institutions. Carrick Institute for Learning and Teaching in Higher Education Ltd, Sydney, NSW. http://www.catl.uwa.edu.au/projects/tqi

Chalmers, D. and Thompson, K (2009) Four dimensions of Teaching Quality Framework Indicator tables. http://www.catl.uwa.edu.au/projects/tqi

Chalmers, D., Lee, K. & Walker, B. (2008). International and national indicators and outcomes of quality teaching and learning currently in use. The Australian Learning and Teaching Council. http://www.altc.edu.au/teaching-quality-indicators

DEEWR (2009). An Indicator Framework for Higher Education Performance Funding: Discussion Paper: December. www.deewr.gov.au/tahes

Fisher, D., K. Rubenson, K. Rockwell, G. Grosjean, and J. Atkinson-Grosjean (2000). Performance Indicators: A Summary. A study funded by the Humanities and Social Science Federation of Canada/ Fédération canadienne des science humaines et sociales Retrieved 2 February 2008 from http://www.fedcan.ca/english/fromold/perf-ind-impacts.cfm

Guthrie, J. & Neumann, R. (2006). Performance Indicators in Universities: The Case of the Australian University System. (Submission for Public Management Review Final February 2006).

Rowe, K. & Lievesley, D. (2002). Constructing and using educational performance indicators. Background paper for Day 1 of the inaugural Asia-Pacific Educational Research Association (APERA) regional conference, ACER, Melbourne April 16-19, 2002.

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 36

Retrieved April 11, 2007, from: http://www.acer.edu.au/research/programs/documents/Rowe&LievesleyAPERAApril2002.pdf

Rowe, K. (2004). Analysing & Reporting Performance Indicator Data: 'Caress’ the data and user beware! ACER, April, background paper for The Public Sector Performance & Reporting Conference, under the auspices of the International Institute for Research (IIR).

Smart, J. C., Feldman, K. A. & Ethington, C. A. (2000). Academic disciplines: Holland’s theory and the study of college students and faculty. Nashville, TN: Vanderbuilt University Press.

Appendix 1

Denise Chalmers & Kate Thomson: Teaching Quality Indicators Tables May, 2009 37

Four dimensions of Teaching Quality Framework Indicator Tables (May, 2009) Denise Chalmers & Kate Thomson

This document contains the Summary Quality Indicators tables for four dimensions: Institutional climate and systems, Diversity, Engagement, and Assessment under four types for indicators: Input, Process, Output and Outcome for four organisational levels.

Figure 1: Framework of Teaching Quality These dimensions and indicators have been broken down by level within the institution. Institution-wide, Faculty, Department/program, and Teacher/Individual. It may be appropriate for an institution to consider including campus (off-shore, regional, multiple sites, distance, etc) where there are multiple campuses involved.

Performance Indicators for Quality Institutional Culture TQI project: http://www.altc.edu.au/teaching-quality-indicators An institutional culture is characterised by a commitment to the enhancement, transformation and innovation of learning. This is evident in the systems and climate of the institution.

Denise Chalmers & Kate Thomson: Teaching Quality Indicators Tables May, 2009 38

Reward and Recognition - Appointment, Performance Appraisal and Management

Level Input Process Output Outcome Institution

1. Staff qualifications and experience

2. Resources for introduction and orientation information, materials that include T & L goals, objectives

3. Funding/resources allocated to teaching; e.g. awards, grants, projects that focus on the scholarship and aspects of teaching and learning.

4. Funding model that recognizes faculty/department teaching performance and teaching quality on defined indicators

5. Criteria and processes on appointment and progression career positions/path(s) which include contributions for teaching and learning.

6. Policy and processes which link criteria to performance review, promotion, access to study leave. etc

7. Developmental/ enhancement approach to career planning.

8. Policies, processes and criteria for regular performance review.

9. Criteria and quality of evidence required to demonstrate teaching performance (e.g. peer review, student feedback)

10. Provision of professional development for staff and supervisors.

11. Provision of foundation and teaching qualification programs

12. Number of enquiries, applications, appointments by type of contract, staff characteristics, faculty

13. Successful applications and total support for study/ conference leave with T & L focus

14. Number/proportion of staff applying and gaining promotions with a teaching emphasis

15. Funding spent on teaching grants, awards

16. Number of staff participating in professional development programs by faculty, department.

17. Number/proportion of staff completing foundation, teaching qualification programs.

18. Review of appointment and promotion practices and policy implementation

19. Periodic survey of applicants on process and information provided, messages obtained.

20. External evaluation and benchmarking of policy and processes, criteria and evidence required.

21. Student perception of learning; e.g. CEQ

22. Student perception of teaching e.g. SET/ AUSSE scores aggregated by school and faculty.

23. Outcomes of teaching projects, grants etc.

Department/Program

1. Staff mentored, provided with leadership opportunities and encouraged to apply for promotion.

2. Staff provided with a range of teaching and leadership opportunities.

3. Staff positions, levels and workloads are monitored and managed, including

7. Supervisors trained on T & L criteria and development of staff performance, career planning relevant to discipline and org unit, linking with institutional requirements

8. Policies and practice of performance management linked to institutional policies

9. Evidence of policies being monitored and staff actively

13. Number/proportion of staff with teaching qualifications

14. Numbers of staff including sessional teachers participating in training by program/unit.

15. Numbers of staff in different types of positions applying for promotions (by equity group etc) & number of successes

18. Supervisor/peer review reports of teaching quality to promotion committees and performance review.

19. Reports on teaching research/scholarship in the program/ department.

20. Student perception of learning; e.g. graduate

Performance Indicators for Quality Institutional Culture TQI project: http://www.altc.edu.au/teaching-quality-indicators An institutional culture is characterised by a commitment to the enhancement, transformation and innovation of learning. This is evident in the systems and climate of the institution.

Denise Chalmers & Kate Thomson: Teaching Quality Indicators Tables May, 2009 39

Level Input Process Output Outcome sessional teacher management,

4. Funding model reflects workload and program of study needs

5. Workload models and allocation practices recognize a range of teaching contributions change of emphasis.

6. Workforce planning that includes discipline focus, knowledge and skill sets required for programs of study and department skill and knowledge base.

supported to participate/attend in programs, further study, development

10. Implementation of policies on evidencing teaching quality (peer review, student feedback).

11. Active monitoring and support for performance review and development.

12. Provision of support and training for sessional teachers and their supervisors

16. Number of staff who have undertaken annual/biannual review

17. Number/proportion of supervisors who have undertaken supervisor training.

capabilities 21. Student perception of

teaching e.g. SET/ AUSSE scores aggregated by program and year

Teacher

1. Undertake relevant qualifications and development

2. Develop career plan with supervisor and review annually

3. Identify experience required and actively seek opportunities to gain that experience/role.

4. Prepare and develop portfolio for annual review; development plan for discussion with supervisor

5. Develop scholarship /research in field/discipline.

6. Active monitoring and evaluation of development of teaching qualities

7. Seek and respond to student evaluation of teaching on teaching qualities.

8. Curriculum development, utilisation of technology, assessment practice etc is appropriate, relevant, current and informed by scholarship.

9. Number of peer review and student feedback responses on teaching quality relevant for position/level of appointment

10. Attendance at programs

11. Evidence of a sustained commitment to quality teaching and to students learning

12. Mentor and monitor sessional/tutorial staff in programs/units under teacher responsibility; contribute to development programs

13. Monitor quality of student learning outcomes

14. Student perception of teaching e.g. SET/ AUSSE scores for unit

Performance Indicators for Quality: Diversity TQI project: http://www.altc.edu.au/teaching-quality-indicators Diversity relates to ethnic, cultural, and socioeconomic diversity, as well as diversity regarding abilities, talents and learning approaches.

Denise Chalmers & Kate Thomson: - Teaching Quality Indicators Tables May, 2009 40

Level Input Process Output Outcome

Institution

1. Students have access to learning support including specific support for equity students

2. Student background characteristics are collected, such as student entrance scores, student alternate entry test scores, social origins of students, student gender, equity status

3. Staff background characteristics collected including qualifications/ experience, equity status.

4. Institutional funding for diversity identified in resource allocation and strategies

5. Obvious and consistent leadership showing commitment for diversity and equity

6. Policy on diversity and inclusive practice for staff and students

7. Active strategies to attract and recruit students from targeted diversity groups

8. Provision of services and support reflect needs of diversity and equity staff and students

9. Student systems established to monitor student attrition, progression and completion by diversity group and across groups (e.g. gender and SES) to program level.

10. Active strategies to work with targeted diversity groups prior to enrolment addressing specific needs.

11. Provision of scholarships targeted to equity students

12. Provision of comprehensive program of professional development in equity, diversity, cultural communication, inclusiveness etc for all teaching and administration staff

13. Active strategies to attract, appoint and retain staff from diverse backgrounds.

14. Provision of grants, funding for targeted initiatives to support diversity and equity of students, staff

15. Attrition, progression, completion data, trend data by faculty, student characteristics equity group

16. Requests for accommodation, funding support etc for number referred, provided, unmet

17. Proportion of students from diversity groups mapped to population, targets, monitored for trends

18. Proportion of staff by diversity characteristics groups mapped to position levels, population, targets monitored for trends

19. Staff retention, career progression by staff characteristics

20. Number and type of equity and diversity cases, complaints

21. Student portfolio mapping learning, graduate capabilities

22. Student satisfaction (CEQ) and engagement (AUSSE) with support and university services

23. Staff satisfaction, work-life experience and engagement by group

24. Exit survey for students who do not complete.

Performance Indicators for Quality: Diversity TQI project: http://www.altc.edu.au/teaching-quality-indicators Diversity relates to ethnic, cultural, and socioeconomic diversity, as well as diversity regarding abilities, talents and learning approaches.

Denise Chalmers & Kate Thomson: - Teaching Quality Indicators Tables May, 2009 41

Level Input Process Output Outcome Department 1. Student characteristics by

program of study 2. Student load 3. Resources/student ratio 4. Staff workload 5. Mentoring support programs

offered within the department for staff and students

6. Staff diversity actively pursued

7. Provision of academic and personal counseling services matched to the student profile

8. Curriculum development and review that accounts for diversity and encourages and values different perspectives and contributions.

9. External discipline/program review of curriculum includes diversity lens

10. Staff informed of program requirements and study options or aware of whom to direct students to for program of study advice

11. Student retention, graduate destination by student characteristics for program of study

12. Proportion of staff diversity characteristics trends over time

13. Staff retention and departure by diversity characteristics

14. Student satisfaction with and awareness of support service

15. Review of exist survey for students who withdraw from study by year, by program

16. Staff exit survey, interview.

Teacher 1. Student background characteristics

2. Staff qualifications/ experience

3. Demonstration of commitment to diversity in teaching portfolio: evident in attending training; teaching content, teaching strategies, materials, assessment tasks

4. Flexible curriculum design e.g. presentation methods, pace, level and type of assessment tasks

5. Participation in mentoring (as mentor or mentoree) with colleagues

6. Active response to specific diversity, equity needs

7. Student attendance and attrition by background characteristics

8. Student satisfaction with inclusiveness and respect for diversity.

9. Student achievement/ grades by diversity characteristics

10. Review of student exist survey data

Performance Indicators for Quality Assessment TQI project: http://www.altc.edu.au/teaching-quality-indicators The most direct measures of student learning are through the assessment tasks while students are studying in their enrolled program of study. The design, delivery and administration, provision of feedback, established standards, moderation and review processes for assessment are where universities should direct their attention.

Denise Chalmers & Kate Thomson: Teaching Quality Indicators Project May, 2009 42

Level Input Process Output Outcome

Institution

1. Student access to learning support including provisions for equity students

2. Student background characteristics, such as student entrance scores, student alternate entry test scores, social origins of students, student gender

3. Staff qualifications/ experience in designing and administering assessment

4. Institutional funding for assessment and assessment processes

5. Assessment polices address issues of pedagogy; accommodation for staff/student diversity; adopting an evidence-based approach to assessment polices; commitment to formative assessment

6. Alignment between institutional policy for best practice and faculty/ departmental activities

7. Provision of professional development on the development, marking/grading and review of assessment for faculty, staff and administrators

8. Systematic review of assessment practices of standards of student learning (using a mix of internal and external reviewers/moderators)

9. Quality assurance policy processes and practices established and implemented across the university.

10. Communication processes and resources for students to advise on issues such as academic integrity, regulations and expectations

11. Student retention rate; by program and year level

12. Student attrition rate; by program and year level

13. Graduation rate 14. Graduate employment rate 15. Trend data of student

completion by relevant characteristics, e.g. SES, entry scores, entry tests, equity group

16. Transition rate (from bridging to degree program, from diploma to degree etc)

17. Number and type of student appeals related to matters of assessment, academic integrity, monitored for trends

18. Collection of longitudinal assessment data using quantitative and qualitative data

19. SET, CEQ AUSSE survey items on assessment

20. Student satisfaction with assessment tasks, requirements and feedback provided by subject and or program

21. Examiners’ and moderators’ reports on standards of learning

22. Examiner’s reports of academic standards, practices, outcomes

23. Student self-reported attainment of graduate attributes

24. Staff participation in professional development to develop academic literacies

Department 1. Student load

2. Resources allocated to assessment and moderation

3. Resources/student ratio 4. Staff workload 5. Staff qualifications/

experience

6. Assessment polices address issues of pedagogy; accommodation for staff/student diversity; adoption of an evidence-based approach to assessment practices; commitment to formative or ‘learning-oriented assessment’

7. Assessment is built into departmental planning and review, practice

10. Student grades 11. Student retention rate; by

subject, program and year level

12. Student attrition rate; by subject, program and year level

13. Graduation rate 14. Graduate employment rate

17. Reports on assessment quality and tasks; reviewed by program/year and trends

18. External moderation of standards; peer report on moderation

19. Student attainment of graduate attributes

20. Student self-reported

Performance Indicators for Quality Assessment TQI project: http://www.altc.edu.au/teaching-quality-indicators The most direct measures of student learning are through the assessment tasks while students are studying in their enrolled program of study. The design, delivery and administration, provision of feedback, established standards, moderation and review processes for assessment are where universities should direct their attention.

Denise Chalmers & Kate Thomson: Teaching Quality Indicators Project May, 2009 43

Level Input Process Output Outcome reviewed by discipline, staff involved in developing assessment tools

8. Assessment tasks systematically reviewed to assess the achievement of graduate attributes and goals/objectives of specific subjects,

9. Reliable assessment standards developed and moderated/ peer reviewed within department and by discipline experts

15. Trend data of student completion by relevant characteristics, e.g. SES, entry scores, entry tests, equity group

16. Number and type of student appeals related to matters of assessment

learning outcomes 21. Student self-reported

attainment of graduate attributes

22. Student engagement and motivation for lifelong learning.

Teacher 1. Student background characteristics

2. Staff workload 3. Explicit learning outcome

statements linked to assessment requirements

4. Staff qualifications/ experience

5. Clear communication with students on expectations, criteria and standards

6. Employ a variety of assessment methods and types of assessment and accounting for student diversity

7. Assessment tasks systematically assess the achievement of graduate attributes and goals of specific subjects,

8. Provision of specific, continuous and timely feedback to encourage student learning

9. Robust review of the design, delivery and administration and feedback component of assessment tasks and their integration into a whole of program approach.

10. Participation in moderation and review of subject standards (and course and discipline standards)

11. Engage in professional development on assessment, standards.

12. Student pass rates 13. Student grades 14. Student retention rate 15. Student attrition rate 16. Graduation rate 17. Graduate employment rate

18. Student attainment of stated learning outcomes

19. Student achievement against established standards of achievement

20. Student satisfaction with assessment tasks, requirements and feedback provided

21. Student self-reported attainment of learning outcomes.

Performance Indicators for Quality: Engagement and Learning Community TQI project: http://www.altc.edu.au/teaching-quality-indicators Engagement refers to commitment and engagement with one’s own education. Learning communities can be considered to include: (a) the extent to which students and staff feel they belong to and are engaged as a community of learners/scholars; and (b) as distinct and varied learning communities within the institution and the disciplines in which students and staff engage.

Denise Chalmers & Kate Thomson: Teaching Quality Indicators Project May, 2009 44

Level Input Process Output Outcome Institution

1. Allocated resources for students promoting engagement and its importance for quality learning

2. Facilities and resources to promote informal and formal engagement, e.g. Library, online resources, clubs, meeting spaces, information and communication tools for staff and students.

3. Active links established to support students’ engagement with external community and employers

4. Established expectations for all staff (academic, general, professional) to develop/contribute to student learning experiences and engagement

5. Provision of activities for students to support formal and informal learning community development across cohorts and years.

6. Established cycle of systematic reporting of engagement data and plans.

7. Provision of professional development programs that include university approach to engagement, understanding of what constitutes engagement and how it can be developed.

8. Policies and curriculum planning that promotes student engagement with their learning

9. Number of students utilising services, support with retention, break down by equity group

10. Student attrition reviewed by faculty, program, year level, campus-establish and monitor trends

11. Student progression rates to further study, higher degrees

12. Number of students who return for further study

13. Number of students who register and engage in alumni after completion

14. Number of students participating in organized student activities

15. Attendance numbers of staff at PD programs and workshops

16. Student satisfaction with teaching, services, broken down by faculty, department, program

17. Student engagement survey* results broken down by faculty, program, year level, campus-establish and monitor trends

18. Staff engagement** with their teaching and commitment to student achievement and learning

19. 20. *e.g. CEQ, Learning

community scale, AUSSE, university surveys)

21. ** e.g. Staff engagement surveys using AUSSE, NSSE, work-life university surveys

Department

1. Established orientation and transition/induction programs for students and staff on support and opportunities available.

2. Established relationships with major employer groups, opportunities for work integrated learning experiences for students.

3. Established services and facilities that recognise different groups of students and their needs and interests

4. Established systems for identifying students at risk and engaging with them, early- monitoring, advising, learning support etc

5. Policies and expectations of student time spent engaged in learning

8. Student attrition reviewed by department and program of study

9. Student attendance at scheduled classes/on-line time on task monitored

10. Student participation rates in informal engagement activities

11. Student self reported time-on-task, time in learning (e.g. AUSSE)

12. Graduate destinations (GDS) data by program of study.

13. Employer satisfaction with graduates of programs of study

14. Student (and staff) responses to engagement surveys (CEQ, AUSSE).

Performance Indicators for Quality: Engagement and Learning Community TQI project: http://www.altc.edu.au/teaching-quality-indicators Engagement refers to commitment and engagement with one’s own education. Learning communities can be considered to include: (a) the extent to which students and staff feel they belong to and are engaged as a community of learners/scholars; and (b) as distinct and varied learning communities within the institution and the disciplines in which students and staff engage.

Denise Chalmers & Kate Thomson: Teaching Quality Indicators Project May, 2009 45

Level Input Process Output Outcome 6. Programs, policy and

curriculum practices to support and enhance engagement evaluated, reviewed and revised.

7. Recognition of contributions made by staff who enhance student engagement and learning opportunities.

Teacher

1. Development of relevant and appropriate teaching skills and understanding related to engagement and learning community.

2. Development of career plan with supervisor and activities opportunities for experience and professional development

3. Review of practices that facilitate student engagement in their learning

4. Development of teaching strategies and practice to promote and support student engagement that recognise students from diverse backgrounds.

5. Advise students of expected hours of formal learning, on-line, staff contact group work, etc

6. Students advised of expected hours of formal learning, on-line, staff contact group work, etc

7. Integrated work relevant learning experiences where appropriate.

8. Student attendance, attrition, progression rates for subjects and programs taught by identified equity groups, students.

9. Student survey data of reported time spent in study, reviewed against expected hours of study

10. Documented evidence of promoting engagement in teaching portfolio

11. Documented evidence of contribution to development of student learning communities.

Appendix 2

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 46

Summary of the National TQI Dissemination Activities The National team sought and were invited to present on the project at individual universities and at forums and conferences within the Australian Higher Education sector and internationally. In addition the dissemination activities of the pilot universities is provided in each project report. Senior Academics and university staff in Australian Universities • The Project Leader was invited to discuss the project with the members of the senior

executives and staff at the following universities: o Charles Darwin University (2007), o Deakin University (2007), o Edith Cowan University (2008). o Griffith University (2007), o James Cook University (2007), o La Trobe University (2009), o Monash University (2007), o The University of Queensland, (2007), o University of New England (2008), o University of Southern Queensland (2007).

Universities Australia, DVC/PVC(A) meetings • The project was first proposed to this group for their support, this was sought and given

at the November 2006 meeting. • The Project Leaders reported to every meeting from December Nov 2006 - Nov 2008. • The study, Australian University Teaching Indicators in Use was provided to each

university with their summarised data in t13 detailed tables and an interim summary of each table in 2007. Wider distribution and sharing of the tables was discussed. An agreement was made to update the tables within each institution with further discussion on the wider use of the tables to take place at the next meeting. The information was updated and reissued to each university in 2008.

• The study of Student Surveys was also supported by the DVC(A)s who provided a contact name for the research team.

AUQF2008 and AUQF2009 AUQF 2008: Quality & Standards in Higher Education: Making a Difference, 9-11 July, 2008. Canberra, http://www.auqa.edu.au/auqf/pastfora/2008/ • Topic: Teaching and learning indicators of quality and universities

Presenters: Professor Denise Chalmers, Director, Centre for Advancement of Teaching and Learning and National Project Leader, Teaching Quality Indicators Project, The University of Western Australia, Ms Kate Thomson, Macquarie University, Ms Lynda Davies, Griffith University, Dr Josephine Lang, RMIT University and Dr Anne Gilmore, The University of Queensland

o The project team were invited to present a workshop at the AUQF conference held in Canberra, in July, 2008. This workshop included an overview of the TQI project and a summary of the pilot university projects. A more detailed exploration of the Assessment dimension of the framework was provided by the Project Officers from The University of Queensland, RMIT University, and Griffith University. Participants discussed the Assessment Indicators table and provided valuable feedback to the team on its usefulness. The presentation

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 47

was well attended, with additional chairs required and received positive feedback from the participants.

• Topic: Teaching Matters: Developing Indicators of Teaching Quality, a Comparative Study Presenters: Ms Jacqueline Flowers, Project Officer (TQI), The University of Western Australia and Ms Bronwyn Kosman, Project Manager (TQI), Macquarie University

o The project officers from The University of Western Australia and Macquarie University presented a paper that explored the Institutional Climate and Systems Dimension of the framework, ‘Teaching matters: Developing Indicators of Teaching Quality, a Comparative Study’. This session also well attended and received positive feedback from the participants.

AUQF2009: Internal and External Quality Assurance: Tensions & Synergies 1-3 July, 2009, Alice Springs http://www.auqa.edu.au/auqf/index.shtml • Topic Developing Indicators of Quality Teaching and Learning for use in Universities.

Presenters: Denise Chalmers, The University of Western Australia, with Dr Anne Gilmore, The University of Queensland, Ms Bronwyn Kosman, Macquarie University, Ms Jacqueline Flowers, The University of Western Australia, and Ms Lynda Davies, Griffith University.

o The project team members proposed a workshop which was accepted and presented during AUQF 2009.

• Topic: Beyond the Numbers: Achieving Best Practice in Learning and Teaching Presenters: Jacqueline Flowers, UWA, and Bronwyn Kosman, Macquarie University

• Topic: Assessment Reform and the Quality Context: Tensions and Synergies Presenter: Lynda Davies, Griffith University

HERDSA 2007, 2009

• Quality teaching and learning in higher education: what is it and how do we know? Denise Chalmers and Judith Sachs, HERDSA presentation, Adelaide. 2007/

• Enhancing the student learning experience through the development of quality indicators of teaching and learning. Chalmers and Thomson, Presentation. Charles Darwin University, Darwin, 2009.

Other Australian Forums and Conferences

• Invited speaker, Academic Development Forum, ALTC Project, Melbourne, October, 2007.

• Invited Speaker, Chinese President’s Leadership Development Program. University of Sydney, 2007 and 2008

• Invited Speaker, Council of Australian Directors of Academic Development (CACAD), 2007, Canberra

• Invited speaker, uniserve Conference, 2007, Sydney. • Presentation, Quality indicators of teaching and learning that enhance the student

learning experience, Denise Chalmers, Teaching and Learning Forum (2009), Curtin University, January

INTERNATIONAL Higher Education Academy (HEA), 2007 -2008 • Professor Chalmers visited the Higher Education Academy for two days. During this time she

met with several people on aspects of the TQI framework. In particular: a team of researchers investigating university staff perceptions and use of teaching indicators; Helen Thomas, Assistant Director (Networks), who has responsibility for the Professional Standards Framework; Elaine Payne, Assistant Director, HE on the Academy’s approach to assessment; Eddie Gulc, Senior Advisor, HEA, on learning communities and internationalisation; Simon Ball, Senior Advisor on the JISC TechDis Service on issues

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 48

relating to supporting students and staff with disabilities; and David Sadler, Director (Networks) HEA. An invited presentation on the TQI project was made to Academy staff, Subject Centre directors, representatives from selected Centres of Excellence in Teaching and Learning (CETL), and two senior staff from QAA. September 2008.

• The project was featured in the Higher Education Academy Winter 2007 edition Conference Presentations • Professor Chalmers presented a paper on the project – ’Teaching and learning quality

indicators in Australian universities’ at the biannual OECD/IMHE general conference, “Outcomes, of higher education: Quality, relevance and impact” in Paris, September. 2008.

• Professor Chalmers presented a paper on the project – Teaching and learning quality

indicators in Australian universities at the biannual International Consortium of Educational Development (ICED) conference, in Barcelona, June 2010.

Other international • Professor Sachs presented on the project, ‘Performance Cultures of Teaching’, at the

University of Helsinki to Members of the International Network of Research Intensive Universities in June 2007.

• Professor Sachs presented on the project, ‘Performance Cultures of Teaching’, to invited staff during a visit to the University of Hong Kong in June 2007

• Professor Chalmers attended an invited retreat, funded by the Swedish Quality Agency and hosted by Lund University. While there she met with senior staff on the Teaching Quality Indicators project and provided copies of the reports. May 2008.

• Professor Chalmers met with the senior executive and presented on the project at the University of Gloucestershire – ’How can we recognise and reward quality teaching and learning?’ Following this visit, the university has expressed interest in the possibility of being involved as an international pilot university, should the project continue. September 2008.

• Professor Chalmers made a number of ad hoc presentations on the project to visiting delegations to the Carrick Institute e.g. Delegation from Sweden, visitors from Japan, visitors from UK, 2007-2008.

Refereed Publications (to date) Chalmers, D. (2007). Teaching and learning indicators in Australia. In Academy Exchange,

Winter, 38-39. Higher Education Academy, UK. http://www.heacademy.ac.uk/resources/detail/resources/publications/exhange_winter2007

Chalmers, D. (2008) Teaching and learning quality indicators in Australian universities. IMHE/OECD conference. The quality, relevance and impact of higher education. Paris. http://www.oecd.org/site/0,3407,en_21571361_38973579_1_1_1_1_1,00.html

Chalmers, D. (2010). Progress and challenges facing the recognition and reward of scholarship of teaching in higher education. Special Issue on Scholarship of Teaching and Learning in Higher Education Research and Development, v29.

Chalmers, D. (2010). The practice and use of student feedback in the Australian national and university context. In Patricie Mertova and Sid Nair (Eds), Student Feedback: The Cornerstone to an Effective Quality Assurance System in Higher Education. Cambridge , UK: Woodhead Publishing.

Devlin, M., Brockett, J., and Nichols, S. (2009). Focusing on university student engagement at the institutional level, Journal of Higher Education Policy and Management, 31:2,109 — 119 and in the Campus News.

Sachs, J. (2007). The scholarship of teaching in a research intensive university: Some reflections and a few possibilities. In Brew, A., & Sachs, J. (eds). Transforming a university: the scholarship of teaching and learning practice. Chippendale, NSW: Sydney University Press.

Appendix 3

Denise Chalmers, National Teaching Quality Indicators Project– Final Report 2010 49

Pilot University Reports, Resources and publications

The following reports and documents are all available on the ALTC project website

1 Deakin University Project Report 2 Griffith University Project Report 3 Macquarie University Project Report 4 RMIT University Project Report 5 The University of Queensland Project Report 6 The University of Western Australia Project Report 7 University of South Australia Project Report 8 University of Tasmania Project Report 9 Barrie, S., Ginns, P., & Symons, R. (2008). Student surveys on teaching and

learning. Final Report. 10 Chalmers, D. (2007). A review of Australian and international performance

indicators and measures of the quality of teaching and learning in higher education

11 Chalmers, D. (2008a). Indicators of university teaching and learning quality. 12 Chalmers, D. & Thomson, K. (2008). Snapshot of Teaching and Learning

Practice in Australian Higher Education Institutions. 13 Chalmers, D. (2008b). Defining teaching and learning indicators in universities. 14 Chalmers, D., Lee, K. & Walker, B. (2008). International and national

indicators and outcomes of quality teaching and learning currently in use. 15 Webb, C. (2009). Evaluation of the Teaching Quality Indicators Project.