384
Republic of the Philippines Department of Education VOLUME 2 TRAINING & DEVELOPMENT NEEDS ASSESSMENT (TDNA) SYSTEM Training and Development System OPERATIONS MANUAL A FIVE-VOLUME COMPILATION OF THE STANDARDS, PROCESSES AND TOOLS FOR IMPLEMENTING THE TRAINING AND DEVELOPMENT SYSTEM IN THE DEPARTMENT OF EDUCATION

Volume 2_TDNA System Operations Manual July V2010

Embed Size (px)

DESCRIPTION

Manual

Citation preview

Page 1: Volume 2_TDNA System Operations Manual July V2010

Republic of the Philippines

Department of Education

June 2010

VOLUME 2TRAINING & DEVELOPMENT NEEDS ASSESSMENT (TDNA) SYSTEM

Training and Development System

OPERATIONS MANUAL

A FIVE-VOLUME COMPILATION OF THE STANDARDS, PROCESSES AND TOOLS FOR IMPLEMENTING

THE TRAINING AND DEVELOPMENT SYSTEMIN THE DEPARTMENT OF EDUCATION

Page 2: Volume 2_TDNA System Operations Manual July V2010

TABLE OF CONTENTS

T&D System Operations Manual-Volume 2: The TDNA System

Section 1.0: Overview of the Training and Development (T&D) System Framework

1.1. Rationale, p11.2. Vision , p21.3. Goals, p21.4. Objectives, p21.5. Standards and Guiding Principles, p21.6. The T&D System and its Major Components, p4

T&D System Operations Manual-Volume 2: The TDNA System Page i

VOLUME 2TRAINING & DEVELOPMENT NEEDS ASSESSMENT (TDNA) SYSTEM

The Training and Development (T&D) System Operations Manual, in five volumes, was developed and validated in Regions VI, VII and VIII, Divisions of Negros Occidental, Bohol/Tagbilaran and Northern Samar, through the AusAID-fundedproject STRIVE

(Strengthening the Implementation of Basic Education in Selected Provinces of the Visayas), in coordination with the EDPITAF (Educational Development Project Implementing Task

Force), and in consultation with the Teacher Education Development Program-Technical Working Group (TEDP-TWG), and the

National Educators Academy of the Philippines (NEAP). The five volumes of the T&D System Operations Manual are:

Volume 1 – The Training and Development System Framework Volume 2 – The Training & Development Needs Assessment (TDNA) System

Volume 3 - The Professional Development Planning (PDP) SystemVolume 4 – The Program Designing and Resource Development (PDRD) SystemVolume 5 – The Program Delivery (PDy) System

Page 3: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA SystemSection 2.0: The Training and Development Needs Assessment System…..……………….........5

2.1. Training and Development Needs Assessment, p52.2. The TDNA Systems Framework, p2.3. The TDNA Tools, p72.4. Standards for TDNA , p2.5.Structure, Functions, Roles and Responsibilities, p

Section 3: General Description of the NCBTS-TSNA 11

3.1. The NCBTS Framework, p3.2. The NCBTS –TSNA System Design, p23.3. Development and Validation of the NCBTS-TSNA Tool, p143.4. The NCBTS-TSNA Guide and Tools, p73.5. The NCBTS-TSNA Orientation Package, p73.6. Administration, Scoring and Interpreting the NCBTS-TSNA Tool, p183.7. Organizational Structure and Process Flow, p13.8.Monitoring and Evaluation of NCBTS-TSNA3.9. Management of the TDNA Consolidation Database, p24

Section 4: General Description of the Training and Development Needs Assessment for School Heads ……..…………………………………………………………………. 25

4.1. Basis and Purpose of TDNASH , p254.2. Organizational Structure and Process Flow, p264.3. Competencies Assessed by the TDNASH, p274.4. The TNASH Guide and Tools, p284.5. System Design and Assessment Methodology, p284.6. Administration of the TDNASH, p294.7. Analysis of Results, p304.8. Reporting Cluster Results, p314.9. Interpretation of Results, p314.10. TDNA Consolidation Database for TDNASH, p324.11. Monitoring and Evaluation of the Process, p334.12. Refinement of the TDNASH Tool, p34

Section 5: General Description of the Organizational TDNA for the Region and Division ………………………..……..……………………………………………………………34

5.1. The Basis of the Organizational TDNA, p34 5.2. The Framework and Management Competencies of the Organizational TDNA, p34 5.3. Development and Validation of the Organizational TDNA Tool, p36 5.4. Methodology, p36 5.5. Monitoring and Evaluation of the Organizational TDNA, p36

Section 6.0: Monitoring and Evaluation of the TDNA System, …………………….…………………. 38

Section 7.0 The TDNA System Guides and Tools……………………………………………………………..41

7.1. The NCBTS-TSNA Guide and Tools

Introductory Information , p47Establishment of Regional and Division TDNA Working Groups , p51Orientation of Supervisors, School Heads and NCBTS Coordinators, 54 School /Cluster NCBTS-TSNA Implementation, 56

T&D System Operations Manual-Volume 2: The TDNA System Page ii

Page 4: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA SystemUtilization of NCBTS-TSNA Results, p59 Monitoring and Evaluation of the NCBTS-TSNA, 60 Attachment 1: NCBTS-TSNA Tools and Templates, 63

Attachment 2: M&E Tools for the NCBTS-TSNA , p87

7.2. The TDNA for School Heads (TDNASH) Guide and Tools ……………………............……..110

Introductory Information, p115 Assessment Approach and Methodology, p117 Analysis of TDNASH Results , p123TDNA Consolidation Database for TDNASH, p128Monitoring and Evaluation of the TDNASH, p 129Attachment 1: TDNASH Tools and Templates, p131 Attachment 2: M&E Tools for the TDNASH, p158Attachment 3. The National Competency-Based Standards for School Heads, p168

7.3. The Organizational TDNA for the Region Guide and Tools …………………………………….. 176Introductory Information , p180General Introductions for the TDNA Working Group, p181 Monitoring and Evaluation of the Organizational TDNA, p182 Attachment 1: Region Organizational TDNA Tools and Templates, p185 Attachment 2: Region Organizational TDNA Monitoring and Evaluation Tools, p201

7.4. The Organizational TDNA for the Division Guide and Tools ……………………………. ……217Introductory Information, p220 General Introductions for the TDNA Working Group, p221 Monitoring and Evaluation of the Organizational TDNA, p222 Attachment 1: Division Organizational TDNA Tools and Templates, p225 Attachment 2: Division Organizational TDNA Monitoring and Evaluation Tools, p240

T&D System Operations Manual-Volume 2: The TDNA System Page iii

Page 5: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

T&D System Operations Manual-Volume 2: The TDNA System Page iv

GLOSSARY OF ACRONYMS

AIP Annual Implementation Plan

BEAM Basic Education Assistance for Mindanao

BESRA Basic Education Sector Reform Agenda

CBTS Competency Based Teacher Standards

CO Central Office

COT Center of Training

DEDP Division Education Development Plan

DepED Department of Education

DO Division Office

ELMP Education Leadership and Management Program

EDPITAF Educational Development Project Implementing Task Force

EBEIS Enhanced Basic Education Information System

ES Education Supervisor

F3 Formal Face-to-Face

FGD Focus Group Discussion

GCA Group Consensual Assessment

HRD-SDD Human Resource Development – Staff Development Division

HRM Human Resources Management

HRTD Human Resource Training and Development

ICT Information Communication Technology

ICT4E Information Communication Technology for Education

INSET In-Service Education and Training

IRR Implementing Rules and Regulations of RA 9155, December 2007

IPPD Individual Plan for Professional Development

JEL Job-embedded Learning

KRT Key Results Thrust

KSA Knowledge, Skills and Attitudes

LAC Learning Action Cells

LEAP Learning Enhancement Action Program

LGU or LGA Local Government Unit or Local Government Authority

LOC Level of Competency

LOI Level of Importance

LRMDS Learning Resource Management and Development System

MOOE Maintenance and Other Operating Expenses

MPPD Master Plan for Professional Development

M&E Monitoring and Evaluation

NCBS-SH National Competency Based Standards for School Heads

NCBTS National Competency-Based Teacher Standards

NEAP National Educators Academy of the Philippines

NGO Non-Government Organization

NSHPI National School Heads Performance Indicators

OPS Office of Planning Service

PCR Program Completion Report

PDM Professional Development Materials

Page 6: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Section 1.0. Overview of the Training and Development (T&D) System Framework(Volume 1 contains the details of the Framework)

1.1.Rationale

The Department of Education (DepED) is presently active in implementing fundamental reforms that include efforts for human resource development at all levels to support quality performance of schools and learners. The Department’s package of policy reforms known as the Basic Education Reform Agenda (BESRA) seeks “to systematically improve critical regulatory, institutional, structural, financial, cultural, physical and informational conditions affecting basic education provision, access and delivery on the ground. These policy reforms are expected to create critical changes necessary to further accelerate, broaden, deepen and sustain the improved education effort already being started” (BESRA PIP, 2006).

BESRA’s vision for human resource development propels a unified system that provides for the continuing quality professional development for in-service education personnel at all levels of the educational system. Two of the major policy reforms under BESRA serve as the core of this present initiative for training and development. The first policy is the National School-Based Management Framework and Standards, which is the decentralization of decision-making authority to individual schools allowing various stakeholders to plan and implement goals to improve school performance and student achievement. The second policy is the Teacher Education and Development Program (TEDP), which saw the establishment of the National Competency-Based Teacher Standards (NCBTS). This is a framework that contains “a set of competency standards for teacher performance so that teachers, pupils and parents are able to appreciate the complex set of behaviors, attitudes and skills that each teacher must possess in order to carry out a satisfactory performance of their roles and responsibilities”(TEDP Final Report, 2006). Necessarily, training and development for teachers and school heads, for instance, must be based on accepted standards of the profession such as the National Competency-Based Teacher Standards (NCBTS) and the National Competency-Based Standards for School Heads (NCBS-SH).

Training and Development (T&D) is defined for the purpose of this framework as the process of providing professional development for the personnel of the Department of Education. The process is aimed at improving competencies and work performance through the provision of a wide variety of opportunities for individual growth in knowledge, attitudes, and skills. It is a personal and professional growth process, which necessarily integrates the goals of the individual professional with the development goals of the school, division and region for better student outcomes. The ultimate beneficiaries of T&D are the learners whose rights to quality education shall be the system’s foremost consideration.

Professional Development activities range from independent study such as personal or structured professional reading; to supported learning such as mentoring and coaching; to collective action such as getting involved in a professional organization or conducting group research and to formal programs such as on site face-to-face training, distance or on-line course study, and continuing formal education.

Training and Development in the education system is most successful in a learning community, which promotes the goals of school-based management with strong leadership and support systems. It is most likely to succeed when it is embedded in the vision, strategic plan and organizational structure of the school, division and region. Moreover, it must be conducted through a functional and integrated system guided by sets of standards, structures, processes, methodologies and tools for effective outcomes.

T&D System Operations Manual-Volume 2: The TDNA System Page 1

Page 7: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

1.2.Vision

Professional Development for all, within a culture of collaborative and continuous learning, nurtured by transformative leadership, for the achievement of educational goals

1.3.Goal

The T&D System’s goal is to establish a transforming and integrated set of operations that includes standards, structures, processes, and tools for the provision of quality professional development for educational leaders, teachers and non-teaching personnel that is functional at the regional, division and school levels.

1.4.Objectives

The objectives of the T&D System are to:1. identify priority development and learning needs of the various Human Resource through a

systematic process of competency-based needs assessment for professional development/training

2. develop needs-based Master Plans, training designs and resource packages for identified priority needs to support continuing professional development

3. conduct priority programs including post-training activities for professional development of educational leaders, school heads, teachers and non-teaching staff

4. provide the information Communication Technology (ICT) and the Monitoring and Evaluation (M&E) support operations through the T&D Information System (TDIS) for the T&D system at the central, regional, division and school levels.

1.5. Standards and Guiding Principles

To support the effective operations of a transformative and integrated T&D system, general standards and guiding principles are set:

Equity and Access- All educational personnel, regardless of age, gender, creed, position, and physical abilities, have

equal access for professional development - Effective strategies are utilized to increase participation and involvement of education personnel

for professional learning.- Professional development endeavours, individual or collective, result to empowerment and

improved well-being across diverse groups of clientele

Sustained culture of a learning organization - Involvement and support are maximized if both internal and external stakeholders have shared

aspirations, jointly make decisions and continuously support professional learning.- Each member of the learning community possesses a deep sense of individual accountability of

improving self and regards professional development as a way of life.

Effective and Efficient Use of Resources - Efficiency and effectiveness of the system are ensured through the proper utilization of resources

such as financial, physical, capital and human

T&D System Operations Manual-Volume 2: The TDNA System Page 2

Page 8: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Collaboration- Collaboration is a built-in value with opportunities provided for educators to work together on

regular basis.- Increased student learning as the focus of collaboration facilitates attainment of professional

development goals.- Collaborative mechanisms engage joint efforts with training and development institutions and

other educational partners for advancement programs.Continuing and cyclical process- Professional Development to be effective is provided with sufficient ongoing follow-up and

technical assistance.- The entire cyclical process of Professional Development is informed by data and research findings

that incorporate innovations and new knowledge.

Sustained by Transformative Leadership- Professional Development is nurtured by transformative leaders who are competent and skillful,

open to change and results-oriented, and have a deep sense of integrity and accountability.

Integrative of individual and institutional development goals directed to better learners’ outcome- Decisions are driven not only by individual professional aspirations but also by the development

priorities of the school, division and region.- Professional development is always directed to learners’ quality education and welfare bearing in

mind the promotion of healthy and protective learning environment as well as fostering equality, respect for human rights, and participation of children

Quality training content and strategies- The quality of training and learning is dependent largely on relevance of training and learning

content and methodologies to intended professional development goals- The utilization of research- based content and strategies ensures effectiveness of training in

improving targeted competencies.

TDNA-Based - Professional development programs must be based on development needs of the clientele

identified through a systematic process and based on competency standards set for the profession.

ICT-enabled - An information management system is integral in the efficient delivery of a quality professional

development program.- The information management system and M&E mechanism that provide a disaggregated data,

e.g. sex, of its clientele are essential inputs for integrating needs and experiences in planning and development of the systems.

Quality Assured- An effective T&D system has direct connectivity to the SBM’s active quality assurance (QA) and

effective monitoring and evaluation systems to ensure that priority learning needs inform planning and that the DepED personnel in the field apply gains and benefits from the training.

Integrated and unified: - T&D operates as a unified system that integrates professional development efforts at the central,

regional, division and school levels.

T&D System Operations Manual-Volume 2: The TDNA System Page 3

Page 9: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

1.6. The Training and Development (T&D) System and Major Components The T&D System, as presented in the Functional diagram below, is an integrated system for the provision of a continuing quality professional development for in-service educational personnel. It operates as a unified system at the central, regional, division and school levels. It is envisioned that the T&D system will engage teachers, school heads, educational leaders and non-teaching personnel in the continuous conduct and progressive provision of training and development programs through various modalities. It defines the inter-relationships of the different aspects of human resource development from needs assessment, program planning, designing and resource development, and the delivery of in-service professional development programs at the regional, division (including districts or clusters), and school levels. In effect, the T&D System is a support mechanism to the central, region, division and school’s demand for quality capability building to ensure the best practice and outcome in the workplace.

The T&D System is composed of four major interrelated subsystems namely: the TDNA System, the Professional Development Planning (PDP) System, Program Designing and Resource Development System (PDRD), and the Program Delivery (PDy) System. The Program Delivery (PDy) System is the main intervention that directly effects change in the knowledge, skills and attitudes (KSAs) of the education personnel. The PDP system is responsible for the completion of the Individual Plan for Professional Development (IPPD), School Plan for Professional Development (SPPD) as well as the Master Plans for Professional Development (MPPD) of the Region and the Division. The PDRD System generates appropriate T&D program designs and resource packages that would address the priority needs of the target personnel.

The Training and Development Needs Assessment (TDNA) system is very significant for the reason that it informs program planning, designing and resource development. It establishes a match between the trainees’ needs and the training programs to be conducted. The TDNA instruments and processes of the system are guided by the current national standards and list of competencies for various educational personnel such as the NCBTS, which articulates the essential parameters that characterize effective

T&D System Operations Manual-Volume 2: The TDNA System Page 4

Page 10: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

teaching and the NCBS-SH that identifies the expected competencies of an effective school leader. The system is supported by a TDNA Database that is the repository of results of needs assessment done at the school, district/division and regional levels. It analyses uploaded data and generates reports.

The T&D System has its Monitoring and Evaluation (M&E) mechanism embedded across the four subsystems. The internal M&E mechanism of the TDNA, PDP, PDRD and the PDy Systems includes specific processes and tools that support the overall goal and objectives of the entire system. The Quality Assurance scheme sees to it that the outputs at the different levels are achieved based on set standards and specifications. Moreover, the M&E results provide information on the strengths and/or weaknesses of the Training & Development System itself and of the different sub-systems to support sustainability and improvement.

The T&D System is ICT-enabled through the T&D Information System (TDIS) and can be accessed through the EBEIS (Enhanced Basic Education Information System) at http://beis.deped.gov.ph/ . The TDIS is one of the components of the Unified Integration System (UIS) that includes the enhanced BEIS and the LRMDS among others. The TDIS is incorporated as a module of the enhanced BEIS, taking advantage of the BEIS personnel data, which has been represented in the HR module of BEIS. The TDIS is responsible for collecting and processing data as well as creating the databases required for the analysis of results produced by the systems. It is also responsible for storing data obtained from the Training and Development Needs Assessment (TDNA), as well as consolidation and analysis of TDNA results based on identified variables.

The TDIS maintains as well as generates reports on training programs delivered, program management, trainers, training personnel, training records, and training evaluation. It also provides access to T&D documents/materials such as professional development plans, program designs and resources packages. Professional development materials (PDMs) developed through the PDRD System are also uploaded and accessed through the Learning Resource Management Development System (LRMDS) Portal.

Section 2: The Training and Development Needs Assessment (TDNA) System

2.1.Training and Development Needs Assessment

Training and Development Needs Assessment (TDNA) is the process of identifying the professional development needs of an individual or an organization through determining the gaps between an established set of standard competencies and the competencies presently possessed by the target personnel. The process involves job analysis, personnel analysis and gap analysis.

The provision of quality professional development programs is largely dependent on the alignment of the programs to the needs of the programs’ clientele. Results of the TDNA inform program planning and designing in order to establish a match between the professional development needs and the training programs to be conducted.

2.2. The TDNA Systems Framework

The TDNA involves three essential stages of training and development needs analysis. Phase I (Job Analysis for Effective Performance) is done by analyzing nationally set standards for the desired performance of personnel in their job or professional practice. In the case of the teachers, for instance, a set of standards that define effective teaching was established and validated by Central Office and is known as the National

T&D System Operations Manual-Volume 2: The TDNA System Page 5

Page 11: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Competency-Based Teacher Standards (NCBTS).

Phase II (Individual Training Needs Analysis) is the instrumentation to determine the current competency levels of the personnel in knowledge, skills and attitudes (KSA) terms. There is a need for objective instruments and methodologies to assess the current KSAs of the target personnel vis-à-vis the desired competencies set by the national standards. Phase III (Strength and Needs Analysis) is the analysis of the discrepancies between the standards set and the current individual competencies. Wide differences indicate a need that may be prioritized for development while minimal or absence of gaps may indicate strengths in particular competencies.

An important aspect of the TDNA process is the utilization of its results as input in the preparation of Individual Plan for Professional Development (IPPD) and in designing T&D programs and activities at the school, division and regional levels. While the consolidated TDNA results at the school level, inform the development of the School Plan for Professional Development (SPPD), the consolidated TDNA results at the Division and Regional levels inform the development of the Master Plans for Professional Development (MPPDs).

The framework for the TDNA is illustrated here:

2.3 The TDNA Tools

T&D System Operations Manual-Volume 2: The TDNA System Page 6

PHASE III

Strength-Needs Analysis

Individual Training Needs

Analysis

InstrumentationData Gathering

PHASE I

Competency Analysis

PHASE II

Current KSA and Competency

KSA required and Competency Standards

COMPETENCY ASSESSMENT

Competency Strengths &

Learning Needs

SIP DEDP REDP

Job Analysis for Effective

Performance

Page 12: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

2.3.1 The TDNA for Teachers

The TDNA for teachers was developed and validated for national implementation and is referred to as the NCBTS-Teachers’ Strengths and Needs Assessment (NCBTS-TSNA). The development of the Tool was based on the NCBTS, the national framework that establishes the competency standards for teacher performance in terms of knowledge, attitudes and skills that each teacher must possess. These competencies are identified to be necessary for the teachers to carry out a satisfactory performance of their roles and responsibilities.

The NCBTS-TSNA Guide and Tools (Section 7.1) was developed, validated and first implemented in the Visayas. The guide serves as a resource to support the implementation of the NCBTS-TSNA at the school level. The guide aims to provide regional, division, school implementers and teacher beneficiaries with a deep understanding of the NCBTS and its relationship to in-service teachers’ professional development. It introduces a TSNA tool that is NCBTS based, with consideration of the tool’s proper administration with teachers and how results can be utilized. The guide includes the M&E scheme and corresponding M&E tools. An accompanying NCBTS-TSNA Orientation Package containing a set of Structured Learning Episodes (SLEs) and resource materials to support the efficient and effective delivery of a NCBTS orientation program has also been developed and can be accessed through the Training and Development Information System (TDIS) which is an element of the Enhanced Basic Education Information System (EBEIS) at http://beis.deped.gov.ph/ or through the Learning Resource Management and Development System (LRMDS) Portal at http:// lrmds . deped.gov.ph /

2.3.2. The TDNA for School Heads

The TDNA for School Heads is referred to as the Training & Development Needs Assessment for School Heads (TDNASH). The process intends to objectively determine the training and development needs of school heads in order to support improved educational leadership. The competencies identified in the TDNASH are based on the standards implied in the RA 9155, and are adapted from the National Educators Academy of the Philippines (NEAP) School Leadership Experience Portfolio (SLEP). The list of competencies is referred to as the National School Head Performance Indicators (NSHPI). The competencies have been classified into seven domains which are school leadership, instructional leadership, creating a student centered learning climate, professional development and human resource management, parental involvement and community partnership, school management and daily operations, and personal integrity and interpersonal effectiveness.

The TDNASH Guide and Tools (Section 7.2) was developed and validated with the School Heads in Region VI, VII and VIII. It contains a guide to support the implementation of the TDNASH, the tools for the triangulation technique and instruments for its M&E. The TDNASH intends to: (a) determine the current level of school heads competency, (b) ascertain the level of importance of each competency to the job, and (c) identify the priority training needs of the SHs vis-à-vis the seven domains for school leadership and management.

Since the development of the TDNASH Guide and Tools, Central DepED has reviewed the NSHPI and validated the new set of national competency based standards for school heads, i.e. the NCBS-SH (DepEd Order No. 32 s 2010, dated March 12, 2010). While these are very similar to the NSHPI, there are some variations. It is expected that DepED will use the NCBS-SH as the bases for the development of a revised TDNA tool for School Heads. In the meantime the TDNASH provides a means by which the training and development needs of school heads can be identified. A copy of the NCBS-SH can be found in Section 7.2, Attachment 3.

T&D System Operations Manual-Volume 2: The TDNA System Page 7

Page 13: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

2.3.3 The Organizational TDNA for the Region and Division Educational Leaders

The Training and Development Needs Assessment (TDNA) of the Region and the Division is designed to identify the organization’s current training and development needs vis-à-vis the desired organizational roles and responsibilities as stipulated in the Governance of Basic Education Act of 2001 (R.A. 9155). The ‘Management Competencies per Service Areas’ established for the Decentralized Management Training Program of the Secondary Education Development Improvement Project (SEDIP), a DepED project implemented by the Bureau of Secondary Education, served as the basis for the development of the TDNA tool.

The TDNA process includes a “self-assessment” done by a group of educational leaders, managers and staff from the different Office’s sections through a Focused Group Discussion (FGD) technique. The respondents from each section arrive at a consensual description of the organization vis-à-vis the management competencies. The TDNA is completed with an external assessment. In the case of the Division, the Regional Office assesses the management competencies of the Division while the Division assesses the Region’s management competencies. Hence, each organization’s TDNA result is the consolidation of the “self-assessment” plus the assessment represented by the external measure.

Specific steps to support the administration and consolidation of results are detailed in the Organizational TDNA for Regions Guide and Tools (Section 7.3) and in the Organizational TDNA for Divisions Guide and Tools (Section 7.4). The initial versions of the Organizational TDNA tool and the processes followed were first developed and used in Region VII and VIII and in the Divisions of Northern Samar and Bohol. These have been further refined and implemented in Region VI and in the Division of Negros Occidental.

2.4 Standards for TDNA

To support the effective delivery of a transformative and integrated TDNA system, general standards are set as follows:

2.4.1 Competency-BasedThere should be a comprehensive TDNA based on the nationally accepted standards set for target clientele (e.g. NCBTS for teachers). A TDNA is used by the system to diagnose if clientele need training and identify the nature and content of the training. 2.4.2 EfficientFor each type of TDNA, a Guide and Tools are provided that stipulates the process flow, roles and responsibilities of key implementers, administration of tool, scoring and retrieval of data, interpretation of result, and the monitoring and evaluation mechanism. An ICT enabled version of the NCBTS-TSNA tool and Consolidation Database are also provided to further increase efficiency of data retrieval and utilization.

Integrative and unified

The system has direct connectivity to the SBM and other systems that integrates professional development efforts at all levels. The TDNA operates as a unified system that integrates professional development efforts at the central, region, division and school levels.

T&D System Operations Manual-Volume 2: The TDNA System Page 8

Page 14: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

2.4.3 CollaborativeConsensual mechanisms are considered in the development of the TDNA system. The system promotes collaborative methodologies and is open to joint efforts with training/development institutions and other educational partners for advancement programs. 2.4.4 Quality AssuredA M&E mechanism is embedded in the TDNA System to ensure that priority learning needs are identified. Results inform the educational planners and implementers in the conduct of activities to address identified gaps. 2.4.5 Validity and reliability Assessment tools and systems are tried, tested, and acceptable to nationally based principles and standards.

2.5 Structure, Functions, Roles and Responsibilities

2.5.1 Regional Level

StructureThe TDNA is one of the responsibilities of the T&D Regional Chief. The T&D Chief is full time and works closely with the office support staff and TDNA Working Group (WG). The members of the WG are representatives from Elementary, Secondary, ALS, Administration/Budget and Finance who are officially designated within a given term as members for the implementation of the TDNA system operations.

Mandated FunctionsThe TDNA Working Group aligns its role and responsibilities with the functions of the Regional Office in “setting standards in learning outcomes and technical assistance in the form of training, performance evaluation, accountability processes, decentralization of functions, and budget in terms of localization, and integration of plans and best practices” (BESRA,2006). It also plans and manages effective and efficient use of personnel, physical and fiscal resources of the region including professional staff development of the region (RA 9155 Rule 6: 6.1.26).

Roles and ResponsibilitiesThe region establishes direction for the TDNA administration based on policy review and implementation, setting of regional standards and conduct of TDNA. It provides technical assistance to its divisions, monitoring and evaluating the implementation of the TDNASH and the NCBTS-TSNA processes. The region also conducts the Organizational TDNA for the region, validates the Organizational TDNA of the Divisions, consolidate, review and analyze results of the region and division Organizational TDNA to identify priority training needs.

2.5.2 Division/District Level

StructureThe TDNA is part of the responsibilities of the T&D Division Chair. The T&D Chair is full time and works closely with the office support staff and TDNA Working Group who are representatives from Education Supervisors of Elementary, Secondary, ALS, Administration/Budget and Finance and Public Schools District Supervisors (PSDS) who are officially designated as members for the implementation of the TDNA system operations.

Mandated Functions

T&D System Operations Manual-Volume 2: The TDNA System Page 9

Page 15: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

The Division plans and manages the effective and efficient use of all personnel, physical and fiscal resources of the division including professional development of staff (RA 9155 Rule 7:7.1.10). It prepares Division specific plans for training school heads and other school level stakeholders on SBM (BESRA) and provides opportunities for broad based capacity building for leadership to support SBM (RA 9155 Rule 9:9.1.24). It provides appropriate organizational support and authority to enhance their capability and competency to assist teachers carry out their mandated roles and responsibilities (BESRA KRT1) and provides professional and instructional advice and support to the SHs, teachers/ facilitators (RA 9155 (8: 8.1.2). It also provides technical assistance in the form of training programs for school heads and teachers (BESRA KRT 1).

Roles and ResponsibilitiesThe Division manages the administration of the NCBTS-TSNA to teachers. It reviews and consolidates the results to identify priority training needs for teachers across the district and division for incorporation into Division Master Plans for Professional Development (MPPD).

The Division administers TDNASH to School Heads, reviews and consolidates results to identify individual training needs to be used, as a basis for developing SH’s IPPD and the Division MPPD.

The Division conducts the administration of the Organizational TDNA of the division, review and consolidates results to identify priority training needs of the division personnel and validates the Organizational TDNA of the Region.

2.5.3 School Level

StructureThe school TDNA team is part of the School T&D Team chaired by the School Head and assisted by the NCBTS Coordinator with membership from the teaching and non-teaching personnel.

Mandated Function Initiate and sustain the regular practice of teachers using NCBTS as a guide for their personal self-appraisal as an integral part of preparing SIP (BESRA PIP).

Allocate and utilize funds at the school level to support teacher development needs identified in SIP in accordance with SBM practice (BESRA PIP). Roles and ResponsibilitiesIn the administration of NCBTS-TSNA, school heads orient teachers on the NCBTS assisted by the NCBTS Coordinators, consolidates TSNA results and prepare teachers for the development of their IPPD and School Plans for Professional Development (SPPD).

T&D System Operations Manual-Volume 2: The TDNA System Page 10

Page 16: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Section 3: The NCBTS-TSNA

T&D System Operations Manual-Volume 2: The TDNA System Page 11

Page 17: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

3.1. The NCBTS Framework

The National Competency-Based Teacher Standards (NCBTS) is an integrated theoretical framework that defines the different dimensions of effective teaching, where effective teaching means being able to help all types of students achieve the learning goals outlined in the curriculum. Teachers can use the NCBTS to determine whether their actions and strategies as teachers are effective in helping their students learn the desired curriculum objectives. Thus, the NCBTS can be used as the basis for a self-assessment tool.

This self-assessment can help teachers plan for their professional development in the short-term and in the long-term. For example, using the NCBTS a teacher can be aware of their strengths as a teacher and ensure that they become more consistent in demonstrating their strengths and may act as a mentor in this area to others. At the same time, they can plan their professional development strategies so that they can improve on their weaknesses.

Competency-based means that the standards or criteria for characterizing good teaching are defined in terms of what the teacher is competent to do. In the NCBTS, good teaching is defined in terms of those practices that help students learn better. Thus, competencies in the NCBTS were derived from educational theories and empirical research on characteristics of learning environments and teaching practices that lead to effective student learning and also from documented successful practices and programs of schools, divisions, regions and educational reforms projects in different parts of the country.

The competency-based teacher standards are organized hierarchically. The “basic” level categories of the standards are seven domains. A domain is defined as a distinctive sphere of the teaching-learning process and a well-defined arena for demonstrating positive teacher practices. Each domain is defined in terms of a principle of ideal teaching associated with enhanced student learning.

Under each domain, there are strands. Strands refer to more specific dimensions of positive teacher practices under the broad conceptual domain. At the lowest level of the hierarchical organization, under the strands, specific indicators are defined. These indicators are concrete, observable, and measurable teacher behaviors, habits, actions, routines, and practices known to create, facilitate, and support enhanced student learning.

The seven domains are:

Domain 1: Social Regard for LearningDomain 2: Learning EnvironmentDomain 3: Diversity of LearnersDomain 4: CurriculumDomain 5: Planning, Assessing, and ReportingDomain 6: Community LinkagesDomain 7: Personal Growth and Professional Development

To understand how the seven domains comprise an integrated whole, it helps to see the seven domains as falling under two broad categories. In the diagram below, the middle domains, Domains 2, 3, 4, 5, and 6 [the shaded areas] represent standards referring to “The Teacher as a Facilitator of Learning,” whereas the two outer domains, Domains 1 and 7 [the unshaded areas] represent standards referring to “ The Teacher as a Learner.”

The middle domains can further be divided into two sub-categories. The innermost domains, Domains 3, 4, and 5 [the darker shaded areas] represent the specific teacher practices related to the technical aspects of the teaching-learning processes, whereas the other domains, Domains 2 and 6 [the lighter shaded

T&D System Operations Manual-Volume 2: The TDNA System Page 12

Page 18: Volume 2_TDNA System Operations Manual July V2010

DOMAIN 1: SOCIAL REGARD FOR LEARNING

DOMAIN 2: THE LEARNING ENVIRONMENT

DOMAIN 3: THE DIVERSITY OF LEARNERS

DOMDAIN 4: CURRICULUM

DOMAIN 5: PLANNING, ASSESSING & REPORTING

DOMAIN 6: COMMUNITY LINKAGES

DOMAIN 7: PERSONAL GROWTH & PROFESSIONAL DEVELOPMENT

DOMAIN 4: Curriculum

T&D System Operations Manual-Volume 2: The TDNA System

areas] represent the specific teacher practices that embed the learning process in an appropriate context.

The integration of the seven domains may be seen from the inside going out. At the center of the series of domains [the dark shaded areas] are the technical aspects of the teaching-learning process. The domains of The Diversity of Learners (3), Curriculum (4), and Planning, Assessing, and Reporting (5) refer to what may be called good teaching strategies, and are very closely related to each other. These domains express the new paradigm in teaching.

3.2. The NCBTS –TSNA System Design

The NCBTS-TSNA adopts the TDNA System Framework. The process determines the differences between the actual situation (what is) and the desired condition (what should be) in terms of teachers professional competencies. In the NCBTS-TSNA, the actual situation is described by the current competencies as perceived by the teacher. The profile of the teacher’s current competencies is compared to the NCBTS standards for effective teaching. This NCBTS-TSNA, therefore, identifies both the competency strengths and needs as a result of determining the difference between the expected and the current competencies of an individual or a group of teachers. These competencies are translated in terms of Knowledge, Skills, and Attitudes (KSAs) that actually define the domains, strands and performance indicators of the NCBTS.

As shown in the TDNA Framework, the NCBTS-TSNA involves three essential stages of strengths and needs analysis: Phase I (Job Analysis for Effective Performance) is actually done by analyzing nationally set teacher standards in behavioral terms or by identifying effective teaching competencies. The DepED Central Office and Regional Offices are tasked to do this phase of the TSNA process. Phase II (Individual Training Needs Analysis) is the instrumentation to determine the current teacher competency levels in KSA terms which is done by the individual teacher at the school level. Phase III (Strengths-Needs analysis) is the analysis of the discrepancies between the standards set and the current teachers’ data on their competencies. Minimal discrepancies indicate strengths while big discrepancies indicate learning needs. The consolidation of results is carried out at the school, cluster, District, Division and Regional level for their respective purposes related to identifying teacher training and development needs.

The systems design below charts the flow of the TSNA process from self-assessment of the individual teacher using the NCBTS-TSNA Tool to the consolidation of TSNA results at the school level by the School

T&D System Operations Manual-Volume 2: The TDNA System Page 13

Page 19: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Head and the NCBTS Coordinator.

Each teacher accomplishes the NCBTS-TSNA tool using either the hard copy or the electronic version of the tool. Both versions end up with the individual teacher’s summary of results that indicate his/her strong and weak competencies across the seven domains and 23 strands. Each teacher’s results are used during the accomplishment of the teacher’s Individual Professional Development Plan (IPPD). The teachers’ TSNA results are consolidated at the school, district, division and regional levels. Generated reports provide information for the completion of the SPPD for each school and the MPPD for the Division and Region. These professional development plans are completed following the processes in the PDP System.

The system is ICT-enabled for tools that are accomplished through the electronic version of the NCBTS-TSNA. The TDNA database accommodate uploaded TSNA data and generates analysis reports on the NCBTS-TSNA of teachers at the the school, district, division and regional levels.

T&D System Operations Manual-Volume 2: The TDNA System Page 14

Page 20: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

At the Division level, the consolidated TSNA results, along with the Division MPPDs are used by the planners specifically for the human resource development component of the Division Education Development Plan (DEDP). Likewise, the NCBTS-TSNA results at the Division levels are forwarded to the Regional level for consolidation. This information is used as one basis for the Regional MPPD and to shed light on the situational analysis for the Regional Education Development Plan (REDP).

3.3. Development and Validation of the NCBTS-TSNA Tool

The NCBTS-TSNA tool is anchored on the seven domains of the NCBTS set by the Department of Education. Each domain has its corresponding strands and each strand has performance indicators. A total of seven domains, 23 strands and 80 performance indicators make up the NCBTS set by the DepED. The domains and strands are listed below. The performance indicators may be seen in the NCBTS-TSNA Tool in the NCBTS-TSNA Guide and Tools found in Section 7.1.

Domain 1 - SOCIAL REGARD FOR LEARNING Strand 1.1 Teacher’s Actions Demonstrate Value for Learning Strand 1.2 Demonstrates that Learning is of Different Kinds and from Different Sources

Domain 2- LEARNING ENVIRONMENT

Strand 2.1 Creates an Environment that Promotes Fairness Strand 2.2 Makes the Classroom Environment Safe and Conducive to Learning Strand 2.3 Communicates Higher Learning Expectations to Each Learner

Strand 2.4 Establishes and Maintains Consistent Standards of Learners’ BehaviorStrand 2.5 Creates a Healthy Psychological Climate for Learning

Domain 3 - DIVERSITY OF LEARNERSStrand 3.1 Determines, Understands and Accepts the Learners’ Diverse Background

Knowledge and Experience

T&D System Operations Manual-Volume 2: The TDNA System Page 15

Page 21: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Domain 4- CURRICULUMStrand 4.1 Demonstrates Mastery of the SubjectStrand 4.2 Communicates Clear Learning Goals for the Lessons that are Appropriate for

LearnersStrand 4.3 Make Good Use of Allotted Instructional TimeStrand 4.4 Selects Teaching Methods, Learning Activities and the Instructional Materials or

Resources Appropriate to the Learners and Aligned to Objectives of the LessonStrand 4.5 Recognizes General Learning Processes as well as Unique Processes of

Individual Learners Strand 4.6 Promotes Purposive Study

Strand 4.7. Demonstrates Skills in the Use of ICT in Teaching and Learning

Domain 5: PLANNING, ASSESSING AND REPORTINGStrand 5.1 Develops and Utilizes Creative and Appropriate Instructional Plan

Strand 5.2 Develops and Uses A Variety of Appropriate Assessment Strategies to Monitor and Evaluate Learning

Strand 5.3 Monitors Regularly and Provides Feedback on Learners Understanding of Content

Strand 5.4 Communicates Promptly and Clearly to Learners, Parents and Superiors About Learners Progress

Domain 6: COMMUNITY LINKAGES

Strand 6.1 Establishes Learning Environment That Responds to the Aspiration of the Community

Domain 7: PERSONAL GROWTH AND PROFESSIONAL DEVELOPMENTStrand 7.1 Takes Pride in the Nobility of Teaching as a ProfessionStrand 7.2 Builds Professional Links with Colleagues to Enrich Teaching PracticeStrand 7.3 Reflects on the Level of the Attainment of Professional Development Goals

The domains, strands and performance indicators were translated to specific Knowledge, Skills, and Attitudes (KSAs) to compose the NCBTS-TSNA Tool with 270 KSAs. Various groups at different levels validated the TSNA Tool content and methodology across Regions VI, VII, and VIII, and further validated at the Central Office level. The validation process involved the following:

1) Preliminary Content Validation: The validation group included the STRIVE2 Project Component Team composed of 34 educators with 4 Regional Supervisors, 6 Division Supervisors, 22 Principals, 1 Administrative Officer V (former HRMO3), and 1 District ALS Coordinator. The process reduced the original 375 items to 260 items.

2) Region, and Division Level Content Validation: Six Regional Education Division Chiefs; 16 Division Supervisors; 13 District Supervisors; 27 School Heads, 27 Elementary School Master Teachers and 27 High School Master Teachers were selected to review the tool for content and language used. They submitted their comments and marginal notes for the refinement of the tool.

3) Field Process Validation: Sixty In-service Teachers were asked to respond to the tool with two versions. Thirty (30) teachers used the manual version and 30 teachers used the electronic version. Results showed that it was more efficient to complete the electronic version. Time spent in accomplishing both versions were recorded to be on average two hours for the manual and one hour for the electronic. There were refinements done for the electronic tool related to the programming of results per domain and strand.

T&D System Operations Manual-Volume 2: The TDNA System Page 16

Page 22: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

4) Validation by six of the STRIVE project’s Technical Advisers: TDNA and SBM Adviser, the National and International T&D Advisers, the National and International ICT Advisers and the SBM-QAAF Adviser. There was a recommendation to include items specific to ICT competencies submitted by the LRMDS Advisers. The ICT4E standards were studied and ten items were added to the four original items to compose the ICT “domain”. This made the total of items 270 in all.

5) Experts’ Validation at the Central Office Level: The TEDP who was responsible in formulating the NCBTS was consulted to review the tool. The TEDP group included: the Director of the Teacher Education Council (TEC), a Professor and former Vice-President for Academics of West Visayas State University, the Associate College Dean of Arts and Sciences of the University of the Philippines, the College Dean of Centro Escolar University and National President of PAFTE, a School Head and the President of NAPSSHI, a School Head and President of PESPA, and a SPED specialist and Assistant Chief of the Bureau of Elementary Education. Together with the T&D Team and the ICT and T&D Technical Advisers, they reviewed the manual and thoroughly inspected the 270 items, item by item. As a result, further refinements were incorporated to the Tool. Additionally, the TEDP expressed appreciation for the developed NCBTS tool and for the addition of a set of 14 items that composed an “ICT domain”.

6) Presentation of the Guide and Tools to a national group of teacher educators: Comments and points for refinement were gathered from the participants of the First National Conference of Centers of Training Institutions, attended by Heads and Deans of 82 Teacher Education Institutions (TEIs), including a few RDs, ARDs, and SDSs, held at the Development Academy of the Philippines, Tagaytay City. Points considered for the improvement of the Guide and tool were the inclusion of the PSDS’s to be co-responsible with the School Heads for the administration of the NCBTS Tool to teachers in their clusters, the inclusion of an item for guided reflection as a competency, reconsideration of the length of the Tool, among others.

7) Preparation of the NCBTS-TSNA Orientation Package: In the course of doing the steps mentioned above, there was a clear recognition that teachers must have an adequate understanding of the NCBTS Framework and the standard competencies that are expected from them before the NCBTS needs assessment process is done. To address this need, the T&D Team developed a resource package that aimed to orient the implementers such as the ES, PSDS and School Heads and NCBTS Coordinators on the BESRA and the NCBTS. The NCBTS-TSNA Orientation Package, which consists of a series of Structured Learning Episodes (SLEs), was also to be conducted to teachers prior to the initial administration of the NCBTS-TSNA Tool.

8) Process Try-out of the NCBTS-TSNA Guide and Tools, including the NCBTS-TSNA Orientation Package: The NCBTS-TSNA system, procedures were tried out in a one-school sample that involved all the teachers and the School Head of the Tabalong National High School, Dauis, Division of Bohol. The content and processes of conducting the SLEs, the tool administration, scoring, individual and school consolidation profiling, and the M&E mechanisms were tried with 33 teachers. Refinements were done following the try-out based on the observations of the T&D Team and feedback from the teacher respondents. The Pilot Version of the NCBTS-TSNA Guide and Tools, and NCBTS-TSNA-Orientation Package was then prepared for a bigger sample of schools.

9) Division Pilot-Test of the NCBTS-TSNA Guide and Tools, including the NCBTS-TSNA Orientation Package: The pilot-testing of the NCBTS-TSNA system using the NCBTS-TSNA Guide and Tools and the NCBTS-TSNA Orientation Package was done in the 300 pilot schools in the Divisions of Bohol, Negros Occidental and Northern Samar. This expanded to include the six hundred twenty (620)

T&D System Operations Manual-Volume 2: The TDNA System Page 17

Page 23: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

non-pilot schools in the division of Negros Occidental. Technical reports were developed to document the process and were the basis for the finalization of the NCBTS-TSNA Guide and Tool and NCBTS-TSNA Orientation Package. This was turned-over to the central office which further validated the Guide and Tools and the Orientation package to six regions outside the STRIVE sites.

10) Finalization of the NCBTS-TSNA Guide and Tool and NCBTS-TSNA Orientation Package: Based on the national validation conducted by the TEDP-TWG in Luzon and Mindanao, further revisions were made such as the renaming of the package to NCBTS-TSNA (Teachers Strengths and Needs Assessment), the clustering of 10 of the 14 ICT items under a new STRAND (4.7), addition of a performance indicator on ICT (4.7.1). The 10 ICT items originally found in different strands in the original tool were selected and finally classified under this performance indicator and ICT strand.

3.4. Description of the NCBTS-TSNA Guide and Tools (The complete Guide and Tools is found in Section 7.1.)

The NCBTS-TSNA Guide and Tools is intended to support the implementation of the National Competency-Based Teacher Standards (NCBTS) Teachers Strengths Needs Assessment (TSNA) at the school level, in line with the Basic Education Sector Reform Agenda (BESRA) of the Department of Education.

The NCBTS-TDNA Guide and Tools aims to provide implementers and teacher beneficiaries with an understanding of the NCBTS and its relationship to in-service teachers’ professional development. It introduces the NCBTS-TSNA tool, with consideration of how the tool can be administered to teachers and how results can be utilized.

The NCBTS-TSNA Guide and Tools contains the following information:

An introduction to the NCBTS-TSNA Guide and Tools that describes the NCBTS-TSNA System Framework, the development of the tool, and the roles and responsibilities of Supervisors, School Heads, NCBTS Coordinators and teachers in relation to the NCBTS – TSNA and includes an overview of the NCBTS-TSNA Orientation Package.

Processes for the administration, scoring and interpretation of the NCBTS-TSNA at the school level

The hard copy and electronic versions of the NCBTS-TSNA Tool that contains the NCBTS-TSNA Self-assessment tool, answer sheet, teacher’s profile, individual summary of NCBTS-TSNA results template, and school consolidation template.

Monitoring and Evaluation processes and tools for the administration of the NCBTS-TSNA. 3.5. The NCBTS-TSNA Orientation Package

The NCBTS-TSNA Orientation Package serves as a guide to the implementers responsible for the initial administration of the NCBTS-TSNA to teachers. It is envisaged that the implementers are the School Heads and School NCBTS Coordinators. Division and District Supervisors also have a role in the TSNA implementation and will be guided by the package.

The package aims to: enhance implementers’ understanding of BESRA and the significance of teacher development in

achieving its goals provide orientation about the NCBTS, specifically as it relates to teacher in-service development deepen implementers’ understanding of the seven domains, strands and performance indicators

of the NCBTS

T&D System Operations Manual-Volume 2: The TDNA System Page 18

Page 24: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

introduce the NCBTS-TSNA Tool and consider guidelines for its implementation prepare School Heads, NCBTS Coordinators and District Supervisors to implement the NCBTS-

TSNA to teachers.

The package should be used by the Division TDNA-WG to orient the School Heads/NCBTS Coordinators and Division and District Supervisors within the clusters who will, in turn, assist their teachers to accomplish the NCBTS-TSNA tool. It can also be used by the Region TDNA-WG to orient Division T&D Teams on the field implementation and M&E of NCBTS-TSNA.

3.6. Administration, Scoring and Interpreting the NCBTS-TSNA Tool

3.6.1 Versions of the NCBTS-TSNA Tool

The NCBTS-TSNA Tool is a self-assessment that is introduced by the School Head/NCBTS Coordinator through an orientation process in order for the teacher-respondents to see its importance and thus reflectively respond to the tool. The NCBTS-TSNA Tool is available in an electronic format with an auto-scoring system, or as a hard copy with a separate answer sheet and summary results template. If the electronic version is used, each teacher responds to the NCBTS-TSNA tool from a file installed on a common computer in the school. It takes approximately one-and-a-half hours to accomplish the instrument, although no time limit should be imposed. The scores and individual summary results of the teacher in the seven domains with the corresponding strands are electronically generated instantly upon completion of the instrument.

For schools that have no access to the technology required for use of the e-version of the NCBTS-TSNA tool, or where teachers are not computer literate, the hard copy version of the tool may be used for implementation. The hard copy version takes approximately 2 hours to accomplish plus one hour for scoring and individual profiling. The hard copy version can be found in the NCBTS-TSNA Guide and Tools along with the separate Answer Sheet and a Summary Results Template.

It should be noted that DepED Central Office has distributed the NCBTS Tool Kit for the TSNA and IPPD which contains the Teacher’s Profile, the NCBTS-TSNA Tool, the Answer Sheet and the Individual Scoring Template to all regions, divisions with the expectation that all teachers will be provided with a copy.

To support school, district and division level consolidation of the NCBTS-TSNA results it is recommended that the School Head, with the assistance of the NCBTS Coordinator, ensure that all hard copies of teachers NCBTS-TSNA results are entered into the electronic version of the tool. Results can then be uploaded onto the TDNA Consolidated Database (see further details Section 3.9 Management of the TDNA Consolidation Database)

T&D System Operations Manual-Volume 2: The TDNA System Page 19

Page 25: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

3.6.2. The Tool Specifications

The NCBTS-TSNA tool contains clusters of knowledge, skills and attitudes (KSAs) specific to a particular Domain, Strand and Performance Indicator. The final tool is composed of a total of 270 items in the various clusters according to the specifications below:

DOMAINS STRANDS PERFORMANCE INDICATORS

KSAs Percentage %

Domain 1: Social Regard for Learning 2 5 18 6.67%Domain 2: Learning Environment 5 17 59 21.85%Domain 3: Diversity of Learners 1 8 27 10.00%Domain 4: Curriculum 7 22 78 28.89%Domain 5: Planning, Assessing and Reporting

4 12 40 14.81%

Domain 6: Community Linkages 1 6 18 6.67%Domain 7: Personal Growth and

Professional Development 3 10 30 11.11%

Total -7 23 80 270 100%

3.6.3. Responding to the Items

Each item has a common stem: “At what level do I…”. Considering that the NCBTS-TSNA tool is intended for self-assessment and not for performance ratings, the responses to the items are expressed qualitatively in competency level, i.e. High (H), Satisfactory (S), Fair (F), and Low (L). However, quantitative data are easier to interpret thus, in the response analysis, the numerical equivalent is assigned for each descriptor; H-4; S-3; F-2; L-1. The reference codes presented below guide the respondent in registering her/his self-assessment for each KSA:

Code of Competency Level

Qualitative Description

H- (High) I am very competent in the KSA and this is not my priority training need

S- (Satisfactory) I am competent in the KSA but I would benefit from further training.

F-(Fair) I am fairly competent in the KSA but need further training.L- (Low) I have low competence in the KSA and require urgent

training.

T&D System Operations Manual-Volume 2: The TDNA System Page 20

Page 26: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

3.6.4. Interpretation of Results

Upon completion of the NCBTS-TSNA Tool an obtained score, whether an average for the domain or strand can be interpreted using the appropriate indices in the chart below.

Range Description of the Level of Teaching Competence(Referred to as Teacher Professional Development Index

in DepED NCBTS-TSNA Primer)Scale Scores Percentage

Scores3.51- 4.00 87.51 - 100% Expert

Very competent and can support other teachers’ improvement

Teacher has almost all the competencies for effective teaching at high level. These are the identified strengths. Strengths have to be sustained and enhanced; however professional development needs have to be continuously addressed*

2.51-3.50 62.51 - 87.50% ExperiencedCompetent in the KSA but would benefit from further training and development

Teacher has the majority of the competencies at high level for effective teaching. Strengths have to be enhanced. Training and development needs have to be addressed*.

1.51-2.50 37.51 – 62.50% DevelopingFairly competent in KSA and need further training and development

Teacher has average of all the competencies at high level of effective teaching. These strengths have to be enhanced; however, training needs have to be addressed as priority.*

1.00-1.50 25.00 – 37.50% BeginningLacking competence in the KSA and require urgent training and development

Teacher has very few of the competencies at high level for effective teaching. Training needs have to be given priority and addressed urgently*.

*Description used in the DepED NCBTS-TSNA Primer and NCBTS Toolkit

For more details of the description of the NCBTS-TSNA Tool, refer to Section 7.1, Attachment 1 (hard and electronic versions of the tool) for the steps in accomplishing the tool and the procedures in scoring the tool.

T&D System Operations Manual-Volume 2: The TDNA System Page 21

Page 27: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

3.7 Organizational Structure and Process Flow

3.7.1. Creation of Regional and Division NCBTS-TDNA Working Groups

The Regional and Division TDNA Working Group, as indicated in the Organizational Structure of the T&D System Framework (See T&D System Operations Manual Volume 1) will be responsible for the management of the NCBTS-TSNA process. The Regional TDNA -WG members (representing the elementary and the secondary levels who are preferably Education Supervisors), will be designated by the Regional Director to support the T&D Chief and members. The Schools Division Superintendent (SDS) will organize the Division TDNA-WG to support the Division T&D Chair and members. Both Regional and Division TDNA-WG members would be chosen based on experiences in assessment and in the training of teachers.

3.7.2. TDNA-WG Roles and Responsibilities

The T&D Chief/Chair has the overall responsibility for the management of the NCBTS-TSNA process. She/He should ensure that all the TDNA-WG members are familiar with the process for conducting an NCBTS-TSNA orientation to groups of School Heads with their NCBTS Coordinators at the cluster/district level, and the administration of the NCBTS-TSNA Tool to the teachers at the school level. The TDNA-WG members are expected to play a key role in the preparation, administration, orientation, monitoring, data consolidation and reporting the results of the NCBTS-TSNA. Specific tasks include managing the:

Organization and technical operations of the TDNA system Provision of technical orientation and administration of organizational and individual TDNA,

e.g. NCBTS-TSNA, TDNASH, Organizational TDNA for Region or Division Consolidation and reporting of TDNA results Technical assistance for the utilization of TDNA results in the SIP/AIP,DEPP and REDP Coordination with the resource mobilization to support TDNA activities Monitoring and evaluation of the TDNA system and use of the TDIS Submission to concerned officials the recommendations based on monitoring and evaluation

results of the TDNA System operations

The general flow of processes related to the NCBTS-TSNA across the Regional, Division/District and School levels is seen in the diagram below.

T&D System Operations Manual-Volume 2: The TDNA System Page 22

Page 28: Volume 2_TDNA System Operations Manual July V2010

Region Level:RD issues memo to Divisions commencing the NCBTS-TSNA specifying among others, the structure, functions, general process, resources and responsibilities

Division Level:SDS instructs the TDNA-WG to commence the NCBTS- TSNA for teachers, specifying among others, structure, functions, and general process, resources and responsibilitiesDiv TDNA-WG meets with Reg TDNA-WG and starts orientation and preparatory activities for the NCBTS-TSNA

School Level:School Head and NCBTS Coordinator conducts NCBTS orientation* and NCBTS-TSNA to all teachers

Division/District Level:Div TDNA-WG monitors cluster/district implementation TDNA-WG consolidates NCBTS-TSNA results and reports to the SDS NCBTS-TSNA Results are utilized for MPPD & DEDP re staff development

RO designates/ instructs Regional TDNA-WG to commence NCBTS-TSNA activities Reg & Div-TDNA-WG convene Facilitators/Trainers Team and conduct a walkthrough of the NCBTS-TSNA Orientation Package*TDNA-WG implements Cluster Lead and district schools level implementation for School Heads and NCBTS Coordinators.

Teachers accomplish NCBTS-TSNA self-assessment in hard copy or electronic format Teachers accomplish NCBTS-TSNA Summary resultsTeachers identify their priority training needs

SDS submits NCBTS-TSNA report to RD

Reg TDNA-WG meets with Division TDNA-WG to orient and plan activities for cluster and school level implementation

Reg TDNA-WG monitors and evaluates Div-TDNA-WG implementation of NCBTS- TSNA

RO utilizes NCBTS- TSNA results to inform the MPPD & REDP

SH and NCBTS Coordinator consolidate NCBTS-TSNA results and submit report to District Supervisor and Division TDNA-WG

T&D System Operations Manual-Volume 2: The TDNA System

NCBTS- TSNA Structural Process Flow

* Note: The Orientation Package only needs to be delivered when first introducing the NCBTS-TSNA.

3.7.3. Roles and Responsibilities to Support the Orientation on NCBTS and the Administration of the NCBTS- TSNA

Regional Supervisors To provide technical assistance to the Division in the implementation of the NCBTS-TSNA

Orientation and the administration of the NCBTS-TSNA Tool To monitor and evaluate the implementation by the Division of the NCBTS-TSNA

District/Division Supervisors To attend an orientation on the National Competency-Based Teacher Standards and the

NCBTS-TSNA Tool To provide assistance to Lead School Heads in coordinating clusters to meet to undertake an

orientation on the National Competency-Based Teacher Standards and the NCBTS-TSNA Tool To assist in generating resources for the Division NCBTS-TSNA activities

T&D System Operations Manual-Volume 2: The TDNA System Page 23

Page 29: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

To actively support School Heads and NCBTS Coordinators in the conduct of the orientation of all teachers regarding the NCBTS

To actively support School Heads and NCBTS Coordinators in the administration of the NCBTS-TSNA Tool to all teachers

To support the electronic consolidation of the results of the teacher NCBTS-TSNA at the school level and the incorporation of findings into School Improvement Plans and District Training and Development Plans

To monitor and evaluate the conduct of the administration of the NCBTS- TSNA To give feedback and recommendation on the NCBTS-TSNA process

Lead School Heads To orient all School Heads and NCBTS Coordinators within their cluster to the National

Competency-Based Teacher Standards and the Training Needs Assessment Tool To assist Schools Heads in identifying/organizing different working groups for the school level

orientation on the National Competency-Based Teacher Standards and the NCBTS-TSNA Tool Support schools within their clusters with resources to support the NCBTS-TSNA Orientation,

NCBTS-TSNA administration and the electronic consolidation of results.

School Heads To attend an orientation on the National Competency-Based Teacher Standards and the

NCBTS-TSNA Tool To identify/organize different working groups for the orientation on the National

Competency-Based Teacher Standards and the NCBTS-TSNA Tool To orient all teaching staff within their school on the National Competency-Based Teacher

Standards To administer the NCBTS-TSNA Tool to all teachers within their school To consolidate the school level NCBTS-TSNA results electronically To identify the strengths and needs of teachers based on the NCBTS-TSNA results

NCBTS Coordinators To attend an orientation on the NCBTS and the NCBTS-TSNA To assist the School Head to orient all teaching staff within their school on the National

Competency-Based Teacher Standards To assist the School Head in the administration of the NCBTS-TSNA Tool to all teachers within

their school To assist in the electronic consolidation of the school level NCBTS-TSNA results

Teachers Attend an orientation on the NCBTS and the NCBTS-TSNA Tool Complete the NCBTS-TSNA including the Individual Teacher Summary of NCBTS-TSNA Results

3.7.4.Criteria for the Selection of School NCBTS Coordinators

The orientation on the National Competency-Based Teacher Standards and the NCBTS-TSNA to teachers in all schools is the responsibility of the School Heads and their respective NCBTS School Coordinators. The designation of the NCBTS school coordinator is upon the discretion of the School Head taking into consideration the following criteria.

T&D System Operations Manual-Volume 2: The TDNA System Page 24

Page 30: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

NCBTS Coordinator should have:

at least three years of teaching experience; knowledge and experience in facilitating training activities; computer skills in Word, Excel and PowerPoint; good communication and leadership skills; and willingness and commitment in completing NCBTS related tasks to support on-going teacher

development.

3.8 Monitoring & Evaluation of the NCBTS –TSNA Implementation

An important function of the TDNA-WG established by the division and the region is the monitoring and evaluation of the NCBTS-TSNA process. The result of the evaluation will serve as feedback on how to improve future NCBTS-TSNA processes. The TDNA-WG is tasked in the preparation, conduct and consolidation of the NCBTS-TSNA M&E results utilizing the tools developed. The M&E tools include:

T&D-M&E Form 1: Individual Profile Template (for Region and Division TDNA-WG)NCBTS-M&E Form 1: Teacher’s Profile for NCBTS-TSNANCBTS-M&E Form 2: Learning Process Observation and Facilitation SkillsNCBTS-M&E Form 3: NCBTS-Coordinators Checklist and Consolidation TemplateNCBTS-M&E Form 4: Trainer’s Assessment of NCBTS Orientation Workshop and Consolidation

TemplateNCBTS-M&E Form 5: Trainee’s End of F3 Program Assessment and Consolidation TemplateNCBTS-M&E Form 6: Documentation Tool for the Conduct of Cluster or School level NCBTS-

TSNA ImplementationNCBTS-M&E Form 7: School’s NCBTS –TSNA Consolidation Template

A matrix that describes the M&E tools developed to support the NCBTS-TSNA process can be found in The NCBTS-TSNA Guide and Tools found in Section 7.1.

Based on the results of the M&E, the Division TDNA-WG prepares the NCBTS-TSNA Program Completion report and informs the Division T&D Chair of the results. The report should include recommendations for the improvement of the process, which will in turn inform Regional policy review and adjustment of the NCBTS-TSNA component of the TDNA System.

3.9 Management of the TDNA Consolidation Database

3.9.1 The TDNA Consolidation Database

As a component of the Training and Development Information System (TDIS) a database has been developed to support the consolidation of the NCBTS-TSNA results. The TDNA Consolidation Database allows schools to upload their electronic versions of the accomplished NCBTS-TSNA tool and automatically generate individual teacher and school level results. An individual summary result as well as a school profile can be generated identifying a single teacher’s or school’s strengths and priority training and development needs according to the NCBTS domains and strands. Data can be analyzed and used to inform the teacher’s development of an Individual Professional Development Plan (IPPD) and the School Plan for Professional Development (SPPD).

Similarly, the database can be used to support the consolidation and analysis of NCBTS-TSNA results at the district, division and regional level.

T&D System Operations Manual-Volume 2: The TDNA System Page 25

Page 31: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

The TDNA Database at the school and district level is a stand alone database that does not require access to the internet. The database can be obtained from the Division along with an accompanying TDNA Consolidation Database Manual. The TDNA database that supports the Division and regional consolidation of data is linked to the web-based TDIS and can be accessed through the EBEIS at http://beis.deped.gov.ph/

3.9.2 School Level Management of the TDNA Consolidated Database

School Heads, supported by their NCBTS Coordinators are responsible for the management of the database at the school level. The Division will be responsible for ensuring all School Heads are trained in how to manage and operate the database. The main responsibility at the school level will be to ensure all NCBTS-TSNA tools accomplished by teachers are in the electronic format i.e. any NCBTS-TSNA manually accomplished are re-entered into the electronic version of the tool.

Electronic files from all schools will need to be submitted to the District/Division to support District, Division and Regional consolidation.

3.9.3 District/Division Management of the TDNA Database

Division TDNA-WG will be responsible for ensuring all Education Supervisors (ES1 or PSDS who are monitoring elementary and secondary schools in the district), and all School Heads are trained in the use of the TDNA Consolidation Database. The Supervisors will be responsible for managing the database at the District level and consolidating the NCBTS-TSNA electronic results from all schools within their district. Results at this level should be used to inform District lead training and development activities. The District level NCBTS-TSNA electronic results should be submitted to the Division TDNA-WG to support division consolidation.

The T&D Chair at the Division level is responsible for managing the TDNA Consolidation Database at the Division level. With the support of the TDNA-WG, district electronic results should be consolidated and incorporated into the web-based TDIS. Results should be analyzed to inform Division MPPDs.

3.9.4 Region Management of the TDNA Database

The T&D Chief at the region level has overall responsibility for the management of the TDIS of which the TDNA Consolidation Database is one component. The Regional T&D Team will be responsible for ensuring all Division T&D Teams are trained in the management and use of the TDIS and are familiar with the process involved in consolidating and analyzing the NCBTS-TSNA results using both the TDIS and the school/district level database.

Section 4: The Training and Development Needs Assessment for School Heads

4.1. Basis and Purpose of the TDNASH

The Training and Development Needs Assessment for School Heads (TDNASH) is a means to systematically determine the training and development needs of school heads (SH) in order to support improved educational leadership. The competencies identified in the TDNASH are based on the mandate for school heads indicated in the RA 9155, its Implementing Rules and Regulations (IRR), and the DepED Order 80 & 81 series of 2003.

T&D System Operations Manual-Volume 2: The TDNA System Page 26

Page 32: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

The TDNASH is an adaptation of the National Educators Academy of the Philippines (NEAP) assessment tool for school managers. A list of required competencies based on the mandate for school heads was earlier developed by the NEAP for its School Leadership Experience Portfolio (SLEP). The SLEP is an assessment tool containing a rubric of behavioral indicators to assess school leaders on specific competencies and to classify their level of performance into one of five general types, namely: Awareness, Emerging, Practicing, Performing and Transforming. The SLEP Assessment Tool was validated and utilized in a previous DepED project, the Secondary Education Development Improvement Project (SEDIP), using a wide range of sample involving 16 DepED Divisions across the country.

The SLEP was the primary material that was reviewed and enhanced to develop the final list of competencies in behavioral terms for the purpose of identifying the training and development needs of school heads in the target sites of Project STRIVE in 2006. The SLEP tool was designed for the training and development needs assessment of school heads in Northern Samar and revised into a shorter form for the Division of Bohol in 2007. The tool was further refined and validated with 716 SHs in the Division of Negros Occidental in June 2008 and with 43 SHs in Northern Samar and 63 SHs in Bohol in September 2008. The Pilot-test results were used to develop this present form referred to as Training and Development Needs Assessment for School Heads (TDNASH).

The TDNASH intends to:

determine the current level of school heads competency ascertain the level of importance of each competency to the job identify the priority training and development and development needs of the SHs vis-à-vis the

seven domains for school leadership and management

4.2 Organizational Structure and Process Flow

4.2.1. The TDNA Working Group (TDNA-WG)

The Division Training and Development (T&D) Team through its TDNA Working Group (TDNA-WG) is responsible for the management of the TDNASH process. The T&D Chair, acting as the Chair of the TDNA-WG and assisted by two Co-chairs (representing the elementary and the secondary levels who are preferably Division Supervisors), are designated by the Schools Division Superintendent (SDS). The TDNA-WG members are all the District/Division Supervisors. Each member should have direct responsibility for the supervision of a cluster of elementary and/or secondary schools.

4.2.2. TDNA-WG Roles and Responsibilities

The Chair and Co-chairs have overall responsibility for the management of the TDNASH process. They should ensure that the Supervisors, who are also TDNA-WG members, are familiar with the process for conducting the TDNASH to the three different sets of respondents, i.e. the SHs, Supervisors and teachers. The TDNA-WG members are expected to play a key role in the preparation, administration, data analysis and reporting the results of the TDNASH. The general flow of processes related to the TDNASH across the Regional, Division/District and School levels is seen in the diagram below.

T&D System Operations Manual-Volume 2: The TDNA System Page 27

Page 33: Volume 2_TDNA System Operations Manual July V2010

Region Level:RD issues memo to Divisions commencing the TDNASH , specifying among others, the structure, functions, general processes, resources and responsibilities

Division Level:SDS instructs the T&D TDNA-WG to commence the TDNASH, specifying among others, structure, functions, and general processes, resources and responsibilitiesT&D Chair convenes the TDNA-WG and starts orientation and preparatory activities for the TDNASH

School Level:TDNA-WG administers the triangulation process for the TDNASH by cluster to SHs, teachers and supervisors

Division/District Level:TDNA-WG analyzes TDNASH data per district/clusterTDNA-WG consolidates TDNASH results and reports to the SDS TDNASH results are utilized for Div- MPPD re SHs’ leadership training plan

RO utilizes TDNASH results for the Reg- MPPD re SHs’ leadership training plan

TDNA-WG monitors and evaluates the TDNASH process (Internal M&E)

SHs are furnished individual results of TDNASH SHs identify their priority training and development needsSHs develop their IPPD

SDS submits report to RD

TDNA-WG monitors and evaluates the TDNASH process at the DO level

T&D System Operations Manual-Volume 2: The TDNA System

TDNASH Process Flow

4.3 Competencies Assessed by the TDNASH The competencies identified for the school heads have been classified into seven domains. These are: (1) School Leadership, (2) Instructional Leadership, (3) Creating a Student-centered Learning Climate, (4) Professional Development and HR Management, (5) Parental Involvement and Community Partnership, (6) School Management and Daily Operations, and (7) Personal Integrity and Interpersonal Effectiveness.

The School Head’s competencies are stated in terms of Knowledge, Skills and Attitude (KSA) that are classified according to the seven domains. Each KSA is associated with one of the five School Leadership Experience Level (SLEL) described in the chart below:

LEVELS DESCRIPTORS of School Leadership Experience Level

Awareness (A) Level

o Observing the leadership of others o Understanding the work of leadership from a theoretical perspectiveo Following the directives or suggestions of DepED or the school team

Emerging (E)Level

o Identifying factors that affect work, programs or projectso Participating as a member in the work of a team for a specific purposeo Implementing a program or project selected by the team or downloaded by DepED

T&D System Operations Manual-Volume 2: The TDNA System Page 28

Page 34: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Practicing (P)Level

o Facilitating the work of a team as a leadero Managing a program, project or initiative being done by a teamo Coordinating the work of otherso Developing programs and providing resources to be utilized by others in their worko Supporting the work of individuals and teams o Monitoring and evaluating performance of individuals, projects accomplishment and

utilization of budgeto Acting on the results of monitoring and evaluation of performanceo Providing feedback to teachers and teams regarding performance

Leading (L) Level

o Overseeing school-wide program, project or initiative which involves delegating, decision-making and problem solving

o Ensuring that teaching and non-teaching personnel are results orientedo Exercising management control and evaluating school-wide projects, programs and

personnelo Designing, developing, and overseeing the implementation of a program or projecto Facilitating, coordinating and evaluating the work of several teams or sub-teamso Monitoring and ensuring efficient and effective use of time and resources o Observing, supervising, and coaching/mentoring teaching and non-teaching

personnel for the purpose of professional growtho Holding others accountable for their work and behavior while providing them with

technical assistance

Transforming(T) Level

o Extending expertise to other school headso Providing opportunities for professional growth to teachers and non-teaching

personnel o Institutionalizing efforts or practices that are found to be effectiveo Identifying and developing emerging and potential leaders in the school communityo Creating a culture of excellence to enhance peak performanceo Creating an atmosphere where individual differences are celebrated and where every

individual grows and maximizes his/her potential

4. 4. The TDNASH Guide and Tools

The TDNASH Guide and Tools can be found in Section 7.2.It includes introductory information, a description of the assessment approach and methodology, details on analysis of the results, and the monitoring and evaluation mechanism. Attachments include the Individual Profile Template, The TDNASH Tool, Answer Sheet, Instructions for Supervisors in Answering the TDNASH Tool, Consolidated Triangulation Results Template for Individual School Head, Consolidated Cluster Results Template, and the Monitoring and Evaluation Tools for TDNASH.

The TDNASH Tool is composed of 410 KSAs and classified into seven domains of School Leadership competencies. It allows for the individual profiling of school heads results and the consolidation of results at the cluster/district level.

4.5 System Design and Assessment Methodology

The administration of the TDNASH employs a triangulation process which involves a self-appraisal made by the School Head and two other assessment measures that validate the self assessment made by the SH as illustrated in the following diagram. The School Heads are asked to complete a self assessment instrument where they identify the behaviors they have consistently demonstrated in their role as a

T&D System Operations Manual-Volume 2: The TDNA System Page 29

Page 35: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

school head across the seven domains. They also identify the importance of the competencies in relation to their ability to perform their job. To corroborate the self-assessment, a Group Consensual Assessment (GCA) technique is conducted with a small group of teachers (5-8) to consider the School Head’s level of competency. Finally the School Head’s District/Division Supervisor assessment of the School Head makes up the third measure using the same instrument covering all the domains and competencies expected of a school leader.

As shown in the design (1.2) above, the TDNA process for School Heads (SH) is similar to that of the teachers’ except that the self-assessment of the SHs is reviewed through a triangulation process. A supervisor and a group of teachers serve as the second and third measure, respectively, to make up the TDNA profile of the SH. The TDNASH results of an individual SH is summarized by the ES/PSDS in-charge and a School Head Profile is generated manually or through the electronic tool provided. This will be used by the School Head in completing his/her IPPD. Data for groups of School Heads can be registered in the TDNA Condolidation Database and for generation of reports at the division level. Generated reports are inputted to the situational analysis completed for the Division Education Development Plan (DEDP) and inform Division MPPDs. The consolidated TDNASH at the Regional level inform the Regional MPPD and the Region Educational Development Plan (REDP) specifically in relation to the identification of training priorities for School Heads in the region.

4.6 Administration of the TDNASH

4.6.1 Preliminary Preparations

Prior to the commencement of the administration of the TDNASH, the T&D Chair and Co-chairs are convened to study the TDNASH Guide and Tools, including the introductory information, procedures and specifically to review the following:- Individual Profile Template- TDNASH Tool - TDNASH Answer Sheet

T&D System Operations Manual-Volume 2: The TDNA System Page 30

Page 36: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

- List of Competencies and Behavioral Indicators - Monitoring and Evaluation Tools

The Chair and Co-chairs of the group lead a one-day orientation activity for the TDNA-WG which includes the Supervisors, guided by the specific steps outlined in the TDNASH Guide and Tools found in the Section 7.2. Key elements of the orientation include:

- Presentation of objectives and key understandings to be achieved- TDNA-WG Roles and Responsibilities- Walkthrough of the TDNASH Guide and Tools including a review of : processes to be

followed, the Individual Profile Template, the TDNASH Tool, the Answer Sheet, the list of Competencies and Behavioral Indicators, the Level of Importance Rating Scale

- Simulation of the TDNASH Administration process among the TDNA-WG members- Review of Group Consensual Assessment technique for the teacher assessors- Process for the Assessment of SHs by the Supervisors- Review of the School Leadership Experience Levels- Scoring the TDNASH Answer Sheets- Consolidation of the triangulation results- Reporting and interpreting cluster results- Developing a plan for field administration of the TDNASH

4.6.2 Administration of the TDNASH to School Heads (Refer to the TDNASH Guide and Tools for specific steps)

Preparation of materials required for the administration of the TDNASH to the School Heads:- Individual Profile Template- TDNASH Tool - TDNASH Answer Sheet

Steps for actual administration include the following:- Introduction of the TDNASH stressing purpose of the TDNA, its spirit of introspection or

guided-reflection and that it is not intended to evaluate performance nor is it a measure to compare between or among various school heads.

- Presentation of the competencies and behavioral indicators identified in the TDNASH that are based on the mandate for school Heads indicated in the RA 9155 and are adapted from the National Educators Academy of the Philippines (NEAP) School Leadership Experience Portfolio (SLEP).

- Explanation that the TDNASH will follow a triangulation process - Accomplishment of the Individual Profile. - Review of the TDNASH tool (domains, qualifiers, indicators) with the School Head and

explanation that for each domain there are a number of competencies. - Provision of instructions for responding to the tool using the appropriate columns of the

answer sheet and in rating the level of importance of each competency

4.6.3 Administration of the TDNASH for the Teachers’ Group Consensual Assessment (GCA) (Refer to the TDNASH Guide and Tools for specific steps)

For the administration of the TDNASH by Teachers the following materials are required:- TDNASH Tool (first page to be removed, 1 per teacher)- TDNASH Answer Sheet (1 only per group)

T&D System Operations Manual-Volume 2: The TDNA System Page 31

Page 37: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

A small group of teachers (5-8) is selected by the PSDS responsible for overseeing the school, to participate in the Group Consensual Assessment. The teachers selected should be representative of the school teacher population and should have a good knowledge of the School Head’s work practices.

Specific steps for the GCA technique to be used with the teachers are outlined in the TDNASH Guide and Tools. Work towards the attainment of the following:- Understanding of the genuine intention and spirit of a TDNA- Understanding of the purpose of the TDNASH and the basic mechanics of the exercise- Presentation of the TDNASH Tool and the Answer Sheet - Understanding of each competency, the response mode and the rating scale for assessing

the competency’s level of importance.- Articulation of individual perception on whether the School Head consistently demonstrates

the behavioral indicators; and consensus agreement on whether the School Head consistently demonstrates the behavioral indicators

- Consensus agreement on the level of importance (LOI) of the competency to the job of a School Head

4.6.4 Administration of the TDNASH by a Supervisor (Refer to TDNASH Guide and Tool for detailed instructions.)

For the administration of the TDNASH by Supervisors the following materials are required:- TDNASH Tool - TDNASH Answer Sheet - Instructions for Supervisors

The Supervisor who is requested to complete the “TDNASH by Supervisors” should have a sound knowledge and background of the School Head’s work practices. This would normally be the Supervisor who has direct responsibility for overseeing the school where the School Head is based and is most likely a member of the TDNA-WG who has undergone the orientation activity for the TDNASH.

The following points serve as a guide for accomplishment of the TDNASH by Supervisors:- Provide the Supervisor with a copy of the “Instructions for Supervisors” and allow him/her

time to read them. - Provide the Supervisor with a copy of the TDNASH Tool, and the Answer Sheet. - Clarify any questions the Supervisor may have regarding the completion of the TDNASH - Allow time for the Supervisor to respond to the TDNASH Tool.- When the Supervisor has finished, collect the Answer Sheet and ensure the respondent’s

information is completed, and the Supervisor’s signature is affixed at the end of the Answer Sheet.

4.7. Analysis of the Results

4.7.1. Scoring the TDNASH

The procedures for scoring the TDNASH Answer Sheets are explained more explicitly in the TDNASH Guide and Tools found in Section 7.2. Scoring is based on a 5-point scale with the descriptors outlined below:

T&D System Operations Manual-Volume 2: The TDNA System Page 32

Page 38: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

School Leadership Experience Levels

Letter Code Rating Scale

Awareness Level A 1Emerging Level E 2Performing Level P 3Leading Level L 4Transforming Level T 5

4.7.2. Consolidation of the TDNASH Triangulation Results for Individual School Heads

When the data has been collected and analyzed from all three sources involved in the triangulation TDNASH process, the results need to be consolidated for an individual School Head by the TDNA-WG member who administered the TDNASH. A template has been developed to assist in the consolidation and an e-version is also available. (Refer to TDNASH Guide and Tools).

4.8.Reporting Cluster Results

When the results have been consolidated for each School Head, cluster results should be developed. A template is provided to facilitate this process (see TDNASH Guide and Tools). There are three parts of the template namely: Part I – Cluster SH Identification; Part II – TDNASH Cluster Summary Sheet for SLEL; and Part III – TDNASH Cluster Summary Sheet for LOI. The steps for organizing the cluster results are explained in more detail in the TDNASH Guide and Tools.

4.9. Interpretation of Results

The SH’s Leadership Experience Level (SLEL) is interpreted using the descriptors for the five levels: Awareness, Emerging, Practicing, Leading and Transforming. The simplest level is Awareness is rated 1 and the most advanced level is Transforming rated 5.

Based on the results of the individual SLEL ratings, a School Head can recognize their level of competency for the various domains. The competencies where they are at an awareness (A) or Emerging (E) level indicate that these require further development and training, while competencies that are rated at the transforming (T) level indicate a strength. Ratings at the practicing (P) and leading (L) levels indicate satisfactory levels of performance and should be further enhanced.

Interpretation of the SLEL results must be done with consideration of the corresponding Level of Importance (LOI) rating. The LOI indicates how essential a given competency is to the job of a School Head. Although a School Head may be rated at the same level of performance for two different competencies, the competencies may not have the same degree of importance to the job of the School Head. Thus, the competency rated more essential should be given higher priority for development or training purposes.

Consolidated TDNASH results gathered from groups of school heads at the District/Division level may be analyzed in the same manner. Planners of training and development programs are informed by the data analysis as to what priority training programs could be prepared to address low level competencies that are most essential in the job of the school managers.

T&D System Operations Manual-Volume 2: The TDNA System Page 33

Page 39: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

4.10. TDNA Consolidation Database for TDNASH

4.10.1. District/Division Management of the TDNA Database

The TDNA Consolidation Database allows districts and divisions to upload the electronically consolidated triangulated results from the TDNASH. An individual School Head summary result as well as a district/division profile can be generated identifying strengths and priority training and development needs according to TDNASH domains and competencies. Data can be analyzed and used to inform the School Head’s development of an Individual Plans Professional Development (IPPD) and the Division Master Plan for Professional Development (MPPD).

The TDNA database that supports the Division and regional consolidation of TDNASH data is linked to the web-based TDIS and can be accessed through the EBEIS at http://beis.deped.gov.ph/

Division TDNA-WG will be responsible for ensuring all Education Supervisors are able to use the e-version of the template for the consolidation of a School Head’s triangulation results for individual school heads. The Supervisors will be responsible for managing the consolidating the TDNASH results from all schools within their district. Results at this level should be used to inform District lead training and development activities for School Heads. The District level TDNASH consolidated electronic results should be submitted to the Division TDNA-WG.

The T&D Chair at the Division level is responsible for managing the TDNA Consolidation Database at the Division level. With the support of the TDNA-WG, district electronic results for the TDNASH are to be consolidated and incorporated into the web-based TDIS. Results should be analyzed to inform Division MPPDs.

4.10.2. Region Management of the TDNA Database

The T&D Chief at the region level has overall responsibility for the management of the TDIS of which the TDNA Consolidation Database is one component. The Regional T&D Team will be responsible for ensuring all Division T&D Teams are trained in the management and use of the TDIS and are familiar with the process involved in consolidating and analyzing the TDNASH results using the TDIS.

4. 11. Monitoring and Evaluation of Process

The TDNA-WG Chair, Co-chairs and selected TDNA-WG members are tasked to monitor and evaluate the administration of the TDNASH. It is not expected that group members will be able to observe the administration of the TDNASH to all School Heads and the validation by teachers, however they should ensure they have an opportunity to observe the TDNASH administration in approximately 10% of sites.

Monitoring and Evaluation mechanism and tools have been developed to support the TDNASH process and consist of the:

T&D-M&E Form 1: Individual Profile TemplateTDNASH-M&E Form 1: Division M&E of Conduct of TDNASHTDNASH-M&E Form 2: TDNASH Consolidated Cluster Results TemplateTDNASH-M&E Form 3: Documentation Tool for Division Implementation of TDNASH

The tools are found in the TDNASH Guide and Tools in Section 7.2. The TDNA-WG should be convened anew to consolidate the results from the monitoring of the TDNASH and develop recommendations for the improvement of the process. The matrix below describes the M&E tools developed for used during

T&D System Operations Manual-Volume 2: The TDNA System Page 34

Page 40: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

the implementation of the TDNASH process.

4.12 Refinement of the TDNASH Tool

The existing TDNASH tool was developed prior to the national validation of the National Competency-Based Standards for School Heads (NCBS-SH) by SBM-WG, TEDP WGs and NEAP in September, 2009, and the issuance of the DepED Order relating to the national adaption and implementation of the NCBS-SH (DepEd Order No. 32 s 2010, dated March 12, 2010).. While the new NCBS-SH are very similar to the SLEP competencies used as basis of the TDNASH tool, it is recognized that the TNASH tool may need to be reviewed and further refined to be in line with the NCBS-SH. The NCBS-SH can be found in Section 7.2, Attachment 3.

Section 5: The Organizational TDNA for the Region and Division

5.1 The Basis of the Organizational TDNA

As mandated in the Revised Implementing Rules and Guidelines of R.A.9155, it is one of the primary functions of the Regional Office to plan and manage effective and efficient use of personnel, physical and fiscal resources of the region including professional staff development and implement and manage regional staff development programs. Related to this, the Division has the authority, accountability and responsibility to plan and manage professional staff development including the division office management and technical staff. The Training and Development Needs Assessment (TDNA) of the Region and Division is designed to identify the organization’s current needs vis-à-vis the desired organizational roles and responsibilities as stipulated in the Governance of Basic Education Act of 2001 (R.A. 9155).

The ‘Management Competencies per Service Areas’ established for the Decentralized Management Training Program of the Secondary Education Development Improvement Project (SEDIP, a DepED project implemented by the Bureau of Secondary Education) served as the basis for the development of this Organizational TDNA tool. These competencies were developed for Central Office, (CO), Region Office (RO), and Division Office (DO) levels and validated with educational leaders across 15 Divisions and nine Regions of DepED. The management competencies are organized as follows:

General Competencies (CO, RO, DO) Service Area 1: Educational Planning (RO, DO) Service Area 2: Learning Outcome Management (DO) Service Area 3: Monitoring and Evaluation (RO, DO) Service Area 4: Education Administration and Management (CO, RO, DO) Service Area 5: Policy Formulation and Standard Setting (CO, RO) Service Area 6: Curriculum Development (CO, RO)

5.2 The Framework and Management Competencies for the Organizational TDNA

TDNA is essential in the provision of quality professional development programs that are aligned to the needs of the organization in order to improve its performance. The TDNA determines the differences between the actual situation (what is) and the desired condition (what should be) in terms of personnel competencies, more specifically the knowledge, skills and attitudes expected of the organization.

T&D System Operations Manual-Volume 2: The TDNA System Page 35

Page 41: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

5.2.1. The Management Competencies

The Management Competencies for the Regional and Division Organization are listed as follows:

A. General Competencies (For CO, RO, and DO):1. Understanding DepED as an Organization2. Understanding RA 9155 or the Governance of Basic Education Act3. Management of Change4. Organization Analysis/ Diagnosis5. Problem Solving6. Decision-Making7. Dealing Effectively with Pressure Groups8. Conflict Management9. Negotiation Skills10. Transformational and Enabling Leadership

B. Educational Planning (For RO and DO)11. Strategic Planning12. Implementation Planning13. Project/ Program Identification14. Resource Mobilization and Allocation15. Financial Management and Control16. Group Process Management17. Facilitation Skills18. Communication Skills19. Advocacy

C. Monitoring and Evaluation (For RO and DO):20. Monitoring and Evaluation Design and Development21. Instrument/Tools Development for M&E Data Gathering22. Data Processing, Analysis and Utilization23. Communication Skills/Feedback Giving24. Education Management Information System (EMIS)

D. Education Administration and Management (For CO, RO and DO):25. Resource Mobilization and Management26. Resource Procurement and Management27. Building Partnerships28. Human Resource Management29. Delegation30. Physical Facilities Programming31. Records Management32. Understanding the intent of the Policy and Implementation

E. Policy Formulation and Standards Setting (For CO and RO)33. Policy Framework Development34. Policy Instrument Development35. Policy Formulation36. Policy Review37. Standard Setting

T&D System Operations Manual-Volume 2: The TDNA System Page 36

Page 42: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

38. Technical Writing39. Advocacy for Policy Formulation / Implementation

F. Curriculum Development (For CO and RO)40. Knowledge on the technical vocabulary of Curriculum Engineering41. Understanding of the Foundations of the Curriculum42. Application of the Foundations of the Curriculum in Curriculum Engineering43. Curriculum designing44. Curriculum structuring45. Implementation of various Curriculum models46. Curriculum evaluation

G. Learning Outcome Management (For DO)47. Understanding of the Revitalized Basic Education Curriculum48. Curriculum Review49. Curriculum Implementation Planning (Curriculum Indigenization)50. Instructional Materials Development51. Instructional Supervision and Management52. Student/Pupil Assessment/ Testing53. Intervention Programming54. Education Programs Management / Project Management55. Tracking Student Progress56. Quality Management

5.3. The Development and Validation of the Organizational TDNA Tool

The Organizational TDNA was first developed and used by Region VII and VIII and the Divisions of Bohol and Northern Samar during Stage One of Project STRIVE. The tool was further refined and validated in Region VI and in the Division of Negros Occidental during Stage 2 of Project STRIVE.

Two parallel sets of Organizational TDNA Guides and Tools have been developed, one for the Region organizational assessment (Section 7.3) and the other for the Division (Section 7.4). Each set includes the following sections:

- Basis and Objectives of the Organizational TDNA, - Region/Division Office Section Respondents, - Assessment approach, - Documents to be used for the Organizational TDNA, - General instructions for the TDNA-WG, - Focused Group Discussion Flow for the Region/Division Organizational TDNA Self-Assessment, - Focused Group Discussion Flow for the Region/Division External Assessment, and the- List of Management Competencies and Behavioral Indicators.

5.4 Methodology

Respondents composing at least 20% of the regional or division management and technical staff are convened to participate in the actual Organizational TDNA from each of the regions/divisions functional divisions/sections e.g. RD/ARD, (regional assessment), SDS/ASDS (Division assessment) Elementary Division, Secondary Division, ALS Division, Planning, Accounting/Budget, Cashier, Medical/Dental, Administrative, Legal, and Supply/Physical Facilities. Representation from each section should include the section head.

T&D System Operations Manual-Volume 2: The TDNA System Page 37

Page 43: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

The Organizational TDNA is done first as “self-assessment” exercise participated in by section respondents through a Focused Group Discussion (FGD) technique and following the steps outlined in the Organizational TDNA Guide and Tools. The respondents of each section will have to arrive at a consensual description of the region/division vis-à-vis the management competencies using the scale provided.

Results of the “self-assessment” will form 60% of the total result of the organization’s TDNA. The Organizational TDNA is completed with an external assessment, which forms the remaining 40% of the total result. In the case of the external assessment of the Division, the Regional Office assesses the management competencies of the Division. For the external assessment of the Region at least three divisions should assess the Region’s competency, with the results consolidated and the average forming the 40% of the overall assessment. Hence, each organization’s TDNA result is the consolidation of the “self-assessment plus the assessment represented by the external measure. Specific steps in the administration and consolidation of results are detailed in the Organizational TDNA Guide and Tools.

A TDNA -WG with the leadership of the T&D Team established by the region/division is responsible for overseeing the Organizational TDNA process. The group is expected to make preliminary preparations, to facilitate the FGD and consolidate each section’s results as well as the overall Organizational TDNA results by following procedures outlined in the Organizational TDNA Guide and Tools and through use of the electronic Organizational TDNA template.

The system design below shows the process for accomplishing the Organizational TDNA.

5.5 Monitoring and Evaluation of the Organizational TDNA

The TDNA-WG is tasked to monitor and evaluate the preparation, conduct and consolidation of the Organizational TDNA results using the developed M&E tools. It also prepares reports, develops recommendations for the improvement of the process, which informs Regional policy review, and adjustment of the Organizational TDNA component of the TDNA system at the regional and division level. The Organization TDNA tools are the:

T&D-M&E Form 1: Individual Profile Template Org’l TDNA-M&E Form 1: Organizational TDNA Tool for the Focus Group Discussion (FGD) Process

at the Region/Division Level

T&D System Operations Manual-Volume 2: The TDNA System Page 38

Page 44: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Org’l TDNA-M&E Form 2a: Division Organizational TDNA Scores Summary TemplateOrg’l TDNA-M&E Form 2b: Region Organizational TDNA Scores Summary TemplateOrg’l TDNA-M&E Form 3: Functional Divisions/Units Organizational TDNA Prioritization TemplateOrg’l TDNA-M&E Form 4: Organizational TDNA Schools Division Consolidation TemplateOrg’l TDNA-M&E Form 5: Documentation Review of Division/Region Organizational TDNA

The tools can be found in the Organizational TDNA Guide and Tools in Section 7.3 (Regional TDNA) and 7.4 (Division TDNA).

Section 6.0:Monitoring of the TDNA System

The diagram (1.4) below shows the procedural design for the TDNA monitoring and evaluation at the Division and Regional levels. The personnel from the TDNA-WG responsible for M&E are tasked to monitor and evaluate the preparation, conduct and consolidation of the TDNA results. Tools to monitor the process by which the TDNA is administered and results utilized have been developed and can be found in the appropriate TDNA Guide and Tools in Section 7.1, 7.2, 7.3 and 7.4. within this Operations Manual.

At the Division or Regional level, the M&E personnel prepare the M&E report and informs the T&D Office or Unit of the findings. Moreover, the T&D Office/Unit develops recommendations for the improvement of the process, which informs regional policy review, and adjustment of the TDNA System.

T&D System Operations Manual-Volume 2: The TDNA System Page 39

Page 45: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Below is the General Framework containing the standards at the input, process, output, and outcome system levels covering the T&D operations for the TDNA System at the region, division and school levels.

SystemLevels

Regional LevelStandards

Division LevelStandards

School LevelStandards

T & D Needs Assessment (TDNA) SystemOutcome Increased % of RO division/units

participation in the organizational T&D process

Systematic and continuous TDNA for the region/divisions

Informative TDNA Results that serve as basis for needs-based Regional planning for HR T&D

Increased % of DO division/units participation in the organizational T&D process

Increased % of SHs who are assessed of T&D needs

Systematic and continuous TDNA for the division/district and schools

Informative TDNA Results that serve as basis for needs-based Division planning for HR T&D

Increased % of Teachers who are assessed of T&D needs

Systematic and continuous NCBTS-TSNA for the teachers

Informative TDNA Results that serve as basis for needs-based School planning for T&D

Output Reliable and valid TDNA results for R-Organizational TDNA

Regularly updated Database identifying T&D Priority Service Areas/Competency Needs for the RO

Complete and accurate consolidation/analysis and Profile of all (or % of) Divisions’ Priority Needs for T&D

Reliable and valid TDNA results for Organizational Division

Regularly updated Database identifying T&D Priority Service Areas/ Competency Needs for the DO

Complete and accurate consolidation/ analysis & Profile of all (or % of) the Division’s and Districts’ Priority Needs of Teachers (TSNA), & School Heads(TDNASH)

Reliable and valid NCBTS-TSNA results

Complete and accurate consolidation/analysis & Profile of Teachers’ NCBTS-TSNA Priority Needs for T&D

Process Systematic and efficient conduct of the Region’s Organizational

Systematic and efficient conduct of the Division’s Organizational TDNA

Systematic and efficient conduct of NCBTS-TSNA

T&D System Operations Manual-Volume 2: The TDNA System Page 40

T&D System

Monitoring

and Eval

uation

General

Framewor

k

Page 46: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

TDNA

Well-documented M&E of the Conduct of Region/Division Organizational TDNA, Division management of the TDNASH and NCBTS-TSNA

Relevant feedback provided to improve the TDNA process in the region and division

and TDNASH,

Well-documented M&E of the Conduct of NCBTS-TSNA

Relevant feedback provided to improve the TDNA process in the division and schools

Input Competent and sufficient personnel of the Regional TDNA-WG

Sufficient and proper representation of the different sections’ respondents of the organizational TDNA

Available and relevant support resources: Reg-T&D Work plan, TDNA tools/materials (Volume 1&2), Funds

Competent and sufficient personnel of the Division TDNA-WG

Sufficient and proper representation of the different sections’ respondents of the organizational TDNA

Available and relevant support resources: Div-T&D Work plan, TDNA tools/materials (Volume 1&2), funds

Competent and sufficient personnel of the School TDNA-WG (NCBTS Coordinator)

Available and relevant support resources: School T&D Work plan, NCBTS-TSNA tools/materials (Volume 1& 2), funds

References:Basic Education Reform Agenda PIP, 2006Teacher Education Development Program, KRT2 Report 2006DepED-STRIVE Training and Development Systems Framework, Volume 1. 2008BEAM (Basic Education Assistance for Mindanao), Teachers’ Professional Development Framework, 2006INSET Mechanism Manual, SEDIP (Secondary Education Development Implementation Project), 2007NEAP, School Leadership Experience Portfolio, 2006

T&D System Operations Manual-Volume 2: The TDNA System Page 41

Page 47: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Section 7.0:

The TDNA System Guides and Tools7.1: The National Competency Based Teacher Standards

Teachers’ Strengths and Needs Assessment (NCBTS-TSNA) Guide and Tools

T&D System Operations Manual-Volume 2: The TDNA System Page 42

Page 48: Volume 2_TDNA System Operations Manual July V2010

Republic of the PhilippinesDepartment of Education

The National Competency Based Teacher Standards Teachers’ Strengths and Needs Assessment

(NCBTS-TSNA)

Guide and Tools

DepED-EDPITAF-STRIVETraining and Development

June 2010

NCBTS-TSNA Guide and Tools Page 43

Page 49: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Section 7.1: NCBTS-TSNA Guide and Tools Page 44

This document, The NCBTS-TSNA Guide and Tools, was developed and validated in Regions VI, VII and VIII, Divisions of Negros

Occidental, Bohol/Tagbilaran and Northern Samar through the AusAID-funded project, STRIVE (Strengthening the Implementation

of Basic Education in selected Provinces in the Visayas), in coordination with the EDPITAF (Educational Development Project

Implementing Task Force) and in consultation with the TEDP-TWG, NEAP and the Bureaus of the Department of Education.

Page 50: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

TABLE OF CONTENTS

I. Introductory Information Basis of the NCBTS – TSNA

The NCBTS – TSNA System Framework Purpose of the NCBTS – TSNA Expected Outputs The NCBTS and the KSA Developed for the NCBTS-TSNA

2. Establishment of Regional and Division TDNA Working Groups TDNA-Working Group Roles and Responsibilities NCBTS-TSNA Structural Process Flow Roles and Responsibilities to Support the Orientation on the NCBTS-TSNA Criteria for the Selection of School NCBTS Coordinators 3. Orientation of Supervisors, School Heads and NCBTS Coordinators

4. School /Cluster NCBTS-TSNA Implementation

The Electronic and Hard Copy Versions of the NCBTS-TSNA Tools Self Administration of the NCBTS-TSNA Tool Interpretation of the NCBTS-TSNA Results Consolidation of the NCBTS-TSNA Results

5. Utilization of NCBTS-TSNA Results 6. Monitoring and Evaluation of the NCBTS-TSNA

Attachment 1: NCBTS-TSNA Tools and Templates Attachment 2: M&E Tools for the NCBTS-TSNA

Section 7.1: NCBTS-TSNA Guide and Tools Page 45

Page 51: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

GLOSSARY OF ACRONYMS

AIP Annual Implementation Plan

BESRA Basic Education Sector Reform Agenda

CBTS Competency Based Teacher Standards

CO Central Office

COT Center of Training

DEDP Division Education Development Plan

DepED Department of Education

DO Division Office

EDPITAF Educational Development Project Implementing Task Force

EBEIS Enhanced Basic Education Information System

ES Education Supervisor

FGD Focus Group Discussion

ICT Information Communication Technology

ICT4E Information Communication Technology for Education

INSET In-Service Education and Training

IRR Implementing Rules and Regulations of RA 9155, December 2007

IPPD Individual Plan for Professional Development

KRT Key Results Thrust

KSA Knowledge, Skills and Attitudes

LAC Learning Action Cells

LRMDS Learning Resource Management and Development System

MOOE Maintenance and Other Operating Expenses

MPPD Master Plan for Professional Development

M&E Monitoring and Evaluation

NCBTS National Competency-Based Teacher Standards

PDP Professional Development Planning

PDRD Program Designing and Resource Development

PDy Program Delivery

PSDS Public School District Supervisor

RA 9155 Republic Act 9155: Governance Act for Basic Education, 11 Aug 2001

REDP Regional Education Development Plan

RO Regional Office

SBM School-Based Management

SH School Head

SIP School Improvement Plan

SLE Structured Learning Episode

Section 7.1: NCBTS-TSNA Guide and Tools Page 46

Page 52: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

SLEP School Leadership Experience Portfolio

SPPD School Plan for Professional Development

STRIVE Strengthening the Implementation of Basic Education in Selected Provinces in the Visayas

T&D Training and Development

TDIS Training and Development Information System

TEC Teacher Education Council

TEDP Teacher Education Development Program

TEI Teacher Education Institute

TDNA Training and Development Needs Assessment

TSNA Teachers Strengths and Needs Assessment

UIS Unified Information System

WG Working Group

Section 7.1: NCBTS-TSNA Guide and Tools Page 47

Page 53: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

National Competency-Based Teacher Standards - National Competency-Based Teacher Standards -

Teachers’ Strengths and Needs Assessment (NCBTS-TSNA) Teachers’ Strengths and Needs Assessment (NCBTS-TSNA)

1.1. Introductory Information Introductory Information

Basis of the NCBTS-TSNABasis of the NCBTS-TSNA

The Department of Education is presently pursuing a package of policy reforms that seeks to improve the quality of basic education. These policy reforms are expected to create the critical changes necessary to further accelerate, broaden, deepen and sustain the improved education effort already started. This package of policy reforms is referred to as the Basic Education Sector Reform Agenda (BESRA).

One key element in the reform agenda is the establishment of the National Competency-Based Teacher Standards (NCBTS). This is a framework that identifies the competency standards for teacher performance so that teachers, learners and stakeholders are able to appreciate the complex set of behaviors, attitudes and skills that each teacher must possess in order to carry out the satisfactory performance of their roles and responsibilities.In response to the need for an instrument that identifies the professional strengths and development needs of the teachers, the NCBTS -TSNA was developed and validated through the AusAID-funded Project Strengthening the Implementation of Basic Education in Selected Provinces in the Visayas (STRIVE). This initiative was undertaken in coordination with the Educational Development Project Implementing Task Force (EDPITAF) and Regions VI, VII and VIII, Divisions of Negros Occidental, Bohol/Tagbilaran and Northern Samar, and further validated by the Teacher Education Development Program-Technical Working Group (TEDP-TWG) at the national level.

The NCBTS- TSNA System Framework The NCBTS- TSNA System Framework

The NCBTS-TSNA adopts the TDNA System Framework. The process determines the differences between the actual situation (what is) and the desired condition (what should be) in terms of teacher professional competencies. In the NCBTS-TSNA, the actual situation is described by the current competencies as perceived by the teacher. The profile of the teacher’s current competencies is compared to the NCBTS standards for effective teaching. This NCBTS-TSNA, therefore, identifies both the competency strengths and needs as a result of determining the difference between the expected and the current teacher’s competencies. These competencies are translated in terms of Knowledge, Skills, and Attitudes (KSAs) that actually define the domains, strands and performance indicators of the NCBTS.

As in the TDNA Framework, the NCBTS-TSNA involves three essential stages of strengths and needs analysis: Phase I (Job Analysis for Effective Performance) is actually done by analyzing nationally set teacher standards in behavioral terms or by identifying effective teaching competencies. The DepED Central Office and Regional Offices are tasked to do this phase of the TSNA process. Phase II (Individual Training Needs Analysis) is the instrumentation to determine the current teacher competency levels in KSA terms which is done by the individual teacher at the school level. Phase III (Strengths-Needs analysis) is the analysis of the discrepancies between the standards set and the current teachers’ data on their competencies. Minimal discrepancies indicate strengths while big discrepancies indicate learning needs. The consolidation of results is carried out at the school, cluster, District, Division or Region level for their respective purposes related to identifying teacher training and development needs.

An important aspect of the NCBTS-TSNA process is the utilization of its results that will serve as inputs in the preparation of Individual Plan for Professional Development (IPPD) and in designing programs and activities for teachers at the school, district and division levels. The consolidated NCBTS-TSNA results at the school, division, and regional level inform the School Improvement Plan (SIP), the Division Education Development

Section 7.1: NCBTS-TSNA Guide and Tools Page 48

Page 54: Volume 2_TDNA System Operations Manual July V2010

COMPETENCY ASSESSMENT

Current KSA and Competencey

KSA required andComptencyStanndards

PHASE III

Individual Training Needs Analysis

InstrumentationData Gathering

PHASE I

Competency Analysis

PHASE II

Competency Strengs & Learning Needs

SIP DEDP REDP

Trainee’s IPPD

SPPDDiv-MPPDReg-MPPD

Job Analysis for Effective Performance

ConsolidatedTDNA Results

Strength-Needs Analysis

T&D System Operations Manual-Volume 2: The TDNA System

Plan (DEDP) and the Regional Education Development Plan (REDP), with respect to the plans for professional development at the school, division and the regional levels.

When established, the NCBTS-TSNA system ensures that “teachers routinely use CBTS in making self-assessments of their current practices to identify their individual development needs, and that school heads, division and regional offices also routinely use CBTS in identifying teacher performance factors that affect school-wide learning outcomes” (BESRA PIP, 2006 Version (PIP V.1, p. 21).

The framework is illustrated below.

Purpose of the NCBTS-TSNAPurpose of the NCBTS-TSNA

To realize the provision of quality Professional Development of Teachers, the NCBTS-TSNA is conducted to gather data on the competency strengths and needs of teachers that serve to inform the design and conduct of continuing training and development programs for the improvement of teaching-learning practice.

Specifically, the NCBTS-TSNA intends to:

1. Determine the competency strengths and learning needs in terms of KSAs of individual teachers vis-à-vis the standards set by the NCBTS in the seven domains

2. Consolidate the NCBTS-TSNA results at the school, district, division, and region levels

Expected OutputsExpected Outputs

Section 7.1: NCBTS-TSNA Guide and Tools Page 49

Page 55: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Based on the purpose stated above, the NCBTS-TSNA is expected to yield the following specific outputs:

A. At the individual level:

An Individual Teacher Summary of NCBTS-TSNA Results indicating the strengths and learning needs in each of the seven domains and 23 strands.

B. At the school level:

Consolidated NCBTS-TSNA results that reflect the general strengths and learning needs of the teachers in the school

C. At the cluster/district /division/region level:

Consolidated NCBTS-TSNA results of participating school teachers in a given cluster/district/division/region.

TheThe NCBTS and the KSAs Developed for the NCBTS-TSNANCBTS and the KSAs Developed for the NCBTS-TSNA

The NCBTS-TSNA tool is anchored on the NCBTS Framework set by the Department of Education. This contains seven integrated domains for effective teaching which are: Domain 1–Social Regard for Learning; Domain 2–Learning Environment; Domain 3–Diversity of Learners; Domain 4–Curriculum; Domain 5– Planning, Assessing and Reporting; Domain 6–Community Linkages; and Domain 7–Personal Growth and Professional Development. Each domain has its corresponding strands and each strand has performance indicators. A total of seven domains, 23 strands and 80 performance indicators make up the NCBTS competency standards set by the DepED.

The domains, strands and performance indicators were translated to specific KSAs to compose the NCBTS-TSNA Tool with 270 KSAs in the various clusters as described in the table below:

DOMAINS STRANDS PERFORMANCE INDICATORS

KSAs

Domain 1: Social Regard for Learning 2 5 18Domain 2: Learning Environment 5 17 59Domain 3: Diversity of Learners 1 8 27Domain 4: Curriculum 7 22 78Domain 5: Planning, Assessing and Reporting 4 12 40Domain 6: Community Linkages 1 6 18Domain 7: Personal and Professional Growth 3 10 30Total -7 Domains 23 80 270

The NCBTS-TSNA Tool content and methodology were validated by various groups at different levels across Regions VI, VII, and VIII and at the Central Office level. The validation process involved the following:

1) Preliminary Content Validation: The validation group included the STRIVE2 Project Component Team composed of 34 educators with 4 Regional Supervisors, 6 Division Supervisors, 22 Principals, 1 Administrative Officer V (former HRMO3), and 1 District ALS Coordinator. The process reduced the original 375 items to 260 items.

2) Region, and Division Level Content Validation: Six Regional Education Division Chiefs; 16 Division Supervisors; 13 District Supervisors; 27 School Heads, 27 Elementary School Master Teachers and 27 High School Master Teachers were selected to review the tool for content and language used. They submitted their comments and marginal notes for the refinement of the tool.

Section 7.1: NCBTS-TSNA Guide and Tools Page 50

Page 56: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

3) Field Process Validation: Sixty In-service Teachers were asked to respond to the tool with two versions. Thirty (30) teachers used the manual version and 30 teachers used the electronic version. Results showed that it was more efficient to complete the electronic version. Time spent in accomplishing both versions were recorded to be on average two hours for the manual and one hour for the electronic. There were refinements done for the electronic tool related to the programming of results per domain and strand.

4) Validation by six of the STRIVE project’s Technical Advisers: TDNA and SBM Adviser, the National and International T&D Advisers, the National and International ICT Advisers and the SBM-QAAF Adviser. There was a recommendation to include items specific to ICT competencies submitted by the LRMDS Advisers. The ICT4E standards were studied and ten items were added to the four original items to compose the ICT “domain”. This made the total of items 270 in all.

5) Experts’ Validation at the Central Office Level: The TEDP who was responsible in formulating the NCBTS was consulted to review the tool. The TEDP group included: the Director of the Teacher Education Council (TEC), a Professor and former Vice-President for Academics of West Visayas State University, the Associate College Dean of Arts and Sciences of the University of the Philippines, the College Dean of Centro Escolar University and National President of PAFTE, a School Head and the President of NAPSSHI, a School Head and President of PESPA, and a SPED specialist and Assistant Chief of the Bureau of Elementary Education. Together with the T&D Team and the ICT and T&D Technical Advisers, they reviewed the manual and thoroughly inspected the 270 items, item by item. As a result, further refinements were incorporated to the Tool. Additionally, the TEDP expressed appreciation for the developed NCBTS tool and for the addition of a set of 14 items that composed an “ICT domain”.

6) Presentation of the Guide and Tools to a national group of teacher educators: Comments and points for refinement were gathered from the participants of the First National Conference of Centers of Training Institutions, attended by Heads and Deans of 82 Teacher Education Institutions (TEIs), including a few RDs, ARDs, and SDSs, held at the Development Academy of the Philippines, Tagaytay City. Points considered for the improvement of the Guide and tool were the inclusion of the PSDS’s to be co-responsible with the School Heads for the administration of the NCBTS Tool to teachers in their clusters, the inclusion of an item for guided reflection as a competency, reconsideration of the length of the Tool, among others.

7) Preparation of the NCBTS-TSNA Orientation Package: In the course of doing the steps mentioned above, there was a clear recognition that teachers must have an adequate understanding of the NCBTS Framework and the standard competencies that are expected from them before the NCBTS needs assessment process is done. To address this need, the T&D Team developed a resource package that aimed to orient the implementers such as the ES, PSDS and School Heads and NCBTS Coordinators on the BESRA and the NCBTS. The NCBTS-TSNA Orientation Package, which consists of a series of Structured Learning Episodes (SLEs), was also to be conducted to teachers prior to the initial administration of the NCBTS-TSNA Tool.

8) Process Try-out of the NCBTS-TSNA Guide and Tools, including the NCBTS-TSNA Orientation Package: The NCBTS-TSNA system, procedures were tried out in a one-school sample that involved all the teachers and the School Head of the Tabalong National High School, Dauis, Division of Bohol. The content and processes of conducting the SLEs, the tool administration, scoring, individual and school consolidation profiling, and the M&E mechanisms were tried with 33 teachers. Refinements were done following the try-out based on the observations of the T&D Team and feedback from the teacher respondents. The Pilot Version of the NCBTS-TSNA Guide and Tools, and NCBTS-TSNA-Orientation Package was then prepared for a bigger sample of schools.

9) Division Pilot-Test of the NCBTS-TSNA Guide and Tools, including the NCBTS-TSNA Orientation Package: The pilot-testing of the NCBTS-TSNA system using the NCBTS-TSNA Guide and Tools and

Section 7.1: NCBTS-TSNA Guide and Tools Page 51

Page 57: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

the NCBTS-TSNA Orientation Package was done in the 300 pilot schools in the Divisions of Bohol, Negros Occidental and Northern Samar. This expanded to include the six hundred twenty (620) non-pilot schools in the division of Negros Occidental. Technical reports were developed to document the process and were the basis for the finalization of the NCBTS-TSNA Guide and Tool and NCBTS-TSNA Orientation Package. This was turned-over to the central office which further validated the Guide and Tools and the Orientation package to six regions outside the STRIVE sites.

10) Finalization of the NCBTS-TSNA Guide and Tool and NCBTS-TSNA Orientation Package: Based on the national validation conducted by the TEDP-TWG in Luzon and Mindanao, further revisions were made such as the renaming of the package to NCBTS-TSNA (Teachers Strengths and Needs Assessment), the clustering of 10 ICT items under a new STRAND (4.7), addition of a performance indicator on ICT (4.7.1). The 10 of the 14 ICT items found in different strands in the original tool were selected and finally classified under this performance indicator and ICT strand.

2.2. Establishment of Regional and Division TDNA-Working GroupsEstablishment of Regional and Division TDNA-Working Groups

The Regional and Division TDNA Working Groups (TDNA-WGs) may be convened to take responsibility for the management of the NCBTS-TSNA process. The Regional TDNA-WG members (representing the elementary and the secondary levels who are preferably Education Supervisors), would be designated by the Regional Director. The Division TDNA-WG is organized by the Schools Division Superintendent (SDS). Both Regional and Division TDNA-WG members are chosen based on experiences in assessment and in the training of teachers. Chairs and Co-Chairs may be assigned to lead the working groups.

TDNA-WG Roles and Responsibilities

The TDNA-WG’s have overall responsibility for the management of the NCBTS-TSNA process. They should be familiar with the process for orienting groups of School Heads with their NCBTS Coordinators at the cluster/district level on the process for conducting the NCBTS-TSNA, and ensure they are then able to administer the tools to the teachers at the school level. The TDNA-WG members are expected to play a key role in the preparation, administration, monitoring, data consolidation and reporting the results of the NCBTS-TSNA. The general flow of processes related to the NCBTS-TSNA across the Regional, Division/District and School levels is seen in the diagram below.

Section 7.1: NCBTS-TSNA Guide and Tools Page 52

Page 58: Volume 2_TDNA System Operations Manual July V2010

Region Level:RD issues memo to Divisions commencing the NCBTS-TSNA specifying among others, the structure, functions, general process, resources and responsibilities

Division Level:SDS instructs the TDNA-WG to commence the NCBTS- TSNA for teachers, specifying among others, structure, functions, and general process, resources and responsibilitiesDiv TDNA-WG meets with Reg TDNA-WG and starts orientation and preparatory activities for the NCBTS-TSNA

School Level:School Head and NCBTS Coordinator conducts NCBTS orientation* and NCBTS-TSNA to all teachers

Division/District Level:Div TDNA-WG monitors cluster/district implementation TDNA-WG consolidates NCBTS-TSNA results and reports to the SDS NCBTS-TSNA Results are utilized for MPPD & DEDP re staff development

RO designates/ instructs Regional TDNA-WG to commence NCBTS-TSNA activities Reg & Div TDNA-WG convene Facilitators/Trainers Team and conduct a walkthrough of the NCBTS-TSNA Orientation Package*TDNA-WG implements Cluster Lead and district schools level implementation for School Heads and NCBTS Coordinators.

Teachers make NCBTS-TSNA self-assessment in hard copy or electronic format Teachers accomplishes NCBTS-TSNA Summary resultsTeachers identify their priority training needs

SDS submits NCBTS-TSNA report to RD

Reg TDNA-WG meets with Div TDNA-WG to orient and plan activities for cluster and school level implementation

Reg TDNA-WG monitors and evaluates Div TDNA-WG implementation of NCBTS- TSNA

RO utilizes NCBTS- TSNA results to inform the MPPD & REDP

SH and NCBTS Coordinator consolidate NCBTS-TSNA results and submit report to District Supervisor and Division TDNA-WG

T&D System Operations Manual-Volume 2: The TDNA System

NCBTS- TSNA Structural Process Flow

* Note: The Orientation Package only needs to be delivered when first introducing the NCBTS-TSNA.

Roles and Responsibilities to Support the Orientation on the NCBTS-TSNA

Regional Supervisors To support and monitor the Division in the implementation of the NCBTS-TSNA Orientation and the

administration of the NCBTS-TSNA Tool District/Division Supervisors

To attend an orientation on the National Competency-Based Teacher Standards and the Teachers Strengths and Needs Assessment Tool

To provide assistance to Lead School Heads in coordinating clusters to meet to undertake an orientation on the National Competency-Based Teacher Standards and the Teachers Strengths and Needs Assessment Tool

To assist in generating resources for the Division NCBTS-TSNA activities To actively support School Heads and NCBTS Coordinators in the conduct of the orientation of all

teachers regarding the NCBTS To actively support School Heads and NCBTS Coordinators in the administration of the NCBTS-TSNA

Tool to all teachers To support the consolidation of the results of the teacher NCBTS-TSNA at the school level and the

incorporation of findings into School Improvement Plans(SIP) and School Plans for Professional Development (SPPD)

To consolidate district NCBTS-TSNA results and incorporate findings into plans for District level training

Section 7.1: NCBTS-TSNA Guide and Tools Page 53

Page 59: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

To monitor and evaluate the conduct of the orientation and administration on the NCBTS-TSNA To give feedback and recommendation on the conduct of orientation and administration of the

NCBTS-TSNA

Lead School Heads To orient all School Heads and NCBTS Coordinators within their cluster to the National Competency-

Based Teacher Standards and the Teachers Strengths and Needs Assessment Tool To assist Schools Heads in identifying/organizing different working groups for the school level

orientation on the National Competency-Based Teacher Standards and the Teachers Strengths and Needs Assessment Tool

Support schools within their clusters with resources to support the NCBTS-TSNA Orientation and the electronic consolidation of NCBTS-TSNA results

School Heads To attend an orientation on the National Competency-Based Teacher Standards and the Teachers

Strengths and Needs Assessment Tool To identify/organize different working groups for the orientation on the National Competency-Based

Teacher Standards and the Teachers Strengths and Needs Assessment Tool To orient all teaching staff within their school on the National Competency-Based Teacher Standards To administer the NCBTS-TSNA Tool to all teachers within their school To consolidate the school level NCBTS-TSNA results electronically To identify the strengths and needs of teachers based on the NCBTS-TSNA results To submit consolidated NCBTS-TSNA Results to the District Supervisor and the Division TDNA-WG

NCBTS Coordinators To attend an orientation on the National Competency-Based Teacher Standards and the Teachers

Strengths and Needs Assessment To assist the School Head to orient all teaching staff within their school on the National Competency-

Based Teacher Standards To assist the School Head in the administration of the NCBTS-TSNA to all teachers within their school To assist in the electronic consolidation of the school level NCBTS-TSNA results

Teachers Attend an orientation on the National Competency-Based Teacher Standards and the Teachers

Strengths and Needs Assessment Tool To read and reflect on every item of NCBTS-TSNA Tool To answer every item based on an honest assessment of oneself to inform future professional

development activities Complete a Teachers Strengths and Needs Assessment including the development of an Individual

Teacher Summary of NCBTS-TSNA Results To use the results in developing an Individual Plan for Professional development (IPPD)

Criteria for the Selection of School NCBTS Coordinators

The orientation on the National Competency-Based Teacher Standards and the Teachers Strengths and Needs Assessment to teachers in all schools is the responsibility of the School Heads and their respective NCBTS School Coordinators. The designation of the NCBTS School Coordinator is at the discretion of the School Head taking into consideration the following criteria.

Section 7.1: NCBTS-TSNA Guide and Tools Page 54

Page 60: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

NCBTS Coordinator should have:

at least 3 years of teaching experience; knowledge and experience in facilitating training activities; computer skills in Word, Excel and PowerPoint; good communication and leadership skills; willingness and commitment in completing NCBTS related tasks to support on-going teacher

development.

3.3. Orientation of Supervisors, School Heads and NCBTS School Coordinators Orientation of Supervisors, School Heads and NCBTS School Coordinators

The principle of school-based management empowers the School Heads to provide instructional leadership and therefore in order for them to support teachers, School Heads must be aware of the framework of the NCBTS that defines the concept of effective teaching. In addition, School Heads can only effectively support the professional development of the teachers when they have the first-hand information about the training and development needs of teachers. School Heads and Schools Supervisors need to be oriented so they are knowledgeable of the NCBTS and the features of the NCBTS-TSNA tool and its proper administration and results utilization if they are to provide effective instructional leadership.

The Division TDNA-WG is responsible for the orientation of all School Supervisors and School Heads with their respective NCBTS School Coordinators on the NCBTS and the TSNA. The NCBTS-TSNA Orientation Package serves as a guide and resource for the introduction of the NCBTS and the initial administration of the self-assessment tool. The package with accompanying resource materials is designed for knowledge building and advocacy on the NCBTS and for the transfer of the technology to conduct NCBTS-TSNA to the School Heads. The package can also be used to provide teacher beneficiaries with a deep understanding of the NCBTS and its relationship to in-service teachers’ professional development and to introduce the NCBTS-TSNA tool, with consideration of the tool’s proper administration with teachers and how results can be utilized.

The NCBTS-TSNA Orientation Package aims to: enhance implementers’ understanding of BESRA and the significance of teacher development in

achieving its goals; introduce the NCBTS and its relevance to teacher in-service development; deepen implementers’ understanding of the seven domains, strands and performance indicators of

the NCBTS; introduce the NCBTS-TSNA Tool and consider guidelines for its implementation; prepare School Heads, NCBTS Coordinators and Supervisors to implement the NCBTS-TSNA to

teachers.

The NCBTS-TSNA Orientation Package consists of resource materials for the conduct of orientation of the NCBTS to those responsible for implementing the NCBTS-TSNA (e.g. School heads, NCBTS Coordinators and Supervisors) as well as to teachers. There are five Structured Learning Episodes (SLEs). Each SLE sets out the specific key understanding to be developed, the specific learning objectives to be achieved, the recommended duration, and the detailed description of the procedural flow of the session e.g. the core activities to be delivered. Accompanying PowerPoint’s and handouts are included in the package.

The content of the SLEs and corresponding elements are summarized in the matrix below:

SLEs/Title Objectives Key UnderstandingsSupport

Materials

Section 7.1: NCBTS-TSNA Guide and Tools Page 55

Page 61: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

SLE 0:

NCBTS-TSNA Orientation Overview

1. Understand the objectives of the 3-day orientation

2. Define and share expectations of the 3-day orientation

3. Identify the house rules4. Appreciate the value of

sharing ideas

An understanding of the objectives of the NCBTS-TSNA orientation will ensure participants are aware of the purpose for the training.

Agreed learning expectations will ensure participants and facilitators are working towards the same goal.

A set of ‘House Rules’ will support the smooth conduct of the training.

- Handouts: Program Objectives and Schedule: Orientation on the NCBTS-TSNA

- PowerPoint presentation

SLE 1:

KRT 2 and BESRA

1. Discuss the initiatives being implemented through BESRA

2. Understand how the KRTs of BESRA lead to the attainment of EFA goals and quality education

3. Understand how one’s school vision can be attained through strong support for BESRA’s initiatives particularly towards quality teacher development

Basic Education Reform Agenda (BESRA) is a package of policy reforms expected to create the critical changes necessary to accelerate, broaden, deepen and sustain the improved education effort already started by the Department of Education (DepED).

To achieve the desired educational outcomes for all Filipinos, BESRA focuses on specific policy action within five Key Reform Thrusts (KRTs) including KRT2 Teacher Development.

As the instructional leader of the school, the School Head should support, encourage and motivate the teachers’ continuous professional development.

- Handout on BESRA

- PowerPoint Presentation

SLE 2: NCBTS – A Guide for all Filipino Teachers

1. Explain the framework, structure and features of the NCBTS

2. Explain the significance and importance of the NCBTS and its target users

3. Discuss the use of the NCBTS and how it can help teachers improve teaching and learning

4. Appreciate the value of the NCBTS

The NCBTS is an integrated theoretical framework that defines the different dimensions of effective teaching, where effective teaching means being able to help all types of students achieve the various learning goals in the curriculum.

The NCBTS provides a clear guide for all teacher development programs and projects from school level up to the national level.

Various stakeholders and institutions use the NCBTS in their roles to maintain quality education through effective teaching and learning.

- Handout on the NCBTS

- PowerPoint Presentation

SLE3:

The NCBTS Components and Structure: A Closer Look at the Domains

1. Define the different domains of NCBTS

2. Classify indicators and strands according to domains

3. Appreciate the value of NCBTS in teacher’s development

The NCBTS defines seven domains within which teachers can develop professionally. The seven domains are closely connected to each other in very meaningful ways, and that the seven domains are best understood as constituting an integrated whole.

The seven domains can be classified into two broad categories. The first category can further be divided into two sub-categories:1. Domains that relate to the teacher as a

facilitator of learning (Domains 2 to 6) 1.1: Domains on teaching practices related to the technical aspect of the teaching and learning processes (Domains 3, 4 and 5) 1.2: Domains on teaching practices that embed the learning process in an appropriate context (Domains 2 & 6)

2. Domains that relate to the teacher as a learner (Domain 1 & 7)

- Handouts:

List of NCBTS Domains, Strands and Performance Indicators

- PowerPoint Presentation

SLE 4

The Administration of the NCBTS - TSNA to Teachers

1.Be familiar with the NCBTS-TSNA Tool and the guidelines for its administration;

2.Practice the administration and profiling of the NCBTS-

The NCBTS-TSNA tool is anchored on the NCBTS Framework set by the Department of Education. This contains the seven integrated domains for effective teaching.

The domains, strands and performance indicators are translated to specific Knowledge,

Handouts:

- NCBTS-TSNA Tool with Answer Sheets and Individual Teacher’s

Section 7.1: NCBTS-TSNA Guide and Tools Page 56

Page 62: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

TSNA Tool (hard copy and electronic versions)

3.Appreciate the importance of the NCBTS-TSNA Tool in the planning and designing professional development interventions and delivery modes for teachers.

Skills, and Attitudes (KSAs) to compose the NCBTS-TSNA Tool with 270 KSAs.

District Supervisors, School Heads and their respective NCBTS Coordinators become the “Implementers” of the NCBTS-TSNA across the schools within their clusters.

The NCBTS-TSNA Tool is available in an electronic format with an auto-scoring system or in hard copy with a separate answer sheet and Individual Teacher’s NCBTS-TSNA Results Summary Template.

The Individual Teacher’s NCBTS-TSNA Results Summary Template is used for the development of the teacher’s IPPD.

NCBTS-TSNA Results Summary Template

- Teacher Profile

- School Consolidation Template

Electronic Version of TSNA Tool

TDNA Consolidation Database

PowerPoint presentation

SLE5 –

(For SH)

Action Planning for NCBTS-TSNA Administration

1. Make a plan for cluster and school-based implementation of the NCBTS-TSNA

2. Positively accept suggestions made on the proposed action plan

An Action Plan will support the implementation of the NCBTS-TSNA at the cluster and school level

An action plan should be complete and doable. An action plan should suit the cluster/school

setting. An action plan should be done collaboratively.

- Action Plan proforma

- PowerPoint presentation

- Participant Evaluation Proforma

The package can be accessed through the Training Development Information System (TDIS) which is an element of the Enhanced Basic Education Information System (EBEIS) at http://beis.deped.gov.ph/ or through the Learning Resource Management and Development System (LRMDS) Portal at http:// lrmds . deped.gov.ph /

4.4. Schools-Cluster NCBTS-TSNA ImplementationSchools-Cluster NCBTS-TSNA Implementation

Schools within the Division are expected to form clusters. Each cluster should designate a Leader School. Leader School Heads and their respective NCBTS Coordinators become the “Implementers” of the NCBTS-TSNA across the schools within their clusters. School Heads from within each cluster are convened to go through parallel knowledge building and to conduct the NCBTS-TSNA for their own teachers. The District Supervisors will take the role of guiding and monitoring the NCBTS-TSNA orientation and administration procedures within the cluster or district.

The Electronic and Hard Copy Versions of the NCBTS-TSNA ToolThe Electronic and Hard Copy Versions of the NCBTS-TSNA Tool

The NCBTS-TSNA Tool is a self-assessment procedure that is introduced by the School Head/NCBTS Coordinator through an orientation process in order for the teacher-respondents to see its importance and thus reflectively respond to the tool. The NCBTS-TSNA Tool is available in an electronic format with an auto-scoring system, or in a hard copy with a separate scoring and results summary template. If the electronic version is used, each teacher responds to the NCBTS-TSNA tool from a file installed on a common computer in the school. It takes approximately one-and-a-half hours to accomplish the instrument, although no time limit should be imposed. The scores and individual profile of the teacher in the seven domains with the corresponding strands are electronically generated instantly upon completion of the instrument. All schools are encouraged to use the e-version of the tool for easy profiling and consolidation.

Section 7.1: NCBTS-TSNA Guide and Tools Page 57

Page 63: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Schools that have no access to the technology required for use of the e-version of the NCBTS-TSNA tool, or where teachers are not computer literate, may use the hard copy version for implementation. The hard copy version takes approximately 2 hours to accomplish plus one hour for scoring and developing an individual teacher results summary. The hard copy version can be found in Attachment 1 along with the separate Answer Sheet, a Teacher Profile and an Individual Teacher’s Summary of NCBTS-TSNA Results Template.

To support school, district and division level consolidation of the NCBTS-TSNA results, it is recommended that the School Head, with the assistance of the NCBTS Coordinator, ensure that all hard copies of teachers NCBTS-TSNA results are entered into the electronic version of the tool.

It should be noted that DepED Central has distributed the NCBTS Tool Kit for the TSNA and IPPD which contains the Teacher’s Profile, the NCBTS-TSNA Tool, the Answer Sheet and the Individual Results Template to all regions and divisions with the expectation that all teachers would be provided with a copy.

Self-Administration of the NCBTS-TSNA ToolSelf-Administration of the NCBTS-TSNA Tool

The NCBTS- TSNA Responses

The instrument contains clusters of KSAs specific to a particular performance indicator with a common stem: “At what level do I…” Considering that the NCBTS-TSNA tool is intended for self-assessment and not for performance ratings, the responses to the items are expressed qualitatively i.e. High (H), Satisfactory (S), Fair (F), and Low (L). However, quantitative data are easier to interpret and relied upon for decisions, thus, in the response analysis, the numerical equivalent is assigned for each descriptor; H - 4; S - 3; F - 2; L - 1.

The reference codes presented below guides the respondent in registering her/his self-assessment for each KSA:

Code of Competency Level

Qualitative Description

H- (High) I am very competent in the KSA and this is not my priority training needS- (Satisfactory) I am competent in the KSA but I would benefit from further training.F-(Fair) I am fairly competent in the KSA but need further training.L- (Low) I have low competence in the KSA and require urgent

training.

Upon completion of the instrument, the respondent using the electronic version can automatically generate his/her Summary of NCBTS-TSNA Results. The Summary of NCBS-TSNA Results Template is in printable format and each teacher is advised to print a copy for her/his own record. An interpretation of the results for each domain and strand is also provided.

For the hard copy version, the steps for scoring and summarizing the results are:

A. Scoring is completed on the individual answer sheet.

1. Get the equivalent score for each KSA using the numerical equivalent for the descriptor of the response e.g. H - 4; S - 3; F - 2; L - 1. Do this per cluster of KSAs in a particular performance indicator (e.g. for Box 1.1.1 or Box 1.1.2 and so on)

2. Compute the sum of the scores for each cluster of KSAs. Write the sub-scores on the cells beside each cluster in the last column.

Section 7.1: NCBTS-TSNA Guide and Tools Page 58

Page 64: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

3. Compute the sum of all the scores within each Strand (S). Write the sub-score on the cell beside each strand.

4. Compute the sum of all the S’s within each Domain. Write the sub-scores on the cell beside each Domain.

B. Transfer all the sub-scores for the Strands and Domains to the Individual Teachers Summary of NCBTS-TSNA Results Template. Use the row for Raw Score.

1. Compute the percentage score for each of the sub-scores by dividing the raw score by the highest possible score (HPS).

2. Plot the percentage scores on the bar graph provided in the template.

Interpretation of the NCBTS-TSNA ResultsInterpretation of the NCBTS-TSNA Results

Upon completion of the NCBTS-TSNA Tool, an obtained score whether an average of the domain or strand is interpreted using the appropriate indices in the chart below.

Range Description of the Level of Teaching Competence(Referred to as Teacher Professional Development Index

in DepED NCBTS-TSNA Primer)Scale Scores Percentage

Scores3.51- 4.00 87.51 - 100% Expert

Very competent and can support other teachers’ improvement

Teacher has almost all the competencies for effective teaching at high level. These are the identified strengths. Strengths have to be sustained and enhanced; however professional development needs have to be continuously addressed*

2.51-3.50 62.51 - 87.50% ExperiencedCompetent in the KSA but would benefit from further training and development

Teacher has the majority of the competencies at high level for effective teaching. Strengths have to be enhanced. Training and development needs have to be addressed*.

1.51-2.50 37.51 – 62.50% DevelopingFairly competent in KSA and need further training and development

Teacher has average of all the competencies at high level of effective teaching. These strengths have to be enhanced; however, training needs have to be addressed as priority.*

1.00-1.50 25.00 – 37.50% BeginningLacking competence in the KSA and require urgent training and development

Teacher has very few of the competencies at high level for effective teaching. Training needs have to be given priority and addressed urgently*.

Section 7.1: NCBTS-TSNA Guide and Tools Page 59

Page 65: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

*Description used in the DepED NCBTS-TSNA Primer and NCBTS Toolkit

Consolidation of NCBTS-TSNA Results

As a component of the Training and Development Information System (TDIS) a database was developed to support the consolidation of the NCBTS-TSNA results. The TDNA Consolidation Database allows schools to upload their electronic versions of the accomplished NCBTS-TSNA tool and automatically generate individual teacher and school level results. An individual summary result as well as a school profile can be generated identifying a single teacher’s or a school’s strengths and priority training and development needs according to the NCBTS domains and strands. Data can be analyzed and used to inform the teacher’s development of an Individual Professional Development Plan (IPPD) and the School Plan for Professional Development (SPPD).

Similarly, the database can be used to support the consolidation and analysis of NCBTS-TSNA results at the district, division and regional level.

The TDNA Database at the school and district level is a stand alone database that does not require access to the internet. The database can be obtained from the Division along with an accompanying TDNA Consolidation Database Manual. The TDNA database that supports the Division and Regional consolidation of data is linked to the web-based TDIS and can be accessed through the EBEIS at http://beis.deped.gov.ph/

School Heads, supported by their NCBTS Coordinators are responsible for the management of the database at the school level. The Division will be responsible for ensuring that all School Heads are trained in how to manage and operate the database. The main responsibility at the school level will be to ensure all NCBTS-TSNA tools accomplished by teachers are in the electronic format i.e. any NCBTS-TSNA manually accomplished are re-entered into the electronic version of the tool.

Electronic files from all schools will need to be submitted to the District/Division to support District, Division and Regional consolidation.

A template for manually consolidated school level NCBTS-TSNA results can be found in Attachment 1.

5.5. Utilization of NCBTS-TSNA Results Utilization of NCBTS-TSNA Results

Individual Results

The Individual Teachers NCBTS-TSNA Summary of Results is used for the development of the teacher’s Individual Plan for Professional Development (IPPD). The identified learning needs therein are appraised by the teacher while taking into consideration the priorities set by the school for its future development. It is important that the teachers develop themselves in order to contribute towards addressing the most urgent needs and the priorities identified by the school. The IPPD is therefore prepared by the teachers to identify their training needs in line with their own priorities and those of the school. A separate document has been developed detailing the concepts and procedures related to the preparation of IPPDs.School Level Consolidated NCBTS-TSNA Results

Consolidate NCBTS-TSNA results of all teachers from a school should be used to identify both strengths and needs of individual teachers and the school as a whole. Teachers with particular strengths in a domain or stand can become resource persons/coaches or mentors for other teachers who need further development in the same domain/strand. Common priority needs identified by groups of teachers should inform school planning activities such as the development of School Plans for Professional Development (SPPDs) and SIP/AIP.

Section 7.1: NCBTS-TSNA Guide and Tools Page 60

Page 66: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

District Level Consolidated NCBTS-TSNA Results

District consolidated NCBTS-TSNA results should be analyzed by Supervisors so that technical assistance can be targeted to the identified needs of individuals as well as groups of teachers. Common needs of teachers across schools can become the focus for District-led training and development programs. Division Level Consolidated NCBTS-TSNA Results

An analysis of Division level NCBTS-TSNA results can support the identification of common training and development needs of teachers across the division. This information can then inform the type of training and development activities that are conducted by the Division to support improved teaching and learning and be incorporated into Division MPPDs. Results should inform Training of Trainer programs, programs for teachers, as well as the type of technical assistance Division Supervisors provide to teachers.

Region Level Consolidated NCBTS-TSNA Results

While the Region does not normally provide training directly to teachers, an analysis of the consolidated NCBTS-TSNA results can be used to inform the region on the type of technical assistance they need to provide to the Divisions. Results should be analyzed when developing Regional MPPDs to provide direction on the type of training programs and resources that the region should be providing Division personnel. This is done so they are able to assist teachers in improving teaching and learning practices.

6.6. Monitoring and Evaluation of the NCBTS-TSNAMonitoring and Evaluation of the NCBTS-TSNA

Monitoring and evaluation (M&E) activities are vital in ensuring that program implementation adheres to the standards set. In carrying out these activities, M & E instruments are indispensable and the processes relating to the use of these instruments equally important.

In as much as valid data must be collected during the administration of the NCBTS-TSNA, preparation by the M&E implementers is necessary. The different M & E tools are intended to support this preparation and assist in the collection of different types of information such as the overall quality of the delivery of the NCBTS – TSNA orientation, the qualities of the NCBTS-Coordinators, and the adherence to standards during the NCBTS-TSNA implementation. The M&E tools include:

T&D-M&E Form 1: Individual Profile TemplateNCBTS-M&E Form 1: Teacher’s Profile for NCBTS-TSNANCBTS-M&E Form 2: Learning Process Observation and Facilitation SkillsNCBTS-M&E Form 3: NCBTS-Coordinators Checklist plus and Consolidation TemplateNCBTS-M&E Form 4: Trainer’s Assessment of NCBTS Orientation Workshop and

Consolidation TemplateNCBTS-M&E Form 5: Trainee’s End of F3 Program Assessment and Consolidation TemplateNCBTS-M&E Form 6: Documentation Tool for the Conduct of Cluster or School level

NCBTS-TSNA ImplementationNCBTS-M&E Form 7: School’s NCBTS –TSNA Consolidation Template

A description on how the M& E tools are to be used is outlined below. The M&E tools can be found in Attachment 2.

What will be monitored

How it will be monitored

M&E tool to be used

Who will be responsible

for the monitoring

When will the monitoring take

place

How will the results be used

NCBTS Implementers details in relation to

All NCBTS Implementers will

T&D-M&E Form 1: Individual

TDNA-WG Prior to their involvement in

Results will be analyzed to ensure NCBTS

Section 7.1: NCBTS-TSNA Guide and Tools Page 61

Page 67: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

their current position, their level of experience and qualification

be asked to complete the profile

Profile Template the NCBTS-TSNA process

Implementers have the required KSAs. Results will be entered into the TDIS

Teachers details in relation to their current position, their level of experience and qualification

All teachers will be asked to complete the profile

NCBTS-M&E Form 1: Teacher Profile for NCBTS-TSNA

TDNA-WG Prior to the accomplishment of the NCBTS-TSNA Tool

Results will be entered into the TD IS database along with their corresponding NCBTS-TSNA results

Implementation of the NCBTS-TNA Orientation Package in relation to the processes followed

A Process Observer will be assigned to complete a Learning Process Observation for each session

NCBTS-M&E Form 2: Learning Process Observation

TDNA -WG During the NCBTS orientation workshop

Results will be discussed with individual Trainers to identify strengths and areas for improvement during debriefing sessions.Recommendations based on a analysis of the results should be included in the Program Completion Report

The competency of the NCBTS Coordinators in relation to the criteria set for the role.

A TDNA-WG member will be assigned to observer the NCBTS Coordinator during the orientation process

NCBTS-M&E Form 3: NCBTS Coordinator’s Checklist

Division TDNA-WG

During the NCBTS orientation workshop

Results will be discussed with individual NCBTS Coordinators to identify strengths and areas for improvementResults will be used to inform future decisions regarding the criteria and process for selecting NCBTS Coordinators. Recommendations based on an analysis of the results should be included in the Program Completion Report

The overall effectiveness of the workshop as delivered by the whole Team.

Each of the trainers will be asked to make an assessment of the orientation.

NCBTS-M&E Form 4: Trainer’s Assessment of the NCBTS Orientation Workshop

Division TDNA-WG

Upon completion of the NCBTS orientation workshop

Results will be collated and analyzed by the TDNA-WG. A summary of the results will be included in the Program Completion Report and will inform future training.

Participants perception of the training in relation to - the overall quality of the training

- the usefulness of the training

- their ability to implement the content of the training

- strengths and weaknesses of the training

All participants will be asked to complete the Trainee’s End of F3 Program Assessment Form

NCBTS-M&E Form 5: Trainee’s End of F3 Program Assessment Form

TDNA-WG Upon completion of the NCBTS- TSNA orientation workshop

Participants evaluations will be collated by the TDNA-WG and the results analyzed. A summary of the results will be included in the Program Completion Report and will inform future training.

Section 7.1: NCBTS-TSNA Guide and Tools Page 62

Page 68: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

The implementation of the NCBTS-TSNA Orientation at the division, cluster and school level

A Process Observer will be identified and asked to complete the tool

NCBTS-M&E Form 6: Documentation Tool for the Conduct of Cluster or School Level NCBTS-TSNA Implementation

Region, DivisionTDNA - WG

During the NCBTS-TSNA Orientation Workshop at the Division, Cluster or School Level

Results to be discussed with the Implementers and identify strengths and areas for improvement.

Observations will be collated by the TDNA- WG and the results analyzed to inform future training

The priority training needs of teachers

The NCBTS Coordinator and the School Head will consolidate the results from the administration of the NCBTS –TSNA tool

NCBTS-M&E Form 7: School’s NCBTS-TSNA Consolidation Template

TDNA-WG After the accomplishment of the NCBTS-TSNA tool

Results will be used to inform school and division plans for professional development. Results will be submitted to the Division.

Section 7.1: NCBTS-TSNA Guide and Tools Page 63

Page 69: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

ATTACHMENT ONE: NCBTS-TSNA Tools &Templates- The NCBTS-TSNA Tool and Answer Sheet- Individual Teachers NCBTS-TSNA Results Summary Template, with sample- School NCBTS-TSNA Consolidation, with sample

- E-Version of the NCBTS-TSNA Tool

Section 7.1: NCBTS-TSNA Guide and Tools Page 64

Page 70: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

The National Competency Based Teacher Standards Teachers’ Strengths and Needs Assessment

(NCBTS-TSNA)

Tool and Answer Sheet

June 2010 DepED-EDPITAF-STRIVE

Section 7.1: NCBTS-TSNA Guide and Tools Page 65

NCBTS-TEACHERS STRENGTHS AND NEEDS ASSESSMENT (TSNA)

Page 71: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

PURPOSE:

Teachers, like all professionals, need to be provided with opportunities to improve their knowledge, skills and attitudes through well-planned programs for professional development. This NCBTS-TSNA tool is designed to help you determine your professional development and training needs as a schoolteacher. The NCBTS-TSNA is a self-assessment exercise that allows you to reflect on your current competencies vis-à-vis the National Competency-Based Teacher Standards (NCBTS) set by the Department of Education.

It should be clear that the data from this activity will NOT be used for performance evaluation and thus, you should feel confident about providing accurate information. Your careful and honest manner when accomplishing the NCBTS-TSNA will be of direct benefit to you. The data when analyzed will help you chart your own professional development plan and will inform the program planners in designing training programs and development activities for the benefit of teachers in your school, cluster, in your division and region.

General Directions for Accomplishing the NCBTS-TSNA: 1. Be sure you have participated in an orientation program on the NCBTS before you accomplish the

NCBTS-TSNA. You should also have accomplished the Teacher’s Profile before you start responding to this NCBTS-TSNA Tool.

2. Quickly scan the instrument. Note that it contains clusters of knowledge, skills and attitudes (KSAs) specific to the particular Performance Indicators within each Strand and Domain of the NCBTS.

3. The KSAs have a common stem: “At what level do I…”. Respond to each item with the code that best represents a true assessment of yourself. Each code is interpreted in the chart below.

Code of Competency Level

Qualitative Description

H- (High) I am very competent in the KSA and this is not my priority training need

S- (Satisfactory) I am competent in the KSA but I would benefit from further training.F-(Fair) I am fairly competent in the KSA but need further training.L- (Low) I have low competence in the KSA and require urgent training.

4. On the Separate Answer Sheet, the codes have been placed in four columns. Tick the column of the code that best represents your self-assessment for each item.

Please do not leave any item unanswered.

Section 7.1: NCBTS-TSNA Guide and Tools Page 66

Page 72: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

NCBTS DOMAIN 1. SOCIAL REGARD FOR LEARNING

STRAND 1.1 TEACHER'S ACTIONS DEMONSTRATE VALUE FOR LEARNING

INDICATOR 1.1.1. Implements school policies and procedures.

At what level do I…    

1 know school policies and procedures?  

2 understand school operations?  

3 implement policies and procedures?  

4 communicate policies and procedures to students, parents and other concerned persons?

5 abide by the school policies and procedures?

INDICATOR 1.1.2. Demonstrates punctuality.

At what level do I…    

6 possess awareness on the implementation of "time on task" in all responsibilities ? 7 demonstrate punctuality in accomplishing expected tasks and functions? 8 model the value of punctuality?  

INDICATOR 1.1.3. Maintains appropriate appearance.

At what level do I…    

9 know decorum, i.e. dress code, behavior of teachers?10 practice decorum on all occasions?  

11 value decorum expected of teachers?  

INDICATOR 1.1.4. Is careful about the effect of one's behavior on students.

At what level do I…    

12 understand the theoretical concepts and principles of social learning? 13 show appropriate behavior even during unguarded moments?14 apply knowledge on social learning in dealing with students? 15 consider the influence my behavior has on students?

STRAND 1.2 DEMONSTRATES THAT LEARNING IS OF DIFFERENT KINDS AND FROM DIFFERENT SOURCES INDICATOR 1.2.1 Makes use of various learning experiences and resources.

At what level do I…    

16 know a range of sources through which social learning may be experienced?17 use information from a variety of sources for learning (e.g. family, church, other sectors of the community)?

18 appreciate that students learn through a range of different social experiences?

NCBTS DOMAIN 2. LEARNING ENVIRONMENT

STRAND 2.1 CREATES AN ENVIRONMENT THAT PROMOTES FAIRNESS    

INDICATOR 2.1.1 Maintains a learning environment of courtesy and respect for different learners(e.g. ability, culture, gender)

At what level do I…    19 understand the dynamics of teaching learners from diverse backgrounds (e.g. ability, culture, family

background & gender)?20 maintain a learning environment that promotes courtesy and respect for all learners?21 show courtesy and respect to everyone at all times?

INDICATOR 2.1.2 Provides gender-fair opportunities for learning.At what level do I…    

22 understand the objectives, principles and strategies for Gender and Development (GAD)?

Section 7.1: NCBTS-TSNA Guide and Tools Page 67

START HERE:

Page 73: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System23 provide gender-fair learning opportunities?    

24 uphold gender sensitivity in my daily dealings with learners and others?    

INDICATOR 2.1.3 Recognizes that every learner has strengths.

At what level do I…    

25 understand the psychological foundations of learners' growth and development? 26 know about potentialities and uniqueness of individual learners?27 provide learning activities that allow all learners to reach their full potential?28 recognize learners' individual potentials and strengths?

STRAND 2.2 MAKES THE CLASSROOM ENVIRONMENT SAFE AND CONDUCIVE TO LEARNING

INDICATOR 2.2.1 Maintains a safe and orderly classroom free from distractions

At what level do I…    

29 know the principles of classroom management, room structuring, and safety measures?30 maintain a safe, clean and orderly classroom free from distractions?    

31 show concern for a safe and conducive learning environment?    

INDICATOR 2.2.2 Arranges challenging activities in a given physical environment.

At what level do I…  

32 know various challenging activities that can be adapted in any given physical environment? 33 conduct challenging learning activities despite physical environment constraints?    

34 show enthusiasm to conduct learning activities at any given situation?

INDICATOR 2.2.3 Uses individual and cooperative learning activities to improve capacities of learners for higher learning.

At what level do I…    

35 understand the importance and dynamics of both individual and cooperative learning36 know varied strategies for individual and cooperative learning? 37 balance the use of individual and cooperative learning activities? 38 see the value in creating individual and cooperative learning activities?

STRAND 2.3 COMMUNICATES HIGHER LEARNING EXPECTATIONS TO EACH LEARNER

INDICATOR 2.3.1 Encourages learners to ask questions.

At what level do I…    

39 know the art of questioning and different techniques of asking higher order questions?40 provide opportunities for learners to ask questions?41 ask questions that stimulate critical and creative thinking among learners? 42 show an accepting response/gesture in dealing with questions of learners?

INDICATOR 2.3.2 Provides learners with a variety of learning experiences

At what level do I…    

43 know various strategies that elevate students' level of learning? 44 provide learners with variety of experiences that enhance learning? 45 willingly provide learners with a variety of challenging learning activities?

INDICATOR 2.3.3 Provides varied enrichment activities to nurture the desire for further learning.

At what level do I…    

46 understand how enrichment activities enhance the learners' desire to learn? 47 know ways of motivating the learners to learn further and more effectively?48 facilitate varied enrichment activities that are interesting for further learning?

Section 7.1: NCBTS-TSNA Guide and Tools Page 68

Page 74: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

49 show diligence in making enrichment materials?

INDICATOR 2.3.4 Communicates and maintains high standards of learning performance.

At what level do I…    

50 know the implications of achieving high standards of learning for total human development?51 help learners maintain high standards of learning? 52 inspire learners to set high performance targets for themselves?

STRAND 2.4 ESTABLISHES AND MAINTAINS CONSISTENT STANDARDS OF LEARNERS' BEHAVIOR

INDICATOR 2.4.1 Handles behavior problems quickly and with due respect to children's rights.

At what level do I…    

53 understand the rights and responsibilities of the child as embodied in different laws, e.g. RA 7610,PD 603?

54 know behavior management techniques for learners with behavioral problems?

55 identify learners with behavioral problems? 56 employ appropriate procedures and actions consistently when dealing with learners with behavioral problems?

57 show a compassion and caring attitude in managing behavior problems?

INDICATOR 2.4.2 Gives timely feedback to reinforce appropriate learners' behavior.

At what level do I…    

58 know the concept, importance, and techniques of social reinforcement?59 provide timely and appropriate reinforcement on learners' behavior?60 believe that positive reinforcement leads to improved learner behavior?

INDICATOR 2.4.3 Guides individual learners requiring development of appropriate social and learning behavior.

At what level do I…    

61 understand the learners' social developmental stages? 62 know different strategies that enhance learners' social development ? 63 use varied teaching-learning strategies that encourage social interaction?

64 show patience in managing different social and learning activities?

INDICATOR 2.4.4 Communicates and enforces school policies and procedures for appropriate learner behavior.

At what level do I…    

65 know DepED / school policies and procedures on student discipline?66 communicate and enforce policies and procedures related to students behavior?

67 commit to enforcing school policies and procedures?

STRAND 2.5 CREATES A HEALTHY PSYCHOLOGICAL CLIMATE FOR LEARNING

INDICATOR 2.5.1 Encourages free expression of ideas from students.

At what level do I…    

68 know the concepts and principles of democratic expression of ideas?69 provide activities that will encourage respect and free expression of ideas?

70 encourage learners to express their ideas freely and responsibly?

Section 7.1: NCBTS-TSNA Guide and Tools Page 69

Page 75: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

INDICATOR 2.5.2 Creates stress-free environment

At what level do I…71 know the elements and importance of establishing a stress-free learning environment?72 manage conflicts and other stress-related situations?

73 initiate and create programs (e.g. child-friendly school system) and activities that promote stress-free environment?

74 get involved in advocacy activities that create a stress-free environment?INDICATOR 2.5.3 Takes measure to minimize anxiety and fear of the teacher and/or subject.

At what level do I…75 know about child-friendly teaching strategies?76 encourage learners to develop a positive attitude towards their subject and teacher?77 let my students feel they are accepted?

NCBTS DOMAIN 3. DIVERSITY OF LEARNERS

STRAND 3.1 DETERMINES, UNDERSTANDS AND ACCEPTS THE LEARNERS' DIVERSE BACKGROUND KNOWLEDGE AND EXPERIENCE

INDICATOR 3.1.1 Obtains information on the learning styles, multiple intelligences and needs of learners

At what level do I…78 understand the theories and concepts of multiple intelligences and learning styles?79 identify learning styles and multiple intelligences of learners?

80 show diligence in obtaining information on different learning needs?

INDICATOR 3.1.2 Designs or selects learning experiences suited to different kinds of learners

At what level do I…  

81 know techniques and strategies in designing/selecting activities for varied types of learners?82 utilize varied activities for various types of learners?

83 show respect and concern for individual differences of students?

INDICATOR 3.1.3 Establishes goals that define appropriate expectations for all learners

At what level do I…    

84 understand the requirements in setting goals for differentiated learning?85 utilize differentiated activities to meet expected learning goals of learners?86 assist learners in setting learning goals for themselves?87 appreciate the need to consider the differences in experiences and capabilities of learners?

INDICATOR 3.1.4 Paces lessons appropriate to needs and difficulties of learners

At what level do I…    

88 know teaching principles and strategies for addressing learners' needs and difficulties? 89 pace lessons according to learners' needs and difficulties? 90 show flexibility in pacing lessons to support the needs of the learners?

INDICATOR 3.1.5 Initiates other learning approaches for learners whose needs have not been met by usual approaches

At what level do I…    

91 have the knowledge on teaching principles and strategies for students -at -risk ?92 keep track of students-at-risk ?

Section 7.1: NCBTS-TSNA Guide and Tools Page 70

Page 76: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System93 provide appropriate intervention programs for learners- at- risk ?94 appreciate the need to help students-at-risk?

INDICATOR 3.1.6 Recognizes multi-cultural background of learners when providing learning opportunities

At what level do I…  

95 know the cultural background of my students and its implications to my teaching? 96 provide appropriate learning activities to students with different cultural background? 97 show appreciation for cultural diversities?

INDICATOR 3.1.7 Adopts strategies to address needs of differently-abled students

At what level do I…  

98 know the educational psychology of learners with special needs?99 use appropriate strategies for learners with special needs?

100 show sensitivity to learners with special needs?

INDICATOR 3.1.8 Makes appropriate adjustments for learners of different socio-economic backgrounds

At what level do I…    

101 understand the effects of socio-economic status on learning performance? 102 determine the different socio-economic background of learners? 103 use techniques to motivate learners of the lower socio-economic status? 104 show fairness to all learners regardless of their socio-economic status?

NCBTS DOMAIN 4. CURRICULUM  

STRAND 4.1 DEMONSTRATES MASTERY OF THE SUBJECT

INDICATOR 4.1.1 Delivers accurate and updated content knowledge using appropriate methodologies, approaches and strategies

At what level do I…    

105 have updated knowledge in content and teaching strategies in my subject area?106 apply the updated content and appropriate strategies in my teaching?107 commit to deliver accurate and updated content knowledge?

INDICATOR 4.1.2. Integrates language, literacy and quantitative skill development and values in his/her subject area

At what level do I…    

108 have knowledge about multi-disciplinary integrative modes and techniques of teaching?109 use multi-disciplinary integrative modes and techniques of teaching the subject area?110 support the integration of language, literacy, skill development and values in the learning activities?

INDICATOR 4.1.3. Explains learning goals, instructional procedures and content clearly and accurately to students

At what level do I…    

111 possess in-depth understanding of the subject area's learning goals, instructional procedures and content based on curriculum ?

112 explains learning goals, concept and process, clearly and accurately to learners?113 give sufficient time to explain the lessons for clear understanding of the learners?

INDICATOR 4.1.4. Links the current content with past and future lessons

At what level do I…    

114 understand interrelation of topics/content within the subject area taught?

115 link the present subject matter content with the past and future lessons?

116 value the need to relate prior knowledge of learners with the present and future lessons?

Section 7.1: NCBTS-TSNA Guide and Tools Page 71

Page 77: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

INDICATOR 4.1.5. Aligns with lesson objectives the teaching methods, learning activities and instructional materials or resources appropriate to learners

At what level do I…    

117 have the knowledge in designing lessons with congruent objectives, teaching methods, learning activities and materials?

118 teach lessons that have congruency of objectives, procedure, materials and evaluation?

119 appreciate the value of aligning objectives with all the parts of a lesson?INDICATOR 4.1.6. Creates situations that encourage learners to use high order thinking skills

At what level do I…    

120 understand the concept of critical thinking and the facets of understanding?

121 engage learners in activities that develop higher order thinking skills?

122 patiently motivate learners to develop higher order thinking skills?

INDICATOR4.1.7 Engages and sustains learners' interests in the subject by making content meaningful and relevant to them

At what level do I…    

123 know strategies and materials that promote authentic learning?

124 apply various appropriate strategies and /or technology to motivate and sustain learning?

125 believe in relating classroom learning to real world experiences?INDICATOR 4.1.8. Integrates relevant scholarly works and ideas to enrich the lesson as needed

At what level do I…    

126 update myself with relevant scholarly works and ideas related to my subject area?

127 integrate scholarly works and ideas to enrich the lesson for the learners?

128 show enthusiasm and openness to new learning?

INDICATOR 4.1.9. Integrates content of subject area with other disciplines

At what level do I…129 know about other disciplines related to the subject I am teaching?

130 integrate content of subject area with other disciplines?

131 appreciate integrative mode of teaching?

STRAND 4.2 COMMUNICATES CLEAR LEARNING GOALS FOR THE LESSONS THAT ARE APPROPRIATE FOR LEARNERS

INDICATOR 4.2.1 Sets appropriate learning goals

At what level do I…132 know the learning goals vis-à-vis specific subject content of the level I am teaching?

133 set doable and appropriate daily learning goals for the learners?

134 reflectively choose appropriate learning goals?

INDICATOR 4.2.2 Understands the learning goals

At what level do I…135 understand the connection of the short-term goals to the long-term goals of learning?136 practice relating short-term goals to long term goals for learning?137 value the learning goals set in the curriculum?

Section 7.1: NCBTS-TSNA Guide and Tools Page 72

Page 78: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

STRAND 4.3 MAKE GOOD USE OF ALLOTTED INSTRUCTIONAL TIME

INDICATOR 4.3.1. Establishes routines and procedures to maximize instructional time

At what level do I…138 understand the principles and procedure of maximizing instructional time?

139 apply techniques of "time on task" in planning and delivering lessons?

140 observe discipline on time management?

INDICATOR 4.3.2. Plans lessons to fit within available instructional time

At what level do I…141 know the principles and techniques of lesson planning considering the allotted instructional time ?

142 design parts of the lesson within available instructional time?

143 show efficiency in the use of time to effectively attain learning goals?

STRAND 4.4 SELECTS TEACHING METHODS, LEARNING ACTIVITIES AND THE INSTRUCTIONAL MATERIALS OR RESOURCES APPROPRIATE TO THE LEARNERS AND ALIGNED TO OBJECTIVES OF THE LESSON

INDICATOR 4.4.1 Translate learning competencies to instructional objectives.

At what level do I…    

144 know the learning competencies in my learning areas in order to formulate appropriate instructional objectives?

145 translate learning competencies into instructional objectives?146 show a reflective attitude in translating learning competencies to instructional objectives?

INDICATOR 4.4.2 Selects, prepares, and utilizes technology and other instructional materials appropriate to the learners and the learning objectives.

At what level do I…    

147 know various technology and instructional materials appropriate for my learning area?148 select and utilize updated and appropriate technology/instructional materials?149 use appropriate technology resources to achieve curriculum standards and objectives? 150 prepare adequate and appropriate instructional materials for the learners and the learning objectives?151 manifest resourcefulness in preparing instructional materials?

INDICATOR 4.4.3 Provides activities and uses materials which fit the learners' learning styles, goals and culture.

At what level do I…    

152 know the principles of instructional material preparation for different types of learners?153 use relevant activities and materials suited to the learning styles, goals and culture of the learners?154 believe in the need to provide activities and use materials appropriate to the learners?

INDICATOR 4.4.4 Uses a variety of teaching approaches and techniques appropriate to the subject matter and the learners .

At what level do I…    

155 understand the theories, approaches and strategies in teaching the subject area?156 use variety of teaching strategies and techniques appropriate to the learners and subject matter?157 show enthusiasm in using innovative and appropriate teaching techniques?INDICATOR 4.4.5 Utilizes information derived from assessment to improve teaching and learning.At what level do I…    

158 understand the proper utilization of assessment results to improve teaching and learning?159 use assessment results in setting learning objectives and learning activities ?

Section 7.1: NCBTS-TSNA Guide and Tools Page 73

Page 79: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System160 appreciate the value of assessment in improving teaching and learning?

INDICATOR 4.4.6 Provides activities and uses materials which involve students in meaningful learning.

At what level do I…    

161 know various educational theories (e.g. constructivism) and their implications to meaningful leaning?162 apply relevant teaching approaches to achieve meaningful learning? 163 use improvised and indigenous materials for meaningful learning?164 appreciate teaching approaches to meaningful learning (e.g., constructivism)?

STRAND 4.5 RECOGNIZES GENERAL LEARNING PROCESSES AS WELL AS UNIQUE PROCESSES OF INDIVIDUAL LEARNERS

INDICATOR 4.5.1 Designs and utilizes teaching methods that take into account the learning process.

At what level do I…    

165 know different teaching approaches and strategies suitable to various learners?166 have knowledge on general and specific learning processes?167 apply teaching-learning methodologies that respond to general and specific learning processes?168 recognize the need to design teaching methods appropriate to the learning process?

STRAND 4.6 PROMOTES PURPOSIVE STUDY

INDICATOR 4.6.1 Cultivates good study habits through appropriate activities and projects.At what level do I…    

169 know the techniques in forming good study habits?170 determine the current study habits of my students?171 provide appropriate learning tasks and projects that support development of good study habits?172 take extra time to help students from good habits?

STRAND 4.7 Demonstrates skills in the use of ICT in teaching and learning

INDICATOR 4.7.1 Utilizes ICT to enhance teaching and learning.

At what level do I…173 Know the nature and operations of technology systems as they apply to teaching and learning? 174 understand how ICT-based instructional materials/learning resources support teaching and learning?175 understand the process in planning and managing ICT-assisted instruction?176 design/develop new or modify existing digital and/or non-digital learning resources?177 use ICT resources for planning and designing teaching-learning activities?178 use ICT tools to process assessment and evaluation data and report results?179 demonstrate proficiency in the use of computer to support teaching and learning?180 use ICT tools and resources to improve efficiency and professional practice?181 value and practice social responsibility, ethical and legal use of ICT tools and resources?182 show positive attitude towards the use of ICT in keeping records of learners?NCBTS DOMAIN 5. PLANNING, ASSESSING AND REPORTINGSTRAND 5.1 DEVELOPS AND UTILIZES CREATIVE AND APPROPRIATE INSTRUCTIONAL PLAN.

INDICATOR 5.1.1 Shows proofs of instructional planning.

At what level do I…183 know the elements and process of developing an instructional plan (e.g. daily, weekly, quarterly, yearly)?

184 arrange sequentially the learning units with reasonable time allotment?

185 identify appropriate learning objectives, strategies, and accompanying materials in the plan?

186 identify appropriate and varied assessment procedures?

Section 7.1: NCBTS-TSNA Guide and Tools Page 74

Page 80: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

187 show enthusiasm in sourcing materials (e.g. lesson plan packages) as guides for instructional planning?

INDICATOR 5.1.2 Implements instructional plan.     

At what level do I…    

188 know the factors for successful implementation of the instructional plan? 189 adjust the instructional plan to ensure attainment of objectives?

190 appreciate the value of instructional planning?

INDICATOR 5.1.3 Demonstrates ability to cope with varied teaching milieu.    

At what level do I…    

191 know the different teaching-learning situations that could affect the implementation of the instructional plans?

192 cope with varied teaching milieu/setting?

193 manifest openness to make necessary adjustments to improve the instructional plan?

STRAND 5.2 DEVELOPS AND USES A VARIETY OF APPROPRIATE ASSESSMENT STRATEGIES TO MONITOR AND EVALUATE LEARNING.

INDICATOR 5.2.1 Prepares formative and summative tests in line with the curriculum.

At what level do I…

194 know the principles and purposes of instructional assessment including formative and summative testing?

195 construct valid and reliable formative and summative tests?

196 appreciate the value of testing as a tool to improve instruction and learning performance?INDICATOR 5.2.2 Employs non-traditional assessment techniques (portfolio, journals, rubrics, etc).

At what level do I…    

197 know the concepts, principles and strategies of non-traditional assessment?198 use appropriate non-traditional assessment techniques?

199 value the use of non-traditional assessment? INDICATOR 5.2.3 Interprets and uses assessment results to improve teaching and learning.

At what level do I…    

200 know concepts, principles on interpretation and utilization of assessment results?201 interpret and use test results to improve teaching and learning?

202 manifest fairness in the interpretation of test results?

INDICATOR 5.2.4 Identifies teaching -learning difficulties and possible causes and takes appropriate action to address them.

At what level do I…    

203 know the concept and principles of diagnostic testing? 

204 know the types of remedial lessons for slow learners? 

205 identify teaching-learning difficulties and possible causes?

206 manage remediation programs? 207 manifest willingness and patience in conducting remediation programs? 

INDICATOR 5.2.5 Uses tools for assessing authentic learning.

At what level do I…    

208 know the concepts and principles of authentic learning assessment? 

209 utilize appropriate tools for assessing authentic learning? 

210 enthusiastically develop and use tools for assessing authentic learning?

Section 7.1: NCBTS-TSNA Guide and Tools Page 75

Page 81: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

STRAND 5.3 MONITORS REGULARLY AND PROVIDES FEEDBACK ON LEARNERS' UNDERSTANDING OF CONTENT.

INDICATOR 5.3.1 Provides timely and accurate feedback to learners to encourage them to reflect on and monitor their own learning growth.

At what level do I…    

211 know the principles of giving and receiving feedback on learners' progress?212 use strategies for giving feedback/reporting progress of individual learner?

213 motivate learners' to reflect and monitor their learning growth?

214 consistently provide timely and accurate feedback?

INDICATOR 5.3.2 Keeps accurate records of grades/performance levels of learners.

At what level do I…    

215 know the current guidelines about the grading system?216 maintain accurate and updated learners' records

STRAND 5.4 COMMUNICATES PROMPTLY AND CLEARLY TO LEARNERS, PARENTS AND SUPERIORS ABOUT LEARNERS PROGRESS.

INDICATOR 5.4.1 Conducts regular meetings with learners and parents to report learners' progressAt what level do I…    

217 know the dynamics of communicating learners' progress to students, parents and other stakeholders?218 plan and implement a comprehensive program to report learners' progress to students and parents?

219 manifest accountability and responsibility in communicating the learners' progress to intended stakeholders?

INDICATOR 5.4.2 Involves parents to participate in school activities that promote learning.At what level do I…    

220 understand the role and responsibilities of parents in supporting school programs to enhance children's learning progress?

221 involve parents to participate in school activities that promote their children's learning progress?222 establish rapport and a cooperative working relationship with parents?

NCBTS DOMAIN 6. COMMUNITY LINKAGESSTRAND 6.1 ESTABLISHES LEARNING ENVIRONMENT THAT RESPOND TO THE ASPIRATION OF THE COMMUNITY.INDICATOR 6.1.1 Involves community in sharing accountability for learners' achievement.At what level do I…  

223 know the programs, projects, and thrusts of DepED on school-community partnership?224 involve the community in the programs, projects and thrusts of the school? 225 promote shared accountability for the learners' achievement?

INDICATOR 6.1.2 Uses community resources (human, material) to support learning.

At what level do I…  

226 know the various community resources available to enhance learning?227 use available community resources (human, material) to support learning?228 recognize community resources to support learning?

INDICATOR 6.1.3 Uses the community as a laboratory for learning.

At what level do I…  

229 know strategies for experiential learning outside the classroom?230 make use of the community as a laboratory for learning?231 appreciate the world as a learning environment?

INDICATOR 6.1.4 Participates in community activities that promote learning.

At what level do I…  

Section 7.1: NCBTS-TSNA Guide and Tools Page 76

Page 82: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

232 know the teacher's social responsibility?233 link with sectors for involvement in community work?234 show enthusiasm in joining community activities?INDICATOR 6.1.5 Uses community networks to publicize school events and achievements.At what level do I…   

235 know the dynamics of community networking and information dissemination?236 communicate the school events/achievements through community networks?237 share information on school events/achievements to the community? 

INDICATOR 6.1.6 Encourages students to apply classroom learning to the community.

At what level do I…   

238 know the social realities outside the classroom to make learning relevant?239 provide learning activities ensuring their application to the community?240 show sensitivity to the needs of the community?NCBTS DOMAIN 7. PERSONAL GROWTH AND PROFESSIONAL DEVELOPMENT

STRAND 7.1 TAKES PRIDE IN THE NOBILITY OF TEACHING AS A PROFESSION. 

  

INDICATOR 7.1.1 Maintains stature and behavior that upholds the dignity of teaching.  

At what level do I…  

241 know the set of ethical and moral principles, standards and values embodied in the Code of Ethics for Professional Teachers?

242 practice the Code of Ethics for Professional Teachers?243 manifest the values that uphold the dignity of teaching?

INDICATOR 7.1.2 Allocates time for personal and professional development through participation in educational seminars and workshops, reading educational materials regularly and engaging in educational research.

At what level do I…    

244 know the requirements/expectations for personal and professional development of teachers?245 prepare and implement an individual personal and professional development plan (IPPD)?246 manifest zeal in undertaking educational research?INDICATOR 7.1.3 Manifests personal qualities like enthusiasm, flexibility & caring attitude.

At what level do I…    

247 know the value concepts of enthusiasm, flexibility and caring attitude and strategies to enhance them? 248 engage in self-assessment to enhance my personal qualities? 249 exhibit personal qualities such as enthusiasm, flexibility and caring attitude?

INDICATOR 7.1.4 Articulates and demonstrates one's personal philosophy of teaching.

At what level do I…    

250 understand the value of having a personal philosophy of teaching?251 translate my philosophy of teaching into action?252 share my personal philosophy of teaching with others?STRAND 7.2 BUILDS PROFESSIONAL LINKS WITH COLLEAGUES TO ENRICH TEACHING PRACTICE.

INDICATOR 7.2.1 Keeps abreast with recent developments in education.

At what level do I…    

253 update myself with recent developments in education?254 apply updated knowledge to enrich teaching practice? 255 manifest openness to recent developments in education?

Section 7.1: NCBTS-TSNA Guide and Tools Page 77

Page 83: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA SystemINDICATOR 7.2.2 Links with other institutions and organizations for sharing best practices.

At what level do I…    

256 know of institutions and organizations with a goal to improve teaching practice?257 link with other institutions and organizations that are helpful to the teaching profession?258 get involved in professional organizations and other agencies that can improve my teaching practice?STRAND 7.3 REFLECTS ON THE LEVEL OF THE ATTAINMENT OF PROFESSIONAL DEVELOPMENT GOALS.INDICATOR 7.3.1 Reflects on the quality of his/her own teaching.

At what level do I…    

259 know the techniques and benefits derived from theory-guided introspection ? 260 make a self assessment of my teaching competencies?  261 desire to improve the quality of my teaching? INDICATOR 7.3.2 Improves teaching performance based on feedback from the mentor, students, peers, superiors and others.

At what level do I…    

262 know the purposes and approaches in establishing an effective feedback system? 

263 actively seek feedback from a range of people to improve my teaching performance?264 manifest positive attitude towards comments/recommendations?INDICATOR 7.3.3 Accepts personal accountability to learners' achievement and performance.

At what level do I…    

265 know my accountability and responsibilities toward students' learning performance?266 examine myself vis-a-vis my accountability for the learners and to the teaching profession?267 accept my personal accountability to the learners? INDICATOR 7.3.4 Uses self-evaluation to recognize and enhance one's strength and correct one's weaknesses.

At what level do I…268 know the concept and strategies for self-evaluation? 

269 identify my strengths and weaknesses as a person and as a teacher? 

270 manifest determination to become a better person and teacher? 

END of SELF-ASSESSMENT

Section 7.1: NCBTS-TSNA Guide and Tools Page 78

Page 84: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA SystemDomain 1:Social Regard for Learning H S F LS1:Teachers actions demonstrate value for learning 2.3.3 46

H S F L 471.1.1 1 48

2 493 2.3.4 504 515 52

1. 1.2 6 S4: Establishes & maintains consistent standards …7 2.4.1 538 54

1.1.3 9 5510 5611 57

1.1.4 12 2.4.2 5813 5914 6015 2.4.3 61

S2: Demonstrates that learning is of diff kinds… 622.1.1 16 63

17 2.4.4 6418 65

Domain 2:Learning Environment 66S1: Creates an environment that promotes fairness 67

2.1.1 19 S5: Creates a healthy psychological climate…20 2.5.1 6821 69

2.1.2 22 7023 2.5.2 7124 72

2.1.3 25 7326 7427 2.5.3 7528 76

S2: Makes the class environment safe & conducive..77

Domain 3: Diversity of Learners2.2.1 29 S1: Determines, understands & accepts the learner’s...

30 3.1.1 7831 79

2.2.2 32 8033 3.1.2 8134 82

2.2.3 35 8336 3.1.3 8437 8538 86

S3: Communicates higher expectations to learners 872.3.1 39 3.1.4 88

40 8941 9042

2.3.2 434445

Section 7.1: NCBTS-TSNA Guide and Tools Page 79

START HERE:

Page 85: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

H S F L H S F L3.1.5 91 S3: Make good use of allotted instructional time

92 4.3.1 13893 13994 140

3.1.6 95 4.3.2 14196 14297 143

3.1.7 98 S4: Selects teaching methods, learning methods…99 4.4.1 144100 145

3.1.8 101 146102 4.4.2 147103 148104 149

Domain 4: Curriculum 150S1: Demonstrates mastery of the subject 1514.1.1 105 4.4.3 152

106 153107 154

4.1.2 108 4.4.4 155109 156110 157

4.1.3 111 4.4.5 158112 159113 160

4.1.4 114 4.4.6 161115 162116 163

4.1.5 117 164118 S5: Recognizes general learning processes…119 4.5.1 165

4.1.6 120 166121 167122 168

4.1.7 123 S6: Promotes purposive study124 4.6.1. 169125 170

4.1.8 126 171127 172128 S7: Demonstrates skills in the use of ICT in teaching

4.1.9 129 4.7.1 173130 174131 175

S2: Communicates clear learning goals 1764.2.1 132 177

133 178134 179

4.2.2 135 180136 181137 182

Section 7.1: NCBTS-TSNA Guide and Tools Page 80

Page 86: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

H S F L H S F L

Domain 5:Planning,Assessing&Reporting 6.1.2 226S1: Develops…appropriate instructional plan 2275.1.1 183 228

184 6.1.3 229185 230186 231187 6.1.4 232

5.1.2 188 233189 234190 6.1.5 235

5.1.3 191 236192 237193 6.1.6 238

S2: Develops appropriate assessment strategies… 2395.2.1 194 240

195 Domain 7: Personal Growth and Professional Development.196

5.2.2 197 S1: Takes pride in the nobility of teaching …198 7.1.1 241199 242

5.2.3 200 243201 7.1.2 244202 245

5.2.4 203 246204 7.1.3 247205 248206 249207 7.1.4 250

5.2.5 208 251209 252210 S2: Builds professional links with colleagues..

S3: Monitors regularly and provides feedback… 7.2.1 2535.3.1 211 254

212 255213 7.2.2 256214 257

5.3.2 215 258216 S3: Reflects on the level of attainment of prof dev

S4: Communicates promptly and clearly… 7.3.1 2595.4.1 217 260

218 261219 7.3.2 262

5.4.2 220 263221 264222 7.3.3 265

Domain 6: Community Linkages 266S1: Establishes learning environment … 2676.1.1 223 7.3.4 268

224 269225 270

END of SELF-ASSESSMENT

Section 7.1: NCBTS-TSNA Guide and Tools Page 81

Page 87: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Domains D1 D1Tot

Domain 2 D2Tot

D3 D3Tot

Domain 4 D4Tot

Domain 5 D5Tot

D6 D6Tot

Domain 7 D7TotStrands S

1.1S

1.2S

2.1S

2.2S

2.3S

2.4S

2.5S

3.1S

4.1S

4.2S

4.3S

4.4S

4.5S

4.6S

4.7S

5.1S

5.2S

5.3S

5.4S

6.1S

7.1S

7.2S

7.3

No. of KSAs 15 3 18 10 10 14 15 10 59 27 27 27 6 6 21 4 4 10 78 11 17 6 6 40 18 18 12 6 12 30

HPS* 60 12 72 40 40 56 60 40 236 108 108 108 24 24 84 16 16 40 312 44 68 24 24 160 72 72 48 24 48 120

Raw Score% Score

Level % Score

100%

93.75%

87.50%

81.25%

75.00%

68.75%

62.50%

56.25%

50.00%

43.75%

37.50%

31.25%

25.00%

Domains D1 D1Tot

Domain 2 D2Tot

D3 D3Tot

Domain 4 D4Tot

Domain 5 D5Tot

D6 D6Tot

Domain 7 D7TotStrands S

1.1S1.2

S2.1

S2.2

S2.3

S2.4

S2.5

S3.1

S4.1

S4.2

S4.3

S4.4

S4.5

S4.6

S4.7

S5.1

S5.2

S5.3

S5.4

S 6.1

S 7.1

S 7.2

S7.3

*Highest Possible Score

Domains D1 D1 Domain 2 D2 D3 D3 Domain 4 D4 Domain 5 D5 D6 D6 Domain 7 D7

Section 7.1: NCBTS-TSNA Guide and Tools Page 82

INDIVIDUAL TEACHER NCBTS-TSNA RESULTS SUMMARY TEMPLATEEx

pert

Beg

inni

ngD

evel

opin

gEx

perie

nce

d

INDIVIDUAL TEACHER NCBTS-TSNA RESULTS SUMMARY TEMPLATE - SAMPLE

Page 88: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA SystemTot Tot Tot Tot Tot Tot TotStrands S

1.1S

1.2S

2.1S

2.2S

2.3S

2.4S

2.5S

3.1S

4.1S

4.2S

4.3S

4.4S

4.5S

4.6S

4.7S

5.1S

5.2S

5.3S

5.4S

6.1S

7.1S

7.2S

7.3

No. of KSAs 15 3 18 10 10 14 15 10 59 27 27 27 6 6 21 4 4 10 78 11 17 6 6 40 18 18 12 6 12 30

HPS* 60 12 72 40 40 56 60 40 236 108 108 108 24 24 84 16 16 40 312 44 68 24 24 160 72 72 48 24 48 120

Raw Score 45 8 53 32 28 40 45 20 165 60 60 80 12 12 60 8 8 12 192 20 20 12 12 64 40 40 39 16 30 85

% Score 75 67 74 80 70 71 75 50 70 56 56 74 50 50 71 50 50 30 62 45 29 50 50 40 56 56 81 67 62 71

Level % Score

100%

93.75

87.50%

81.25%

75.00%

68.75%

62.50%

56.25%

50.00%

43.75%

37.50

31.25

25.00%

Domains D1 D1Tot

Domain 2 D2Tot

D3 D3Tot

Domain 4 D4Tot

Domain 5 D5Tot

D6 D6Tot

Domain 7 D7TotStrands S

1.1S1.2

S2.1

S2.2

S2.3

S2.4

S2.5

S3.1

S4.1

S4.2

S4.3

S4.4

S4.5

S4.6

S4.7

S5.1

S5.2

S5.3

S5.4

S 6.1

S 7.1

S 7.2

S7.3

Section 7.1: NCBTS-TSNA Guide and Tools Page 83

Expe

rtB

egin

nin

gD

evel

opin

gEx

perie

nce

d

Page 89: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Name of School: ____________________________________Division _______________________

School NCBTS-TSNA Results

School Head _________________________________________

NCBTS Coordinator: __________________________________

Section 7.1: NCBTS-TSNA Guide and Tools Page 84

SCHOOL NCBTS- TSNA CONSOLIDATION

Domain /Strand No.

Teacher’s Percentage ScoreTotal

Average Percentage

T1 T2 T3 T4 …

1.1 1.2Total Domain 1.

2.1 2.2 2.3 2.4 2.5Total Domain 2. 3.1Total Domain 3. 4.1 4.2 4.3 4.4 4.5 4.6 4.7Total Domain 4. 5.1 5.2 5.3 5.4Total Domain 5. 6.1Total Domain 6. 7.1 7.2 7.3Total Domain 7.

Page 90: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

School NCBTS-TSNA Results (SAMPLE)

School Head _________________________________________

NCBTS Coordinator: __________________________________

Section 7.1: NCBTS-TSNA Guide and Tools Page 85

COMPETENCY No.

Scores per Teacher

Total

Average Percentage

Corazon Estrada

Pedro Mendoza

1.1 87.52 75.62 163.14 163.14 ÷ 2 = 81.62 1.2 82.54 78.67 161.21

161.21 ÷ 2 = 80.60Total Domain 1. 170.06 154.29 324.35

324.35 ÷ 2 = 81.11 2.1

Page 91: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

NCBTS-TSNA

ELECTRONIC Version

This can be accessed through your Division or downloaded from the Training and Development Information System (TDIS) which is an element of the Enhanced Basic Education Information System (EBEIS) at http://beis.deped.gov.ph/ or through the Learning Resource Management and Development System (LRMDS) Portal at http:// lrmds . deped.gov.ph /

Section 7.1: NCBTS-TSNA Guide and Tools Page 86

Page 92: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Attachment 2: Monitoring and Evaluation Tools for NCBTS-TSNA

M&E Matrix of Tools for NCBTS-TSNAT&D-M&E Form 1: Individual Profile TemplateNCBTS-M&E Form 1: Teacher’s Profile for NCBTS-TSNANCBTS-M&E Form 2: Learning Process Observation and Facilitation SkillsNCBTS-M&E Form 3: NCBTS-Coordinators Checklist plus and Consolidation

TemplateNCBTS-M&E Form 4: Trainer’s Assessment of NCBTS Orientation Workshop and

Consolidation TemplateNCBTS-M&E Form 5: Trainee’s End of F3 Program Assessment and Consolidation

TemplateNCBTS-M&E Form 6: Documentation Tool for the Conduct of Cluster or School

level NCBTS-TSNA ImplementationNCBTS-M&E Form 7: School’s NCBTS –TSNA Consolidation Template

Section 7.1: NCBTS-TSNA Guide and Tools Page 87

Page 93: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

M&E Matrix of Tools for NCBTS-TSNA

What will be monitored

How it will be monitored

M&E tool to be used

Who will be responsible

for the monitoring

When will the monitoring take place

How will the results be used

NCBTS Implementers details in relation to their current position, their level of experience and qualification

All NCBTS Implementers will be asked to complete the profile

T&D-M&E Form 1: Individual Profile Template

TDNA-WG Prior to their involvement in the NCBTS-TSNA process

Results will be analyzed to ensure NCBTS Implementers have the required KSAs. Results will be entered into the TDIS

Teachers details in relation to their current position, their level of experience and qualification

All teachers will be asked to complete the profile

NCBTS-M&E Form 1: Teacher Profile for NCBTS-TSNA

TDNA-WG Prior to the accomplishment of the NCBTS-TSNA Tool

Results will be entered into the TD IS database along with their corresponding NCBTS-TSNA results

Implementation of the NCBTS-TNA Orientation Package in relation to the processes followed and the facilitation skills demonstrated.

A Process Observer will be assigned to complete a Learning Process Observation for each session

NCBTS-M&E Form 2: Learning Process Observation and Facilitation Skills

TDNA -WG During the NCBTS orientation workshop

Results will be discussed with individual Trainers to identify strengths and areas for improvement during debriefing sessions.Recommendations based on a analysis of the results should be included in the Program Completion Report

The competency of the NCBTS Coordinators in relation to the criteria set for the role.

A TDNA-WG member will be assigned to observer the NCBTS Coordinator during the orientation process

NCBTS-M&E Form 3: NCBTS Coordinator’s Checklist

Division TDNA-WG

During the NCBTS orientation workshop

Results will be discussed with individual NCBTS Coordinators to identify strengths and areas for improvementResults will be used to inform future decisions regarding the criteria and process for selecting NCBTS Coordinators. Recommendations based on an analysis of the results should be included in the Program Completion Report

The overall effectiveness of the

Each of the trainers will be

NCBTS-M&E Form 4:

Division TDNA-WG

Upon completion of

Results will be collated and analyzed by the

Section 7.1: NCBTS-TSNA Guide and Tools Page 88

Page 94: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

workshop as delivered by the whole Team.

asked to make an assessment of the orientation.

Trainer’s Assessment of the NCBTS Orientation Workshop

the NCBTS orientation workshop

TDNA-WG. A summary of the results will be included in the Program Completion Report and will inform future training.

Participants perception of the training in relation to - the overall quality of the training

- the usefulness of the training

- their ability to implement the content of the training

- strengths and weaknesses of the training

All participants will be asked to complete the Trainee’s End of F3 Program Assessment Form

NCBTS-M&E Form 5: Trainee’s End of F3 Program Assessment Form

TDNA-WG Upon completion of the NCBTS- TSNA orientation workshop

Participants evaluations will be collated by the TDNA-WG and the results analyzed. A summary of the results will be included in the Program Completion Report and will inform future training.

The implementation of the NCBTS-TSNA Orientation at the division, cluster and school level

A Process Observer will be identified and asked to complete the tool

NCBTS-M&E Form 6: Documentation Tool for the Conduct of Cluster or School Level NCBTS-TSNA Implementation

Region, DivisionTDNA - WG

During the NCBTS-TSNA Orientation Workshop at the Division, Cluster or School Level

Results to be discussed with the Implementers and identify strengths and areas for improvement.

Observations will be collated by the TDNA- WG and the results analyzed to inform future training

The priority training needs of teachers

The NCBTS Coordinator and the School Head will consolidate the results from the administration of the NCBTS –TSNA tool

NCBTS-M&E Form 7: School’s NCBTS-TSNA Consolidation Template

TDNA-WG After the accomplishment of the NCBTS-TSNA tool

Results will be used to inform school and division plans for professional development. Results will be submitted to the Division.

Section 7.1: NCBTS-TSNA Guide and Tools Page 89

Page 95: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

T&D-M&E Form 1: Individual Profile Template

I PERSONAL DATAName:                     

(Surname) (First Name) (Middle Name)

Employee Number (If Applicable):   Sex:   Male FemaleDate of Birth:                    Home Address:                    Contact #:         e-mail address: Region:   Division:       District:          Office/School:         Address:          Current Position: Other Designations:

 Highest Educational Attainment:           

II. WORK EXPERIENCE(List from most current.)

POSITIONMAIN AREA OF

RESPONSIBILITY e.g. subjects taught, level supervised

LEVEL e.g. Elem/Sec/ALS school, district, division, region

INCLUSIVE PERIOD

                                                                                                                   

Use additional sheet if necessary.

Section 7.1: NCBTS-TSNA Guide and Tools Page 90

Page 96: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

III. TRAINING ATTENDED OVER THE LAST THREE YEARS

Please check training focus and management level for all training attended over the last three years.

Training Focus Training attended over last 3 years ()

Management Level of TrainingCentral Region Division Cluster School

Curriculum

Resource Materials Development

Planning

Management

Policy Development

Research

Other, please specify ______________

IV. SIGNIFICANT EXPERIENCESIdentify which of the following areas you consider to be your area(s) of expertise: S School Based Management Quality Assurance Monitoring and Evaluation Access Education Subject Specialization: _____________) Education Planning Policy Development Learning Resource Materials Development ICT Delivery of Training Other, please specify ________________

Certified Trainers by NEAP Central NEAP-Region TEI

SEAMEO- INNOTECH Foreign Assisted Projects (FAP) Other, please specify --

List your significant experiences in the identified areas

Use additional sheet if necessary.

Section 7.1: NCBTS-TSNA Guide and Tools Page 91

Page 97: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

V. TRAINING AND DEVELOPMENT EXPERIENCES

Identify which of the following specific areas you consider to be your area(s) of expertise:

Competency Assessment Program Planning

Program Designing Resource Materials Development

Program Delivery Program Management

Monitoring and Evaluation of Training

List your significant experiences in the identified areas

Use additional sheet if necessary.

I certify that the information I have given to the foregoing questions are true, complete, and correct to the best of my knowledge and belief.

Date: 

Signature:    

Please submit completed form to Training and Development Division/Unit. Information will be incorporated into the T&D Information System Database.

Section 7.1: NCBTS-TSNA Guide and Tools Page 92

Page 98: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Section 7.1: NCBTS-TSNA Guide and Tools Page 93

Page 99: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Section 7.1: NCBTS-TSNA Guide and Tools Page 94

Page 100: Volume 2_TDNA System Operations Manual July V2010

NCBTS-M&E Form 2: Learning Process Observation and Facilitation Skills

This form is to be used during the actual delivery of a program. A Process Observer will need to be assigned to complete the Learning Process Observation for each session. Results should be used to inform daily debriefing sessions. At the end of this Form is a Checklist of Facilitation Skills which may be observed and recorded.

Session No. _____ Title: ____________________________________________

Time Session Started: ________________ Time Session Ended:____________

Process Observer: ___________________ Designation (M&E Team Member/Trainer)

Phases of Session

Facilitation Skills Demonstrated

Trainee’s Knowledge

/Insights/Skills, Values Learned

Comments

Introductory

Activity

Analysis

Page 101: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Abstraction

Application

Concluding Activity

Section 7.1: NCBTS-TSNA Guide and Tools Page 96

Page 102: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Observe if the Facilitator has demonstrated the skill. If so, put a check in the appropriate column.

Checklist of Facilitation Skills √OBSERVING SKILLS

1. noted trainees’ level of involvement in all activities2. monitored the energy level of the trainees during sessions3. sensed the needs of the trainees that may affect the learning process

QUESTIONING SKILLS4. formulated questions in a simple manner 5. asked questions that were clear and focused6. formulates follow-up questions to trainees’ responses appropriately7. asked Higher Order Thinking Skills (HOTS)8. acknowledged trainees’ responses9. solicited, accepted and acted on feedback from trainees10. processed responses with probing questions to elicit the desired training

LISTENING SKILLS11. listened and understood the meaning of what had been said12. responded positively to trainees insights13. clarified and checked my understanding of what was heard14. reacted to ideas not to the person

ATTENDING SKILLS15. created the proper environment based on adult learning principles16. directed and redirected the trainees to the learning tasks17. managed the learning atmosphere throughout the sessions18. acknowledged greetings and responses of trainees

INTEGRATING SKILLS19. highlighted important results of the activity that lead to the attainment of the

objectives of the session20. deepened and broadened trainees outlook on the significance of the outputs

ORAL COMMUNICATION SKILLS21. expressed ideas with clarity, logic and in grammatically correct sentences22. spoke with a well-modulated voice23. delivered ideas with confidence and sincerity

SKILL IN USING TRAINING AIDS24. employed appropriate and updated training aids25. made training aids that were simple and clear26. used training aids that were attractive and interesting27. utilized training aids that were socially, culturally, and gender-fair

Section 7.1: NCBTS-TSNA Guide and Tools Page 97

Page 103: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

NCBTS-M&E Form 3: NCBTS Coordinator’s Checklist

Name of NCBTS Coordinator to be monitored: ____________________________________________

Please assess the competency of the NCBTS Coordinator according to the following indicators by checking under the appropriate column.

Legend: M-Manifested; N-Not ManifestedThe NCBTS Coordinator demonstrates… M NM Comments

1 Proficiency in the use of MS Word

2 Proficiency in the use of MS Excel proficient

3 Proficiency in the use of MS powerpoint

4 Confidence in using the e-version of the tool with minimum assistance

5 Understanding of the scoring guide described in the manual version of the tool

6 Fluency in communicating ideas

7 Active participation in the workshop

8 Positive attitude towards interacting with the other participants

9 Leadership potential during group activities

10

Readiness to act as anchor/lead person as required

Name and Signature of the Monitor: _________________________________________________

Date Accomplished: _____________________________

Section 7.1: NCBTS-TSNA Guide and Tools Page 98

Page 104: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

NCBTS-M&E Form 3: NCBTS Coordinator’s Checklist Consolidation Template

INSTRUCTIONS FOR NCBTS COORDINATOR’S CHECKLIST

Instructions for AdministrationA. For the TDNAWG Chair

1. Assign each member of the TDNA-WG to observe how the orientation is conducted at the cluster level or school level orientation.

2. Distribute this instrument to the members of the TDNA-WG. This instrument may also be given to Education Supervisors or Public Schools District Supervisors who may assist in the monitoring of the orientation activities.

3. Brief the members, ES, and PSDS on how to use this instrument.4. Retrieve all the accomplished instruments.

B. For the TDNAWG Member or ES/PSDS Be familiar with the indicators included in this instrument. Observe how the NCBTS Coordinator conducts the orientation activities. Use the instrument to record the indicators manifested by the NCBTS Coordinator. Submit the accomplished instruments to the TDNAWG Chair.

Scoring and ConsolidationUse the Template that follows to consolidate results. *This can efficiently be done using MS ExcelNCBTS Coordinator Checklist Consolidation

Tally Frequency Total(M + NM)

Percentage of Manifestation(Total M ÷ Grand Total= 100%)

Items M NM M NM

1

2

3

4

56

7

8

9

10

Total M Grand Total

Section 7.1: NCBTS-TSNA Guide and Tools Page 99

___%

Page 105: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

NCBTS-M&E Form 4: Trainers Assessment of the NCBTS Orientation Workshop

Trainer’s Name: _________________________________ Sex: Male Female

Please assess the effectiveness of the entire workshop according to the indicators below.Please refer to the following rating scale:

4-Very High (VH); 3-High (H); 2-Low (Low); 1-Very Low (VL)

After the conduct of the Orientation Program by the Team and considering participants’ outputs I believe that ……….

Rating1 2 3 4

1 the workshop was well planned2. the workshop objectives were met3. new information was clearly presented4. new information was appropriate to participants’ roles and responsibilities5. the strategies and methods used were interesting and enjoyable for participants6. the andragogical (4 As) approach was properly applied7. training activities moved quickly enough to maintain participants’ interest8. contribution of all participants, both male and female, were encouraged9. participants were encouraged to consider how ideas and skills gained during

the training could be incorporate into their own practices10. handout materials were clear 11. workshop topics were summarised12. powerpoint presentations supported the flow of sessions 13. the resources provided were appropriate to particiants’ needs

My contribution to the objectives of the workshop: I …

14. contributed in the preparation for the workshop.15. effectively delivered what was expected of me in the conduct of the workshop.16. gave support needed to the Team.

Please provide your honest response to each of the following questions:

What were the successful aspects of the workshop? Why?

What changes would you like to make to improve similar workshops in the future? Why?

Recommendations

Signature: _________________________________ Date Accomplished: ____________

Section 7.1: NCBTS-TSNA Guide and Tools Page 100

Page 106: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

NCBTS-M&E Form 4: Trainers Assessment of the NCBTS Orientation Workshop Consolidation Template

INSTRUCTION FOR TRAINER’S ASSESSMENT OF WORKSHOP

I. Instructions for Administration

Give this instrument to the trainers prior to the beginning of the workshop. Brief the trainer of the content and purpose of the instrument prior to administration. Consolidate the results based on the accomplished instruments.

II. Scoring and Consolidation -This can efficiently be done using MS Excel.

Tally (T) Frequency (e) (f) Mean Ratin

ga b c d

Items VL L H VH VLTx1

LTx2

HTx3

VHTx4

a+b+c+d VH + H + L + VL

e/f

Ex. llll ll

llll llll

llll

llll

7x1=7

9x2=18

4x3=12

5x4=20

7+ 18=12+ 20=57

7+9+4+5= 25

57/25= 2.28

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

Section 7.1: NCBTS-TSNA Guide and Tools Page 101

Page 107: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

NCBTS-M&E Form 5: Trainees’ End of the F3 Program Assessment

Trainee’s Name (Optional): _________________________ Sex: Male Female

Program Title: ________________________ Date: __________________

Direction: Please assess the effectiveness of the entire F3 component of the program according to the indicators below. Please refer to the following rating scale:

4-Strongly Agree (SA); 3-Agree (A); 2-Disagree (D); 1-Strongly Disagree (SD)

After the conduct of the F3 component of the program, I believe that …

Rating1

SD2D

3A

4SA

A Program Planning/Management/Preparation1 the training program was delivered as planned2 the training program was managed efficiently3 the training program was well-structured B Attainment of Objectives4 the program objectives were clearly presented5 the session objectives were logically arranged6 the program and session objectives were attainedC Delivery of Program Content7 program content was appropriate to trainees’ roles and

responsibilities8 content delivered was based on authoritative and reliable

sources 9 new learning was clearly presented10

the session activities were effective in generating learning

11

adult learning methodologies were used effectively

12

management of learning was effectively structured e.g. portfolio, synthesis of previous learning, etc.

D Trainees’ Learning13

trainees were encouraged to consider how ideas and skills gained during the training could be incorporated into their own practices

14

contribution of all trainees, both male and female, were encouraged

15

trainees demonstrated a clear understanding of the content delivered

E Trainers’ Conduct of Sessions16

the trainers’ competencies were evident in the conduct of the sessions

17

teamwork among the trainers and staff was manifested

18

trainers established a positive learning environment

19

training activities moved quickly enough to maintain trainees’ interest

F Provision of Support Materials 20

training materials were clear and useful

Section 7.1: NCBTS-TSNA Guide and Tools Page 102

Page 108: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

21

powerpoint presentations supported the flow of the sessions

22

the resources provided were appropriate to trainees’ needs

G Program Management Team 23

Program Management Team members were courteous

24

Program Management Team was efficient

25

Program Management Team was responsive to the needs of trainees

H Venue and Accommodation 26

the venue was well lighted and ventilated

27

the venue was comfortable with sufficient space for program activities

28

the venue had sanitary and hygienic conditions

29

Meals were nutritious and sufficient in quantity and quality.

30

the accommodation was comfortable with sanitary and hygienic conditions

I Overall31

I have the knowledge and skills to apply the new learning

32

I have the confidence to implement the JEL contract

Please provide your honest response to each of the following questions:

What do you consider your most significant learning from the program?

What changes would you suggest to improve similar programs in the future?

Briefly describe what you have learned and how it will help you with your work.

What further recommendations do you have?

Section 7.1: NCBTS-TSNA Guide and Tools Page 103

Page 109: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Section 7.1: NCBTS-TSNA Guide and Tools Page 104

Page 110: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

NCBTS-M&E Form 5: Trainees’ End of the F3 Program Assessment Consolidation Template

Collate the accomplished F3-M&E Form 5: Trainees’ End of the F3 Program Assessment, and review the results. Use the table below to consolidate the results for the quantitative items.

Note: The scoring and consolidation can be efficiently done using MS Excel.

Use the scale below to interpret mean rating for each item of the assessment: 3.5 to 4.0 = (SA) Strongly Agree

2.5 to 3.4 = (A) Agree1.5 to 2.4 = (D) Disagree 1.0 to 1.4 = (SD) Strongly Disagree

Qualitative results should also be summarized below.

Tally (T) Frequency (e) (f) Mean RatingA b c d

Items

SD

D A SA SDTx1

DTx2

ATx3

SATx4

a+b+c+

d

SA+ A + D+ SD

e/f

Ex. llllIII =8

llll =7

0x1=0 0x2=0

8x3=24

7x4=28

24+28=52

7+8= 15

52/15= 3.47

A Program Planning/Management/Preparation

1

2

3

B Attainment of Objectives

4

5

6

C Delivery of Program Content

7

8

9

10

11

12

D Trainees’ Learning

13

14

15

Section 7.1: NCBTS-TSNA Guide and Tools Page 105

Page 111: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

E Trainers Conduct of Sessions

16

17

18

19

F Provision of Support Materials

20

21

22

G Program Management Team

23

24

25

H Venue and Accommodation

26

27

28

29

30

I Overall

31

32

Summary of Qualitative Responses

Section 7.1: NCBTS-TSNA Guide and Tools Page 106

Page 112: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

What do you consider your most significant learning from the program?

What changes would you suggest to improve similar programs in the future?

Briefly describe what you have learned and how it will help you with your work.

What further recommendations do you have?

NCBTS-M&E Form 6: Documentation Tool for the Conduct of Division, Cluster or School Level NCBTS-TSNA Implementation

Section 7.1: NCBTS-TSNA Guide and Tools Page 107

Page 113: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

This form is to be used to support Regional monitoring of the NCBTS-TSNA process at the Division level and Division monitoring of district and school level activities. It is expected that the assessment will be based on observations, discussions with the implementing team and review of relevant documents.

Division/District/School _________________________ Date: __________________

Rating Guide:Numerical

RatingInterpretation Description

4 Very High Extent In a very significant way3 High Extent In a meaningful way2 Low Extent In a limited way only1 Very Low Extent Not in any meaningful way

Use the scale above to assess the extent to which the conduct of TDNA documentation adhered to the following:

To what extent …….. 1 2 3 41. was thorough planning conducted prior to the NCBTS-TSNA

orientation workshop?2. were participants oriented to the NCBTS?3. was the purpose of the NCBTS-TSNA explained?4. was a clear explanation provided on how to accomplish the NCBTS-

TSNA tools e.g. manual and/or e-version5. was the scoring system for the NCBTS-TSNA tool explained?6. were the steps involved in developing an Individual Summary TSNA

results explained? 7. were the steps involved in consolidating TSNA results explained?8. was an explanation on how to interpreted individual and consolidated

results provided ?9. was technical assistance provided when required?10.

were the M&E tools and processes implemented?

11.

was there evidence of team work and collaboration amongst the NCBTS Implementers

12.

were recommendations for improving the NCBTS-TSNA Orientation and Administration processes identified?

Recommendations:

Name: ___________________________________

Designation: ______________________________

Date: ____________________________________

Section 7.1: NCBTS-TSNA Guide and Tools Page 108

Page 114: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

NCBTS-M&E Form 7: School’s NCBTS-TSNA Consolidation Template

Name of School: ____________________________Division _______________________

School NCBTS-TSNA Results

School Head _________________________________________

Section 7.1: NCBTS-TSNA Guide and Tools Page 109

Domain /Strand No.

Teacher’s Percentage ScoreTotal

Average Percentage

T1 T2 T3 T4 …

1.1 1.2Total Domain 1.

2.1 2.2 2.3 2.4 2.5Total Domain 2. 3.1Total Domain 3. 4.1 4.2 4.3 4.4 4.5 4.6 4.7Total Domain 4. 5.1 5.2 5.3 5.4Total Domain 5. 6.1Total Domain 6. 7.1 7.2 7.3Total Domain 7.

Page 115: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

NCBTS Coordinator: __________________________________

7.2. The Training and Development Needs Assessment for School Heads (TDNASH) Guide and Tools

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 110

Page 116: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Republic of the Philippines

Department of Education

Training and Development Needs Assessment for School Heads

(TDNASH)

Guide and Tools

DepED-EDPITAF-STRIVEJune 2010

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 111

Page 117: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 112

This document, The TDNASH Guide and Tools, was developed and validated in Regions VI, VII and VIII, Divisions of Negros Occidental, Bohol/Tagbilaran and Northern Samar through the AusAID-funded

project, STRIVE (Strengthening the Implementation of Basic Education in selected Provinces in the Visayas), in coordination with the EDPITAF (Educational Development Project Implementing Task

Force) and in consultation with the TEDP-TWG, NEAP and the Bureaus of the Department of Education.

Page 118: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Table of Contents

I. Introductory Information Basis and Purpose of the TDNASH

The TDNASH Working Group TDNA Working Group Roles and Responsibilities TDNASH Process Flow

II. Assessment Approach and Methodology Preliminary Preparations Guide for the Conduct of the TDNASH Working Group Orientation Administration of the TDNASH to School Heads Administration of the TDNASH for Teachers using a Group Consensual Assessment Technique Flow for Teachers Group Consensual Assessment Technique Administration of the TDNASH by a Supervisor

III. Analysis of TDNASH Results Scoring the TDASH Answer Sheet Consolidation of TDNASH Triangulation Results for an Individual SH Reporting Cluster Results Interpretation of Results

IV. TDNA Consolidation Database for TDNASH V. Monitoring and Evaluation of the TDNASH

Attachment 1: TDNASH Tools and Templates

Attachment 2: M&E Tools for the TDNASH

Attachment 3: National Competency-Based Standards for School Heads

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 113

Page 119: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 114

GLOSSARY OF ACRONYMS

AIP Annual Implementation Plan

BESRA Basic Education Sector Reform Agenda

CO Central Office

DEDP Division Education Development Plan

DepED Department of Education

DO Division Office

ELMP Education Leadership and Management Program

EDPITAF Educational Development Project Implementing Task Force

EBEIS Enhanced Basic Education Information System

ES Education Supervisor

FGD Focus Group Discussion

GCA Group Consensual Assessment

IRR Implementing Rules and Regulations of RA 9155, December 2007

IPPD Individual Plan for Professional Development

KSA Knowledge, Skills and Attitudes

LOC Level of Competency

LOI Level of Importance

LRMDS Learning Resource Management and Development System

MPPD Master Plan for Professional Development

M&E Monitoring and Evaluation

NCBS-SH National Competency Based Standards for School Heads

NCBTS National Competency-Based Teacher Standards

NSHPI National School Heads Performance Indicators

PDM Professional Development Materials

PDP Professional Development Planning

PDRD Program Designing and Resource Development

PDy Program Delivery

PSDS Public School District Supervisor

RA 9155 Republic Act 9155: Governance Act for Basic Education, 11 Aug 2001

REDP Regional Education Development Plan

RO Regional Office

SBM School-Based Management

SEDIP Secondary Education Development Improvement Project

SH School Head

SIP School Improvement Plan

SLE Structured Learning Episode

SLEL School Leadership Experience Level

SLEP School Leadership Experience Portfolio

SPPD School Plan for Professional Development

STRIVE Strengthening the Implementation of Basic Education in Selected Provinces in the Visayas

T&D Training and Development

TDIS Training and Development Information System

TDNA Training and Development Needs Assessment

TSNA Teachers Strengths and Needs Assessment

Page 120: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Training and Development Needs Assessment for School Heads (TDNASH)

I. INTRODUCTORY INFORMATION

Basis and Purpose of the TDNASH

The Training and Development Needs Assessment for School Heads (TDNASH) is a means to systematically determine the training and development needs of school heads (SH) in order to support improved educational leadership. The competencies identified in the TDNASH are based on the mandate for school heads indicated in the RA 9155, its Implementing Rules and Regulations (IRR), and the DepED Order 80 & 81 series of 2003.

The TDNASH is an adaptation of the National Educators Academy of the Philippines (NEAP) assessment tool for school managers. A list of required competencies based on the mandate for school heads was earlier developed by the NEAP for its School Leadership Experience Portfolio (SLEP). The SLEP is an assessment tool containing a rubric of behavioral indicators to assess school leaders on specific competencies and to classify their level of performance into one of five general types, namely: Awareness, Emerging, Practicing, Performing and Transforming. The SLEP Assessment Tool was validated and utilized in a previous DepED project, the Secondary Education Development Improvement Project (SEDIP), involving 16 DepED Divisions across the country.

The SLEP was the primary material that was reviewed and enhanced to develop the final list of competencies in behavioral terms for the purpose of identifying the training and development needs of school heads in the target sites of Project STRIVE in 2006. The SLEP tool was designed for the training and development needs assessment of school heads in Northern Samar and revised into a shorter form for the Division of Bohol in 2007. The tool was further refined and validated with 716 SHs in the Division of Negros Occidental in June 2008 and with 43 SHs in Northern Samar and 63 SHs in Bohol in September 2008. The pilot-test results were used to develop this present form referred to as Training and Development Needs Assessment for School Heads (TDNASH).

The competencies identified for the school heads are classified into seven domains for school leadership and management which are:

Domain 1: School Leadership, Domain 2: Instructional Leadership, Domain 3: Creating a Student-centered Learning Climate, Domain 4: Professional Development and HR Management, Domain 5: Parental Involvement and Community Partnership, Domain 6: School Management and Daily Operations, and Domain 7: Personal Integrity and Interpersonal Effectiveness.

The TDNASH intends to: determine the current level of school heads competency ascertain the level of importance of each competency to the job identify the priority training and development and development needs of the SHs vis-à-vis

the seven domains for school leadership and management

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 115

Page 121: Volume 2_TDNA System Operations Manual July V2010

Region Level:RD issues memo to Divisions commencing the TDNASH , specifying among others, the structure, functions, general processes, resources and responsibilities

Division Level:SDS instructs the T&D TDNA-WG to commence the TDNASH, specifying among others, structure, functions, and general processes, resources and responsibilitiesT&D Chair convenes the TDNA-WG and starts orientation and preparatory activities for the TDNASH

School Level:TDNA-WG administers the triangulation process for the TDNASH by cluster to SHs, teachers and supervisors

Division/District Level:TDNA-WG analyzes TDNASH data per district/clusterTDNA-WG consolidates TDNASH results and reports to the SDS TDNASH results are utilized for Div- MPPD re SHs’ leadership training plan

RO utilizes TDNASH results for the Reg- MPPD re SHs’ leadership training plan

TDNA-WG monitors and evaluates the TDNASH process (Internal M&E)

SHs are furnished individual results of TDNASH SHs identify their priority training and development needsSHs develop their IPPD

SDS submits report to RD

TDNA-WG monitors and evaluates the TDNASH process at the DO level

T&D System Operations Manual-Volume 2: The TDNA System

The TDNA Working Group (TDNA-WG)

The Division Training and Development (T&D) Team through its TDNA Working Group (TDNA-WG) is responsible for the management of the TDNASH process. The T&D Chair, acting as the Chair of the TDNA-WG and assisted by two Co-chairs (representing the elementary and the secondary levels who are preferably Division Supervisors), are designated by the Schools Division Superintendent (SDS). The TDNA-WG members are all the District/Division Supervisors. Each member should have direct responsibility for the supervision of a cluster of elementary and/or secondary schools.

TDNA-WG Roles and Responsibilities

The Chair and Co-chairs have overall responsibility for the management of the TDNASH process. They should ensure that the Supervisors, who are also TDNA-WG members, are familiar with the process for conducting the TDNASH to the three different sets of respondents, i.e. the SHs, Supervisors and teachers. The TDNA-WG members are expected to play a key role in the preparation, administration, data analysis and reporting the results of the TDNASH. The general flow of processes related to the TDNASH across the Regional, Division/District and School levels is seen in the diagram below.

TDNASH Process Flow

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 116

Page 122: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

II. Assessment Approach and Methodology

The TDNA is basically a self-appraisal made by the School Head (SH) employing a triangulation process that involves two other assessment measures that validate the self assessment made by the SH. The School Heads are asked to complete a self assessment instrument where they identify the behaviors they have consistently demonstrated in their role as a school head across the seven domains. They also identify the importance of the competencies in relation to their ability to perform their job. To corroborate the self-assessment, a Group Consensual Assessment (GCA) technique is conducted with a small group of teachers (5-8) to consider the School Head’s level of competency. Finally the School Head’s District/Division Supervisor’s assessment of the School Head makes up the third measure using the same instrument covering all the domains and competencies expected of the a school leader.

A. Preliminary Preparations

Prior to the commencement of the administration of the TDNASH, the T&D Chair and Co-chairs are convened to accomplish the following tasks:

Study the TDNASH Guide and Tools, including the introductory information, procedures and specifically to review the following:- Individual Profile Template- TDNASH Tool - TDNASH Answer Sheet- TDNASH Consolidation Tool (e-version)- List of Competencies and Behavioral Indicators - Monitoring and Evaluation Tools

The Chair and Co-chairs of the group lead a one-day orientation activity for the TDNA-WG which includes all the Supervisors, guided by the following steps

Guide for the Conduct of the TDNA-WG Orientation1. Conduct an introductory activity that provides an opportunity for participants to greet one

another and feel comfortable working together. 2. Explain the objectives of the orientation:

be familiarized with the purpose of the TDNASH develop a clear understanding of their role as TDNA-WG’s members in the administration

of the TDNASH practice the processes involved in the administration of the TDNASH to the different

respondents consolidate results from the TDNASH at the school and cluster level gain experience in analyzing the results from the TDNASH develop a plan for the completion of the TDNASH process

3. Give an overview of the orientation by focusing on the key understandings to be achieved: The Training Development Needs Assessment for School Heads (TDNASH) aims to identify

the School Leadership Experience Level (SLEL) of the School Head The TDNASH uses a triangulation process with data collected from the School Head, the

School Head’s direct supervisor and from teachers working with the School Head.

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 117

Page 123: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Strengths and prioritized training and development needs for individual School Heads and clusters of School Heads can be identified through reviewing the TDNASH results

The TDNA-WG members have the responsibility for the administration, consolidation and reporting of the TDNASH results.

4. Review the TDNASH Guide and Tools by: Discussing the Introductory Information and the TDNA Working Group Roles and

Responsibilities Highlighting all the other sections of the TDNASH Guide and Tools and explain that these

will be examined closely when the walk through of the materials is conducted. Clarifying any questions about the TDNASH Guide and Tools.

5. Walk through the TDNASH Guide and Tools. Begin by reviewing the process for administering the TDNASH to School Heads. Consider the:

- process to be followed- Individual Profile Template- TDNASH tool, including a review of the competencies and behavioral Indicators - TDNASH Answer Sheet- Level of Importance Rating Scale

Simulate the administration of TDNASH among the members present.

6. Review the purpose for collecting data from the Teachers who are working at the school managed by the School Head. Discuss the process to be used e.g. a Group Consensual Assessment (GCA) technique. Read the Focus Group Flow and practice the process for gathering corroborating evidence

from teachers through the GCA technique

NOTE: Point out that if a School Head is new to a school, it may be necessary to conduct the GCA technique with teachers from a school where the School Head has previously worked. If the School Head is new to the position (i.e. less than a year), it may not be possible to collect corroborating data from teachers. Hence, only the self-assessment data will be considered.

7. Review the purpose for collecting data from the Supervisors who are overseeing the school managed by the School Head. Discuss the process to be used e.g. responding to the TDNASH tool providing evidence to

corroborate the School Head’s self-assessment. Read the instructions for the Administration of the TDNASH by a Supervisor. Discuss the process and ensure the TDNA-WG members are familiar with the requirements

NOTE: A decision has to be made as to who is in the best position to provide the data for each School Head. If a School Head is new to an area, it may be necessary to ask a Supervisor from the area where the School Head has previously worked to complete the TDNASH data gathering exercise. For a School Head who is new to the position (i.e. less than a year), only the self-assessment data will be considered.

Ensure participants are aware of the three different sets of respondents whom data will be collected from and the processes involved for each set.

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 118

Page 124: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

8. Study how the indicators for the competencies included within the TDNASH tool are written as a continuum from the simplest to the most complex. Review the School Leadership Experience Levels and the expectations at the various levels. Look at the indicators for the competencies and note how each indicator is associated with

a specific School Leadership Experience Level.

9. Review the Analysis of the Results section of the Guide and Tools: Scoring the TDNASH Answer Sheets Consolidation of the TDNASH Triangulation Results for Individual SHs, including the use of

the e-version Reporting Cluster Results Interpretation of Results

10. Develop a plan for the field administration of the TDNASH. This should consider the: strategy to be followed e.g. Supervisors visit schools within their cluster and current

jurisdiction and administer the TDNASH to School Heads and conduct the GCA technique with teachers. Supervisors in charge of the cluster will complete the data for each school head separately as part of the triangulation process.

coordination of the monitoring and evaluating process management of the results analysis time frame for conducting the TDNASH reproduction of the materials for field use administrative tasks associated with the conduct of the TDNASH e.g. writing of memos

11. Summarize the key elements of the day’s orientation: Agreements that have been made for the administration of the TDNASH. TDNA-WG members’ roles and responsibilities. Submission of triangulation results for individual School Heads and for clusters of School

Heads to the TDNA-WG Chair TDNA-WG monitoring processes and the analysis of the results at the District/Division level

12. Thank participants for their involvement and commitment

B. Administration of the TDNASH to School Heads

This may be done by District or Cluster. It is also possible that the Supervisor administers the tool to individual SHs during the administration of the same to the teacher-respondents. The Supervisor in-charge of the district/cluster or the school will be responsible in the administration of the TDNASH.

The materials required for the administration of the TDNASH to the School Heads are: TDNASH Tool (see Attachment 1) TDNASH Answer Sheet (see Attachment 1) Individual Profile Template (see Attachment 2)

1. Explain to the School Heads that they will be completing a TDNA to determine their training strengths and needs in order to support improved educational leadership. You may add that the TDNA results will help him/her in the development of an Individual Plan for Professional

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 119

Page 125: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Development (IPPD) and will also inform the Division in designing a Master Plan for Professional Development (MPPD) for the benefit of school managers.

2. Emphasize that TDNA is the initial stage of managing one’s professional growth where individual strengths are affirmed and learning needs are purposely taken care of through a deep sense of individual accountability for self-professional development. Stress to the School Head the TDNA’s spirit of introspection or guided-reflection and that it is not intended to evaluate performance nor is it a measure to compare between or among various school heads.

3. Present the competencies identified in the TDNASH based on the mandate for school Heads indicated in the RA 9155 and adapted from the National Educators Academy of the Philippines (NEAP) School Leadership Experience Portfolio (SLEP). The competencies are classified into seven domains for school leadership and management:

Domain 1: School Leadership, Domain 2: Instructional Leadership, Domain 3: Creating a Student-centered Learning Climate, Domain 4: Professional Development and HR Management, Domain 5: Parental Involvement and Community Partnership, Domain 6: School Management and Daily Operations, and Domain 7: Personal Integrity and Interpersonal Effectiveness

4. Explain that the TDNASH follows a triangulation process and in addition to the School Head’s self assessment, corroborating data will also be collected from a group of teachers from the School Head’s school through a Group Consensual Assessment (GCA) technique, as well as data from the School Head’s Supervisor. Results from the triangulation will be collated and a copy provided to the School Head to support them in identifying their training strengths and needs.

5. Review the TDNASH tool with the School Head and explain that for each domain there are a number of competencies. For each competency a qualifier has been provided which identifies a specific focus. Each competency also has a series of indicators which state the type of behaviors expected to be demonstrated by a school head as they develop their education leadership skills.

6. Ask the School Head to consider each indicator and reflect on the things they have done as a School Head related to the indicators and which demonstrate their level of competency.

7. Show the School Head the Answer Sheet and explain that they need to place a for Yes and a X for No in the appropriate column, representing their assessment of whether or not they consistently demonstrate this behavior in their work. They will need to do this for each indicator.

8. Tell the SH to note that the letter codes, A, E, P, L, T under the SLEL column are intended for the analysis of results. Please do not be distracted by these codes.

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 120

Page 126: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

9. Explain to School Heads that in addition to assessing their level of competency in relation to each indicator they have to rate each competency in terms of how essential they believe it is in the performance of their job. Explain the rating scale which they will use to rate the level of importance (LOI) of each competency and show them where they are to record this information on the Answer Sheet, e.g. under the column labeled LOI.

10. Explain that the completion of the TDNASH will take approximately 2 hours, however no time limit is imposed.

11. Allow enough time for the School Head to complete the Individual Profile Template and the TDNASH.

12. When the School Head has completed the TDNASH collect the Individual Profile and the Answer Sheet. Ensure the School Head has affixed their name and signature at the end of the Answer Sheet.

13. Thank the School Head for their cooperation and explain that a copy of the results from the triangulation process will be forwarded when collated.

C. Administration of the TDNASH for the Teachers’ using a Group Consensual Assessment (GCA) Technique

For the administration of the TDNASH by Teachers the following materials (see Attachment 1) are required:

TDNASH Tool (First Page to be removed, 1 per teacher) TDNASH Answer Sheet (1 only per group)

A small group of teachers (5-8) is selected by the PSDS responsible for overseeing the school, to participate in the Group Consensual Assessment. The teachers selected should be representative of the school teacher population and should have a good knowledge of the School Head’s work practices.

Use the following flow to guide the GCA technique.

Flow for Teachers’ Group Consensual Assessment TechniqueObjective ActivityHave everyone at ease with the gathering of respondentsExplain the genuine intention and spirit of a TDNA

Introduction1. Explain to the Teachers that they will be completing a Training Development Needs

Assessment for their School Head (TDNASH) as viewed from their own perspective as teachers. Present the intention of a TDNA, which is to determine the training

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 121

Rating Scale – Level of Importance(LOI) to the JobRating Interpretation Rating Interpretation

1 Most Essential 3 Essential

2 More Essential 4 Less Essential

5 Least Essential

Page 127: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

needs of SHs in order to support improved educational leadership. Just like the purpose of the NCBTS-TSNA for teachers, you may add that the TDNA results will help the SH in the development of an individual plan for professional development and will also inform the Division in designing a Master Plan for Professional Development (MPPD) for the benefit of school managers.

2. Emphasize to the participants that as key respondents to this TDNA of the School Head, their answers and agreement with their colleagues to reach a consensual assessment will be most helpful for the future development of the educational leadership competencies of the School Head.

3. Stress the TDNA’s spirit of introspection or guided-reflection and that it is important for them to be objective and true to their assessment of the competencies of their SH.

Develop an understanding of the purpose of the TDNA and the basic mechanics of the exercise

4. Presentation of the purpose of the TDNASH e.g. to corroborate the School Heads Self Assessment. Provide a brief description of TDNASH process e.g. a triangulation process and the data gathering method for the teachers e.g. a group assessment to reach a consensus.

5. Explain that the results of the GCA will be provided to the School Head as a consensual decision and not as individual teacher’s views.

6. Ask the group to select a recorder who will take charge of entering the consensual ratings on the Answer Sheet provided for the GCA.

Introduce the TDNASH Tool and the Answer Sheet Achieve a common understanding of the response mode and the rating scale for assessing the competency’s level of importance.

7. Review the tool with the teachers and explain that for each domain there are a number of competencies. For each competency a qualifier providing a specific focus has been identified as well as a series of indicators which state the type of behaviors expected to be demonstrated by a school head as they develop their education leadership skills.

8. Explain that the TDNASH tool requires them as a group to consider each of the indicators and decide if they believe the School Head consistently demonstrates this behavior.

9. Show that there are two sets of data required by the tool, one for the School Leadership Experience Level (SLEL) indicators that requires a Yes/No response, and another for the Level of Importance (LIO) of the competency to the job.

10. Provide an explanation of the rating scale to be used for rating the LOI, where 1 is Most Essential, 2 is More Essential, 3 is Essential, 4 is Less Essential, and 5 is Least Essential.

Achieve a common understanding of each competency

11. Walk through the ‘TDNASH Tool’ with the teachers. Consider one competency at a time. Consider each of the indicators for the competency and ask the teachers to reflect on whether or not the School Head consistently demonstrates this behavior.

Articulation of:a. Individual perception on

whether the School Heads consistently demonstrates the behavioral indicators

b. Consensus agreement on whether the School Heads consistently demonstrates the behavioral indicators

c. Individual perception on the level of importance of the competency to the job of a School Head

d. Consensus agreement on the level of importance of the competency to the job of a School Head

12. Each participant provides an individual perception of whether the School Head consistently demonstrates the behavioral indicators and provides evidence to support their perception

13. The group reaches a consensus and provides a Yes/No response for each indicator. They need to place a for Yes or a X for No in the appropriate column.

14. Record the groups’ consensual decision on the separate answer sheet provided. 15. Each participant provides an individual perception of the LOI of the competency to

the job of a School Head 16. The group reaches a consensus and provides a rating (1, 2, 3, 4, or 5) according to

the rating scale provided. 17. Record the group’s consensual decision on the answer sheet provided. 18. Continue Steps 12 to 17 for each competency.

Evidence of Completion of GCA part of the TDNASH

19. Record details of the participants who took part in the GCA and ask the teachers to affix their signature at the end of the Answer Sheet.

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 122

Page 128: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

D. Administration of the TDNASH by a Supervisor

For the administration of the TDNASH by Supervisors the following materials (see Attachment 1) are required:

TDNASH Tool (1 copy – First page removed) TDNASH Answer Sheet (1 copy) Instructions for Supervisors

The Supervisor asked to complete the “TDNASH by Supervisors” should have a sound knowledge to the School Head’s work practices. This would normally be the Supervisor who has direct responsibility for overseeing the school where the School Head is based. It is most likely that the Supervisor is a member of the TDNA-WG who has undergone an orientation activity for the TDNASH. It is also possible that the Supervisor is the one who has administered the TDNASH to the SH and to the teachers.

The following points serve as a guide for accomplishment of the TDNASH by Supervisors:1. Provide the Supervisor with a copy of the ‘Instructions for Supervisors’ and allow him/her

time to read them. 2. Provide the Supervisor with a copy of the TDNASH Tool, and the Answer Sheet. 3. Clarify any questions they may have regarding completing the TDNASH 4. Allow time for the Supervisor to respond to the Tool.5. When the Supervisor has finished, collect the Answer Sheet and ensure the respondent

information is completed, and the Supervisor signature is provided at the end of the Answer Sheet.

III. ANALYSIS OF THE TDNASH RESULTS

Scoring the TDNASH Answer SheetsThe Supervisor is responsible for scoring the Answer Sheets from the three sets of respondents. The scoring process involves reviewing the indicators which are arranged in a continuum from the simplest to the more complex to identify the School Heads School Leadership Experience Level (SLEL).

Each Answer Sheet should be scored separately by completing the following steps:

1. Review each competency and the responses provided for the set of indicators. 2. Take note of where the ’s (representing a Yes response) have been given. These indicate that

the school head consistently demonstrates these behaviors.3. Identify where the SH level of competency appears to be (e.g. where the ’s finish and where

the X’s, representing a No response, start). See example below.

Domain 1: School Leadership Competency SLEL Indicator YES = NO = X LOI1.1

3

A 1

2

E 2 P 3 P 4 P 5 L 6 XL 7 X L 8 XL 9 XT 10 X

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 123

Page 129: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

T 11 XT 12 X

4. In the example above take note that the letter P under the School Leadership Experience Level (SLEL) column corresponds to the SH level of competency based on the indicator described in step No. 3 above. The letter P stands for Performing level which is the SLEL for the specific competency. Based on the scale below, P corresponds to a rating of 3. Record this on the Answer Sheet in the competency column.

School Leadership Experience Levels

Letter Code

Rating Scale

Awareness Level A 1Emerging Level E 2Performing Level P 3Leading Level L 4Transforming Level

T 5

5. In cases where there are cross marks followed by check marks, consider the first check mark followed by at least two crosses to be the registered competency level. See examples below.

Example A. The first followed by two Xs is registered as the final competency level.

Domain 1: School Leadership Competency SLEL Indicator YES = NO = X LOI1.1

4

A 1

2

E 2 P 3 XP 4 P 5 L 6 L 7 XL 8 XL 9 T 10 T 11 XT 12 X

Example B. The last of the series followed alternately by X’s is registered as the final level.

Domain 1: School Leadership Competency SLEL Indicator YES = NO = X LOI1.1

5

A 1

2

E 2 P 3 XP 4 P 5 X L 6 L 7 XL 8 L 9 XT 10 T 11 XT 12

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 124

Page 130: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

6. Complete this for each of the competencies and for the three sets of respondents.

7. This data becomes the bases for the consolidation of the TDNASH Triangulation Results.

Consolidation of the TDNASH Triangulation Results for Individual School Head

When the data has been collected and analyzed from all three sources involved in the TDNASH process, the results need to be consolidated by the TDNA-WG member who administered the TDNASH e.g. the Supervisor. A template (hard copy and an e-version) has been developed to assist with the consolidation (See Attachment 1).

1. Enter the final results of each SH’s ratings for both SLEL Competency ratings and the Level of Importance (LOI) ratings obtained from the School Head Self-Assessment, the Supervisor and the Teachers into the consolidation template provided.

2. Based on the SLEL ratings entered, compute the weighted average for the overall ‘School Leadership Experience Level’ for each competency. It is recommended that the School Head’s results be given a heavier weight than the results from the ‘others’. Thus, the weights for the SH self-assessment is 50%, the Supervisor’s is 25% and the Teachers’ assessment is 25%. Enter the weighted average for each competency in the appropriate column. See the example below on how to calculate the weighted average.

SEE EXAMPLE BELOW:

TDNASH Items

Competency Ratings obtained from triangulation process

Overall School Leadership Experience Level (SLEL)

Overall Level of Importance

Self-Assessment

Supervisor Teachers

Weighted Average of SLEL Ratings

A- AwarenessE- EmergingP- PracticingL- LeadingT- Transforming

(Simple) Average LOI

SLEL50%

LOI

SLEL25%

LOISLEL25%

LOI

1 2 4 2 5 2 4 2 E 4.332 2 3 3 2 3 2 2.5 P 2.333 4 2 3 3 4 4 3.75 L 3.004 3 2 2 1 1 1 2.25 E 1.33

Example of computing the weighted average (See TDNASH Item #3 above):

Self- Assessment (4 x .50) + Supervisor (3 x .25) + Teachers (4 X .25) = (2.0) + (0.75) + (1.0) = 3.75

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 125

Page 131: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

3. For the next Overall School Leadership Experience Level column, determine the equivalent SLEL descriptor (A, E, P, L, T) using the following scale ranges:

A- 1.00 to 1.49 P- 2.50 to 3.49 T - 4.50 to 5.00 E - 1.50 to 2.49 L - 3.50 to 4.49

4. For the level of importance, find the simple average of the LOI ratings across the results from the three sets of respondents.

5. A copy of the consolidated triangulation results should be provided to the School Head and to the TDNA-WG Chair. The Supervisor should discuss the results with the School Heads and assist with the interpretation of the results e.g. identifying priority training and development needs by finding the domains/competencies that have the lowest level of competency and are considered to be of the highest level of importance.

Reporting Cluster Results

When the triangulation results have been consolidated for each School Head, cluster results should be developed. A template (hard copy and an e-version) is provided to facilitate this process. (See Attachment 2). There are three parts of the template namely:

Part I – Cluster SH Identification; Part II – TDNASH Cluster Summary Sheet for SLEL; and Part III – TDNASH Cluster Summary Sheet for LOI.

The following are the steps for manually organizing the cluster results:

Part I. 1. Assign each of the School Heads within a specific cluster an identifying code number

such as: SH 1, SH 2, SH 3, until the last SH. These are entered in the first column of the template.

2. Enter the names of SHs corresponding to the code number identified.3. Write the school’s name beside the name of each of the SHs.

Part II. 1. Enter the individual Weighted Average of SLEL Ratings per School Head computed from the

TDNASH triangulation process into the TDNASH Cluster Summary Sheet

2. Compute the simple average across all the SHs’ in each of the competencies.

3. Determine the equivalent SLEL Descriptor (A, E, L, P or T) corresponding to the Overall Cluster SLEL average rating using the scale ranges below. Enter this in the last column.

A - 1.00 to 1.49 P- 2.50 to 3.49 T - 4.50 to 5.00 E - 1.50 to 2.49 L - 3.50 to 4.49

Part III.

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 126

Page 132: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

1. Enter the consolidated average for the Level of Importance from the TDNASH triangulation for each School Head into the TDNASH Cluster Summary Sheet LOI Results worksheet.

2. Calculate the simple average for the Level of Importance for each competency across schools within the cluster.

Analyze the Cluster results and develop an interpretation (See Section D below). This should identify the priority training and development needs of the School Heads within the Cluster

Provide a copy of the cluster results and priority training and development needs to the Lead School Head and to the TDNA-WG Chair. F. Interpretation of Results

The School Heads (SLEL) is interpreted using the descriptors found in the following table. The simplest level is Awareness and the most advanced is Transforming.

LEVELS DESCRIPTORS of School Leadership Experience Level

Awareness (A) Level

o Observing the leadership of others o Understanding the work of leadership from a theoretical perspectiveo Following the directives or suggestions of DepED or the school team

Emerging (E)

Level

o Identifying factors that affect work, programs or projectso Participating as a member in the work of a team for a specific purposeo Implementing a program or project selected by the team or downloaded by DepED

Practicing (P)

Level

o Facilitating the work of a team as a leadero Managing a program, project or initiative being done by a teamo Coordinating the work of otherso Developing programs and providing resources to be utilized by others in their worko Supporting the work of individuals and teams o Monitoring and evaluating performance of individuals, projects accomplishment and utilization of

budgeto Acting on the results of monitoring and evaluation of performanceo Providing feedback to teachers and teams regarding performance

Leading (L) Level

o Overseeing school-wide program, project or initiative which involves delegating, decision-making and problem solving

o Ensuring that teaching and non-teaching personnel are results orientedo Exercising management control and evaluating school-wide projects, programs and personnelo Designing, developing, and overseeing the implementation of a program or projecto Facilitating, coordinating and evaluating the work of several teams or sub-teamso Monitoring and ensuring efficient and effective use of time and resources o Observing, supervising, and coaching/mentoring teaching and non-teaching personnel for the

purpose of professional growtho Holding others accountable for their work and behavior while providing them with technical

assistance

Transforming

(T) Level

o Extending expertise to other school headso Providing opportunities for professional growth to teachers and non-teaching personnel o Institutionalizing efforts or practices that are found to be effectiveo Identifying and developing emerging and potential leaders in the school communityo Creating a culture of excellence to enhance peak performanceo Creating an atmosphere where individual differences are celebrated and where every individual

grows and maximizes his/her potential

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 127

Page 133: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Based on the results of the individual SLEL ratings, a School Head can recognize their level of competency for the various domains. The competencies where they are at an awareness (A) or emerging (E) level indicate that these require further development and training, while competencies that are rated at the transforming (T) level indicate a strength. Ratings at the practicing (P) and leading (L) levels indicate satisfactory levels of performance and should be further enhanced.

TDNASH Items

Competency Ratings obtained from triangulation process

Overall (SLEL) School Leadership Experience Level

Overall Level of Importance

Self-Assessment

Supervisor Teachers Weighte

d Average of SLEL Ratings

A- AwarenessE- EmergingP- PracticingL- LeadingT- Transforming

(Simple)

Average

LOISLEL50%

LOI

SLEL25%

LOI

SLEL25%

LOI

1 2 4 2 5 3 4 2.25 E 4.33 Less Essential

2 2 3 3 2 3 2 2.5 P 2.333 4 2 3 3 4 4 3.75 L 3.00

3 2 2 1 1 1 2.25 E 1.33 Most Essential

Sample of Consolidated Triangulation Results

Interpretation of the SLEL results must be done with consideration of the corresponding Level of Importance (LOI) rating. The LOI indicates how essential a given competency is to the job of a School Head. Although a School Head may be rated at the same level of performance for two different competencies, the competencies may not have the same degree of importance to the job of the School Head. Thus, the competency rated more essential should be given higher priority for development or training purposes. The example above illustrates that for Competency 1 and 4, the School Head has the same SLEL, the competencies do not have the same degree of importance to the job of the School Head. Thus, Competency 4 that is rated more essential should be given priority for development or training purposes.

Consolidated TDNASH results gathered from groups of school heads at the District/Division level may be analyzed in the same manner. Planners of training and development programs are informed by the data analysis as to what priority training programs could be prepared to address low level competencies that are most essential in the job of the school managers.

IV. TDNA CONSOLIDATION DATABASE FOR TDNASH

The TDNA Consolidation Database allows districts and divisions to upload the electronically consolidated triangulated results from the TDNASH. An individual School Head summary result as well as a district/division profile can be generated identifying strengths and priority training and development needs according to TDNASH domains and competencies. Data can be analyzed and

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 128

Page 134: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

used to inform the School Head’s development of an Individual Plans for Professional Development (IPPD) and the Division Master Plan for Professional Development (MPPD).

The TDNA database that supports the Division and regional consolidation of TDNASH data is linked to the web-based TDIS and can be accessed through the EBEIS at http://beis.deped.gov.ph/

Division TDNA-WG will be responsible for ensuring all Education Supervisors are able to use the e-version of the template for the consolidation of a School Head’s triangulation results for individual school heads. The Supervisors will be responsible for managing the consolidating the TDNASH results from all schools within their district. Results at this level should be used to inform District lead training and development activities for School Heads. The District level TDNASH consolidated electronic results should be submitted to the Division TDNA-WG.

The T&D Chair at the Division level is responsible for managing the TDNA Consolidation Database at the Division level. With the support of the TDNA-WG, district electronic results for the TDNASH are to be consolidated and incorporated into the web-based TDIS. Results should be analyzed to inform Division MPPDs.

V. MONITORING AND EVALUATION OF THE TDNASH TDNASH

The TDNA-WG Chair, Co-chairs and selected TDNA-WG members are tasked to monitor and evaluate the administration of the TDNASH. It is not expected that group members will be able to observe the administration of the TDNASH to all School Heads and the validation by teachers, however they should ensure they have an opportunity to observe the TDNASH administration in approximately 10% of sites.

Monitoring and Evaluation mechanism and tools have been developed to support the TDNASH process and consist of the:

T&D-M&E Form 1: Individual Profile TemplateTDNASH-M&E Form 1: Division M&E of Conduct of TDNASHTDNASH-M&E Form 2: TDNASH Consolidated Cluster Results TemplateTDNASH-M&E Form 3: Documentation Tool for Division Implementation of TDNASH

The matrix below describes the M&E tools developed for use during the implementation of the TDNASH process. The tools are found in Attachment 2. The TDNA-WG should be convened anew to consolidate the results from the monitoring of the TDNASH and develop recommendations for the improvement of the TDNASH process.

What will be monitored

How it will be monitored

M&E tool to be used

Who will be responsible

for the monitoring

When will the monitoring take place

How will the results be used

School Head’s details in relation to their current position, their level of experience and qualification

All School Heads will be asked to complete the profile

T&D-M&E Form 1: Individual Profile Template

TDNA - WG Prior to the accomplishment of the TDNASH Tool

Results will be entered into the TDIS database along with their corresponding TDNASH results

The implementation Members of TDNASH- Division During the Results will be collated

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 129

Page 135: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

of the TDNASH process at the school/cluster level

the Division TDNA- WG will be asked to observe the conduct of the TDNASH at the school/cluster level and complete the tool

M&E Form 1: Division Monitoring & Evaluation of the Conduct of TDNASH

TDNA - WG accomplishment and TDNASH process and the consolidation of results

and analyzed by the TDNA-WG and used to inform future TDNASH processes.

The training and development needs of the School Heads

The PSDS /ES will be asked to consolidated the School Head TDNASH results for a cluster of schools

TDNASH-M&E Form 2: TDNASH Consolidated Cluster Results Template

TDNA -WG After the accomplishment of the TDNASH by a cluster of School Heads

Results will be analyzed and used to inform the Division on the training and development needs for School Heads. Results will be incorporate into the MPPD and DEDP

The implementation of the TDNASH at the Division level

A Process Observer will be identified and asked to complete the tool

TDNASH-M&E Form 3: Documentation Tool for Division Implementation of TDNASH

Region TDNA - WG

During the TDNASH process at the Division Level

Results to be discussed with the Division and identify strengths and areas for improvement.

Observations will be collated by the TDNA- WG and the results analyzed to inform future TDNA policy

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 130

Page 136: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

ATTACHMENT ONE: TDNASH Tools &Templates

1.1 Training and Development Needs Assessment for School Heads (TDNASH) Tool 1.2 TDNASH Answer Sheet1.3 Instructions for Supervisors in Answering the TDNASH Tool1.4 Consolidated Triangulation Results Template for Individual School Head

(Note: e-version available through Division as part of the TDNA Consolidation Database)

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 131

Page 137: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Attachment 1.1

TRAINING DEVELOPMENT NEEDS ASSESSMENT FOR SCHOOL HEADS (TDNASH) TOOL

INSTRUCTIONS for SELF-Assessment:

1. Complete the Respondent Information on the Separate Answer Sheet provided.

2. Read the competencies for each domain and the supporting behavior indicators. Consider each indicator by reflecting on your past and current practices as a School Head. Recall any evidence that you have in relation to your performance for the stated behavior.

3. On the Answer Sheet, place a for Yes or a X for No under the appropriate column to represent your assessment of whether or not you have been consistently demonstrating this behavior in your work. Do this for each indicator.

Example:School LeadershipCompetency

SLEL Indicator YES = NO = X

LOI

1.1 A 1

E 2

P 3 XP 4P 5L 6L 7L 8L 9T 10T 11T 12

Note: The Letter Codes, A, E, P, L, T under the SLEL column are intended for the analysis of results. Please do not be distracted by these codes.

4. After responding to each set of indicators, rate each competency in terms of how essential it is in performing the job of a School Head. Write the corresponding number in the appropriate box under the column LOI on the Answer Sheet using the scale below.

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 132

Rat ing Scale for Level o f Importance (LOI )Rat ing In terpretat ion Rat ing In terpretat ion

1 Most Essential 3 Essential

2 More Essential 4 Less Essential

5 Least Essential

Page 138: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

5. No time limit is imposed for completing the TDNASH. It is expected it will take approximately 1-2 hours to complete.

6. Please do not leave any item unanswered.

START HERE

DOMAIN 1: SCHOOL LEADERSHIP

COMPETENCY 1.1 Developing and Communicating a Vision

Qualifier Developing a clear vision and communicating it to others

SLEL Indicators

A 1 States and explains the school vision clearlyE 2 Communicates and explains the vision to faculty and studentsP 3 Participates in committees which develops and/or revises vision statement P 4 Discusses vision statement with all stakeholders P 5 Involves stakeholders in the crafting the school visionL 6 Lead in developing the vision among stakeholdersL

7 Facilitates understanding of vision among stakeholders for the purpose of aligning decisions to it

L 8 Aligns goals and objectives of activities of the school with the school visionL

9 Revisits the vision yearly. Regularly checks and monitors programs if they are aligned with the vision

T10 Advocates and shares expertise with other school heads on facilitating vision

development among stakeholdersT 11 Creates teams to champion the achievement of the vision of the schoolT

12 Together with the stakeholders revisits and checks if school programs and activities are aligned with the vision of the school

COMPETENCY 1.2 Data Based Strategic Planning

Qualifier Creating strategic plans that are driven by data

SLEL Indicators

A13 Explains the importance of establishing baseline data on all performance indicators

especially on student achievement (SRC)E 14 Analyzes and interprets student achievement dataE 15 Gathers and establishes baseline data on all performance indicators (SRC)P 16 Facilitates data usage to develop school goalsP 17 Uses baseline data in formulating school development plansL

18 Facilitates baseline data usage among staff to develop school-wide goals for school improvement

T 19 Assesses processes and results of programs. Modifies and/or improves said programs based on assessment results

COMPETENCY 1.3 Problem Solving

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 133

Page 139: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Qualifier Solving problems with an awareness of the educational context

SLEL Indicators

A 20 Identifies school and instructional problems and issuesE 21 Suggests solutions to common school and classroom problems and issuesP

22Facilitates the resolution of an issue or problem involving relevant parties and makes informed decisions

P 23 Conducts dialogue with concerned parties re problems at handL 24 Identifies and assumes responsibility to see that an issue is resolvedL 25 Determines information gathering and decision making strategiesL

26 Collaborates as necessary with relevant parties in developing solutions/decision and determines if the solution was effective

L 27 Initiates and makes use of steps in solving problemsT

28 Establishes and puts mechanisms in place that prevent the occurrences of potential problems.

T 29 Provides creative solutions to problems at hand.T 30 Assesses decisions or solutions made and acts accordingly

COMPETENCY 1.4 Building High Performing Teams

Qualifier Establishing and leading team(s) to achieve positive results

SLEL Indicators

A 31 Follows team decisionsA 32 Seeks suggestions from the teamE 33 Participates in a team experienceE 34 Conducts participatory meetings for decision makingP

35 As a leader, ensures team members’ understanding of purpose, tasks, responsibilities, and timeliness for completing the work of the team

P 36 Checks progress of team in accomplishing its purposeP 37 Forms different teams or committees for different tasksP 38 Defines roles and functions of each team or committeeL

39 Ensures that all aspects of the team process allowed individuals and groups to exercise professional responsibilities effectively and efficiently

L 40 Holds teams responsible for deadlines and resultsL 41 Monitors accomplishment of each team/committeeL 42 Gives feedback on the team’s performanceT

43 Creates a culture of excellence where individual differences are celebrated to enhance peak performance of each individual

T44 Facilitates the development of shared organizational value to achieve high

performanceT

45Develops the group’s capabilities in assessing team results and processes. Employs synergy to assist / develop peak performing teams

T46 Searches and applies creative and innovative assessment tools for peak

performance T

47 Prepares assessment tools (e.g. rubrics) as a result of consultation from the members of the group

T 48 Creates champion teams to assess accomplishment of the different group

COMPETENCY 1.5 Coordinating the Work of Others

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 134

Page 140: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Qualifier Managing Projects

SLEL Indicators

A 49 Follows directions for project implementationA 50 Follows directives from the higher officeE 51 Implements a project or eventE 52 Issues school memo to implement a projectE 53 Forms different committees (List of Committees)P

54 Coordinates the development, organization, and implementation of a project or event which supports teaching and learning

P 55 Coordinates with concerned staff on the implementation of the project

P 56Monitors project ( Monitoring Form)Document s project (Progress report, pictorials)Evaluates project (Project Assessment Form)

L 57 Monitors efficient use of resources and time on a collaborative project directly focused on improving teaching and learning

L 58 Monitors proper allocation of resources ( time, fiscal, personnel, IMS, etc)L 59 Gives feedback & recommendations (Minutes of meeting, Completion Reports)L 60 Publishes project results (School Bulletin, School Memo)

T 61 Facilitates the development of self-directed teachers in maximizing resources in improving teaching and learning

T 62 Mobilizes teachers / staff in sustaining a project

COMPETENCY 1.6 Leading and Managing Change

Qualifier Initiating and Managing Change

SLEL Indicators

A 63Identifying areas that need change considering the school as an organization and its environment (Form 178, Personal Journal)

E 64 Approves initiatives towards changes for school improvementP

65 Initiates and manages changes for school improvement applying change management skills

P 66 Organizes teams to implement changes (minutes of meetings)P 67 Monitors the implementation of change programsL 68 Delegates to staff some functions to respond / address evaluation results

L 69 Evaluates results of change programs and modifies or effects new changes based on results

L 70 Assigns staff to take charge of different components of program or projectL 71 Observes and applies multi-tasking in giving assignmentsT 72 Building capacities of teachers and staff to manage change

T 73 Sends staff to attend training and seminars to enhance knowledge, skills and attitudes (KSA’s)

T 74 Sustains effective practices realized in changes introducedT

75 Instills in teachers a pro-active stance in identifying, initiating and managing changes

DOMAIN 2: INSTRUCTIONAL LEADERSHIP

COMPETENCY 2.1a Assessment of and for Learning

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 135 DOMAIN 1:

SCHOOL LEADERSHIP

Page 141: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Qualifier Using data to assess student performance and needs, and for designing programs to address these needs

SLEL Indicators

A 76 Follows department leadership directives on program implementationA 77 Cites the orders /memoranda on assessmentE 78 Utilizes external assessment data to determine gaps in student achievementE

79 Utilizes the National, Regional and Division test results as baseline for measuring Achievement

P 80Utilizes multiple assessment measures of student performance to evaluate student learning across a grade/year level or content area such as standardized tests, performance based portfolio

P 81 Supports grade/year level programs, or content area planning, to address needs of all learners

P 82 Identifies and prioritizes school programs to address student learning needs such as reading readiness, reading difficulties and the like

L 83 Ensures that relevant data were available and examined regularly to monitor and determine trends and patterns for programmatic attention

L 84 Conducts action research on needs like reading readiness, reading difficulties and the like

L 85 Develops interventions programs such as reading program, remedial instruction and the like

L 86 Supervises / monitors programs based on results of action research and of other assessment techniques

T 87 Sharing innovative assessment procedures to other school heads and setting trends and patterns for benchmarking

T 88 Sustains / institutionalizes intervention program

T 89 Facilitates the conduct of continuous or regular action research per subject area or year level

COMPETENCY 2.1b Assessment of and for Learning

Qualifier Assessing the effectiveness of curricular programs and/ or instructional strategies

SLEL Indicators

A 90 Gives instructions to teachers to assess effectiveness of curricular programs

E 91 Evaluates programs or strategies from teacher reaction rather than from student learning outcome

E 92 Gathers feedback from teachers on programsE 93 Gives feedback to teachers on programs based on set guidelines

P 94 Evaluates current programs or instructional strategies to assess their effectiveness based on student learning outcomes

P 95 Analyzes and checks programs to determine alignment to learning goals P 96 Utilizes assessment / test results to improve instruction

L 97 Maintains a school-wide focus on improving learning for all students and all subgroups of students

L 98 Monitors and evaluates program results to determine impact on student achievement

L 99 Schedules monitoring of programs with teachers L 100 Prepares monitoring reports and acts on the results

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 136

Page 142: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

T 101 Organizes assessment groups to provide feedback on the effectiveness of curricular programs and strategies

T 102 Sharing expertise with other school heads in the development of programs to address assessment needs

COMPETENCY 2.2.a Developing Instructional Programs and/ or Adapting Existing Programs

Qualifier Using research expertise, professional development, and/or other vehicles to assist in developing and implementing a cohesive and coherent school-wide curriculum

SLEL Indicators

A 103 Implements prescribed BEC curriculumA 104 Follows school curriculum directivesE 105 Participates in discussion on standards-based instruction and assessmentE 106 Explains standard-based instruction and assessment

P 107 Facilitates discussions and process of aligning grade or year level and/or content area curriculum to set standards

P 108 Facilitates grade or year level teams to determine common learning strategies

P 109 Conducts meetings with subject area coordinators/heads to align content per subject area with standards

L 110 Coordinates the alignment of assessment, instruction, and resources to the set Learning Standards and Benchmarks

L 111 Facilitates selection of common learning strategies across and within year or grade levels and content areas

L 112 Designs assessment tools with department heads/ subject area coordinators on the assessment and instruction

T 113 Provides expertise to other schools in assessing their curriculum programs or projects to achieve their vision

T 114 Develops research-based programs and projects considering school context

T 115 Serves as Resource Speaker re programs implemented (supported by the documents)

COMPETENCY 2.2b Developing Instructional Programs and/ or Adapting Existing Programs

Qualifier Addressing deficiencies and sustaining successes of current programs

SLEL Indicators

A 116 Explains and supports the need to address student achievement gapsE 117 Identifies gaps and successes in school instructional programP 118 Facilitates the use of effective teaching-learning materials to address deficiencies

P 119 Monitors the implementation of strategies and use of materials to identify barriers and address them with new interventions

L 120 Analyzes test resultsL 121 Creating effective interventions based on results of analysisT 122 Providing assistance to other schools in creating effective interventionsT 123 Extends technical assistance to other schools

T 124 Conducts training programs/interventions to address identified learning gaps (use of SIM)

COMPETENCY 2.2c Developing Instructional Programs and/ or Adapting Existing

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 137

Page 143: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Programs

Qualifier Developing a culture of literacy

SLEL Indicators

A 125Follows and implements DepED guidelines and procedures regarding literacy programs

E 126 Participates and attends meetings towards developing school-wide literacy program efforts or programs

P 127 Facilitates faculty discussions on literacy and on how to develop a school-wide culture of literacy

L 128 As a leader, contributes extensively to a school-wide literacy programL 129 Oversees school-wide literacy programsL 130 Monitors and evaluates the effectiveness of literacy programs in the schoolT 131 Organizes teams or groups to sustain effective literacy programs in the schoolT 132 Provides opportunities to all stakeholders to be involved in literacy programs

COMPETENCY 2.3 Implementation of instructional programs

Qualifier Organizing for instructional coherence

SLEL Indicators

A 133 Identifying instructional standards in lesson plansA 134 Observes classes to identify instructional problems (Form 178)E 135 Aligns curriculum to learning standardsP 136 Works with teachers in curriculum review per year level/subject areaL 137 Facilitates both vertical and horizontal teams collaboration for curricular coherenceL 138 Involves teaches in reinforcing/enriching curricular offerings

T 139 Organizes teams to champion continuous implementation of instructional programs towards curricular coherence

DOMAIN 3: CREATING A STUDENT - CENTERED LEARNING CLIMATE

COMPETENCY 3.1a High Social & Academic Expectations

Qualifier Establishing and modeling high social and academic expectations for all

SLEL Indicators

A 140 Explains the need for teachers to have high expectationsA 141 Sets school targetsE 142 Models high social and academic expectations for all studentsE 143 Identifies practices of high performing schools that can be adapted in the schoolP 144 Conceptualizes school specific interventions that address identified problems

P 145 Establishing and implementing school practices that support high academic and social expectations for all students

P 146 Implements programs or interventions (social and academic)P 147 Replicates best practices of high performing schoolsL 148 Institutes practices found effectively supporting student-centered learning climateL 149 Engages faculty in setting high social & academic expectationsL 150 Conducts regular monitoring and evaluation of student performanceT 151 Establishes structures and influencing faculty and stakeholders to embrace and

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 138

Page 144: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

demonstrate shared commitments to high academic and social expectationsT 152 Develops advocates towards high social & academic expectations

T 153 Institutionalizes practices that have positively contributed to the attainment of target

T 154 Provides incentive system for high performing teachers

COMPETENCY 3.1b High Social & Academic Expectations

Qualifier Creating an engaging learning environment (e.g. differentiated learning opportunities, after-school programs, clubs, etc.)

SLEL Indicators

A 155 Explains the importance of Student – Centered Learning Environment (SCLE)E 156 Identifies and implements school programs to enhance learning environmentE 157 Supports the use of engaged learning strategies

P 158 Provides feedback to classroom teachers on ways to make the learning environment more student-centered for all learners

P 159 Manages programs and provides feedback to teachers on engaged learning strategies

P 160 Conducts periodic assessment of programs on improving the learning environment and provides feedback

L 161Facilitates dialogue through a variety of means with the faculty about how to develop and maintain student –centered classroom and school environments that support the needs of all learners

L 162 Ensures faculty understanding and implementation of engaged learning models

L 163 Conducts SLAC / SBTP (school-based training program) and meetings on improving the learning environment continuously

L 164 Participates actively in curricular and co-curricular activities (Teachers and students)

T 165 Instills pride & confidence among students in achieving high social & academic expectations

T 166 Empowers teachers to build students’ commitment to life-long learning

T 167 Creates and provides intrinsic and extrinsic rewards to teachers and programs designed to improve the learning environment

COMPETENCY 3.2a Creating school environments focused on the needs of the learner

Qualifier Contributing to the creation of a safe and orderly environment

SLEL Indicators

A 168 Follows school guidelines regarding safety proceduresA 169 Explains existing guidelines on safety proceduresE 170 Monitors and implements behavior management systemE 171 Formulates guidelines on behavior management and safety standards

P 172 Establishes student behavior management systems that address inappropriate behavior and acknowledge

P 173 Implements guidelines on safety procedures and behavior managementP 174 Establishes functional guidance program for behavior management

L 175 Guides teachers, staff, students and parents in understanding and taking ownership of rules and procedures in supporting learning for all students

L 176 Conducts conferences with parents , students and faculty regarding safety procedures and behavior management

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 139

Page 145: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

T 177 Influences the school community to take ownership of rules & procedures in support of student learning

T 178 Involves the School Governing Council (SGC) in the reformulation and implementation of school policies

T 179 Mobilizes community support in sustaining programs and projects for behavior management and safety

COMPETENCY 3.2b Creating school environments focused on the needs of the learner

Qualifier Contributing to the creation of a nurturing and supportive environment

SLEL Indicators

A 180Follows directions on using appropriate models to enhance classroom learning environments

E 181 Models and promotes attitudes and behaviors that are supportive of caring for and towards students

E 182 Practices Child Friendly School System ( CFSS )

P 183 Facilitates dialogue through a variety of means with the faculty on how to improve and/or enhance learning for all students

P 184 Conducts consultative conferences with faculty

L 185Conducts information dissemination campaign to teachers and parents to promote understanding and taking ownership of factors that contribute to learning environments focused on the needs of all children

T 186 Introduces innovations in and mechanisms towards enhancing learning for all types of learners

T 187 Applies innovations that cater to all types of learners

DOMAIN 4: PROFESSIONAL DEVELOPMENT & HR MANAGEMENT

COMPETENCY 4.1a Creating a professional learning community

Qualifier Mentoring and coaching existing employees and facilitating the induction of new hires

SLEL Indicators

A 188 Welcomes new staff and facultyE 189 Supports new teachers in becoming familiar with school routing proceduresE 190 Provides informal guidance and feedback to faculty and staffE 191 Orients newly hired teachers and staff on School routine procedures / policiesP 192 Integrating new teachers into the school’s culture

P 193 Supports a new teachers in understanding and addressing performance expectations

P 194 Provides formal guidance and feedback to faculty and staffP 195 Involves new teachers in school activities/programs P 196 Administers evaluation instrument to gather feedbackP 197 Briefs teachers on PAST and its indicatorsL 198 Engages teachers in a self- analysis process that foster self-reflection and growthL 199 Conducts evaluation sessions with teachersL 200 Holds a regular program coaching with teachers

T 201 Establishes mechanisms to develop teacher-leaders to coach and mentor other teachers

T 202 Creating an atmosphere for teachers to be responsible for their growth through

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 140

Page 146: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

self- analysis and self-reflection

T 203 Providing rewards and recognition of teachers’ growth like initiating a yearly search for Outstanding Teachers with community involvement

T 204 Gives incentives to deserving teachers

COMPETENCY 4.1b Creating a professional learning community

Qualifier Assessing the needs of faculty members and collaboratively designing and leading professional development programs to meet these needs

SLEL Indicators

A 205Accomplishes assessment instruments regarding development needs, e.g. TDNASH-Training and Development Needs Assessment for School Heads

E 206Conducts assessment of teachers to determine professional development needs

and interest through the National Competency-Based Teacher Standards (NCBTS) Teachers Strengths and Needs Assessment

P 207 Works with teachers to plan professional development targeted to address identified needs

P 208 Plans intervention programs with the staff to address identified needs

P 209 Implements professional development programs like INSET, Scholarships, for teachers and staff

L 210 Monitors and evaluates both the professional needs of the faculty and the effectiveness of professional development programs

L 211 Ensures that time and resources spent on professional development are focused on adult learning that will improve student performance outcomes

T 212Provides opportunities for teachers to assess their strengths and weaknesses as bases for their Individual Plan for Professional Development (IPPD) that maximize time and resources

T 213 Provides guidelines for the utilization of teachers’ self- assessment results

COMPETENCY 4.1c Creating a professional learning community

Qualifier Aligning professional development plans with strategic goals

SLEL Indicators

A 214 Attends seminars, training, workshops etc for professional development

E 215 Encourages teachers to direct their individual plan for professional development (IPPD) or learning management plan (LMP) plans towards targeted needs

E 216 Plans and delivers professional development activities that are aligned to goals for improving teaching and learning

E 217 Assists teachers in the preparation of their IPPDs or LMPsP 218 Aligns IPPD/LMP focus with identified needs of teachers

P 219 Collaborates with staff on developing plans that are aligned with the school’s goals for improving teaching and learning

P 220 Assesses teachers IPPDs/LMPs whether they are aligned to school goals

L 221 Guides and supports continuous professional development that is results –oriented and research-based and that supports the school improvement process

L 222 Provides time and funds for teachers to conduct action research

T 223 Builds teachers’ competence in implementing their IPPDs, that are aligned with the school’s strategic goals

T 224 Conducts capability building through INSET / LACS and other forms of professional development

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 141

Page 147: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

COMPETENCY 4.1d Creating a professional learning community

Qualifier Analyzing your career development needs as a leader and learner and seeking opportunities to address those needs

SLEL Indicators

A 225Explains the importance of establishing the IPPD/LMP in the School to teachers and staff

E 226 Participates in and evaluates his/her own professional growth through mandated professional development activities

E 227 Prepares and accomplishes his/her own individual Plan for Professional Development (IPPD)

P 228 Participates in a variety of personal professional development as a means of demonstrating a commitment to lifelong learning

P 229 Implements and monitors progress of his/her IPPD

P 230 Provides support for the development and implementation of the School Plan for Professional Development (SPPD)

P 231 Assists teachers in developing their IPPD

L 232 Participates in professional development via association memberships, professional writing, conference, and leadership coursework

L 233 Considers teachers who are members of professional clubs, organizations, and who have published articles, as plus factors in the PAST

T 234 Provides opportunities for teachers to assess their performance vis-à-vis their own needs in planning their career paths

T 235 Advocates for Life-long Learning through benchmarkingT 236 Helps other school heads develop and operationalize their SPPD

Competency 4.2 Recruiting and Hiring

Qualifier Providing technical input for improvement in the staff selection process to meet school needs (e.g. needs identification, interviewing, hiring, etc.)

SLEL Indicators

A 237Explains the basic qualification standards in hiring and recruiting teachers /staff to others

E 238 Participates in one aspect of the recruiting and hiring process

E 239 Creates a School Selection and Promotion Board and explains the members’ role and responsibility

P 240 Participates in several aspects of the recruiting and hiring and hiring process

P 241 As chair of the Selection and Promotion Board in the school, implements the guidelines set by DepED

L 242 Participates in several aspects of the recruiting and hiring process and demonstrating leadership for one or more of those components

L 243 Offers suggestions on how to improve the recruitment and hiring system in the school and in the Division

T 244 Recommends better ways and means to improve recruitment, hiring and performance appraisal of teachers.

T 245 Builds capacities of school ranking committee members and other school heads in conducting interviews of teacher-applicants

T 246 Trains members of the Selection Committee in Recruitment & Hiring

COMPETENCY 4.3 Teachers Observation and Instructional SupervisionSection 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 142

Page 148: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Qualifier Conducting classroom observations for the purpose of instructional improvement and/or teacher evaluation

SLEL Indicators

A 247 Observes classroom instruction with a supervisor coaching him/heE 248 Prepares a supervisory planP 249 Supervises and evaluates classroom instruction according to Division guidelines P 250 Conducts Classroom Observation –STAR, TISS, etcP 251 Conducts a post-conference after the actual observationL 252 Provides timely, accurate and specific feedback to teachers regarding performance

L 253 Fosters and encourages professional growth of individuals and staff based on observation data

T 254 Projects instructional competence and charisma to encourage teachers to seek instructional support

T 255 Provides guidelines on how to conduct observations and instructional supervision

COMPETENCY 4.4 Performance management of Teachers and Staff

Qualifier Managing the performance of all teachers and staff from performance goal setting to performance evaluation

SLEL Indicators

A 256Allows teachers and staff to set their own performance goals and to assess their own performance

E 257 Sets performance goals for each direct subordinate

E 258 Identifies and deputizes specific persons to help manage the performance of other teachers and staff

P 259 Sets performance goals for each subordinate and coaching them for commitment

P 260 Conducts performance dialogue and feedback at the end of each performance period

P 261 Coaches deputized staff once in a while on how to manage performance of staff

L 262 Sets & resets performance goals for each subordinate and coaching them for commitment

L 263 Conducts performance dialogue and feedback at the accomplishment of each performance goal

L 264 Coaches deputized staff regularly on how they manage the performance of the staff assigned to them

T 265 Sets up systems to facilitate management of performance of all teachers and staff

T 266 Models the conduct of performance management from goal setting to coaching to performance evaluation

T 267 Provides follow-up sessions to hone the skills of deputized staff on managing performance of those assigned to them

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 143

Page 149: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

DOMAIN 5: PARENT INVOLVEMENT & COMMUNITY PARTNERSHIP

COMPETENCY 5.1a

Parental Involvement

Qualifier Creating parent/family partnerships to support student peak performance

SLEL Indicators

A 268 Attends PTCA meeting or at least sends authorized representative

E 269 Participates or facilitates some aspect of an event involving parents’ support of student learning

E 270 Presents priority projects to PTCA for fund raisingP 271 Manages and/or implements opportunities for parents to support student learning

P 272 Designates key personnel who would assist in the implementation of programs and projects of PTCA

P 273 Extends advice and suggestions to PTCAs re programs and projectsL 274 Designs opportunities for parents to support student learning

L 275 Involves parents in planning and designing of programs and projects to support student learning

L 276 Creates monitoring / evaluating team with parents in programs / projects

T 277 Institutionalizes structures for parents to own responsibility in supporting student learning

T 278Creates working committees with parents in the implementation and assessment of programs and projects

COMPETENCY 5.1b Parental Involvement

Qualifier Creating a climate that supports parental involvement

SLEL Indicators

A 279 Provides communications to parents regarding a classroom topicA 280 Schedules the issuance of reports cards/homeroom PTCA meetingE 281 Provides communications to parents regarding a school-wide topicE 282 Providing assistance to teachers in organizing PTCA meetings

E 283 Conducts advocacy / information campaign to parents or school programs and projects

P 284 Creates & maintains communication with parents to solicit ideas, suggestions and opinions

P 285 Develops strategies to make parents feel welcome when they come to the schoolP 286 Solicits feedback or data from parents regularly

L 287 Establishes an open communication with the staff, students, parents, and community, including PTCA organizations, etc.

L 288 Establishes functional School Governing Council (SGC) and conducts regular meetings with them

L 289 Maintains a harmonious working relationship with parents and other stakeholders

T 290 Conducts school summit with stakeholders at the beginning and at the end of the school year

T 291Organizes champion teams among parents to M & E programs and projects

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 144

Page 150: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

COMPETENCY 5.2

External Community Partnership

Qualifier Creating opportunities for external community involvement/partnership

SLEL Indicators

A 292Participates in some aspect of a program or event that involves an external community partner

A 293 Attends to barangay meetings on educational programs or projects / meeting

E 294 Facilitates some aspect of a program or event that involves an external community partner

E 295 Participates actively in community activities

P 296 Manages and/or implements programs and events in which the community and school interact and share resources to support school improvement goals

P 297 Establishes strong linkages with other sectors by complying terms and conditions stated in the MOA or any other similar instrument

L 298 Develops and sustains ongoing partnership(s) and the school for the purpose of focusing and using resources to support school improvement goals

L 299 Taps NGOs to support and provide assistance to existing school programs and projects

T 300 Forges strong partnerships with external community towards focusing and maximally using resources to support school improvement goals

T 301Involves external community in developing plans

T 302 Establishes stronger partnership with sponsoring agencies in forging continuing assistance

DOMAIN 6: SCHOOL MANAGEMENT AND DAILY OPERATIONS

COMPETENCY 6.1 Managing Daily Operations

Qualifier Participating in the management of the school facility and daily operation

SLEL Indicators

A 303 Follows daily routines and procedures established by school leadershipA 304 Checks the proper use of school facilities and equipment

E 305 Participates in monitoring or managing some aspect of the daily operation of the school(s)

E 306 Provides support on one aspect of school plant operations for safety, security, and cleanliness

E 307 Prepares list of school facilities needing repair (faulty wirings, etc.)E 308 Identifies guidelines on proper use of school facilities / equipment

P 309 Establishes and implements school structures, timelines, and processes that support academic environments

P 310 Monitors some aspect(s) of school plant operations for safety, security, and cleanliness

P 311 Hires and assigns support personnel to de the needed work

P 312 Monitors the operations of school plant facilities for safety, security and cleanliness

L 313 Delegates some management functions to key personnel in the school and

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 145

Page 151: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

coaching them in the process

L 314 Manages school plant operations to create and maintains a safe, secure, and clean learning environment.

L 315 Assigns property custodian or appropriates school personnel to manage school plant operations

L 316 Oversees school plant operations and care and use of school facilities according to set guidelines

T 317Contributes to the development of organizational structures, practices and policies that complement and enhance each other in support of improving academic environments

T 318 Institutionalizes effective practices in managing school operations thereby creating a safe and secure learning environment

COMPETENCY 6.2a Fiscal Management

Qualifier Developing a school budget

SLEL Indicators

A 319 Follows budget decisions made by school leadershipE 320 Participates in budget discussions and decisionsE 321 Reviews / discusses budget proposal with teachersP 322 Aligns decisions about resource allocation with school improvement goals P 323 Identifies fund sources (MOOE, SB, Solicitations, etc)

L 324 Develops creative ways to obtain, allocate and conserve resources to support school improvement goals within the limits of board policy

L 325 Establishes tie-ups, linkages with LGUs, NGOs to provide funds for project implementation

T 326 Generates resources to support school improvement programs and activitiesT 327 Initiates income generating projects with community support

COMPETENCY 6.2b Fiscal Management

Qualifier Monitoring a budget and/or demonstrating knowledge of DepED financial policies and procedures

SLEL Indicators

A 328 Monitors budget implementation based on DepED guidelineE 329 Participates in monitoring some aspects of the school budgetE 330 Reviews / discusses budget proposal with teachersP 331 Monitors and evaluates school budgetP 332 Allocates funds according to priority needs P 333 Utilizes resources judiciouslyL 334 Monitors and revises the school budget according to results from monitoringL 335 Ensuring that all financial records are up-to-date and ready for audit at any timeL 336 Keeps accurate and up-to-date record of financial expenditures T 337 Voluntarily submits to regular auditing and scrutiny of all financial recordsT 338 Acquires new / improved school facilities with support with other stakeholder

T 339Publishes financial statement regularly for transparency

COMPETENCY 6.3 Use of Technology in the Management of Operations

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 146

Page 152: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Qualifier Applying TQM, Technology Management, Project Management, and/or Knowledge Management

SLEL Indicators

A 340Attends conferences or seminars about new technologies (TQM, Knowledge Management) that would be of help in managing daily operations

E 341 Seeks external help to introduce a technology in the schoolE 342 Organizes training for teachers and seeks sponsors for said trainingE 343 Invites Resource Persons-specialists in said or identified technologyE 344 Prepares Information Technology and plans for online CommunicationP 345 Implementing at least one technology in the management of operations

P 346 Sends memo to teachers on the use of new technology or online communication e.g. e-mails, e-grades

P 347 Implements the use of the new technologyP 348 Coordinates the development of instructional materials for the new technologyL 349 Advocates the use of technology in the management of operations.L 350 Facilitates internet use: e mail; e-library; e-conferencing; online grade

L 351 Conducts monitoring and evaluation on the use of new technology and submits recommendations

T 352Installs varied and appropriate technology in the management of operations

T 353Shares effective management practices with other school heads

T 354Shares with other principals the school experience in the use of new technology and their developed materials

DOMAIN 7: PERSONAL INTEGRITY AND INTERPERSONAL EFFECTIVENESS

COMPETENCY 7.1 Professionalism and Respect

Qualifier Creating and nurturing an atmosphere of professionalism and respect

SLEL Indicators

A 355 Models personal and professional behavior and respect

A 356Observes desirable professional and personal behavior such as honesty, dedication, patriotism, and genuine concern for others

E 357 Models personal and professional behavior and respect at all times

E 358 Demonstrates desirable professional and personal behavior such as honesty, dedication, patriotism and genuine concern for others most of the time

P 359 Promotes the expectation that staff members demonstrate personal and professional behavior and respect at all times

P 360 Issues school memo/circular on demonstration of desirable professional and personal behavior

L 361Builds and sustains a learning environment which sets high expectations for staff members professional growth, conduct, and dialogue, all focused on the needs of the children

L 362 Develops programs and projects to build and sustain a learning environment which sets high standard for staff members’ personal and professional behavior

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 147

Page 153: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

L 363 Conducts teacher conferences to promote expectations that they demonstrate desirable professional and personal behavior

T 364 Enables teachers and staff to build and sustain a nurturing learning environment in their area of responsibility

T 365 Trains teachers in building and sustaining a nurturing learning environment in their area of responsibility through INSET or LACs

COMPETENCY 7.2a Communication

Qualifier Communicating effectively in writing to groups and individuals

SLEL Indicators

A 366Writes appropriately to different school audiences - teachers, pupils, parents and other stakeholders using the correct format

E 367Writes appropriate communication at all times to teachers, pupils, parents and other stakeholders following the correct format

P 368 Checks drafts of communication for clarity, correctness and appropriatenessL 369 Sets and models high standards for all written communication from the schoolL 370 Edits all written communication to conform with standards setT 371 Equips peers and their staff with high standards in all communications

T 372 Trains teachers and staff and even peers in high standards of written communication

COMPETENCY 7.2b Communication

Qualifier Speaking effectively to groups and individual

SLEL Indicators

A 373Speaks appropriately to different school audiences and explains the importance of speaking effectively to school audiences

E 374 Speaks appropriately to different school audiences at all timesP 375 Makes formal presentations/ speeches to different audiencesP 376 Presents views clearly to all types of audiencesL 377 Reflects on speaking experiences and refining presentation skills

L 378 Conducts self-assessment on speaking experience and improved presentation skills for teachers

T 379 Helps other people develop oral presentation skillsT 380 Conducts workshops on oral presentation to teachers and staff

COMPETENCY 7.2c Communication

Qualifier

Listening actively to all stakeholders’ needs and concerns and responding appropriately (in consideration of the political, social, legal and cultural context)

SLEL Indicators

A 381 Explains the importance of listening activelyE 382 Modeling active and responsive listening at all times

E 383 Attends to concerns of teachers, parents, pupils’ grievances, issues, opinions, etc at all times

P 384 Facilitates opportunities for faculty members to listen actively and responsively to each other

P 385 Provides opportunities for teachers to listen actively and responsively to each other such as open forum, round table discussion, etc.

L 386 Sets and models high standards for faculty to listen and respond appropriately to

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 148

Page 154: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

each other, students and parents

L 387 Develops practical guidelines for teachers, parents and pupils in listening actively and dealing with other stakeholders

L 388 Develops a feedback mechanism in the school

T 389 Models high standards for faculty and staff to listen and respond appropriately to each other, students and parents and the rest of the school community

T 390 Trains other stakeholders and co-school heads/teachers of other schools in listening skills

COMPETENCY 7.3 Interpersonal Sensitivity

Qualifier Interacting appropriately with a variety of audiences

SLEL Indicators

A 391 Models courteous and respectful interactionsA 392 Accepts opinions of others without passing judgmentE 393 Models courteous and respectful interactions at all timesE 394 Gives positive criticism to teachers, students and parents based on guidelinesE 395 Exercises self-control when faced with some issues and concerns

P 396 Facilitates and promotes courteous and respectful interactions among faculty, students and parents

L 397 Sets high standards for courteous and respectful interactions throughout school community

L 398 Formulates standards and evaluates programs/activities that promote courtesy

T 399 Models high standards for courteous and respectful interactions throughout the school community

T 400 Provides opportunities to others to manifest high standards of decorum for all interactions

T 401Extends programs /activities to other stakeholders and to other schools

COMPETENCY 7.4 Fairness and Integrity

Qualifier Demonstrating and understanding of the relationship between acting with integrity and fairness and the role of a school leader

SLEL Indicators

A 402 Cites the importance of fairness, honesty and integrity in all school practices.E 403 Acts with integrity and fairness in a leadership role

E 404 Demonstrates integrity, honesty and fairness in all school practices and in making decisions

P 405 Develops guidelines in helping teachers and staff to sustain fairness, honesty and integrity in all school practices

L 406 Holds individuals accountable for acting with integrity and fairnessL 407 Establishes management control meetings to hear and check on each personT 408 Creates a culture of accountability, fairness and integrityT 409 Models and manifesting adherence to set behavior and performance standards

T 410 Sets up an Awards System for teachers and staff to sustain integrity honesty and fairness in all school practices

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 149

Page 155: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 150

Page 156: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Attachment 1.2: TDNASH Answer Sheet

School Head Name: ________________________ School: ___________________

Respondent: School Head Supervisor: Teachers:

Domain 1: School Leadership Competency

SLEL Indic-ator

YES = NO = X

LOI

1.1 A 1 E 2P 3P 4P 5L 6L 7L 8L 9T 10T 11T 12

1.2 A 13E 14E 15P 16P 17L 19T 19

1.3 A 20E 21P 22P 23L 24L 25L 26L 27T 28T 29T 30

1.4 A 31A 32E 33E 34P 35P 36P 37P 38L 39L 40L 41L 42T 43T 44

T 45Compe- tency

SLEL Indica-tor

YES = NO = X

LOI

T 46T 47T 48

1.5 A 49A 50E 51E 52E 53P 54P 55P 56L 57L 58L 59L 60T 61T 62

1.6 A 63E 64P 65P 66P 67L 68L 69L 70L 71T 72T 73T 74T 75

Domain 2: Instructional Leadership 2.1a A 76

A 77E 78E 79P 80P 81P 82L 83L 84L 85L 86T 87T 88T 89

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 151

Page 157: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 152

Page 158: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Competency

SLEL Indicator

YES = NO = X

LOI

2.1b A 90E 91E 92E 93P 94P 95P 96L 97L 98L 99L 100T 101T 102

2.2a A 103A 104E 105E 106P 107P 108P 109L 110L 111L 112T 113T 114T 115

2.2b A 116E 117P 118P 119L 120L 121T 122T 123T 124

2.2c A 125E 126P 127L 128L 129L 130T 131T 132

2.3 A 133A 134E 135P 136L 137L 138T 139

Competency

SLEL Indicator

YES = NO = X

LOI

Domain 3: Creating A Student Centered Learning Climate3.1a A 140

A 141E 142E 143P 144P 145P 146P 147L 148L 149L 150T 151T 152T 153T 154

3.1b A 155E 156E 157P 158P 159P 160L 161L 162L 163L 164T 165T 166T 167

3.2a A 168A 169E 170E 171P 172P 173P 174L 175L 176T 177T 178T 179

3.2b A 180E 181E 182P 183P 184L 185T 186T 187

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 153

Page 159: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 154

Page 160: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Competency SLEL Indicator YES = NO = X LOIDomain 4: Professional Development & HR Management4.1a A 188

E 189E 190E 191P 192P 193P 194P 195P 196P 197L 198L 199L 200T 201T 202T 203T 204

4.1b A 205E 206P 207P 208P 209L 210L 211T 212T 213

4.1c A 214E 215E 216E 217P 218P 219P 220L 221L 222T 223T 224

4.1d A 225E 226E 227P 228P 229P 230P 231L 232L 233T 234T 235T 236

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 155

Page 161: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Competency SLEL Indicator YES = NO = X LOI4.2 A 237

E 238E 239P 240P 241L 242L 243T 244T 245T 246

4.3 A 247E 248P 249P 250P 251L 252L 253T 254T 255

4.4 A 256E 257E 258P 259P 260P 261L 262L 263L 264T 265T 266T 267

Domain 5: Parent Involvement & Community Partnership5.1a A 268

E 269E 270P 271P 272P 273L 274L 275L 276T 277T 278

5.1b A 279A 280E 281E 282E 283P 284P 285P 286L 287L 288L 289T 290T 291

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 156

Page 162: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Competency SLEL Indicator YES = NO = X LOI5.2 A 292

A 293E 294E 295P 296P 297L 298L 299T 300T 301T 302

Domain 6: School Management And Daily Operations6.1 A 303

A 304E 305E 306E 307E 308P 309P 310P 311P 312L 313L 314L 315L 316T 317T 318

6.2a A 319E 320E 321P 322P 323L 324L 325T 326T 327

6.2b A 328E 329E 330P 331P 332P 333L 334L 335L 336T 337T 338T 339

Competency SLEL Indicator YES = NO = X LOI6.3 A 340

E 341E 342E 343E 344P 345

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 157

Page 163: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

P 346P 347P 348L 349L 350L 351T 352T 353T 354

Domain 7: Personal Integrity and Interpersonal Effectiveness7.1 A 355

A 356E 357E 358P 359P 360L 361L 362L 363T 364T 365

7.2a A 366E 367P 368L 369L 370T 371T 372

7.2b A 373E 374P 375P 376L 377L 378T 379T 380

7.2c A 381E 382E 383P 384P 385L 386L 387L 388T 389T 390

Competency SLEL Indicator YES = NO = X LOI7.3 A 391

A 392E 393E 394E 395P 396L 397

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 158

Page 164: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

L 398T 399T 400T 401

Competency SLEL Indicator YES = NO = X LOI7.4 A 402

E 403E 404P 405L 406L 407T 408T 409T 410

END of Assessment

To be signed by School Head if Answer Sheet is for Self-Assessment:

Name of Respondent Signature Designation Date

To be signed by Supervisor if Answer Sheet is accomplished as respondent:

Name of Respondent Signature Designation Date

To be signed by Teachers if Answer Sheet is used for the GCA:

Name of Respondent Signature Designation Date

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 159

Page 165: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 160

Page 166: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Attachment 1.3: INSTRUCTIONS for Supervisors in answering the TDNASH Tool

1. Complete the Respondent Information on the Answer Sheet provided.

2. Reflect on the competencies of the School Head and consider their strengths and the areas where you think they need further assistance to become a more effective School Head.

3. Read the competencies for each domain and the supporting behavior indicators. Consider each indicator by reflecting on the School Head and the evidence you have in relation to his/her ability to consistently demonstrate the stated behavior.

4. On the Answer Sheet place a for Yes or a X for No to representing your assessment of whether or not the School Head consistently demonstrates this behavior in their work. Do this for each indicator.

5. Rate each competency in terms of how essential you believe it is for the School Head to be able to perform his/her job. Write the corresponding number in the appropriate box under the LOI column on the Answer Sheet using the scale below.

6. No

time limit is imposed for completing the TDNASH. It is expected it will take approximately 1 to 2 hours to complete.

7. Submit the TDNASH results to the TDNASH Administrator for consolidation.

8. Thank you for your cooperation.

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 161

Rating Scale for Level of Importance (LOI)Rating

Interpretation Rating Interpretation

1 Most Essential 3 Essential

2 More Essential 4 Less Essential

5 Least Essential

Page 167: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Attachment 1.4: Consolidated Triangulation Results Template for Individual School Head

School Head Name: _____________________________________________ Date: ____________

TDNASH Competency Items

Competency Ratings obtained from triangulation process

Overall School Leadership Experience Level (SLEL)

Overall Level of Importance (LOI)

Self-Assessment

Supervisor Teachers Weight

ed Average of SLEL Ratings

A- AwarenessE- EmergingP- PracticingL- LeadingT- Transforming

Average LOI

SLEL

LOI

SLEL

LOI

SLEL

LOI

DOMAIN 1: SCHOOL LEADERSHIP

1.1 Developing and Communicating a Vision1.2 Data-based Strategic Planning1.3 Problem -Solving1.4 Building High Performance Teams1.5 Coordinating the Work of Others1.6 Leading & Managing Change

DOMAIN 2: INSTRUCTIONAL LEADERSHIP

2.1a Assessment of and for Learning2.1b Assessment of and for Learning2.2a Developing Instructional Programs and/or Adapting Existing Programs2.2b Developing Instructional Programs and/or Adapting Existing Programs2.2c Developing Instructional Programs and/or Adapting Existing Programs2.3 Implementation of Instructional ProgramsDOMAIN 3: CREATING A STUDENT- CENTERED LEARNING CLIMATE

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 162

Page 168: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

TDNASH Competency Items

Competency Ratings obtained from triangulation process

Overall School Leadership Experience Level (SLEL)

Overall Level of Importance (LOI)

Self-Assessment

Supervisor Teachers

Weighted Average of SLEL Ratings

A- AwarenessE- EmergingP- PracticingL- Leading

Average LOI

SLEL

LOI

SLEL

LOI

SLEL

LOI

3.1a High Social & Academic Expectations3.1b High Social & Academic Expectations3.2a Creating School Environments Focused on the Needs of the Learner3.2b Creating School Environments Focused on the Needs of the LearnerDOMAIN 4: PROFESSIONAL DEVELOPMENT & HR MANAGEMENT4.1a Creating a Professional Learning Community4.1b Creating a Professional Learning Community4.1c Creating a Professional Learning Community4.1d Creating a Professional Learning Community4.2 Recruiting and

Hiring4.3 Teacher

Observation and Instructional Supervision

4.4 Performance Management of Teachers and Staff

DOMAIN 5: PARENT INVOLVEMENT & COMMUNITY PARTNERSHIP5.1a Parental Involvement 5.1b Parental Involvement

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 163

Page 169: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

TDNASH Competency Items

Competency Ratings obtained from triangulation process

Overall School Leadership Experience Level (SLEL)

Overall Level of Importance (LOI)

Self-Assessment

Supervisor Teachers

Weighted Average of SLEL Ratings

A- AwarenessE- EmergingP- PracticingL- Leading

Average LOI

SLEL

LOI

SLEL

LOI

SLEL

LOI

5.2 External Community Partnership

DOMAIN 6: SCHOOL MANAGEMENT AND DAILY OPERATIONS6.1 Managing Daily

Operations6.2a Fiscal Management6.2b Fiscal Management6.3 Use of

Technology in the Management of Operations

DOMAIN 7: PERSONAL INEGRITY AND INTERPERSONAL EFFECTIVENESS7.1 Professionalism

and Respect7.2a Communication7.2b Communication7.2c Communication7.3 Interpersonal

Sensitivity7.4 Fairness and

Integrity

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 164

Page 170: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Attachment 2: Monitoring and Evaluation Tools for TDNASH

M&E Matrix of Tools for TDNASH T&D-M&E Form 1: Individual Profile TemplateTDNASH-M&E Form 1: Division M&E of Conduct of TDNASHTDNASH-M&E Form 2: TDNASH Consolidated Cluster Results TemplateTDNASH-M&E Form 3: Documentation Tool for Division Implementation of TDNASH

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 165

Page 171: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

What will be monitored

How it will be monitored

M&E tool to be used

Who will be responsible for the

monitoring

When will the monitoring take place

How will the results be used

School Head’s details in relation to their current position, their level of experience and qualification

All School Heads will be asked to complete the profile

T&D-M&E Form 1: Individual Profile Template

TDNA - WG Prior to the accomplishment of the TDNASH Tool

Results will be entered into the TDIS database along with their corresponding TDNASH results

The implementation of the TDNASH process at the school/cluster level

Members of the Division TDNA- WG will be asked to observe the conduct of the TDNASH at the school/cluster level and complete the tool

TDNASH-M&E Form 1: Division Monitoring & Evaluation of the Conduct of TDNASH

Division TDNA - WG

During the accomplishment and TDNASH process and the consolidation of results

Results will be collated and analyzed by the TDNA-WG and used to inform future TDNASH processes.

The training and development needs of the School Heads

The PSDS /ES will be asked to consolidated the School Head TDNASH results for a cluster of schools

TDNASH-M&E Form 2: TDNASH Consolidated Cluster Results Template

TDNA -WG After the accomplishment of the TDNASH by a cluster of School Heads

Results will be analyzed and used to inform the Division on the training and development needs for School Heads. Results will be incorporate into the MPPD and DEDP

The implementation of the TDNASH at the Division level

A Process Observer will be identified and asked to complete the tool

TDNASH-M&E Form 3: Documentation Tool for Division Implementation of TDNASH

Region TDNA - WG

During the TDNASH process at the Division Level

Results to be discussed with the Division and identify strengths and areas for improvement.

Observations will be collated by the TDNA- WG and the results analyzed to inform future TDNA policy

M&E Matrix of Tools for TDNASH

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 166

Page 172: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

T&D-M&E Form 1: Individual Profile Template

I PERSONAL DATAName:                     

(Surname) (First Name) (Middle Name)

Employee Number (If Applicable)   Sex: Male FemaleDate of Birth:                    Home Address:                    Contact #: e-mail address: Region:   Division:       District:  Office/School:         Address:Current Position: 

Other Designations:  

Highest Educational Attainment:            

II. WORK EXPERIENCE(List from most current.)

POSITIONMAIN AREA OF

RESPONSIBILITY e.g. subjects taught, level supervised

LEVEL e.g. Elem/Sec/ALS school, district, division, region

INCLUSIVE PERIOD

                                                                                                                 

Use additional sheet if necessary.

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 167

Page 173: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

III. TRAINING ATTENDED OVER THE LAST THREE YEARS

Please check training focus and management level for all training attended over the last three years.

Training Focus Training attended over last 3 years ()

Management Level of TrainingCentral Region Division Cluster School

Curriculum

Resource Materials Development

Planning

Management

Policy Development

Research

Other, please specify ______________

IV. SIGNIFICANT EXPERIENCESIdentify which of the following areas you consider to be your area(s) of expertise: S School Based Management Quality Assurance Monitoring and Evaluation Access Education Subject Specialization: _____________) Education Planning Policy Development Learning Resource Materials Development ICT Delivery of Training Other, please specify ________________

Certified Trainers by NEAP Central NEAP-Region TEI

SEAMEO- INNOTECH Foreign Assisted Projects (FAP) Other, please specify -----

List your significant experiences in the identified areas

Use additional sheet if necessary.

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 168

Page 174: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

V. TRAINING AND DEVELOPMENT EXPERIENCES

Identify which of the following specific areas you consider to be your area(s) of expertise:

Competency Assessment Program Planning

Program Designing Resource Materials Development

Program Delivery Program Management

Monitoring and Evaluation of Training

List your significant experiences in the identified areas

Use additional sheet if necessary.

I certify that the information I have given to the foregoing questions are true, complete, and correct to the best

of my knowledge and belief.

Date: 

Signature:    

Please submit completed form to Training and Development Division/Unit. Information will be incorporated into the T&D Information System Database.

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 169

Page 175: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

TDNASH Administrator Monitored: ____________________ Dates: _______________

Activity Monitored: School Head completion of TDNASH Supervisor

Teachers focus group discussionPlease consider the administration of the TDNASH in line with the indicators listed below. Please check ( ) ✔the appropriate column to indicate your level of agreement for each of the statements.

Strongly Agree Agree Disagree Strongly

DisagreeAdministration of the TDNASHMaterials for conducting the TDNASH were organized and prepared in advanceRespondents were informed of the purpose for conducting the TDNASHClear instructions were provided on the process to be followedClarification was provided by the TDNASH Administer when necessaryAnswer Sheets were collected and checked to ensure respondent information is complete and answers have been provided for all indicators/competenciesResults AnalysisResults have been accurately analysed to identify the School Leadership Experience Levels and the Level of ImportanceResults have been accurately consolidated for individual School HeadsResults have been accurately consolidated for clusters of School HeadsResults have been submitted to the School Head, Cluster Lead School Head and TDNA-WG Chair in a timely mannerGeneral Comments

Recommendations to improve processes

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 170

TDNASH-M&E Form 2: Division Monitoring & Evaluation of the Conduct of TDNASH

Page 176: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

PART I. Cluster School Heads Identification

Cluster Name: ____________________________ Division __________________________

No. School Head Name School

SH 1

 

 

SH 2

 

 

SH 3

 

 

SH 4

 

 

SH 5

 

 

SH 6

 

 

SH 7

 

 

SH 8

 

 

SH 9

 

 

SH 10

 

 

SH 11

 

 

SH 12

 

 

SH 13    

SH 14

 

 

SH 15

 

 

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 171

TDNASH-M&E Form 3: TDNASH Consolidated Cluster Results Template

Page 177: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

PART II. TDNASH Cluster Summary Sheet for School Leadership Experience Level (SLEL)Cluster Name: _______________________

 Domains (D) 

School Heads’ SLEL Overall Rating obtained from the triangulation data

Cluster Overall

SLEL

  SH 1

SH 2

SH 3

SH 4

SH 5

SH 6

SH 7

SH8

SH9

SH10

SH11

SH12

SH13

SH14

SH15

Ave.Ratings

Level Equivalent

D1

1.1                                    

  1.2                                      1.3                                      1.4                                      1.5                                      1.6                                    D2

2.1a

                                   

  2.1b

                                   

  2.2a

                                   

  2.2b

                                   

2.2c2.3

D3

3.1a

                                   

3.1b

  3.2a

                                   

3.2b

D4

4.1a

                                   

4.1b4.1c4.1d4.2

4.3

  4.4                                    D5

5.1a

                                   

5.1b

  5.2                                    D6

6.1                                    

  6.2a

                                   

6.2b

  6.3                                    D7

7.1                                    

  7.2a

                                   

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 172

Page 178: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System7.2b7.2c

  7.3                                      7.4                                    

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 173

Page 179: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

PART III. TDNASH Cluster Summary Sheet for Level of Importance (LOI)Cluster Name: ________________________  Domains (D) 

School Heads’ LOI Overall Rating obtained from the triangulation data

Cluster LOI

  SH 1

SH 2

SH 3

SH 4

SH 5

SH 6

SH 7

SH8

SH9

SH10

SH11

SH12

SH13

SH14

SH15

Ave.Ratings

D 1

1.1                                  

  1.2                                    1.3                                    1.4                                    1.5                                    1.6                                  D2

2.1a

                                 

2.1b2.2a

  2.2b

                                 

  2.2c

                                 

  2.3                                  D3

3.1a

                                 

3.1b3.2a

  3.2b

                                 

D4

4.1a

                                 

4.1b4.1c4.1d

  4.2                                    4.3                                  D5 

5.1a

                                 

5.1b5.2                                  

D6

6.1                                  

  6.2a

                                 

6.2b

  6.3                                  D7

7.1                                  

  7.2a

                                 

7.2

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 174

Page 180: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA Systemb7.2c

  7.3                                    7.4                                  

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 175

Page 181: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

TDNASH-M&E Form 4: Documentation Tool for Division Implementation of TDNASH

This form is to be used to support Regional monitoring of the TDNASH process at the Division level and Division monitoring of district and level activities. It is expected that the assessment will be based on observations, discussions with the implementing team and review of relevant documents.

Division/District _________________________ Date: __________________

Rating Guide:Numerical

RatingInterpretation Description

4 Very High Extent In a very significant way3 High Extent In a meaningful way2 Low Extent In a limited way only1 Very Low Extent Not in any meaningful way

Use the scale above to evaluate the extent to which the conduct of TDNASH documentation adhered to the following:

To what extent …….. 1 2 3 41. was thorough planning conducted prior to the TDNASH orientation?2. were participants oriented to the competencies expected of a School

Head? 3. was the purpose of the TDNASH explained?4. Was the triangular process used for the TDNASH explained e.g.

three different respondents, group consensual assessment technique

5. was a clear explanation provided on how to accomplish the TDNASH tools e.g. manual and/or e-version

6. was the scoring system for the TDNASH tool explained e.g. continuum of indicators for each competency matched to school leadership experience levels?

7. were the steps involved in consolidating the triangulation results for an individual school head explained?

8. were the steps involved in consolidating TDNASH results for a group of school heads explained?

9. was an explanation on how to interpreted individual and consolidated results provided ?

10.

was technical assistance provided when required?

11.

were the M&E tools and processes implemented?

12.

Was there evidence of team work and collaboration amongst the TDNASH Implementers

13.

were recommendations for improving the TDNASH administration processes identified?

Recommendations:

Name:______________________________________

Designation: _________________________________ Date: __________________________

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 176

Page 182: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Attachment 3:

The National Competency BasedStandards for School Heads (NCBS-SH)

NATIONAL COMPETENCY – BASED STANDARDS FOR SCHOOL HEADS (NCBS- SH)(Inclosure to DepED Order No. 32, s. 2010)

Key Domains of School Heads’ CompetenciesBased on the Core Principle and a review of literature on principal performance, the following domains of practice were identified as important for school heads to be effective. School LeadershipEffective leadership is the core of every successful school. This domain emphasizes that effective school leaders collaborately create a vision and establish a climate for teachers, non-teaching personnel and learners to reach their highest level of achievement. They follow the leadership framework of a transformational leadership which are owning, co-owning and co-creating They use data-base and analysis of best practices in education, society and country in order to be responsive and proactive in changing schools to prepare children for the future in which they will live.

Instructional LeadershipEducation reforms has created an urgent need for strong emphasis on the development of instructional leadership skills. This domain covers those actions in instructional leadership (e.g. assessment for learning, development and implementation, instructional supervision and technical assistance that school heads take or delegate to others to promote good teaching and high level learning among pupils/students.

Creating a Student-Centered Learning ClimateThe domain requires that effective school leaders set high standards and create high expectations for learners at the same time recognizing their achievements. It also includes creating opportunities to make learners functionally literate. They create a learner – centered, safe and healthy environment that supports continuous learning and sharing of knowledge.

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 177

Page 183: Volume 2_TDNA System Operations Manual July V2010

CORE PRINCIPLE

School heads are competent, committed and accountable in providing access to quality and relevant education for all through transformational leadership and high degree of professionalism

Domain 1SCHOOL LEADERSHIP

Developing & Communicating Vision, Mission, Goals, and Objectives (VMGO)Data-based Strategic Planning

Conflict management Building High Performance TeamsCoordinating the Work of Others

Domain 7PERSONAL & PROFESSIONAL ATTRIBUTES and INTERPERSONAL EFFECTIVENESS

ProfessionalismCommunication

Interpersonal SensitivityFairness and Integrity

Domain 2INSTRUCTIONAL LEADERSHIP

Assessment for LearningDeveloping Programs &/or Adapting Existing ProgramsImplementing Programs for Instructional Improvement

Domain 3CREATING A STUDENT CENTERED LEARNING CLIMATE

Setting high social & academic expectationsCreating school environments focused on the needs of the learner

Domain 6SCHOOL MANAGEMENT AND DAILY OPERATIONS

Managing Daily OperationsFiscal Management

Use of technology in the Management of Operations

Domain 4HR MANAGEMENT AND PROFESSIONAL DEVELOPMENT

Creating a Professional Learning CommunityRecruiting and Hiring

Instructional SupervisionManaging Performance of teachers and Staff

Domain 5PARENT INVOLVEMENT AND COMMUNITY PARTNERSHIP

Parental InvolvementExternal Community Partnership

T&D System Operations Manual-Volume 2: The TDNA System

HR Management and Professional DevelopmentEffective school leaders develop the skills and talents of those around them. This domain includes the nurturing and supporting of a learning community that recruits teachers based on NCBTS and promotes the continuous growth and development of personnel based on IPPD and SPPD. They recognize individual talents and assign responsibility and authority for specific tasks and appraise the staff based on competency standards..

Parent Involvement and Community PartnershipEffective school heads engage in shared decision making with the community in achieving universal participation, completion and functional literacy. This domain covers parent and other stakeholders involvement to raise learners’ performance. This also includes responsibility for promoting positive image of the school thereby establish sustainable linkages with other sectors.

School Management and OperationsThis domain covers the critical role school heads play in managing the implementation and monitoring of their schools’ improvement plan/annual implementation plan. They are responsible for the generation, mobilization and are accountable for the utilization of funds and other resources. They also use ICT in the management of their daily operations.

Personal and Professional Attributes and Interpersonal Effectiveness Effective school leaders are models of professionalism and ethical and moral leadership. This domain includes the development of pride in the nobility of the teaching profession. School leaders also project integrity by promoting and supporting an environment where teachers, non-teaching staff and learners adhere to do “what is right.” They also express themselves clearly and possess effective writing and presentation skills

Key Domains and Competency Strands The NCBS-SH is an integrated theoretical framework that defines the different dimensions of being an effective school head. An effective school head is one who can implement continuous school improvement, who can produce better learning outcomes among its pupils/students and who can help change institutional culture among others.

The NCBS-SH Domains are distinctive areas which will guide school heads to be effective. Each domain is based on the functions expected of school heads, Under each domain are competency strands and the competency strand is broken down into a number of performance indicators. Performance indicator points to the abilities, attitudes and underpinning knowledge which lead to competent performance.

The Figure below shows the schematic presentation of the Integrated Domains with the competency strands under each domain.

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 178

Page 184: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Guiding Principles in the Framing of the NCBS -SH

The following are the principles which guided the framing of the NCBS-SH.

• Function - based. The competencies are based on school head functions as stated in RA 9155, related laws and DepED policies.

• Responsive. Competencies are applicable in any range of context: big or small school, city or rural school, culturally divergent groups.

• Impartial. These are applicable to any school head regardless of position item, gender, age, experience and other personal circumstances

• Coherent. These are clear and logical.

• Valid. All performance indicators are research-and experience-based.

CORE PRINCIPLESchool heads are competent, committed and accountable in providing access to quality

and relevant education for all through transformational leadership and high degree of professionalism.

DOMAINS AND COMPETENCY STRANDS

INDICATORS

DOMAIN 1. SCHOOL LEADERSHIP1.A. Developing &

Communicating Vision, Mission, Goals, and Objectives (VMGO)

Involves internal and external stakeholders in the drafting of the school vision, mission, goals and objectives for co-ownership

Expresses ownership and personal responses to the identified issues Aligns goals and objectives with the school vision and mission Communicates the school VMGO clearly Explains the school vision to the general public Revisits and ensures that school activities are aligned with the school VMGO

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 179

Page 185: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA SystemDOMAINS AND

COMPETENCY STRANDS INDICATORS

1.B. Data-based Strategic Planning

Establishes BEIS/SIS and baseline data of all performance indicators Involves all internal and external stakeholders in developing SIP/AIP Utilizes data, e.g. BEIS/SIS, SBM assessment, TSNA, and strategic planning in

the development of SIP/AIP Aligns the SIP/AIP w/ national, regional and local education thrusts and

policies Communicates effectively SIP/AIP to internal and external stakeholders

1.C. Problem Solving Resolves problems at the school level Assists teachers and students to understand the problem and identify possible

solutions Assists concerned parties in choosing solutions through a dialogue Addresses the causes of the problem rather than the symptoms Explores several approaches in handling problems

1.D. Building High Performance Teams

Involves stakeholders in meetings and deliberations for decision making, Provides opportunities for growth to develop members to be team players Defines roles and functions of each committee Monitors and evaluates accomplishment of different committees/teams Gives feedback on the team’s performance using performance – based

assessment tool Establishes a system for rewards and benefits for teachers and staff

1.E. Coordinating with Others

Collaborates with concerned staff and other stakeholders on the planning and implementation of programs and projects

Ensures proper allocation and utilization of resources (time, fiscal, human, IMS, etc.)

Provides feedback and updates to stakeholders on the status of progress and completion of programs and projects

Mobilizes teachers/staff/stakeholders in sustaining a project1.F. Leading &

Managing Change Assists teachers to identify strengths and growth areas through monitoring

and observation Introduces innovations in the school program to achieve higher learning

outcomes Monitors and evaluates the implementation of change programs included in

SIP/AIP Observes and applies multi-tasking in giving assignments Advocates and executes plans for changes including culture change in the

workplace Empowers teachers and personnel to identify, initiate and manage changes

DOMAIN 2. INSTRUCTIONAL LEADERSHIP2.A. Assessment for

Learning Manages the processes and procedures in monitoring student achievement Ensures utilization of a range of assessment processes to assess student

performance Assesses the effectiveness of curricular/co-curricular programs and / or

instructional strategies Creates & manages a school process to ensure student progress is conveyed

to students and parents/guardians regularly2.B. Developing

Programs &/or Adapting Existing Programs

Uses research, expertise, and/or other vehicles to assist in developing and implementing a coherent and responsive school-wide curriculum

Addresses deficiencies and sustains successes of current programs in collaboration with the teachers. Learners and stakeholders

Develops a culture of functional literacy

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 180

Page 186: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA SystemDOMAINS AND

COMPETENCY STRANDS INDICATORS

2.C. Implementing Programs for Instructional Improvement

Manages the introduction of curriculum initiatives in line w/ DepEd policies (e.g. BEC, Madrasah)

Works with teachers in curriculum review Enriches curricular offerings based on local needs Manages curriculum innovation and enrichment with the use of technology Organizes teams to champion instructional innovation programs toward

curricular responsiveness

2.D. Instructional Supervision

Prepares an instructional supervisory plan Conducts Instructional Supervision using appropriate strategy Evaluates lesson plans as well as classroom and learning management Provides timely, accurate and specific feedback in a collegial manner to

teachers regarding performance Provides technical assistance / expertise and instructional support to teachers

DOMAIN 3. CREATING A STUDENT - CENTERED LEARNING CLIMATE3.A. Setting high social

& academic expectations

Benchmarks school performance Establishes and models high social and academic expectations for all Creates an engaging learning environment Participates in the management of learner behavior within the school and

other school related activities done outside the school Supports learners’ desire to pursue further learning Recognizes high performing learners and teachers and supportive parents and

other stakeholders3.B. Creating school

environments focused on the needs of the learner

Creates and sustains a safe, orderly, nurturing and healthy environment Provides environment that promotes use of technology among learners and

teachers

DOMAIN 4. HR MANAGEMENT AND PROFESSIONAL DEVELOPMENT 4.A. Creating a

Professional Learning Community

Builds a community of learners among teachers Assesses and analyzes the needs and interests of teachers and other school

personnel Aligns the School Plan for Professional Development (SPPD) with the

Individual Plan for Professional Development (IPPD) and identified needs of other school personnel

Includes the SPPD in the SIP/AIP Mentors and coaches employees and facilitates the induction of new ones Recognizes potential of staff and provides opportunities for professional

development Ensures the school development plan objectives are supported with resources

for training and development programs Prepares, implements, and monitors school-based INSET for all teaching staff

based on IPPD Monitors and evaluates training efficiency and effectiveness

4.B. Recruitment & Hiring

Utilizes the basic qualification standards and adheres to pertinent policies in recruiting and hiring teachers / staff

Creates a School Selection and Promotion Committee and trains its members Recommends better ways and means to improve recruitment, hiring and

performance appraisal of teachers4.C. Managing

Performance of Teachers and Staff

Assigns teachers and other personnel to their area of competence Assists teachers and staff in setting and resetting performance goals Monitors and evaluates performance of teaching and non-teaching personnel

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 181

Page 187: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA SystemDOMAINS AND

COMPETENCY STRANDS INDICATORS

vis-a-vis targets Delegates specific tasks to help manage the performance of teaching and

non-teaching personnel Coaches deputized staff as needed on managing performance Creates a functional school-based performance appraisal committee Assists and monitors the development of of IPPD of each teacher

DOMAIN 5. PARENT INVOLVEMENT & COMMUNITY PARTNERSHIP5.A. Parental

Involvement Establishes school and family partnerships that promote student peak

performance Organizes programs that involve parents and other school stakeholders to

promote learning Conducts dialogues, fora, training of teachers, learners and parents on the

welfare and performance of learners 5.B. External

Community Partnership

Promotes the image of the school through school summit, State of the School Address (SOSA) cultural shows, learners’ project exhibits, fairs, etc.

Conducts dialogues and meetings with multi-stakeholders in crafting programs and projects

Participates actively in community affairs Establishes sustainable linkages / partnership with other sectors, agencies and

NGOs through MOA/ MOU or using Adopt a School Program policiesDOMAIN 6. SCHOOL MANAGEMENT AND OPERATIONS6.A. Managing School

Operations Manages the implementation, monitoring and review of the SIP/AIP and other

action plans Establishes and maintains specific programs to meet needs of identified target

groups Takes the lead in the design of a school physical plant and facilities

improvement plan in consultation with an expert/s Allocates/prioritizes funds for improvement and maintenance of school

physical facilities and equipment Oversees school operations and care and use of school facilities according to

set guidelines Institutionalizes best practices in managing and monitoring school operations

thereby creating a safe, secure and clean learning environment Assigns / hires appropriate support personnel to manage school operations

6.B. Fiscal Management

Prepares a financial management plan Develops a school budget which is consistent with SIP/AIP Generates and mobilizes financial resources Manages school resources in accordance with DepEd policies and accounting

and auditing rules and regulations and other pertinent guidelines Accepts donations, gifts, bequests and grants in accordance with RA 9155 Manages a process for the registration, maintenance and replacement of

school assets and dispositions of non-reusable properties Organizes a procurement committee and ensures that the official

procurement process is followed Utilizes funds for approved school programs and projects as reflected in

SIP/AIP Monitors utilization, recording and reporting of funds Accounts for school fund Prepares financial reports and submits / communicates the same to higher

education authorities and other education partners6.C. Use of Technology

in the Management of

Applies Information Technology (IT) plans for online communication Uses IT to facilitate the operationalization of the school management system (e.g. school information system, student tracking system, personnel

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 182

Page 188: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA SystemDOMAINS AND

COMPETENCY STRANDS INDICATORS

Operations information system) Uses IT to access Teacher Support Materials (TSM), Learning support

Materials (LSM) and assessment tools in accordance with the guidelines Shares with other school heads the school’s experience in the use of new

technology DOMAIN 7. PERSONAL AND PROFESSIONAL ATTRIBUTES AND INTERPERSONAL EFFECTIVENESS7.A. Professionalism Manifests genuine enthusiasm and pride in the nobility of the teaching

profession Observes and demonstrates desirable personal and professional (RA 6713 &

Code of Ethics RA 7836) behaviors like respect, honesty, dedication, patriotism and genuine concern for others at all times

Maintains harmonious and pleasant personal official relations with superiors, colleagues, subordinates, learners, parents and other stakeholders

Recommends appointments, promotions and transfers on the bases of merit and needs in the interest of the service

Maintains good reputation with respect to financial matters such as the settlement of his/her debts, loans and other financial affairs

Develops programs and projects for continuing personal and professional development including moral recovery and values formation among teaching and non-teaching personnel

7.B. Communication Communicates effectively both in speaking and writing to staff and other stakeholders

Listens to stakeholders’ needs and concerns and responds appropriately in consideration of the political, social, legal and cultural context

7.C. Interpersonal Sensitivity

Interacts appropriately with a variety of audiences Demonstrates ability to empathize with others

Section 7.2: TDNA for School Heads (TDNASH ) Guide and Tools Page 183

Page 189: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 184

Section 7.3.

The Organizational TDNA for the Region Guide and Tools

Page 190: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Republic of the Philippines

Department of Education

Organizational Training & Development Needs Assessment (TDNA)

for the Region

Guide and Tools

DepED-EDPITAF-STRIVEJune 2010

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 185

Page 191: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 186

This document, The Organizational TDNA for the Region Guide and Tools, was developed and validated in Regions VI, VII and VIII,

Divisions of Negros Occidental, Bohol/Tagbilaran and Northern Samar through the AusAID-funded project STRIVE (Strengthening the Implementation of Basic Education in selected Provinces in the

Visayas), in coordination with the EDPITAF (Educational Development Project Implementing Task Force) and in consultation

with the TEDP-TWG, NEAP and the Bureaus of the Department of Education.

Page 192: Volume 2_TDNA System Operations Manual July V2010

TABLE of CONTENTS

1. Introductory Information Basis and Purpose of the Organizational TDNA Regional Office Respondents Assessment Approach and Methodology Documents to be used for Organizational TDNA

2. General Introductions for the TDNA Working Group Preliminary Preparation Organizational TDNA Proper Post Organizational TDNA

3. Monitoring and Evaluation of the Organizational TDNA

Attachment 1: Region Organizational TDNA Tools and Templates Attachment 2: Region Organizational TDNA Monitoring and Evaluation Tools

Page 193: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

I n t r o d u c t o r y I n f o r m a t i o n

Basis and Purpose of the Organizational TDNA

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 188

GLOSSARY OF ACRONYMS

BESRA Basic Education Sector Reform Agenda

CBTS Competency Based Teacher Standards

CO Central Office

DEDP Division Education Development Plan

DepED Department of Education

DO Division Office

EDPITAF Educational Development Project Implementing Task Force

EBEIS Enhanced Basic Education Information System

ES Education Supervisor

FGD Focus Group Discussion

GCA Group Consensual Assessment

HRM Human Resources Management

ICT Information Communication Technology

IRR Implementing Rules and Regulations of RA 9155, December 2007

KSA Knowledge, Skills and Attitudes

LOC Level of Competency

LOI Level of Importance

LRMDS Learning Resource Management and Development System

MPPD Master Plan for Professional Development

M&E Monitoring and Evaluation

PSDS Public School District Supervisor

RA 9155 Republic Act 9155: Governance Act for Basic Education, 11 Aug 2001

REDP Regional Education Development Plan

RO Regional Office

SBM School-Based Management

SH School Head

STRIVE Strengthening the Implementation of Basic Education in Selected Provinces in the Visayas

T&D Training and Development

TDIS Training and Development Information System

TDNA Training and Development Needs Assessment

UIS Unified Information System

WG Working Group

Page 194: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

The Training Development Needs Assessment (TDNA) of the region is designed to identify the organization’s current training and development needs vis-à-vis the desired organizational roles and responsibilities as stipulated in the Governance of Basic Education Act of 2001 (R.A. 9155).

The ‘Management Competencies per Service Areas’ established for the Decentralized Management Training Program of the Secondary Education Development Improvement Project (SEDIP, a DepED project implemented by the Bureau of Secondary Education) served as the basis for the development of the Organizational TDNA tool. These competencies were developed through the Project for Central Office (CO), Region Office (RO) and Division Office (DO) levels and validated by educational leaders across 15 Divisions and nine Regions of DepED. The competencies are organized as follows:

General Competencies (CO, RO, DO) Service Area 1: Educational Planning (RO, DO) Service Area 2: Learning Outcome Management (DO) Service Area 3: Monitoring and Evaluation (RO, DO) Service Area 4: Education Administration and Management (CO, RO, DO) Service Area 5: Policy Formulation and Standard Setting (CO, RO) Service Area 6: Curriculum Development (CO, RO)

The Organizational TDNA was first developed and used by Region VII and VIII and the Divisions of Bohol and Northern Samar during Stage One of Project STRIVE. The tool was further refined and validated in Region VI and in the Division of Negros Occidental during Stage 2 of Project STRIVE.

Regional Office Respondents

Respondents composing at least 20% of the regional office management and technical staff are convened to participate in the actual Organizational TDNA from each of the region’s functional divisions e.g. RD/ARD, Elementary Division, Secondary Division, ALS Division, Planning, Accounting/Budget, Cashier, Medical/Dental, Administrative, Legal, and Supply/Physical Facilities. Representation from each section should include the section head. Representation from each section should include the section head. Smaller sections may be clustered to form a group (e.g. cashier, medical/dental, legal can become one group).

Assessment Approach and Methodology

The Organizational TDNA is done first in a “self-assessment” exercise participated in by the section respondents through a Focused Group Discussion (FGD) technique and using FORM A of the FGD Flow (see Attachment 1). The respondents of each section will have to arrive at a consensual description of the region vis-à-vis the management competencies using the scale provided.

Results of the “self-assessment” will form 60% of the total result of the Region’s Organizational TDNA. To complete the Region’s TDNA, at least three divisions should do a parallel assessment of the Region’s current competencies using FORM B of the FGD Flow (see Attachment 1). Results of the parallel assessment will be collated and form 40% of the total results for the Region.

A TDNA Working Group (TDNA-WG) established by the region will be responsible for overseeing the Organizational TDNA process. The TDNA-WG is expected to make preliminary preparations, to facilitate the FGD and consolidate each section’s results as well as the overall Organizational TDNA results by following the instructions and the procedures outlined in the FGD Flow.

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 189

Page 195: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Documents to be used for the Organizational TDNA

This Organizational TDNA for the Region Guide and Tools contains the general instructions for the TDNA-WG as well as the following materials (found in Attachment 1) to be used by the TDNA-WG Facilitators and the respondents during the conduct of the Organizational TDNA process:

(a) Materials for the Facilitators: FGD Flow, Matrix of the Management Competencies and the List of Competencies and Behavior Indicators, Rating Scale Descriptors, Competency Rating Board and Results Summary Template (e-version available)

(b) Materials for the Respondents: Matrix of Management Competencies and the List of Competencies and Behavior Indicators, Separate Answer Sheet, and Rating Scales Descriptions

G e n e r a l I n s t r u c t i o n s F o r T h e T D N A W o r k i n g G r o u p

A. Preliminary Preparations:

1. Study the Organization TDNA for the Region Guide and Tools by conducting a “walkthrough” with all the members of the TDNA-WG before the actual conduct of the Organizational TDNA. The Chair of the group should lead this activity.

2. Assign specific tasks to each member such as:

a. Presenter of the Management Competencies b. Facilitator(s) for the FGDc. Data Recorder(s) of assessment ratingsd. Monitoring an evaluation

3. Reproduce the following set of materials to be used by each of the participating respondents: Matrix of Management Competencies, List of Competencies and Behavior Indicators, Rating Scales Descriptions and Separate Answer Sheet (one per section only).

4. Prepare the Competency Rating Boards. A Competency Rating Board is required for each competency across the service areas e.g. 46 Competency Boards will be required

5. Negotiate for the selected Divisions to complete the parallel assessment of the Region’s current competencies using FORM B of the FGD Flow.

B. Organizational TDNA Proper

1. Let each participant accomplish T&D M&E Form 1: Individual Profile Template upon registration.

2. Follow the instructions listed in the FGD Flow Form A. See Attachment 1.3. Accomplish Org’l TDNA –M&E Form 1: The Organizational TDNA Tool for FGD Process

at the Region/Division level to monitor the facilitation of the session.

C. Post Organizational TDNA

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 190

Page 196: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

1. Consolidate the results of the Self-assessment exercise (60%) and the results obtained from the Division’s assessment of the regional competencies (40%). It should be noted that the parallel assessment of the Regional Competencies completed by the Divisions is usually done in as separate activity and coincides with the “TDNA Self-Assessment” of the Division. Use the Organizational TDNA -M&E Form 2b: Organizational – TDNA Scores Summary Template Region Level prepared for this purpose (e-copy available).

2. Analyze the results and identify the priority training and development needs of the Region e.g. lowest level of competency, highest level of importance using Organizational TDNA M&E Form 3: Functional Divisions/Units Organizational-TDNA Prioritization Template and Organizational TDNA M&E Form 4: Organizational-TDNA Schools Division Consolidation Template.

3. Assigned TDNA M&E team, accomplish Organizational TDNA M&E Form 5: Documentation Review of Organizational TDNA Region/Division Level

M o n i t o r i n g a n d E v a l u a t i o n o f t h e O r g a n i z a t i o n a l T D N A

The TDNA-WG members are tasked to monitor and evaluate the administration of the Organizational TDNA. Monitoring and Evaluation mechanism and tools have been developed to support the Organizational TDNA process and consist of the:

T&D-M&E Form 1: Individual Profile TemplateOrg’l TDNA-M&E Form 1: Organizational TDNA Tool for the Focus Group Discussion (FGD)

Process at the Region/Division LevelOrg’l TDNA-M&E Form 2b: Organizational – TDNA Scores Summary Template Region LevelOrg’l TDNA-M&E Form 3: Functional Divisions/Units Organizational TDNA Prioritization

TemplateOrg’l TDNA-M&E Form 4: Organizational TDNA Schools Division Consolidation TemplateOrg’l TDNA-M&E Form 5: Documentation Review of Division/Region Organizational TDNA

The matrix below describes the M&E tools developed for use during the implementation of the Organizational TDNA which can be found in Attachment 2. The TDNA-WG should be convened anew to consolidate the results from the monitoring of the Organizational TDNA and develop recommendations for the improvement of the process.

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 191

Page 197: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

What will be monitored

How it will be monitored

M&E tool to be used

Who will be responsible

for the monitoring

When will the monitoring take place

How will the results be used

Respondent’s details in relation to their current position, their level of experience and qualification

All participants in the Organizational TDNA will be asked to complete the profile

T&D-M&E Form 1: Individual Profile Template

Division, Region TDNA-WG

Prior to the accomplishment of the Organizational TDNA Tool

Information will be entered into the TDIS database

The processes followed during the conduct of the Focus Group Discussion (FGD) at the Region/Division level

A process observer will be appointed and will use the tool

Org’l TDNA-M&E Form 1: Organizational TDNA Tool for FGD Process at the Region/ Division level

Division, Region TDNA-WG

During the conduct of the FGD for the Organizational TDNA

Results will be shared with the FGD facilitators to identify best practices and areas for improvement. Recommendations for improving the process will be included in the Program Completion Report to inform future processes.

The level of competency and the level of importance of the Division/Region for the various management competencies across Service areas

Results of the organizational TDNA will be consolidated using the template provided

Org’l TDNA-M&E Form 2a: Division Organizational TDNA Scores Summary Template

Org’l TDNA-M&E Form 2b: Regional Organizational TDNA Scores Summary Template

Division, Region TDNA-WG

Following the accomplishment of the Division / Region Organizational TDNA

Results will inform decisions on the training and development programs offered at the division level and will be incorporate into both the DEDP and REDP

The Organizational TDNA of the functional divisions/ sections/ units

Results of the organizational TDNA will be consolidated for each functional division/ section/ unit using the template provided

Org’l TDNA-M&E Form 3: Functional Divisions/ Sections/ Units Organizational TDNA Prioritization Template

Division, Region TDNA-WG

Following the accomplishment of the Organizational TDNA

Results will inform decisions on the training and development programs offered at the division/regional level and will be incorporate into both the DEDP and REDP

The Organizational TDNA results of the

Results of the organizationa

Org’l TDNA-M&E Form 4:

Region TDNA-WG

Following the submission of

Results will inform decisions on the training

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 192

Page 198: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

various divisions across a region

l TDNA will be consolidated for all division within a region using the template provided

Organizational TDNA Schools Division Consolidation Template

Division Organizational TDNA results

and development programs offered at the region level and will be incorporate into the REDP.The results will be analyzed to inform future TDNA policy

The implementation of the Organizational TDNA at the Division/Region levels

A Process Observer will be identified and asked to complete the tool

Org’l TDNA-M&E Form 5: Documentation Review of Division/ Region Organizational TDNA

Division, Region TDNA-WG

During the conduct of the Organizational TDNA at the Division/Region Level

Results to be discussed with the Division/Region and identify strengths and areas for improvement.

Observations will be collated by the TDNA- WG and the results analyzed to inform future TDNA policy

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 193

Page 199: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Attachment 1: Region Organizational TDNA Tools and Templates

o FORM A: Focused Group Discussion Flow – Regional Organizational TDNA Self-Assessment

o Form B: Focus Group Discussion Flow – Regional Organizations TDNA for Division Respondents

o Region Organizational TDNA Rating Scales

o Organizational TDNA Regional Self Assessment Answer Sheet

o Organizational TDNA for the Region by Division Respondents Answer Sheet

o Sample Competency Rating Board for the Region Organizational TDNA Self Assessment

o Sample Competency Rating Board Region Organizational TDNA for the Division Respondents

o Matrix of Management Competencies for the Region

o List of Competencies and Behavioral Indicators

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 194

Page 200: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

FORM A: Focused Group Discussion Flow – Regional Organizational TDNA Self-Assessment

Objective ActivityEstablish a friendly atmosphere 1. Introductions

Emphasize to the participants that as a key respondents to this Organizational TDNA of the Region, their answers and collaboration with their colleagues to reach a consensual assessment will be most helpful for the future development of the management competencies of the region.

Develop an understanding of the purpose of the Organizational TDNA and the basic mechanics of the exercise

2. Present the purpose of the Organizational TDNA and provide a brief description of the data gathering method e.g. focus group discussion to reach a consensus.

Achieve a common understanding of the definition of the items in the rating scale

3. Explain the rating scales

Achieve a common understanding of each competency

4. Walk through the ‘Management Competencies per Service Area’ one service area at a time. Begin with the General Competencies across units. Consider the behavioral indicators for each competency.

5. Facilitator to clarify queries on the competenciesArticulate the: Individual perception on the level of

competence of the region Individual perception on the criticality of

the competence to the performance of the task

Section agreement on the level of competency of the region and the level of importance of the competency

6. Organize respondents into their respective sections/groups.7. Each participant provides individual perception of the region per

competency and shares with the section/group.8. The section/group reaches a consensus and provides a rating for each

competency e.g. 4, 3, 2 or 19. Record section/group rating on the separate answer sheet.10. Each participant provides individual perception on ‘perceived

importance of the competency in the performance of the region’s task / job’ and shares with the section/group.

11. The section/group reaches a consensus and provides a rating for the importance of each competency e.g. 1, 2, 3 or 4.

12. Record section rating on the answer sheet.

Determine the average rating of the perceived: Importance of the competency in the

performance of the region’s task / job. Current level of competency as an initial

step to establish the consensual description of the region

13. When all sections/groups have completed their ratings for a service area, each section records its ratings on the competency rating board.

14. Sections/groups repeat the steps 4-13 for each of the remaining service areas.

15. Assigned TDNA-WG members calculate the average across all section ratings per competency while the groups continue rating the remaining competencies for each service areas.

Identify which competencies will need further discussion to establish a consensual rating

16. Assigned TDNA-WG members inspect each competency rating board to determine deviant (individual) / polarized ratings that may have tipped the average rating to a non-representative perception of the group.

17. Once all competencies have been rated the TDNA-WG presents the averages for all competencies across the service areas, identifying those averages which require further consideration ( based on inspection in Step 16)

18. Where necessary the participants will deliberate on the average competency rating for the region by articulating their views or sharing of factual experience that best describes the current level of the region vis-à-vis the competency in question. Facilitator is expected to traffic the discussion.

Establish a consensus on competencies 19. All sections/groups should agree on the consensual ratings made per competency before considering the next competency.

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 195

Page 201: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

FORM B: Focused Group Discussion Flow – Regional Organizational TDNA By Division Respondents

Objective ActivityEstablish a friendly atmosphere 1. Introductions

Emphasize to the participants that as a key respondents to this Organizational TDNA of the Region, their answers and collaboration with their colleagues to reach a consensual assessment will be most helpful for the future development of the management competencies of the region.

Develop an understanding of the purpose of the Organizational TDNA and the basic mechanics of the exercise

2. Present the purpose of the Organizational TDNA and provide a brief description of the data gathering method e.g. focus group discussion to reach a consensus on the level of competency. NOTE: The Division respondents do not need to rate the level of importance of each competency

Achieve a common understanding of the definition of the items in the rating scale

3. Explain the rating scales

Achieve a common understanding of each competency

4. Walk through the ‘Management Competencies per Service Area’ one service area at a time. Begin with the General Competencies across units. Consider the behavioral indicators for each competency. Call the attention of the participants to the differences between the REGIONAL service areas and their DIVISION service areas.

5. Facilitator is expected to clarify queries on the competencies.Articulate the:

Individual perception on the level of competence of the region

Section agreement on the level of competency of the region

6. Organize respondents into their respective sections/groups.7. Each participant provides individual perception of the region per

competency and shares with the section group.8. The section/group reaches a consensus and provides a rating for

each competency e.g. 4, 3, 2 or 19. Record section/group rating on the separate answer sheet.

Determine the average rating of the perceived:

Current level of competency as an initial step to establish the consensual description of the region

10. When all sections/groups have completed their ratings for a service area, each section records its ratings on the competency rating board.

11. Sections/groups repeat steps 4-10 for each of the remaining service areas.

12. Assigned TDNA-WG members calculate the average across all section/group ratings per competency while the sections continue rating the remaining competencies for each service areas.

Identify which competencies will need further discussion to establish a consensual rating

13. Assigned TDNA-WG members inspect each competency rating board to determine deviant (individual) / polarized ratings that may have tipped the average rating to a non-representative perception of the group.

14. Once all competencies have been rated the TDNA-WG presents the averages for all competencies across the service areas, identifying those averages which require further consideration (based on inspection in Step 13)

15. Where necessary the participants will deliberate on the average competency rating for the region by articulating their views or sharing of factual experience that best describes the current level of the region vis-à-vis the competency in question. Facilitator is expected to traffic the discussion.

Establish a consensus on competencies 16. All sections/groups should agree on the consensual ratings made per competency before considering the next competency.

Region Organizational TDNA Rating ScalesSection 7.3: The Organizational TDNA for the Region Guide and Tools Page 196

Page 202: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

SCALE ON THE LEVEL OF COMPETENCE

Scale

Description

4 VERY HIGH LEVEL

The region has comprehensive knowledge of the competency and can apply it with a high level of confidence.Outputs resulting in the performance of the competency are viewed as very comprehensive, of high quality and have been used as a benchmark for others.

3 HIGH LEVEL The region has substantial knowledge of the competency and can apply it without supervision/guidance.Outputs resulting in the performance of the competency are viewed as comprehensive, of quality and very useful.

2 MODERATE LEVEL

The region has basic understanding of the competency and can apply it with supervision or some external support.Outputs resulting in the performance of the competency meet the basic standards.

1 LIMITED LEVEL The region has minimal understanding of the competency but cannot apply the competency.Requires training or direct guidance to achieve outputs related to the performance of the competency.

SCALE ON THE LEVEL OF IMPORTANCE

Scale

Description

1 HIGHLY ESSENTIAL

Competency is indispensable, highly critical to the job performance of the Region and the lack of it will adversely affect the quality of output / work.

2 ESSENTIAL Competency is necessary and the lack of it will partly affect the quality of output / work.

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 197

Page 203: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

3 LESS ESSENTIAL Competency supports the Region in the performance of its job but the lack of it will have minimal impact on the quality of work.

4 NOT ESSENTIAL Competency enhances the Region in the performance of its job but the lack of it bears no impact to the quality of work.

Organizational Training Development Needs Assessment for the Regional Self-Assessment Answer Sheet

SECTION/UNIT: _________________________________

Service Area Competency #

Level of Competency Level of Importance4 3 2 1 1 2 3 4

General Competencies Across Units(CO/DO/RO)

12345678910

Service Area 1Educational Planning (DO/RO)

111213141516171819

Service Area 3Monitoring & Evaluation (DO/RO)

2021222324

Service Area 4Education Administration & Management(CO/DO/RO)

2526272829303132

Service Area 5 Policy

333435

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 198

Page 204: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Formulation and Standard Setting(CO/RO)

36373839

Service Area 6 Curriculum Development(CO/RO)

40414243444546

Organizational Training Development Needs Assessment for the Region by Division Respondents

Answer Sheet

SECTION: __________________________________________

Service Area Competency # Level of Competency4 3 2 1

General Competencies Across Units(CO/DO/RO)

12345678910

Service Area 1Educational Planning (DO/RO)

111213141516171819

Service Area 3Monitoring & Evaluation (DO/RO)

2021222324

Service Area 4Education Administration & Management(CO/DO/RO)

2526272829303132

Service Area 5 Policy Formulation and Standard Setting

3334353637

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 199

Page 205: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

(CO/RO) 3839

Service Area 6 Curriculum Development(CO/RO)

40414243444546

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 200

Page 206: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Sample Competency Rating Board for the Region Organizational TDNA Self Assessment

(Big size copy paper (17”x 22”) may be used for each Board required.)

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 201

SERVICE AREA: ___________________________COMPETENCY: ___________________________

Section

Level of COMPETENCYSections are to write in this column their perception on the current level of competency of the division

Level of IMPORTANCESections are to write in this column their perception on the level of importance of the competency

Section 1

Section 2

Section 3

Section 4

Section 5

Section 6

Current competency level: Level of ImportanceAverage:

Average: Consensus:

Page 207: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Sample Competency Rating Board Regional Organizational TDNA by Division Respondents

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 202

SectionLevel of COMPETENCYSections are to write in this column their perception on the current level of competency of the division

Section 1

Section 2

Section 3

Section 4

Section 5

Section 6

Current competency level:

Average: Consensus:

Page 208: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

MATRIX of MANAGEMENT COMPETENCIES for the REGION

#

GENERAL COMPETENCIES ACROSS UNITS (CO/DO/RO)

#

SERVICE AREA 1

#

SERVICE AREA 3

#

SERVICE AREA 4

#

SERVICE AREA 5

SERVICE AREA 6

EDUCATIONAL PLANNING (DO/RO)

MONITORING AND EVALUATION (DO/RO)

EDUCATION ADMINI & MGT(CO/RO/DO)

POLICY FORMULATION AND STANDARD SETTING(CO/RO)

#

CURRICULUM DEVELOPMENT(CO/RO)

1 Understanding DepEd as an Organization

11 Strategic Planning

20

Monitoring and Evaluation Design and Development

25 Resource Mobilization and Management

33 Policy Framework Development

40Knowledge on the technical vocabulary of Curriculum Engineering

2 Understanding RA 9155 or the Governance of BEd Act

12 Implementation Planning

21

Instrument/Tools Development for M&E Data Gathering

26 Resource Procurement and Management

34 Policy Instrument Development

41 Understanding of the Foundations of the Curriculum

3 Management of Change

13 Project/ Program Identification

22

Data Processing, Analysis and Utilization

27 Building Partnerships

35 Policy Formulation

42 Application of the Foundations of the Curriculum in Curriculum Engineering

4 Organization Analysis/ Diagnosis

14 Resource Mobilization and Allocation

23

Communication Skills/Feedback Giving

28 Human Resource Management

36 Policy Review 43 Curriculum designing

5 Problem Solving 15 Financial Management and Control

24

Education Management Information System (EMIS)

29 Delegation 37 Standard Setting 44 Curriculum structuring

6 Decision-Making 16 Group Process Management

30 Physical Facilities Programming

38 Technical Writing 45 Implementation of various Curriculum models

7 Dealing Effectively with Pressure Groups

17 Facilitation Skills

31 Records Management

39 Advocacy for Policy Formulation / Implementation

46 Curriculum evaluation

8 Conflict Management

18 Communication Skills

32 Understanding the intent of the Policy and Implementation

9 Negotiation Skills 19 Advocacy

10 Transformational and Enabling Leadership

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 203

Page 209: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

LIST OF COMPETENCIES AND BEHAVIORAL INDICATORS

GENERAL COMPETENCIES ACROSS UNITS (CO/RO/DO)

1. Understanding DepED as an Organization

Understanding DECS vision, goals, and objectives. Being familiar with organizational structure and individuals involved

2. Understanding RA 9155 or the Governance of Basic Education Act

Understanding the facets of the education sector decentralization law and its Implementing Rules and Regulations

3. Management of Change

Understanding the nature and phases of change Facilitating the change process in the unit

4. Organization Analysis/Diagnosis

Assessing the entire organization – it objectives, its resources and the ways it allocates resources to attain goals

Knowledge of approaches and tools to organization analysis

5. Identifying and Solving Problems

Ability to timely identify problems or potential problems and identify strategies or means to address these

6. Decision-Making

Efficiently arriving at conclusions on issues and problems whether internal or external, minor or major

Acting on conclusions arrives at to desired results

7. Dealing Effectively with Pressure Groups

Handling various interests of stakeholders Involves political skills, managing political interference

8. Conflict Management

Ability to identify, handle and manage differences and conflict situations

Ability to resolve conflicts

9. Negotiation Skills

Ability to influence and work with people to reach solutions that everyone can live with

10. Transformational and Enabling Leadership

Extending expertise to other regions Providing opportunities for professional growth to staff Institutionalizing efforts or practices found effective

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 204

Page 210: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Identifying and developing emerging leaders Creating a culture of excellence to enhance peak performanceCreating an atmosphere where individual differences are celebrated and

where every individual grows and maximizes his/her potential

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 205

Page 211: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

SERVICE AREA 1: EDUCATIONAL PLANNING (DO/RO)Outputs: Division/Region Education Development Plan Consolidated Annual Plan

11. Strategic Planning

Preparing clear statements that describe a course of action in terms of identified goals and objectives. Strategic planning involves the understanding of the following:

‐ Current and potential future the provincial, national, and international levels

‐ Planning process and principles‐ Department Planning timelines and requirements‐ Current Department Directions‐ Current regional thrusts‐ Principles and processes in environmental scanning‐ Processes related to collaborative educational planning

and proficiency in the following:

‐ Gather pertinent planning information/inventory of resources‐ Analyze planning information‐ Draw inferences from planning information‐ Conduct a thorough analysis of the current situation‐ Prioritize needs‐ Identify core organizational values‐ Formulate a vision‐ Formulate a mission‐ Perform data forecasting‐ Set realistic Divisional goals to achieve the vision‐ Set realistic performance targets‐ Identify appropriate strategies

12. Implementation Planning

Translating goals and objectives into specific interventions e.g. programs and projects. Implementation Planning involves the understanding of the following:

‐ Forecasting and Trend-Setting‐ Processes related to prioritization of needs‐ Varied tools in plan development‐ Designing Monitoring and Evaluation System‐ Budget procedures of the Department‐ Possible funding sources‐ Budget items

and proficiency in the following:

‐ Prioritize Division needs‐ Identify sufficient activities to achieve targets and priorities‐ Identify performance measures‐ Identify means of verification‐ Identify resources requirements

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 206

Page 212: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

‐ Identify programs and projects to achieve objectives‐ Link programs/projects with resources‐ Perform financial forecasting and planning‐ Allocate human, material, fiscal resources

13. Project/Program Identification

Involves the generation of potential project ideas to address problems or gaps

14. Resource Mobilization and Allocation

Identifying sources of funds Proper distribution of funds across funding requirements

15. Financial Management and Control

Understanding budgeting and financial management principles and funding processes

Understanding COA rules, regulations, and applications

16. Group Process Management

Managing group dynamics during planning activities to accomplish tasks at hand. Group process management the understanding of the following:

‐ Workshop management‐ Principles of group process management‐ Interpersonal styles supportive of group process‐ Areas of Facilitation

and proficiency in the following:

‐ Generate group ideas‐ Engage participants in plan development‐ Manage conflict of ideas and move towards synergy‐ Ask probing questions‐ Summarize group outputs‐ Design and manage planning workshops‐ Build consensus

17. Facilitation Skills

Involves the ability to get a group of people to work together towards a common goal. It involves the following areas:

‐ Listening‐ Observing‐ Questioning‐ Attending

18. Communication Skills

Ability to covey requirements needed to implement the SIP to various school stakeholders.

Ability to engage stakeholders in SIP planning and implementation

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 207

Page 213: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Applying effective verbal and non-verbal communication methods to achieve desired results

Applying effective written communication methods to achieve desired results

19. Advocacy

Ability to generate awareness, ownership and commitment to plan preparation and implementation

SERVICE AREA 3: MONITORING AND EVALUATION (DO/RO)Outputs: M&E Design M&E Instruments and Tools M&E Reports Consolidated/Analyzed BEIS Data Documentation of Success Stories and Best Practices

20. Monitoring and Evaluation Design and Development

Designing and developing the system, facilities, tools and mechanisms for monitoring and evaluation

Setting performance and quality standards for work processes and outputs

21. Instrument/Tools Development for M&E Data Gathering

Preparation of appropriate instruments and tools to gather information to satisfy major monitoring and evaluation areas

22. Data Processing, Analysis and Utilization

Processing, reviewing data gathered to make inferences to enable management to use data for decision-making

23. Communication Skills/Feedback Giving

Ability to convey results of monitoring and evaluation results to improve program/project implementation catering to improving learning outcomes

Establish and manage a mechanism for effective feedback

24. Education Management Information System (EMIS)

Ability to convey results of monitoring and evaluation results to improve program/project implementation catering to improving learning outcomes

Establish and manage a mechanism for effective feedback

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 208

Page 214: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

SERVICE AREA 4: EDUCATION ADMINISTRATION AND MANAGEMENT (CO/RO/DO)Outputs: Resource Mobilization Plan Procurement Plan Human Resource Management Plan Physical Facilities Plan Records Management System Regular Report Requirements

25. Resource Mobilization and Management

Ability to rally support for educational initiatives and proper/prudent allocation of resources across needs in the division/region/central office

Understanding budgeting and financial management principles and funding processes

Understanding COA rules, regulations, and applications Monitoring of resource utilization Providing feedback to funders on status of funds utilization and

benefits derived

26. Resource Procurement and Management

Developing a system of procurement of resources, allocation of resources to specific activities e.g. learning/instructional materials, supplies, etc

Monitoring of resource utilization

27. Building Partnerships

Developing useful contacts and dealing with a broad range of people from various levels, social, and economic background

Working with other institutions (GOs, NGOs, LGUs, etc) both national and international to achieve common goals

28. Human Resource Management

Ability to plan for manpower requirements based on organizational vision

Ability to attract, select, and hire competent people for organizational deployment

Ability to develop people through various forms e.g. In-Service Training, Coaching Mentoring

Ability to compensate, appraise and reward performance

29. Delegation

Involves the ability to entrust to other people one’s authority style of management which allows your staff to use and develop their

skills and knowledge to the full potential

30. Physical Facilities Programming

Ability to forecast physical facilities’ requirement and new school requirements

Setting standard specifications catering to requirements of schools, divisions, regions

Allocating funds, constructing facilities, rehabilitation, maintenance and repair

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 209

Page 215: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

31. Records Management

Developing a system for managing and updating records Maintaining an updated and functional system of storing and retrieving

data on work process and their outputs and other information relevant to decision-making

32. Understanding the intent of the Policy and Implementation

Identifying the purpose of the policy Ability to apply the policy as intended

SERVICE AREA 5: POLICY FORMULATION AND STANDARD SETTING (RO/CO)Outputs: Regional Policy Framework Regional Policies Regional Standards

33. Policy Framework Development

Identifying policy issues and developing strategies to address this in the design of a policy framework

Developing an analytical framework for policy studies undertaken Generating lessons from previous policy studies Applying these in the development of policy framework for current

research

34. Policy Instrument Development

Identifying and developing the appropriate policy instruments for specific policy implementation

35. Policy Formulation

Understanding the current environment and its requirements and determining policy requirements

Understanding research designs and data gathering methods

36. Policy Review

Applying research methods to assess effectiveness of policies formulated and enforced

37. Standard Setting

Developing a set of appropriate norms

38. Technical Writing

Preparing written material that follows generally accepted rules of style and form, is appropriate for the audience, and accomplishes its intended purpose

39. Advocacy for Policy Formulation/Implementation

Generating support for policy formulation and implementation

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 210

Page 216: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

SERVICE AREA 6: CURRICULUM DEVELOPMENT (CO/RO)Output: Basic Curriculum Standards for Curriculum Adaptation Policies to guide Curriculum Implementation Regional Policy Framework for Curriculum Localization

40. Knowledge on the Technical Vocabulary of Curriculum Engineering

Understanding of technical terms currently used for curriculum engineering

Using technical terms in curriculum engineering during professional meetings and technical reporting related to curriculum development

41. Understanding of the Foundation of the Curriculum

Understanding of the philosophy of the Philippines education and the thrust of the 21st century education.

Understanding of the relevant theories, approaches and requirements of curriculum engineering.

42. Application of the Foundations of the Curriculum in Curriculum Engineering

Utilizing knowledge and understanding of the foundations of curriculum engineering in the designing, implementing and evaluation of curriculum.

43. Curriculum Designing

Contributing significant inputs to the design of the curriculum’s framework, curriculum model, teaching-learning approaches and assessment.

Familiarity with theories of cognitive development and learning as they apply to development of instructional materials

Skills in development, assessment and evaluation of instructional materials

44. Curriculum Structuring

Contributing significant inputs in prescribing learning standards, objectives and content of the different subject areas in the various grade and year levels.

45. Implementation of Various Curriculum Models

Implementing initiatives for the localization of the national curriculum. Engaging in pilot-testing various curriculum models recommended to

be potentially effective. Disseminating best practices and replicable models related to the

curriculum

46. Curriculum Evaluation

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 211

Page 217: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Evaluating the effectiveness of the curriculum in addressing local learning needs

Providing feedback to appropriate bodies to improve the curriculum design and/or structure

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 212

Page 218: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Attachment 2: Region Organizational TDNA M&E Tools

M&E Matrix of Tools for the Organizational TDNAT&D-M&E Form 1: Individual Profile Template Org’l TDNA-M&E Form 1: Organizational TDNA Tool for the Focus Group Discussion

(FGD) Process at the Region/Division LevelOrg’l TDNA-M&E Form 2b: Region Organizational TDNA Scores Summary TemplateOrg’l TDNA-M&E Form 3: Functional Divisions/Units Organizational TDNA Prioritization

TemplateOrg’l TDNA-M&E Form 4: Organizational TDNA Schools Division Consolidation TemplateOrg’l TDNA-M&E Form 5: Documentation Review of Division/Region Organizational

TDNA

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 213

Page 219: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

What will be monitored

How it will be monitored

M&E tool to be used

Who will be responsible

for the monitoring

When will the monitoring take place

How will the results be used

Respondent’s details in relation to their current position, their level of experience and qualification

All participants in the Organizational TDNA will be asked to complete the profile

T&D-M&E Form 1: Individual Profile Template

Division, Region TDNA-WG

Prior to the accomplishment of the Organizational TDNA Tool

Information will be entered into the TDIS database

The processes followed during the conduct of the Focus Group Discussion (FGD) at the Region/Division level

A process observer will be appointed and will use the tool

Org’l TDNA-M&E Form 1: Organizational TDNA Tool for FGD Process at the Region/ Division level

Division, Region TDNA-WG

During the conduct of the FGD for the Organizational TDNA

Results will be shared with the FGD facilitators to identify best practices and areas for improvement. Recommendations for improving the process will be included in the Program Completion Report to inform future processes.

The level of competency and the level of importance of the Division/Region for the various management competencies across Service areas

Results of the organizational TDNA will be consolidated using the template provided

Org’l TDNA-M&E Form 2a: Division Organizational TDNA Scores Summary Template

Org’l TDNA-M&E Form 2b: Regional Organizational TDNA Scores Summary Template

Division, Region TDNA-WG

Following the accomplishment of the Division / Region Organizational TDNA

Results will inform decisions on the training and development programs offered at the division level and will be incorporate into both the DEDP and REDP

The Organizational TDNA of the functional divisions/ sections/ units

Results of the organizational TDNA will be consolidated for each functional division/ section/ unit using the template provided

Org’l TDNA-M&E Form 3: Functional Divisions/ Sections/ Units Organizational TDNA Prioritization Template

Division, Region TDNA-WG

Following the accomplishment of the Organizational TDNA

Results will inform decisions on the training and development programs offered at the division/regional level and will be incorporate into both the DEDP and REDP

The Organizational TDNA results of the various divisions

Results of the organizational TDNA will

Org’l TDNA-M&E Form 4: Organizational

Region TDNA-WG

Following the submission of Division

Results will inform decisions on the training and development

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 214

Page 220: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

across a region be consolidated for all division within a region using the template provided

TDNA Schools Division Consolidation Template

Organizational TDNA results

programs offered at the region level and will be incorporate into the REDP.The results will be analyzed to inform future TDNA policy

The implementation of the Organizational TDNA at the Division/Region levels

A Process Observer will be identified and asked to complete the tool

Org’l TDNA-M&E Form 5: Documentation Review of Division/ Region Organizational TDNA

Division, Region TDNA-WG

During the conduct of the Organizational TDNA at the Division/Region Level

Results to be discussed with the Division/Region and identify strengths and areas for improvement.

Observations will be collated by the TDNA- WG and the results analyzed to inform future TDNA policy

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 215

Page 221: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

T&D-M&E Form 1: Individual Profile Template

I PERSONAL DATAName:                     

(Surname) (First Name) (Middle Name)

Employee Number (If Applicable):   Sex:   Male FemaleDate of Birth:                    Home Address:                    Contact #:         e-mail address: Region:   Division:       District: Office/School:         Address:Current Position: 

Other Designations: 

Highest Educational Attainment:            

II. WORK EXPERIENCE(List from most current.)

POSITIONMAIN AREA OF

RESPONSIBILITY e.g. subjects taught, level supervised

LEVEL e.g. Elem/Sec/ALS school, district, division, region

INCLUSIVE PERIOD

                                                                                                                   

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 216

Page 222: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Use additional sheet if necessary.

III. TRAINING ATTENDED OVER THE LAST THREE YEARS

Please check training focus and management level for all training attended over the last three years.

Training Focus Training attended over last 3 years ()

Management Level of TrainingCentral Region Division Cluster School

Curriculum

Resource Materials Development

Planning

Management

Policy Development

Research

Other, please specify ______________

IV. SIGNIFICANT EXPERIENCESIdentify which of the following areas you consider to be your area(s) of expertise: S School Based Management Quality Assurance Monitoring and Evaluation Access Education Subject Specialization: _____________) Education Planning Policy Development Learning Resource Materials Development ICT Delivery of Training Other, please specify ________________

Certified Trainers by NEAP Central NEAP-Region TEI

SEAMEO- INNOTECH Foreign Assisted Projects (FAP) Other, please specify --

List your significant experiences in the identified areas

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 217

Page 223: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Use additional sheet if necessary.

V. TRAINING AND DEVELOPMENT EXPERIENCES

Identify which of the following specific areas you consider to be your area(s) of expertise:

Competency Assessment Program Planning

Program Designing Resource Materials Development

Program Delivery Program Management

Monitoring and Evaluation of Training

List your significant experiences in the identified areas

Use additional sheet if necessary.

I certify that the information I have given to the foregoing questions are true, complete, and correct to the best of my knowledge and belief.

Date: 

Signature:    

Please submit completed form to Training and Development Division/Unit. Information will be incorporated into the T&D Information System Database.

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 218

Page 224: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

_______ FGD Flow of Regional TDNA Self Assessment_______ FGD Flow of Division TDNA Self Assessment_______ Monitoring of Division Organization TDNA by Regional TeamPlease check ( ) ✔ under the manifested (M) column if the process was manifested and under the not manifested (NM) column if the process was not manifested. Please indicate any variations noted and include any additional comments regarding the facilitation of the session.

ACTIVITY M NM

Variations/Comments

1. Facilitator emphasizes to the participants that as key respondents to this Organizational TDNA of the Region/Division, their answers and collaboration with their colleagues to reach a consensual assessment will be most helpful for the future development of the management competencies of the region.

2. Facilitator clearly presents the purpose of the Organizational TDNA

3. Brief description of the data gathering method (FGD) was provided by the facilitator

4. Facilitator comprehensively explained the rating scale5. Systematic walk through of the ‘Management

Competencies per Service Area’ one service area at a time was carried out.

6. Each section collaboratively reached a consensus on the level of importance and level of competencies for each service area

7. Section ratings were recorded properly8. Each participant provided individual perception on

perceived importance of the competency in the performance of the region’s/division’s task / job’ and shared these with the group.

9. TDNA-WG members efficiently performed their assigned task.

10. Participants careful deliberated on the average level of importance and competency ratings

11. A consolidation of the Organizational TDNA results following the guidelines outlined in the FGD flow was accomplished.

12. An M&E committee was tasked to monitor and evaluate on the preparation, conduct and consolidation of the TDNA results.

Total

Name and Signature of the Process Observer: ___________________________________ Date: ____________________________

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 219

Org’l TDNA-M & E Form 1: Organizational TDNA Tool for Focus Group Discussion (FGD) Process at the Region/Division Level

Page 225: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

REGION ______

      Raw ScoresLevel of

Competency(LOC)

WEIGHTED RATING (LOC)

Level of Importance (LOI)GENERAL

COMPETENCIES ACROSS UNITS (CO/RO/DO) SA Div  

S-A60%

Div40%   Total

1 Understanding DepED as an Organization

       

2 Understanding RA 9155 or the Governance of Basic Education Act

3 Management of Change        

4 Organization Analysis/Diagnosis

       

5 Identifying and Solving Problems

       

6 Decision-Making        

7 Dealing Effectively with Pressure Groups

       

8 Conflict Management        

9 Negotiation Skills        

10

Transformational and Enabling Leadership

  

   

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 220

Org’l TDNA-M & E Form 2b Region Organizational - TDNA Scores Summary Template

Page 226: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

   SERVICE AREA 1: EDUCATIONAL PLANNING (DO/RO)

Raw Scores

Level of Competency (LOC)  

WEIGHTED RATING (LOC)

Level ofImportance (LOI)

S-A Div  S-A60%

Div40%  

TOTAL

 11 Strategic Planning        

12 Implementation Planning        

13 Project/Program Identification

       

 14 Resource Mobilization and Allocation

       

 15 Financial Management and Control

       

 16 Group Process Management

       

 17 Facilitation Skills        

 18 Communication Skills        

 19 Advocacy        

SERVICE AREA 3: MONITORING AND EVALUATION (DO/RO) S-A Div  

S-A60%

Div40%.  

TOTAL

 20 Monitoring and Evaluation Design and Development

       

 21 Instrument/Tools Development for M&E Data Gathering

       

22  Data Processing, Analysis and Utilization

       

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 221

Page 227: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

 23 Communication Skills/Feedback Giving

       

24  Education Management Information System (EMIS)

       

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 222

Page 228: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

SERVICE AREA 4: EDUCATION ADMINISTRATION AND MANAGEMENT (CO/RO/DO)

Raw Scores

Level of Competency

(LOC)  

WEIGHTED RATING (LOC) Level of

Importance (LOI)

S-A Div  

S-A60%

Div 40%  

TOTAL

 25 Resource Mobilization and Management

       

 26 Resource Procurement and Management

       

 27 Building Partnerships        

 28 Human Resource Management

       

 29 Delegation        

 30 Physical Facilities Programming

       

 31 Records Management        

 32 Understanding the intent of the Policy and Implementation

       

SERVICE AREA 5: POLICY FORMULATION AND STANDARD SETTING (RO/CO) S-A Div  

 33 Policy Framework Development

       

 34 Policy Instrument Development

       

 35 Policy Formulation        

 36 Policy Review        

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 223

Page 229: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

 37 Standard Setting        

 38 Technical Writing        

 39 Advocacy for Policy Formulation/Implementation

       

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 224

Page 230: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

 SERVICE AREA 6: CURRICULUM DEVELOPMENT (CO/RO)  

Raw Scores Level of

Competency (LOC)  

WEIGHTED RATING (LOC) Level

of Importance (LOI)

S-ADiv.  

S-A60%

Div 40%  

TOTAL

 40Knowledge on the Technical Vocabulary of Curriculum Engineering

       

 41Understanding of the Foundation of the Curriculum

       

42 Application of the Foundations of the Curriculum in Curriculum Engineering

       

 43Curriculum Designing

       

 44Curriculum Structuring

       

 45Implementation of Various Curriculum Models

       

 46Curriculum Evaluation

       

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 225

Page 231: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

 

NOTE: The lower the numerical value of the LOC and LOI, the greater is the need for training.

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 226

Page 232: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

(To be accomplished by the TDNA -WG

LEVEL OF PLAN: REGION DIVISION DATE Accomplished: ______________________

Supply the following data: 1) name of functional divisions/sections/units, and 2) numerical rating of LOI and LOC for each service area of each divisions/sections/unit.

Competencies / Service Areas

Name of Functional Divisions/Sections/Units

LOI LOC LOI LOC LOI LOC LOI LOC LOI LOC LOI LOC LOI LOCGeneral Competencies

Service Area 1: Educational PlanningService Area 2: Learning Outcome Management Service Area 3: Monitoring& Evaluation Service Area 4: Education Administration & ManagementService Area 5: Policy Formulation and Standard SettingService Area 6: Curriculum Development

Priority1Priority 2Priority 3

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 227

Org’l TDNA-M & E Form 3: Functional Divisions/Sections/Units Organizational - TDNA Prioritization Template

Page 233: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

(To be accomplished by the Regional TDNA-WG)REGION ___________________________ DATE Accomplished: ______________________ NOTE: For Regions with more than seven (7) Schools Divisions, additional columns maybe added. Only the general average of each service area is entered.

Competencies / Service Areas

Names of Divisions within Region Division 1:____________

Division 2: ___________

Division 3: ____________

Division 4: ___________

Division 5: ___________

Division 6: ____________

Division 7: ____________

LOI LOC LOI LOC LOI LOC LOI LOC LOI LOC LOI LOC LOI LOCGeneral Competencies

Service Area 1: Educational Planning

Service Area 2: Learning Outcome Management Service Area 3: Monitoring& Evaluation Service Area 4: Education Administration & ManagementService Area 5: Policy Formulation and Standard SettingService Area 6: Curriculum Development

Priority1Priority 2Priority3

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 228

Org’l TDNA-M & E Form 4: Organizational - TDNA Schools Division Consolidation Template

Page 234: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 229

Page 235: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Org’l TDNA-M&E Form 5: Documentation Review of Organizational TDNA Region/Division level

This form is to be used to support Regional monitoring of the Organizational TDNA processes at the Division level. It is expected that the assessment will be based on observations, discussions with the implementing team and review of relevant documents.

Division/Region _________________________ Date: __________________

Rating Guide:Numerical

RatingInterpretation Description

4 Very High Extent In a very significant way3 High Extent In a meaningful way2 Low Extent In a limited way only1 Very Low Extent Not in any meaningful way

Use the scale above to assess the extent to which the conduct of Organizational TDNA documentation adhered to the following:

To what extent …….. 1 2 3 41. was thorough planning conducted prior to administration?2. was the purpose of the Organizational TDNA explained?3. was the data collection method to be followed for administering the

Organizational TDNA explained e.g. group consensual assessment technique, self assessment and an external assessment?

4. were participants oriented to the Organizational Management Competencies for each service?

5. was a clear explanation provided on how to accomplish the Organizational TDNA process e.g. consensus agreement within each division/unit regarding level of competence and level of importance, agreement across divisions/units, scoring system

6. were the steps involved in consolidating the results for individual divisions/units as well as the overall region/division explained?

7. were the steps involved in consolidating the self assessment and the external assessment explained?

8. was an explanation on how to interpreted results to identify priority training needs provided ?

9. was technical assistance provided when required?10.

were the M&E tools and processes implemented?

11.

was there evidence of team work and collaboration amongst the Organizational TDNA Implementers

12.

were recommendations for improving the Organizational TDNA administration processes identified?

Recommendations:

Name: ___________________________________

Position: _________________________________

Date: ____________________________________

Section 7.3: The Organizational TDNA for the Region Guide and Tools Page 230

Page 236: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Section 7.4.

The Organizational TDNA for the Division Guide and Tools

Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 231

Page 237: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Republic of the Philippines

Department of Education

OrganizationalTraining and Development Needs

Assessment (TDNA)for the Division

Guide and Tools

DepED-EDPITAF-STRIVE 2June 2010

Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 232

Page 238: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 233

This document, Organizational Training and Development Needs Assessment of the Division Guide and Tools, was developed and

validated in Regions VI, VII and VIII, Divisions of Negros Occidental, Bohol/Tagbilaran and Northern Samar through the AusAID-funded

project, STRIVE (Strengthening the Implementation of Basic Education in selected Provinces in the Visayas), in coordination with the EDPITAF (Educational Development Project Implementing Task

Force) , and in consultation with the TEDP-TWG, NEAP and the Bureaus of the Department of Education.

Page 239: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 234

TABLE of CONTENTS

1. Introductory Information Basis and Purpose of the Organizational TDNA Division Office Respondents Assessment Approach and Methodology Documents to be used for Organizational TDNA

2. General Introductions for the TDNA Working Group Preliminary Preparation Organizational TDNA Proper Post Organizational TDNA

3. Monitoring and Evaluation of the Organizational TDNA

Attachment 1: Division Organizational TDNA Tools and Templates Attachment 2: Division Organizational TDNA Monitoring and Evaluation Tools

Page 240: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

1 .I n t r o

d u c t o r y I n f o r m a t i o n

Basis and Purpose of the TDNA

The Training Development Needs Assessment (TDNA) of the Division is designed to identify the organization’s current training and development needs vis-à-vis the desired organizational roles and responsibilities as stipulated in the Governance of Basic Education Act of 2001 (R.A. 9155).

Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 235

GLOSSARY OF ACRONYMS

BESRA Basic Education Sector Reform Agenda

CO Central Office

DEDP Division Education Development Plan

DepED Department of Education

DO Division Office

EDPITAF Educational Development Project Implementing Task Force

EBEIS Enhanced Basic Education Information System

ES Education Supervisor

FGD Focus Group Discussion

GCA Group Consensual Assessment

ICT Information Communication Technology

INSET In-Service Education and Training

IRR Implementing Rules and Regulations of RA 9155, December 2007

IPPD Individual Plan for Professional Development

KSA Knowledge, Skills and Attitudes

LOC Level of Competency

LOI Level of Importance

LRMDS Learning Resource Management and Development System

MPPD Master Plan for Professional Development

M&E Monitoring and Evaluation

PSDS Public School District Supervisor

RA 9155 Republic Act 9155: Governance Act for Basic Education, 11 Aug 2001

REDP Regional Education Development Plan

RO Regional Office

SEDIP Secondary Education Development Improvement Project

SH School Head

STRIVE Strengthening the Implementation of Basic Education in Selected Provinces in the Visayas

T&D Training and Development

TDIS Training and Development Information System

TDNA Training and Development Needs Assessment

UIS Unified Information System

WG Working Group

Page 241: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA SystemThe ‘Management Competencies per Service Areas’ established for the Decentralized Management Training Program of the Secondary Education Development Improvement Project (SEDIP, a DepED project implemented by the Bureau of Secondary Education) served as the basis for the development of this Organizational TDNA tool. These competencies were developed for Central Office (CO), Region Office (RO) and Division Office (DO) levels and validated by educational leaders across 15 Divisions and nine Regions of DepED. The competencies are organized as follows:

General Competencies (CO, RO, DO) Service Area 1: Educational Planning (RO, DO) Service Area 2: Learning Outcome Management (DO) Service Area 3: Monitoring and Evaluation (RO, DO) Service Area 4: Education Administration and Management (CO, RO, DO) Service Area 5: Policy Formulation and Standard Setting (CO, RO) Service Area 6: Curriculum Development (CO, RO)

The Organizational TDNA was first developed and used by Region VII and VIII and the Divisions of Bohol and Northern Samar during Stage One of Project STRIVE. The tool was further refined and validated in Region VI and in the Division of Negros Occidental during Stage 2 of Project STRIVE.

Division Office Section Respondents

Respondents composing at least 20% of the division office management and technical staff from each section are convened to participate in the actual TDNA e.g. SDS/ASDS, Elementary Division, Secondary Division, ALS Division, Planning, Accounting/Budget, Cashier, Medical/Dental, Administrative, Legal, and Supply/Physical Facilities. Representation from each section should include the section head . Smaller sections may be clustered to form a group (e.g. cashier, medical/dental, legal can become one group).

Assessment Approach

The Organizational TDNA is done first as a “self-assessment” exercise participated in by the section respondents through a Focused Group Discussion (FGD) technique and using FORM A of the FGD Flow (see Attachment 1). The respondents of each section will have to arrive at a consensual description of the division vis-à-vis the management competencies using the scale provided.

Results of the “self-assessment” will form 60% of the total result of the Division’s Organizational TDNA. To complete the Division’s TDNA, the Region should do a parallel assessment of the Division’s current competencies using FORM B of the FGD Flow (see Attachment 1). Results of the parallel assessment will form 40% of the total results for the Division.

A TDNA Working Group (TDNA-WG) established by the division will be responsible for overseeing the TDNA process. The Group is expected to make preliminary preparations to facilitate the FGD and consolidate each section’s results as well as the overall Organizational TDNA results by following the instructions and procedures outlined in the FGD Flow.

Documents to be used for the Division Organizational TDNA

The Organizational TDNA of the Division Guide and Tools contains the general instructions for the TDNA-WG as well as the following materials (found in Attachment 1) to be used by the TDNA-WG Facilitators and the respondents during the conduct of the Organizational TDNA process:

(a) Materials for the Facilitators: General Instructions for the TDNA-WG, FGD Flow, Matrix of the Management Competencies and List of Competencies and Behavior Indicators, Rating Scales Description, Competency Rating Board and Results Summary Template (e-version available)

Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 236

Page 242: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System(b) Materials for the Respondents: Matrix of Management Competencies and List of Competencies and

Behavior Indicators, Separate Answer Sheet, and Rating Scales Descriptions

2 . G e n e r a l I n s t r u c t i o n s F o r T h e T D N A W o r k i n g G r o u p

A. Preliminary Preparations:

1. Study the Organizational TDNA Guide and Tools by conducting a “walkthrough” with all the members of the TDNA-WG before the actual conduct of the Organizational TDNA. The Chair of the group should lead this activity.

2. Assign specific tasks to each member such as:

a. Presenter of the Management Competencies b. Facilitator(s) for the FGDc. Data Recorder(s) of assessment ratingsd. Monitoring and evaluation

3. Reproduce the following set of materials to be used by each of the participating respondents: Matrix of Management Competencies, List of Competencies and Behavior Indicators, Rating Scales Descriptions and Separate Answer Sheet (one per section only).

4. Prepare the Competency Rating Boards. A Competency Rating Board is required for each competency across the service areas e.g. 44 Competency Boards will be required

5. Negotiate for the Region to complete a parallel assessment of the Division’s current competencies using FORM B of the FGD Flow.

B. Organizational TDNA Proper

1. Let each participant accomplish T&D M&E Form 1: Individual Profile Template upon registration.2. Follow the instructions listed in the FGD Flow Form A. See following page.3. Accomplish Org’l TDNA –M&E Form 1: The Organizational TDNA Tool for FGD Process at the

Region/Division level to monitor the facilitation of the session.

D. Post Organizational TDNA

1. Consolidate the results of the Self-assessment exercise (60%) and the results obtained from the Region’s assessment of the divisional competencies (40%). Note that the parallel assessment of the Division Competencies completed by the Region is usually done in as separate activity and coincides with the “TDNA Self-Assessment” of the Region. Use the Organizational TDNA -M&E Form 2a: Organizational – TDNA Scores Summary Template Division Level prepared for this purpose (e-copy available).

2. Analyze the results and identify the priority training and development needs of the Division e.g. lowest level of competency, highest level of importance using Organizational TDNA M&E Form 3: Functional Divisions/Units Organizational-TDNA Prioritization Template.

3. Assigned TDNA M&E team, accomplish Organizational TDNA M&E Form 5: Documentation Review of Organizational TDNA Region/Division Level

Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 237

Page 243: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

3 . M o n i t o r i n g a n d E v a l u a t i o n o f t h e O r g a n i z a t i o n a l T D N A

The TDNA-WG members are tasked to monitor and evaluate the administration of the Organizational TDNA. Monitoring and Evaluation mechanism and tools have been developed to support the Organizational TDNA process and consist of the:

T&D-M&E Form 1: Individual Profile TemplateOrg’l TDNA-M&E Form 1: Organizational TDNA Tool for the Focus Group Discussion (FGD) Process at the

Region/Division LevelOrg’l TDNA-M&E Form 2a: Organizational – TDNA Scores Summary Template Division LevelOrg’l TDNA-M&E Form 3: Functional Divisions/Units Organizational TDNA Prioritization TemplateOrg’l TDNA-M&E Form 5: Documentation Review of Division/Region Organizational TDNA

The matrix below describes the M&E tools developed for use during the implementation of the Organizational TDNA which can be found in Attachment 2. The TDNA-WG should be convened anew to consolidate the results from the monitoring of the Organizational TDNA and develop recommendations for the improvement of the process.

Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 238

Page 244: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

What will be monitored

How it will be monitored

M&E tool to be used

Who will be responsible

for the monitoring

When will the monitoring take place

How will the results be used

Respondent’s details in relation to their current position, their level of experience and qualification

All participants in the Organizational TDNA will be asked to complete the profile

T&D-M&E Form 1: Individual Profile Template

Division, Region TDNA-WG

Prior to the accomplishment of the Organizational TDNA Tool

Information will be entered into the TDIS database

The processes followed during the conduct of the Focus Group Discussion (FGD) at the Region/Division level

A process observer will be appointed and will use the tool

Org’l TDNA-M&E Form 1: Organizational TDNA Tool for FGD Process at the Region/ Division level

Regional / Division TDNA -WG

During the conduct of the FGD for the Organizational TDNA

Results will be shared with the FGD facilitators to identify best practices and areas for improvement. Recommendations for improving the process will be included in the Program Completion Report to inform future processes.

The level of competency and the level of importance of the Division/Region for the various management competencies across Service areas

Results of the organizational TDNA will be consolidated using the template provided

Org’l TDNA-M&E Form 2a: Division Organizational TDNA Scores Summary Template

Org’l TDNA-M&E Form 2b: Regional Organizational TDNA Scores Summary Template

Division, Region TDNA-WG

Following the accomplishment of the Division / Region Organizational TDNA

Results will inform decisions on the training and development programs offered at the division level and will be incorporate into both the DEDP and REDP

The Organizational TDNA of the functional divisions/ sections/ units

Results of the organizational TDNA will be consolidated for each functional division/ section/ unit using the template provided

Org’l TDNA-M&E Form 3: Functional Divisions/ Sections/ Units Organizational TDNA Prioritization Template

Division, Region TDNA-WG

Following the accomplishment of the Organizational TDNA

Results will inform decisions on the training and development programs offered at the division/regional level and will be incorporate into both the DEDP and REDP

The implementation of the Organizational TDNA at the Division/Region levels

A Process Observer will be identified and asked to complete the tool

Org’l TDNA-M&E Form 5: Documentation Review of Division/ Region Organizational TDNA

Division, Region TDNA-WG

During the conduct of the Organizational TDNA at the Division/Region Level

Results to be discussed with the Division/Region and identify strengths and areas for improvement.

Observations will be collated by the TDNA- WG and the results analyzed to

Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 239

Page 245: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA Systeminform future TDNA policy

Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 240

Page 246: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Attachment 1: Division Organizational TDNA Tools and Templates

FORM A: Focused Group Discussion Flow – Division Organizational TDNA Self-Assessment

Form B: Focus Group Discussion Flow – Division Organizations TDNA by Region Respondents

Division Organizational TDNA Rating Scales

Organizational TDNA Division Self Assessment for the Division Answer Sheet

Sample Competency Rating Board for the Division Organizational TDNA

Matrix of Management Competencies for the Division

List of Competencies and Behavioral Indicators

Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 241

Page 247: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

FORM A: Focused Group Discussion Flow – Divisional Organizational TDNA Self-Assessment

Objective Activity

Establish a friendly atmosphere 1. IntroductionsEmphasize to the participants that as key respondents to this Organizational TDNA of the Division, their answers and collaboration with their colleagues to reach a consensual assessment will be most helpful for the future development of the management competencies of the division.

Develop an understanding of the purpose of the Organizational TDNA and the basic mechanics of the exercise

2. Present the purpose of the Organizational TDNA and provide a brief description of the data gathering method e.g. focus group discussion to reach a consensus.

Achieve a common understanding of the definition of the items in the rating scale

3. Explain the rating scales

Achieve a common understanding of each competency

4. Walk through the ‘Management Competencies per Service Area’ one service area at a time. Begin with the General Competencies across units. Consider the behavioral indicators for each competency.

5. Facilitator to clarify queries on the competenciesArticulate the: Individual perception on the level of

competence of the division Individual perception on the

criticality of the competence to the performance of the task

Section/group agreement on the level of competency of the division and the level of importance of the competency

6. Organize respondents into their respective sections/groups.7. Each participant provides individual perception of the division per

competency and shares with the section/group.8. The section/group reaches a consensus and provides a rating for each

competency e.g. 4, 3, 2 or 1.9. Record section rating on the separate answer sheet.10. Each participant provides individual perception on ‘perceived importance of

the competency in the performance of the division’s task / job’ and shares with the group.

11. The section/group reaches a consensus and provides a rating for the importance of each competency e.g. 1, 2, 3 or 4.

12. Record the section rating on the answer sheet.Determine the average rating of the perceived: Importance of the competency in

the performance of the division’s task / job.

Current level of competency as an initial step to establish the consensual description of the division.

13. When all sections/groups have completed their ratings for a service area, each section/group records its ratings on the competency rating board.

14. Sections/groups repeat the steps 4-13 for each of the remaining service areas.

15. Assigned TDNA-WG members calculate the average across all section/group ratings per competency while the groups continue rating the remaining competencies for each service areas.

Identify which competencies will need further discussion to establish a consensual rating

16. Assigned TDNA-WG members inspect each competency rating board to determine deviant (individual) / polarized ratings that may have tipped the average rating to a non-representative perception of the group.

17. Once all competencies have been rated the TDNA-WG presents the averages for all competencies across the service areas, identifying those averages which require further consideration ( based on inspection in Step 16)

18. Where necessary the participants will deliberate on the average competency rating for the division by articulating their views or sharing of factual experience that best describes the current level of the division vis-à-vis the competency in question. Facilitator is expected to traffic the discussion.

Establish a consensus on competencies 19. All sections/ groups should agree on the consensual ratings made per competency before considering the next competency.

FORM B: Focused Group Discussion Flow – Division Organizational TDNA By Region Respondents*

Objective Activity

Establish a friendly atmosphere 1. Introductions

Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 242

Page 248: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA SystemEmphasize to the participants that as a key respondents to this Organizational TDNA of the Division, their answers and collaboration with their colleagues to reach a consensual assessment will be most helpful for the future development of the management competencies of the division.

Develop an understanding of the purpose of the Organizational TDNA and the basic mechanics of the exercise

2. Present the purpose of the Organizational TDNA and provide a brief description of the data gathering method e.g. focus group discussion to reach a consensus.

Achieve a common understanding of the definition of the items in the rating scale

3. Explain the rating scales

Achieve a common understanding of each competency

4. Walk through the ‘Management Competencies per Service Area’ one service area at a time. Begin with the General Competencies across units. Consider the behavioral indicators for each competency. Call the attention of the participants to the differences between the DIVISION service areas and their REGION service areas.

5. Facilitator is expected to clarify queries on the competencies.Articulate the:

a. Individual perception on the level of competence of the division

b. Individual perception on the criticality of the competence to the performance of the task

c. Section/group agreement on the level of competency of the division and the level of importance of the competency

6. Organize respondents into their respective sections/groups.7. Each participant provides individual perception of the division per competency and

shares with the section group.8. The section/group reaches a consensus and provides a rating for each

competency e.g. 4, 3, 2 or 1.9. Record section/group rating on the separate answer sheet.10. Each participant provides individual perception on ‘perceived importance of the

competency in the performance of the division’s task / job’ and shares with the group.

11. The section/group reaches a consensus and provides a rating for the importance of each competency e.g. 1, 2, 3 or 4.

12. Record section/group rating on the answer sheet.Determine the average rating of the perceived:

a. Importance of the competency in the performance of the division’s task / job.

b. Current level of competency as an initial step to establish the consensual description of the division

13. When all sections/groups have completed their ratings for a service area, each section will record its ratings on the competency rating board.

14. Sections/groups repeat steps 4-13 for each of the remaining service areas.15. Assigned TDNA-WG members calculate the average across all section/group

ratings per competency while the sections continue rating the remaining competencies for each service areas.

Identify which competencies will need further discussion to establish a consensual rating

16. Assigned TDNA-WG members inspect each competency rating board to determine deviant (individual) / polarized ratings that may have tipped the average rating to a non-representative perception of the group.

17. Once all competencies have been rated the TDNA-WG presents the averages for all competencies across the service areas, identifying those averages which require further consideration (based on inspection in Step 16)

18. Where necessary the participants will deliberate on the average competency rating for the division by articulating their views or sharing of factual experience that best describes the current level of the division vis-à-vis the competency in question. Facilitator is expected to traffic the discussion.

Establish a consensus on competencies 19. All sections/groups should agree on the consensual ratings made per competency before considering the next competency.

*Note: Form B is used by the Regional Office Focus Group for the purpose of the parallel assessment of the Division Organizational Competencies. It is completed by the Region and it usually coincides with the “Organizational TDNA Self-Assessment” of the Region.

Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 243

Page 249: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 244

Page 250: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Division Organizational TDNA Rating Scales

SCALE ON THE LEVEL OF COMPETENCE

Scale

Description

4 VERY HIGH LEVEL

The division has comprehensive knowledge of the competency and can apply it with a high level of confidence.Outputs resulting in the performance of the competency are viewed as very comprehensive, of high quality and have been used as a benchmark for others.

3 HIGH LEVEL The division has substantial knowledge of the competency and can apply it without supervision/guidance.Outputs resulting in the performance of the competency are viewed as comprehensive, of quality and very useful.

2 MODERATE LEVEL

The division has basic understanding of the competency and can apply it with supervision or some external support.Outputs resulting in the performance of the competency meet the basic standards.

1 LIMITED LEVEL The division has minimal understanding of the competency but cannot apply the competency.Requires training or direct guidance to achieve outputs related to the performance of the competency.

SCALE ON THE LEVEL OF IMPORTANCE

Scale

Description

1 HIGHLY ESSENTIAL

Competency is indispensable, highly critical to the job performance of the Division and the lack of it will adversely affect the quality of output / work.

2 ESSENTIAL Competency is necessary and the lack of it will partly affect the quality of output / work.

3 LESS ESSENTIAL Competency supports the Division in the

Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 245

Page 251: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

performance of its job but the lack of it will have minimal impact on the quality of work.

4 NOT ESSENTIAL Competency enhances the Division in the performance of its job but the lack of it bears no impact to the quality of work.

Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 246

Page 252: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Organizat ional Training Development Needs Assessment for the Divis ion

Answer SheetSection: ___________________________________________

Service Area Competency #

Level of Competency Level of Importance4 3 2 1 1 2 3 4

General Competencies Across Units(CO/DO/RO)

12345678910

Service Area 1Educational Planning (DO/RO)

111213141516171819

Service Area 2 Learning Outcome Management(DO)

202122232425262728293031

Service Area 3Monitoring & Evaluation (DO/RO)

3233343536

Service Area 4Education Administration & Management(CO/DO/RO)

3738394041424344

Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 247

Page 253: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Sample Competency Rating Board for the Division Organizational TDNA (Big size copy paper (17”x 22”) may be used for each Board required.)

Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 248

SERVICE AREA: ___________________________COMPETENCY: ___________________________

Section

Level of COMPETENCYSections are to write in this column their perception on the current level of competency of the division

Level of IMPORTANCESections are to write in this column their perception on the level of importance of the competency

Section 1

Section 2

Section 3

Section 4

Section 5

Section 6

Current competency level: Level of ImportanceAverage:

Average: Consensus:

Page 254: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

MATRIX of MANAGEMENT COMPETENCIES for the DIVISION

#

GENERAL COMPETENCIES ACROSS UNITS (CO/DO/RO)

#

SERVICE AREA 1

#

SERVICE AREA 2

#

SERVICE AREA 3

#

SERVICE AREA 4

EDUCATIONAL PLANNING (DO/RO)

LEARNING OUTCOME MANAGEMENT (DO)

MONITORING AND EVALUATION (DO/RO)

EDUCATION ADMINI & MGT(CO/RO/DO)

1 Understanding DepED as an Organization

11 Strategic Planning 20 Understanding of the Revitalized Basic Education Curriculum

32 Monitoring and Evaluation Design and Development

37 Resource Mobilization and Management

2 Understanding RA 9155 or the Governance of BEd Act

12 Implementation Planning

21 Curriculum Review 33 Instrument/Tools Development for M&E Data Gathering

38 Resource Procurement and Management

3 Management of Change

13 Project/ Program Identification

22 Curriculum Implementation Planning (Curriculum Indigenization)

34 Data Processing, Analysis and Utilization

39 Building Partnerships

4 Organization Analysis/ Diagnosis

14 Resource Mobilization and Allocation

23 Instructional Materials Development

35 Communication Skills/Feedback Giving

40 Human Resource Management

5 Problem Solving 15 Financial Management and Control

24 Instructional Supervision and Management

36 Education Management Information System (EMIS)

41 Delegation

6 Decision-Making 16 Group Process Management

25 Student/Pupil Assessment/ Testing

42 Physical Facilities Programming

7 Dealing Effectively with Pressure Groups

17 Facilitation Skills 26 Intervention Programming

43 Records Management

8 Conflict Management

18 Communication Skills

27 Education Programs Management / Project Management

44 Understanding the intent of the Policy and Implementation

9 Negotiation Skills

19 Advocacy 28 Tracking Student Progress

10 Transformational and Enabling Leadership

29 Quality Management

30 Staff Development

31 Coaching and Mentoring

LIST OF COMPETENCIES AND BEHAVIORAL INDICATORS

Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 249

Page 255: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

GENERAL COMPETENCIES ACROSS UNITS (CO/RO/DO)

1. Understanding DepED as an Organization

b. Understanding DECS vision, goals, and objectives. Being familiar with organizational structure and individuals involved

2. Understanding RA 9155 or the Governance of Basic Education Act

Understanding the facets of the education sector decentralization law and its Implementing Rules and Regulations

3. Management of Change

Understanding the nature and phases of change Facilitating the change process in the unit

4. Organization Analysis/Diagnosis

Assessing the entire organization – it objectives, its resources and the ways it allocates resources to attain goals

Knowledge of approaches and tools to organization analysis

5. Identifying and Solving Problems

Ability to timely identify problems or potential problems and identify strategies or means to address these

6. Decision-Making

Efficiently arriving at conclusions on issues and problems whether internal or external, minor or major

Acting on conclusions arrives at to desired results

7. Dealing Effectively with Pressure Groups

Handling various interests of stakeholders Involves political skills, managing political interference

8. Conflict Management

Ability to identify, handle and manage differences and conflict situations Ability to resolve conflicts

9. Negotiation Skills

Ability to influence and work with people to reach solutions that everyone can live with

10. Transformational and Enabling Leadership

Extending expertise to other regions Providing opportunities for professional growth to staff Institutionalizing efforts or practices found effective Identifying and developing emerging leaders Creating a culture of excellence to enhance peak performance Creating an atmosphere where individual differences are celebrated and where every individual

grows and maximizes his/her potential

Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 250

Page 256: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

SERVICE AREA 1: EDUCATIONAL PLANNING (DO/RO)Outputs: Division/Region Education Development Plan Consolidated Annual Plan

11. Strategic Planning

Preparing clear statements that describe a course of action in terms of identified goals and objectives. Strategic planning involves the understanding of the following:

‐ Current and potential future the provincial, national, and international levels‐ Planning process and principles‐ Department Planning timelines and requirements‐ Current Department Directions‐ Current regional thrusts‐ Principles and processes in environmental scanning‐ Processes related to collaborative educational planning

and proficiency in the following:

‐ Gather pertinent planning information/inventory of resources‐ Analyze planning information‐ Draw inferences from planning information‐ Conduct a thorough analysis of the current situation‐ Prioritize needs‐ Identify core organizational values‐ Formulate a vision‐ Formulate a mission‐ Perform data forecasting‐ Set realistic Divisional goals to achieve the vision‐ Set realistic performance targets‐ Identify appropriate strategies

12. Implementation Planning

Translating goals and objectives into specific interventions e.g. programs and projects. Implementation Planning involves the understanding of the following:

‐ Forecasting and Trend-Setting‐ Processes related to prioritization of needs‐ Varied tools in plan development‐ Designing Monitoring and Evaluation System‐ Budget procedures of the Department‐ Possible funding sources‐ Budget items and proficiency in the following:‐ Prioritize Division needs‐ Identify sufficient activities to achieve targets and priorities‐ Identify performance measures‐ Identify means of verification‐ Identify resources requirements‐ Identify programs and projects to achieve objectives‐ Link programs/projects with resources‐ Perform financial forecasting and planning‐ Allocate human, material, fiscal resources

Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 251

Page 257: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

13. Project/Program Identification

Involves the generation of potential project ideas to address problems or gaps

14. Resource Mobilization and Allocation

Identifying sources of funds Proper distribution of funds across funding requirements

15. Financial Management and Control

Understanding budgeting and financial management principles and funding processes Understanding COA rules, regulations, and applications

16. Group Process Management

Managing group dynamics during planning activities to accomplish tasks at hand. Group process management the understanding of the following:

‐ Workshop management‐ Principles of group process management‐ Interpersonal styles supportive of group process‐ Areas of Facilitation and proficiency in the following:‐ Generate group ideas‐ Engage participants in plan development‐ Manage conflict of ideas and move towards synergy‐ Ask probing questions‐ Summarize group outputs‐ Design and manage planning workshops‐ Build consensus

17. Facilitation Skills

Involves the ability to get a group of people to work together towards a common goal. It involves the following areas:

‐ Listening‐ Observing‐ Questioning‐ Attending

18. Communication Skills

Ability to covey requirements needed to implement the SIP to various school stakeholders. Ability to engage stakeholders in SIP planning and implementation Applying effective verbal and non-verbal communication methods to achieve desired results Applying effective written communication methods to achieve desired results

19. Advocacy

Ability to generate awareness, ownership and commitment to plan preparation and implementation

Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 252

Page 258: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

SERVICE AREA 2: LEARNING OUTCOME MANAGEMENT (DO)Outputs: Performance Standards for Managing Learning Outcomes Findings/Analysis Test Results and Utilization Indigenized Curriculum Indigenized Instructional Materials Instructional Supervisory Plan Programs and Projects to Address Learning Difficulties System to Track Learning Outcomes Staff Development Program

20. Understanding of the Revitalized Basic Education Curriculum

Understanding the philosophy and competencies covered by the current curriculum

21. Curriculum Review

Ability to understand current standard curriculum and assess it against the requirements for cultural and ethnic curriculum adaptations

Applying research methods to ascertain the effectiveness of the curriculum and to identify cultural consideration to facilitate adaptation of the curriculum

22. Curriculum Implementation Planning (Indigenized Curriculum and Instructional Materials)

Involves adapting current curriculum considering local requirements following theories of cognitive development and learning

23. Instructional Materials Development

Understanding of the curriculum Familiarity with theories of cognitive developing and learning as they apply to development of

instructional materials

24. Instructional Supervision and Management

Familiarity with various instructional strategies and motivational techniques and their application in the classroom

Planning, implementing, organizing and monitoring an instructional supervision activity for quality assurance of instructional delivery

25. Student/Pupil Assessment/Testing

Development of tools to assess students’/pupils’ learning performance Understanding the theories and applications i the development of testing instruments Involves likewise the interpretation and use of assessment results to improve instruction

26. Intervention Programming

Identification and designing of appropriate interventions (e.g. programs/projects) to improve students’/pupils’ learning outcomes

Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 253

Page 259: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

27. Education Programs/Project Management

Effective and efficient management of intervention projects/programs to address gaps in learning outcomes

28. Tracking Student Progress

Development of a student/pupil achievement monitoring and reporting system to follow progress of students’/pupils’ learning outcomes. This requires the understanding of the following:

‐ Identifying issues related to EMIS‐ Planning for information systems development‐ Acquiring, developing, and installing hardware and software requirements‐ Final testing of the system in the actual environment where it will be deployed‐ Monitoring, evaluation, adjusting the system to the organization’s changing informational

requirements ‐ Design and installation of system for generating, storing, maintaining and retrieving data

29. Quality Management

Development of performance standards/indicators in managing learning outcomes Ensuring that inputs (e.g. instruction, curriculum, etc) are at par with standards set Following through implementation; achievement of goals Taking corrective action

30. Staff Development

Identifying key need areas for staff development of personnel task with instructional delivery Understanding key concepts and principles related to staff development Designing and implementing programs to answer staff development needs

31. Coaching and Mentoring

Transferring acquired knowledge and skills by providing technical assistance and direct guidance to staff

Identifying and developing mechanisms or specific activities for effective transfer of knowledge and skills

Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 254

Page 260: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

SERVICE AREA 3: MONITORING AND EVALUATION (DO/RO)Outputs: M&E Design M&E Instruments and Tools M&E Reports Consolidated/Analyzed BEIS Data Documentation of Success Stories and Best Practices

32. Monitoring and Evaluation Design and Development

Designing and developing the system, facilities, tools and mechanisms for monitoring and evaluation

Setting performance and quality standards for work processes and outputs

33. Instrument/Tools Development for M&E Data Gathering

Preparation of appropriate instruments and tools to gather information to satisfy major monitoring and evaluation areas

34. Data Processing, Analysis and Utilization

Processing, reviewing data gathered to make inferences to enable management to use data for decision-making

35. Communication Skills/Feedback Giving

Ability to convey results of monitoring and evaluation results to improve program/project implementation catering to improving learning outcomes

Establish and manage a mechanism for effective feedback

36. Education Management Information System (EMIS)

Ability to convey results of monitoring and evaluation results to improve program/project implementation catering to improving learning outcomes

Establish and manage a mechanism for effective feedback

SERVICE AREA 4: EDUCATION ADMINISTRATION AND MANAGEMENT (CO/RO/DO)Outputs: Resource Mobilization Plan Procurement Plan Human Resource Management Plan Physical Facilities Plan Records Management System Regular Report Requirements

37. Resource Mobilization and Management

Ability to rally support for educational initiatives and proper/prudent allocation of resources across needs in the division/region/central office

Understanding budgeting and financial management principles and funding processes Understanding COA rules, regulations, and applications Monitoring of resource utilization Providing feedback to funders on status of funds utilization and benefits derived

Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 255

Page 261: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

38. Resource Procurement and Management

Developing a system of procurement of resources, allocation of resources to specific activities e.g. learning/instructional materials, supplies, etc

Monitoring of resource utilization

39. Building Partnerships

Developing useful contacts and dealing with a broad range of people from various levels, social, and economic background

Working with other institutions (GOs, NGOs, LGUs, etc) both national and international to achieve common goals

40. Human Resource Management

Ability to plan for manpower requirements based on organizational vision Ability to attract, select, and hire competent people for organizational deployment Ability to develop people through various forms e.g. In-Service Training, Coaching Mentoring Ability to compensate, appraise and reward performance

41. Delegation

Involves the ability to entrust to other people one’s authority style of management which allows your staff to use and develop their skills and knowledge to the

full potential

42. Physical Facilities Programming

Ability to forecast physical facilities’ requirement and new school requirements Setting standard specifications catering to requirements of schools, divisions, regions Allocating funds, constructing facilities, rehabilitation, maintenance ad repair

43. Records Management

Developing a system for managing and updating records Maintaining an updated and functional system of storing and retrieving data on work process and

their outputs and other information relevant to decision-making

44. Understanding the intent of the Policy and Implementation Identifying the purpose of the policy Ability to apply the policy as intended

Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 256

Page 262: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Attachment 2: Division Organizational TDNA M&E Tools

M&E Matrix for the Division Organizational TDNAT&D-M&E Form 1: Individual Profile Template Org’l TDNA-M&E Form 1: Organizational TDNA Tool for the Focus Group Discussion (FGD)

Process at the Region/Division LevelOrg’l TDNA-M&E Form 2a: Division Organizational TDNA Scores Summary TemplateOrg’l TDNA-M&E Form 3: Functional Divisions/Units Organizational TDNA Prioritization

TemplateOrg’l TDNA-M&E Form 5: Documentation Review of Division/Region Organizational TDNA

Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 257

Page 263: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

M&E Matrix for the Division Organizational TDNAWhat will be monitored

How it will be monitored

M&E tool to be used

Who will be responsible

for the monitoring

When will the monitoring take place

How will the results be used

Respondent’s details in relation to their current position, their level of experience and qualification

All participants in the Organizational TDNA will be asked to complete the profile

T&D-M&E Form 1: Individual Profile Template

Division, Region TDNA-WG

Prior to the accomplishment of the Organizational TDNA Tool

Information will be entered into the TDIS database

The processes followed during the conduct of the Focus Group Discussion (FGD) at the Region/Division level

A process observer will be appointed and will use the tool

Org’l TDNA-M&E Form 1: Organizational TDNA Tool for FGD Process at the Region/ Division level

Division, Region TDNA-WG

During the conduct of the FGD for the Organizational TDNA

Results will be shared with the FGD facilitators to identify best practices and areas for improvement. Recommendations for improving the process will be included in the Program Completion Report to inform future processes.

The level of competency and the level of importance of the Division/Region for the various management competencies across Service areas

Results of the organizational TDNA will be consolidated using the template provided

Org’l TDNA-M&E Form 2a: Division Organizational TDNA Scores Summary Template

Org’l TDNA-M&E Form 2b: Regional Organizational TDNA Scores Summary Template

Division, Region TDNA-WG

Following the accomplishment of the Division / Region Organizational TDNA

Results will inform decisions on the training and development programs offered at the division level and will be incorporate into both the DEDP and REDP

The Organizational TDNA of the functional divisions/ sections/ units

Results of the organizational TDNA will be consolidated for each functional division/ section/ unit using the template provided

Org’l TDNA-M&E Form 3: Functional Divisions/ Sections/ Units Organizational TDNA Prioritization Template

Division, Region TDNA-WG

Following the accomplishment of the Organizational TDNA

Results will inform decisions on the training and development programs offered at the division/regional level and will be incorporate into both the DEDP and REDP

The implementation of the Organizational TDNA at the Division/Region levels

A Process Observer will be identified and asked to complete the tool

Org’l TDNA-M&E Form 5: Documentation Review of Division/ Region Organizational TDNA

Division, Region TDNA-WG

During the conduct of the Organizational TDNA at the Division/Region Level

Results to be discussed with the Division/Region and identify strengths and areas for improvement.

Observations will be collated by the TDNA- WG and the results analyzed to inform future TDNA policy

Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 258

Page 264: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

T&D-M&E Form 1: Individual Profile Template

I. PERSONAL DATAName:

(Surname) (First Name) (Middle Name)

Employee Number (If Applicable): Sex: Male FemaleDate of Birth: Home Address: Contact #: e-mail address: Region: Division: District: Office/School: Address: Current Position:

Other Designations:

Highest Educational Attainment:

II. WORK EXPERIENCE(List from most current.)

POSITION MAIN AREA OF RESPONSIBILITY e.g. subjects taught, level supervised

LEVEL e.g. Elem/Sec/ALS school, district, division,

region

INCLUSIVE PERIOD

Use additional sheet if necessary.

III. TRAINING ATTENDED OVER THE LAST THREE YEARSSection 7.4: The Organizational TDNA for the Division Guide and Tools Page 259

Page 265: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Please check training focus and management level for all training attended over the last three years.

Training Focus Training attended over last 3 years ()

Management Level of TrainingCentral Region Division Cluster School

Curriculum

Resource Materials Development

Planning

Management

Policy Development

Research

Other, please specify ______________

IV. SIGNIFICANT EXPERIENCES

Identify which of the following areas you consider to be your area(s) of expertise: School Based Management Monitoring and Evaluation Quality Assurance Subject Specialization: _____________) Access Education Policy Development Education Planning ICT Learning Resource Materials Development Other, please specify ________________ Delivery of Training

Certified Trainers by NEAP Central NEAP-Region TEI

SEAMEO- INNOTECH Foreign Assisted Projects (FAP) Other, please specify --

List your significant experiences in the identified areas

Use additional sheet if necessary.

V. TRAINING AND DEVELOPMENT EXPERIENCES

Identify which of the following specific areas you consider to be your area(s) of expertise:Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 260

Page 266: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Competency Assessment Program Planning

Program Designing Resource Materials Development

Program Delivery Program Management

Monitoring and Evaluation of Training

List your significant experiences in the identified areas

Use additional sheet if necessary.

I certify that the information I have given to the foregoing questions are true, complete, and correct to the best of my knowledge and belief.

Date:

Signature:

Please submit completed form to Training and Development Division/Unit. Information will be incorporated into the T&D Information System Database.

Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 261

Page 267: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

_______ FGD Flow of Regional TDNA Self Assessment_______ FGD Flow of Division TDNA Self Assessment_______ Monitoring of Division Organization TDNA by Regional TeamPlease check ( ) ✔ under the manifested (M) column if the process was manifested and under the not manifested (NM) column if the process was not manifested. Please indicate any variations noted and include any additional comments regarding the facilitation of the session.

ACTIVITY M NM Variations/Comments13. Facilitator emphasizes to the participants that as key respondents

to this Organizational TDNA of the Region/Division, their answers and collaboration with their colleagues to reach a consensual assessment will be most helpful for the future development of the management competencies of the region.

14. Facilitator clearly presents the purpose of the Organizational TDNA

15. Brief description of the data gathering method (FGD) was provided by the facilitator

16. Facilitator comprehensively explained the rating scale17. Systematic walk through of the ‘Management Competencies per

Service Area’ one service area at a time was carried out. 18. Each section collaboratively reached a consensus on the level of

importance and level of competencies for each service area19. Section ratings were recorded properly20. Each participant provided individual perception on perceived

importance of the competency in the performance of the region’s/division’s task / job’ and shared these with the group.

21. TDNA-WG members efficiently performed their assigned task.22. Participants careful deliberated on the average level of importance

and competency ratings23. A consolidation of the Organizational TDNA results following the

guidelines outlined in the FGD flow was accomplished.24. An M&E committee was tasked to monitor and evaluate on the

preparation, conduct and consolidation of the TDNA results. Total

Name and Signature of the Process Observer: ___________________________________ Date: ____________________________

Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 262

Org’l TDNA-M & E Form 1: Organizational TDNA Tool for Focus Group Discussion (FGD) Process at the Region/Division Level

Page 268: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

DIVISION ___________________

   

Level of COMPETENCIES (LOC)   Level of IMPORTANCE (LOI)Raw

Scores  WEIGHTED

RATING 

Raw Scores

  WEIGHTED RATING

 LOC

LOISelf-A

Reg.

 Self-A60%

Reg40%

  Total

 Self-A

Reg. 

Self-A60%

Reg40%

  Total

 

GENERAL COMPETENCIES ACROSS UNITS (CO/RO/DO  

 1 Understanding DepED as an Organization      

 2 Understanding RA 9155 or the Governance of Basic Education Act

     

 3 Management of Change      

 4 Organization Analysis/Diagnosis      

 5 Identifying and Solving Problems      

 6 Decision-Making       7 Dealing Effectively with

Pressure Groups      

 8 Conflict Management       9 Negotiation Skills      

Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 263

Org’l TDNA-M & E Form 2a Division Organizational–TDNA Scores Summary Template

Page 269: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

 10 Transformational and Enabling Leadership      

      Raw Scores

    WEIGHTED RATING

Raw Scores

  WEIGHTED RATING

LOC

LOISelf-A

Reg.

 Self-A60%

Reg40%

 

Total

 Self-A

Reg.  

Self-A60%

Reg40%

 

Total

SERVICE AREA 1: EDUCATIONAL PLANNING (DO/RO)   11 Strategic Planning       12 Implementation

Planning      

 13 Project/Program Identification

     

 14 Resource Mobilization and Allocation

     

 15 Financial Management and Control

     

16  Group Process Management

     

17  Facilitation Skills       18 Communication Skills       19 Advocacy      

Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 264

Page 270: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

  Raw Scores

    WEIGHTED RATING

 Raw Scores

  WEIGHTED RATING

  LOC 

LOI Self-A

Reg.

 Self-A60%

Reg40%

  Total

 Self-A

Reg.  

Self-A60%

Reg40%

  Total

 

SERVICE AREA 2: LEARNING OUTCOME MANAGEMENT (DO) 

20  Understanding of the Revitalized Basic Education Curriculum

     

21 Curriculum Review       22 Curriculum

Implementation Planning (Indiginized Curriculum and Instructional Materials)

     

 23 Instructional Materials Development

     

 24 Instructional Supervision and Management

     

 25 Student/Pupil Assessment/Testing

     

 26 Intervention Programming

     

 27 Education Programs/Project Management

     

Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 265

Page 271: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

 28 Tracking Student Progress

     

 29 Quality Management       30 Staff Development       31 Coaching and

Mentoring     

Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 266

Page 272: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

      Raw Scores

    WEIGHTED RATING

 Raw Scores

  WEIGHTED RATING

 

LOC

LOI

Self-AReg.

 Self-A60%

Reg40%

  Total

 Self-A

Reg.  

Self-A60%

Reg40%

  Total

 

SERVICE AREA 3: MONITORING AND EVALUATION (DO/RO  

 32

Monitoring and Evaluation Design and Development

     

 33

Instrument/Tools Development for M&E Data Gathering

     

 34

Data Processing, Analysis and Utilization

     

 35

Communication Skills/Feedback Giving

     

 36

Education Management Information System (EMIS)

     

SERVICE AREA 4: EDUCATION ADMINISTRATION & MANAGEMENT (CO/RO/DO   37

Resource Mobilization and Management

     

 38

Resource Procurement and Management

     

 3 Building Partnerships      

Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 267

Page 273: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

9 40

Human Resource Management

     

 42

Delegation      

 42

Physical Facilities Programming

     

 43

Records Management      

 44

Understanding the intent of the Policy& Implementation

     

Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 268

Page 274: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

(To be accomplished by the TDNA –WG)

LEVEL OF PLAN: REGION DIVISION DATE Accomplished: ______________________

Supply the following data: 1) name of functional divisions/sections/units, and 2) numerical rating of LOI and LOC for each service area of each divisions/sections/unit.

Competencies / Service Areas

Name of Functional Divisions/Sections/Units

LOI LOC LOI LOC LOI LOC LOI LOC LOI LOC LOI LOC LOI LOCGeneral Competencies

Service Area 1: Educational PlanningService Area 2: Learning Outcome Management Service Area 3: Monitoring& Evaluation Service Area 4: Education Administration & ManagementService Area 5: Policy Formulation and Standard SettingService Area 6: Curriculum Development

Priority1Priority 2

Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 269

Org’l TDNA-M & E Form 3: Functional Divisions/Sections/Units Organizational - TDNA Prioritization Template

Page 275: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Priority 3

(To be accomplished by the Regional TDNA-WG)REGION ___________________________ DATE Accomplished: ______________________ NOTE: For Regions with more than seven (7) Schools Divisions, additional columns maybe added. Only the general average of each service area is entered.

Competencies / Service Areas

Names of Divisions within Region Division 1:____________

Division 2: ____________

Division 3: ____________

Division 4: ____________

Division 5: ____________

Division 6: ____________

Division 7: ____________

LOI LOC LOI LOC LOI LOC LOI LOC LOI LOC LOI LOC LOI LOCGeneral Competencies

Service Area 1: Educational Planning

Service Area 2: Learning Outcome Management Service Area 3: Monitoring& Evaluation Service Area 4: Education Administration & ManagementService Area 5: Policy Formulation and Standard SettingService Area 6: Curriculum Development

Priority1Priority 2

Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 270

Org’l TDNA-M & E Form 4: Organizational - TDNA Division Consolidation Template

Page 276: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Priority3

Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 271

Page 277: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Org’l TDNA-M&E Form 5: Documentation Review of Organizational TDNA Region/Division level

This form is to be used to support Regional monitoring of the Organizational TDNA processes at the Division level. It is expected that the assessment will be based on observations, discussions with the implementing team and review of relevant documents.

Division/Region _________________________ Date: __________________

Rating Guide:Numerical

RatingInterpretation Description

4 Very High Extent In a very significant way3 High Extent In a meaningful way2 Low Extent In a limited way only1 Very Low Extent Not in any meaningful way

Use the scale above to assess the extent to which the conduct of Organizational TDNA documentation adhered to the following:

To what extent …….. 1 2 3 41. was thorough planning conducted prior to administration?2. was the purpose of the Organizational TDNA explained?3. was the data collection method to be followed for administering the

Organizational TDNA explained e.g. group consensual assessment technique, self assessment and an external assessment?

4. were participants oriented to the Organizational Management Competencies for each service?

5. was a clear explanation provided on how to accomplish the Organizational TDNA process e.g. consensus agreement within each division/unit regarding level of competence and level of importance, agreement across divisions/units, scoring system

6. were the steps involved in consolidating the results for individual divisions/units as well as the overall region/division explained?

7. were the steps involved in consolidating the self assessment and the external assessment explained?

8. was an explanation on how to interpreted results to identify priority training needs provided ?

9. was technical assistance provided when required?10.

were the M&E tools and processes implemented?

11.

was there evidence of team work and collaboration amongst the Organizational TDNA Implementers

12.

were recommendations for improving the Organizational TDNA administration processes identified?

Recommendations:

Name: ___________________________________

Position: _________________________________

Date: ____________________________________

Section 7.4: The Organizational TDNA for the Division Guide and Tools Page 272

Page 278: Volume 2_TDNA System Operations Manual July V2010

T&D System Operations Manual-Volume 2: The TDNA System

Acknowledgements

to

Region VI Region VII Region VIIIViolenda Gonzales, AO-V Milagros Villanueva, ES-II Alejandra Lagumbay, P-IIEditha Segubre, ES-II Flordeliza Sambrano, ES-II Rita Dimakiling, ES-IIGabriel Pintor, P-III Churchita Villarin, ES-II Adelma Rabuya, PSDS

Belen Zanoria, ES-I Ma. Lita Veloso, P-I

Grecia Bataluna, ES-I Jovena Amac, HT-III

Nimfa Bongo, P-III

Negros Occidental Bohol/Tagbilaran Northern SamarMarsette Sabbaluca, ES-I Debra Sabuero, P-I Nimfa Graciano, ES-IMichell Acoyong, ES-I John Ariel Lagura, P-I Cristito Eco, P-IIIZorahayda Albayda, P-III Lilibeth Laroga, P-I Imelda Valenzuela, P-IIINelson Bedaure, P-II Ma. Lileth Calacat, P-I Carlos Balanquit, PSDSRegie Sama, P-II Helconida Bualat, P-1 Nedy Tingzon, P-IThelma Pedrosa, P-II Rosanna Villaver, P-I Noe Hermosilla, P-IEulalia Gargaritano, HT IV Remigio Arana, MT-ISusan Severino, HT-IV

DepED- EDPITAF T&D Coordinator

Jonathan F. Batenga

Project STRIVE T&D Technical Advisers

Louise A. QuinnInternational Technical Adviser

Twila G. PunsalanNational Technical Adviser

The Project STRIVE 2 Training and Development Component Members who developed the standards, processes and

tools of the TDNA System Operations Manual, Volume 2

Acknowledgements