121
Naval exercise and experimentation support framework Kurt Salchert CAE Inc. Prepared By: CAE Inc. 1135 Innovation Drive Ottawa, ON K2K 3G7 Contractor's Document Number: 5902-002 Version 01 Contract Project Manager: Peter Avis, 613-247-0342 PWGSC Contract Number: W7714-083663/001/SV TASK 199 Technical Authority: Dr P. Dobias and C. Eisler Disclaimer: The scientific or technical validity of this Contract Report is entirely the responsibility of the Contractor and the contents do not necessarily have the approval or endorsement of the Department of National Defence of Canada. Contract Report DRDC-RDDC-2016-C095 March 2016

Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Embed Size (px)

Citation preview

Page 1: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval exercise and experimentation support framework

Kurt Salchert CAE Inc. Prepared By: CAE Inc. 1135 Innovation Drive Ottawa, ON K2K 3G7 Contractor's Document Number: 5902-002 Version 01 Contract Project Manager: Peter Avis, 613-247-0342 PWGSC Contract Number: W7714-083663/001/SV TASK 199 Technical Authority: Dr P. Dobias and C. Eisler

Disclaimer: The scientific or technical validity of this Contract Report is entirely the responsibility of the Contractor and the contents do not necessarily have the approval or endorsement of the Department of National Defence of Canada.

Contract Report

DRDC-RDDC-2016-C095

March 2016

Page 2: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

This S&T document is provided for convenience of reference only. Her Majesty the Queen in right of

Canada, as represented by the Minister of National Defence ("Canada"), makes no representations or

warranties, expressed or implied, of any kind whatsoever, and assumes no liability for the accuracy,

reliability, completeness, currency or usefulness of any information, product, process or material included

in this document. Nothing in this document should be interpreted as an endorsement for the specific use of

any tool, technique or process examined in it. Any reliance on, or use of, any information, product, process

or material included in this document is at the sole risk of the person so using it or relying on it. Canada

does not assume any liability in respect of any damages or losses arising out of or in connection with the

use of, or reliance on, any information, product, process or material included in this document.

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016

© Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la Défense nationale,

2016

Page 3: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

CAE Inc. 1135 Innovation Drive

Ottawa, Ont., K2K 3G7 Canada Tel: 613-247-0342

Fax: 613-271-0963

NAVAL EXERCISE AND EXPERIMENTATION SUPPORT FRAMEWORK

CONTRACT #: W7714-083663/001/SV TASK 199

FOR

MARPAC OPERATIONAL RESEARCH TEAM

(DR. P. DOBIAS & C. EISLER)

PO Box 17000 STN Forces, Victoria, BC, V9A 7N2

3 1 M a r c h 2 0 1 6

Document No. 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016

© Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la Défense nationale, 2016

Page 4: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – ii – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

A P P R O V A L S H E E T

Document No. 5902-002 Version 01

Document Name: Naval Exercise and Experimentation Support

Framework

Primary Author

Name Kurt Salchert Position Senior Consultant

Reviewer

Name Tab Lamoureux Position Senior Consultant

Approval

Name Peter Avis Position Senior Consultant

Page 5: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – iii – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

R E V I S I O N H I S T O R Y

Revision Reason for Change Origin Date Draft A Initial document issued for comment 18 March 2016 Version 01 Final report issued, incorporating

customer comments 31 March 2016

Page 6: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – iv – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

T A B L E O F C O N T E N T S

1 INTRODUCTION AND BACKGROUND ................................................................. 11.1 Objectives of this Work ......................................................................................... 11.2 Organization of This Report .................................................................................. 22 MILITARY EXERCISE DESIGN AND MEASUREMENT FRAMEWORK ............... 42.1 Exercise Overview ................................................................................................ 42.1.1 Exercise Methods .............................................................................................. 52.1.2 Exercise Cycle ................................................................................................... 52.2 Fundamentals of Exercise Design, Development and Delivery ............................ 82.3 Phase 1 – Initiate ................................................................................................ 112.3.1 Phase 1 ORT Inputs and Outputs .................................................................... 112.4 Phase 2 – Conceive ........................................................................................... 122.4.1 Phase 2 ORT Inputs and Outputs Phase 2 ORT Inputs and Outputs .............. 122.5 Phase 3 – Design ............................................................................................... 132.5.1 Determine Mission Essential Tasks ................................................................. 132.5.2 Building on the Mission Essential Tasks .......................................................... 182.5.3 Phase 3 ORT Inputs and Outputs .................................................................... 212.6 Phase 4 – Plan ................................................................................................... 232.6.1 Resource Planning Considerations .................................................................. 232.6.2 Scenario Planning Considerations ................................................................... 242.6.3 Master Scenario and Events List Planning Considerations .............................. 252.6.4 Modelling and Simulation Considerations ........................................................ 272.6.5 Phase 4 ORT Inputs and Outputs .................................................................... 282.7 Phase 5 – Conduct ............................................................................................. 292.7.1 Data Collection ................................................................................................ 302.7.2 Phase 5 ORT Inputs and Outputs .................................................................... 312.8 Phase 6 – Assess ............................................................................................... 312.8.1 Phase 6 ORT Inputs and Outputs .................................................................... 322.9 Sample Exercise and Training Objectives – Examples ...................................... 322.9.1 Evaluation on the Conduct of a Task Group Exercise...................................... 332.9.2 Evaluation of a Concept of Operations during a TGEX .................................... 352.9.3 Validation of Concept of Operations of Tactical Elements during a TGEX ....... 382.9.4 Validation of Concept of Employment of Combat Systems during Live Trials .. 412.9.5 Evaluation/Assessment of the Conduct of a Table-Top Exercise (TTX) .......... 432.9.6 Validation of a Concept of Operations (including Multi-Agency) during a Table-

Top Exercise .................................................................................................... 472.9.7 Assessment of Command-Post Exercises ....................................................... 512.10 Most Common Objectives, Topics and Measures .............................................. 54

Page 7: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – v – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 563.1 Defence Experimentation ................................................................................... 563.2 Experiment Types ............................................................................................... 573.3 Experiment Cycle ............................................................................................... 573.4 Hypothesis Formulation ...................................................................................... 613.5 Measurement Planning Considerations .............................................................. 643.5.1 Defining the Concept to be Measured .............................................................. 653.5.2 Establish the Importance of Measurement within the Event ............................ 653.5.3 Establish the Context of the Event ................................................................... 653.5.4 Establish the Timing of the Measurement Activity ........................................... 653.6 Template Hypothesis Formulation by Activity Type ............................................ 663.6.1 Comparison of Two or More Combat Systems/Naval Platforms ...................... 663.6.2 Evaluation of Operational Plans ....................................................................... 663.6.3 Evaluation of Tactical Plans ............................................................................. 673.6.4 Development and Assessment of a Concept of Operations ............................. 673.6.5 Development and Assessment of a Concept of Employment for a Particular

Capability ......................................................................................................... 683.6.6 Impact of an Increase or Decrease in a Particular Capability .......................... 683.6.7 Identification of Requirements Sets for a Particular Operation or a Type of

Operations ....................................................................................................... 693.6.8 Identification of Gaps in a Particular Capability Set ......................................... 694 COMMON CONSIDERATIONS FOR EXCERCISES AND EXPERIMENTS ........ 714.1 Constraints in the C2 Measurement Event Type ................................................ 714.2 Independent Variable ......................................................................................... 714.3 Dependent Variable ............................................................................................ 724.4 Measures of Performance and Effectiveness ..................................................... 724.4.1 Measures of Effectiveness ............................................................................... 724.4.2 Measures of Performance ................................................................................ 734.5 Measurement Approaches ................................................................................. 734.5.1 Subjective Measures ........................................................................................ 734.5.2 Data Recording (Objective Measures) ............................................................. 764.6 Reliability and Validity ......................................................................................... 764.7 Direct versus Indirect Measures ......................................................................... 774.8 Online vs. Offline Measure ................................................................................. 784.9 Operational Performance vs. Human Performance Measures ........................... 784.9.1 Operational Performance Measures ................................................................ 784.9.2 Human Performance Measures ....................................................................... 805 CONCLUSIONS AND RECOMMENDATIONS ..................................................... 836 REFERENCES ...................................................................................................... 88APPENDIX A GLOSSARY ..................................................................................... A-1

Page 8: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – vi – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

L I S T O F F I G U R E S

Figure 2-1: Exercise Cycles ............................................................................................ 7Figure 2-2: Exercise Planning Process ......................................................................... 10Figure 2-3: Organization of Conditions for U.S. Joint and Naval Tasks......................... 19Figure 2-4: MSEL Summary Template .......................................................................... 26Figure 2-5: MSEL Inject Template ................................................................................. 27Figure 3-1: Iterative Cycle for Experimentation ............................................................. 58Figure 3-2: GUIDEx Experimentation Cycle .................................................................. 60Figure 3-3: GUIDEx Hypothesis Formulation ................................................................ 62Figure 3-4: GUIDEx Hypothesis Formulation (Alternate) ............................................... 63Figure 5-1: Synchronization between MARPAC ORT and the RCN ............................. 86Figure 5-2: Flow of MARPAC ORT Activities in Support of an Exercise ........................ 87

Page 9: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – vii – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

L I S T O F T A B L E S

Table 2-1: 2.9.1 Training Objective 1 Measures ............................................................ 34Table 2-2: 2.9.1 Training Objective 2 Measures ............................................................ 34Table 2-3: 2.9.1 Training Objective 3 Measures ............................................................ 35Table 2-4: 2.9.2 Training Objective 1 Measures ............................................................ 36Table 2-5: 2.9.2 Training Objective 2 Measures ............................................................ 37Table 2-6: 2.9.2 Training Objective 3 Measures ............................................................ 38Table 2-7: 2.9.3 Training Objective 1 Measures ............................................................ 39Table 2-8: 2.9.3 Training Objective 2 Measures ............................................................ 40Table 2-9: 2.9.3 Training Objective 3 Measures ............................................................ 40Table 2-10: 2.9.4 Training Objective 1 Measures .......................................................... 41Table 2-11: 2.9.4 Training Objective 2 Measures .......................................................... 42Table 2-12: 2.9.4 Training Objective 3 Measures .......................................................... 43Table 2-13: 2.9.5 Training Objective 1 Measures .......................................................... 44Table 2-14: 2.9.5 Training Objective 2 Measures .......................................................... 45Table 2-15: 2.9.5 Training Objective 3 Measures .......................................................... 46Table 2-16: 2.9.6 Training Objective 1 Measures .......................................................... 48Table 2-17: 2.9.6 Training Objective 2 Measures .......................................................... 49Table 2-18: 2.9.6 Training Objective 3 Measures .......................................................... 50Table 2-19: 2.9.7 Training Objective 1 Measures .......................................................... 52Table 2-20: 2.9.7 Training Objective 2 Measures .......................................................... 53Table 2-21: 2.9.7 Training Objective 3 Measures .......................................................... 54Table 2-22: Common Measurements Described in Sample Objectives ........................ 55Table 4-1: Constraints Mapped to Different Event Types .............................................. 71Table 4-2: Operational Performance Measures ............................................................ 79

Page 10: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – viii – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

E X E C U T I V E S U M M A R Y

The Department of National Defence and the Canadian Armed Forces face pressures to achieve their objectives efficiently and effectively. The Maritime Forces Pacific (MARPAC) Operational Research Team supports MARPAC and Joint Task Force (Pacific) (JTFP) naval exercises, as well as table-top exercises and wargames. Currently, each exercise is treated as a separate entity, and each time an exercise support requirement arises, a new data collection and analysis plan is developed. As there are many commonalities in the design, planning and delivery approach to both exercises and experiments, it was determined that it would be desirable to develop a recommended exercise support and experimentation framework for the navy that could be adjusted to fit any particular exercise or wargame requirement.

The Naval Exercise and Experimentation Support Framework presented in this report is intended to improve exercise and experiment planning efficiency and effectiveness. It provides decision-makers and planners involved in the direction, oversight, planning, delivery, and analysis of exercises or experiments with an understanding of the exercise and experiment “cycles,” as well as the key inputs, processes and outputs required to conceive, design and develop combined training exercises and operational research experiments. Its value results from the collaboration of the two communities (Operator and Scientist) to engender full, mutual understanding and participation. Details regarding the actual conduct, data collection, analysis and reporting following an exercise or experiment are beyond the scope of this particular project and should be considered in future work.

The resulting Support Framework for design and planning is a requirements-based, iterative methodology for aligning MARPAC/JTFP exercises and experiments with defence priorities, assigned missions and tasks. It is consistent with direction and guidance from the Chief of the Defence Staff, Commander Canadian Joint Operations Command, Commander Royal Canadian Navy, Canadian Maritime Component Commanders, and Fleet Commanders Atlantic and Pacific. The application of this Framework could directly support the achievement of training, readiness and capability development objectives promulgated in the Chief of Defence Staff Force Posture and Readiness Directive (DND CDS FP&R, 2013), Commander Royal Canadian Navy’s Strategic Direction and Guidance (DND CRCN SD&G, 2015), the Royal Canadian Navy’s Readiness and Sustainment Policy (DND CFCD 129, 2015), and the Royal Canadian Navy Future Naval Training System Strategy (DND FNTS, 2015), as well as the RCN Ten Year Fleet Plan, the Five Year Operational Schedule, and the Fleet Schedules for the two coastal formations.

This Framework has been tailored to exercises and experiments conducted at the Regional Joint Task Force, Naval Task Group and individual ship or sub-unit level. However, it has been structured in such a way that it can be flexibly applied across a variety of exercises and experiments at the strategic, operational and tactical levels involving a wide range of multinational, joint, service, interagency and non-governmental stakeholders. Alternatively,

Page 11: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – ix – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

when viewed through a training and readiness lens, this Framework will assist decision-makers and planners at all levels to design and plan well-conceived exercises which measure performance relative to required readiness standards and under defined conditions or variables of the operational environment. When viewed through an operational research lens, this Framework supports the design and planning of single experiments as well as campaigns of experimentation aimed at expanding scientific knowledge, promoting innovation and capability development.

This Framework draws on best practices in both exercise and experiment design and planning used by the Canadian Armed Forces, NATO and other close allies, as well as emergency management exercise methodologies used by Canadian federal and provincial emergency management organizations, the U.S. Department of Homeland Security, and the Australian Emergency Management Institute. It is expected that this framework will foster greater understanding, collaboration, and synchronisation between the Operational Research team and their Canadian Armed Forces clients.

The following paragraphs and figures, extracted from the main body of this report, provide an overview of military exercises and experimentation design and planning considerations.

An exercise is a military manoeuvre or simulated operation involving planning, preparation and execution. Exercises assist commanders and decision-makers at all levels to define the required levels of task performance, determine the current level of performance, conduct training to improve performance, and assess levels of task performance against measurable standards under defined conditions.

While terminology varies by organization, the exercise planning process follows a repeatable cycle starting with initiation by senior leadership and follows an iterative process of design and planning, followed by exercise delivery, analysis and reporting. This cycle generally repeats itself as the conclusions and recommendations of the last cycle are analysed and incorporated into the next exercise cycle as part of a larger process of continual improvement within an organization. The key phases of exercise planning are as follows:

Phase 1 – Initiate: Initiation is the foundation for the design and conduct of an exercise and should be a commander-led process to determine the training needs of his/her forces based on assigned missions and readiness postures;

Phase 2 – Conceive: The conceive phase includes Mission Analysis during which high level initiating direction during is analyzed which sets out the fundamental requirements for the exercise including the concept, form, scope, setting, aim, exercise objectives, lessons learned priorities, force requirements, political implications, analysis arrangements and costs;

Phase 3 - Design: The design phase examines the high-level requirements for the exercise as well as the exercise aim and objectives and provides further guidance to the exercise planners. Specific and measurable training objectives based on Mission Essential Tasks (MET) are developed during this phase;

Page 12: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – x – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Phase 4 – Plan: The plan phase develops the detailed plan on how the exercise will be conducted and all the necessary administrative instructions to conduct the exercise. The exercise aims and training objectives are developed into a story line into which themes, events, incidents and injects can be developed to stimulate the training audience to perform the necessary tasks under given conditions and evaluated against measurable standards;

Phase 5 – Conduct: The conduct phase delivers the actual training events to evaluate the performance of the training audience relative to specified training objectives, tasks, standards and conditions. Throughout the conduct of the exercise, data is collected to support the analysis, assessments, or evaluations as specified in the exercise plan; and

Phase 6 – Assess: The purpose of the assessment phase is to determine whether the training audience was able to perform at the level required to meet the task standard, and which missions the organization is trained to accomplish. The outcome of this activity is a formal report designed to capture the effects of the exercise and provide recommendations on improvements for both the players and the process of designing and conducting future exercises. Lessons identified are tracked to ensure changes in behavior resulting in improved capability.

Defence experimentation is the process of controlled research, testing and evaluation to discover information or test a hypothesis to establish cause and effect. The experiment process follows a similar repeatable cycle starting with initiation and follows an iterative process of definition and development, followed by experiment conduct, analysis and reporting. This cycle repeats itself as the conclusions and recommendations of the last cycle are analysed and incorporated into the next cycle as part of a larger campaign of experimentation.

Evident throughout this study is the finding that no exercise or experiment should be undertaken without sufficient preparation and senior leader/sponsor buy-in to provide advice and buy-in to exercise and experiment design. In the early stages of design, an analysis should understand what qualitative and quantitative measurement opportunities there are to measure performance outcomes and outputs. The study found that while the Canadian Joint Operations Command, the US Department of Defence, and the US Department of Homeland Security have refined lists of tasks, conditions and standards from which to design, plan and deliver both exercises and experiments. The guidance for the RCN lacks the same level of detail, particularly in terms of performance standards, measures and criteria at the Task Group level.

This study proposes a six-phase planning process that is illustrated in Figure Ex.1 below and described more fully in Section 2 of the main body of this report. It should be noted that the methodology described in this Framework is based on complex exercises. For smaller, less complex exercises, or those that are limited in scope and resources, some of the activities described may not be required or relevant to the particular exercise or experiment.

Page 13: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – xi – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Figure Ex.1: The six-phase exercise planning process

Page 14: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – xii – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Of particular relevance to the MARPAC ORT is the manner in which they should organise their work to complement the processes of the RCN. These processes proceed from the macro level (i.e., guidance from the Government of Canada and CJOC) to the micro level (i.e., activities concerned with planning specific exercise and experiment events). The synchronisation of activities between the MARPAC ORT and the RCN is presented in the flow chart below (Figure Ex.2).

Figure Ex.2: Flow of MARPAC ORT Activities in Support of an Exercise

Page 15: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 1 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

1 INTRODUCTION AND BACKGROUND

The development of this Naval Exercise and Experimentation Framework (the Framework) is timely in that it directly supports the achievement of recently promulgated high-level doctrine, strategy and guidance including the contained in the: Chief of Defence Staff’s Directive on Force Posture and Readiness (DND CDS FP&R, 2013), the Commander Royal Canadian Navy’s Strategic Direction and Guidance (DND CRCN SD&G, 2015), the Royal Canadian Navy’s Readiness and Sustainment Policy (DND CFCD 129, 2015), the Royal Canadian Navy Future Naval Training System Strategy (DND FNTS, 2015), and the Naval Order 3771 (DND RCN, 2014) series related to science and technology, modelling and simulation, and concept development and experimentation. Central to this high-level direction and guidance is the requirement for the Royal Canadian Navy (RCN) to leverage leading-edge technologies and techniques in order to generate a highly interoperable naval force capable of delivering operational excellence within complex joint, interagency and multinational environments across the spectrum of conflict.

The Department of National Defence (DND) and the Canadian Armed Forces (CAF) face pressure to achieve their objectives efficiently and effectively. The Maritime Forces Pacific (MARPAC) Operational Research Team (ORT) supports MARPAC and Joint Task Force (Pacific) (JTFP) in the conduct of naval exercises, as well as table-top exercises and wargames. Currently, each exercise is treated as a separate entity, and each time an exercise support requirement arises, a new data collection and analysis plan is developed. However, there are many commonalities in the various support approaches. Hence, it was determined that it would be desirable to develop an exercise support and experimentation framework for the RCN that could be adjusted to fit any particular exercise or wargame requirements.

The proposed Framework has been designed to be flexible enough to support various types of live, simulated and table-top exercises and experiments; yet, it is specific enough so that it can be readily employed with minimum additional adjustments. Its value results from the collaboration of the two communities (Operator and Scientist) to engender full, mutual understanding and participation.

To support this work, MARPAC ORT contracted CAE Inc. through the standing offer held with the Centre for Operational Research and Analysis (CORA) contract number W7714-083663. The Technical Authority (TA) for this work are Dr. Peter Dobias and Ms. Cheryl Eisler.

1.1 Objectives of this Work

The objectives of this work are two-fold; to develop a framework that will enable efficient and effective support to naval and table-top exercises; and to develop a framework that will enable efficient and effective support of live experiments and table-top (and computer assisted) wargames.

Page 16: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 2 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Based on these two high level objectives, the following sub-objectives were identified:

Define exercises and experiments as well as their respective design, development and delivery cycles;

Identify and define the general elements for designing and developing appropriate exercise objectives;

Identify and define sample objectives for activities and measures and indicators of performance for activities including evaluation of the conduct of Task Group Exercises; validation of concepts of operations; validation of concepts of employment of combat systems during live trials; evaluation/assessment of the conduct of table-top exercises and command-post exercises; and

Identify best practices in formulating research hypothesis in military experiments and develop template hypothesis formulation activities including comparison of two or more combat systems/naval platforms; evaluation of operational and tactical plans; development and assessment of concepts of operations and concepts of employment; evaluate the impact of an increase or decrease in a particular capability; identification of requirements sets for a particular operation or a type of operation; and identifying gaps in a particular capability set.

The deliverables for this work consist of this report.

1.2 Organization of This Report

This report is divided into the following sections:

Section 1: Provides the project background and objectives for this work;

Section 2: Provides an overview of exercises, their cycles and the fundamental building blocks of exercise design and development. This section identifies best practices (Canadian and Allied) in the development of appropriate objectives including a set of samples of objectives as well as a measurement framework for live exercises, trials and table-top exercises;

Section 3: Provides an overview of defence experiments, their cycles and the fundamental building blocks of problem definition, experiment design and development. This section identifies best practices in formulating research hypothesis in military experiments including live experiments and table-top (and computer assisted) wargames;

Section 4: Describes common considerations concerning measurement that should be made irrespective of whether support is being afforded to exercises or experiments;

Page 17: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 3 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Section 5: Draws conclusions from the results and suggests future activities priorities to build upon this work; and

Section 6: Lists the references that were used to guide the project work.

A glossary of terms is provided at the end of this report in Appendix A.

Page 18: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 4 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

2 MILITARY EXERCISE DESIGN AND MEASUREMENT FRAMEWORK

2.1 Exercise Overview

The CAF defines an exercise as “a military manoeuvre or simulated wartime operation involving planning, preparation, and execution. It is carried out for the purpose of training and evaluation. It may be a combined, joint, or single service exercise, depending on participating organizations.” (Defence Terminology Bank - DTB).

Exercises assist commanders and decision-makers at all levels to define the required levels of task performance, determine the current level of performance, conduct training to improve performance, and assess levels of task performance against measurable standards under defined conditions. The Canadian Army Simulation Centre Exercise Design, Development and Delivery Guide (DND CASC, 2014) and U.S. Chairman of the Joint Chiefs of Staff Joint Training Manual (CJCS, 2015) provide excellent overviews of the relationship between tasks conditions and standards. Exercises should focus on preparing for operations and are used in a variety of ways including, but not limited to:

Learning, maintaining, reinvigorating, and/or further developing critical knowledge and military skill sets;

Developing and improving decision-making capability;

Demonstrating the preparedness of forces to perform assigned roles and tasks in accordance with high-level direction;

Improving the capacity and capability of the CAF to conduct domestic and expeditionary operations effectively in a coalition, combined, inter-agency, and/or joint environment;

Refining and validating doctrine and procedures;

Supporting course of action (COA) development, wargaming, mission rehearsals, and rehearsal of concept (ROC) drills for potential Force Employment (FE);

Measuring performance and evaluating the delivery of military capability against assigned tasks, under set conditions and against established standards;

Reporting on general readiness or as part of a mission-specific readiness evaluation;

Identifying cause and effect relationships regarding priorities, constraints and restraints, thus enabling Commanders to better direct and manage resources; and

Exposing potential Doctrine, Organization, Training, Materiel, Leadership and Education, Personnel, Facilities and Interoperability (DOTMLPFI) capability gaps that can be translated into specific requirements to guide capability development processes and acquisition

Page 19: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 5 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

programs; and Demonstrating, integrating and/or evaluating new and developing technologies through experimentation to adapt to emerging technologies, concepts and threats.

2.1.1 Exercise Methods

The NATO Collective Training and Exercise Directive (Bi-SC 75-3, 2013) and the U.S. Joint Training Manual provide extensive details on various types of exercises. The following are examples of commonly used exercises:

Command Post Exercises (CPX): An exercise in which the forces are simulated, involving the commander, his staff and communications within and between headquarters. These exercises are also known as Functional Exercises in civilian emergency management settings.

Field Training Exercises (FTX): An exercise conducted in the field under simulated war or emergency conditions in which units participate with troops and equipment. Often referred to as a Live Exercise (LIVEX) or Full Scale Exercise.

Table-Top Exercises (TTX): An exercise in which hypothetical scenarios are discussed in an informal setting and generally used to assess the adequacy of strategy, policy, plans, procedures, training, resources, relationships and interdependencies. Also known as a seminar, symposium or facilitated discussion.

Computer Assisted Exercise (CAX): A synthetic exercise where electronic means are used to simulate scenarios, operational environments, plans and procedures.

Case Study: Case studies promote critical thinking and discussion on complex situations with wide-ranging variables where there may be no one clear-cut solution but many alternatives. This method of training encourages exploration of complex issues and enables the training audience to apply new knowledge and skills.

Wargame: A disciplined process of action-reaction-counteraction with rules and steps that attempt to visualize the flow of the operation, given the friendly and the adversary’s capabilities, strengths, weaknesses and force dispositions as well as other situational and environmental considerations. Wargaming can be fairly rudimentary with simple maps and charts or it may involve complex, computer-aided modelling and simulation to observe moves and counter-moves. Wargames can be also “one sided” concept development assessment games; seminar wargames/scenario discussions; or follow the vignette, task, requirement and option (VITRO) methodology (Dooley and Gauthier, 2013).

2.1.2 Exercise Cycle

While terminology varies by organization, the exercise planning process follows a repeatable cycle starting with initiation by senior leadership and follows an iterative process of design and planning, followed by exercise delivery, analysis and reporting. This cycle generally repeats

Page 20: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 6 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

itself as the conclusions and recommendations of the last cycle are analysed and incorporated into the next exercise cycle as part of a larger process of continual improvement within an organization. Figure 2-1 provides a visual representation of a variety of exercise cycles from across the defence and emergency management communities.

Page 21: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 7 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Figure 2-1: Exercise Cycles

Page 22: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 8 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

2.2 Fundamentals of Exercise Design, Development and Delivery

Based on the examples of different exercise cycles illustrated in Figure 2-1, it is evident that there is no single best practice applied to the design, development and delivery exercises. The most comprehensive description of exercise planning techniques is contained in the U.S. Joint Training Manual. However, for the sake of consistency in this Framework, a six-phase process illustrated in Figure 2-2 will be used. This process, developed specifically for this project, maps to the CJOC Guide to Designing, Conducting, Assessing, and Reporting on an Exercise (DND CJOC, 2013) methodology (i.e., the fourth cycle in Figure 2-1). Each phase in the CJOC method is described briefly below, with reference to the exercise development process before being described in a greater detail in the subsequent sections. The detailed sections describe the manner in which the ORT can provide advice and subject matter expertise to the lead planners and integrate ORT efforts with the development process and what their major tasks will be during each phase. It should be noted that the methodology described in this Framework is based on complex exercises. For smaller, less complex exercises, or those which are limited in scope and resources, some of the activities described may not be required or relevant to the particular exercise.

Phase 1 – Initiate: Initiation is the foundation for the design and conduct of an exercise and should be a commander-led process to determine the training needs of his/her forces based on assigned missions and readiness postures;

Phase 2 – Conceive: The conceive phase includes Mission Analysis during which high level initiating direction during is analyzed which sets out the fundamental requirements for the exercise including the concept, form, scope, setting, aim, exercise objectives, lessons learned priorities, force requirements, political implications, analysis arrangements and costs;

Phase 3 - Design: The design phase examines the high-level requirements for the exercise as well as the exercise aim and objectives and provides further guidance to the exercise planners. Specific and measurable training objectives based on Mission Essential Tasks (MET) are developed during this phase;

Phase 4 – Plan: The plan phase develops the detailed plan on how the exercise will be conducted and all the necessary administrative instructions to conduct the exercise. The exercise aims and training objectives are developed into a story line into which themes, events, incidents and injects can be developed to stimulate the training audience to perform the necessary tasks under given conditions and evaluated against measurable standards;

Phase 5 – Conduct: The conduct phase delivers the actual training events to evaluate the performance of the training audience relative to specified training objectives, tasks, standards and conditions. Throughout the conduct of the exercise, data is collected to support the analysis, assessments, or evaluations as specified in the exercise plan; and

Page 23: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 9 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Phase 6 – Assess: The purpose of the assessment phase is to determine whether the training audience was able to perform at the level required to meet the task standard, and which missions the organization is trained to accomplish. The outcome of this activity is a formal report designed to capture the effects of the exercise and provide recommendations on improvements for both the players and the process of designing and conducting future exercises. Lessons identified are tracked to ensure changes in behavior resulting in improved capability.

Page 24: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 10 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Figure 2-2: Exercise Planning Process

Page 25: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 11 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

2.3 Phase 1 – Initiate

Initiation is the foundation for the design and conduct of an exercise and should be a commander-led process to determine the training needs of his/her forces based on assigned missions and readiness postures.

Key inputs to the initiate process include high-level national military strategy, Government of Canada defence planning guidance and treaty obligations, existing operational plans (OPLANs) and contingency plans (CONPLANs), allied, joint and service-level doctrine, Chief of Defence Staff and Commander’s Guidance, lessons learned and after action review observations, issues and best practices.

At the highest level, this direction comes in the form of the Chief of Defence Staff Force Posture and Readiness Directive, which provides strategic military guidance based on policy and direction from the Government of Canada to the CAF as it relates to the relative importance of potential Force Employment (FE) operations; national policy objectives and the six core missions contained in the Canada First Defence Strategy (Government of Canada, 2008); and the measures of performance used to evaluate delivery of military capability against assigned tasks. CDS direction may also come in the form of an Initiating Directive, Tasking Order, or Annual Collective Training and Exercise Guidance to Force Generators and Force Employers. Direction could also come as a result of collaborative decisions taken by the Assistant Deputy Minister Emergency Management Committee (ADM EMC) which annually schedules and prioritizes the interagency National Exercise Program (NEP).

Other high-level direction may also come in the form of guidance from the Force Employer to the Force Generators. As the Joint Training Authority (JTA) for the CAF, Commander CJOC provides direction to the RCN under the Joint Managed Readiness Programme (JMRP) which serve to integrate exercises, experiments, and capability development to achieve the CDS priorities. Direction may also come from the Commander to Formation or Subordinate Commanders (i.e. Commander CJOC to Regional JTFs and/or the Maritime Component Commander). This direction may come in the form of standing guidance or as part of the annual integrated business planning process. High-level multinational exercise guidance comes from such guidance as the annual North American Aerospace Defense Command and U.S. Northern Command (NORAD-USNORTHCOM) Joint Training Plan and invitations to participate in NATO, Rim of the Pacific (RIMPAC) and other international exercises.

2.3.1 Phase 1 ORT Inputs and Outputs

At this phase in the exercise development cycle, the ORT will be a recipient of the direction distributed through the chain of command. This information will be combined by the ORT with their own understanding of their priorities areas for support to the RCN to form an initial plan to provide support to RCN exercises. This initial plan should be shared with the appropriate Maritime Component Commander’s staff to ensure that they are aware of the plan and can begin to accommodate the ORT, both as an additional stakeholder in the exercise but also as an additional source of evaluation, lessons learned, and analysis effort. Subsequently, the ORT

Page 26: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 12 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

representative should be invited to planning conferences to identified exercises. The ORT should expect these invitations and should make enquiries if they are not received.

2.4 Phase 2 – Conceive

The conceive phase includes Mission Analysis and a Concept Development Conference (CDC) during which high level direction is analyzed to establish the fundamental requirements for the exercise including the concept, form, scope, setting, aim, exercise objectives, lessons learned priorities, force requirements, political implications, analysis arrangements and costs. Some discussion regarding training objectives may be proposed during the CDC, however, detailed discussions would normally occur later on in the design phases.

The CDC is usually chaired by a representative of the Officer Scheduling the Exercise (OSE) or the Exercise Director. Participation in the CDC should be held to the strict minimum in order to keep the conference focused on the high level aim and exercise objectives and key planning milestones. Chapter 4 to the CJOC Guide to Designing, Conducting, Assessing, and Reporting on an Exercise contains detailed work breakdown structures and deliverables including workflow, CDC agenda, briefing, and Exercise Specification (EXSPEC) templates.

Experimentation during exercises is not specifically addressed in the CJOC Guide to Designing, Conducting, Assessing, and Reporting on an Exercise; however, both the U.S. Joint Training Manual and the NATO Education, Training, Exercise and Evaluation Directive (Bi-SC 75-2, 2013) note that early engagement between exercise planners and operational research communities is essential to ensure that experiment integration requirements and limitations are well understood in order to prevent conflicts between exercise aim, exercise objectives, training objectives, and experimental requirements.

The major output of the conceive phase is the EXSPEC signed by the OSE.

2.4.1 Phase 2 ORT Inputs and Outputs Phase 2 ORT Inputs and Outputs

If the ORT is going to be involved in an exercise, a representative should be present at the corresponding CDCs. Their involvement in the CDC should include assisting in developing the exercise objectives and providing preliminary estimates of the likely level of effort required, scenarios, and manipulations that will assist in meeting the objectives. This information will assist the OSE to make cost-benefit decisions and determine the final form of the exercise. Linked to the objectives will be evaluation and lessons learned priorities.

The ORT should not consider the CDC their last or only opportunity to contribute to the development of the exercise. Rather, the CDC results in the ORT’s ‘warning order’ that will describe the main effort of the exercise and the planning milestones that must be met by the ORT. This should begin a process of working backward in order for the ORT to understand what it needs to do, and whom it needs to consult or involve, in order to meet the planning milestones with sufficient time to obtain support and buy-in from the other stakeholders. Of particular importance is to begin to plan submission to the Human Research Ethics Committee (HREC) review process, if HREC approval will be required for the exercise data collection plan. Since

Page 27: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 13 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

the HREC only meets once a month, the milestone for submission of the ethics protocol should also be identified with sufficient time to address HREC review comments that may be required before approval. Further detail concerning the HREC submission is provided in Section 2.5.3.

Regardless of whether or not experimentation will occur during an exercise, ORT should be involved prior to and during the CDC to help aid in developing the exercise aims and high-level objectives and, if experimentation is planned, supporting the development of research questions and potential hypotheses.

2.5 Phase 3 – Design

The design phase examines the high-level requirements for the exercise as well as the exercise aim and objectives approved in the EXSPEC, and provides further guidance to the exercise planners. The key outcomes of the design phase are specific training objectives which are defined in the Exercise Directive (EXDIR).

The key activity of the design phase is the Initial Planning Conference (IPC), convened on the direction of the OSE, and conducted by a representative of the Officer Conducting the Exercise (OCE), most likely the Exercise Director. The purpose of the IPC is to explore options for achieving the exercise aim and objectives. A calling letter will invite the exercise planning team and participants to review the EXSPEC and share each other’s training requirements with a view to developing specific training objectives that meet the needs of both the exercise sponsor as well as the participants. Key topics of discussion include exercise scope, constraints, restraints and assumptions; exercise aim and objectives; locations; identification of primary and secondary training audiences; exercise control staff and observers; the exercise architecture; budget and resourcing. Chapter 5 to the CJOC Guide to Designing, Conducting, Assessing, and Reporting on an Exercise contains detailed work breakdown structures and templates for the conduct of the IPC and the development of the EXDIR.

The following sub-sections of the design phase describe in more detail the methodology for developing training objectives. As illustrated in Figure 2-2, training objectives are derived from commander-approved Mission Essential Tasks, which are selected from Allied, Joint and/or Service-level Joint Task Lists which serve as a “Task Library.” Training Objectives consist of the specific performance requirements expected of the training audience (tasks), which are performed under controlled variables of the operational training environment (conditions), and which are measured against a scale of performance (standards).

2.5.1 Determine Mission Essential Tasks

The development of Mission Essential Tasks (MET) is a commander-led process that establishes the most essential mission capability requirements expected of his/her forces. METs are derived from authoritative Task Lists such as the Canadian Joint Task List (DND CJOC, 2016), the RCN Naval Task List (DND CFCD 129, 2015), NATO Task List (Bi-SC 80-90), the U.S. Universal Joint Task List (CJCS UJLT, 2016), Universal Navy Task List (USN, USMC, USCG UNTL, 2007), and the Department of Homeland Security Universal Task List (DHS UTL, 2005). In whatever form they are defined, Task Lists are living documents that should be

Page 28: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 14 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

reviewed regularly and updated to incorporate changes in strategy, doctrine, missions, and capabilities.

The reason that Task Lists, and more importantly METs, are critical to exercise design and real world operations is that they delineate the capabilities and specify the types of missions that can be performed by a military force. In order to properly design an exercise, the planning staff should know what the force can do, under what conditions, and to what standards. When communicating with higher and lower commands and other services and multinational partners the use of doctrinally-based terms and types of tasks that are described in Task Lists and METs are particularly useful to facilitate assessments of risk and mission readiness, and to communicate plans and orders.

The following five sub-sections describe the most relevant and the most well-developed of the various task lists. Note that the most detail is provided in the RCN Naval Task List (for Naval exercises) and the Canadian Joint Task List (for Joint exercises to which the RCN is contributing). Any ORT member should understand that they might need to follow the task lists to their lowest level in order to understand what the RCN will undertake to pursue a mission objective and, thereby, devise exercise interventions to test the range of factors implicated in the objective and develop suitable measures to evaluate performance.

2.5.1.1 RCN Naval Task List

The RCN Naval Task List is contained in CFCD 129 RCN Readiness and Sustainment Policy and provides the linkage between the Naval Task List and various Readiness States assigned to HMC Ships. Unlike the CAF JTL, the RCN Naval Task List has been grouped under five Capability Programs (Command, Conduct Operations, Mobility, Protect Forces, and Sustain Forces) with supporting capability streams listed in CFCD 102(L) RCN Combat Readiness Training Requirements.

Command Program: Divided into two Capability Streams:

Command, Control, Communication and Computers (C4) Capability Stream: Refers to the ability to command, control, and communicate, and generally includes the use of computers in doing so. It includes joint, combined, and maritime C4 at Task Force, Task Group, and unit levels. It includes the ability to exchange information via secure and non-secure communication paths.

Intelligence, Surveillance and Reconnaissance (ISR) Capability Stream: Refers to the ability to determine, receive, collect, and report information and intelligence requirements. It includes coordinating and contributing to tactical and operational pictures.

Page 29: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 15 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Conduct Operations Program: Consists of six Capability Streams:

Anti-Surface Warfare (ASuW) Capability Stream: Refers to that portion of maritime warfare in which operations are conducted to destroy or neutralize enemy naval surface forces.

Anti-Air Warfare (AAW) Capability Stream: Measures taken to defend a maritime force against attacks by airborne weapons launched from aircraft, ships, submarines and land-based sites. It includes Force level AAW, and Air Defence (AD).

Anti-Submarine Warfare (ASW) Capability Stream: Operations conducted with the intention of denying the enemy the effective use of their submarines. ASW does not include other types of underwater warfare, such as mine warfare.

Interdiction Capability Stream: Refers to boarding and approach operations, including seizing vessels and support to Other Government Departments (OGD).

Search and Rescue (SAR) Capability Stream: Refers to various types of assistance-oriented activities, including rendering assistance to recreational vessels, refugee assistance, or mass casualty evacuation.

Air Operations Capability Stream: Refers to those activates involved with maintaining the ability to conduct air operations. Air deck evolutions are therefore included, but a helicopter sortie, such as an ASW air sortie, for example, would not be. An ASW air sortie would contribute directly to the ASW Capability Stream.

Mobility Program: Consisting of five Capability Streams:

Force Mobility Capability Stream: Refers to all those elements that contribute to the movement of maritime units. Bridgemanship, engineering capabilities, and manoeuvring are examples of serial types that contribute to force mobility.

Special Operations (SOF) Capability Stream: This is an example where no training readiness serials exist in support of the stream. It is necessary to capture the stream; however, as personnel and materiel readiness do contribute, for capturing overall readiness through the CAF reporting system.

Naval Control and Guidance of Shipping (NCAGS) Capability Stream: This includes the means to manage the interaction between military forces and commercial shipping. Training readiness in this stream is only maintained at the TG staff level, and within the NCAGS community.

Mine Warfare (MW) Capability Stream: This includes mapping the sea floor, conducting sea bed intervention, and MW operations.

Page 30: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 16 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Seamanship Capability Stream: This includes seamanship-centric operations, such as Replenishment at Sea (RAS).

Protect Forces Program: Consisting of three Capability Streams:

Harbour Defence (HD) Capability Stream: Refers strictly to the elements of harbour defence practiced by Port Security Units, and not to other force protection operations conducted while ships are in harbours.

Chemical, Radiological, Biological and Nuclear (CBRN) Capability Stream: This stream includes core unit and individual-level, as well as mission-specific CBRN capabilities.

Force Protection (FP) Capability Stream: This includes all elements of unit and individual protection, including defence against some aspects of asymmetric warfare and stealth capabilities.

Sustain Forces Program: Consisting of two Capability Streams:

Specialist Support Capability Stream: This includes the ability to conduct various types of specialist support. There are no training readiness serials in support of this stream, since most of the skills required by specialists are generated within their own training and readiness regimes.

Survivability Capability Stream: Survivability refers to the ability to recover from damage or threats in order to return to the requisite float, move, and fight capabilities, and certain aspects of the protection of individuals. It includes damage control, battle damage repair, and man-overboard rescue capabilities.

Unlike other task lists, in particular the combined United States Navy (USN), United States Marine Corps (USMC), and United States Coast Guard (USCG) task lists, the standards and conditions against which the RCN tasks are to be evaluated have not been well defined in CFCD 129. While CFCD 102(L) and the RCN’s Maritime Command Sea Training Guide (DND RCN, 2013) contain basic standards and conditions, these are focussed on the individual or small team level. Normally, the authority to assess whether a designated training or readiness standard has been successfully met is delegated to the applicable senior officer or supervisor, but is ultimately determined to be successful on assessment of the Commanding Officer. CFCD 102(L) does include a section on Task Group-level readiness and training requirements; however, the standards and conditions against which the tasks are to be evaluated lacks well-defined quantitative or qualitative performance metrics and criteria against which the performance standards are to be measured and success is ultimately determined by the Fleet Commander.

Page 31: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 17 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

2.5.1.2 Canadian Joint Task List

The Canadian Joint Task List (CJTL) is an operational-level task list and is maintained on the CJOC J7 portal on the Defence Wide Area Network (DWAN) at http://collaboration-cjoc-coic.forces.mil.ca/sites/JTL. The CJTL was initially designed for force development in support of expeditionary operations and included tasks at the strategic, operational and tactical levels. CAF doctrine, both existing and newly published, was used to add new tasks or as a source for modifications to existing tasks. On the appointment of the Commander CJOC as the CAF Joint Training Authority, the CJOC JTL was transformed into a more generic CAF JTL that describes all the activities that should be performed by the staff of various levels of Joint Headquarters during the conduct of their staff functions. The CAF Joint Tasks (JT) have been grouped under five CAF Operational Functions as follows:

The CAF Joint Tasks (JT) have been grouped under five CAF Operational Functions as follows:

1. CAF JT 1 Command: The assignment of tasks to subordinate commanders so as to achieve accomplishment of the assigned mission.

2. CAF JT 2 Sense: The provision of the commander with knowledge. This function includes the determination, collection and processing of information and intelligence products.

3. CAF JT 3 Act: The integration of maneuver, firepower, and information operations to achieve desired effects. This function covers the application of allocated joint force packages to achieve the military objectives of assigned missions.

4. CAF JT 4 Shield: The protection of a force, its capabilities, its information, and its freedom of action from conventional and asymmetric threats and from the operational environment.

5. CAF JT 5 Sustain: The ability of a military force to maintain its operational capability for the duration required to achieve its objectives.

Each Operational Function is composed of a number of Joint Tasks and Sub-Tasks that follow the same general format of a task title followed by a task description.

2.5.1.3 USN, USCG and USMC Universal Naval Task Lists

The U.S. Universal Naval Task List (UNTL) is a comprehensive single source document that combines the Navy Tactical Task List (NTTL) and the Marine Corps Task List (MCTL). As applied to joint training and readiness reporting, the UNTL provides a common language that commanders can use to document their command warfighting requirements as Mission Essential Tasks (METs) and use the same lexicon and task hierarchy as the U.S. Universal Joint Task List. A particular strength of the UNTL is that it clearly lays out the tasks, conditions and guidelines for performance measurement.

Page 32: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 18 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

2.5.1.4 NATO Task List

Bi-SC 80-90, NATO Task List (NTL) Directive provides a common terminology and reference system for NATO’s Strategic Commanders, their subordinate commanders and agencies, operational planners, and for training and exercise planners to communicate mission requirements. Unfortunately, because the NATO Task List is classified, it was not analysed for this Framework.

2.5.1.5 U.S. Department of Homeland Security (DHS) Universal Task List

The DHS Universal Task List (DHS, 2005) provides a common language and reference system for users from multiple jurisdictions, disciplines and levels of government as well as the private sector in supporting homeland security exercise programs. It is also applied to a wide range of other functions including doctrine development, personnel requirements, logistics support analysis, interagency and inter-jurisdictional coordination and organizational development.

The DHS Target Capabilities List (DHS, 2007) is a complementary document to the DHS Universal Task , which identifies specific capabilities related to the five homeland security mission areas of prevention, protection, mitigation, response, and recovery. The Target Capabilities List defines and establishes national guidance for preparing for major all-hazards events, such as those defined by the 15 U.S. National Planning Scenarios. The National Planning Scenarios are representative of major events including terrorism, natural disasters and other emergencies. Since no single jurisdiction or agency is expected to perform every task identified in the DHS Universal Task List and no two jurisdictions require the same level of capabilities, the task-tailored Target Capabilities List was developed.

The strength of the DHS approach is that each element on the Targeted Capability List has a well-defined list of preparedness and performance activities, tasks, measures and metrics as well as planning assumptions.

2.5.2 Building on the Mission Essential Tasks

As discussed in the sections above, the development of the MET is a commander-led process that establishes the most essential mission capability requirements expected of his/her forces. Because exercises are limited in scope, time and resources, it would be impossible to design, plan and deliver an exercise that stimulates and measures the response of the training audience to every conceivable MET. Therefore, Commanders should assess current capabilities against required capabilities defined by METs to identify and prioritize specific training objectives within the scope of a particular exercise. In order for a MET to be used as a training objective and evaluation tool, the task needs to be performed under a set of conditions or variables of the operating environment and measured against some pre-defined standard.

The following sub-sections describe how conditions and standards are developed in exercise design to complement the tasks that were developed from the analysis of the Task List and commander-approved METs.

Page 33: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 19 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

2.5.2.1 Determine Conditions

Having analyzed and selected specific tasks from the commander-approved MET which will be exercised during training, the next step is to determine the conditions under which training will be conducted. Conditions are variables of the environment that affect the performance of tasks. They are generally categorized by conditions of the physical environment (e.g., weather, sea state, terrain), the military environment (e.g., threat, command structure and relationships, force capabilities), and the civil environment (e.g., political, social, cultural, economic factors). Appendix A to the U.S. Universal Naval Task List Figure 2-3 shows a high-level overview of conditions contained in the U.S. Joint and Universal Naval Task Lists.

Figure 2-3: Organization of Conditions for U.S. Joint and Naval Tasks

2.5.2.2 Relationship Between Tasks and Conditions

The following example of determining conditions illustrates the relationship between Tasks and Conditions. Note that this example is not exhaustive and merely shows some conditions which may act upon a single task. This example is drawn from the USN Universal Naval Task List.

Task: Move Naval Tactical Forces

Task Description: To move naval units and/or organizations and their systems from one position to another in order to gain a position of advantage or avoid a position of disadvantage with respect to an enemy.

Conditions of the Physical Environment (C 1.2 Sea): Those factors associated with the continuous salt-water ocean system to include oceans, seas, gulfs, inlets, bays, sounds, straits, channels, and rivers. Condition descriptors include:

Page 34: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 20 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Open (open ocean, blue water beyond 5 NM of land);

Littoral (coastal, within 5 NM of land areas); and

Riverine (inland from the littoral terrain to include rivers, canals, delta areas connected to landlocked waters).

These conditions can be further broken down as required to evaluate the training audience. For example, if the naval force is transiting in the littoral environment, it may be desirable to add additional conditions such as various sea states, ocean features, salinity, acoustic conditions and water depth etc.

Conditions of the Military Environment (C 2.9 Threat Naval Force Size): The relative size of naval forces of the potential aggressor to friendly naval forces. Condition descriptors include:

Overwhelming (significantly more enemy than friendly naval forces);

Large (somewhat more enemy than friendly naval forces);

Moderate (comparable level of enemy to friendly naval forces); and

Low (less enemy than friendly naval forces).

Conditions of the Civil Environment (C 3.1.2.1 Major Power Involvement): The major power interests about a region or military operation and the ability and willingness of a particular major power to act on those interests. Condition descriptors include:

Active (major power has interests and may be willing to act);

Limited (major power has interests but is not inclined to act); and

No (lack of major power interest).

These conditions can be further decomposed based on threat size, disposition, posture, and axis or to impose limitations such as political constraints on action, limits to host nation support or cultural impediments to collaboration etc.

2.5.2.3 Determine Standards, Measures and Criteria

Having selected specific training objectives based on the commander-approved METs and determined the conditions under which training will be conducted, the next step is to determine the proficiency levels required to perform a particular task. These standards are established by commanders and consist of measures and criteria, where a measure provides the basis for describing varying levels of task performance and a criterion defines the acceptable levels of

Page 35: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 21 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

performance. A criterion is often expressed as a minimum acceptable standard and the combination of the measure and the criterion comprise the standard for a task.

CFCD 124, RCN Maritime Operational Test and Evaluation Guide (DND CFMWC, 2013) provides the following useful definitions and examples of measures:

Measures of Effectiveness (MOE): MOE quantify the degree to which a system or process accomplishes specified missions, tasks or effects in an operational environment. MOE must be testable, which means there must be a practical means of collecting and analyzing the necessary data, and there must be a correspondence between changes in the value of the MOE and changes in the achievement of aims, objectives or effects. Sample MOEs include mission success, fratricide/blue on white, casualties (numbers, friend/enemy ratio, initial/final strength ratio), time to accomplish mission, or size of area occupied, denied, controlled, under surveillance, cleared etc.:

Example MOE: If the task of an air defence missile is to kill threat vehicles, an appropriate MOE might be the probability of kill (P(k)). Data would be collected on the number of attempted kills and the number of kills.

Measures of Performance (MOP): MOP are sub-elements of MOE that quantify the degree to which a system or process contributes to the accomplishment of specified missions, tasks or effects. MOP are measures of a system’s performance expressed as speed, payload, range, time-on-station, frequency, detections, lethality, or other distinctly quantifiable performance features.

Example MOP: If the task is to communicate timely force status reports from a TG Commander to his superior HQ, data would have to be gathered to determine the interval between the time when relevant information becomes available to the commander and the time it was used. It is dependent on parameters such as system availability, communications availability, transmission time and processing by staff.

Metrics. A metric is a standard of measurement. Percent, time, distance, depth, altitude, temperature, salinity, velocity, rate of turn, radar cross section, infra-red signature, and even “yes/no” are examples of various types of metrics.

The key outcomes of the design phase are specific training objectives defined as MET. In order for a MET to be used as a training objective and evaluation tool, the task needs to be performed under a set of conditions or variables of the operating environment and measured against a standard.

2.5.3 Phase 3 ORT Inputs and Outputs

It is apparent that Phase 3 represents a considerable amount of work for the ORT. In particular, the reframing of the exercise objectives into the RCN Naval Task List Capability Streams, the subsequent selection of specific tasks, the selection of conditions to apply to the capabilities and tasks and finally the development of MOPs and MOEs, will incur significant activity and iteration.

Page 36: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 22 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

There is a great deal of slow progress during this phase as stakeholders wait for decisions to be made by other stakeholders before making their own decisions. It is critical that the ORT seek involvement from the OSE and OCE and their representatives. Much of the ORT activity during this phase will involve seeking clarification and additional information from those who are involved in developing the scenario (refer to paragraph 2.6). Because of this integration, it may make sense for the ORT to collaborate with the OSE and OCE Staffs and potentially participate in the Operations Planning Process (OPP) being carried out by the Staff.

The ORT plan should make explicit consideration of the availability of data with which to carry out their analysis. In many exercises, it may prove difficult to obtain objective data and the use of questionnaires or interviews may lead to fatigue on the part of exercise participants. The ORT should ensure their plan clearly describes the relationship between their MOP/MOE and the objectives, and how the interpretation of MOP/MOE will change should different conditions occur. The ORT should also plan how they will document their data so that it facilitates analysis with any specialty software or networks systems (to handle designation or classification of information) to be used.

If required, the data collection plan will inform a part of the HREC submission. Typically the HREC will require a review of all data collection protocols and instruments that involve any interaction with human subjects, such as questionnaires or interview questions. However, an expedited review may be possible for research on human subjects that would not reasonably be expected to cause any physical or psychological distress. As noted previously, the HREC meets once a month, although an expedited review can be accommodated throughout the month according to the schedules of the HREC members.

Based on the Tri-Council Policy Statement (2010) of the Medical Research Council of Canada, the Natural Sciences and Engineering Council of Canada and the Social Sciences and Humanities Research Council of Canada, the HREC considers the following ethical principles:

Respect for human dignity;

Respect for free and informed consent;

Respect for vulnerable persons;

Respect for privacy and confidentiality;

Respect for justice and inclusiveness;

Balancing harms and benefits;

Minimizing harm; and,

Maximizing benefits.

Page 37: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 23 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

For further information concerning research on human subjects, the reader should consult the Guidelines for Human Subject Participation in Research Projects (2002). The reader may also complete an ethical research course1 provided by the Tri-Council at https://tcps2core.ca/welcome as well as consult the Government of Canada’s Panel on Research Ethics (www.pre.ethics.gc.ca ).

NOTE: In practice, Phase 3 and Phase 4 may overlap to a significant degree, with planning only possible as the design matures, and design only possible as the plan solidifies.

The reader may refer to Section 4 for further information about the selection of measures by which to evaluate achievement of an exercise objective and acceptance or rejection of an experimental hypothesis.

2.6 Phase 4 – Plan

The plan phase develops the details on how the exercise will be conducted, who are the key participants, how evaluation will be undertaken and all the necessary administrative instructions to conduct the exercise. This phase includes the conduct of the Main Planning Conference (MPC), the Scenario Writing Board, the Final Planning Conference (FPC) and the Master Scenario and Events List (MSEL) Synchronization Conference. The aim of these conferences and boards is to build the scenario and exercise operating environment from which events and injects can be developed to stimulate the training audience and to achieve the exercise aim and objectives.

The Exercise Director plays a key role throughout the planning phase by overseeing the development and coordination of the exercise including ensuring the availability of training areas and facilities, training support systems and services, modelling and simulation, observers, controllers, evaluators, data collection planning and all other resources to support the exercise.

2.6.1 Resource Planning Considerations

The availability of resources is a key early planning consideration. Areas to consider include:

The amount of time available to complete the training;

Support personnel required;

Modelling and simulation requirements;

Observer-controller support;

1 Noting that any ORT researcher participating in the study and interacting with the human subjects must complete the course and hold this certification before the HREC will approve the submission.

Page 38: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 24 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Higher Command (HICON), Lower Command (LOCON), Flanks and Opposing Force (OPFOR) requirements; and

Available training areas and facilities.

The amount and availability of resources can limit the size or number of live training events (for example, availability of targets, range facilities and area clearances for live fire exercises), requiring planners to consider a mix of virtual and constructive simulation exercises. Several key decisions need to be made early in the planning:

What are the time parameters of the exercise?

Will the exercise be a TTX, CPX, live, virtual or constructive wargame or a combination?

Will the scenario be based on an existing operational environment (OE) or does a new scenario and virtual environment need to be developed?

2.6.2 Scenario Planning Considerations

The CAF’s increased capability to integrate virtual and constructive simulations with live training, and the increased need for joint, interagency, multi-national and multi-echelon training require exercise planners to develop a detailed and consistent scenario and exercise (OE for each training event). Defining the scenario and OE provides top-down coherence and continuity to the exercise and allows interaction between all levels of the training audience. Scenario and OE development should include an examination of the following eight operational variables that stimulate the training audience. The following is a brief description of each Political, Military, Economic, Social, Information, Infrastructure, Physical Environment, and Time (PMESII-PT) variable, along with an example of questions a commander might need to have answered about each variable:

Political: Describes the distribution of responsibility and power at all levels of governance and formally constituted authorities, as well as informal or covert political powers. (e.g., Who is the leader of the town and does he exercise control over the security of the port?)

Military: Explores the military and/or paramilitary capabilities of all relevant actors (enemy, friendly, and neutral) in a given OE. (e.g., Does the enemy have access to shore-based cruise-missiles and mines?)

Economic: Includes individual and group behaviors related to producing, distributing, and consuming resources. (e.g., Does the port have a high unemployment rate that makes it easy for the enemy to get dock-workers to perform illicit tasks?)

Social: Describes the cultural, religious, and ethnic makeup within an OE and the beliefs, values, customs, and behaviors of society members. (e.g., Who are the influential people in the port such as criminal bosses or corrupt officials?)

Page 39: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 25 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Information: Describes the nature, scope, characteristics, and effects of individuals, organizations, and systems that collect, process, disseminate, or act on information. (e.g., Does the opposing force have a signals intercept capability?)

Infrastructure: Is composed of the basic facilities, services, and installations needed for the functioning of a community or society. (e.g., Does the port have potable water and stable power?)

Physical Environment: Includes the geography and man-made structures as well as the climate and weather in the area of operations. (e.g., Do the oceanographic conditions and bottom topography favour enemy submarine operations?)

Time: Describes the timing and duration of activities, events, or conditions within an OE, as well as how the timing and duration are perceived by various actors in the OE. (e.g., At what times are fishermen likely to congest the approaches to the port or conduct activities that provide a cover for hostile operations?)

The scenario and OE provides the consolidated storyline that ensures training and exercise objectives are accounted for in the exercise.

2.6.3 Master Scenario and Events List Planning Considerations

The exercise scenario is a key element of the background materials required to guide the preparation of exercise events and Master Scenario and Events List (MSEL). In a NATO context, the MESL may be referred to as the Main Events List/Main Incidents List (MEL/MIL). Using the scenario as the setting, the MSEL builds storylines designed to stimulate certain decisions and activities in the training audience based on the exercise and training objectives. The MESL covers all events and injects provided to the training audience from the start of the exercise (STARTEX) through to the end of the exercise (ENDEX).

A detailed and accurate MSEL database and the supporting events and injects is the foundation for an effective and successful exercise. Writers should be sourced from those organizations that will be represented in Exercise Control (EXCON) and include writers representing Higher Control (HICON), Lower Control (LOCON) and Flanking Control (FLANKCON). Evaluation teams and ORT, if part of the exercise, should assist in development of the MESL development and request specific incidents and/or injections designed to support their evaluation/experimentation requirements.

Figure 2-4 from the Department of Homeland Security Exercise and Evaluation Program (DHS HSEEP, 2013) shows an example of a high-level MSEL summary template.

Page 40: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 26 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Figure 2-4: MSEL Summary Template

Figure 2-5 from the HSEEP shows an example of the detail required on a single MESL inject. Individual injects sequenced within the overall exercise scenario or story line are carefully managed and controlled to stimulate the training audience to perform the necessary tasks under given conditions and evaluated against measurable standards.

Page 41: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 27 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Figure 2-5: MSEL Inject Template

2.6.4 Modelling and Simulation Considerations

Modelling and Simulation (M&S) plays an increasingly important role in such fields as military training, readiness evaluation, mission rehearsal, concept development, research and development, experimentation, operational test and evaluation, and defence acquisition processes. The planning phase should consider the M&S requirements and resources necessary to stimulate the expected responses from the training audience during the exercise. The 2015 DND/CAF Modelling and Simulation Roadmap (Draft Version 5.6), the Canadian Army Experimentation Centre’s Experimentation Guide, and the NATO Bi-SC 75-2 Education, Training, Exercise and Evaluation Directive provide a good overview of Canadian and Allied considerations regarding the use of M&S techniques.

In its most basic sense, a model is a representation of the properties and/or behaviour of some entity such as a ship, aircraft, vehicle, person or an area of air, sea or land. In some applications, these entities may appear as realistic-looking images; however, models may or may not have a visual depiction depending on the nature of the exercise or experiment. A simulation is the process of replicating activities over time while applying the model to determine how it responds to varying inputs or operating conditions against a controlled baseline. M&S is ideal in a training setting when human error with real-world equipment such as weapons and

Page 42: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 28 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

live ammunition could have disastrous consequences. Simulations are usually categorized as one of the following three types:

1. Constructive simulations. Involve simulated situations and entities with no real person interaction (e.g., fully computerized war games).

2. Virtual simulations. Involve real people interacting with simulated equipment operating in simulated situations (e.g., human-in-the-loop ships bridge and operations room, weapon, vehicle or flight simulators).

3. Live simulations. Involve real people using real equipment in representative operational conditions (e.g., TG Exercises, Work Ups or major joint and international exercises such as RIMPAC and JOINTEX).

During the plan phase, M&S for a particular exercise or experimentation within an exercise will be determined by such factors as performance requirements, the ability to control variables, cost and resources.

The major output of the plan phase is the Exercise Plan (EXPLAN) signed by the Exercise Director which includes the approved training objectives all of the necessary information and information for all of the participants (training audience, controllers and observers) to conduct and report on the exercise, as well as the integration of M&S and experimentation into the exercise. The Plan Phase also produces various instructions including, but not limited to Participant Handbooks, Controller-Evaluator Handbooks, the Data Collection Plan, and guidance on the After Action Report, Post-Exercise Report and Lessons Learned Report.

One of the final steps in exercise planning is integration and operations testing which occurs just prior to the exercise. Event tests confirm that all simulations, databases, and connectivity are operating correctly, including the communications infrastructure between exercise sites. Final preparations should include complete end-to-end test of the M&S systems, models, databases, connectivity, and control infrastructure.

2.6.5 Phase 4 ORT Inputs and Outputs

If possible, the ORT should be working with the MSEL designers to ensure that scenario injects are sufficient to stimulate the evaluation of the exercise objectives, recognising that the ORT are unlikely to be the experts in the scenario or the tactical activities. Thus, the ORT can advocate for the inclusion of factors that will assist in the stimulation of measurement points but they must ultimately defer to the judgement of the military planners.

The ORT should be an active participant in the MPC, the scenario writing board and the FPC ensuring that their measurement requirements are being scheduled into the MSEL in such a way as to be sensitive to the need to maintain realism in the event. Furthermore, decisions to embed ORT into the exercise should be made as early as possible in the planning process to account for any specific information technology and communication architecture required for data collection and analysis. By the time of the FPC it is likely that much of the effort of the ORT

Page 43: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 29 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

is concerned with developing trusting relationships with the OSE and OCE and their staff to ensure that they understand the necessity of the ORT’s involvement and ORT requests during the event.

During the Planning phase, the ORT will deliver their submission to the HREC (if and as necessary).

The ORT should also become extremely familiar with the scenario and the various synchronisation points that exist so that they do not have to interrupt any of those involved with the exercise, whether supporting staff or primary training audience, to understand what is happening.

The resource constraints, personnel availability, and other factors beyond the ORT and exercise planners’ control might impose limits on the ORT involvement at this stage. However, even if that is the case, the ORT should strive to maintain situational awareness in order to tailor their collection plans to the scenario realities.

2.7 Phase 5 – Conduct

The conduct phase begins with deployment to the exercise location(s) and ends with ENDEX and the facilitated After Action Review (AAR). The conduct phase delivers the actual training events to evaluate the performance of the training audience relative to the approved training objectives.

Controllers, data collectors and evaluators are pre-positioned at appropriate locations and administer the exercise by referring to the MSEL to ensure the exercise remains on schedule and within scope. Data collectors directly observe player actions during the exercise and ensure that manual and automatic data records are collected and retained for evaluation and analysis.

Depending on the complexity, scope and participants, the exercise may be conducted in phases of increasing complexity. For example, complex naval and joint exercises may involve nations with different operating cultures and languages, operating procedures, experiences and expectations. To ensure that the fullest possible range of participant exercise objectives are met, naval exercises typically adhere to a phased approach as follows:

1. Harbour/Garrison/Headquarters Training Phase focussing on:

a. Relationship building.

b. Learning and reinforcing safety, environment and warfighting protocols;

c. Confirming connectivity and weapon/sensor systems;

d. Understanding national and coalition command and control procedures; and

e. Seminar, TTX, and/or CPX rehearsal of final plan.

Page 44: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 30 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

2. Force Integration Training:

a. Basic air, surface and sub-surface warfare at the task group level; and

b. Advanced warfare including live-firing events.

3. Advanced Force Integration and Free Play:

a. Task Force Integration and Training at Theatre Level; and

b. Complex operational concepts such as Theatre ASW, Maritime Interdiction Operations, Joint Targeting/Fires, Amphibious Assaults, Humanitarian Assistance and Disaster Relief Operations, and applying Command and Control processes for an extended period across the spectrum of conflict.

Throughout the conduct phase, the Exercise Director and EXCON monitor the situation to ensure the exercise progresses in accordance with the scenario to allow accomplishment of the exercise objectives. Specific activities include:

Providing a centralized control environment to present a synchronized and realistic exercise portrayal to the Training Audience;

Ensuring that scripted scenario events remain technically and operationally synchronized;

Ensuring that M&S capabilities support accomplishment of the exercise and training objectives;

Providing initial and follow-on intelligence and information in accordance with the scenario (dynamic exercise writing may be required to respond to requests for information or to develop additional products to support the exercise);

Replicating the functions, actions, responses, and decisions of agencies external to the Training Audience;

Monitoring and controlling the exercise to ensure accomplishment of the exercise objectives; and

Collecting and recording data for exercise analysis and reporting.

2.7.1 Data Collection

Throughout the conduct of the exercise, data are collected to support the analysis, assessments, or evaluations as specified in the EXPLAN, in order to generate the specified deliverables of the exercise. Evaluation and Lessons Learned processes are closely linked.

Page 45: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 31 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Data collection is critical to the conduct phase as the key resources to evaluating the exercise or testing experiment hypotheses are the data and interviews collected during the training/experimentation event research. Analysts form data collection teams of subject-matter experts (SMEs) and each team collects quantitative and qualitative data and conduct interviews of key personnel at various locations and measurement points based on the scenario, events and injects.

Following ENDEX, the After Action Review (AAR) is convened for the benefit of the training audience. Redeployment of equipment and personnel marks the official conclusion of the execution stage. If experimentation is part of the exercise, the conduct stage will also collect data in order to assess the value of the experiment solutions and capabilities that are integrated into designated training events and establish whether the hypotheses were supported or rejected.

As a general practice, exercise controllers should receive those portions of the MESL for which they are responsible as well as any detailed inject information that they need to provide to exercise participants. Core control team members, primarily located at the Exercise Control Cell, will use the MSEL to track exercise play, manage and oversee simulation, and maintain situational awareness for the Exercise Director.

2.7.2 Phase 5 ORT Inputs and Outputs

As part of their planning to this point the ORT will already have made decisions concerning the positioning of their resources, the data to be collected, their timetable, reachback, and so on. The ORT should meet, in person or via some technological means, after each day’s activity to discuss the adequacy of the day’s work and to share planned corrective actions for the following day. It is always likely that the original plan will have to change, but these changes, and the rationale for them, should be documented so that they can be used during planning for future events.

In the case where HREC approval was required for the exercise, the ORT is required to report to the HREC after the exercise has been concluded and the data collection is complete. This report must include when the exercise concluded, the total number of participants, if any injuries occurred during the exercise and a copy of signed consent forms from all participants.

2.8 Phase 6 – Assess

The evaluation stage begins with redeployment and ends when finalized data and products are distributed to the end users, to include feedback into the training cycle. Upon completion of redeployment and equipment recovery, all exercise and event information (observations, documents, model data (if applicable), outputs from the AAR, etc.) is analyzed for the Post Exercise Report and Lessons Learned Report, if applicable.

The purpose of the assessment phase is to determine whether the training audience was able to achieve the exercise objectives. If experimentation is part of the exercise, this phase should identify which experiment solutions and capabilities demonstrated value during the exercise,

Page 46: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 32 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

establish whether the hypotheses were supported or rejected, and guide further concept and/or capability development and refinement in future exercise and experimentation cycles. The Assess phase should not mark the end of a closed loop, but rather be viewed as a spiral that moves forward in a process of continual learning, growth, and improvement for all exercise participants and stakeholders across the DOTMLPFI spectrum.

The spiral process of continual improvement could be a long and deliberate process to change existing policies and/or procedures. Validation of corrective action (i.e., ensuring that the lesson has been learned) normally requires an additional collection effort to determines whether there has been a positive change in organizational behaviour and/or effectiveness as a result of the implementation of corrective action and tracking of lessons identified.

2.8.1 Phase 6 ORT Inputs and Outputs

The assessment phase of the exercise development cycle will be the main effort for the ORT; consequently, it may take a great deal of time. However, this time can be shortened by conducting some of the analyses during the exercise (e.g., employing a reachback team, if practical). The time and resource allocation for this phase should be adequately planned ahead of time.

2.9 Sample Exercise and Training Objectives – Examples

Having described the key inputs, processes and outputs of the exercise cycle, this section of the framework contains examples of sample training objectives. As a reminder, a specific training objective is a desired goal expressed in terms of Mission Essential Tasks (MET), derived from an authoritative Task List to be performed under set conditions and related to a defined standard. The Commander’s training objective should describe the desired outcome of a training activity for a training audience as well as the measures for assessing the performance outcome. Specific training objectives consist of a specific performance requirement (task), the training situation (conditions) and the level of performance (standard). Training objectives are defined by the training audience Commanders based on their METs, and promulgated in with the OCE’s EXPLAN.

The following sample training objectives were derived from topical issues for the RCN based on unclassified discussions with: Dr. Peter Dobias (Lead, MARPAC ORT) on 8 and 26 January 2016; an interview with Commodore J.B Zwick (Commander Canadian Fleet Pacific) on 9 January 2016; an interview with Commander C.G. Peschke (Commander Sea Training Pacific) on 22 January 2016; and an interview with Captain(N) D.M.C. Young (MARPAC/JTFP Chief of Staff) on 22 February 2016.

They are also derived from examining the concept of the Canadian Naval Task Group (TG), which remains central to any strategic thinking about the future fleet. It will almost certainly shape, influence, and inform capabilities that are to be developed and acquired in the near- to mid-term. The Task Group continues to play a particularly important role as the RCN moves forward on such new projects as the Canadian Surface Combatant (CSC), the Joint Support Ship (JSS), the Arctic Offshore Patrol Vessel (AOPS), the Maritime Helicopter Project (MHP),

Page 47: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 33 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

and explores new concepts with Unmanned Aerial Vehicles (UAV) and Joint Fires and Targeting. For the RCN, employing the TG concept continues to offer a broad range of capabilities domestically and for expeditionary operations around the globe when needed, and forms the basis on which it fulfils the core missions.

Each of the sample training tasks contains a table with four columns identifying the measure identifier, the measure of merit, the metric, and a short description of the measure. A deliberate decomposition of the objective into its component tasks was not possible in the time available for this work, but is recommended for follow up work.

2.9.1 Evaluation on the Conduct of a Task Group Exercise

Evaluation is the structured process of examining activities, capabilities and performance against defined standards or criteria. In the case of evaluating the conduct of a Task Group Exercise (TGEX), emphasis is placed on planning, delivery of orders, exercising tactical command and control, and maintaining information on force status during the TGEX.

Training Objective 1 – Prepare Timely and Clear Plans and Orders for the Conduct of TGEX:

Scenario: During TGEX, there is a requirement for the TG Commander to communicate the Commander’s intent, guidance and decisions in a clear, useful form that is easily understood by those who must execute the plan or order. Plans and orders are communication that directs actions and focuses subordinate’s activities toward accomplishing the mission. This task includes developing and completing plans and orders, coordinating support, and approving orders. The mission requirements and capabilities are considered in the production of plans. This task could include the planning, coordinating and controlling of the entire TGEX, as well as specific phases or serialized events required to fill the aim of TGEX.

Page 48: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 34 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Table 2-1: 2.9.1 Training Objective 1 Measures

M1 Communication of Plans and Orders Time Available to complete planning.

M2 Communication of Plans and Orders Time To complete planning.

M3 Clarity of Plans Number Of changes made to plans and orders in order to attain Commander’s approval.

M4 Communication of Plans and Orders Time

Prior to TGEX, phase or serialized event execution that OPLAN/OPORDER/OPGEN/OPTASK/OCS Intentions are published and delivered to units.

M5 Communication of Plans and Orders Percent Of units receive their orders on schedule.

M6 Clarity of Plans Ratio

Number of units at desired position and ordered degree of readiness at execution versus number of units at desired position, but not at the ordered degree of readiness at execution.

M7 Clarity of Plans Ratio Number of units at the desired position versus the number of units not at the desired position at execution.

Training Objective 2 – Provide Direction and Purpose to Assigned Forces Through the Exercise of Tactical Command and Control during the Conduct of TGEX:

Scenario: During the conduct of TGEX, tactical command and control provides purpose and direction to the varied activities of forces assigned to the Task Group. It is the means by which the Officer in Tactical Command (OTC) or Officer Conducting the Serial (OCS) recognizes what needs to be done and sees to it that appropriate actions are taken to achieve the objectives. Tasks include ordering warfare degrees of readiness; directing asset assignment, movement, and employment; and controlling tactical assets, including Allied and Joint forces assigned to the TG.

Table 2-2: 2.9.1 Training Objective 2 Measures

M1 Communication of Plans and Orders Time For units to respond to tasking.

M2 Communication of Plans and Orders Time Delay in response to orders.

M3 Clarity of Plans Percent Of units responding appropriately to orders M4 Clarity of Plans Percent Of mission objectives attained.

Page 49: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 35 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Training Objective 3 – Support Effective and Timely Decision-Making by Maintaining Accurate Information and Force Status during the Conduct of TGEX:

Scenario: During TGEX, there is a requirement for the TG Commander and subordinate units to collect, display, evaluate, disseminate and archive data and information in a form that supports decision-making and the tactical picture. Data and information includes but is not limited to organization, situation reports, assessments and readiness data as well friendly, enemy/OPFOR and other force dispositions, locations and status which are represented in Tactical Decision Aids.

Table 2-3: 2.9.1 Training Objective 3 Measures

M1 Clarity of Communications Percent Of incoming pieces of information (which could affect

outcome of operation) do not get to person needing it. M2 Awareness of Status Percent Of friendly unit’s/organization’s personnel status is known.

M3 Time Management Time From receipt of reports until data is posted to appropriate data bases or passed to proper recipient.

M4 Clarity of Plans Time To enter most current information on status of forces.

M5 Awareness of Status Percent Of reports processed and disseminated to all agencies within specified time limits.

M6 Time Management Time To access and display shared local data bases. M7 Time Management Time To access and display shared remote data bases. M8 Awareness of Status Percent Of operational readiness data displayed is current.

M9 Decision-quality Information Percent Of dual tracks at any given moment.

M10 Decision-quality Information Percent Of decisions delayed because data not presented to

decision maker in suitable format.

M11 Decision-quality Information Number Of unresolved ambiguities in tactical picture.

2.9.2 Evaluation of a Concept of Operations during a TGEX

A concept of operations (CONOPS) expresses the military commander’s intentions on the use of forces, time and space to achieve his mission, objectives, and end state. The following sample training objectives represent topical issues for the RCN as attempts to modernize and become a highly interoperable naval force capable of delivering operational excellence within complex joint, interagency and multinational environments across the spectrum of conflict.

Training Objective 1 – Evaluate the TG Commander’s CONOPS for the Engagement of Joint Time-Sensitive Targets with Land-Attack Harpoon:

Scenario: A CONOPS has been developed by the TG Commander to support the JFMCC by engaging time-sensitive targets requiring immediate response, which the Joint Force Commander (JFC) has validated. Time-sensitive targets (TSTs) are engaged

Page 50: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 36 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

using either deliberate or dynamic targeting. Since TSTs are time-sensitive, and often fleeting, or emerging, they tend to be engaged via dynamic targeting, but guidance, validation, relative prioritization, assessment criteria, collection requirements, and many other aspects of developing TSTs can be accomplished during pre-operation planning and/or as part of deliberate targeting. For this Task, the TG Commander’s CONOP would involve the planning, decisions and approvals (through the JFMCC and JFC as well as the Canadian National Chain of Command (if required) for the launch of Land-Attack Harpoon missiles against TSTs.

Table 2-4: 2.9.2 Training Objective 1 Measures

M1 Completeness of Plans Yes/No

Did the CONOPS include procedures to obtain time-sensitive approvals of ROE through Canadian Chain of Command and Weapon Control Orders from Higher Command to support TST?

M2 Completeness of Plans Yes/No Did the CONOPS include procedures regarding direct

liaison and coordination to plan TST engagement?

M3 Clarity of Communications Yes/No

Did the CONOPS enable near real-time sharing of information between Components, TG Commander and firing units in a common and consistent language?

M4 Completeness of Plans Yes/No

Did the CONOPS describe how specific areas of the battlespace would be defined to enable commanders to efficiently coordinate, de-conflict, integrate, and synchronize operations?

M5 Mission Success Percent Of attacking systems penetrate to target to deliver ordnance.

M6 Mission Success Time After target identification to complete attack.

M7 Mission Success Percent Of enemy forces destroyed, delayed, disrupted, or degraded.

Training Objective 2 – Evaluate the TG Commander’s CONOPS to Support Coalition Theatre ASW Efforts:

Scenario: A CONOPS has been developed by the TG Commander to support the Theatre ASW Commander in executing the JFMCC plan in support of the JFC OPLAN. For this mission, the TG Commander and his TG have been assigned a patrol box within a larger Joint Operating Area (JOA) and maneuvers forces to assist in establishing maritime superiority and deter OPFOR aggression. TG will have coalition maritime patrol and reconnaissance aircraft (MPRA) in support to conduct patrols throughout the JOA to support ISR and targeting as well as providing SUW/ASW protection to the force. Additionally, the TG Commander (and embarked Submarine Element Commander) will have Allied submarines operating in direct support in selected Submarine Operating Areas (SUBOPAREA) providing SUW/ASW and ISR support to protect the force. For this Task, the TG Commander’s CONOP would involve the planning, decisions and approvals (through the Theatre ASW Commander and JFMCC (and possibly through

Page 51: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 37 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

National Chains of Command)) to employ assigned surface, MPRA and submarine forces to prosecute OPFOR submarine capabilities.

Table 2-5: 2.9.2 Training Objective 2 Measures

M1 Time Management Time Prior to execution CONOPS published and delivered to units.

M2 Time Management Percent Of units receive the CONOPS on schedule.

M3 Completeness of Plans Yes/No

Did National Chains of Command, TASW and JFMCC authorize and implement sufficient guidance and ROE to support execution of the CONOPS?

M4 Completeness of Plans Yes/No Did CONOPS establish planned and reactive procedures

for detection, tracking and prosecution of OPFOR?

M5 Clarity of Plans Number Of units at desired position and appropriate degree of readiness at execution.

M6 Mission Achievement Percent

Of collection of ASW surveillance requirements fulfilled by reconnaissance/surveillance assets in accordance with (IAW) CONOPS.

M7 Mission Achievement Percent Of contact cues converted into contact detections IAW

CONOPS.

M8 Mission Achievement Time To classify a contact to a level where it could be

prosecuted IAW CONOPS. Training Objective 3 – Evaluate the TG Commander’s CONOPS to Support Coalition

Maritime Interdiction Operations:

Scenario. A CONOPS has been developed by the TG Commander to support the JFMCC in executing the JFMCC plan in support of the JFC OPLAN. For this mission, the TG Commander and his TG have been assigned a patrol box within a larger JOA and maneuvers forces by conducting Maritime Interdiction Operations (MIO) as part of UN Sanctions. TG will have coalition MPRA in support to conducts patrols throughout the JOA to support ISR and targeting as well as providing SUW/ASW protection to the force. For this Task, the TG Commander’s CONOP would involve the planning, decisions and approvals (through the JFMCC (and possibly through National Chains of Command)) to employ assigned surface and MPRA forces.

Page 52: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 38 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Table 2-6: 2.9.2 Training Objective 3 Measures

M1 Completeness of Plans Yes/No

Did the CONOPS include procedures to obtain approvals of ROE through Canadian Chain of Command and Levels of Force authorized by Higher Command to support MIO?

M2 Completeness of Plans Yes/No Did the CONOPS include procedures regarding direct

liaison and coordination to plan MIO?

M3 Completeness of Plans Yes/No

Did the CONOPS enable near real-time sharing of information between Components, TG Commander and units assigned to in a common and consistent language?

M4 Completeness of Plans Yes/No

Did the CONOPS describe how specific areas of the battlespace would be defined to enable commanders to efficiently coordinate, de-conflict, integrate, and synchronize operations?

M5 Mission Success Percent Of OPFOR avenues of approach closed as maneuver possibilities due to friendly force actions defined in the CONOPS.

M6 Force Status Reporting Time

To transmit and receive direction from Higher HQ based on information, imagery and intelligence collected during boarding IAW CONOPS.

M7 Mission Success Ratio Ratio of targeted forces detected versus successfully interdicted.

M8 Mission Success Percent Reduction in flow of all supplies to (or from) a targeted nation.

2.9.3 Validation of Concept of Operations of Tactical Elements during a TGEX

Validation is the confirmation of the capabilities and performance of organizations, individuals, materiel or systems to meet defined standards or criteria, through the provision of objective evidence. The following sample training objectives have been proposed based on possible missions which tactical elements could be called on to perform in the future security environment.

Training Objective 1 – Validate the TG Commander’s CONOPS for the Coordination and Conduct Naval Surface Fire Support by Tactical Elements:

Scenario: A CONOPS has been developed by the TG Commander and approved by JFMCC to coordinate Naval Surface Fire Support with maneuver of forces ashore into a cohesive action maximizing their effect in accomplishing the mission and minimizing adverse effects on friendly/neutral forces and non-combatants. The TG Commander has detached Tactical Elements to the Landing Force Commander to coordinate and conduct Naval Surface Fire Support.

Page 53: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 39 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Table 2-7: 2.9.3 Training Objective 1 Measures

M1 Completeness of Plans Yes/No

Did the CONOPS include procedures to obtain approvals of ROE through Canadian Chain of Command and Levels of Force authorized by Higher Command to support NSFS?

M2 Completeness of Plans Yes/No

Did the CONOPS enable near real-time sharing of information between Components, TG Commander and units assigned to in a common and consistent language?

M3 Completeness of Plans Yes/No

Did the CONOPS describe how specific areas of the battlespace would be defined to enable commanders to efficiently coordinate, de-conflict, integrate, and synchronize operations?

M4 Clarity of Plans Ratio

Number of units at desired position and ordered degree of readiness at execution versus number of units at desired position, but not at the ordered degree of readiness at execution.

M5 Mission Success Percent Of friendly forces execute assigned missions on time.

M6 Mission Success Percent Of attacking systems penetrate to target to deliver ordnance.

M7 Time Management Time After target identification to complete attack.

M8 Force Status Reporting Time To provide full assessment of NSFS engagement success.

Training Objective 2 – Validate the TG Commander’s CONOPS for the Conduct of

Tactical Reconnaissance and Surveillance using UAV in Support of Force Protection:

Scenario: A CONOPS has been developed for the employment of shipborne Unmanned Aerial Vehicle (UAV) to obtain by various detection methods, information about the activities of an enemy or potential threat during a single ship transit through a narrow channel into a safe and secure port for a mission critical purpose. This task requires surveillance to systematically observe the area of operations using the various sensors on the UAV. This task includes development and execution of search plans.

Page 54: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 40 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Table 2-8: 2.9.3 Training Objective 2 Measures

M1 Completeness of Plans Yes/No From receipt of tasking, ship-borne UAV is airborne and

mission ready.

M2 Completeness of Plans Yes/No Of collection requirements fulfilled by

reconnaissance/surveillance assets.

M3 Completeness of Plans Yes/No Of time able to respond to collection requirements.

M4 Clarity of Plans Ratio To receive transmit updated cueing information by UAV to ship.

M5 Mission Success Percent To analyze cueing information transmitted by UAV.

M6 Mission Success Percent Sufficient bandwidth for transmission and receipt of UAV cueing information?

M7 Time Management Time Of unresolved ambiguities in tactical picture.

M8 Force Status Reporting Time Of decisions delayed because data not presented to

decision maker in suitable format. Training Objective 3 – Validate the Joint Special Forces and Submarine Force

Commander’s CONOPS for Special Forces Insertions and Extraction by Submarine to Support Direct Action:

Scenario: A CONOPS has been jointly developed by the Special Forces (SOF) Commander and Submarine Force Commander to covertly insert CAF SOF via Canadian diesel-electric Submarine (SSK) to perform Direct Action (DA). Specifically, SOF will be inserted to conduct small-scale offensive actions in order to seize, destroy, capture, recover, or inflict damage on designated personnel or materiel. On completion, SOF will be extracted by the SSK.

Table 2-9: 2.9.3 Training Objective 3 Measures

M1 Time Management Time From receipt of tasking, SSK and embarked SOF are in position and mission ready.

M2 Time Management Time Between planned and actual infiltration.

M3 Coordination of Plans Yes/No

Did SSK C4ISR capabilities and requirement to remain covert impact full coordination and de-confliction of SOF plans prior to insertion?

M4A Navigation Accuracy on SOF Extraction

Distance Distance between actual position of SSK and SOF at time and location of planned extraction.

M4B Navigation Accuracy on SOF Extraction

Percent Of time SOF navigation equipment satisfies accuracy criteria.

M4C Navigation Accuracy on SOF Extraction

Percent Of time SSK navigation equipment satisfies accuracy criteria.

Page 55: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 41 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

2.9.4 Validation of Concept of Employment of Combat Systems during Live Trials

A Concept of Employment (CONEMPL) describes how a specific system or systems will be employed to meet mission accomplishment. Validation is the confirmation of the capabilities and performance of organizations, individuals, materiel or systems to meet defined standards or criteria, through the provision of objective evidence. The following sample training objectives have been selected to assess the potential application of new equipment, concepts, procedures which may be employed by a modernized RCN in the future security environment.

Training Objective 1 – Validate the Concept of Employment for the Main Gun Armament of the Canadian Surface Combatant in an Indirect Fire Role:

Scenario: Collateral damage can have an adverse impact on a fragile civilian infrastructure and in maintaining the support of the local population. The RCN must ensure it coordinates its ground fire support coordinating measures and fully complies with rules of engagement and direction from Higher Command. This live trial would involve extensive data collection and post-exercise analysis.

Table 2-10: 2.9.4 Training Objective 1 Measures

M1 Completeness of Plans Yes/No

Did the CONEMPL describe how specific areas of the battlespace would be defined to enable commanders to efficiently coordinate, de-conflict, integrate, and synchronize Indirect Fire Missions?

M2A Response Time Time Interval between the time when relevant information becomes available to the commander and the time the Call for Fire order was issued

M2B Response Time Time Interval between the time Call for Fire order was issued and ship was in a position to conduct the Fire Mission.

M4A Call of Shot Correction Time After strike of previous round to provide adjustment data.

M4B Call of Shot Correction Distance Miss distance after applying corrections to fall of shot.

M3A Mission Success Yes/No Attacking systems penetrate to target to deliver ordnance. M3B Mission Success P(h) Probability of a hit. M3C Mission Success P(k) Probability of kill given a hit.

Page 56: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 42 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Training Objective 2 – Validate the Concept of Employment for the Canadian Surface Combatant to apply Airspace Control Measures when Assigned the Duty of Local Anti-Air Warfare Commander:

Scenario: A Canadian Surface Combatant has been directed to perform the duties of Local Anti-Air Warfare Commander (LAAWC) and is required to apply Airspace Coordination Measures (ACM) in accordance with the Joint Force Air Component Commander (JFACC) Airspace Control Plan (ACP) and Special Instructions (SPINS). This live trial involves validating new equipment and procedures to facilitate routing and recognition of friendly aircraft, establishment of identification zones and weapons engagement zones, and the direction of non-combat air resources.

Table 2-11: 2.9.4 Training Objective 2 Measures

M1 Completeness of Plans

Yes/No

Did the CONEMPL describe how specific areas of the battlespace would be defined to enable LAAWC to efficiently coordinate, de-conflict, integrate, and synchronize operations IAW the ACM, ACP and SPINS?

M2 Procedures Applied Percent Of fixed wing sorties unable to complete mission because of lack of clearance.

M3 Fratricide Percent Of friendly aircraft sorties engaged by friendly weapons systems.

M4A Information Accuracy Percent

Of LAAWC’s Operational Area that has complete air picture available.

M4B Information Accuracy Percent Of time the information satisfies accuracy criteria.

M5 Decision-quality information Percent Of dual tracks at any given moment.

M6 Decision-quality information Percent

Of decisions delayed because data not presented to decision maker in suitable format.

M7 Decision-quality information Number Of unresolved ambiguities in tactical picture.

Training Objective 3 – Validate the Concept of Employment for two Canadian Surface

Combatants to Achieve a Time Sensitive, Dual-axis, Simultaneous Time-on-Top Attack Against a Maritime Surface Target with the Harpoon Missile:

Scenario: Two Canadian Surface Combatants have been assigned to attack surface targets with the intent to degrade the ability of enemy forces to conduct coordinated operations and/or perform critical tasks. For the purposes of this live trial, the concept of employment will require 2 ships, 1 maritime patrol and reconnaissance aircraft to validate the concept of employment for a time sensitive, dual-axis, simultaneous time-on-top engagement against a maritime surface target of similar size and capabilities. This live trial would involve extensive data collection and post-exercise analysis.

Page 57: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 43 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Table 2-12: 2.9.4 Training Objective 3 Measures

M1 Completeness of Plans Yes/No

Did the CONEMPL describe how specific areas of the battlespace would be defined to enable the ships and aircraft to efficiently coordinate, de-conflict, integrate, and synchronize a time sensitive, dual-axis, simultaneous time-on-top engagement against a maritime surface target

M2 Procedures Applied Time Required to develop accurate plot and issue firing orders.

M3 Information Accuracy Percent Of time aircraft reported information satisfies engagement

criteria. M4A Mission Success Yes/No Attacking systems penetrate to target to deliver ordnance. M4B Mission Success P(h) Probability of a hit. M4C Mission Success P(k) Probability of kill given a hit. M4D Maintenance Number Of no-launch due to mechanical reasons. M4E Operator Training Percent Of no-launch due to operator error.

2.9.5 Evaluation/Assessment of the Conduct of a Table-Top Exercise (TTX)

A TTX is an exercise in which hypothetical scenarios are discussed in an informal setting. The sample TTX training objects focus on assessing the adequacy of high-level policy, plans, procedures, training, resources, relationships and stakeholder interdependencies at the TG Commander, MCC and MARPAC/JTFP levels. Because a TTX lacks the disciplined approach of a wargame, the TTX generally lacks the robust process of action-reaction-counteraction with rules and steps that would be followed in a wargame.

Training Objective 1 – Evaluate/Assess the Adequacy of the Canadian Maritime Component Commander’s Plans Related to the Provision of Support during a Naval Deployment to a Theatre of Operations in Time of Crisis:

Scenario: The TTX will evaluate/assess the adequacy of the Canadian MCC plans to support in-transit forces enroute to a Joint Theatre of Operations in a time of crisis. Short notice operational deployments, particularly of larger contributions of naval forces to a Joint Task Force, require intermediate staging bases or areas for refueling, air-bridge operations, forward basing/staging of personnel and equipment, and integration training. Staging bases or areas may require ports, airfields and facilities, transshipment facilities, construction services and health care services.

Page 58: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 44 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Table 2-13: 2.9.5 Training Objective 1 Measures

M1A Commander Involvement Yes/No Commander provided complete and relevant commander’s

guidance to the staff for the TTX

M1B Commander Involvement Yes/No

Commander was available on request to provide guidance and direction throughout the TTX, including COA evaluation criteria.

M2A Cross-Disciplinary Involvement Yes/No Subject matter experts (SME) arrived fully prepared and

were able to contribute fully to the TTX.

M2B Cross-Disciplinary Involvement Ratio SMEs who arrived knowledgeable about the TTX and their

areas of planned involvement in the TTX.

M3 Completeness of Plan Percent Percent completeness of plan at end of TTX versus start.

M4 Completeness of Overall Plan Percent Of requests for information remaining unanswered at and

of TTX.

M5A Completeness of Annexes Percent

Of annex complete describing how personnel, equipment and other resources are force generated in time of crisis (at end of TTX).

M5B Completeness of Annexes Percent

Of annex complete describing how to obtain agreements for access and basing (Diplomatic Clearances and Status of Forces Agreements etc.) (at end of TTX).

M5C Completeness of Annexes Percent Of annex complete describing how national level support

services are accessed (at end of TTX).

M5D Completeness of Annexes Percent

Of annex complete describing how in-theatre arrangements are made with the Host Nation and allies/partners (at end of TTX).

M5E Completeness of Annexes Percent

Of annex describing requirements for national communications infrastructure for in-theatre communications and national rear link (at end of TTX).

M5F Completeness of Annexes Percent

Of annex describing Command and Control structure and Transfer of Command Authority (TOA/TOCA) procedures (at end of TTX).

M5G Completeness of Annexes Percent

Of annex describing how to establish logistics arrangements for materiel and services including in-theatre contracting (at end of TTX).

Training Objective 2 – Evaluate/Assess the Adequacy of TG Commander Plans and Procedures to Integrate Multinational Forces into a Canadian-Led Naval Task Group:

Scenario: The TTX will assess the adequacy of the Canadian TG Commander’s plans and procedures to integrate multinational naval forces into the TG. This TTX would require knowledgeable representation from foreign nations who would typically be involved in planning coalition operations. Because a TTX lacks the disciplined approach of a wargame, the TTX generally lacks the robust process of action-reaction-counteraction with rules and steps that would be followed in a wargame.

Page 59: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 45 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Table 2-14: 2.9.5 Training Objective 2 Measures

M1A Commander Involvement Yes/No Commander provided complete and relevant commander’s

guidance to the staff for the TTX

M1B Commander Involvement Yes/No

Commander was available on request to provide guidance and direction throughout the TTX, including COA evaluation criteria.

M2A Cross-Disciplinary Involvement Yes/No Subject matter experts (SME) arrived fully prepared and

were able to contribute fully to the TTX.

M2B Cross-Disciplinary Involvement Ratio SMEs who arrived knowledgeable about the TTX and their

areas of planned involvement in the TTX.

M3 Time Management Time Prior to TTX that PLAN/OPORDER/OPGEN/OPTASKs published and delivered to participating TTX coalition forces.

M4 Clarity of Plans Percent Of OPLAN/OPORDER/OPGEN/OPTASKs published in a common format and lexicon to participating TTX coalition forces.

M5 Classification and Disclosure Percent

Of orders, assessments and reports bear proper classification markers and are written for release to multinational partners.

M6 Completeness of Plan Percent Percent completeness of plan at end of TTX versus start.

M7 Completeness of Plan Percent Of requests for information remaining unanswered at the

end of TTX.

M8 Completeness of Plan Percent Of coalition partner national rules (safety, ROE, policy)

which constrain employment resolved by end of TTX. M9A Conflict Resolution Number Number of conflicts identified during TTX. M9B Conflict Resolution Percent Of conflicts identified were resolved during TTX.

M9C Conflict Resolution Percent Of conflicts left unresolved by coalition partner acquiescence to lead-nation standards.

M10 Interoperability Percent

Of coalition partners who have knowledge of and capability in particular warfare areas, but were not able to fully integrate into the TG due to interoperability challenges identified during the TTX.

Training Objective 3 – Evaluate/Assess Adequacy of MARPAC/JTFP Plans and Procedures to Understand and Manage the Impact of Cyber-threats to the Formation:

Scenario: The TTX will assess the adequacy of MARPAC/JTFP Plans and Procedures to understand and manage the impact of cyber-threats to the formation. Formation-level cyberspace operations support the Commander’s ability to exercise command and control by providing communication and information systems that are reliable, secure, timely, and flexible. For the purposes of this TTX, cyberspace operations protect information and information processes through defensive means, rather than offensive cyberspace operations.

Page 60: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 46 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

This TTX would require knowledgeable representation from Higher Command and Subject Matter Experts who are knowledgeable of cyberspace threats and planning such as the CAF Network Operations Centre (CFNOC) which conducts national system operations that have enterprise-wide scope. Because a TTX lacks the disciplined approach of a wargame, the TTX generally lacks the robust process of action-reaction-counteraction with rules and steps that would be followed in a wargame.

Table 2-15: 2.9.5 Training Objective 3 Measures

M1A Commander Involvement Yes/No Commander provided complete and relevant commander’s

guidance to the staff for the TTX

M1B Commander Involvement Yes/No

Commander was available on request to provide guidance and direction throughout the TTX, including COA evaluation criteria.

M2A Cross-Disciplinary Involvement Yes/No Subject matter experts (SME) arrived fully prepared and

were able to contribute fully to the TTX.

M2B Cross-Disciplinary Involvement Ratio SMEs who arrived knowledgeable about the TTX and their

areas of planned involvement in the TTX.

M3 Time Management Time Prior to TTX that relevant materials for the TTX were published and delivered to participants.

M4 Clarity of Plans Percent Of relevant materials for the TTX were published in a common format and lexicon to participants.

M5 Classification and Disclosure Percent

Of orders, assessments and reports bear proper classification markers and are written for release to multinational partners.

M6A Complete Plan Percent Percent completeness of plan at end of TTX versus start.

M6B Completeness of Plan Yes/No

Does the command have a roadmap to develop a mission assurance and critical cyberspace infrastructure protection plan and continuity plans in the event of disruption as a result of TTX?

M7 Completeness of Plan Percent

Increase in the volume of cyber-threat information and recommendations on cyberspace issues available to be provided to the commander as a result of the TTX?

M8 Information Completeness Percent

Increase in number of information sources in his regional cyberspace common operational picture (COP) as a result of the TTX?

M9 Continuity of Operations Percent Of the command’s mission-critical networks synchronized

and prioritized for restoration in the event of a cyber-attack.

M10 Continuity of Operations Time Time to restore mission-critical networks in the event of a

cyber-attack.

Page 61: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 47 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

2.9.6 Validation of a Concept of Operations (including Multi-Agency) during a Table-Top Exercise

Validation of a Concept of Operations requires the confirmation of the capabilities and performance of organizations, individuals, materiel or systems to meet defined standards or criteria, through the provision of objective evidence. When involving multi-agency partners, this can be a particularly daunting task given different authorities, mandates and jurisdictions, interests, organizational cultures and lexicon. While not specifically aimed at validating a CONOPS, the observations and recommendations from TTX PACIFIC JEOPARDY 2015 (DRDC-RDDC-2015-L185) provide an excellent overview of the perils of scope creep and the lack of the requisite knowledge of the TTX procedures resulting in the inability of participating decision-makers and subject matter experts to contribute fully to the discussion. The training objectives below for the validation of CONOPS were selected because they are complex, inter-disciplinary, and require advance research and preparation by the training audience. Moreover, the inclusion of a mix of qualitative as well as quantitative measures provides objective evidence of the ability to execute the concept of operations, will help prevent a multi-agency TTX from degrading to a series of side-bar discussions and the exchange of business cards.

Training Objective 1 – Validate the JTFP CONOPS to provide Alternate Command, Control, Communications, Computers, and Intelligence (C4I) Services following a Catastrophic Earthquake:

Scenario: The focus of this TTX is to validate the CONOPS for the provision of secure communications and automated information system support encompassing command, control, communications, computers, and intelligence (C4I) to JTFP following a catastrophic earthquake.

Page 62: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 48 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Table 2-16: 2.9.6 Training Objective 1 Measures

M1A Commander Involvement Yes/No

Commander provided complete and relevant commander’s guidance to the staff and interagency participants for the TTX.

M1B Commander Involvement Yes/No

Commander was available on request to provide guidance and direction throughout the TTX, including COA evaluation criteria.

M2A Cross-Disciplinary Involvement Yes/No Subject matter experts (SME) arrived fully prepared and

were able to contribute fully to the TTX.

M2B Cross-Disciplinary Involvement Ratio SMEs who arrived knowledgeable about the TTX and their

areas of planned involvement in the TTX.

M3 Time Management Time Prior to TTX that relevant background materials were published and delivered to participants.

M4 Clarity of Plans Percent Of background materials published in a common format and lexicon to participants.

M5 Classification and Disclosure Percent

Of orders, assessments and reports bear proper classification markers and are written for release to participants.

M6 Completeness of Plan Percent Percent completeness of plan at end of TTX versus start.

M7 Completeness of Plan Percent Of requests for information remaining unanswered at end

of TTX.

M8 Completeness of Plan Percent Of interagency limitations which constrain employment

were comprehended by all stakeholders by end of TTX. M9A Conflict Resolution Number Number of conflicts identified during TTX. M9B Conflict Resolution Percent Of conflicts identified were resolved during TTX.

M9C Conflict Resolution Percent Of conflicts left unresolved by interagency partner acquiescence (or silence).

M10 Interoperability Time To establish mobile C4I support to forward deployed assets and liaison teams to Federal, Provincial, Municipal and First Nations partners.

M11 Mission Assurance Percent

Of commander’s critical communications systems that would be fully operational immediately following an earthquake.

M12 Mission Assurance Yes/No Does the command have a roadmap to address C4I

architecture gaps as a result of TTX?

M13A Mission Assurance Yes/No

Does the command have established data, voice, and video services available to support operations from an Alternate HQ?

M13B Mission Assurance Percent

Of redundant systems at alternate HQ are in place and regularly tested (IAW SOP) to ensure commanders critical communications systems are fully operational

Page 63: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 49 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Training Objective 2 – Validate the JTFP CONOPS for Phase 1 (Immediate Response) of JTFP CONPLAN PANORAMA:

Scenario: The focus of this TTX will be to validate the JTFP CONOPS Phase 1 immediately following a catastrophic earthquake. Initial response tasks are always in support of the civil authorities and focus on the providing requested and approved humanitarian needs.

Table 2-17: 2.9.6 Training Objective 2 Measures

M1 Completeness of Plan Percent Percent completeness of plan at end of TTX versus start.

M2 Completeness of Plan Percent Of requests for information remaining unanswered at end

of TTX.

M3 Completeness of Plan Percent Of interagency limitations which constrain employment

were comprehended by all stakeholders by end of TTX.

M4 Completeness of Plan Yes/No

Does the CONOPS describe in sufficient detail the tasks and desired effects required for individual and unit self-recovery.

M5 Mission Assurance Time To re-establish C4I with chain of command after

earthquake.

M6 Mission Assurance Percent Of AO having received initial damage assessment within

critical milestones and timelines.

M7 Mission Assurance Time For Lead Agencies to determine the degree of CAF

services needed.

M8 Mission Assurance Percent

Of road and rail lines of communication surveyed and determined to be serviceable within planned milestones and timelines.

M9 Mission Assurance Percent To determine transportation requirements that can be used

to support mission.

M10 Mission Assurance Percent Of Provincial requests for support filled within planned

milestones and timelines.

M11 Mission Assurance Percent

Of initial high readiness CAF reinforcements from outside the AO having gone through Reception, Staging, Onward Movement and Integration (RSOMI) within planned milestones and timelines.

M12 Mission Assurance Percent Of equipment ready and available to provide assistance.

M13 Mission Assurance Percent Of available forces dedicated to provide assistance.

Page 64: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 50 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Training Objective 3 – Validate the Canadian Maritime Component Commander’s CONOPS to Analyze the Relative Combat Power of an Adversary as part of the Operational Planning Process:

Scenario: Relative combat power analysis is a comparison of those friendly and adversary tangible and intangible factors that allow the MCC to generate combat power in order to achieve objectives. The most important information derived from conducting such an analysis is that it provides planners with a deeper understanding of friendly and adversary force numbers, capabilities, strengths, and weaknesses relative to each other at a given time and in a particular geographic location. Pre-planned scenarios or vignettes as well as Higher Command planning guidance and OPFOR order of battle information is required for this TTX. This type of TTX should attract not only military planners, but intelligence analysists as well as research scientists and academics who are knowledgeable of OPFOR weapons, tactics and doctrine.

Table 2-18: 2.9.6 Training Objective 3 Measures

M1A Commander Involvement Yes/No Commander provided complete and relevant commander’s

guidance to the staff and participants for the TTX

M1B Commander Involvement Yes/No

Commander was available on request to provide guidance and direction throughout the TTX, including COA evaluation criteria.

M2A Cross-Disciplinary Involvement Yes/No Subject matter experts (SME) arrived fully prepared and

were able to contribute fully to the TTX.

M2B Cross-Disciplinary Involvement Ratio SMEs who arrived knowledgeable about the TTX and their

areas of planned involvement in the TTX.

M3 Application of OPP Percent Of Staff Trained in CF OPP and Relative Combat Power

Analysis Techniques

M4 Completeness of Plans Yes/No Was the provided OPFOR data sufficient for analysis in

terms of qualitative and quantitative data?

M5 Quality of OPP Percent Of analysis based on factually based discussion rather than conjecture.

M6 Quality of OPP Yes/No If an item could not be analyzed adequately, was it capture it as a subjective estimate, or even an assumption, as part of the TTX?

M7 Quality of OPP Yes/No Were both friendly and OPFOR doctrine and how it affects their respective actions/equipment analyzed?

M8 Quality of OPP Yes/No Were both friendly and OPFOR weapons analyzed?

M9 Quality of OPP Yes/No Were both friendly and OPFOR ammunition loadout and resupply issues and impacts analyzed?

M10 Quality of OPP Yes/No Were both friendly and OPFOR training issues and impacts analyzed?

Page 65: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 51 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

2.9.7 Assessment of Command-Post Exercises

The training objectives identified to assess Command-Post Exercises were selected to measure the performance of JTFP and Task Group staffs supported by their commanders to conduct key components of the operational planning process when responding to an emergent situation. Depending on the level of ambition for the CPX, this effort could include the involvement of dedicated Red-Team players to challenge the ideas and assumptions of the commander and staff and to expose alternatives, vulnerabilities, limitations and risks.

Training Objective 1 – Assess the Ability of JTFP Staff to Conduct Mission Analysis:

The process of mission analysis is described in the CF Operational Planning Process (OPP) CFJP 5.0 (DND, 2008). Mission analysis is a cognitive process conducted through a brainstorming session by the commander and selected senior staff. They will determine what has to be accomplished, where, by whom, by when, and why. The mission analysis leads the commander towards the production of the mission statement, commander's Planning Guidance, or if the analysis was generated by the commander, an initiating directive to a subordinate commander. The commander or senior staff then uses these products to orient the planning staff in the development of courses of action in subsequent steps of the OPP.

Scenario: A scenario with appropriate background information on Political, Military, Economic, Social, Information, Infrastructure, Physical Environment, and Time (PMESII-PT) factors would be required as an analytical start point for Mission Analysis. Performance of this training objective includes reviewing the mission, mission requirements, and evaluating updated status information. Planners analyze higher-level guidance, identify enemy centers of gravity, review assessments of the situation, and prepare a proposed mission statement. Area of Interest (AI) is determined from terrain analysis and an analysis of friendly and threat capabilities and limitations, and should be examined in accordance with the guidance provided by the Commander. Commander's Critical Information Requirements (CCIRs) are developed. The mission statement, commander's intent, and initial planning guidance are developed and issued to facilitate development of the proposed course(s) of action.

Page 66: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 52 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Table 2-19: 2.9.7 Training Objective 1 Measures

M1A Commander Involvement Yes/No Commander provided complete and relevant commander’s

guidance to the staff and participants for the CPX.

M1B Commander Involvement Yes/No

Commander was available on request to provide guidance and direction throughout the CPX and receive a back brief on the mission analysis

M2A Cross-Disciplinary Involvement Yes/No Subject matter experts (SME) arrived fully prepared and

were able to contribute fully to the CPX.

M2B Cross-Disciplinary Involvement Ratio SMEs who arrived knowledgeable about the CPX and their

areas of planned involvement in the CPX.

M3 Application of OPP Percent Of Staff Trained in CF Operational Planning Process (OPP)

M4A Time Management Time Available to complete planning.

M4B Time Management Time To complete assessment of latest information regarding status of forces and the operational environment.

M4C Time Management Time To complete Mission Analysis.

M5 Quality of OPP Percent Of PMSEII-PT factors analyzed to form basis for mission analysis

M5B Quality of OPP Percent Of available (and required) information did not get to the person requiring it.

M6A Clarity of Plans Number Of changes made to mission statement in order to attain Commander’s approval.

M6B Clarity of Plans Number Of changes made to commander’s intent in order to attain Commander’s approval.

M6B Clarity of Plans Number Of changes made to initial planning guidance in order to attain Commander’s approval.

Training Objective 2 – Assess the Ability of JTFP Staff to Analyze and Compare Courses of Action during a CPX:

Scenario: The use of the Commander’s planning guidance, as well as Joint Intelligence Preparation of the Operational Environment (JIPOE) products, would be required to execute such a CPX. The planners do not judge or eliminate potential COAs: all possibilities are recorded for potential use. Using an array of employment possibilities, planners design a broad plan of how they intend to accomplish the mission. "How" they intend to accomplish the mission becomes a COA. Development of COAs with sufficient variety to provide the Commander a range of employment options is critical.

Page 67: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 53 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Table 2-20: 2.9.7 Training Objective 2 Measures

M1A Commander Involvement Yes/No Commander provided complete and relevant commander’s

guidance to the staff and participants for the CPX.

M1B Commander Involvement Yes/No

Commander was available on request to provide guidance and direction throughout the CPX and receive a back brief on recommended COA.

M2A Cross-Disciplinary Involvement Yes/No Subject matter experts (SME) arrived fully prepared and

were able to contribute fully to the CPX.

M2B Cross-Disciplinary Involvement Ratio SMEs who arrived knowledgeable about the CPX and their

areas of planned involvement in the CPX.

M3 Application of OPP Percent Of Staff Trained in CF Operational Planning Process (OPP)

M4A Time Management Time Available to complete planning.

M4B Time Management Time To complete assessment of latest information regarding status of forces and the operational environment.

M4C Time Management Time To complete COA development.

M5 Quality of OPP Yes/No

Did the Commander and Staff follow OPP regarding COA development and analysis procedures to visualize the flow of the operation, given the friendly and the adversary’s capabilities, strengths, weakness’ and force dispositions as well as other situational and environmental considerations.

M5 Quality of OPP Number Feasible COAs developed and war gamed

Training Objective 3 – Assess the Ability of TG Commander Staff to Evaluate and Review Rules of Engagement (ROE) during a CPX:

Scenario: Using pre-planned scenarios or vignettes as well as Higher Command planning guidance, the TG Commander and Staff are to determine limitations on tactical actions based on existing ROE. This also includes understanding the freedom for action provided by or constrained by existing ROE. This type of CPX would involve a variety of complex vignettes to foster a robust dialogue involving legal advisors and staff echelons including MCC, CJOC and SJS. The output of this CPX is a commander approved ROE Request Message (ROEREQ).

Page 68: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 54 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Table 2-21: 2.9.7 Training Objective 3 Measures

M1A Commander Involvement Yes/No Commander provided complete and relevant commander’s

guidance to the staff and participants for the CPX.

M1B Commander Involvement Yes/No Commander was available on request to provide guidance

and direction throughout the CPX.

M2A Cross-Disciplinary Involvement Yes/No Subject matter experts (SME) arrived fully prepared and

were able to contribute fully to the CPX.

M2B Cross-Disciplinary Involvement Ratio SMEs who arrived knowledgeable about the CPX and their

areas of planned involvement in the CPX.

M3 Application of ROE Percent Of Staff Trained in ROE

M4A Time Management Time Available to complete ROE planning.

M4B Time Management Time To complete assessment of latest information regarding status of forces and the operational environment.

M4C Time Management Time To complete and ROEREQ

M5 Clarity of ROEREQ Number Of changes made to in order to attain Commander’s

approval of ROEREQ.

M6 Quality of OPP Yes/No Higher echelon staff and legal advisors and/or role players contributed to CPX.

2.10 Most Common Objectives, Topics and Measures

The preceding sections presenting a variety of sample exercise objectives that can be considered for the commonalities observed. It should be noted that these commonalities are only based on the samples provided in this report. Ideally, a more comprehensive effort will analyse the exercises carried out by the RCN over the last ten to fifteen years to consider all the exercise objectives pursued, the exercise focus, and the types of measures applied.

At the highest level, five common objectives were listed in the sample exercise objectives section. These were:

Prepare;

Provide;

Support;

Evaluate/Assess; and,

Validate.

These five high-level objectives always have a particular focus which, is then coupled with a specific area of interest (e.g., use of a specific weapon or sensor system, preparations for

Page 69: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 55 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

specific operations). The particular common focus areas described in the sample exercise objectives were:

Plans and Orders/Plans and Procedures;

Direction and Purpose;

Decision Making;

CONOPS;

Concept of Employment; and

Ability (to carry out an action).

Every sample exercise objective was associated with a number of measures. Table 2-22 below presents the summary of the common measurement areas with associated units of measurement.

NOTE: In some cases, measures described with different words have been grouped together and in other cases, overly specific measures have been generalised to a common topic.

Table 2-22: Common Measurements Described in Sample Objectives

Measurement Area Unit of Measure Measurement Area Unit of Measure Effectiveness and

Achievement Yes/No, Time, Percent,

Ratio, P(h), P(k) Decision Making Percent, Number

ROEs Percent, Number Information Exchange Percent Situation Awareness Percent Maintenance Number

Communications Percent, Yes/No, Time Navigation Distance, Percent

Classification and Disclosure Percent Operator Training Percent

Commander Involvement Yes/No Procedures Applied Percent, Time

Plans and Planning Yes/No, Percent, Number, Ratio

Time Management and Response Time Time, Percent

Coordination and Conflict Resolution

Percent, Number, Yes/No, Distance

Interoperability/ Cross Disciplinary Involvement

Yes/No, Ratio, Percent

Continuity of Operations Percent, Time

Page 70: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 56 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS

The experiment process follows a repeatable cycle similar to that described for the exercise cycle starting with initiation and following an iterative process of definition and development, followed by experiment conduct, analysis and reporting. This cycle repeats itself as the conclusions and recommendations of the last cycle are analysed and incorporated into the next cycle as part of a larger campaign of experimentation.

This section of the Framework provides an overview of defence experimentation, the iterative cycle of experimentation and summarizes The Technical Cooperation Program’s (TTCP’s) Guide for Understanding and Implementing Defense Experimentation (GUIDEx) (2006) which complements the growing body of literature in defence experimentation. It also includes a section on hypothesis development techniques and measurement planning considerations. Finally, the section concludes with the final deliverable of this project, several examples of template hypothesis as well as the notional research questions that could precipitate the development of the hypothesis.

3.1 Defence Experimentation

The CAF Defence Terminology Bank defines experimentation as “the process of controlled research, testing and evaluation to discover information or test a hypothesis to establish cause and effect.” Opportunities to conduct experimentation may be found in training exercises and in operational test and evaluation (OT&E) events. Conducting experimentation during exercises involves careful planning and the integration of discrete test and experimentation events into a near-real operational environment. The chief advantages of experimentation during exercises is the availability of commanders, planners, operators, technicians and other subject matter experts capable of providing levels of insight not available through other means. Additionally, the routine inclusion of experimentation in exercises helps build a culture of innovation as researchers and practitioners of the art and science of warfare to collaborate in understanding and solving difficult military problems. The principle challenges of conducting experimentation during exercises is that the exercise environment may not be suitable for the deliberate control and manipulation of variables, data collection may conflict with training and exercise timings and priorities, and experiment objectives may be inconsistent with exercise and training objectives. At the end of the day, the exercise will have primacy over experiment objectives; consequently, the researcher must understand the limitations of the exercise environment and plan accordingly.

Like the defence and emergency management training and exercise communities, the operational research and defence science communities also generally follow a repeatable cycle commencing with problem formulation; design and planning; conduct (hypothesis testing and data collection); and analysis and reporting. The Code of Best Practices in Experimentation (2002) introduces the concept of a Campaign of Experimentation, where a variety of scientific techniques are used in a methodical and iterative way to move from an initial idea to a demonstrated and meaningful military capability.

Page 71: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 57 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

3.2 Experiment Types

There are three general types of defence experiment:

1. Discovery Experiment: Used to experiment with new ideas or ways of doing things, and generally examine cause and affect relationships. They are typically employed early in the development cycle to identify insights and explore potential benefits of a new idea or innovation without a full quantitative analysis.

2. Hypothesis Testing Experiment: Used to advance knowledge by seeking to prove or disprove a specific hypothesis. This technique typically requires a defined problem statement and a controlled baseline during which new conditions or variables are systematically introduced or removed under controlled conditions to observe the outcome. Control and manipulation of dependent, independent, and control variables are integral to the experiment.

3. Demonstration Experiment: Used to recreate a known truth, typically in the form of a technology demonstration performed under well-established and specified conditions. Demonstration experiments are not purposefully aimed at acquiring new knowledge, but rather to allow decision-makers and other stakeholders to visualize and comprehend the value of the new concept, process, technology, or innovation.

3.3 Experiment Cycle

The experiment process follows a repeatable cycle similar to that described for the exercise cycle starting with initiation and following an iterative process of definition and development, followed by experiment conduct, analysis and reporting. This cycle repeats itself as the conclusions, recommendations, and lessons identified of the last cycle are analysed and incorporated into the next exercise cycle as part of a larger campaign of experimentation.

Figure 3-1, from Alberts’ Code of Best Practices Campaigns of Experimentation, illustrates the iterative nature of the experimentation cycle allowing for the incorporation of changes into the experiment over its life. The Annex to the Code includes a comprehensive checklist which could be applied to individual experiments as well as for a larger campaign of experimentation. However, the ORT should be in no doubt that their preparations for an experiment must be complementary to the military’s own development cycle. To this point, it is recommended that the ORT use the exercise planning cycle described in Figure 2-2, namely: initiate, conceive, design, plan, conduct, and assess. This approach should be used whether the RCN is the lead agency or CORA is the lead agency, to ensure that CORA and the RCN share a common understanding of the process and their respective responsibilities within that process.

Page 72: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 58 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Figure 3-1: Iterative Cycle for Experimentation

The changing nature of the threat landscape, the speed of transformation in technology and concept development, and the growing recognition of the importance of defence experimentation led to the development of GUIDEx to complement the growing body of literature in defence experimentation. While the major focus of GUIDEx is on military experimentation based on the study of field events and virtual (human-in-the-loop) simulations, it is applicable to experiments based on analytic wargames and constructive simulations. Part II of the GUIDEx describes the following 14 Principles for designing valid experiments:

Principle 1. Defence experiments are uniquely suited to investigate the cause-and effect relationships underlying capability development.

Principle 2. Designing effective experiments requires an understanding of the logic of experimentation.

Principle 3. Defence experiments should be designed to meet the following four validity requirements (ability to use the new capability, detect a change in effect, isolate the reason for the change, and relate the results to actual operations).

Principle 4. Defence experiments should be integrated into a coherent campaign of activities to maximize their utility.

Page 73: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 59 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Principle 5. An iterative process of problem formulation, analysis and experimentation is critical to accumulate knowledge and validity within a campaign.

Principle 6. Campaigns should be designed to integrate all three scientific methods of knowledge generation (studies, observations and experiments).

Principle 7. Multiple methods are necessary within a campaign in order to accumulate validity across the four requirements.

Principle 8. Human variability in defence experimentation requires additional experiment design considerations.

Principle 9. Defence experiments conducted during collective training and operational test and evaluation require additional experiment design considerations.

Principle 10. Appropriate exploitation of M&S is critical to successful experimentation.

Principle 11. An effective experimentation control regime is essential to successful experimentation.

Principle 12. A successful experiment depends upon a comprehensive data analysis and collection plan.

Principle 13. Defence experiment design must consider relevant ethical, environmental, political, multinational, and security issues.

Principle 14. Frequent communication with stakeholders is critical to successful experimentation.

Figure 3-2, from GUIDEx, illustrates the iterative nature of experiment planning and shows how the 14 Principles listed above are integrated through the stages of problem formulation, experiment design, development, execution, analysis and reporting. While the 14 Principles provide valuable context to the activity of developing a defence experiment, their treatment is beyond the level of detail intended for this experiment development framework. Paragraphs 3.4 and 3.5 describe the development of hypotheses and the selection of measurement schemes to support the evaluation of the hypotheses. It is practical guidance concerning these two fundamental experiment-planning tasks that this framework is primarily concerned with.

Page 74: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 60 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Figure 3-2: GUIDEx Experimentation Cycle

Page 75: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 61 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

3.4 Hypothesis Formulation

A hypothesis is a proposed explanation for some phenomenon. A hypothesis should be testable (falsifiable) and based on prior observations that have not been satisfactorily explained by existing theories. Typically, hypotheses are phrased as ‘all or nothing’ statements: if the hypothesis is not supported by the data, it must be rejected.

As illustrated in Figure 3-1 and Figure 3-2, the transition from problem definition to hypothesis formulation is an iterative process requiring collaboration between both the experiment sponsor and analyst. The importance of properly defining the sponsor’s problem cannot be overstated. The experiment purposes, objectives and all subsequent design, development, execution, analysis and reporting should be mapped back to the problem statement agreed with the sponsor. Having developed a comprehensive and shared understanding of the research problem between sponsor and analyst, the following steps need to be considered in the development of a testable research hypothesis:

1. Develop a testable hypothesis that is also falsifiable to disprove the hypothesis;

2. Identify variables (qualitative/quantitative and independent/dependent) and the ability to measure them. As described in paragraph 2.5.2.1 of the exercise framework, the selection of conditions or variables of the operational environment is a key component of the design and planning process for both exercises and experiments; and

3. Identify measures of merit. Annex A to CFCD 124, RCN Maritime Operational Test and Evaluation Guide provides an excellent overview of various Measures of Effectiveness (MOE), Measures of Merit (MOM), Measures of Performance (MOP), Measures of Suitability (MOS), and Dimensional Parameters (DP) for a wide range of naval applications.

Figure 3-3 and Figure 3-4 taken from GUIDEx, identify two approaches to hypothesis development. In Figure 3-3, an operational problem (requirement) has been identified and a solution is required. The large “2” in the upper right hand corner of the slide refers to GUIDEx Principle 2 (designing effective experiments requires an understanding of the logic of experimentation).

Page 76: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 62 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Figure 3-3: GUIDEx Hypothesis Formulation

Figure 3-3 implies some interplay between the first and second bullets above: successful hypothesis generation will depend upon a good understanding of the independent and dependent variables available in the experimental design. This further speaks to the importance of the shared understanding between the analyst and the sponsor, since the sponsor is likely to be the source of much of the analyst’s understanding of the problem space.

At the root of a hypothesis, however, the analyst should be able to summarize the investigation in terms of an “…if…then…” statement. The “if” should refer to the solution, the cause or the independent variable (the terms can be considered interchangeable) and the ‘then’ statement should refer to the problem to be overcome, the effect, the outcome or the dependent variable (again, the terms can be considered interchangeable).

Hypotheses should also attempt to avoid confounding the outcome by considering several dependent or independent variables in a single hypothesis. Confounding variables are likely to make it impossible to state whether or not a hypothesis is accepted or rejected because it will

Page 77: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 63 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

be unclear to which independent variable the effect is due, if it is even possible to identify what dependent variable is affected.

The simplicity of the “…if…then…” statement belies the supporting material; it is only the tip of the iceberg. But reducing the hypothesis to an “…if…then…” statement is a good examination of the adequacy of the hypothesis.

In the second example, Figure 3-4 a new innovation is available and experiments are conducted to determine the utility of the solution (i.e., a solution looking for a problem).

Figure 3-4: GUIDEx Hypothesis Formulation (Alternate)

Figure 3-4 speaks very specifically to defence experimentation since it refers to capabilities supporting tasks. This model of a hypothesis is further expanded to include any two-part relationship: a capability to a task, a capability to an MOE or a component piece to a capability. It is likely that, if leveraging an exercise, an experimental hypothesis will take the form represented in Figure 3-4. However, the reader should note that the same format applies:

Page 78: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 64 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

“…if…then…”. The reader should also note that the same requirements for relevant data collection and falsification also apply in order to confidently accept or reject the hypothesis.

There are two types of hypothesis: the null hypothesis and the alternative hypothesis. The null hypothesis states that there is no relation between the cause and the effect and the alternative hypothesis states there is a relation between the cause and the effect. It is expected that most hypotheses created for defence experimentation will be phrased as alternative hypotheses in the interests of securing interest and support from experiment sponsors.

It may not be true that, because the alternative hypothesis is not supported, the null hypothesis is true. The data may not be sufficient to support either hypothesis. Even if a hypothesis is supported, the analyst must make further conclusions concerning the nature of the effect (i.e., is it two-sided, having some effect but in an unknown direction, or one-sided, having a positive or negative effect in accordance with the stated relation).

3.5 Measurement Planning Considerations Breton (2011) produced a comprehensive report outlining the lessons learned from the Joint Command Decision Support for the 21st Century (JCDS21) Technology Demonstration Project (TDP) in which he defined different measurement opportunities and identified the constraints and opportunities associated with each. As part of his work, Breton developed a set of five guidelines to assist in the application of measures to the different types of experiments, demonstrations, and exercises based on the objectives of each:

Establish the objective of the event (exercise, demonstration, experiment);

Define the to-be measured concept (e.g., task or process execution, understand the influence of some factor);

Establish the importance of measurement within the event (primary or secondary importance);

Establish the context in which the measurement takes place (field trial, simulator, laboratory); and

Establish the timing of the measurement activity (pre-event, during event, post-event).

With the exception of establishing the objective of the event, which is discussed elsewhere, Breton’s guidelines are described in more detail below.

Applying measurement, particularly during an exercise or demonstration, can be difficult because of the level of realism that is required and the intrusiveness of the subjective measures. Despite the fact that collecting data during an exercise or demonstration is difficult or may not be feasible, it is still highly desirable. The constraint becomes one of sample size, since sufficient participants are required in order to achieve statistical power. Likewise, some

Page 79: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 65 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

measurement protocols may require repeated measurements under different conditions. This may not be possible in military settings where time and resources are at a premium.

3.5.1 Defining the Concept to be Measured

When choosing to measure performance of a task, it is necessary to reproduce the conditions under which a task is usually performed, possibly restricting this type of measurement to simulators or field trials. However, if a specific process or outcome is to be measured, very controlled conditions may be required, possibly requiring that an experiment in a laboratory setting be pursued to permit specific, systematic, and repeated manipulation of the variables. Thus, the ORT must critically consider their requirements to determine whether an exercise will give them the reliability and validity they require, or if they need to conduct the study in an experimental context.

3.5.2 Establish the Importance of Measurement within the Event

If the measurement is of primary importance, the composition of the group to be studied becomes the key constraint, with particular issues surrounding participants’ levels of expertise and knowledge, and the availability of desirable participants. If the measurement is of secondary importance, the measurement protocol must be compatible and compliant with the primary military objective, it must be as unobtrusive as possible, and it may not be possible to disturb participants during execution of the event.

3.5.3 Establish the Context of the Event

The difference between simulator investigations, laboratory-based investigations, and field-based investigations closely mirrors other distinctions between types of investigations. A laboratory-based investigation is conducted in a context that is isolated from the outside world, affording a great deal of control over the factors that affect performance. Experiments, demonstrations, and exercises can all take place in a laboratory setting. Laboratory-based investigations are open to criticism that they lack realism and cannot be generalized to operational environments. By contrast, a field-based investigation takes place in a context that exhibits many, if not all, of the factors that can influence the performance under realistic conditions. Field-based investigations do not afford such good opportunities to control variables. However, both demonstrations and exercises can take place in a field setting. Field-based investigations are open to criticism that they are too uncontrolled. Simulator investigations can offer the best of both worlds: they afford as much control over independent variables as desired, but can provide sufficient realism to render the findings generalizable to the real-world. As with laboratory-based and field-based investigations, simulator settings are amenable to experiments, demonstrations, and exercises, although they can be expensive to set up and run.

3.5.4 Establish the Timing of the Measurement Activity

Measurements before the event occurs can be criticized for providing expectations (inappropriate or otherwise) about the event and thereby modifying performance.

Page 80: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 66 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Measurements during the event can also be sufficiently intrusive to modify performance, so the selection of measurement instruments that do not affect performance is critical. Using technology-enabled indirect measurements (such as audio visual recordings, data recording or inserting control signals into data streams, or Blue force tracking) might somewhat alleviate the impact of measurement on the performance. These recordings could be analyzed post event; however, sufficient time for the analysis should be budgeted. Measurements after the event can also take a great deal of extra time, so care must be taken to ensure the event objectives are manageable. Further, measurement data taken after the event may be contaminated by participant’s expertise and background, rather than representing their actual performance on the task.

3.6 Template Hypothesis Formulation by Activity Type

The following section contains examples of Template Hypothesis as well as the notional research question(s) that could precipitate the development of the hypothesis. The training objectives identified in Section 2 as well as interviews with Formation, Fleet and Sea Training Staff aided in the development of the notional scenarios and research questions. Where possible, the independent and dependent variables, any obvious MOPs or MOEs, and the nature of the hypothesis have been identified as well.

3.6.1 Comparison of Two or More Combat Systems/Naval Platforms

Scenario: Two Canadian Surface Combatants will attempt to achieve a time-sensitive, dual-axis, simultaneous time-on-top attack against a land target with land-attack harpoon missiles. The purpose of this experiment is to compare the abilities of a CP-140 MPRA, a shipboard CH-148 Cyclone helicopter and a shipborne UAV (three instances of the independent variable) to transmit high quality and accurate targeting data (MOP) and perform post-engagement BDA (MOP).

Research question: Based on a comparison of the MPRA, helicopter and the UAV, can the UAV track a target continuously, providing accurate, high quality streaming video for targeting purposes to the Firing Units at sea at a closer strand-off range than the MPRA and/or helicopter?

Hypothesis: If a UAV is used, then targeting data is more accurate (i.e., capability to MOE).

3.6.2 Evaluation of Operational Plans

Scenario: The Maritime Component Commander (MCC) is analyzing the Relative Combat Power of an adversary (Orange) as part of the CF OPP prior to his forces arriving in the JOA. Orange has deployed two diesel electric submarines (SSK) which are un-located but may be patrolling in the area the MCC plans to deploy his surface forces. The MCC has 1 nuclear submarine (SSN) assigned, but is prepared to request additional SSN from an

Page 81: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 67 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

adjacent area if it will increase the probability of sanitation to ____% within a fixed number of hours.

Research question: Given the dimensions (independent variable) of the MCC’s planned Area of Operations, how much area can be sanitized in _____hours (dependent variable and MOP)?

Hypothesis: The assignment of a second SSN will be required to meet the requirement to sufficiently sanitize the operating area within a given timeframe (i.e. capability to an operational task).

3.6.3 Evaluation of Tactical Plans

Scenario: The TG Commander is conducting Counter-Drug Operations in the Joint Interagency Task Force-South (JIATF-S) JOA. Surface forces with the requisite speed to intercept drug runners are in short supply, however, with sufficient warning time and persistent ISR, USCG Armed helicopters can be vectored from ashore and apply force necessary to stop the drug-runners and allow the TG ships to perform the interdiction before the drug-runners reach their destination. The TG Commander needs to follow a fixed cycle to send his/her ships in and out of Key west for refuelling and crew rest, however, the available time on station could be optimized by positioning closer to Key West, rather than further out to sea, provided that there is sufficient ISR and time available to call in the USCG Armed helicopters.

Research question: What decrease in USCG helicopter ground alert time (independent variable) will permit the TG Commander to reduce the range to _______ from Key West and still achieve a ______% intercept rate (MOP and dependent variable.

Hypothesis: If the ground alert of USCG helicopters is reduced from ____ minutes to ____minutes, then this will permit the TG Commander to reduce the range from Key West that ships will be deployed to a fixed distance of ____ and still achieve a fixed _____% intercept rate (i.e., capability to an MOE).

3.6.4 Development and Assessment of a Concept of Operations

Scenario: In planning for a potential Catastrophic Earthquake, JTFP has developed a CONOPS to provide alternate C4I services. The CONOPS has two separate branch plans—exercising C4I from at sea onboard an HMC Ship, or from a mobile van with limited C4I capabilities. The ship has an advantage of greater C4I connectivity with Higher HQ in Ottawa, but the van has the advantage of being collocated with Provincial authorities. The Provincial authorities will be making assessments and coordinating the overall response, but due to poor communications infrastructure, communications in and out of the Provincial Emergency Coordination Centre are extremely poor for the first 24 hours. In the critical first 24 hours following the earthquake, Commander JTFP needs to be able to coordinate the national CAF response as well as liaise with Provincial authorities regarding the response.

Page 82: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 68 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Research question: In the first 24 hours after a catastrophic earthquake, should JTFP establish his alternate HQ onboard a ship (independent variable) where he can receive multiple information feeds (MOP) and satisfy CJOC CCIRs (MOP), or establish his alternate HQ with the Province where he/she will have better “ground truth” immediately around him/her (MOP), but relatively scanty information from across the entire affected area (MOP)?

Hypothesis: If JTFP establishes his alternate HQ onboard a ship for the first 24hrs following a catastrophic earthquake, he/she will be able to respond to more CJOC CCIRs than if co-located with the Provincial Headquarters with limited C4I (i.e., capability to MOE).

3.6.5 Development and Assessment of a Concept of Employment for a Particular Capability

Scenario: An HMC Ship has embarked a Cyclone Helicopter and a UAV. Both platforms are capable of supporting ISR; however, there is a requirement to determine the relative wind envelope for the safe and emergency launch and recovery of the UAV, compared to the Cyclone as part of the process to assess the concept of employment for the UAV.

Research question: What are the maximum relative wind envelopes for the emergency launch and recovery of the UAV (dependent variable), if the risk of loss during an emergency launch or recover (MOP) is fixed ____% (independent variable)?

Hypothesis: If the wind exceeds ______ Kts the risk of loss during emergency launch will exceed ______%. (i.e., cause and effect relationship).

3.6.6 Impact of an Increase or Decrease in a Particular Capability

Scenario: The sponsor wants to optimize its firing doctrine for the employment of the Main Gun Armament of the Canadian Surface Combatant in a Direct Fire role during a Naval Surface Fire Support mission and increase P(k) on the second corrected round, rather than the third round.

Research question: At a fixed distance of _____ from the target (independent variable), and assuming the target can be tracked from the ship (independent variable), what is the likelihood (dependent variable) of achieving a P(k) of ____% (MOP) on the second corrected high explosive round fired rather than the third?

Hypothesis: If the target can be tracked from the ship at a fixed range of _____ from the target during a Direct Fire mission, then the Main Gun Armament will achieve a P(k) of ____% on the 2nd corrected high explosive round fired (independent variables) (i.e., capability to operational task).

Page 83: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 69 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

3.6.7 Identification of Requirements Sets for a Particular Operation or a Type of Operations

Scenario: A key part of the CF OPP is to analyze and deconstruct both friendly and enemy Centres of Gravity and determine critical capabilities and critical requirements. By performing analysis at this level of detail, planners can determine which critical requirements are deficient or vulnerable to attack. An example of critical requirements are those essential conditions, resources, and means for an integrated air defense system (a critical capability) that is protecting the operational centre of gravity. In this scenario, the critical requirements are the individual radars, surface to air missiles, communications and headquarters making up the integrated air defence system. In making his/her plan, the JFMCC would likely consider the following factors in determining his own requirements to defeat the enemy: Command and Control, Intelligence, Sustainment, Force Protection and Fires.

Research question: Which of the individual radars, surface to air missiles, communications and headquarters infrastructure which make up the integrated air defence system of the enemy (independent variable) within an area defined by _____ (independent variable) require persistent ISR (independent variable) during the initial phases of conflict until air supremacy (MOP and dependent variable) is achieved?

Hypothesis: If the JFMCC participates in the CFACC’s OPP for developing the integrated air defence plan, then the resulting plan will increase the defence of the naval force (i.e., capability to MOE).

3.6.8 Identification of Gaps in a Particular Capability Set

Scenario: The new government has announced that it is undertaking a defence policy review and has opened the door to fast-tracking the replacement to the out of service Protecteur-class replenishment oilers. The RCN is already in the process of converting the former MV Asterix into a supply ship as an interim replacement under Project Resolve. One of the options under consideration is the immediate purchase of 2 x Amphibious Assault Ships (Helicopter Carriers). The RCN has initiated a project to conduct a gap analysis across the domains of doctrine, organization, training, materiel, leadership and education, personnel, facilities and interoperability (DOTMLPFI) relating to the potential acquisition of these ships. One of the principle concerns is that Esquimalt may not have sufficient water depth and facilities to berth even a single Amphibious Assault Ship. Vancouver cannot accommodate the vessel full time, but can support the load out of equipment and personnel for mission specific purposes. If berthed and force generated from other ports on the West Coast, such as Prince Rupert or Kitimat, this would have significant DOTMLPFI implications which are not yet fully understood.

Research question: What are the DOTMLPFI implications (MOPs) of berthing (and force generating) an Amphibious Assault ship in either the ports of Prince Rupert or Kitimat versus Esquimalt (three instances of the independent variable). Assuming that both Kitimat and Prince Rupert can berth the ship, the aim of the study would be to determine the DOTMLPFI

Page 84: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 70 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

gaps in being able to have the ship rapidly force generated (independent variable) from either Prince Rupert or Kitimat and repositioned in Vancouver for mission loadout before proceeding on a mission (dependent variable). Note: this particular scenario would be part of a much more extensive examination of MOE and MOP, but is provided for illustration purposes only.

Hypothesis: If the Amphibious Assault Ship is berthed in Prince Rupert, then it can be force generated and pre-positioned to Vancouver for mission loadout _____ hours faster than if it was force generated from Kitimat (i.e., capability to operational task).

Page 85: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 71 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

4 COMMON CONSIDERATIONS FOR EXCERCISES AND EXPERIMENTS

The following describe some key considerations that apply regardless of whether the event being targeted is an exercise or an experiment.

4.1 Constraints in the C2 Measurement Event Type

Each of the event types described above brings different constraints. Constraints for the most common of these events are presented in Table 4-1.

Table 4-1: Constraints Mapped to Different Event Types

Control over Event

Space for Observers

Access to Participants Interruptability

Access for data

recording

Experiment High High High High High

Demonstration Low Low Negotiable Low Possibly High

Exercise Negligible Negotiable Negotiable Negligible Possibly High

An experiment offers the fewest constraints to the investigator. The constraints associated with the event type, however, must be mapped to the demands of the different measurement types (paragraphs 4.4 and 4.5). The reader should note that HREC review may be required irrespective of the event type.

Measuring the effectiveness of a military organisation is challenging task due to the subjectivity of the definition of ‘success’. In many cases, the evaluation may be a measure of the one’s perception of the military commander’s intent, how it aligns with the stated event objectives, and how it is being achieved. Therefore the most complete evaluation of RCN effectiveness may be a combination of quantitative and qualitative measures, where the quantitative measurement is undertaken only for variables for which quantitative data is available, and qualitative measurement is undertaken for other, less defined, variables.

4.2 Independent Variable

Once the type of event has been established, the investigative team should define their objectives and identify the independent variable. The independent variable is the variable that will be manipulated by the investigating team to generate some effects on other variables. The independent variable should be declared in advance when the measurement event is being planned. The investigating researcher will provide procedures, systems, and instructions to be followed by participants. Any changes to these variables are made deliberately by the investigating team to create an impact on the dependent variables.

Page 86: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 72 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

The independent variable may also be represented by a probe stimulus or a control point within the experiment or exercise. Such probe stimuli are deliberate attempts to provoke the activity of interest in the individual, team, organisation or system under investigation and measure its performance. Likewise, a control point is a decision point for which all possible decisions and outcomes have been developed and identified, and can be used by the ORT to evaluate performance.

4.3 Dependent Variable

The dependent variable is the variable, which is affected by a change to the independent variable. Experimentally, the effect of the independent variable is established through measurement of the dependent variable(s). As with the independent variable(s), the dependent variable(s) should be declared in advance when the measurement event is being planned. Both MOPs and MOEs are considered dependent variables, and all different types of measurement approaches (see Section 4.4 below) address dependent variables. For example, in an experiment considering RCN capability, any change to the system intended to improve a capability would be considered the independent variable. The downstream impact of this change on dependent variables would be the proof of a positive or negative effect on that capability.

It is often the case that there is no MOP or MOE that directly measures the quality under investigation. As a consequence, there will often be a requirement to consider the dependent variables as indicators of that quality. Each dependent variable, whether an operational performance measure or a human performance measure, needs to be considered as one part of a coherent story. There must be converging evidence (from other measures) that indicate the same effect. The identification of convergent measures will result from task and scenario analysis to identify measurement opportunities and acceptable (or optimum) performance parameters.

4.4 Measures of Performance and Effectiveness

The literature concerning evaluation in C2 (e.g., GUIDEx, C2 Measures of Effectiveness Handbook) contends that measurement is made on at least two levels: a MOP and/or a MOE. MOEs and MOPs are defined below.

4.4.1 Measures of Effectiveness

A MOE is concerned with assessing a system’s ability to achieve its goal; its impact on the operational environment. In this respect, an MOE tends to be more complex than a MOP and may include two or more MOPs in its determination. A MOE focuses on a holistic assessment of how well a system performs overall and can be considered an overarching measurement that comprises more than one MOP (adapted from Roedler & Jones, 2005; and the Test and Evaluation Management Guide, 2005).

Page 87: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 73 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

4.4.2 Measures of Performance

A MOP is a quantifiable measurement that can take any number of forms, such as a simple count of something, a physical measurement, an average, a rate, a percentage, etc. A MOP may be stated as a declarative statement and indicates the system’s achieved level of performance. A MOP does not provide an assessment of the overall impact of the measurement attribute on the goal of the system. A MOP focuses on the absolute measurement of unidimensional qualities (adapted from Roedler & Jones, 2005, Test and Evaluation Management Guide, 2005). MOEs and MOPs are also scalable: what may be a MOE for one hypothesis or objective may be a MOP to a different hypothesis or objective. Thus it is possible that MOPs may also be used as MOEs. This should be borne in mind when considering the framework development of measurements. Note that several commentators advocate a hierarchical arrangement of measures proceeding from MOPs at the lowest level, to Measures of C2 Effectiveness (MOCE), Measures of Force Effectiveness (MOFE), and Measures of Policy Effectiveness (MOPE) (see Code of Best Practice for Experimentation, C2 Measures of Effectiveness Handbook).

4.5 Measurement Approaches

Breton (2011) lists four main types of measurement approach: Interviews, Questionnaires, Observations (all subjective measures), and Data Recording (i.e., objective measures). When supporting defence exercises or experimentation the measurement approaches should be selected from these four types. It is important to note that any or all of these approaches may require the data collection plan to be reviewed by the HREC if there is any interaction with human subjects (with some limited exceptions).

4.5.1 Subjective Measures

Interviews, questionnaires and observations are collectively referred to as subjective (qualitative) measures. These are measures which are open to influence by the investigator’s and the respondent’s own feelings and objectives for the exercise. Interviews, questionnaires and observations also share the need for significant work a priori, in order to identify the topics or phenomena about which data must be collected. This can be overcome somewhat by using qualified, SME observers, although this will often result in bias in the data (although by using several observers, data can be correlated). Subjective measures can also be constructed to have greater reliability and validity by using behaviourally-anchored rating scales or other readily observable indicators of levels of performance, opinion, etc.

These three approaches are also intrusive with respect to needing additional participant time (for interviews and questionnaires) and requiring the presence of one or more observer in the area in which tasks are being carried out. Considered in the light of any defence event, the investigator should attempt to use objective measures to the maximum extent possible, since they can be constructed to avoid and overcome many of the event and implementation constraints that have been discussed to this point.

Page 88: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 74 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

4.5.1.1 Interviews

Interviews are a powerful tool for experiments and exercises. Interviews can be used for any situation since there is no limit to the question set that can be developed. Interviews can be structured, semi-structured, or open, depending upon how ‘conversational’ the interview is and how much deviation from a focused subject area the interviewer will tolerate. A structured interview will closely follow a set of interview questions (e.g. script), while an open interview will follow a natural conversational flow. Open interviews are very flexible and can result in unexpected insights.

Interviews should be conducted by trained interviewers who understand how to avoid inserting their own biases into the interview. Some negotiation may be required to secure access to participants to be involved in the interviews.

Interviews require one-on-one or one-to-many access to participants, accessibility to participants and task execution that allows self-evaluation. Interviews are not intrusive if conducted before or after the event and can provide rich data. However, interviews may also provide biased data, is time and resource consuming (as is analysis of interview data), and can vary according to the understanding of the participant, the quality of the interviewer, and the rapport between the participant and the interviewer. Interviews are best suited for prototyping, demonstrations in field trials and simulators (before and after the event), exercises in field trials, and exercises in simulators (in that order).

4.5.1.2 Questionnaires and Structured Worksheets

Questionnaires and structured worksheets are also a flexible method of investigating a subject. They have the advantage of being able to collect large amounts of data from a large subject sample. Questionnaires and worksheets can be presented in several formats: multiple choice, rating scales, paired associates, ranking, open-ended questions, closed (yes/no) questions, and filter questions.

Questionnaires and worksheets may require that participants self-evaluate their own performance, possibly reflecting their feelings about their performance or concept to be measured, rather than their performance or the concept itself. Questionnaires must provide clear instructions, otherwise it is unlikely that data will be valid. Questionnaires and worksheets should not be used for measurement of cognitive processes, unless participants have received instruction regarding that cognitive process and their comprehension has been established. Some negotiation may be required to secure access to participants to complete questionnaires.

Questionnaires and worksheets require standardized and validated questions, access to participants, task execution that allows self-evaluation, and administration periods (before, during, or after the event). Questionnaires and worksheets are not intrusive if administered before or after the event, can accommodate several participants concurrently, and support rapid data collection and analysis. However, questionnaire and worksheet responses may be biased, and may vary according to a participant’s understanding of the questions being asked. Questionnaires and worksheets are best suited to prototyping, demonstrations in field trials and

Page 89: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 75 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

simulators (before and after the event), exercise events in field trials, and exercise events in simulators (in that order).

4.5.1.3 Observations

Observations are a potentially rich source of data, but depend upon observable phenomena. Drury (1990) suggest five types of information that can be collected through observation: sequence of activities, duration of activities, frequency of activities, fraction/duration of time spent in states, and spatial movement.

Similarly to interviews, trained observers should be used to collect observational data. Several observers should be used in order to get valid data. However, observations cannot be used to study cognitive processes that have no external manifestation. Some negotiation may be required to secure access to the event for sufficient numbers of observers to obtain valid data, and to locate the observers in the optimum position. An optimum number of observers (based on the nature of the task being observed, for example, fast tempo or slow tempo) should be determined before this negotiation. Nevertheless, it may be necessary to arrange special permissions for the observer to remain in place during special evolutions such as a Man Overboard and the subsequent Verification Muster (for instance).

Observations require trained observers or Subject Matter Experts (SMEs), and depend upon observable (physical or verbal) behaviours that have been identified beforehand. An optimal ratio between observers and participants is required, dependent upon the nature of the task. Observations are not necessarily intrusive, although as the number of observers increases, greater intrusion may result. Observation may also cause the actions and performance of the participant being observed to change (i.e., the Heisenberg Principle). Recording the experiment with audio and sound recording devices may significantly reduce any intrusion perceived by the participant. Unobtrusive audio-visual recording systems exist to track and automatically record participants’ physical actions and their areas of focus (i.e. eye movement tracking). Such recording is accompanied with significant analysis implications after the event, with rules of thumb indicating 10 hours of analysis for every 1 hour of recording. Similarly to the discussion of questionnaire (Section 4.5.1.2), the analysis effort can be facilitated and standardized using predefined indicators and measurement criteria. The ability to record events might be limited by operational security (OPSEC) concerns.

Observation may be an alternative if participant accessibility is low. Observations provide a direct evaluation of a given behaviour within overall task performance. Observations, however, are based on the evaluation of a third person, may require a ratio of observers to participants that is too high (and thus intrusive), and require observers to be as close as possible to participant activities in order to record the best data. Observation is best suited to exercise events in field trials, exercise events in simulators, and demonstration events in field trials or simulators (in that order).

Page 90: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 76 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

4.5.2 Data Recording (Objective Measures)

Data recording (objective or quantitative measures) is often reserved for technical systems. In the field of human performance, there are few measures that can be considered objective measures (i.e. measures that are recorded automatically and represent an immutable and physically-measurable characteristic of the entity of interest). The measures of most interest are generally subjective since they gather data relating to an individual’s state of conscious mind (e.g. mental workload or situation awareness). Nevertheless, it may be possible to associate mental states with recordable phenomena, such as eye gaze and blinks, key strokes, mouse clicks, numbers of communications, work output, etc. If it is a viable measurement approach, data recording can result in large amounts of data that is less susceptible to bias due to the intervention of human participants or observers, is less obtrusive to participants, and requires less concurrent effort on the part of the observer team to collection (although significant effort is still required a priori to know what to measure, and post-collection for analysis). Objective measures may require rigorous experimental protocols in order to maintain reliability and validity and in order to gather sufficient data regarding the different experimental conditions. They must also be compatible with the technical environment in which the investigation is taking place.

The implementation of objective measures generally requires technological support, as does data analysis (e.g. statistics programs, spreadsheets). However, objective measures can be taken without interacting directly with participants, who may not even be aware of the data collection process. Thus, the data is unbiased by participants’ own evaluations, and a very large set of data can be collected and analysed rapidly. Objective data collection generally requires a minimum number of participants for statistical purposes, and requires that its technological set up be included in the participant’s environment. Objective measures are best suited for the study of the role of cognitive processes, prototyping in simulator environments, demonstrations in simulators, and exercise events in simulators (in that order).

4.6 Reliability and Validity

Reliability and validity are two key concepts associated with measurement that must be considered when planning an experiment or an exercise. Without reliability or validity it is not possible to generalise from the results of an experiment or exercise to the operational context.

Reliability refers to the consistency of data provided by a measure when the measure is applied repeatedly in similar conditions. There are several types of reliability, not all of which will apply to every event:

Test-Retest Reliability: assesses the degree to which test scores provided by a single participant on a single version of the test remain stable from one application to the next.

Inter-Rater Reliability: assesses the degree to which the ratings provided by different participants concerning the same quality, event, etc. agree with each other.

Page 91: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 77 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Internal Consistency: assesses the degree to which items within a test which purport to measure the same quality provide similar results.

Reliability is generally measured through statistical correlation. To permit statistical correlations to the calculated from an experiment or event, analysts must ensure that they account for all possible sources of variance and set up their analysis routines to consider reliability for those data points that are expected to be reliable.

Validity refers to the degree to which a measurement approach quantifies the quality that it purports to measure. In many cases, validity refers to the accuracy of the measurement in that the results must correspond, and be sensitive, to the real world. As with reliability, there are several types of validity, not all of which will apply to every event:

Test Validity: the degree to which a test accurately measures what it is supposed to measure. Test validity includes accuracy, construct validity, content (or face) validity, and criterion validity (consisting of concurrent and predictive validity). The sub-types of test validity are not described in detail in this report.

Experimental Validity: refers to the adequacy of the design of an experimental study. Without a valid experimental design, valid scientific conclusions cannot be made. Experimental validity includes statistical conclusion validity, internal validity and external (ecological) validity. Again, these sub-types of experimental validity are not described in detail in this report.

Diagnostic Validity: refers to the degree to which a condition or outcome can be predicted from measurements or observations. In many senses diagnostic validity is a type of predictive validity.

When planning an event, whether an experiment or participation in an exercise, the ORT must be conscious of the need for reliability and validity and design their study to ensure that different treatment conditions (i.e. the independent variable) are limited to changes in the expected variable. If variables change without knowledge or control the results of the study become confounded and cannot be generalised to operations.

4.7 Direct versus Indirect Measures

When measuring performance and effectiveness, measurements can be direct or indirect. A direct measure is one that quantifies or describes the characteristic of interest without being translated, inferred, or otherwise transformed from the original raw observation or data point. An indirect measure is one that is comprised of several direct measures and/or is inferred from a combination of data, or from missing data, or by transforming or extrapolating from the data.

Page 92: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 78 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

4.8 Online vs. Offline Measure

Online measures are taken while the participant is carrying out the task of interest. These may include data recording and observation, although they can also include some interview techniques such as a “talk-aloud” protocol or periodic verbal probe questions such as asking about the presence or absence of specific entities in the task environment. Offline measures are those that are made before the participant begins the task, during breaks from the task, or after the task has been completed. Both approaches can be intrusive.

4.9 Operational Performance vs. Human Performance Measures

MOPs and MOEs, whether subjective or objective, direct or indirect, online or offline, interviews, questionnaires, observation, or data recording, can address two topics: Operational Performance or Human Performance. Operational performance measurement concerns parameters that have to do with the execution of the task being performed, and may focus on processes that are carried out or states that are reached. Similarly, human performance measurement may also focus on processes or states, but concerns human cognitive and physical parameters. The ORT is unlikely to assess human performance and is more likely to assess command or command team performance.

4.9.1 Operational Performance Measures

The purpose behind evaluation is to improve overall mission performance. Mission performance requires the consideration of measures associated with the performance of the mission: operational performance measures.

Traditionally, assessment of operational performance has been carried out through comparison with authoritative doctrine and by judgement by observers/controllers and senior mentors. However, reliance on these approaches presupposes that a quality product results from the C2 process and organization, or that the assessor has infallible insight into the necessary C2 processes and outcomes. A better understanding of the link between a process and an outcome (state) is necessary to fully assess C2 operational performance. MOPS to address this understanding will include communication, coordination, and information sharing behaviours, as well as the quality of products, accuracy in execution of complementary tasks, and violations (of intent, boundaries, role, etc.). Within operational performance measures most MOEs may provide indications of the adequacy of a specific aspect of performance, but will otherwise be too complicit with other aspects of C2 performance. For instance, “achievement of the mission objectives” as an MOE will depend upon the adequacy of C2 integration, the adequacy of planning, the accuracy of execution, environmental conditions, and a host of other factors not necessarily related to the dependent variable of primary interest.

While subjective measures can cover almost any area of interest through direct investigation or indirect query methods, objective measures for operational performance tend to include the following:

Page 93: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 79 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Time-based metrics (A well-integrated C2 organization may exhibit some reductions in the time required):

Time taken to react to an event (time to notice, process and act upon new information);

Time to perform a task (time to make a decision);

Time horizon for predictive analysis and plans; and,

Rate of performing tasks (tempo).

Accuracy-based metrics (a well-integrated C2 organization may be more precise, reliable, and knowledgeable, producing better quality products with fewer errors):

Precision of the observed system(s) performance);

Reliability of the observed system(s) performance;

Completeness (known unknowns, unknown unknowns);

Errors (alpha, beta, omission, transposition, severity); and,

Quality of information produced.

Ideally, a task analysis of the behaviour(s) of interest will identify the specific measurement points that will allow evaluation of the paradigm. This task analysis should also include consideration of the scenario to be exercised, in order that “ground truth” is known and can be used to determine the relative success of performance. It may be useful, when planning the exercise or experiment, to develop a table similar to Table 4-2 for each hypothesis and/or objective. Note that Table 4-2 is deliberately blank; MOPs and MOEs can be developed to suit any measurement approach.

Table 4-2: Operational Performance Measures

Measurement Approach Measures of Performance Measures of Effectiveness Interview

Questionnaire Observation

Data Recording (Objective Measure)

If subjective measures of operational performance are used, SMEs need to be engaged to assess the outcomes, whether in advance of the event or after the event has taken place (through assessment of products).

Page 94: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 80 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

4.9.2 Human Performance Measures

In most systems, the most critical impediment to overall system performance is the human operator. The evaluation of a defence exercise or experiment must therefore involve consideration of the human components of the system. Without understanding the effect on the human operator, changes may have the inadvertent effect of placing stress on the human, which can lead to sub-optimal performance, or even hindering the human’s pursuit of task objectives. That being said, the ORT is most likely to use human performance as part of a measurement of command or command effectiveness.

Human performance measures address process and state aspects of human cognitive and physical performance. Unlike operational performance, however, human performance measurement includes concepts that are germane to any operational context or task. For instance, workload and situation awareness (SA) are always applicable. The precise components of the human performance measure, such as what contributes to the outcome state, or what needs to be observed to evaluate the process or state, does vary with the operational context or task.

When developing a human performance measurement protocol analysis of the task and operational scenario should be carried out. Only by understanding the task and the task context can specific and sensitive measurement points be identified. Stanton and Salmon (2004) have done the most comprehensive recent review of human performance design and evaluation methods. They reviewed over 200 methods, reducing this list to 91 methods based upon their applicability to C2. The 91 methods comprised the following categories:

Data collection techniques;

Task analysis techniques;

Cognitive task analysis techniques;

Charting techniques;

Human error identification techniques;

Situation awareness measurement techniques;

Mental workload assessment techniques;

Team performance analysis techniques;

Interface analysis techniques;

System design techniques; and,

Performance time assessment techniques.

Page 95: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 81 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

The work by Stanton and Salmon (2004) exhibited a great deal of overlap between the categories, and only few of the techniques are appropriate for measurement of C2 events (as opposed to developing basic understanding of how things are done). Nevertheless, there are a number of measurement techniques that are useful for the purpose of this work. As noted in paragraph 4.9.1, specific measurement points should be identified through task analysis of the activity of interest and the scenario to be exercised. These analyses will support the development of interview questions, questionnaire questions, and observer ”checklists.” They will also help analysts determine if the responses to interviews, questionnaires, and the observed behaviours are what was expected, allowing conclusions to be drawn about the system itself. Other human performance dimensions are more generally applicable and do not necessarily have to rely on task analysis.

Since Stanton and Salmon (2004) conducted their survey, other methods have been developed, and Breton, Tremblay, and Banbury (2007) have developed an extensive list of SA measurement techniques for C2 environments. Breton et al (2007) distinguish between SA as a process (Situation Assessment) and SA as a state (Situation Awareness). Breton and his co-authors also differentiate between individual SA and team SA. The other important aspect that Breton et al (2007) note is the calibration of the SA measurement, in their case by comparing the actual agreement of SA information reported by different team members with team members self-reported ratings of SA quality. It is possible that a team member may think SA is good, but is in fact poor, or that they believe it is poor when in fact it is good. This correlation provided further understanding of how effective individual and team SA was likely to be.

In their 2004 report, Stanton and Salmon do not deal with trust as a process or a state that is necessary for effective performance. Likewise, they do not discuss the use of time-based, activity-based, or error-based MOPs and MOEs. Similarly, to operational performance measures, time-based and activity-based measures can be used to measure human performance processes, although they are less useful to evaluate human performance states. Errors can be used to measure both human performance process and outcome states. Time, activity, and error-based measures must be based on prior task understanding (through task and scenario analysis) to identify what the activity involves, what defines the onset and conclusion of different activity stages, and what constitutes an error. Stanton and Salmon (2004) also do not discuss the quality of human-work, human-machine-work, and machine-work as a measurement point for C2 outcome states.

Given the preceding discussion, the following dimensions of human performance are recommended for measurement associated with RCN exercises and experiments:

Workload;

Situation Awareness;

Human Error: based on task and scenario analysis, specific potential errors should be identified and form the basis of observer checklists, questionnaires and interview. Errors might include omission of procedurally-mandated communication or coordination activities,

Page 96: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 82 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

misinterpretation of orders or intent, inaccurate execution of plans handed down from higher, etc.;

Team Performance: with respect to accountability, goal-setting, communications, ownership/membership, etc.;

Usability;

Trust;

Work quality: comparison of the quality of work products should be made. In particular, quality of work products should be compared between different experimental conditions (as opposed to workload, SA and human error, which can be considered in an absolute, rather than relative, sense);

Time-based measures (dependent on tasks being performed); and,

Activity-based measures (dependent on tasks being performed).

NOTE: Some of these dimensions of human performance will be more germane at the individual level of performance and, therefore, will be collected as MOPs rather than MOEs. It could be argued that, for human performance, there are no MOEs since the individual activities do not have a direct relationship with overall success of a capability.

Page 97: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 83 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

5 CONCLUSIONS AND RECOMMENDATIONS

The Naval Exercise and Experimentation Support Framework proposed in this report was developed to improve exercise and experiment planning efficiency and effectiveness by providing decision-makers and planners involved in the direction, oversight, planning, delivery and analysis of exercises and experiments an understanding of the exercise and experiment “cycles,” as well as the key inputs, processes and outputs required to conceive, design, and develop combined training exercises and operational research experiments while preserving the fundamental distinctions between each. Its value results from the collaboration of the two communities (Operator and Scientist) to engender full, mutual understanding and participation.

The exercise design and measurement framework is a six-phase repeatable exercise cycle that aligns high-level training direction and guidance with doctrine, strategy and missions to produce highly trained and operationally-ready commanders, staffs and units. The cycle commences with initiation by senior leadership and follows an iterative process of design and development, followed by exercise delivery, analysis and reporting. This process repeats itself as the conclusions and recommendations of the last cycle are analysed and incorporated into the next exercise cycle as part of a larger process of continual improvement within an organization.

The work examined best practices in Canadian and Allied service-level, joint and interagency training and exercise design and planning and found that training objectives are selected by Commanders based on an analysis of Mission Essential Tasks (MET) which are derived from the Universal Task Lists such as the Canadian Joint Task List, NATO Task List, the U.S. Universal Joint List and Universal Navy Task Lists, and the Department of Homeland Security Universal Task Lists. Specific, measurable, attainable, relevant and time-bound (i.e., SMART) training objectives consist of a specific performance requirement (task), the operational environment or situation in which a unit, system, or individual is expected to operate (conditions) and the level of expected performance (standard). A process chart was developed to assist exercise planners and operational research teams in exercise design and planning.

The framework for developing research hypotheses in military experimentation examined best practices from across the defence research community and determined that the defence experimentation follows a similarly repeatable cycle starting with initiation and follows an iterative process of definition and development, followed by experiment conduct, analysis and reporting. This cycle repeats itself as the conclusions and recommendations of the last cycle are analysed and incorporated into the next cycle as part of a larger campaign of experimentation.

The work provided a number of sample training objectives as well as template research hypothesis for both exercise and experiment scenarios ranging from Unit and Task Group-level activities through to Maritime Component Command and Joint-Task Force settings involving multi-agency, multi-national, and interdisciplinary stakeholders.

Evident throughout this study is the finding that no exercise or experiment should be undertaken without sufficient preparation and senior leader/sponsor buy-in to direct exercise and experiment design. In the early stages of design, an analysis should understand what qualitative and quantitative measurement opportunities there are to measure performance and

Page 98: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 84 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

experimental outcomes and outputs. The study found that while the Canadian Joint Operations Command, the US Department of Defence, and the US Department of Homeland Security have refined lists of tasks, conditions and standards from which to design, plan and deliver both exercises and experiments, the guidance for the RCN lacks the same level of detail, particularly in terms of performance standards, measures and criteria at the Task Group-level.

Of particular relevance to the MARPAC ORT is the manner in which they should organise their work to complement the processes of the RCN. These processes proceed from the macro level (i.e., guidance from the Government of Canada and CJOC) to the micro level (i.e., activities concerned with planning specific exercise and experiment events). The synchronisation between the MARPAC ORT and the RCN is presented graphically in Figure 5-1. The specific activities of the MARPAC ORT and how they flow from each other are shown graphically in Figure 5-2.

NOTE: The colour coding in Figure 5-1 matches that in Figure 5-2, which represents the six phases in the CJOC exercise planning cycle.

Future work should focus on refining the RCN Task List, defining data and information requirements to measure performance, and developing data collection and analysis plans to include the development of template questionnaires and surveys. In addition to providing measurable performance and readiness data, such an effort would support the assessment of progress towards meeting the training, readiness and capability development objectives promulgated in high-level guidance and direction including the Chief of Defence Staff Force Posture and Readiness Directive, Commander Royal Canadian Navy’s Strategic Direction and Guidance, the Royal Canadian Navy’s Readiness and Sustainment Policy, and the Royal Canadian Navy Future Naval Training System Strategy, as well as the RCN Ten Year Fleet Plan, the Five Year Operational Schedule, and the Fleet Schedules for the two coastal formations.

Future work should also review a comprehensive set of previous ORT exercise and experiment support activities in order to distill a framework containing the following:

Common objectives;

Common areas of focus;

Common area of interest;

Common measurement area; and,

Common units of measurement.

The relationships should be retained where possible in order that future ORT planning work can draw from previous mappings from objectives to tasks to measurements. This work should make extensive use of examples and cross-reference to the original reports where possible. Ideally, this would also be leveraged in a ‘living’ lessons-learned database for exercise and

Page 99: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 85 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

experiment support2 wherein modifications to the mapping could be captured, insights into the implementation of the support plan could be fed forward and from which future support activities could begin in order to facilitate the planning and preparation process and ensure the continued improvement of the support provided by the MARPAC ORT to the RCN.

Future work could also focus on creating a more robust reporting mechanisms which would link RCN outputs to higher level CAF measures contained in CDS direction and guidance. This would effectively harmonize military readiness reporting and departmental program reporting through such means as the Report on Plans and Priorities, and Departmental Performance Report, and updates to the Defence Priorities. Such a robust measurement and reporting system would not only provide ground truth regarding the current state of readiness of the RCN, but could be leveraged to illustrate the requirement to invest in rebuilding a world-class Royal Canadian Navy capable of delivering operational excellence in the complex future security environment across the spectrum of conflict.

At the core of this effort is that Exercise and Experimentation support mechanism between maritime operations teams and RCN (in particular MARPAC, MARLANT, and CFMWC) should be set in place to ensure that Science (CORA) is relevant to Operations (CAF/RCN). Close, invested collaboration in Exercises and Experimentation will ensure meaningful outcomes for the RCN and CAF over the long term.

2 The lessons-learned database is already maintained by the CAF, so this would be only a slight expansion of its use.

Page 100: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 86 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Figure 5-1: Synchronization between MARPAC ORT and the RCN

Page 101: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 87 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Figure 5-2: Flow of MARPAC ORT Activities in Support of an Exercise

Page 102: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 88 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

6 REFERENCES

Alberts, D. S. (2002). Code of Best Practice: Experimentation. Office of the Assistant Secretary of Defense Command and Control Research Program (CCRP). Washington DC.

Alberts, D. S., & Hayes, R. E. (2005). Campaigns of Experimentation: Pathways to Innovation and Transformation. Office of the Assistant Secretary of Defense Command and Control Research Program (CCRP). Washington DC.

Alberts, D. S., Huber, R. K., & Moffat, J. (2010). NATO NEC C2 Maturity Model. Office of the Assistant Secretary of Defense Command and Control Research Program (CCRP). Washington DC.

American, British, Canadian, Australian, and New Zealand Armies Program. (2008). ABCA Coalition Operations Handbook. ABCA Publication 332.

Armed Forces Staff Branch VI 6. (2008). German Concept Development and Experimentation Sub-concept (CD&E SC). Chief of Staff, Bundeswehr.

Australian Emergency Management Institute. (2012). Australian Emergency Manual Series. Handbook 3: Managing Exercises. Commonwealth of Australia.

Breton, R. (2011). Joint Command Decision Support for the 21st Century Technology Demonstration Project: Lessons learned from experiment, demonstration and training events. DRDC Valcartier Technical Report TR 2010-226.

Breton, R., Tremblay, S. & Banbury, S. (2007). Measurement of individual and team situation awareness: A critical evaluation of the available metrics and tools and their applicability to command and control environments. DRDC Valcartier Technical Report TR 2003-283.

Bowley, D., Comeau, P., Edwards, R., Hiniker, P. J., Howes, G., Kass, R. A., ... & Villeneuve, S. (2006). Guide for Understanding and Implementing Defense Experimentation (GUIDEx) version 1.1. The Technical Cooperation Program (TTCP).

Birnstiel, M., Kämmerer, M., Kern, S., May, T., Noeske, A., … & Walther, M. (2004) Wargaming Guide. Bundeswehr Command and Staff College.

Burns, S. (post-2013). War Gamers’ Handbook a Guide for Professional War Gamers. U.S. Naval War College.

Cameron, F., & Pond, G. (2010). Military Decision Making Using Schools of Thought Analysis–A Soft Operational Research Technique, with Numbers. In proceedings from 27th International Symposium on Military Operational Research. http://ismor.cds.cranfield.ac.uk/27th-symposium-2010, accessed 20 February 2016.

Page 103: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 89 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Chief of Naval Operations. (2013). Naval Planning NWP 5-01. Office of the Chief of Naval Operations, Navy Warfare Development Command.

Chief of Naval Operations, Commandant, United States Marine Corps, Commandant, United States Coast Guard. (2007). Universal Naval Task List (UNTL). OPNAVINST 3500.38B/MCO 3500.26A/ USCG COMDTINST M3500.1B. Version 3.0.

Combined Joint Operations from the Sea Centre of Excellence (2011). Allied Interoperability Handbook: A Tool to Enhance and Measure Interoperability Among NATO Allied/Coalitions and US Navy.

Couillard, M., Arseneau, L., Eisler, C., & Taylor, B. (2015). Force Mix Analysis in the Context of the Canadian Armed Forces. DRDC-RDDC-2015-P045.

Defense Acquisition University (2005). Test and Evaluation Management Guide. Fifth Edition. Defense Acquisition University Press, Fort Belvoir, VA.

Defence Terminology Database (DTB). http://terminology.mil.ca/index-eng.asp# accessed 23 February 2016.

Denford, J. (2011). Experimentation Guide. Revision 3. Army Experimentation Centre – Note, AEC-N 0501.

Department of Homeland Security. (2007). Department of Homeland Security Target Capabilities List: A companion to the National Preparedness Guidelines.

Department of Homeland Security. (2005). Department of Homeland Security Universal Task List. Version 2.1.

Department of Homeland Security. (Undated). Homeland Security Exercise and Evaluation Program (HSEEP) Master Scenario Events List (MSEL) Package.

Department of Homeland Security. (2013). Homeland Security Exercise and Evaluation Program (HSEEP).

Director, Operational Test and Evaluation. (2015). Test and Evaluation Master Plan (TEMP) Guidebook. Office of the Secretary of Defense.

DND. (Undated). Canadian Army Battle Task Standards (BTS). B-GL-383-002/PS-002.

DND. (Undated). MARPAC/JTFP Aide Memoire - Exercise Planning Process. LCdr Fedoruk MARPAC J7 email correspondence 22 January 2016.

DND. (Undated). MARPAC/JTFP Exercise Development and Deliverables. LCdr Fedoruk MARPAC J7 email correspondence 22 January 2016.

Page 104: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 90 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

DND. (Undated). Command and Control for Joint Maritime Operations. CFJP 3.X (DRAFT).

DND. (Undated). Royal Canadian Navy Combat Readiness Training Requirements CFCD 102(L). B-GN-002-000/RQ-001.

DND. (2001). Leadmark: The Navy’s Strategy for 2020.

DND. (2001). Maritime Operational Analysis Guide. CFCD 101. CFMWC: 3250-1 (CO). http://halifax.mil.ca/CFMWC/pages/links.html

DND. (2005). Canadian Forces Operations. CFJP-3.0. B-GJ-005-300/FP-000.

DND. (2008). Canadian Forces Operational Planning Process. CFJP-5.0 B-GJ-005-500/FP-00.

DND. (2009). Canadian Military Doctrine. CFJP-01. B-GJ-005-000/FP-001.

DND. (2010). Commander JTFP CONPLAN PLETHORA – JTFP Response to Domestic Emergencies.

DND. (2011). Evaluation of Land Force Readiness and Training. Chief of Review Services. March 2011 1258-169 (CRS).

DND. (2013) Chief of Defence Staff Force Posture and Readiness (FP&R).

DND. (2013). Commander’s Guidance and Direction to the Royal Canadian Navy: Executive Plan 2013-2107.

DND. (2013). Maritime Operational Test and Evaluation Guide. CFCD 124. CFMWC: 3250-1 (AWB/OTE). http://halifax.mil.ca/CFMWC/pages/links.html

DND. (2013). A Guide to Designing, Conducting, Assessing, and Reporting on an Exercise. CJOC Exercise Methodology (Version 1.2).

DND. (2013). Maritime Command Sea Training Guide. STG (C).

DND. (2013). NAVORD 3120-2. Scheduling Process.

DND. (2013). NAVORD 11900-2. Maritime Evaluations.

DND. (2014). NAVORD 3771-1. Maritime Science and Technology (S&T) Programme.

DND. (2014). NAVORD 3771-12. Defence Research & Development Canada (DRDC) - Directed Client Support.

DND. (2014). NAVORD 3771-14. Maritime Modelling and Simulation (M&S) Policy.

DND. (2014). NAVORD 4500-0. Individual and Collective/Operational Training Policy.

Page 105: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 91 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

DND. (2014). The Future Security Environment; 2014-2040.

DND. (2014). Canadian Army Simulation Centre Exercise Design, Development and Delivery Guide (E3DG) Version 3.0.

DND. (2014). Commander Canadian Army Training for Land Operations. B-GL-300-008/FP-001.

DND. (2014). Commander CJOC Planning Guidance for CJOC Joint Training and Exercise Plan Fiscal Year 2015/2016. 3352-1 (J7). (provided by LCdr Fedoruk MARPAC J7 email 22 January 2016).

DND. (2014). Commander CJOC CONPLAN JUPITER – CAF Expeditionary Operations Contingency Plan.

DND. (2014). Commander CJOC Standing Operations Order for Domestic Operations (SOODO).

DND. (2014). Commander CJOC CONPLAN LENTUS - CAF Assistance to Federal/Provincial/Territorial Disaster Relief Operations.

DND. (2014). Commander JTFP CONPLAN PANORAMA – JTFP Response to a Catastrophic Earthquake in British Columbia.

DND. (2015). Commander Royal Canadian Navy Standing Direction and Guidance (CRCN SD&G) 3371-3250-1(DGNSR/RDIMS 366208).

DND. (2015). Director-General Capability and Structure Integration Operating Procedures Aide-Memoire. 2nd Edition.

DND. (2015). Director General Naval Force Development Maritime Concept Development Guide (MCDG).

DND. (2015). DND/CAF Modelling and Simulation Roadmap. Draft Version 5.6.

DND. (2015). Lessons Learned. CFJP A2. B-GL-025-0A2/FP-001.

DND. (2015). Maritime Concept Development Guide (MCDG).

DND. (2015). NAVORD 3009-3. Statement of Operational Capability Deficiency (SOCD).

DND. (2015). NAVORD 3771-13. Maritime Concept Development and Experimentation (CD&E) Policy.

DND. (2015). Royal Canadian Navy Future Naval Training System Strategy (FNTS). RDIMS # 457941.

Page 106: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 92 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

DND. (2015). Royal Canadian Navy Readiness and Sustainment Policy. CFCD 129. B-GN-005-RCN/RQ-001.

DND. (2016). Canadian Armed Forces Joint Task List (JTL). http://collaboration-cjoc-coic.forces.mil.ca/sites/JTL

Dobias, P. (2014). Operational Research Support to Canadian Fleet Pacific TGEX 3-14: Methodology and Data Collection Plan, DRDC-RDDC-2014-L176.

Dobias, P., Appleget, J., Cameron, F., Tahir, A., Sen, F., Unlu, S., & Gencay, M. (2015). Employment of Non-Lethal Capabilities for Visit, Board, Search, and Seizure Operations: Naval Postgraduate School Wargame, NATO STO-TM-SAS-094.

Dobias, P., & Eisler, C. (2015). Enhanced Boarding Party Concept of Operations: Data Collection Plan for Exercise TRIDENT FURY, DRDC-RDDC-2015-L086.

Dobias, P., & Eisler, C. (2015). Operational Research Support to Exercise TRIDENT FURY Data Collection Plan. DRDC-RDDC-2015-L120.

Dobias, P., Kalantzis, E., & Connell, D. (2010). CEFCOM Effects Dashboard: Assessment of Operations in Multi-Agency Environment. DRDC CORA TM 2010-001.

Dobias, P., Sprague, K., Woodill, G., Cleophas, P., & Noordkamp, W. (2008). Measures of Effectiveness and Performance in Tactical Combat Modelling. DRDC CORA TM 2008-032.

Dooley, P., & Gauthier, Y. (2013). Vignette, Task, Requirement and Option (VITRO) Analyses Approach: Application to Concept Development for Domestic Chemical, Biological, Radiological and Nuclear (CBRN) Event Response. DRDC CORA TR 2013-225.

DRDC Toronto (2002). Guidelines for Human Subject Participation in Research Projects. DRDC Report, April 2002.

Drury, C. (1990). “Methods for Direct Observation of Performance”, in J.R. Wilson & E.N. Corlett (eds), Evaluation of Human Work, 2nd edition, London: Taylor and Francis, pp.45-68.

Eaton, J., Redmayne, J., & Thordsen, M. (2007). Joint Analysis Handbook. Third Edition. NATO Joint Analysis and Lessons Learned Centre.

Eisler, C. (2015). Operational Research Support to MARPAC/JTFP during EX DETERMINED DRAGON 14. DRDC-RDDC-2014-L258.

Eisler, C., & Dobias, P. (2015). Observations and Recommendations from the Exercise PACIFIC JEOPARDY 2015. DRDC-RDDC-2015-L185.

Page 107: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 93 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Emergency Management Division. (2010). Exercise Design Quick Reference Guide. Justice Institute of British Columbia.

Government of Canada. (2008). Canada First Defence Strategy.

Joint Systems and Analysis Group Technical Panel 3. (Undated). Guide to Capability-Based Planning. The Technical Cooperation Program (TTCP).

Kalantzis E., Dobias, P., & Connell, D. (2007). Assessment of Effect Based Operations Based on a 3D Approach, In Proceedings from Cornwallis XII: Analysis for Multiagency Support, Pearson Peacekeeping Centre, Cornwallis Park, Nova Scotia, 2-5 April 2007, SL 2007-03.

Kass, R. (2006). Logic of Warfighting Experiments. Office of the Assistant Secretary of Defense Command and Control Research Program (CCRP). Washington DC.

Lamoureux, T. (2013). Measures of Performance Related to the Concepts of Integrated C2. DRDC Valcartier W7714-08-3663 TASK 148.

Lund, W.D.G. (2014). NBP 3.0 Project - Basic Battle Procedure for RCN Junior Officers. provided by Cdr Peschke, Sea Training Pacific. Email correspondence 22 January 2016.

National Emergency Management Training Committee (NEMTC) Exercise Design Working Group. (2007). Exercise Design 100. Alberta Emergency Management Agency.

NORAD and US Northern Command. (2008). NORAD and USNORTHCOM Exercise Program. NNCI 10-156.

North Atlantic Treaty Organization. (2003). Handbook on Long Term Defence Planning. RTO-TR-069 AC/323(SAS-025) TP/41.

North Atlantic Treaty. (2004). AJP 3.1 Allied Joint Maritime Operations. Organization. NATO Standardization Agency.

North Atlantic Treaty Organization. (2008). AJP-3.9 Allied Doctrine for Joint Targeting. NATO Standardization Agency.

North Atlantic Treaty Organization. (2009). MC 0583: NATO Policy for Concept Development and Experimentation.

North Atlantic Treaty Organization. (2010). MCM-0056-2010: NATO Concept Development and Experimentation Process.

North Atlantic Treaty Organization. (2010). Comprehensive Operations Planning Directive (COPD). NATO Allied Command Operations.

Page 108: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 94 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

North Atlantic Treaty Organization. (2010). AJP-01(D) – Allied Joint Doctrine. NATO Standardization Agency.

North Atlantic Treaty Organization. (2010). Bi-SC 75-4: Experimentation Directive.

North Atlantic Treaty Organization. (2011). NATO Alternative Analysis (Red Teaming) Concept Conceptual Framework: DRAFT Conceptual Framework for the Implementation of a Bi-Strategic Command Decision Support Tool. NATO Allied Command Transformation.

North Atlantic Treaty Organization. (2011). The NATO Lessons Learned Handbook. Second Edition. NATO Joint Analysis and Lessons Learned Centre.

North Atlantic Treaty Organization. (2011). SAS 078 Study Group: Non-Lethal Weapons Capability-Based Analysis. NATO STO-TR-SAS078 Final Report extracts provided by Dr. P. Dobias.

North Atlantic Treaty Organization. (2013). AJP-5 Allied Joint Doctrine for Operational Planning. NATO Standardization Agency.

North Atlantic Treaty Organization. (2013). Bi-SC 75-2: Education, Training, Exercise and Evaluation Directive (E&TD).

North Atlantic Treaty Organization. (2013). Bi-SC 75-3: Collective Training and Exercise Directive (CT&ED).

North Atlantic Treaty Organization. (2013). Bi-SC 75-7: Education and Individual Training Directive (E&IT).

North Atlantic Treaty Organization. (2015). AAP-06 NATO Glossary of Terms and Definitions. NATO Standardization Agency.

Operations Research/Systems Analysis Committee. (2011). Operations Research/

Systems Analysis (ORSA) Fundamental Principles, Techniques, and Applications. The Army Logistics University.

Post, A. (2004). Military Experimentation Hallmark of Professionalism. Air Power Development Centre. Australia.

Roedler, G.J. and Jones, C. (2005). Technical Measurement: A Collaborative Project of PSM, INCOSE, and Industry. INCOSE-TP-2003-020-01.

Stanton, N. & Salmon, P. (2004). Human Factors Design and Evaluation Methods Review. Report by Human Factors Integration Defence Technology Centre.

Page 109: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – 95 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Taylor, B. (2013). Analysis Support to Strategic Planning. The Technical Cooperation Program (TTCP).

TRADOC Analysis Center (1993). Command and Control Measures of Effectiveness Handbook. Technical Document TRAC-TD-0393.

Tri-Council Policy Statement (2010). Ethical Conduct for Research Involving Humans. December, 2010.

UK Ministry of Defence. (2013). Red Teaming Guide. Second Edition. Director Concepts and Doctrine

U.S. Army Training and Doctrine Command. (1993). Command and Control Measures of Effectiveness Handbook. TRADOC Analysis Center Technical Document TRAC-TD-0393.

U.S. Army Training and Doctrine Command. (2014). The U.S. Army Operating Concept: Win in a Complex World. TRADOC Pamphlet 525-3-1.

U.S. Joint Chiefs of Staff. (2010). Department of Defense Dictionary of Military and Associated Terms. Joint Publication 1-02 (as amended through 15 January 2016).

U.S. Joint Chiefs of Staff. (2011). Universal Joint Task Manual. CJCSM 3500.04F

U.S. Joint Chiefs of Staff. (2014). Universal Joint Task List Policy and Guidance. CJCSI 3500.02B.

U.S. Joint Chiefs of Staff. (2015). Joint Training Manual for the Armed Forces of the United States. CJCSM 3500.03.

U.S. Joint Chiefs of Staff. (2016). Universal Joint Task List (UJTL).

U.S. Navy Warfare Development Command. (2011). Maritime Commander’s Red Team Handbook.

Wesolkowski, S., & Eisler, C. (2015). Capability-Based Models for Force Structure Computation and Evaluation. DRDC-RDDC-2015-N008.

Page 110: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – A-1 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

APPENDIX A GLOSSARY

Act: The operational function that integrates manoeuvre, firepower and information operations to achieve the desired effects. (DTB)

After Action Report (AAR): A formal report generated following a training exercise or operational event that focuses on identifying what happened, why it happened, and how it can be improved. Note: An after-action report is an example of a post-activity report and may be the output of an after-action review. (DTB)

Air Defence (AD): All measures designed to nullify or reduce the effectiveness of hostile air action. (AAP-6)

Airspace Control Plan (ACP): The document approved by the joint force commander that provides specific planning guidance and procedures for the airspace control system for the joint force operational area. (JP 1-02)

Airspace Coordinating Measures (ACW): Measures employed to facilitate the efficient use of airspace to accomplish missions and simultaneously provide safeguards for friendly forces. (JP 1-02)

Analysis and Collection Plan (ACP): In lessons learned, a plan developed to gather data and information related to the assigned objective, and identify techniques to facilitate analysis. (DTB)

Anti-Air Warfare (AAW): Measures taken to defend a maritime force against attacks by airborne weapons launched from aircraft, ships, submarines and land-based sites. (AAP-6)

Anti-Submarine Warfare (ASW): Operations conducted with the intention of denying the enemy the effective use of their submarines. (AAP-6)

Anti-Surface Warfare (ASUW): That portion of maritime warfare in which operations are conducted to destroy or neutralize enemy naval surface forces and merchant vessels. Alternatively referred to as Surface Warfare. (JP 1-02)

Area of Interest (AI): The area of concern to a commander relative to the objectives of current or planned operations, including his areas of influence, operations and/or responsibility, and areas adjacent thereto. (AAP-6)

Area of Operations (AO): That portion of an area of war necessary for military operations and for the administration of such operations. (AAP-6)

Area of Responsibility (AOR): A defined area in which responsibility is specifically assigned to the commander of the area for the development and maintenance of installations, control of movement and the conduct of tactical operations involving military

Page 111: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – A-2 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

forces under his control along with parallel authority to exercise these functions. In naval usage, a predefined area of enemy terrain for which supporting ships are responsible for covering by fire on known targets or targets of opportunity and by observation. (AAP-6)

Battle Damage Assessment (BDA): The assessment of effects resulting from the application of military action, either lethal or non-lethal, against a military objective. (APP-6)

Battle Task Standard (BTS): The standards specifying the minimum degree of competence required to carry out combat operations at a given level. (DTB)

Best Practice: An effective method that is promoted to effect change and to ensure its continued use. (DTB)

Buy and Try (BTVAL): The evaluation of readily available commercial equipment in an operational environment. (NAVORD 11900-2)

Capability: The ability to execute a specified course of action to achieve a certain effect. Within the transformational area, the definition includes one or more elements of the Doctrine, Organization, Training, Materiel, Leadership, Personnel, Facilities and Interoperability (DOTMLPFI) spectrum. (NATO MC 0583). The ability to carry out a military operation to create an effect. (DTB)

Capability Based Planning (CBP): In force development, a systematic approach to determine the most appropriate force options to meet national aims. (DTB)

Chemical, Radiological, Biological and Nuclear (CBRN): Measures taken to minimize or negate the vulnerabilities to, and/or effects of, a chemical, biological, radiological, or nuclear hazard or incident. (JP 1-02)

Collective Training: Collective training is that training, other than Individual Training (IT), designed to prepare teams, units and other elements to perform military tasks in accordance with defined standards. This includes procedural drill and the practical application of doctrines, plans and procedures to acquire and maintain tactical, operational and strategic capabilities. (NAVORD 4500-0)

Command: The operational function that integrates all the operational functions into a single comprehensive strategic, operational or tactical level concept. (DTB)

Commander’s Critical Information Requirement (CCIR): An information requirement identified by the commander as being critical to facilitating timely decision making. (JP 1-02)

Command Post Exercises (CPX): This is an exercise involving the commander, his staff and communications within and between headquarters but in which the forces are simulated. This type of exercise is known as a “functional exercise” in the civilian context. These exercises are designed to validate the capability of an individual function, or complex activity within a function. They are designed to be very realistic, with real response time

Page 112: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – A-3 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

constraints, but with all activity outside the training audience fully simulated. This type of exercise is quite sophisticated, normally involves simulation and requires great skill to properly design and conduct. A CPX driven either partially or fully by simulation is often referred to as a computer assisted exercise (CAX). (CASC E3DG)

Computer Assisted Exercise (CAX): An exercise using modelling and simulation technology to create an artificial environment, identical to the real-world, that will stimulate decision-making and follow-on command and control actions. (NATO Bi-SC 075-003)

Concept: In force development, a notion or statement of an idea to address a capability gap. (DTB)

Concept of Employment (CONEMP): A CONEMP outlines the employment of a specific capability within a range of operations or scenarios. (MCDG)

Concept of Operations (CONOPS): The CONOPS expresses the military commander’s intentions on the use of forces, time and space to achieve his mission, objectives, and end state. (CFJP 5.0) A clear and concise statement of the line of action chosen by a commander in order to accomplish his [or her] mission. (DTB)

Conditions: Those variables of an operational environment or situation in which a unit, system, or individual is expected to operate and may affect performance. Also, variables of the operational environment, including scenario that affects task performance. (CJCSM 3500.04F)

Concept Development and Experimentation (CD&E): The NATO CD&E process is a scientifically supported methodology aimed at developing innovative and novel solutions to capability shortfalls or gaps, through an iterative approach of discovery and refinement. (NATO MCM-0056-2010)

Constructive Simulation: In the taxonomy of simulation, a type of simulation involving simulated people operating simulated systems. (DTB)

Contingency Plan (CONPLAN): An operation plan for contingencies that can reasonably be anticipated in a specific geographical area. A mechanism to address a potential future event or circumstance based on known or assumed planning factors. (CFJP 5.0)

Criterion: The minimum acceptable level of performance associated with a particular measure of task performance. It is often expressed as hours, days, percent, occurrences, minutes, miles, or some other command-stated measure. (CJCSM 3500.04F)

Data Collection Plan: A plan that explains how the requisite data will be collected and validated prior to analysis. The plan will identify what data are being collected, the collection techniques and the method of validation. (GUIDEx)

Page 113: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – A-4 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Defense Experiment/Experimentation: The application of the experimental method to the solution of complex defense capability development problems, potentially across the full spectrum of conflict types, such as warfighting, peace-enforcement, humanitarian relief and peace-keeping. (GUIDEx)

Developmental Evaluation (DEVAL): An evaluation of concepts or equipment in the pre-prototype stage or, prototype equipment as applicable for potential application in HMC ships and/or submarines. (NAVORD 11900-2)

Demonstration Experiment: Demonstration experiments are designed experiments in which known truth is recreated, analogous to those in high school in which students follow instructions to show that the laws of chemistry and physics operate as the underlying theories predict. For NATO these activities are cooperative demonstrations of technology to show that an innovation can, under carefully orchestrated conditions, improve the efficiency, effectiveness or speed of a military activity. The technologies employed are well established and the setting (e.g., scenario, participants) is orchestrated to show that these technologies can be employed efficiently and effectively. (BI-SCD 075-003)

Direct Action (DA): A short-duration strike or other small scale offensive action by special operations forces to seize, destroy, capture, recover or inflict damage to achieve specific, well-defined and often time-sensitive results. (AAP-6)

Discovery Experiment: Discovery experiments are designed to create recommendations of concepts that are most likely to produce successful future military and/or political capabilities for the Alliance. Their outcomes are expected to be insights rather than optimality or rigorous quantitative analyses; they do not produce final answers. Discovery-type capabilities experiments produce actionable recommendations that address desired operational capabilities and potential investment streams. (BI-SCD 075-003)

Doctrine, Organization, Training, Materiel, Leadership and Education, Personnel, Facilities and Interoperability (DOTMLPFI): DOTMLPFI is an acronym used to describe inputs to the capability development process. (DTB)

Evaluation: The process of determining, by whatever means, the quality of a concept or system of interest by comparing it against appropriate criteria or requirements. When done practically or empirically, this is enacted by testing. (GUIDEx)

Event: A generic term which may be used to describe a demonstration, test, experiment or observational study prior to the designation of those terms. (GUIDEx)

Exercise: A military manoeuvre or simulated wartime operation involving planning, preparation, and execution. It is carried out for the purpose of training and evaluation. It may be a combined, joint, or single service exercise, depending on participating organizations. (DTB)

Page 114: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – A-5 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Exercise Control (EXCON): EXCON is the term used to describe all of the participants during the Operational Conduct of an exercise who are not in the training audience and thus who are under the control of the Exercise Director. (BI-SCD 075-003

Exercise Directive (EXDIR): Refines the requirements for the exercise by providing further guidance to the planners of the exercise. Signed by the Officer Conducting the Exercise (OCE). (CJOC Exercise Methodology)

Exercise Director (Ex DIR): The Exercise Director, proposed by the OCE and approved by the OSE, is the senior officer responsible for the overall direction and control in support of the exercise aim and objectives as well as the approved training objectives. The EXDIR will be designated during the Exercise Concept and Specification Development Stage and engage in the remainder of the exercise process in support of the OCE. The EXDIR will head the Exercise Control (EXCON) organisation and direct all aspects of execution of an exercise on behalf of the OCE. (BI-SCD 075-003)

Exercise Plan (EXPLAN): Provides details on how the exercise will be conducted, who are the key participants, how the evaluation will be undertaken and all the necessary administrative instructions. Signed by the Exercise Director (Ex DIR). (CJOC Exercise Methodology)

Exercise Specification (EXSPEC): Sets out the fundamental requirements for the exercise including an outline of the concept, form, scope, setting, aim, objectives, force requirements, political implications, analysis arrangements and costs. Signed by the Officer Scheduling the Exercise (OSE). (CJOC Exercise Methodology)

Experiment: An empirical means of establishing cause-and-effect relationships through the manipulation of independent variables and measurement of dependent variables in a controlled environment. Experimentation is enacted by the testing of hypotheses. (GUIDEx)

Experimentation: The process of controlled research, testing and evaluation to discover information or test a hypothesis to establish cause and effect. (DTB)

Field Training Exercise (FTX): An FTX is a training event executed in a realistic tactical environment designed to enable tactical groupings to practice or confirm the execution of tactical tasks, either in a force-on-force or live fire context. They should include maximum integration of combat, combat support and combat service support (CSS) functions. This type of exercise is known as a “full scale exercise” in the civilian context. Full scale exercises are intended to evaluate the operational capability of systems in an interactive manner over a substantial period of time. They involve the validation of a major portion of the elements existing within these systems and organizations in a high stress environment. This type of exercise includes the actual mobilization of personnel and resources to demonstrate coordination and response capability in real time. Full scale exercises are designed to be the closest to the “real thing” as possible but even so there is usually some simulation required. (CASC E3DG)

Page 115: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – A-6 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Finding: In lessons learned, a concise statement based upon the analysis of observations, data and/or information. (DTB)

Force Development: A system of integrated and interdependent processes used to identify, conceptualize and implement necessary changes to existing capabilities or to develop new capabilities. (DTB) It is defined in the RCN Executive plan as: is directed towards activities and processes related to identifying and gaining approval for new or amended naval capabilities, to planning and managing the acquisition, production, and/or staffing of those capabilities, and to bringing those capabilities to a sufficient level of operational maturity to allow them to be integrated into the normal Force Support, Force Employment, and Force Generation domains. (CRDN SD&G)

Force Employment: At the strategic level, the application of military means in support of strategic objectives. At the operational level, the command, control and sustainment of allocated forces. (DTB) The RCN Executive plan defines FE as follows: is directed towards activities and processes related to the command, control, and operational employment of naval forces. This functional area falls under CJOC. (CRCN SD&G)

Force Generation: The process of organizing, training and equipping forces for force employment. (DTB) It is defined in the RCN Executive plan as: directed towards activities and processes that support the three key elements required to generate maritime forces: materiel readiness (technical readiness), personnel readiness (e.g., individual training) and combat readiness (e.g., collective and mission-specific training). (CRCN SD&G)

Force Management (FM): Defined in the RCN Executive plan as: directed towards activities and processes related to planning, directing, monitoring and coordinating Force Development, Force Generation, Force Support and Force Employment activities across the span of the RCN. (CRNC SD&G)

Force Posture and Readiness Directive (FP&R): The CDS Directive on CAF Force Posture and Readiness. (CRCN SD&G)

Force Protection (FP): All measures and means to minimize the vulnerability of personnel, facilities, equipment and operations to any threat and in all situations, to preserve freedom of action and the operational effectiveness of the force. (AAP-6)

Harbour Defence (HD): The safeguarding of vessels, harbours, ports, waterfront facilities and cargo from internal threats such as: destruction, loss, or injury from sabotage or other subversive acts; accidents; thefts; or other causes of similar nature. May be referred to as Port Security. (AAP-6)

Higher Control (HICON): In manoeuvres or tactical exercises with or without troops, military members at a higher level responsible for creating incidents, simulating conditions and presenting tactical and logistical problems to make military students under their command react according to the aims of the exercises. (DTB)

Page 116: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – A-7 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Hot Wash-Up: Military jargon for an informal after action review. (DTB)

Hypothesis: An assertion, proposition or statement about relations or constraints whose truth-value is as yet unknown, but in principle is determinable by tests involving generally empirical but also logical evidence. (GUIDEx)

Hypothesis-testing Experiment: Hypothesis-testing experiments are the classic type used to advance knowledge by seeking to falsify specific hypotheses (if…then statements) or discover their limiting conditions. They also are used to test whole theories or observable hypotheses derived from such theories. To conduct a hypothesis-testing experiment, the experimenter creates a situation in which one or more dependent variables can be systematically observed under conditions with varying independent variables, while other potentially relevant factors (i.e., control variables) are held constant, either empirically or through statistical manipulation. Hence, results from hypothesis-testing experiments are always caveated with “all other things being equal. (BI-SCD 075-003)

Initiating Directive: A written directive issued by the Chief of Defence Staff to formalize the initiation of strategic and operational level planning, to communicate the initial deductions of a strategic assessment, and to authorize the execution of preparatory actions such as making changes to readiness, conducting reconnaissance and conducting pre-deployment training. Note: While an initiating directive initiates planning and preparation, it does not confer the authority to execute an operation. (DTB)

Intelligence: The product resulting from the processing of information concerning foreign, hostile or potentially hostile forces or elements, or areas of actual or potential operations. (DTB)

Interoperability: The ability to operate in synergy in the execution of assigned tasks. (DTB)

Joint: Adjective that connotes activities, operations, organizations in which elements of at least two services participate. (DTB)

Joint Concept: Links strategic guidance to the development and employment of future joint force capabilities and serve as "engines for transformation" that may ultimately lead to doctrine, organization, training, materiel, leadership and education, personnel and facilities (DOTMLPF) and policy changes. (JP 1-02)

Joint Experimentation: An iterative process of collecting, developing, and exploring concepts to identify and recommend the best value-added solutions for changes to doctrine, organization, training, equipment, leadership, personnel, and facilities required to achieve significant advances in future joint operational capabilities. (DTB)

Joint Fires: Fires applied during the employment of forces from two or more components, in coordinated action toward a common objective. (AAP-6)

Page 117: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – A-8 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Joint Force Air Component Commander (JFACC): The commander within a unified command, subordinate unified command, or joint task force responsible to the establishing commander for recommending the proper employment of assigned, attached, and/or made available for tasking air forces; planning and coordinating air operations; or accomplishing such operational missions as may be assigned. (JP 1-02)

Live Simulation): Simulation of military operations in a live environment with actual military units and with real military equipment and operational prototypes, with only weapon effects being simulated. For example, Air Combat Manoeuvring Instrumentation (ACMI) ranges and field environments using laser-based weapon effects simulators. (GUIDEx)

Lower Control (LOCON): In manoeuvres or tactical exercises with or without troops, subordinate military members who obey, wrongly obey or make believe they obey orders given them by military students at a higher level to make them react according to the aim of the exercises. (DTB)

Main Events List/Main Incidents List (MEL/MIL): The MEL/MIL, the main tool (normally a database) for the EXCON to control the exercise, is maintained by EXCON and it is structured on the main events developed to support achievement of the exercise objectives. Each main event will have one or more incidents that are presented to the training audiences by means of injections. The MEL/MIL should encompass the complete timeline of the exercise and, at ENDEX, be updated to include all dynamic and unscripted events, incidents and injections utilized during the exercise conduct. (BI-SCD 075-003)

Master Events Scenario List (MESL): The list of key events in an exercise scenario. (DTB)

Measure: A parameter that provides the basis for describing varying levels of task accomplishment. (CJCSM 3500.04F)

Maritime Interdiction Operation (MIO): An operation conducted to enforce prohibition on the maritime movement of specified persons or material within a defined geographic area. (AAP-6)

Mission: A clear, concise statement of the task of the command and its purpose. (DTB)

Mission Specific Training (MST): Is training for individuals, teams or units in order to produce a specific capability for a specific mission. Normally conducted in support of mission-specific capability insertions or equipment fits. (NAVORD 4500-0)

Mine Warfare (MW): The strategic and tactical use of mines and their counter-measures. (AAP-6)

Page 118: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – A-9 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Naval Cooperation and Guidance of Shipping (NCAGS): The provision of NATO military cooperation, guidance, advice, assistance and supervision to merchant shipping to enhance the safety of participating merchant ships and to support military operations. (AAP-6)

Naval Task Group (NTG): A Naval Force Package comprised of up to four combatants (destroyers, frigates or submarines) and a support ship, with appropriate Naval Task Group Command Staff and maritime air support. (CRCN SD&G)

Objective: A clearly defined and attainable goal for a military operation, for example seizing a terrain feature, neutralizing an adversary's force or capability or achieving some other desired outcome that is essential to a commander's plan and towards which the operation is directed. (DTB)

Officer Conducting the Exercise (OCE): The officer responsible for the conduct of an allocated part of the exercise from the Blue, Orange and Purple aspects. He will issue necessary supplementary instructions. In addition, he may be an exercise commander. (DTB)

Officer Scheduling the Exercise (OSE): The officer who originates the exercise and the orders it to take place. He will issue basic instructions which will include the designation of exercise areas, the allocation of forces, and the necessary coordinating instructions. He will also designate the officers conducting the exercise. (DTB)

Operational Evaluation (OPVAL): OPVALs are used in OT&E to ensure that newly introduced or modified equipment or systems meet the validated requirements of a user in a realistic scenario. (CFCD 124)

Operational General Matters (OPGEN): The Officer in Tactical Command (OTC) issues OPGEN on policy, instructions, and aspects common to all warfare areas. This message is the primary means for the OTC to issue mission orders to their forces for an operation or phase of an operation. (AJP 3.1)

Operational Task (OPTASK): OPTASKs are messages used to promulgate detailed tasking and instructions specific to an individual warfare area. They are issued by the responsible commander and are the primary method for issuing tailored standing orders to maritime forces for an operation or phase of an operation. (AJP 3.1)

Operational Tests (OPTEST): OPTEST is a special category of OT&E used to test existing systems against the changing operational environment. In addition to hardware, existing systems includes software, operational organizations, tactics, drills and procedures, as well as any other testable matter that can affect operational effectiveness or suitability for use in an operational environment. (CFCD 124)

Page 119: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – A-10 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Operational Training (OT): Is training that develops, maintains or improves the operational readiness of units, such as Work-Ups (WUPs) and Harbour Readiness Training (HRT). (NAVORD 4500-0)

Operations Order (Op O): An Op O is a directive, usually formal, issued by a commander to subordinate commanders for the purpose of effecting the coordinated execution of an operation. (CFJP 5.0)

Operational Planning Process (OPP): A coordinated staff process used by a commander to determine the best method of accomplishing assigned tasks and to direct the action necessary to accomplish the mission. (CFJP 5.0)

Operations Plan (OPLAN): An OPLAN is a mechanism that a commander uses to plan/prepare well in advance for a known upcoming operation for which the Government has specifically tasked the CF to prepare and execute. (CFJP 5.0)

Operational Readiness: The ability of a unit/formation, ship, weapon system or equipment to perform the missions or functions for which it is organized or designed. Note: May be used in a general sense or to express a level or degree of readiness. (DTB). The cumulative readiness assessment involving Personnel, Materiel and Collective Training (OR=PITR+MR+CTR). (CRCN SD&G)

Probability of Detection (P(d)): The probability that sensor functions required to search, detect and track objects are adequate to support effective control and engagement. (MOTEG)

Probability of Engagement (P(h)): The probability that the effector system hitting the threat.

Probability of Kill (P(k)): The probability that the effector system destroys the threat, given a hit.

Replenishment at Sea (RAS): Those operations required to make a transfer of personnel and/or supplies when at sea. (AAP-6)

Rules of Engagement (ROE): Directives issued by competent military authority which specify the circumstances and limitations under which forces will initiate and/or continue combat engagement with other forces encountered. (DTB)

Scenario: The background story that describes the historical, political, military, economic, cultural, humanitarian and legal events and circumstances that have led to the current exercise crisis or conflict. The scenario is designed to support exercise and training objectives and, like the setting, can be real, fictionalized or synthetic as is appropriate. A scenario will be composed of specific modules essential to the accomplishment of the exercise objectives or of the seminar/academic/experiment objectives. (BI-SCD 075-003)

Page 120: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – A-11 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Sense: The operational function that provides the commander with knowledge. Note: This function incorporates all capabilities that collect and process data. (DTB)

Shield: The operational function that protects a force, its capabilities and its freedom of action. (DTB)

Situational Awareness (SA) : The combined knowledge of friendly forces, hostile forces, the environment and other aspects of the battlespace. (DTB)

Special Operations (SOF) : Military activities conducted by specially designated, organized, selected, trained and equipped forces using unconventional techniques and modes of employment. (AAP-6)

Standard: Quantitative or qualitative measures for specifying the levels of performance of a task. (CJCSM 3500.04F)

Surveillance: The systematic observation of aerospace, surface or subsurface areas, places, persons or things by visual, aural, electronic, photographic or other means. (DTB)

Sustain: The operational function that regenerates and maintains capabilities in support of operations. (DTB)

Table-Top Exercise (TTX): This is also known as a seminar, symposium or facilitated discussion. TTX are used to provide an opportunity for participants to discuss issues, events or vignettes in a low stress environment with very few time constraints. They are designed to elicit constructive discussion by the participants as they attempt to examine and then resolve problems based on existing plans. Generally, they involve key senior personnel discussing hypothetical scenarios and are aimed at facilitating understanding of concepts and processes while identifying strengths and shortfalls. (CASC E3DG)

Task: An activity that contributes to the achievement of a mission. (DTB)

Technical Evaluation (TECHVAL): A test of the technical suitability of prototype or production equipment. (NAVORD 11900-2)

Training: An activity that aims to impart the skills, knowledge and attitudes required to perform assigned duties. Note: This is a generic term for all types of training such as professional development, collective and individual training. (DTB)

Transfer of Authority (TOA): Within NATO, an action by which a member nation or NATO Command gives operational command or control of designated forces to a NATO Command. (APP-6)

Page 121: Naval exercise and experimentation support frameworkcradpdf.drdc-rddc.gc.ca/PDFS/unc236/p803538_A1b.pdf · 3 FORMULATING RESEARCH HYPOTHESES IN MILITARY EXPERIMENTS .... 56 3.1 Defence

Naval Exercise and Experimentation Support Framework

31 March 2016 – A-12 – 5902-002 Version 01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la

Défense nationale, 2016

Transfer of Command Authority (TOCA): The formal transfer of a specified degree of authority over forces assigned to an operation between commanders of supporting commands and the supported commander. (CFJP 5.0)

Validation: The confirmation of the capabilities and performance of organizations, individuals, materiel or systems to meet defined standards or criteria, through the provision of objective evidence. (DTB)

Wargaming: Wargaming is a disciplined process, with rules and steps that attempt to visualize the flow of the operation, given the friendly and the adversary’s capabilities, strengths, weakness’ and force dispositions as well as other situational and environmental considerations. (NWP 5-01)

Whole-of-Government Approach (WoG Approach): An integrated approach to a situation that incorporates diplomatic, military, and economic instruments of national power as required. (DTB)