39
Learning Design Standard Digital Performance Analysis Version: 1.1

Exposure Draft - Learning Design Standard - Digital ...  · Web viewFigure 1 - Pathways to learning. ... Note this is broadly the online GA training program freely available at

  • Upload
    lamnhan

  • View
    215

  • Download
    0

Embed Size (px)

Citation preview

Learning Design StandardDigital Performance Analysis

Version: 1.1

Table of contents

Revision history.....................................................................................................3Using the Learning Design Standards...................................................................3Intellectual property and moral rights.....................................................................3Guidance for providers..........................................................................................5Guidance for agencies...........................................................................................7Setting the context.................................................................................................7Jobs, roles and skills............................................................................................10Overview of Digital performance analysis............................................................11Target audience...................................................................................................12Pathways to Digital performance analysis...........................................................12Qualifications and certifications...........................................................................13Capabilities needed for Digital performance analysis..........................................14Relevant SFIA Skills............................................................................................16References..........................................................................................................17Key content areas................................................................................................18

Unit 1. The role of performance analysis in the Australian Government digital service design context.....................................................................................18

Unit 2. Defining the scope of work and preparing the strategy........................20

Unit 3. Optimisation and iteration of the digital service using performance analysis...........................................................................................................27

29 November 2018 Building Digital Capability Program – Learning Design Standard – Digital Performance Analysis 2

Revision historyDate Version Contact Content9 April 2018 0.1 Grant Nicholson Initial draft for PCB consideration30 April 2018 0.2 Grant Nicholson Incorporation of minor changes recommended by PCB3 May 2018 0.3 Grant Nicholson First exposure draft20/06/2018 0.4 Ross McGuire Added in SFIA Licensing explanation 21/06/2018 0.4 Ross McGuire Added feedback from ACS21/06/2018 1 Ross McGuire Finalised document version for the DTA30/11/2018 1.1 Grant Nicholson Upgrade SFIA references to SFIA7

Using the Learning Design Standards The Australian Public Service Commission (APSC) has developed Learning Design Standards (LDS) to describe a capability needed by the Australian Public Service (APS) to help with the digital transformation of government services.

The LDS describes the context, business need, target audience, underpinning capabilities and curriculum for these capabilities. It does not prescribe or mandate a specific learning solution or format to build the capability described. That is left open for providers and sellers to design solutions that meet the specific needs of individual agencies.

This document is for:

Providers and sellers seeking to work with APS agencies to understand the needs of the APS when developing and marketing products.

APS agencies seeking to build capability, to inform their learning & development planning, program development and approaches to market for learning solutions.

All queries relating to this standard should be directed to [email protected].

Intellectual property and moral rightsIntellectual property in parts of these materials may be owned by the Skills Framework for the Information Age (SFIA) Foundation.  

The Australian Public Service Commission (APSC) holds an extended public sector licence on behalf of all Australian Public Service (APS) agencies covered by the Public Service Act 1999

29 November 2018 Building Digital Capability Program – Learning Design Standard – Digital Performance Analysis 3

(PS Act) for the use of SFIA materials.  This licence permits certain uses of SFIA materials including the creation and internal distribution of products and services derived from or using significant extracts of SFIA materials.  The licence does not extend to commercial use of the materials and does not cover Commonwealth bodies other than agencies under the PS Act.

These materials may only be used by APS agencies in accordance with the terms of the extended public sector licence granted to the APSC.  No other uses of these materials are permitted.  For more information on the APSC SFIA licence visit the APSC SFIA webpage.

The opportunity

The Australian Government is modernising the way it delivers services to citizens. ‘Digital by default’ is the guiding principle. This means many APS agencies will need to engage multidisciplinary teams in the design, development and implementation of digital services as defined in the Digital Service Standard. Digital performance analysis has been identified as a key skill that will be in high demand for the APS workforce to transform service delivery.

29 November 2018 Building Digital Capability Program – Learning Design Standard – Digital Performance Analysis 4

Guidance for providersGood learning design

When proposing or developing a solution, it is important to be consistent with contemporary instructional design practices. Adult learning is a continuous process that is not limited to the classroom or formal training activities. Good learning design leverages the ways adults learn all the time through a range of experiences.

The diagram below shows some elements that you could include in a learning program.

29 November 2018 Building Digital Capability Program – Learning Design Standard – Digital Performance Analysis 5

29 November 2018 Building Digital Capability Program – Learning Design Standard – Digital Performance Analysis 6

Figure 1 - Pathways to learning

Learning environment

The APS is made up of many different departments and agencies. Each may have their own:

culture business needs technical platforms geographic dispersion existing level of digital capability and maturity

If your learning solution is intended for broad use across the APS you need to consider how it would apply in different contexts. Any digital solutions you develop need to be able to be deployed on a wide range of platforms.

Standards of compliance

The APS will require all digital learning solutions to be compatible with the following standards:

Digital Transformation Agency (DTA) Digital Service Standard Web content accessibility guidelines version 2.0 AA compliance level Australian Signals Directorate (ASD) Information Security Manual Standards Learning Design Specification standard

Learning outcome assessment

Agency requirements for assessment may vary. Formative and/or summative assessment may be offered by the provider and should be specified by the agency when engaging providers.

Formative assessment - monitors learning and gives ongoing feedback. It is used by facilitators to improve their teaching, and by learners to improve their learning. The purpose is assessment FOR learning. Examples of formative assessments are

observations, conferences, questioning drawing concept maps, reflections

self-evaluations and self-assessments Summative assessment - evaluates the level of success or capability at the end of a learning activity, comparing it against some standard or

29 November 2018 Building Digital Capability Program – Learning Design Standard – Digital Performance Analysis 7

benchmark. The purpose is assessment OF learning. Examples of summative assessments are:

a midterm assessment or end-of-course test a final project a presentation or report

Guidance for agenciesCustomising content

Agencies may extend, reduce or change the content of this LDS.

Agencies should highlight these changes so that providers can readily adapt their learning solutions to meet your agency needs.

Setting the contextBuilding the digital capability of the Australian Public Service

The Australian Federal Government is progressing a digital transformation agenda to revolutionise the way it delivers services. Australians are more mobile, more connected and more reliant on technology than ever before. For this reason the Digital Transformation Agency (DTA) is leading this transformation in order to improve how the Australian Government delivers services online.

As part of the digital transformation agenda, the APSC and the DTA are jointly delivering the Building Digital Capability Program. One of the main activities of this program is the identification of digital capability shortfalls and the definition of learning programs to build capability in those areas.

29 November 2018 Building Digital Capability Program – Learning Design Standard – Digital Performance Analysis 8

The Digital Service Standard

The Digital Transformation Agency guides government service modernisation through the Digital Service Standard. The Standard helps digital teams to build services that are simple, clear and fast.

29 November 2018 Building Digital Capability Program – Learning Design Standard – Digital Performance Analysis 9

The multidisciplinary Digital Delivery Team

The Digital Service Standard suggests the ideal multidisciplinary team to design, build, operate and iterate a digital service. This team includes core (permanent) roles as well as extended roles that you can bring into the team when needed . People may perform one or many roles, depending on their capability and the workload.

The capabilities defined by the Learning Design Standards relate to the roles in a digital delivery team. An agency will be able to use the LDS to define an effective team that meets

29 November 2018 Building Digital Capability Program – Learning Design Standard – Digital Performance Analysis 10

Figure 2 - The Digital Delivery Team

their specific agency requirements for digital transformation.

Jobs, roles and skillsMembers of multidisciplinary teams may perform many roles in their jobs. Each role has expectations of skill, behaviors and knowledge. You can verify these through relevant qualifications and certifications.

Figure 3 - Role Composition

This Learning Design Standard only addresses learning outcomes for professional skills and knowledge. A person who has done training also needs to put it into practice. This allows them to gain experience and become effective. Individual agencies will determine how they manage experience.

Providers may wish to provide certifications that verify the learning outcomes specified in this LDS, but these are not mandated. It is up to individual agencies to decide if they want certification.

Individual agencies will define jobs according to their needs. Jobs may involve one role only, though it is becoming more common for multidisciplinary teams to have job fluidity. Members may perform many roles according to their capabilities and the needs of the team.

29 November 2018 Building Digital Capability Program – Learning Design Standard – Digital Performance Analysis 11

Overview of Digital performance analysisDigital performance analysis takes a user-centered approach to improving government services, encompassing a variety of investigative methods. The objective is the improvement of digital services so that interactions are increasingly seamless, quick and user-friendly.

The role of a Performance analyst is to maintain, manage and review service performance data, identifying key trends and then clearly articulating recommendations to stakeholders to improve the service performance. Performance Analysts drive a continuous improvement culture, supporting effective business decision-making in order to deliver simple, easy-to-use services and products for Australian Citizens.

The Performance analyst will determine the relevant data pertaining to the services in question from a variety of key analytics, data sources and stakeholders. They will conduct analysis and interpret the result using business acumen combined with contemporary software reporting on relevant trends and performance in order to recommend improvements to product and service performance.

Performance analysts provide data reporting that is open and accessible to the Australian public through the DTA’s Performance Dashboard. This public sharing of performance data underpins the Australian Federal Government’s Open Data Policy.

The evolution of the Digital performance analysis discipline

The landscape of technology available to analyse digital services has changed markedly in recent times. Software applications available for data modelling have become increasingly faster, more intuitive and user friendly; and with the increase in user-centered software development and the advent of internet-based business intelligence software, the typical Performance analyst is no longer always a purely technical resource.

Historically, a typical data analyst would have a background in mathematics, statistics or computer science disciplines and would apply the modelling skills to the business context. While in many instances data modelling is still performed in this manner today, we are experiencing a fundamental shift in the way data is accessed and analysed, and made available to both decision-makers and the public, thanks to new digital technologies.

Applications such as Tableau, Google Analytics and Klipfolio do not require sophisticated data handling skills, and so we are seeing a shift in the capabilities required for Performance analysis from purely technical skills to a focus more on business-orientated skills such as business case preparation, data visualisation, stakeholder management entrepreneurial skills, communication and business strategy skills.

29 November 2018 Building Digital Capability Program – Learning Design Standard – Digital Performance Analysis 12

Target audience Primary

The Primary audience for this capability is APS employees with academic qualifications in relevant data related disciples who are seeking to apply and further extend their skills in a digital service delivery team in the Australian Government context. They are likely to have a background in Economics, Finance, Mathematics, Business, Statistics or Computer science.

Secondary

Employees within a multidisciplinary product or service delivery team performing related activities to develop and improve the user experience by applying data analysis. They are likely to have a background in a Business, Marketing, Strategy or related disciplines.

Pathways to Digital performance analysisEverybody has a different work history and career path. The following roles are some of the more common roles people may have had in their career before coming to the current role;

Business analyst Data analyst APS front line roles Customer service Corporate reporting Systems design Digital media Governance Internal audit

29 November 2018 Building Digital Capability Program – Learning Design Standard – Digital Performance Analysis 13

Qualifications and certificationsThe following qualifications are relevant to the capability described in this LDS;

Mathematics Communications Statistics Computer science Engineering Finance Business Government Marketing Economics Multimedia

Completion of the online Google Analytics Academy Courses is common for Performance Analysts and highly recommended as foundational to any web-based analytics work.

29 November 2018 Building Digital Capability Program – Learning Design Standard – Digital Performance Analysis 14

Capabilities needed for Digital performance analysisThe skills, knowledge and attributes listed below are the minimum needed for someone to be effective in this role. A person undertaking the learning defined by this LDS should have the knowledge and skills described below after finishing the learning. They may need experience of these in a workplace to embed the learning and become effective.

Knowledge: Skills: Attributes:Organisational context

• The Australian Government Digital Service Standard

• Future data trends (eg Big Data, real time data management)

• Policies and procedures relating to the business being analysed

Methodologies, procedures and standards

• Business analysis techniques

• Business reporting techniques

• Segmentation techniques • User journeys• Web optimisation

techniques

Tools

• Analytics tools and software applications – such as Cognos, SAS VA, IBM Watson

Technical

• Advanced spreadsheet and database application skills

• Building reports• Manipulating and

visualising data• Interpreting data• Modelling data

Analysis, synthesis & evaluation

• Analyse data trends • Defining measures and

metrics• Concept linking and

mapping• Converting data to

actions• Researching

Communication

• Relationship and stakeholder management

• Communicating skills, written and spoken

Digital

• Analytical

Professional

• Accuracy • Entrepreneurial • Lateral thinking• Problem solving

Personal

• An inquiring mind• Attention to detail• Creative thinking• Curiosity• Empathy with the

user• Explorer• Persistence

29 November 2018 Building Digital Capability Program – Learning Design Standard – Digital Performance Analysis 15

Knowledge: Skills: Attributes:• Web Analytics• Data mashing• Data assembly

Technology

• Multi-dimensional and relational database modelling, analysis, concepts and structures

Theory and theoretical

• Observational research (eg. usability)

Principles

• Process flows • User centered design

principles

Concepts

• Analytics techniques

• Running workshops• Storytelling, drawing

together a multi-lens view of a problem

Relationships and Interpersonal

• Advocacy

29 November 2018 Building Digital Capability Program – Learning Design Standard – Digital Performance Analysis 16

Relevant SFIA SkillsThe Skills Framework for the Information Age (SFIA) is a global standard that defines Digital and other ICT related skills. A person possessing the following SFIA skills at the levels indicated would be capable of performing the role described by this standard.

Code Skill Applicable Levels

Caveats*

INAN Analytics 5DTAN Data analysis 3 Focused on the analysis of performance

metrics & metadata rather than business data structures

*Caveats are identified components of a SFIA skill that are not explicitly required for the current role. For the purpose of this Learning Design Standard the SFIA description should be read as though the caveated components were not included in the SFIA skill description.

29 November 2018 Building Digital Capability Program – Learning Design Standard – Digital Performance Analysis 17

References DTA Guide to meeting the Digital Service Standard

UK GDS Service Manual

Office of the Australian Information Commissioner - About information policy

DTA Digital Service Standard

University of North Cariolina: What is information policy

Dept. of Finance publication: Getting on with Government 2.0

Department of Finance Declaration of Open Government

UC Institute for Governance and Policy Analysis - Recent Reports

Smashing Magazine - Data-Driven Design In The Real World

Google Analytics Academy

Performance Analysis

Blog: How to choose the right UX metrics for your product

Blog: How Google sets goals OKRs

29 November 2018 Building Digital Capability Program – Learning Design Standard – Digital Performance Analysis 18

Key content areasThe following table outlines content areas that need to be addressed.

Unit = area of learning. Topic = component of area of learning.

Unit 1. The role of performance analysis in the Australian Government digital service design context

Learning objective: Describe the context and purpose of performance analysis in the measurement of government services

Topic title Topic learning objectives

Critical content

1.1 Transforming government digital service delivery

Define the Australian Government context for digital service delivery

1 The Australian Government’s Digital Transformation Agenda

review the Digital Service Standard

2 Performance analysis in the digital transformation of government services in the APS

1.2 Information handling in government services

Describe the concept of information policy

Explain why performance analysis is relevant to an open and transparent government

Describe how performance measurement is integral to meeting the Digital Service Standard criterion

1 The handling of public sector information (PSI)

principles of privacy, confidentiality, security and classification

2 The importance of a participatory and informed public

how to efficiently collect, use and manage public sector information

make public sector information more readily and freely available to the public

29 November 2018 Building Digital Capability Program – Learning Design Standard – Digital Performance Analysis 19

Topic title Topic learning objectives

Critical content

3 Give citizens new opportunities to engage in, and develop, Australian Government processes and policies that are more effective and achieve better outcomes

4 Government 2.0 in UK, NZ and USA

explore international approaches to data handling in other government systems

5 The role of Office of the Australian Information Commissioner (OAIC) relating to privacy and freedom of information

research and list the key factors relating to the handling of data in:

t he rivacy Act, 1982 Freedom of Information Act 1982 Australian Information Commissioner

Act 2010

6 Measure performance against KPIs set out in the Digital Service Standard (the Standard) guides.

identify how the Standard uses mandatory data requirements to achieve its objectives

7 Reporting on the public dashboard

locate and define key services reported at the Australian Performance Dashboard website.

1.3 Performance analysis overview

Describe the various applications of Performance analysis in the broader business context

1 Other applications of performance analysis

sport psychology human resources financial analysis business analytics

29 November 2018 Building Digital Capability Program – Learning Design Standard – Digital Performance Analysis 20

Topic title Topic learning objectives

Critical content

marketing analytics business intelligence

2 Other forms of performance analysis in government:

benefits analysis and reporting governance structures

1.4 The purpose of Performance analysis in government services

Define the underlying reasons why Performance analysis is critical to digital service success in government

1 How transparency generates a greater sense of public accountability

2 How reporting generates better outcomes for users

29 November 2018 Building Digital Capability Program – Learning Design Standard – Digital Performance Analysis 21

Unit 2. Defining the scope of work and preparing the strategy

Learning objective: Determine the scope and approach for identifying the appropriate analysis activities.

Topic title Topic learning objectives

Critical content

2.1 A detailed look at performance measurement in the Australian Government’s Digital Service Standard

Identify the key performance indicators (KPIs) for a digital service

Describe the mandatory measurement criterion for the Digital Service Standard

1 The importance of key performing indicators in performance analysis

what is a KPI? writing a KPI when KPIs are not helpful to solving

the business problem

2 What are the mandatory key performing indicators of success for digital service delivery as defined in the Digital Service Standard:

user satisfaction - to help continually improve the user experience of the service

digital take-up - to show how many people are using the service and to help encourage users to choose the digital service

completion rate - to show which parts of the service experience ‘drop- off’ and to identify areas for improvement

cost per transaction - to make the service more cost efficient

3 The difference between KPIs in procurement or program delivery vs KPIs in performance analysis

2.2 Understanding the data landscape and context before you begin

Benchmark existing data sources for the service

1 Define the user problem the service is solving, or the opportunity

understanding the brief understanding expectations of the

business

29 November 2018 Building Digital Capability Program – Learning Design Standard – Digital Performance Analysis 22

Topic title Topic learning objectives

Critical content

working with user research understanding user-centered design

processes

2 Identify stakeholders

know your project sponsor locate stakeholders in IT and business negotiate access to data sources with

your stakeholders

3 Access the project charter and other governance documents to clarify project objectives and goals

4 Craft additional KPIs that influence business impact

5 Explore the data that is already available for an existing service, where it is kept and how you might access and use it, and also share your own insights

6 Collect baseline data for the service operation in all of its channels

7 Avoid duplicating effort by defining existing data that may already exist

8 Evaluate data frequency to determine data quality and usefulness

9 Estimate the number of people you expect to use the service

2.3 The importance of preparing the performance framework approach for project success

Create a performance framework

Hypothesise a set of variables to measure that will help drive and improve the service in order to help identify existing data sources

1 Best practice performance framework approaches for digital services

waterfall, agile, hybrid documentation approach five pillars of strategy execution

2 Benefits for service delivery to be measured including:

track costs from the start of the project articulate the benefits the service is

29 November 2018 Building Digital Capability Program – Learning Design Standard – Digital Performance Analysis 23

Topic title Topic learning objectives

Critical content

expected to provide as early as possible

establish baseline data for each of the service’s touchpoints

identify the goals, signals and metrics will need to be tracked and monitored that the articulated benefits are being realised

cost the time it takes for your users to complete the process before and after your redesign, if you are replacing an existing service

benefits roadmap or value stream; what might be realised and when

confirm that the benefits you have articulated can be realised with your various prototypes

3 Create a performance framework outlining your objectives, hypotheses and what metrics the team will use to demonstrate success. Include:

goals audience segmentation metrics lagging metrics leading metrics experience actions you want to influence/improve stakeholders barriers (e.g. culture, exiting or

conflicting KPIs, data quality) Google’s ‘OKR’ framework

4 Creating excellence with ambitious and quantifiable objectives key results in your work

2.4 Conceive the dashboard

Prototype a performance analysis dashboard

1 Methods for data display for each element in the performance framework

2 Envisaging the dashboard(s) with the

29 November 2018 Building Digital Capability Program – Learning Design Standard – Digital Performance Analysis 24

Topic title Topic learning objectives

Critical content

elements as defined including the four mandatory KPIs from the Standard

3 Designing dashboards that inspire action4 Paper prototyping the dashboard

2.5 Engage stakeholders in the dashboard design.

Socialise the performance framework and dashboard design

1 Share and test with stakeholders2 The importance of continually seeking

feedback from senior stakeholders on the dashboard design and effectiveness for informing decisions

2.6 Data sources, types and structures for performance analysis

Explore database structures and systems

Identify alternative types of data and the source of each.

Segment the data for logical and detailed user-centric enquiry

1 Using qualitative and quantitative data

understand empirical data in performance analysis is quantitative and qualitative (and qualitative data can still be numerical and non-numerical)

working with user data

2 Data types and database structures

3 System generated data

compare multi-dimensional databases with Relational databases

the benefits of pre-calculated data

4 Online analytical processing (OLAP)

working and defining cubes with developers

creating pivot tables and pivot reports reviewing key concepts:o measures (such as sales, revenue)o dimensions (such as time,

geography)o calculated members (where

measures are calculated, for example, profit)

29 November 2018 Building Digital Capability Program – Learning Design Standard – Digital Performance Analysis 25

Topic title Topic learning objectives

Critical content

o expressions and querieso hierarchies and levels

5 In-site search data6 Site analytics7 Social platform analytics and comments8 Click stream data 9 Usability testing data10User research data11Site comments and feedback on pages

(contextual and non-contextual)12Data-driven design methods:

understand the Google HEART framework

define the Goals-Signals-Metrics framework

13The Importance of isolating variables

identify segments for cluster analysis determine personas methods of acquisition:o anticipate how visitors will arrive at

your serviceo create clusters for owned, earned and

paid sources.o determine desired and actual

behavior with user journeyso determine site architecture and how it

relates to user journeyo identify the outcomes you want the

user clusters to achieve

2.7 Performance analysis methods

Select performance analysis methods for a working beta service

1 Data Inputs and format

establish a common language for data inputs

2 Tools and Software Selection

determine analytics tool to be used

29 November 2018 Building Digital Capability Program – Learning Design Standard – Digital Performance Analysis 26

Topic title Topic learning objectives

Critical content

select business intelligence (BI) software

determine database management systems (DMS) to be used

2.8 Google Analytics Implement the Google Analytics (GA) account for site user data

Note this is broadly the online GA training program freely available at analytics.google.com

1 Overview of Google Analytics

becoming GA certified create your GA account orientate yourself to key features and

navigation set your site up in GA

2 Data collection and processing

categorising into users and sessions applying configuration settings storing data and generating reports creating a measurement plan

3 Setting up data collection and configuration

organise your analytics account set up advanced filters on views create your own custom dimensions create your own custom metrics understand user behavior with event

tracking

4 Advanced analysis tools and techniques

segment data for insight analyse data by channel analyse data by audience analyse data with custom reports

2.9 Data maintenance planning

Consider how data will be kept clean and secure throughout the program

1 Data integrity considerations

identify tools for testing integrity prepare testing plan

29 November 2018 Building Digital Capability Program – Learning Design Standard – Digital Performance Analysis 27

Topic title Topic learning objectives

Critical content

2 Data security considerations

identify the key principles affecting security as outlined in the Principles of the Australian Government Information Security Manual

prepare security plan

2.10 The Scoping Document

Bring your findings and approach together in a scope of work to articulate your strategy

1 Prepare your scope of work2 Share your hypotheses3 Validate your work with stakeholders

before you begin

Unit 3. Optimisation and iteration of the digital service using performance analysis

Learning objective: Executing performance analysis to leverage data insights in order to improve a digital product or service

Topic title Topic learning objectives

Critical content

3.1 The Australian Government Digital Dashboard

Collaborate with the Digital Transformation Agency to share your data to the dashboard

1 Identify stakeholders for dashboard collaboration and creation

2 Prepare the data for sharing3 Deliver mandatory and additional data4 Benchmark results and movement

3.2 Data interpretation and reporting

Define current state and set the status

1 Review data output at regular intervals

map the findings

2 Identify trends and movement in metrics in current state

3 Report regularly on key data

29 November 2018 Building Digital Capability Program – Learning Design Standard – Digital Performance Analysis 28

Topic title Topic learning objectives

Critical content

population of key performance and asset based data into weekly, monthly and quarterly reporting templates 

performance attribution versus benchmarks

validate your findings against your hypothesis

4 Prepare a gap analysis on key data related to goals

5 Service optimisation6 Develop small user groups for site

optimisation7 Work with the service delivery team to

influence backlog8 Test your recommendations9 Alternative testing methods

3.3 Develop recommendations and stabilise performance workflow

Shape performance analysis information into strategic and tactical plans that deliver the agreed business objectives.

Present your findings to stakeholders at regular intervals

1 Explore techniques to automate and enhance performance reporting. This includes predictive analytics and leveraging better practice

2 Prepare strategic plan within an executive overview for service improvement

set the status: is there a problem? what is the opportunity?

develop your business case establish a case for buy-in or

investment getting the red light or the green light

to go ahead share and present your

recommendations peer reviewing your work

3 Socialising your plan4 Drive and influence service improvement

program

29 November 2018 Building Digital Capability Program – Learning Design Standard – Digital Performance Analysis 29

Topic title Topic learning objectives

Critical content

5 Identify data “blind spots”

developing an entrepreneurial mindset to identify opportunity for innovation

6 Address legacy systems

explore retiring legacy systems and replacement of manual processes

7 Address ongoing improvement

commissioning additional research or access additional data

work in collaboration with the team to continually keep performance policy and procedures updated 

3.4 Business as usual

Return your work to the business owners

1 Define findings and recommendations

are these achievable for the business? is the data ready?

2 The post implementation review 3 Prepare for handover

who is the decision maker how will this data be implemented and

measured going forward

4 Handover project to business

29 November 2018 Building Digital Capability Program – Learning Design Standard – Digital Performance Analysis 30