91
Manual Category C. Methodology Manual ID - No & Title: Quality - 08. Quality Manual Final v 2005/01 Last Updated: 09 Jun 2009 Chapter No. & Title: 01. Introduction Section No. & Title: Subsection No. & Title: Document Version: 2005/01 Header Status: *** Final *** Comments: Not yet available Status Finalised contact details are available for this Manual. Area Resp. for Updating: Methodology Division General Contact Details Name: Phone: Specific Contact Details Name: Phone: Division: Email: Melissa Gare (02) 6252 7147 Methodology and Data Management Division Contact Info To be notified when a change to this document is published, register yourself, your workgroup or your workgroup database email address in the table below. View: (LookupStakeholders) Click in the left hand margin to select an existing entry ! Press F9 if your registration doesn’t appear automatically ! Stakeholders The documents linked here: offer guidance on related matters ! when changed will demand a review and possible updating of this document and/or ! will need to be reviewed and perhaps updated as a consequence of changes to these ! documents. Policy Standards Links

Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

  • Upload
    others

  • View
    7

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

Manual Category Ü C. MethodologyManual ID - No & Title: Ü Quality - 08. Quality Manual

Finalv 2005/01

Last Updated:09 Jun 2009

Chapter No. & Title: Ü 01. Introduction

Section No. & Title:

Subsection No. & Title:

Document Version: Ü 2005/01

Header

Status: *** Final ***

Comments: Not yet available

Status

Finalised contact details are available for this Manual.

Area Resp. for Updating: Methodology DivisionGeneral Contact Details

Name: Phone:

Specific Contact DetailsName: Phone: Division: Email:

Melissa Gare(02) 6252 7147Methodology and Data Management Division

Contact Info

To be notified when a change to this document is published, register yourself, your workgroup or your workgroup database email address in the table below.

View: (LookupStakeholders) Click in the left hand margin to select an existing entry!

Press F9 if your registration doesn't appear automatically!

Stakeholders

The documents linked here:

offer guidance on related matters!

when changed will demand a review and possible updating of this document and/or!

will need to be reviewed and perhaps updated as a consequence of changes to these !

documents.

Policy

Standards

Links

Page 2: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

Definitions

Procedures

Other links

Last modified by Narrisa Gilbert on 09/06/2009 05:57 PMHistory

Audit Trail

Quality in the ABS

1 The ABS has a reputation for and a culture of, providing a high quality national statistical service. This is reliant upon the use of well defined, transparent and appropriate quality assurance measures and processes. Quality plays a significant role in the ABS. Central to our existence is the ABS mission of "leading a high quality, objective and responsive national statistical service".

2 This manual on quality has been developed to assist ABS staff in advancing the mission. It is an evolving project and will be updated as further information and tools are developed.

3 We seek to improve quality and quality assurance measures to use information about data quality to manage and improve our statistical processes. By doing so we can increase the visibility of the quality of data both internally and externally and publish and promote guidelines and frameworks about quality.

Purpose of the Manual

4 This manual has been developed to expand ABS capability for improving the quality assurance of ABS products and to improve the accessibility of information about quality for the use of ABS staff.

5 Methodology Division is also working to expand ABS's capability for quality assuring statistical collections through developing tools and processes and guidelines for managing quality. These will be updated in the manual as they become available.

Structure of the Manual

6 The Quality Manual provides general information about quality in the ABS as well as providing specific information relating to statistical collections, sourcing statistics from administrative data and analytical products. Links are provided where relationships exist between information.

8 Chapter 2, Quality Guidelines contains information that is relevant regardless of whether you are dealing with a statistical collection, administrative data source or an analytical product. Information on quality is of a more general nature, including the ABS Quality Framework, and information on continuous improvement. It introduces the concept of quality gates as well as being a repository for relevant ABS papers on quality.

9 Chapter 3, Statistical Collections provides a range of information on quality

Page 3: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

relevant to staff in statistical collection areas. Statistical risk mitigation is discussed and a template to assist in planning for mitigating this type of risk throughout the change process is provided. Further risk mitigation is provided through 'Tips and Traps' to assist staff in identifying, monitoring and mitigating statistical risk for various components of the e2e statistical process. A range of quality measure to assist in monitoring quality including the Quality Infrastructure System (QIS) are presented. Quality gates as they apply to statistical collections are discussed. This chapter also includes a Quality Incident Response Plan manual, which is a plan for resolving serious doubts about key statistical results.

10 Quality of statistics sourced from administrative data is discussed in Chapter 4. The chapter discusses all elements of sourcing data from administrative collections from relationship building to quality gates within this process.

11 Chapter 5, Analytical Products looks at specific products from Economic Accounts, Price Indices, (models of) small area estimation, data linking and demographic estimation. Quality gates will be looked at under each of these sub-sections.

12 The appendices include various templates and tools to assist in improving and monitoring quality.

Page 4: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

Quality Manual : Chapter 02

Manual Category Ü C. MethodologyManual ID - No & Title: Ü Quality - 08. Quality Manual

Finalv 2005/02

Last Updated:19 Jul 2007

Chapter No. & Title: Ü 02. Quality Guidelines

Section No. & Title:

Subsection No. & Title:

Document Version: Ü 2005/02

Header

Status: *** Final ***

Comments:

Status

Finalised contact details are available for this Manual.

Area Resp. for Updating: Methodology DivisionGeneral Contact Details

Name: Phone:

Specific Contact DetailsName: Phone: Division: Email:

Melissa Gare(02) 6252 7147Methodology and Data Management Division

Contact Info

To be notified when a change to this document is published, register yourself, your workgroup or your workgroup database email address in the table below.

View: (LookupStakeholders) Click in the left hand margin to select an existing entry!

Press F9 if your registration doesn't appear automatically!

Stakeholders

The documents linked here:

offer guidance on related matters!

when changed will demand a review and possible updating of this document and/or!

will need to be reviewed and perhaps updated as a consequence of changes to these !

documents.

Policy

Links

Page 5: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

Standards

Definitions

Procedures

Other links

Last modified by Narrisa Gilbert on 19/07/2007 10:21 AM - Comments field changed : 17 characters deleted

History

Audit Trail

Quality Guidelines

1 This chapter contains information about quality that is relevant regardless of whether you are dealing with a statistical collection, administrative data source or an analytical product. Information on quality is of a more general nature, including the ABS Quality Framework, and information on continuous improvement. It introduces the concept of quality gates as well as being a repository for ABS papers on quality. As each section is loaded to the manual, links will be provided in this document.

2 Section 01 discusses the ABS Quality Framework defining the elements as well as providing some examples of current and possible uses.

3 The ABS approach to Continuous Quality Management is discussed in Section 02.

4 The concept of Quality Gates are introduced in section 03. These are discussed in general and links will be provided to relevant sub-sections and products use eg the use of Quality Gates in Statistical Collections. Passing a 'quality gate' enables the product, for example, in terms of the ABS statistical process this could be a frame, sample, benchmark file or set of estimates to move on to the next stage of the process.

5 Section 04 contains a range of ABS papers and research on quality. These papers are brought together in one place to provide a complete picture of quality in the ABS. Where it is more appropriately for papers to be placed elsewhere, links will be provided.

6 The contents of section 04 is:-

Allen, B., Qualifying Quality - A Framework for Supporting Quality Informed Decisions, 2002.

Brackstone G., Managing Data Quality in a Statistical Agency, (1999) Survey Methodology, Vol 25, No. 2, Statistics Canada. Accessible from http://www.nso.go.kr/sqs2000/canada-attachment.pdf

Carson, C., Toward a Framework for Assessing Data Quality, IMF Working Paper, WP/01/25, International Monetary Fund, Washington, 2001. Accessible from http://www.imf.org/external/pubs/ft/wp/2001/wp0125.pdf

Page 6: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

Quality Manual : Section 02-01

Manual Category Ü C. MethodologyManual ID - No & Title: Ü Quality - 08. Quality Manual

Finalv 07/12/2007Last Updated:11 May 2009

Chapter No. & Title: Ü 02. Quality Guidelines

Section No. & Title: 01. ABS Data Quality Framework

Subsection No. & Title:

Document Version: Ü 07/12/2007

Header

Status: *** Final ***

Comments:

Status

Finalised contact details are available for this section.

Area Resp. for Updating:General Contact Details

Name: Phone:

Specific Contact DetailsName: Phone: Division: Email:

Melissa Gare(02) 6252 7147Methodology and Data Management Division

Contact Info

To be notified when a change to this document is published, register yourself, your workgroup or your workgroup database email address in the table below.

View: (LookupStakeholders) Click in the left hand margin to select an existing entry!

Press F9 if your registration doesn't appear automatically!

Stakeholders

The documents linked here:

offer guidance on related matters!

when changed will demand a review and possible updating of this document and/or!

will need to be reviewed and perhaps updated as a consequence of changes to these !

documents.

Links

Page 7: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

Policy

Standards

Definitions

Procedures

Other links

Last modified by Narrisa Gilbert on 11/05/2009 02:55 PM - Body field changed : 497 characters added

History

Audit Trail

02 01 ABS Data Quality Framework

1 The ABS data quality framework (DQF) is based on two frameworks, one developed by Statistics Canada and the other by the European Statistics Code of Practice. The Statistics Canada framework identified 6 dimensions of data quality (Relevance, Timeliness, Accuracy, Coherence, Interpretability and Accesssibility) and the European Statistics Code of Practice identified another essential aspect of data quality (Institutional Environment). The ABS DQF serves multiple purposes.

2 The ABS DQF can help in the framing of a data need using 7 dimensions of data quality. This enables objectives/requirements of collections to be set at initial planning stages. This is a valuable tool when designing a new survey to ensure that requirements are met.

3 The ABS DQF can also aid in the evaluation of the fitness for purpose of survey and administrative data statistical releases. 4 The 7 dimensions are: Institutional Environment, Relevance, Timeliness, Accuracy, Coherence, Interpretability, and Accessibility.

Institutional Environment refers to the institutional and organisational factors which may have a significant influence on the effectiveness and credibility of the agency producing statistics.

Examples of the information which may be contained within this dimension are: Professional independence;Mandate for data collection;Adequacy of resource;Quality commitment;Statistical confidentiality;Impartiality and objectivity.

Relevance refers to how well the statistical product or release meets the needs of users in terms of the concept(s) measure, and the population(s) represented.

Examples of the information which may be contained within this dimension are: scope and coverage; geography; main outputs produced such as:

employed number, cause of death, population count etc.

Timeliness refers to the delay between the reference period (to which the data pertains) and the

That is: when it was collected;when it was released; and

Page 8: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

date on which the data becomes available; and the delay between the advertised date and the date at which the data become available (i.e., the actual release date).

when it will be collected again.

Accuracy refers to the degree to which the data correctly describes the phenomenon they were designed to measure.

For example:relative standards errors; impact of imputation;impact of confidentiality protections

Coherence refers to the internal consistency of a statistical collection, product or release, as well as its comparability with other sources of information, within a broad analytical framework and over time.

This is compatibility with:previous releases;other products available; changes in definitions of data items over time; history of collection etc.

Interpretability refers to the availability of information to help provide insight into the data.

For example the use of:seasonally adjusted data; trend data; other metadata and information such as:

Concepts, Sources and Methods,Explanatory Notes,Information papers etc.

Accessibility refers to the ease of access to data by users, including the ease with which the existence of information can be ascertained, as well as the suitability of the form or medium through which the information can be accessed.

This includes the formats data is available in, such as:spreadsheets;data cubes; Confidentialised Unit Record Files (CURFs); Remote Access Data Laboratory (RADL); as well as requests for unpublished data.

Quality Manual : Section 02-02

Manual Category Ü C. MethodologyManual ID - No & Title: Ü Quality - 08. Quality Manual

Finalv12/ 2007

Last Updated:24 Dec 2008

Chapter No. & Title: Ü 02. Quality Guidelines

Section No. & Title: 02. Quality Declaration User Guide

Subsection No. & Title:

Document Version: Ü 12/ 2007

Header

Status: *** Final ***

Comments:

Status

Contact Info

Page 9: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

Finalised contact details are available for this Manual.

Area Resp. for Updating: Methodology DivisionGeneral Contact Details

Name: Phone:

Specific Contact DetailsName: Phone: Division: Email:

Melissa Gare(02) 6252 7147Methodology and Data Management Division

To be notified when a change to this document is published, register yourself, your workgroup or your workgroup database email address in the table below.

View: (LookupStakeholders) Click in the left hand margin to select an existing entry!

Press F9 if your registration doesn't appear automatically!

Stakeholders

The documents linked here:

offer guidance on related matters!

when changed will demand a review and possible updating of this document and/or!

will need to be reviewed and perhaps updated as a consequence of changes to these !

documents.

Policy

Standards

Definitions

Procedures

Other links

Links

Last modified by Narrisa Gilbert on 24/12/2008 11:20 AM - Body field changed : 82 characters added

History

Audit Trail

Page 10: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

QUALITY DECLARATION USER GUIDE

CONTENTS

01 Overview Information02 Example of how a Quality Declaration will look on the website03 Writing standards for the website and Quality Declarations 04 Creating Quality Declarations using the Collection Management System (CMS) and Website Contents Approvals (WCA)

05 Seven Dimensions of the ABS Data Quality Framework06 Institutional Environment

Suggested content topicsExample for Institutional Environment

07 RelevanceSuggested content topicsQuestions to ask yourselfExamples for Relevance

08 TimelinessSuggested content topicsQuestions to ask yourselfExamples for Timeliness

09 AccuracySuggested content topicsQuestions to ask yourselfExamples for Accuracy

10 CoherenceSuggested content topicsQuestions to ask yourselfExamples for Coherence

11 InterpretabilitySuggested content topicsQuestions to ask yourselfExamples for Interpretability

12 AccessibilitySuggested content topicsQuestions to ask yourselfExamples for Accessibility

Examples of Quality Declarations13 Attachment 1: Census 2006 Quality Declaration14 Attachment 2: National Accounts Quality Declaration15 Attachment 3: Criminal Courts Australia Quality Declaration16 Example of a Compendium Quality Declaration17 CURF QDs - Guidelines and Example

20 User consultation feedback21 Doclinks to useful information

01 Overview Information

Quality Declarations (QDs) contain information about the quality of a statistical release, that helps users make an informed decision about the data to determine whether they are fit for their purpose. The information will be presented electronically on the ABS website to help users become better informed about the data they are using. QDs will be a 'click' away from the information that the user is accessing.

The QD will be versatile enough to cater for every author area's needs, in that if information is already available on the website, the author area can provide a link to this existing information from within the QD. If the information is not currently on the website, the QD allows the creation of not only a top layer of information containing the seven dimensions of the ABS data quality framework, but also enables linking to a second layer for each of the dimensions if the author area wishes to go into more detail on a particular topic. That way, if

Page 11: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

there is something that you think is really important that users know about, and it can't beexpressed completely within the top layer of the QD, you can write about it in the second layer. This allows author areas to further build on the top layer of information under each dimension should they wish to do so.

Another option if more information is required which the top layer can't cater to due to the size limitation, is to link to this information which may be stored on the website somewhere else from the top layer of the QD. You can also link to information on the website from within the second layer as well.

When writing a QD you need to consider whether the information presented is beneficial for the user. Will the information help the user make an informed decision about whether the data are suitable for the use that they wish to put it to?

QDs will differ amongst collections in terms of what author areas will write about. A population sample survey QD will contain different types of information compared to an administrative data collection QD, a business sample survey QD, or a census QD. Due to the varying nature of the collections within the ABS, the creation of the content of QDs is left to the discretion of author areas. That is, apart from having the seven dimensions of the ABS Data Quality Framework, along with recommended content ideas, QDs are left to author areas to determine what is the essential information to have within them.

QDs will take some time in the initial construction. However, for those collections that are more frequent than others, e.g. quarterly/monthly releases, it would be advisable to not make the QD too specific to statistics coming forth from that release as it would mean that the QD would become out of date very quickly. The trick is to find the aspects of quality of that collection that are consistent across releases and highlight them. However, if there is a particular quality aspect of a release for a particular cycle of a collection that does need to be highlighted, this should be done. The QD would need to be amended for the next release as the 'particular aspect' would not apply to subsequent releases.

The QD should be around two pages in length for the top layer and for any information written in the subsequent second layers. Hence, the information needs to be short and to the point. It should highlight the main pieces of information and issues that users should know when using statistics from that statistical collection. The information also needs to be useful for decision making. When writing a QD highlight some of the limitations of the statistics as well as some of the strengths. Try and find a balance between the two in terms of what you think is most valuable for a user to know. Remember to ask yourself when writing the QD "Is this going to help a user make a decision about whether to use the data or not?"

Quality Declarations can be designed to cater for a group of products which come from the same collection (CMS Collection) or they could be designed for a specific release such as a Confidentialised Unit Record File (CURF). For those QDs which cater for multiple products the QD will be written once ensuring that it covers all of the products that link to it.QDs that are product specific, such as for a CURF, will contain information that is only related to that particular product.

e.g.1 The Labour Force Survey may want to have one QD that covers all of their statistical releases - 6202.0, 6202.0.55.001, 6291.0.55.001 and 6224.0.55.001.

e.g.2 The Labour Force Survey may want to have two QDs:one QD that covers the statistical releases 6202.0, 6202.0.55.001, 6291.0.55.001; !and one QD that covers the annual family data, 6224.0.55.001 alone.!

Examples of where to find some information to develop content for QDs are: the Collection Management System (CMS), Concepts Sources and Methods, Clearance documentation, End of Survey reports, and Explanatory Notes.

If your QD is effective at answering the key questions users have about fitness-for-purpose of your statistics, then users will be more knowledgeable about the statistics, and better able to use the statistics in an appropriate way to secure better outcomes.

The below flow diagrams represent the Quality Declaration life cycle. Author areas can use

Page 12: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

the diagrams to determine whether they need to create a new Quality Declaration in theCMS or if the existing approved Quality Declaration in the CMS is still relevant (no need to create a new Quality Declaration or make changes to the existing CMS version).

Monthly Cycle for Quality Declarations

Page 13: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

Quarterly Cycle for Quality Declarations

Page 14: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

Annual/Irregular Cycle for Quality Declarations

Page 15: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

(Note sub-sections 2-4 excluded)

Quality Manual : Subsection 02-02-05

Page 16: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

05 Seven Dimensions of the ABS Data Quality Framework (overview)

DefinitionsInstitutional Environment: refers to the institutional and organisational factors that have a significant influence on the effectiveness and credibility of the agency producing statistics.

Relevance: How well the data meet the needs of users.

Timeliness: The delay between the reference period (to which the data pertains) and the date on which the data becomes available.

Accuracy: The degree to which the data correctly describe the phenomena they were designed to measure.

Coherence: Comparability with other sources of information.

Interpretability: Availability of information to help provide insight into data.

Accessibility: Ease of access to data by users.

Suggested Content TopicsThe topics mentioned here are suggestions of what could be written about in the quality declaration. The topic headings came from existing metadata fields. The ones listed were generally mentioned by users during consultation as the most important.

The topics are linked to the dimension that it is contained within. Hence, if writing about 'scope and coverage' the information regarding this topic fits underneath the 'Relevance' dimension. It is not possible to write about each of the suggested topics in the top layer of a quality declaration, so it is important to determine what content is most relevant in terms of the quality of your collection.

For examples of more topics which can be written about please see the following document which is also listed in Doclinks to useful information (Subject: Revised - Proposed content for Quality Declarations (Sept 2006); Database: MD Projects Vol. 3 - Other Clients WDB; Author: Bruce Fraser; Created: 01/09/2006; Doc Ref: BFRR-6T87PT).

Questions to askOnce again, these questions are examples of what you might want to consider when writing a quality declaration. Not all of the answers to the questions need to be included, just those that are the most important in regards to the quality of your collection.

ExamplesThe examples are to give an idea of the types of information that may be included in a quality declaration. Once again, they are to help provide ideas of what could be included in a quality declaration and to build on the information presented above in them in the section.

Page 17: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

06 INSTITUTIONAL ENVIRONMENT

The institutional environment of statistical information refers to the institutional and organisational factors that have a significant influence on the effectiveness and credibility of the agency producing statistics.

Suggested Content Topics :Professional independence ;Mandate for data collection ;Adequacy of resources ;Quality commitment;Statistical confidentiality ; andImpartiality and objectivity .

Definitions of Content Topics :

Professional independence :Independence from other policy, regulatory or administrative departments and bodies, as well as from private-sector operators.

Mandate for data collection :Administrative organisations, businesses and households and the public at large may be compelled by law to allow access to or provide data to the agency.

Adequacy of resources :The resources available must be sufficient to meet the needs of the agency in terms of the production (collection) of data.

Quality commitment:The staff of the agency are committed to upholding best work practices in line with the corporate objectives of the agency.

Statistical confidentiality :The privacy of data providers (households, enterprises, administrations and other respondents), and the confidentiality of the information they provide must be guaranteed.

Impartiality and objectivity :The production and dissemination of data is done in an objective, professional and transparent manner.

Institutional Environment in a Quality Declaration

For products that are substantially or solely produced from data collected under the Census and Statistics Act, 1905, the Quality Declaration will link to a standard description of Institutional Environment elsewhere on the ABS website.

For products primarily or solely produced from administrative data, information will need to be written about the Institutional Environment in which the data were collected. This information only needs to be brief. However, if the author area decides they wish to go into more detail then a second layer for 'Institutional Environment' is available for use.

Institutional Environment for 'products' produced substantially or solely from data collected under the Census and Statistics Act , 1905. i.e. use the below sentence and link in the Quality Declaration if your 'product' is produced substantially or solely from data collected under the Census and Statistics Act,1905.

For information on the institutional environment of the Australian Bureau of Statistics (ABS), including the legislative obligations of the ABS, financing and governance arrangements, and mechanisms for scrutiny of ABS operations, please see ABS Institutional Environment.

nb: where the "ABS Institutional Environment" hyperlink is:<http://www.abs.gov.au/websitedbs/d3310114.nsf/4a256353001af3ed4b2562bb00121564/10

Page 18: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

ca14cb967e5b83ca2573ae00197b65!OpenDocument>

Examples of Institutional Environment for 'products' produced primarily or solely from administrative data

Criminal Courts Australia:

Criminal matters are brought to the courts by a government prosecuting agency, which is generally the Director of Public Prosecutions, but can also be the Attorney-General, the police, regulatory agencies, local councils and traffic camera branches. Information on defendants brought before the courts is recorded by the court administration authorities in each state and territory for operational and case management purposes in the adjudication and sentencing process. Criminal Courts statistics are based on data extracted from these administrative records. Data are supplied to the ABS by the courts administering agency for all states and territories except for Queensland (where they are supplied via the Office of Economic and Statistical Research), and New South Wales (where they are supplied via the Bureau of Crime Statistics and Research).

Criminal Courts statistics are produced by the National Criminal Courts Statistics Unit (NCCSU) of the ABS. The NCCSU functions under an intergovernmental agreement between the ABS and the Attorneys-General/Ministers of Justice of the Commonwealth and each of the states and territories. One of the major functions of the Unit is:

to compile, analyse publish and disseminate uniform national criminal courts statistics, !subject to the provisions of the Census and Statistics Act 1905.

Page 19: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

07 RELEVANCE

The relevance of statistical information reflects the degree to which it meets the real needs of end users. Understanding who the respondents are and how the information is collected is important because this assists in better understanding the limitations of the resulting data.

Suggested Content Topics:Scope and coverage ; Reference period;Geographic detail ; Main outputs/Data Items especially those of interest e.g. Indigenous status, Questionnaire;Classifications;

Questions to ask yourself when writing a Quality Declaration for Relevance in terms of User needs

Scope and Coverage Who do the data represent?!Who's excluded? !

Why? !Any impacts/biases caused by the exclusions?!What ages covered in the collection? (part of scope)!

Reference period What period were the data collected?!Exceptions to the collection period? i.e. Data delayed in !registration for Administrative data purposes?

Geographic detail What levels of detail are available for the data?!Can a User focus on a specific area for research?!

Main Outputs/Data Items Can a specific data item be targeted by a user? (topical !e.g. Indigenous)What are the issues out in the Australian !economy/society?Can a user focus on a specific population (sub !population) of this survey?Main levels (estimates) for which the survey was !designed. What are the other important ones for which estimates are or are not reasonably reliable?

Questionnaire Great to link to questionnaire if it's possible. Allows users !to see the flow of the questions.Data collection method?!

Classifications What classifications are used?!

Examples for Relevance:

The industry classification used is the Australian and New Zealand Standard Industry Classification (ANZSIC06)The occupation classification used is the Australian Standard Classification of Occupations (ASCO)The geography classification used is the ...

The lowest level of geographical data that is possible is Statistical Local Area (SLA). For information on the coverage of SLAs please see the geographic classification .........

The Labour Force Survey includes all persons aged 15 and over except:members of the permanent defence forces;!certain diplomatic personnel of overseas governments;!overseas residents in Australia; and!members of non-Australian defence forces (and their dependants) stationed in Australia.!

For more information on the participants of the Labour Force Survey please see the Explanatory Notes section of Labour Force Australia, 6202.0.

Page 20: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

The quality of the data item "external causes of death" is subject to various collection issues which are better explained in detail in the following information paper External Causes of Death, Data Quality, 2005.

Page 21: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

08 TIMELINESS

The timeliness of statistical information usually refers to the delay between the reference point (or the end of the reference period) to which the information pertains, and the date on which the information becomes available. It can also refer to the time taken to develop a survey and commence data collection.

Suggested Content Topics:Timing (Time lapse between collection to publication, or publication of 1st, 2nd release etc for Census);Frequency of survey ;

Questions to ask yourself when writing a Quality Declaration on Timeliness in terms of User needs

Timing When is it released?!Are there several releases of these data?!When was it collected? i.e. over what period and if there is a !reference period to determine scope? i.e. a reference period/week?

Frequency When is the next collection?!

Examples for Timeliness:

The release of 2006 Census data is staggered over several months due to the volume of information available. For more information on the products available from the 2006 Census and their release dates please see the 2006 Census:Proposed Product Release Dates web page.

Page 22: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

09 ACCURACY

The accuracy of statistical information is the degree to which the information correctly describes the phenomena it was designed to measure. It covers both sampling and non-sampling error issues.

The Quality Declaration should inform the user whether the accuracy of the output will be sufficient to meet their needs. If not, they then need to consider the risk of using the data. If the statistical information is misleading, the results of any analysis may also be misleading, and result in poor decisions. Pointing out the differences in accuracy depending on the level of the data used may be beneficial (see examples below).

Suggested Content Topics:Sample size;Response rates;overall response rates by state; particular data items that are of note; use of internet for answering questionnaire (impact) on response rates;Changes in quality of data ;Indigenous ERP measures etc- links to information papers Standard errors; Use of confidence intervals for key estimates - state levelsNon-response adjustments (e.g. imputation);Confidentiality methods ;Weighting methods (include reference to Benchmarks);Revisions to data;Calculation of averages ; often users are a little confused over how averages may have been constructed. It may be useful to let them know whether the average was done on a quarterly, annual, financial annual etc basis. Advice on how to construct averages would be useful.

Questions to ask yourself when writing a Quality Declaration for Accuracy in terms of User needs

Sample size Has the sample increased?!Are there deficiencies of the sample?!

Response/Enumeration Rates

Was there a similarity with the non-responses? Possible !biases?

!

Standard errors What is the range the estimate could fall between for use in !the estimation of costings?

Non-response adjustments

Imputation methods used, if any?!How are non-responses coded? !

Was there a "Don't Know" option in the question, or was !the question "refused"Were the refusals incorporated with "don't knows"!

How many non-responses to a particular question?!!

Confidentiality methods How do these impact on the level of detail that can be !provided?

Weighting methods What methods used?!How does this impact on the use of the data?!

Revisions to data What revisions have been done?!Is it consistent for all series or only some?!Breaks in series of note?!

Examples for Accuracy:

While these data are of a good quality at the higher level of geography/national/state level, when cross-classifying the data the RSEs should be carefully viewed.

Page 23: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

The limitations of a sample survey mean that a dash within a table represents that no-one was selected in the sample with those characteristics. It does not mean that there is no-one in Australia with those particular characteristics.

Very detailed export data quality is not as accurate as aggregate level data due to the limited statistical editing procedures that have been undertaken on the fine level data.

Page 24: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

10 COHERENCE

The coherence of statistical information reflects the degree to which it can be successfully brought together with other statistical information within a broad analytic framework and over time. This is captured in the Quality Declaration by focusing on changes over time to the data collection as any such changes will impact on any interpretation of how things may have changed over that period. For example, a perceived change in results between two time periods might simply reflect a change in definition. Thus, it is important to know when these definitions have changed and how much they have changed, and considering the potential impact of those definitional changes on the data.

Suggested Content Topics:Changes to data items ;also future changes e.g. citizenship question; addition of new possible questions/more defining questionsComparison with previous releases ;Comparison with other products available ;History of collection ; For time series collections it would be important to list breaks in series, this can be linked to e.g. link to Concepts Sources and Methods.

Questions to ask yourself when writing a Quality Declaration for Coherence , in terms of User needs

Changes to data items What changes have occurred to the collection of this data !item?

additional questions to derive this item?!was it self reporting and now derived and vice versa?!any changes that have impacted on the quality of the data !item?

Is there a 'timeseries' of this data item?!Does the timeseries reflect growth/decline in a particular !item?

Comparison with previous releases

Any changes in collection methodology, if so what impact? !e.g. Has age scope of collection changed?!What is the impact of this change on the estimates?!What is the impact of this change on a particular series?!

Have there been changes in the supply of the data?!Is the same 'questionnaire' used for the entire collection?!

If not, what are the differences, and the impacts of these !differences?

Has there been a statistical impact to this series caused by !changes in methodology or framework?

How has this impact been managed (e.g. bridged, !backcast)?

Has there been impact on the movement estimates?!

Comparison with other products available

Are there external series that a particular series can be !compared with?Is the series showing a 'similar' pattern to those already !published series for comparison purposes, and if not, what are the major differences?Are there other ABS collections or important non-ABS !collections that a particular series can be compared with?What data cannot be compared together? !

History of collection Has the collection always been collected in its current form?!What are some of the major changes to the collection over its !lifetime?

Changes in quality of data

Have there been any 'real' world influences on the statistics !collected?

What are the impacts of this change on the data?!Have questions changed/wording which has influenced the !collection?Changes in survey design/question placement which may have !influenced quality?

Page 25: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

Administrative collection undergone changes in processing, !collection?

Examples for Coherence:

Comparing the data item "Occupation" between Victoria and Queensland is not technically possible as the question used to collect this information differs between these two states. The estimate for Victoria will be higher than Queensland's due to the scope of the question definition which is different...

These data are comparable with <another collection's data/ another agency's data > with the following limitations...

Page 26: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

11 INTERPRETABILITY

The interpretability of statistical information reflects the availability of the supplementary information and metadata necessary to interpret and utilise it appropriately. Interpretability can be discussed in the Quality Declaration by asking the questions:

What metadata / explanatory information is available to users?!How easy is it to obtain more information about the data and data collection if required?!

Suggested Content Topics:Definition of seasonal adjustment processDefinition of trend adjustment process Common misunderstandings of terms (e.g. youth unemployment, underemployment, unemployment, wealth, earnings, GDP, Standard of living);Other metadata sources for further clarification of collection (e.g. Concepts Sources and Methods);information papers

Questions to ask yourself when writing a Quality Declaration for Interpretability in terms of User needs

Seasonal adjustment What method used?!When are seasonally adjusted estimates appropriate to use?!

Trend adjustment What method used?!When are trend adjusted estimates appropriate to use?!

Common misunderstandings of terms

What does 'youth' mean?!What is the difference between 'response' and 'enumeration' !rate.

Other sources of information regarding the data

Where can more information be found if someone wants to do !more research on this collection?Where are the details on data definitions etc?!

Examples for Interpretability:

The major series estimates from this collection are available in original, seasonally adjusted and trend series. To find out more information on seasonal adjustment and trend estimator please see Timeseries Analysis Frequently Asked Questions.

When to use seasonally adjusted vs trend adjusted data ?Comparing: Month to Month

Series ABS Recommendation Benefits and disadvantages

Original estimates Do not use May be misleading due to seasonal patterns, residual noise and irregular influences.

Seasonally adjusted estimates

Use with caution May be misleading for volatile series containing high levels of residual noise. Provides useful information on volatility of series.

Trend estimates Preferred option The most appropriate indicator for comparing month-to-month or quarter-to-quarter changes. Recent estimates, usually the last 3 or 4, may be revised.

Comparing: year apart changes

Series ABS Recommendation Benefits and disadvantages

Original estimates Do not use Crude form of seasonal adjustment assuming

Page 27: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

seasonal patterns do not change. May be misleading as it ignores evolving seasonal patterns, trading day and moving holiday effects. May contain high contribution from residual noise.

Seasonally adjusted estimates

Use with caution May be misleading because Year apart percentage changes in the seasonally adjusted estimates usually contain a high contribution from the residual noise.

Trend estimates Preferred option Stable measure indicating average trend movement over the year. May not reflect current direction of trend if there has been a change in the direction of the trend during the year.

Page 28: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

12 ACCESSIBILITY

The accessibility of statistical information refers to the ease with which it can be accessed by users. This is addressed in the Quality Declaration by discussing:

ease of accessing the data; and!knowledge that the data exist.!

Suggested Content Topics:Special data services (provide a statement about possible combinations of data that may be requested or even the ability to request information - contact information.)Data products available (provide links to the products, as well as their format);

e.g. spreadsheets, data cubes, Confidentialised Unit Record Files (CURFs), Remote Access Data Laboratories (RADL), Census Products page

Questions to ask yourself when writing a Quality Declaration for Accessibility in terms of User needs

Special data requests What data mixes are available?!Can regional x indigenous be obtained by request?!Can individual earnings be obtained rather than the current !groupings?Are other groupings of data items available?!Is it possible to get regional data at a lower level than SLA?!How long do data requests take?!Who is the contact for data requests?!Is the attainment of mean/averages data possible at finer !levels to alleviate possible disclosure issues?What level are seasonally adjusted and trend estimates !available at?

Data products available What data are available in .xls?!What other formats are data available in?!Data items available - can link to the data item list (if !available), or have a data item list in the second dimension

Examples for Accessibility:

e.g. Labour force may state that if users examine the data cubes with all the data items available within each cube, they may allow a addition of a new variable if one or more of the current variables in the cube is sacrificed. A statement about whether information is more likely to be supplied at a detailed level if averages are used instead of a single point in time estimate would also be useful here.)

e.g. 2) Indigenous estimates are available for the following data combinations upon request ......

Quality Manual : Subsection 02-02-16

Manual Category Ü C. MethodologyManual ID - No & Title: Ü Quality - 08. Quality Manual

Finalv30/6/2008

Last Updated:24 Dec 2008

Chapter No. & Title: Ü 02. Quality Guidelines

Section No. & Title: 02. Quality Declaration User Guide

Header

Page 29: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

Subsection No. & Title: 16. Example of a compendium Quality Declaration

Document Version: Ü 30/6/2008

Status: *** Final ***

Comments: Final

Status

Finalised contact details are available for this Manual.

Area Resp. for Updating: Methodology DivisionGeneral Contact Details

Name: Phone:

Specific Contact DetailsName: Phone: Division: Email:

Melissa Gare(02) 6252 7147Methodology and Data Management Division

Contact Info

To be notified when a change to this document is published, register yourself, your workgroup or your workgroup database email address in the table below.

View: (LookupStakeholders) Click in the left hand margin to select an existing entry!

Press F9 if your registration doesn't appear automatically!

Stakeholders

The documents linked here:

offer guidance on related matters!

when changed will demand a review and possible updating of this document and/or!

will need to be reviewed and perhaps updated as a consequence of changes to these !

documents.

Policy

Standards

Definitions

Procedures

Other links

Links

Page 30: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

Last modified by Narrisa Gilbert on 24/12/2008 11:20 AM - Body field changed : 3 characters added

History

Page 31: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

16 Mock Quality Declaration for Measures of Australia 's Progress

QUALITY DECLARATION SUMMARY

INSTITUTIONAL ENVIRONMENT

Measures of Australia's Progress (MAP) is produced by the Social Analysis and Reporting Branch of the Australian Bureau of Statistics (ABS). The staff producing the publication have considerable expertise in the compilation and use of statistics to analyse and report on trends, and to inform and stimulate public debate.

The design of MAP was guided by past and current ABS consultations. The ABS has a systematic program of consulting users of statistics about our statistical frameworks, surveys, products and analyses. Through this program, thousands of government agencies, academic researchers, businesses and business councils, community organisations and individual Australians have told the ABS what they think it is important that we measure. Our initial choices were tested through several further rounds of consultation undertaken specifically for MAP.

RELEVANCE

Measures of Australia�s Progress (MAP) considers some of the key aspects of progress side by side and discusses how they are linked with one another. It does not attempt to measure every aspect of progress that is important. Nor does it consider all of the many different ways in which parts of Australia or groups of Australians are progressing. But it does provide a national summary of many of the most important areas of progress, presenting them in a way which can be quickly understood by all Australians. It informs and stimulates public debate and encourages all Australians to assess the bigger picture when contemplating progress.

Indicators and commentary are presented in the following areas: health; education and training; work; national income; economic hardship; national wealth; housing; productivity; the natural landscape; the air and atmosphere; oceans and estuaries; family, community and social cohesion; crime; democracy, governance and citizenship. In addition to these headline areas, there is commentary on the supplementary areas of: culture and leisure; inflation; communication; competitiveness and openness; transport.

Second layer

When MAP was first developed, the ABS undertook an extensive process to determine what measures of progress to include. Broadly, the indicators presented in MAP were chosen in four key steps:

We defined three broad domains of progress (society, economy and !

environment) We made a list of potential progress dimensions within each of the domains !

We chose a subset of dimensions for which indicators would be sought !

We chose an indicator (or indicators) for each of those dimensions. In particular, !

potential 'headline' indicators were identified which have the capacity to encapsulate major features of change in the given aspect of Australian life.

The eventual selection of indicators in MAP was guided by expert advice and by the criteria listed below. The decision on how many indicators to present was based on statistical grounds � for example, is it possible to find one or a few indicators that would encapsulate the changes in the given aspect of life? Is it possible to sum or otherwise combine indicators? And is the indicator supported by quality data?

Once the ABS had drafted its initial list of candidate headline indicators, extensive consultation was undertaken to test whether the list accorded with users' views. Whether a reader agrees with the ABS choice of headline indicators or not, he or she is able to look at the whole suite of indicators in each full edition of MAP and assign a weight to each, according to his or her own values, to make an assessment of whether life is getting better.

It was also decided that the indicators should focus on the outcome rather than the

Page 32: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

inputs or underlying causes of change (such as other influences that generated the outcome, or government and other social responses to the outcome). For example, an outcome indicator in the health dimension should if possible reflect people's actual health status and not, say, public and private expenditure on health treatment and education. Input and response variables are important to understanding why health outcomes change, but the outcome itself should be examined when assessing progress.

One criterion was regarded as essential to headline indicators � that most Australians would agree that each headline indicator had a 'good' direction of movement (signalling progress, when that indicator is viewed alone) and a 'bad' direction of movement (signalling regress, when that indicator is viewed alone). For instance, the number of divorces could be considered as an indicator for family life. But an increase in that number is ambiguous � it might reflect, say, a greater prevalence of unhappy marriages, or greater acceptance of dissolving unhappy marriages. This good-direction / bad-direction distinction raises unavoidably the question of values and preferences.

TIMELINESS

Summary indicators are released in April of each year, and present the most recent available indicators in each of the fourteen headline dimensions. Many of these indicators relate to the previous financial or calendar year.

A full edition of MAP has been released in 2002, 2004 and 2006. The full edition includes several articles that explain in detail the framework used for the indicators and analysis and interpretation of the indicators.

Second layer

Table of reference period for each headline indicator.

ACCURACY

The headline indicators used in Measures of Australia's Progress are broad level indicators that are generally presented at national level and in respect of the full Australian population. As such the level of accuracy is high. Data cubes are also available that present indicators for States and Territories. These indicators will be less accurate than national indicators. For more detail follow the link below.

Second layer

Table of RSEs or alternate accuracy measure for each headline indicator (if known/available) by State/Territory.

COHERENCE

Frameworks are a tool that provide coherence in statistical measurement, data analysis and analytical commentary. The framework used by the ABS in developing MAP is described in detail in Measures of Australia's Progress, 2006 (ABS Cat. no. 1377.0). This framework is built around three fundamental questions: "What do we mean by progress overall?"; "How can we describe progress across society, the economy and the environment, and what dimensions of progress should be included?"; and "What indicators best encapsulate progress in each dimension?".

Measures of Australia's Progress reflects on issues of importance to Australia and Australians, and no systematic or comprehensive attempt has been made to compare Australia's progress with that in other countries. Considering Australian progress

Page 33: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

side-by-side with progress in other countries can be informative. However, if we were confined to presenting indicators for which comparable overseas data are available, the coverage here would be narrower and its focus would probably be less relevant to Australian concerns. Where possible we draw some international comparisons of headline indicators for those dimensions of progress for which comparable international data are available.

INTERPRETABILITY

Measures of Australia's Progress are released as a suite of indicators across four broad areas of progress - individuals; the economy and economic resources; the environment; and living together - so that changes and trends in any one area can be compared and contrasted with changes and trends in other areas. In addition, the full edition of MAP contains articles and commentary to aid interpretation of the indicators.

One of the criteria used for choosing indicators for MAP is that the indicator should be intelligible and easily interpreted by the general reader.

ACCESSIBILITY

MAP indicators, data cubes, analysis and commentary are accessible through the ABS website. General inquiries about the content and interpretation of the statistics in this publication should be addressed to Linda Fardell on 02 6252 7187. For all other inquiries please contact the National Information Referral Service on 1300 375 070.

Page 34: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

Quality Manual : Subsection 02-02-17

Manual Category Ü C. MethodologyManual ID - No & Title: Ü Quality - 08. Quality Manual

FinalvDecember 08Last Updated:29 Jan 2009

Chapter No. & Title: Ü 02. Quality Guidelines

Section No. & Title: 02. Quality Declaration User Guide

Subsection No. & Title: 17. CURF QDs - Guidelines and Example

Document Version: Ü December 08

Header

Status: *** Final ***

Comments:

Status

Finalised contact details are available for this Manual.

Area Resp. for Updating: Methodology DivisionGeneral Contact Details

Name: Phone:

Specific Contact DetailsName: Phone: Division: Email:

Melissa Gare(02) 6252 7147Methodology and Data Management Division

Contact Info

To be notified when a change to this document is published, register yourself, your workgroup or your workgroup database email address in the table below.

View: (LookupStakeholders) Click in the left hand margin to select an existing entry!

Press F9 if your registration doesn't appear automatically!

Stakeholders

The documents linked here:

offer guidance on related matters!

when changed will demand a review and possible updating of this document and/or!

will need to be reviewed and perhaps updated as a consequence of changes to these !

documents.

Links

Page 35: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

Policy

Standards

Definitions

Procedures

Other links

Last modified by Narrisa Gilbert on 29/01/2009 12:29 PM - Body field changed : 476 characters added

History

Audit Trail

17 QUALITY DECLARATIONS FOR CURFs

Quality Declarations for CURFs were endorsed at the May 2008 Publishing and Dissemination Board to be released from January 2009 onwards. This is in response to users requesting more information when dealing with CURFs.

For more information on the content of what was presented to this board, please see: (Subject: Agenda Item 7: Quality Declarations review; Database: ICDSD WDB; Author: Christine Harmey; Created: 27/02/2008; Doc Ref: CHAY-7C82J3)

Constructing a Quality Declaration for a CURFWhen creating a QD for a CURF make the information contained within each dimension specific to that CURF. For example, if the CURF is only available as an Expanded CURF, do not mention Basic CURFs.

Below are some guidelines for the types of information to include within each dimension for a CURF. Where applicable "Standard" paragraphs have been created for some dimensions that you must use. As with any QD, the ability to link to other information on the website through the use of hyperlinks is possible. If a lot of information for a dimension that you think is important for users to know about is contained in the Publication QD; Explanatory Notes; or other metadata available on the website; then please provide a link to this information from within the CURF QD under the relevant dimension that the link applies to.

As for all QDs, the limit for content in the top layer is approximately 2 pages. Second layers are available for more detailed information that doesn't fit in the top layer of the QD.

For more information on writing standards for the website, how to create a QD using the Collection Management System and the Website Contents Approvals database, along with general QD information, please see sections 02-02-01 to 02-02-04 of the Quality Declaration User Guide.

CURF notes for Institutional Environment

The below is the 'standard' for Institutional Environment for CURFs. Where the product is not substantially or solely produced from data collected under the Census and Statistics Act, 1905, a link to the Quality Declaration for the publication should also be provided.

Standard paragraphs that must be used:Confidentialised Unit Record Files (CURFs) are released in accordance with the conditions specified in the Statistics Determination section of the Census and Statistics Act 1905 (CSA).

Page 36: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

This ensures that confidentiality is maintained whilst enabling micro level data to be released. More information on the confidentiality practices associated with CURFs can be found at the "How is CURF data confidentialised?" page.

For information on the institutional environment of the Australian Bureau of Statistics (ABS), including the legislative obligations of the ABS, financing and governance arrangements, and mechanisms for scrutiny of ABS operations, please see ABS Institutional Environment .

CURF notes for Relevance

Examples of things to mention:Level of detail CURF is available at e.g. Basic and/or Expanded.!Tailor the information to be specific to what is available in the CURF e.g. examples of !levels of detail that are available such as individual income amounts, as opposed to ranged income amounts.Link to technical manuals/guides!Some of the key data items available !

NB: If the CURF has "Special Conditions" that apply/govern the use of it's data then this must be mentioned.

CURF notes for Timeliness

Examples of things to mention:Previous CURF releases!Future CURF releases including intended time between data collection and CURF !release

CURF notes for AccuracyExamples of things to mention:

Mention the CURF Technical Manual - link to it!Link to CURF data item list (in regards to level of detail available for particular variables)!Sample size!Response rate!

Standard paragraph that must be included in AccuracySteps to confidentialise the data made available on the CURF are taken in such a way as to maximise the usefulness of the content while maintaining the confidentiality of respondents to ABS statistical collections. As a result, it may not be possible to exactly reconcile all the statistics produced from the CURF with published statistics.

CURF notes for CoherenceExamples of things to mention:

Differences of note between CURF releases !General differences for collection over time!

CURF notes for Interpretability

Examples of things to mention:Other supporting metadata such as Technical Notes, Explanatory Notes, Concepts !Sources and Methods etc.

Standard paragraph that must be included in Interpretability :The <Insert name of technical manual /guide and product number as a link > is a key source for consultation when using a CURF. It includes survey objectives, methods and design;

Page 37: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

survey content; data quality and interpretation; output data items; and information about theavailability of results; comparability with previous surveys; and the content of the CURF file.

CURF notes for Accessibility

Standard paragraphs that must be used:CURF microdata are not available to the public without special access being granted. All CURF users are required to read and abide by the 'Responsible Access to ABS Confidentialised Unit Record Files (CURFs) Training Manual'. Application to access a particular CURF microdata can be completed and submitted for approval by following the steps listed in the 'How do I apply for CURFs' Frequently Asked Questions. A full list of available CURFs can be viewed via the 'List of Available CURFs'.

The Basic CURF can be accessed on CD-ROM, in addition to being accessed through the Remote Access Data Laboratory (RADL) and the ABS Data Laboratory (ABSDL). The Expanded CURF can only be accessed through RADL and ABSDL. More detail regarding types and modes of access to CURFs can be found on CURF Access Modes and Levels of Detail web page.

If you have any questions regarding access to CURF Microdata please contact the Microdata Access Strategies Section at <[email protected]> or call 02) 6252 7714.

Page 38: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

EXAMPLE Quality Declaration for the EEBTUM 2008 CURF (release data ~ May 2009).

The below Quality Declaration is an example for the 2008 release, however, because the 2008 release is not yet available, for the purpose of demonstration, links have been created with the 2006 release information (where applicable) rather than 2008.

QUALITY DECLARATION - SUMMARY

INSTITUTIONAL ENVIRONMENT

Confidentialised Unit Record Files (CURFs) are released in accordance with the conditions specified in the Statistics Determination section of the Census and Statistics Act 1905 (CSA). This ensures that confidentiality is maintained whilst enabling micro level data to be released. More information on the confidentiality practices associated with CURFs can be found at the "How is CURF data confidentialised?" page.

For information on the institutional environment of the Australian Bureau of Statistics (ABS), including the legislative obligations of the ABS, financing and governance arrangements, and mechanisms for scrutiny of ABS operations, please see ABS Institutional Environment .

RELEVANCE

This CURF is available as a Basic and an Expanded. The differences between the 2006 Basic and Expanded CURF are detailed in Appendix 3 of Labour Force Survey and Employee Earnings, Benefits and Trade Union Membership - Basic and Expanded CURF, Technical Manual (cat. no. 6202.0.30.002)

The level of detail provided for selected key data items are:Basic CURF Expanded CURF

Earnings Deciles Deciles; Single dollar amounts

Age Single years: 15-24; 65-795 year groups elsewhere;Top coded: 85+

Single years: 15-84, Top coded: 85+

Geography NT and ACT combined All States separate

TIMELINESS

The EEBTUM survey is enumerated every August. The first EEBTUM and LFS CURF was released in 2005, based on August 2004 data (Basic CURF only). Subsequent CURF releases have been both Basic and Expanded CURFs (2006 and 2008 data). The next CURF release is expected to occur in May 2011 based on August 2010 data.

ACCURACY

The LFS and EEBTUM CURF contains individual person level data (unit records) whilst the EEBTUM publication contains aggregated level data. Along with unit record data, the CURF contains finer levels of detail for data items than what is otherwise published. For more information on the level of detail provided in the CURF please see the data item list.

There was a reduction in sample for the EEBTUM survey in 2008 of approximately a third compared to previous years. For more detailed information about this sample reduction please see the Explanatory Notes.

Steps to confidentialise the data made available on the CURF are taken in such a way as to maximise the usefulness of the content while maintaining the confidentiality of respondents to ABS statistical collections. As a result, it may not be possible to exactly reconcile all the statistics produced from the CURF with published statistics.

For more information on the Survey Methodology, see Labour Force Survey and Employee Earnings, Benefits and Trade Union Membership - Basic and Expanded CURF, Technical

Page 39: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

Manual (cat. no. 6202.0.30.002).

COHERENCE

The ABS has been conducting the Employee Earnings, Benefits and Trade Union Membership Survey since 1999. Prior to 1999 this publication was titled Weekly Earnings of Employees (Distribution), Australia cat.no.(6310.0). Key changes made to the Employee Earnings, Benefits and Trade Union Membership Survey can be found in Chapter 21.2 of Labour Statistics: Concepts, Sources and Methods (cat. no. 6102.0.55.001).

The differences between the 2006 and 2008 CURFs are detailed in Appendix 2 of the Labour Force Survey and Employee Earnings, Benefits and Trade Union Membership - Basic and Expanded CURF, Technical Manual (cat. no. 6202.0.30.002).

INTERPRETABILITY

The Labour Force Survey and Employee Earnings, Benefits and Trade Union Membership Technical Manual (cat. no. 6102.0.55.002) is a key source for consultation when using a CURF. It includes survey objectives, methods and design; survey content; data quality and interpretation; output data items; and information about the availability of results; comparability with previous surveys; and the content of the CURF file.

ACCESSIBILITY

CURF microdata are not available to the public without special access being granted. All CURF users are required to read and abide by the 'Responsible Access to ABS Confidentialised Unit Record Files (CURFs) Training Manual'. Application to access a particular CURF microdata can be done by following the steps listed in the 'How do I apply for CURFs' Frequently Asked Questions. A full list of available CURFs can be viewed via the 'List of Available CURFs'.

The Basic CURF can be accessed on CD-ROM, in addition to being accessed through the Remote Access Data Laboratory (RADL) and the ABS Data Laboratory (ABSDL). The Expanded CURF can only be accessed through RADL and ABSDL. More detail regarding types and modes of access to CURFs can be found on CURF Access Modes and Levels of Detail web page.

If you have any questions regarding access to CURF Microdata please contact the Microdata Access Strategies Section at <[email protected]> or call 02) 6252 7714.

Quality Manual : Subsection 02-02-20

Manual Category Ü C. MethodologyManual ID - No & Title: Ü Quality - 08. Quality Manual

Finalv 2007/01

Last Updated:08 Nov 2007

Chapter No. & Title: Ü 02. Quality Guidelines

Section No. & Title: 02. Quality Declaration User Guide

Subsection No. & Title: 20. Feedback from User Consultation

Document Version: Ü 2007/01

Header

Page 40: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

Status: *** Final ***

Comments:

Status

Finalised contact details are available for this Manual.

Area Resp. for Updating: Methodology DivisionGeneral Contact Details

Name: Phone:

Specific Contact DetailsName: Phone: Division: Email:

Melissa Gare(02) 6252 7147Methodology and Data Management Division

Contact Info

To be notified when a change to this document is published, register yourself, your workgroup or your workgroup database email address in the table below.

View: (LookupStakeholders) Click in the left hand margin to select an existing entry!

Press F9 if your registration doesn't appear automatically!

Stakeholders

The documents linked here:

offer guidance on related matters!

when changed will demand a review and possible updating of this document and/or!

will need to be reviewed and perhaps updated as a consequence of changes to these !

documents.

Policy

Standards

Definitions

Procedures

Other links

Links

Last modified by Narrisa Gilbert on 08/11/2007 12:34 PMHistory

Audit Trail

Page 41: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

Attachment X: Feedback from User Consultation

General informationQuality declarations were a welcomed initiative!Not too many layers!Do not have greyed out headings in the QD!Provide a definition of each dimension of quality!Order of the 6 dimensions of data quality is: Relevance, Timeliness, Accuracy, !Coherence, Interpretability and Accessibility

Information most requested for QDsscope and coverage!standard errors!reference period!sample size!response rates !geographic information!classifications!history of collection!changes in data item definitions!methodology differences and breaks in series, !explanations of seasonal adjustment and trend adjustments and when should they be !used for assessment of dataadvice of how to construct averages!benchmarks and weighting methodologies!survey instrument/questionnaire!products and formats of products!link State/Territory publications to the Australia level publications!confidentiality methods used !data items/main outputs- be sure to include information on topic items e.g. indigenous !status, regional leveldata requests - types of splits of information that are available;!common misunderstandings of data (e.g. youth unemployment)!collection methodology!derived measures that are used e.g. data is collected monthly, but an average of 3 !months is produced quarterly

Link to a more detailed summary: (Subject: Summary of all feedback from user consultation of Quality Declarations; Database: MD Projects Vol. 3 - Other Clients WDB; Author: Narrisa Gilbert; Created: 23/03/2007; Doc Ref: NGIT-6ZK2AQ)

Page 42: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

Quality Manual : Chapter 03

Manual Category Ü C. MethodologyManual ID - No & Title: Ü Quality - 08. Quality Manual

Finalv 2005/01

Last Updated:13 Apr 2006

Chapter No. & Title: Ü 03. Statistical Collections

Section No. & Title:

Subsection No. & Title:

Document Version: Ü 2005/01

Header

Status: *** Final ***

Comments:

Status

Finalised contact details are available for this Manual.

Area Resp. for Updating: Methodology DivisionGeneral Contact Details

Name: Phone:

Specific Contact DetailsName: Phone: Division: Email:

Melissa Gare(02) 6252 7147Methodology and Data Management Division

Contact Info

To be notified when a change to this document is published, register yourself, your workgroup or your workgroup database email address in the table below.

View: (LookupStakeholders) Click in the left hand margin to select an existing entry!

Press F9 if your registration doesn't appear automatically!

Stakeholders

The documents linked here:

offer guidance on related matters!

when changed will demand a review and possible updating of this document and/or!

will need to be reviewed and perhaps updated as a consequence of changes to these !

documents.

Policy

Links

Page 43: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

Standards

Definitions

Procedures

Other links

Last modified by Bruce Fraser on 13/04/2006 10:11 AM - Comments field changed : 17 characters deleted

History

Audit Trail

Statistical Collections

1 Chapter 3, Statistical Collections provides a range of information on quality relevant to staff in statistical collection areas. As each section is loaded to the manual, links will be provided in this document.

2 Statistical risk mitigation is discussed in Section 01 and includes a template for managing statistical risk in the change process. Statistical risk mitigation is undertaken to plan for and monitor statistical risk.

3 The concept of Quality Gates in the E2E Process are discussed here with specific reference to statistical collections and include clearance documentation and release management information and links.

4 Further mitigation strategies are discussed in Section 03 which provides 'Tips and Traps' to assist staff in identifying, monitoring and mitigating statistical risk. It is a quality assurance guide that discusses the risks and assurance strategy for various components of the e2e statistical process.

5 Section 04 introduces a range of tools for monitoring qualtiy including the Quality Infrastructure System (QIS). The QIS is an integrated infrastructure that will allow the capture, storage and use of quality measures over the end to end business processes across all of ABS� data collections. It allows quality measures to be readily accessible for use in quality assurance processes as well as reporting on quality and making quality declarations about data. The system uses an input data warehouse approach to quality measures and interact with other corporate metadata repositories. It has a seamless interface with existing and forthcoming IT tools which generate quality measures, and with systems which use those tools.

6 This chapter includes the Quality Incident Response Plan manual, which is a plan for resolving serious doubts about key statistical results. It is designed to provide a quick and rigorous high level response to an identified serious statistical problem and is based upon the formation of a dedicated ad hoc multidisciplinary team to investigate and resolve the issue.

Page 44: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

Quality Manual : Section 03-02

Manual Category Ü C. MethodologyManual ID - No & Title: Ü Quality - 08. Quality Manual

Finalv 2005/01

Last Updated:13 Apr 2006

Chapter No. & Title: Ü 03. Statistical Collections

Section No. & Title: 02. Quality Gates in the End to End Process

Subsection No. & Title:

Document Version: Ü 2005/01

Header

Status: *** Final ***

Comments:

Status

Finalised contact details are available for this Manual.

Area Resp. for Updating: Methodology DivisionGeneral Contact Details

Name: Phone:

Specific Contact DetailsName: Phone: Division: Email:

Melissa Gare(02) 6252 7147Methodology and Data Management Division

Contact Info

To be notified when a change to this document is published, register yourself, your workgroup or your workgroup database email address in the table below.

View: (LookupStakeholders) Click in the left hand margin to select an existing entry!

Press F9 if your registration doesn't appear automatically!

Stakeholders

The documents linked here:

offer guidance on related matters!

when changed will demand a review and possible updating of this document and/or!

will need to be reviewed and perhaps updated as a consequence of changes to these !

documents.

Links

Page 45: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

Policy

Standards

Definitions

Procedures

Other links

Last modified by Bruce Fraser on 13/04/2006 10:12 AM - Comments field changed : 17 characters deleted

History

Audit Trail

Quality Gates in the End to End Process

1 Quite simply, a quality gate can be seen as a validation point in the end to end statistical process. Throughout the statistical process, there will be a number of validation points which must be 'passed' before moving on to the next stage.Quality gates can be used to improve the visibility of quality in the statistical processes as well as being used to measure and monitor quality in real time at various points in the end to end statistical process.

2 In determining which part of the process should be monitored by using quality gates, the survey manager should base the decision on a top down assessment of the statistical risks associated with the process. Processes which are subject to higher risk of quality degradation should be monitored more intensely than lower risk ones.

3 The placement of a quality gate may be different for each survey or collection. There will also be different gates at different levels.

4 For example, the diagrams below illustrate the top level vision layer of the statistical process for ESG (figure 1) and PSG (figure 2). If passing a 'quality gate' enables the product, for example, in terms of the ABS statistical process, this could be a frame, sample, benchmark file or set of estimates, to move on to the next phase, then there may be a quality gate between each of the individual phases.

Page 46: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

5 At the same time, each of the phases illustrated on both ESG and PSG process maps may have quality gates within the lower level activities or tasks. Within the process of acquiring data, for example, there is frame preparation and sample preparation. It may be that there will be a quality gate signifying that frame preparation is completed and signed-off and sample selection can begin. For some surveys, the quality gate may only be after the sample preparation.

Page 47: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

Developing quality gates

6 The concept of quality gates is akin to an acceptance criteria imposed at predetermined points in the end to end statistical process. The quality gate consists of a set of indicators to sign off on. These indicators may be qualitative or quantitative and need to be agreed on by all stakeholders; that is, those who are signing off on the quality and those who are receiving the quality product.

7 Agreement will also need to be reached on the positioning of the quality gates. They need to be positioned to take into account critical issues or decisions. The focus should be on a small number of key measures for each process.

8 When positioning the quality gates, the survey manager needs to remain cognisant of the time lines around the survey balanced with good quality management. The introduction of too many gates, or gates at inappropriate junctures will only serve to slow the process down and may ultimately devalue all quality gates.

9 Business analysis and process mapping is a useful tool to map out the processes at the various phases of the statistical process. Mapping processes allows you to understand how a system works, how a system interacts with other systems and processes. It will provide a simple conceptual framework to identify logical and key areas where quality gates can be used to monitor quality.

10 In any statistical process, there are known quality hotspots. Business process maps can be overlaid with a 'quality map' highlighting these quality hotspots. These also, will be logical places for a quality gate to improve the level of monitoring. For further information about Business Process Mapping, refer to the BPM Documentation database.

11 Quality gates are made up of a range of quality measures. Quality measures are used to quality assure surveys both within the collection cycle and at the end-of-cycle. These are useful tools in monitoring the quality and progress of the survey to sound the alert should a quality incident be looming.

12 Quality measure categories include frame quality, response rates, adjustments to data, number of revisions, estimates, and data capture. A core set of quality measures to assist in the real time monitoring of the quality of the collection is under development. There will also be a more comprehensive list to enable tailoring for a particular survey. This core set of quality measures will be developed in negotiation with a range of stakeholders to ensure relevancy. These

Page 48: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

quality measures include within cycle and end of cycle measures.

13 The identification of the core quality measures was based on the six key performance areas for business surveys outlined in Performance Indicators for Business Surveys (Linacre, 1996). This focused on end of cycle measures and has been expanded to incorporate within cycle quality measures.

14 The judgement and experience of the survey manager is required in deciding which quality measures to monitor. Deciding on the suite of quality measures for a particular survey should be done in consultation with MD and the senior managers who will be signing off on the quality of the collection as fit for purpose. It also comes into play when deciding whether the data is signalling a quality incident or is moving within an acceptable control limit.

Sourcing Quality Measures

15 For a quality gate to be effective, the information must be readily available to the survey manager. This information needs to be accessible within the statistical cycle and at the end of the cycle as well as being flexible enough to allow the survey manager to manipulate the data and to set appropriate targets or limits. One such system, currently under development to deliver quality measures to the desktop, is the Quality Infrastructure System.

The Quality Infrastructure System

16 The Quality Infrastructure System (QIS) is an integrated infrastructure that will allow the capture, storage and use of quality measures over the end to end business processes of both PSG and ESG. QIS will allow quality measures to be readily accessible so users can understand and improve processes, report on quality and make quality declarations about data.

17 Centralised storage of authoritatively defined and accessible quality measures allows for investigations of quality within collections, between collection cycles, and between collections. It also provides for the efficient and timely identification and resolution of quality related issues.

18 It will be up to the survey manager, in consultation with MD and key stakeholders, to decide what quality measures to monitor and have accessible from the desktop. A minimum will be the core set of quality measures as well as any other measures as identified by the BSC with a focus on a small number of key measures.

19 Survey areas identifying new quality measures will need to register these with Methodology Division who will act as the registration authority for QIS quality measures. The core quality measures list will be updated as changes are made.

20 QIS will deliver information to the desktop. The information will allow the survey manager to make informed decisions about the quality of the collection at a point in time, at end of cycle and between cycles. The survey manager will be able to set parameters around the quality indicating limits or targets.

Using Quality Gates: An example

21 Graph 1 shows this months response rates against the target response rate. A survey manager monitoring these results on a weekly basis, should become concerned around week 3 and investigate why the response rate is tracking below target as well as looking at other measures to corroborate the results. Remedial actions per risk mitigation strategy should be put in place.

Page 49: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

1 RESPONSE RATES, THIS MONTH AND TARGET

1 2 3 4 5 6Week

%

0

20

40

60

80

100

Target This mth

22 Another way of monitoring quality, is to predict the outcome from early in the process and put in place mitigation strategies.

23 For example, using the data from the response rate graph above, we may be able to predict the final response rate at day 6 by taking into consideration a range of variables including sample loss rate, non-contact etc. Graph 2 shows the target response rate for the month, the actual response rate up to week 2, and a predicted final response rate.

24 In the example above, the prediction (red square) is clearly below the target, so the survey manager should immediately investigate what the problem is and put in place strategies to lift response rates. Note, the prediction is below target even though 'this month' line is above target when the prediction is made.

25 Graph 3 shows an example of monitoring quality within a control limit. By tracking within a range, the survey manager is seeking appropriate quality mapped against effort. While Graph 4 indicates to the survey manager that they may have a significant problem on their hands and they need to start applying mitigation strategies to rectify underlying problems.

3 FORM RECEIVAL RATE, THIS MONTH, LIMITS

1 2 3 4 5Week

%

0

20

40

60

80

100

Upper limit Lower limitThis mth

4 FORM RECEIVAL RATE, THIS MONTH, LIMITS

1 2 3 4 5Week

%

0

20

40

60

80

100

Upper limit Lower limitThis mth

Page 50: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

26 A key element of monitoring against limits or ranges is that the targets, benchmark, control limits or ranges are decided upon by the survey manager and will be different for different surveys. A response rate of 85% would not be acceptable for the labour force survey, but may be acceptable for another survey. Remember, quality is about being fit for purpose.

27 Quality monitoring allows the survey manager to set up-front, quality measures and parameters within which quality is acceptable. The survey manager will specify limits or targets. These will be based on historical data including variations and range as well as looking at desired outcomes. The quality measures can be monitored to ensure the survey process remains on track or if problems are emerging then remedial actions and risk mitigation strategies should be employed.

28 Further, the QIS will have the capability for the tailored quality measures to be output in a number of ways, for example feeding into the Collection Management System (CMS), assisting in clearance documentation or on a survey specific 'dashboard'. There are a range of uses and outputs from QIS. These include a 'dashboard', reports, and generalised interrogation.

Clearance documentation

29 Other summary reports include clearance documentation and meetings. Clearance documentation is an important tool not just to ensure the release of a quality product but in detailing quality measures which may alert you to a potential quality incident. Good quality clearance documentation should include a range of quality measures for the collection. These measures will be obtainable form the QIS.

30 Statistical Services Branch have produced Guidelines for Clearance Documentation to assist in the production of coherent and relevant clearance documentation. This has been developed for use by the survey manager to demonstrate that appropriate quality assurance processes have been applied in the collection, and to enable the collection owner assess "fitness for publication" based on information on quality assessment and analysis of the statistical data.

Examples of current quality gates

31 In the ABS statistical processes, there are already quite a number of quality gates in use. Within the 'Analyse and disseminate' (PSG) and the 'Assemble and disseminate' (ESG) phase of the statistical process, the Release Management System (RMS) acts as a quality gate with sign off required at a number of junctures prior to the publication of any product. In fact, the RMS, and the the Release Approvals System (RAS) and Website Contents Approvals (WCS) processes actually have a number of quality gates within the dissemination phase of the statistical process.

32 Release management is the process of ensuring that all the mandatory information required to disseminate a product has been registered for all phases of a product's release. This includes product registration (catalogue information), identifying appropriate search terms and approvals by authorised or delegated managers. The RMS is the new system that is being rolled out from August 2005. It is currently being used in conjunction with RAS and the WCA.

Page 51: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

Quality Manual : Subsection 03-02-02

Manual Category Ü C. MethodologyManual ID - No & Title: Ü Quality - 08. Quality Manual

Finalv 2005/01

Last Updated:13 Apr 2006

Chapter No. & Title: Ü 03. Statistical Collections

Section No. & Title: 02. Quality Gates in the End to End Process

Subsection No. & Title: 02. Guidelines for Clearance Documentation

Document Version: Ü 2005/01

Header

Status: *** Final ***

Comments:

Status

Finalised contact details are available for this Manual.

Area Resp. for Updating: Methodology DivisionGeneral Contact Details

Name: Phone:

Specific Contact DetailsName: Phone: Division: Email:

Melissa Gare(02) 6252 7147Methodology and Data Management Division

Contact Info

To be notified when a change to this document is published, register yourself, your workgroup or your workgroup database email address in the table below.

View: (LookupStakeholders) Click in the left hand margin to select an existing entry!

Press F9 if your registration doesn't appear automatically!

Stakeholders

The documents linked here:

offer guidance on related matters!

Links

Page 52: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

when changed will demand a review and possible updating of this document and/or!

will need to be reviewed and perhaps updated as a consequence of changes to these !

documents.

Policy

Standards

Definitions

Procedures

Other links

Last modified by Bruce Fraser on 13/04/2006 10:12 AM - Comments field changed : 17 characters deleted

History

Audit Trail

Guidelines for Clearance Documentation

1 The following is a broad framework for the creation of clearance documentation for the purpose of publication approval for release.

2 The aim of these guidelines is to focus on the quality of the data for the current cycle and any changes to that quality over time. Where possible, information should be presented in a time series to highlight the quality of the collection over time. The clearance documentation should not be a copy of what will appear in the publication, it should draw the readers attention to important issues in the publication or with the data.

3 Existing clearance documentation can be used as supplementary information to the document created using these guidelines if more detail is required.

4 Not all sections will be applicable to all collections. They will need to be adapted accordingly and a balance found in the effort undertaken and the value added from the process. The guidelines currently used in the Industry Statistics Branch have had a significant input into the development of these guidelines.

5 Where tables and graphs have been presented, comments should be made in regards to those tables and graphs, even if it is to state that there is nothing to discuss.

6 Remember that the clearance document will be printed off, in most cases, so try to make the document as printer friendly as possible; i.e. ensure that all tables and graphs are easily read in black and white and that they fit within the margins of the page, don't include tab pages or doclinks and if sections are used make sure that they are open when the final document is saved and sent.

7 Clearance documention

0. Summary Table

The table below will assist the members of the clearance team in identifying the

Page 53: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

issues which need to be focused on during the clearance process. By placing atick in the appropriate box for each topic in the summary table, the approving officer and stakeholders will be able to concentrate on the appropriate issues. Each topic in the table below relates to a section in the clearance document.

Page 54: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

Name of Collection: Reference Period:Publication Title and Cat. No.:

Clearance Date:Release Date:

Indication of Issues for Topics

Topic Significant Issues Issues For Consideration

No Issues

1. Main Points "2. Major Influences "

3. Frame, Scope and Coverage "

4. Sample Design and Estimation Methodology

"

5. Response and Imputation "

6. Adjustments "7. Data Confrontation "

8. Revisions "9. Systems and Processes "

10. Confidentiality "11. Emerging Issues "

12. Other Issues "

Page 55: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

1. Main Points

The Main Points section should summarise the key points of the collection, the main data items to be published and draw out any interesting or concerning aspects of analysis e.g. larger or smaller than expected movements. For collections with a large number of data items the prominent comments should relate to the main statistics of interest. Data should be expressed in seasonally adjusted (and trend) terms where applicable.

Where analysis has shown that a particular cause can be found for unusual estimates it should be briefly mentioned in this section and then refer to the appropriate section where more detail will be presented, e.g. a lower response rate (hence higher imputation rates) was the cause (explained further in Major Influences) for the change in the estimate and is presented in Response and Imputation section.

For the main data items, graphical representation of estimates or movements highlights unusual values more effectively. For less important data items or when there are a large number of data items, tables are more effective and less time consuming to produce. It is important to note, when producing graphs, that they are set up for black and white printing. Also, for the less important data items, consider including them in an attachment rather than in the main clearance document, if there are no significant issues to discuss.

For irregular collections, a brief statement of the survey objectives, the data items produced and their intended audience and use should be made.

BSC Director ViewsAnother process which has been found useful in clearance document is to present the EL2 views on the data and publication. Were there any issues that particularly concerned the EL2 of the survey during the survey cycle, what is the expected response from the major stakeholders, etc.

2. Major Influences (excluding methodological changes)

Is there anything that happened that has/will have a significant impact on the data in this or future cycles? For example:

Major tourism event: Rugby World Cup;Collapse or entry of a significant company;Environmental influences: droughts, flood, storms;Health Issue: outbreak of illness (such as SARS);Changes in Government policy: Introduction or change to current taxes;Differences in responses between large, medium and small businesses e.g. large businesses have reported an increase but smaller businesses have reported a decrease.

If any influences are identified explain what was done about it, i.e. how was it treated in the compilation of the data, will a technical note or data caution be included in the publication to notify users?

3. Frame, Scope and Coverage

What is the current scope and coverage of the collection? Are there any changes to any aspect of the scope or coverage from the previous collection cycle? If yes, what impact did this have on the data? Were any changes to other processes required?

Page 56: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

What is the source of the frame for this collection - ABS Business Register, external frame source? Are there any issues in regards to the frame creation or updating processes that have impacted on the data in this cycle e.g. the external source has changed the timing of providing the frame or changed its updating procedures? What impact did this have on the frame quality? Were adjustments required to the data because of this change?

Have any issues arisen in the current collection cycle that indicate possible changes are required to the scope and/or coverage definition of the collection?

If any issues have been identified, what action has been undertaken to correct the problem? What other areas of the ABS, such as Methodology Division or the Business Register Unit, have been consulted in the decision making process?

Present a time series graph of the number of deaths and out-of-scope (OOS) units identified through the collection of data. USI codes for deaths and OOS can be used to gather the information quickly. Have these numbers suddenly increased, been increasing steadily, remained constant, decreased?

Note: If there have been any changes to the Frame, Scope or Coverage, ensure that these changes are reflected in CMS.

4. Sample Design and Estimation Methodology

Give a brief description of the sample design and estimation methodology. For repeating collections (sub-annuals) where the sample design and estimation methodology has not changed from the last cycle, the information presented here does not need to be very detailed. Rather an appendix could be included where the detailed information is copied over from cycle to cycle.

For annual and irregular collections, more detailed information should be in this section.

Issues to cover when describing the sample design and estimation methodology:Sampling process including stratification, rotation and overlap control with other !collections.Estimation Methodology, i.e. Number Raised, Ratio (with what benchmark !variable), GREG estimation (using what benchmarks).What are the main data items and at what level are estimates produced?!What is the level of accuracy(s) (RSEs) that has been used in the designing of !the collection?Have these design RSEs been achieved? Present a time series graph of the !RSEs for the main data items.

Note: If there have been any changes to the sample design, estimation methodology or other methodologies such as outliering, ensure that these changes are reflected in CMS.

5. Response and Imputation

What is the achieved response rate for the cycle? This information is best presented in graphical form, including the response rates for previous periods to show the response rates over time. Present this information at various levels, such as state, industry, size. Include commentary to highlight the strata or classifications where response rates are low and whether the expected response rates were achieved.

Page 57: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

What imputation methodology was used? Historical ratio, donor, LRM, weight adjustment, etc. Is the same imputation methodology used for all units? Are there different methodologies used for complete and partial non-response?

What is the contribution of imputed units to key output data items? For higher levels of aggregates, graphs can be used to give information about the total level of imputation and its associated contribution to estimates. For more detailed presentation of the contribution of imputation, tabular form may be more convenient. Absolute and percentage contributions should be presented.

Where possible, total imputation and partial imputation data should be presented. Presenting the information on total imputation gives a good indication of the variability introduced into the estimates by the imputation methodology, whereas presenting the information on partial imputation gives an indication of particular data items that respondents have difficulty in reporting. By examining the partial imputation rates, modifications or changes to the questions or data items being collected may be recommended.

Has the contribution of imputed units to the estimates changed over time? Present the total contribution of imputation over time.

Where imputation has had a significant influence on the estimates and RSEs, commentary should be added to inform the reader of the reason for the higher level of imputation (if known) and whether the level of imputation is acceptable. Recommendations on how to correct the problem should also be presented in this section, e.g. changes to IFU procedures.

6. Adjustments

This section should discuss all adjustments applied to the data, such as Business Provisions, Outliering, Editing, etc. Below is a short list of possible adjustments that can be applied to the data.

Are Business Provisions applied? How are the Business Provision classes !defined?What is the contribution of the Business Provisions to the estimates (produce a !graph if possible showing the number of Business Provisions and their contribution to estimates over time)? Also include a doclink to the Business Provisions Clearance report.

Outlier methodology. Surprise Outliering or Winsorisation.!How many outliers are there and what impact do they have on the estimates? !Present a time series of the number of outliers and the contribution to estimates. Presenting this information may provide an indication that a review of the sample design, i.e. stratification review may be required.

Was an editing methodology applied to the data?!What impact did this editing methodology have on the estimates?!

Any special treatments to data e.g. is administrative data (BAS, BIT, etc) used to !edit or confirm survey responses?What impact, if any, have these treatments had on the data?!

When issues have arisen during the collection cycle which have impacted on the estimates, explain the process which has been undertaken to address these issues. Have other areas of the ABS, such as Methodology Division or Collection Management Unit, been involved in the decision about any treatments or adjustments made to the estimates?

Page 58: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

7. Data Confrontation

Comment on the credibility of the data. Is it consistent with 'real world' expectations? Have comparisons with other data sources, both internal and external, been made? Have major stakeholders, such as National Accounts, been consulted and provided comments on the data or any significant issues regarding the credibility of the data? What checks have been undertaken and are there any justifications for the difference to other sources?

Presenting the data in a time series fashion is good for identifying changes to cycles. There are some good diagnostics available from the seasonal adjustment process that can help with checking credibility. Are there any ratios that can be derived with internal or external data sources that can be used to identify changes to problems that can't be identified by examining the data by itself?

8. Revisions

Were there any revisions to historical data during this cycle that are not considered normal revisions, such as due to changes in methodology? If so, what were they, why did they occur and what impact on past estimates did they have? Provide tables and or graphs showing absolute and percentage changes to key data items. Were other areas of the ABS, such as Methodology Division or National Accounts involved in the decision to revise these historical estimates? Is a note required in the publication to explain to the users of the data the reason for the revisions and highlight the impact?

9. Systems and Processes

Were any systems or processing changes implemented during this collection cycle? For example: was the editing process (input or output) modified and implemented during this cycle? If so, what impact did this have on the data, if any? Is it expected that these changes will affect future cycles?

Also discuss any resourcing issues that occurred. Due to other priorities, did the normal allocated resources change for this cycle which could have impacted on the quality of the data? For example: did the introduction of new systems or processes free up resources which were allocated to other processes? Did the change to systems or processes require additional resources to ensure that the expected data quality was achieved? Were resources required from other areas of the ABS?

10. Confidentiality

Are there any confidentiality issues that need to be raised for this cycle? Were any other areas of the ABS consulted to discuss any confidentiality issues? If so, what was the outcome of these discussions? For example: were any estimates suppressed due to confidentiality? If estimates weren't suppressed and there are confiendentiality issues do we have permission to release?

11. Emerging Issues

Based on the information presented above, are there any emerging issues that will have an impact on future cycles? For example: has a change in the population or the response patterns indicated that a sample design review is required? Are changes to the questionnaire required to improve the quality of the

Page 59: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

data received from respondents?

Are there any changes planned for future cycles that will have an impact on the data or the resources required, both in terms of staffing and computing resources?

12. Other Issues

Are there any other issues pertaining to the clearance of this cycle which have not been covered in sections 1 to 11? As this is a Clearance Document, only include issues which will assist the members of the clearance team in the clearance process for this cycle.

Page 60: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

Quality Manual : Chapter 04

Manual Category Ü C. MethodologyManual ID - No & Title: Ü Quality - 08. Quality Manual

Finalv22/08/2007

Last Updated:09 Jun 2009

Chapter No. & Title: Ü 04. Statistics Sourced From Administrative Data

Section No. & Title:

Subsection No. & Title:

Document Version: Ü 22/08/2007

Header

Status: *** Final ***

Comments: Not yet available

Status

Finalised contact details are available for this Manual.

Area Resp. for Updating: Methodology DivisionGeneral Contact Details

Name: Phone:

Specific Contact DetailsName: Phone: Division: Email:

Melissa Gare(02) 6252 7147Methodology and Data Management Division

Contact Info

To be notified when a change to this document is published, register yourself, your workgroup or your workgroup database email address in the table below.

View: (LookupStakeholders) Click in the left hand margin to select an existing entry!

Press F9 if your registration doesn't appear automatically!

Stakeholders

The documents linked here:

offer guidance on related matters!

when changed will demand a review and possible updating of this document and/or!

will need to be reviewed and perhaps updated as a consequence of changes to these !

documents.

Links

Page 61: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

Policy

Standards

Definitions

Procedures

Other links

Last modified by Narrisa Gilbert on 09/06/2009 05:55 PMHistory

Audit Trail

Statistics Sourced From Administrative Data

1. This chapter is for staff who work with or are looking to acquire administrative data for statistical purposes. This could be for ad hoc statistical analysis, substitution of data currently directly collected for producing statistical outputs, or the ongoing production and dissemination of statistics from the source. Some principles and best practices to deal with issues relating to quality will be provided, along with hints and issues to think about when acquiring or using administrative data.

2. It will provide a reference for those who are about to acquire administrative data, and assist those who have existing relationships with data providers through the ongoing statistical use of their administrative data. 3. Some examples of published ABS statistical data sourced from administrative

systems include:

Register of Births, Deaths and Marriages provides the ABS with valuable data which !is used in population estimates and projections. Data from this source can also be used to measure likely demand for services associated with population growth, such as housing, child-care and schools (Births, Australia - ABS Cat. No. 3301.0; Deaths, Australia - ABS Cat. No. 3302.0)Medicare's change of address advice is used as an indicator of internal migration !used in population estimates and projectionsMotor Vehicle Registries provide snapshots of their registers to be used as a !population frame for the Survey of Motor Vehicle Use, (the data from the snapshot is published in Motor Vehicle Census - ABS Cat. No. 9309.0)Department of Education collects information on school enrolments (Schools, !Australia - ABS Cat. No. 4220.0)Customs documentation completed for goods sent or received at an international sea !or air ports, provides the basis for overseas trade data and export and import statistics (International Merchandise Trade, Australia - ABS Cat. No. 5422.0)Local government authorities' collection of information on building approvals is used !to construct an indicator of economic activity (Building Approvals, Australia - ABS Cat. No. 8731.0)The Australian Taxation Office (ATO) collects information on personal and business !taxation. Regional Wage and Salary Earner Statistics, Australia (cat. no. 5673.0.55.001 and 5673.0.55.003) and the National Regional Profile(cat. no. 1379.0.55.001 and 1379.0.55.002) Experimental Estimates of Personal Income for Small Areas, Taxation and Income !Support Data, 1995-1996 to 2000-2001 (cat. no. 6524.0.55.001) has been provided by both the ATO and Centrelink. The National Centre for Education and Training Statistics (NCETS) unit has !responsibility for the National Schools Statistical Collection. This contains

Page 62: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

aggregated data on schools, students and staff collected by the state and territoryeducation agencies for government schools and the Department of Education, Science and Technology (DEST) for non-government schools.Other non-ABS data that is used:!

Medicare (Australia): number of medicare enrolments, aggregated by !postcode/age/sex levelDrivers licenses (South Australia): number of drivers license holders, !aggregated by postcode Electricity connections (ACT only): number of residential electricity !connections, aggregated by suburb Australian Electoral Commission data (Australia): a snapshot of enrolments at !the end of each quarter - unit record data but only a few data items - no names or addresses

4. The focus of the manual is on the management of the quality of statistical outputs produced from administrative data. This chapter is geared towards electronic data files and spreadsheets of extracted data from administrative data providers. However, other types of administrative data include pen and paper write-ups of the data from existing files and hard copy data supplied (e.g. extracts). It is not a �how to produce outputs� manual, and staff new to using administrative data or producing statistical outputs generally will find that it will need to be used along with other �how to� manuals such as Appendix 7 Quality Declaration and Assessment of the NSS Handbook (National Statistical Services Handbook). Another source that may be of use is the ACT Office checklist for Administrative Data.

QA Checklist_1.pd

5. The chapter starts with the principles behind managing the collection and production of statistics to achieve the quality required for key uses of the outputs. It will then look at characteristics of administrative data sources likely to influence the quality of any statistical output produced from such a source. This material should be relevant for all staff working with administrative data, whether on an existing ongoing basis or seeking to develop a new source. The chapter then takes you through various quality issues likely to surface in the design of each of the operational phases of a system using administrative data for statistical purposes. It will then specifically look at aspects of managing for quality, in particular risk management, quality gates, continuous improvement and relationship management.

6. The various sections within the chapter are:

01 Special features of administrative data which can impact on quality02 Managing to achieve quality required03 Understand the quality requirements for key uses of outputs04 Understand and document the source05 High level design to achieve quality required

a) acquire datab) process inputsc) transform inputsd) analysis & explaine) assemble & dissemination

06 Risk management07 Quality assurance/quality gates08 Continuous improvement09 Relationships10 Contacts for more information and support

Page 63: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

Quality Manual : Section 04-01

Manual Category Ü C. MethodologyManual ID - No & Title: Ü Quality - 08. Quality Manual

Finalv23/08/2007

Last Updated:09 Jun 2009

Chapter No. & Title: Ü 04. Statistics Sourced From Administrative Data

Section No. & Title: 01. Special features of administrative data which can impact on quality

Subsection No. & Title:

Document Version: Ü 23/08/2007

Header

Status: *** Final ***

Comments: Not yet available

Status

Finalised contact details are available for this Manual.

Area Resp. for Updating: Methodology DivisionGeneral Contact Details

Name: Phone:

Specific Contact DetailsName: Phone: Division: Email:

Melissa Gare(02) 6252 7147Methodology and Data Management Division

Contact Info

To be notified when a change to this document is published, register yourself, your workgroup or your workgroup database email address in the table below.

View: (LookupStakeholders) Click in the left hand margin to select an existing entry!

Press F9 if your registration doesn't appear automatically!

Stakeholders

The documents linked here:

offer guidance on related matters!

when changed will demand a review and possible updating of this document and/or!

will need to be reviewed and perhaps updated as a consequence of changes to these !

Links

Page 64: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

documents.

Policy

Standards

Definitions

Procedures

Other links

Last modified by Narrisa Gilbert on 09/06/2009 06:01 PMHistory

Audit Trail

04 01 Special features of administrative data which can impact on quality

1 Administrative data are collected for regulatory purposes (e.g. birth and death registration) or for the administration of various government programs (such as social security benefits, education, and health). As such, the records created are collected with a specific decision making purpose in mind, and the identities on a record are crucial. The records can also be used for statistical purposes, and in doing so can offer strategic and some statistically important advantages over direct collection of data from the population concerned.

2 As well as enabling the production of statistics derived directly from the source data, administrative data can be attractive for substituting for some variables or subpopulations in a direct collection to reduce response load or improve efficiency in estimation. It is also being increasingly used by linking with other administrative data sets or even surveys/censuses to produce richer datasets for statistical utilisation. This manual is primarily concerned with use of administrative data to produce statistical outputs, although the other applications are discussed from time to time.

3 Utilisation of an existing information source avoids adding to the response load and can provide full coverage of a population, but will usually mean trade-offs in terms of what statistical information is able to be produced and there will be different quality features compared with direct collection. Cost will often be less, particularly considering a full coverage can be readily obtained (although not necessarily exactly of the population required for statistical users). But cost may not be less if there is a lot of transformation and processing to adapt the data to better meet user requirements, and timeliness can be an issue. Given the importance of avoiding duplication of collection and reducing overall load on respondents, it is an Australian Bureau of Statistics (ABS) corporate strategy to make use of administrative data sources wherever possible as a cost effective means of obtaining data which reduces respondent load, improves scope and coverage and increases the availability of longitudinal and small area data. Please see the following doclink for the link to the official ABS policy regarding dissemination of non-ABS statistics. (Subject: Dissemination of non-ABS Data; Database: ABS Corporate Manuals; Author: Lisa Snushall; Created: 31/03/1999)

4 In many other ways the use of administrative data for the production of statistics or analysis is just like the production of statistics from data collected directly from respondents (as done with a census or survey). Many of the principles and best practice methods for the various processes involved from end to end apply regardless of the source of the data, and the approaches for managing quality are the same.

5 There are issues associated with administrative data that users need to consider. Some of these include:

Page 65: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

the limited opportunity to influence the business processes underlying the creation of !the administrative data (such as the population for which the administrative data exists, and how and what information is collected about them); the importance of managing the relationship with the managers of the administrative !system to understand and influence the business processes used to produce the administrative records and ensure effective supply;system changes and how providers extract their data;!data may not be current;!question and interviewer biases may occur depending on who supplies the !information, and how forms are designed and tested (if at all);may suffer from partial non-response due to missing fields;!different forms being used across different jurisdictions or even within the same !jurisdiction;the tension between accuracy and timeliness - generally the more timely the data the !less accurate it is e.g. a jurisdiction providing figures to their state parliament that differ from the figures provided to the ABS a few months previously;political issues that may arise e.g. in the supply of data to the National Centre for !Crime and Justice Statistics (NCCJS) a jurisdiction refused to supply data for 8 months as the data was not flattering to that jurisdiction compared to other jurisdiction;the separate laws that apply to each state - registration laws and privacy acts;!understanding the impact on quality and the cost involved when setting up internal !ABS systems for processing administrative data; the history of stability in the data and the drivers of instability (e.g. the frequency with !which the regulatory environment around the system is subject to change);the maturity of the system (e.g. whilst Electronic Funds Transfer at the Point Of Sale !(EFTPOS) data has reached a stable level of market penetration, this was not the case over the first 10 years when progressive uptake would have masked any change in real patterns of retail spending);large volume of records (in the millions) which impact on the ABS' ability to 'clean up' !all of the records; andcost benefit trade-off of spending real money and effort to influence and change the !up-stream process e.g. in the case of building approvals the Administrative Data Acquisition Unit (ADAU) and the BSC worked hard to convert two thirds of councils to provide electronic data files.

6 The first of these aspects can have a significant impact on the quality of the resulting statistics produced, particularly their relevance, accuracy and coherence. The decision to use administrative data should rest on assessing whether the noise generated in the data by:

the business processes that drives it; and!the processes implemented at source or during statistical processing, !

are acceptable in relation to the statistical signal about the underlying economic (or social) phenomena being measured.

7 An example of the cost of not making this assessment very well is the UK attempt to use health records to measure change in real outputs of services. Continuing changes in the underlying coverage and definitions of the administrative processes generated statistics that had little credibility and eroded public confidence. For more information please see the below report - Atkinson Review: Final Report, Measurement of Government Output and Productivity for the National Accounts, 2005, Crown Copyright 2005, which can also be found on the Office of National Statistics website <www.statistics.gov.uk>

8 Customs records are another example. Changes from month to month in value of exports and imports can reflect changes in the customs business processes as well as the real-world changes in behaviour that is assumed by the users of the data. Furthermore, post September 11, Customs authorities have had to tighten up on export documentation which has given rise to improved accuracy and coverage of data in export records. If this were not explained and documented by statistical agencies this may well be misinterpreted as a genuine increase in exports. Quality management in this context at the very least means being aware of the impact, attempting to estimate its magnitude and reporting the fact to users.

Page 66: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

9 The starting point for quality management in the use of administrative data is a well documented model of the business process that generates the data, and the factors that drive it, so that quality dynamics can be understood and quality changes anticipated. It is also important to understand the agendas of the providers when managing statistical quality.

10 The relationship aspect can influence both the quality of the resulting statistics (mostly with respect to timeliness but also accuracy and quality improvement) and the ongoing availability and continuity of any statistical series produced. Good relationships can influence how well business processes for the source are understood and able to be influenced or managed. For example, changes to collection or coding of the source data may be able to be negotiated. In the case of NCCJS, they provide edits to their suppliers that are run prior to delivery of the data. These edits are used to aid in the provision of 'clean data' that meet ABS standards, which helps with the processing of the data in a more timely manner.

11 In view of their importance to managing quality there are sections in this chapter on the data source (04 04) and relationship management (04 09).

12 Just as information collected by the ABS is done in accordance with the Census and Statistics Act (CSA) with its requirements imposed on how the data is used, administrative data is usually covered by specific legislation (such as, for example, taxation) as well as more general legislation applying to information such as Federal and State Privacy Acts. Restrictions placed on access and use of the source data by these other legislation must be understood and followed in addition to those restrictions imposed by the CSA.

13 The legal situation with ABS use of administrative data can get complicated when data obtained by the ABS is edited and amended. Because the data has been �collected� by the ABS from the source, and is therefore subject to the obligations and restrictions of the CSA, there are difficulties deciding what information can be returned back to the source for their use. To overcome this problem direct collection of information from respondents to the administrative system for editing purposes is sometimes done by the ABS acting on behalf of the source owners and with their explicit authority.

14 A quick checklist for determining the usefulness of administrative data is:

Page 67: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

04 02 Managing to achieve quality required

1 Quality management is founded on the simple premise that �to manage something you need to know what standard you are aiming at and what you are achieving�. Management is then focused on aligning the achieved quality with the required quality.

2 There is no single approach to managing quality, but the following basic principles lay the foundations for best practice management of the quality of statistical outputs:

understand the quality requirements for key uses of the statistics (in terms of the 1.ABS Data Quality Framework (DQF));use the quality requirements to guide the evaluation of potential data sources and to 2.understand those characteristics which will impinge upon any use made of the source data and the outputs produced from it;working within the constraints of the chosen data source, and using the quality 3.requirements of key uses, develop a high level design for acquiring and using the source data to produce statistical outputsuse project management methodology for implementation of the collection design, 4.along with a system for registering and managing action taken plus a system for change management during operation;use risk assessment and management techniques during implementation and 5.operation of the design to minimise the likelihood of unplanned events stopping the achievement of quality goals; anduse quality assurance and quality improvement techniques to align the quality of what 6.is produced with what is required.

3 Each of these principles, except principle 4, is discussed in the following sections, along with the tools and methods which can be used to manage quality according to these principles. For information on Principle 4 the reader is referred to the Project Management Framework, Striving for Project Management Excellence manual.

Page 68: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

04 03 Understand the quality requirements for key uses

1 Quality as perceived by users is quite broad, and so for Australian Bureau of Statistics (ABS) statistical products their quality should be described in terms of the 6 dimensions of quality in the ABS Data Quality Framework (DQF) (i.e. relevance, accuracy, timeliness, accessibility, interpretability and coherence). Please see the doclink below for more detailed information and the attachment for an example of how the 6 dimensions of the DQF could be used to help access quality.

Subject: Report - Qualifying Quality - A Framework for Supporting Quality-Informed Decisions; Database: MD Projects Vol. 3 - Other Clients WDB; Author: Bill Allen; Created: 02/05/2002; Doc Ref: BALN-59R7JY) .

2 All of the quality dimensions can be influenced to varying degrees by characteristics of the source of the information used to produce the statistics. The dimensions of accessibility and interpretability will be very much determined by how the statistical output is made available. The constraints of administrative data in terms of what has been collected and when it is available will primarily influence how well user needs are able to be met. This is in regards to the relevance, accuracy and coherence of the resulting statistics and their timeliness. Hence it is very important to know the key user requirements so that processes can be implemented to better align the resulting statistics to user needs. It is also very important to make judgments from time to time about the overall value obtained compared with the costs of producing the statistics.

3 There are issues that may impact on the ABS being able to meet user requirements. This includes political issues such as the national standard for recording crime statistics. If all states were to move to this standard the incidence of reporting in various crimes and justice statistics would increase, hence for political reasons some states have been resistant to moving toward the national definition.

4 The ABS likes to try to better meet users needs by expanding existing collections to meet new and emerging issues. However, in the case of administrative data, this can be quite difficult as the ABS does not own the collection and the provider is not as concerned with meeting user requirements as the ABS. For example stalking and phone stalking are considered to be crimes, where as 30 years ago they were not. Information now needs to be collected on these incidents.

5 Another issue that the ABS needs to be considerate of is changes to publication tables which may impact on a provider's willingness to supply data. For example a publication may move from rates to indexes for the output. Due to the change in how the output is displayed people may interpret the data differently and the fear of looking bad from the provider's point of view may increase (or decrease) depending on how the data are presented.

6 User requirements can be obtained from several existing sources as well as from specific information gathered from key users. Many user reviews have been conducted over the years by the ABS for most subject fields. In particular, Information Development Plans should point to key un-met user needs. As well as the managers of the administrative system, the policy arm of government departments responsible for the system are also important sources of user requirements, particularly if they have a keen interest in obtaining statistics from the system about its effectiveness in meeting policy objectives.

7 If it is considered, based on a general understanding of user requirements, that there could be potentially significant use made of the data source if its existence and features were made more widely known, but there is lack of clarity on specific needs and uses, then an initial strategy may be to simply make the source visible through describing its key elements and listing in directories, on web sites etc. Even when doing this however it must be done with an eye on likely user requirements.

8 In practice, the fact that the source data already exists and statistics can be readily produced from it, and the greater limitations on what can be collected from the target population of the administrative system, mean that outputs can be more input driven than user driven. While this is not necessarily bad, it does mean that regular assessments should be made of the extent that important uses are being made of the statistics produced that

Page 69: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

justify the costs.

Page 70: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

04 04 Understand and document the Source

1 Quality management with respect to administrative record data rests on understanding the business processes that generate the data, and if possible designing the statistical processes that absorb the data as input in such a way as to buffer and filter the undesirable properties arising from the operation of the upstream part of the process. These statistical interventions are rarely 100% foolproof, so at the very least the statistician needs to be able to understand and report administrative process noise impacts when publishing statistics.

2 While it is necessary to understand the source used for all statistics, most statistics produced by the Australian Bureau of Statistics (ABS) are from surveys of businesses and households and there is extensive knowledge about the populations from which information is collected. With statistics sourced from administrative data, however, each source is different and there are some particular characteristics pertaining to administrative data sources which can impact on the quality of any statistics produced from the source. In particular, non-standard questions and classifications are often used and changes are made relatively frequently making it difficult to maintain a consistent time series. Also, the units for which information is collected and maintained will often need to be grouped or transformed into statistical units. Other examples are continuity of the data, legislation requirements, and coverage of the population.

3 In some cases statistical outputs are produced at the national level from information obtained from each of the State and Territory administrative systems. Not only are the characteristics of each source of vital importance but so are the differences between sources and how these differences can flow through to the end statistics.

4 It is important therefore to understand and document those aspects of the source information which are critical to the design of the system and which assist sound use of the data by helping users understand quality and limitations. This includes not just the information likely to be extracted for statistical analysis but also those features of the environment in which the administrative system exists which can influence quality e.g. the way the information is collected, public sensitivities about the collector or the system. When doing so, the Data Quality Framework (DQF) should be used as the framework to ensure all aspects which can impinge upon quality are covered.

5 A word of caution is needed here. Sometimes there can be extensive information relating to a source (particularly if it has been in existence for a long time with many changes), and so it is critical that judgments are made about what information is relevant to facilitating use and understanding the quality of the end statistics in order to keep the task manageable and cost-effective.

6 The main characteristics of an administrative data source likely to influence each of the DQF dimensions for statistics produced from the source are as follows:

Framework Characteristics of administration data sources likely to influence the quality of statistics produced

Relevance: how well the data meets the needs of users.

The definitions for the population of the administrative !program, and for the data items collected about each entity are critical influences on the relevance of any statistics able to be produced from the source.The primary (and sometimes sole) objective of the source !owner is meeting legislative requirements or administrative program goals. Hence, the population of the program, the time reference period for some data, and the data items available in the system may not meet the "relevance" dimension for effective statistical use.There can be differences between defined scope of the !population covered by the administrative system and the actual coverage achieved. This can be influenced by the existence of penalties or benefits obtained, and the steps taken by the owners of the system to ensure all eligible entities are covered.

A limited range of demographic information is usually available !

Page 71: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

as only those variables needed for administration purposes arecollected. Also, categories used to collect and/or code responses can be restrictive because of form design or processing issues.Geography is often coded to postcode as this is the most !cost-effective strategy for the administrative system.Some administrative records are flow records ie information !about an entity is built up overtime depending on needs of the administrative program and interactions of the entity with the system (e.g. social security clients). Other systems are stock records i.e. they refer to events during a period or with respect to a reference date (e.g. births, imports).

Accuracy: the degree to which the data correctly describes the phenomena it was designed to measure.

Quality standards used to achieve administrative goals can be !different to quality standards needed to produce statistics, although the application of sound quality assurance principles by the administrative program will usually result in quality suitable for statistical use. Areas to look at include forms used, method of collection, consistency checks and method of data coding & capture.The purpose of the program and the public perception of the !organization managing the program can influence the accuracy of information obtained.Non-sample errors are usually the main influence on accuracy. !Forms are rarely subjected to the same best practice design principles as statistical questionnaires and respondent and/or interviewer understanding is rarely tested.Main sources of non-sample error are:!

�up-to-dateness� of records (deaths, outstanding returns) !and duplicates;question and �interviewer� biases (e.g. some information !such as with death records is supplied about the entity by someone else, �interview� data for e.g. Indigenous origin can be different from self reported responses on a form);question wording may also differ for the same item e.g. the !question regarding Occupation for death statistics varies across each State.lack of consistency in the application of questions or forms !across �data gatherers� (e.g. sometimes old forms are used up before being replaced with new forms and so there could be a period of overlap when a mixture of questions is used);detailed examination of records in extremely large datasets !is not possible;missing data (particularly for variables not essential for !administration purposes); andextent of coverage of target population;!

sample errors will be an issue if sampling is used to cut down !processing costs, although the cost structure for sampling is such that adequate sample errors are usually able to be achieved.

Timeliness: the delay between the reference period (to which the data pertains) and the date on which the data becomes available.

How up-to-date records are in an administrative system will !depend on the reasons for the system, timeliness requirements imposed by legislation or the system, and any incentives or penalties which exist. e.g. In Tasmania registration of births occurs at the hospital, whereas other States wait for the parents to register the birthIt will also depend on how the information is collected and the !degree of automation for capturing any paper based collection.It can be difficult to process administrative data because of !duplicate records i.e. the same record can be updated on the dataset several times before the dataset is delivered, with replicates of this record occurring each time a change is made.

To meet user requirements for timely data it is often necessary !to obtain information from the administrative source before all

Page 72: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

returns for the reference period have been obtained (e.g. tax,births).

trade-offs to be made between accuracy (completeness !and estimation) and timeliness.need to have some method for handling or estimating !missing returns.

Coherence: comparability with other sources of information.

Statistical coherence within the source can be an issue as it !will not usually be an objective of the managers of the system. Changes to questions, scope etc overtime and jurisdiction can !impact on consistency of time series data. These changes can be the result of legislative or program objective changes.Completeness or quality of older vs newer data can also !impact on time series and domain comparisons. Statistical concepts for questions are not always suited to the !administration purpose or the means of collection (e.g. occupation on arrival and departure cards, geography)The use of standard classifications can be of value to the !source managers, particularly if coding tools are available, as they can provide a cost effective means for coding and using the information within the administrative systemThe use of standards will also help achieve coherency with !other sources of statisticsGood accessible documentation on metadata for the !administrative system is critical to assessing coherence

Accessibility: ease of access to data.

Sometimes performance reporting requirements on a program !can lead to the public availability of statistics extracted by the system owners. These may be limited and focused more on the efficiency of the program service delivery or its effectiveness.Often an administrative source provides statistical information !of a nature and focus different to the prime focus of the administrative program, and hence there will be no interest on the part of the administrative managers directly supporting these other uses. E.g. births, deaths and marriage records. Confidentiality and privacy acts impact on the supply of !administrative data. Sometimes Administrative data is not provided due to political funding, privacy and personality issues. There can also be provisions on the supply of data to the ABS which impacts on access.System changes can impact on administrative data. e.g. the !ABS updated its systems to meet provider requirements for locality, and then the provider changed their minds in regards to the reporting of locality.Source owners will often be happy for ABS to take !responsibility for making statistical information accessible, although they may not be keen on providing access to supplementary information or unit record information for further analysis. (Sometimes there can even be lack of cooperation when the statistics extracted are used to shed light on politically sensitive issues). Addition of new data items to an administrative dataset can be !quite difficult as concurrence needs to be obtained from all of the suppliers (e.g. States).�Corporatisation� of a government program or agency can !result in what were previously consider public good databases becoming a commercial asset and access can be restricted or hindered by issues of cost recovery, copyright or profit making.Information extracted from an administrative source by the !ABS will be made more accessible for both multiple use within the ABS and to the public by the ABS. The issue of who does the data belong to? e.g. in prisoner !health data, the information is obtained from private prisons, government prisons, state prisons as well as other government agencies.

Interpretability: The key here is getting information from the source managers !

Page 73: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

availability of information to help provide insight into the data.

so that the source can be adequately documented and madeavailable. Information on some aspects of quality may be particularly hard to get as there may not be enough information kept over time or the managers may not be keen on having service provision disrupted by attempts to obtain information. Some providers have no idea what the input is e.g. IT contractors may not know about how the data was collected, hence there are gaps in knowledge.Lack of consistent application of business rules over time !and/or across offices can be a major influence on consistency and hence interpretability of statistical output.Different definitions may occur in each state. This is a huge !threat if the differences in the definitions are not known. The best advice is don't assume everything is the same - ask!Care needs to be taken when interpreting the data as users !may not be happy with the way the data has been presented.

7 Sometimes a source is already well documented by the owners and the information about it available through a website. This can be the case when the source has been widely used for statistical and research purposes to support policy development and evaluation for the owners. Usually, however, dialogue with the owners/managers of the system is needed to obtain the information. This can be challenging at times when extra effort is required on the part of the managers to provide the detailed information required, particularly for historical data with changes over time not being well documented. Maintaining good working relationships with the owners of the source data is so important for achieving desired quality outcomes that the subject is covered at the end of this chapter of the manual as Section 04 09.

8 As with surveys, a skirmish could be conducted to get information on exactly what is available or understanding how it is collected and how this might impact on quality.

9 A checklist is a useful way of understanding the data source and evaluating it with respect to its suitability for use in statistical analysis. Please see the examples below of an administration data checklist and an explanatory note incorporating the checklist. The HMC_Data quality checklist was used for the first time in 2006, this checklist was sent to providers to complete and send back with the dataset. The checklist will be updated after each cycle.

(Subject: Acrobat of Section 4 and appendices 2 and 3 of NSW Admin Data Course; Database: MD Projects Vol. 3 - Other Clients WDB; Author: Narrisa Gilbert; Created: 22/11/2006; Doc Ref: NGIT-6VS2J2)

(Subject: Acrobat of an example of an explanatory note incorporating the Admin data checklist; Database: MD Projects Vol. 3 - Other Clients WDB; Author: Narrisa Gilbert; Created: 22/11/2006; Doc Ref: NGIT-6VS5DG)

10 Providing feedback to the source at the end of each cycle is a useful process. This will hopefully encourage the providers to use the information given to improve upon their own internal process and hence improve the data provided for future cycles. Regular reviews of on-going collections are important.

(Subject: Example Quality Report for Criminal Courts collection; Database: VIC NCCJS WDB; Author: Christina Feild; Created: 16/02/2007; Doc Ref: CFED-6YG6PT)

11 Documentation about the source should be entered into the Collection Management System so that it is readily available for all staff working on the production or use of the statistical outputs and for providing material to be made available in various forms to users of the outputs.

Page 74: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

04 05 High level design to achieve quality required

1 The prime objective of a high level design for obtaining and using the source data should be the production of statistical outputs which meet key user requirements specified in accordance with the 6 dimensions of the Data Quality Framework (DQF). The design should be constrained by budget, time and resources as well as the constraints of the source identified in the previous section. It should include specification of quality indicators and measures for assessing and assuring the quality of each phase in operation, and the use of quality gates as approval mechanisms for moving between the phases and for assessing any strain in quality (see Section 04 07 on quality assurance).

2 When designing the system, note that the statistical system (data collection and production of outputs) should be separate to the system in which the administrative records and transactions are stored. This enables the data to be manipulated, aggregates formed and statistical tests applied in the statistical system, without interfering with the administrative collection. It also helps to maintain the integrity and consistency of data which could keep being changed if kept within the administrative system and extracted each time it was used. If multiple uses of the administrative data are expected within the Australian Bureau of Statistics (ABS), then consideration should be given to maintaining one input data base to support the multiple uses as this will minimise duplication of effort, achieve consistency across uses and avoid multiple demands made on the owners of the source data.

3 The first step is to determine the statistical methodology to be used for utilising the data (e.g. regression estimation using supplementary information as benchmarks, tabulations from a census of all records within scope). This will then allow decisions to be made about what data is required, and how it will be used as the basis of design across each of the operational phases of the end to end (e2e) model.

4 What follows are the key elements of a high level design, and the key aspects relating to quality, to be considered for the operational phases of the e2e model. The main operational phases of the e2e model for which use of administrative data can have processes and quality issues different from those for direct collection are those relating to acquisition, processing and transformation of inputs into statistical outputs. Furthermore, if an input data base is to be developed and maintained to support multiple uses within the ABS, then some of the design aspects in the front end phases will be the responsibility of the area maintaining the input data base.

Acquire data

5 For the data acquisition phase the basic design steps are:based on the methodology to be used to produce statistical outputs, determine what a.information (including metadata) is to be extracted or collected from the source;determine what other information is required for using the information e.g. b.benchmarks, supplementary information for estimating for coverage deficiencies;decide how all of the information is to be collected from sources and develop c.processes and systems for getting and receiving the information;develop quality indicators and measures for assuring correct receipt of information; d.anddecide on critical quality measures for use as a quality gate and for assessing any e.strain in quality.

6 Some quality issues with data acquisition which arise particularly with administrative data are:

What is to be collected ?Population � does it match user requirements? Is it close enough? Is it consistent a.overtime? Does it need �transformation� to match the required population e.g. aggregation to different units, or need adjustments or supplementary collection to cover the missing population? i.e. Decide on scope and coverage and determine which records are in and which out.Do you take all records (census) or will a sample be adequate?b.What cut off dates should apply for obtaining records, and how do you deal with late c.

Page 75: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

or missing records, and amendments made overtime (will revisions to series bemade?)One-off or ongoing use?d.Stock or flow view? Will longitudinal analysis be of value for meeting user needs? e.(One strong advantage of administrative records can be their relatively easy ability to support longitudinal analysis and hence enable the production of flow and change of state statistics.) If so, some form of unique identifier will need to be used for each record to allow information to be built up over time.Reference dates for data items � are they suitable? consistent across records?f.What explanatory variables are available from the source? Are they in standard form g.or can be transformed to standard form?Are there hierarchical records? Are all levels of the hierarchy required or some h.transformation into a flat record required? Is some grouping or consolidation of reporting units into a statistical unit required?

How is the information to be collected ?How are the data obtained from source � as aggregated figures, a unit record file a.according to specifications, a dump of all records and extraction is first done by the statistics producer?Physical form � medium for transfer, Record structure/file format; at the beginning the b.ABS needs to decide what file structure is required to best meet system requirements and then advise the provider of this format.Validation Checks c.Security arrangements for transferd.When? � ongoing, weekly, at a point in time for each reference period? Timeliness e.may be an issue with late availability of some records (e.g. outstanding tax returns or birth registrations) and will be a critical factor determining when results of desired quality can be published. Rules will be needed for dealing with late records. Arrangements for any follow-up with source or by source with population to obtain f.critical missing data or fix erroneous records.Issues of cost recovery are likely to be raised by source owners.g.Storage and archiving, including any return to source of records or coded records.h.

7 A Service Level Agreement (SLA) or Memorandum of Understanding (MOU) covering what is collected and in what form etc is good practice for both parties involved. The SLA should specify what, when and how information is to be provided, the confidentiality requirements of source owner, etc � see Section 04 09 on Relationships. It should have quality requirements specified and obligations on the part of parties should these requirements not be met.

8 The National Centre for Crime and Justice Statistics (NCCJS) in Victoria for the first time in 2007 provided a checklist for data providers to complete about each data extract that they provided to the ABS. This checklist will be reviewed and updated each year. The NCCJS notes that for the State that completed the checklist properly and provided all the information requested the result was much cleaner data being received, which in turn made it quicker to process and understand.

Quality indicators9 Common quality indicators and measures for this phase which particularly apply to using administrative data are:

population counts (which can be monitored overtime by themselves and/or against !predicted likely counts);frequency distribution for key variables;!number of missing values for key variables;!births since last period;!deaths since last period;!changes to units;!outstanding returns;!coverage of population; and!date of receipt of records.!

Note that similar indicators and measures should be developed for all sources of inputs, not just the administrative source.

Page 76: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

Quality gate10 Common quality gates for this phase which particularly apply to using administrative data are:

scope and coverage consistent over time;!data still exists; !changes to the data have been examined and taken into account with respect to the !use;changes to reported structure of the file haven't changed;!check that the loading of the data is the same i.e. that no information has been lost in !the transfer from one system to another;classificatory items - check that they exist on the ABSIW (information warehouse), !check that the classifications have been loaded; andcheck that correct formats and variables have been loaded along with the correct !number of records.

Process inputs

11 For the process inputs phase the basic design steps are:Determine what needs to be done to get the acquired information into a form suitable a.for statistical use as per the chosen statistical method (e.g. coding, edit and imputation, handling of missing data (partial or complete), handling of duplicates, deaths and missing records;Identify classification systems needed;b.Specify and develop processes and systems for processing the data into a form for c.statistical use;Develop quality indicators and measures for assuring quality of processes applied; d.andDecide on critical quality measures for use as a quality gate and for assessing any e.strain in quality.

12 Some quality issues with processing inputs which arise particularly with administrative data are:

administrative data can be very large in volume and processing and validation !techniques need to be designed with this in mind;dealing with non standard or out of date standards used for data items and the quality !of any coding done by the owners of the data;is follow-up with the source or directly with the respondent necessary (and possible) !for obtaining missing data or can imputation be used;handling of duplicates; !dealing with changes over time; and!if records are to be linked then rules for dealing with inconsistencies across records !for the same entity need to be developed as well as rules for dealing with records not exactly matched on the basis of unique identifiers.

13 Some of the effort required at this stage can be greatly reduced by encouraging and helping the source owners to use standard data items and/or apply standard classifications and coding to the information for their own purposes so that coded information is acquired and the coding done to a satisfactory standard. Influencing providers is a very difficult task, and often organisations may want compensation for any changes to systems that are designed to extract data for statistical purposes.

14 If it is considered necessary for quality reasons to obtain missing or correct information, then acceptable arrangements need to be made with the source owners which should be covered in any SLA. In particular, if any additional information is obtained directly by the ABS (e.g. clarification with a doctor of a cause of death), then legal complications can arise with any return of coded information back to the administrative source for their own use.

Quality indicators15 Common quality indicators and measures for this phase which particularly apply to using administrative data are:

levels and value of imputation;!coding errors;!

Page 77: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

expected number of records;!edit failures; and!extent and result of auto-correction processes.!

Quality gate16 Common quality gates for this phase which particularly apply to using administrative data are:

check imputes have been completed and that there is no missing data;!check there are no duplicates;!check all classification codes exist;!check that all data meets the scope and coverage criteria required;!check initial inputs, number of edits triggered, derivations. It is important to !remember that other areas may not have as intimate knowledge of the data as you, and hence they may not have run all the checks that are required.check any changes in the frequency of the collection;!weekly meetings may be needed to discuss impacts;!

17 A query process may need to be set up over individual data items. For example if there is a query about data item "a" then you may need to go to the source for more information; however if the query is over data item "b" you may just impute the information.

18 A collection management structure that indicates who is responsible for maintaining the documentation for the collection, the processing of that collection and the sign-off procedures may be of use to ensure quality in the process.

Transform inputs into statistics

19 Once data has been acquired, processed and coded it is ready for statistical use by transformation into statistics. For this phase the basic design steps are:

determine the processes for summarizing input data into statistics (e.g. estimation a.and tabulations, statistical regressions);develop systems for transforming the input data into statistics;b.develop quality indicators and measures for assuring the quality of transformation c.processes applied to the input data; and decide on critical quality measures for the transformed data for use as a quality gate d.and for assessing any strain in quality.

20 Some design and quality issues with transforming inputs which arise particularly with administrative data are:

strategies need to consider the volume of data to be handled;!estimating for scope or coverage deficiencies as determined by user requirements � !there could be population deficiencies due to the administrative system not covering all of the target population or it could be scope deficiencies due to for example late submission of forms after the date of acquisition of the data from the system. Solutions can take the form of inflation factors and statistical models; and output editing can be more important than with direct collection as there is usually no !follow up with respondents to get correct information to work with at the unit record level.

For an example of documentation on the adjustments made to administration data (including modelling) please see the below examples:

(Subject: Documentation from revision of and further work on improving ABSMP BAS benchmarks.; Database: MD Proj Vol 1b - ESG Client WDB; Author: Elise Pierson; Created: 26/05/2006; Doc Ref: EPIN-6Q68G8)

Quality indicators21 Common quality indicators and measures for this phase which particularly apply to using administrative data are:

size of scope and coverage adjustments;!standard errors if any sampling done; and!size of amendments due to output editing.!

Page 78: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

Quality gate22 Common quality gates for this phase which particularly apply to using administrative data are:

check any manipulation of the data is still appropriate; !check data matches requirements;!keep in contact with the provider as a quality gate; and!check data structure is still the same after manipulations.!

Analyse & explain

23 For the �analyse & explain� phase the basic design steps are:decide on how results will be validated and reconciled with other sources;a.decide on quality measures to be produced so that any results are evaluated for b.reliability and robustness and consistency with other sources; anddetermine what information will be presented and how, particularly what c.accompanying commentary and supporting graphs etc.

24 Generally the same techniques and quality issues associated with using data directly collected from the target population are relevant. The main differences in the aspects likely to be encountered when using administrative data, as opposed to directly collected data, for statistical analyses are:

analysis is usually done to establish important and/or statistically significant !relationships for reporting and to evaluate quality. Such analysis done with administrative data will often be limited by the restricted range of �explanatory variables� available in the source data; andgiven issues mentioned above with respect to population coverage or lack of !standard variables, consistency and coherency with other sources can be an issue with statistics produced from administrative data, particularly compared with statistics produced direct from the one frame covering the target population (e.g. a business register).

Quality indicators25 Common quality indicators and measures for this phase which particularly apply to using administrative data are:

consistency checks with other sources; !decomposition of estimates; and!historical levels for comparison.!

Quality gate26 Common quality gates for this phase which particularly apply to using administrative data are:

consistency checks completed:!with other data;!with historical data for consistency.!

Assemble & Disseminate

27 For the �assemble and disseminate� phase the basic design steps are:determine what media to use for disseminating the statistical outputs, including a.information on quality (metadata);develop processes for assembling the various forms of outputs to be released;b.develop an awareness and marketing plan to ensure wide knowledge of key results c.and of the availability of the various outputs;develop clearance documentation for explaining outputs and summarizing quality for d.approval to release. This clearance document becomes the paramount quality gate for assessing the quality of statistics to be published and approving their release; determine a release approval process as the quality gate for release of outputs; ande.update the CMS to reflect any changes that may have occurred between this cycle f.and the previous cycle.

28 Generally the dissemination of outputs produced from administrative data sources have the same processes and the same quality issues as for disseminating outputs

Page 79: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

produced directly from a direct collection. That is, the outputs should be designed and madeavailable to users in accordance with the quality requirements specified across the 6 dimensions of the DQF. What is different, and can be a significant aspect, is that there should be consultation with the source owners so that they are comfortable with what is being done. This consultation should be done at the development stage to agree on principles for release, not once the system has been implemented and outputs are first produced. Source owners will often want to ensure explanatory material is accurate and reflects their understanding and perspective on the source material.

29 Experiences show that this can be a sensitive and time consuming matter:if there is political or commercial sensitivity about the data source or the results;!the results point to quality deficiencies of the administrative system;!legislative or policy restrictions on who can access microdata;!ownership issues can surface which can result in tedious discussions about the !owners wanting to sign off any publication or revenue sharing to cover their costs incurred or joint publications; andlack of interest on the part of the owners to provide support for answering queries !requiring more information about the source.

30 Access to microdata to support further research and analysis should be considered. Consultation should occur with the source owners as there are likely to be some issues to be addressed � these could include legislative or policy barriers, sensitivities about confidentiality and privacy, and lack of interest in supporting queries from researchers.

31 Metadata relating to what information is collected, and how it is transformed and processed to produce statistical output is critical for users understanding the quality of the outputs produced. Best practice is for such metadata to be created on the go as a by product of various processes and stored in the CMS. Critical information on concepts, sources and methods and quality issues should be made available along with the various output products, and directions provided to the availability of more detailed information for those users needing it. While this applies to all statistical outputs regardless of their source data, there are usually some particular characteristics of outputs produced from administrative data which need to be drawn to the attention of users of the outputs. These are the features which are referred to in Section 04 07, as well as any adjustments made to the data as a result of these features during the transformation, analyse and assemble phases.

Quality indicators32 Common quality indicators and measures for this phase which particularly apply to using administrative data are:

check totals;!examine clearance documentation information i.e. the information that is filled out in !the clearance documentation at various stages of the processing;

Quality gate33 Common quality gates for this phase which particularly apply to using administrative data are:

check major totals are recorded in the clearance documentation at various steps in !the processing cycle; andinternal consistency checks of the publication.!

Clearance documentation

34 Clearance documents are prepared and presented to approving officers for them to make informed decisions about data quality and fitness of the statistics for release. Clearance documentation aim to demonstrate that appropriate quality assurance processes have been applied in the collection. They are also used to enable the releasing approving officer to assess "fitness for publication" based on information from the quality assessment and analysis of the statistical data.

35 A broad framework for the creation of clearance documentation for the purpose of approving publications for release is set out in the document �Guidelines for Clearance Documentation� prepared by Business Survey Centres. These guidelines have the following sections for their clearance documentation:

Page 80: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

0. Summary Table1. Main Points2. Major Issues that Influence the Data3. Frame, Scope and Coverage4. Sample Design and Estimation Methodology5. Response and Imputation6. Adjustments7. Data Confrontation and Coherence8. Revisions9. Systems and Processes10. Confidentiality11. Emerging Issues12. Other Issues

36 While the main aim of the documentation is to focus on the quality of the data for the current cycle, information should be presented in a time series to show the quality of the collection over time. Basics to include are to emphasise what�s new, explain changes and put in context by relating statistics to external events, report on the major influences. The Quality Information Checklist, in Appendix 1 of Quality Incident Response Plan (QIRP) gives a list of measures that could be included as part of the clearance documentation.

37 Some examples of clearance documentation used for an administrative data collection (as well as survey collection) is:

(Subject: June 2005 SISC Paper : Economic Collections Clearance Documentation Guidelines; Database: MD Projects Vol. 3 - Other Clients WDB; Author: Alan Herning; Created: 01/04/2005; Doc Ref: AHEG-6B27W5) Please note that areas are advised to use this as a guide and then add any collection specifics to this documentation.

(Subject: Example Quality Report for Criminal Courts collection; Database: VIC NCCJS WDB; Author: Christina Feild; Created: 16/02/2007; Doc Ref: CFED-6YG6PT)

Page 81: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

04 06 Risk management

1 The high level collection design of the previous section sets out the strategies and processes to be followed for extracting data from the source, and processing and analysing it to produce statistical output for dissemination with the aim of providing users with the quality of statistics they require specified in accordance with the data quality framework (DQF).

2 Ensuring that the quality needs of users are met, however, requires more than a design based on a sound statistical methodology. There are underlying assumptions behind any design which vary in the extent that they are met in operation and there are always events which occur which are unexpected. Hence to ensure the required quality is achieved requires the application of sound quality management principles applied to the implementation and operation of the design. The key aspects for implementing these principles with respect to use of administrative data are outlined in this and the remaining sections.

3 One important strategy for implementation is the use of risk management, which has been identified as an important factor for success in project management. Basically, an assessment of risks to the successful development, implementation or operation of a project is undertaken, the likelihood and potential impact gauged for each risk event, and strategies identified for amelioration or prevention. Usually risks and their management are monitored by project boards. Further details on risk management can be found in the ABS Project Management Framework , Striving for Project Management Excellence , manual.

4 The project management framework (PMF) tends to focus on project budget and delivery risks. These aspects of project management are critical and can impact on quality. However, to ensure the quality of output is as specified in accordance with the DQF, there needs to be in addition the identification and management of issues which can impact on statistical soundness. These risks can arise during development and implementation of a collection as well as during ongoing operation.

5 When a risk assessment is done for the production of statistics from an administrative data source there are 3 broad areas for risk assessment (in no particular order):

1. within the source organisation(s) e.g. there may be many organisations that are supplying data to the ABS, there is also the risk of outsourced information technology (IT) people who don't know about the data being supplied to the ABS;2. within the Australian Bureau of Statistics (ABS); and3. the wider environment � stakeholders, users, public.

6 A risk assessment should be done for each of these areas, the likelihood and potential impact assessed, and a risk management strategy developed for each risk. The aim should be to identify and act upon factors which can influence quality and therefore cause a failure to meet user needs.

As a guide, the main areas where risk is likely to be identified (but not necessarily the only ones) are as follows:

1 Within the source

Policy and legislative changes impacting on the future of the source or resulting in !significant changes.Lack of interest or support on the part of the source organisation in having !statistics produced from the source. This is because ABS output isn't the core business for the provider. This can impact in many ways:

no support for obtaining the data needed to produce the statistics or !supplying the data in a timely manner;lack of support for attending to form or system problems which impact !on statistical quality.

Key personnel changes can very significant here, along with outsourced IT !people.Negative events arising from incidents in the source organisation or more directly !with the source program which threaten public confidence � can impact on the accuracy of information provided or raise concerns about wider use of the data.Changes to forms or method of data collection from respondents which can !

Page 82: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

reduce data quality, impact on time series. Implementation of changes can alsobe a problem with a long change over period when a mixture of old and new forms are being used. Changing of data item names is also an issue that needs to be monitored.System problems in the source agency and contracting out part of the system !particularly relating to data management can impact on supply.Cost recovery or commercial expectations.!Compliance load or respondent load policies of government and/or source !agency.

Key strategies for managing these risks are likely to include :For most or all of these the key strategy will be identifying and maintaining key !relationships within the source agency (particularly at senior management levels). This strategy is so important for achieving quality objectives that it is discussed further as a separate section (Section 04 09);Getting the source agency to see the value for themselves from having quality !statistics produced and hence investing in quality of the system;Providing expertise and tools to assist the source agency (staff, technical !expertise in various areas e.g. form design; andReturning coded data to source for their use to manage program (providing able !to do so legally).

2 Within the ABS

Understanding the quality features of administrative data which differ from directly !collected data.Lack of interest in maintaining relationship with source or in understanding the !source provider interests and primary objectives.Changes to international and ABS standards.!Underestimating costs and quality due to volume of data and its growth.!User funding of ABS collections. The user may reduce or cut funds if they can't !get the data they need.

Key strategies are likely to include :Relationships (again, especially with Boards and Advisory Groups); and!Provision of Tools and expertise to implement changes.!

3 Wider environment

Privacy concerns and interest of relevant public interest organisations.!Political sensitivity of some data relating to program performance.!Changing user expectations, particularly about new forms or getting data such as !microdata.

Key strategies are likely to include :Maintain good relationships with privacy custodians and keep them informed of !what doing;Maintain dialogue with users; and!Stress differences between administrative use and statistical use of data in !administrative systems, and how identified information flows one-way.

7 Throughout the above it is apparent that the establishment and maintenance of a sound working relationship with the source owners/managers is a key ingredient for success. The principles of good relationship management along with some case studies are set out in the last section (Section 04 09).

8 One area where risk tends to be heightened is when changes are being made to a system. Changes to systems, management, etc will occur from time to time within both the source agency and the ABS, and any risk assessment should look hard at risks likely to arise due to changes.

9 The Motor Vehicle Census project is tightly managed because there are staff changes regularly. Hence documentation and communication are very important. Annual reviews are conducted to find out what went well and what didn't. These reviews include IT people in the feedback so that the ABS can go to the provider to ask for improvements in

Page 83: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

supply of the data.

Page 84: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

04 07 Quality assurance/ quality gates

1 Implementing a methodology through the various end to end (e2e) phases as outlined in Section 04 05, particularly with the use of project and risk management tools as outlined in Section 04 06, is an essential step for meeting the quality requirements of the key uses of the outputs. It will not guarantee, however, that the planned quality is achieved unless there is an ongoing program of quality assurance which regularly checks that processes are working as designed and overall are in fact producing the quality required.

2 Quality assurance can take many forms but should be based on the following principles:

you need to know what standard of quality is required and what is being achieved . 1.Chapter 04 03 covers the setting of quality requirements of users based on the ABS DQF and Chapter 04 05 covers the development of a high level design aimed at meeting these requirements. Because the design has had to work within budget and other constraints the target quality may in fact be different from the user requirements.manage the quality achieved in each phase of the end -to-end process and have 2.sign off of the quality achieved . Tools commonly used to manage process quality, particularly when processes are managed by different areas of an organisation, include Service Level Agreements (SLAs), Memorandums of Understanding (MOUs) and quality gates (see p35 of the Quality Incident Response Plan (QIRP)). Quality Gates are tools designed to improve early detection of errors or flaws in statistical processes. A template for identifying risk areas in a process and hence any quality gates that might arise from the acknowledgement of these risk areas is:

. This template is also stored on the Quality Assistant database .measure quality regularly . �if you do not know what quality is being achieved, how 3.can you manage it?� Quality measures can take many forms and may be indicators relating to critical processes or phases or be some aspect of the quality of the end product (e.g. sample error). Regular measures should be supplemented with a program of quality evaluations or studies (e.g. random record auditing to check source data, observation studies at point of data collection into the administrative system, record reconciliation with other sources, etc).maintain a system for registering problems and managing action taken . This not only 4.keeps a record of issues and ensures that nothing is missed, it also helps maintain knowledge management. Other areas who might be having similar issues with their administrative data can find out what steps another area has taken to fix the issue.use a change management system when making changes . 5.

3 Quality assurance should culminate in sign off of the outputs by an approving officer using clearance documentation which incorporates the results of any quality measures produced and the results of any evaluations. Clearance documentation is a critical quality gate for the Assemble and Disseminate phase and more information about it can be found in Section 04 05.

4 Quality issues to be followed up should be identified and actions for their attention planned � a problem register and a change management system are useful tools for managing this. It is important, however, to distinguish between quality issues which mean critical quality requirements of users are not being met and hence need to be fixed as soon as possible (i.e. quality assurance), and improvements which can be made but are not critical (i.e. quality improvements). Any improvement which can be made must be based on a cost benefit decision and this is the focus of the next section (Section 04 08).

5 When a collection is regularly producing results below the quality standard, or the standard changes in response to key user needs or expectations (e.g. improved timeliness), the system design and its processes need to be reviewed to identify areas to fix or improve their performance.

Page 85: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

04 08 Continuous Improvement

1 A program of quality assurance as outlined in the previous section will show areas where improvements can be made to processes and source data to improve the quality of the statistical outputs produced or the efficiency of their production. As mentioned in that section, improvements which are made must be based on cost benefit decisions. Otherwise the overall cost of the outputs will be increased for little or no gain in utility. 2 In addition, regular reviews of established statistical outputs should be conducted to see if they are still meeting key user needs across the 6 dimensions of the data quality framework (DQF). These reviews should be the main driver for changes as they will be based on user needs, and are likely to be in the following areas:

identifying any new information that might be available and the potential for adding !questions to improve the relevance of the data set, the ABS can try and influence source providers to add extra questions, however this is a very large challenge due to the provider not collecting data for ABS statistical purposes;Adoption of standard questions and or classifications, once again the adoption of !standard questions is very difficult due to a number of reasons e.g. different jurisdictions, provider not collecting data for ABS statistical purposes;Implementation of latest versions of standards;!Quality issues with data impacting on use; !Timeliness;!PMF - a long time frame should be planned and reviewed to see if project is still on !track, and if there are any changes to direction due to changes in user requirements; andreviews need to be conducted (audits) every year of User Funded statistics, as the !ABS needs to justify (explain) the amount of money that was contributed in order to achieve the statistics released and to also maintain the level of funding received. This also enables any requests from users for more information to be further funded by the users as all current monies can be accounted for.

3 If changes are required to the source data and are not supported by the source managers then statistical methods could be considered. Some examples are: impute for missing data; estimate for missing coverage; and link matrices to map old standards to current ones.

4 A quality assurance program run by the source managers on the source data should also be encouraged and the source managers invited to participate in quality reviews. The National Centre for Crime and Justice Statistics (NCCJS) recently started sending a data quality checklist for their providers to answer and send back to the ABS when the administrative data is being provided. The goal of this checklist is to improve the quality of the data provided. The use of an entire processing cycle review being provided to the source (providers) is also useful, as the source may be able to implement changes to help improve the data supplied to the ABS.

5 Opportunities are provided from time to time by changes to systems for the source data or the processing environment within the Australian Bureau of Statistics (ABS). These changes are not usually driven by the need for improvements to the quality of the end statistics but provide an ideal time to consider any changes which may be needed to improve quality (e.g. to implement the latest version of a standard).

6 The ABS has a specific policy regarding administrative data. Type 1 data is data which is brought into the ABS's own databanks and then processed and statistics released from them e.g. births, deaths, international trade. The policy states that the ABS will not pay for administrative data, however the ABS may meet the costs incurred in the extraction of statistics from administrative data. The ABS must uphold the policy and legislation governing the use of non-ABS data when dealing with providers and any issues that may arise during the course of the provision of administrative data. For more information on the ABS Policy and Legislation in regards to non-ABS data please see:

Page 86: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

04 09 Relationships

1 The risk assessment and management section (04 06) concluded that the most important strategy for managing a range of risks to quality is the maintenance of good ongoing relationships with the owners and managers of the source data. These relationships should be established to varying degrees at all levels from the Chief Executive down to operational staff.

2 Experiences over the years across many applications of use of administrative data reinforce this assessment. Unfortunately, a lot of the lessons were learnt from hindsight after problems occurred. This section will outline the key aspects of relationship management, provide a case study to illustrate problems which can arise, and outline some tools for use in maintaining good relationships so that the risk of problems occurring is minimised.

3 It is always a sound principle to maintain good relationships with the providers of information to be used for statistical purposes. With population censuses and a range of surveys this is done through a variety of policies and means including for example an extensive public awareness campaign run in association with the census. Unlike the direct collection of information, however, the use of administrative data as a source introduces a third party into the supply chain (or even a fourth party if the owners of the source data contract out the management of its day to day operation). This third party has other obligations and interests and for them the production of statistics from their source data will always be of secondary or lower importance.

4 The Census and Statistics Act (CSA) provides authority and powers for the Australian Statistician to obtain administrative data for statistical purposes. Similarly, administrative data systems are usually set up by, and operate under, other legislation with specific objectives to be delivered by the managing authority. In addition, more general legislation relating to privacy also applies. The best way forward is for the various agencies involved to work together to achieve their respective objectives within their various legislations without resorting to unproductive legal disputes about higher authority.

5 In any successful relationship, the foremost principle is to understand the objectives and constraints of the other party. With respect to use of administrative data, this means understanding the needs of the owners and managers of the administration data. It also means making sure the owners understand the value of the statistical information which can be produced from their data and the critical requirements of the data for producing statistics of the required quality. This is especially important if the contact person changes as relationships will need to be re-established and priorities and perspectives will need to be re-discovered or at least checked to see if they are consistent with the previous contact . Due to the large number of providers that have contracted out their information technology (IT) work, developing a good relationship with these contractors is important as they may not be aware of the quality of the data provided. Hence helping them to understand the statistical aspects of the data and how we use it as well as what we do with it, will be useful to these contractors.

6 The following have been shown from experience as being useful tools in establishing and maintaining a sound relationship.

If a new administrative source is being established which will be of value statistically then !the ideal situation is for the Australian Bureau of Statistics (ABS) to be involved with the project right from the start so that statistical aspects can be built into the system being implemented. This however is rarely the case as most systems have been around a long time. The next best option is to be involved when the administrative systems are being redesigned to meet changed objectives or to operate under new information technology.Establish a champion for statistical use within the source agency. This could be a senior !executive or a key user of the statistics (e.g. policy evaluator associated with the administrative system). Sometimes the information is so critical that agreements need to be reached between the respective chief executives.Ensure the owners understand the value and importance of the key statistics produced !from their source. This includes ensuring they understand the key quality requirements such as accuracy and timeliness of supply, and of course the impact of disruption of supply.Understand what expertise the ABS can bring to bear on a project or to support ongoing !

Page 87: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

operation. This expertise includes statistical coding software, information managementskills, outposted officers, statistical standards for collecting and using certain information (e.g. demographic information about a respondent, geographic classification).Map not only the provider to the ABS, but also other users of the statistics who may be !able to influence the provider, which is beneficial to the ABS when changes might need to be made to the existing administrative dataset e.g. the addition of extra questions, wording changes etc. The ABS can then go to the influential users and use their authority to get changes implemented to the administrative collections.Maintain continuous contact with the provider through email, telephone, in person etc. !This contact should not be limited to when the ABS requires something from the provider. The Administrative Data Acquisition Unit (ADAU) sends a thank you letter to their providers. Another suggestion that could be used is the sending of a corporate Christmas card.Internal relations also need to be maintained.!Use some form of written agreement between the parties. These agreements have been !in a variety of forms over the years but essentially set out objectives and obligations for both sides. Memorandums of Understanding (MOUs), Data Service Agreements, Service Level Agreements (SLAs) are examples of agreements used. The below doclink provides two examples of MOUs from Statistics New Zealand. One is a detailed example of a MOU between Statistics New Zealand and the New Zealand Agriculture and Forestry department, and the other example is a template of the construction of more general MOUs. An MOU between the ABS and the Queensland Local Government Grants Commission is also included for further reference.

(Subject: Examples of MOUs from Stats NZ; Database: MD Projects Vol. 3 - Other Clients WDB; Author: Narrisa Gilbert; Created: 20/12/2006; Doc Ref: NGIT-6WMT3D)

Establish some form of governance arrangements involving senior managers, at least for !critical stages of projects such as implementation of a new or revised system. Consider involving wider relationships by involving other stakeholders with an interest in ensuring successful utilisation of the source data. The arrangements should provide for the management of incidents which can impact on supply and quality.

7 Complexities can arise when there is more than one source responsible for the administration data. An example of this is the Overseas Arrivals and Departures (OAD), commonly referred to as Passenger Cards, issue that occurred in the early 2000's. The resolution of the issues associated with this particular case were achieved through the development of stronger relationships between the sources (the owners and the third party) and the ABS. This is a great demonstration of the need for communication between all parties involved in administration data.

(Subject: OAD case study on using admin data; Database: MD Projects Vol. 3 - Other Clients WDB; Author: Geoff de Baux; Created: 20/02/2006; Doc Ref: JMQR-6M6TYE)

8 The ABS currently has a many ABS to one provider burden issue. It would be beneficial if the ABS had one central contact to deal with the source in the supply of administrative data. This central person would then deal with all data requests coming from the ABS to the provider. Currently the ABS has an outposted officer at the Australian Taxation Office (ATO) which has worked well in regards to the co-ordination of tax data.

Page 88: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

04 10 Contacts for more information and support

1 An area in the Australian Bureau of Statistics (ABS), the Administrative Data Acquisition Unit (ADAU), has responsibility for acquiring, transforming and loading administrative data on behalf of the economic collections of the ABS, and some PSG collections on request. Areas contemplating or investigating use of administrative data to substitute or supplement data directly collected from providers are advised to contact the ADAU. The ADAU will be able to provide information and support on a range of issues applicable to negotiating and receiving administrative data.

2 In particular the ADAU:negotiates and manages the supply of input data from providers, including !negotiating of memorandums of understanding (MOUs) for data supply;receives data in a wide range of formats using a variety of communication channels;!formats, edits and transforms data as required by Business Statistics Centre (BSC) !processes, then loads to BSC processing systems;administers and develops the Secure Deposit Box (SDB). The SDB provides for !secure and authenticated lodgement of electronic data from providers over the Internet;provides technical facilitation for electronic data capture across the Economic !Statistics Data Centre (ESDC);provides technical advocacy in terms of liaison with Technical Applications in !Technical Services Division (TSD);provides business analysis in relation to input data process re-engineering;!seeks efficiency gains and reduced times for data input processes; !performs basic quality assurance checks on data after each load and reports this to !BSCs; andproject manages, facilitates and acts as technical advisor for the Multimodal Data !Capture project.

The ADAU may be contacted by emailing to the Administrative Data Acquisition Unit Wdb.

3 Statistical Services Branch (SSB) provides general support in the area of quality measurement and quality assurance. In particular, SSB support staff can assist areas with the design of quality measures or quality gates. Contact you local SSB support staff, or the Assistant Statistician, SSB for cross-cutting quality issues.

4 The Transport, Tourism and Local Government section (TTLG) run the Motor Vehicle Census. This collection is extremely large, in the vicinity of 15 million records, hence if you require help or information on the processing of extremely large data sets, please contact the Director of TTLG.

5 The Health and Vitals Statistical Unit (HVSU) have written data quality statements for each data item as part of the Census Data Enhancement project. Templates are currently being tried for the data quality statements, with refinements to occur over time. Examples of these templates are: For more information please contact the Director of HVSU.

6 The HVSU have also had some major successes with negotiating changes to administrative data collections with providers. For information on how and what was achieved or advice on how to go about approaching providers regarding possible changes to administrative data sets, please contact the Director of HVSU.

Page 89: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

Quality Manual : Section 07-01

Manual Category Ü C. MethodologyManual ID - No & Title: Ü Quality - 08. Quality Manual

Finalv 2007/01

Last Updated:25 May 2007

Chapter No. & Title: Ü 07. Appendices

Section No. & Title: 01. Template for quality assurance for change processes

Subsection No. & Title:

Document Version: Ü 2007/01

Header

Status: *** Final ***

Comments: Not yet available

Status

Finalised contact details are available for this Manual.

Area Resp. for Updating: Methodology DivisionGeneral Contact Details

Name: Phone:

Specific Contact DetailsName: Phone: Division: Email:

Melissa Gare(02) 6252 7147Methodology and Data Management Division

Contact Info

To be notified when a change to this document is published, register yourself, your workgroup or your workgroup database email address in the table below.

View: (LookupStakeholders) Click in the left hand margin to select an existing entry!

Press F9 if your registration doesn't appear automatically!

Stakeholders

The documents linked here:

offer guidance on related matters!

when changed will demand a review and possible updating of this document and/or!

will need to be reviewed and perhaps updated as a consequence of changes to these !

documents.

Links

Page 90: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

Policy

Standards

Definitions

Procedures

Other links

Created by Bruce Fraser on 25/05/2007 12:28:36 PM

Audit Trail

0. Project name

<insert project title here>

1. Brief description of change (or doclink/s to it)

<write a couple of paragraphs outlining the project, or simply insert relevant doclink/s - but try to keep it brief!>

2. Summary of processes

Identify all the processes that will involve a change - these may or may not lead to a statistical risk. The process may be in the areas of

people skills and training!automated processes e.g. new software or tools!manual processes!

(include a BPM of the process if one exists - it's a good idea to have one, as it will be helpful in identifying all the processes, changes and issues, and in showing the possible flow-on effects if there is an error "upstream")

Number Process title12...

3. QA plans

For each of the processes identified in the previous section, fill out the following table (copy the table as many times as necessary). The key purpose is to consider the risks (particularly, statistical risks i.e. effects on statistical outputs if something goes wrong), identify a Quality Assurance (QA) strategy to manage the risk and develop key Quality Measures (QMs) to be used throughout the process to make sure things are on track. It's also important to identify who's doing the QA, and who is signing off on the outcome.

what:<copy the "Process title" from the previous section>

when:<when will the process (as opposed to the QA) be carried out - can be a range of dates e.g. Sep - Dec 2005>

risks:<identify the major risks, particularly statistical risks i.e. potential effects on statistical outputs if something goes wrong>

Page 91: Quality Manual Chapters 1 and 2 (1) - United Nationsunstats.un.org/unsd/dnss/QAF_comments/ABS_Internal_Quality_Man… · Quality Manual : Section 02-01 Manual Category Ü C. Methodology

qa strategy: <a brief description of the Quality Assurance strategy that will be followed>

key quality measures (QMs):

<dot point list of key quality measures that will be monitored to ensure things are on track i.e. how will you know everything has gone correctly? These could be quantitative (e.g. an accuracy rate) or qualitative (e.g. a subjective assessment of whether the outputs are fit-for-purpose>

who to implement QA:

<who's responsible for carrying out the QA?>

who to sign off: <who will sign-off i.e. decide that the results are of sufficient quality as to minimise any

statistical risk?>

4. Sign-off of QA plan

<If desired, the QA plan could be signed off separately for each stage of the process. If this is the case, simply add an extra column to the table below for the title of each stage and a row for each stage>

Who to sign off QA plan Date signed off

Related Documents: (Subject: QA plan for ANZSIC 2006 preparatory phase; Database: MD Proj Vol 1b - ESG Client WD08/11/2005; Doc Ref: PSCT-6HWTWP)