54
Manual of Petroleum Measurement Standards Chapter 4—Proving Systems Section 7—Field Standard Test Measures SECOND EDITION, DECEMBER 1998 COPYRIGHT 2000 American Petroleum Institute Information Handling Services, 2000 COPYRIGHT 2000 American Petroleum Institute Information Handling Services, 2000

Download

Embed Size (px)

Citation preview

Page 1: Download

Manual of Petroleum Measurement StandardsChapter 4—Proving Systems

Section 7—Field Standard Test Measures

SECOND EDITION, DECEMBER 1998

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 2: Download

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 3: Download

Manual of PetroleumMeasurement StandardsChapter 4—Proving Systems

Section 7—Field Standard Test Measures

Measurement Coordination Department

SECOND EDITION, DECEMBER 1998

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 4: Download

SPECIAL NOTES

API publications necessarily address problems of a general nature. With respect to partic-ular circumstances, local, state, and federal laws and regulations should be reviewed.

API is not undertaking to meet the duties of employers, manufacturers, or suppliers towarn and properly train and equip their employees, and others exposed, concerning healthand safety risks and precautions, nor undertaking their obligations under local, state, or fed-eral laws.

Information concerning safety and health risks and proper precautions with respect to par-ticular materials and conditions should be obtained from the employer, the manufacturer orsupplier of that material, or the material safety data sheet.

Nothing contained in any API publication is to be construed as granting any right, byimplication or otherwise, for the manufacture, sale, or use of any method, apparatus, or prod-uct covered by letters patent. Neither should anything contained in the publication be con-strued as insuring anyone against liability for infringement of letters patent.

Generally, API standards are reviewed and revised, reafÞrmed, or withdrawn at least everyÞve years. Sometimes a one-time extension of up to two years will be added to this reviewcycle. This publication will no longer be in effect Þve years after its publication date as anoperative API standard or, where an extension has been granted, upon republication. Statusof the publication can be ascertained from the API Measurement Coordination Department[telephone (202) 682-8000]. A catalog of API publications and materials is published annu-ally and updated quarterly by API, 1220 L Street, N.W., Washington, D.C. 20005.

This document was produced under API standardization procedures that ensure appropri-ate notiÞcation and participation in the developmental process and is designated as an APIstandard. Questions concerning the interpretation of the content of this standard or com-ments and questions concerning the procedures under which this standard was developedshould be directed in writing to the director of the Authoring Department (shown on the titlepage of this document), American Petroleum Institute, 1220 L Street, N.W., Washington,D.C. 20005. Requests for permission to reproduce or translate all or any part of the materialpublished herein should also be addressed to the director.

API standards are published to facilitate the broad availability of proven, sound engineer-ing and operating practices. These standards are not intended to obviate the need for apply-ing sound engineering judgment regarding when and where these standards should beutilized. The formulation and publication of API standards is not intended in any way toinhibit anyone from using any other practices.

Any manufacturer marking equipment or materials in conformance with the markingrequirements of an API standard is solely responsible for complying with all the applicablerequirements of that standard. API does not represent, warrant, or guarantee that such prod-ucts do in fact conform to the applicable API standard.

All rights reserved. No part of this work may be reproduced, stored in a retrieval system, or transmitted by any means, electronic, mechanical, photocopying, recording, or otherwise,

without prior written permission from the publisher. Contact the Publisher, API Publishing Services, 1220 L Street, N.W., Washington, D.C. 20005.

Copyright © 1998 American Petroleum Institute

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 5: Download

FOREWORD

This publication consolidates the latest changes and improved technology in Field Stan-dard Test Measures since it was Þrst published in October 1988. Units of measurement inthis publication are in United States Customary (USC) units and the International System(SI) consistent with North American industry practices.

This standard has been developed through the cooperative efforts of many individualsfrom industry under the sponsorship of the American Petroleum Institute (API) and with theassistance of the National Institute of Standards and Technology (NIST). Joint participationbetween the API and NIST has enabled a new gravimetric calibration facility for test mea-sures to be designed and built in Gaithersburg, MD. This new calibration laboratory will pro-vide higher accuracies, improved calibration service, and reduced volume uncertainties inÞeld standard test measure volumes determined at this new facility.

API publications may be used by anyone desiring to do so. Every effort has been made bythe Institute to assure the accuracy and reliability of the data contained in them; however, theInstitute makes no representation, warranty, or guarantee in connection with this publicationand hereby expressly disclaims any liability or responsibility for loss or damage resultingfrom its use or for the violation of any federal, state, or municipal regulation with which thispublication may conßict.

Suggested revisions are invited and should be submitted to the director of the Measure-ment Coordination Department, American Petroleum Institute, 1220 L Street, N.W., Wash-ington, D.C. 20005.

iii

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 6: Download

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 7: Download

CONTENTS

Page

0 INTRODUCTION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

1 SCOPE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

2 REFERENCED PUBLICATIONS. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

3 DEFINITIONS. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

4 EQUIPMENT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24.1 Materials and Fabrication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24.2 Gauge Glass . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24.3 Drain Lines . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34.4 Levels and Leveling. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34.5 Neck . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34.6 Scale Plate and Graduations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44.7 Case . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54.8 Outlet Valve . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54.9 Nameplate . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64.10 Assembly . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

5 CALIBRATION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65.1 General . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65.2 Calibrated Volume . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65.3 Calibration Frequency . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65.4 Number of Calibration Runs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75.5 Test Measure Control Charts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75.6 Seals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85.7 Disputes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

6 CALIBRATION METHODS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86.1 Gravimetric Calibration of Neck Scale Test Measures . . . . . . . . . . . . . . . . . . . . . 86.2 Gravimetric Calibration of Slicker-Plate Test Measures . . . . . . . . . . . . . . . . . . . . 96.3 Volumetric Calibration of Neck Scale and Slicker-Plate Test Measures. . . . . . . . 9

7 INSPECTION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107.1 General . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107.2 Documentation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107.3 Visual Inspection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107.4 Cleaning Procedures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117.5 Integrity. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12

8 OPERATION AND USE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 128.1 Field Use of Volumetric Test Measures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 128.2 Normal Test Measure Operation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 128.3 Irregular Test Measure Operations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

v

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 8: Download

CONTENTS

Page

APPENDIX A ACCURACY REQUIREMENTS FOR VOLUMETRICTEST MEASURES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

APPENDIX B CALCULATION OF UNCERTAINTY OF A FIELDSTANDARD TEST MEASURE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27

APPENDIX C TEST MEASURE CONTROL CHARTS. . . . . . . . . . . . . . . . . . . . . . . . 31APPENDIX D LABORATORY WEIGHTS AND MASS STANDARDS . . . . . . . . . . 33APPENDIX E NIST CERTIFICATES OF CALIBRATION FOR

FIELD STANDARD TEST MEASURES (EXAMPLES) . . . . . . . . . . . 35APPENDIX F WATER DENSITY EQUATIONS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41

Figures1 Various Types of Standard Test Measures (I) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 162 Various Types of Standard Test Measures (II) . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173 Field Standard Test MeasureÑInvertible Type . . . . . . . . . . . . . . . . . . . . . . . . . . . 184 Field Standard Test MeasureÑBottom Drain Type . . . . . . . . . . . . . . . . . . . . . . . . 195 Gauge Glass and Scale Assembly. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 206 Field Standard Test Measure Control Chart . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 217 Levels of the Petroleum Measurement Hierarchy . . . . . . . . . . . . . . . . . . . . . . . . . 26

Tables1 Capillary Rise in Gauge Glass Tubes with Varying Water Quality . . . . . . . . . . . . . 32 Effect of Out-Of-Level of Normal Sensitivity Test Measures

on Gauge Glass Level . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43 Effect of Out-Of-Level of Normal Sensitivity Test Measures

on Measure Volumes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44 Scale Graduations for Normal-Sensitivity Test Measures . . . . . . . . . . . . . . . . . . . . 55 Scale Graduations for High-Sensitivity Test Measures . . . . . . . . . . . . . . . . . . . . . . 56 Scale Meniscus Errors Due to Contaminated Water . . . . . . . . . . . . . . . . . . . . . . . . 57 Range Limits versus Number of Calibration Runs. . . . . . . . . . . . . . . . . . . . . . . . . . 78 Volume Errors Due to Test Measure Internal Deposits . . . . . . . . . . . . . . . . . . . . . 11A1 Hierarchy of Measurements for the Metered Volumes of Petroleum. . . . . . . . . . . 23A2 Uncertainty Requirements for a Petroleum Measurement

Hierarchy Using a Single Custody Transfer Meter and theWaterdraw Calibration of a Displacement Prover . . . . . . . . . . . . . . . . . . . . . . . . . 23

A3 Uncertainty Limits for Calibration of Normal SensitivityTest Measures as Used in the Calibration of ConventionalDisplacement Provers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24

A4 Uncertainty Limits for the Calibration of High SensitivityTest Measures as Used in the Calibration of Small Volume Provers . . . . . . . . . . . 24

A5 Uncertainty Limits for the Calibration of Test MeasuresUsed for Displacement Provers. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24

A6 Hierarchy of Petroleum Metered Volume Measurements Including Calibration of Meter Provers by a Master Meter . . . . . . . . . . . . . . . . . . . . . . . . . . 24

A7 Uncertainty Requirements for a Petroleum Measurement HierarchyIncluding a Single Custody Transfer Meter With MasterMeter Prover Calibration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

vi

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 9: Download

CONTENTS

Page

A8 Total Hierarchy of Measurements of Metered Volumes of Petroleum Showing International Standards, Calibrationof Test Measures Other Than by NIST and Meter ProverCalibration by a Master Meter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

B1 Uncertainty for the Gravimetric Determination of the Contained Volume at Room Temperature of NISTÕs 100-gallon (0.379 m

3

) Neck Scale Test Measure . . . . . . . . . . . . . . . . . . . . . . . . . . 28B2 Uncertainty for the Gravimetric Determination of the

Contained Volume at Room Temperature of NISTÕs50-gallon (0.189 m

3

) Neck Scale Test Measure . . . . . . . . . . . . . . . . . . . . . . . . . . . 29B3 Uncertainty for the Gravimetric Determination of the

Contained Volume at Room Temperature of NISTÕs5-gallon (0.0189 m

3

) Neck Scale Test Measure . . . . . . . . . . . . . . . . . . . . . . . . . . . 29B4 Uncertainty for the Gravimetric Determination of the

Contained Volume at Room Temperature of NISTÕs5-gallon (0.0189 m

3

) Slicker Plate Test Measure. . . . . . . . . . . . . . . . . . . . . . . . . . 30C1 Statistical Analysis Table of the Calibration History of

Test Measure #123456 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31D1 Maximum and Minimum Limits for Density in g/cm

3

. . . . . . . . . . . . . . . . . . . . . 34D2 Application of Standard Mass Weights. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34

vii

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 10: Download

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 11: Download

1

Manual of Petroleum Measurement Standards—Proving SystemsField Standard Test Measures

0 Introduction

A Þeld standard test measure is a vessel fabricated to meetspeciÞc design criteria and calibrated by an ofÞcial agencysuch as the National Institute of Standards and Technology(NIST). Its primary purpose is to provide a standardized vol-ume, used for the calibration of displacement and tank prov-ers, when calibrated by the waterdraw method.

1 Scope

This chapter details all the essential elements of Þeld stan-dard test measures by providing descriptions, constructiondetails, calibration procedures, operation, inspection, andÞeld use requirements. Accuracy and uncertainty require-ments are also discussed. The normal volume range for testmeasures is 1 to 1,000 gallons, approximately 4 to 4,000liters. The information contained in this chapter is limited toatmospheric, graduated neck type, metal, volumetric Þeld-standard test measures. The volume is determined between anupper neck scale zero, and either a solid bottom or a taperedbottom with a drain line and shut-off valve, depending on thesize of the test measure. Bottom-neck scale test measures andprover tanks are not addressed in this document.

Test measures are classiÞed as:a. ÒTo deliverÓ volumetric test measures. b. ÒTo containÓ volumetric test measures.

Guidelines are provided in this document for the design,manufacture, calibration, and use of new and/or replacementtest measures and are not intended to make any existing testmeasures obsolete.

2 Referenced Publications

The following standards, codes, and speciÞcations contrib-uted to this standard and have been used as an appropriateresource for reference material:

API

Manual of Petroleum Measurement Standards

Chapter 1Ñ

Vocabulary

Chapter 4.2Ñ

Conventional Pipe Provers

Chapter 4.3Ñ

Small Volume Provers

Chapter 4.4Ñ

Tank Provers

Chapter 4.8Ñ

Guide to the Operation of ProvingSystems

Chapter 7Ñ

Temperature Determination

Chapter 11Ñ

Physical Properties Data

Chapter 12Ñ

Calculation of Petroleum Quantities

Chapter 13Ñ

Statistical Aspects of Measuring andSampling

NIST

1

Handbook 105

SpeciÞcations and Tolerances for ReferenceStandards

and Field Standard Weights andMeasures,

Part 3ÑS

peciÞcations and Toler-ances for Graduated Neck-Type VolumetricField Standards

3 Definitions

3.1 borosilicate glass:

A glass with a low coefÞcient ofthermal expansion.

3.2 brim test measure:

A vessel similar to a Þeld stan-dard test measure, except it has no sight glass and neck scale.It is Þlled until the liquid just extends to the top of the neckdue to surface tension. Any liquid overßow is caught in thereservoir below and drained away and/or measured.

3.3 calibrated volume:

Also deÞned as the Base Mea-sure Volume (BMV); the delivered volume of a Þeld standardtest measure, at its reference temperature, between its deÞnedÒfull and emptyÓ levels.

3.4 calibration:

A set of operations which establish,under speciÞed conditions, the relationship between the val-ues indicated by a measuring device and the correspondingknown values indicated when using a suitable measuringstandard.

3.5 cessation of main flow:

During the draining of aÞeld standard test measure, the moment when the full dis-charging water stream ÒbreaksÓ and becomes a small trickle.

3.6 clingage:

The Þlm of liquid that adheres to the insidesurface of a Þeld standard test measure after it has beendrained and is considered empty.

3.7 field standard test measure:

DeÞned as a volu-metric, nonpressurized, round, metal container, with a cylin-drical neck containing a gauge glass and scale. Designed tostringent speciÞcations, it ÒcontainsÓ or ÒdeliversÓ an exactvolume between a Þxed bottom or a bottom shut-off valveand an upper neck scale reading.

3.8 high-resolution type:

A Þeld standard test measuredesigned with a smaller diameter neck, which is used toachieve greater neck volume resolution in reading the waterlevel meniscus. This type of test measure is also characterizedby the terms Òhigh-sensitivityÓ or Òhigh-accuracyÓ.

1

National Institute of Standards and Technology, Gaithersburg,Maryland 20899.

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 12: Download

2 API C

HAPTER

4—P

ROVING

S

YSTEMS

3.9 measurement:

A procedure to determine the value ofa physical variable. Statements that best describe the qualityof these measurements as used in this standard are deÞnedbelow:

a.

Accuracy.

Describes the degree of closeness between mea-sured values and the true value.b.

Error.

Describes the difference between a measured valueand the true value. c.

Precision.

Describes the degree to which all the datawithin the set of measurement variables is clustered together.d.

Range.

Describes the area between a high value to a lowvalue. e.

Repeatability.

Describes the measure of agreementbetween the results of successive measurements of the samevariable, by the same method, with the same instrument andoperator, at the same location, over a short time period.f.

Tolerance.

Describes the maximum permissible error in aseries of measurements. g.

True value.

Describes the only correct value of a variable.h.

Uncertainty.

Describes the range of deviation between ameasured value and the true value, expressed as a percentage.For example, a device with an accuracy of 2% would have anuncertainty of ±2%.

3.10 reference temperature:

The temperature at whichthe test measure is intended Òto containÓ or Òto deliverÓ itsnominal capacity.

3.11 slicker-plate test measure:

Described as a vesselsimilar to a Þeld standard test measure, except it has no sightglass and neck scale. It is Þlled so that the liquid just extendsabove the top of the neck, due to surface tension, where theexcess is sheared off by sliding a transparent plate across thetop of the neck.

3.12 “to contain” volume:

A method of characterizationof a Þeld standard test measure that determines its containedvolume, at reference temperature, when it is Þlled from aclean, dry, empty, condition.

3.13 “to deliver” volume:

A method of characterizationof a Þeld standard test measure that determines its deliveredvolume, at reference temperature, when it is emptied from itsfull condition and drained in accordance with the prescribeddraining time.

3.14 traceability:

The property of the result of a measure-ment or the value of a standard whereby it can be related tostated references, usually national or international referencestandards, through an unbroken chain of comparisons, allcontrolled, and having stated uncertainties. It should be notedthat traceability only exists, when scientiÞcally rigorous evi-dence is collected, on a continuing basis, showing that themeasurement is producing documented results, for which thetotal measurement uncertainty in quantiÞed.

4 Equipment

4.1 MATERIALS AND FABRICATION

A vessel used as a Þeld standard test measure shall be con-structed of corrosion-resistant stainless steel. All parts of themain body of the test measure shall be made of the samematerial.

All interior seams shall be Þlled and ground smoothenough to prevent any entrapment of air, liquid, or foreignmaterial.

Fabrication shall ensure that no pockets, dents, or crevicesare present that may entrap air or liquid or impair the properÞlling or draining of the standard.

The materials of construction of the Þeld standard testmeasure shall be thermally stable and not have an undulyhigh coefÞcient of thermal expansion, which might render thetest measure unsuitable for Þeld use. All applicable physicalproperty data must be reliably documented.

Any horizontal cross-section shall be circular, and theshape of the vessel shall permit complete emptying and drain-ing. Dimensional requirements for U.S. Customary and Met-ric Test Measures are shown in Tables 1a and 1b of the NISTHandbook 105-3, July 1997, pages 18Ð19.

Where appropriate, reinforcing bands shall be used to pre-vent distortion of the measure or Þeld standard when it is fullof liquid.

The opening at the top of the neck shall be structurallydurable because of the thickness of the metal, or it shall bereinforced.

The top of the neck shall be Þnished so that the level positionof the Þeld standard test measure can be determined by placinga machinistÕs precision spirit level across it. Proper adjustmentof all replacement levels shall be achieved in this manner.

The bottom of the Þeld standard test measure shall bedesigned to prevent distortion when it is Þlled with liquid andprevent damage during use.

A Þeld standard test measure in use must be leveled andstand solidly on a surface with its vertical axis perpendicularto that surface.

4.2 GAUGE GLASS

A Þeld standard test measure shall be equipped with agauge glass mounted on the side of the neck.

This gauge glass shall be made of borosilicate type glass orequivalent, and shall be free from any irregularities or defectsthat will distort the appearance of the liquid surface. Anygauge glass made of a substitute material must be imperviousto petroleum products.

The gauge glass shall be installed to facilitate cleaning andremoval. Replacement glass must conform to the original sizeand bore speciÞed by the manufacturer. The bottom mountingof the gauge glass shall be made leak proof, without the use ofcement, by using compressible gaskets or O-rings. Removal

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 13: Download

S

ECTION

7—F

IELD

S

TANDARD

T

EST

M

EASURES

3

and replacement of the glass shall be accomplished withoutdifÞculty and without affecting the calibration of the measure.

Table 1 shows the effect of surface tension capillary risewithin the glass tube, when using different size gaugeglass tubes, due to variability of water quality. Dirty wateris shown for comparison only and is not intended toendorse or recommend the use of dirty or contaminatedwater in any test application.

4.3 DRAIN LINES

If a drain line extends from the bottom center of a Þeldstandard test measure, the downward slope of the line mustprovide complete and proper drainage. This drain line shallbe sized to provide the maximum drainage rate possible con-sistent with a smooth and controlled drainage, while empty-ing the test measure with the drain valve fully open andadhering to the prescribed draining time. Minimum drainsizes are described in NIST Handbook 105-3, July 1997,Tables 1a and 1b, pages 18Ð19.

A test measure drain is described as a gravity dischargeline between the bottom of the bottom cone and the shut-offvalve. It shall have a downward slope of at least 5¡ from thehorizontal plane. All pipes connected to the test measuredrain line, downstream of the shut-off valve, shall be posi-tioned at an elevation to ensure the complete emptying of thetest measure.

A test measure that is not in a permanent installation,equipped with a bottom drain, shall have a minimum of threeadjustable legs that will enable the test measure to be leveled.

4.4 LEVELS AND LEVELING

The level position of the Þeld standard test measure can bedetermined by placing a precision machinistÕs spirit levelacross the top of the neck of the test measure. This level veri-Þcation must be determined in two directions, 90¡ apart, andshall be used to verify the permanently mounted spirit levels.

All Þeld standard test measures shall be equipped with aminimum of two or more adjustable spirit levels, mounted atright angles to each other, or with equivalent leveling indica-tors on the upper cone in a protected position on the sides ofthe vessel. The adjusting screws for these levels shall be pro-vided with a means of wire sealing and the level shall beequipped with a protective cover. Once set the levels should

be sealed and covered for protection. All seals should beafÞxed with corrosion-resistant Series 316 stainless steel orequivalent wire.

If a trailer is used for mounting and/or transporting a Þeldstandard test measure, then it shall have provision for level-ing. Any truck on which a Þeld standard test measure ismounted shall be equipped with leveling jacksÑone neareach corner of the vehicle.

To achieve a higher precision in the calibration of Þeldstandard test measures and in their use, spirit-level precisionrequirements shall be speciÞed. During each calibration run,operating instructions should require test-measure level veri-Þcation prior to scale reading after Þlling. A minimum dis-crimination of two minutes (2') per division is recommendedfor all precision test measure levels. Liquid spirit levels varyin sensitivity from 30 minutes (30') per division to 5 secondsper division. MagniÞcation must be used with high-precisionspirit levels. The potential error due to an out-of-level testmeasure can be estimated as follows:

where

e

= Scale reading error (in.),

x

= Distance from centerline of test measure to out-side of gauge tube (in.),

= Out-of-level angle uncertainty.

Errors for various test measure sizes because of out-of-

level conditions are shown in Tables 2 and 3. The uncertain-ties in the tables do not include potential errors as a result oflevel instability of the test measures, or possible structuralinstability. For the purposes of measurement, errors are alsocaused by the failure to re-level the test measures prior toreading the scale each time.

4.5 NECK

The neck diameter of Þeld standard test measures shall beconsistent with their intended applications and required scaleresolution. The neck assembly, or the top cone area of the test

Table 1—Capillary Rise in Gauge Glass Tubes with Varying Water Quality

Capillary Rise (in.)

Gauge Glass Tube

Diameter (in.)

Pure Water Aerated Water Dirty Water

5

/

8

0.025 0.0060 0.016

3

/

4

0.015 0.0035 0.012

1

0.005 0.0011 0.005

e x qsin=

q

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 14: Download

4 API C

HAPTER

4—P

ROVING

S

YSTEMS

measure and neck assembly, may be designed to open(swivel), provided that the assembly has a metal-to-metaljoint, sufÞcient clamping points to prevent distortion of thejoint, and a means of ensuring that the device is properlyassembled, leveled and leak-free.

4.6 SCALE PLATE AND GRADUATIONS

The scale plate shall be made of corrosion-resistant metaland shall be mounted tangent to the front of or directly behindthe gauge glass. In either case, it shall not be more than

1

/

4

in.(6 mm) from the glass.

If the scale is mounted behind the gauge glass, a shieldshould be provided to protect the glass and allow for replace-ment of the gauge glass without difÞculty.

If the scale adjustment provides for movement by incre-ments only, the maximum increment shall be

1

/

4

of the small-est scale division.

Scale numbering on all Þeld standard test measures shall bespeciÞed on the scale in milliliters (ml), cubic inches (in.

3

), orother volume units. The units of measurement should beclearly marked on the scale. To avoid confusion and possibleerrors in reading, dual numbering on any one scale (e.g. cubicinches and decimal fractions of a gallon) is not permitted. Dualnumbering is permitted only if two scale plates are used; in thiscase, the U.S. customary units scale (in.

3

) is preferred andlocated on the left when the test measure is viewed from the

front. The metric scale (m

3

) is recommended situated on theright side of the gauge glass. Provisions shall be made allowingeither scale to be adjusted individually so that the two zerolines will lie in parallel planes.

The distance between scale graduations shall not be lessthan

1

/

16

in. (1.5 mm).

Scales shall be graduated both above and below the zeroline. For neck sizes smaller than 17 in. (43 cm) in diameter,every Þfth line on the scale shall be considered a major divisionand shall be longer than the intermediate or subdivision lines.Every major line shall be numbered with the volume to thatmark. For neck diameters of 17 in. (43 cm) or larger, everytenth line may be designated a major division line. For small-diameter, high-sensitivity measures that have diameters of 2, 3,or 4 in. (5, 7

1

/

2

, or 10 cm), every fourth or Þfth division is amajor numbered division depending upon the scale increments.

A sufÞcient number of scale brackets (a minimum of two)shall hold the scale plate rigid. The brackets shall be mountedon adjusting rods.

An adjusting rod shall be provided with a means for seal-ing that will prevent movement. The scale plate shall besecurely attached to the brackets and provided with a meansfor sealing. Movement of the adjusting mechanism or scaleplate shall not be possible without breaking the seal. All sealsshould be afÞxed with corrosion-resistant Series 316 stainlesssteel or equivalent wire.

Table 2—Effect of Out-Of-Level of Normal Sensitivity Test Measures on Gauge Glass Level

Uncertainty in Scale Reading, ± in.

3

, versus Level Discrimination

a

Size of Test Measure (gal.) ±30 minutes per division ±10 minutes per division ±2 minutes per division

1 0.011 0.004 0.0007 5 0.011 0.004 0.0007 10 0.011 0.004 0.000725 0.015 0.005 0.001050 0.015 0.005 0.0010100 0.020 0.007 0.0013200500

0.0260.041

0.0090.014

0.00170.0028

a

Level discrimination is estimated as

1

/

2

the minimum discrimination or resolution of the spirit level.

Table 3—Effect of Out-Of-Level of Normal Sensitivity Test Measures on Measure Volumes

Uncertainty in Scale Reading, ± % Volume, versus Level Discrimination

a

Size of Test Measure (gal.) ± 30 minutes per division ± 10 minutes per division ± 2 minutes per division

1 0.056 0.020 0.00365 0.011 0.004 0.000710 0.006 0.002 0.000425 0.005 0.002 0.000350 0.002 0.001 0.0002100 0.003 0.001 0.0002200500

0.004 0.009

0.001 0.003

0.00030.0006

a

Level discrimination is estimated as twice the minimum discrimination or resolution of the spirit level.

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 15: Download

S

ECTION

7—F

IELD

S

TANDARD

T

EST

M

EASURES

5

The graduation lines, numbers, and other information onthe scale plate shall be permanent. Graduation lines shall beof uniform width. The width of the lines shall be no morethan 0.025 in. (0.6 mm) or less than 0.015 in. (0.4 mm).

On scale plates mounted to the front of the gauge glass, themajor (numbered) lines shall be at least

1

/

4

in. (6 mm) inlength. The intermediate lines shall be at least

1

/

8 in. (3 mm)in length. The major and intermediate lines shall extend to theedge of the scale plate nearest the gauge glass. The zero lineshall extend completely across the plate.

On a scale plate mounted behind the gauge glass, the major(numbered) lines shall be at least 3/4 in. (19 mm) in length.Intermediate lines shall be at least 1/2 in. (13 mm) in length.The zero line shall extend completely across the plate.

Two commercially available classes of test measures aredesignated normal sensitivity and high sensitivity. Uncer-tainty in the calibration is different for each class of the samesize test measure. This is due to the improved scale sensitiv-ity, and the repeatability of the smaller diameter neck in thehigh-sensitivity test measures.

Scale graduations for normal sensitivity test measures arelisted in Table 4. Scale graduation for high-sensitivity test mea-sures are listed in Table 5. Scale reading errors due to usingcontaminated water in test measures are shown in Table 6.

4.7 CASE

A Þeld standard test measure is a precision instrument andmust be handled with great care to avoid damage that may

alter its volume. Adequate protection, such as a strong paddedcase, shall be provided for storing and transporting the testmeasure.

4.8 OUTLET VALVE

It must be noted that the outlet (drain) valve of a Þeld stan-dard test measure is a critical component of the calibratedvolume, which if modiÞed or replaced will necessitate therecalibration of the test measure. Test measure drain valvesshould be sealed, painted, or by some other method indicatewhether the valve has been changed, modiÞed, replaced, orrepaired in any way since the last calibration. If seals are usedthey should be afÞxed with corrosion-resistant Series 316

Table 4—Scale Graduations for Normal-Sensitivity Test Measures

NominalGallon Size

Neck Diameter

(in.)

Minimum Numberof in.3

Above & Below 0

Minimum DiscriminationBetween Graduations

( in.3)

Maximum ScaleSpacing

(in.)

1 37/8 15 1 0.0855 37/8 15 1 0.08510 37

/8 30 1 0.08525 5 60 2 0.10250 5 120 2 0.102100 7 250 5 0.130200 10 500 10 0.127500 17 1250 25 0.110

Table 5—Scale Graduations for High-Sensitivity Test Measures

NominalGallon Size

Neck Diameter

(in.)

Nominal Volumeof in.3

Above & Below 0

Minimum DiscriminationBetween Graduations

( in.3)

Maximum ScaleSpacing

(in.)

1 2 20 1/4 0.0805 2 20 1/4 0.08010 2 20 1/4 0.08025 3 35 1/2 0.07150 3 50 1/2 0.07150 4 50 1 0.080100 4 50 1 0.080

Table 6—Scale Meniscus Errors Due to Contaminated Water

Error,% Volume

Nominal Gallon Size Air in Water Dirty Water

1 Ð0.170 Ð0.1085 Ð0.034 Ð0.02210 Ð0.017 Ð0.01125 Ð0.006 Ð0.00350 Ð0.003 Ð0.0015100 Ð0.003 Ð0.0015200 Ð0.003 Ð0.0015500 Ð0.004 Ð0.002

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 16: Download

6 API CHAPTER 4—PROVING SYSTEMS

stainless steel or equivalent wire and provide positive integ-rity that the drain valve has not been replaced.

Many test measures have drain valves that are simplyscrewed into the drain line. The volume is altered if this valveis turned to a different number of threads, either more or less.The importance of this valve to the calibrated volume cannotbe overemphasized and properly sealing screwed drain valvesshould be an absolute priority for volumetric test measures.Alternatively, a metal-to-metal ßange arrangement betweenthe drain line and the drain valve is the preferred method ofdrain valve connection.

This drain valve shall be a quick-acting full-opening valve,open-ended for visual inspection, or shall have a visual-inspec-tion device immediately downstream of the valve to detectvalve failure. This valve must be leak-free at all times.

4.9 NAMEPLATE

Each Þeld standard test measure shall bear in a conspicuousplace the name of the manufacturer, the nominal volume, anda serial or identiÞcation number. The material from which thestandard is constructed shall be shown together with its cubi-cal coefÞcient of thermal expansion per ¡F (or ¡C) for thatmaterial.

4.10 ASSEMBLY

All parts of the assembly of a Þeld standard test measureand all piping and valves that affect the volume of the provershall be fully assembled by the manufacturer or supplier.

5 Calibration

5.1 GENERAL

The National Institute of Standards and Technology (NIST)is the calibrating agency of choice in the United States for Þeldstandard test measures used to calibrate meter provers.

SpeciÞcations for test measures shall be in accordance withthis standard and the latest edition of NIST Standard Hand-book 105-3. Calibrations shall be made using clean potablewater as the calibrating liquid. CertiÞed laboratory graduatesmay be used for measuring partial volumes of the Þeld-stan-dard test measure. The actual capacity of the test measure, asshown on the laboratory report or calibration certiÞcate, shallbe used as the ofÞcial capacity rather than the nominal capac-ity of the measure. Whenever possible, calibrations of the testmeasure shall be maintained at an uncertainty of ± 0.01 per-cent or better.

Water-density data provided in API MPMS Chapter 11should be compatible with the density used by the calibratingagency. See Appendix F, Water Density Equations.

5.2 CALIBRATED VOLUME

Test measures may be calibrated either Òto containÓ or Òtodeliver.Ó A Þeld standard test measure can be designed andbuilt Òto containÓ a precise liquid volume when Þlled from aclean, empty, and dry condition. Test measures calibrated ÒtocontainÓ are not used in prover calibrations, because theÒemptyÓ condition means empty, clean, and dry before everyuse, usually an impractical Þeld operations requirement.

Normally test measure volumes are used in the Òto deliverÓmode. To prepare the test measure for calibration use requiresthat it is wetted prior to use. It should be Þlled with water to itszero mark, leveled, and then drained exactly for the prescribeddraining time as given on the certiÞcate of calibration. The testmeasure is then returned to an upright position or its drainvalve is closed leaving it with a controlled and repeatableamount of clingage (water) inside prior to use. To ensure thatthis clingage quantity is repeatable means that the test measuremust be completely Þlled, and then emptied, in strict compli-ance with the operating procedures and draining times that arespeciÞed on its calibration certiÞcate. Just wetting the interiorof the test measure prior to use is completely unacceptable.

This standard dictates that water shall be used as the cali-brating liquid because of its stability, low thermal coefÞcientof expansion, and high heat capacity.

Instructions sent to NIST for the calibration of a Þeld stan-dard test measure should clearly state that it is to be calibratedas a Òto deliverÓ volume, and indicate whether this calibrationshall be performed by the gravimetric or volumetric method.If the volumetric method of calibration is chosen, then thecalibrated volume of the test measure and all measurementssubsequently made using it, will have a larger level of uncer-tainty than if calibrated by the gravimetric method. This mustbe acceptable to all parties using this test measure.

The prescribed draining time for all test measures cali-brated by NIST and drained by inverting, is currently 10 sec-onds after the cessation of main ßow. Similarly, the drainingtime for all NIST calibrated test measures having bottomdrains, is currently 30 seconds after the cessation of mainßow. The actual prescribed draining time used however, shallbe in accordance with the CertiÞcate of Calibration for theparticular test measure in use.

5.3 CALIBRATION FREQUENCY

Field standard test measures calibrated by NIST, shallbe recalibrated at the following regularly prescribed inter-vals of use:

a. Maximum certiÞcation intervals shall be as follows: 1. All new test measures shall be calibrated prior to use(initial certiÞcation).2. All existing test measures, in regular use, shall be reca-librated every 3 years.

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 17: Download

SECTION 7—FIELD STANDARD TEST MEASURES 7

3. Test measures stored and infrequently used may havetheir recalibration suspended. Infrequent use has allowedsome ßexibility between successive recalibrations. In thissituation, use of this test measure shall be allowed if it hasa valid certiÞcate of calibration current within the past 5years.

Control charts shall be maintained on every test measure(see Appendix C). b. Test measures shall be calibrated any time there is evi-dence of damage, distortion, repairs, alterations, maintenanceto the test measure or replacement of the drain valve thatcould affect its volume or its use.c. If appropriate, it may be necessary to calibrate test mea-sures after change of ownership. This calibration may or maynot be necessary. It shall depend on the transfer circum-stances, i.e. the condition of the test measure at the time oftransfer and its previously known calibration history. Allthese factors shall be considered in deciding whether a cali-bration is necessary and should be determined after a carefulcheck of all these parameters and in consultation with allinterested parties.

The calibrated volume of all test measures shall bereported to Þve or more signiÞcant digits.

All Þeld standard test measures for prover calibrations inthe United States, shall have a CertiÞcate of Calibration,issued by NIST, that is current within the last 3 years. TheCertiÞcate of Calibration provided by NIST for the test mea-sure shall be supplied to the owner, together with all the cali-bration data obtained during the calibration procedure.

On some occasions, the same test measure may be cali-brated by two or more different national calibration agencies;e.g. NIST and the Canadian Standards Agency (CSA) mayboth calibrate the same test measure for use in different coun-tries. Similarly, on occasions the same test measure can becalibrated in different units of measurement. For example, thetest measure may be calibrated in both in.3 and ml. This willrequire that two different certiÞcates of calibration, developedeither by the same calibrating agency, or by two differentagencies, need to be issued. Units of measurement, test mea-sure volumes, and required drain times must all be followedexactly for the speciÞc certiÞcate of calibration being used.No interchanging of data or requirements between two differ-ent certiÞcates of calibration for the same test measure is per-mitted.

5.4 NUMBER OF CALIBRATION RUNS

When calibrating a test measure, a sufÞcient number of cali-bration runs shall be conducted to ensure that the randomuncertainty of the calibration measurements are no greater than1/2, and preferably 1/4, of the overall uncertainty speciÞed forthe calibration.

For example, if the uncertainty limit for the calibration of atest measure is ± 0.01%, the random uncertainty due to varia-tions of the calibration runs should not exceed ± 0.0025% to± 0.050%. The range limit for a speciÞc number of calibra-tion runs to meet a prescribed uncertainty limit of the averagecan be calculated as follows, using equation 20 from the APIMPMS Chapter 13.2, November 1994.

where

W (X) = Range Limit for a Set of Calibration Runs (%),

= Prescribed Uncertainty Limit for the Set of Cal-ibration Runs (± %),

= Range to Estimated Uncertainty Conversion Factor for Averages. (See Table II in API MPMS Chapter 13, Section 2.)

If a 95% ConÞdence Level is required for data analyses,the range limits versus number of calibration runs for anuncertainty limit of the average of ± 0.005% are shownbelow:

For procedural uncertainty limits other than ± 0.005%, therange limits can be modiÞed from the above values as follows:

where

W (X) = the value from Table 7.

5.5 TEST MEASURE CONTROL CHARTS

Every test measure shall have a data record prepared andmaintained by NIST, containing all the relevant informationpertaining to the history of the test measure.

Table 7—Range Limits versus Numberof Calibration Runs

Number of Calibration Runs Range Limits, W(X),% Average

2 0.000633 0.00344 0.00645 0.0093

W X( ) a X( )Z % n( , )-----------------=

a X( )

Z % n,( )

W X( ) a X( )± 0.005%-----------------------=

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 18: Download

8 API CHAPTER 4—PROVING SYSTEMS

The relevant information should contain:

a. The identity of the test measure by owner and the ownerÕsnumber.

b. The makerÕs name, identiÞcation number, nominal volume,scale divisions, coefÞcient of expansion of material of con-struction, dates and details of all types of cleaning performed.

c. Calibration dates, corresponding seal numbers, togetherwith Òto containÓ and Òto deliverÓ volumes.

d. Uncertainty history.

The calibration history of all test measures shall be docu-mented and recorded and a control chart developed for eachtest measure. See Figure 6 and the discussion in Appendix C.

5.6 SEALS

Calibration and subsequent certiÞcation must include theafÞxing of a tamper proof seal on the adjustable scale anddrain valve by the calibrating authority.

5.7 DISPUTES

In the case of a dispute between interested parties over theaccuracy of a test measure, the disputed test measure shall besubmitted to NIST for Þnal judgement of its accuracy.

6 Calibration Methods

Volumetric and gravimetric calibration methods for testmeasures are described in the following sections. Uncertain-ties associated with the volumes determined by both methodsare discussed in Appendix B.

6.1 GRAVIMETRIC CALIBRATION OF NECK SCALE TEST MEASURES

a. Inspect the test measure for damage, rust, and internalcleanliness.

b. Exercise the balance (mass comparator) several times toensure smooth operation.

c. Test the balance with the appropriate test weights to assurethe balance response is within tolerance. If out of tolerance,recalibrate the balance in mass units prior to proceeding withthe calibration of the test measure.

d. Measure and record the room air temperature, the atmo-spheric pressure, and the relative humidity.

e. Calculate the air density, (Tair), in kg/m3, using theformula:

where

From step d:

P = The atmospheric pressure (mm of Hg),

U = The relative humidity (%),

(Tair) = The air temperature (¡C).

f. Level the test measure on the platform of the masscomparator.g. Weigh and record the empty mass of the clean and dry testmeasure to be calibrated.h. Fill the test measure with distilled water up to the region ofthe zero reference mark on the neck scale. Inspect the testmeasure for leaks.i. Measure and record the temperature of the distilled water(Tw) in the test measure with a calibrated temperature probe.j. Read the water meniscus in the glass tube of the neckscale, and record the deviation of the reading from the zeroreference mark in volume units.k. Weigh and record the mass of the Þlled test measure.l. Drain the test measure for the prescribed draining period.At the end of the prescribed draining time, close the drainvalve.m. Weigh and record the mass of the drained test measure.n. Calibrate the neck scale in volume units:

1. ReÞll the test measure to a point approximately 50%below the zero reference mark on the test measureÕs neckscale.2. Read and record the indicated value of the neckscale (1).3. Add a known volume of distilled water using an appro-priate device (Vn).4. Read and record the indicated value of the neckscale (2).

Calculate the neck-scale correction factor, k.

where

Vn = The volume of distilled water added,

h1 = The indicated value of the scale reading (2),

h2 = The indicated value of the scale reading (1).

o. Calculate the distilled water density from the Kell Equa-tion (see Appendix F).

kg/m3

r

r T air( )464.56P U 4.47 1.8504 T air( )Ð 00.085594 T air( )2 }+{Ð

T air( ){ 273.16 } 103´+

---------------------------------------------------------------------------------------------------------------------------------------=

kV n

h1 h2Ð( )--------------------=

r T w( )a bT w cT w

2 dT 3w eT w

4 f T 5w+ + + + +

1 gT w+------------------------------------------------------------------------------------------=

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 19: Download

SECTION 7—FIELD STANDARD TEST MEASURES 9

p. Calculate the contained volume of the test measure at thetemperature of the distilled water using the equation:

(m3)

where

mf = The mass of the test measure Þlled with dis-tilled water,

me = The mass of the clean, empty and dry test measure,

(Tw) = The density of the distilled water,

(Tair) = The density of the air,

h = The deviation of the location of the meniscus from zero with the appropriate algebraic sign applied.

q. Calculate the delivered volume of the test measure at thetemperature of the water using the equation:

(m3)

where

md = The mass of the drained test measure.

r. Calculate the volume of the test measure at the desired ref-erence temperature using the equation:

(m3)

where

Vcal(T) = The volume of the test measure calibrated either to contain or to deliver,

T = The temperature of the distilled water,

Tb = The standard reference temperature (60¡F or 15¡C),

t = The volume coefÞcient of thermal expansion for the test measure.

6.2 GRAVIMETRIC CALIBRATION OF SLICKER-PLATE TEST MEASURES

The calibration of a slicker-plate test measure by the gravi-metric method differs from that of the neck-scale type only inthe way the measure is Þlled with distilled water, step h

above. The slicker-plate test measure is Þlled such that thewater extends above the top of the neck of the measure due tosurface tension and the excess is sheared off by sliding atransparent plate across the top of the neck making,

( is deÞned as Òabsolutely equivalent toÓ),

h = The deviation of the location of the meniscus from zero with the appropriate sign applied,

dh = One tenth of the least division of the neck scale, read with a magniÞer Þtted with an anti-paral-lax device.

Transparency of the slicker-plate enables visual inspec-tion for air voids under the plate, which must not beallowed to occur.

The equations for the calculation of uncertainty in the cali-bration of neck-scale test measures (Appendix A) are alsoapplicable to slicker-plate test measures with .

Table 4 in Appendix B lists the uncertainty for a typical5-gallon slicker-plate test measure. The uncertainty deter-mined for a 5-gallon neck-scale test measure is normallymore than 4 times that of a 5-gallon slicker-plate test mea-sure. This is due to the uncertainty in the neck scale read-ing and is shown in detail in Tables 3 and 4 contained inAppendix B.

6.3 VOLUMETRIC CALIBRATION OF NECK SCALE AND SLICKER-PLATE TEST MEASURES

Calibration is achieved by the volumetric transfer of aknown volume of water from a standard test measure into thetest measure of unknown volume (test measure being cali-brated), and calculating either the contained or delivered vol-ume of the unknown test measure.

Ideally, it is preferred to have a platform elevator and largestorage tank for distilled water. The calibration can then beperformed by the following method:

a. Select a standard test measure, designated (U), of knowndelivery volume at a speciÞed reference temperature, scalegraduation, and known cubical coefÞcient of thermal expan-sion, . The standard test measure can be placed on anelevator, leveled, and raised to an appropriate height.b. Place the test measure to be calibrated, designated (Z),below the standard test measure and level. c. If the test measure is to be calibrated Òto contain,Ó then itsinterior must be dry. If the test measure is to be calibrated ÒtodeliverÓ then both the standard test measure and the measureto be calibrated must be wet. They are wetted by Þlling thestandard test measure and then draining it into the measure tobe calibrated with a 30-second draining time of the standardtest measure. Then drain the test measure under calibration inthe same way. This process is only to wet the surfaces anddeÞne the amount of clingage.

V contain T w( )m f meÐ

r T w( ) r T air( )Ð------------------------------------- h+=

r

r

V delivered T w( )m f mdÐ

r T w( ) r T air( )Ð------------------------------------- h+=

V cal T b( ) V cal T( ) 1 bt T T bÐ( )Ð[ ]=

b

h dh 0º º º

h dh 0º º

a( )

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 20: Download

10 API CHAPTER 4—PROVING SYSTEMS

d. Fill the standard test measure, U, to any point on the neckscale, LA, with distilled water from the storage. Record thewater temperature, TA. Read the meniscus scale and record.

e. Drain the standard test measure while ensuring no loss ofwater into the test measure to be calibrated. Close the drain onthe standard test measure after draining for the prescribed time.

f. Read the meniscus scale reading in the test measure beingcalibrated. Record the neck scale reading, Lq, and the watertemperature, Tq.

g. In situations where the standard test measure is smallerthan test measure being calibrated, steps cÐf will have to berepeated the appropriate number of times, each time drainingthe standard test measure for the speciÞed time.

h. After one complete calibration or ÒrunÓ is completed,additional runs may be performed.

i. A neck calibration as described in ÒGravimetric Cali-brationÓ is performed. On test measures having dual scaleplates, denote whether one scale plate or both scale plateshave been calibrated.

Field Standard Test Measure volumes are always refer-enced to standard temperature, i.e. either 60¡F or 15¡C.

For one transfer of the standard test measure, U, to cali-brate a test measure, Z.

where

= Density of water in the Standard Test Measure,

= Density of the water in the Test Measure to be

Calibrated,

M = The same mass of water transferred from the standard test measure U60 into the test measure to be calibrated Z60,

= Cubical coefÞcient of expansion/¡F of the stan-dard test measure, U,

= Cubical coefÞcient of expansion/¡F of the test measure to be calibrated, Z,

Tb = Reference temperature, either 60¡F or 15¡C.

Since the two masses of water are the same:

This expression can be solved for Z60, which is the vol-ume required. This can be either a Òto containÓ or a ÒtodeliverÓ volume.

The standard test measure, U60, has a known volume at thezero graduation on the neck scale, Lo, The difference in vol-ume, A, between Lo and LA may be calculated by knowl-edge of the neck calibration constant. The numerical value of

A is used in the calculation of Z60.

7 Inspection

7.1 GENERAL

Measurement activities often include the inspection orexamination of test measures. Inspections or examinationsare most often required when a new test measure is receivedfrom the manufacturer.

7.2 DOCUMENTATION

Documentation shall be available to all persons who havecustody transfer interests in the test measure. This paperworkshall include the manufacturerÕs identiÞcation of the construc-tion material, together with a statement of the primary calibra-tion. All subsequent records of calibrations along with thecontrol charts shall be maintained. Detail drawings, interior-coating identiÞcation (if applicable), gauge-glass sizes, and allother relevant documents, shall be sustained on Þle for exami-nation when required.

7.3 VISUAL INSPECTION

A visual inspection of any Þeld standard test measure shallbe made before each use to ascertain that the capacity has notbeen altered by dents, internal corrosion or surface deposits.In addition inspect the test measure carefully to make sure itis free of rust, broken seals, broken gauge glasses or scales,broken or missing levels, leaks or defective drain valves. Theeffects of internal deposits on the calibrated volume of a testmeasure are shown in Table 8.

Visual inspection shall include the following:

a. An examination of all surfaces that affect volume toensure that they are free from dents. Any minor dents thatare considered acceptable must be identiÞed on the certiÞ-cate of calibration.

MrA

----- U60 1 a T A T bÐ( )+[ ]=

Mrq

----- Z60 1 b T q T bÐ( )+[ ]=

rA

rq

a

b

Z60 1 b T q T bÐ( )+[ ] U60= 1 a T a T bÐ( )+[ ]

Z60

U60 1 a T A T bÐ( )+[ ]rq 1 b T q T bÐ( )+[ ]

-------------------------------------------------=

D

D

Z60

rA U60 1 a T A T bÐ( )+[ ] DA+{ }Pq 1 b T q T bÐ( )+[ ]

--------------------------------------------------------------------------=

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 21: Download

SECTION 7—FIELD STANDARD TEST MEASURES 11

b. A veriÞcation that noninvertible test measures have a rigid,sloping drain line. The drain line must be equipped with a suit-able drain valve and provide the capability to detect leaks.

c. A veriÞcation that noninvertible test measures areequipped with two adjustable spirit levels mounted at rightangles to each other on the upper cone or equivalent equip-ment. The levels must be a shielded style and of high quality,or provided with a protective cover and have adjusting screwsto permit leveling. Adjusting screws should allow a lead-and-wire seal to be installed by the calibrating authority.

d. An examination of the surface Þnish to determine that it isclean and free from mill scale, grease, or other contaminants.If the exterior surface is painted, the paint coating must bereasonably free from scratches and corrosion damage.

e. An examination of welding quality to ensure that smooth,full-penetration welds are provided.

f. If NIST Handbook 105-3 is the speciÞcation used, veriÞ-cation must indicate that each test measure bears in aconspicuous place the name or trade mark of the manufac-turer, the nominal volume (in gallons, cubic inches, liters, orother units), and a serial or other identiÞcation number. Thematerial from which the standard is constructed shall be iden-tiÞed together with the cubical coefÞcient of thermalexpansion per ¡F or ¡C for that material.

Interior joints and seams shall be smooth and uniform. Sur-faces, including joints and seams, must be clean and freefrom grease, dirt, or oil Þlm. Surfaces must be smooth andfree from rust corrosion. Potential air or water traps, resultingfrom either design or damage, are not permitted.

Prolonged, and sometimes, even only isolated use of con-taminated water in a test measure will cause corrosive dam-age, pitting, and/or deposits to adhere to the interior surfaces.The interior of test measures should never be exposed to saltwater or any type of water with high dissolved solids.

If coating material is used inside test measures then it mustbe uniformly applied to the surfaces and be free from anyvoids or bubbles. Any coating material used, such as epoxy orphenolic, must be resistant to the effects of alcohol, acetone,benzene, petroleum products, and water.

Noninvertible test measures must be equipped with aÞxed anti-swirl plate to eliminate the formation of a vortexduring draining.

The neck on a test measure shall be uniformly cylindricalin length and regular in diameter.

The scale must be of corrosion-resistant metal, Þrm, secure,and easily adjustable. A provision must exist for afÞxing alead-and-wire seal to provide a means for detecting unautho-rized adjustment. The scale divisions or graduations must belinear. The scale length must be appropriate to the test mea-sure. Applicable volume units (cubic inches, cubic milliliters,gallons, liters, or others) must be clearly indicated, and scalemarkings must be legible.

The gauge glass must be clean and clear after wetting (thatis, no droplets should be present), and it must be capable ofbeing removed, cleaned, and replaced.

The top surface of the neck of a test measure must beground, machined, or smoothly formed to serve as a levelbenchmark. This surface shall be the primary level indicationin two axial directions.

Any test measure that is normally transported to variouslocations shall be protected. A case, shipping container, orother means that sufÞciently protects the test measure againstdents and/or scale damage during storage or transportationshall be used. Test measures mounted and moved on trailersshall be secured to prevent movement or damage during travel.

7.4 CLEANING PROCEDURES

All test measures shall be inspected and cleaned prior tocalibration, unless clean water was used as the calibrationßuid and the test measure was cleaned, dried, sealed, and putinto storage at the completion of the previous calibration.

Prior to any cleaning of a test measure, it should be exam-ined for any signs of damage as described in the inspectionsection. Normal cleaning of the interior of the test measureinvolves scrubbing with a biodegradable detergent and water.However, if the interior of the test measure contains oil resi-due then it may be necessary to use solvents for cleaning priorto cleaning with detergents.

In the event a test measure is heavily contaminated with oilresidues the additional step of steam cleaning the interior ofthe test measure prior to any solvent cleaning or detergentwashing will be required.

After the above cleaning, each test measure should be Þlledwith water and allowed to stand for several minutes. Duringthis time, a careful examination for leaks can be made. Oncethis inspection is made, the test measure should be rinsed withdistilled water, then rinsed with alcohol and allowed to dry.

Smaller test measures, without leveling feet, whose level isdetermined by the plane of their base, should sit on a level bed-plate and be checked for Þrm seating. A vessel that rocks willnot give a true meniscus reading during its calibration.

Table 8—Volume Errors Due to Test Measure Internal Deposits

Test Measure Sizein Gallons

Errors Due to 0.0001 inch Internal Deposit Minus (Ð) Percentage of Total Volume

1 0.00945 0.005310 0.004225 0.003250 0.0023100 0.0018200 0.0013500 0.0012

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 22: Download

12 API CHAPTER 4—PROVING SYSTEMS

7.5 INTEGRITY

In addition to the initial testing a test measure undergoeswhen it is new, it should also be checked periodically to ver-ify its leak integrity. The following systematic procedure isrecommended:

a. The test measure should be Þlled with warm-to-hot water100Ð140¡F (38Ð60¡C) and allowed to stand for 30Ð60 min-utes, making certain that any thermometer installed will notbe damaged by exceeding its range. All seams, joints, pipe Þt-tings, and gauge glass along with the overall test measureshould be checked for leaks. The drain valve should bechecked for leaks and air entrapment. The valve should beopened and closed several times to verify positive sealing ofthe valve. An optional procedure involves taking and record-ing a level reading prior to cracking the drain valve open anddrawing off approximately 1 gallon of water into a wetted testmeasure and returning it to the prover. If the level in the testmeasure is lower, either the piping ahead of the valve con-tained air or some part of the valve body cavity that waspreviously Þlled with air is now Þlled with water.b. The test measure should be drained to observe the effec-tiveness of the anti-swirl plate while the ßuid is draining.c. The test measure should be Þlled with the calibratingmedium, and the checks listed above should be repeated.d. The thermometers used with the test measure should beroutinely subjected to a calibration or a comparison with astandard reference thermometer to verify accuracy.

8 Operation And UseThe primary use of Þeld standard test measures is to deter-

mine the volume of a meter prover when using the waterdrawmethod of calibration. Test measures are constructed of stain-less steel in a multitude of sizes, ranging from 1 gallon all theway up to 1,500 gallons, with many different intermediatesizes. In fact, a test measure may be constructed in any conve-nient size, and frequently high-sensitivity test measures arebuilt to accommodate the exact volume of a small volumeprover. Some examples of unusual test measures sizes in reg-ular use are 15 gallon, 21 gallon, 42 gallon, 225 gallon, and227 gallon. Equivalent volume or different volume test mea-sure sizes are also available in metric units. Only test mea-sures calibrated with a Òto deliverÓ volume shall be used inthe calibration of meter provers.

8.1 FIELD USE OF VOLUMETRIC TEST MEASURES

Field standard test measures are used in the waterdraw cal-ibration of pipe provers, small volume provers, and tank prov-ers. Full descriptions on the Þeld use of these test measureswill be found in API MPMS, Chapter 4.9.1, ÒDeterminationof the Volume of Displacement and Tank Provers by theWaterdraw Method of Calibration.Ó The procedure for calcu-

lating the Base Volume of a Displacement or Tank Prover,using Þeld standard test measures, is described in API MPMS,Chapter 12.2.4, ÒCalculation of Base Prover Volumes by theWaterdraw Method.Ó

Different techniques may be used to calibrate meter prov-ers using Þeld standard test measures. Any of the followingmethods can be used; however, normal operating practicecurrently uses Method 3.

1. Fill each test measure to its exact certiÞed (zero)capacity, and allow the Þnal water level to be read on com-pletion in the last test measure Þlled.2. Slightly overÞll each test measure and then bring thelevel back to the exact capacity (zero) by withdrawingsome of the liquid. The liquid that is withdrawn is thenreleased into the next test measure to be Þlled.3. With graduated neck test measures it is not necessaryto operate at the zero level. Therefore, the test measuremay be Þlled to any location on its scale, the liquid levelread, and its certiÞed capacity is then adjusted mathemati-cally using a plus or minus scale reading (SR). A minussigniÞes that the water level is below the certiÞed capacity(zero mark) on the test measure scale, and a plus indicatesthat the water level is above the zero mark on the scale.

8.2 NORMAL TEST MEASURE OPERATION

Setting up a test measure for calibration use in the Þeldnormally involves the following preparatory steps. Inspect alltest measures to be used in the calibration for cleanliness,dents, unbroken sight glasses, scales and seals. Following theinspection, all the test measures shall be leveled while theyare in a water-Þlled condition, across at least two axes, 90¡apart. After this Þlling and leveling, the test measure shall beleak tested. If free of leaks, draining of the test measuresshould now take place, until cessation of the main ßow allowsthe draining time (as speciÞed on the calibration certiÞcate) tobe followed and a repeatable clingage condition established.After draining for the prescribed draining time, the bottomdrain valve is closed or the test measure is returned to itsupright and level position to prepare the test measure for cali-bration use.

During a calibration run, the test measure is Þlled withwater to an upper scale reading, which is then recorded as a(+) or (Ð) volume, depending on whether the reading is aboveor below the zero line on the scale. Typically, most volumesare read as either cubic inches (in.3) or milliliters (ml)depending on the calibration units of the test measure. Oncethe test measure is Þlled, the leveling veriÞed, and the scalereading recorded, the temperature of the water in the testmeasure is then required. This temperature can be taken bydifferent methods and in different locations. The temperaturecan be taken, while the test measure still contains the waterfollowing the scale reading, by immersing a thermometerinside the test measure or by holding a thermometer in the

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 23: Download

SECTION 7—FIELD STANDARD TEST MEASURES 13

ßowing stream while the water is draining from the test mea-sure, after the scale reading. At the cessation of the main ßow,it is essential that the draining time is carefully and exactlyfollowed to only allow the correct amount of water to escape.This will restore the test measure to its ready-to-Þll-againcondition, ensuring that the water clingage remaining insidethe test measure is maintained each time as a repeatable quan-tity, and replicates the drainage time sustained by the calibrat-ing agency.

8.3 IRREGULAR TEST MEASURE OPERATIONS

In Þeld use, sometimes irregular operating conditions canarise while using test measures in prover calibrations. Two ofthese conditions are identiÞed as either requiring that a testmeasure be overÞlled or underÞlled during the calibrationcycle. If either one of these two conditions should arise dur-ing a prover calibration, the techniques described below canbe used to solve the deÞciency.

8.3.1 Test Measure Pre-Fill

During a calibration, it is sometimes necessary to accu-rately determine the volume of a partially Þlled test measure.This determination can be made by a method known as pre-Þlling. Pre-Þlling is accomplished by Þrst Þlling a larger testmeasure with an amount of water taken from a smaller vol-ume test measure. This will bring the water being drawn fromthe prover into the test measure to a level that can be read onthe gauge glass scale.

As an example, consider a prover calibration with anapproximate one-way volume of 145 gallons. To accomplishthe volume determination, the only test measures availableare of a 100-, 50-, and 5-gallon capacity. One method of cali-brating this prover would be to Þll the 100-gallon test mea-sure one time and the 5-gallon test measure nine times. Thiswould not be recommended due to the large number of testmeasure Þllings required, using a 5-gallon test measure,which normally has a large uncertainty value in its calibra-tion. A more practical solution would be to pre-Þll either the

100- or the 50-gallon test measure, with 5 gallons of waterfrom the 5-gallon test measure and then Þll the remainder ofthis test measure from the prover.

The 100-gallon test measure for Fill No. 1 has a base vol-ume of 23100 in.3. The 50-gallon test measure for Fill No. 2has a base volume of 11550 in.3. The pre-Þll 5-gallon testmeasure has a base volume of 1155 in.3.

If only the 100-gallon and the 50-gallon test measures areused, at the end of the pass the water level in the last test mea-sure Þlled will not reach the neck, and therefore, will not havea scale reading. In this instance a pre-Þll is used to pre-load theÞrst measure Þlled with enough water to enable readings in thesight glasses on all the test measures Þlled to be achieved.

Treat the pre-Þll volume as a separate Þeld standard testmeasure Þlling, but with both the Base Volume of the TestMeasure (BMV) and the Scale Reading (SR) handled in a neg-ative manner.

Fill the 5-gallon test measure with wastewater and take theScale Reading (SR). The temperature is taken by immersion,after reading and recording the SR. Then drain the 5-gallonmeasure into the 100-gallon measure using the prescribeddrain time. In this example, SR on the 5-gallon test measure is(-10).

Fill No. 1, of the 100-gallon test measure has a SR of (+20)in.3. Temperature is taken, while draining, after reading andrecording the scale.

Fill No. 2, of the 50-gallon test measure has a SR of (-6)in.3. Temperature is taken, while draining, after reading andrecording the scale, (see below).

where

CTDW = The correction for the difference in density between the water in the prover and the water in the test measure,

CCTS = The combined correction factor for the effect of temperature on the steel of the prover and the test measure,

Pre-Þll BMV = 1155 ÉTreated as a negative: BMV = Ð1155Pre-Þll SR = Ð10 ÉTreated as a negative: SR = +10

To determine the adjusted Base Volume of the Test Measure (BMVa):

Pre-Þll: BMV + SR = BMVa Example: Ð1155 + (+10) = Ð1145Fill No. 1: BMV + SR = BMVa Example: 23100 + (+20 ) = +23120Fill No. 2: BMV + SR = BMVa Example: 11550 + (Ð6) = +11544

TOTAL VOLUME (145.10 gals) 33519 in.3

Calculating the Pre-Þll volume the same way as any other test measure is calculated:

Calculate Fill No. 1: 23120 x CTDW x CTS = WD Éfor Fill No. 1 Calculate Fill No. 2: 11544 x CTDW x CCTS = WD Éfor Fill No. 2 Calculate Pre-Þll: Ð1145 x CTDWx CCTS = WD Éfor Pre-Þll

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 24: Download

14 API CHAPTER 4—PROVING SYSTEMS

WD = The Base Volume of the Test Measure adjusted for the scale reading, SR, and corrected for the effects of CTDW and CCTS.

8.3.2 Test Measure Neck Draw

Another unusual operating condition that can arise withtest measure use is the required controlled overÞlling of testmeasures during a prover calibration run. In this case, theproblem is that additional water needs to be contained in thealready-full test measure. This is a situation where a neckdrawdown (neck-Þll) procedure would be used and examplesof this situation are described in the text below.

As an example, consider a prover calibration with anapproximate one-way volume of 153 gallons. To accomplishthe volume determination, the only test measures we haveavailable are of a 100- and 50-gallon capacity. The totalcapacity of the combined Þlling of these two test measureswould range from a minimum volume of 148.5 gallons to amaximum volume of 151.5 gallons.

The 100-gallon test measure for Fill No. 1 has a base vol-ume of 23100 in.3. The 50-gallon test measure for Fill No. 2has a base volume of 11550 in.3.

It has been determined beforehand that after Þlling bothmeasures to their maximum readable scale levels there willstill be approximately 1.5 gallons of water remaining in theprover. This volume of water will somehow need to beaccommodated in a test measure, so as to be measured.

Fill No. 1, of the 100-gallon test measure has a SR of (+230)in.3. Temperature is taken, by immersion, after reading andrecording the scale.

Fill No. 2, of a 50-gallon test measure has a SR of (+100)in.3. Temperature is taken, while draining, after reading andrecording the scale.

Neck Draw: Lower the water level in the neck of the 100-gallon test measure, to waste, by careful manipulation of thedrain valve, until the water level reaches the lowest readablevalue on the scale. Close the drain valve and take the scalereading. This value is recorded as the opening scale reading.

SR (opening) = Ð220 in.3

Draw water from the prover reÞlling the neck of the100-gallon test measure until the detector switch activates

and shuts down the prover calibration run. Take the closingscale reading:

SR (closing) = + 215 in.3

Treat this neck Þll as if it were a separate Þeld standard testmeasure with a BMV = zero (0).

The temperature is taken by immersion, of the neck portiononly, after reading and recording the closing SR.

To rationalize the Scale Reading: (SR) = [SR (closing) ÐSR (opening)]

= +215 Ð (Ð220)= +215 + (220)= + 435 in.3

To determine BMVa: (see below)

Similar situations can often occur in the calibration of newprovers, or also when provers have been modiÞed or workedon, such that the volume is not exactly known. Therefore, theÞrst calibration run is very often a guess as to where the actualÞnal volume will Þnish, relative to the test measures beingused to determine the volume. In this situation, it often hap-pens that the operator can Þnd himself having to perform sev-eral neck draws to complete the calibration run, just becauseof the fact that this prover volume is not precisely known. Useof this method will allow a precise volume to be decided andenable the operator to reÞne the test measure Þllings for theadditional calibration runs required to be made.

8.3.3 Test Measure Pre-Fill and Neck Draw

It is also possible to have a situation where a test measurepre-Þll and a test measure neck draw are both necessary dur-ing the same prover calibration run. Although this situation isuncommon, operators should be prepared for it, should thiscondition arise in a calibration.

Following the previous examples, consider a prover calibra-tion with an approximate one-way volume of 147.5 gallons.To accomplish the volume determination, the only test mea-sures that are available are the following: 100-, 50-, and 5-gal-lon capacity. Therefore, to perform the prover calibration with

Fill No. 1: BMV + SR = BMVa Example: 23100 + (+230) = +23330 Fill No. 2: BMV + SR = BMVa Example: 11550 + (+100) = +11650 Neck Draw: BMV + SR = BMVa Example: zero + (+435) = +435

TOTAL VOLUME (153.31 gals) 35,415 in.3

Calculate the Neck Fill as any other measure Þll is calculated.

Calculate Fill No. 1: 23330 x CTDW x CCTS = WD Éfor Fill No. 1Calculate Fill No. 2: 11650 x CTDW x CCTS = WD Éfor Fill No. 2Calculate Neck Fill: 435 x CTDW x CCTS = WD É for Neck Draw

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 25: Download

SECTION 7—FIELD STANDARD TEST MEASURES 15

these test measures would require the following sequence ofevents:

Fill the 100-gallon test measure with a 5-gallon water pre-Þll from the 5-gallon test measure, then Þll the test measureup to the top of the readable scale with water drawn from theprover. Next, Þll the 50-gallon test measure to the top of itsreadable scale with water draw water. At this point, the fol-lowing situation ocurs:

The volume in the 100-gallon test measure is 101 gallonsfrom which 5 gallons need to be subtracted. The volume inthe 50-gallon test measure is 50.5 gallons. The total volumeequals (101 + 50.5Ð5) = 146.5 gallons.

Therefore, with both test measures Þlled, and a pre-Þll of 5gallons, a total amount of 146.5 gallons of water has beenheld. However, 147.5 gallons need to be contained and roomin a test measure is needed for an additional one gallon ofwater.

At this point, it is necessary to draw down the neck of the100-gallon test measure. Drawing the water level down to thelowest readable scale line will make a total volume spaceavailable in the 100-gallon test measure for approximately anadditional 2 gallons and allow the prover calibration run to becompleted.

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 26: Download

16 API CHAPTER 4—PROVING SYSTEMS

Figure 1—Various Types of Standard Test Measures (I)

Sig

ht tu

be

Gau

ge s

cale

Gau

ge g

lass

Gla

ss p

late

Gau

ge s

cale

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 27: Download

SECTION 7—FIELD STANDARD TEST MEASURES 17

Figure 2—Various Types of Standard Test Measures (II)

Pro

tect

or

Gla

ss tu

bew

ith s

cale

Win

dow

35¡

Brim

Mea

sure

Slic

ker P

late

Tes

t Mea

sure

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 28: Download

18 API CHAPTER 4—PROVING SYSTEMS

Figure 3—Field Standard Test Measure—Invertible Type

35¡

DE

LIV

ER

S

5 U

S G

AL

25

20

15

10

5

0

5

10

15

20

25in3

3

2

4

5

6

1

1. Reinforcing bar2. Handle (raised)3. Rolled bead4. Gauge assembly5. Gauge mounting6. Concave bottom

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 29: Download

SECTION 7—FIELD STANDARD TEST MEASURES 19

Figure 4—Field Standard Test Measure—Bottom Drain Type

1. Drain valve2. Drain slope 5¡3. Levels4. Level cover5. Gauge mounting6. Rolled bead7. Top cone pitch 25¡8. Reinforcing bands9. Bottom cone pitch 20¡

10. Adjustable legs

5

4

3

2

1

10

9

8

7

6

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 30: Download

20 API CHAPTER 4—PROVING SYSTEMS

Figure 5—Gauge Glass and Scale Assembly

����������������������������

����������������������������

DE

LIV

ER

S

5 U

S G

ALL

ON

S

0

�������

25

20

15

10

5

0

5

10

15

20

25

in3

����

��������

����������������������������������������������������������������������������������������

���������������������������������������������������������������

XX

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 31: Download

SE

CT

ION 7—

FIE

LD ST

AN

DA

RD T

ES

T ME

AS

UR

ES

21

Figure

6—F

ield Standard Test M

easure Control C

hart

Warning l imi ts

Warning l imi ts

Tolerance l imi ts

Act ion l imi ts

Center l ine

Act ion l imi ts

Tolerance l imi ts

4,999

4,998.5

4,998

4,997.5

4,997

1940 1950 1960 1970 1980 1990 2000

Year

Ga

llo

ns

, th

ou

sa

nd

s

CONTROL CHART

5 Gal lon Sl icker Plate Test Measure

Serial No:#123456. Owner: ABC Oi l Co.

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 32: Download

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 33: Download

23

APPENDIX A—ACCURACY REQUIREMENTS FOR VOLUMETRIC TEST MEASURES

In determining the accuracy requirements in the calibrationof test measures, it is necessary to consider the hierarchy ofpetroleum custody transfer measurements so that the preci-sion levels of test measures are consistent with other preci-sion levels of measurement. The conventional hierarchy inpetroleum measurement is shown in Table A1.

If, however, the master meter calibration techniques areintroduced then the number of hierarchy levels will beincreased to seven (see Table A6). Similarly, should a stateagency or calibration source other than NIST be used in thetest measure calibration, the number of hierarchy levels willbe increased (see Table A8).

Potential errors or uncertainties of each position in thehierarchy from the lowest to highest position must be pro-gressively lower so that overall uncertainty of the hierarchycan be controlled. Uncertainties from the highest to the low-est levels progressively add to each successive level. Eachstep in the hierarchy should normally experience uncertain-ties that are 25 to 50% of the uncertainty at the next lowestlevel in the hierarchy. For measurements that are somewhatrepetitious in nature, the margin between adjacent steps in thehierarchy can be less because repeated measurements propa-gate the uncertainty on the overall average to a lower value.

Shown below are the uncertainty requirements for volumemeasurements in the normal petroleum measurement hierar-chy, see Table A2. The uncertainties versus test measure sizeas shown in the table are appropriate to the petroleum indus-try convention for meter provers.

No recommendations are made for uncertainty limits forsmall volume provers calibrated by master meter/prover pro-cedures. For small volume provers, lower uncertainty limitsare needed for test measures used in their calibration sincesmaller prover and test measure volumes are used at the samemetering rate involving conventional provers. Recommenda-tions on uncertainty limits for test measures used for calibrat-ing small volume provers are given below, see Table A4.

The use of small volume provers and master meter/provercalibration of conventional provers dominates the uncertaintylimit capabilities needed for test measure calibration. Theuncertainty limits shown in the table below are the recom-mended levels that can be achieved by NIST, see Table A3.

If master meter calibration of meter provers is introducedinto the petroleum hierarchy, as in the case of proving a mas-ter meter with a master prover and then using the mastermeter to calibrate a Þeld prover. The number of levels in thehierarchy is increased to seven as follows, (see Table A7):

Table A1—Hierarchy of Measurements for the Metered Volumes of Petroleum

Hierarchy Position Measurement Description

1. Highest Calibration of NIST Standards2. Calibration of Test Measures3. Calibration of Meter Prover4. Proving of Custody Transfer Meter5. Lowest Custody Transfer Meter Registration

Table A2—Uncertainty Requirements for a Petroleum Measurement Hierarchy Using a Single Custody Transfer Meter and the Waterdraw Calibration of a Displacement Prover

Overall Uncertainty Limits %

HierarchyPosition

Measurement Description

Low Accuracy

MediumAccuracy

HighAccuracy

1. Highest Calibration of NIST Stds 0.050 0.020 0.0102. Calibration of Test Measures 0.100 0.050 0.0203. Calibration of Meter Prover 0.200 0.100 0.0504. Proving of Custody

Transfer Meter0.350 0.150 0.075

5. Lowest Custody Transfer Meter Registration

0.500 0.250 0.100

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 34: Download

24 API CHAPTER 4—PROVING SYSTEMS

Table A3—Uncertainty Limits for Calibration of Normal Sensitivity Test Measures as Used in the Calibration of Conventional Displacement Provers

Size of Test Measure, Gallons Uncertainty Limit%

1 0.0305 0.02010 0.01025 0.01050 0.005100 0.005200 0.005500 0.005

Table A4—Uncertainty Limits for the Calibration of High Sensitivity Test Measuresas Used in the Calibration of Small Volume Provers

Size of Test Measure, Gallons Uncertainty Limit%

1 0.0305 0.02010 0.01025 0.01050 0.010100 0.010

Table A5—Uncertainty Limits for the Calibration of Test Measures Used for Displacement Provers

Overall Uncertainty Limits %

Size of Test Measure, GallonsWith Master MeterProver Calibration

Without Master MeterProver Calibration

1 0.025 0.0505 0.015 0.03510 0.010 0.02525 0.010 0.02050 0.008 0.015100 0.005 0.010200 0.005 0.010500 0.005 0.010

Table A6—Hierarchy of Petroleum Metered Volume Measurements Including Calibration of Meter Provers by a Master Meter

Hierarchy Position Measurement Description

1. Highest Calibration of NIST Standards2. Calibration of Test Measures3. Calibration of Master Prover4. Calibration of Master Meter5. Calibration of Meter Prover6. Proving of Custody Transfer Meter7. Lowest Custody Transfer Meter Registration

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 35: Download

SECTION 7—FIELD STANDARD TEST MEASURES 25

Table A7—Uncertainty Requirements for a Petroleum Measurement Hierarchy Including a Single Custody Transfer Meter With Master Meter Prover Calibration

HierarchyPosition

MeasurementDescription

LowAccuracy

MediumAccuracy

HighAccuracy

1. Highest Calibration of NIST Standards 0.010 0.005 0.00252. Calibration of Test Measures 0.025 0.010 0.0053. Calibration of Master Prover 0.050 0.020 0.0104. Calibration of Master Meter 0.100 0.050 0.0205. Calibration of Meter Prover 0.200 0.100 0.0506. Proving of Custody

Transfer Meter 0.350 0.150 0.075

7. Lowest Custody Transfer Meter Registration

0.500 0.250 0.100

The extreme case of total expansion of the Petroleum Measurement Hierarchy is shown below:

Table A8—Total Hierarchy of Measurements of Metered Volumes of Petroleum Showing International Standards, Calibration of Test Measures other than by NIST and Meter Prover Calibration by a Master Meter

Hierarchy Position Measurement Description

1. Highest International Base Standard Units2. Calibration of NIST Standards3. Calibration of Other Body Standards4. Calibration of Test Measures5. Calibration of Master Prover6. Calibration of Master Meter7. Calibration of Meter Prover8. Proving of Custody Transfer Meter9. Lowest Custody Transfer Meter Registration

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 36: Download

26 API CHAPTER 4—PROVING SYSTEMS

Figure 7—Levels of the Petroleum Measurement Hierarchy

International Base Standard Units

Calibration of NIST Standards

Calibration of Other CalibrationAgency Standards

Calibration of Field StandardTest Measures

Calibration of Field StandardTest Measures

Calibration of Field StandardTest Measures

Calibration of Field Prover byWaterdraw Method

Calibration of Master Proverby Waterdraw Method

Calibration of Master Proverby Waterdraw Method

Calibration of Master Meter Calibration of Master Meter

Calibration of Field Prover byMaster Meter Method

Calibration of Field Prover byMaster Meter Method

Proving of Custody TransferMeter

Proving of Custody TransferMeter

Proving of Custody TransferMeter

Custody Transfer MeterRegistration

Custody Transfer MeterRegistration

Custody Transfer MeterRegistration

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 37: Download

27

APPENDIX B—CALCULATION OF UNCERTAINTY OF A FIELD STANDARD TEST MEASURE

While the contained volume (Tw) is the actual scientiÞcvalue determined by NIST, test values of the delivered vol-ume and calculated values of volume (Tb), required by indus-try, are also provided by NIST. Accordingly, the equation forthe contained volume (below) must be subject to uncertaintyanalysis.

m3

The above equation contains the expression (Tw) whichis derived from the equation:

kg/m3

The equation is very unwieldy for the purposes of uncer-tainty analysis. To make the analysis simpler, the Þrst-orderapproximation is used:

kg/m3

where

= The density of the distilled water at the refer-

ence temperature, Tb,

= The volumetric thermal expansion coefÞcient

for water appropriate for Tb,

Tb = The reference temperature, either 60¡F or 15¡C.

While the accuracy of this equation is inferior to that of thefull Kell density equation, it is extremely satisfactory for thepurposes of uncertainty analysis.

m3

substituting:

kg/m3

m3

Air density is a function of air temperature, atmospheric pres-sure and relative humidity. At a given altitude, ßuctuations

about the mean air density are known to rarely exceed 3%. Analysis indicates that modest accuracy in the required mea-surements produces a satisfactory uncertainty in the air den-sity. For example, temperature measured within an accuracy of ±0.4¡C, atmospheric pressure within ±1.1 mm Hg, and rel-ative humidity within ±16 %, will result in a contribution to the uncertainty of the above equation on the order of one part per million (1 ppm).

It is convenient to make a change in notation at this point,let V = Vcontain(Tw).

The relative uncertainty in V may be expressed as:

where

xi = The measured variables in V,

k = The coverage factor, normally taken to be 2, consistent with NIST policy.

Tables 1Ð3 list typical calculated uncertainties for con-tained volumes at (Tw) for 100-, 50-, and 5-gallon Þeld stan-dard test measures, respectively. Values are listed in SI units.

ÒGuidelines for Evaluating and Expressing the Uncertaintyof NIST Measurement Results,Ó NIST Technical Note 1297,suggests classifying the uncertainties for xi as:

Type A: Uncertainties which are evaluated by statisticalmethods.

Type B: Uncertainties which are evaluated by other means.Once a value for the uncertainty of an xi is determined, it

receives the identical mathematical treatment in the equationwhether it is either Type A or Type B. The sources of theuncertainties for the xiÕs and their type are:

dh is one-tenth of the least division of the neck scale readwith a magniÞer Þtted with an anti-parallax device, Type B.

dmf and dme are calculated from:

where

sm = The standard deviation of the residuals of the comparator calibration equation,

V containT w

m f meÐr T w( ) r T air( )Ð-------------------------------------+ h=

r

r T w( )a bT w cT w

2 dT w3 eT w

4 fT w5+ + +++

1 gT w+-----------------------------------------------------------------------------------------=

r T W( ) rO 1 bW T W T BÐ( ) ]Ð[=

ro

bw

V contain T w( )m f meÐ

r T w( ) r T air( )Ð------------------------------------- h+=

r T w( ) ro 1 bw T w T bÐ( ) ]Ð[=

V contain T w( )m f meÐ

ro 1 bw T w T bÐ( ) ]Ð r T air( )Ð[-----------------------------------------------------------------------= + h

dVV

------- k S 1V----

¶V¶xi

-------dxiè øæ ö

2 0.5

=

dmsm

b2----- 1

1p---

m mmidrangeÐ( )2

S mi mmidrangeÐ( )2------------------------------------------+ +

þïýïü

12---

îïíïì

=

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 38: Download

28 API CHAPTER 4—PROVING SYSTEMS

b = The slope of the calibration equation,

p = The number of calibration measurements,

m = The mass of the test measure,

mmidrange = The midpoint of the calibrated range of the comparator,

mi = The mass value for which the calibrations were done.

One would expect the uncertainties of the masses to beincluded in dm, but they are negligible compared to the stan-dard deviation of the residuals, Type A.

and are derived from the Kell Density Formula,Type B.

dTw is based on the Standard Deviation from NIST Cali-bration Data, Type B.

is derived from A Primer for Mass Metrology, Jaeger& Davis, NBS Publication 700-1.

Volume calibrations are normally done in a climate-con-trolled laboratory environment at a temperature around 23¡C.The normal requirement from the petroleum industry is toconvert volumes measured at normal room or ambient tem-peratures to a reference temperature, usually 60¡F or 15¡C,via the formula:

where

= The volumetric coefÞcient of thermal expan-

sion for the test measure,

Tb = The reference temperature of the test measure, typically 60¡F or 15¡C.

Since the volumetric coefÞcient of thermal expansion is notdirectly measured, common practice has been to use the ther-mal expansion coefÞcient for Series 304 stainless steel. Thevalue used is 0.000048/¡C. Using this value for contained vol-umes, the uncertainty due to the calculation using the abovevolume equation is where is 7 x 10 -7/¡C,which is to be added in quadrature to the appropriate uncer-tainty from the tables.

The uncertainty in applying the above volume equation tothe delivered volumes is compounded by the fact that whenthe temperature decreases from 23¡ to 15.56¡C, the viscosityof water increases by 21% and the surface tension increasesby 1.6%. These increases will have an effect on the way thewater drains from the test measure. If less water drains from agiven measure at 15.56¡C in the 30 seconds than from thesame measure at 23¡C in 30 seconds, then there is an uncor-rected residual effect that will make it difÞcult to calculate thedelivered volume at the lower temperature with any level ofconÞdence.

dro dbw

drair

V T b( ) V T w( ) 1 bt T w T bÐ( ) ]Ð[=

bt

T w T bÐ( )dbt dbt

Table B1—Uncertainty for the Gravimetric Determination of the Contained Volume at Room Temperature of NIST’s 100-Gallon (0.379m3) Neck Scale Test Measure

xi Value Value dxi

h Ñ 1/V 2.6/m3 3.3 x 10-6m3 8.6

mf 492 kg 1/(mfÐme) 2.7 x 10-3/kg 2.2 x 10-3 kg 6.0

me 115 kg Ð1/(mfÐme) Ð2.7 x 10-3/kg 2.2 x 10-3 kg -6.0

1000 kg/m3 Ð1/ Ð1 x 10-3m3/kg 4 x 10-3 kg/m3 -4.0

2 x 10-4/ ¡C TwÐTref,w 21.5 ¡C 3.3 x 10-7/ ¡C 7.1

Tw 21.5 ¡C 1.5 x 10-4/ ¡C 2 x 10-2 ¡C 3.0

1.18 kg/m3 1/ 1 x 10-3m3/kg 1 x 10-3 kg/m3 1.0

%

1V----

¶V¶xi

------- 1V----

¶V¶xi

-------dxi 10 6д

ro ro

bw

bw

rair ro

dVV

------- 2 S 1V----

¶V¶xi

-------dxiè øæ ö

2

12---

0.0030==

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 39: Download

SECTION 7—FIELD STANDARD TEST MEASURES 29

Table B2—Uncertainty for the Gravimetric Determination of the Contained Volume at Room Temperature of NIST’s 50-Gallon (0.189m3) Neck Scale Test Measure

xi Value Value dxi

h Ñ 1/V 5.3/m3 1.6 x 10-6m3 8.5

mf 254 kg 1/(mfÐme) 5.3 x 10-3/kg 2.2 x 10-3/kg 11.7

me 66 kg Ð1/(mfÐme) Ð5.3 x 10-3/kg 2.2 x 10-3 kg Ð11.7

1000 kg/m3 Ð1/ Ð1 x 10-3 m3/kg 4 x 10-3 kg/m3 Ð4.0

2 x 10-4/¡C TwÐTref,w 23.5 ¡C 3.3 x 10-7 /¡C 7.8

Tw 23.5 ¡C 1.5 x 10-4 /¡C 2 x 10-2 ¡C 3.0

1.18 kg/m3 1/ 1 x 10-3 m3/kg 1 x 10-3 kg/m3 1.0

Table B3—Uncertainty for the Gravimetric Determination of the Contained Volume at Room Temperature of NIST’s 5-Gallon (0.0189 m3) Neck Scale Test Measure

xi Value Value dxi

h Ñ 1/V 53/m3 1.6 x 10-6m3 84.8

mf 23.6 kg 1/(mf Ðme) 5.3 x 10-2/kg 2.3 x 10-4 kg 12.2

me 4.8 kg Ð1/(mf Ðme) Ð5.3 x 10-2/kg 2.3 x 10-4 kg Ð12.2

1000 kg/m3 Ð1/ Ð1 x 10-3 m3/kg 4 x 10-3 kg/m3 Ð4.0

2 x 10-4/¡C TwÐTref,w 18¡C 3.3 x 10-7/¡C 6.0

Tw 18¡C 1.5 x 10-4/¡C 2 x 10-2 ¡C 3.0

1.18 kg/m3 1/ 1 x 10-3 m3/kg 1 x 10-3 kg/m3 1.0

1V----

¶V¶xi

-------1V----

¶V¶xi

-------dxi 10 6д

ro ro

bw

bw

rair ro

dVV

------- 2 S 1V----

¶V¶xi

-------dxiè øæ ö

2

12---

0.0042%==

1V----

¶V¶xi

-------1V----

¶V¶xi

-------dxi 10 6д

ro ro

bw

bw

rair ro

dVV

------- 2 S 1V----

¶V¶xi

-------dxiè øæ ö

2

12---

0.0174%==

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 40: Download

30 API CHAPTER 4—PROVING SYSTEMS

Table B4—Uncertainty for the Gravimetric Determination of the Contained Volume at Room Temperature of NIST’s 5-Gallon (0.0189 m3) Slicker-Plate Test Measure

xi Value Value dxi

h Ñ 1/V 53/m3 0 0

mf 28.1 kg 1/(mfÐme) 5.3 x 10-2/kg 2.3 x 10-4 kg 12.2

me 9.3 kg 1/(mfÐme) Ð5.3 x 10-2/kg 2.3 x 10-4 kg Ð12.2

1000 kg/m3 Ð1/ Ð1 x 10-3 kg/m3 4 x 10-3 kg/m3 Ð4.0

2 x 10-4 /¡C TwÐTref,w 21.7 ¡C 3.3 x 10-7/¡C 7.2

Tw 21.7 ¡C 1.5 x 10-4/¡C 2 x 10-2 ¡C 3.0

1.18 kg/m3 1/ 1 x 10-3 kg/m3 1 x 10-3 kg/m3 1.0

1V----

¶V¶xi

-------1V----

¶V¶xi

-------dxi 10 6д

ro ro

bw

bw

rair ro

dVV

------- 2 S 1V----

¶Vdxi

-------dxiè øæ ö

2

12---

0.0039==

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 41: Download

31

APPENDIX C—TEST MEASURE CONTROL CHARTS

Every test measure shall have a data record prepared and maintained by the national calibrating agency, containing all the rele-vant information pertaining to the history of the test measure.

The relevant information should contain:

a. The identity of the test measure by owner and the ownerÕs number.b. The makerÕs name, identiÞcation number, nominal volume, scale divisions, coefÞcient of expansion of material of construc-tion, dates and details of all types of cleaning performed. c. Calibration dates, corresponding seal numbers, together with Òto containÓ and Òto deliverÓ volumes. d. Uncertainty history.

The calibration history of all test measures shall be documented and recorded and a control chart developed for each testmeasure.

An example of a Test Measure Control Chart is shown in Figure 6, which plots the calibrated volume against time for each cal-ibration performed, and shows the total documented calibration history of this particular test measure. Analysis of the controlchart shown in Figure 6 yields the following data shown in Table C1.

Calibrations that are more frequent should have been performed on the test measure in the early years, because sufÞcient datawas unavailable until 1972 or 1973 to construct a control chart with calculated limits. Initially, the calibration facilityÕs uncertaintyestimates could have been used until sufÞcient data was available to calculate control levels. For measurements as critical as testmeasure calibrated volumes, conservative conÞdence limits levels should be used to set warning, action, and tolerance limits. Thefollowing statistical conÞdence levels are recommended:

Control Level ConÞdence Level %Warning Limits 90Action Limits 95Tolerance Limits 99From the control chart in Figure 6, and the statistical analysis table C1, the following control limits are calculated after

four calibrations by the year 1973:Center Line = 4.99788 Warning Limits = 4.99761 and 4.99815Action Limits = 4.99751 and 4.99825Tolerance Limits = 4.99720 and 4.99856From the control chart in Figure 6, three of the past calibrations are outside of the warning limits but no results fall outside of

the action limits.This history of each test measure is recorded and kept on Þle by NIST for every test measure calibrated. It is also advisable for

the owners of Þeld standard test measures to keep their own current records on Þle, including detailed information on all mainte-nance and modiÞcations performed.

Table C1—Statistical Analysis Table of the Calibration History of Test Measure #123456(5-Gallon Slicker-Plate Test Measure)

Year Volume in Gallons Moving Average Moving Range

1960 4.99790 Ð Ð 1965 4.99802 4.99796 0.000121972 4.99780 4.99791 0.000221973 4.99778 4.99788 0.000241982 4.99780 4.99786 0.000241982 4.99806 4.99789 0.000281982 4.99817 4.99793 0.000391982 4.99809 4.99795 0.000391992 4.99820 4.99798 0.000421994 4.99800 4.99798 0.000421996 4.99775 4.99796 0.000451996 4.99817 4.99798 0.000451996 4.99772 4.99796 0.00048

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 42: Download

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 43: Download

33

APPENDIX D—LABORATORY WEIGHTS AND MASS STANDARDS

Various classes of weights and mass standards are used forLaboratory Procedures and in Gravimetric Calibrations, andtheir speciÞcations in OIML R 111 describe weights in classesas E1, E2, F1, F2, M1, M2, and M3.

For complete information, see OIML R 111.This appendix brießy describes the principal physical char-

acteristics and metrological requirements for classes ofweights, that are used:

a. For the veriÞcation of weighing instruments.b. For the veriÞcation of weights of a lower class of accuracy. c. With weighing instruments.

For veriÞcation of weighing instruments, it is recom-mended that the tolerance of the weight be < 1/4 of the preci-sion needed from the instrument.

D.1 Terminologya. Weight: A material measure of mass, regulated in regard toits physical and metrological characteristics: shape, material,surface quality, nominal value, and maximum permissibleerror.b. Accuracy class of weights: A class of weights which meetscertain metrological requirements intended to keep errorswithin speciÞed limits.c. Traceability: The Òproperty of the result of a measurementof the value of a standard whereby it can be related to statedreferences, usually national or international standards,through an unbroken chain of comparisons all having stateduncertaintiesÓ.

D.2 Physical CharacteristicsD.2.1 CONSTRUCTION

Weights are divided into two types based on design:

a. Type I: These weights are one piece construction and con-tain no added adjusting material. They should be speciÞedwhen weights are to be used as standards of the highest orderand where maximum stability is required. A precise measure-ment of density can be made only for one-piece weights.b. Type II: Weights of this type can be of any appropriatedesign, such as screw knob, ring, or sealed plug. Adjustingmaterial can be used as it is of a material at least as stable asthe base material and is contained in such a way that it willnot become separated from the weights.

Class E1 and E2 weights shall be Type I which is solid andshall have no cavity open to the atmosphere. They shall have anintegral construction, i.e., consist of a single piece of material.

Class F1 and F2 weights from 1g to 50 kg may be Type Ior Type II, i.e., one or more pieces from the same material.They may contain an adjusting cavity; however, the volumeof this cavity shall not exceed one-Þfth of the total volume of

the weight, and the cavity shall be closed by means of a lift-ing knob or other suitable device.

Class M1 weights from 100 g to 50 kg shall have an adjust-ing cavity. For weights from 10 g to 50 g, the cavity isoptional, and weights from 1 g to 10 g shall be manufacturedwithout an adjusting cavity.

Class M2 and M3 weights from 100 g to 50 kg shall havean adjusting cavity. For weights of 20 g and 50 g, the adjust-ing cavity is optional and weights of l0 g and less shall besolid without any adjusting cavity.

D.3 Design

A weight may have any shape that does not introduce fea-tures that reduce the reliability. All weights shall be free ofragged or sharp edges or ends.

Weights less than 1 g shall be ßat polygonal sheets or wires,with appropriate shapes, which permit easy handling. Theshapes shall be indicative of the nominal value of the weights.

Weights of 5 g to 50 kg may have different shapes suitablefor their method of handling. Instead of a lifting knob theymay have rigid handling devices embodied with the weights,such as axles, handles, etc.

D.4 Material

All weights shall be corrosion-resistant. The quality of thematerials used in construction shall be such that the change inmass of the weights shall be negligible, relative to the maxi-mum errors permitted in their accuracy class under normalconditions of use.

Class E1 and E2 weights shall be constructed of metal oralloy that is practically nonmagnetic. The hardness of thismaterial and its resistance to wear shall be similar or betterthan that of austentitic stainless steel.

Class F1 and F2 weights shall be constructed of mate-rial that has brittleness and hardness equal to drawn brass.The material used for weights of this class shall be practi-cally nonmagnetic.

Class M1 cylindrical weights of 10 kg and below shall bemade of brass or other material whose quality is similar orbetter than that of brass. Weights of 1 g and less shall be madeof a material that is sufÞciently resistant to corrosion and oxi-dization. The surface shall not be coated and the material inthe weights shall be practically nonmagnetic.

Class M2 and M3 cylindrical weights of 10 kg and belowshall be made of material which has a hardness and corrosionresistance equal to that of cast brass and a brittleness notexceeding that of gray cast iron. Gray cast iron shall not beused in weights less than 100 g. Weights shall be practicallynonmagnetic.

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 44: Download

34 API CHAPTER 4—PROVING SYSTEMS

D.5 DensityBecause of the effects of the buoyant force of air on a

weight, precision measurements of mass require that the vol-ume of the weight be known, as well as the density of the airvolume in which it is being measured, so that the appropriatecorrections can be made. For weights of high precision, therange of density is limited to values at or near the density ofwell-established standards, such as are used by NEL, NIST,etc. As lower precision of measurement is required, so therange of density is broadened.

D.6 FinishThe surface of the weights shall be smooth and the edges

shall be rounded. The surface of classes E1, E2, F1, and F2weights up to and including 50 kg shall not appear to beporous and shall present a glossy appearance when visually

examined. The surface of classes M1, M2, and M3 cylindricalweights from 1g to 50 kg shall be smooth and shall not appearto be porous when visually examined.

D.7 Adjustment Type I weights including classes E1 and E2 shall be

adjusted by abrasion, grinding or any appropriate method. Thesurface requirements shall be met at the end of the process.

Type II weights including classes F1, F2, M1, M2, andM3 solid weights shall be adjusted by abrasion grindingor any appropriate method that does not alter the surface.Weights with adjusting cavities shall be adjusted with thesame material from which they are made, with tin, or withtungsten. For weights that have sealing caps, the cap maybe made of aluminum.

Table D1—Maximum and Minimum Limits for Density in g/cm3

Nominal Value Class E1 Classes E2, F1 Class F2 Classes M1, M2, M3

> 5 kg 7.934Ð8.067 7.81Ð8.1 7.0Ð8.1 > 4.4> 5 kgÐ50 mg 7.934Ð8.067 7.7Ð8.1 7.7Ð8.1 > 2.2

< 50 mg > 2.2 > 2.2 > 2.2 > 2.2

Table D2—Application of Standard Mass Weights

Application Type Class

Primary Reference Standards I E1Reference Standards for Calibrating E2 Weights I E1Reference Standards for Calibrating F1 Weights I E2Standards Used for Calibrating F2 Weights II F1Standards Used for Calibrating M1 Weights II F2Standards Used for Calibrating M2 Weights II M1Standards Used for Calibrating M3 Weights II M2Creating Balances of Accuracy Class I II E2Creating Balances of Accuracy Class II II F1Laboratory Weights for Routine Analytical Work Calibrating Balances of Accuracy Class III, IIIL and IIII II M1, M2, M3Dial Scales, Trip Balances, and Platform Scales Built-In Weights for High Quality Analytical Balances I or II E2Student Laboratory Use II M2

rmin rmax,( )

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 45: Download

35

APPENDIX E—NIST CERTIFICATES OF CALIBRATION FOR FIELD STANDARD TEST MEASURES

(EXAMPLES)

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 46: Download

36 API CHAPTER 4.7

U.S. DEPARTMENT OF COMMERCE

NATIONAL INSTITUTE OF STANDARDS AND TECHNOLOGYGaithersburg, Maryland 20899

REPORT OF CALIBRATION

OF A ONE HUNDRED (100) GALLON VESSEL (Graduated Neck Type)

Mfg: Exact Fabricators NIST Seal No: 2345Canton, OH. Material: Stainless Steel

Makers No: 8865

Submitted by:

ABC Pipeline CompanyMytown, TX.

(Purchase Order Number XYZ 222-345678 dated August 22, 1997)

Assumed Volumetric CoefÞcient of Expansion, 0.0000265 per degree Fahrenheit.

The internal volume of the vessel described above has been determined by the volumetric testmethod using a NIST Standard Test Measure. This method utilizes a volumetric test measure thathas a volume that has been compared against NIST reference mass standards. With the vessel ina standing position and the reference attitude established by leveling the attached levels, anddrained for 30 seconds after cessation of the main ßow, the results for the volume of water deliv-ered are as follows, using the units in parentheses.

*The scale reading is determined by the intersection of the horizontal plane, tangent to the bottomof the meniscus reading in the gauge tube. A scale division, between -125 and +125 is equivalentto 5 in.3.

The NIST Standard Volumetric Test Measure used above has an expanded uncertainty of symbolof ±0.02% according to Reference [2]. This reference gives the deÞnitions of terms used belowand details on this method for producing expanded uncertainty assessments. The expandeduncertainty quoted above is obtained by computing the standard deviation of the mean for two

[2] Taylor, B.N. and Kuyatt, C.E., “Guidelines for Evaluating and Expressing the Uncertainty of NIST Measurement Results,” NIST Technical Note 1297, National Institute of Standards and Technology (January 1993).

NIST Test No.: 444/123456-97-8Calibration Date: August 29, 1997 Page 1 of 2

Scale*Reading

Volume Delivered(U.S. gal at 60¡F)

Volume Delivered(in.3 at 60¡F)

0 99.9967 23099.24

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 47: Download

MPMS CHAPTER 4.7—FIELD STANDARD TEST MEASURES 37

REPORT OF CALIBRATIONABC Pipeline Company,Volume Standards

independent measurements; this is deÞned as Type A Evaluation of Standard Uncertainty, uA, asa percentage of the mean value. This Type A Standard Uncertainty is ±0.004%; it is composed ofthe imprecision of all the components of the NIST measurement process; it includes gauge scalereadings, interior wall drainage characteristics, etc. The Type B Evaluation of Standard Uncer-tainty, uB, for the volume transfer standard is estimated to be ±0.01%; this is based on volumemeasurement experience. These Type A and B Standard uncertainties are combined using theroot-sum-of-squares (RSS) method to produce the Combined Standard Uncertainty, uc. ThisCombined Standard Uncertainty is multiplied by a coverage factor, k, that is taken to be 2 to givethe expanded uncertainty, U. In the equation form, this procedure is:

The test results shown above are the arithmetic means of two independent observations. Thestated volume is corrected to a reference temperature of 15.56¡C (60¡F). The reported volumehas an estimated uncertainty that is computed by Þrst taking the standard deviation of the mean oftwo measurement results, this is referred to as Type A Evaluation of the Standard uncertainty ofthe test measure. This Type A Standard Uncertainty is squared and added to the square of theType B Standard Uncertainty which is taken to be that for the NIST capability, ±0.015. The squareroot of this sum gives the Combined Standard Uncertainty for the test measure. By multiplying thisCombined Standard uncertainty by a coverage factor of 2 produces the expanded uncertainty.The expanded uncertainty for this test measure is ±0.045.

For the Director,National Institute of Standards and Technology

Dr. George E. MattinglyLeader, Fluid Flow GroupProcess Measurements DivisionChemical Science and Technology Laboratory

NIST Seal No.: 2345NIST Test No.: 444/123456-97-8Calibration Date: August 29, 1997 Page 2 of 2

U k uc k uA2 uB

2+( )=´=

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 48: Download

38 API CHAPTER 4.7

U.S. DEPARTMENT OF COMMERCE

NATIONAL INSTITUTE OF STANDARDS AND TECHNOLOGYGaithersburg, Maryland 20899

REPORT OF CALIBRATION

OFA ONE HUNDRED (100) GALLON VESSEL (Graduated Neck Type)

Mfg.: Exact Fabricators NIST Seal No.: 2345 Canton, OH. Material: Stainless Steel Makers No.: 8865

Submitted by:

ABC Pipeline CompanyMytown, TX.

(Purchase Order Number XYZ 123 dated August 22, 1997)

Assumed Volumetric CoefÞcient of Expansion, 0.0000265 per degree Fahrenheit.

The internal volume of the vessel described above has been determined by the gravimetric testmethod. The gravimetric method requires weighing the empty vessel and re-weighing when Þlledwith a ßuid of known density. The ßuid used was water. With the vessel in a standing position, thereference attitude was established by leveling the attached levels. The internal volume deter-mined in this way is termed the contained volume and the value is given below using the unitsrequested in parentheses:

Per request, the delivered volume of this vessel was determined by the following procedure. Toestablish the delivered volume, the contained volume is drained through the bottom valve. Whenthe main ßow of liquid ceases, a 30-second drain time is observed before closing the valve. Sub-sequent re-weighing completes the gravimetric procedure and enables determination of the deliv-ered volume. The results for the delivered volume of water are as follows, using the unitsrequested in parentheses:

NIST Test No.: 123/12345Calibration Date: August 29, 1997 Page 1 of 2

Scale*Volume Contained (U.S. gal at 60¡F)

Volume Contained(in.3 at 60¡F)

0 99.9800 23095.4

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 49: Download

39 API CHAPTER 4.7

REPORT OF CALIBRATIONABC Pipeline Company,Volume Standards

*The scale reading is determined by the intersection of the horizontal plane, tangent to the bottomof the meniscus reading in the gauge tube. A scale division, between -200 and +200 is equivalentto 2 in3.

The expanded uncertainty in the measured volume is ±0.003%. It was calculated according toReference [2] with a coverage factor of 2 and is traceable to NIST reference mass standards.

For the Director,National Institute of Standards and Technology

Dr. George E. MattinglyLeader, Fluid Flow GroupProcess Measurements DivisionChemical Science and Technology Laboratory

[2] Taylor, B.N. and Kuyatt, C.E., “Guidelines for Evaluating and Expressing the Uncertainty of NISTMeasurement Results”, NIST Technical Note 1297, National Institute of Standards and Technology(January 1993).

NIST Seal No.: 2345NIST Test No.: 234/12345Calibration Date: August 29, 1997 Page 2 of 2

Scale* Volume Delivered(U.S. gal at 60¡F)

Volume DeliveredReading (in.3 at

60¡F)0 99.9575 23090.2

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 50: Download

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 51: Download

41

APPENDIX F—WATER DENSITY EQUATIONS

In 1981, a working group of the API Committee on StaticPetroleum Measurement was set up to review standards, andadvise on the water density equation to be used by API. Theyrecommended the use of the internationally accepted waterdensity versus temperature equation of Wagenbreth andBlanke.

The relationship is shown below:

kg/m3

Note: Wagenbreth, H., and Blanke, H., ÒThe Density of Water in theInternational System of Units and in the International Practical Tem-perature Scale of 1968,Ó Mitteilungen der Physikalish-TechnischenBudnesanstalt (PTB-Mitt), pp. 412Ð415, June 1971.

where (in compatible units)

= True water density in kg/m3,

Tw = Water temperature, ¡C,

a = 999.8395639 kg/m-3,

b = 0.06798299989 kg/m-3 ¡C-1,

c = 9.106025564 x 10 Ð3 kg/m-3 ¡C-2,

d = 10.05272999 x 10-5 kg/m-3 ¡C-3,

e = 112.6713526 x 10-8 kg/m-3 ¡C-4,

f = 659.1795606 x 10-11 kg/m-3 ¡C-5.

NIST used this standard until recently; however, theirwater density calculations have now been converted to theequation developed by G.S. Kell. This relationship is shownbelow:

kg/m3

Note: Kell, G.S., ÒDensity, Thermal Expansion, and Compressibilityof Liquid Water from 0¡ to 150¡C: Correlations and Tables forAtmospheric Pressure and Saturation Reviewed and Expressed in1968 Temperature Scale,Ó Journal of Chemical and EngineeringData, Volume 20, pp. 97Ð105, 1975.

where (in compatible units)

= True water density in kg/m3,

Tw = Water temperature, ¡C,

a = 999.83952 kg/m-3,

b = 16.945176 kg/m-3 ¡C-1,

c = Ð7.9870401 x 10-3 kg/m-3 ¡C-2,

d = Ð46.170461 x 10-6 kg/m-3 ¡C-3,

e = 105.56302 x10-9 kg/m-3 ¡C-4,

f = Ð280.54253 x 10-12 kg/m-3 ¡C-5,

g = 16.879850 x 10-3 kg/m-3 ¡C-1.

Evaluation of these two equations showed that the calcu-lated water densities could differ by two parts in a million;e.g., 999.012 versus 999.014 kg/m3 respectively, for the den-sity of water at 60¡F. However, the volumetric correction fac-tors (density ratios) developed by either equation areessentially constant and unaffected by the mathematical treat-ments under consideration.

The Wagenbreth and Blanke equation is used to generatewater densities at various temperatures. The ratio of the waterdensity between the test measure and the proveris equivalent to the correction factor CTDW.

where

= the water density, kg/m3,

T = temperature, ¡C, [(temperature ¡F Ð 32)/1.8]

TM = the test measure temperature,

TP = the prover temperature.

The limits in prover temperatures are 35 to 105¡F and intest measure temperatures 32.1 to 105¡F. This is the tempera-ture range used for the Wagenbreth equation, and, therefore,any extrapolation outside of this temperature range is notrecommended. The Kell equation, however, can be used forcalculating CTDW values when the temperature is outside ofthe range recommended for the Wagenbreth equation.

Using the Wagenbreth equation yields a maximum uncer-tainty in any CTDW value of ±0.000007. As discussed previ-ously, use of the Wagenbreth equation versus the Kellequation essentially does not yield any differences in CTDWvalues. That is, CTDW values from the two equations will dif-fer by less than 0.0000005, within the temperature range ofthe standard. On rounding to six decimal places approxi-mately, less than 9% of the equivalent values could differ by±0.000001. For this reason, it is not recommended that theKell equation be used to duplicate the Wagenbreth standard.

Operators should be aware, however, that all test measurescalibrated by NIST will have the volumes calculated usingthe density of water derived from the Kell equation.

r T w( ) a bT w cT 2w dT w

3 eT w4Ð f T w

5+ +Ð+=

r

r T w( )a bT w cT w

2 dT w3 eT w

4 fT w5+ + + + +

1 gT w+-----------------------------------------------------------------------------------------=

r

rTM( ) rTP( )

CTDWrTM( )rTP( )

-------------=

r

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 52: Download

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 53: Download

12/98—3C

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000

Page 54: Download

Additional copies available from API Publications and Distribution:(202) 682-8375

Information about API Publications, Programs and Services isavailable on the World Wide Web at: http://www.api.org

Order No. H04072

COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000COPYRIGHT 2000 American Petroleum InstituteInformation Handling Services, 2000