100
The American Association for Laboratory Accreditation 2011 by A2LA All rights reserved. No part of this document may be reproduced in any form or by any means without the prior written permission of A2LA. L:\BOD\Technical Forum Minutes\2011 Final Minutes\ElectroMechanical Advisory Committee Page 1 of 14 A2LA Electro-Mechanical Advisory Committee (EMAC) Meeting The Sheraton Columbia Hotel Columbia, MD MEETING MINUTES Saturday, April 2, 2011 8:00 am – 5:00 pm 1 Introductions. At 8:00 am, A Gouker, EMAC recording secretary, began the 2011 EMAC meeting. A moment of silence was observed for Kurt Fisher who recently passed away. Roberts Rules of Order were set as the basis for discussions. Introductions of present persons were made. 2 Review and approval of agenda. Motion by A. McCall to approve agenda. Motion properly seconded. Vote by voice – all votes affirmative, none dissenting. Agenda was approved. 3 Membership. In accordance with EMAC bylaws, anyone present may be voted to become an EMAC member. D. Zakharia, B. Quinlan, C. Maytrott, Y. Tanuma, and J. Mason were nominated to be added to the EMAC membership list. Motion by W. Elliott to approve addition of the persons named above. Motion seconded by A. McCall. D. Kramer asked the Recording Secretary and Committee Chair to ensure that this is in line with the recent draft revisions to EMAC bylaws (and ensure they are in line with other TAC bylaws). A. Gouker confirmed that this action was still in compliance with the EMAC bylaws. No other discussions. Vote by voice – all votes affirmative, none dissenting. Action Item 1 - EMAC membership list will be updated by A. Gouker. Also, A. Gouker will contact those individuals who have missed two consecutive meetings to ask if they wish to remain members of the EMAC. (Due by June 30, 2011)

The American Association for Laboratory Accreditation

Embed Size (px)

Citation preview

The American Association for Laboratory Accreditation

2011 by A2LA

All rights reserved. No part of this document may be reproduced in any form or by any means without the prior written permission of A2LA.

L:\BOD\Technical Forum Minutes\2011 Final Minutes\ElectroMechanical Advisory Committee Page 1 of 14

A2LA Electro-Mechanical Advisory Committee (EMAC) Meeting

The Sheraton Columbia Hotel Columbia, MD

MEETING MINUTES

Saturday, April 2, 2011

8:00 am – 5:00 pm 1 Introductions.

At 8:00 am, A Gouker, EMAC recording secretary, began the 2011 EMAC meeting. A moment of silence was

observed for Kurt Fisher who recently passed away. Roberts Rules of Order were set as the basis for discussions. Introductions of present persons were made.

2 Review and approval of agenda.

Motion by A. McCall to approve agenda. Motion properly seconded. Vote by voice – all votes affirmative, none dissenting. Agenda was approved.

3 Membership.

In accordance with EMAC bylaws, anyone present may be voted to become an EMAC member. D. Zakharia, B. Quinlan, C. Maytrott, Y. Tanuma, and J. Mason were nominated to be added to the EMAC membership list. Motion by W. Elliott to approve addition of the persons named above. Motion seconded by A. McCall. D. Kramer asked the Recording Secretary and Committee Chair to ensure that this is in line with the recent draft revisions to EMAC bylaws (and ensure they are in line with other TAC bylaws). A. Gouker confirmed that this action was still in compliance with the EMAC bylaws. No other discussions. Vote by voice – all votes affirmative, none dissenting. Action Item 1 - EMAC membership list will be updated by A. Gouker. Also, A. Gouker will contact those individuals who have missed two consecutive meetings to ask if they wish to remain members of the EMAC. (Due by June 30, 2011)

L:\BOD\Technical Forum Minutes\2011 Final Minutes\ElectroMechanical Advisory Committee Page 2 of 14

4 Approval of Last Meeting minutes. (ATT. 1)

A brief review of the 2010 EMAC meeting minutes was conducted, and discussion opportunity offered. No points of discussions were made. Motion by A. McCall to approve the 2010 meeting minutes. Motion was properly seconded. Vote by voice – all votes affirmative, none dissenting. 2010 EMAC meeting minutes were approved with no changes.

5 Review of Action Items from 2010 EMAC Meeting (ATT. 2)

Action items from 2010 meeting were reviewed: 1 – Closed as indicated in ATT 2 2 – Closed as indicated in ATT 2 3 – Discussion on this subject: A2LA stated that they are treating laboratories as customers, but we are encouraging laboratories to fill out the C101 checklist in word for their applications. A2LA does not want to restrict /require CABs to only using MS Word for their submittals. See attachment 2. 4 – Discussion on this subject: FCC / NIST personnel were invited to the 2011 Forum meeting, however, the APEC-TEL meetings were scheduled for the same time period and those persons invited were in attendance at those meetings. FCC and NIST will be invited again for the 2012 Forum. See attachment 2. 5 – Work on this action item was still in process: EMAC Bylaws were revised by A Gouker with assistance of M Buzard. Those draft revisions were discussed in agenda item 7 below. See attachment 2. 6 – Discussion on this subject: The committee briefly discussed a reiteration of the 2010 decision which asked A2LA staff to notify all EMC testing laboratories about the removal of the requirement to have a documented PT plan. A2LA management decided that having a bulk information mailing sent out to all EMC testing laboratories would raise unnecessary questions. A2LA management indicated that A2LA staff would handle such questions on a case-by-case basis from its CABs. See attachment 2. A question was raised regarding accredited PT providers vs. regular commercial PT providers (unaccredited) – there are currently two (2) commercially available (C63, ACIL) PT programs which relate to EMC testing. This point was further questioned, what is meant by “Commercial” in the A2LA PT Policy, and whether ACIL participation in their own PT program would invalidate that program from being considered by A2LA. A2LA’s PT Policy states “Available and Relevant” – does A2LA require a PT Provider to be accredited, or simply “recognized” by A2LA? A discussion of reliability of the program versus accreditation as relating to A2LA recognizing a PT program for EMC laboratories took place. R Miller clarified that A2LA may at some point add a requirement for EMC labs to use accredited PT Programs. A point was made that the EMAC may suggest to the Criteria Council to add other programs to the Annex to the A2LA PT Policy. A suggestion was made for updating the PT Policy Annex to clearly state that if a PT program is not accredited, it is not necessary for a lab to participate to meet PT Policy requirements (to clarify the A2LA desired standpoint described above). Multiple members questioned whether or not PT for electrical labs is even “worthwhile.” Motion by L. Gradin for the Criteria Council (CC) to review ISO 17025, clause 5.9, and state that PT is a method that CAN be used rather than MUST be used if a PT program exists for determining CAB compliance with this clause.

L:\BOD\Technical Forum Minutes\2011 Final Minutes\ElectroMechanical Advisory Committee Page 3 of 14

(motion defeated during discussion) A recommendation was made for removing the use of “PT” as a term and replacing it with “QC” when discussing clause 5.9 with EMC and Electrical CABs. Motion by A. McCall to leave this action item open and discuss it later in the agenda. Motion was properly seconded. See agenda item 18 below. 7 – Discussion on this subject: A reminder was given that if a CAB does not have PT available (typically, only AEMCLRP is available for Electrical labs), the assessor is still required to submit a PT Matrix in their deliverables, but not a C106 PT Policy checklist. 8 – Discussion on this subject: B. Nadeau expecting discussion with ANSI in the coming months regarding bore sighting an antenna. An outside (ANSI C63) committee meeting is scheduled for the first week of May 2011. W. Schaefer is on the ANSI workgroup for this question, but could not comment on the group’s discussions at this point in time. Action Item #2: B. Nadeau to report to EMAC the results of the ANSI Committee decision. (Due July 30, 2011) 9 – Closed as indicated in ATT 2 10 – Closed as indicated in ATT 2 11 – Closed as indicated in ATT 2 12 – Discussion on this subject: A draft on an update of the A2LA Traceability Policy which incorporates definitions of calibration versus verification has been passed through CC multiple times since 2010 without final resolution to date. Action Item #3: CC to finalize update of the A2LA Traceability Policy to include definition of calibration versus verification. A Gouker to report to all interested parties the results of the CC vote and any updates to the Traceability Policy (Due June 30, 2011) 13 – Closed as indicated in ATT 2 14 – Discussion on this subject: Laboratory staff must be able to reproduce certain NSA results during an assessment. A reminder was made that the assessment is only a spot check, NOT the full realm of NSA measurements required by the FCC. A ± 4dB deviation from the theoretical value is the basic requirement for meeting NSA spot checks if the subcontractor does the initial NSA measurement and the staff is witnessed performing the spot checks during the assessment. Motion by L. Gradin to have A2LA staff forward the FCC checklist to FCC testing labs ahead of time so that the lab is aware of this guidance. R. Miller and A. Gouker clarified that this information is currently in the A2LA “Acknowledge Application” letter which is sent to laboratories upon an assessor being assigned for the assessment. No further action was taken on the above motion. - Side action item from EMC assessor training: It was clarified by the ANSI C63 Standards Organization that Horn Antennas are required for Radiated Emissions testing above 1 GHz – hybrid or other antenna types are not permitted. Table 1 in clause 4.5 of ANSI C63.4:2009 details the list of allowable antennas. This action item was closed.

L:\BOD\Technical Forum Minutes\2011 Final Minutes\ElectroMechanical Advisory Committee Page 4 of 14

6 New/Emerging areas of interest in the laboratory accreditation community, or within the

Electrical/EMC community (standing agenda item)?

Smart Grid Interoperability – Labs and CBs will have to meet ISO 17025 and Guide 65 accreditation requirements respectively. ITCA (Interoperability Testing and Certification Authority) will be set up in the future and will impose accreditation requirements as part of their certification programs. The electric utility industry will drive this program to ensure that smart grid products are interoperable. No other new items were discussed.

7 Review EMAC bylaws and potential revisions. (ATT. 3)

Motion by G. Gogates to add “Protocol Testing” and “Interoperability” to scope of EMAC bylaws, to be added after “Wireless”. Motion was properly seconded. Vote by voice – all votes affirmative, none dissenting. A. Gouker added these two items to the draft of the bylaws. A. McCall pointed out a difference in language in 2.3g regarding use of “national or international” when describing organizations. A. Gouker and M. Buzard acknowledged overlooking this section during their revision process, and A. Gouker changed section 2.3g to use wording conforming to the rest of the EMAC Bylaws document. R. Miller clarified the use of the terminology “Certification Bodies” in section 2.1 to mean those CBs related to activities within the EMAC scope. Motion by G. Gogates to approve EMAC Bylaws draft revisions and send to CC. Motion seconded by A. McCall Vote by voice – all votes affirmative, none dissenting. Action Item #4 – A Gouker will finalize all revisions to the EMAC bylaws and forward the revised Bylaws document to the CC (by April 30, 2011).

8 Review P103c – MU annex for Electro-Mechanical Labs – per CC request at November ’10 teleconference. Confirm MU Categories are correct. (ATT. 4)

The P103c document was reviewed in 2010, with some (ETSI, RSS, BETS) methods having their measurement uncertainty categories updated. § Recommendation made for adding “Protocol” and “Interoperability” testing as type I / II MU testing to the list. As protocol testing is pass/fail, this appears to be a type I MU at best. Recent revisions to the IEC 61000 series having MU in an annex to those methods were discussed, but the MU is in regards to the calibration of devices used for those methods – P103c will not be changed for these tests. B. Nadeau led discussion on needed clarifications to some methods currently listed in P103c: § CISPR 22 – Ed 5 should be changed to “Ed 5.2 or later”, and should remain as category V measurement uncertainty. § CISPR 11 – Radiated and Conducted Emissions testing for CISPR 11 should be changed to category V measurement for “Ed 4.2 and above”

L:\BOD\Technical Forum Minutes\2011 Final Minutes\ElectroMechanical Advisory Committee Page 5 of 14

§ A request was made to compile of the two types of emissions into a single “Emissions” category in order to simplify the P103c document. Motion by D. Kramer to capture requested changes in these minutes and clarify these changes at later time, and have the Recording Secretary send a revised draft of P103c for EMAC voting prior to sending the revision to the CC for approval. Motion seconded by W Schaefer Vote by voice – all votes affirmative, none dissenting. Action Item #5 – A. Gouker to update P103c with the above listed clarifications and changes, and forward to EMAC members for voting. Pending approval of the EMAC members, A. Gouker will forward the revised P103c to the CC for approval (by June 30, 2011).

9 Discussion of EMAC Consensus Document, developed by EMAC Working Group. EMAC Recording Secretary to provide this document for Criteria Council’s approval following meeting. (ATT. 5)

All discussion items were reviewed individually and voted on before sending forward. For those decision items not listed in the minutes below, no discussion was undertaken, and those items are considered as having unanimous support from the EMAC membership in attendance for inclusion in the Consensus Document. Item I.1 – “Labs can perform pre-compliance testing, as long as it is not represented as compliance testing, and does not violate their own management system” Recommendation was made for clarifying pre-scan / pre-compliance in regards to the A2LA Ad Policy and accredited testing requirements. Suggestions were made to further clarify what “Pre-Compliance” means, in order to clarify how these types of other tests relate to actual Compliance testing. The EMAC wants to avoid considering these tests as Compliance testing. The issue is laboratories determining EUT compliance to a regulatory standard based on non-accredited test procedures. Further discussion on this point is addressed in agenda item 11 below. A suggestion was made to note when any consensus item is superseded by another policy. A second suggestion was made for moving superseded consensus items to a separate section at the end of the consensus document when those decisions are no longer applicable in order to keep the document tidy. Item I.2 – “Regarding ESD testing, a Lab does not need to perform a full calibration, but they need to have a checking function of some manner in place to verify the operation of the system.” Some members disagreed with this consensus item as a whole. Others disagreed with the wording of the decision. After discussion, it was determined that the intent was to have the decision written as: “Regarding ESD of testing, a lab needs to have a checking function of some manner in place to verify the operation of the system.” Motion by G. Gogates to revise consensus item I.2 to reflect the above rewording. (Motion defeated during discussion) Continued disagreement with the rewording was discussed, stating that responsibility for QC checks is on the testing laboratory, and that a laboratory could set an interval as long as their calibration period, which could remove the value of the QC checks. Final EMAC decision – The Committee agreed that this topic appeared to be taken out of context from the 2002 EMAC minutes. All were in agreement that it should be removed from the current list of consensus decisions,

L:\BOD\Technical Forum Minutes\2011 Final Minutes\ElectroMechanical Advisory Committee Page 6 of 14

Item I.3 – “If support/auxiliary equipment is provided by the client to the test laboratory, the laboratory is NOT responsible for the verification and/or the calibration of this support equipment.” L. Gradin wanted to add the requirement that the laboratory report clearly indicate those items used in testing which were brought into the lab by the customer for testing. The requested wording to be added was, “Customer supplied operating equipment which the testing laboratory is not responsible for shall be listed in the test report.” Discussion over the use of SHOULD versus SHALL for reporting the equipment took place. Final EMAC decision - Use SHOULD rather than SHALL for the test report inclusion guidance point, and to add the suggested wording (using SHOULD) to this consensus item. Item I.4 – “For a test chamber with a limited search height, requirements would not be met to allow for accreditation without a limitation being noted on the scope of accreditation” Clarification given that NSA does not require T9 assessment. No changes were made to the wording of this consensus document item. Item I.8 – Power Meters and Bandwidths -When a laboratory is being assessed for RF power measurements; the lab needs to know the bandwidth of the power head and power meter display unit as a system. Depending on the application, the bandwidth of the power measuring system needs to be calibrated (traceable). Final EMAC decision - Change “power head” to “power sensor” Item II.1.a – No deficiencies are to be referenced directly against the FCC checklist. Deficiencies are to be referenced to a specific requirement in the test method (i.e. C63.4 requirement, not “question X” on FCC checklist) The new FCC checklist now has a cross-reference to applicable standard sections. This should help assessors with these citations. No changes were made to the wording of this consensus document item. Item III.1.b – Assessors will not cite a deficiency against ANSI C63.4-2009 clause 8.3.2.2 in reference to “keeping the EUT in the cone of radiation” if the antenna is not bore-sighted during the measurement. An FCC KDB interpretation for use of planar scans (such as those specified in CISPR 22) for maximizing measured radiated emissions (KDB number 746324) was presented which indicates that the FCC wants antennas to be pointed at EUTs when performing Part 15 RE testing. The full KDB wording was read to the committee members for further discussion. Planar height scan is not acceptable to the FCC for radiated emissions testing above 1 GHz per the KDB. However, it was pointed out that a KDB is not a normative requirement. Discussion took place resulting in the decision below. Final EMAC decision - Assessors (still) cannot write a deficiency against the KDB interpretation requiring bore sighting the antenna, unless the KDB is on the lab’s scope, or until/unless ANSI accepts this interpretation for C63.4 Radiated Emissions testing. Item III.2.a – Laboratories must provide objective evidence that their LISN meets the requirements of CISPR 16-1-1 2 for isolation, and requirements of CISPR 16-1-4 for and voltage drops. Data sheets are acceptable for this purpose

L:\BOD\Technical Forum Minutes\2011 Final Minutes\ElectroMechanical Advisory Committee Page 7 of 14

A question was raised from a member of a laboratory if assessors are verifying LISN impedance calibration with and without extension cords during an assessment. Assessors in attendance answered yes. Most EMC laboratories are not giving clear calibration requests to their cal providers. A. Gouker stated that an ANSI C63 working group is developing a guidance document for EMC laboratories for requesting proper calibrations. ANSI C63.4 allows extension cords, CISPR does not – this disparity causes confusion in testing labs. Motion by L. Gradin to form a work group to create a guidance document for EMC labs and cal providers. Motion tabled until later in agenda. W. Elliott indicated that he will attend the C63 meeting in May, and will determine how sufficient the C63 guidance document is for A2LA purposes and offer feedback to the EMAC regarding his findings. It was pointed out that CISPR 16-1-1 should be re-written as “16-1-2, clause 4.7.1, table 6” for LISN isolation. Since CISPR 22 references CISPR 16-1-2 as a normative document, isolation is required. The reference to “CISPR 16-1-4” should be changed to “CISPR 16-1-2, clause 8.” Another question was asked from a laboratory if assessors are checking for phase calibration on LISNs – Assessors in attendance answered Yes, if the lab has the proper version of CISPR 22 on their scope. Motion by G. Gogates – Add an EMAC consensus decision to use a blanket 5% tolerance allowance for any standards which do not have a listed tolerance. It was suggested to clarify this standpoint to only qualify for dimensional measurements, as the original decision was only intended to address measuring antenna-to-EUT distance for radiated emissions testing. Motion tabled until agenda item 23 for further discussion on the consensus document. Motion by G. Gogates to adopt the EMAC consensus document Motion seconded by K. Williams Vote by voice – all votes affirmative, none dissenting. Action Item #6 – A. Gouker to incorporate the above changes to the EMAC consensus document and present the revised document to the CC for approval. (by June 30, 2011)

10 Does the EMAC see issues with listing CISPR 16-1-4 on testing laboratory scopes of accreditation? Is this a correct practice for A2LA to continue?

CISPR 16-2-x contain the test methods, and should be the method on the scope for testing laboratories. The EMAC was in general agreement that CISPR 16-1-4 could be on the testing lab’s scope if they were accredited for performing site validation measurements. An example scope provided to the EMAC used “Generic EMC Measurements” for 16-1-4, which the committee agreed was incorrect (unless the testing lab is performing site validation measurements). If a lab wants to have CISPR 16-1-4 on the scope of accreditation, it should have wording similar to “In Support of NSA Measurement (or other activity) at Client Sites” – in which case the lab must be assessed against A2LA’s field testing requirements (R104). Action Item #7 –A. Gouker to contact accredited testing laboratories with CISPR 16-1-4 on their scope of accreditation in order to clarify use of CISPR 16-1-4, and make modifications to the scopes as necessary (by May 31, 2011).

11 Reporting pre-scan data per C63.4… can these measurements be reported in an endorsed “accredited” test report if the chamber did not meet NSA requirements? Should all pre-scan measurements be classified as “un-accredited” if they appear on an endorsed report?

L:\BOD\Technical Forum Minutes\2011 Final Minutes\ElectroMechanical Advisory Committee Page 8 of 14

Pursuant to A2LA’s Advertising Policy, if testing is performed in a chamber not meeting NSA requirements

these activities must at least be flagged as non-accredited testing in the laboratory’s report. These results may be included in an endorsed report so long as they are identified as non-accredited test results. Pre-scans in a non-compliant chamber, and final check against only the suspect frequencies identified in the pre-scan, is not a valid, accredited test procedure. The full scan must be done to be considered a valid, accredited report for ANSI C63.4. Pre-scan data still must be declared as being performed in a non-compliant chamber. ANSI C63.4, sec 8.3.2.1, requires the full frequency span to be checked if the EUT is moved (“relocated to a final test site”). Motion by D. Kramer – Add to consensus document: “ANSI C63.4 pre-scan data, if done in a chamber which does not meet NSA requirements, must be clearly noted as not complying with C63.4 if used in an endorsed test report (per A2LA Advertising Policy). ANSI C63.4, sec 8.3.2.1, does allow pre-scan testing to be performed, but if the EUT is relocated to a final testing site (which meets NSA compliance requirements), the full frequency span must be re-checked.” Motion seconded by N. Belsher Vote by voice – all votes affirmative, none dissenting..

12 Review/Re-clarification of the “spot check” requirements for on-site assessments. (ATT. 6)

A. Gouker clarified that A2LA is currently including wording in their formal letters during the assessor assignment process (to both testing laboratories, and assessors going into laboratories testing to FCC rule parts 15 and 18) that Subcontractors are not allowed to perform the NSA spot checks during the assessment. The laboratory’s technicians are expected to reproduce the results during the on-site assessment. Action Item #8 – R. Miller to send out notification to all FCC testing labs about the subcontractor requirement for NSA spot checks (rather than waiting for application packages to send out this notification) (By May 31, 2011). A question was raised for clarification – what does the FCC mean by “Reproduce”(ing) data during the NSA spot checks? One laboratory was asked to completely verify the full NSA data set, not just a small set. W. Elliott suggested changing “reproduce” to “demonstrate.” FCC has confirmed that spot checks are their major concern for ensuring competence in a testing laboratory, and assessors are not required to check NSA spot checks at multiple sites during a single assessment. Action Item #9 – R. Miller to re-verify with the FCC whether NSA spot checks are necessary on multiple sites (when multiple test sites are available), or if only one NSA spot check is sufficient during an assessment (by May 31, 2011) Reminder was given to assessors to clearly identify on FCC checklist which test site was witnessed and spot checked to meet the NSA requirements. Next assessors should check a different site (if multiple exist for a testing laboratory).

13 Discuss the possibility of mandating frequency range capabilities on scopes of accreditation for radiated emissions testing. Possibly list chamber designations as well (i.e. like AEMCLAP scopes); or at least test distances and types of chambers available. This relates to the submitted question: “if a lab has 63.4 on scope and the assessor saw only a 3m chamber during the assessment, can lab add a 10m chamber and perform accredited tests without A2LA knowing?”

A question was raised related to ANSI C63.4 – What does FCC want A2LA to know and qualify for those

L:\BOD\Technical Forum Minutes\2011 Final Minutes\ElectroMechanical Advisory Committee Page 9 of 14

tests? AEMCLAP and VCCI require clarifications on frequency ranges, chamber(s) identifications and set ups. EMAC was in general agreement that A2LA should not issue this type of scope inclusion/clarification as a blanket requirement. A recommendation was made to update A2LA policy (specifically R101) for laboratories to report additions of new chambers or testing sites, as these would qualify as a change in capabilities for performance of testing. Action Item #10 – R. Miller to consult with A2LA management regarding proper insertion of this requirement into R101 or another A2LA policy document if appropriate (by May 31, 2011).

14 Discuss the requirements, and A2LA interpretation, to list customer address on the test report as identified by clause 5.10.2.d.

A. Gouker provided background on the reasoning behind this agenda item being brought to the EMAC’s attention. An assessor wrote a deficiency at a laboratory because their test reports only included the city and state of their customer, and not the full street address; however, A2LA staff ruled the deficiency invalid and indicated that the lab did not need to put the street address on the test report. The assessor asked that this issue be discussed during the EMAC meeting, and if necessary, an interpretation of clause 5.10.2.d be forwarded to the A2LA Criteria Council. (Internal customers were determined to be covered by the leeway in clause 5.10.3 for inclusion of addresses on test reports.) Motion by D. Kramer that the full address of the testing site be included for test reports for external customers (unless lab has valid reasons for not including the address, which must be evidenced in contract review records) Motion seconded by P Boers Vote by voice – all votes affirmative, none dissenting. Action Item #11 – A. Gouker to write an A2LA “17025 Explanation” of clause 5.10.2.d and forward to the Criteria Council for a vote (by June 2, 2011).

15 Discussion item – if product family standards are listed on a scope (i.e. EN 55024), is it expected that the lab be accredited for all referenced standards?

A2LA staff clarified the current A2LA policy, that if a laboratory wants to perform an old version of a test method as well as the most current version, both must be clearly identified on the scope. For EMAC discussion purposes, “EN 55024” was used an example of a product family standard that frequently appears on laboratories scopes of accreditation. An assessor pointed out that the current version of EN 55024 requires the use of an outdated ESD test method. So the question was raised that if EN 55024 is on the scope, does this imply that the lab is accredited to perform the outdated version of the ESD method, or does the outdated ESD method also have to appear on the scope of accreditation? If the subsections (such as EN 61000-4-2:1995, -4-3:1996, and -4-4:1995) are not included in the laboratory’s capabilities, all non-included subsections must be clearly listed as an exclusion on the laboratory’s scope. Most assessors in attendance took the position that if EN 55024 is listed on the scope, this would imply that the lab is able to perform the testing to the outdated version of the test method being called out within the document. However, A2LA staff referred to the long-standing A2LA policy that if the lab is testing to an outdated version of a method, they must clearly identify the outdated version on their scope. Since both parties could not reach an agreement on the issue, A2LA staff said they would be happy to take an action item to discuss the matter further among upper management and report back to the committee with guidance. EMAC Position to be discussed by A2LA staff and A2LA management – If a lab has a product specific/generic standard on the scope (e.g. “EN 55024”), the lab must be able to demonstrate

L:\BOD\Technical Forum Minutes\2011 Final Minutes\ElectroMechanical Advisory Committee Page 10 of 14

competence at performing all associated methods under that product specific standard if the lab does not want to specifically list included / excluded sub methods (e.g. “EN 61000-4-2:1995 specified in EN 55024 while the latest edition is EN61000-4-2:2008”) Action Item #12 – A. Gouker to discuss with A2LA management the requirement to list any outdated revisions on the scope of accreditation if the outdated revisions are called out in a current edition of a product family standard; and then report back to the EMAC with guidance. (Specifically referencing EN 55024 clause 2, annex ZE) (by May 31, 2011).

16 Presentation - Mac Elliott - Task Group Lead for ANSI C63.26 STM 5 - Field Strength of Radiated Emissions – Presentation / Discussion on task group’s proposal for new standard.

Postponed until end of meeting, time permitting. (shifted to item 24 of agenda)

17 Discussion of KDB issues - discuss a mechanism for labs and manufacturers to request extensions/relief/etc…. Discuss ideas for improvement to the current KDB process for notifying affected parties. .

Postponed until end of meeting, time permitting (combined with agenda item 16 above as part of new agenda item 24)

18 Review G105 document to unify QC expectations (per 10/12/10 email from T. McInturff) (ATT. 7)

L. Gradin offered to revise G105 to update it as needed. R. Miller states that the G105 will go before CC upon completion of updating. Action Item #13– A. Gouker to provide L. Gradin a copy of the current G105 document on file in order for him to update, and then forward to the A2LA Criteria Council for a vote (by June 30, 2011).

19 Discussion of 5.5.4 requirements for software. (requested as topic of discussion from accreditation council for the EMAC).

Unique identification of software as required by clause 5.5.4 was discussed. An assessor deficiency cited testing software which was not uniquely identified in the laboratory’s equipment list / records. An AC member disputed the finding during the AC review of the assessment package. Motion by L. Gradin to use G. Gogates’ guidance document (“Software Validation in Accredited Laboratories – A Practical Guide” Rev. 06/07/2010) to be cited in A2LA interpretation of 5.5.4 for determining when this unique identification of software is Practical. Motion seconded by D. Kramer Motion dropped in favor of creating an A2LA application of clause 5.5.4. Action Item #14 – M. Buzard to write an A2LA “17025 Explanation” of clause 5.5.4, taking G. Gogates’ guidance document into account. The 17025 Explanation is to be forwarded to EMAC for approval prior to forwarding to Criteria Council for a vote (by May 31, 2011).

20 Clarification on Advertising Policy clauses 13.1 and 13.2 (request from assessor for EMAC to review in order to ensure everyone is aware of the differences in these clauses) (ATT. 8)

A2LA staff made a few clarifications on section 13 of the A2LA Advertising Policy (P102). If a laboratory performs work falling under clause 13.2 of the Ad Policy, it should be able to demonstrate that the client specifically requested testing to be performed in a non-accredited manner.

21 Consensus decision – discuss site validation above 1GHz for testing labs using ANSI C63.4:2009 vs. CISPR 22 (using CISPR 16-1-4:2007 as the site validation method).

L:\BOD\Technical Forum Minutes\2011 Final Minutes\ElectroMechanical Advisory Committee Page 11 of 14

If the lab is claiming CISPR 22 testing capability up to 6 GHz, either limit ANSI 63.4:2009 to 6 GHz, or

require the lab to meet the floor absorber requirements from ANSI C63.4:2009 (2.4m x 2.4m absorber) in order to allow testing up to 18 GHz. For measurements in the 18 to 40 GHz frequency, only sites showing conformance to CISPR 16-1-4:2007 are valid for testing. KDB 704992 – C63.4-2009 provides two options for test facilities used to make radiated emission measurements above 1 GHz, and clarifies that the use of RF absorbers on the top of the ground plane is permitted. Facilities suitable for measurements in the frequency range 30 MHz to 1000 MHz are considered suitable for the frequency range 1 GHz to 40 GHz with RF absorbing material covering the ground plane such that either:

a) the site validation criterion called out in CISPR 16-1-4:2007 is met; b) or a minimum area of the ground plane is covered, i.e., 2.4 m by 2.4 m (for a 3 m test distance), between the antenna and the Equipment Under Test (EUT) using RF absorbing material with a minimum-rated attenuation of 20 dB (for normal incidence) up to 18 GHz.

C63.4-2003 does not have site validation requirements for test facilities used to make radiated emissions above 1 GHz. However, it does state that facilities that are suitable for measurements in the frequency range 30 MHz to 1000 MHz are considered suitable for the frequency range 1 GHz to 40 GHz, including the presence of the reference ground plane.

22 Consensus Decision – ESD Testing (ISO 61000-4-2) – discuss impact of barometric pressure on ESD test… the need to record the pressure during the test… and the need to have the pressure recording device calibrated to meet A2LA’s traceability policy.

An assessor pointed out that IEEE Standard 4 includes a formula for determining the effect of temperature and barometric pressure on discharges (not just electrostatic). The formula shows that voltage must go into the Hundred of Thousands of volts range to be affected by barometric pressure. Final EMAC Decision - A laboratory is not required to have any calibration on their barometric pressure meter for ESD testing, as the EMAC has decided that barometric pressure does not have a significant impact on the result of the testing. (Reference IEEE Standard 4) For example, the laboratory may use barometric pressure reporting from an off-site pressure reporting source, such as a local airport weather station. Motion by G. Gogates to approve the above decision statement Motion seconded by K. Williams Vote by voice – one abstention, all other approved

23 Return to agenda item 9 for final review of EMAC Consensus Document. Any new Consensus Decisions arising during 2011 EMAC meeting to be incorporated into final draft.

Above decisions and clarifications are to be added to the EMAC Consensus Document prior to forwarding a final draft to the CC for review and approval. (see action item 6 above) Discussion on 5% tolerance for dimensions only: The 5% tolerance rule of thumb started when considering measuring the distances in Radiated Emissions test setups. As assessors, the rule of thumb will be to allow laboratories to have a 5% leeway on all measurements where no tolerance range is provided by the testing standard being used. Motion by G. Gogates to use the following wording in Table 3 of the AEMCLAP ver. 4.0 requirements (copied below) to define tolerances when a test method does not explicitly give tolerance ranges for testing:

L:\BOD\Technical Forum Minutes\2011 Final Minutes\ElectroMechanical Advisory Committee Page 12 of 14

Note: Where tolerances on parameters are not defined in the reference documents, the tolerances in Table 3 shall apply. Table 3. Default Tolerances

Supply voltage and current ±5% Time interval, length ±10% Resistance, capacitance, inductance, impedance ±10% Test parameters for RF field strength, Electrical or magnetic field strength, injected current, power, energy, transient voltage amplitude (if adjustable)

+ 10% - 0%

Any commercial measurement devices (ruler, tape measure, etc) can be used for the distance measurement. No calibration is required for these devices.

Motion seconded by D. Kramer Discussion – P103c states that conditions given may not apply in all given situations. Amended motion by G. Gogates to change tolerances (as listed in AEMCLAP document reference) to 5% as rule of thumb for not citing deficiencies. Second – P. Boers Vote by voice – all approve use of modified tolerances on the AEMCLAP table as basis for above motion Vote by voice – all approve of the modified motion (approved tolerances chart below)

Note: Where tolerances on parameters are not defined in the reference documents, the tolerances in Table 3 shall apply. Table 3. Default Tolerances

Supply voltage and current ±5% Time interval, length ±5% Resistance, capacitance, inductance, impedance ±5% Test parameters for RF field strength, Electrical or magnetic field strength, injected current, power, energy, transient voltage amplitude (if adjustable)

+5% -0%

Any commercial measurement devices (ruler, tape measure, etc) can be used for the distance measurement. No calibration is required for these devices.

Motion by G. Gogates - Interoperability and Protocol testing shall be considered under the Information Technology field of testing. Motion tabled pending action item below. Action Item #15– M. Buzard to forward IT Program Requirements document to EMAC committee members. Allow 4 weeks for EMAC review, then perform electronic vote to include Interoperability and Protocol testing under the IT program with S. Medellin’s input (by June 30, 2011). Motion by G. Gogates – Protocol test units are not required to be calibrated. (RF testing still requires calibration) – If the above action item results in a positive EMAC vote, this motion is included and addressed by the inclusion of those two types of testing under the IT field. If the above action item results in a negative vote, EMAC membership will be balloted in accordance with the bylaws for a decision on this motion.

L:\BOD\Technical Forum Minutes\2011 Final Minutes\ElectroMechanical Advisory Committee Page 13 of 14

Motion tabled pending outcome of Action Item #14. Motion by L. Gradin to form a work group to create a “Good Practices” document for EMC calibration requests. Motion seconded by P. Boers Volunteers presented their services to the working group as follows. Chairman – P. Boers A. Gouker, L. Gradin, B. Nadeau, D. Kramer, W. Elliott as working group members Action Item #16 – A. Gouker to follow up with working group chair P. Boers to ensure progress is being made on the “Good Practices” guidance document (by May 31, 2011).

24 Presentation – William “Mac” Elliott - Task Group Lead for ANSI C63.26 STM 5 - Field Strength of Radiated Emissions – Presentation / Discussion on task group’s proposal for new standard. Discussion of KDB issues - discuss a mechanism for labs and manufacturers to request extensions/relief/etc…. Discuss ideas for improvement to the current KDB process for notifying affected parties.

W. Elliott presented information on C63.26 task group. See attachment #9 for a copy of the presentation. EMAC discussed implementation of KDB, and general consensus was that the KDB would be equally helpful as hindering to the laboratories in the industry. Further discussion would follow upon additional development.

25 Old/New Business

Assessors only meeting was to be held after EMAC meeting adjourned. No other new EMAC business was discussed.

26 Next Meeting

Possibly IEEE in Long Beach, CA in August – otherwise, next EMAC shall be at next year’s A2LA Annual Meeting and Technical Forum (2012)

27 Adjournment

Motion by P. Boers to adjourn 2011 EMAC meeting. Motion seconded by A. McCall Vote by voice – all votes affirmative, none dissenting. Meeting was adjourned at approximately 1500.

Summary prepared by Mike Buzard, A2LA Accreditation Officer.

L:\BOD\Technical Forum Minutes\2011 Final Minutes\ElectroMechanical Advisory Committee Page 14 of 14

ATTENDEES OF THE 2011 EMAC MEETING: Name Affiliation Benoit Nadeau Assessor / EMAC Chairman Bill Peverill Assessor Greg Gogates Assessor Larry Gradin Assessor Peter Boers Assessor Werner Schaefer Assessor Nathan Belsher Assessor Thomas Dickten Assessor David Waitt Assessor Mike Bosley Assessor Doug Kramer Assessor / NCEE Nee Salam Assessor Andreas Eberhard Assessor Barry Quinlan Assessor Yukio Tanuma Assessor Craig Maytrott Assessor Allan McCall Fischer Custom Communications, Inc. David Fischer Fischer Custom Communications, Inc. Kimball Williams Denso John Fuhrman MET Labs John Mason MET Labs William (Mac) Elliott Motorola / ANSI / Assessor Deanna Zakharia Motorola Global Adam Gouker A2LA Staff / EMAC Recording Secretary Rob Miller A2LA Staff Diana Gavin A2LA Staff Michelle Bradac A2LA Staff Mike Buzard A2LA Staff Trace McInturff A2LA Staff Peter Unger A2LA President

Total attendees for 2011: 30

L:\BOD\Conclave Minutes\2010 Final Minutes\Electro-Mechanical Advisory Committee Meeting Summary.doc Page 1 of 11

The American Association for Laboratory Accreditation

2010 by A2LA All rights reserved. No part of this document may be reproduced in any form or by any means without the prior written permission of A2LA.

A2LA Electro-Mechanical Advisory Committee (EMAC) Meeting The Sheraton Columbia Hotel

Columbia, MD MEETING MINTUES

Saturday, April 24, 2010

8:00 am ~ 2:00 pm 1 - Introductions.

At 8:06am, B. Nadeau, EMAC Chairman, began the 2010 EMAC meeting. Roberts Rules were stated as the basis to be followed for discussions. The agenda was briefly reviewed. Introductions were made of all present EMAC members, A2LA Staff, and other invitees.

2 - Review and approval of agenda. A. Gouker requested addition of agenda items as follows: (A) Notice from NIST on CISPR22 vs KN22 requirements. (B) FCC Draft Lab division publications and FCC Draft Checklist review. (C) Collection of A502 Forms. Motion by B. Nadeau: Approve the agenda, with additions above. Motion properly seconded. Vote by voice – all votes affirmative - agenda was approved. (No dissenting votes)

3 - Nomination/Election of EMAC Chairperson.

Motion by K. Williams: Nominate B. Nadeau to remain Chairman. Nomination properly seconded. Vote by voice – all votes affirmative – Benoit remains Chairman. (No dissenting votes)

4 - Membership.

ACTION ITEM 1 – A. Gouker – Contact Phil Fanson, Greg Bartell, Hoosamuddin Bandukwala to ascertain their interest in remaining EMAC members since they have missed two consecutive meetings. (By May 31, 2010)

tbarnett
Text Box
ATTACHMENT 1

L:\BOD\Conclave Minutes\2010 Final Minutes\Electro-Mechanical Advisory Committee Meeting Summary.doc Page 2 of 11

Motion by K. Williams: Add Kurt Fisher, Joel Lachance, Mac Elliott, and Allan McCall to EMAC membership list. Motion properly seconded. Vote by voice – all votes affirmative – A. Gouker to add persons to membership list and contact non-present persons. (No dissenting votes) ACTION ITEM 2 – A. Gouker – Add Kurt Fisher, Joel Lachance, Mac Elliot, Allan McCall, to EMAC members list. (By May 31, 2010)

5 - Approval of Last Meeting minutes.

Motion by K. Williams: Approve last meeting minutes. Motion was seconded by T. Dickten. Vote by voice – all votes affirmative – 2009 EMAC Minutes were approved. (No dissenting votes.)

6 - Review of Action Items from 2009 EMAC Meeting

#1 closed #2 closed #3 closed #4 open – The assigned Working Group made little progress in 2009. It was suggested that A2LA should be responsible for continuing the working group’s efforts, but not all agreed with this idea. R. Miller clarified that the Working Group should be used to determine the most pertinent technical points from previous EMAC meeting minutes. A. Gouker suggested adding one A2LA staff member to the working group as a liaison to the main office and to assist with the work load. It was agreed that W. Schaefer and A. Gouker will join D. Kramer in the Working Group. Motion by B. Nadeau: Continue work in this area throughout 2010, and have the Working Group complete the document detailing major decisions/interpretations of the EMAC. Motion by B. Nadeau: Establish July 1, 2010 as a deadline for a first draft release, and September 1, 2010 as a deadline for closing this action item. Both motions were properly seconded. Vote by voice – all votes affirmative – Working Group will continue to process previous EMAC Meeting Minutes to analyze and document pertinent EMAC decisions and interpretations. (No dissenting votes.) #5 – Bore sighting was discussed during the EMC training on 4/23/2010. A summary of the discussion was produced. It was agreed that assessors will not cite a deficiency against ANSI C63.4-2009 clause 8.3.2.2 in reference to “keeping the EUT in the cone of radiation” if the antenna is not bore-sighted during the measurement. This item remained open until discussed later in the agenda. #6 – This item is on the agenda for discussion (item 8). #7 – Ramona Saar (NIST) informs A2LA about KN standards updates as they happen. A2LA notifies assessors as KN standards updates are sent to the A2LA office. This is an ongoing

L:\BOD\Conclave Minutes\2010 Final Minutes\Electro-Mechanical Advisory Committee Meeting Summary.doc Page 3 of 11

item in light of the frequency of updates. Action item is considered closed. #8 – A2LA will allow assessors to use their Assessor Master Code in lieu of their name for FCC checklist identification purposes. The checklist appears on the FCC website as public document, and some assessors were concerned with potential legal issues. #9 – Closed. #10 –Concern was expressed regarding the quality of the contents of assessor packages (an example given was one lab’s scanned C101 being of poor quality, and – after successive use and reprinting - the quality worsened). ACTION ITEM 3 – A. Gouker - Propose to A2LA upper management requirements for submission of C101 checklists received with applications to restrict the format to non-hand-written PDF scans, or .doc versions – (By 6/1/2010). #11 – FCC members were invited to this year’s Conclave. George Tannahill and Ramona Saar were not present for 2010. A. Gouker will re-invite them in 2011. ACTION ITEM 4 – A. Gouker – Re-invite FCC representatives to 2011 Conclave. (By 4/1/2011).

7 - New/Emerging areas of interest in the laboratory accreditation community, or within the Electrical/EMC community (i.e. Toyota EMI issues and The Smart Grid development) Smart Grid – G. Gogates has assessed one lab for BackNet, which is the industrial building backside of SmartGrid (protocol testing). ISA 99 (standard for power companies and suppliers) was also mentioned as a potential area to examine. Another emerging area noted was Protocol Testing. It was suggested that A2LA further clarify Inspection Body vs. ISO/IEC 17025 accreditation among staff. The idea of promoting inspection body requirements/activities to specifiers and/or installers was also mentioned. The topic of Energy Star was mentioned. M. Buzard & R. Miller gave a brief discussion on future requirements from EPA/DoE for accredited testing labs, and the hope that EPA will also consider using Guide 65 accredited organizations for running verification testing programs. R. Miller asked assessors present to evaluate their own backgrounds and capabilities and inform T. McInturff if they are able to assess the Energy Star program requirement tests.

8 - Review A106 - EMAC bylaws. Revisions received from recent request for review/approval

of current document. Comments were requested previously but submitted by only one EMAC member. A request was made to provide the draft (including comments already submitted) to EMAC members for further review before initiating any formal changes to the draft. If changes are made, the bylaws document will need to be sent to the A2LA Criteria Council (CC). It was suggested that IT activities be incorporated into the scope of the EMAC Bylaws document

L:\BOD\Conclave Minutes\2010 Final Minutes\Electro-Mechanical Advisory Committee Meeting Summary.doc Page 4 of 11

since gaming labs are becoming a bigger source of protocol testing. Motion by D. Kramer: A2LA staff to compare the EMAC bylaws with other Technical Advisory Committee (TAC) bylaws and clarify any bridging/overlap issues, especially with regard to the scope of each TAC. A2LA is then to provide feedback to the EMAC with recommendations for bylaw revisions (including provisions for including involvement of other advisory committees when overlapping scopes require changes to more than one document). Comments are to be incorporated into the next draft of the bylaws. A deadline of May 31, 2010 was set for B. Nadeau to review comments on proposed bylaw revisions. This motion was seconded by D. Sigouin. Vote by voice – all votes affirmative – A2LA staff to undertake EMAC bylaw review and draft revision. (No dissenting votes.) ACTION ITEM 5 - A. Gouker to compare EMAC bylaws with other Technical Advisory Committee bylaws and clarify any bridging/overlap issues, especially with regards to the scope of each TAC. Provide feedback to EMAC with recommendations for bylaw revisions (including provisions for including involvement of other advisory committees when overlapping scopes require changes to more than one document). Incorporate all comments into draft of bylaws, which is to be sent to B. Nadeau. (By May 31, 2010)

9 - Discussion of Proficiency Testing (PT) related to labs within the jurisdiction of the EMAC. Namely R103, R103a, and expectations from both laboratories and assessors. A. Gouker explained that the previous section PT7 of R103 has been removed so that labs no longer need a plan to explain what they are doing in lieu of commercially available proficiency testing. Since this is no longer a requirement for the labs, assessors should not be citing deficiencies against lack of a PT plan in these cases. Section 5.9.1 of ISO 17025 requires a quality control procedure for monitoring the validity of every test on the scope of accreditation, but a PT plan is not required for labs where commercial PT is not available. In addition, submission of the PT Matrix (A2LA document A310) is no longer required. Assessors indicated that they felt labs were still unclear on A2LA’s expectations regarding this issue and asked that staff notify EMC labs on these policy changes. ACTION ITEM 6 – A. Gouker to notify appropriate EMC labs that a documented PT plan is no longer required if there are no commercially available proficiency testing programs for the tests on their scope. (By May 31, 2010). A suggestion was made to point labs to the A2LA Guidance Document, G105 - Quality Control and Proficiency Test Plan, on this subject. ACTION ITEM 7 – A. Gouker to speak with upper management and clarify to assessors what deliverables shall be returned for the EMC Labs. Due by May 31, 2010.

10 - C63.4 and C63.10 bore sighting.

During the April 23, 2010 EMC training, “bore sighting” was discussed during a review of ANSI standards C63.4:2009 and C63.10:2009. The standards are not completely clear with regard to antenna positioning when testing over 1 GHz. Referring to the diagram on slide 11 in W. Schaefer’s presentation on ANSI C63.4:2009 from

L:\BOD\Conclave Minutes\2010 Final Minutes\Electro-Mechanical Advisory Committee Meeting Summary.doc Page 5 of 11

the EMC training (please see Attachment 1), discussion took place regarding the positioning of the antenna in order to maintain it within the “cone of radiation”. ANSI and FCC guidance documents are not clear enough on this point. A discussion took place on whether or not to consider bore sighting as an accepted practice. It was determined that EMAC cannot make a final interpretation on this issue. Motion by B. Nadeau: A2LA Assessors shall accept both bore sighting and planar scan testing methods. Motion properly seconded. Votes by raised hands – motion defeated by majority vote. Motion by D. Waitt: A2LA Assessors shall not write a deficiency if the lab does not bore sight their antenna. [ANSI C63.4:2009, 8.3.2.2] Motion seconded by J. Lachance. Votes by raised hands – motion was passed, with one dissenting vote. Dissenting vote by K. Fischer. Decision – A2LA Assessors shall not write a deficiency if a laboratory does not bore sight their antenna [EMAC interpretation of ANSI C63.4:2009, 8.3.2.2] A suggestion was made to solicit an interpretation from IEEE on the bore sighting issue. B. Nadeau accepted an action item, with assistance from W. Schaefer, to contact IEEE about this issue. ACTION ITEM 8 – B. Nadeau will solicit IEEE for interpretation on the bore sighting issue, w/ assistance from W. Schaefer. (By 10/1/2010).

11 - Working Group Reports from 2009 conclave. This agenda item was covered earlier in the EMAC discussion. The working group remains open, and two new members were added. See agenda item 6, bullet 4 above.

12 - Discussion regarding EN 55022 test setup interpretation, surrounding a question from an A2LA accredited EMC laboratory An A2LA accredited lab posed the following question: CISPR 22 wording on cabling was claimed to be inconsistent with ANSI standards. For testing above 1 GHz, the CISPR setup was not accepted by FCC. The lab wants to use one single setup for both <1 GHz and >1 GHz tests. The lab believes that a documented deviation from standards in the test report would be required, and desires confirmation of their understanding. A decision was made that the lab must follow the requirements in ISO 17025, 5.4.1 (“Deviation from test and calibration methods shall occur only if the deviation has been documented, technically justified, authorized, and accepted by the customer”). ACTION ITEM 9 – A. Gouker to notify the subject laboratory of the EMAC’s interpretation on their question regarding EN 55022 test setup requirements. (By May 31, 2010)

13 - P103c – possible MU Annex revisions (specifically Canadian radio specs portion)

L:\BOD\Conclave Minutes\2010 Final Minutes\Electro-Mechanical Advisory Committee Meeting Summary.doc Page 6 of 11

Motion by B. Nadeau: revise P103c for RSS and BETS (Canadian radio tests) test methods to remove individual test listings, and use a URL to point readers to the appropriate listing websites. Also, modify ETSI test listing to say something to the effect of “All ETSI 300 series tests – MU Category V”. Point to ETSI list, as those standards are also free to view. Motion properly seconded. Vote by voice – all votes affirmative. Motion was passed. ACTION ITEM 10 – A. Gouker to make revisions to P103c for RSS, BETS, and ETSI 300-Series tests, adding URL’s instead of individual test methods, and send out to EMAC for comment. (By May 31, 2010)

ACTION ITEM 11 – After EMAC provides comments on P103c revisions, A. Gouker to present to the A2LA Criteria Council for final approval and update. (By June 30, 2010)

14 - SAR Testing: Is calibration required for the reference dipole?

The argument was made that SAR dipoles have 0 (zero) effect on the testing results and so calibration is not needed. Dissenting opinions were presented by some EMAC members, stating that compliance measurements require accredited calibrations. IEEE 1528:2003, 8.2.1 (System Checks): Not a verification of the system with respect to external factors. Dipoles are used to check the list of possible short term drift errors in the system. FCC KDB makes the following note in the Dipole Requirements: “…. regular calibrations are necessary to reconfirm the electrical [specifications] and SAR targets of the dipoles.” As such, it was agreed that the dipole must have an accredited calibration to remain in compliance with ISO/IEC 17025. Decision – Calibration of SAR Dipoles is required (ISO 17025, sec 5.5.2/5.6.1) – Calibrations not meeting A2LA’s Traceability Policy Requirements shall have a deficiency cited.

15 - Investigate ways of documenting laboratory compliance with 6db requirement for

SVSWR (VCCI V-3)… similar to current practice in C216 for FCC. It was discussed that VCCI seems to want SVSWR results submitted with VCCI V3 6 GHz notifications, based on previous denials of applications missing the SVSWR results. Based on discussions during the EMC Training on 4/23/2010, a question was raised on whether or not A2LA should require labs to have SVSWR reports for verifying testing capabilities up to 6 GHz. In addition, there was some question as to whether A2LA should require an additional deliverable from assessors with regard to VCCI, or whether it should only be clarified on the lab’s scope of accreditation (1 or 6 GHz limit). Decision – Assessors shall clarify frequency range for VCCI V-3 on scopes, and Assessors shall verify SVSWR reports during the on-site assessment for labs testing over 1 GHz. Assessors shall notify laboratories prior to coming on-site that the SVSWR reports (> 1 GHz) must be prepared for review and that submitting the reports prior to coming on-site would be beneficial.

L:\BOD\Conclave Minutes\2010 Final Minutes\Electro-Mechanical Advisory Committee Meeting Summary.doc Page 7 of 11

16 - Discussion regarding an interpretation of C63.4, section 10.1.5, correlating to FCC checklist

#82… requirement for conversion factors to be in test reports. A working group was established to review and comment on possible revisions to the FCC Checklist (A2LA Doc C216). A draft version of the new FCC Checklist is on the FCC website for comment. It was reiterated that last year’s decision of the EMAC was not to cite deficiencies against FCC Checklist items.

17 - Notification regarding NAVAIR Instruction 2400.1 update. (Page 4 listing MIL-STD-461

testing requirements.) NAVAIR issued a revised document, allowing ILAC MRA signatory accreditation bodies to accredit labs for this testing.

18 – Request from A2LA-Accredited EMC Lab: Can the EMAC develop clear definitions of

calibration, verification, and standardization with respect to EMC test equipment. From these definitions, is it possible to develop equipment lists for EMC tests which will identify the need for calibration/verification/standardization? It was explained that, in a recent assessment conducted for SCC, an assessor explained the SCC approach to this situation (not endorsed by any document, however) to the assessed lab. Namely, that clause 5.4.1 of ISO/IEC 17025 should be applied to require identification of major contributors and a criterion of more than 4:1 on the total measurement uncertainty (MU) should be used. Unfortunately this approach is difficult to apply in all cases (especially if MU budgets do not exist). It was felt that the question was raised to help the lab determine what equipment could be removed from their yearly calibration list in order to save money. It was proposed that an extension of the calibration cycle for certain pieces of equipment was an allowable approach if the laboratory has good quality checks according to 5.9 of ISO/IEC17025. Decision - EMAC cannot define the terms calibration, verification, and standardization/characterization. It was agreed that B. Nadeau would work with the CC to define these terms. It was also recommended that labs refer to “ILAC G24 - Guidelines for the Determination of Calibration Intervals of Measuring Instruments.” ACTION ITEM 12 – A. Gouker to discuss with upper management the possibility of incorporating definitions for calibration, verification, etc from the VIM version 3.0 into A2LA’s Traceability Policy. If agreed to by management, then make the proper revisions and ballot the A2LA Criteria Council. (By June 30, 2010)

L:\BOD\Conclave Minutes\2010 Final Minutes\Electro-Mechanical Advisory Committee Meeting Summary.doc Page 8 of 11

19 - Further to agenda item 14, distribute for discussion a white paper that was received by EMAC recording secretary from an unidentified source regarding calibration and verification differences. No further discussion was undertaken on this topic – see agenda item 18.

20 - Highlights from Friday’s EMC Assessor Training class.

ANSI C63.4:2009, 4.7.5 – Every temperature variation of more than 15 degrees centigrade requires a laboratory to apply different insertion loss values for their cables exposed to these variations (i.e. on top of the ground plane on an OATS).

Bi-conical antennas must have an antenna balance of <= 1dB. It was recommended

that antennae be marked so they are used the same way (orientation) every time. Calibration certificates must also include a statement of what orientation was used to obtain the correction factor.

A discussion on protocol analyzers took place (& protocol testing in general) – Time

functions should not need to be calibrated. Unless otherwise noted, protocol analyzers should not need to be calibrated (see specific protocols).

Protocol testing should be added as a competency item in assessor descriptions, as

most Bluetooth/CTIA assessors are competent, but not all EMC assessors are competent in this area. Assessors should contact T. McInturff if they feel they should be coded for protocol testing.

CISPR 16-1-2 – A question was raised regarding whether LISN isolation calibrations

are required and whether calibration providers truly calibrate to manufacturer specifications. If the lab is not specifically requiring certain specifications (i.e., “meets CISPR 16-1-2 requirements”) on the cal purchase order, some felt this should be considered a non-conformance.

Motion by L. Gradin: Taking into consideration isolation requirements, labs must provide objective evidence that their LISNs meet these requirements. Motion properly seconded. Votes taken by voice – 1 abstention, all others affirmative.

CISPR 16-1-2 – It was agreed that voltage drops shall not be more than 95%.

Motion by B. Nadeau: A lab shall have objective evidence that their LISN has no more than 95% voltage drop. Motion properly seconded. Votes taken by voice – 1 abstention, 1 against (anonymous), all others affirmative. Decision – Labs must provide objective evidence that their LISN meets the requirements of CISPR 16-1-2 for isolation and voltage drops.

21 - Notice from NIST regarding CISPR 22 vs. KN 22.

KN 22 requires a 10m test distance and will not allow 3m test data to be considered for a proper submission as CISPR 22 allows, unless high ambient noise signals are present.

L:\BOD\Conclave Minutes\2010 Final Minutes\Electro-Mechanical Advisory Committee Meeting Summary.doc Page 9 of 11

22 - FCC Draft laboratory Division Publications

A new FCC checklist is in draft format on the FCC’s website. ANSI C63.4 2003 or 2009 are both allowed per the current draft document. NSA verification appeared to not have been included in this draft. R. Miller asked for comments to be submitted by May 10, 2010 (to him or A. Gouker at A2LA) and A2LA will submit a total commentary on behalf of the EMAC. Everyone was also told that this is a public document and anyone is able to post comments to the FCC. A2LA plans to add NSA check requirements to the current iteration of the FCC draft checklist when it becomes a final published version. Motion by L. Gradin: If a laboratory does not have an NSA report submitted to the appropriate regulatory body, then they must prepare an equivalent report for the assessor to review. This motion was withdrawn during discussion. Motion by D. Sigouin: A2LA staff to contact the FCC and obtain clarification on the requirement for reviewing the test firm’s NSA data on-site and clarification on why the NSA demonstration was removed. Motion was seconded by D. Kramer. Vote by voice – all votes affirmative. (No dissenting votes) R. Miller noted that this item is included with the current list of A2LA questions going to FCC regarding the draft document. ACTION ITEM 13 – A. Gouker to provide comments to the FCC in an effort to clarify why the NSA spot check requirement was removed from the current FCC draft checklist. (By May 16, 2010) A discussion took place regarding use of a subcontractor to perform the NSA measurements. On the current FCC checklist, “No” must be checked if that subcontractor is not present to perform the NSA measurements during the assessment.

ACTION ITEM 14 – A. Gouker to gain clarification from the FCC on the use of subcontractor/service providers who perform the NSA measurements for test labs, and provide feedback to the EMAC notifying them of the clarification for these providers to be on-site to perform spot checks during the assessment. Due by June 30, 2010.

23 - Collection of A502 (Conflict of Interest) Forms.

A502 forms were passed in by attendees and collected by A. Gouker.

24 - Old/New Business

A question was raised regarding A2LA’s current application of ISO 17025, Clause 5.7. The application currently states that, if a lab is not involved in the collection of samples, then sampling is not applicable. However, if the lab uses a statement on the testing of individual items to apply to a population, then sampling does apply.

L:\BOD\Conclave Minutes\2010 Final Minutes\Electro-Mechanical Advisory Committee Meeting Summary.doc Page 10 of 11

An example was given of a manufacturer’s lab being required to use a certain report template after testing a selection of test items such that the individual results would then be related to a larger population. In this case, it was suggested that the clause of ISO/IEC 17025 related to interpretation of opinions/results (Section 5.10.5) be referenced. Also, it may be possible for the lab to qualify the final report as non-accredited work. Finally, the suggestion was made for the assessor to investigate the lab contract with their customer to verify that the test report format is as required by the customer contract (Section 4.4)

25 - Next Meeting

The next EMAC meeting is scheduled to occur at the 2011 Conclave. A. Gouker will issue notices to all applicable EMAC members in advance and in accordance with the EMAC Bylaws.

The meeting was adjourned at approximately 2:00 pm. Summary prepared by Mike Buzard, A2LA Accreditation Officer.

L:\BOD\Conclave Minutes\2010 Final Minutes\Electro-Mechanical Advisory Committee Meeting Summary.doc Page 11 of 11

Attendees of the 2010 EMAC Meeting:

Attendee Name: Affiliation: Adam Gouker A2LA Staff Mike Buzard A2LA Staff Rob Miller A2LA Staff Diana Gavin A2LA Staff Michelle Bradac A2LA Staff Benoit Nadeau A2LA Assessor / EMAC Chairman Nathan Belsher A2LA Assessor Andreas Eberhard A2LA Assessor Doug Kramer A2LA Assessor Nee Salam A2LA Assessor Werner Schaefer A2LA Assessor David Waitt A2LA Assessor Thomas Dickten A2LA Assessor Dan Sigouin A2LA Assessor Larry Gradin A2LA Assessor Greg Gogates A2LA Assessor Kurt Fischer A2LA Accreditation Council Joel Lachance Assessor in Training Kimball Williams Assessor in Training / Denso David Fischer Fischer Custom Comm. Allan McCall Fischer Custom Comm. William “Mac” Elliott Motorola

Total Attendees for 2010: 22

2010 EMAC Meeting Action Items ACTION ITEM 1 – A. Gouker – Contact Phil Fanson, Greg Bartell, Hoosamuddin Bandukwala to ascertain their interest in remaining EMAC members since they have missed two consecutive meetings. – Complete (no replies from three individuals) ACTION ITEM 2 – A. Gouker – Add Kurt Fisher, Joel Lachance, Mac Elliot, Allan McCall, to EMAC members list. - Complete (EMAC list has been updated) ACTION ITEM 3 – A. Gouker - Propose to A2LA upper management requirements for submission of C101 checklists received with applications to restrict the format to non-hand-written PDF scans, or .doc versions – Complete (moving to on-line renewal process, where C101 will be available for download in .doc format; but will not mandate submission in Word format since labs are our customers) ACTION ITEM 4 – A. Gouker – Re-invite FCC representatives to 2011 Conclave – Complete – (FCC and NIST indicated they were unable to attend 2011 Forum due to prior engagements) ACTION ITEM 5 - A. Gouker to compare EMAC bylaws with other Technical Advisory Committee bylaws and clarify any bridging/overlap issues, especially with regards to the scope of each TAC. Provide feedback to EMAC with recommendations for bylaw revisions (including provisions for including involvement of other advisory committees when overlapping scopes require changes to more than one document). Incorporate all comments into draft of bylaws, which is to be sent to B. Nadeau. – Open, in-process (draft sent to EMAC for decision during 2011 Forum meeting) ACTION ITEM 6 – A. Gouker to notify appropriate EMC labs that a documented PT plan is no longer required if there are no commercially available proficiency testing programs for the tests on their scope. – Closed (discussed with management. Decided that notifying labs to clarify requirement could cause further confusion. Provide explanations on case by case basis.) ACTION ITEM 7 – A. Gouker to speak with upper management and clarify to assessors what deliverables shall be returned for the EMC Labs. – Closed (this A/I was meant to clarify to assessors what was required for PT deliverables… assessors were emailed and notified that PT matrix was required at a minimum, even if no commercial PT was available. Note on matrix) ACTION ITEM 8 – B. Nadeau will solicit IEEE for interpretation on the bore sighting issue, w/ assistance from W. Schaefer. Open, in-process (IEEE was sent request for interpretation early 2011, no feedback or publication at this time. Expect decision during May ’11 C63 meeting.) ACTION ITEM 9 – A. Gouker to notify the subject laboratory of the EMAC’s interpretation on their question regarding EN 55022 test setup requirements. – Closed (laboratory notified) ACTION ITEM 10 – A. Gouker to make revisions to P103c for RSS, BETS, and ETSI 300-Series tests, adding URL’s instead of individual test methods, and send out to EMAC for comment. – Closed (P103c updated and approved by CC) ACTION ITEM 11 – After EMAC provides comments on P103c revisions, A. Gouker to present to the A2LA Criteria Council for final approval and update. – Closed (P103c updated and approved by CC)

tbarnett
Text Box
ATTACHMENT 2

ACTION ITEM 12 – A. Gouker to discuss with upper management the possibility of incorporating definitions for calibration, verification, etc from the VIM version 3.0 into A2LA’s Traceability Policy. If agreed to by management, then make the proper revisions and ballot the A2LA Criteria Council. – Closed (upper management is currently in process of revising P102, which includes section on “definition of terms”) ACTION ITEM 13 – A. Gouker to provide comments to the FCC in an effort to clarify why the NSA spot check requirement was removed from the current FCC draft checklist. – Closed (comments were submitted to FCC during comment period on draft “Test Firm Roles/Resp” document. No feedback received; however, new FCC checklist was published in July 2010 with spot check requirement still included) ACTION ITEM 14 – A. Gouker to gain clarification from the FCC on the use of subcontractor/service providers who perform the NSA measurements for test labs, and provide feedback to the EMAC notifying them of the clarification for these providers to be on-site to perform spot checks during the assessment. – Closed (FCC clarified that contractors are acceptable for performing full NSA measurements for the lab; but during the on-site A2LA assessments the LAB Personnel must be able to perform spot check under accredited conditions) INFORMAL ACTION ITEM FROM 2010 EMC ASSESSOR TRAINING CLASS – W. Schaefer to submit a request for interpretation to IEEE regarding the acceptability of using hybrid or other broadband antennas for radiated emissions measurements above 1 GHz (for ANSI C63.4-2009 tests). – Closed (W.Schaefer submitted request and received the following feedback, which was also forwarded to those in attendance at EMC assessor training class):

“Hybrid or other broadband antennas cannot be used for radiated emission measurements above 1 GHz. See table 1 in clause 4.5 and clause 4.5.4 of C63.4-2009 for the allowed list of antennas. While C63.5-2005 provides a method to calibrate broadband antennas above 1 GHz, the current revision of C63.4 does not allow for their use.”

L:\Association Documents\A106 - Bylaws of the Electromechanical Advisory Committee (EMAC) (07/10/08) Page 1 of 8

draft

A106 – BYLAWS OF THE ELECTROMECHANICAL ADVISORY COMMITTEE (EMAC)

Approved by the A2LA Criteria Council on mm/dd/yy

ARTICLE 1 - ASSOCIATION BYLAWS

1.1 These Bylaws are in accordance with the Bylaws of the American Association for Laboratory Accreditation (A2LA). The Electromechanical Advisory Committee (EMAC) will hereinafter be referred to as the Committee. The Criteria Council will hereinafter be referred to as the Council. A2LA will hereinafter be referred to as the Association. On issues not specifically addressed by the Association Bylaws these Bylaws shall govern.

ARTICLE 2 - SCOPE

2.1 The scope of the Committee shall be the development of accreditation guides, positions, and recommendations to the Association for Electrical, Electromagnetic Compatibility (EMC), Product Safety, Telecom, Environmental Simulation [e.g. vibration, acceleration, rain, salt and moisture exposure, dust, thermal and humidity cycling, thermal and vibration shock, high/low temperature, low pressure (altitude), contamination by fluids, solar radiation (sunshine), fungus, explosive atmosphere, immersion, acoustic noise, life cycle/durability etc.], Software, Wireless, Bluetooth, Specific Absorption Ratio (SAR), Electro-mechanical functional and characteristic (e.g. dielectric, strength, resistance, etc.), and stress (e.g. highly accelerated life test and stress simulation, stress screening, etc.) testing. The Committee shall also address issues/concerns related to any accredited Telecommunication Certification Bodies (TCB’s) and Proficiency Testing Providers relevant to its scope. Project selection for this Committee will be coordinated with other Association committees, as appropriate, and with other organizations and individuals in the subject areas, so that unnecessary duplication of effort will be minimized. The Committee, as coordinated with or requested by the Association, shall represent the Association to various groups with a stake in the standards development process (e.g. FCC, NVCASE, AEMCLRP, ISO, Bluetooth SIG, CCF, CTIA, IEC, IEEE, EIA, ISA, IPC, SAE, NAVAIR, ELAWG, ACIL, ANSI C63, CISPR, EPA, DOE) within the areas enveloped by the Committee.

2.2 The Committee shall report to the Council, and communicate its findings with the Council

Chairman and the Staff Advisor. The Committee has no right to bind the Association without direct authorization of the Board.

2.3 To achieve the scope (and goals) of Article 2.1, the Committee will undertake, among others, the

following activities:

a. To Contribute to the production and promotion of knowledge within of Committee members relating to the scope of the Committee;

b. To Promote, encourage and maintain the highest possible professional level of practice and ethics in the scope of EMAC related accreditations

c. To Encourage research and exchange of ideas, experiences and projects in the scope of the Committee this area;

tbarnett
Text Box
ATTACHMENT 3

L:\Association Documents\A106 - Bylaws of the Electromechanical Advisory Committee (EMAC) (07/10/08) Page 2 of 8

d. To Establish, with the help of the Association, preferred contacts with laboratories, businesses and other organizations, both public and private, nationally or internationally, and with germane associations nationally or internationally to promote worldwide consistent accreditation approaches;

e. To Promote and support activities leading towards the development of laboratory competence within the Committee scope;

f. To Promote activities such as courses, conferences, seminars, meetings and exhibitions for the Committee scope in support of the Association;

g. To Promote and sponsor publications that conform to the goals of the Association intending to inform Laboratories, Businesses and other organizations, both public and private, and with germane associations in the country or abroad;

h. To Maintain a dialog with laboratories, businesses and other organizations for improvement of the accreditation process within the Committee scope;

i. To Promote the use of non-proprietary, readily available and consensus-based standards, interfaces and formats for cost-effectiveness and fairness;

j. To Oversee working group and task group operation to see that they are it is within the scope of the Committee, and the charter of the working groups and task groups;

k. To Facilitate, as appropriate, interlaboratory comparisons and cooperation to minimize variation in test results.

l. To strive to become a trusted advisor to the Association. A trusted advisor is an individual or group willing to take risks by being open, frank, and clear for the benefit of those advised and also focused on the long term benefits for others not just self-interest.

m. To Assist the Association in establishing clarifications and interpretations with in the EMAC technical area Committee scope to support such requests from industry.

2.4 Draft Documents - Draft position or guideline documents must bear a notice that it is an

unapproved draft of a proposed Association document on the cover and does not reflect an authorized Association position. The Association logo shall not be applied to draft documents. As such, this document is subject to change. Permission is granted for EMAC participants to reproduce this document for purposes of Association activities only.

ARTICLE 3 - MEMBERSHIP 3.1 Participation - Participatory membership is open to all who meet the established membership

rules within these Bylaws or as established by the Committee. 3.2 Voting - Voting membership on the Committee shall be as individual Association members,

Association assessors, or official representatives of organizations, or individuals in a lead responsibility (e.g. Technical Manager. Quality Manager, Supervisor, Approved Signatory, or deputies thereto) for an accredited laboratory Conformity Assessment Body (CAB) that has have mutual recognition with the Association (e.g. through APLAC, EA, or other such agreements) in a field within the scope of the Committee. All voting members shall have expertise in one or more areas of the Committee scope and be approved by ballot of the existing membership.

3.3 New Members Applications - Individuals requesting membership on the Committee shall do so

through the Association office or through the Chairman of the Committee. The Committee members will seek new members to enhance the knowledge/experience base of the Committee. The levels of membership are defined in Article 3.6.

L:\Association Documents\A106 - Bylaws of the Electromechanical Advisory Committee (EMAC) (07/10/08) Page 3 of 8

3.4 Voting Privileges - All Committee members with voting privileges (see 3.2 6) are entitled to vote on administrative matters such as election of officers and Committee bylaws. Furthermore, all members of the Committee with voting privileges (see 3.6) may vote on Committee ballot actions of a technical nature (e.g. technical positions). All negatives votes and comments received from all ballot returns shall be considered. No more than one vote per organization, individual, program, and or accredited CAB laboratory will be accepted. The number of votes from accredited laboratories shall not exceed 50% of the votes on a ballot for a ballot to be considered valid.

3.5 Voting Actions - Voting members present will be determined at the outset of any Committee

meeting by review against the roster of voting members. Ballots will be reviewed to eliminate conflict with paragraph 3.2 and 3.4.

3.6 Maintaining Membership - To maintain membership on the Committee, the member shall attend

at least 50 percent of the meetings within a two year period. Failing this, the person shall cease to be a member of the committee if so recommended by members present at a Committee meeting. An exception to this process would be granted if the Member submits in writing a request for exception supplying adequate justification pending approval by the Committee Chair and Recording Secretary. Notice of Committee membership termination shall be sent to the member by the Recording Secretary in writing.

ARTICLE 4 - OFFICERS AND THEIR ELECTIONS 4.1 Officers - The Officers of the Committee shall be a Chairman and a Recording Secretary. The

Recording Secretary shall be an Association-appointed staff member. Consistent with Association rules the Recording Secretary is a non-voting member.

As workload of the Committee increases a Vice Chairman may be established. Parliamentarian. A Parliamentarian shall be appointed by the Chairman to serve until either party terminates the service. The Parliamentarian shall give advice to the Chairman on matters of parliamentary procedure. The Chairman may consult with the Parliamentarian prior to meetings to anticipate problems and determine proper procedure. During meetings the Parliamentarian shall give advice when called upon by the Chairman. The Chairman retains the right to make a final ruling and may accept or reject the advice of the Parliamentarian.

4.2 Duties of the Officers: 4.2.1 Chairman - The Chairman, or in his/her absence, an alternate if available presides at all

meetings of the Committee. Alternate selection will be in the following order (1) Vice Chairman, if available, (2) Work Group Chairman in the order of the Work Group size, (3) Parliamentarian, (4) individual selected by majority vote at the meeting.

4.2.2 Recording Secretary - The Recording Secretary shall keep the minutes and perform such

other duties as may be assigned by the Chairman. The Recording Secretary will be responsible for coordinating Committee correspondence and maintaining decision records. The Recording Secretary is responsible for maintaining a copy of the most current bylaws, the minutes, the roster of membership, the roster of officers and Parliamentarian, the attendance record of members, and the biographical sketches of all members (including interim members). The Recording Secretary may ask for and receive assistance in the generation of meeting minutes or

L:\Association Documents\A106 - Bylaws of the Electromechanical Advisory Committee (EMAC) (07/10/08) Page 4 of 8

other tasks. A member assisting in this regard is responsible to prepare the materials to the same level of detail as expected for the Recording Secretary. The duties of the Recording Secretary established by these Bylaws shall not knowingly be in conflict with Association Standard Operating Procedures. The Recording Secretary shall bring known conflicts to the attention of the Committee Chairman for resolution with the Association.

4.3 Term of Office - The term of office of the officers shall normally be for two years. The office

holders, especially work group chair’s may be re-elected for the following a second continuous term. Elections shall be held, as necessary, during the Association’s Annual Meeting Conclave, subject to the provision of clause 4.5. Nominations must be made to the Chairman six weeks before the Annual Meeting Conclave. The Chairman is responsible for balloting the nominations to the Voting Members four weeks before the Annual Meeting Conclave. This may be assigned to the Recording Secretary. Officers will be elected at the Annual Conclave Meeting, by a simple majority vote, in a secret ballot. Voting Members that are unable to attend the Annual Conclave Meeting may submit their written selection directly to the Recording Secretary at least two weeks prior to the meeting.

4.4 Absence of Officers – If the office of Chairman becomes vacated prior to the normal election,

the Vice-Chairman (if available) shall immediately become the Chairman. In the absence of both a Chairman and Vice Chairman the Recording Secretary will ballot the membership proposing as candidates all other officers (after asking them if they would accept the position) via Email or other expeditious means with a majority vote leading to election of new Chairman.

4.5 Approval of Committee Officers by Association Board of Directors – All elected committee

officers must be ultimately approved and appointed by the Chairman of the Association Board of Directors (BOD) (See Section 8.7 of the A2LA Bylaws). Once the outcome of an election of the Committee is known, the staff representative must inform the VP of A2LA so that it can be added as a topic of discussion at the next scheduled meeting of the BOD. Because of this, it is extremely important to schedule elections carefully so that sufficient time is allotted for the BOD appointment process before the elected officers are to begin their term. Consequently, some adjustment to the provisions of clause 4.3 of these Committee Bylaws may be necessary. Once the BOD Chairman has formally appointed elected officers for an advisory committee, the staff representative Recording Secretary to that committee must officially notify the elected officers of their appointment, their responsibilities and their exact term of office.

ARTICLE 5 – WORK GROUPS

5.1 Working Groups – The Chairman of the Committee may establish working groups with working group chairmen. Working groups are established to work on specific areas of testing that are expected to continue on a long term basis.

ARTICLE 6 - TASK GROUPS 6.1 Task Groups - Task groups of one or more persons may be appointed by the Chairman of the

Committee or the work group chairman(s) as appropriate to their responsibilities for specific assignments. Task groups of one or more persons may be assigned to prepare initial drafts of Association documents or provide initial review of technical issues. A task group is formed when an activity requires special attention, an involvement of more than one individual, and is not an expected long term or continuing activity meriting a work group.

L:\Association Documents\A106 - Bylaws of the Electromechanical Advisory Committee (EMAC) (07/10/08) Page 5 of 8

6.2 Reporting - A task group will report directly to the responsible Chairman. The Committee That

Chairman will be responsible for correspondence to all other members of at the Committee level. Work group Chairmen shall be responsible for working level correspondence with copies of such correspondence being provided to the Committee Chairman and Recording Secretary. Correspondence to those outside the Committee and Association representing Committee position or formally inviting new members to participate or providing other actions will be by the Committee Chairman with Association member concurrence.

ARTICLE 7 - MEETINGS 7.1 Number of meetings - Regular meetings of the Committee shall be held at least once a year

(usually in concurrence with the Association Annual Meeting Assessor Conclave) and as often as necessary to carry out the business of the Committee. Special meetings of the Committee may be held at the call (or e-mail notification) of the Chairman or at the written request (or e-mail request) of at least 4 members of the Committee.

7.2 Time and place - The time and place of all meetings of the Committee shall be the responsibility

of the Chairman in concurrence with the Association Recording Secretary or staff member. Notices of all meetings shall be transmitted to the members of the Committee no less than four weeks in advance of the meeting by the Association Recording Secretary.

7.3 Proxies - A Committee member may delegate in writing a qualified individual as proxy at for a

single meeting and the provision for review of a biographical sketch of the proxy. This written proxy shall be recognized if presented to the Committee Recording Secretary prior to the meeting with concurrence of the Chairman, and the review of the biographical sketch of the proxy if requested. but No individual shall hold or exercise proxies for more than one member.

7.4 Quorum – Twenty-Five (25) % voting Committee members (not counting including proxies)

constitute a quorum at a Committee meeting. Likewise twenty-five (25) % voting work group members (not counting including proxies) constitute a quorum at a work group meeting.

7.5 Meeting notices and agenda - The Chairman, and Recording Secretary and Association

headquarters shall be notified in advance of all meetings of work groups, task groups and shall receive meeting notices, agendas, and minutes of these meetings.

7.6 Meeting rules - Robert's rules of order (most current version) shall guide the Committee

meetings except where these rules are in conflict with the Committee or Association Bylaws. See Article 3 for Parliamentarian duties.

7.7 Virtual meetings - Decisions may be made using the Internet. Four voting members may initiate

the decision making process by submitting a proposal to the Chairman with accompanying arguments. The proposal and arguments will be corresponded to the voting membership by the Chairman in conjunction with the Recording Secretary. Responses should be corresponded to the voting membership directly. Two weeks after issuance of the proposal and arguments, the originators may request that the Chairman take a vote. If so requested, the Chairman in conjunction with the Recording Secretary will issue a ballot. The voting membership must return their vote within two weeks. See Section 3 for voting rules.

L:\Association Documents\A106 - Bylaws of the Electromechanical Advisory Committee (EMAC) (07/10/08) Page 6 of 8

7.8 Executive Sessions – Executive sessions are utilized to discuss sensitive issues that are inappropriate to share with normal or potential general meetings. Such sessions are not routine, but are convened to discuss items or issues that may cause embarrassment, such as discipline or personnel matters. Participation is normally the Chairman, Recording Secretary, and working group chairmen. Members or invited guests may be invited to participate in these sessions when approved by the normal executive session membership by a majority of those present. Executive sessions will be on every normal agenda (see 7.9), but approval of the session will be recorded when necessary or noted as not necessary by the majority vote of designated normal Executive session members.

7.9 An appropriate order of business at regular meetings may be:

1. Roll call 2. Approval of Agenda 3. Reading of and correction of minutes of previous meeting 4. Review of previous assigned Action Items (briefly if covered in 7 or 8 below) 5. Reading of report on business transacted other than at meeting 6. Report of communications 7. Reports of Officers 8. Reports of Committees 9. Unfinished business 10. New business 11. Elections if not otherwise provided for 12. Special Executive Session, if necessary 13. Adjournment

7.10 Observers - Committee and work group meetings are open to observers. Executive Sessions are not. Requests of Observers to participate by Observers in a particular discussion are at the discretion of the Chairman.

7.11 Invited Guests – Guests may be invited to Committee or work group meetings to participate in discussions relating to the subject for which they were invited. Guests may be invited to Executive Sessions as described in 7.8. Requests of invited guests to participate by Invited Guests in a particular discussion are at the discretion of the Chairman.

ARTICLE 8 - BALLOTS

8.1 Ballots - Recommendations for all actions shall be approved in accordance with Committee

procedures and Robert’s rules. 8.2 Reporting procedures - The number of affirmative, negative, and abstaining ballots shall be

properly reported to the Committee. The Recording Secretary shall record the actual votes on a roll call form (or equivalent media by spread sheet or equivalent by the Recording Secretary or Chairman for electronic or posted ballots) retained for a minimum of 2 years; the reporting in the minutes will only indicate summary numbers. All negative votes should be accompanied by reasons based on either technical or improper procedure considerations and should include suggested revisions. At the specific request of a meeting participant who voted on a subject, his/her name is to be recorded in the meeting minutes to indicate the vote.

L:\Association Documents\A106 - Bylaws of the Electromechanical Advisory Committee (EMAC) (07/10/08) Page 7 of 8

ARTICLE 9 - REPORTS

9.1 Committee reports - The Recording Secretary (or Chairman as appropriate) shall submit

minutes/reports to the Association's Criteria Council for each Committee meeting. Submittals for work groups will be by agreement of the Committee Chairman and Recording Secretary. Disputes are to be resolved by Committee majority vote with 50% quorum.

ARTICLE 10 - AMENDMENTS 10.1 Amendments to these Bylaws are normally proposed by a member at a regular meeting of the

Committee. If practical these Bylaws may be amended by a two-thirds vote of all members present and voting at any regular scheduled Committee meeting, or by two-thirds of the members voting electronically or by mail in accordance with the balloting process outlined in Article 3. Where this is not practical, as determined by the majority of officers at a meeting, a letter (e-mail) ballot may be authorized by the Chairman with coordination of the Recording Secretary. Amendments shall be authorized by approval of two-thirds of the members in attendance at the Committee meeting or two-thirds of the members voting electronically or by mail with a minimum quorum of 50%.

10.2 Grammatical, typographical and spelling errors may be corrected by the Chairman and/or

Recording Secretary majority of officers so long as no correction alters the intent of the affected Bylaw

10.3 Amendments to these Bylaws shall be adopted utilizing the balloting process outlined in Article

3. Amendments must not conflict with the Association Bylaws. Amendments are subject to the approval of the Board of Directors.

10.4 Final drafts of the Committee Bylaws are sent to the Association Board for final approval

through the Criteria Council for concurrence. 10.5 Upon final approval of the Board, the Recording Secretary assigned staff representative (i.e.

Recording Secretary) is responsible for distributing the revised Bylaws to all members of the Committee. the voting membership, non-voting members, and interim members on the Recording Secretary Roster at time of approval and new entries on the Committee Roster.

ARTICLE 11 - INDEMNIFICATION 11.1 The Association shall indemnify any member of the Committee who was or is a party or is

threatened to be made a party to any proceeding (which shall include for the purposes of this article any threatened, pending, or completed action, or other proceeding whether civil, criminal, administrative, or investigative (other than an action by or in the right of the Association)) by reason of the fact that such person was or is an authorized member of the Committee against expenses (which shall include for purposes of this Article attorney's fees), judgments, fines, and amounts paid in settlement actually and reasonably incurred by such person in connection with such action or proceeding if such person acted in good faith and in a manner such person reasonably believed to be in, or not opposed to, the best interests of the Committee and, with respect to any criminal proceeding, had no reasonable cause to believe such person's conduct was unlawful.

L:\Association Documents\A106 - Bylaws of the Electromechanical Advisory Committee (EMAC) (07/10/08) Page 8 of 8

ARTICLE 12 - COMPENSATION 12.1 Membership Members of the Committee shall not receive any compensation for time spent for

their voluntary services, but as coordinated through the Recording Secretary (and staff member to the Association), may be reimbursed for expenses associated with the Committee activities.

The American Association for Laboratory Accreditation Document Revised: November 9, 2010

P103c – Annex: Policy on Estimating Measurement Uncertainty for Electro-Mechanical Testing Labs

Page 1 of 10

L:\Requirements\P103c – Annex: Policy on Estimating Measurement Uncertainty for Electro-Mechanical Testing Labs

A2LA has compiled information regarding the classification of common test methods according to the A2LA Policy on Measurement Uncertainty for Testing Laboratories. Following is a list of example classifications, which has been reviewed by the A2LA Electro-Mechanical Advisory Committee. The list below is intended to provide examples of how the listed methods were typically categorized. These classifications are dependent on the particular circumstances in a laboratory, and may not apply in all cases. The fact that a method is listed below in Category I or II does not absolve a laboratory from compliance with 5.4.6.2 of ISO/IEC 17025 or the need to address measurement uncertainty. The list is organized by test area. Within each area, the methods are ordered first by Category (as defined in the Measurement Uncertainty Policy), then by method designation. In some disciplines, specific method numbers are not listed. This list will be updated periodically as more information is made available. Such updates may include changes in categories. Following is a list of the test areas currently covered: Electrical Tests and Electric Product Tests Environmental Stress or Exposure (Military & Generic) Electromagnetic Compatibility (EMC) & Telecom Environmental Stress or Exposure (Automotive/Vehicle) Automotive/Vehicle Vibration Testing ETSI-based Radio Tests Bluetooth RF Tests SAR Testing Product Safety Testing Electrical Tests and Electric Product Tests Test Method Standard(s) Category

(I through V) Capacitance MIL-STD-202 Method 305 I/II Dielectric Withstanding Voltage MIL-STD-202 Method 301,

IPC-6012, IPC-6013, IPC-TM-650, GR-1217-CORE, MIL-STD 1344 Method 3001, Boeing D6-44588, Boeing D6-16050-4, UL746A, UL796, IEC 60243-1, JIS K6911, JIS C2110

I/II

Dielectric Breakdown Voltage ASTM 149 I/II Contact Resistance MIL-STD-202 Method 307 I/II ASTM B539-80 I/II

tbarnett
Text Box
ATTACHMENT 4

The American Association for Laboratory Accreditation Document Revised: November 9, 2010

P103c – Annex: Policy on Estimating Measurement Uncertainty for Electro-Mechanical Testing Labs

Page 2 of 10

L:\Requirements\P103c – Annex: Policy on Estimating Measurement Uncertainty for Electro-Mechanical Testing Labs

Electrical Tests and Electric Product Tests (cont.) Test Method Standard(s) Category

(I through V) DC Resistance MIL-STD-202 Method 303 I/II ASTM D 257 I/II Insulation Resistance MIL-STD-202 Method 302,

MIL-PRF-55110, MIL-STD 1344 Method 3003

I/II

MIL-STD-883 Method 1003 I/II Resistivity of Electrical Conductor Materials

ASTM B193 I/II

Quality Factor MIL-STD-202 Method 306 I/II Electromigration Resistance GR-78-CORE I/II Contact Resistance

SAE AS13441; MIL-STD-202, Method 307, ASTM B539, GR-1217-CORE

I/II

Bonding & Grounding Boeing D6-44588, Boeing D6-16050-4 R

I/II

Dry Arc Resistance Test ASTM D495, JIS K6911 I/II Volume / Surface Resistivity UL746A, ASTM D257,

JIS K6911, JIS C6481, IEC 93 I/II

Bare and Assembled Printed Wiring Boards

IPC-TM-650 I/II

Environmental Stress or Exposure (Military & Generic) Test Method Standard(s) Category

(I through V) Acceleration MIL-STD-202 Method 212 I/II MIL-STD-810 Method 513.3 I/II Drop Test ISTA I/II FED-STD-101 I/II Explosion MIL-STD-202 Method 109 I/II MIL-STD-810, Method 511.2 I/II High Impact Shock MIL-STD-202 Method 207 I/II MIL-S-901 I/II High Temperature MIL-STD-810 Method 5001.2,

MIL-STD-1344 Method 1003, 1005, RTCA/DO160, GR-63-CORE, GR-1221-CORE, IEC 68-2-2, Test Ba

I/II

Humidity MIL-STD-202 Method 103, RTCA/DO160

I/II

The American Association for Laboratory Accreditation Document Revised: November 9, 2010

P103c – Annex: Policy on Estimating Measurement Uncertainty for Electro-Mechanical Testing Labs

Page 3 of 10

L:\Requirements\P103c – Annex: Policy on Estimating Measurement Uncertainty for Electro-Mechanical Testing Labs

Environmental Stress or Exposure (Military & Generic) (cont.) Test Method Standard(s) Category

(I through V) Low Pressure (Altitude) MIL-STD-810 Method 500.2 I/II Low Temperature MIL-STD-810 Method 502.2 I/II Immersion MIL-STD-202 Method 104,

MIL-STD-810 Method 512; MIL-STD-883 Method 1002; RTCA-DO160 Section 10

I/II

Moisture Resistance MIL-STD-202 Method 106 I/II Mechanical Shock MIL-STD-202 Method 213 I/II MIL-STD-883 Method 2002,

IEC 68-2-27, Test Ea, IEC 68-2-29, Test Eb, GR-1221-CORE, MIL-B-49430, MIL-B-49458, MIL-PRF-49471

I/II

MIL-STD-883 Method 1004.7 I/II Telecommunications Core Environmental Exposure

GR-63-CORE, ETS 300 019

I/II

Particle Impact Noise Detection MIL-STD-202 Method 217 I/II Sand and Dust MIL-STD-202 Method 110,

MIL-STD-810 Method 510, RTCA/DO160 Section 12

I/II

Rain Test MIL-STD-810 Method 506.3 I/II Resistance to Soldering Heat MIL-STD-202 Method 210 I/II Resistance to Solvents MIL-STD-202 Method 215 I/II Thermal Shock MIL-STD-202 Method 107 I/II Vibration MIL-STD-202 Method 201, I/II Method 204, Method 214 I/II MIL-STD-810 I/II MIL-STD-883 Method 2026 I/II Method 2007.2 I/II MIL-STD-167, RTCA/DO160,

ISTA Project 1 I/II

Electromagnetic Compatibility (EMC) & Telecom Test Method Standard(s) Category

(I through V) Radio disturbance characteristics CISPR 12, CISPR 25,

SAE J1113-41 I/II

Radiated, RF field immunity EN 61000-4-3, IEC 1000-4-3

I/II

The American Association for Laboratory Accreditation Document Revised: November 9, 2010

P103c – Annex: Policy on Estimating Measurement Uncertainty for Electro-Mechanical Testing Labs

Page 4 of 10

L:\Requirements\P103c – Annex: Policy on Estimating Measurement Uncertainty for Electro-Mechanical Testing Labs

Electromagnetic Compatibility (EMC) & Telecom Test Method Standard(s) Category

(I through V) Radiated Emissions and Conducted Emissions

CISPR 11 (Edition 4 Amd 1 or earlier), 13, 14, 15, 20, 22 (Edition 5 or earlier), EN 55011, EN 55013, EN 55014, EN 55022, CNS: 13438, 13439, 13803, 13783, 14115, AS/NZS 1044, AS/NZS CISPR14.1, AS/NZS 3548, AS/NZS CISPR 22, AS/NZS 2064, AS/NZS CISPR11 (Edition 4.1 or earlier), SABS CISPR 22, SABS CISPR-11, CSA C108.8, CISPR 13, AS/NZS CISPR 13, CISPR 14, EN 55014, MIL-STD 461/462, CFR 47, Part 15 (using ANSI C63.4) CFR 47, Part 18 (using MP5); Boeing D6-16050-4, TIA/EIA-603

I/II

Radiated Emissions CISPR 22 (Edition 5.2) V Conducted Emissions CISPR 11 (Edition 4 Amd 1 or

earlier), 13, 14, 15, 20, 22 (Edition 5 or earlier), EN 55011, EN55013, EN 55014, EN 55022, CNS: 13438, 13439, 13803, 13783, 14115, AS/NZS 1044, AS/NZS CISPR14.1, AS/NZS 3548, AS/NZS CISPR 22, AS/NZS 2064, AS/NZS CISPR11, SABS CISPR 22, SABS CISPR-11,

I/II

The American Association for Laboratory Accreditation Document Revised: November 9, 2010

P103c – Annex: Policy on Estimating Measurement Uncertainty for Electro-Mechanical Testing Labs

Page 5 of 10

L:\Requirements\P103c – Annex: Policy on Estimating Measurement Uncertainty for Electro-Mechanical Testing Labs

Electromagnetic Compatibility (EMC) & Telecom (cont.) Test Method Standard(s) Category

(I through V) Conducted Emissions (cont.) CSA C108.8, CISPR 13,

AS/NZS CISPR 13, CISPR 14, EN 55014, MIL-STD 461/462, CFR 47, Part 15 (using ANSI C63.4), CFR 47, Part 18 (using MP5); Boeing D6-16050-4, Boeing D6-4588, EN 60555 Part 2, EN 60555 Part 3

I/II

Conducted Emissions CISPR 22 (Edition 5.2) V EMC Product Family Standards

EN 50081-1, EN 50081-2, EN 50082-1, EN 50082-2, EN 61000-6-1, EN 61000-6-2, EN 50091-2, CISPR 24, EN 55024, EN 55103-1, EN 55103-2, EN 61326, EN 61547, EN 50130-4, EN 55104, EN 50083-2, 60601-1-2, IEC 1800-3, EN 61000-3-2, AS/NZS 61000.3.2, EN 61000-3-3, AS/NZS 61000.3.3, ETS 300 386-1, EN 60669-2-1, GR-1089-CORE, IEC 60728-1, EN 50083-2, ANSI/RESNA WC/Vol. 2

I/II

Telecom Standards FCC 47 CFR Part 68, CS-03

TIA/EIA TSB31, TIA/EIA-IS-968, TIA/EIA-IS-883, T1.TRQ.6-2001, AS/ACIF S002, AS/ACIF S016, AS/ACIF S031, AS/ACIF S038,

I/II

The American Association for Laboratory Accreditation Document Revised: November 9, 2010

P103c – Annex: Policy on Estimating Measurement Uncertainty for Electro-Mechanical Testing Labs

Page 6 of 10

L:\Requirements\P103c – Annex: Policy on Estimating Measurement Uncertainty for Electro-Mechanical Testing Labs

Electromagnetic Compatibility (EMC) & Telecom (cont.) Test Method Standard(s) Category

(I through V) Telecom Standards (cont.) AS/ACIF S043, TBR 1, TBR 2,

ITU-T G.703, TBR 012, TBR 013, TBR 24, TBR 20, TS 002, TS 016, TS 030, TS 038, ETS 300 132-2, ETS 300 132-1, FTZ 1TR 09

I/II

Electrical fast transient/burst immunity

EN 61000-4-4, IEC 1000-4-4, IEC 801-4

I/II

Surge immunity EN 61000-4-5, IEC 1000-4-5, IEEE C62.41: IEC 801-5

I/II

Immunity to conducted disturbances EN 61000-4-6, IEC 1000-4-6

I/II

Canadian Radio Tests All RSS and BETS standards

available at the following URL’s: RSS URL: http://www.ic.gc.ca/eic/site/smt-gst.nsf/eng/h_sf06129.html BETS URL: http://www.ic.gc.ca/eic/site/smt-gst.nsf/eng/h_sf06125.html

I/II

Electrostatic discharge immunity EN 61000-4-2, IEC 61000-4-2, RTCA/DO160; Boeing D6-16050-4, IEC 801-2

I/II

47 CFR (FCC) FCC Rules parts 2, 11, 15, 18, 21, 22, 24, 25, 27, 74, 80, 87, 90, 95, 97, 101 (using test methods of either ANSI C63.4 or EIA/TIA 603)

I/II

Overvoltage, Overcurrent, Surge ITU-T K17, ITU-T K20, ITU-T K21, ITU-T K40, ITU-T K45

I/II

The American Association for Laboratory Accreditation Document Revised: November 9, 2010

P103c – Annex: Policy on Estimating Measurement Uncertainty for Electro-Mechanical Testing Labs

Page 7 of 10

L:\Requirements\P103c – Annex: Policy on Estimating Measurement Uncertainty for Electro-Mechanical Testing Labs

Electromagnetic Compatibility (EMC) & Telecom (cont.) Test Method Standard(s) Category

(I through V) Magnetic Field Immunity EN 61000-4-8, Boeing D6-

16050-4, RTCA/DO160, IEC 801-8,

I/II

Voltage dips, short interruptions and voltage Variations

EN 61000-4-11, IEC 801-11

I/II

Voltage Spike RTCA/DO160 I/II Audio Frequency Conducted RTCA/DO160 I/II Conduction and Coupling - Part 1 ISO 7637-1 I/II Lightning Induced Transient Susceptibility

RTCA/DO160, Boeing D6-16050-4, Boeing D6-44588

I/II

Conduction and Coupling – Part 2 ISO 7637-2 I/II Vehicles - electrostatic discharge ISO TR 10605 I/II Earth-moving machinery ISO 13766 I/II Narrowband Radiated EM ISO 11452-1 I/II Absorber-lined Chamber ISO 11452-2 I/II Transverse EM mode cell ISO 11452-3 I/II Bulk current injection (BCI) ISO 11452-4 I/II Stripline ISO 11452-5 I/II Parallel Plate ISO 11452-6 I/II Direct Radio Frequency Power Injection

ISO 11452-7 I/II

Control of EM Interference MIL STD 461E I/II Vehicle EMC SAE J551-1 I/II Vehicle component EMC SAE J1113-1 I/II Conducted Immunity SAE J1113-2 I/II Direct Injection of RF Power SAE J1113-3 I/II Bulk Current Injection (BCI) SAE J1113-4 I/II Conducted Transients on Power Leads

SAE J1113-11 I/II

Conduction and Coupling Clamp SAE J1113-12 I/II Immunity to Electrostatic Discharge SAE J1113-13 I/II Immunity - Absorber-Lined Chamber

SAE J1113-21 I/II

Magnetic Fields from Power Lines SAE J1113-22 I/II Immunity - Stripline SAE J1113-23 I/II Immunity - Wideband TEM Cell SAE J1113-24 I/II Immunity – Triplate SAE J1113-25 I/II AC Power Line Electric Fields SAE J1113-26 I/II

The American Association for Laboratory Accreditation Document Revised: November 9, 2010

P103c – Annex: Policy on Estimating Measurement Uncertainty for Electro-Mechanical Testing Labs

Page 8 of 10

L:\Requirements\P103c – Annex: Policy on Estimating Measurement Uncertainty for Electro-Mechanical Testing Labs

Electromagnetic Compatibility (EMC) & Telecom (cont.) Test Method Standard(s) Category

(I through V) Immunity – Reverberation chamber SAE J1113-27 I/II Conducted Transient Emissions SAE J1113-42 I/II BMW Electromagnetic Compatibility

GS 95002 I/II

Immunity-Reverberation Mode Tune SAE J1113-28 I/II Caterpillar Electromagnetic Compatibility

EC-1 I/II

Cummins Electromagnetic Compatibility

14270 I/II

DAF Conducted Requirements BSL 0003-103 I/II DAF Radiated Requirements BSL 0003-103 I/II DAF ESD BSL 0003-103 I/II Daimler Chrysler EMC PF9326 I/II Daimler Chrysler EMC PF10540 I/II Daimler Chrysler EMC DC 10614 I/II Fiat EMC 990110 I/II Ford EMC ES-XW7T-1A278-AB I/II Freightliner Performance Requirements

49-00085 I/II

Susceptibility to Conducted Transients

GM9105P I/II

Susceptibility to ESD During Operation

GM9109P I/II

Susceptibility - Bulk current injection

GM9112P I/II

Susceptibility to Magnetic Fields GM9113P I/II Radiated Emissions – Reverb GM9114P I/II Conducted Transient Emissions GM9115P I/II Conducted Sinusoidal Bursts GM9116P I/II Jump Start / Reverse Battery GM9117P I/II ESD Sensitivity During Handling GM9119P I/II Radiated Electric Fields - Reverberation

GM9120P I/II

EMC Component/Subsystem Requirements

GMW3097 I/II

EMC Component/Subsystem Verification

GMW3100 I/II

EMC Vehicle Requirement Specifications

GMW3091 I/II

Electronic Component Environmental Tests

GMW3172GS I/II

Harley Davidson EMC Testing 22603 I/II

The American Association for Laboratory Accreditation Document Revised: November 9, 2010

P103c – Annex: Policy on Estimating Measurement Uncertainty for Electro-Mechanical Testing Labs

Page 9 of 10

L:\Requirements\P103c – Annex: Policy on Estimating Measurement Uncertainty for Electro-Mechanical Testing Labs

Electromagnetic Compatibility (EMC) & Telecom (cont.) Test Method Standard(s) Category

(I through V) Hyundai EMC Testing ES 96200-00 I/II Mercedes EMC EMV AV 7199 I/II Mercedes Vehicle EMC Testing MBN10284 1 I/II Mercedes Component EMC Testing MBN10284 2 I/II Peugeot South America B21 7090 I/II Renault EMC NC 2000 00 96 I/II Renault EMC 3600808 I/II Toyota EMC Bench Test Method TSC 7001G I/II Toyota EMC interference susceptibility

TSC 7006G I/II

Toyota EMC test method TSC 7026G I/II Toyota EMC radio noise interference TSC 7508G I/II VW Short-Distance Interference Suppression

TL965 I/II

VW Conducted Interferences TL82066 I/II VW Radiated Interferences TL82166 I/II VW Coupled Interference on Sensors TL82366 I/II VW ESD TL82466 I/II Environmental Stress or Exposure (Automotive/Vehicle) Test Method Standard(s) Category

(I through V) Daimler Chrysler Environmental PF9688 I/II Ford Ford 00.00EA-D11 I/II Interior mounted modules GM9123 I/II Gravelometer SAE J400 I/II Radar object detection MIL-STD285 I/II Testing methods for automotive electronic

JASO D001-87 I/II

Heavy duty trucks SAEJ1455 I/II Electronic Equipment SAE J1211 I/II Electrical/Electronic Component GMW3172 I/II Harley Davidson GES 22601 I/II Toyota - Compact Disk Players TSC 7513G I/II

The American Association for Laboratory Accreditation Document Revised: November 9, 2010

P103c – Annex: Policy on Estimating Measurement Uncertainty for Electro-Mechanical Testing Labs

Page 10 of 10

L:\Requirements\P103c – Annex: Policy on Estimating Measurement Uncertainty for Electro-Mechanical Testing Labs

Automotive/Vehicle Vibration Testing

Test Method Standard(s) Category

(I through V) GMI 12558 I/II GMW7293 I/II PF9688 I/II GM9123 I/II GMW3172 I/II GES 22601 I/II TSC 7513G I/II SAE J1211 I/II Ford 00.00EA-D11 I/II ETSI-based Radio Tests All ETSI 300 series tests are considered Category V, and as such, measurement uncertainty calculations are expected. Bluetooth RF Tests

Test Specification Date Section requiring MU RF Test specifications February 1, 2003 Section 6.9 RF Provisional Test specifications

February 1, 2003 Section 6.9

CTIA Tests

Test Specification Date Section requiring MU Test Plan for Mobile Station Over the Air Performance

January 15, 2008 Section 7

SAR Testing

Similar Measurement Uncertainty is expected to apply to IEEE 1528, EN 50360, EN 50361, AS/NZS 2772.1, Australian RF Exposure, and Australian Radio Communication (Electromagnetic Radiation – Human Exposure) Standard Product Safety Testing Using the international industry guideline IECEE-CTL/153/INF, product safety test standards/specifications are considered Type I/II. 2010 by A2LA All rights reserved. No part of this document may be reproduced in any form or by any means without the prior written permission of A2LA.

L:\Requirements\P10x - Technical Consensus Decisions from the Electro-Mechanical Advisory Committee (EMAC)

The American Association for Laboratory Accreditation

Document Issued: MM/DD/YY

P10x – Technical Consensus Decisions from the Electro-Mechanical Advisory Committee (EMAC)

Page 1 of 2

EMAC - A Summary of Critical Decisions

This document has been created and reviewed by the A2LA Electro-Mechanical Advisory Committee (EMAC). It provides a summary of consensus decisions voted on and approved by the EMAC and A2LA Criteria Council for use by laboratories and assessors. I. A2LA Requirements 1.) Labs can perform pre-compliance testing, as long as it is not represented as compliance testing,

and does not violate their own management system. (2001 EMAC Meeting) 2.) Regarding ESD testing, a Lab does not need to perform a full calibration, but they need to have a

checking function of some manner in place to verify the operation of the system. (2002 EMAC Meeting)

3.) If support/auxiliary equipment is provided by the client to the test laboratory, the laboratory is NOT responsible for the verification and/or the calibration of this support equipment. (2003 EMAC Meeting)

4.) For a test chamber with a limited search height, requirements would not be met to allow for accreditation without a limitation being noted on the scope of accreditation. (2007 EMAC Meeting)

5.) NSA is not an in-house calibration and the requirements of Section T9 (of A2LA’s Traceability Policy) do not apply. (2008 EMAC Meeting)

6.) Within IEC 61000-4-3, it is agreed that uniform field measurement is not considered a calibration, and T9 is not required when utilizing properly calibrated equipment. (2008 EMAC Meeting)

7.) Within IEC 61000-4-6 it is agreed that the test signal level measurement is not considered a calibration, and T9 is not required when utilizing properly calibrated equipment. (2008 EMAC Meeting)

8.) Power Meters and Bandwidths -When a laboratory is being assessed for RF power measurements; the lab needs to know the bandwidth of the power head and power meter display unit as a system. Depending on the application, the bandwidth of the power measuring system needs to be calibrated (traceable). (2008 EMAC Meeting)

9.) Calibration of SAR reference dipoles is required (ISO 17025, sec 5.5.2/5.6.1). Calibrations not meeting A2LA’s Traceability Policy Requirements shall have a deficiency cited. (2010 EMAC Meeting)

II. External Organizational Matters (FCC, NIST, VCCI, AEMCLRP, ect.)

1.) FCC:

a. No deficiencies are to be referenced directly against the FCC checklist. Deficiencies are to be referenced to a specific requirement in the test method (i.e. C63.4 requirement, not “question X” on FCC checklist). (2009 EMAC Meeting)

b. If a deficiency that is cited in conjunction with the FCC checklist cannot be traced to a specific test method, the “N” is to be marked on the checklist and in the comments section of the checklist, provide an explanation for why a deficiency was not cited. (2009 EMAC Meeting)

c. A2LA will allow assessors to use their Assessor Master Code in lieu of their name for FCC checklist identification purposes. (2010 EMAC Meeting)

tbarnett
Text Box
ATTACHMENT 5

L:\Requirements\P10x - Technical Consensus Decisions from the Electro-Mechanical Advisory Committee (EMAC)

The American Association for Laboratory Accreditation

Document Issued: MM/DD/YY

P10x – Technical Consensus Decisions from the Electro-Mechanical Advisory Committee (EMAC)

Page 2 of 2

II. External Organizational Matters (FCC, NIST, VCCI, AEMCLRP, ect.) – (continued)

2.) VCCI:

a. Assessors shall clarify frequency range for VCCI V-3 on scopes, and Assessors shall verify SVSWR reports during the on-site assessment for labs testing over 1 GHz. Assessors shall notify laboratories prior to coming on site that the SVSWR reports (> 1 GHz) must be prepared for review – submitting the reports prior to coming on site would be beneficial. (2010 EMAC Meeting)

III. Specific Test Methods

1.) ANSI C63.4:

a. ANSI C63.4 requires verification of turntable position and verification of the antenna height (at 1 and 4 m). Azimuthal verification must be verified to be less than 22.5 degrees when used in a non-continuous process. If the test report contains specific height and angle measurements, the lab must have adequate verification on its numbers. (2003 EMAC Meeting)

b. Assessors will not cite a deficiency against ANSI C63.4-2009 clause 8.3.2.2 in reference to “keeping the EUT in the cone of radiation” if the antenna is not bore-sighted during the measurement. (2010 EMAC Meeting)

2.) CISPR 16:

a. Laboratories must provide objective evidence that their LISN meets the requirements of CISPR 16-1-1 2 for isolation, and requirements of CISPR 16-1-4 for and voltage drops. Data sheets are acceptable for this purpose. (2010 EMAC Meeting)

3.) CISPR 22:

a. Test Site Validation Above 1GHz Using SVSWR Measurement - - CISPR 22, Version 5.2 clearly states that test site validation above 1 GHz is required. The committee was in agreement and as such, laboratories should be validating the test site and would be expected to provide this information for review during on-site assessments. (2008 EMAC Meeting)

4.) MIL-STD-461:

a. An EMC bond requires low DC resistance (2.5 milliohms) and low RF impedance (length to width ratio of 5:1) according to MIL-STD-461. (2003 EMAC Meeting)

5.) IEC 61000-4-2:

a. It was agreed that no deficiencies would be cited against a laboratory that does not use a 1 GHz instrument to verify the ESD equipment (per EN 61000-4-2). (2001 EMAC Meeting)

2011 by A2LA All rights reserved. No part of this document may be reproduced in any form or by any means without the prior written permission of A2LA.

FCC “Spot Check” Clarification

FOR FCC Authorized Test Firms located in the USA or in countries holding an MRA with the USA: FOR LABORATORIES: Please note that if your lab seeks to be recognized by the FCC as an Accredited Test Firm for Declaration of Conformity (DOC) and Certification testing, you must undergo an A2LA assessment which incorporates the use of the C216 – FCC Technical Evaluation Checklist. As part of this assessment, the assessor will expect that a member of your laboratory’s staff will be on-site in order to reproduce your site attenuation measurements under accredited conditions. If a sub-contractor was used to perform previous NSA measurements at your facility, it is insufficient to have the sub-contractor on-site to act on your laboratory’s behalf (pursuant to guidance from the FCC). A member of the laboratory must be able to reproduce the results. In addition, if you elect to forego an assessment which utilizes the C216 checklist, A2LA will be unable to register your facility with the FCC, and you will then need to register directly through a “2.948 listing” (whereby DOC testing is excluded). FOR ASSESSORS: Please note that if the lab seeks to be recognized by the FCC as an Accredited Test Firm for Declaration of Conformity (DOC) and Certification testing, they must undergo an A2LA assessment which incorporates the use of the C216 – FCC Technical Evaluation Checklist. As such, please be sure to advise the lab prior to going on-site that a member of their staff must be on-site in order to reproduce the site attenuation measurements under accredited conditions. If a sub-contractor was used to perform previous NSA measurements at the facility, it is insufficient to have the sub-contractor on-site to act on the laboratory’s behalf (pursuant to guidance from the FCC). A member of the laboratory must be able to reproduce the results. In addition, please also advise the lab that if they elect to forego an assessment which utilizes the C216 checklist, A2LA will be unable to register their facility with the FCC, and they will then need to register directly through a “2.948 listing” (whereby DOC testing is excluded).

tbarnett
Text Box
ATTACHMENT 6

Quality Control and Proficiency Test Plan for Accredited Test Laboratories

User and/or Laboratory: [Please replace with user data here] Division/Location

User Identification: [Please replace with user data here]

Original Author/Source, Revision, Date : Larry Gradin/Integrity Solutions Group, Inc., ISG-RD1007-R07, 2006/12/11 modified and accepted by lab indicated above

User Doc. # [Please replace with user data here]

Specific Modifier/Preparer Checked and Approved Date (dd/mm/yyyy) Rev

XXXXX YYYYY XXXXX YYYYY dd/mm/yyyy Page 1 of 22

Integrity Solutions-- “Where integrity, attention to detail, cost-effective action, and commitment assure quality performance”

[Please replace with user data in the footer here]

L:\Guidance\G105 – Quality Control and Proficiency Testing Plan (2/6/07) Page 1 of 22

[Guideline for (see note below)]

G105 - Quality Control and Proficiency Test Plan

for

ISO/IEC 17025-2005 Accredited Electrical, Product

Safety, EMC, Environmental Exposure, General Mechanical, and Similar Test Laboratories

By Integrity Solutions Group, Inc.

6419 Bridgewood Terrace Boca Raton, FL 33433 USA

Phone 561-289-9137 Efax: 978-285-6589

Web Page: http://www.integrity-solutions.org Email: [email protected]

[Note: When this Plan is adopted please replace with appropriate user data above, remove “Guideline for” indication and modify the guidance expressed by the term “should” with “shall” to indicate actual organization review and adoption of the guidance as an organization commitment, if made. See footnote 1 for guidance to convert this document from a guidance document to an adopted requirements document] Note: This document is meant to be reasonably complete as a working document, but as a generic document it can not reflect

the unique requirements of the User or Policies that may be mandatory from a procuring organization or third party. It is expected and prudent that some tailoring to this document to reflect the specific User conditions and needs will occur, including deletion of this instructive note. This document is available for use (as a foundation document) based on clear attribution/credit in user documentation to Integrity Solutions Group, Inc. and the author (e.g., a statement such as, this current specification is based on a modification of Integrity Solutions Group, Inc. Technical & Quality Specification to Assure Competent Calibration Services By Larry Gradin). Other guidance documents are available from Integrity

tbarnett
Text Box
ATTACHMENT 7

User/Purchaser or Laboratory [Please replace with user data here] Division/Location

User Identification: [Please replace with user data here]

Original Author/Source, Revision, Date : Larry Gradin/Integrity Solutions Group, Inc., ISG-RD1001-R16, 2006/04/18 modified and accepted by lab indicated above

User Doc. # [Please replace user data here]

Specific Modifier/Preparer Checked and Approved Date (dd/mm/yyyy) Rev

XXXXX YYYYY XXXXX YYYYY dd/mm/yyyy Page 2 of 22

L:\Guidance\electromechanical advisory committee meeting summary - att 7 (2/6/07) Page 2 of 22

Solutions Group for competence in test, calibration, uncertainty estimates, and service acceptance. Refer to http://www.integrity-solutions.org. © Integrity Solutions Group. Inc. 2006

TABLE OF CONTENTS

SECTION TITLE PAGE Cover (Approval) Sheet 1 Table of Contents 2

1.0 Purpose and Introduction 3

2.0 Scope 4

3.0 References 4

4.0 Definitions and Terminology 5

5.0 Quality Control and Proficiency Testing Plan 6

6.0 Required Tests And Documentation 10

7.0 Reporting 20

8.0 Change Record 21

Quality Control and Proficiency Test Plan for Accredited Test Laboratories

User and/or Laboratory: [Please replace with user data here] Division/Location

User Identification: [Please replace with user data here]

Original Author/Source, Revision, Date : Larry Gradin/Integrity Solutions Group, Inc., ISG-RD1007-R07, 2006/12/11 modified and accepted by lab indicated above

User Doc. # [Please replace with user data here]

Specific Modifier/Preparer Checked and Approved Date (dd/mm/yyyy) Rev

XXXXX YYYYY XXXXX YYYYY dd/mm/yyyy Page 3 of 22

L:\Guidance\quality control and proficiency testing plan (2/6/07) Page 3 of 22

1.0 Purpose and Introduction

This (guidance document1 for a) documented Quality Control and Proficiency Test Program plan is designed to satisfy the requirements of ISO/IEC 17025 (Reference 3.1) paragraph 5.9 “Assuring the Quality of Test and Calibration Results” and the A2LA Proficiency Testing (PT) Requirements (reference 3.2) by way of a Quality Control (QC) and PT Plan. A combination of Participation in a formal PT Program [(should be) evaluated acceptable and (should) be accredited PT provider as a minimum]), Internal Performance Based Data collection, Calibration Records, Preventative Maintenance Plans, Personnel Training and competence verification (should) be used to satisfy this requirement. Great care (should) be taken to utilize existing recorded data from standard laboratory practices to ensure a cost effective and value added procedure. Also (to be) considered are the potential needs for intermediate checks to maintain confidence in the calibration status per ISO/IEC 17025 clause 5.5.10. According to ISO/IEC 17025 (reference 3.1) and ILAC (reference 3.9) a laboratory shall have quality control procedures for monitoring the validity of tests and calibrations undertaken. This monitoring may include the participation in interlaboratory comparisons or proficiency testing programs, but also other means (e.g. the regular use of certified reference materials or replicate tests or calibrations using the same or different methods). By these means a laboratory can provide evidence of its competence to its clients and the accreditation body. Guidance from ILAC (reference 3.9) includes the following:

Note1: It is recognized that there are particular areas where proficiency testing is just not practical or does not exist. Proficiency testing activities are considered to be a powerful and effective tool to determine the performance of individual laboratories for specific tests or measurements and to monitor laboratories’ continuing performance. Note 2: Appropriate PT activities include any ILC or measurement audit which monitors the laboratory’s performance, for example those conducted by national or regional accreditation bodies or co-operations, government, industry or commercial providers of formal PT schemes.

1 The term ”should” is used to indicate those provisions which, although they constitute guidance for the application of the requirements, are expected to be adopted by an organization. The term ”shall” or equivalent is expected to used throughout this document upon review and organization adoption reflecting the requirements for QC/PT and are expected to become mandatory. The words that are provided as a recommended direction that should be removed are in parenthesis surrounding blue bold italic text.)

Quality Control and Proficiency Test Plan for Accredited Test Laboratories

User and/or Laboratory: [Please replace with user data here] Division/Location

User Identification: [Please replace with user data here]

Original Author/Source, Revision, Date : Larry Gradin/Integrity Solutions Group, Inc., ISG-RD1007-R07, 2006/12/11 modified and accepted by lab indicated above

User Doc. # [Please replace with user data here]

Specific Modifier/Preparer Checked and Approved Date (dd/mm/yyyy) Rev

XXXXX YYYYY XXXXX YYYYY dd/mm/yyyy Page 4 of 22

L:\Guidance\quality control and proficiency testing plan (2/6/07) Page 4 of 22

2.0 Scope

This plan (should) include the tests or test methods (which may be grouped in a common sub-discipline basis as described in clause 4.3 of this plan) that are included in the scope of accreditation. This controlled document (should) include the current Accreditation Body certification numbers for the accredited laboratory(s), a list of applicable documents, applicability to lab and other organization personnel, scheduling of testing, quality control and proficiency test methods, and appendices that provide additional information for conducting the defined program.

3.0 References

3.1 ISO/IEC 17025-2005, "General requirements for the competence of testing and calibration laboratories"

3.2 A2LA Proficiency Testing Requirements for Accredited Testing and Calibration

Laboratories, September 2005

3.3 ISO/IEC 17011:2004(E) “Conformity assessment – General Requirements for Accreditation Bodies Accrediting Conformity Assessment Bodies”.

3.4 ILAC P1: 2003 “ILAC Mutual Recognition Arrangement Requirements for

Evaluation of Accreditation Bodies by ILAC Recognized Regional Co-operations”

3.5 Automotive EMC Laboratory Recognition Program, Revision 4, January 27, 2006

3.6 ISO/IEC Guide 43-1:1997, Proficiency testing by interlaboratory comparisons - Part 1: Development and operation of proficiency testing schemes

3.7 ISO/IEC Guide 43-2:1997, Proficiency testing by interlaboratory comparisons -

Part 2: Selection and use of proficiency testing schemes by laboratory accreditation bodies.

3.8 ASTM E1301-95 Standard Guide for Proficiency Testing by Interlaboratory

Comparisons.

3.9 ILAC P9: 2005 “ILAC Policy for Participation in National and International Proficiency Testing Activities”

Quality Control and Proficiency Test Plan for Accredited Test Laboratories

User and/or Laboratory: [Please replace with user data here] Division/Location

User Identification: [Please replace with user data here]

Original Author/Source, Revision, Date : Larry Gradin/Integrity Solutions Group, Inc., ISG-RD1007-R07, 2006/12/11 modified and accepted by lab indicated above

User Doc. # [Please replace with user data here]

Specific Modifier/Preparer Checked and Approved Date (dd/mm/yyyy) Rev

XXXXX YYYYY XXXXX YYYYY dd/mm/yyyy Page 5 of 22

L:\Guidance\quality control and proficiency testing plan (2/6/07) Page 5 of 22

3.10 A2LA “General Requirements for Accreditation of Laboratories”, dated August

2005

3.11 A2LA “Conditions For Accreditation (based on ISO/IEC 17011)”

4.0 Definitions and Terminology

4.1 Quality Control and Proficiency Testing: Within the context of this Quality Control (QC) and Proficiency Test Plan, this is the process to determine or confirm reasonable proficiency or competence in testing or measurement by various means. Such means includes, but is not limited to:

Inter-laboratory Comparison2 or Proficiency Testing3 Competence Audits Alternate Measurement Means To Confirm A Measurement Or Test Performance Based Evaluations, Checks, Performance History4, Or Data Collection Technical Activity Audits Regular Use Of Certified Reference Materials Internal Quality Control Replicate Tests Or Calibrations Using The Same Or Different Methods Retesting Or Re-Calibration Of Retained Items Correlation Of Results For Different Characteristics Of An Item

4.2 Performance Based Activities

Performance Based Activities are the focus on activities that are significant to the achievement of a requirement or objective, including cost-effectiveness of a desired performance. Performance Based Activities can take the form of Evaluations, Checks, additional test monitoring, technical assessments, results correlation or Data collection or Performance History. The terminology is usually associated with activities that are not formal Inter-laboratory Comparison or Proficiency Testing.

2 IEC 43-1 Reference 3.6, and ILAC P9:2005 Reference 3.9 defines interlaboratory comparisons as “organization, performance and evaluation of tests on the same or similar test items by two or more laboratories in accordance with predetermined conditions” 3 ILAC P9:2005 Reference 3.9 defines Profciency testing (PT) as “ the determination of the calibration or testing performance of a laboratory or the testing performance of an inspection body by means of interlaboratory comparison”. 4 Performance History is specifically allowed and part of the AEMCRP method of demonstrating proficiency in EMC Testing. (Reference 3.5, clause 5.4.1).

Quality Control and Proficiency Test Plan for Accredited Test Laboratories

User and/or Laboratory: [Please replace with user data here] Division/Location

User Identification: [Please replace with user data here]

Original Author/Source, Revision, Date : Larry Gradin/Integrity Solutions Group, Inc., ISG-RD1007-R07, 2006/12/11 modified and accepted by lab indicated above

User Doc. # [Please replace with user data here]

Specific Modifier/Preparer Checked and Approved Date (dd/mm/yyyy) Rev

XXXXX YYYYY XXXXX YYYYY dd/mm/yyyy Page 6 of 22

L:\Guidance\quality control and proficiency testing plan (2/6/07) Page 6 of 22

Quality Control and Proficiency Test Plan for Accredited Test Laboratories

User and/or Laboratory: [Please replace with user data here] Division/Location

User Identification: [Please replace with user data here]

Original Author/Source, Revision, Date : Larry Gradin/Integrity Solutions Group, Inc., ISG-RD1007-R07, 2006/12/11 modified and accepted by lab indicated above

User Doc. # [Please replace with user data here]

Specific Modifier/Preparer Checked and Approved Date (dd/mm/yyyy) Rev

XXXXX YYYYY XXXXX YYYYY dd/mm/yyyy Page 7 of 22

L:\Guidance\quality control and proficiency testing plan (2/6/07) Page 7 of 22

4.3 Major Sub-Disciplines Of Test Technology Or Methods or Processes

Test standards or processes are very often very similar whether these appear in commercial standards, national, methods, military standards, automotive standards, etc. The Accreditation Body uses the term sub-discipline and (should) be conservatively considered in this plan to be each separate categorization of a test type in the scope of accreditation.

5.0 Quality Control and Proficiency Testing Plan

5.1 Introduction and Background

The American Association for Laboratory Accreditation (A2LA), which is our Accreditation Body, consistent with the International Laboratory Accreditation Cooperative publication ILAC P1: 2003 “ILAC Mutual Recognition Arrangement Requirements for Evaluation of Accreditation Bodies by ILAC Recognized Regional Co-operations” (Reference 3.4) and ILAC P9: 2005 “ILAC Policy for Participation in National and International Proficiency Testing Activities” (Reference 3.9) imposes on its Accredited Laboratories a Proficiency Testing and Performance Based Activities Program. Proficiency Testing, consistent with reference to A2LA Proficiency Testing Requirements for Accredited Testing and Calibration Laboratories, may include various activities that are included in the definitions in clause 4 above. It must be noted that ILAC p1:2003 clause 5.3.1 (Reference 3.4) clearly states that; “It is recognized that there are particular areas where proficiency is just not practical”. Such areas (should generally) include most EMC, Mechanical, Electrical, and Environmental Exposure test methods on a scope of accreditation that (should) be supported by the practical approaches of this plan. The international standard for Accreditation Bodies, (Reference 3.3) ISO/IEC 17011:2004 “Conformity assessment — General requirements for accreditation bodies accrediting conformity assessment bodies”, first edition clause 7.15.3 states:

"The accreditation body shall ensure that its accredited laboratories participate in proficiency testing or other comparison programs, where available and appropriate, and that corrective actions are carried out when necessary. The minimum amount of proficiency testing and the frequency of participation shall be specified in cooperation with interested parties and shall be appropriate in relation to other surveillance activities.

Quality Control and Proficiency Test Plan for Accredited Test Laboratories

User and/or Laboratory: [Please replace with user data here] Division/Location

User Identification: [Please replace with user data here]

Original Author/Source, Revision, Date : Larry Gradin/Integrity Solutions Group, Inc., ISG-RD1007-R07, 2006/12/11 modified and accepted by lab indicated above

User Doc. # [Please replace with user data here]

Specific Modifier/Preparer Checked and Approved Date (dd/mm/yyyy) Rev

XXXXX YYYYY XXXXX YYYYY dd/mm/yyyy Page 8 of 22

L:\Guidance\quality control and proficiency testing plan (2/6/07) Page 8 of 22

NOTE 1: It is recognized that there are particular areas where proficiency testing is impractical”

Allowance for various means to accomplish the purpose of Proficiency Testing is also in ISO/IEC 43-2, (Reference 3.7), Clause 1, Scope since 1997 as follows:

It should be recognized, however, that laboratory accreditation bodies and their assessors may take into account the suitability of test data produced from other activities apart from proficiency testing schemes. This includes results of laboratories' own internal quality control procedures with control samples, comparison with split-sample data from other laboratories, performance of audit tests with certified reference materials, etc.

A2LA requires appropriate PT efforts in their General Requirements (Reference 3.10, Part C, VII). More detail in their PT Requirements (Reference 3.2) mandate that laboratories have suitably implemented the requirements and have a documented plan of how to cover the applicable program requirements or the major sub-disciplines and materials/matrices/product types on their scope of accreditation over a four-year period. This plan covers any commercially available participation and any intra- and/or inter-laboratory organized studies, as applicable. The laboratory must also be able to explain when proficiency testing is not possible for certain testing and this explanation must be included in the plan. When inter-laboratory comparisons (ILCs) or internal/external round robin programs, are not available or relevant to the scope of accreditation, internal performance-based data in accordance with clause 5.9 of ISO/IEC 17025 are acceptable. Internal performance-based checks (or data or evaluations) include (but are not limited to) the following types of activities: regular use of certified reference materials and/or internal quality control using secondary reference materials; replicate tests or calibrations using the same or different methods; retesting or re-calibration of retained items; and correlation of results for different characteristics of an item. Such internal performance-based data activities are included in clause 6 of this document. 5.2 Timing and PT/QC Scheduling

Consistent with AB requirements (Reference 3.2) unless otherwise specified within this document and accepted by the AB, at a minimum, proficiency testing participation

Quality Control and Proficiency Test Plan for Accredited Test Laboratories

User and/or Laboratory: [Please replace with user data here] Division/Location

User Identification: [Please replace with user data here]

Original Author/Source, Revision, Date : Larry Gradin/Integrity Solutions Group, Inc., ISG-RD1007-R07, 2006/12/11 modified and accepted by lab indicated above

User Doc. # [Please replace with user data here]

Specific Modifier/Preparer Checked and Approved Date (dd/mm/yyyy) Rev

XXXXX YYYYY XXXXX YYYYY dd/mm/yyyy Page 9 of 22

L:\Guidance\quality control and proficiency testing plan (2/6/07) Page 9 of 22

(should) be required for at least two proficiency-testing activities per year, every year. Laboratories with less than 4 sub-disciplines on their scope are required to participate in at least one proficiency testing activity per year, every year. Several programs (e.g. Environmental, Construction Materials, Food, Automotive EMC) specify the type and frequency of proficiency tests required. For example, Automotive EMC (Reference 3.5) has a cycle of PT and Performance History on a two (2) year cycle.

In fields without specific requirements, all accredited laboratories must participate in relevant and available proficiency testing at a frequency sufficient to ensure that all major sub-disciplines and materials/matrices/product types (as defined in each section) on the scope of accreditation are covered over a four year period (Reference 3.2). The laboratory submittal to A2LA is dependent on lab status (e.g. new applicant for accreditation or a currently accredited lab) and other data such as frequency of submittal per the A2LA requirements (References 3.2 clause E, F, H). (In addition the laboratory should include specific PT programs for certain disciplines [Biological, Chemical, Electrical, Construction Materials, Geotechnical] consistent with the A2LA Policy [Reference 3.2] and the actual scope of accreditation). Note this plan will be reviewed by the A2LA assessor during the onsite assessments and submitted to A2LA. Per the PT Requirements (Reference 3.2) and A2LA Conditions for Accreditation (Reference 3.11), laboratories are obliged to inform A2LA of any changes to this plan. The reporting of results to A2LA is described in Clause 7 including corrective action responses for any unacceptable results with the corrective action record submitted to A2LA. 5.3 Generic Quality Control Activities to Assure Proficiency The following activities reflect the good quality practices that (should) be used by the lab on a generic basis that enhances quality and proficiency: The competence through internal performance based data program includes: Adequacy of test setups, alignment of test setups consistent with test plan

requirements; assurance that all changes are fully documented with justification in the appropriate test documentation and lab reports; conducting proficiency tests of Lab test personnel;

Quality Control and Proficiency Test Plan for Accredited Test Laboratories

User and/or Laboratory: [Please replace with user data here] Division/Location

User Identification: [Please replace with user data here]

Original Author/Source, Revision, Date : Larry Gradin/Integrity Solutions Group, Inc., ISG-RD1007-R07, 2006/12/11 modified and accepted by lab indicated above

User Doc. # [Please replace with user data here]

Specific Modifier/Preparer Checked and Approved Date (dd/mm/yyyy) Rev

XXXXX YYYYY XXXXX YYYYY dd/mm/yyyy Page 10 of 22

L:\Guidance\quality control and proficiency testing plan (2/6/07) Page 10 of 22

Review of capability that the laboratory possesses the necessary physical, personnel and information resources, and that the laboratory's technical personnel have the skills and expertise necessary for the performance of the tests and/or calibrations in question. The review may also encompass results of earlier participation in inter-laboratory comparisons or proficiency testing and/or the running of trial tests using samples or items of known physical characteristics in order to determine uncertainties of measurement, limits of detection, confidence intervals, etc.

Personnel performing tasks requiring special skills are qualified prior to performing

the work. Such personnel are Qualified/Certified based on demonstrated proficiency of each candidate and periodically thereafter to maintain skills to meet required practice.

The technical manager or supervisor audits the performance of the qualified personnel at least every two years. As a result of this audit, an employee can be disqualified from doing accredited work, be required to be retrained and be re-qualified if the amount of errors and/or problems with quality warrant such action.

The selected methods are appropriate for the type and volume of the work

undertaken.

Proficiency testing sample efforts or Measurement Audits for the routine staff qualified by the accredited laboratory (i.e. ISO/IEC 17025 clause 5.2). Laboratories shall conduct tests and efforts for PT and QC in accordance with the lab normal accredited testing/calibration and reporting procedures, unless otherwise specified in the instructions from the proficiency test provider. Laboratories shall also reasonably ensure that proficiency testing samples are equally distributed among personnel trained and qualified for the relevant tests..

Quality Control and Proficiency Test Plan for Accredited Test Laboratories

User and/or Laboratory: [Please replace with user data here] Division/Location

User Identification: [Please replace with user data here]

Original Author/Source, Revision, Date : Larry Gradin/Integrity Solutions Group, Inc., ISG-RD1007-R07, 2006/12/11 modified and accepted by lab indicated above

User Doc. # [Please replace with user data here]

Specific Modifier/Preparer Checked and Approved Date (dd/mm/yyyy) Rev

XXXXX YYYYY XXXXX YYYYY dd/mm/yyyy Page 11 of 22

L:\Guidance\quality control and proficiency testing plan (2/6/07) Page 11 of 22

6.0 Required Tests And Documentation

(Guidance: These are possible guidance activities for various subdisciplines. The lab actual organization should review for appropriateness and adopt the guidance and as appropriate commit to a documented plan as they deem appropriate. The data herein may be supplemented with data tables and exhibits to make this procedure more useful as determined by the user. See footnote 1 for guidance to convert this document from a guidance document to an adopted requirements document. Note, this list may appear large, but it does not envelope all subdisciplines – the lab is responsible for completeness. Recommendations for enhancement to assure consistency is welcome.)

PROFICIENCY TESTING REQUIREMENTS FOR INDIVIDUAL AEMCLAP/AEMCLRP5 TESTS (If applicable)

6.1 Radiated Immunity competence is demonstrated through the AEMCLRP Program

(Reference 3.2) appropriate Appendix. This program may require Artifact testing every 2 years and the data to be submitted to A2LA. System verification data may also be submitted when periodic checks are made in the time period between the 2-year intervals. Equipment calibration records and employee training records shall also be available for inspection.

6.2 Bulk Current Injection competences are demonstrated through the AEMCLRP

Program (Reference 3.2) appropriate Appendix. This program may require Artifact testing every 2 years and the data to be submitted to A2LA. System verification data may also be submitted when periodic checks are made in the time period between the 2-year intervals. Equipment calibration records and employee training records shall also be available for inspection.

6.3 Direct Injection competence is demonstrated through the AEMCLRP Program

(Reference 3.2) appropriate Appendix. This program may require Artifact testing every 2 years and the data to be submitted to A2LA. System verification data may also be submitted when periodic checks are made in the time period between the 2-year intervals. Equipment calibration records and employee training records shall also be available for inspection.

5 The Accreditation Body performs the Accreditation effort for the Automotive EMC Program which is designated as the Automotive EMC Laboratory Accreditation Program or AEMCLAP. DaimlerChrysler (DC), Ford Motor Company (Ford) and General Motors Corporation (GM) provide overall recognition as part of the Automotive EMC Laboratory Recognition Program or AEMCLRP. Consequently, both the terms AEMCLAP and AEMCLRP are used in different phases of the overall program.

Quality Control and Proficiency Test Plan for Accredited Test Laboratories

User and/or Laboratory: [Please replace with user data here] Division/Location

User Identification: [Please replace with user data here]

Original Author/Source, Revision, Date : Larry Gradin/Integrity Solutions Group, Inc., ISG-RD1007-R07, 2006/12/11 modified and accepted by lab indicated above

User Doc. # [Please replace with user data here]

Specific Modifier/Preparer Checked and Approved Date (dd/mm/yyyy) Rev

XXXXX YYYYY XXXXX YYYYY dd/mm/yyyy Page 12 of 22

L:\Guidance\quality control and proficiency testing plan (2/6/07) Page 12 of 22

6.4 Radiated Emissions (CISPR 25) competence is demonstrated through the AEMCLRP Program (Reference 3.2) appropriate Appendix. This program may require Artifact testing every 2 years and the data to be submitted to A2LA. System verification data (taken with a Reference Radiator) and chamber quiet sweeps may also be submitted when periodic checks are made in the time period between the 2 year intervals. Equipment calibration records and employee training records shall also be available for inspection.

6.5 Electrostatic Discharge (ESD) competence is demonstrated through the AEMCLRP

Program (Reference 3.2) appropriate Appendix. This program requires the submission of waveforms to verify the ESD events every 2 years and the data to be submitted to A2LA. ESD verification is done prior to each ESD test per xxx [Please replace with user data here. This data is included with each test report and is available for inspection. Equipment calibration records and employee training records shall also be available for inspection.

Quality Control and Proficiency Test Plan for Accredited Test Laboratories

User and/or Laboratory: [Please replace with user data here] Division/Location

User Identification: [Please replace with user data here]

Original Author/Source, Revision, Date : Larry Gradin/Integrity Solutions Group, Inc., ISG-RD1007-R07, 2006/12/11 modified and accepted by lab indicated above

User Doc. # [Please replace with user data here]

Specific Modifier/Preparer Checked and Approved Date (dd/mm/yyyy) Rev

XXXXX YYYYY XXXXX YYYYY dd/mm/yyyy Page 13 of 22

L:\Guidance\quality control and proficiency testing plan (2/6/07) Page 13 of 22

OTHER QUALITY CONTROL AND PROFICIENCY TESTING (PERFORMANCE BASED) REQUIREMENTS

6.6 Fluids Susceptibility competence is demonstrated through internal based

performance data. Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection.

6.7 Humidity Exposure competence is demonstrated through internal based

performance data. The test chambers are of high quality and capable of maintaining an environment of approximately +/- 5 % RH. An independent chart recorder verifies the chamber temperature throughout the test in addition an independent temperature controller is used to verify temperature. This chart is included with each test report. Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection.

6.8 Mechanical Shock competence is demonstrated through internal based

performance data. Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection.

6.9 Rain competence is demonstrated through internal based performance data.

Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection.

6.10 Temperature Cycling/Exposure competence is demonstrated through internal

based performance data. The test chambers are of high quality and capable of maintaining an environment of approximately +/- 2 ºCelsius. An independent chart recorder verifies the chamber temperature throughout the test in addition an independent temperature controller is used to verify temperature. This chart is included with each test report. Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection.

Quality Control and Proficiency Test Plan for Accredited Test Laboratories

User and/or Laboratory: [Please replace with user data here] Division/Location

User Identification: [Please replace with user data here]

Original Author/Source, Revision, Date : Larry Gradin/Integrity Solutions Group, Inc., ISG-RD1007-R07, 2006/12/11 modified and accepted by lab indicated above

User Doc. # [Please replace with user data here]

Specific Modifier/Preparer Checked and Approved Date (dd/mm/yyyy) Rev

XXXXX YYYYY XXXXX YYYYY dd/mm/yyyy Page 14 of 22

L:\Guidance\quality control and proficiency testing plan (2/6/07) Page 14 of 22

6.11 Thermal Shock competence is demonstrated through internal based performance data. The test chambers are of high quality and capable of maintaining an environment of approximately +/- 2º Celsius. An independent chart recorder verifies the chamber temperature throughout the test in addition an independent temperature controller is used to verify temperature. This chart is included with each test report. Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection.

6.12 Dewing competence is demonstrated through internal based performance data.

The test chambers are of high quality and capable of maintaining an environment of approximately +/- 1.0 degree Celsius and +/- 5.0% RH. An independent chart recorder verifies the chamber temperature throughout the test. The tests specifications only require the temperature and humidity to be within +/- 2.0 degree Celsius and +/- 5.0% RH, respectively. This chart can found in the corresponding test folder. Reports, equipment calibration records, preventative maintenance records, and management test report approval shall be available for inspection.

6.13 Salt Fog competence is demonstrated through internal based performance data

such as the quality checks including pH verification, temperature verification, condensing rates, which are inherent to the basic industry test method ASTM B-117. Reports, equipment calibration records, preventative maintenance records, and management test report approval support of this plan. Equipment calibration records and management test report approval shall also be available for inspection.

6.14 Dust Exposure competence is demonstrated through internal based performance

data such as the quality checks including compliance with dust flow or accumulation, air flow, and concentration measurements inherent with the method described in SAE J1211. Reports, equipment calibration records, preventative maintenance records, and management test report approval shall be available for inspection.

6.15 Vibration competence is demonstrated through internal based performance data.

The vibration is controlled via an accelerometer mounted to shaker. A second independent accelerometer records the actual vibration profile and this profile is included with each test report. Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection.

6.16 Fungus competence is demonstrated through internal based performance data.

Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection.

Quality Control and Proficiency Test Plan for Accredited Test Laboratories

User and/or Laboratory: [Please replace with user data here] Division/Location

User Identification: [Please replace with user data here]

Original Author/Source, Revision, Date : Larry Gradin/Integrity Solutions Group, Inc., ISG-RD1007-R07, 2006/12/11 modified and accepted by lab indicated above

User Doc. # [Please replace with user data here]

Specific Modifier/Preparer Checked and Approved Date (dd/mm/yyyy) Rev

XXXXX YYYYY XXXXX YYYYY dd/mm/yyyy Page 15 of 22

L:\Guidance\quality control and proficiency testing plan (2/6/07) Page 15 of 22

6.17 Hosedown competence is demonstrated through internal based performance data. Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection.

6.18 Airborne Contaminants calibration is required before each test using a specified

circuit board. Competence is demonstrated through internal based performance data. Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection.

6.18 Impact competence is demonstrated through internal based performance data.

Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection.

6.20 Altitude/Pressure competence is demonstrated through internal based

performance data. Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection.

6.21 Dust chamber is built to manufacturers specifications. Competence is

demonstrated through internal based performance data. Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection.

6.22 Backfire Simulation competence is demonstrated through internal based

performance data. Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection.

6.23 HALT competence is demonstrated through internal based performance data.

Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection.

6.24 Acceleration “G” level is calculated using mathematical formula, distance of unit

from center of arm. Competence is demonstrated through internal based performance data. Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection.

Quality Control and Proficiency Test Plan for Accredited Test Laboratories

User and/or Laboratory: [Please replace with user data here] Division/Location

User Identification: [Please replace with user data here]

Original Author/Source, Revision, Date : Larry Gradin/Integrity Solutions Group, Inc., ISG-RD1007-R07, 2006/12/11 modified and accepted by lab indicated above

User Doc. # [Please replace with user data here]

Specific Modifier/Preparer Checked and Approved Date (dd/mm/yyyy) Rev

XXXXX YYYYY XXXXX YYYYY dd/mm/yyyy Page 16 of 22

L:\Guidance\quality control and proficiency testing plan (2/6/07) Page 16 of 22

6.25 Explosion competence is demonstrated though internal based performance data. The explosion mixture used for this test is ignited to verify that the explosive mixture is present in the chamber. Charts proving this occurred are included in each test report. Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection.

6.26 Earthquake competence is demonstrated through internal based performance data.

The vibration is controlled via an accelerometer mounted to shaker. A second independent accelerometer records the actual vibration profile and this profile is included with each test report. Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection.

6.27 Firearms Resistance competence is demonstrated though internal based

performance data. Standard ammunition is used as a specified distance. Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection.

6.28 Fire Test this test requires proof of calibration of Flame not only before the

exposure to the unit but also immediately after without extinguishing the burner. This data is included in each test report. Competence is demonstrated though internal based performance data. Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection. Charts proving this occurred are included in each request.

6.29 Flammability competence is demonstrated through internal based performance

data. Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection.

6.30 Fire Resistance competence is demonstrated through internal based performance

data. Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection.

6.30 NEBS Fire Spread calibration of the flame is required prior to each test. Calibration

of Oxygen sensor is required prior to each test. This data is included in each report. Competence is demonstrated through internal based performance data. Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection.

Quality Control and Proficiency Test Plan for Accredited Test Laboratories

User and/or Laboratory: [Please replace with user data here] Division/Location

User Identification: [Please replace with user data here]

Original Author/Source, Revision, Date : Larry Gradin/Integrity Solutions Group, Inc., ISG-RD1007-R07, 2006/12/11 modified and accepted by lab indicated above

User Doc. # [Please replace with user data here]

Specific Modifier/Preparer Checked and Approved Date (dd/mm/yyyy) Rev

XXXXX YYYYY XXXXX YYYYY dd/mm/yyyy Page 17 of 22

L:\Guidance\quality control and proficiency testing plan (2/6/07) Page 17 of 22

6.31 Icing/Freezing Rain competence is demonstrated through internal based performance data. Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection.

6.32 Acoustic Noise competence is demonstrated through internal based performance

data. Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection.

6.33 Conducted Emissions, Audio Frequency In addition to regular calibration of the

equipment, competence is demonstrated through “System Checks” that are performed and documented prior to every MIL-STD-461 test. In addition, competence is demonstrated through documented training of test personnel and by a documented supervisor review of test setups prior to testing. Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection.

6.34 Conducted Emissions, Radio Frequency In addition to regular calibration of the

equipment, competence is demonstrated through “System Checks” that are performed and documented prior to every MIL-STD-461 test. In addition, competence is demonstrated through documented training of test personnel and by a documented supervisor review of test setups prior to testing. Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection.

6.35 Conducted Emissions, Transient In addition to regular calibration of the

equipment, competence is demonstrated through “System Checks” that are performed and documented prior to every MIL-STD-461 test. In addition, competence is demonstrated through documented training of test personnel and by a documented supervisor review of test setups prior to testing. Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection.

6.36 Conducted Susceptibility, Audio Frequency Competence is demonstrated

through performance-based data. Calibration of the test setup is performed and documented prior to every MIL-STD-461 test. Applied test levels in terms of power and voltage are recorded and documented for every test. In addition, competence is demonstrated through documented training of test personnel and by a documented supervisor review of test setups prior to testing. Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection.

Quality Control and Proficiency Test Plan for Accredited Test Laboratories

User and/or Laboratory: [Please replace with user data here] Division/Location

User Identification: [Please replace with user data here]

Original Author/Source, Revision, Date : Larry Gradin/Integrity Solutions Group, Inc., ISG-RD1007-R07, 2006/12/11 modified and accepted by lab indicated above

User Doc. # [Please replace with user data here]

Specific Modifier/Preparer Checked and Approved Date (dd/mm/yyyy) Rev

XXXXX YYYYY XXXXX YYYYY dd/mm/yyyy Page 18 of 22

L:\Guidance\quality control and proficiency testing plan (2/6/07) Page 18 of 22

6.37 Conducted Susceptibility, Radio Frequency Competence is demonstrated through performance-based data. In addition to the calibrations that are performed and documented prior to every MIL-STD-461 test, a “System Check” is also performed to verify the accuracy of the calibration. This is done by running a simulated test, using the just collected calibration data, and actual test software, with the injection and monitoring probes installed in calibration fixtures. This “System Check” data is also documented. In addition, competence is demonstrated through documented training of test personnel and by a documented supervisor review of test setups prior to testing. Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection.

6.38 Conducted Susceptibility, Transient Competence is demonstrated through

performance-based data. Calibration of the test setup is performed and documented prior to every MIL-STD-461 test. Applied test levels in terms of current and voltage are recorded and documented for every test. In addition, competence is demonstrated through documented training of test personnel and by a documented supervisor review of test setups prior to testing. Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection.

6.39 Radiated Emissions, Magnetic Field In addition to regular calibration of the

equipment, competence is demonstrated through “System Checks” that are performed and documented prior to every MIL-STD-461E test. In addition, competence is demonstrated through documented training of test personnel and by a documented supervisor review of test setups prior to testing. Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection.

6.40 Radiated Emissions, Electric Field In addition to regular calibration of the

equipment, competence is demonstrated through “System Checks” that are performed and documented prior to every MIL-STD-461E test. In addition, competence is demonstrated through documented training of test personnel and by a documented supervisor review of test setups prior to testing. Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection.

6.41 Radiated Susceptibility, Audio Frequency Competence is demonstrated through

performance-based data. Calibration of the test setup is performed and documented prior to every MIL-STD-461 test. Applied test levels in terms of Magnetic Field strength are recorded and documented for every test. In addition, competence is demonstrated through documented training of test personnel and by a documented supervisor review of test setups prior to testing. Reports, equipment

Quality Control and Proficiency Test Plan for Accredited Test Laboratories

User and/or Laboratory: [Please replace with user data here] Division/Location

User Identification: [Please replace with user data here]

Original Author/Source, Revision, Date : Larry Gradin/Integrity Solutions Group, Inc., ISG-RD1007-R07, 2006/12/11 modified and accepted by lab indicated above

User Doc. # [Please replace with user data here]

Specific Modifier/Preparer Checked and Approved Date (dd/mm/yyyy) Rev

XXXXX YYYYY XXXXX YYYYY dd/mm/yyyy Page 19 of 22

L:\Guidance\quality control and proficiency testing plan (2/6/07) Page 19 of 22

calibration records, preventative maintenance records, and employee training records shall be available for inspection.

6.42 Radiated Susceptibility, Radio Frequency Competence is demonstrated through

performance-based data. Calibration of the test setup is performed and documented prior to every RTCA/DO-160 test. Applied test levels in terms of Electric Field strength, along with applied RF Forward Power, are recorded and documented for every test. MIL-STD-461 requires “real-time” monitoring and leveling of the applied Electric Field strength and this data is recorded and documented, along with applied RF Forward Power, for every test. In addition, competence is demonstrated through documented training of test personnel and by a documented supervisor review of test setups prior to testing. Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection.

6.43 Radiated Susceptibility, Transient Competence is demonstrated through

performance-based data. Calibration of the test setup is performed and documented prior to every MIL-STD-461 test. Applied test levels in terms of current and/or voltage are recorded and documented for every test. In addition, competence is demonstrated through documented training of test personnel and by a documented supervisor review of test setups prior to testing. Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection.

6.44 Electrostatic Discharge (ESD) In addition to regular calibration of the equipment,

competence is demonstrated through documented training of test personnel and by a documented supervisor review of test setups prior to testing. Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection.

6.45 Knob Rotation, Knob Abuse Test competence is demonstrated through internal

based performance data. Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection.

6.46 Pushbutton Durability, Switch Mounting Durability competence is demonstrated

through internal based performance data. Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection.

6.47 Visual Inspection competence is demonstrated through internal based

performance data. Reports, equipment calibration records, preventative

Quality Control and Proficiency Test Plan for Accredited Test Laboratories

User and/or Laboratory: [Please replace with user data here] Division/Location

User Identification: [Please replace with user data here]

Original Author/Source, Revision, Date : Larry Gradin/Integrity Solutions Group, Inc., ISG-RD1007-R07, 2006/12/11 modified and accepted by lab indicated above

User Doc. # [Please replace with user data here]

Specific Modifier/Preparer Checked and Approved Date (dd/mm/yyyy) Rev

XXXXX YYYYY XXXXX YYYYY dd/mm/yyyy Page 20 of 22

L:\Guidance\quality control and proficiency testing plan (2/6/07) Page 20 of 22

maintenance records, and employee training records shall be available for inspection.

6.49 Solar Test Exposure competence is demonstrated through inter-laboratory Total

Normalized Radiation (TNR) comparison between the GM Desert Proving Ground Solar Exposure Laboratory in AZ (or equivalent) and xxx every2 years (The xxx must not be 12% below Az); calibration records and Equipment verification data

6.50 Color Evaluation competence is demonstrated through internal Reports,

equipment calibration records, employee specialized qualifications and training records ..

6.51 Gloss Evaluation competence is demonstrated through internal Reports,

equipment calibration records, employee specialized qualifications and training records

6.52 Squeak and Rattle / Audible Noise competence is demonstrated through internal

based performance data. Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection.

6.53 Accelerated Environmental Aging competence is demonstrated through internal

based performance data. The test chambers are of high quality and capable of maintaining an environment of approximately +/- 2 degrees Celsius. A monitoring thermocouple verifies the chamber temperature throughout the test. This data is stored on the chamber PC, or on a chart recorder associated with each chamber. Reports, equipment calibration records, preventative maintenance records, and management test report approval shall be available for inspection.

6.54 Crocking/Mar competence is demonstrated through internal based performance

data. Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection.

6.55 Fade competence is demonstrated through internal based performance data. SAE

J1885 defines the verification methods, frequency, and tolerances allowed for exposure intensity. These verifications are performed and stored at the tester. Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection.

6.56 Flex competence is demonstrated through internal based performance data, and

through commercial interlaboratory comparisons (See Appendix C). Reports, equipment calibration records, preventative maintenance records, and employee training records shall be available for inspection

Quality Control and Proficiency Test Plan for Accredited Test Laboratories

User and/or Laboratory: [Please replace with user data here] Division/Location

User Identification: [Please replace with user data here]

Original Author/Source, Revision, Date : Larry Gradin/Integrity Solutions Group, Inc., ISG-RD1007-R07, 2006/12/11 modified and accepted by lab indicated above

User Doc. # [Please replace with user data here]

Specific Modifier/Preparer Checked and Approved Date (dd/mm/yyyy) Rev

XXXXX YYYYY XXXXX YYYYY dd/mm/yyyy Page 21 of 22

L:\Guidance\quality control and proficiency testing plan (2/6/07) Page 21 of 22

6.57 Compression/Tensile competence is demonstrated through internal based

performance data such as the quality checks including keying in the sensitivity of load cell, otherwise known as a correction factor. Reports, equipment calibration records, preventative maintenance records, and management test report approval shall be available for inspection.

7.0 REPORTING 7.1 Formal Proficiency Testing Results

The laboratory (should) document the analysis of all results, and submit the results, and the subsequent analysis, of all relevant proficiency testing participation to A2LA promptly upon receipt consistent with A2LA directions (Reference 3.2 clause H). Detailed corrective action responses for any outlying or unacceptable results related to testing/calibration on the laboratory Scope of Accreditation must also be submitted per the A2LA directions. To facilitate A2LA review, laboratories must complete the A2LA Proficiency Testing Data Submission form along with the data and corrective actions.

7.2 Performance Based Activitiy Results PT/QC reports that are Performance Based Activities (should) be submitted to A2LA to provide a representative sample of yearly ‘checks’ along with annual data submittal to A2LA. However, any deficiencies determined (should) be documented under the lab corrective action process including root cause and corrective action and its implementation along with the representative summary data.

Quality Control and Proficiency Test Plan for Accredited Test Laboratories

User and/or Laboratory: [Please replace with user data here] Division/Location

User Identification: [Please replace with user data here]

Original Author/Source, Revision, Date : Larry Gradin/Integrity Solutions Group, Inc., ISG-RD1007-R07, 2006/12/11 modified and accepted by lab indicated above

User Doc. # [Please replace with user data here]

Specific Modifier/Preparer Checked and Approved Date (dd/mm/yyyy) Rev

XXXXX YYYYY XXXXX YYYYY dd/mm/yyyy Page 22 of 22

L:\Guidance\quality control and proficiency testing plan (2/6/07) Page 22 of 22

8.0 CHANGE RECORD

Change Record

Rev Date Responsible Person Description of Change

(0) (2006/05/14) (Lab Tech Manager) (Initial Release)

(1) (2006/11/19) (Lab Tech Manager) (Rewrite to come in line with January 27, 2006 revision of AEMCLRP and A2LA Current PT Policy. Add XX and YY)

The American Association for Laboratory Accreditation

Document Revised: November 22, 2010

P101 – Reference to A2LA Accredited Status – A2LA Advertising Policy

Page 1 of 13

2010 by A2LA All rights reserved. No part of this document may be reproduced in any form or by any means without the prior written permission of A2LA. L:\Association Documents\P101 – Reference to A2LA Accredited Status-A2LA Advertising Policy

(Changes effective as of October 12, 2010 unless otherwise noted.) The following requirements pertaining to the use of the “A2LA Accredited” symbol and to any other reference to A2LA accreditation shall be met by an accredited organization in order to become and remain accredited by A2LA. Failure to comply with these requirements may result in suspension or revocation of the organization's accreditation. Introduction A2LA-accredited organizations are strongly encouraged to promote their A2LA accreditation by using the “A2LA Accredited ” symbol.

It is the ethical responsibility of accredited and applicant organizations to describe their accredited status in a manner that does not imply accreditation in areas that are outside their actual scope of accreditation or for other facilities not covered under A2LA accreditation. Accredited organizations or others are encouraged to advise A2LA if a violation of this policy is discovered by actions of other parties. Appendix A of this document may also be consulted for examples of appropriate vs. inappropriate references to A2LA accredited status. While inclusion of the “A2LA Accredited” symbol on reports is not mandatory, only reports bearing the “A2LA Accredited” symbol (or that otherwise make reference to accredited status by a specific, recognized accreditation body) can benefit from the acceptance established through mutual recognition agreements/arrangements among accreditation bodies, and only those calibration and reference material certificates bearing the “A2LA Accredited” symbol (or that otherwise make reference to accredited status by a specific, recognized accreditation body) can be confirmed to meet the A2LA Traceability Policy. (See P102 – A2LA Policy on Measurement Traceability for additional clarification and requirements that must be met.) Unless otherwise specified, all requirements related to the use of the “A2LA Accredited” symbol specified in this document also apply when making any other claims of A2LA accreditation. The term “certificates and reports” includes calibration certificates, test reports, inspection certificates or reports and/or any other certificate or report generated under the organization’s scope of accreditation. Failure to comply with these requirements may lead to denial, suspension or revocation of accreditation and/or legal remedies. NOTE: The “A2LA” logo is to be used by A2LA only. Accredited organizations may use the “A2LA Accredited” symbol and/or may make a narrative reference to their A2LA accreditation, but may not use the “A2LA” logo in such references. The A2LA Logo: The “A2LAAccredited” Symbol: (to be used by A2LA only) (to be used by A2LA-accredited organizations)

tbarnett
Text Box
ATTACHMENT 8

The American Association for Laboratory Accreditation

Document Revised: November 22, 2010

P101 – Reference to A2LA Accredited Status – A2LA Advertising Policy

Page 2 of 13

L:\Association Documents\P101 – Reference to A2LA Accredited Stauts-A2LA Advertising Policy

1.0 General Requirements 1.1 The accredited organization shall have a policy and procedure for controlling the use of the term

“A2LA” and the “A2LA Accredited” symbol. 1.2 The “A2LA Accredited” symbol shall not be used by an accredited organization that is not A2LA

accredited.

1.3 The “A2LA Accredited” symbol shall not be used by applicants for A2LA accreditation. 1.4 The “A2LA Accredited” symbol shall be used by an A2LA accredited organization only under

the name in which it holds A2LA accreditation.

1.5 When promoting or providing proof of accreditation, accredited organizations shall use the scope(s) of accreditation, as this document details the specific activities which are accredited. The certificate shall be used for display purposes and may also accompany the scope.

1.6 It is the responsibility of the accredited organization to communicate this Advertising Policy and

its requirements to the necessary corporate/marketing representatives to ensure that all requirements are met.

2 Symbol Reproduction

2.1 The “A2LA Accredited” symbol is available as an electronic version upon request. 2.2 Where the A2LA name (not to be confused with the “A2LA” logo) is used by accredited

organizations in a narrative reference to accredited status, it shall always be accompanied by at least the word “accredited”.

2.3 While there are no restrictions on the size and color of the “A2LA Accredited” symbol

reproduction, the symbol must maintain its form. 2.4 The “A2LA Accredited” symbol may be generated electronically provided that the prescribed

formats and forms are retained. 3 Use of the “A2LA Accredited” Symbol on Reports and Certificates

3.1 Where the “A2LA Accredited” symbol is used to endorse results on reports or certificates, it shall always be accompanied by the A2LA certificate number(s) and an indication of the type of organization accredited (e.g., testing/calibration laboratory, proficiency testing provider, reference material producer, inspection body, product certification body, etc.). An example for each accreditation program is given below.

The American Association for Laboratory Accreditation

Document Revised: November 22, 2010

P101 – Reference to A2LA Accredited Status – A2LA Advertising Policy

Page 3 of 13

L:\Association Documents\P101 – Reference to A2LA Accredited Stauts-A2LA Advertising Policy

TESTING CERT #9999.99

CALIBRATION CERT #9999.99

PROFICIENCY TESTING PROVIDER

CERT #9999.99

REFERENCE MATERIAL PRODUCER

CERT #9999.99

INSPECTION BODY CERT #9999.99

PRODUCT CERTIFICATION BODY

CERT #9999.99

3.2 The “A2LA Accredited” symbol may be displayed on all certificates and reports that contain results

from activities that have been carried out within the accredited organization’s official A2LA Scope of Accreditation.

3.3 The “A2LA Accredited” symbol shall not be used on certificates and reports if none of the results

presented are from activities included on the A2LA Scope(s) of Accreditation.1 3.4 Where both accredited and non-accredited activities are included on an endorsed report or certificate,

non-accredited results shall be clearly and unambiguously identified as such. This can be done by placing an asterisk after each such result along with a footnote stating, for example: “The test/calibration/inspection results are not covered by our current A2LA accreditation.”

3.5 On reports where results are reported within the field where accreditation exists but in a technology

that is not included in the scope, they must be so indicated. (For example, if an organization is

1 To provide clients with assurance that the quality system under which the contracted work was done meets the accreditation requirements, an appropriate reference may be included in a prominent place on the report or certificate when none of the work is covered under the accreditation. For example, as related to tests or calibrations: “This accredited organization maintains A2LA accreditation to ISO/IEC 17025 for the specific tests/calibrations listed in A2LA Certificate #______. The tests/calibration results included in this report/calibration, however, are not covered by this accreditation.” Note that inclusion of the “A2LA accredited” symbol in this case is prohibited.

The American Association for Laboratory Accreditation

Document Revised: November 22, 2010

P101 – Reference to A2LA Accredited Status – A2LA Advertising Policy

Page 4 of 13

L:\Association Documents\P101 – Reference to A2LA Accredited Stauts-A2LA Advertising Policy

accredited in the Environmental Field for only wet chemistry and metals, any gas chromatographic data reported would need to be identified as non-accredited.)

3.6 If the intent is to ensure that the client meets the requirements of the A2LA Traceability Policy, the

calibrations performed by an A2LA accredited calibration laboratory must be included on the calibration laboratory’s A2LA Scope of Accreditation, and the calibration certificate issued must contain the “A2LA Accredited” symbol (or other reference to accredited status by a specific, recognized accreditation body), the A2LA certificate number, and an indication of the type of entity accredited (See Section 3.1).3

3.7 There shall be nothing in the reports, certificates or in any attachments or other materials which

implies or may lead any user of the results or any interested party to believe that the work is accredited when it is not.

4. Subcontracted Activities4

4.1 If the final report or certificate contains only results of the subcontracted activity, an A2LA accredited

organization may include the results in its endorsed reports or certificates (i.e., containing the “A2LA Accredited” symbol) and may portray the results as being “accredited” only if:

4.1.1 The accredited organization has informed the client in writing of the proposed

subcontracting and has obtained prior approval (e.g., ISO/IEC 17025:2005, Section 4.5.2); and

4.1.2 The subcontracted work appears on the organization’s own Scope of Accreditation; and

4.1.3 The subcontractor is accredited for the work in question by A2LA or an A2LA

recognized MRA partner and submitted their results on an endorsed report or certificate to the contracting organization; and

4.1.4 The subcontracted activity results are clearly and unambiguously identified as such on the

final certificate or report issued to the customer (e.g., ISO/IEC 17025:2005, Section 5.10.6).

4.2 If the final report or certificate contains only results of the subcontracted work and this activity is not

covered by the A2LA accredited organization’s Scope of Accreditation or the subcontractor is not accredited by A2LA (or an A2LA recognized MRA partner) for the work performed, the final report shall not include the “A2LA Accredited” symbol and shall not imply A2LA accreditation for the work done. (However, a statement as contained in Footnote #1 of P101 may be included.)

4.3 If the final report contains both subcontracted work and work performed by the A2LA accredited

3 Many accredited calibration laboratories offer “accredited” and “non-accredited” services. The “non-accredited” services do not meet the A2LA Traceability Policy. 4 For reference material producers and proficiency testing providers, A2LA does not consider collaborator arrangements to be subcontracting.

The American Association for Laboratory Accreditation

Document Revised: November 22, 2010

P101 – Reference to A2LA Accredited Status – A2LA Advertising Policy

Page 5 of 13

L:\Association Documents\P101 – Reference to A2LA Accredited Stauts-A2LA Advertising Policy

organization itself, then the results may be included in the A2LA accredited organization’s endorsed report or certificate (i.e., containing the “A2LA Accredited” symbol) and may be portrayed as being “accredited” only if:

4.3.1 The accredited organization has informed the client in writing of the proposed

subcontracting done and has obtained prior approval (e.g., ISO/IEC 17025:2005, Section 4.5.2); and

4.3.2 The subcontracted activity results are clearly and unambiguously identified as such on the

final certificate or report issued to the customer (e.g., ISO/IEC 17025:2005, Section 5.10.6); and

4.3.3 Any work performed outside of the A2LA accredited organization’s own Scope of

Accreditation and/or performed by a subcontractor not accredited by A2LA (or by an A2LA recognized MRA partner) is clearly identified as “non-accredited”; and

4.3.4 Any subcontracted work that falls within the A2LA accredited organization’s Scope of

Accreditation is performed by a subcontractor that is itself accredited by A2LA (or an A2LA recognized MRA partner) for the specific activities concerned and that submitted results on an endorsed report or certificate to the contracting organization.

5 Opinions and Interpretations

5.1 Where statements of opinions and interpretations are outside the Scope of Accreditation, the accredited organization is required to include a disclaimer such as the following in the certificate

or report: “The opinions/interpretations expressed in this report are outside the scope of this accredited organization's A2LA accreditation.”5 5.2 It is preferable, however, to express opinions and interpretations that are outside the Scope of

Accreditation on a separate letter which is not part of the endorsed certificate or report and that does not carry the “A2LA Accredited” symbol or other reference to A2LA accreditation.

6 Calibration Labels

6.1 Calibration labels containing the “A2LA Accredited” symbol may be affixed only to equipment that has been calibrated by the accredited calibration laboratory under their Scope of Accreditation. If only a portion of the calibration (e.g., one parameter out of several associated with the equipment) falls within the laboratory’s Scope of Accreditation, then the “A2LA Accredited” symbol may not be affixed to the equipment calibrated unless the label clearly delineates the accredited parameters vs. non-accredited parameters calibrated.

6.2 Calibration labels containing the “A2LA Accredited” symbol shall include at least the following

information:

_________________________________

5 Statements of compliance with a specification are not considered “opinions and interpretations.”

The American Association for Laboratory Accreditation

Document Revised: November 22, 2010

P101 – Reference to A2LA Accredited Status – A2LA Advertising Policy

Page 6 of 13

L:\Association Documents\P101 – Reference to A2LA Accredited Stauts-A2LA Advertising Policy

6.2.1 The name of the accredited calibration laboratory or its A2LA Certificate number. 6.2.2 The instrument identification

6.2.3 The date of the current calibration 6.2.4 Cross-reference to the accredited calibration certificate issued with respect to this

calibration.

7 Inspection Labels (ISO/IEC 17020 Inspection Body Accreditation only)

7.1 Inspection labels containing the “A2LA Accredited” symbol may be affixed only to products that have been inspected by the accredited inspection body under their Scope of Accreditation.

7.2 Inspection labels containing the “A2LA Accredited” symbol shall include at least the following information:

7.2.1 The name of the accredited inspection body and/or its A2LA Certificate number. 7.2.2 The product identification 7.2.3 The date of the current inspection 7.2.4 Cross-reference to the accredited inspection certificate issued with respect to this inspection.

8 Reference Material Producer Labels and Certificates (ISO Guide 34 only)

8.1 A2LA does not permit the use of the “A2LA Accredited” symbol on reference material labels because of the potential for misrepresentation as product certification.

8.2 The reference material producer labels shall meet the requirements stipulated in ISO Guide 34:2000

clause 5.6.5 .

8.3 The content of the label or mark on the material should serve only to identify the reference material and should be confined to the name of the producer, the name of the material, the product’s code for the material, the batch number, and the relevant health and safety warnings as prescribed in ISO Guide 31.

8.4 Only those reference material certificates bearing the “A2LA Accredited” symbol (or that otherwise

make reference to accredited status by a specific, recognized accreditation body) can be confirmed to meet the A2LA Traceability Policy. (See P102 – A2LA Policy on Measurement Traceability for additional clarification and requirements that must be met.)

9 Proficiency Testing Providers Labels

9.1 A2LA does not permit the use of the A2LA Accreditation symbol on proficiency testing labels because of the potential for misrepresentation as product certification.

The American Association for Laboratory Accreditation

Document Revised: November 22, 2010

P101 – Reference to A2LA Accredited Status – A2LA Advertising Policy

Page 7 of 13

L:\Association Documents\P101 – Reference to A2LA Accredited Stauts-A2LA Advertising Policy

10 Advertising, Publicity, and Business Solicitation

10.1 Accredited organizations may incorporate statements concerning their accreditation in publicity and/or advertising materials, including brochures and organization publications, technical literature, business reports, web sites and quotations or proposals for work. When statements concerning accredited status or the “A2LA Accredited” symbol are to be used on promotional materials that are not easily recalled or corrected in the event of error (i.e., mass distributed catalogs & brochures, advertisements in periodicals & other literature, etc.), these materials must be approved in advance in writing by A2LA staff, prior to distribution/publication.

10.2 The use of the "A2LA Accredited" symbol or other reference to A2LA used to promote accreditation

enhances the reputation and value of accreditation for all stakeholders. It is the responsibility of the accredited organizations to ensure that there is no misrepresentation of the accreditation status and that the accreditation process is not brought into disrepute.

10.3 The accreditation claim shall be related only to the activity that is covered under the A2LA Scope of

Accreditation, and not with any other activities in which the accredited organization or its parent organization are involved.

10.4 A2LA accreditation is site specific. The accreditation claim shall be related only to the specific

accredited location that is covered under the A2LA Scope of Accreditation, and not with any other non-accredited locations.

10.5 In proposals or quotations, the accredited organization shall distinguish activities that are covered

under the A2LA Scope of Accreditation from those that are not covered. 10.6 Where the “A2LA Accredited” symbol is printed on letterhead or other corporate stationery, such

stationery shall not be used for work proposals, quotes, reporting of results exclusively outside the A2LA Scope of Accreditation, or certifying a product or other item.

10.7 The “A2LA Accredited” symbol or accreditation claim shall not be affixed to a material, item or

product (or related part, including packaging), or used to imply that an item or product has been certified. (See section 7 of this document for rules related to the use of the logo for accredited inspection bodies.)

10.8 If the “A2LA Accredited” symbol is included in literature relating to a product, the symbol must

appear directly adjacent to the reference to the accredited organization and it must be clearly stated that inclusion of the symbol does not imply certification/approval of the products.

10.9 The “A2LA Accredited” symbol shall not be displayed on business cards in a manner that might

imply personnel certification. (See Section 12.4 below for additional requirements regarding business cards.)

11 Misuse of the “A2LA Accredited” symbol or accreditation status

11.1 Every circumstance where the principle of accurate representation applies cannot be anticipated and dealt with in this document. Therefore, it is the responsibility of the accredited and applicant

The American Association for Laboratory Accreditation

Document Revised: November 22, 2010

P101 – Reference to A2LA Accredited Status – A2LA Advertising Policy

Page 8 of 13

L:\Association Documents\P101 – Reference to A2LA Accredited Stauts-A2LA Advertising Policy

organization representatives not to misrepresent their accredited status under any circumstances. 11.2 If there are questions, the accredited organization should submit intended uses of the symbol, draft

advertisements, and/or any other accreditation claims to A2LA Headquarters for advance review. 11.3 Upon suspension or termination of accreditation, an organization must immediately cease to issue

reports, and certificates displaying the symbol and shall cease publishing documents (including advertisements, websites, etc.) containing the symbol or reference to A2LA accreditation.

12 Use of the combined ILAC-A2LA Accredited Symbol (For use exclusively in conjunction with activities

covered under A2LA’s ILAC Scope of Recognition – testing and calibration laboratories only as of September 2005.)

12.1 Accredited laboratories may use the combined “ILAC MRA – A2LA Accredited” symbol in order to

demonstrate accreditation by a signatory of the ILAC arrangement.

12.2 The combined symbol may be used only in combination with the accredited laboratory’s A2LA certificate number and an indication of whether it is a testing or calibration laboratory that is accredited. For example, the symbol below is appropriate for use by a testing laboratory.

12.3 The combined symbol shall be used only in the same proportions as indicated above. It may be displayed in black-and-white or in an approved blue color according to the following color breakdowns:

PROCESS (CMYK) Color Breakdown: C100 M56 Y0 K0 PANTONE (PMS) Color Breakdown:: PANTONE 293C (blue) WEBSITE (RGB) Color Breakdown: R0 G0 B229

12.4 The combined symbol shall not be used on business cards. 12.5 Laboratories wishing to use the combined symbol must present their proposed usage to A2LA for

review and shall not begin actual use of the combined symbol until they have received written approval from A2LA.

12.6 Laboratories wishing to use the combined symbol must also sign a formal sub-license agreement

with A2LA that will be provided when the laboratory’s proposed usage is submitted to A2LA. They may not begin actual use of the combined symbol until this sub-license agreement has been signed.

12.7 All requirements of sections 1.0-7.0, 10.0 & 11.0 of this Advertising Policy are also applicable to

use of the combined symbol (e.g., policy and procedure, reproduction, use on marketing materials, etc.).

The American Association for Laboratory Accreditation

Document Revised: November 22, 2010

P101 – Reference to A2LA Accredited Status – A2LA Advertising Policy

Page 9 of 13

L:\Association Documents\P101 – Reference to A2LA Accredited Stauts-A2LA Advertising Policy

13 “Accredited Work” vs. “Non-Accredited Work” The issue of what defines “accredited” vs. “non-accredited” work has been debated for many years. In order to establish a consistent approach to be followed by all accredited laboratories, the following requirements have been developed and approved:

13.1 When a client requests the performance of a test or calibration that appears on a laboratory’s A2LA Scope of Accreditation, the test or calibration must be performed in accordance with all of the A2LA requirements for accreditation, whether or not the “A2LA Accredited” symbol is used on the resulting test report or calibration certificate, unless the conditions of 13.2 are met. It is important to note, however, that the “A2LA Accredited” symbol must be included on any certificate or report intended to demonstrate measurement traceability in accordance with the A2LA Policy on Measurement Traceability, Section T2. (See the “Introduction” section above.)

13.2 If a client requests performance of a test or calibration that appears on a laboratory’s A2LA Scope of

Accreditation but does not want or need the test or calibration to be performed under accredited conditions, these requests and the exceptions to the accreditation requirements must be clearly documented in the accredited laboratory’s contract review records (reference ISO/IEC 17025, Section 4.4.1a). When these tests or calibrations are not performed in accordance with all of the A2LA requirements for accreditation, the resulting test report or calibration certificate cannot be endorsed with the “A2LA Accredited” symbol.

14 Dimensional Testing Certificates/Test Reports

14.1.1 For all dimensional testing parameters for which the unit under test does serve as a link in the

traceability chain, where an accredited, endorsed test report is issued, the organization shall identify on the test report that the test(s) conducted is performed in accordance with R205 – Specific Requirements: Calibration Laboratory Accreditation Program and is deemed equivalent with a calibration certificate.

14.1.2 For all dimensional testing parameters for which the unit under test does not serve as a link in

the traceability chain, where an accredited, endorsed test report is issued, the organization shall not indicate in any way that the test report is deemed equivalent with a calibration certificate.

14.1.3 For dimensional testing parameters for which the unit under test does serve as a link in the

traceability chain for some parameters but does not serve as a link in the traceability chain for other parameters, where an accredited, endorsed calibration certificate or test report is issued the organization must distinguish those results that are performed in accordance with R205 and deemed equivalent to a calibration from those that are not deemed equivalent to a calibration. Use of an asterisk with language to this effect would be acceptable.

The American Association for Laboratory Accreditation

Document Revised: November 22, 2010

P101 – Reference to A2LA Accredited Status – A2LA Advertising Policy

Page 10 of 13

L:\Association Documents\P101 – Reference to A2LA Accredited Stauts-A2LA Advertising Policy

APPENDIX A

Examples of Appropriate vs. Inappropriate References to A2LA Accredited Status Inappropriate Reference:

Accredited Test Lab Cert 0000.00

WHY? Because only A2LA may use the “A2LA” logo. Appropriate Reference:

Test Lab Cert 0000.00

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Inappropriate Reference on a Calibration Certificate:

“Accredited to ISO/IEC 17025” WHY? Because it does not mention the specific accreditation body, certificate number and type of entity. This calibration certificate would not meet the A2LA Traceability Policy. Appropriate Reference on a Calibration Certificate:

Calibration Lab Cert 0000.00

The American Association for Laboratory Accreditation

Document Revised: November 22, 2010

P101 – Reference to A2LA Accredited Status – A2LA Advertising Policy

Page 11 of 13

L:\Association Documents\P101 – Reference to A2LA Accredited Stauts-A2LA Advertising Policy

- OR –

“Calibration Laboratory Accredited by A2LA,

Certificate Number 0000.00”

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Inappropriate Business Card:

ABC Labs Your Calibration Source!

John Q. Doe Technical Manager

123 1st Street

Anywhere, USA Phone: 555 555 5555

Fax: 444 444 4444 Email: [email protected]

www.ABC.com

WHY? Placement of the “A2LA Accredited” symbol implies personnel certification of Mr. Doe. A2LA does not certify or accredit personnel. Appropriate Business Card:

ABC Labs Your Calibration Source!

John Q. Doe

Technical Manager

123 1st Street Anywhere, USA

Phone: 555 555 5555 Fax: 444 444 4444

Email: [email protected] www.ABC.com

The American Association for Laboratory Accreditation

Document Revised: November 22, 2010

P101 – Reference to A2LA Accredited Status – A2LA Advertising Policy

Page 12 of 13

L:\Association Documents\P101 – Reference to A2LA Accredited Stauts-A2LA Advertising Policy

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Inappropriate Advertisement:

ABC Labs Your Calibration Source!

Locations in:

Anywhere, USA Nowhere, USA

Elsewhere, USA

Call for our list of services, including: Equipment calibration, service, maintenance and

repair!

WHY? Accreditation is site specific and ABC Labs is accredited at the “Anywhere, USA” location only. They are also accredited for calibration of equipment only – not servicing, maintenance and repair. Inclusion of the symbol implies accreditation for all 3 laboratories/locations and for all services listed. Appropriate Advertisement:

ABC Labs Your Calibration Source!

Locations in:

Anywhere, USA Nowhere, USA*

Elsewhere, USA*

Call for our list of services, including: Equipment calibration, service*, maintenance*

and repair*!

*Locations and services not within the A2LA Scope of

Accreditation

The American Association for Laboratory Accreditation

Document Revised: November 22, 2010

P101 – Reference to A2LA Accredited Status – A2LA Advertising Policy

Page 13 of 13

L:\Association Documents\P101 – Reference to A2LA Accredited Stauts-A2LA Advertising Policy

Document Revision History Date Description January 2003

Revised 3.6 to clarify that calibration certificates including reference to A2LA accreditation are acceptable under the A2LA Traceability Policy. Revised 8.3 to include examples of documents and to specify that references to A2LA accreditation must cease upon suspension or termination.

November 2004

Added Section 9.0 to address the combined ILAC-MRA and A2LA Accredited symbol. Clarified distinction between “A2LA” logo and “A2LA Accredited” symbol.

September 2005

Added Section 7.0 to address issues specific to accredited Inspection Bodies. Made overall policy generic for use by any accredited organization.

December 2006

Sections 3.1 and 10.2 revised to require inclusion of an indication of the type of organization accredited when the symbol is used on certificates or reports. Section 10.5 added to require a formal sub-license agreement for use of the combined “ILAC MRA – A2LA Accredited” symbol. Section 11 added to define “accredited” vs. “non-accredited” work.

July 2008

Introduction revised to include reference material certificates related to the Traceability Policy. Sections added regarding reference material producer and proficiency testing provider labels. Sections 12.2 and 13.1 clarified.

February 2009

Clarified what is acceptable to meet the A2LA Traceability Policy; expanded upon Section 6.1; included cross-references to ISO/IEC 17025 clauses; included Appendix A.

August 2009 Clarified intent of Section 4 with regard to presentation of subcontracted results. June 2010

Removal of transition period from Section 3.1; Clarification of requirements for subcontracted activities in Section 4; Addition of requirement for pre-approval of mass-distributed items in Section 10.1; Addition of requirements for color breakdown of combined symbol in Section 12.3; Addition of Section 12.4 prohibiting use of combined symbol on business cards.

November 2010 Addition of section 14 for Dimensional Test Reports and Calibration Certificates

ANSI C63.26 Development ANSI C63.26 Development UpdateUpdate

Mac Elliott

04/02/2011

2011 A2LA Technical Forum and Annual Meeting

tbarnett
Text Box
ATTACHMENT 9

BackgroundBackground

ANSI C63 very active in updating key EMC standards

ANSI C63.10 [Unlicensed Transmitters] / ANSI C63.4:2009 [Part 15 tests / etc.] released

These were accepted as valid alternatives by the FCC for testing to ANSI C63.4:2003 [FCC P/N DA 09-2478]

These tests are performed for: 1) Rx Verification for Part 15 compliance

2) FCC DoC [unintentional emissions when connected to IT]

3) Part 15 / other Unlicensed Intentional Radiators [Bluetooth / MotoTalk / WLAN / etc..]

BackgroundBackground [Cont.] [Cont.]

Some significant differences exist between existing test methods and new requirements

Examples:– Absorber required on ground plane for testing

above 1 GHz – Antenna calibration requirements [ANSI

C63.5:2006 ONLY] While not required now, anticipate that in the future

FCC will transition to accepting the tests to the new standard solely.

Petitioned to make transition in 2012 [heard through grapevine]

Motivation for StandardMotivation for Standard

International standards are changing rapidly [such as site VSWR requirements above 1 GHz].

Changes from the standards referenced in TIA 603 / etc…

Prime motivation seems to be to try and harmonize the test methods as much as possible to alleviate multiple test requirements for compliance to similar regulatory requirements.

Another motivation is that there are considerations that are not covered in current accepted methods such as LTE / WiMax broadband emissions.

Radiated emissions not high priority for TR8.1 since method tried and true. Not harmonized with later standards.

Proposed ApproachProposed Approach

Develop generic Field Strength of Spurious Emissions with “modifiers”based on the Rule Part / Technology in Part-specific considerations

Generic / or Basic Method is based on TIA 603 Substitution Measurements

Proposed ApproachProposed Approach

Similar to R&TTE EMC Standard 301 489– ETS 301 489-1 specify general requirements

– ETS 301 489-”X” modifies the general requirements for radio service specifics.

Approach would be similar here.

Considerations for Part Specific TX would be only Part specific instead of having to apply to all Licensed radio services. [as TIA 603 does now]

Proposed ApproachProposed Approach

Example– Part 90 / 95 Considerations (with exceptions)

Keep close as possible to current standard for radiated / refer to TIA Standard [already mature / etc]

Site considerations in ANSI C63.10 taken into account

For sites doing only Part 15 RX Verification –may waive absorber on ground plane if can show worst case achieved without it

Proposed ApproachProposed Approach

Example– Part 90 / 95 Considerations (with exceptions) ETS 300 086 being considered as alt standard to

demonstrate compliance with certain Part 90 Products – true Harmonization

Individuals from OET and IC have informally stated that they would be open to consider this

EXCEPTIONS - Part 90Y Public safety 4.9GHz – the FCC rules calls out the test methods for 5GHz U-NII – (C 63.10 power measurements , PSD, peak excursion).

Part 90Z would not be covered by ETSI either

Proposed ApproachProposed Approach

Each radio part will have its own TG to address considerations specific to its own testing

Considerations Above 1 GHzConsiderations Above 1 GHz

Significant issues identified by related Task Group leading to changes in testing above 1 GHz.

May lead to some changes in above approach.

Update from Werner Schaffer

FCC KDB 449343FCC KDB 449343

Published Dec 2010 w Jan 1st effective date

Precludes use of “pre-calibrated” field

“Since the pre-calibrated field methodology has not been recognized as part of an industry standard or formally submitted to the FCC for consideration, data collected using this method cannot be deemed acceptable for demonstrating compliance to FCC rules.”

FCC KDB 449343FCC KDB 449343

“It is our understanding that the Accredited Standards Committee C63® — Electromagnetic Compatibility is considering related measurement procedures. If C63®, or another recognized standards organization, develops alternatives to the measurement methods described in TIA-603-C, then the FCC is prepared to consider the acceptability of those alternative measurement methods.”

FCC KDB 449343FCC KDB 449343

We are working on this alternate procedure and have developed a mechanism for requesting extensions on effective dates if can make case of undue burden on labs [ACIL+TCB Council]

Bigger question for accredited labs? Validated alternate methods not acceptable for demonstrating compliance?

FCC KDB ProcessFCC KDB Process

EME / SAR labs have been living with KDB decisions and changes for awhile

Needs to be a mechanism for labs / ABs / FCC to communicate regarding these decisions

FCC KDB ProcessFCC KDB Process

The plan is to define and propose an ANSI Committee to do just that.

My task to make sure gets on agenda for discussion at ANSI meetings next month.

FCC KDB ProcessFCC KDB Process

Of course – we could always use participants in any of these areas and all are invited to get involved!

Thank youThank you