36
WECC Compliance Committee September 18, 2013 Salt Lake City, UT

WECC Compliance Committee September 18, 2013 Salt Lake City, UT

Embed Size (px)

Citation preview

Page 1: WECC Compliance Committee September 18, 2013 Salt Lake City, UT

WECC Compliance Committee

September 18, 2013Salt Lake City, UT

Page 2: WECC Compliance Committee September 18, 2013 Salt Lake City, UT

2

I. Welcome and Introductions Lee Beyer, Chair

Agenda ReviewApproval of Minutes of June 26, 2013 Meeting

II. Compliance Update and Metrics Constance White

III. Board Oversight of Compliance Post-Bifurcation Ruben Arredondo

IV. Compliance Outreach Laura Scholl

V. WICF Update Matt Jastram

VI. Update and Return Disclosure Form

VII. Adjourn

AGENDA

Page 3: WECC Compliance Committee September 18, 2013 Salt Lake City, UT

Constance B. WhiteVice President, Compliance

WECC Compliance UpdateWednesday, September 18, 2013

Page 4: WECC Compliance Committee September 18, 2013 Salt Lake City, UT

4

1 Index of Slides

2 Audit and Spot Check Reports

3 Violation Review and Validation-Data

4 Violation Review and Validation-Line Chart

5 Mitigation Plan Review-Data

6 Mitigation Plan Review-Line Chart

7 Completed Mitigation Plan Review-Data

8 Completed Mitigation Plan Review-Line Chart

9 Violations Received-CIP (Reporting Method)

10 Violations Received-O&P (Reporting Method)

11 Violations Received (Shown by Percent)

12 Violation Aging

13 Regional Entity Inventory

Appendix (attached): Detailed explanation for each measure

WECC Compliance Report

Page 5: WECC Compliance Committee September 18, 2013 Salt Lake City, UT

5

Measure: Average days to produce audit report and file with NERC (“non-public” = violations are not yet processed to finality)

Spot Check data includes only those since 1/1/2011 when reports were required to be sent to NERC.

“Days Outstanding” measures average days that the incomplete reports are pending

Goal: Per CMEP, “normally” 60 days for audit reports; 90 days for spot check reports if there are no violations)

* Year-to-date

Non-public Audit & Spot Check ReportsAudits:

Audit Year Complete Reports Average Days to Completion Incomplete Reports Days Outstanding

2010 101 80 0 0 2011 92 89 0 0 2012 137 60 0 0 2013* 77 45 9 40

Spot Checks:

Spot Check Year Completed Reports Average Days to Completion Incomplete Reports Days Outstanding

2011 16 78 0 0 2012 - - - - 2013* - - - -

Page 6: WECC Compliance Committee September 18, 2013 Salt Lake City, UT

6

Measure: Average days to complete reviews for validated (enforceable) violations reviewed during the referenced period. Includes technical review. Upon completion of review validated violations are sent to Enforcement staff for further processing and final disposition. This data does not include dismissed violations.

Goal: 60 days (Internal; no CMEP requirements)

# CIP pending review as of 7/31/2013: 33

# O&P pending review as of 7/31/2013: 22

* Quarter-to-date

Violation Review and Validation

Violation Review ReportAs of 7/31/2013

Quarter/Year

Critical Infrastructure Protection (CIP) Operations and Planning (O&P)

Reviewed & Validated

Average Days to Review

Pending Review > 90

Days

Reviewed & Validated

Average Days to Review

Pending Review > 90

Days

2012            

2Q 2012 59 58 2 19 76 1

3Q 2012 63 32 2 32 46 0

4Q 2012 27 47 2 30 34 1

2013            

1Q 2013 44 27 5 39 46 4

2Q 2013 83 35 14 44 32 9

3Q 2013* 14 53 15 11 49 10

Page 7: WECC Compliance Committee September 18, 2013 Salt Lake City, UT

7

Shows information from the preceding slide (slide 3)

Goal: 60 days (Internal; no CMEP requirements)

# pending review as of 7/31/2013: 55

# pending review > 90 days as of 7/31/2013: 25

* Quarter-to-date

Violation Review and Validation

020406080

100120140160180

Average Days to CompleteViolation ReviewAs of 7/31/2013

CIP O&P Goal

Nu

mb

er o

f D

ays

Page 8: WECC Compliance Committee September 18, 2013 Salt Lake City, UT

8

Measure: Average days to complete reviews for accepted Mitigation Plans reviewed during the referenced period

Does not include rejected MPs or MPs with dismissed violations

Goal: 60 days (Internal. CMEP: Region has 30 days to accept, reject, or extend review. If the MP is not reviewed within 30 days, WECC always extends the review within 30 days.)

# CIP pending review as of 7/31/2013: 29

# O&P pending review as of 7/31/2013: 13

* Quarter-to-date

Mitigation Plan Review

Mitigation Plan Review ReportAs of 7/31/2013

Quarter/Year

Critical Infrastructure Protection (CIP) Operations and Planning (O&P)

Reviewed & Accepted

Average Days to Review

Pending Review > 90

Days

Reviewed & Accepted

Average Days to Review

Pending Review > 90

Days

2012            

2Q 2012 85 44 0 34 41 0

3Q 2012 49 20 2 67 13 0

4Q 2012 46 28 2 31 27 0

2012            

1Q 2013 44 21 0 37 24 0

2Q 2013 65 31 2 32 16 3

3Q 2013* 22 28 3 9 52 4

Page 9: WECC Compliance Committee September 18, 2013 Salt Lake City, UT

9

Measure: Shows information from the preceding slide (slide 5)

Goal: 60 days (Internal. CMEP: Region has 30 days to accept, reject, or extend review. WECC always extends the review within 30 days, if necessary.)

# pending review as of 7/31/2013: 42

# pending review > 90 days as of 7/31/2013: 7

* Quarter-to-date

Mitigation Plan Review

020406080

100120140160180

Average Days to CompleteMitigation Plan Review

As of 7/31/2013

CIP O&P Goal

Nu

mb

er o

f D

ays

Page 10: WECC Compliance Committee September 18, 2013 Salt Lake City, UT

10

Measure: Average days to complete reviews for accepted Completed Mitigation Plans reviewed during the referenced period

Does not include rejected CMPs or CMPs with dismissed violations

Goal: 60 days (Internal; no CMEP requirements)

# CIP pending review as of 7/31/2013: 86

# O&P pending review as of 7/31/2013: 6

* Quarter-to-date

Completed Mitigation Plan Review

Completed Mitigation Plan Review ReportAs of 7/31/2013

Quarter/Year

Critical Infrastructure Protection (CIP) Operations and Planning (O&P)

Reviewed & Accepted

Average Days to Review

Pending Review > 90

Days

Reviewed & Accepted

Average Days to Review

Pending Review > 90

Days

2012            

2Q 2012 114 90 21 55 60 1

3Q 2012 76 61 24 41 12 1

4Q 2012 45 98 24 44 18 1

2012            

1Q 2013 55 80 24 32 25 3

2Q 2013 48 85 27 45 35 2

3Q 2013* 24 95 32 8 16 2

Page 11: WECC Compliance Committee September 18, 2013 Salt Lake City, UT

11

Measure: Shows information from the preceding slide (slide 7)

Goal: 60 days (Internal; no CMEP requirements)

# pending review as of 7/31/2013: 92

# pending review > 90 days as of 7/31/2013: 34

* Quarter-to-date

Completed Mitigation Plan Review

020406080

100120140160180

Average Days to CompleteCompleted Mitigation Plan Review

As of 7/31/2013

CIP O&P Goal

Nu

mb

er o

f D

ays

Page 12: WECC Compliance Committee September 18, 2013 Salt Lake City, UT

12

Measure: Enforceable violations (Reviewed violations that are not dismissed) shown by source (CMEP monitoring method)

* "Other" includes Spot Check, Exception Report, Periodic Data Submittal, Compliance Investigation and Complaint

** Year-to-date

Violations Reporting Method

2010 2011 2012 2013 **

Self-Report 66 127 86 46

Self-Certification 153 42 38 40

Audit 30 9 51 44

Other * 25 5 0 0

Total 274 183 175 130

25

75

125

175

225

275

CIP Enforceable ViolationsAs of 7/31/2013

Nu

mb

er o

f V

iola

tio

ns

Page 13: WECC Compliance Committee September 18, 2013 Salt Lake City, UT

13

Measure: Enforceable violations (Reviewed violations that are not dismissed) shown by source (CMEP monitoring method)

* "Other" includes Spot Check, Exception Report, Periodic Data Submittal, Compliance Investigation and Complaint

** Year-to-date

Violations Reporting Method

2010 2011 2012 2013 **

Self-Report 74 95 98 50

Self-Certification 21 35 47 7

Audit 38 14 17 13

Other * 7 4 21 0

Total 140 148 183 70

25

75

125

175

225

275

O&P Enforceable ViolationsAs of 7/31/2013

Nu

mb

er o

f V

iola

tio

ns

Page 14: WECC Compliance Committee September 18, 2013 Salt Lake City, UT

14

Measure: Enforceable violations (Reviewed violations that are not dismissed), with source expressed as a percentage of enforceable violations.

* "Other" includes Spot Check, Exception Report, Periodic Data Submittal, Compliance Investigation and Complaint

** Year-to-date

Violations Reporting Method

CIP O&P CIP O&P CIP O&P CIP O&P2010 2011 2012 2013 **

0%

100%

24%

53%

69% 64%

49% 54%

35%

71%

56%

15%

23%24%

22%26%

31%

10%

11%27%

5% 9%29% 9% 34%

19%9% 5% 3% 3%

11%0%

Percentage of Enforceable ViolationsAs of 7/31/2013

Self-Report Self-Certification Audit Other *

Per

cen

tag

e o

f V

iola

tio

ns

Page 15: WECC Compliance Committee September 18, 2013 Salt Lake City, UT

15

Excludes Federal Entity violations on hold.

Violation Aging

0

50

100

150

200

250

300

31 30

101

48

86

247

Aging of Violations in CaseloadAs of 7/31/2013

Nu

mb

er o

f V

iola

tio

ns

Page 16: WECC Compliance Committee September 18, 2013 Salt Lake City, UT

16

Source: NERC Enforcement Metrics as of 6/30/2013

These new metrics are developed based on violations processed in the first 6 months of 2013. Violations that are held by appeal, a regulator, or a court are excluded from computation of these metrics.

Regional Entity Inventory

Region Regional Inventory Average months in Regional

Inventory Regional Caseload

Index

FRCC 44 9.8 5.9

MRO 137 10.8 11.0

NPCC 99 10.3 8.1

RFC 358 10.9 9.9

SERC 500 16.6 16.1

SPP 185 11.7 15.2

TRE 159 9.7 5.6

WECC 304 7.5 8.4

Page 17: WECC Compliance Committee September 18, 2013 Salt Lake City, UT

17

Detailed Explanation of Measures

Appendix to Compliance Report

Page 18: WECC Compliance Committee September 18, 2013 Salt Lake City, UT

18

Slide 2 Explanation: Audit and Spot Check Reports

Measure: This data chart measures the average number of days it takes for the WECC audit team to produce an audit report following an audit and file it with NERC. It also measures the number of completed reports, the number of incomplete reports and the total number of days incomplete reports are outstanding.

Measurement Period: One year. (4 years’ worth of audit reports and 3 years’ worth of spot check reports are shown)

Purpose: WECC measures this to gauge our compliance with the CMEP time suggested; to gauge the audit and audit support staff efficiency and efficiency of both staff and tools.

Terms: “Non public” :refers to final audit reports send to NERC. Audit reports are not posted publically until all due process relating to disposing of any violations are complete, i.e. a Notice of Penalty is approved by FERC.

“Days outstanding” measures the average number of days that the incomplete reports have been pending.

Goal: 60 days for audit report; 90 days for spot check report. Per the Compliance Monitoring and Enforcement Program) CMEP, “normally” audit reports would be filed with NERC 60 days following conclusion of the audit. The CMEP indicates that for spot check reports, these should be filed within 90 days if there are no violations.

Notes: Spot Check data includes only those since 1/1/2011 when reports were required to be sent to NERC.

Appendix - Explanation of Measures

Page 19: WECC Compliance Committee September 18, 2013 Salt Lake City, UT

19

Slides 3-4 Explanation: Violation Review and Validation Report (Data and Line Charts)

Measure: These charts measure the average number of days staff (generally SMEs, Subject Matter Experts) takes to complete reviews of all NPVs (New Possible Violations) which are validated and transferred to WECC Enforcement staff for processing and disposition. It also measures the number of violations reviewed and validated and the number of violations pending review > 90 days at the end of the measurement period. Also noted are the total number of CIP and O&P violations currently pending review.

Measurement Period: Quarter (six most recent quarters are shown)

Purpose: WECC measures this as a gauge of staffing sufficiency given workload, and efficiency of both staff and processes.

Terms: “NPV” is New Possible Violation. NPVs are reported and entered into the tracking system within five days after initial indication of a violation . “CMEP” is the FERC-approved Compliance Monitoring and Enforcement Program:

Goal: 60 days from entry of the NPV to the date it is either dismissed or referred for further processing by WECC Enforcement staff. This is an internal goal; the CMEP specifies no goals or requirements.

Notes: NPV are reviewed by an appropriate WECC SME, who may review the available record , request further information from the entity, or issue data requests. At that point, the NPV is either:

(1) Dismissed (meaning that upon review the SME determined that no violation in fact exists) or

(2) “Validated” and becomes an “Alleged Violation” which is forwarded to the WECC Enforcement staff for appropriate disposition (for example, through FFT, Find, Fix and Track, or an Expedited Settlement Agreement, Notice of Penalty, or Spreadsheet NOP).

This measure currently does not include violations that are dismissed. The dismissal data in WECC’s new webCDMS system includes a number of out-of-process outliers, which can skew the average days calculation. WECC is working with the webCDMS vendor to add the ability to flag these outliers for exclusion.

Appendix - Explanation of Measures

Page 20: WECC Compliance Committee September 18, 2013 Salt Lake City, UT

20

Slides 5-6 Explanation: Mitigation Plan Review (Data and Line Charts)Measure: These charts measure the average number of days staff (generally SMEs, Subject Matter Experts) takes to

complete reviews of Mitigation Plans (MPs) which were accepted during the referenced period. It also measures the number of MPs reviewed and accepted and the number of MPs pending review > 90 days at the end of the measurement period. Also noted are the total number of CIP and O&P MPs currently pending review. These measures do not include MPs that were rejected after review or that pertain to violations that ultimately were dismissed.

Measurement Period: Quarter (six most recent quarters are shown)

Purpose: WECC measures this as a gauge of staffing sufficiency given workload, and efficiency of both staff and processes.

Terms: “MP” is Mitigation Plan. Entities are required to file these on a requirements (rather than Standard) level. Thus, for example, violations of two different requirements relating of a single standard would result in two MPs.

“CMEP” is the FERC-approved Compliance Monitoring and Enforcement Program:

Goal: 60 days from submittal of MP to the date it is either accepted or rejected. This is an internal goal; the CMEP specifies the region has 30 days to accept, reject, or extend the review period. WECC always extends the review period within 30 days, if necessary.

Notes: WECC believes that 30 days for review is not a realistic goal. Under the CMEP, entities are not required to file MPs until after issuance of the NOAV (Notice of Alleged Violation), if not contested . Yet entities are encouraged to file MPs as quickly as possible once they believe they are in violation. It is not unusual for an entity to file a Self Report, or certify non-compliance, simultaneously (or nearly so) with a corresponding MP. The dilemma is that WECC cannot assess the sufficiency of the MP until it understands the violation by performing its review (initial validation of the New Possible Violation, then Enforcement staff assessing of the scope and risk of the Alleged Violation). The CMEP contains no measures for these violation review activities, but it’s only after doing performing them that staff assess whether it can accept the MP. All these activities easily can take longer than 30 days. Figures also include the federal cases on hold (pending legal resolution). In these cases MPs are not required and entities have been submitting voluntarily.

Appendix - Explanation of Measures

Page 21: WECC Compliance Committee September 18, 2013 Salt Lake City, UT

21

Slides 7-8 Explanation: Completed Mitigation Plan Review (Data and Line Charts)Measure: These charts measure the average number of days staff (generally SMEs, Subject Matter Experts) takes to

complete reviews of Completed Mitigation Plans (CMPs) which were accepted during the referenced period. It also measures the number of CMPs reviewed and accepted and the number of CMPs pending review > 90 days at the end of the measurement period. Also noted are the total number of CIP and O&P CMPs currently pending review. These measures do not include CMPs that were rejected after review or that pertain to violations that ultimately were dismissed.

Measurement Period: Quarter (six most recent quarters are shown)

Purpose: WECC measures this as a gauge of staffing sufficiency given workload, and efficiency of both staff and processes.

Terms: “CMP” is Completed Mitigation Plan, which includes a Certification of Mitigation Plan Completion and supporting evidence.

“CMEP” is the FERC-approved Compliance Monitoring and Enforcement Program:

Goal: 60 days from submittal of CMP to the date it is either accepted or rejected. This is an internal goal; the CMEP specifies no goals or requirements.

Notes: CMP reviews are generally more involved and time consuming compared to MP reviews. In addition to a Certification of Mitigation Plan Completion document, an entity must submit evidence, which demonstrates the mitigating activities outlined in a particular MP are complete. These CMP reviews not only require WECC SMEs to review the evidence provided by entities, but also SMEs often need to issue data requests in order to get the information necessary to verify MP completion. Figures also include the federal cases on hold (pending legal resolution). In these cases MPs are not required and entities have been submitting voluntarily.

Appendix - Explanation of Measures

Page 22: WECC Compliance Committee September 18, 2013 Salt Lake City, UT

22

Slides 9-11 Explanation: Enforceable Violations and Enforceable Violations by Percent

Measure: These charts measures the number of CIP and O&P enforceable violations by reporting method: Self-Report; Self-Certification; Audit and Other, and shows the breakdown by percent as well.

Measurement Period: One year (4 most recent years are shown)

Purpose: This does not measure performance or efficiency. It is used to track the number of discovered violations relative to the various reporting methods, primarily to assess trends. This information occasionally is requested by stakeholders.

Terms: "Other" includes Spot Check, Exception Report, Periodic Data Submittal, Compliance Investigation and Complaint.

“Enforceable violations” are violations that are reviewed and validated.

Goal: N/A

Notes: Dismissed violations are not enforceable.

Appendix - Explanation of Measures

Page 23: WECC Compliance Committee September 18, 2013 Salt Lake City, UT

23

Slide 12 Explanation: Aging of Violations in Caseload

Measure: This chart measures the number and age of violations, which have not been filed with NERC.

Measurement Period: Snapshot as of specified date.

Purpose: Identifies and tracks older violations. This information is used to help manage violation processing priority and resources.

Terms: “Caseload" refers to all discovered violations in WECC’s inventory, which have not been filed with NERC..

Goal: N/A

Notes: Excludes Federal Entity violations on hold.

Appendix - Explanation of Measures

Page 24: WECC Compliance Committee September 18, 2013 Salt Lake City, UT

24

Slide 13 Explanation: Regional Entity Inventory

Measure: This is a chart created by NERC, which shows three separate NERC Enforcement Metrics for the eight Regional Entities:

Regional Inventory – Number of active violations that have not been submitted to NERC for processing and review.

Average Months in Regional Inventory – Average number of months violations have been in the Region’s inventory from discovery to present (the day the metric is computed).

Regional Caseload Index – Violations in Regional Inventory divided by the total number of violations filed with NERC over previous 12-months (NOCVs , SNOPs , FFTs , SAs , Dismissals) multiplied by 12.

Measurement Period: Quarterly (Most recent 12 months are shown)*

Purpose: Regional Inventory – Used in the calculation of the Regional Caseload Index.

Average Months in Regional Inventory – Tracks how long violations have been in the Regional Inventory.

Regional Caseload Index – Computes the number of months that it would take to clear the violations that are either in the Region’s inventory based upon the average monthly processing rate over the preceding 12-month period.

Terms: “NOCV" refers to Notice of Confirmed Violation. “SNOP” refers to Spreadsheet Notice of Penalty. “FFT” refers to Find, Fix, and Track. “SA” = Settlement Agreement.

Goal: N/A

Notes: * These new metrics are developed based on violations processed in the first 6 months of 2013. Each subsequent quarter the metrics will be computed with additional months of processing data reflected until a 12-month average is obtained and then roll the 12-month period thereafter. NERC plans to develop metrics for the BOTCC with 6, 9 and 12 months of processing data for the August 2013, November 2013 and February 2014 meetings, respectively. Violations that are held by appeal, a regulator, or a court are excluded from computation of these metrics.

Appendix - Explanation of Measures

Page 25: WECC Compliance Committee September 18, 2013 Salt Lake City, UT

Constance B. White

Vice President, Compliance

WECC

[email protected]

Page 26: WECC Compliance Committee September 18, 2013 Salt Lake City, UT

Ruben ArredondoSenior Legal Counsel

Board Oversight of Compliance Post-BifurcationWECC Compliance Committee Update

September 18, 2013

Page 27: WECC Compliance Committee September 18, 2013 Salt Lake City, UT

Laura SchollManaging Director – Stakeholder

Outreach

WECC Compliance Committee UpdateSeptember 18, 2013

Page 28: WECC Compliance Committee September 18, 2013 Salt Lake City, UT

28

Participation in CUG and CIPUG

W 2011 Marina Del Rey

S 2011 Portland

F 2011 Tempe

W 2012 Anaheim

S 2012 SLC

F 2012 San

Diego

W 2013 Mesa

S 2013 Portland

0

100

200

300

400

500

600

700

800

285 315 271344

272360 313 357

225250

229

320

222

274266

3144041

35

55

60

3065

57

WEBINARCIPUGCUG

Page 29: WECC Compliance Committee September 18, 2013 Salt Lake City, UT

29

• CUG /CIPUG combined attendance up 14 percent for 2012 over 2011. • 2013* (Winter & Spring data only) trending increase over 2012

2013 Attendance

2011 2012 YTD 2013*

0

500

1000

1500

2000

Column1CIPUG

Page 30: WECC Compliance Committee September 18, 2013 Salt Lake City, UT

30

• CUG 98 percent positive• CIPUG 98 percent positive

Survey Feedback from Portland

Page 31: WECC Compliance Committee September 18, 2013 Salt Lake City, UT

31

CIP 101 – Two Day Seminar

January 2012 session: Over 100 Participants, maxed capacity

December 2012 session: Sold-out 100+

Next Session scheduled September 24-25, 2013

SOLD OUT!

*Will double capacity for 2014Off-site facility

Page 32: WECC Compliance Committee September 18, 2013 Salt Lake City, UT

32

• Continue to attract 160-200 ports per call• Integrating webcam this week.

Open Webinars

Page 33: WECC Compliance Committee September 18, 2013 Salt Lake City, UT

33

• Offered three times a year, just before CUG/CIPUG meetings as introduction, overview, refresher (90 minutes)o January 17 - 280+ Portso May 23 – 104 Portso October 17- TBD

Compliance 101 –Webinar

Page 34: WECC Compliance Committee September 18, 2013 Salt Lake City, UT

34

Collaboration with WICF

Participate in monthly Steering Committee calls

Attend Strategic Planning sessions

Coordinate agendas for WICF and CUG/CIPUG meetings

Provide “heads-up” information to WICF to share via its website

Page 35: WECC Compliance Committee September 18, 2013 Salt Lake City, UT

35

• CUG/CIPUG Jan. 28-30, 2014• CIP v.5 Roadshow Feb. 5-6, 2014• CIP v.5 Roadshow March 19-20, 2014• CUG/CIPUG June 3-5, 2014• CIP 101 Sept. 24-25, 2013• CUG/CIPUG Oct. 14-16, 2013• Open Webinars Third Thursdays• Compliance 101 Prior to each CUG• BES Definitional ChangeSpring/Summer 2014

2014 Schedule of Outreach Events

Page 36: WECC Compliance Committee September 18, 2013 Salt Lake City, UT

Laura Scholl

Managing Director-Stakeholder Outreach

[email protected]

801-819-7619

Questions?