Upload
vicente-aceituno
View
8.122
Download
0
Embed Size (px)
DESCRIPTION
Security Metrics
Citation preview
Security Management MetricsVicente Aceituno, 2008
Conferencia FIST Marzo/Madrid 2008 @
Sponsored by:
2
About me
Vice president of the ISSA Spain chapter. www.issa-spain.org
Vice president of the FIST Conferences association. www.fistconference.org
Author of a number of articles: Google: vaceituno wikipedia
Director of the ISM3 Consortium The consortium promotes ISM3, an ISMS standard ISM3 is the main source for this presentation. www.ism3.com
3
The world without Metrics
4
Management vs Engineering
Security Engineering: Design and build systems than can be used securely.
Security Management: Employ people and systems (that can be well or badly engineered) safely.
5
Targets vs Outcomes
Activity and Targets are weakly linked. Targets:
+Security / -Risk Trust
Activity: Keep systems updated Assign user accounts Inform users of their rights
6
Definition
Metrics are quantitative measurements that can be interpreted in the context of a series of previous or equivalent measurements.
Metrics make management possible:1. Measurement – Some call this “metrics” too.
2. Interpretation – Some call this “indicator”.
3. Investigation – (When appropriate, logs are key here) Common cause Special cause
4. Rationalization
5. Informed Decision
7
Qualitative vs Quantitative Measurement
William Thomson (Lord Kelvin): “I often say that when you can
measure what you are speaking about, and express it in numbers, you know something about it; but when you cannot express it in numbers, your knowledge is of a meager and unsatisfactory kind; it may be the beginning of knowledge, but you have scarcely, in your thoughts, advanced to the stage of science, whatever the matter may be”:
Meaning: “What can’t be measured, can’t be managed”
8
Interpretation
It doesn’t make sense to set thresholds beforehand. You have to learn what is normal to find out what is abnormal.
Thresholds can be fuzzy. False positives and false negatives. Example: 1000 students tested for HIV, 10 have it.
HIV Have HIV Don’t have HIV
Test positive for HIV 9 99
Test negative for HIV 1 891
9
Is it successful? Is it normal? How does it compare against peers?
Interpretation
10
Are outcomes better fit to their purpose? Are outcomes getting closer or further from target? Are we getting fewer false positives and false negatives? Are we using resources more efficiently?
Interpretation
11
Rationalization
Is the correction/change working? Is it cost effective? Can we meet our targets with the resources we
have? Are we getting the same outputs with fewer
resources?
12
Decisions
13
Good Metrics are SMARTIED
S.M.A.R.T Specific: The metric is relevant to the process being measured. Measurable: Metric measurement is feasible with reasonable cost. Actionable: It is possible to act on the process to improve the metric. Relevant: Improvements in the metric meaningfully enhances the
contribution of the process towards the goals of the management system.
Timely: The metric measurement is fast enough for being used effectively. +Interpretable: Interpretation is feasible (there is comparable
data) with reasonable cost (false positives or false negatives rates are low enough)
+Enquirable: Investigation is feasible with reasonable cost. +Dynamic: The metric values change over time.
14
Fashion vs Results
Real Time vs Continuous Improvement Management is far more than Incident Response.
Risk Assessment as a Metric Only as useful as Investigation results.
Certification / Audit Compliant / Not compliant is NOT a Metric.
15
What are good Metrics?
Activity: The number of outcomes produced in a time period;Scope: The proportion of the environment or system that is protected by the process.Update: The time since the last update or refresh of process outcomes. (Are outcomes recent enough to be valid?)Availability: The time since a process has performed as expected upon demand (uptime), the frequency and duration of interruptions, and the time interval between interruptions.Efficiency / ROSI: Ratio of outcomes to the cost of the investment in the process. (Are we getting the same outcomes with fewer resources? Are we getting more/better outcomes with the same resources?)
16
What are good Metrics?
Efficacy / Benchmark: Ratio of outcomes produced in comparison to the theoretical maximum. Measuring efficacy of a process implies the comparison against a baseline. (Are outputs better fit to their purpose?, Compare against industry/peers to show relative position)Load: Ratio of available resources in actual use to produce the outcomes, like CPU load, repositories capacity, bandwidth, licenses and overtime hours per employee. Accuracy: Rate of false positives and false negatives.
17
Examples
Activity: Number of access attempts successful
Scope: % Resources protected with Access Control
Update: Time elapsed since last access attempt successful
Availability: % of Time Access Control is available
Efficiency / ROSI: Access attempts successful per euro
Efficacy / Benchmark: Malicious access attempts failed vs Malicious access attempts successful. Legitimate access attempts failed vs Legitimate access attempts
successful.Load: % mean and peak Gb, Mb/s, CPU and licenses in use.
18
Metrics and Capability
Undefined. The process might be used, but it is not defined.
Defined. The process is documented and used. Managed. The process is Defined and the results
of the process are used to fix and improve the process.
Controlled. The process is Managed and milestones and need of resources is accurately predicted.
Optimized. The process is Controlled and improvement leads to a saving in resources
19
Capability: Undefined
Measurement - None Interpretation - None
20
Capability: Defined
Measurement - None Interpretation - None Investigation – (When appropriate, logs are key here)
Common cause (changes in the environment, results of management decisions)
Special cause (incidents)
Rationalization for use of time, budget, people and other resources – Not possible
Informed Decision – Not possible
21
Capability: Managed
Measurement: Scope, Activity, Availability Interpretation:
Normal?, Successful?, Trends?Benchmarking, How does it compare?Efficacy.
Investigation (Common cause, Special cause)Find faults before they produce incidents.
Rationalization… – Possible Informed Decision – Possible
22
Capability: Controlled
Measurement: Load, Update Interpretation
Can we meet our targets in time with the resources we have?
What resources and time are necessary to meet our targets ?
InvestigationFind bottlenecks.
Rationalization…- Possible Informed Decision, Planning – Possible
23
Capability: Optimized
Measurement: Efficiency (ROSI) Interpretation Investigation Rationalization Informed Decision, Planning, Tradeoffs (point of
diminishing returns) – Possible
24
Metric Specification
Name of the metric;Description of what is measured;How is the metric measured;How often is the measurement taken;How are the thresholds calculated;Range of values considered normal for the metric;Best possible value of the metric;Units of measurement.
25
Metrics Representation
26
Metrics Representation
27
Metrics Representation
Access Rights Granted
0,02000,04000,06000,08000,0
10000,012000,014000,016000,0
Wee
k10
Wee
k13
Wee
k16
Wee
k19
Wee
k22
Wee
k25
Wee
k28
Wee
k31
Wee
k34
Wee
k37
Wee
k40
Wee
k43
Wee
k46
Wee
k49
Weeks
Access Rights Granted
0,0200,0400,0600,0800,0
1000,01200,01400,01600,01800,0
Wee
k10
Wee
k13
Wee
k16
Wee
k19
Wee
k22
Wee
k25
Wee
k28
Wee
k31
Wee
k34
Wee
k37
Wee
k40
Wee
k43
Wee
k46
Wee
k49
Weeks
Access Rights Granted
0,0200,0400,0600,0800,0
1000,01200,01400,01600,0
Wee
k10
Wee
k13
Wee
k16
Wee
k19
Wee
k22
Wee
k25
Wee
k28
Wee
k31
Wee
k34
Wee
k37
Wee
k40
Wee
k43
Wee
k46
Weeks
0,0200,0400,0600,0800,0
1000,01200,01400,01600,01800,0
1 4 7 10 13 16 19 22 25 28 31 34 37 40
28
Metrics Representation
Access Rights Granted
0,0200,0400,0600,0800,0
1000,01200,01400,01600,01800,0
Wee
k10
Wee
k13
Wee
k16
Wee
k19
Wee
k22
Wee
k25
Wee
k28
Wee
k31
Wee
k34
Wee
k37
Wee
k40
Wee
k43
Wee
k46
Wee
k49
Weeks
29
Using Metrics
Acumulado de Recomendaciones por Responsable (Suma de días)
0
500
1000
1500
2000
2500
Enero
Febre
ro
Mar
zoAbr
il
May
oJu
nio
Julio
Agosto
Septie
mbr
e
Octub
re
Noviem
bre
Diciem
bre
Mr Blue
Mr Pink
Mr Yellow
Mr Purple
Mr Soft Blue
Mr Red
Mr Green
Mr Orange
30
Using security management metrics
Key Goal Indicators Key Performance Indicators Services Levels Agreements / Underpinnig Contracts Balanced Scorecard (Customer, Internal, Stakeholder,
Innovation - Goals and Measures)
Learn to implement High Performance Security Management Processes http://cli.gs/ism3
Web www.inovement.esVideo Blog youtube.com/user/vaceitunoBlog ism3.comTwitter twitter.com/vaceitunoPresentations slideshare.net/vaceituno/presentations
Articles slideshare.net/vaceituno/documents
32
THANK YOU
@ with the sponsorship of:
www.fistconference.org