Upload
anthony-arrott
View
276
Download
0
Tags:
Embed Size (px)
DESCRIPTION
“There is a desperate need for new standards for today’s anti-virus products. The dominant paradigm, scanning directories of files, is focused on old and known threats, and reveals little about product efficacy in the wild.”Williamson & Gorelik (2007)
Citation preview
Copyright 2010 Trend Micro Inc.
Measuring the Actual Security that Vendors Provide to Customers
the need for AV product testing reform
An executive session with Trend Micro CTO Raimund Genes and industry guests
Copyright 2010 Trend Micro Inc.
Four ways to stop malware
1. Block malware from arriving at the endpoint.e.g., web filtering; web reputation services
2. Stop malware files from executing on the endpoint.e.g., signature-based scanning of files
3. Interrupt malware doing bad things on execution.e.g., behavior monitoring
4. Protect vulnerabilities from being exploited.e.g., disable access to known vulnerabilities until patched
Copyright 2010 Trend Micro Inc.
But traditional testing only counts one
1. Block malware from arriving at the endpoint.
e.g., web filtering; web reputation services
2. Stop malware files from executing on the endpoint.
e.g., signature-based scanning of files
3. Interrupt malware doing bad things on execution.
e.g., behavior monitoring
4. Protect vulnerabilities from being exploited.e.g., disable access to known vulnerabilities until
patched
Traditional AV product testing only measures
detection
Traditional AV product testing only measures
detection
Copyright 2010 Trend Micro Inc.
As a result …
“There is a desperate need for new standards for today’s anti-virus products. The dominant paradigm, scanning directories of files, is focused on old and known threats, and reveals little about product efficacy in the wild.”
Williamson & Gorelik (2007)
“There is a desperate need for new standards for today’s anti-virus products. The dominant paradigm, scanning directories of files, is focused on old and known threats, and reveals little about product efficacy in the wild.”
Williamson & Gorelik (2007)
Copyright 2010 Trend Micro Inc.
Test Labs are responding
Names of Testing Metrics• Anti-malware detection • Caught initially on download• Caught on first exposure• Caught subsequently on execution• Caught with repeated exposure • Drive-by-download protection• Dynamic detection • End-to-end web threat protection• Exposure layer web threat protection• Infection layer web threat protection• Internet-connected detection• Malware blocking• Malware detection• Overall web threat protection• Proactive detection• Web security blocking• Web threat blocking effectiveness• Whole product dynamic test• Zero-day protection
• Independent testing labs have introduced new testing methods
• Many new metrics attempt to better measure actual security
• But the Labs have trouble keeping up with changes:– New cybercriminal techniques– Anti-malware solution innovations
• As a result, there is now chaos in AV testing metrics & results
Copyright 2010 Trend Micro Inc.
AV Testing Metrics Chaos
Names of Testing Metrics• Anti-malware detection • Caught initially on download• Caught on first exposure• Caught subsequently on execution• Caught with repeated exposure • Drive-by-download protection• Dynamic detection • End-to-end web threat protection• Exposure layer web threat protection• Infection layer web threat protection• Internet-connected detection• Malware blocking• Malware detection• Overall web threat protection• Proactive detection• Web security blocking• Web threat blocking effectiveness• Whole product dynamic test• Zero-day protectionAV
product
W
100%
90%
80%
70%
60%
50%
40%
30%
20%
10%
0%
A B C D E FTest Lab
AVproduct
X
A B C D E FTest Lab
AVproduct
Y
A B C D E FTest Lab
AVproduct
Z
A B C D E FTest Lab
AVproduct
W
100%
90%
80%
70%
60%
50%
40%
30%
20%
10%
0%
A B C D E FTest Lab
AVproduct
X
A B C D E FTest Lab
AVproduct
Y
A B C D E FTest Lab
AVproduct
Z
A B C D E FTest Lab
AVproduct
W
100%
90%
80%
70%
60%
50%
40%
30%
20%
10%
0%
A B C D E FTest Lab
AVproduct
X
A B C D E FTest Lab
AVproduct
Y
A B C D E FTest Lab
AVproduct
Z
A B C D E FTest Lab
Copyright 2010 Trend Micro Inc.
AV Testing Metrics Chaos
• No consistency of testing method
• No consistency of applied threat stimuli
• No consistency of metrics definition
• No consistency of results
consequently:
• Little value to buyers of security products
AVproduct
W
100%
90%
80%
70%
60%
50%
40%
30%
20%
10%
0%
A B C D E FTest Lab
AVproduct
X
A B C D E FTest Lab
AVproduct
Y
A B C D E FTest Lab
AVproduct
Z
A B C D E FTest Lab
AVproduct
W
100%
90%
80%
70%
60%
50%
40%
30%
20%
10%
0%
A B C D E FTest Lab
AVproduct
X
A B C D E FTest Lab
AVproduct
Y
A B C D E FTest Lab
AVproduct
Z
A B C D E FTest Lab
AVproduct
W
100%
90%
80%
70%
60%
50%
40%
30%
20%
10%
0%
A B C D E FTest Lab
AVproduct
X
A B C D E FTest Lab
AVproduct
Y
A B C D E FTest Lab
AVproduct
Z
A B C D E FTest Lab
Copyright 2010 Trend Micro Inc.
Blocking in the cloud before arrival
• 92% of malware arrives over the Internet.
• The source is often easier to identify than the malware files.
• Blocking files from a bad source does not require file detection.
• Traditional test methods do not credit blocking by source URL.
Malware files arriving over the Internet indicate the source
address of the file
http://abc.com/xyz.exeMalware can be blocked if the
source is known bad (URL)
source file
http://abc.com/xyz.exeMalware can be blocked if the file is known bad (signature)
source file
Malware files arriving over the Internet indicate the source
address of the file
http://abc.com/xyz.exeMalware can be blocked if the
source is known bad (URL)
source filehttp://abc.com/xyz.exe
Malware can be blocked if the source is known bad (URL)
source file
http://abc.com/xyz.exeMalware can be blocked if the file is known bad (signature)
source filehttp://abc.com/xyz.exe
Malware can be blocked if the file is known bad (signature)
source file
Copyright 2010 Trend Micro Inc.
A new threat every 1.5 sec.
• Thousands of new threats per day overwhelm test methods.
• Stored threats become irrelevant before a test is completed.
• Speed of response to new threats is more important than detection of old threats.
• Many threats are “old” in hours to days – not weeks.
Copyright 2010 Trend Micro Inc.
How long to respond to a new threat?
Time - to - Protect(average time after first exposure to new threat)
57
1620
30 32
38 40
45 46
0 h
10 h
20 h
30 h
40 h
50 h
#1 #2 #3 #4 #5 #6 #7 #8 #9 #10
AV Vendor
… a metric that shows real differences among vendors.
Copyright 2010 Trend Micro Inc.
Key principles for AV product testing
• Credit for “protection” instead of “detection”
• “Real-time” or “dynamic” testing
• Reproducibility: Statistical not deterministic
• Broad and diverse relevant threat samples
• Measuring the vendor response e.g., “time-to-protect”
Copyright 2010 Trend Micro Inc.
Comments from Industry Guests
• Gerhard EschelbeckCTO & SVP Engineering at Webroot
• Vik PhatakChairman & CTO at NSS Labs
• Andreas MarxCEO at AV-Test
• Anil SomayajiDirector, Computer Security Lab, Carleton
University
Copyright 2010 Trend Micro Inc.