37
UCDavis Computer Security Lab Collaborative End-host Worm Defense Experiment Senthil Cheetanceri, Denys Ma, Allen Ting, Jeff Rowe, Karl Levitt UC Davis Phil Porras, Linda Briesemeister, Ashish Tiwari SRI John Mark Agosta, Denver Dash, Eve Schooler Intel Corp.

UCDavis Computer Security Lab Collaborative End-host Worm Defense Experiment Senthil Cheetanceri, Denys Ma, Allen Ting, Jeff Rowe, Karl Levitt UC Davis

Embed Size (px)

Citation preview

Page 1: UCDavis Computer Security Lab Collaborative End-host Worm Defense Experiment Senthil Cheetanceri, Denys Ma, Allen Ting, Jeff Rowe, Karl Levitt UC Davis

UCDavis Computer Security Lab

Collaborative End-host Worm Defense Experiment

Senthil Cheetanceri, Denys Ma, Allen Ting, Jeff Rowe, Karl LevittUC Davis

Phil Porras, Linda Briesemeister, Ashish TiwariSRI

John Mark Agosta, Denver Dash, Eve SchoolerIntel Corp.

Page 2: UCDavis Computer Security Lab Collaborative End-host Worm Defense Experiment Senthil Cheetanceri, Denys Ma, Allen Ting, Jeff Rowe, Karl Levitt UC Davis

Overview

• Introduction

• End-host Based Defense Approach

• Our DETER Experiment

• General Testing Tools

• A Worm Defense Testing Framework

• Simulations and Analysis

Page 3: UCDavis Computer Security Lab Collaborative End-host Worm Defense Experiment Senthil Cheetanceri, Denys Ma, Allen Ting, Jeff Rowe, Karl Levitt UC Davis

Cyber Defense Testing

• Validation Using Simulations and Analysis (L. Briesemeister - SRI)– Quickly validate proposed cyber defense strategies– Test a large variety of conditions and configurations

• Live Deployment– Validation using real operating conditions– Reluctance to deploy systems without serious testing– Testing response to live attacks is impossible

• DETER Testbed (S. Cheetancheri – UC Davis)

– Tests defense systems using real code on real systems– Attacks can be safely launched– Bridges the testing gap between simulation and live deployment

Page 4: UCDavis Computer Security Lab Collaborative End-host Worm Defense Experiment Senthil Cheetanceri, Denys Ma, Allen Ting, Jeff Rowe, Karl Levitt UC Davis

The Specific Problem• Oftentimes centralized network worm defenses are unavailable.

– Mobile Users– Home Offices– Small Businesses– Network defenses have been bypassed or penetrated

• Local end-host detector/responders can form a last line of defense against large-scale distributed attacks.

• End-host detectors are “weak”– Without specific attack signatures, false-positives are high.– Local information isn’t sufficient in deciding whether a global attack is

occurring• Can “weak” end-host detectors be combined to produce a “strong”

global detector that triggers response?• How can a federation of local end-host detectors be used to detect

worm attacks?

Page 5: UCDavis Computer Security Lab Collaborative End-host Worm Defense Experiment Senthil Cheetanceri, Denys Ma, Allen Ting, Jeff Rowe, Karl Levitt UC Davis

Our Approach…

• Motivated by,Sequential Hypothesis Testing

Jung, J., Paxson, V., Berger, A., Balakrishnan, H., “Fast Portscan Detection Using Sequential Hypothesis Testing”, Proceedings of the IEEE Symposium on Security and Privacy, 2004

Corraborative Intrusion Detection and InferenceAgosta, J.M., Dash, D., Schooler, E., Intel Research

• Probabilistic inference by a federation of end-host local detection points.

• Protocol for distributing alert information within the federation.

Page 6: UCDavis Computer Security Lab Collaborative End-host Worm Defense Experiment Senthil Cheetanceri, Denys Ma, Allen Ting, Jeff Rowe, Karl Levitt UC Davis

Distributed Decision Chains

1,1 2,1 3,1 4,1 n,1

2,2 3,2 4,2 n,2

3,3 4,3 n,3

n,n

. . .

. . .

. . .

.

.

.

Matrix of Likelihood ratios of Bernoulli trials

{ i, j } = j local alerts seen after i steps

4,4 n,4. . .

• Worm threshold determines elements needed for an attack decision

• False alarm threshold determines elements for a false alarm decision

1,1

1,1

2,1

2,1

2,2

2,2

3,2

3,3

3,2

3,1

WORM!

False Alarm

Page 7: UCDavis Computer Security Lab Collaborative End-host Worm Defense Experiment Senthil Cheetanceri, Denys Ma, Allen Ting, Jeff Rowe, Karl Levitt UC Davis

Sequential Hypothesis Testing

H0 – Hypothesis that there is a worm

H1 – Hypothesis that there is a worm

Y = { 0 – No Alert raised

{ 1 – Alert raised

P[Y=1 | H0] = Fp P[Y=0 | H0] = (1- Fp)P[Y=0 | H1] = Fn P[Y=1 | H1] = (1-Fn)

Page 8: UCDavis Computer Security Lab Collaborative End-host Worm Defense Experiment Senthil Cheetanceri, Denys Ma, Allen Ting, Jeff Rowe, Karl Levitt UC Davis

TRW ParametersGiven:

Fp - False +ve rate of individual detectors.

Fn – False –ve rate of individual detectors.

Desired:dD – desired rate of Detection

dF – desired rate of False positive

Page 9: UCDavis Computer Security Lab Collaborative End-host Worm Defense Experiment Senthil Cheetanceri, Denys Ma, Allen Ting, Jeff Rowe, Karl Levitt UC Davis

Decision Making

Likelihood Ratio, L:

P[Y1|H1].P[Y2||H1]…P[Yn|H1]

P[Y1|H0].P[Y2||H0]…P[Yn|H0]

L < T0 (NoWorm) T0 = 1-dD/1-dF

> T1 (Worm) T1 = dD/dF

Page 10: UCDavis Computer Security Lab Collaborative End-host Worm Defense Experiment Senthil Cheetanceri, Denys Ma, Allen Ting, Jeff Rowe, Karl Levitt UC Davis

Experiment Components

• Local Detectors

• Defense Agents

• “Vulnerable” Service

• Safe Worm Generator

• Background Traffic Generator

Page 11: UCDavis Computer Security Lab Collaborative End-host Worm Defense Experiment Senthil Cheetanceri, Denys Ma, Allen Ting, Jeff Rowe, Karl Levitt UC Davis

End-host Detector and Defense Agents

• Implement a “weak” end-host local detector– Alert is generated for all connections to un-serviced ports

– False positive rate for local detection is high (one alert per hour per machine at UCDavis)

• Defense agents send local detector alerts to the defense agents on other end-hosts– Recipients are chosen at random for each alert

• Local alerts are aggregated into a global alert message. Agents use probabilistic inference do decide whether this is likely to be a worm or false alarm, or propagate global alert message if no decision has been reached.

Page 12: UCDavis Computer Security Lab Collaborative End-host Worm Defense Experiment Senthil Cheetanceri, Denys Ma, Allen Ting, Jeff Rowe, Karl Levitt UC Davis

Experimental Setup

• 200 Virtual Nodes on 40 Physical nodes.

• All nodes are on a single DETER LAN.

• 50 nodes are vulnerable– Alarms aren’t generated for worm

connections to these nodes

• All nodes have a local detector and defense agent

• Single node serves as the external infection source. Internal infected hosts also generate worm traffic

Page 13: UCDavis Computer Security Lab Collaborative End-host Worm Defense Experiment Senthil Cheetanceri, Denys Ma, Allen Ting, Jeff Rowe, Karl Levitt UC Davis

Detection Time

Random Scanning Worm @ 2 scans/sec

Full Saturation: 12 minutes after launch

Worm Detection: 4 minutes after launch

Infected Nodes: 5 (10%)

Page 14: UCDavis Computer Security Lab Collaborative End-host Worm Defense Experiment Senthil Cheetanceri, Denys Ma, Allen Ting, Jeff Rowe, Karl Levitt UC Davis

Results

• For random scanning worm:– Full saturation of infections occurs at 15 minutes post launce

– Worm detection trigger at 4 minutes after launch with 10% of vulnerable machines already infected.

– Global worm alert broadcast could protect 90%

• False alarms– At 4 false alarms per minute over all 200 machines (from UC

Davis laboratory network), no worm triggers

– Live testing in needed to evaluate false alarm performance over a longer time period

Page 15: UCDavis Computer Security Lab Collaborative End-host Worm Defense Experiment Senthil Cheetanceri, Denys Ma, Allen Ting, Jeff Rowe, Karl Levitt UC Davis

Summary

• Simulations by Intel Research show that a distributed TRW algorithm can be useful to detect worms using only “weak” end-host detectors.

• Emulated testing confirms that the algorithm and protocol works on live machines in the presence of real traffic

• Code tested and working on real Unix machines in DETER testbed will be deployed in the UCD and Intel networks for further testing and evaluation.

Page 16: UCDavis Computer Security Lab Collaborative End-host Worm Defense Experiment Senthil Cheetanceri, Denys Ma, Allen Ting, Jeff Rowe, Karl Levitt UC Davis

Testing Tools

• NTGC - A tool for Network Traffic Generation Control and Coordination

• WormGen – Safe worm generation for cyber defense testing

• A framework for worm defense evaluation

Page 17: UCDavis Computer Security Lab Collaborative End-host Worm Defense Experiment Senthil Cheetanceri, Denys Ma, Allen Ting, Jeff Rowe, Karl Levitt UC Davis

NTGC: A tool for Network Traffic Generation Control and Coordination

• To develop a background traffic generation tool which can:– Build traffic model by extracting important traffic parameters

from real traffic trace (tcpdump format)

– Automatically configuring the testbed nodes to generate traffic based on the traffic model extracted from real traffic

– Utilize existing traffic generators (e.g. TG, D-ITG, Harpoon) as low-level packet generation component

– Generate real TCP connections

Page 18: UCDavis Computer Security Lab Collaborative End-host Worm Defense Experiment Senthil Cheetanceri, Denys Ma, Allen Ting, Jeff Rowe, Karl Levitt UC Davis

Architecture

NTGC consists of the following components:• Traffic Analyzer

The traffic analyzer takes the trace data as input and reconstructs complete TCP connections.

• Traffic filterThe traffic filter can manipulate the traffic parameter data generated by the traffic analyzer.

• Network address mapping toolThis module maps the IP addresses of the packet trace into the DETER experimental network IP addresses.

• Configuration file generator–This module takes the output from the traffic analyzer (or traffic filter), and compile them into a TG or TTCP compatible configuration file–Parses the flow data generated by traffic analyzer and traffic filter, and then sends the flow information to the corresponding remote hosts.

• Command and flow data dispatcherWe use this tool to send commands to control the NTGC agents running on each DETER nodes.

• Low-level packet generators (e.g. TG, TTCP, D-ITG, Harpoon)

Page 19: UCDavis Computer Security Lab Collaborative End-host Worm Defense Experiment Senthil Cheetanceri, Denys Ma, Allen Ting, Jeff Rowe, Karl Levitt UC Davis

Modular Diagram

Raw trace 1

Raw trace n

……

……

……

Traffic Analyzer

Reconstruct TCP connections

Generate flow dataMerge tracesTimestamp normalization

Connection Data

Flow Data

Traffic Filter

Filtering

Address RemappingScale up/ downDuplicateRemove

Address Remapping rules.Topology file

Configuration File Generator

Page 20: UCDavis Computer Security Lab Collaborative End-host Worm Defense Experiment Senthil Cheetanceri, Denys Ma, Allen Ting, Jeff Rowe, Karl Levitt UC Davis

NTGC Summary

• The traffic analyzer is able to generate the flow-level data in XML format.

• We are able to manipulate the traffic parameters within the XML format flow data.

• We tested configuration file generation and dispatching feature on DETER testbed, with a 40 nodes topology. The configuration file generator generated TG compatible configuration files for all 40 nodes, and dispatched the configuration files to all the nodes.

• We observed that the traffic was sending and receiving between all the experimental nodes, based on the traffic model derived from the WIDE trace.

Page 21: UCDavis Computer Security Lab Collaborative End-host Worm Defense Experiment Senthil Cheetanceri, Denys Ma, Allen Ting, Jeff Rowe, Karl Levitt UC Davis

WormGen – Safe Worm Generation

• On demand distributed attack behavior is needed for the evaluation of defenses.

• Nobody wants to implement and deploy attacks that spread automatically using real vulnerabilities.

• How to produce realistic attack behavior without actually launching an attack?

• WormGen generates a propagating worm on the test network without using Malcode.

Page 22: UCDavis Computer Security Lab Collaborative End-host Worm Defense Experiment Senthil Cheetanceri, Denys Ma, Allen Ting, Jeff Rowe, Karl Levitt UC Davis

The worm simulation network consists of a several networkedagents and a single controller.

Page 23: UCDavis Computer Security Lab Collaborative End-host Worm Defense Experiment Senthil Cheetanceri, Denys Ma, Allen Ting, Jeff Rowe, Karl Levitt UC Davis

The controller assigns each agent a role for a given worm.1) Vulnerable (denoted by a red x)2) Vulnerable and initially infected (red x with XML code)3) Not Vulnerable (denoted by a green check)

Page 24: UCDavis Computer Security Lab Collaborative End-host Worm Defense Experiment Senthil Cheetanceri, Denys Ma, Allen Ting, Jeff Rowe, Karl Levitt UC Davis

The controller sends a start command to the initially infectedagent(s). The agent process the XML instructions, or “worm”,for information about how to spread.

Page 25: UCDavis Computer Security Lab Collaborative End-host Worm Defense Experiment Senthil Cheetanceri, Denys Ma, Allen Ting, Jeff Rowe, Karl Levitt UC Davis

The agent consults the information in the PortScanRate elementto determine the speed in which it "scans".

Page 26: UCDavis Computer Security Lab Collaborative End-host Worm Defense Experiment Senthil Cheetanceri, Denys Ma, Allen Ting, Jeff Rowe, Karl Levitt UC Davis

Based on the "probability" values of RandomScan andLocalSubnetScan elements, the agent chooses whichaddress range to target

Page 27: UCDavis Computer Security Lab Collaborative End-host Worm Defense Experiment Senthil Cheetanceri, Denys Ma, Allen Ting, Jeff Rowe, Karl Levitt UC Davis

For each infected agent, a new address range is chosen based on the the probability values, and the attack cycle continues.

Page 28: UCDavis Computer Security Lab Collaborative End-host Worm Defense Experiment Senthil Cheetanceri, Denys Ma, Allen Ting, Jeff Rowe, Karl Levitt UC Davis

or, once again, simply sends the worm. Only the vulnerableagents are infected.

Page 29: UCDavis Computer Security Lab Collaborative End-host Worm Defense Experiment Senthil Cheetanceri, Denys Ma, Allen Ting, Jeff Rowe, Karl Levitt UC Davis

When the worm is stopped, the controller gathers information from each agent and processes it into a report.

Page 30: UCDavis Computer Security Lab Collaborative End-host Worm Defense Experiment Senthil Cheetanceri, Denys Ma, Allen Ting, Jeff Rowe, Karl Levitt UC Davis

Motivation:Provide a framework for easy evaluation of worm defenses

in DETER test-bed environment.

Worm

Topology

Defense

EvaluationTest-bedAPI

Towards a Framework for Worm Defense Evaluation

Page 31: UCDavis Computer Security Lab Collaborative End-host Worm Defense Experiment Senthil Cheetanceri, Denys Ma, Allen Ting, Jeff Rowe, Karl Levitt UC Davis

The Framework itself

Page 32: UCDavis Computer Security Lab Collaborative End-host Worm Defense Experiment Senthil Cheetanceri, Denys Ma, Allen Ting, Jeff Rowe, Karl Levitt UC Davis

Features:

• Test-bed programming is transparent to the experimenter.

• Hooks for users’ defense, worms and background traffic replay.

• Event Control System for executing series of experiments in batch mode.

• Standardized vulnerable servers• Worm library

Page 33: UCDavis Computer Security Lab Collaborative End-host Worm Defense Experiment Senthil Cheetanceri, Denys Ma, Allen Ting, Jeff Rowe, Karl Levitt UC Davis

Advantages

Current Approach Our Approach

Approach Custom tools Standardized tools

Time to First experiment

Hours to weeks Hours

Setup time : Expt time ratio

10:1 1:100

Testbed details knowledge

Required Not Required

Page 34: UCDavis Computer Security Lab Collaborative End-host Worm Defense Experiment Senthil Cheetanceri, Denys Ma, Allen Ting, Jeff Rowe, Karl Levitt UC Davis

Example: Hierarchical Defense

Page 35: UCDavis Computer Security Lab Collaborative End-host Worm Defense Experiment Senthil Cheetanceri, Denys Ma, Allen Ting, Jeff Rowe, Karl Levitt UC Davis

Analysis

No Defense Defense Turned on

Only 5 iterations 10 iterations

Page 36: UCDavis Computer Security Lab Collaborative End-host Worm Defense Experiment Senthil Cheetanceri, Denys Ma, Allen Ting, Jeff Rowe, Karl Levitt UC Davis

Future Work

• Traffic Analysis on n/w components • Provide default topologies

– business networks, academic networks, defense networks, etc.,.

• Counter the effect of scale-down.• Provide a formal language to describe the API

for this framework.

Page 37: UCDavis Computer Security Lab Collaborative End-host Worm Defense Experiment Senthil Cheetanceri, Denys Ma, Allen Ting, Jeff Rowe, Karl Levitt UC Davis

Next Steps

• Implement and test other cooperative protocols– Multicast– Channel Biasing– Hierarchical Aggregation

• Include a variety of local end-host detectors with differing performance – more sophisticated Bayesian Network model developed by Intel Corp.

• Optimize local detector placement in the cooperative network