Upload
howard-camacho
View
35
Download
4
Tags:
Embed Size (px)
DESCRIPTION
Application Intrusion Detection. Robert S. Sielken In Fulfillment Of Master of Computer Science Degree School of Engineering and Applied Science University of Virginia. Outline. Introduction State of Practice - OS IDS Case Studies Application Intrusion Detection - PowerPoint PPT Presentation
Citation preview
May 4, 1999 Application Intrusion Detection 1
Application Intrusion Detection
Robert S. Sielken
In Fulfillment Of
Master of Computer Science Degree
School of Engineering and Applied Science
University of Virginia
May 4, 1999 Application Intrusion Detection 2
Outline
• Introduction
• State of Practice - OS IDS
• Case Studies
• Application Intrusion Detection
• Construction of an Application Intrusion Detection System (AppIDS)
• Conclusion
May 4, 1999 Application Intrusion Detection 3
Introduction• Intrusion Detection
– determining whether or not some entity, the intruder, has attempted to gain, or worse, has gained unauthorized access to the system
• Intruders– Internal
– External
• Objectives– Confidentiality– Integrity– Availability– Accountability
• Current State– done at the OS level,
but diminishing returns– opportunities and limits
of utilizing application semantics?
May 4, 1999 Application Intrusion Detection 4
State of Practice - OS IDS• Audit records
– operating system generated collections of the events that have happened in the system over a period of time
• Events– results of actions taken
by users, processes, or devices that may be related to a potential intrusion
• Threat Categories– Denial of Service
– Disclosure
– Manipulation
– Masqueraders
– Replay
– Repudiation
– Physical Impossibilities
– Device Malfunctions
May 4, 1999 Application Intrusion Detection 5
OS IDS - Approaches• Anomaly Detection
– Static• Tripwire, Self-Nonself
– Dynamic• NIDES, Pattern
Matching (UNM)
• Misuse Detection• NIDES, MIDAS, STAT
• Extensions - Networks– Centralized
• DIDS, NADIR, NSTAT
– Decentralized• GrIDS, EMERALD
May 4, 1999 Application Intrusion Detection 6
OS IDS - Generic Characteristics
• Relation - expression of how two or more values are associated– Statistical
– Rule-Based
• Observable Entities - any object (user, system device, etc.) that has or produces a value in the monitored system that can be used in defining a relation
• Thresholds - determine how the result of the relation will be interpreted
May 4, 1999 Application Intrusion Detection 7
OS IDS - Generic Characteristics
• Effectiveness– fine-tuning of thresholds– frequency of relation evaluation– number of correlated values– hierarchy
May 4, 1999 Application Intrusion Detection 8
AppIDS
• Guiding Questions– Opportunity – what types of intrusions can be
detected by an AppIDS?– Effectiveness – how well can those intrusions
be detected by an AppIDS?– Cooperation – how can an AppIDS cooperate
with the OS IDS to be more effective than either alone?
May 4, 1999 Application Intrusion Detection 9
Case Studies• Electronic Toll
Collection– numerous devices
distributed
– complementary device values
– hierarchical
– gathers data about monitored external behavior
– accounting component
• Health Record Management– non-hierarchical
– no devices beyond controlling computer
– no financial component
– limited access
– contains physical realities
– data collection and scheduling components
May 4, 1999 Application Intrusion Detection 10
Electronic Toll Collection (ETC)• Devices
– Toll Lane• Tag Sensor
• Automated Coin Basket
• Toll Booth Attendant
• Loop Sensor
• Axle Reader
• Weigh-In-Motion Scale
• Traffic Signal
• Video Camera
– Vehicle• Tag (Active/Passive)
May 4, 1999 Application Intrusion Detection 11
ETC - Hierarchy
T o ll L a ne T o ll L a ne
T o ll P la za T o ll P la za
T o ll L a ne T o ll L a ne T o ll L a ne
T o ll P la za O th e r D e v ices
T o ll M a n ag e m en t C e n te r
May 4, 1999 Application Intrusion Detection 12
ETC - Application Specific Intrusions
• Annoyance (3 methods)
• Steal Electronic Money (10 methods)
• Steal Vehicle (4 methods)
• Device Failure (1 method)
• Surveillance (2 methods)
Threat Categories
Specific Intrusions
Methods Relations
May 4, 1999 Application Intrusion Detection 13
ETC - Steal Service
Rel#
RelationRelation
DescriptionExecutionLocation
Steal Service
No tagand
coverplate
Copytag
Packet filterthat discards
all a tag'spackets
1 Tag vs. Historical (Time) (stat) TBP/TMC X4 Tag vs. Historical (Sites) (stat) TMC X5 Tag vs. Time (rule) TMC X9 Tag vs. Axles (rule) TBL X X X25 Unreadable Tags (stat) TBP/TMC X
May 4, 1999 Application Intrusion Detection 14
Application Intrusion Detection• Similarities
– detect intrusions by evaluating relations to differentiate between anomalous and normal behavior
– centralized or decentralized (hierarchical)
– same threat categories
• Differences– anomaly detection
using statistical and rule-based relations
– internal intruders
– event causing entity
– resolution
– tightness of thresholds
– event records• periodic
• code triggers
May 4, 1999 Application Intrusion Detection 15
AppID (cont’d)• Dependencies
– OS IDS on AppIDS• None
– AppIDS on OS IDS• basic security services
• prevention of bypassing application to access application components
• Cooperation– audit/event record
correlation
– communication• bi-directional
• request-response bundles
– complications• terms of communication
• resource usage - lowest common denominator
May 4, 1999 Application Intrusion Detection 16
Construction of an AppIDS
Event Record Manage
r
Relation Evaluato
r
Anomaly Alarm Handler
TOOLS
Relation Specifier
Relations
Event Record
Specifier
Event Record Structure
Timings
Relation – Code
Connector
Observable Entity Locations in the
Application
GENERIC COMPONENTS
May 4, 1999 Application Intrusion Detection 17
Conclusion• Opportunity
– internal intruders (abusers)
– anomaly with statistical and rule-based relations
– same threat categories
• Effectiveness– resolution
– tightness of thresholds
• Cooperation– detection
• Construction– tools
– generic components
May 4, 1999 Application Intrusion Detection 18
Health Record Management (HRM)
• Components– Patient Records– Orders – lists of all requests for drugs, tests, or
procedures– Schedule – schedule for rooms for patient
occupancy, laboratory tests, or surgical procedures (does not include personnel)
• Users– doctors, laboratory technicians, and nurses
May 4, 1999 Application Intrusion Detection 19
HRM - Application Specific Intrusions
• Annoyance (4 methods)
• Steal Drugs (1 method)
• Patient Harm (6 methods)
• Surveillance (2 methods)
Threat Categories
Specific Intrusions
Methods Relations
May 4, 1999 Application Intrusion Detection 20
HRM - Patient Harm
Rel#
RelationRelation
DescriptionPatient Harm
Adm
in.
Wro
ngD
rug
Adm
in.
Too
Muc
hof
Dru
g
Adm
in.
an A
llerg
icD
rug
Adm
in.
Impr
oper
Die
t
Ord
er N
eed
less
Dru
gs
Per
form
Nee
dles
sP
roce
dure
2 Drug vs. Allergy (rule) X X
5 Drug vs. Diet (rule) X X
8 Drug vs. Historical (dosage) (stat) X X
24 Patient Test Results vs. TestResults (Historical)
(stat) X X X X
May 4, 1999 Application Intrusion Detection 21
ETC - Steal ServiceRel#
Relation Relation DescriptionExecutionLocation
Steal Service
No tagand
coverplate
Copytag
Packet filterthat discards
all a tag'spackets
1 Tag vs.Historical (Time)
Tag (Time of Day) should match Historical Time (ofDay) (stat)
TBP/TMC X
2 Tag vs.Historical (Day)
Tag (Day of Week) should match Historical Time(Day of Week) (stat)
TBP/TMC X
3Tag vs.
Historical(Frequency)
Tag (Frequency (per day)) should match HistoricalFrequency (per day) (stat)
TBP/TMC X
4 Tag vs.Historical (Sites)
Tag (Sites) should match Historical sites (stat) TMC X
5 Tag vs. TimeTag should not be reread within x minutes any othertoll both (rule)
TMC X
6 Tag vs. ParkingTag (Identity) should not be listed as being in aparking facility (Parking) (rule)
TMC X
7 Tag vs. Reportof Stolen Tag
Tag should not match that of a reported lost/stolenvehicle (rule)
TMC X
9 Tag vs. Axles Tag (Axles) should match Axles (rule) TBL X X X10 Tag vs. Scale Tag (Weight) should match Scale (rule) TBL X X X
11Tag + Toll +Coin Toll vs.Traffic Signal
# of tolls paid (tag/toll/coin-toll) equals number ofsignals given (green) (rule)
TBL X
12Tag + Toll +Coin Toll vs.
Video
# of tolls paid (tag/toll/coin-toll) equals number ofvehicles seen by camera (rule)
TBL X X
13Tag + Toll +Coin Toll vs.
Loops
# of tolls paid (tag/toll/coin-toll) equals number ofvehicles seen by loops (rule)
TMC X X
May 4, 1999 Application Intrusion Detection 22
Steal Service (cont’d)Rel#
Relation Relation DescriptionExecutionLocation
Steal Service
No tagand
coverplate
Copytag
Packet filterthat discards
all a tag'spackets
15 Axles vs. Scale # of Axles should match Scale reading (rule) TBL X16 Axles vs. Toll Axles (cost) should match Toll collected (rule) TBL X X X
17 Axles vs. Coin-Toll
Axles (cost) should match Toll (coin) paid (rule) TBL X X X
18 Toll vs. Scale Toll collected should match Scale based fare (rule) TBL X X X
19 Toll vs. VideoToll collected should match Video vehicledetermination (rule)
TBL X X X
20 Coin-Toll vs.Scale
Toll (coin) paid should match Scale based fare (rule) TBL X X X
21 Coin-Toll vs.Video
Toll (coin) paid should match Video vehicledetermination (rule)
TBL X X X
22 Traffic Signal vs.Video
# of signals (green) equals # of vehicles seen byvideo camera (rule)
TBL X
23 Traffic Signal vs.Loops
# of signals (green) equals # of vehicles seen byloops (rule)
TMC X
24 Video vs. Loops# of vehicles seen by video camera equals # ofvehicles seen by loops (rule)
TMC X
25 UnreadableTags
# of unreadable tags should be fairly evenlydistributed between lanes and toll booths (stat)
TBP/TMC X
May 4, 1999 Application Intrusion Detection 23
HRM - Patient HarmRel#
Relation Relation Description Patient Harm
Adm
in.
Wro
ngD
rug
Adm
in.
Too
Muc
hof
Dru
g
Adm
in.
an A
llerg
icD
rug
Adm
in.
Impr
oper
Die
t
Ord
er N
eed
less
Dru
gs
Per
form
Nee
dles
sP
roce
dure
1 Drug vs. DrugCertain drugs cannot be taken in conjunction with otherdrugs (rule)
X
2 Drug vs. AllergyCertain drugs cannot be taken when a person has certainallergies (rule)
X X
3 Drug vs. SexCertain drugs cannot be taken by one sex or the other(rule)
X
4 Drug vs. WeightCertain drugs prescriptions are based on the patient'sweight (rule)
X X
5 Drug vs. DietCertain drugs cannot be taken while consuming certainfoods (rule)
X X
6 Drug vs. Lethal Dosage Drug dosage should not exceed the lethal dosage (rule) X X
7 Drug vs. TimeDrugs have a minimum time between doses (such as 4hours) (rule)
X X
8 Drug vs. Historical(dosage)
Drug dosage should be fairly similar to other prescriptionsof the drug in either dosage amount or frequency (stat)
X X
12 Procedure vs. DietSome procedures may have a special dietary preparationrequirement (rule)
X
18 Language vs.Language
Anything outside of the restricted language is not allowed(rule)
X
24Patient Test Results
vs. Test Results(Historical)
Test results should be related to previous test results forthat patient (stat)
X X X X
25 Test Results vs. TestResults (Historical)
Test results should be related to previous test resultsacross all patients (stat)
X X X X