Upload
others
View
2
Download
0
Embed Size (px)
Citation preview
Understanding Risk – Performance Trade-off
at Point of Entry Systems
Bojan CukicWest Virginia University
DHS University Summit, March 2009 ©
Risk function
Systems Approach: Port of Entry
Traveler Queues
Watch Lists / Identity DB
Legend=Required Signal=Optional Signal= Movement
Public Key Directory
Secondary Inspection / Detainment
Border Access
=Optional Movement
Inspection Stations(w/ biometric )
Local, distributed, or central?
Modality, quality, scalability, update, access ?
Acceptance,modality, quality?
Modality, FMR, vulnerability, exceptions, throughput?
False Non-Match Rate, Inconvenience acceptance?
False Match Rate
3
Travelers arrival• Arrival information from December 2007 in one of
the terminals at the Dulles International Airport
0
0.002
0.004
0.006
0.008
0.01
0.012
0.014
0.016
Aver
age
Pass
enge
rs/s
econ
d/bo
oth
Time
Hourly Average
An Airport Inspection System
MRTD Reader
Fingerprint Reader
Digital Camera
MRTD Card
Traveler
POE Officer
TNS(Name lookup)PKD TBS
(Biometric lookup)
Traveler Queue Inspection Facility
Layered Queuing Network Model
5
Layered Queueing Network Model
6
Performance Analysis: An Example
• Complex system requirements and design tradeoffs.– Point-of-entry applications, digital passports.
• How to optimally organize access to national public keys.– Acronyms
• ICAO-International Civil Aviation Organization• MRTD-Machine Readable Travel Document• PKD-Public Key Directory, CA – Certificate Authority
• Goals– Identify possible architectural designs for implementation of PKI
subsystem at points-of-entry. – Suggest “best” solution based on performance and security
modeling early in the development lifecycle.
ICAO MRTD PKD
MRTDHolder / Traveler
PKD
MRTDReader
Hosting PC
Border Inspection System
MRTD
Border Inspection Officer
ICAOBiometricDevice10
1
23 5
4 6
7
89
11
Network
Point of Entry App
Architectural Differences• One Key Distribution Access Point
– The simplest distribution scheme, single centralized copy of the PKD. – Network delay a function of networking infrastructure and CA PKD
request response time.
• Localized PKDs– A “middle-ground” architecture.– A local copy of the PKD at each port of entry (POE). – The network delay greatly reduced.– Decisions must be made on when and how to update the CA PKD.
• Border Inspection Site Replicated PKD– The most involved PKD distribution scheme for participating countries. – Complex design decisions regarding update/synchronization schemes,
times, and frequencies. – In theory, this scheme eliminates network traffic delays (except for the
updates).
Performance ResultsPrimary Inspection Time
0 10 20 30 40 50 60 70
MRTD
Dedicated PKD
Shared PKD (1 airport)
Shared PKD (40 airports)
Shared PKD (80 airports)
Shared PKD (160 airports)
Arch
itect
ural
Opt
ions
Inspection Time (s)
Automated Inspection Complete Inspection
Performance ResultsResponse Time and Resource Utilization
Validation• Against a discrete-event
simulation model of the Los Angeles International Airport
• The model includes approximately 400 modules
• Simulation results for baseline models have been validated
• To collect performance measures the simulation model is run for a 24-hour period.
• To estimate the mean and variance of the wait time, 10 simulation runs are made
* T. Edmunds, P. Sholl, Y. Yao, J. Gansemer, E. G. Norton. Simulation Analysis of Inspections of International Travelers at Los Angeles International Airport for US-VISIT (Lawrence Livermore National Laboratory, Livermore, CA). 2004
Performance ResultsValidation (2)
0
20
40
60
80
100
120
100300
500700
9001100
13001500
17001900
Travelers (n)
Aver
age
Tota
l Wait
ing T
ime
(m)
Option 1 Option 2Option 3 (1 airport) Option 3 (40 airports)Option 3 (80 airports) Option 3 (160 airports)
14
Performance experiments: Watch list size
23.01733
0 10 20 30 40 50 60
1,000
10,000
100,000
1,000,000
10,000,000
50,000,000
100,000,000
Su
spec
ts i
n W
atch
list
Avera g e S ec onda ry Inspec tion T ime (min)
0
20
40
60
80
100
100 300 500 700 900 1100 1300 1500 1700
Tota
l wai
ting
Tim
e (m
in)
Travelers
1,000
10,000
100,000
1,000,000
10,000,000
50,000,000
100,000,000
15
Performance experiments: Biometric system match rates
•Biometric False Match Rates create increased workload at secondary inspection point.
0
10
20
30
40
50
60
70
80
90
100
0 500 1000 1500 2000
Tota
l Wai
ting
Tim
e (m
in)
Travelers
0.01
0.001
0.0001
0
10
20
30
40
50
60
70
80
90
100
0 0.0001 0.001 0.005 0.01 0.1Impostor - Prior Probability
Tota
l ave
rage
pro
cess
ing
Tim
e (m
in)
Waiting Time
Inspection Time∞
556
Impostor prior: 0.01
FMR
16
Performance experiments Match rates & watch lists
0.0001 0.001 0.01
1,000 10,000
100,000 1,000,000
10,000,000100,000,000
0102030
40
50
60
70
80
90
100
Tota
l Ser
vice
Tim
e (m
in)
FMR
Watchlist Size
180
137
1343
Risk function
Systems Approach: Port of Entry
Traveler Queues
Watch Lists / Identity DB
Legend=Required Signal=Optional Signal= Movement
Public Key Directory
Secondary Inspection / Detainment
Border Access
=Optional Movement
Inspection Stations(w/ biometric )
after Cukic et al.
Local, distributed, or central?
Modality, quality, scalability, update, access ?
Acceptance,modality, quality?
Modality, FMR, vulnerability, exceptions, throughput?
False Match Rate, Inconvenience acceptance?
False Non Match Rate
Cost Curve Modeling for Biometric PoE Inspection
• A methodology for adaptation of biometric system set-up based on expected cost of misclassification
• C(+|-) denotes the cost of incorrectly classifying a genuine user (as an impostor)
Secondary inspection.• C(-|+) denotes the cost of misclassifying an impostor as a
genuine user. Security breach.
• p(+) probability of a user being an impostor.• p(-) probability of a user being a genuine.
Face Recognition in Border Inspections
• Face Recognition Vendor Test (FRVT) 2006
Test which algorithm is better when:• Impostor arrival rate
varies 0.01 – 0.0001• Misclassification cost ratio, μ=C(+|-):C(-|+) variesbetween 0.1 and 0.0001;
• Misclassifying an impostoris 10 – 10,000 times more “expensive” than misclassifying a genuine user.
P(+)=0.01P(-)=0.99
Face recognition cost curves
1E-31E-21E-1 1E-4
P(+)=0.001P(-)=0.999
P(+)=0.0001P(-)=0.9999
21
Fingerprint matching algorithms (FpVTE 2003)
22
Fingerprint – Cost curve
23
FMR, Risks, Performance
Probability Cost, PC(+)
Norm(E[Cost])
FMR FNMR Total Waiting (min)
0.001 0.3227 0.00152 0.322 infinite
0.1 0.0314 0.00152 0.322 infinite
0.5 0.1235 0.175 0.073 205.5807
Probability Cost, PC(+)
Norm(E[Cost]) FMR FNMR Total Waiting (min)
0.001 0.00689 0.00005376 0.0059 25.5008
0.1 0.0004 0.0001834 0.0031 25.06776
0.5 0.0013 0.001276 0.0013 24.79358
Face Recognition
Fingerprint recognition
P(+)=0.0001µ=1/100
Summary• “Rapid” screening cannot be considered as a goal
by itself.– Related to security risk, system design, data set size, etc.
• Points of entry need to adapt to the operational environment.– Cost curves demonstrate the strategy for threshold adjustment in
deployed biometric systems. – Need very few parameters
• The “arrival rate” for impostors and the misclassification cost ratio. – Such design minimizes the overall risk.
• Current work– Incorporating multimodal biometrics. – Deriving system design rules in light of the privacy parameters.