72
03/27/22 1 R. Smith - University of St Thomas - Minnesota CISC 370 - Class Today CISC 370 - Class Today A few things back A few things back Ch 12-13 problems back Wednesday Ch 12-13 problems back Wednesday A look at network security A look at network security Security process Security process Authentication Authentication Network crytpography Network crytpography Firewalls Firewalls

9/17/2015 1R. Smith - University of St Thomas - Minnesota CISC 370 - Class Today A few things backA few things back –Ch 12-13 problems back Wednesday

Embed Size (px)

Citation preview

04/19/23 1R. Smith - University of St Thomas - Minnesota

CISC 370 - Class Today CISC 370 - Class Today

• A few things backA few things back– Ch 12-13 problems back WednesdayCh 12-13 problems back Wednesday

• A look at network securityA look at network security– Security processSecurity process– AuthenticationAuthentication– Network crytpographyNetwork crytpography– FirewallsFirewalls

Spring 2009 2R. Smith - University of St Thomas - Minnesota

Making security decisionsMaking security decisions

• Do you always lock:Do you always lock:– A car doorA car door

– A room doorA room door

– A house doorA house door

• If not If not alwaysalways, what decides , what decides otherwise?otherwise?

Spring 2009 3R. Smith - University of St Thomas - Minnesota

Decision Making StrategiesDecision Making Strategies

• Rule basedRule based– I’m told that’s what we do, and I follow that rule (Passwords)I’m told that’s what we do, and I follow that rule (Passwords)

• Relativistic Relativistic – My friend does it, so I do, too.My friend does it, so I do, too.– My neighbor has a fence and locks his front door. Me, too.My neighbor has a fence and locks his front door. Me, too.– We all use super-strong Kryptonite bike locksWe all use super-strong Kryptonite bike locks

• ““Security Theater”, hunter’s dilemmaSecurity Theater”, hunter’s dilemma• MAD - DeterrenceMAD - Deterrence

• RationalRational– We look at the risks and choose security measures We look at the risks and choose security measures

accordinglyaccordingly– If an incident occurs, it should prove cheaper than the long-If an incident occurs, it should prove cheaper than the long-

term cost of protecting against itterm cost of protecting against it– Reassess risks as part of the “life cycle” of the assetReassess risks as part of the “life cycle” of the asset

Spring 2009 4R. Smith - University of St Thomas - Minnesota

Decision making in a life cycleDecision making in a life cycle

• Identify your practical goalsIdentify your practical goals– What “real” things do you want to accomplish?What “real” things do you want to accomplish?– What risks interfere with them?What risks interfere with them?

• Choose the security that fitsChoose the security that fits– What weaknesses exist?What weaknesses exist?– What security measures might work?What security measures might work?– What are the trade-offs against goals?What are the trade-offs against goals?

• Measure successMeasure success– Monitor for attacks or other failuresMonitor for attacks or other failures– Recover from problemsRecover from problems– Reassess goals and trade-offsReassess goals and trade-offs

Spring 2009 5R. Smith - University of St Thomas - Minnesota

The Security ProcessThe Security Process

1.1. Identify your assetsIdentify your assets• What assets and capabilities do you require?What assets and capabilities do you require?

2.2. Analyze the risks of attackAnalyze the risks of attack• What can happen to damage your assets? What can happen to damage your assets? • What is the likelihood of damage?What is the likelihood of damage?

3.3. Establish your security policyEstablish your security policy• Trade off of risks, cost of damage, cost of protectionTrade off of risks, cost of damage, cost of protection• Identify the protections you intend to useIdentify the protections you intend to use

4.4. Implement your defensesImplement your defenses

5.5. Monitor your defensesMonitor your defenses

6.6. Recover from attacksRecover from attacks

Spring 2009 6R. Smith - University of St Thomas - Minnesota

Asset

Threats & VulnerabilitiesThreats & Vulnerabilities

Threat

Defense,Safeguard, or

“Countermeasure”

An attempt to steal or harm the asset is an attackattack

Vul

nera

bilit

yV

ulne

rabi

lity

Spring 2009 7R. Smith - University of St Thomas - Minnesota

Simple risk analysis: your PCSimple risk analysis: your PC

• Threats?Threats?– Who, why?Who, why?

• Vulnerabilities?Vulnerabilities?– What bad can happen?What bad can happen?– What allows the badness to happen?What allows the badness to happen?

• Can we just lock it up?Can we just lock it up?– Put it in a roomPut it in a room– Put a lock on the door.Put a lock on the door.– Don’t share the keyDon’t share the key

• Does this work?Does this work?

Spring 2009 8R. Smith - University of St Thomas - Minnesota

Extreme Workstation SecurityExtreme Workstation Security

Does this achieve our goals? Does this achieve our goals?

The Network Security ProblemThe Network Security Problem

• Protection is usually Protection is usually locallocal • Network data travels to Network data travels to remoteremote locations locations

March 2005 9R. Smith - University of St Thomas - Minnesota

Risk: EavesdroppingRisk: Eavesdropping

• An established social tradition (“party lines”)An established social tradition (“party lines”)

March 2005 10R. Smith - University of St Thomas - Minnesota

Risk: ForgeryRisk: Forgery

• Who Who reallyreally sent the message? sent the message?

March 2005 11R. Smith - University of St Thomas - Minnesota

Risk: ReplayRisk: Replay

• If a message worked once, why not again, If a message worked once, why not again, • and again?and again?

March 2005 12R. Smith - University of St Thomas - Minnesota

How do we fix this?How do we fix this?

• Again, it depends on Again, it depends on policy policy – What are we really trying to achieve (“the big picture”)What are we really trying to achieve (“the big picture”)– What are the real risks to that big picture?What are the real risks to that big picture?

• Practical networking choicesPractical networking choices– Should/must the users control the defenses?Should/must the users control the defenses?

• Can/should they choose what gets protected?Can/should they choose what gets protected?– Can we isolate the users in a safe but restrictive “bubble”?Can we isolate the users in a safe but restrictive “bubble”?

• If not, what access do they need to the ‘outside’?If not, what access do they need to the ‘outside’?– What external, secure connections do we need?What external, secure connections do we need?

• Are they ad-hoc, or can we anticipate them?Are they ad-hoc, or can we anticipate them?

• Risk AssessmentRisk Assessment– Which threats matter: eavesdropping, forgery, replay?Which threats matter: eavesdropping, forgery, replay?

March 2005 13R. Smith - University of St Thomas - Minnesota

Spring 2009 14R. Smith - University of St Thomas - Minnesota

Deciding on ProtectionDeciding on Protection

• Policy: what protections we needPolicy: what protections we need– If possible, identify defensive perimetersIf possible, identify defensive perimeters– Identify other defenses to reduce impact of risks Identify other defenses to reduce impact of risks – Balance against how we use the assetBalance against how we use the asset– Balance against cost of protectionBalance against cost of protection

Security TechnologiesSecurity Technologies

• AuthenticationAuthentication

• Encryption – applied to protocolsEncryption – applied to protocols

• Firewalls (did those)Firewalls (did those)

04/19/23 15R. Smith - University of St Thomas - Minnesota

04/19/23 16R. Smith - University of St Thomas - Minnesota

AuthenticationAuthentication

• Associates some computing behavior to a Associates some computing behavior to a particular user nameparticular user name

• NotNot Identification – a separate problem Identification – a separate problem– We don’t figure out who the user “really” isWe don’t figure out who the user “really” is– We don’t find other names for a userWe don’t find other names for a user

• NotNot Access Control – a separate problem Access Control – a separate problem– We don’t figure out what the user We don’t figure out what the user shouldshould be able to do be able to do– Some systems grant broad access permissions to Some systems grant broad access permissions to

authenticated users, but that’s authenticated users, but that’s notnot an authentication issue an authentication issue

04/19/23 17R. Smith - University of St Thomas - Minnesota

Just bits on a wire…Just bits on a wire…

Cover art fromCover art from Authentication: From Authentication: From Passwords to Public Keys Passwords to Public Keys by Richard E. Smith © by Richard E. Smith © 2002, Addison Wesley.2002, Addison Wesley.

Illustration by Peter Steiner, Illustration by Peter Steiner, The Cartoon Bank. Used by The Cartoon Bank. Used by permissionpermission. .

04/19/23 18R. Smith - University of St Thomas - Minnesota

Something you know• Password or PIN

My Pin is ...

Something you have• Key or Token

Something you are• Personal trait

Authentication “Factors”Authentication “Factors”

Traditional parallel terms: Traditional parallel terms:

Something you Something you knowknow, , areare, , havehave

04/19/23 19R. Smith - University of St Thomas - Minnesota

The Password TraditionThe Password Tradition

• 1963: passwords implemented as a computer-1963: passwords implemented as a computer-oriented substitute for student lockers at MIToriented substitute for student lockers at MIT– Easy to implement and operateEasy to implement and operate– Originators never thought of it as a strong mechanismOriginators never thought of it as a strong mechanism

Passwords capture an essential truth:Passwords capture an essential truth:

The best way to authenticate someone is by The best way to authenticate someone is by proving ownership of a personalized secretproving ownership of a personalized secret

(We’ll call it the (We’ll call it the base secretbase secret here) here)

04/19/23 20R. Smith - University of St Thomas - Minnesota

The Password TraditionThe Password Tradition• Passwords: the essence of computer authentication: Passwords: the essence of computer authentication:

Verifies the ownership of a personal secretVerifies the ownership of a personal secret(the (the basebase secretsecret))

• Attack: Steal the password file from the hard driveAttack: Steal the password file from the hard drive

From Authentication © 2002. Used by permission

04/19/23 21R. Smith - University of St Thomas - Minnesota

Password HashingPassword Hashing

• Can’t retrieve passwords by stealing the fileCan’t retrieve passwords by stealing the file– Can only replace a forgotten passwordCan only replace a forgotten password

• Use a cryptographic Use a cryptographic one way hashone way hash function function

From Authentication © 2002. Used by permission

04/19/23 22R. Smith - University of St Thomas - Minnesota

Password Ping-PongPassword Ping-Pong

Attacks Defenses

PasswordsSteal the Password File

Password HashingGuessing

Guess DetectionSocial Engineering

Help Desk RestrictionsKeystroke Sniffing

Memory ProtectionPassword Sharing

Password TokensNetwork Sniffing

One-Time Passwords??

04/19/23 23R. Smith - University of St Thomas - Minnesota

Guessing AttacksGuessing Attacks

• Interactive AttacksInteractive Attacks– Classic Examples: Searches for written passwords, PIN Classic Examples: Searches for written passwords, PIN

guessingguessing– PIN guessing is limited to trial-and-error attempts to use a PIN guessing is limited to trial-and-error attempts to use a

serverserver• Limited to server’s speed, and failures can be detectedLimited to server’s speed, and failures can be detected

• Off-Line AttacksOff-Line Attacks– Classic Example - “Dictionary Attack”Classic Example - “Dictionary Attack”– Make a list of likely passwords, intercept information about Make a list of likely passwords, intercept information about

users’ passwords, match the list match against the intercepted users’ passwords, match the list match against the intercepted informationinformation

– Most powerful attack - fast and hard to detectMost powerful attack - fast and hard to detect

04/19/23 24R. Smith - University of St Thomas - Minnesota

Attack: Off-line with DictionaryAttack: Off-line with Dictionary

From Authentication © 2002. Used by permission

04/19/23 25R. Smith - University of St Thomas - Minnesota

ComputingComputing Average Attack SpaceAverage Attack Space

V = S / (2L)V = S / (2L)

• S = S = Number of Number of likelylikely permutations of the permutations of the base secretbase secret

• L = L = Fraction of population that uses one of the Fraction of population that uses one of the likelylikely base secrets base secrets

• V = V = number of trial-and-error attempts needed to number of trial-and-error attempts needed to achieve a 50% chance of selecting the actual base achieve a 50% chance of selecting the actual base secretsecret

04/19/23 26R. Smith - University of St Thomas - Minnesota

Password StrengthPassword Strength

• Resisting a brute force guessing attackResisting a brute force guessing attack– Estimate the average number of attempts required to succeedEstimate the average number of attempts required to succeed– Attacker makes a list of possible passwords and tries them allAttacker makes a list of possible passwords and tries them all

• # passwords in the list# passwords in the list• % chance that someone chooses from that list% chance that someone chooses from that list• How many attempts to achieve 50-50 chance of successHow many attempts to achieve 50-50 chance of success

– Call this the Call this the Average Attack SpaceAverage Attack Space

• An Ideal ExampleAn Ideal Example– People choose completely random strings of charactersPeople choose completely random strings of characters– 96 characters to choose from, 8 characters long96 characters to choose from, 8 characters long

– 969688 presents an average attack space of about 2 presents an average attack space of about 2

4545

04/19/23 27R. Smith - University of St Thomas - Minnesota

Guessable PasswordsGuessable Passwords

Incident

Year

% Guessed

Internet Worm 1988 ~50%

Klein’s Study 1990 24.2%

Spafford’s Study 1992 20%

CERT Incident IN-1998-03 1998 25.6%

Cambridge study by Yan, et al. 2000 35%

04/19/23 28R. Smith - University of St Thomas - Minnesota

Passwords in PracticePasswords in Practice

• The Rule for Strong Passwords:The Rule for Strong Passwords:

The password must be impossible to memorize The password must be impossible to memorize and never written downand never written down

• The Result of this RuleThe Result of this Rule– look under some mouse pads and find --- look under some mouse pads and find ---

From Authentication © 2002. Used by permission

04/19/23 29R. Smith - University of St Thomas - Minnesota

Strength in PracticeStrength in Practice

Example

Type of Attack

Average Attack Space

Random 8-character password

Interactive 245

Dictionary Attack Off-Line 215 to 223

Mouse Pad Search Interactive 21 to 24

Practical Off-Line Attacks Off-Line 240 to 263

04/19/23 30R. Smith - University of St Thomas - Minnesota

Sharing Trumps Password StrengthSharing Trumps Password Strength

A secret is something you tell people A secret is something you tell people one at a timeone at a time

- Anonymous- Anonymous

• Passwords are easy for people to sharePasswords are easy for people to share• There is no way to tell if a password has been There is no way to tell if a password has been

shared (unless something bad happens)shared (unless something bad happens)• The only solution is to use something elseThe only solution is to use something else

– Biometrics or Hardware Authentication TokensBiometrics or Hardware Authentication Tokens

04/19/23 31R. Smith - University of St Thomas - Minnesota

Sniffing Trumps Password StrengthSniffing Trumps Password Strength

From Authentication © 2002. Used by permission

04/19/23 32R. Smith - University of St Thomas - Minnesota

Sniffing and PasswordsSniffing and Passwords

• Passwords safest when only used internallyPasswords safest when only used internally– E.g., passwords never leave the physical premisesE.g., passwords never leave the physical premises– Potential attackers have physical access to computers already Potential attackers have physical access to computers already

• Passwords too-often unprotected on networksPasswords too-often unprotected on networks– Internet protocols: HTTP, FTP, Telnet, POP, IMAPInternet protocols: HTTP, FTP, Telnet, POP, IMAP– Even when protection is possible, ISPs don’t always support itEven when protection is possible, ISPs don’t always support it

• Solution: use encryption techniquesSolution: use encryption techniques– Challenge Response PasswordsChallenge Response Passwords– EncryptionEncryption

• Uses strong, separate encryption keys to prevent sniffingUses strong, separate encryption keys to prevent sniffing– Token-based One Time Passwords (discussed later)Token-based One Time Passwords (discussed later)

04/19/23 33R. Smith - University of St Thomas - Minnesota

Password SummaryPassword Summary

• Passwords make poor base secretsPasswords make poor base secrets– Too hard to control, too easy to share or loseToo hard to control, too easy to share or lose

A password is an office cabinet lockA password is an office cabinet lock– Best in internal environments with low threatBest in internal environments with low threat– OK when external environment is restrictedOK when external environment is restricted

• Network traffic protected with separately keyed encryptionNetwork traffic protected with separately keyed encryption• Trial-and-error guessing can be detected on serverTrial-and-error guessing can be detected on server

• Passwords are tricky and unreliablePasswords are tricky and unreliable– If you make them hard to handle, you open yourself to social If you make them hard to handle, you open yourself to social

engineering attacksengineering attacks

04/19/23 34R. Smith - University of St Thomas - Minnesota

Authentication TokensAuthentication Tokens

• There are also “soft” tokensThere are also “soft” tokens– Most tokens also provided in software implementationsMost tokens also provided in software implementations– ““Public key” products often handle the private key with Public key” products often handle the private key with

software-only mechanismssoftware-only mechanisms

From Authentication © 2002. Used by permission

04/19/23 35R. Smith - University of St Thomas - Minnesota

Tokens: Things you haveTokens: Things you have

BenefitsBenefits• Hard to attack - uses a Hard to attack - uses a

stronger secret than you stronger secret than you get in a typical passwordget in a typical password

• Hard to forge - must Hard to forge - must often hack the hardwareoften hack the hardware

• Hard to share - usually Hard to share - usually stored in hardwarestored in hardware

ProblemsProblems• Expensive - must buy Expensive - must buy

hardware and/or special hardware and/or special authentication softwareauthentication software

• Can be lost or stolenCan be lost or stolen• Risk of hardware failureRisk of hardware failure

04/19/23 36R. Smith - University of St Thomas - Minnesota

Hardware TokensHardware Tokens

• Resist copying and other attacks by storing the base Resist copying and other attacks by storing the base secret in a tamper-resistant package.secret in a tamper-resistant package.

From Authentication © 2002. Used by permission

04/19/23 37R. Smith - University of St Thomas - Minnesota

One-Time Password TokensOne-Time Password Tokens

Attacker can’t reuse the sniffed passwordAttacker can’t reuse the sniffed passwordFrom Authentication © 2002. Used by permission

04/19/23 38R. Smith - University of St Thomas - Minnesota

Challenge ResponseChallenge Response

• Earliest product (SafeWord) adapted from a Earliest product (SafeWord) adapted from a copy protection scheme in a 1980s video gamecopy protection scheme in a 1980s video game

• Interactive Challenge ResponseInteractive Challenge Response– First implemented in calculator-sized tokensFirst implemented in calculator-sized tokens

• SafeWord, WatchWord, SNK, ActivCard, ANSI X9.9SafeWord, WatchWord, SNK, ActivCard, ANSI X9.9– Software implementation: S/Key for UnixSoftware implementation: S/Key for Unix

• Embedded Challenge ResponseEmbedded Challenge Response– Client performs the challenge response processing Client performs the challenge response processing

automaticallyautomatically– Microsoft LANMAN, NTLM, MS-CHAP in WindowsMicrosoft LANMAN, NTLM, MS-CHAP in Windows– Kerberos provides a similar capabilityKerberos provides a similar capability

04/19/23 39R. Smith - University of St Thomas - Minnesota

Interactive ChallengeInteractive Challenge

• Requires a calculator (hardware or software)Requires a calculator (hardware or software)• Base secret is embedded in the calculatorBase secret is embedded in the calculator• Authenticates the owner of the base secretAuthenticates the owner of the base secret

From Authentication © 2002. Used by permission

04/19/23 40R. Smith - University of St Thomas - Minnesota

Embedded ChallengeEmbedded Challenge

• Login client handles challenge automatically Login client handles challenge automatically (DEC, Novell)(DEC, Novell)

• The password is the base secretThe password is the base secret

From Authentication © 2002. Used by permission

04/19/23 41R. Smith - University of St Thomas - Minnesota

Blocking the SniffersBlocking the Sniffers

• Interactive Challenges: Strong but Hard to UseInteractive Challenges: Strong but Hard to Use– Can store a large base secret in the tokenCan store a large base secret in the token– But, user must transcribe the challenge and the responseBut, user must transcribe the challenge and the response– Rarely used todayRarely used today

• Embedded Challenges: Popular but weakerEmbedded Challenges: Popular but weaker– Handles the challenge response computation automaticallyHandles the challenge response computation automatically– But… Attackers can perform off-line dictionary attacksBut… Attackers can perform off-line dictionary attacks

• Encryption protocols are even betterEncryption protocols are even better– SSL for Web, FTP, E-mail; IPSEC for everything elseSSL for Web, FTP, E-mail; IPSEC for everything else– Uses a separate, strong secret to block password sniffingUses a separate, strong secret to block password sniffing– Prevents off-line attacks, forces detectable interactive attacksPrevents off-line attacks, forces detectable interactive attacks

04/19/23 42R. Smith - University of St Thomas - Minnesota

Tokens Resist AttacksTokens Resist Attacks

ExampleType ofAttack

AverageAttackSpace

Password Off-Line 215 to 223

One-Time Password Token Interactive 219 to 223

One-Time Password Token Off-Line 254 to 263

Smart Card with Public Key Off-Line 263 to 2116

04/19/23 43R. Smith - University of St Thomas - Minnesota

Biometrics: Things you areBiometrics: Things you are

Also hand, voice, face, eyesAlso hand, voice, face, eyes

From Authentication © 2002. Used by permission

Biometrics don’t work on networksBiometrics don’t work on networks

• Once the bits leave the biometric reader, we Once the bits leave the biometric reader, we can’t tell if they’re legitimate or notcan’t tell if they’re legitimate or not– Maybe they were replayed from an earlier visitMaybe they were replayed from an earlier visit– Maybe they were constructed from a stolen imageMaybe they were constructed from a stolen image

• The biometric pattern acts like a base secretThe biometric pattern acts like a base secret– But, Cathy’s biometrics are not base secretsBut, Cathy’s biometrics are not base secrets

• Cathy leaves artifacts of her voice, fingerprints, and Cathy leaves artifacts of her voice, fingerprints, and appearance wherever she goesappearance wherever she goes

• Cathy can’t change them if someone makes a copyCathy can’t change them if someone makes a copy– Also, Cathy’s privacy is jeopardized if the biometric signatures Also, Cathy’s privacy is jeopardized if the biometric signatures

and patterns must be handled by many systems and devicesand patterns must be handled by many systems and devices

04/19/23 44R. Smith - University of St Thomas - Minnesota

Network EncryptionNetwork Encryption

• We get different We get different results by putting results by putting cryptography in cryptography in different places different places in the protocol in the protocol architecturearchitecture

March 2005 45R. Smith - University of St Thomas - Minnesota

ApplicationApplication

Device DriverDevice Driver

TCP/UDP LayerTCP/UDP Layer

IP LayerIP Layer

Link LayerLink Layer

ProtocolProtocolStackStack

March 2005 46R. Smith - University of St Thomas - Minnesota

The Encryption ProcessThe Encryption Process

• Convert plaintext to ciphertext with a keyConvert plaintext to ciphertext with a key

March 2005 47R. Smith - University of St Thomas - Minnesota

CryptanalysisCryptanalysis

• Known ciphertext attackKnown ciphertext attack– a.k.a. ciphertext-only attack – classic attacka.k.a. ciphertext-only attack – classic attack– Newspaper cryptogramsNewspaper cryptograms– You have ciphertext, no plaintextYou have ciphertext, no plaintext

• Known plaintext attackKnown plaintext attack– You have You have somesome plaintext for some intercepted ciphertext plaintext for some intercepted ciphertext– The attack used against ENIGMA to reduce the problemThe attack used against ENIGMA to reduce the problem

Security and the Protocol StackSecurity and the Protocol Stack

Classic layer-oriented Classic layer-oriented examples of crypto examples of crypto

protocolsprotocols• Application: PGPApplication: PGP

– encrypts application dataencrypts application data

• Trans->App: SSLTrans->App: SSL– encrypts the connectionencrypts the connection

• IP Layer: IPSECIP Layer: IPSEC– encrypts routable packetsencrypts routable packets

• Link Layer: WEP/WPALink Layer: WEP/WPA– encrypts LAN packetsencrypts LAN packets

March 2005 48R. Smith - University of St Thomas - Minnesota

ApplicationApplication

Device DriverDevice Driver

TCP/UDP LayerTCP/UDP Layer

IP LayerIP Layer

Link LayerLink Layer

WEP/WPAWEP/WPA

IPSECIPSEC

SSLSSL

PGPPGP

ProtocolProtocolStackStack

How Crypto works in the stackHow Crypto works in the stack

• ““Above” a crypto layerAbove” a crypto layer– Data is assumed to be in plaintext formData is assumed to be in plaintext form

• ““At” a crypto layerAt” a crypto layer– We convert between plaintext and ciphertextWe convert between plaintext and ciphertext– We have access to some keysWe have access to some keys– We generate some plaintext headersWe generate some plaintext headers– Some header info may be encrypted or protected otherwiseSome header info may be encrypted or protected otherwise

• ““Below” the crypto layerBelow” the crypto layer– New network headers are added in plaintextNew network headers are added in plaintext

March 2005 49R. Smith - University of St Thomas - Minnesota

How it works GeographicallyHow it works Geographically

• Application layer encryptionApplication layer encryption– ““End to end security” – routable, and inaccessible to othersEnd to end security” – routable, and inaccessible to others– Defeats intermediate virus scans, intrusion detectionDefeats intermediate virus scans, intrusion detection– Applied at the discretion of the end user (usually)Applied at the discretion of the end user (usually)

• Socket layer encryptionSocket layer encryption– Application-application security – similar to application layerApplication-application security – similar to application layer– Often applied automatically under control of the serverOften applied automatically under control of the server– Sometimes it is a user-level optionSometimes it is a user-level option

• IPSEC – IP Security ProtocolsIPSEC – IP Security Protocols– Internet layer security – protects routable packets, per-packetInternet layer security – protects routable packets, per-packet– Protects all Internet application traffic equallyProtects all Internet application traffic equally– Often a substitute for inter-site leased linesOften a substitute for inter-site leased lines

March 2005 50R. Smith - University of St Thomas - Minnesota

March 2005 51R. Smith - University of St Thomas - Minnesota

Diagramming the CryptoDiagramming the Crypto

• ElementsElements– Protocol stack elementsProtocol stack elements– Where the crypto goesWhere the crypto goes– What is encryptedWhat is encrypted– What is plaintextWhat is plaintext

IP Security Protocol – IPSECIP Security Protocol – IPSEC

• Security protection that’s IP routableSecurity protection that’s IP routable• We authenticate the IP addressesWe authenticate the IP addresses• We encrypt everything inside the IP headerWe encrypt everything inside the IP header

March 2005 52R. Smith - University of St Thomas - Minnesota

Separate HeadersSeparate Headers

• AH – Authentication HeaderAH – Authentication Header– Keeps the packet intactKeeps the packet intact

• ESP – Encapsulating Security PayloadESP – Encapsulating Security Payload– A ‘generic’ security format, originally just for encryptionA ‘generic’ security format, originally just for encryption– Now does both encryption and authenticationNow does both encryption and authentication

March 2005 53R. Smith - University of St Thomas - Minnesota

Authentication Header – ‘AH’Authentication Header – ‘AH’

• Protects unchanging bits of the IP headerProtects unchanging bits of the IP header• ““SPI” – Security Parameter IndexSPI” – Security Parameter Index

– Identifies the keying and hash algorithm to useIdentifies the keying and hash algorithm to use

March 2005 54R. Smith - University of St Thomas - Minnesota

Encapsulating Security Payload- ESPEncapsulating Security Payload- ESP(8 bit bytes) SPI

Sequence Number

Payload Data (variable)

Padding (variable)

Pad Length Next Header

Integrity Check (variable)

March 2005 55R. Smith - University of St Thomas - Minnesota

• Modern style, including integrity protectionModern style, including integrity protection– Internal format still depends on the crypto usedInternal format still depends on the crypto used– SPI picks the crypto format; the format determines variablesSPI picks the crypto format; the format determines variables

• Main problem: how long is the integrity check?Main problem: how long is the integrity check?• May be length = 0, especially if the crypto does it alreadyMay be length = 0, especially if the crypto does it already

March 2005 56R. Smith - University of St Thomas - Minnesota

Secret Key ManagementSecret Key Management

• Two elementsTwo elements– How do you assign individual keysHow do you assign individual keys– How do you update keysHow do you update keys

• Assignment – how many keys do we need?Assignment – how many keys do we need?– ““One Big Cryptonet”One Big Cryptonet”– Pairwise user-userPairwise user-user– Pairwise user-server (“key distribution center)Pairwise user-server (“key distribution center)

• Updating – given the assignment strategiesUpdating – given the assignment strategies– ManualManual– AutomaticAutomatic

March 2005 57R. Smith - University of St Thomas - Minnesota

Automatic key updatingAutomatic key updating

• How do we get the new key?How do we get the new key?– Internal updateInternal update

• use a ‘pseudo random number generator’use a ‘pseudo random number generator’• ““Forward secrecy” problemForward secrecy” problem

– Random updateRandom update• Use a new, randomly generated keyUse a new, randomly generated key• Share with the cryptonetShare with the cryptonet

• How do we transmit random keys?How do we transmit random keys?– Chained updateChained update

• Send it using the existing crypto keySend it using the existing crypto key• ““Forward secrecy” problemForward secrecy” problem

– KEK-based updateKEK-based update• Use a separate “key encrypting key”Use a separate “key encrypting key”• Data is only sent with “data keys” or “session keys”Data is only sent with “data keys” or “session keys”• Only use KEK to send newly generated sessionOnly use KEK to send newly generated session

March 2005 58R. Smith - University of St Thomas - Minnesota

Key Distribution Center (KDC)Key Distribution Center (KDC)

• Each user has a unique personal keyEach user has a unique personal key– Contacts KDC to get a session key Contacts KDC to get a session key – KDC sends keys encrypted with users’ personal keysKDC sends keys encrypted with users’ personal keys

• ExampleExample– Bob wants to talk to AliceBob wants to talk to Alice– Bob contacts KDC, says “I want to talk to Alice”Bob contacts KDC, says “I want to talk to Alice”– KDC sends two copies of the session keyKDC sends two copies of the session key

• One encrypted with Bob’s personal keyOne encrypted with Bob’s personal key• One encrypted with Alice’s personal keyOne encrypted with Alice’s personal key

• This is the basis of KerberosThis is the basis of Kerberos– Encrypted keys are called “tickets”Encrypted keys are called “tickets”

March 2005 59R. Smith - University of St Thomas - Minnesota

• Uses a pair of keys: the Uses a pair of keys: the Private KeyPrivate Key and the and the Public KeyPublic Key

• Usually, one key of the pair decrypts what Usually, one key of the pair decrypts what the other key encrypts, and vice versathe other key encrypts, and vice versa

• ““Asymmetric EncryptionAsymmetric Encryption””

EncryptionEncryptionProcedureProcedure

ClearClearTextText

ClearClearTextText

Public Key EncryptionPublic Key Encryption

CipherCipherTextText

Public Public KeyKey

DecryptionDecryptionProcedureProcedure

Private Private KeyKey

Public Key Protocols/ApplicationsPublic Key Protocols/Applications

• IPSEC: used for key exchangeIPSEC: used for key exchange– ““Diffie Hellman” public key techniqueDiffie Hellman” public key technique

– Produce temporary public/private keys Produce temporary public/private keys

– Use the security to set up IPSEC security associations (SPIs)Use the security to set up IPSEC security associations (SPIs)

• SSL: protects Web, FTP, e-mail, shell (SSH)..SSL: protects Web, FTP, e-mail, shell (SSH)..– Usually RSA public key techniqueUsually RSA public key technique

– Uses a web server’s public key to set up a shared secretUses a web server’s public key to set up a shared secret

– Uses regular crypto to protect the actual data transfersUses regular crypto to protect the actual data transfers

• PGP, PEM, S/MIME: protect files and e-mailPGP, PEM, S/MIME: protect files and e-mail– Usually RSA public key techniqueUsually RSA public key technique

– Encrypt a file with regular (symmetric) cryptoEncrypt a file with regular (symmetric) crypto

– Encrypt the key with recipients’ public keysEncrypt the key with recipients’ public keys

– ““Sign” the message with author’s private keySign” the message with author’s private key

04/19/23 60R. Smith - University of St Thomas - Minnesota

Firewall objectivesFirewall objectives

• Provide outbound Internet accessProvide outbound Internet access

• Restrict/forbid inbound connectionsRestrict/forbid inbound connections

• Detect and block malicious trafficDetect and block malicious traffic

March 2005 61R. Smith - University of St Thomas - Minnesota

Types of firewall traffic controlTypes of firewall traffic control

• Service control (allow specific protocols)Service control (allow specific protocols)– Block unauthorized protocolsBlock unauthorized protocols– Permit authorized onesPermit authorized ones– Actually very hard to doActually very hard to do

• Direction control (in/out)Direction control (in/out)– Allow outbound browsingAllow outbound browsing– Restrict access to internal serversRestrict access to internal servers

• User control (source/destination)User control (source/destination)– User authorization, or perhaps subnet filteringUser authorization, or perhaps subnet filtering

• Behavior controlBehavior control– bandwidth, application specific casesbandwidth, application specific cases– Look in e-mail for malwareLook in e-mail for malware– Filter access to Web sites (China, Saudi, …)Filter access to Web sites (China, Saudi, …)

March 2005 62R. Smith - University of St Thomas - Minnesota

Oct 2001 63

Types of Firewall FilteringTypes of Firewall Filtering

Packet Filtering: based on packet header (Unsophisticated)Packet Filtering: based on packet header (Unsophisticated)

Circuit Filtering: restricts connections (Common)Circuit Filtering: restricts connections (Common)

Application Proxy: restricts based on general policy (Refined)Application Proxy: restricts based on general policy (Refined)

IP HeaderIP Header TCP DataTCP Data

TCP HeaderTCP Header Application DataApplication Data

Appl. HeaderAppl. Header User DataUser Data

+ Connection state+ Connection state

+ Connection state + application state+ Connection state + application state

Oct 2001 64

Firewalls in Different StrengthsFirewalls in Different Strengths

Packet FilterPacket Filter• Control Based on Source /Control Based on Source /

Destination Internet Destination Internet AddressesAddresses

IPIP

LinkLink

Circuit GatewayCircuit Gateway• Hides Internal Network DetailsHides Internal Network Details

TCP/UDPTCP/UDP

IPIP

LinkLink

Application GatewayApplication Gateway• Control Based on Application Control Based on Application

Type and ContentType and Content

ApplicationApplication

TCP/UDPTCP/UDP

IPIP

LinkLink

INTERNETINTERNET

Oct 2001 65

Proxies . . . .Proxies . . . . for the Application Gateway for the Application Gateway

M. A. ProxyM. A. Proxy

Proxies are small ( less than 2000 lines of code),Proxies are small ( less than 2000 lines of code),““minimal and modular”minimal and modular”

Oct 2001 66

Proxies . . .Proxies . . . for the Application Gateway. for the Application Gateway.

M. A. ProxyM. A. Proxy

User’s requestsUser’s requests

CLIENTCLIENT SERVERSERVER

Oct 2001 67

Proxies . . .Proxies . . . for the Application Gateway. for the Application Gateway.

User’s requestsUser’s requestsforwardedforwarded

ApplicationApplicationoutputoutput

CLIENTCLIENT SERVERSERVERM. A. ProxyM. A. Proxy

User’s requestsUser’s requests

Oct 2001 68

Proxies . . .Proxies . . . for the Application Gateway. for the Application Gateway.

User’s requestsUser’s requestsUser’s requestsUser’s requestsforwardedforwarded

ApplicationApplicationoutputoutput

ApplicationApplicationoutput forwardedoutput forwarded

CLIENTCLIENT SERVERSERVERM. A. ProxyM. A. Proxy

Logs maintainedLogs maintained

Oct 2001 69

Internet Firewall Internet Firewall Application Level GatewayApplication Level Gateway

RouterRouterAuditAuditLogsLogs

PrivatePrivateNetworkNetwork

Ethernet CardEthernet Card

PublicPublicNetworkNetwork

nntp proxynntp proxy

http proxyhttp proxy

smtp proxysmtp proxy

telnet proxytelnet proxy

ftp proxyftp proxy

X11 proxyX11 proxy

snmp proxysnmp proxy

rlogin proxyrlogin proxy

Ethernet CardEthernet Card

Oct 2001 70

Issues with using FirewallsIssues with using Firewalls

• All firewalls are NOT created equalAll firewalls are NOT created equal– Type and rigor of controlsType and rigor of controls– OS securityOS security

• Correct configuration is critical for any FirewallCorrect configuration is critical for any Firewall– Many attacks exploit insecure default configurationsMany attacks exploit insecure default configurations

• Firewalls, even when functioning correctly, Firewalls, even when functioning correctly, open BIG holes in the security perimeteropen BIG holes in the security perimeter– World-Wide Web (HTTP)World-Wide Web (HTTP)– Active content (Java, Java-Script, ActiveX)Active content (Java, Java-Script, ActiveX)

SummarySummary

• Lots of technologyLots of technology

• An ongoing arms race between attackers and An ongoing arms race between attackers and defendersdefenders

• Always something newAlways something new

04/19/23 71R. Smith - University of St Thomas - Minnesota

04/19/23 72R. Smith - University of St Thomas - Minnesota

Creative Commons LicenseCreative Commons License

This work is licensed under the Creative This work is licensed under the Creative Commons Attribution-Share Alike 3.0 United Commons Attribution-Share Alike 3.0 United

States License. To view a copy of this license, States License. To view a copy of this license, visit http://creativecommons.org/licenses/by-visit http://creativecommons.org/licenses/by-

sa/3.0/us/ or send a letter to Creative sa/3.0/us/ or send a letter to Creative Commons, 171 Second Street, Suite 300, San Commons, 171 Second Street, Suite 300, San

Francisco, California, 94105, USA.Francisco, California, 94105, USA.