28
Trust Evidence for IoT: Trust Establishment from Servers to Sensors David E. Ott, PhD Research Director, University Research Office Intel Labs Co-authors: Claire Vishik, David Grawrock, Anand Rajan

Trust Evidence for IoT - EEMA · 3 Types of Keys: –Node master keys K N : shared between IP and N –Infrastructure Provider keys K N,SP: shared between IP, SP and N –Software

  • Upload
    others

  • View
    4

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Trust Evidence for IoT - EEMA · 3 Types of Keys: –Node master keys K N : shared between IP and N –Infrastructure Provider keys K N,SP: shared between IP, SP and N –Software

Trust Evidence for IoT: Trust Establishment from Servers to Sensors

David E. Ott, PhD

Research Director, University Research Office

Intel Labs

Co-authors: Claire Vishik, David Grawrock, Anand Rajan

Page 2: Trust Evidence for IoT - EEMA · 3 Types of Keys: –Node master keys K N : shared between IP and N –Infrastructure Provider keys K N,SP: shared between IP, SP and N –Software

Cross-domain Trust

Tomorrow’s internet: • Billions of devices, users, connections

• 2015: 15B devices, 3.4B users*

• Complex web of client devices and services • Massive forum for data exchange between systems

Need for cross-domain trust• Can you trust the system or device you are connected to?• What threat posture should you assume for a given context?

*Source: Cisco VNI: Forecast and Methodology, 2011-2016.

Page 3: Trust Evidence for IoT - EEMA · 3 Types of Keys: –Node master keys K N : shared between IP and N –Infrastructure Provider keys K N,SP: shared between IP, SP and N –Software

Authentication

Authentication• Verify the identity of the user and/or system (e.g., id/pw)• Much-studied, increasingly robust methods

• Encryption-based, certificates, TPM-based, biometrics, multifactor, etc.

Yet most often fails to comprehend dynamic operating condition of the system

• integrity of system and application software• correct function of hardware components• rogue processes subverting runtime environment• user misuse of system or application software

Page 4: Trust Evidence for IoT - EEMA · 3 Types of Keys: –Node master keys K N : shared between IP and N –Infrastructure Provider keys K N,SP: shared between IP, SP and N –Software

Can we augment authentication with additional information?

Trust Evidence

Trust Evidence (def): Canonical set of information and parameters for demonstrating trustworthiness during interactions with other components or systems in the computing environment.

Might include:• Various aspects of system or device as it operates• Software and system configuration• Integrity of software running on the system• Characteristics of software execution during interactions

Page 5: Trust Evidence for IoT - EEMA · 3 Types of Keys: –Node master keys K N : shared between IP and N –Infrastructure Provider keys K N,SP: shared between IP, SP and N –Software

Trust Evidence Seedling ResearchGoal: Define an approach to cross-domain trust evidence that is broadly applicable and can be implemented across ecosystem devices or services.

Research Summary

1. Trust Evidence for DataPIs: Dawn Song, UC BerkeleyHI-CFG: Hybrid Information- and Control Flow Graph – Integrates representations of code and the data it works on. Data includes information on how it was computed.

2. Remotely Detecting Compromises in Distributed ApplicationsPIs: Mike Reiter, UNC Chapel HillTechniques allowing a server to verify the behavior of a client. Server uses symbolic execution to identify when client behavior cannot be verified.

3. Trust Evidence: Closing the Loop Between Organizations, Humans, Computation, and Hardware

PIs: Sean Smith and Sergey Bratus, Dartmouth CollegeExplores gap between human security goals and technical implementation of Trust. a.k.a., the language of trust

4. Trust Evidence for Connected Devices in Consumer SpacesPIs: Yoshi Kohno, University of WashingtonTrustworthiness in various classes of consumer devices. Techniques for evaluating trust without hardware or software modification.

Page 6: Trust Evidence for IoT - EEMA · 3 Types of Keys: –Node master keys K N : shared between IP and N –Infrastructure Provider keys K N,SP: shared between IP, SP and N –Software

References

6

• Dan Caselden, Alex Bazhanyuk, Mathias Payer, Stephen McCamant, and Dawn Song. Hi-CFG: Construction by binary analysis and application to attack polymorphism. Proceedings of ESORICS, page(s): 164-181. September 2013.

• T. Denning, T. Kohno, and H. Levy. Computer Security in the Modern Home. Communications of the ACM, 56(1), January 2013.

• L. Bauer, Y. Liang, M. K. Reiter and C. Spensky. Discovering access-control misconfigurations: New approaches and evaluation methodologies. Proceedings of the 2nd ACM Conference on Data and Application Security and Privacy. February 2012.

• S. Bratus, M. Locasto, B. Otto, R. Shapiro, S. W. Smith, G. Weaver. Beyond SELinux: the Case for Behavior-Based Policy and Trust Languages. Computer Science Technical Report TR2011-701. Dartmouth College. August 2011.

Page 7: Trust Evidence for IoT - EEMA · 3 Types of Keys: –Node master keys K N : shared between IP and N –Infrastructure Provider keys K N,SP: shared between IP, SP and N –Software

TE from Software Predictability

Baseline (def) : Reference information on expected control-flow behavior for software executing on a system or device.

Intuition:• Despite software complexity, critical execution sequences

often follow a predictable patterns• E.g., key user functions, standard protocol sequences

Can we capture cases when software is predictable and use to build evidence of trustworthy behavior in software at runtime?

Can program instrumentation or other approaches to capturing programmer intent help?

Page 8: Trust Evidence for IoT - EEMA · 3 Types of Keys: –Node master keys K N : shared between IP and N –Infrastructure Provider keys K N,SP: shared between IP, SP and N –Software

Baselining Compared

Anomaly Detection Baselining

• E.g., intrusion detection• Build statistical model of expected

network behavior,• Use model to identify suspicious flows

or endpoints

• Build statistical model of expected software behavior

• Measure software’s degree of conformance• E.g., software runtime environment

• Provide execution profile to remote system which evaluates against well-known baselines

Control Flow Integrity Baselining

• Build control-flow graph (CFG) through source code analysis, binary analysis, execution profiling

• Model allowable execution paths• Enforce at runtime

• Coarse-grained approach appears more practical

• Emphasis on degree of conformance as indicator of trustworthiness (“guard rails”)

• Interested in generating trust evidence for use in cross-domain communication sessions

Page 9: Trust Evidence for IoT - EEMA · 3 Types of Keys: –Node master keys K N : shared between IP and N –Infrastructure Provider keys K N,SP: shared between IP, SP and N –Software

Trust Evidence Research (Part 2)Goal: Explore innovative formulations of baselining and demonstrate their effectiveness. Must include:

• a methodology for generating baseline information,• a runtime framework for measuring program behavior with respect to

baselines, and• a proposal for defining and generating trust evidence using the framework.

Research Summary

1. A Two-tier Architecture To Create Trust EvidencePIs: Frank Piessens and Ingrid Verbauwhede, KU Leuven

2. Trust Evidence from Programmer’s Intent: Its Generation and Enforcement Via Policy-based Risk Management

PIs:Michael Huth, Imperial College London3. Runtime Program Behavior Monitoring Combining Control and Data Flow Tracking

PIs: Angelos D. Keromytis and Michalis Polychronakis, Columbia

4. Capturing and Enforcing Intent Semantics for Trust EvidencePIs: Sergey Bratus and Sean Smith, Dartmouth

Page 10: Trust Evidence for IoT - EEMA · 3 Types of Keys: –Node master keys K N : shared between IP and N –Infrastructure Provider keys K N,SP: shared between IP, SP and N –Software

Two-tiered Architecture to Create Trust Evidence

Research Outcomes:

Protected Software Module Architecture ("Tier 1"), explored with three TCB strategies:

• Fides – hypervisor (VMM) implementation • Sancus - HW-level implementation using TI MSP430 • Salus - kernel-level implementation

Secure CompilerGoal: Compile to protected module architecture

Trust Assessment Modules ("Tier 2")Goal: trust assessment modules run in protected way, perform trust measurements on the system• Measure memory corruption• Malware detection• Measure sanity of the call stack• For protected modules: secure interrupts, state continuity,

developer support

TCB = Trusted Computing Base

Pis: Frank Piessens, Ingrid Verbauwhede

Page 11: Trust Evidence for IoT - EEMA · 3 Types of Keys: –Node master keys K N : shared between IP and N –Infrastructure Provider keys K N,SP: shared between IP, SP and N –Software

SancusIsolation through PC-based access control within hardware.

Page 12: Trust Evidence for IoT - EEMA · 3 Types of Keys: –Node master keys K N : shared between IP and N –Infrastructure Provider keys K N,SP: shared between IP, SP and N –Software

Sancus Key Management• Only computed after enabling isolation

• Only usable through special HW instructions

Page 13: Trust Evidence for IoT - EEMA · 3 Types of Keys: –Node master keys K N : shared between IP and N –Infrastructure Provider keys K N,SP: shared between IP, SP and N –Software

Sancus Key ManagementStrictly symmetric key for performance reasons

3 Types of Keys:– Node master keys KN : shared between IP and N

– Infrastructure Provider keys KN,SP : shared between IP, SP and N

– Software Module keys KN,SP,SM : shared between IP, SP and SM on N

Nodes are initialized with their master key on production

All other keys are derived by means of key derivation functions– KN,SP = kdf (KN , SP)

– KN,SP,SM = kdf(KN,SP , SM)

System Model:

Software providers (SPs) deploy software modules (SMs) on network of low-end nodes (Ns) managed by an infrastructure provider (IP).

Page 14: Trust Evidence for IoT - EEMA · 3 Types of Keys: –Node master keys K N : shared between IP and N –Infrastructure Provider keys K N,SP: shared between IP, SP and N –Software

Reflections on the Approach

Explores the theme: What if software could run in a hardware-protected memory space?

Strong Features:• Zero-software TCB. i.e., Works without trusting any infrastructure software on

the device.• Supports two ways of collecting trust evidence

• High assurance remote attestation of protected modules

• Lower assurance but more pragmatic collection of trust evidence by means of trust assessment modules

Further Work:• Programmer tools and methodologies for using protected modules within

software architectures• How could this scheme be translated to Intel Software Guard Instructions (SGX)

TCB = Trusted Computing Base

Page 15: Trust Evidence for IoT - EEMA · 3 Types of Keys: –Node master keys K N : shared between IP and N –Infrastructure Provider keys K N,SP: shared between IP, SP and N –Software

Implications for IoT

IoT context represents a continuum of devices• Ensembles of low-powered sensors and compute devices

• Gateway nodes that aggregate data and communicate

• Server systems residing in the cloud

Flexible application of PMA:• Hardware implementation – e.g., lightweight devices (easier to customize, low-power

requirements)

• Kernel-based OS implementation

• Hypervisor implementation – e.g., cloud server (hard to customize, virtualization features)

PMA as a unifying framework across the device spectrum?• Standard set of trust assessment modules for IoT devices communicating with a gateway.

• Gateway

– gathers Trust Evidence about device configuration, state, and operational characteristics

– Performs risk assessment before including device data in its aggregation processes.

Page 16: Trust Evidence for IoT - EEMA · 3 Types of Keys: –Node master keys K N : shared between IP and N –Infrastructure Provider keys K N,SP: shared between IP, SP and N –Software

KU Leuven References

16

• Raoul Strackx, Frank Piessens. Fides: Selectively hardening software application components against kernel-level or process-level malware. ACM CCS 2012.

• Pieter Agten, Raoul Strackx, Bart Jacobs, Frank Piessens, Secure compilation to modern processors. IEEE CSF 2012.

• Job Noorman, Pieter Agten, Wilfried Daniels, Raoul Strackx, Anthony Van Herrewege, Christophe Huygens, Bart Preneel, Ingrid Verbauwhede, Frank Piessens. Sancus: Low-cost trustworthy extensible networked devices with a zero-software Trusted Computing Base. Usenix Security 2013.

• Niels Avonds, Raoul Strackx, Pieter Agten and Frank Piessens. Salus: Non-Hierarchical Memory Access Rights to Enforce the Principle of Least Privilege, SECURECOMM 2013.

• Marco Patrignani, Dave Clarke, Frank Piessens, Secure compilation of Object-Oriented components to protected module architectures, APLAS 2013.

• Ruan de Clercq, Frank Piessens, Dries Schellekens, and Ingrid Verbauwhede, Secure Interrupts on Low-End Microcontrollers, IEEE ASAP 2014.

• Raoul Strackx, Bart Jacobs, Frank Piessens. ICE: A Passive, High-Speed, State-Continuity Scheme, ACSAC 2014.

• P. Agten, B. Jacobs, F. Piessens. Sound modular verification of C code executing in an unverified context. Proceedings of the 42nd ACM SIGPLAN-SIGACT Symposium on Principles of Programming Languages (POPL 2015). January 2015

• M. Patrignani, P. Agten, R. Strackx, B. Jacobs, D. Clarke, F. Piessens. Secure compilation to protected module architectures. ACM Transactions on Programming Languages and Systems, volume 37, issue 2, pages 6:1-6:50, April 2015.

Page 17: Trust Evidence for IoT - EEMA · 3 Types of Keys: –Node master keys K N : shared between IP and N –Infrastructure Provider keys K N,SP: shared between IP, SP and N –Software

Trust Evidence from Programmer Intent

Research Outcomes:

• Program annotation language. Scheme that enables programmers to include numerical trust assertions in their code (@expect, @policy, @switch). Goal: Leverage programmer intent.

• PEAL. Pluggable evidence aggregation language. Refers to language for aggregating and evaluating TE numerical inputs based on thresholds and policies.

• PEALT. PEAL tool that takes TE input (policies, policy sets, conditions, domain-specifics, and analyses) and generates input for Z3 SMT solver which decides satisfiability.

PI: Michael Huth

PI: Michael Huth

Page 18: Trust Evidence for IoT - EEMA · 3 Types of Keys: –Node master keys K N : shared between IP and N –Infrastructure Provider keys K N,SP: shared between IP, SP and N –Software

@Expect Blocks

@expect[ex1,max] default 0.1 { // declare optimistic trust evidence

(calledBy foo0) hasTE 0.9;

(calledBy foo1) hasTE 0.3;

(calledBy foo2) hasTE 0.8;

}

@expect[ex2,+] default 0 { // declare accumulative trust evidence

(isSanitized(x)) hasTE 0.7;

(calledBy foo2) hasTE 0.2;

}

localTrust = min(@ex1, @ex2);

Layer 1: generate numerical

trust evidence

Aggregation examples:• addition, min, max, avg,

weighted avg• App-specific formulas

Placement examples:• Program state checks• Input parsing• Self-validation routines

Page 19: Trust Evidence for IoT - EEMA · 3 Types of Keys: –Node master keys K N : shared between IP and N –Infrastructure Provider keys K N,SP: shared between IP, SP and N –Software

@Policy Blocks

@Policy[p1,join] { // p1 policy name, composition is ‘join’deny if (localTrust < 0.2); // localTrust as computed before

grant if certifiedTrustPath();

} // declared rules composed with ‘join’

@Policy[p2,priority] {

deny if !safeProvenance(@caller);

grant if (sameDomain(@caller));

} // composition is “first rule with definite decision wins”

decision = @p1 priority @p2; // compose decisions of policy blocks

Layer 2: policy-based decisions

Page 20: Trust Evidence for IoT - EEMA · 3 Types of Keys: –Node master keys K N : shared between IP and N –Infrastructure Provider keys K N,SP: shared between IP, SP and N –Software

@Switch Blocks

@switch {

deny : (-1,deny);

inconsistent : (-1,incon);

_ : ((eval <body>), _);

}

Usage:

let (v,p) = foo(4,13,myPolicy) in {

if ((p != deny) && (p != incon)) {

commit(v);

} else {

....

}

Layer 3: program-level

policy enforcement

Page 21: Trust Evidence for IoT - EEMA · 3 Types of Keys: –Node master keys K N : shared between IP and N –Infrastructure Provider keys K N,SP: shared between IP, SP and N –Software

Reflections on the Approach

Explores the theme: What if software could have numerical TE data embedded in it?

Strong Features:• Innovative approach to generating trust evidence.• Quantitative approach supports interesting aggregation functions• Policy-based approach explores “guard railing” notion of execution

Further Work:• Methodology and tools for generating trust evidence values• Runtime support for numerical trust evidence values• Practical case studies demonstrating semantics• Compiler support

Page 22: Trust Evidence for IoT - EEMA · 3 Types of Keys: –Node master keys K N : shared between IP and N –Infrastructure Provider keys K N,SP: shared between IP, SP and N –Software

Implications for IoT

Illustrative scenario: dynamic discovery of trustworthy devices • Gateway requires a new IoT devices to present Trust Evidence report scores based on a

standardized application built into the operating system or installed at the factory. • Device application has been instrumented with various system checks and runs as a

protected module. • The results of assessment are obtained and signed by the underlying runtime system in

a way that protects it from software tampering. • The gateway uses the assessment, along with other authentication mechanisms, to

make a decision on whether the device is trustworthy.

Illustrative scenario: dynamic risk evaluation• Instrumented application running on your mobile phone provides periodic updates in

the form of numeric values assessing the outcome of various application or system configuration checks.

• The lack of such Trust Evidence in a public terminal while traveling results in a very different threat posture from the same user.

Page 23: Trust Evidence for IoT - EEMA · 3 Types of Keys: –Node master keys K N : shared between IP and N –Infrastructure Provider keys K N,SP: shared between IP, SP and N –Software

ICL References

23

• M. Huth, J.H.P. Kuo, A. Sasse, et al. Towards usable generation and enforcement of Trust Evidence from programmers' intent. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Vol: 8030 LNCS, Pages: 246-255. 2013.

• M. Huth, J.H.P. Kuo. Towards verifiable trust management for software execution (extended abstract). Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Vol: 7904 LNCS, Pages: 275-276. 2013.

• M. Huth, J.H.P. Kuo. PEALT: An Automated Reasoning Tool for Numerical Aggregation of Trust Evidence. Tools and Algorithms for the Construction and Analysis of Systems (TACAS). 2014.

• M. Huth, J.H.P. Kuo. On designing usable policy languages for declarative trust aggregation. Lecture Notes in Computer Science. Pages: 45-56, ISSN: 0302-9743. 2014.

Page 24: Trust Evidence for IoT - EEMA · 3 Types of Keys: –Node master keys K N : shared between IP and N –Infrastructure Provider keys K N,SP: shared between IP, SP and N –Software

Data Flow Tracking

Data (Information) Flow Tracking: Track data as it is propagated during program execution. Understand vulnerabilities and leaks.ShadowReplica: Efficient technique for accelerating DFT to make it performance feasible• Approach:

– Extract code blocks and control flow information– Construct partial control-flow graph (CFG), including performance profile– Generate optimized code stubs that are injected into application binary– During execution, DFT thread receives data from code stubs and analyzes

• Application to Trust Evidence for IoT:– DFT-based certification or profiling application – focus on trustworthiness of data– PMA could be applied– DFT analysis could be done on IoT device or gateway

K. Jee, V. P. Kemerlis, A. D. Keromytis, G. Portokalidis. ShadowReplica: Efficient Parallelization of Dynamic Data Flow Tracking. ACM Conference on Computer and Communications Security (CCS). November 2013.

PI: Angelos Keromytis

Page 25: Trust Evidence for IoT - EEMA · 3 Types of Keys: –Node master keys K N : shared between IP and N –Infrastructure Provider keys K N,SP: shared between IP, SP and N –Software

Summary

Trust Evidence is needed to augment authentication in cross-domain

interaction between systems and devices

IoT will create an even greater need for Trust Evidence• Broad spectrum of new platform and service types• Distributed architectures for data exchange

Candidate approaches to Trust Evidence:• Protected module architecture• Program language extensions• Data-flow tracking

Future Work/Challenges:• Communication protocols describing how TE is exchanged• Standards to describe the content and format of TE• Software and hardware support for collecting TE

Page 26: Trust Evidence for IoT - EEMA · 3 Types of Keys: –Node master keys K N : shared between IP and N –Infrastructure Provider keys K N,SP: shared between IP, SP and N –Software

Thank you!

Page 27: Trust Evidence for IoT - EEMA · 3 Types of Keys: –Node master keys K N : shared between IP and N –Infrastructure Provider keys K N,SP: shared between IP, SP and N –Software
Page 28: Trust Evidence for IoT - EEMA · 3 Types of Keys: –Node master keys K N : shared between IP and N –Infrastructure Provider keys K N,SP: shared between IP, SP and N –Software

Legal Notices and DisclaimersIntel technologies’ features and benefits depend on system configuration and may require enabled hardware, software or service activation. Learn more at intel.com, or from the OEM or retailer.

No computer system can be absolutely secure.

Tests document performance of components on a particular test, in specific systems. Differences in hardware, software, or configuration will affect actual performance. Consult other sources of information to evaluate performance as you consider your purchase. For more complete information about performance and benchmark results, visit http://www.intel.com/performance.

Cost reduction scenarios described are intended as examples of how a given Intel-based product, in the specified circumstances and configurations, may affect future costs and provide cost savings. Circumstances will vary. Intel does not guarantee any costs or cost reduction.

This document contains information on products, services and/or processes in development. All information provided here is subject to change without notice. Contact your Intel representative to obtain the latest forecast, schedule, specifications and roadmaps.

Statements in this document that refer to Intel’s plans and expectations for the quarter, the year, and the future, are forward-looking statements that involve a number of risks and uncertainties. A detailed discussion of the factors that could affect Intel’s results and plans is included in Intel’s SEC filings, including the annual report on Form 10-K.

The products described may contain design defects or errors known as errata which may cause the product to deviate from published specifications. Current characterized errata are available on request.

No license (express or implied, by estoppel or otherwise) to any intellectual property rights is granted by this document.

Intel does not control or audit third-party benchmark data or the web sites referenced in this document. You should visit the referenced web site and confirm whether referenced data are accurate.

Intel and the Intel logo are trademarks of Intel Corporation in the United States and other countries.

*Other names and brands may be claimed as the property of others.

© 2015 Intel Corporation.