26
1 Utilizing a Forensic Methodology For Data Integrity IVT’s Data Integrity Validation Conference August 17, 2017

QDS IVT Forensic Data Integrity - s Data Integrity Validation Conference August 17, 2017. 2 Agenda Introductions & About Us Intro to Data Forensics Forensic Methodology for DI Interactive

  • Upload
    ledien

  • View
    216

  • Download
    3

Embed Size (px)

Citation preview

1

Utilizing a Forensic Methodology For Data Integrity

IVT’s Data Integrity Validation ConferenceAugust 17, 2017

2

Agenda

Introductions& About Us

Intro toData Forensics

ForensicMethodology

for DI

InteractiveBreak-Out

Session

3

Armin Torres

About Armin

Mr. Torres is a Principal at Qualified Data Systems Inc. QDS is a global information technology consultancy focused on the needs of the Life Science industry.

Mr. Torres has over twenty-five years of international experience in Engineering, Quality Management, and Software Quality with an emphasis in Information Systems within regulated environments. Armin is an Electrical and Computer Engineer. He has

extensive experience in software development, software engineering, and software quality/reliability.Prior to joining QDS, Mr. Torres was a Manager at BearingPoint (formerly KPMG Consulting) where he lead consulting teams on Life Science projects at fortune 500 companies. Mr. Torres has worked extensively with international clients in the Asian, Pacific Rim, European Union, and Latin American markets.

4

Javier Dominguez

About Javier

Javier Dominguez is a Sr. Consultant with Qualified Data Systems. QDS is a global information technology consultancy focused on the needs of the Life Science industry.

Mr. Dominguez has been in the FDA regulated industry for over 10 years. His experience has been with Medical Device and Pharmaceutical companies. He has lead major quality and compliance efforts through project engagement over this time. Primary

responsibilities have included implementations of cybersecurity programs for Medical Device manufacturers and development of digital data integrity forensics projects for Life Science customers. Additionally experience includes software engineering, systems engineering and integration, network security, system design, requirements and risk analysis, and software/process validation projects.

5

What is Data Forensics?And what does it have to do with Data Integrity

6

Data Forensics

It is considered the application of science to the identification,

collection, examination, and analysis of data while

preserving the integrity of the information and maintaining a

strict chain of custody for the data.

Traditionally used when solving a crime but has varied utility in Computer Science, Information Technology, Data Engineering, and Quality Management with corporate enterprise environments.

7

Data Forensics

• The reality is that its really hard to do!

• Often requires varied backgrounds and skills sets to perform

• Requires the use of an array of both HW/SW tools for data collection,

examination, and analysis.

• Requires lots of patience, strict compliance to protocols, and objectivity

when reporting results.

• May be costly and time consuming depending upon objectives and

outcomes desired.

• In the end, it does not guarantee success

8

What Can We Use Forensics For?

Internal Policy Violations (i.e. Compliance Assessment)

Reconstructing Computer Security Incidents (Cybersecurity)

Troubleshooting Operational Problems (e.g. Equipment, Process, and/or Data Integrity Issues)

Practically every organization needs to have the capability to perform digital forensics at some level.

How Does Forensics Apply To Us? How can we apply these techniques with Data Integrity?

10

Use Case Scenario

• Pharmaceutical Company

• FDA is conducting an audit and decides to focus in the laboratories

• While reviewing the audit trail, auditor discovers the deletion of data

• Product was released but accompanying records cannot be found

• Business does not have answers

• 483 is written

• Business must respond within 15 days

11

Forensic Methodology for Data Integrity

CollectionData Collection

of DigitalRecords

Examination

Assessing and Extracting

Information

Analysis

Analytics to Support

Conclusions

Reporting

Preparing and Presenting

Results

Based upon the NIST Framework SP 800-86

12

Collection – Identify Possible Data Sources

• Application

• Audit Trail

• Event Logs

• Directory Output

• Operating System

• Event Viewer Logs

• Directory Audit Logs

• Temporary Files

• Network Logs

• Disaster Recovery

• Backup Logs

13

Collection – Example of a Forensic ChecklistItem Description

Data Files Files of the following data types (media) shall be collected, examined, and analyzed as required.

• Documents• Images• Video’s• Application (HPLC and GC software)• Backup Tapes• Log Files (Event, Error, Audit Trail records)• Metadata

OS Files(Non-Volatile)

• Deleted Data• File Slack Space• Free Space• NTFS Data Streams• Log Files (System Events, Audit, Application Events, Command History, Recently Accessed Files)• Configuration• Security• Jobs• Application (Configuration, Options, Scripts/Automation, Authentication)• Swap• Dump• Temporary

14

Collection – Example of a Forensic ChecklistItem DescriptionComputerized Systems

• Change Records• Validation Records• Backup/Restore Services• Patches/Updates

Network • Configuration• Connections• Running Processes• Open Files• Login Sessions• OS Time• Local User & Group Security Policies• Network Shares• Log Files• TCP/IP Traffic (if applicable) for the Application, Transport, Network, and Data Link Layers

QC Lab Instrumentation

• Commissioning records• Service Records• Validation Records

Site Policies/Procedures

• QC Analytical Lab • Information Technology• Supply Chain

15

Collection – Acquiring the Data

• Develop a data integrity plan to acquire the data

• Likely value

• Volatility

• Amount of Effort Required

• How will you acquire the data

• What tools?

• What process?

• Verify the Integrity of the Data

• Legal case

• Internal disciplinary proceedings

16

Data File Integrity

Write-Blockers

• A write-blocker is a hardware or software-

based tool that prevents a computer

from writing to computer storage media

connected to it

• Write-blockers shall be used to ensure

that the backup or imaging process does

not alter data on the original media

ALWAYS MAINTAIN THE INTEGRITY OF THE MEDIA/DATA

17

Collection – Incident Response Considerations

• System Isolation

• Prevent further damage to the system

• Preserve evidence

• Limit Access

• Environment Impact

• Is this system critical in a production environment?

• Is this the only system which can perform it’s job?

• Scheduled Downtime

18

Examination

Assessing and extracting the relevant pieces of information from the collected data

• Tools, Tools, Tools… they are your friend!

• Text and Pattern Searches, Log Parsers, Log Parsers, SIEM

• Look for other related incidents

• Filter relevant data, exclude erroneous

• Hard drives may contain hundreds of thousands of data files

• Recover deleted files

• Slack Space

• Free Space

19

Analysis

Study and analyze the data to draw conclusions

• Correlate data among multiple sources

• The foundation of forensics is using a methodical approach to reach appropriate

conclusions based on the available data or determine that no conclusion can yet be drawn.

20

Reporting

Prepare and Present the information resulting from the Analysis phase

• Alternative Explanations

• When the information regarding an event is incomplete, it may not be possible to arrive

at a definitive explanation of what happened. Don’t draw conclusions too early.

• Audience Consideration

• System Administrator, Senior Management, FDA. Each are looking for different data.

Cater your report to your audience.

• Actionable Information

• Not only for the incident at hand but keep in mind actions to prevent future incidents.

21

Reporting

• A report should seek to extend its value by providing useful recommendations which were

gathered during the investigation.

• Organizations should be proactive in collecting useful data

• Configuring auditing on OSs

• Implementing centralized logging

• Performing regular system backups

22

Data Integrity Maturity Level Characterization

ISPE GAMP Records and Data Integrity

23

Questions?

24

Breakout Session

Interactive

25

Audit Trail Review

During the Examination phase of your project, you are analyzing the audit trail of a system to

try to determine the sequence of events which took place. You realize the audit trail is

littered with Data Integrity anomalies.

• Break out in teams

• Identify all possible Data Integrity anomalies

• 6 possible types of anomalies, multiple cases of each

• Must identify why event is an anomaly

• Winning team will receive prize

• BINGO style… first team to yell “YAHTZEE” wins

26

+1 305 444 1212

2100 Ponce de LeonSuite 1070Coral Gables, FL 33134

www.qualifiedsystems.com

[email protected]

Qualified Data Systems