16
A Principal Component Analysis (PCA) Decomposition Based Validation Metric for Use With Full Field Measurement Situations Randall Allemang, PhD Director, Structural Dynamics Research Lab Senior Collaborator, USAF AFRL/SSC College of Engineering and Applied Science, University of Cincinnati, Cincinnati, OH 45221-0072 Michael Spottswood, PhD Thomas Eason, PhD Senior Aerospace Structures Engineer, USAF AFRL/RQHF Senior Aerospace Structures Engineer, USAF AFRL/RQHF Structural Sciences Center, Air Vehicles Directorate Air Force Research Laboratory, Wright-Patterson AFB, OH 45385 ABSTRACT A validation metric that involves principal component analysis (PCA) decomposition of simulation and test data is developed for potential use in the quantification of margin and uncertainty (QMU) for an upcoming USAF design exercise (OEM Aero-Thermo-Structures Design Study - USAF Task Order 0015: “Predictive Capability for Hypersonic Structural Response and Life Prediction”). This validation metric allows for use of nearly full-field, simulation and test data over a wide range of spatial realizations (3-D responses over multiple input conditions) and temporal (time or frequency) information, as needed. A demonstration example utilizing two datasets explains how the validation metric is formed and how it can be used to quantify the margin between the simulation and the test data as well as how it can quantify the uncertainty. The primary advantage of the proposed PCA validation metric is that it preserves the engineering units (EU) of the original data so that the quantifications of margin and uncertainty can be made in EU. A second advantage of the PCA validation metric is that it can be used over a wide range of temporal information. While the proposed method shows considerable promise, future work is identified in terms of exploring other decomposition methods commonly used in fingerprint, iris and facial biometric pattern recognition. KEYWORDS: Validation, Quantification of Margin and Uncertainty, Principal Component Decomposition, Singular Value Decomposition, Full Field Measurements 1. Introduction Sustained air-breathing, hypersonic flight employing reusable, hot structure vehicles has been a goal of the U.S. Department of Defense and specifically of the U.S. Air Force (USAF) for the last 60 years. The difficulty of the scientific challenges inherent in realizing this goal cannot be overstated. Hypersonic vehicles, and particularly their structures, must necessarily operate at the intersection of multiple technical disciplines (structural, fluid, thermal, acoustics and aerodynamic). It is anticipated that these vehicle structures will experience high-intensity, transient loading resulting in nonlinear, coupled structural response. Flight test and experimentation alone cannot account for all of the possible loading combinations. What is clearly required is a validated simulation capability (for both structural response and life prediction) to ensure the successful analysis, design, and operation of such a vehicle. The overall concept of a validated simulation capability is shown in Figure 1 which represents the development of a simulation model that is referred to as the digital twin of the physical hypersonic vehicle.

A Principal Component Analysis (PCA) … Principal Component Analysis (PCA) Decomposition Based Validation ... decomposition of simulation and test ... iris and facial biometric pattern

Embed Size (px)

Citation preview

Page 1: A Principal Component Analysis (PCA) … Principal Component Analysis (PCA) Decomposition Based Validation ... decomposition of simulation and test ... iris and facial biometric pattern

A Principal Component Analysis (PCA) Decomposition Based Validation Metric for Use With Full Field Measurement Situations

Randall Allemang, PhD

Director, Structural Dynamics Research Lab Senior Collaborator, USAF AFRL/SSC

College of Engineering and Applied Science, University of Cincinnati, Cincinnati, OH 45221-0072

Michael Spottswood, PhD Thomas Eason, PhD

Senior Aerospace Structures Engineer, USAF AFRL/RQHF Senior Aerospace Structures Engineer, USAF AFRL/RQHF

Structural Sciences Center, Air Vehicles Directorate Air Force Research Laboratory, Wright-Patterson AFB, OH 45385

ABSTRACT A validation metric that involves principal component analysis (PCA) decomposition of simulation and test data is developed for potential use in the quantification of margin and uncertainty (QMU) for an upcoming USAF design exercise (OEM Aero-Thermo-Structures Design Study - USAF Task Order 0015: “Predictive Capability for Hypersonic Structural Response and Life Prediction”). This validation metric allows for use of nearly full-field, simulation and test data over a wide range of spatial realizations (3-D responses over multiple input conditions) and temporal (time or frequency) information, as needed. A demonstration example utilizing two datasets explains how the validation metric is formed and how it can be used to quantify the margin between the simulation and the test data as well as how it can quantify the uncertainty. The primary advantage of the proposed PCA validation metric is that it preserves the engineering units (EU) of the original data so that the quantifications of margin and uncertainty can be made in EU. A second advantage of the PCA validation metric is that it can be used over a wide range of temporal information. While the proposed method shows considerable promise, future work is identified in terms of exploring other decomposition methods commonly used in fingerprint, iris and facial biometric pattern recognition. KEYWORDS: Validation, Quantification of Margin and Uncertainty, Principal Component Decomposition, Singular Value Decomposition, Full Field Measurements

1. Introduction

Sustained air-breathing, hypersonic flight employing reusable, hot structure vehicles has been a goal of the

U.S. Department of Defense and specifically of the U.S. Air Force (USAF) for the last 60 years. The difficulty of

the scientific challenges inherent in realizing this goal cannot be overstated. Hypersonic vehicles, and

particularly their structures, must necessarily operate at the intersection of multiple technical disciplines

(structural, fluid, thermal, acoustics and aerodynamic). It is anticipated that these vehicle structures will

experience high-intensity, transient loading resulting in nonlinear, coupled structural response. Flight test

and experimentation alone cannot account for all of the possible loading combinations. What is clearly

required is a validated simulation capability (for both structural response and life prediction) to ensure the

successful analysis, design, and operation of such a vehicle. The overall concept of a validated simulation

capability is shown in Figure 1 which represents the development of a simulation model that is referred to as

the digital twin of the physical hypersonic vehicle.

Page 2: A Principal Component Analysis (PCA) … Principal Component Analysis (PCA) Decomposition Based Validation ... decomposition of simulation and test ... iris and facial biometric pattern

The Aerospace Systems Directorate at the Air

Force Research Laboratory (AFRL/RQ) is

dedicated to integrating the structural, control,

and aeronautical sciences necessary to field a

reusable hypersonic vehicle capable of

sustained high-Mach flight. The Air Aerospace

System Directorate’s Structural Sciences

Center (SSC) was established in 2005,

specifically to address the scientific

deficiencies in our understanding of extreme-

environment structures, deficiencies that have

repeatedly stymied USAF, NASA and Industry

engineers in pursuit of this revolutionary

game-changing mission capability. One critical

need of the Program involves defining

appropriate validation metrics and determining

how these metrics will be useful for the

quantification of margin and uncertainty (QMU) for the specific case study example that is part an OEM Aero-

Thermo-Structures Design Study (USAF Task Order 0015: “Predictive Capability for Hypersonic Structural

Response and Life Prediction”) [1,2].

The 2014 testing activity specifically involves testing of a representative panel structure, built by an OEM

USAF provider, based upon a proposed preliminary analysis of a hot structure, reusable hypersonic vehicle

design in a combined extreme environment set of regimes. Figure 2 represents the overall roadmap for the

hypersonic vehicle program showing Phase 3 of the OEM Aero-Thermo-Structures Design Study noted in the

timeline.

As planning for Phase 3 of this program is initiated, agreement between all parties is needed on the specific

calibration and validation metrics

that will be utilized to establish

adequacy or success of any

modeling. This is made particularly

difficult by the detailed, multi-scale

modeling requirements which result

in detailed response simulation

together with the emerging

capabilities to make full-field

measurements, or very high degree

of freedom (DOF) detail, of the

structure’s response as a function of

an independent temporal variable

like time or frequency. Response

simulation and test data includes

structural deformation to multiple,

extreme inputs such as dynamic

load, temperature, aerodynamic

flow, acoustic pressure, etc. The testing that is envisioned for this Phase 3 of the OEM Aero-Thermo-

Structures Design Study can be considered as an example of a benchmark or subsystem case in Figure 3 and

Figure 2: Roadmap of Reusable, Hot-Structure Hypersonic Vehicle Program

Figure 1: Digital Twin Validation Concept

Page 3: A Principal Component Analysis (PCA) … Principal Component Analysis (PCA) Decomposition Based Validation ... decomposition of simulation and test ... iris and facial biometric pattern

represents a unique potential for the application and evaluation of calibration and validation metrics that will

become the baseline for future work in the hypersonic vehicle program.

2. Background and Relevance

According to the AFRL, “The primary mission of the AFRL-SSC

is to address critical shortfalls in the capability of today’s

USAF to exploit state-of-the-art, commercial and non-

commercial, modeling and simulation tools for predicting the

response, performance, applicable failure modes, and

potential failure states for aero-structures experiencing the

intense, coupled, combined, thermal-acoustic-mechanical

loads associated with high-Mach flight, and do so within a

risk-informed/risk-based/risk-quantified structural design,

structural assessment, and structural evaluation paradigm.”

Developing robust validation strategies for advanced,

extreme-environment structures required for this future

USAF mission capability is critically needed. Preliminary designs for the development of such vehicle

platforms have been uniformly unsuccessful in the past, in large part, because significant gaps exist in the

science and technology knowledge base required to meet the mission requirements. The financial and public-

relations impact of a vehicle failure demand that a comprehensive validation procedure be part of any

hypersonic development program to quantify the uncertainty and mitigate the risk to within any reasonable

standard. New design and fabrication strategies that do not require physical testing of the full vehicle will

increasingly be dependent on a validation strategy that depends heavily on material, component and sub-

assembly tests, in order to digitally (numerically) validate the full vehicle with very limited physical testing of

some of the sub-assemblies and of the full vehicle.

The validation problem is particularly sensitive to the information/knowledge gaps when physical test

facilities and budgets for experimental testing are limited. Some of the general knowledge gaps that are the

focus of current work by the AFRL-SSC include: (1) modeling multi-scale physics problem at multiple time

scales and across multiple structure scales, (2) model problems across time and structure scales that include

joint or boundary connections, (3) capture the relevant physics at structural scales, particularly the non-

linear effects, (4) create a simulation environment that includes the multiple structure and time

characteristics, and (5) design, conduct risk assessment, and quantify uncertainty in a life prediction

framework to allow a future platform to be realized, first numerically and then physically. The past work by

the author [3] reviewed the verification and validation (V&V) guidelines, identified major challenges/issues

and discussed the particular problems of applying V&V to the Hypersonic Vehicle Program.

Rigorous validation is part of an emerging field that generally is referred to as verification and validation

(V&V) with increasing emphasis on the use of V&V methodology for the purposes of quantification of margin

and uncertainty (QMU) [4-12]. Specifically, verification refers to numerical experiments designed to detect

errors in software (code) or numerical limitations in software (Are we solving the equations correctly?) and

validation refers to the physical experiments designed to evaluate the results of the simulations (Are we

solving the correct equations?). While this framework has become increasingly more structured since 1990,

this field has evolved considerably even in the last ten years. The V&V framework now has evolved to include

outcomes of uncertainty quantification (UQ) and more recently quantification of margins and uncertainties

(QMU) [6,7,9,10]. Within the nomenclature, margin refers to the difference (distance) between the design

requirement and the mean of the expected or measured result (that accounts for unknowns in terms of

Figure 3: Hierarchical V&V Concept [5]

Page 4: A Principal Component Analysis (PCA) … Principal Component Analysis (PCA) Decomposition Based Validation ... decomposition of simulation and test ... iris and facial biometric pattern

knock-down factors and/or factors of safety), and uncertainty refers to the variation about the mean of the

expected or measured result, often with respect to the plus/minus two sigma information.

3. General Methodology

The focus of this research has been to identify potential validation metrics and how these metrics support the

quantification of margin and uncertainty (QMU) for the specific structure and testing involved in Phase III of

the OEM Study. This includes defining an approach to a series of validation metrics for both structural

response prediction models and life prediction models that encompass the multiple physics aspects of the

testing. At the present time, based upon limitations of the test facilities, this is likely to be combinations of

acoustic, thermal and structural loading. This focus addresses some of the previously identified

challenges/issues as well as some of the knowledge gaps identified during Phase I of the OEM Study.

In particular, this research work partially addresses one of the key knowledge gaps identified in the OEM

Study [2]:

“While these phenomena can be represented in isolation with sufficiently detailed special purpose

models, there are difficulties in addressing multiple phenomena simultaneously at the global design

level with confidence.”

This problem of identifying the effects of multiple phenomena simultaneously is the nature of the validation

problem in multiple extreme environments. Definition of validation metrics for the proposed experiment

involves more than one validation metric and may involve hundreds of metrics when both structural

response and life are included. Validation limits or targets (acceptable match) between the model simulation

and the physical experiment, which constitute the validation metric, should be defined in terms of measured

engineering units (EU) prior to the physical testing [6,7]. An important aspect to any validation test is the

definition, and measurement, of loads and boundary conditions so that the test environment is completely

known. Ideally, this becomes part of the set of validation metrics as well Validation metrics are often

computed using parameters that are scalar (1-D), area (2-D), volume (3-D) or generalized space (N-D)

concepts characterized by the difference between the simulation and physical experiment [6,7]. Often, the

metric is defined in terms of differences of maxima or means (time or frequency domain) rather than

differences in the complex (magnitude and phase) quantities (frequency domain) when the importance of

phasing of the data is unknown. A major concern in the development of validation metrics is that using

maxima, or absolute values, generally gives a “worst case” scenario with the probability of additional design

margin above what is desirable.

While not generally preferred, validation metrics can be normalized (to account for different units and

scaling) and combined in terms of vector quantities to provide an overall validation metric as a measure of

success or failure of validation. This type of methodology will utilize vector norm and subspace (normalized

single/principal value) methodology to fully understand comparisons of results from multiple physical tests

involving multiple measurement quantities (units). In all cases, some traceability to engineering units (EU) is

desirable to make sure details related to the scale of the units are not lost.

Regardless of whether specific validation targets are met as measured by the a priori validation metrics, one

of the important goals of a validation test is the refinement of the quantification of margin and uncertainties.

For this reason, it is desirable that estimates of margin and uncertainties are included in pre-test planning for

each metric. Refinement and further understanding of the quantification of margin and uncertainty must be

part of the post-test analysis.

Page 5: A Principal Component Analysis (PCA) … Principal Component Analysis (PCA) Decomposition Based Validation ... decomposition of simulation and test ... iris and facial biometric pattern

It is expected that facility configuration and limits of available instrumentation will also be a part of any

validation test. During the pre-test planning and the establishment of the validation metrics, these limitations

must be part of the discussion between the modeling and test personnel. These limitations will dictate the

limits of what can be measured and compared with respect to validation and may change the test protocol. It

is important to recognize that validation testing is not an attempt to measure realistic mission profiles or

application domain environments but is an attempt to simulate measure and quantify metrics that are

representative of successful modeling of a portion of the extreme environments that the test object will

encounter.

Quantification of margin and uncertainties (QMU) assessment is the principal driver for the choice of V&V

metrics and uncertainty quantification (UQ) strategies [10]. Even when validation metrics are not achieved

within desirable goals, the lack of agreement provides important information for the QMU part of the risk

analysis. The validation metrics chosen should provide some measures that allow the quantification of the

means and uncertainties in terms of engineering units. One of the goals of the risk assessment for the

hypersonic vehicle program is to move the risk away from structural response prediction toward life

prediction, so that the majority of the uncertainty will be on predictions of decreased/increased life, with

little or no risk of complete loss of the platform due to uncertain structural response prediction. This means

that additional data from the first flights of the full hypersonic vehicle will be utilized for revised estimates of

the quantification of margins and uncertainty prior to additional flights. In order for the ultimate design of

the hypersonic vehicle to be realizable, the design methodology must focus on a realistically small, but

acceptable, margin within a minimal uncertainty climate to meet weight, structure and mission requirements.

Understanding the relationship between margin and uncertainty across the design space is the ultimate goal

of V&V and, specifically, validation. When the purpose of the validation strategy is understood as part of the

quantification of margins and uncertainties, this gives an appropriate perspective to the process. The

significance of this aspect of the validation strategy is that it will extend, and give better definition to, the

current validation thought process that takes a largely-sequential material, component and sub-assembly

approach, and move it toward an integrated system-level, top-down, rather than bottom-up, process.

Explanation of New/Unusual Techniques:

The proposed validation method involves a decomposition of large amounts of simulation and test data from

the physical space to a virtual space with much smaller dimension while maintaining the engineering units of

the physical space. The proposed technique involves the use of singular value decomposition (SVD) as a

principal component analysis (PCA) method to estimate the validation metric. This is a big change and

potential improvement over historical calibration and/or validation metrics that involve the comparison of

very limited degrees of freedom (DOFs) in only the physical space. The primary advantage of the proposed

PCA validation metric is that it preserves the engineering units (EU) of the original data so that the

quantifications of margin and uncertainty can be made in EU. A second advantage of the PCA validation

metric is that it can be used over a wide range of temporal information. A third advantage is that some

aerospace industry partners are already using SVD based methodology when comparing experimental data

sets. While the proposed method shows considerable promise, future work is identified in terms of exploring

other decomposition methods commonly used in fingerprint, iris and facial biometric pattern recognition.

Page 6: A Principal Component Analysis (PCA) … Principal Component Analysis (PCA) Decomposition Based Validation ... decomposition of simulation and test ... iris and facial biometric pattern

4. Specific Methodology - Principal Component Analysis

The focus of the current validation metric is the use of principal component analysis as the basis for the

formulation of validation metrics for sets of modeling simulation results as well as experimentally measured

results of the same quantities. Principal Component Analysis (PCA) represents a coordinate transformation

or decomposition from physical space to some virtual space that is a ranked decomposition of the physical

space (ranked in the sense that the largest contributions are the most significant). The application of PCA as a

validation metric permits a large amount of simulation or experimental data to be reduced to a more

manageable amount of data where comparisons (metrics) can be more easily formed on a macro scale instead

of the historical DOF by DOF comparison. The nature of the PCA decomposition captures the engineering

units (EU) of the data so as to avoid normalization which is undesirable in V&V considerations. The

application of PCA to experimental structural dynamics data [13-22] to determine underlying characteristics is

widely used.

The fundamental PCA decomposition currently used in experimental structural dynamics is often formed

using the eigenvalue decomposition (ED) of a real valued, square matrix. This approach could be useful for

the comparison of time domain simulation and experiment results and is represented in following equation.

One concern is the need for square matrices which will not generally be applicable to simulation and

experimental situations.

The more general form of the PCA decomposition used in experimental structural dynamics is the singular

value decomposition (SVD) of a complex valued, rectangular matrix. This approach is useful for the

comparison of frequency domain simulation and experiment results (complex valued) and for cases where

the simulation and experiment matrix of data is not square. This is represented in the following equation and

contains the previous equation as a subset.

The above equation will be considered the base equation for all following V&V discussion and examples. In

the above equation, the matrix [B] represents the data matrix that will be evaluated. In general, this matrix

will be complex-valued and rectangular of size NL by NS (size of the long and short dimension of the data). The

[U] matrix is the right singular vectors and the [V] matrix is the left singular vectors and both are complex-

valued and unitary. The superscript T represents the transpose of the matrix and the superscript H

represents the hermitian (conjugate transpose) of the matrix. The remaining [S] matrix is the singular value

matrix which is diagonal, square and real-valued. As the [U] and [V] matrices are unitary, the magnitude

characteristics of the data matrix [B] are captured in the singular values and with proper scaling have the

same engineering units (EU) as the data matrix [B].

In order to understand how this decomposition will be applied to simulated or measured data, the principal

component analysis is first applied at one temporal point (time or frequency). There will be two,

corresponding [B] data matrices; one for the simulation data and one for the experimental data. For the case

of either simulation data or experimental data, the data matrix [B] will be of the same matrix row-column

structure. The matrix [B] will be of row dimension NL which represents the degrees of freedom (DOF) of the

corresponding simulation and experimental data. This would typically be the nodal locations in 3-space with

multiple dimensions per node (x, y, z, etc.). Requirements for traditional validation would limit the multiple

dimensions to like engineering units (mixing of motion and temperature, for example, would not be

permitted). The column dimension NS of the data matrix [B] represents multiple simulations, loading

Page 7: A Principal Component Analysis (PCA) … Principal Component Analysis (PCA) Decomposition Based Validation ... decomposition of simulation and test ... iris and facial biometric pattern

conditions or reference measurements associated with each of the DOFs. This would allow for the use of

multiple datasets with different input parameters for each different dataset. As the first singular value found

is the largest singular value, this will represent, together with the associated unitary vectors from [U] and [V],

the most dominant response characteristics in all of the datasets. The remaining singular values and singular

vectors represent the lesser response characteristics.

The above principal component analysis is then repeated at each temporal point in the datasets so that a

comparison can be made over a complete simulation and test involving time or frequency. Since the singular

values retain the engineering units (EU) of the data, the singular values can be compared directly to

determine overall scale relationships (calibration or quantification of margin) between the simulation and

experiment as well as a measure of scatter (quantification of uncertainty).

Application Example - Principal Component Analysis:

The following example demonstrates the principal component analysis procedure used in a validation

exercise for a structural dynamics example. The procedure is general and can be applied to any two datasets

whether the data is simulation and experiment or simulation and simulation or experiment and experiment.

The following example uses singular value decomposition to implement the principal component analysis.

This particular example application uses two experimental datasets from an experimental structural

dynamics test on an automotive vehicle that contains realistic measurement errors and small amounts of

nonlinear response over the input loading levels. Each dataset utilizes a data matrix that has a row

dimension of 146 ( NL ) and a column dimension of 4 ( NS ) and the analysis is completed over a frequency

range from around 4 Hertz to 30 Hertz using 0.0625 Hertz resolution (more than 400 frequency steps).

Page 8: A Principal Component Analysis (PCA) … Principal Component Analysis (PCA) Decomposition Based Validation ... decomposition of simulation and test ... iris and facial biometric pattern

Figure 4: Dataset #1 – Color Spectragrams of Four Reference Conditions

Figure 5: Dataset #2 – Color Spectragrams of Four Reference Conditions

Page 9: A Principal Component Analysis (PCA) … Principal Component Analysis (PCA) Decomposition Based Validation ... decomposition of simulation and test ... iris and facial biometric pattern

Figures 4 and 5 represent the two datasets that will be represented by principal component analysis using

singular value decomposition. These color spectragrams look a little different from traditional color plots

that represent the motion of a 2-D space in that the 3-D space is captured in the rows of this format.

Therefore, 2-D locations that have higher amplitudes of response will appear as striations in this type of plot

depending on the DOF naming convention. The columns of this format are capturing the temporal aspect of

the data, frequency in this case. The color is capturing the magnitude of the response (although both

magnitude and phase are present in this set of data). Finally, the four color spectragrams for each dataset

capture the four different cases or sets of data that represent different simulation/test conditions.

Figure 6: Principal Component Analysis of Dataset #1 and #2

Figure 6 represents the results of comparing the two datasets using PCA-SVD where the singular values are

plotted as a function of the temporal independent variable, frequency. This data represents a comparison of

the magnitude characteristics of the two sets of four color spectragrams shown in Figures 4 and 5. The phase

information in this case is not lost but is contained in the associated singular vectors. Since the singular

values have been scaled to the engineering units of the responses involved in this comparison, the

magnitudes are in terms of the engineering units of the simulation/test data. Recognize that, while the units of

the singular values are the same as the original data, the new information is no longer in the physical space of

the original data. Each set of singular value curves captures the magnitude of the dominant response that spans

the original 3-D geometry and contributing test conditions.

In this presentation, the dotted lines of the same color represent the singular values of the respective

datasets. For further discussion, only the analysis of the first curves (blue dotted lines) will be discussed as

they represent the highest magnitudes of response. However, the other three singular value sets of curves

also provide information that may be of value, particularly when extreme environments are considered and

the responses may be nonlinear. These plots are included in the Appendix.

The primary principal components are replotted in Figure 7. The two shapes are similar representing the

similarity in the primary response characteristics of the two contributing datasets. The correlation in shape

as a function of frequency (ignoring scale) between these two curves is 0.987 where 1.0 would represent the

same shape as a function of frequency.

Page 10: A Principal Component Analysis (PCA) … Principal Component Analysis (PCA) Decomposition Based Validation ... decomposition of simulation and test ... iris and facial biometric pattern

Figure 7: Primary Principal Component of Dataset #1 and #2

The analysis of Figure 7 is made more complete by also viewing the correlation of the singular vectors as a

function of frequency [22] in order to determine that the spatial information throughout the 3-D space of the

simulation and test is similar. In this case, the correlation of nearly 1.0 in Figure 8 indicates that the singular

vectors are spanning the same 3-D space as a function of frequency.

Figure 8: Primary Principal Vector Correlation of Dataset #1 and #2

This analysis becomes more useful for calibration or validation when the two curves are plotted against each

other as in Figures 9 and 10 [23-24]. Figure 9 gives the general idea and the relationship between the ratio of

the primary singular values and the quantification of margin and uncertainty.

Page 11: A Principal Component Analysis (PCA) … Principal Component Analysis (PCA) Decomposition Based Validation ... decomposition of simulation and test ... iris and facial biometric pattern

Figure 9: Validation Metric – Relationship to QMU

In the two figures that are part of Figure 10, the scale (quantification of margin) between the two datasets is

given by the slope of the curve where a slope of one (1.0) would mean that the simulation data and test data

would be a match at all frequencies. Likewise, in these figures, the scatter of the singular values around the

sloped line gives a quantification of the uncertainty. At this time, the interpretation of the slope of the lin-lin

plot and the log-log plot both seem to give some useful information and use of one over the other is not

clearly advantageous.

Figure 10: Primary Principal Response Comparisons

In both of the plots in Figure 10, the red line is plotted to represent a slope of 1.0 in the lin-lin plot and in the

log-log plot. For this example, the slope of the ratio of singular values is less than 1.0 which means that the

second dataset (y axis) has an overall magnitude that is less than represented by the first dataset (x axis). If

Page 12: A Principal Component Analysis (PCA) … Principal Component Analysis (PCA) Decomposition Based Validation ... decomposition of simulation and test ... iris and facial biometric pattern

the first dataset is the result of a series of model simulations and the second dataset is the corresponding

experimental data, this would mean that the model is overestimating the response that was experimentally

measured. The scatter about the sloped line is an indication of the uncertainty from temporal point to

temporal point in the analysis.

Issues, Concerns:

The demonstration example does not include the extreme environments anticipated in the OEM Aero-

Thermo-Structures Design Study. This may necessitate further examination of the remaining principal

components included in the Appendix. In this particular demonstration study, the data matrix [B] was

somewhat modest compared to what might be encountered in a realistic simulation and test data validation.

From a numerical point of view, this does not constitute a computational problem as the memory and

computational speed requirements are modest even for very large data matrix comparisons. The PCA

method does assume that the DOFs and temporal dimensions of both data matrices must be the same. The

temporal dimension should never be a problem as the simulation can be chosen to match the experimental

test. However, in some cases with large differences in DOF spacing, a geometry-based correlation may be

needed to match the simulation and experimental DOFs to the common subset.

Results/Significance:

The results appear to be easily estimated using PCA and SVD procedures that are commonly used in

structural dynamics and meet the goal of providing an approach to validation metrics that yield engineering

units (EU) and an easily interpreted quantification of margin and uncertainty. Further demonstration

examples need to be evaluated to determine overall usefulness beyond the historical structural dynamics

example involving acceleration response used in this case. These PCA validation metrics, or similar validation

metric approaches involving alternate decomposition procedures, are expected to be significant as the basis

for validation planning and development of validation metrics for the rest of the hypersonic vehicle program.

Specific results, as well as the process, are expected to be applied to many other simulations and tests that are

part of the OEM Aero-Thermo-Structures Design Study as well as future hypersonic vehicle program

simulation and testing.

5. Summary and Future Work:

The proposed PCA-SVD validation metric appears to meet the general requirements for the proposed

requirements of the OEM Aero-Thermo-Structures Design Study and the experimental phase that will be

conducted in 2014. Further refinement as well as alternative methods will be evaluated once a complete set

of simulation and matching experimental tests are available. Future work involves evaluating the PCA-SVD

validation metric on datasets of increasing dimension and complexity (in terms of extreme environment

simulation and test data) [25-26]. Likewise, comparing the present metric to other decomposition methods

commonly used in fingerprint, iris and facial biometric pattern recognition will be necessary to determine

alternate methods that may be more robust. The primary concern regarding pattern recognition methods is

the lack of traceability to engineering units and the possible focus on portions of the data that may not be part

of the dominant response.

References/Literature Citations:

1. Tzong, G., Richard Jacobs, R., Liguore, S., “Task Order 0015: Predictive Capability for Hypersonic Structural Response and Life Prediction: Phase 1 - Identification of Knowledge Gaps, Volume 1-Nonproprietary Version”, AFRL-RB-WP-TR-2010-3068, V1, 181 pp., 2010.

2. Zuchowski, B., “Task Order 0015: Predictive Capability for Hypersonic Structural Response and Life Prediction, Phase 1 - Identification of Knowledge Gaps”, AFRL-RB-WP-TR-2010-3069, 100 pp., 2010.

Page 13: A Principal Component Analysis (PCA) … Principal Component Analysis (PCA) Decomposition Based Validation ... decomposition of simulation and test ... iris and facial biometric pattern

3. Allemang, R.J., “Validation of the “Digital Twin” of a Reusable, Hot Structure, Hypersonic Vehicle”, Final Report – US Air Force Summer Faculty Fellowship Program, 14 pp., Sept. 2011.

4. AIAA, Guide for the Verification and Validation of Computational Fluid Dynamics Simulations, American Institute of Aeronautics and Astronautics, AIAA-G-077-1998, 19 pp., 1998.

5. ASME, Guide for Verification and Validation in Computational Solid Mechanics, American Society of Mechanical Engineers, ASME V&V 10-2006. 29 pp., 2006.

6. Oberkampf, W.L., Roy, C.J., Verification and Validation in Scientific Computing, Cambridge University Press, 767 pp., 2010.

7. Roache, P. J., Fundamentals of Verification and Validation, Hermosa Publishers, 476 pp., 2009.

8. NASA, Standards for Models and Simulations, NASA-STD-7009, 58 pp., 2008.

9. Helton, J.C., “Conceptual and Computational Basis for the Quantification of Margins and Uncertainty”, Sandia Technical Report, SAND2009-3055, 400 pp., June 2009.

10. Alvin, K., "Perspectives on V&V and Computational Simulation-Based QMU at Sandia", Create Developers Review Meeting, Presentation Slides, 22 pp., 2011.

11. NASA, “Procedure for Failure Mode, Effects and Criticality Analysis (FMECA)”, Apollo Reliability and Quality Assurance Office, RA-006-013-1A (N70-76197), 37 pp., 1966.

12. Blevins, R.D., Bofilios, D., Holehouse, I., Hwa, V., Tratt, M., Laganelli, A.L., Pozefsky, P., Pierucci, M., “Thermo-Vibro-Acoustic Loads and Fatique of Hypersonic Flight Vehicle Structure, AFRL-RB-WR-TR-2009-3139, 478 pp. 2009.

13. Wilkinson, J.H., The Algebraic Eigenvalue Problem, Oxford University Press, Oxford, U.K., 1965, pp. 12-13.

14. Strang, G., Linear Algebra and Its Applications, Third Edition, Harcourt Brace Jovanovich Publishers, San Diego, CA, 1988, 505 pp.

15. Lawson, C.L., Hanson, R.J., Solving Least Squares Problems, Prentice-Hall, Inc., Englewood Cliffs, NJ, 1974, 340 pp.

16. Jolliffe, I.T., Principal Component Analysis, Springer-Verlag New York, Inc., New York, NY, 1986, 271 pp.

17. Ljung, Lennart, System Identification: Theory for the User, Prentice-Hall, Inc., Englewood Cliffs, NJ, 1987, 519 pp.

18. Deprettere, E.F.A., SVD and Signal Processing: Algorithms, Applications and Architectures, Elsevier Science Publishing Company, Inc., New York, NY, 1988, 477 pp.

19. Hyvarinen, A., Karhunen, J, Oja, Independent Component Analysis, John Wiley & Sons, Inc., New York, NY, 2001, 481 pp.

20. Jackson, Edward J., User's Guide to Principal Components, John Wiley & Sons,Inc., Hoboken, NJ, 2003, 569 pp.

21. Allemang, R.J., Phillips, A.W., Allemang, M.R., “Application of Principal Component Analysis Methods to Experimental Structural Dynamics, Proceedings, International Modal Analysis Conference, 22 pp. 2010.

22. Allemang, R.J., “The Modal Assurance Criterion (MAC): Twenty Years of Use and Abuse", Proceedings, International Modal Analysis Conference, pp. 397-405, 2002. Sound and Vibration Magazine, Vol. 37, No. 8, pp. 14-23, August, 2003.

23. Wang, W., Mottershead, J.E., Patki, A., Patterson, E.A., “Construction of Shape Features for the Representation of Full-field Displacement/Strain Data”, Applied Mechanics and Materials, Vol. 24-25, pp. 365-370, 2010.

24. Sebastian, C.M., Patterson, E.A., Ostberg, D., “Comparison of Numerical and Experimental Strain Measurements of a Composite Panel Using Image Decomposition, Appl. Mechanics and Materials, Vol. 70, pp. 63-68, 2011.

25. Allemang, R.J., "Validation Metrics and Quantification of Margin and Uncertainty (QMU) in the Extreme Environments of the Hypersonic Vehicle Program", Final Report- US Air Force Summer Faculty Fellowship Program, 16 pp., 2012.

26. Allemang, R.J., "Refinement of Metrics Used to Quantify Margin and Uncertainty (QMU) in the Extreme Environments of the Hypersonic Vehicle Program", Final Report- US Air Force Summer Faculty Fellowship Program, 16 pp., 2013.

Page 14: A Principal Component Analysis (PCA) … Principal Component Analysis (PCA) Decomposition Based Validation ... decomposition of simulation and test ... iris and facial biometric pattern

APPENDIX:

Page 15: A Principal Component Analysis (PCA) … Principal Component Analysis (PCA) Decomposition Based Validation ... decomposition of simulation and test ... iris and facial biometric pattern
Page 16: A Principal Component Analysis (PCA) … Principal Component Analysis (PCA) Decomposition Based Validation ... decomposition of simulation and test ... iris and facial biometric pattern