130
1 Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology Jeffrey R. Edwards University of North Carolina

Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

  • Upload
    xiang

  • View
    60

  • Download
    5

Embed Size (px)

DESCRIPTION

Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology. Jeffrey R. Edwards University of North Carolina. Outline. Types of Difference Scores Questions Difference Scores Are Intended To Address Problems With Difference Scores An Alternative Procedure - PowerPoint PPT Presentation

Citation preview

Page 1: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

1

Alternatives to Difference Scores:Polynomial Regression and

Response Surface Methodology

Jeffrey R. EdwardsUniversity of North Carolina

Page 2: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

2

OutlineI. Types of Difference ScoresII. Questions Difference Scores Are Intended To AddressIII. Problems With Difference ScoresIV. An Alternative ProcedureV. Analyzing Quadratic Regression Equations Using

Response Surface MethodologyVI. Moderated Polynomial RegressionVII. Mediated Polynomial RegressionVIII. Difference Scores As Dependent VariablesIX. Answers to Frequently Asked Questions

Page 3: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

3

Types of Difference ScoresUnivariate:

Algebraic difference: (X – Y) Absolute difference: |X – Y| Squared difference: (X – Y)2

Multivariate: Sum of algebraic differences: Σ(Xi – Yi) = D1

Sum of absolute differences: Σ|Xi – Yi| = |D| Sum of squared differences: Σ(Xi – Yi)2 = D2

Euclidean distance: (Σ(Xi – Yi)2)½ = D Profile correlation: C(Xi,Yi)/S(X)S(Y) = rXi,Yi

= Q

Page 4: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

4

Questions Difference Scores are Intended to Address How well do characteristics of the job fit the needs or desires of

the employee? To what extent do job demands exceed or fall short of the

abilities of the person? Are prior expectations of the employee met by actual job

experiences? What is the degree of similarity between perceptions or beliefs of

supervisors and subordinates? Do the values of the person match the culture of the organization? Can novices provide performance evaluations that agree with

expert ratings?

Page 5: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

5

Problems with Difference Scores:ReliabilityWhen component measures are positively

correlated, difference scores are often less reliable than either component.

The reliability of an algebraic difference is:

yxxy

2y

2x

yxxyyy2yxx

2x

y)(xr2 +

r2 r + r =

To illustrate, if X and Y have unit variances, have reliabilities of .75, and are correlated .50, the reliability of X – Y equals .50.

Page 6: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

6

Problems with Difference Scores:Conceptual Ambiguity It might seem that component variables are reflected

equally in a difference score, given that the components are implicitly assigned the same weight when the difference score is constructed.

However, the variance of a difference score depends on the variances and covariances of the component measures, which are sample dependent.

When one component is a constant, the variance of a difference score is solely due to the other component, i.e., the one that varies. For instance, when P-O fit is assessed in a single organization, the P-O difference solely represents variation in the person scores.

Page 7: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

7

Problems with Difference Scores:Confounded EffectsDifference scores confound the effects of the

components of the difference.For example, an equation using an algebraic

difference as a predictor can be written as:Z = b0 + b1(X – Y) + e

In this equation, b1 can reflect a positive relationship for X, an negative relationship for Y, or some combination thereof.

Page 8: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

8

Problems with Difference Scores:Untested Constraints

Difference scores constrain the coefficients relating X and Y to Z without testing these constraints.

The constraints imposed by an algebraic difference can be seen with the following equations:

Z = b0 + b1(X – Y) + eExpansion yields:

Z = b0 + b1X – b1Y + e

Page 9: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

9

Problems with Difference Scores:Untested ConstraintsNow, consider an equation that uses X and Y as

separate predictors:Z = b0 + b1X + b2Y + e

Using (X – Y) as a predictor constrains the coefficients on X and Y to be equal in magnitude but opposite in sign (i.e., b1 = –b2).

This constraint should not be simply imposed on the data but instead should be treated as a hypothesis to be tested.

Page 10: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

10

Problems with Difference Scores:Untested ConstraintsThe constraints imposed by a squared

difference can be seen with the following equations:

Z = b0 + b1(X – Y)2 + eExpansion yields:

Z = b0 + b1X2 – 2b1XY + b1Y2 + eThus, a squared difference implicitly treats Z

as a function of X2, XY, and Y2.

Page 11: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

11

Problems with Difference Scores:Untested Constraints

Now, consider a quadratic equation using X and Y:Z = b0 + b1X + b2Y + b3X2 + b4XY + b5Y2 + e

Comparing this equation to the previous equation shows that (X – Y)2 imposes four constraints: b1 = 0 b2 = 0 b3 = b5, or b3 – b5 = 0 b3 + b4 + b5 = 0

Again, these constraints should be treated as hypotheses to be tested empirically, not simply imposed on the data.

Page 12: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

12

Problems with Difference Scores:Dimensional Reduction

Difference scores reduce the three-dimensional relationship of X and Y with Z to two dimensions. The linear algebraic difference function represents a

symmetric plane with equal but opposite slopes with respect to the X-axis and Y-axis.

The V-shaped absolute difference function represents a symmetric V-shaped surface with its minimum (or maximum) running along the X = Y line.

The U-shaped squared difference function represents a symmetric U-shaped surface with its minimum (or maximum) running along the X = Y line.

Page 13: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

13

Two-Dimensional Algebraic Difference Function

-6 -4 -2 0 2 4 6(X - Y)

1

2

3

4

5

6

7Z

Page 14: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

14

Three-Dimensional Algebraic Difference Function

Page 15: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

15

Two-Dimensional Absolute Difference Function

-6 -4 -2 0 2 4 6(X - Y)

1

2

3

4

5

6

7Z

Page 16: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

16

Three-Dimensional Absolute Difference Function

Page 17: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

17

Two-Dimensional Squared Difference Function

-6 -4 -2 0 2 4 6(X - Y)

1

2

3

4

5

6

7Z

Page 18: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

18

Three-Dimensional Squared Difference Function

Page 19: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

19

Problems with Difference Scores:Dimensional ReductionThese surfaces represent only three of the

many possible surfaces depicting how X and Y may be related to Z.

This problem is compounded by the use of profile similarity indices, which collapse a series of three-dimensional surfaces into a single two-dimensional function.

Page 20: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

20

An Alternative Procedure The relationship of X and Y with Z should be viewed

in three dimensions, with X and Y constituting the two horizontal axes and Z constituting the vertical axis.

Analyses should focus not on two‑dimensional functions relating the difference between X and Y to Z, but instead on three‑dimensional surfaces depicting the joint relationship of X and Y with Z.

Constraints should not be simply imposed on the data, but instead should be viewed as hypotheses that, if confirmed, lend support to the conceptual model upon which the difference score is based.

Page 21: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

21

Data Used for Illustration Data were collected from 373 MBA students who were engaged in

the recruiting process. Respondents rated the actual and desired amounts of various job

attributes and the anticipated satisfaction concerning a job for which they had recently interviewed.

Actual and desired measured had three items and used 7-point response scales ranging from “none at all” to “a very great amount.” The satisfaction measured had three items and used a 7-point response scale ranging from “strongly disagree” to “strongly agree.”

The job attributes used for illustration are autonomy, prestige, span of control, and travel.

Page 22: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

22

Confirmatory ApproachWhen a difference scores represents a hypothesis

that is predicted a priori, the alternative procedure should be applied using the confirmatory approach. The R2 for the unconstrained equation should be

significant. The coefficients in the unconstrained equation should

follow the pattern indicated by the difference score. The constraints implied by the difference score should

not be rejected. The set of terms one order higher than those in the

unconstrained equation should not be significant.

Page 23: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

23

Confirmatory Approach Applied to the Algebraic DifferenceThe unconstrained equation is:

Z = b0 + b1X + b2Y + eThe constrained equation used to evaluate the third

condition is:Z = b0 + b1 (X – Y) + e

The equation that adds higher-order terms used to evaluate the fourth condition is:

Z = b0 + b1X + b2Y + b3X2 + b4XY + b5Y2 + e

Page 24: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

24

Example: Confirmatory Test of Algebraic Difference for Autonomy

Unconstrained equation:Dep Var: SAT N: 360 Multiple R: 0.356 Squared multiple R: 0.127Adjusted squared multiple R: 0.122 Standard error of estimate: 1.077 Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail)CONSTANT 5.835 0.077 0.000 . 75.874 0.000AUTCA 0.445 0.062 0.413 0.737 7.172 0.000AUTCD -0.301 0.071 -0.244 0.737 -4.235 0.000 Analysis of VarianceSource Sum-of-Squares df Mean-Square F-ratio PRegression 60.133 2 30.067 25.930 0.000Residual 413.953 357 1.160

Page 25: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

25

Example: Confirmatory Test of Algebraic Difference for AutonomyUnconstrained surface:

Page 26: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

26

Example: Confirmatory Test of Algebraic Difference for Autonomy

The first condition is met, because the R2 from the unconstrained equation is significant.

The second condition is met, because the coefficients on X and Y are significant and in the expected direction.

For the third condition, testing the constraints imposed by the algebraic difference is the same as testing the difference in R2 between the constrained and unconstrained equations.

Page 27: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

27

Example: Confirmatory Test of Algebraic Difference for Autonomy

Constrained equation:Dep Var: SAT N: 360 Multiple R: 0.339 Squared multiple R: 0.115Adjusted squared multiple R: 0.113 Standard error of estimate: 1.082

Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail)CONSTANT 5.937 0.061 0.0 . 97.007 0.000AUTALD 0.393 0.058 0.339 1.000 6.825 0.000 Analysis of VarianceSource Sum-of-Squares df Mean-Square F-ratio PRegression 54.589 1 54.589 46.586 0.000Residual 419.498 358 1.172

Page 28: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

28

Example: Confirmatory Test of Algebraic Difference for AutonomyConstrained surface:

Page 29: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

29

Example: Confirmatory Test of Algebraic Difference for Autonomy The general formula for the difference in R2 between

two regression equations is:

U2U

UC2C

2U

df/)R1()dfdf/()RR(F

05.p,91.4357/)127.1(

)357358/()115.127(.

The test of the constraint imposed by the algebraic difference for autonomy is:

The constraint is rejected, so the third condition is not satisfied.

Page 30: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

30

Example: Confirmatory Test of Algebraic Difference for Autonomy

For the fourth condition, the unconstrained equation for the algebraic equation is linear, so the higher-order terms are the three quadratic terms X2, XY, and Y2.

Testing the three quadratic terms as a set is the same as testing the difference in R2 between the linear and quadratic equations.

Page 31: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

31

Quadratic equation:Dep Var: SAT N: 360 Multiple R: 0.411 Squared multiple R: 0.169Adjusted squared multiple R: 0.157 Standard error of estimate: 1.055 Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail)CONSTANT 5.825 0.083 0.000 . 70.161 0.000AUTCA 0.197 0.100 0.182 0.273 1.966 0.050AUTCD -0.293 0.106 -0.238 0.315 -2.754 0.006AUTCA2 -0.056 0.047 -0.086 0.444 -1.177 0.240AUTCAD 0.276 0.080 0.396 0.178 3.453 0.001AUTCD2 -0.035 0.063 -0.054 0.242 -0.553 0.581 Analysis of VarianceSource Sum-of-Squares df Mean-Square F-ratio PRegression 79.951 5 15.990 14.362 0.000Residual 394.135 354 1.113

Example: Confirmatory Test of Algebraic Difference for Autonomy

Page 32: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

32

Example: Confirmatory Test of Algebraic Difference for Autonomy The test of the higher-order terms associated with

the algebraic difference for autonomy:

05.p,96.5354/)169.1(

)354357/()127.169(.

The higher-order terms are significant, so the fourth condition is not satisfied.

Page 33: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

33

Confirmatory Approach Applied to the Absolute DifferenceThe unconstrained equation is:

Z = b0 + b1X + b2Y + b3W + b4WX + b5WY + eThe constrained equation used to evaluate the third

condition is:Z = b0 + b1 |X – Y| + e

The equation that adds higher-order terms used to evaluate the fourth condition is:

Z = b0 + b1X + b2Y + b3W + b4WX + b5WY +

b6X2 + b7XY + b8Y2 + b9WX2 + b10WXY + b10WY2 + e

Page 34: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

34

Example: Confirmatory Test of Absolute Difference for Autonomy

Unconstrained equation:Dep Var: SAT N: 360 Multiple R: 0.399 Squared multiple R: 0.159Adjusted squared multiple R: 0.147 Standard error of estimate: 1.061 Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail)CONSTANT 6.233 0.152 0.000 . 41.136 0.000AUTCA -0.150 0.184 -0.139 0.082 -0.818 0.414AUTCD 0.183 0.188 0.148 0.102 0.970 0.333AUTW -0.349 0.201 -0.148 0.329 -1.737 0.083AUTCAW 0.752 0.209 0.490 0.129 3.605 0.000AUTCDW -0.554 0.219 -0.406 0.093 -2.537 0.012 Analysis of VarianceSource Sum-of-Squares df Mean-Square F-ratio PRegression 75.381 5 15.076 13.386 0.000Residual 398.705 354 1.126

Page 35: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

35

Example: Confirmatory Test of Absolute Difference for AutonomyUnconstrained surface:

Page 36: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

36

Example: Confirmatory Test of Absolute Difference for Autonomy

The first condition is met, because the R2 from the unconstrained equation is significant.

The second condition is not met, because the coefficients on X and Y are not significant, and in the expected direction.

For the third condition, testing the constraints imposed by the absolute difference is the same as testing the difference in R2 between the constrained and unconstrained equations.

Page 37: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

37

Example: Confirmatory Test of Absolute Difference for Autonomy

Constrained equation:Dep Var: SAT N: 360 Multiple R: 0.323 Squared multiple R: 0.105Adjusted squared multiple R: 0.102 Standard error of estimate: 1.089 Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail)CONSTANT 6.212 0.087 0.000 . 71.122 0.000AUTABD -0.531 0.082 -0.323 1.000 -6.464 0.000 Analysis of VarianceSource Sum-of-Squares df Mean-Square F-ratio PRegression 49.555 1 49.555 41.788 0.000Residual 424.532 358 1.186

Page 38: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

38

Example: Confirmatory Test of Absolute Difference for AutonomyConstrained surface:

Page 39: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

39

Example: Confirmatory Test of Absolute Difference for Autonomy The test of the constraints imposed by the absolute

difference for autonomy is:

05.p,68.5354/)159.1(

)354358/()105.159(.

The constraints are rejected, so the third condition is not satisfied.

Page 40: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

40

Example: Confirmatory Test of Absolute Difference for Autonomy

For the fourth condition, the unconstrained equation for the absolute equation is piecewise linear, so the higher-order terms are the six quadratic terms X2, XY, Y2, WX2, WXY, and WY2.

Testing the six quadratic terms as a set is the same as testing the difference in R2 between the piecewise linear and piecewise quadratic equations.

Page 41: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

41

Piecewise quadratic equation:Dep Var: SAT N: 360 Multiple R: 0.431 Squared multiple R: 0.185Adjusted squared multiple R: 0.160 Standard error of estimate: 1.053 Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail)CONSTANT 6.193 0.206 0.000 . 30.124 0.000AUTCA -0.438 0.548 -0.407 0.009 -0.799 0.425AUTCD 0.256 0.505 0.207 0.014 0.506 0.613AUTW -0.534 0.276 -0.225 0.172 -1.931 0.054AUTCAW 0.672 0.608 0.438 0.015 1.105 0.270AUTCDW -0.373 0.592 -0.273 0.013 -0.631 0.529AUTCA2 0.146 0.312 0.225 0.010 0.468 0.640AUTCAD -0.092 0.618 -0.133 0.003 -0.150 0.881AUTCD2 0.107 0.350 0.169 0.008 0.307 0.759AUTCA2W -0.088 0.325 -0.082 0.026 -0.272 0.786AUTCADW 0.325 0.641 0.368 0.004 0.507 0.613AUTCD2W -0.219 0.371 -0.342 0.007 -0.589 0.556 Analysis of VarianceSource Sum-of-Squares df Mean-Square F-ratio PRegression 87.940 11 7.995 7.205 0.000Residual 386.146 348 1.110

Example: Confirmatory Test of Absolute Difference for Autonomy

Page 42: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

42

Example: Confirmatory Test of Absolute Difference for Autonomy The test of the higher-order terms associated with

the absolute difference for autonomy is:

05.p,85.1348/)185.1(

)348354/()159.185(.

The higher-order terms are not significant, so the fourth condition is satisfied.

Page 43: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

43

Confirmatory Approach Applied to the Squared DifferenceThe unconstrained equation is:

Z = b0 + b1X + b2Y + b3X2 + b4XY + b5Y2 + eThe constrained equation used to evaluate the third

condition is:Z = b0 + b1 (X – Y)2 + e

The equation that adds higher-order terms used to evaluate the fourth condition is:

Z = b0 + b1X + b2Y + b3X2 + b4XY + b5Y2 +

b6X3 + b7X2Y + b8XY2 + b9Y3 + e

Page 44: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

44

Example: Confirmatory Test of Squared Difference for Autonomy

Unconstrained equation:Dep Var: SAT N: 360 Multiple R: 0.411 Squared multiple R: 0.169Adjusted squared multiple R: 0.157 Standard error of estimate: 1.055 Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail)CONSTANT 5.825 0.083 0.000 . 70.161 0.000AUTCA 0.197 0.100 0.182 0.273 1.966 0.050AUTCD -0.293 0.106 -0.238 0.315 -2.754 0.006AUTCA2 -0.056 0.047 -0.086 0.444 -1.177 0.240AUTCAD 0.276 0.080 0.396 0.178 3.453 0.001AUTCD2 -0.035 0.063 -0.054 0.242 -0.553 0.581 Analysis of VarianceSource Sum-of-Squares df Mean-Square F-ratio PRegression 79.951 5 15.990 14.362 0.000Residual 394.135 354 1.113

Page 45: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

45

Example: Confirmatory Test of Squared Difference for AutonomyUnconstrained surface:

Page 46: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

46

Example: Confirmatory Test of Squared Difference for Autonomy

The first condition is met, because the R2 from the unconstrained equation is significant.

The second condition is not met, because the coefficients on X and Y are significant, and the coefficients on X2 and Y2 are not significant.

For the third condition, testing the constraints imposed by the squared difference is the same as testing the difference in R2 between the constrained and unconstrained equations.

Page 47: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

47

Example: Confirmatory Test of Squared Difference for Autonomy

Constrained equation:Dep Var: SAT N: 360 Multiple R: 0.310 Squared multiple R: 0.096Adjusted squared multiple R: 0.093 Standard error of estimate: 1.094 Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail)CONSTANT 5.993 0.067 0.000 . 89.830 0.000AUTSQD -0.183 0.030 -0.310 1.000 -6.162 0.000 Analysis of VarianceSource Sum-of-Squares df Mean-Square F-ratio PRegression 45.463 1 45.463 37.972 0.000Residual 428.623 358 1.197

Page 48: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

48

Example: Confirmatory Test of Squared Difference for AutonomyConstrained surface:

Page 49: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

49

Example: Confirmatory Test of Squared Difference for Autonomy The test of the constraint imposed by the squared

difference for autonomy is:

05.p,77.7354/)169.1(

)354358/()096.169(.

The constraint is rejected, so the third condition is not satisfied.

Page 50: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

50

Example: Confirmatory Test of Squared Difference for Autonomy

For the fourth condition, the unconstrained equation for the squared equation is quadratic, so the higher-order terms are the four cubic terms X3, X2Y, XY2, and Y3.

Testing the four cubic terms as a set is the same as testing the difference in R2 between the quadratic and cubic equations.

Page 51: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

51

Cubic equation:Dep Var: SAT N: 360 Multiple R: 0.436 Squared multiple R: 0.190Adjusted squared multiple R: 0.170 Standard error of estimate: 1.047 Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail)CONSTANT 5.757 0.109 0.000 . 52.736 0.000AUTCA 0.364 0.119 0.337 0.190 3.055 0.002AUTCD -0.312 0.120 -0.253 0.245 -2.609 0.009AUTCA2 0.043 0.095 0.066 0.109 0.456 0.649AUTCAD 0.356 0.175 0.511 0.037 2.033 0.043AUTCD2 -0.075 0.126 -0.117 0.060 -0.594 0.553AUTCA3 -0.104 0.037 -0.442 0.094 -2.817 0.005AUTCA2D 0.052 0.066 0.167 0.052 0.794 0.428AUTCAD2 -0.030 0.089 -0.098 0.028 -0.338 0.736AUTCD3 0.003 0.053 0.011 0.046 0.047 0.962 Analysis of VarianceSource Sum-of-Squares df Mean-Square F-ratio PRegression 90.233 9 10.026 9.142 0.000Residual 383.853 350 1.097

Example: Confirmatory Test of Squared Difference for Autonomy

Page 52: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

52

Example: Confirmatory Test of Squared Difference for Autonomy The test of the higher-order terms associated with

the squared difference for autonomy is:

05.p,27.2350/)190.1(

)350354/()169.190(.

The higher-order terms are not significant, so the fourth condition is satisfied.

Page 53: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

53

Analyzing Quadratic Regression Equations Using Response Surface MethodologyResponse surface methodology can be used to analyze

features of surfaces corresponding to quadratic regression equations. These analyses are useful for three reasons: Constraints imposed by difference scores are usually rejected,

which makes it necessary to interpret unconstrained equations. Many conceptually meaningful hypotheses cannot be

expressed using difference scores. Response surfaces can themselves serve as the basis for

developing and testing hypotheses.

Page 54: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

54

Key Features of Response Surfaces: Stationary PointThe stationary point is the point at which the

slope of the surface relating X and Y to Z is zero in all directions. For convex (i.e., bowl-shaped) surfaces, the

stationary point is the overall minimum of the surface with respect to the Z axis.

For concave (i.e., dome-shaped) surfaces, the stationary point is the overall maximum of the surface with respect to the Z axis.

For saddle-shaped surfaces, the stationary point is where the surface is flat with respect to the Z axis.

Page 55: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

55

Key Features of Response Surfaces: Stationary PointThe coordinates of the stationary point can be

computed using the following formulas:

2453

51420 bbb4

bb2bb = X

2453

32410 bbb4

bb2bb = Y

X0 and Y0 are the coordinates of the stationary point in the X,Y plane.

Page 56: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

56

Example: Stationary Point for Autonomy

Applying these formulas to the equation for autonomy yields:

982.0 = 276.0)035.0)(056.0(4

)035.0)(197.0(2)276.0)(293.0( = X 20

315.0 = 276.0)035.0)(056.0(4

)056.0)(293.0(2)276.0)(197.0( = Y 20

Page 57: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

57

Example: Stationary Point for Autonomy

Stationary Point

Page 58: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

58

Key Features of Response Surfaces: Principal Axes The principal axes describe the orientation of the

surface with respect to the X,Y plane. The axes are perpendicular and intersect at the stationary point. For convex surfaces, the upward curvature is greatest along

the first principal axis and least along the second principal axis.

For concave surfaces, the downward curvature is greatest along the second principal axis and least along the first principal axis.

For saddle-shaped surfaces, upward curvature is greatest along the first principal axis, and the downward curvature is greatest along the second principal axis.

Page 59: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

59

Key Features of Response Surfaces: First Principal Axis An equation for the first principal axis is:

.b

b)bb(bb = p

4

24

25335

11

XppY 1110

011010 XpYp

The formula for the slope of the first principal axis (i.e., p11) is:

Using X0, Y0, and p11, the intercept of the first principal axis (i.e., p10) can be calculated as follows:

Page 60: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

60

Example: First Principal Axis for AutonomyApplying these formulas to the equation for

autonomy yields:

079.1 = 276.0

276.0)]035.0(056.0[)056.0(035.0 = p22

11

375.1 = )982.0)(079.1(315.0 = p10

Page 61: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

61

Example: First Principal Axis for Autonomy

First Principal Axis

Page 62: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

62

Key Features of Response Surfaces: Second Principal Axis An equation for the second principal axis is:

.b

b)bb(bb = p4

24

25335

21

XppY 2120

021020 XpYp

The formula for the slope of the second principal axis (i.e., p21) is:

X0, Y0, and p21 can be used to obtain the intercept of the second principal axis (i.e., p20) as follows:

Page 63: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

63

Example: Second Principal Axis for AutonomyApplying these formulas to the equation for

autonomy yields:

927.0 = 276.0

276.0)]035.0(056.0[)056.0(035.0 = p22

21

594.0 = )982.0)(927.0(315.0 = p20

Page 64: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

64

Example: Second Principal Axis for Autonomy

Second Principal Axis

Page 65: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

65

Key Features of Response Surfaces: Shape Along the Y = X Line The shape of the surface along a line in the X,Y plane

can be estimated by substituting the expression for the line into the quadratic regression equation.

To estimate the slope along the Y = X line, X is substituted for Y in the quadratic regression equation, which yields:

Z = b0 + b1X + b2X + b3X2 + b4X2 + b5X2 + e = b0 + (b1 + b2)X + (b3 + b4 + b5)X2 + e The term (b3 + b4 + b5) represents the curvature of the

surface along the Y = X line, and (b1 + b2) is the slope of the surface at the point X = 0.

Page 66: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

66

Example: Shape Along Y = X Line for Autonomy

For autonomy, the shape of the surface along the Y = X line is:

Z = 5.825 + [0.197 + (–0.293)]X + [–0.056 + 0.276 + (–0.035)]X2 + e

Simplifying this expression yields:Z = 5.825 – 0.096X + 0.185X2 + e

The surface is curved upward along the Y = X line and is negatively sloped at the point X = 0 (the curvature is significant at p < .05).

Page 67: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

67

Example: Shape Along Y = X Line for Autonomy

Contours ShowShape AlongY = X Line

Page 68: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

68

Key Features of Response Surfaces: Shape Along Y = –X LineTo estimate the slope along the Y = –X line, –

X is substituted for Y in the quadratic regression equation, which yields:Z = b0 + b1X – b2X + b3X2 – b4X2 + b5X2 + e

= b0 + (b1 – b2)X + (b3 – b4 + b5)X2 + eThe term (b3 – b4 + b5) represents the curvature

of the surface along the Y = –X line, and (b1 – b2) is the slope of the surface at the point X = 0.

Page 69: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

69

Example: Shape Along Y = –X Line for AutonomyFor autonomy, the shape of the surface along

the Y = –X line is:Z = 5.825 + [0.197 – (–0.293)]X + [–0.056 – 0.276 + (–0.035)]X2 + e

Simplifying this expression yields:Z = 5.825 + 0.490X – 0.367X2 + e

The surface is curved downward along the Y = –X line and is positively sloped at the point X = 0 (both are significant at p < .05).

Page 70: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

70

Contours ShowShape AlongY = –X Line

Example: Shape Along Y = –X Line for Autonomy

Page 71: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

71

To estimate the slope along the first principal axis, p10 + p11X is substituted for Y:

X)ppb2pbpbb(pbpbb 11105104112121051020

eX)pbpbb( 221151143

)Xpp(XbXb)Xpp(bXbb Z 111042

31110210 e)Xpp(b 2

11105

Key Features of Response Surfaces: Shape Along First Principal Axis

The composite terms preceding X2 and X are the curvature of the surface along the first principal axis and the slope of the surface at the point X = 0.

Page 72: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

72

Example: Shape Along First Principal Axis for AutonomyFor autonomy, the shape of the surface along

the first principal axis is:)375.1)(035.0()375.1)(293.0(825.5 Z 2

)375.1)(276.0()079.1)(293.0(197.0[

eX)]079.1)(035.0()079.1)(276.0(056.0[ 22 eX201.0X395.0162.6 2

X)]079.1)(375.1)(035.0(2

The surface is curved upward along the first principal axis and is negatively sloped at the point X = 0 (both are significant at p < .05).

Page 73: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

73

Example: Shape Along First Principal Axis for Autonomy

Contours ShowShape Along

First Principal Axis

Page 74: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

74

To estimate the slope along the second principal axis, p20 + p21X is substituted for Y:

X)ppb2pbpbb(pbpbb 21205204212122052020

eX)pbpbb( 222152143

)Xpp(XbXb)Xpp(bXbb Z 212042

32120210 e)Xpp(b 2

21205

Key Features of Response Surfaces: Shape Along Second Principal Axis

The composite terms preceding X2 and X are the curvature of the surface along the second principal axis and the slope of the surface at the point X = 0.

Page 75: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

75

Example: Shape Along Second Principal Axis for Autonomy For autonomy, the shape of the surface along the

second principal axis is:)594.0)(035.0()594.0)(293.0(825.5 Z 2

)594.0)(276.0()927.0)(293.0(197.0[

eX)]927.0)(035.0()927.0)(276.0(056.0[ 22 eX342.0X671.0639.5 2

X)]927.0)(594.0)(035.0(2

The surface is curved downward along the second principal axis and is positively sloped at the point X = 0 (both are significant at p < .05).

Page 76: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

76

Example: Shape Along Second Principal Axis for Autonomy

Contours ShowShape Along

Second Principal Axis

Page 77: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

77

The formulas for shapes along predetermined lines such as Y = X and Y = –X can be tested using procedures for testing weighted linear combinations of regression coefficients.

For example, a t-test for b1 + b2 is obtained by dividing b1 + b2 by its standard error, or the square root of the variance of b1 + b2:

Key Features of Response Surfaces: Tests of Significance

)b,b(C2)b(V)b(V)bb(S 212121 The variances of b1 and b2 are the squares of their

standard errors, and the covariance of b1 and b2 is their correlation times their standard errors.

Page 78: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

78

Key Features of Response Surfaces: Tests of Significance

Weighted linear combinations of regression coefficients can also be tested using routines available in many statistical packages.

Another approach is to test the reduction in R2 produced by the constraint represented by the weighted linear combination of coefficients.

For instance, to jointly test (b1 + b2) and (b3 + b4 + b5), we set both quantities equal to zero and impose the resulting constraints.

Page 79: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

79

Key Features of Response Surfaces: Tests of Significance

The expression b1 + b2 = 0 implies b2 = –b1. Likewise, the expression b3 + b4 + b5 = 0 implies b5 = –b3 – b4. Imposing these constraints on the quadratic regression equation yields:

Z = b0 + b1X – b1Y + b3X2 + b4XY + (–b3 – b4)Y2 + eThe expression simplifies to:

Z = b0 + b1(X – Y) + b3(X2 – Y2) + b4(XY – Y2) + e The reduction in R2 from this equation relative to the R2

from the quadratic equation is a joint test of b1 + b2 = 0 and b3 + b4 + b5 = 0.

Page 80: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

80

X0, Y0, p10, p11, p20, p21, and slopes along the principal axes are nonlinear combinations of regression coefficients. For these quantities, significance tests can be conducted using the bootstrap, as follows: A large number (e.g., 10,000) of samples of size N are

randomly drawn with replacement. Each sample is used to estimate the quadratic regression

equation. The coefficients from each sample are used to compute X0, Y0,

p10, p11, p20, and p21. The distributions of X0, Y0, p10, p11, p20, and p21 are used to

construct confidence intervals.

Key Features of Response Surfaces: Tests of Significance

Page 81: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

81

Example: Testing Response Surface Features for Autonomy

A joint test of (b1 + b2) and (b3 + b4 + b5), which represent the slope at the point X = 0 and the curvature along the Y = X line, is yielded by the following commands:MGLHMOD SAT=CONSTANT+AUTCA+AUTCD+AUTCA2+AUTCAD+AUTCD2ESTHYPAMA [0 1 1 0 0 0;, 0 0 0 1 1 1]TEST

Page 82: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

82

For autonomy, this test yields the following result:Hypothesis.

A Matrix

1 2 3 4 5 6 1 0.0 1.000 1.000 0.0 0.0 0.0 2 0.0 0.0 0.0 1.000 1.000 1.000

Test of Hypothesis

Source SS df MS F P

Hypothesis 16.878 2 8.439 7.580 0.001 Error 394.135 354 1.113

Example: Testing Response Surface Features for Autonomy

Page 83: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

83

Example: Testing Response Surface Features for Autonomy

Separate tests of (b1 + b2) and (b3 + b4 + b5) are yielded by the following commands:MGLHMOD SAT=CONSTANT+AUTCA+AUTCD+AUTCA2+AUTCAD+AUTCD2ESTHYPAMA [0 1 1 0 0 0]TESTHYPAMA [0 0 0 1 1 1]TEST

Page 84: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

84

For autonomy, the results are:A Matrix 1 2 3 4 5 6 1 0.0 1.000 1.000 0.0 0.0 0.0

Test of Hypothesis

Source SS df MS F P

Hypothesis 1.068 1 1.068 0.959 0.328 Error 394.135 354 1.113

A Matrix 1 2 3 4 5 6 1 0.0 0.0 0.0 1.000 1.000 1.000Test of Hypothesis

Source SS df MS F P

Hypothesis 11.740 1 11.740 10.545 0.001 Error 394.135 354 1.113

Example: Testing Response Surface Features for Autonomy

Page 85: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

85

Example: Testing Response Surface Features for Autonomy

Likewise, a joint test of (b1 – b2) and (b3 – b4 + b5), which represent the slope at the point X = 0 and the curvature along the Y = – X line, is yielded by the following commands:MGLHMOD SAT=CONSTANT+AUTCA+AUTCD+AUTCA2+AUTCAD+AUTCD2ESTHYPAMA [0 1 -1 0 0 0;, 0 0 0 1 -1 1]TEST

Page 86: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

86

For autonomy, this test yields the following result:Hypothesis.

A Matrix

1 2 3 4 5 6 1 0.0 1.000 -1.000 0.0 0.0 0.0 2 0.0 0.0 0.0 1.000 -1.000 1.000

Test of Hypothesis

Source SS df MS F P

Hypothesis 39.512 2 19.756 17.744 0.000 Error 394.135 354 1.113

Example: Testing Response Surface Features for Autonomy

Page 87: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

87

Example: Testing Response Surface Features for Autonomy

Separate tests of (b1 – b2) and (b3 – b4 + b5) are yielded by the following commands:MGLHMOD SAT=CONSTANT+AUTCA+AUTCD+AUTCA2+AUTCAD+AUTCD2ESTHYPAMA [0 1 -1 0 0 0]TESTHYPAMA [0 0 0 1 -1 1]TEST

Page 88: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

88

For autonomy, the results are:A Matrix 1 2 3 4 5 6 1 0.0 1.000 -1.000 0.0 0.0 0.0

Test of Hypothesis

Source SS df MS F P

Hypothesis 8.105 1 8.105 7.279 0.007 Error 394.135 354 1.113

A Matrix 1 2 3 4 5 6 1 0.0 0.0 0.0 1.000 -1.000 1.000

Test of Hypothesis

Source SS df MS F P

Hypothesis 6.588 1 6.588 5.917 0.015 Error 394.135 354 1.113

Example: Testing Response Surface Features for Autonomy

Page 89: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

89

In SYSTAT, the bootstrap is implemented with the following commands:MGLHMOD SAT=CONSTANT+AUTCA+AUTCD+AUTCA2+AUTCAD+AUTCD2SAVE AUTBOOT.SYD/COEFEST/SAMPLE=BOOT(10000)

These commands will produce a large output file with the results of all 10,000 regressions and a system file containing 10,000 sets of coefficients.

The coefficients are used to construct confidence intervals (Mooney & Duval, 1993; Stine, 1989).

Example: Testing Response Surface Features for Autonomy

Page 90: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

90

For autonomy, the 95% confidence intervals for X0, Y0, p10, p11, p20, p21 are:

Value CIL CIU

X0 0.982 0.199 5.142Y0 –0.315 –3.480 0.239p10 –1.375 –11.423 –0.359p11 1.079 0.688 2.123p20 0.594 –1.167 1.120p21 –0.927 –1.449 –0.466

Example: Testing Response Surface Features for Autonomy

Page 91: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

91

The surface was saddle-shaped. The slope of the first principal axis did not differ from 1, and the

intercept of first principal axis was negative, meaning that the axis ran parallel to the Y = X line but was shifted to the right.

The slope and intercept of the second principal axis did not differ from –1 and 0, respectively. Thus, the axis did not differ from the Y = –X line.

The location of the first principal axis combined with the slope along the second principal axis indicate that satisfaction increased as actual autonomy increased toward desired autonomy, continued to increase as actual autonomy exceeded desired autonomy, and began to decrease when actual autonomy exceeded desired autonomy by about one unit.

Within the range of the data, satisfaction increased at an increasing rate as actual and desired autonomy both increased along the first principal axis.

Interpretation of Results for Autonomy

Page 92: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

92

Moderated Polynomial Regression In some cases, the effect represented by a quadratic

regression equation is believed to be moderated by another variable.

Incorporating the moderator variable V into a quadratic regression equation yields:Z = b0 + b1X + b2Y + b3X2 + b4XY + b5Y2 + b6V +

b7XV + b8YV + b9X2V + b10XYV + b11Y2V + e Moderation is tested by assessing the increment in R2

yielded by the terms XV, YV, X2V, XYV, and Y2V.

Page 93: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

93

Moderated Polynomial Regression The moderated quadratic regression equation can be

rewritten to show simple surfaces at selected levels of the moderator variable, as follows:Z = (b0 + b6V) + (b1 + b7V)X + (b2 + b8V)Y +

(b3 + b9V)X2 + (b4 + b10V)XY + (b5 + b11V)Y2 + e The compound coefficients on the terms X, Y, X2,

XY, and Y2 can be tested using procedures for testing weighted linear combinations of regression coefficients.

Page 94: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

94

Example: Moderated Polynomial Regression for Autonomy Quadratic equation with importance as a moderator:

Dep Var: SAT N: 357 Multiple R: 0.431 Squared multiple R: 0.186Adjusted squared multiple R: 0.160 Standard error of estimate: 1.057 Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail)CONSTANT 5.514 0.481 0.000 . 11.455 0.000AUTCA 0.409 0.487 0.379 0.012 0.841 0.401AUTCD -0.740 0.518 -0.595 0.014 -1.429 0.154AUTCA2 0.181 0.292 0.278 0.012 0.620 0.536AUTCAD 0.595 0.489 0.855 0.005 1.218 0.224AUTCD2 -0.225 0.306 -0.353 0.010 -0.736 0.462AUTI 0.062 0.101 0.051 0.343 0.614 0.540AUTCAI -0.050 0.103 -0.242 0.009 -0.479 0.632AUTCDI 0.103 0.115 0.454 0.009 0.890 0.374AUTCA2I -0.046 0.054 -0.408 0.011 -0.862 0.389AUTCADI -0.047 0.088 -0.392 0.004 -0.533 0.594AUTCD2I 0.021 0.059 0.200 0.008 0.360 0.719 Analysis of VarianceSource Sum-of-Squares df Mean-Square F-ratio PRegression 88.023 11 8.002 7.158 0.000Residual 385.670 345 1.118

Page 95: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

95

Example: Moderated Polynomial Regression for Autonomy The test of the increment in R2 yielded by the five

moderator terms is:

The increment in R2 is not significant, so moderation is not supported.

05.p,44.1345/)186.1(

)345350/()169.186(.

Page 96: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

96

Simple quadratic equations at low, medium, and high levels of importance:

X Y X2 XY Y2

Low 0.21 -0.33** -0.00 0.41* -0.14Medium 0.16 -0.23 -0.05 0.36** -0.12High 0.11 -0.13 -0.09 0.32** -0.10

Example: Moderated Polynomial Regression for Autonomy

Page 97: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

97

Example: Moderated Polynomial Regression for AutonomySimple surface for low importance:

Page 98: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

98

Example: Moderated Polynomial Regression for AutonomySimple surface for medium importance:

Page 99: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

99

Example: Moderated Polynomial Regression for AutonomySimple surface for high importance:

Page 100: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

100

Mediated Polynomial Regression On occasion, the effect represented by a quadratic

regression equation is believed to be mediated by (i.e., transmitted through) another variable.

Mediation can be analyzed using two regression equations, one that regresses the mediator on the five quadratic terms, and another that regresses the outcome on the five quadratic terms and the mediator:M = a0 + a1X + a2Y + a3X2 + a4XY + a5Y2 + eM

Z = b0 + b1M + b2X + b3Y + b4X2 + b5XY + b6Y2 + eZ

Page 101: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

101

Mediated Polynomial Regression The mediated effect represented by these two equation

can be derived by substituting the equation for M into the equation for Z to obtain a reduced form equation:Z = b0 + b1(a0 + a1X + a2Y + a3X2 + a4XY + a5Y2 + eM)

+ b2X + b3Y + b4X2 + b5XY + b6Y2 + eZ

Distribution yields:Z = b0 + a0b1 + a1b1X + a2b1Y + a3b1X2 + a4b1XY +

a5b1Y2 + b1eM + b2X + b3Y + b4X2 + b5XY + b6Y2

+ eZ

Page 102: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

102

Mediated Polynomial Regression Collecting like terms yields:

Z = (b0 + a0b1) + (b2 + a1b1)X + (b3 + a2b1)Y +

(b4 + a3b1)X2 + (b5 + a4b1)XY + (b6 + a5b1)Y2 +

(eZ + b1eM) The compound coefficients on X, Y, X2, XY, and Y2

capture the portion of the quadratic effect mediated by M as the products a1b1, a2b1, a3b1, a4b1, and a5b1.

The portion of the quadratic effect that bypasses M is captured by b2, b3, b4, b5, and b6.

These coefficients can be analyzed separately and jointly to examine the mediated quadratic effect.

Page 103: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

103

Example: Mediated Polynomial Regression for Autonomy Quadratic equation with intent to take the focal job as the

outcome variable:Dep Var: INT N: 360 Multiple R: 0.276 Squared multiple R: 0.076Adjusted squared multiple R: 0.063 Standard error of estimate: 1.174 Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail)CONSTANT 5.851 0.092 0.000 . 63.319 0.000AUTCA 0.161 0.111 0.142 0.273 1.449 0.148AUTCD -0.244 0.119 -0.187 0.315 -2.056 0.041AUTCA2 -0.076 0.052 -0.110 0.444 -1.438 0.151AUTCAD 0.197 0.089 0.267 0.178 2.211 0.028AUTCD2 0.008 0.070 0.013 0.242 0.121 0.904

Analysis of VarianceSource Sum-of-Squares df Mean-Square F-ratio PRegression 40.397 5 8.079 5.858 0.000Residual 488.231 354 1.379

Page 104: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

104

Example: Mediated Polynomial Regression for Autonomy Quadratic equation with satisfaction as the mediator variable:

Dep Var: SAT N: 360 Multiple R: 0.411 Squared multiple R: 0.169Adjusted squared multiple R: 0.157 Standard error of estimate: 1.055 Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail)CONSTANT 5.825 0.083 0.000 . 70.161 0.000AUTCA 0.197 0.100 0.182 0.273 1.966 0.050AUTCD -0.293 0.106 -0.238 0.315 -2.754 0.006AUTCA2 -0.056 0.047 -0.086 0.444 -1.177 0.240AUTCAD 0.276 0.080 0.396 0.178 3.453 0.001AUTCD2 -0.035 0.063 -0.054 0.242 -0.553 0.581 Analysis of VarianceSource Sum-of-Squares df Mean-Square F-ratio PRegression 79.951 5 15.990 14.362 0.000Residual 394.135 354 1.113

Page 105: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

105

Example: Mediated Polynomial Regression for Autonomy Quadratic equation with intent to take the focal job as the

outcome variable and satisfaction as the mediating variable:Dep Var: INT N: 360 Multiple R: 0.760 Squared multiple R: 0.578Adjusted squared multiple R: 0.571 Standard error of estimate: 0.795 Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail)CONSTANT 1.074 0.242 0.000 . 4.445 0.000SAT 0.820 0.040 0.777 0.831 20.480 0.000AUTCA 0.000 0.076 0.000 0.270 0.001 0.999AUTCD -0.003 0.081 -0.002 0.308 -0.038 0.969AUTCA2 -0.030 0.036 -0.044 0.443 -0.842 0.401AUTCAD -0.030 0.061 -0.040 0.173 -0.484 0.629AUTCD2 0.037 0.047 0.055 0.242 0.780 0.436 Analysis of VarianceSource Sum-of-Squares df Mean-Square F-ratio PRegression 305.506 6 50.918 80.556 0.000Residual 223.122 353 0.632

Page 106: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

106

Example: Mediated Polynomial Regression for Autonomy The compound coefficients are:

b0 + a0b1 = 1.07 + 5.83 × 0.82 = 1.07 + 4.78 = 5.85 b2 + a1b1 = 0.00 + 0.20 × 0.82 = 0.00 + 0.16 = 0.16 b3 + a2b1 = –0.00 – 0.29 × 0.82 = –0.00 – 0.24 = –0.24 b4 + a3b1 = –0.03 – 0.06 × 0.82 = –0.03 – 0.05 = –0.08 b5 + a4b1 = –0.03 + 0.28 × 0.82 = –0.03 + 0.23 = 0.20 b6 + a5b1 = 0.04 – 0.04 × 0.82 = 0.04 – 0.03 = 0.01

The individual coefficients can be tested using the reported standard errors, and the products of coefficients can be tested using the bootstrap.

Page 107: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

107

Tests of individual and compound coefficients: Direct1 First Second Indirect Total Effect Stage Stage Effect Effect

Intercept 1.07** 5.83** 0.82** 4.78** 5.85**

X 0.00 0.20* 0.82** 0.16* 0.16Y –0.00 –0.29** 0.82** –0.24** –0.24*

X2 –0.03 –0.06 0.82** –0.05 –0.08XY –0.03 0.28** 0.82** 0.23** 0.20Y2 0.04 –0.04 0.82** –0.03 0.01

1The direct effect of the five quadratic terms was not significant.

Example: Mediated Polynomial Regression for Autonomy

Page 108: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

108

Example: Mediated Polynomial Regression for AutonomySurface for unmediated effect:

Page 109: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

109

Example: Mediated Polynomial Regression for AutonomySurface for direct effect:

Page 110: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

110

Example: Mediated Polynomial Regression for AutonomySurface for first stage of indirect effect:

Page 111: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

111

Example: Mediated Polynomial Regression for AutonomySurface for indirect effect:

Page 112: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

112

Example: Mediated Polynomial Regression for AutonomySurface for total effect:

Page 113: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

113

Many of the problems that occur when difference scores are used as independent variables also occur when they are used as dependent variables.

Alternative procedures for difference scores as dependent variables are fundamentally different from those for difference scores as independent variables.

We will briefly consider procedures when the dependent variable is an algebraic difference and both components are endogenous, meaning they are caused by the independent variables.

Difference Scores as Dependent Variables

Page 114: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

114

An equation that uses an algebraic difference as a dependent variable is:

(Y1 – Y2) = b0 + b1X + eY1 and Y2 may be recast as separate dependent

variables in a multivariate regression analysis:Y1 = b10 + b11X + e1

Y2 = b20 + b21X + e2

Difference Scores as Dependent Variables

Page 115: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

115

The correspondence between these equations can be seen by subtracting the Y2 equation from the Y1 equation, which yields:

(Y1 – Y2) = (b10 – b20) + (b11 – b21)X + (e1 – e2) This subtraction shows the following:

b0 = b10 – b20

b1 = b11 – b21

These expressions reveal a fundamental ambiguity, in that b0 and b1 indicate the differences between the intercepts and slopes, respectively, from the Y1 and Y2 equations, but they provide no information regarding the absolute magnitudes of these intercepts and slopes.

Difference Scores as Dependent Variables

Page 116: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

116

This ambiguity is illustrated by the following examples, all of which yield the same value for b1.

This pattern indicates that the effects of X on Y1 and Y2 are equal in magnitude but opposite in sign:

b11 = b1/2, b21 = –b1/2 Here, X is positively related to Y1 and unrelated to Y2:

b11 = b1, b21 = 0 Here, X is negatively related to Y2 and unrelated to Y1:

b11 = 0, b21 = –b1

These examples show that b1 is essentially useless for determining the effect of X on Y1 and Y2.

Difference Scores as Dependent Variables

Page 117: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

117

The alternative procedure uses Y1 and Y2 jointly as dependent variables in multivariate regression equations.

The multivariate equations reveal the separate effects of X on Y1 and Y2 and can be used to test whether these effects correspond to hypotheses implied when (Y1 – Y2) is used as a dependent variable.

The procedure provides multivariate tests of the effects of X on Y1 and Y2 and differences between these effects.

Multivariate piecewise regression equations can be used as an alternative to |Y1 – Y2| is used as a dependent variable.

Difference Scores as Dependent Variables

Page 118: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

118

Q: Which higher-order terms should I use? Are squared and product terms sufficient, or should I also use cubed terms, the products of squared and first-order terms, etc.?

A: The higher-order terms to be included in the equation depend entirely on one’s hypotheses regarding the joint relationships of X and Y with Z. In most cases, I have found that the three quadratic terms (i.e., X2, XY, and Y2) are sufficient to capture most theoretically meaningful effects. In exploratory analyses, I have found significant effects for cubic and quartic terms, but these rarely survive cross-validation and are often symptoms of a few outliers or influential cases in the data.

Answers to Frequently Asked Questions

Page 119: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

119

Q: How do I interpret the coefficients on X2, XY, and Y2? I understand what they each mean separately, but thinking about them all together is confusing.

A: The coefficients on X2, XY, and Y2 should be interpreted along with the coefficients on X and Y as a set, because these coefficients collectively describe the shape of the surface relating X and Y to Z. Trying to interpret any one of these coefficients in the absence of the others will often yield erroneous conclusions. Instead, surfaces indicated by quadratic regression equations should be treated as whole entities, and features of the surfaces can be tested using response surface methodology. A major motivation for applying response surface methodology was my frustration when trying to make sense of coefficients from quadratic equations. Response surface methodology makes the task much easier.

Answers to Frequently Asked Questions

Page 120: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

120

Q: Given that the coefficients on X and Y are scale dependent when X2, XY, and Y2 in the equation, how can I meaningfully interpret these coefficients?

A: The coefficients on X and Y (i.e., b1 and b2) are indeed scale dependent. However, this simply reflects the fact that b1 and b2 indicate the slope of the surface where X and Y are zero (i.e., the origin of the X,Y plane). One could add or subtract arbitrary constants to X and Y and change the values of b1 and b2, but doing so may shift the origins of X and Y beyond the bounds of the data, where it doesn’t make sense to estimate b1 and b2 in the first place. A more reasonable strategy is to scale X and Y such that their origins represent a meaningful point in the distribution of the data in the X,Y plane, such as a point midway between their means or the midpoint of their common scale.

Answers to Frequently Asked Questions

Page 121: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

121

Q: How large should my sample be?A: The sample should be large enough to provide the statistical power

needed to test constraints and combinations of regression coefficients required to test hypotheses. Power is important because showing support for constraints requires support for the null hypothesis (i.e., the R2 values for the constrained and unconstrained equations do not differ). A related concern is that the sample should provide adequate dispersion of cases in the X,Y plane. For example, if cases are skewed in the direction of X > Y or X < Y, it will be very difficult to detect changes in the slope of the surface along the Y = –X line, which are usually of interest in congruence research. Keep in mind that skewness on either side of the Y = –X line cannot be detected by examining the distributions of X and Y separately.

Answers to Frequently Asked Questions

Page 122: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

122

Q: I have seen measures that ask the respondent to directly compare the degree to which X deviates from Y. Doesn’t this approach avoid the problems with difference scores?

A: Not really. Although it removes the need for the researcher to calculate the difference, it does not guarantee that the respondent will not implicitly or explicitly calculate the difference between X and Y when providing a response (many response scales for such items prompt the respondent to do just that). If this occurs, then items that solicit direct comparisons are subject to the problems as difference scores, because these problems do not depend on who calculates the difference. Moreover, direct comparison items hopelessly confound X and Y (analogous to any “double-barreled” item) and force the researcher to take a two-dimensional view of the relationship of X and Y with Z, even when a three-dimensional view may be more informative.

Answers to Frequently Asked Questions

Page 123: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

123

Q: The unconstrained equations for profile similarity indices contain so many items. How do I interpret all those coefficients, and what do I do about degrees of freedom?

A: Testing the full set of constraints imposed D1, |D|, and D2 does indeed require using items for all of the dimensions as predictors. However, the items constituting profiles can often be grouped into conceptually homogeneous subsets. Scales corresponding to these subsets can then be constructed, which can drastically reduce the effective number of dimensions to be analyzed. This not only makes interpretation easier, but also reduces sample size requirements. Moreover, higher-order terms for each dimension can be tested as sets, and those that are not significant may be dropped (for an illustration of this, see Edwards, 1993). Of course, models derived in this manner should be considered exploratory, pending cross-validation.

Answers to Frequently Asked Questions

Page 124: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

124

Q: By not using difference scores, aren’t we ignoring “fit?”A: Models using difference scores are simply special cases of general

models containing the components of the difference. Hence, these general models subsume those that use difference scores. The general models also permit tests of the constraints imposed by difference scores, which remain unverified when difference scores are used. Moreover, fit hypotheses can usually be restated in terms of relationships involving the variables that constitute the fit construct. By stating hypotheses in these terms, one can verify that relationships for these variables conform to patterns depicted by fit hypotheses. Thus, the use of component variables, supplemented by higher-order terms and response surface analyses, permit tests of most fit hypotheses as well as hypotheses difference scores cannot depict. This approach lets the researcher gain much and lose little, if anything at all.

Answers to Frequently Asked Questions

Page 125: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

125

Q: How can I apply the quadratic approach to structural equations modeling?

A: Drawing from the literature on moderated structural equation modeling, I have developed procedures for specifying and estimating quadratic structural equation models and applying response surface methodology. These procedures require squares and products of the indicators of first-order latent variables, involve complex nonlinear constraints on parameters, and use estimation methods for nonnormal data. I hope to finish a manuscript describing this procedure in the near future.

Answers to Frequently Asked Questions

Page 126: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

126

Q: How do you generate those fancy graphs?A: I have traditionally used SYSTAT, which is great for

plotting three-dimensional surfaces and adding contour lines, principal axes, and so forth. Surfaces can also be plotted using Microsoft Excel, and I have developed a file that allows the user to enter coefficient estimates from a quadratic equation and the minimum and maximum values of X and Y to produce a surface. This file can be downloaded from my website at:http://public.kenan-flagler.unc.edu/faculty/edwardsj/downloads.htm

Answers to Frequently Asked Questions

Page 127: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

127

Q: Can you recommend empirical examples of polynomial regression in the organizational behavior literature?

A: The use of polynomial regression has grown since its introduction. Examples published through 2000 are cited in the Edwards (2001) article on difference score myths, and more recent examples are cited in the meta-analysis conducted by Kristof-Brown et al. (2005).

Answers to Frequently Asked Questions

Page 128: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

128

Q: Your approach looks like a real pain. Can I just pretend it doesn’t exist? Or, can I just cite your work to make it look like I'm doing what you recommend?

A: Some researchers tenaciously cling to difference scores. Old habits die hard. As a case in point, in a 1992 Psychological Bulletin article, Lee Cronbach lamented that researchers continue to use profile similarity indices he once advocated (Cronbach, 1955; Cronbach & Gleser, 1953) but subsequently disavowed (Cronbach, 1958). Researchers have also developed clever ways of citing articles that criticize difference scores without following the advice in the articles. Here are some of my favorites, quoted from studies that cite Edwards (1994):

Answers to Frequently Asked Questions

Page 129: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

129

“Computing a correlation across dimensions for each individual to predict outcomes of fit or congruence represents a flawed measure of fit (Edwards, 1994). However, for our purposes here, correlations across individuals within a dimension provide an appropriate measure of the relationship between person and environment.”

“The reliabilities of the difference scores created to assess similarity were relatively high, so it seemed simpler and more understandable to keep the analysis as it was rather than to apply more complicated alternatives (e.g., Edwards, 1994).”

“Unmet expectations were assessed by subtracting scores on each item for the early expectations from scores on each item from the current situation . . . Problems in measuring and analyzing discrepancy scores, and unmet expectations in particular, have been reported recently (Edwards, 1994) . . . these problems have not been entirely overcome here.”

Answers to Frequently Asked Questions

Page 130: Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

130

Key ReferencesBohrnstedt, G. W., & Goldberger, A. S. (1969). On the exact covariance of products of random variables.

Journal of the American Statistical Association, 64, 1439-1442.Bohrnstedt, G. W., & Marwell, G. (1978). The reliability of products of two random variables. In K. F.

Schuessler, (Ed.), Sociological Methodology 1978 (pp. 254-273). San Francisco: Jossey-Bass.Edwards, J. R. (1994). The study of congruence in organizational behavior research: Critique and a proposed

alternative. Organizational Behavior and Human Decision Processes, 58, 51-100 (erratum, 58, 323-325).Edwards, J. R., & Parry, M. E. (1993). On the use of polynomial regression equations as an alternative to

difference scores in organizational research. Academy of Management Journal, 36, 1577-1613.Edwards, J. R. (1995). Alternatives to difference scores as dependent variables in the study of congruence in

organizational research. Organizational Behavior and Human Decision Processes, 64, 307-324.Edwards, J. R. (2001). Ten difference score myths. Organizational Research Methods, 4, 264-286.Edwards, J. R. (2002). Alternatives to difference scores: Polynomial regression analysis and response surface

methodology. In F. Drasgow & N. W. Schmitt (Eds.), Advances in measurement and data analysis (pp. 350-400). San Francisco: Jossey-Bass.

Kristof-Brown, A. L., Zimmerman, R. D., & Johnson, E. C. (2005). Consequences of individual's fit at work: A meta-analysis of person-job, person-organization, person-group, and person-supervisor fit. Personnel Psychology, 58, 281-342.

Mooney, C. Z., & Duval, R. D. (1993). Bootstrapping: A nonparametric approach to statistical inference. Newbury Park, CA: Sage.

Stine, R. (1989). An introduction to bootstrap methods. Sociological Methods & Research, 18, 243-291.