16
EVALUATING YOUR EVALUATING YOUR RESEARCH DESIGN RESEARCH DESIGN EDRS 5305 EDUCATIONAL RESEARCH & STATISTICS

EVALUATING YOUR RESEARCH DESIGN EDRS 5305 EDUCATIONAL RESEARCH & STATISTICS

Embed Size (px)

Citation preview

Page 1: EVALUATING YOUR RESEARCH DESIGN EDRS 5305 EDUCATIONAL RESEARCH & STATISTICS

EVALUATING YOUR EVALUATING YOUR RESEARCH DESIGNRESEARCH DESIGN

EDRS 5305

EDUCATIONAL RESEARCH & STATISTICS

Page 2: EVALUATING YOUR RESEARCH DESIGN EDRS 5305 EDUCATIONAL RESEARCH & STATISTICS
Page 3: EVALUATING YOUR RESEARCH DESIGN EDRS 5305 EDUCATIONAL RESEARCH & STATISTICS

Campbell and Stanley (1966)Campbell and Stanley (1966)

Two general criteria of research designs:

Internal validity

External validity

Page 4: EVALUATING YOUR RESEARCH DESIGN EDRS 5305 EDUCATIONAL RESEARCH & STATISTICS

Definition: refers to the Definition: refers to the extent to which the changes extent to which the changes observed in the DV are observed in the DV are caused by the IV.caused by the IV.

INTERNAL VALIDITYINTERNAL VALIDITY

Page 5: EVALUATING YOUR RESEARCH DESIGN EDRS 5305 EDUCATIONAL RESEARCH & STATISTICS

Internal ValidityInternal Validity

?s of internal validity cannot be answered positively unless the design provides adequate control of extraneous variables.

Essentially a problem of control. Anything that contributes to the

control of a research design contributes to its internal validity.

Page 6: EVALUATING YOUR RESEARCH DESIGN EDRS 5305 EDUCATIONAL RESEARCH & STATISTICS

Internal ValidityInternal Validity

1. History: specific events or conditions, other than the treatment, may occur between the 1st and 2nd measurements of the participants to produce changes in the DV.

2. Maturation: processes that operate within the participants simply as a function of the passage of time.

Page 7: EVALUATING YOUR RESEARCH DESIGN EDRS 5305 EDUCATIONAL RESEARCH & STATISTICS

Internal ValidityInternal Validity

3. Pretesting: exposure to a pretest may affect participants’ performance on a 2nd test, regardless of the IV.

4. Measuring instruments: changes in the measuring instruments, in the scorers, or in the observers used may produce changes in the obtained measures.

Page 8: EVALUATING YOUR RESEARCH DESIGN EDRS 5305 EDUCATIONAL RESEARCH & STATISTICS

Internal ValidityInternal Validity

5. Statistical regression: If groups are selected on the basis of extreme scores, statistical regression may operate to produce an effect that could be mistakenly interpreted as an experimental effect.

Page 9: EVALUATING YOUR RESEARCH DESIGN EDRS 5305 EDUCATIONAL RESEARCH & STATISTICS

Internal ValidityInternal Validity

6. Differential selection of participants: important differences may exist between the groups before the IV is applied.

7. Experimental mortality: occurs when there is a differential loss of respondents from the comparison groups.

Page 10: EVALUATING YOUR RESEARCH DESIGN EDRS 5305 EDUCATIONAL RESEARCH & STATISTICS

Internal ValidityInternal Validity

8. Selection-maturation interaction: Some of these internal validity threats may interact. Frequently arises when volunteers are compared with nonvolunteers.

Page 11: EVALUATING YOUR RESEARCH DESIGN EDRS 5305 EDUCATIONAL RESEARCH & STATISTICS

Internal ValidityInternal Validity

9. Implementation: sometimes implementing the IV threatens internal validity. Experimenter bias effect

10. Participants’ attitudes: Hawthorne effect-attention was positive; John Henry effect-exert extra effort

Page 12: EVALUATING YOUR RESEARCH DESIGN EDRS 5305 EDUCATIONAL RESEARCH & STATISTICS

Controlling for Threats to Controlling for Threats to Internal ValidityInternal Validity

1. Random assignment

2. Randomized matching: match on as many variables as possible and then randomly assign one member of the pair to the IV-other goes to the control group.

Page 13: EVALUATING YOUR RESEARCH DESIGN EDRS 5305 EDUCATIONAL RESEARCH & STATISTICS

3. Homogeneous selection: select samples that are as similar as possible on some extraneous variable (e.g., IQ; age)

4. Building variables into the design: include the extraneous variable as one of the IVs examined (e.g., gender)

5. Analysis of covariance: removing portion of performance that is systematically related to an extraneous variable.

6. Using participants as their own controls: participants are in each of the experimental conditions, one at a time.

Page 14: EVALUATING YOUR RESEARCH DESIGN EDRS 5305 EDUCATIONAL RESEARCH & STATISTICS

External Validity of Research External Validity of Research DesignsDesigns

Refers to generalizability or representativeness of the findings.

Question addressed here is: To what groups, settings, experimental

variables, and measurement variables can these findings be generalized?

Page 15: EVALUATING YOUR RESEARCH DESIGN EDRS 5305 EDUCATIONAL RESEARCH & STATISTICS

Types of External ValidityTypes of External Validity

1. Population external validity: identifying the population to which results may be generalizable.

2. Ecological external validity: concerned with generalizing experimental effects to other environmental conditions (i.e., settings).

Page 16: EVALUATING YOUR RESEARCH DESIGN EDRS 5305 EDUCATIONAL RESEARCH & STATISTICS

Types of External ValidityTypes of External Validity

3. External validity of operations: concerned with how well the operational definitions and the experimental procedures represent the constructs of interest. Would the same relationships be found if a different researcher used different operations (i.e., measures) in investigating the same question?