6
Session W1G 978-1-4244-4714-5/09/$25.00 ©2009 IEEE October 18 - 21, 2009, San Antonio, TX 39 th ASEE/IEEE Frontiers in Education Conference W1G-1 Steps toward a Sound Measure of Engineering Student Attitudes: Pittsburgh Engineering Attitudes Scale - Revised Jonathan Hilpert, Glenda Stump, Jenefer Husman, Wonsik Kim, Wen-Ting Chung, Jieun Lee [email protected]; [email protected], [email protected]; [email protected]; [email protected]; [email protected] Abstract - Engineering education is currently undergoing a transition toward increased constructivist practices and a new philosophy of education that recognizes the active role of the learner in constructing knowledge. To meet the demands of the changing landscape, our research team has been reworking the Pittsburgh Engineering Attitudes Scale so that valid and reliable inferences can be made about engineering student attitudes. The current study follows up on our analysis of the scale at FIE 2008, and presents evidence of structural validity, internal reliability, and external validity of student responses to a revised version of the scale. The results of the current study suggest the Pittsburgh Engineering Attitudes Scale Revised overcomes previous problems with the original scale reported in the literature and that we are making positive steps toward a sound measure of student attitudes. Index Terms Scale Development, Constructivism, Motivation, Attitudes INTRODUCTION During FIE 2008, our research team presented an analysis of the Pittsburgh Engineering Attitudes Scale [1], a measurement instrument used for engineering program evaluation at over seventeen academic institutions in the last two decades [2] [3]. Although our analysis of the scale suggested the constructs were significantly and positively related to the other constructivist motivation and strategic learning variables we examined, we uncovered a variety of problems with student responses to the items (skewed data, unsatisfactory internal reliability, and cross loading), some of which had been echoed in the literature on the instrument [3]. Since this time, we have revised the items and administered the new scale to a large sample of post secondary engineering students. We wanted to determine if our revised set of items produced responses that maintained structural fidelity to the original scale and provided evidence of valid inferences about an engineering student population, especially in regard to constructivist learning variables. CONSTRUCTIVISM AND ENGINEERING EDUCATION Engineering education is in the process of a transformation toward constructivist educational practices [4]-[7]. Updated measurement instruments are required to examine how student attitudes within the changing landscape of engineering education relate to students’ academic achievement, strategic learning, and persistence in the field. In recent years, inquiry into the adequacy of engineering educational philosophy [5] [8] [9], engineering educational practices [10] [11], and the professional role of the engineer in today’s world [6] has lead to widespread dissatisfaction with traditional approaches to engineering education. So- called “chilly” climates [12] that emphasize basic sciences and technical subjects with little application or collaborative learning have been widely criticized [6]. Sheppard and colleagues [6] suggest, Although engineering schools aim to prepare students for the profession, they are heavily influenced by academic traditions that do not always support the profession’s needs. These types of criticisms have lead to the advancement of an integrated approach to engineering education that includes the development of analytical reasoning, practical skills, and professional judgment with increased sensitivity to gender and diversity. Additionally, the transformation of engineering education has included deep reevaluation of engineering educational philosophy. Recent FIE conferences have included sessions devoted to exploring new philosophies of engineering education. For example, Heywood and colleagues [5] have led insightful discussions regarding the changing role of philosophy in engineering and engineering education. Although these discussions have run the gamut of philosophical approaches to science, technology, and engineering, an important amount of attention has been given to imbuing engineering education with a philosophy of education. Heywood and colleagues [5] write, Many publications have something to say about either the curriculum or instruction but they do not draw on the philosophy of education. There is a need for the two philosophies to come together to create a more informed critique of engineering education. Within these discussions, the advantages of a constructivist philosophy of education have been addressed. Constructivist views of education, championed by thinkers such as Piaget [13] and Vygotsky

Steps toward a sound measure of engineering student attitudes: Pittsburgh Engineering Attitudes Scale - Revised

Embed Size (px)

Citation preview

Session W1G

978-1-4244-4714-5/09/$25.00 ©2009 IEEE October 18 - 21, 2009, San Antonio, TX 39th ASEE/IEEE Frontiers in Education Conference W1G-1

Steps toward a Sound Measure of Engineering Student Attitudes: Pittsburgh Engineering Attitudes

Scale - Revised

Jonathan Hilpert, Glenda Stump, Jenefer Husman, Wonsik Kim, Wen-Ting Chung, Jieun Lee [email protected]; [email protected], [email protected]; [email protected];

[email protected]; [email protected]

Abstract - Engineering education is currently undergoing a transition toward increased constructivist practices and a new philosophy of education that recognizes the active role of the learner in constructing knowledge. To meet the demands of the changing landscape, our research team has been reworking the Pittsburgh Engineering Attitudes Scale so that valid and reliable inferences can be made about engineering student attitudes. The current study follows up on our analysis of the scale at FIE 2008, and presents evidence of structural validity, internal reliability, and external validity of student responses to a revised version of the scale. The results of the current study suggest the Pittsburgh Engineering Attitudes Scale – Revised overcomes previous problems with the original scale reported in the literature and that we are making positive steps toward a sound measure of student attitudes. Index Terms – Scale Development, Constructivism, Motivation, Attitudes

INTRODUCTION

During FIE 2008, our research team presented an analysis of the Pittsburgh Engineering Attitudes Scale [1], a measurement instrument used for engineering program evaluation at over seventeen academic institutions in the last two decades [2] [3]. Although our analysis of the scale suggested the constructs were significantly and positively related to the other constructivist motivation and strategic learning variables we examined, we uncovered a variety of problems with student responses to the items (skewed data, unsatisfactory internal reliability, and cross loading), some of which had been echoed in the literature on the instrument [3]. Since this time, we have revised the items and administered the new scale to a large sample of post secondary engineering students. We wanted to determine if our revised set of items produced responses that maintained structural fidelity to the original scale and provided evidence of valid inferences about an engineering student population, especially in regard to constructivist learning variables.

CONSTRUCTIVISM AND ENGINEERING EDUCATION

Engineering education is in the process of a transformation toward constructivist educational practices [4]-[7]. Updated measurement instruments are required to examine how student attitudes within the changing landscape of engineering education relate to students’ academic achievement, strategic learning, and persistence in the field. In recent years, inquiry into the adequacy of engineering educational philosophy [5] [8] [9], engineering educational practices [10] [11], and the professional role of the engineer in today’s world [6] has lead to widespread dissatisfaction with traditional approaches to engineering education. So-called “chilly” climates [12] that emphasize basic sciences and technical subjects with little application or collaborative learning have been widely criticized [6]. Sheppard and colleagues [6] suggest, Although engineering schools aim to prepare students for the profession, they are heavily influenced by academic traditions that do not always support the profession’s needs. These types of criticisms have lead to the advancement of an integrated approach to engineering education that includes the development of analytical reasoning, practical skills, and professional judgment with increased sensitivity to gender and diversity.

Additionally, the transformation of engineering education has included deep reevaluation of engineering educational philosophy. Recent FIE conferences have included sessions devoted to exploring new philosophies of engineering education. For example, Heywood and colleagues [5] have led insightful discussions regarding the changing role of philosophy in engineering and engineering education. Although these discussions have run the gamut of philosophical approaches to science, technology, and engineering, an important amount of attention has been given to imbuing engineering education with a philosophy of education. Heywood and colleagues [5] write, Many publications have something to say about either the curriculum or instruction but they do not draw on the philosophy of education. There is a need for the two philosophies to come together to create a more informed critique of engineering education. Within these discussions, the advantages of a constructivist philosophy of education have been addressed. Constructivist views of education, championed by thinkers such as Piaget [13] and Vygotsky

Session W1G

978-1-4244-4714-5/09/$25.00 ©2009 IEEE October 18 - 21, 2009, San Antonio, TX 39th ASEE/IEEE Frontiers in Education Conference W1G-2

[14] have dominated educational psychological study for the past few decades and have surfaced in a wide variety of approaches to understanding human learning including social-cognitive theory [15][16], information processing theory [17]-[19], and the study of higher order cognitive processes [20][21]. Commonly referred to as constructivism [22], cognitive theorists consider learning the active construction of knowledge representation through social negotiation and exploration, with an emphasis on the role of attitudes, affect, motivation, and prior knowledge in the development of conceptual understanding.

THE PITTSBURGH ENGINEERING ATTITUDES SCALE

The Pittsburgh Engineering Attitudes Scale (PEAS) [2][3] was written to reflect the changing and expanded role of professional engineers in today’s world and was developed to assess the impact of student attitudes on their advancement towards engineering degrees. Inherently, the study of how attitudes impact learning and decision making is a constructivist pursuit, and the subscales of the original instrument reflect the authors’ insight into the changing role of the engineer in the latter decades of the 20th century. For example, the original instrument not only evaluated student reasons for entering the profession (i.e. financial influences, parental pressures, enjoyment of math and science, and general impressions of the field) but also their implicit beliefs about science and the professional role of engineers in society (i.e. perception of engineering as an exact science, the social prestige of being an engineer, and the potential for engineers to make societal contributions). In a general sense, the attempt to measure student attitudes about the profession [2], the subsequent popularity of the scale [23], and salient findings regarding gender and minority attitudinal differences [3] highlights both the inadequacy of traditional approaches to engineering education and the need for survey instruments that can be used to examine student attitudes within a constructivist philosophy of engineering education.

In our previous administration of the scale [1], we found extensive evidence that engineering student responses to the PEAS worked well in relation to other common constructivist measures of student strategic learning and motivation. Our external validity evidence included significant and positive relationships between students’ responses to the subscales of the PEAS and responses to measures of student knowledge building, collaborative learning, perceived instrumentality of coursework, and self-efficacy for learning engineering content (see our 2008 FIE proceeding for details). Although these results were positive, we uncovered a number of problems with the measurement instrument in our exploratory factor analysis of the scale. Most notably, students’ responses to the items were skewed, items cross loaded onto inappropriate factors, and many of the subscales did not have enough corresponding items to adequately assess internal reliability. In our analysis, we suggested that these problems could potentially interfere with researchers’ ability to reliably and

validly assess student attitudes with the instrument, and that revisions to the scale were necessary. We hoped that these revisions would help to make positive steps toward a viable version of the PEAS that could be used in conjunction with other constructivist measures of student learning and motivation.

MEASUREMENT OF ENGINEERING STUDENT ATTITUDES

In a recent review of qualitative and quantitative methods of evaluating engineering student learning, motivation, and attitudes [24], statistical techniques such as factor analysis, item analysis, and other common methods of producing evidence for internal reliability and structural validity of measurement instruments were not mentioned. In response to the authors’ call to investigate additional methods, we argue that sound measurement techniques are an integral component of engineering education research and are central to producing reliable and valid interpretation of results when conducting other types of statistical tests such as t-tests, correlational analyses, multiple regression, ANOVA, and MANOVA. It is important for quantitative researchers to examine if valid and reliable inferences can be made from their measures within their target populations, before moving on to other statistical tests, to assure that their results are trustworthy [25]. Researchers need to examine the appropriateness of data for their intended uses before conducting higher level analyses [26]. The current study is our second administration of PEAS in a population of post secondary engineering students, with revisions based on the results of our first analysis.

Based on contemporary scale construction methods [27], we combined items from the original PEAS scale that elicited valid responses with newly written items that reflected the content and purpose of the original instrument. In our revision, we developed eight items for each of the seven original subscales creating the Pittsburgh Engineering Attitudes Scale – Revised (PEAS-R). Then we administered the scale to a large sample of engineering students at a large public university in the southwest via online survey along with established measures of student motivation and strategic learning. Our online administration matched the intent of the original authors, who paved the way for online assessment of student attitudes in engineering education research [2]. The purpose of our testing is to develop four or five items for each subscale that will work well in populations of secondary engineering students.

Our research questions were, do post secondary student responses to the PEAS-R produce evidence of 1) structural validity, 2), internal reliability and 3) external validity? For the third component of the research questions, we calculated correlation coefficients to examine if there were significant positive correlations between PEAS-R subscales and students’ knowledge building, collaborative learning, perceived instrumentality for coursework, and self-efficacy for learning engineering content. We hypothesized that student responses to our revised items would be relatively

Session W1G

978-1-4244-4714-5/09/$25.00 ©2009 IEEE October 18 - 21, 2009, San Antonio, TX 39th ASEE/IEEE Frontiers in Education Conference W1G-3

normally distributed, would maintain structural fidelity to the original scale, could be reliably interpreted as representative of the measured attitudinal constructs, and would be significantly and positively correlated with our measures of constructivist learning and motivation.

METHOD

I. Participants

Participants consisted of 212 students from 19 mechanical and aerospace engineering courses in a large public university in the southwestern United States. The courses were representative of all levels of the curriculum, with 9.4% of the students in a 100 level course, 36.4% in 200 level courses, 30% in 300 level courses, and 24% in 400 level courses. Females comprised 15.6% of the sample.

II. Measures

• Pittsburgh Freshman Engineering Attitudes Scale - Revised (PEAS-R): A 56-item revision of the original PEAS, based on previous assessment [1], was used for this study. Students were asked to indicate their agreement or disagreement with statements that reflected attitudes about engineering in the following seven areas: general impressions, financial influences, societal contributions, social prestige, enjoyment of math and science, engineering as an exact science, and parental pressure.

• Student Perceptions of Classroom Knowledge-building (SPK): The SPOCK subscales [28] were administered to assess students' perceptions of their own knowledge building and intentional learning behavior. Two subscales were utilized. The first subscale focused on classroom knowledge-building. This 8-item subscale assessed students’ tendencies to make meaning from and construct their own understanding of classroom material. An example of a knowledge-building item was “As I study the topics in this class, I try to think about how they relate to the topics I am studying in other classes.” The second SPOCK subscale assessed the level of collaborative learning at the classroom level. An example of one of the five items in the subscale was, “In this class, my classmates and I actively work together to complete assignments.” The items on the SPOCK were classroom-specific, and participants were asked to focus on one class when completing the SPOCK.

• Perceptions of Instrumentality (PI): The PI scale [29] was administered to determine perceived instrumentality of a particular college course selected by the participant. Two types of items comprised the PI scale. One type of item evaluated the perceived instrumentality of learning from the course, such as, “I will use the information I learn in the (course selected) in other classes I will take in the future.” The second type of item measures perceived instrumentality of the grade earned in the class, such as, “The grade I get in

the (course selected) will not affect my ability to continue on with my education.”

• MSLQ Self-efficacy (MSLQSE): The MSLQ self-efficacy subscale [30] was administered to assess students’ perceptions of their own abilities. This subscale assessed students’ confidence in their ability to learn class information. An example item was, “I am confident I can understand the basic concepts taught in this course.”

III. Procedure

The revised 56-item scale was administered as part of a larger online survey that examined students’ beliefs about the future, self-efficacy for learning course material, and study strategies utilized in their coursework. Students received a monetary incentive of $10.00 for completing the survey.

VI. Analysis

Item level descriptive statistics were calculated to determine if student responses to the items were relatively normally distributed. Skew values greater than 1 or less than -1 were considered problematic [25]. Exploratory factor analysis was conducted to determine the number and identity of factors measured by the revised items and to determine its fidelity to the factors measured in the original PEAS scale. Principal axis factoring was chosen to estimate the model because this method has no assumptions regarding multivariate normality [31]. Four criteria were used to determine the number of factors to rotate: 1) our a priori hypothesis that the original seven factors were present, 2) the scree test [32], 3) a parallel analysis [33], and 4) the interpretability of the factor solution. For the parallel analysis, the mean random data eigenvalues were used as the basis for comparison.

Initially, nine factors were examined using a direct oblimin rotation procedure (used when factors are hypothesized to be correlated). Based on those results, a seven-factor solution was then attempted. Communalities were inspected to determine if variables were well defined by the factors. Adequacy of rotation was evaluated by viewing factor plots for clustering of variables near the axes, viewing factor loadings for simple structure, and interpreting meaning of resultant factors. A cutoff of .40 was utilized as the basis for acceptable factor loadings [25]. After the optimal solution was chosen, factors were compared with those from the original instrument to determine if the factor structure was maintained in the revised scale. To provide internal reliability evidence, Cronbach’s Alphas were computed for groups of items associated with their respective factors. Alphas greater than .6 were considered acceptable. To provide external validity evidence, items that loaded strongly on their hypothesized factor were parceled and correlation coefficients were calculated among the study variables.

Session W1G

978-1-4244-4714-5/09/$25.00 ©2009 IEEE October 18 - 21, 2009, San Antonio, TX 39th ASEE/IEEE Frontiers in Education Conference W1G-4

RESULTS

Overall, participants marked items favorably, with item means ranging from 1.72 to 4.39, and standard deviations ranging from .58 to 1.04. Further exploration revealed student responses to the items were relatively normally distributed, with skew values slightly less than -1 for three items, and skew values of slightly greater than 1 for two items. These results were favorable.

A review of the scree plot [32] indicated that our initial hypothesis of seven factors was correct; however, the

comparison between eigenvalues observed in the study data and those estimated from random data in the parallel analysis [33] indicated that a nine-factor solution may have been viable. However, our results indicated the two additional factors were not interpretable, and were determined to be the result of poorly worded revised items. After removing items associated with the two problematic factors, a seven-factor rotated solution was attempted with the remaining 46 items. Eight additional items cross loaded above .4 on multiple factors and were removed. Seven interpretable factors, representative of our hypothesized solution, remained: societal contributions, financial influences, parental pressure, enjoyment of math and science, engineering as exact science, general impressions, and social prestige (See Table 1 for the final solution).

The oblique rotation method was supported by a mild to moderate correlation between factors, ranging from .009 to .470 in the seven-factor correlation matrix. Communalities of the items ranged from .24 to .74. When using a recommended cutoff value of .40, all 38 items loaded onto factors with no excessive cross loading. Examination of the factor plot revealed three distinct clusters of items, one located near the axis, and the remaining two located some distance from the axis. The remaining four sets of items were clustered closely to each other as well as to the axis. The seven-factor solution explained 52.23% of the total variance with societal contribution accounting for the greatest proportion of variance.

Cronbach’s alphas for the seven factors were calculated. The results suggested reliable interpretation of student responses: societal contributions, α = .89; financial influence, α = .85; parental pressure, α = .79; enjoyment of math and science, α = .87; engineering as an exact science, α = .71; general impressions, α = .73; and social prestige, α = .83.

Descriptive statistics for the PEAS-R factors and constructivist learning variables indicated that all of the study variables were relatively normally distributed and met the assumptions for a correlation analysis (see Table 2). Correlations among the study variables showed that positive general impressions about engineering study and recognition of engineers’

positive impact on society were associated with students’ perceived value of course work, effective study strategies, and increased self-efficacy for learning (See Table 3).

DISCUSSION

The seven-factor rotated solution yielded optimal results in terms of factor loadings, factor interpretability, and variance explained. Factors extracted from the revised PEAS were the same as those from the original instrument. After parceling the subsets of items that loaded well onto their latent factors, we found the latent attitudinal constructs were

TABLE I ITEMS AND FACTOR LOADINGS FOR THE SEVEN FACTOR SOLUTION

Item Factor 1. 2. 3. 4. 5. 6. 7.

1. Societal Contributions Engineers in my field… .78 - - - - - - Engineers in my field play… .76 - - - - - - In my field, engineering is… .71 - - - - - - Engineers in my field work... .70 - - - - - - Engineers in my field have... .63 - - - - - - Engineers in my field do... .60 - - - - - -

In my field, engineering is... .58 - - - - - - 2. Financial Influences

Financial success is one... - .84 - - - - - I am studying engineering… - .79 - - - - - I am interested in... - .77 - - - - - Money is not one of the… - .66 - - - - - I am studying engineering… - .60 - - - - -

I do not study engineering… - .48 - - - - - 3. Parental Pressures

My parent(s) are intent… - - .71 - - - - My parents were very... - - .65 - - - - I am studying engineering… - - .62 - - - - My parent(s) want me… - - .59 - - - - My parent(s) were not… - - .58 - - - - My parent(s) want me... - - .51 - - - -

My parent(s) are not intent… - - .45 - - - - 4. Enjoyment of Math and Science

I like to take courses… - - - .81 - - - I find mathematics… - - - .76 - - - I find studying the... - - - .76 - - - Studying mathematics... - - - .71 - - - I really enjoy… - - - .63 - - -

Studying mathematics… - - - .58 - - - 5. Engineering as Exact Science

Engineering is an exact... - - - - .80 - - Engineering problems are... - - - - .75 - -

There is always a right.… - - - - .47 - - 6. General Impressions

Engineering is the most… - - - - - .65 - There is no other major… - - - - - .57 -

My interest in engineering… - - - - - .41 - 7. Social Prestige

Engineers in my field are not… - - - - - - .70 Engineers in my field are… - - - - - - .68 Engineers in my field are not… - - - - - - .66 Engineers in my field are held… - - - - - - .48 Engineering is an occupation… - - - - - - .46

Engineers in my field are… - - - - - - .46 Note. Listwise N = 212. Principal Axis Analysis with Oblimin Rotation; (-) = suppressed at < .40

Session W1G

978-1-4244-4714-5/09/$25.00 ©2009 IEEE October 18 - 21, 2009, San Antonio, TX 39th ASEE/IEEE Frontiers in Education Conference W1G-5

significantly correlated with our motivation and strategic learning variables, producing evidence that valid inferences about this population can be made from student responses to the PEAS-R.

Students who reported favorable general impressions of engineering and recognized the societal contribution of engineers tended to report more effective study strategies and high self-efficacy for learning engineering content. The same can be said for students who reported enjoying math and science. It is also important to note that financial influences and parental pressure were significantly positively correlated, and were significantly negatively correlated with general impressions of the field and social prestige, respectively. Perceptions of engineering as an exact science were also significantly positively correlated with parental pressure. Taken together, these results seem to suggest that students who recognize the professional role of engineers as making positive social impacts are more

likely to enjoy their study and make strong cognitive connections between coursework. In contrast, students who report being pressured into engineering by parents reported being focused on monetary gain and perceiving engineering as an exact science. Self determination theorists [34] argue that people are motivated though satisfaction of psychological needs such as the desire to feel competent in a discipline or connected to others in a meaningful way. Our results align nicely with this avenue of study, and there may be important overlap between student responses to the PEAS-R and their need for self-determination.

Based on these promising results, we have revised the items once more, and are currently testing them again in similar samples of post secondary engineering education majors to finalize the PEAS-R. In a third and final follow-up paper, we plan to present four or five reliable and valid items for each subscale. We hope that our efforts will eventually produce a psychometrically sound measure of engineering student attitudes that can be used in conjunction with common constructivist measures of strategic learning and motivation.

ACKNOWLEDGMENT

Funding for the current study was provided by National Science Foundation Grant, REC-0546856, CAREER: Connecting with the Future: Supporting Career and Identity Development in Post-Secondary Science and Engineering.

AUTHOR INFORMATION

Jonathan Hilpert, Assistant Professor, Indiana University Purdue University Fort Wayne, [email protected]. Glenda Stump, Doctoral Student, Arizona State University, [email protected] Jenefer Husman, Associate Professor, Arizona State

TABLE 2 DESCRIPTIVE STATISTICS FOR THE SUBSCALES

Min Max M SD Skew SPKKB 1.00 5.00 3.42 0.75 -0.42 SPKCL 1.00 5.00 3.24 1.09 -0.17 PIEN 1.00 5.00 4.08 0.73 -0.96 PIEX 2.00 5.00 3.88 0.71 -0.36 MSLQ 1.83 6.67 5.23 0.92 -0.47 PEASRGI 1.00 5.00 3.49 0.77 -0.35 PEASRFI 1.50 5.00 3.28 0.71 -0.04 PEASRSC 2.00 5.00 4.05 0.53 -0.49 PEASRSP 2.50 5.00 4.10 0.47 -0.14 PEASREMS 2.00 5.00 4.02 0.57 -0.34 PEASREES 1.00 5.00 2.71 0.78 0.48 PEASRPP 1.00 4.29 2.64 0.60 0.02 Note. Listwise N = 212. SPKKB = knowledge building; SPKCL = collaboration; PIEN = endogenous PI; PIEX = exogenous PI; MSLQSE = self-efficacy; PEASGI = general impressions; PEASFI = financial influences; PEASSC = societal contributions; PEASSP = social prestige; PEASEMS = enjoyment of math and science; PEASEES = engineering as exact science; PEASPP = parental pressure.

TABLE 3 CORRELATIONS AMONG ALL STUDY VARIABLES

1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 1. SPKKB - 2. SPKCL .45** - 3. PIEN .60** .14 - 4. PIEX .17 .08 .34** - 5. MSLQSE .47** .20** .46** .10 - 6. PEASRGI .28** .18** .16* .08 .22** - 7. PEASRFI -.10 .00 .00 .04 .07 -.18* - 8. PEASRSC .22** .06 .22** .08 .20** .30** .01 - 9. PEASRSP .11 .00 .14* .14* .16* .21** .13 .61** - 10. PEASREMS .18** .08 .23** .11 .33** .45** -.08 .41** .34** - 11. PEASREES .02 -.12 .06 .10 .07 -.04 .07 .00 -.10 -.03 - 12. PEASRPP .00 -.01 -.05 .02 -.09 -.07 .14* -.09 -.14* -.12 .14* - Note. **p <. 01, *p <. 05; Listwise N = 212. SPKKB = knowledge building; SPKCL = collaboration; PIEN = endogenous PI; PIEX = exogenous PI; MSLQSE = self-efficacy; PEASGI = general impressions; PEASFI = financial influences; PEASSC = societal contributions; PEASSP = social prestige; PEASEMS = enjoyment of math and science; PEASEES = engineering as exact science; PEASPP = parental pressure.

Session W1G

978-1-4244-4714-5/09/$25.00 ©2009 IEEE October 18 - 21, 2009, San Antonio, TX 39th ASEE/IEEE Frontiers in Education Conference W1G-6

University, [email protected] Wonsik Kim, Doctoral Student, Arizona State University, [email protected] Wen Ting Chung, Doctoral Student, Arizona State University, [email protected] JiEun Lee, Doctoral Student, Arizona State University, [email protected]

REFERENCES

[1] Hilpert, J. et al., “An Exploratory Factor Analysis of the Pittsburgh Freshman Engineering Attitudes Survey,” Frontiers in Education Conference, Oct. 2008

[2] Besterfield-Sacre, M.E. and C.J. Atman, “Survey Design Methodology: Measuring Freshman Attitudes About Engineering,” American Society for Engineering Education Conference Proceedings, June 1994, pp. 236-242.

[3] Besterfield-Sacre, M.B., M. Moreno, L.J. Shuman, and C.J. Atman, “Gender and Ethnicity Differences in Freshman Engineering Student Attitudes: A Cross Institutional Study.” Journal of Engineering Education, November 2001.

[4] Duderstadt, “Engineering for a Changing World: A Roadmap to the Future of Engineering Practice, Research, and Education,” The Millenium Project, The University of Michigan, 2008.

[5] Heywood, J., R. McGrann, & K. Smith, “Special Session – Continuing the FIE 2007 Conversation On: Can Philosophy of Engineering Education Improve the Practice of Engineering Education?” Frontiers in Education Conference, Oct. 2008.

[6] Sheppard, S. D., K. Macatangay, A. Colby, & W. M. Sullivan, “Educating Engineers: Designing for the Future of the Field.” The Carnegie Foundation for the Advancement of Teaching, 2008.

[7] Taraban, R. et al., “A Paradigm for Assessing Students Conceptual and Procedural Knowledge,” Journal of Engineering Education, no. 4, 2007, pg. 335-245

[8] Heywood, J., “Philosophy and Engineering Education: A Review of Certain Developments in the Field,” Frontiers in Education Conference, Oct. 2008.

[9] Harding, T. S., The Psychology of ‘Ought,” Frontiers in Education Conference, Oct. 2008.

[10] Seymour, E., and N.M. Hewitt, “Talking About Leaving: Why Undergraduates Leave the Sciences,” Westview Press, 2000.

[11] Vogt, M.V, D. Hocevar, and L.S. Hagedorn, “A Social Cognitive Construct Validation: Determining Women’s and Men’s Success in Engineering Programs.” The Journal of Higher Education, vol. 78, no. 3, June 2007, pp. 337-364.

[12] Seymour, E. “The loss of women from science, mathematics, and engineering undergraduate majors: An explanatory account.” Science Education, 1995, 79(4), 437-473.

[13] Piaget, J., The Origins of Intelligence in Children (M. Cook Trans.). New York: Harcourt Brace, 1952.

[14] Vygotsky L., “Mind and Society: The Development of Higher Psychological Processes.” Cambridge, MA: Harvard University Press, 1978.

[15] Bandura, A., “Self-Efficacy: The Exercise of Control,” New York, Freeman, 1997.

[16] Zimmerman B. J., & Schunk, D. H., “Self-Regulated Learning and Academic Achievement: Theoretical Perspectives” (2nd ed). Lawrence Erlbaum Associates, 2001.

[17] Atkinson R. K., & R. M. Shiffrin, “Human Memory: A Proposed System and its Control Processes.” In K. W. Spence & J. T. Spence (Eds.), The Psychology of Learning Motivation: Advances in Research and Theory (Vol. 2). San Diego, CA: Academic Press, 1968.

[18] Baddley, A. D., “Working Memory” Science, Vol. 255, pp. 556-559, 1992.

[19] Mayer, R. E., “Learners as information processors: Legacies and limitations of education psychologies second metaphor.” Educational Psychologist. 31, 151-161., 1996

[20] Schraw, G., “Knowledge: Structures and Processes.” In P.A. Alexander & P. H. Winne (Eds.), Handbook of Educational Psychology (pp. 369-390). Mahwah, NJ, Lawrence Erlbaum Associates, 2006.

[21] Bransford et al., “Learning Theories and Education: Toward a Decade of Synergy.” In P. A. Alexander & P. H. Winne (Eds.), Handbook of Educational Psychology (2nd ed., pp. 209-244). Mahwah, NJ: Erlbaum, 2006

[22] Ormrod, J., Educational Psychology: Developing Learners. Pearson Inc., 2008

[23] Besterfield-Sacre, M.E. et al., "Comparing entering Freshman Engineers: Institutional Differences in Student Attitudes," 1999 American Society for Engineering Education Conference Proceedings, Charlotte, NC, June 1999.

[24] Borrego, M., E. P. Douglas, & C. T. Amelink, “Quantitative and Qualitative Research Methods in Engineering Education,” Journal of Engineering Education, no 1, pp. 53-66, 2009

[25] Tabachnick, B. G., & L. S. Fidell, “Using Multivariate Statistics” (5th ed), Pearson Inc., 2007.

[26] Thorndike, R. M., “Measurement and Evaluation in Psychology and Education” (7th ed), Pearson Inc., 2005.

[27] DeVellis, R.F., “Scale Development: Theory and Applications,” Sage Publications, 2007.

[28] Shell, D., et al., “The Impact of Computer Supported Collaborative Learning Communities on High School Students’ Knowledge Building, Strategic Learning, and Perceptions of the Classroom,” Journal of Educational Computing Research, vol. 33, no. 3, pp. 327-349, 2005.

[29] Husman, J. & J. Hilpert, “The Intersection of Students’ Perceptions of Instrumentality, Self-Efficacy, and Goal Orientations in an Online Mathematics Course.” Zeitschrift fűr Pädagogische Psychologie. 21(3/4) 229-239, 2007

[30] Pintrich, P. R., D.A. Smith, T. Garcia, & W.J. McKeachie, “Reliability and predictive validity of the Motivated Strategies for Learning Questionnaire (MSLQ).” Educational & Psychological Measurement, vol. 5, no. 3, pp. 801-813, 1993.

[31] Fabrigar, L. R., D.T. Wegener, R. C. MacCallum & E. J. Strahan, “Evaluating the Use of Exploratory Factor Analysis in Psychological Research,” Psychological Methods, Vol.4. No. 3.272-299,1999

[32] Cattell, R. B., “The Scree Test for the Number of Factors.” Multivariate Behavioral Research. No. 1, pp. 245-76, 1966.

[33] O’Connor, B.P. “SPSS and SAS programs for determining the number of components using parallel analysis and Velicer’s MAP test,” Behavior Research Methods, Instruments, & Computers, 32 (3), pp. 396-402, 2000

[34] Ryan, R. M., E. L. Deci “Self Determination Theory and the Facilitation of Intrinsic Motivation, Social Development, and Well-Being,” American Psychologist, 55(1), pp. 68-78, 2000