305

Assessment Practices in Undergraduate Mathematics - Northern

  • Upload
    others

  • View
    11

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Assessment Practices in Undergraduate Mathematics - Northern
Page 2: Assessment Practices in Undergraduate Mathematics - Northern

i

AssessmentPractices

inUndergraduateMathematics

Page 3: Assessment Practices in Undergraduate Mathematics - Northern

ii

©1999 by The Mathematical Association of America

ISBN 0-88385-161-XLibrary of Congress Catalog Number 99-63047

Printed in the United States of America

Current Printing10 9 8 7 6 5 4 3 2

Page 4: Assessment Practices in Undergraduate Mathematics - Northern

iii

AssessmentPractices

inUndergraduateMathematics

BONNIE GOLD

SANDRA Z. KEITH

WILLIAM A. MARION

EDITORS

Published byThe Mathematical Association of America

MAA NOTES NUMBER 49

Page 5: Assessment Practices in Undergraduate Mathematics - Northern

iv

The MAA Notes Series, started in 1982, addresses a broad range of topics and themes of interest to all who areinvolved with undergraduate mathematics. The volumes in this series are readable, informative, and useful,and help the mathematical community keep up with developments of importance to mathematics.

Editorial BoardTina Straley, Editor

Nancy Baxter-Hastings Sylvia T. Bozeman Donald W. BushawSheldon P. Gordon Stephen B Maurer Roger B. Nelsen

Mary R. Parker Barbara E. Reynolds (Sr) Donald G. SaariAnita E. Solow Philip D. Straffin

MAA Notes

11. Keys to Improved Instruction by Teaching Assistants and Part-Time Instructors, Committee on Teaching Assistantsand Part-Time Instructors, Bettye Anne Case, Editor.

13. Reshaping College Mathematics, Committee on the Undergraduate Program in Mathematics, Lynn A. Steen, Editor.

14. Mathematical Writing, by Donald E. Knuth, Tracy Larrabee, and Paul M. Roberts.

16. Using Writing to Teach Mathematics, Andrew Sterrett, Editor.17. Priming the Calculus Pump: Innovations and Resources, Committee on Calculus Reform and the First Two Years, a

subcomittee of the Committee on the Undergraduate Program in Mathematics, Thomas W. Tucker, Editor.

18. Models for Undergraduate Research in Mathematics, Lester Senechal, Editor.

19. Visualization in Teaching and Learning Mathematics, Committee on Computers in Mathematics Education, SteveCunningham and Walter S. Zimmermann, Editors.

20. The Laboratory Approach to Teaching Calculus, L. Carl Leinbach et al., Editors.

21. Perspectives on Contemporary Statistics, David C. Hoaglin and David S. Moore, Editors.

22. Heeding the Call for Change: Suggestions for Curricular Action, Lynn A. Steen, Editor.24. Symbolic Computation in Undergraduate Mathematics Education, Zaven A. Karian, Editor.

25. The Concept of Function: Aspects of Epistemology and Pedagogy, Guershon Harel and Ed Dubinsky, Editors.

26. Statistics for the Twenty-First Century, Florence and Sheldon Gordon, Editors.27. Resources for Calculus Collection, Volume 1: Learning by Discovery: A Lab Manual for Calculus, Anita E. Solow,

Editor.

28. Resources for Calculus Collection, Volume 2: Calculus Problems for a New Century, Robert Fraga, Editor.

29. Resources for Calculus Collection, Volume 3: Applications of Calculus, Philip Straffin, Editor.30. Resources for Calculus Collection, Volume 4: Problems for Student Investigation, Michael B. Jackson and John R.

Ramsay, Editors.

31. Resources for Calculus Collection, Volume 5: Readings for Calculus, Underwood Dudley, Editor.

32. Essays in Humanistic Mathematics, Alvin White, Editor.33. Research Issues in Undergraduate Mathematics Learning: Preliminary Analyses and Results, James J. Kaput and Ed

Dubinsky, Editors.

34. In Eves’ Circles, Joby Milo Anthony, Editor.

35. You’re the Professor, What Next? Ideas and Resources for Preparing College Teachers, The Committee on Preparationfor College Teaching, Bettye Anne Case, Editor.

36. Preparing for a New Calculus: Conference Proceedings, Anita E. Solow, Editor.

37. A Practical Guide to Cooperative Learning in Collegiate Mathematics, Nancy L. Hagelgans, Barbara E. Reynolds,SDS, Keith Schwingendorf, Draga Vidakovic, Ed Dubinsky, Mazen Shahin, G. Joseph Wimbish, Jr.

38. Models That Work: Case Studies in Effective Undergraduate Mathematics Programs, Alan C. Tucker, Editor.39. Calculus: The Dynamics of Change, CUPM Subcommittee on Calculus Reform and the First Two Years, A. Wayne

Roberts, Editor.

Page 6: Assessment Practices in Undergraduate Mathematics - Northern

v

40. Vita Mathematica: Historical Research and Integration with Teaching, Ronald Calinger, Editor.41. Geometry Turned On: Dynamic Software in Learning, Teaching, and Research, James R. King and Doris Schattschneider,

Editors.

42. Resources for Teaching Linear Algebra, David Carlson, Charles R. Johnson, David C. Lay, A. Duane Porter, Ann E.Watkins, William Watkins, Editors.

43. Student Assessment in Calculus: A Report of the NSF Working Group on Assessment in Calculus, Alan Schoenfeld,Editor.

44. Readings in Cooperative Learning for Undergraduate Mathematics, Ed Dubinsky, David Mathews, and Barbara E.Reynolds, Editors.

45. Confronting the Core Curriculum: Considering Change in the Undergraduate Mathematics Major, John A. Dossey,Editor.

46. Women in Mathematics: Scaling the Heights, Deborah Nolan, Editor.47. Exemplary Programs in Introductory College Mathematics: Innovative Programs Using Technology, Susan Lenker,

Editor.

48. Writing in the Teaching and Learning of Mathematics, John Meier and Thomas Rishel.

49. Assessment Practices in Undergraduate Mathematics, Bonnie Gold, Editor.50. Revolutions in Differential Equations: Exploring ODEs with Modern Technology, Michael J. Kallaher, Editor.

51. Using History to Teach Mathematics: An International Perspective, Victor Katz, Editor.

52. Teaching Statistics: Resources for Undergraduate Instructors, Thomas J. Moore, Editor.53. Geometry at Work: Papers in Applied Geometry, Catherine A. Gorini, Editor.

54. Teaching First: A Guide for New Mathematicians, Thomas W. Rishel.

55. Cooperative Learning in Undergraduate Mathematics: Issues That Matter and Strategies That Work, Elizabeth C.Rogers, Barbara E. Reynolds, Neil A. Davidson, and Anthony D. Thomas, Editors.

56. Changing Calculus: A Report on Evaluation Efforts and National Impact from 1988 to 1998, Susan L. Ganter.

These volumes can be ordered from:MAA Service Center

P.O. Box 91112Washington, DC 20090-1112

800-331-1MAA FAX: 301-206-9789

Page 7: Assessment Practices in Undergraduate Mathematics - Northern

vi

This book is dedicated to the memory of James R. C. Leitzel, who chaired both theCommittee on the Teaching of Undergraduate Mathematics (CTUM) (on which BonnieGold served) and the Committee on the Undergraduate Program in Mathematics (on whichBill Marion served), and who also co-authored an article with Sandy Keith. Jim dedicatedhis life to improving the teaching of undergraduate mathematics, and worked on it tirelessly,giving generously of his time to those he worked with, until his untimely death. We hopethis book will continue his work.

Page 8: Assessment Practices in Undergraduate Mathematics - Northern

vii

PrefaceThis book grew out of a report by a subcommittee of the Committee on the Undergraduate Program inMathematics, chaired by Bernie Madison, and out of a concern by Sandy Keith that mathematicians takeassessment seriously. The introductions by Lynn Steen and Bernie Madison set the context more fully.

Acknowledgements

We would like to acknowledge the support of the institutions at which this book was edited, and of the staffwho helped us with the details. In particular, at Wabash College (Bonnie Gold’s former institution), MarciaLabbe wrote all the initial letters inviting contributions and the computer center staff, particularly Kelly Pfleddererand Carl Muma, helped with translating electronic versions of articles in many different formats into usableform. At Valparaiso University, Paula Scott, secretary of the Department of Mathematics and Computer Science,provided Bill Marion with significant word processing help in corresponding with the authors of the articles inthe section on assessing the major, especially during the revision stage. We would also like to acknowledge thevery helpful suggestions of the Notes subcommittee which worked with us, Nancy Baxter Hastings, MaryParker, and Anita Solow.

Page 9: Assessment Practices in Undergraduate Mathematics - Northern

viii

Page 10: Assessment Practices in Undergraduate Mathematics - Northern

ix

Table of Contents

Introduction: Assessing Assessment, Lynn Arthur Steen ..................................................................................................1Assessment and the MAA, Bernard L. Madison................................................................................................................7How to Use this Book ....................................................................................................................................................... 9Articles Arranged by Topic ............................................................................................................................................. 11Teaching Goals Inventory ................................................................................................................................................ 15

Part I: Assessing the Major ...................................................................................................... 17Introduction, William A. Marion ..................................................................................................................................... 19

Portfolios, Capstone Courses, and Focus Groups

Assessing a Major in Mathematics, Laurie Hopkins .......................................................................................................21Portfolio Assessment of the Major, Linda R. Sons ..........................................................................................................24An Assessment Program Built Around a Capstone Course, Charles Peltier ..................................................................27Using a Capstone Course to Assess a Variety of Skills, Deborah A. Frantz ...................................................................31The Use of Focus Groups Within a Cyclic Assessment Program, Marie P. Sheckels .....................................................35

Comprehensive Examinations

Assessing the Major Via a Comprehensive Senior Examination, Bonnie Gold ..............................................................39A Joint Written Comprehensive Examination to Assess Mathematical Processes and Lifetime Metaskills,

G. Daniel Callon ....................................................................................................................................................42Undergraduate Core Assessment in the Mathematical Sciences, John W. Emert and Charles R. Parish .......................46

Overall Programs

Outcomes Assessment in the B.S. Statistics Program at Iowa State University, Richard A. Groeneveld andW. Robert Stephenson ............................................................................................................................................49

Assessment of the Mathematics Major at Xavier: The First Steps, Janice B. Walker ....................................................54Assessing Essential Academic Skills from the Perspective of the Mathematics Major, Mark Michael ..........................58Department Goals and Assessment, Elias Toubassi ........................................................................................................61

Special Programs Within the Major

Analyzing the Value of a Transitional Mathematics Course, Judith A. Palagallo and William A. Blue ..........................64Assessing the Major: Serving the Needs of Students Considering Graduate School, Deborah Bergstrand ..................68

Part II: Assessment in the Individual Classroom ........................................................................... 73Introduction, Bonnie Gold ...............................................................................................................................................75

Testing and Grading

What Happened to Tests? Sharon Cutler Ross ................................................................................................................77

Page 11: Assessment Practices in Undergraduate Mathematics - Northern

x Assessment Practices in Undergraduate Mathematics

Interactive E-Mail Assessment, Michael D. Fried ...........................................................................................................80Flexible Grade Weightings, William E. Bonnice ...............................................................................................................84

Classroom Assessment Techniques (A La Angelo & Cross)

The One-Minute Paper, David M. Bressoud.....................................................................................................................87Concept Maps, Dwight Atkins ..........................................................................................................................................89If You Want to Know What Students Understand, Ask Them! John Koker ....................................................................91Improving Classes with Quick Assessment Techniques, John W. Emert ..........................................................................94Creating a Professional Environment in the Classroom, Sandra Z. Keith ........................................................................98

Reviewing Before Examinations

True or False? Explain! Janet Heine Barnett ................................................................................................................101Define, Compare, Contrast, Explain..., Joann Bossenbroek ...........................................................................................104Student-Created Problems Demonstrate Knowledge and Understanding, Agnes M. Rash ............................................106

What Do Students Really Understand?

In-Depth Interviews to Understand Student Understanding, M. Kathleen Heid ............................................................109

Projects and Writing to Learn Mathematics

Assessing Expository Mathematics, Annalisa Crannell .................................................................................................113Assessing Modeling Projects in Calculus and Precalculus: Two Approaches, Charles E. Emenaker ..........................116Assessment During Complex, Problem-Solving, Group Work in Class, Brian J. Winkel ..............................................120Student Assessment Through Portfolios, Alan P. Knoerr and Michael A. McDonald ...................................................123Using Writing to Assess Understanding of Calculus Concepts, Dorothee Jane Blum ...................................................126Journals: Assessment Without Anxiety, Alvin White .....................................................................................................129Assessing General Education Mathematics Through Writing and Questions, Patricia Clark Kenschaft ......................131

Cooperative Groups and Problem-Centered Methods

Combining Individual and Group Evaluations, Nancy L. Hagelgans ............................................................................134Group Testing, Catherine A. Roberts .............................................................................................................................137Continuous Evaluation Using Cooperative Learning, Carolyn W. Rouviere ..................................................................140Collaborative Oral Take-Home Exams, Annalisa Crannell ...........................................................................................143Assessment in a Problem-Centered College Mathematics Course, Sandra Davis Trowell and Grayson H.

Wheatley ................................................................................................................................................................146

Special Need Students

Assessing the Learning of Female Students, Regina Brunner ........................................................................................149Strategies to Assess the Adult Learner, Jacqueline Brannon Giles ................................................................................152

Assessing the Course as a Whole

The Class Mission Statement, Ellen Clay .......................................................................................................................155Early In-Course Assessment of Faculty by Students, William E. Bonnice .....................................................................158Student Feedback Teams in a Mathematics Classroom, David Lomen .........................................................................161Early Student Feedback, Patricia Shure .........................................................................................................................164Friendly Course Evaluations, Janet Heine Barnett ........................................................................................................167Course Portfolio in Mathematics, Steven R. Dunbar......................................................................................................170

Part III: Departmental Assessment Initiatives .............................................................................173Introduction, Sandra Z. Keith .........................................................................................................................................175

Placement and Advising

Administering a Placement Test: St. Olaf College, Judith N. Cederberg ......................................................................178A Mathematics Placement and Advising Program, Donna Krawczyk and Elias Toubassi ............................................181

Page 12: Assessment Practices in Undergraduate Mathematics - Northern

Contents xi

A Comprehensive Advising Program, Stephen A. Doblin and Wallace C. Pye ...............................................................184

General Education and Quantitative Literacy Programs

A Quantitative Literacy Program, Linda R. Sons ...........................................................................................................187Creating a General Education Course: A Cognitive Approach, Albert D. Otto, Cheryl A. Lubinski, and Carol

T. Benson ...............................................................................................................................................................191Using Pre- and Post-Testing in a Liberal Arts Mathematics Course to Improve Teaching and Learning, Mark

Michael .................................................................................................................................................................195Coming to Terms with Quantitative Literacy in General Education: or, the Uses of Fuzzy Assessment, Philip

Keith ......................................................................................................................................................................198Does Developmental Mathematics Work? Eileen L. Poiani ..........................................................................................202

Mathematics in a Service Role

Let Them Know What You’re Up to, Listen to What They Say, J. Curtis Chipman......................................................205Have Our Students with Other Majors Learned the Skills They Need? William O. Martin and Steven

F. Bauman .............................................................................................................................................................209A TEAM Teaching Experience in Mathematics/Economics, Marilyn L. Repsher and J. Rody Borg ............................213Factors Affecting the Completion of Undergraduate Degrees in Science, Engineering, and Mathematics forUnderrepresented Minority Students: The Senior Bulge Study, Martin Vern Bonsangue ............................................216Evaluating the Effects of Reform, Richard West ............................................................................................................219A Comprehensive, Pro-Active Assessment Program, Robert Olin and Lin Scruggs .....................................................224

Assessing Reform Courses

Assessment in One Learning Theory Based Approach to Teaching, Ed Dubinsky ........................................................229An Evaluation of Calculus Reform: A Preliminary Report of a National Study, Sue Ganter .......................................233Increasing the Dialogue About Calculus with a Questionnaire, A. Darien Lauten, Karen Graham, and Joan

Ferrini-Mundy ......................................................................................................................................................237Workshop Calculus: Assessing Student Attitudes and Learning Gains, Nancy Baxter Hastings ..................................241Does Calculus Reform Work? Joel Silverberg ...............................................................................................................245Assessing the Effectiveness of Innovative Educational Reform Efforts, Keith E. Schwingendorf .................................249The Evaluation of Project CALC at Duke University, 1989 – 1994, Jack Bookman and Charles P. Friedman ............253

Part IV: Assessing Teaching .................................................................................................... 257Introduction, William A. Marion .....................................................................................................................................259Departmental Assistance in Formative Assessment of Teaching, Alan P. Knoerr, Michael A. McDonald, andRae McCormick ..............................................................................................................................................................261Assessing the Teaching of College Mathematics Faculty, C. Patrick Collier ................................................................265Using Video and Peer Feedback to Improve Teaching, Joel David Hamkins................................................................ 269Exchanging Class Visits: Improving Teaching for Both Junior and Senior Faculty, Deborah Bergstrand ....................272Peer Review of Teaching, Pao-sheng Hsu .....................................................................................................................275

APPENDIX: Reprint of “Assessment of Student Learning for Improving the Undergraduate Major inMathematics” ........................................................................................................................................................ 279

Suggestions for Further Reading .................................................................................................................................... 285Bibliography ................................................................................................................................................................... 287Addresses of Authors ...................................................................................................................................................... 291

Page 13: Assessment Practices in Undergraduate Mathematics - Northern

xii Assessment Practices in Undergraduate Mathematics

Page 14: Assessment Practices in Undergraduate Mathematics - Northern

1

We open letters from the city assessor with trepidation sincewe expect to learn that our taxes are about to go up. Mathema-ticians typically view academic assessment with similar emo-tion. Some react with indifference and apathy, others with sus-picion and hostility. Virtually no one greets a request for as-sessment with enthusiasm. Assessment, it often seems, is theacademic equivalent of death and taxes: an unavoidable waste.

In ancient times, an assessor (from ad + sedere) was onewho sat beside the sovereign to provide technical advice onthe value of things that were to be taxed. Only tax collectorswelcomed assessors. Tradition, self-interest, and commonsense compel faculty to resist assessment for many of thesame reasons that citizens resist taxes.

Yet academic sovereigns (read: administrators) insist onassessment. For many reasons, both wise and foolish, ad-ministrators feel compelled to determine the value of things.Are students learning what they should? Do they receive theeducation they have been promised? Do our institutions servewell the needs of all students? Are parents and the publicreceiving value for their investment in education? Are edu-cational programs well suited to the needs of students? Doprogram benefits justify costs? Academic sovereigns askthese questions not to impose taxes but to determine institu-tional priorities and allocate future resources.

What we assess defines what we value [22]. Students’irreverent questions (“Will it be on the test?”) signal theirunderstanding of this basic truth. They know, for example,that faculty who assess only calculation do not really valueunderstanding. In this respect, mathematics faculty are notunlike their students: while giving lip service to higher goals,both faculty and students are generally satisfied with evidenceof routine performance. Mathematics departments commonlyclaim to want their majors to be capable of solving real-world problems and communicating mathematically. Yet

these goals ring hollow unless students are evaluated by theirability to identify and analyze problems in real-world settingsand communicate their conclusions to a variety of audiences.Assessment not only places value on things, but alsoidentifies the things we value.

In this era of accountability, the constituencies of educa-tional assessment are not just students, faculty, and adminis-trators, but also parents, legislators, journalists, and the pub-lic. For these broader audiences, simple numerical indica-tors of student performance take on totemic significance.Test acronyms (SAT, TIMSS, NAEP, AP, ACT, GRE) com-pete with academic subjects (mathematics, science, history)as the public vocabulary of educational discourse. Nevermind that GPA is more a measure of student compliancethan of useful knowledge, or that SAT scores reflect rela-tively narrow test-taking abilities. These national assessmentshave become, in the public eye, surrogate definitions of edu-cation. In today’s assessment-saturated environment, math-ematics is the mathematics that is tested.

College Mathematics

In most colleges and universities, mathematics is arguablythe most critical academic program. Since students in a largemajority of degree programs and majors are required to take(or test out of) courses in the mathematical sciences, on mostcampuses mathematics enrollments are among the highestof any subject. Yet for many reasons, the withdrawal andfailure rates in mathematics courses are higher than in mostother courses. The combination of large enrollments and highfailure rates makes mathematics departments responsible formore student frustration — and dropout — than any othersingle department.

ASSESSING ASSESSMENT

Lynn Arthur SteenSt. Olaf College

Page 15: Assessment Practices in Undergraduate Mathematics - Northern

2 Assessment Practices in Undergraduate Mathematics

What’s more, in most colleges and universities mathemat-ics is the most elementary academic program. Despite math-ematics’ reputation as an advanced and esoteric subject, theaverage mathematics course offered by most postsecondaryinstitutions is at the high-school level. Traditionalpostsecondary level mathematics — calculus and above —accounts for less than 30% of the 3.3 million mathematicalscience enrollments in American higher education [8].

Finally, in most colleges and universities, mathematics isthe program that serves the most diverse student needs. Inaddition to satisfying ordinary obligations of providingcourses for general education and for mathematics majors,departments of mathematical sciences are also responsiblefor developmental courses for students with weakmathematics backgrounds; for service courses for programsranging from agriculture to engineering and from businessto biochemistry; for the mathematical preparation ofprospective teachers in elementary, middle, and secondaryschools; for research experiences to prepare interestedstudents for graduate programs in the mathematical sciences;and, in smaller institutions, for courses and majors instatistics, computer science, and operations research.

Thus the spotlight of educational improvement often fallsfirst and brightest on mathematics. In the last ten years alone,new expectations have been advanced for school mathemat-ics [13], for college mathematics below calculus [1], forcalculus [3, 14, 16], for statistics [6], for undergraduatemathematics [17], for departmental goals [10] and for fac-ulty rewards [7]. Collectively, these reports convey new val-ues for mathematics education that focus departments moreon student learning than on course coverage; more on stu-dent engagement than on faculty presentation; more on broadscholarship than on narrow research; more on context thanon techniques; more on communication than on calculation.In short, these reports stress mathematics for all rather thanmathematics for the few, or (to adopt the slogan of calculusreform) mathematics as “a pump, not a filter.”

Principles of AssessmentAssessment serves many purposes. It is used, among otherthings, to diagnose student needs, to monitor studentprogress, to give students grades, to judge teachingeffectiveness, to determine raises and promotions, to evaluatecurricula and programs, and to decide on allocation ofresources. During planning (of courses, programs, curricula,majors) assessment addresses the basic questions of why,who, what, how, and when. In the thick of things (in mid-course or mid-project) so-called formative assessmentmonitors implementation (is the plan going as expected?)and progress (are students advancing adequately?). At thesummative stage — which may be at the end of a class period,or of a course, or of a special project — assessment seeks torecord impact (both intended and unintended), to compare

outcomes with goals, to rank students, and to stimulate actioneither to modify, extend, or replicate.

Several years ago a committee of the MathematicalAssociation of America undertook one of the very first effortsin higher education to comprehend the role of assessment ina single academic discipline [2, 9]. Although this committeefocused on assessing the mathematics major, its findings andanalyses apply to most forms of assessment. The committee’skey finding is that assessment, broadly defined, must be acyclic process of setting goals, selecting methods, gatheringevidence, drawing inferences, taking action, and thenreexamining goals and methods. Assessment is the feedbackloop of education. As the system of thermostat, furnace, andradiators can heat a house, so a similar assessment systemof planning, instruction, and evaluation can help facultydevelop and provide effective instructional programs. Thusthe first principle: Assessment is not a single event, but acontinuous cycle.

The assessment cycle begins with goals. If you want heat,then you must measure temperature. On the other hand, if itis humidity that is needed, then a thermostat won’t be ofmuch use. Thus one of the benefits of an assessment programis that it fosters — indeed, necessitates — reflection onprogram and course goals. In his influential study ofscholarship for the Carnegie Foundation, Ernest Boyeridentified reflective critique as one of the key principlesunderlying assessment practices of students, faculty,programs, and higher education [5]. Indeed, unless linkedto an effective process of reflection, assessment can easilybecome what many faculty fear: a waste of time and effort.

But what if the faculty want more heat and the studentsneed more humidity? How do we find that out if we onlymeasure the temperature? It is not uncommon formathematics faculty to measure success in terms of thenumber of majors or the number of graduates who go tograduate school, while students, parents, and administratorsmay look more to the support mathematics provides for othersubjects such as business and engineering. To ensure thatgoals are appropriate and that faculty expectations matchthose of others with stakes in the outcome, the assessmentcycle must from the beginning involve many constituenciesin helping set goals. Principle two: Assessment must be anopen process.

Almost certainly, a goal-setting process that involvesdiverse constituencies will yield different and sometimesincompatible goals. It is important to recognize the value ofthis variety and not expect (much less force) too muchuniformity. The individual backgrounds and needs of studentsmake it clear that uniform objectives are not an importantgoal of mathematics assessment programs. Indeed, consensusdoes not necessarily yield strength if it masks importantdiversity of goals.

The purpose of assessment is to gather evidence in orderto make improvements. If the temperature is too low, the

Page 16: Assessment Practices in Undergraduate Mathematics - Northern

Introduction 3

thermostat turns on the heat. The attribution of cause (lack ofheat) from evidence (low temperature) is one of the mostimportant and most vexing aspects of assessment. Perhapsthe cause of the drop in temperature is an open window ordoor, not lack of heat from the furnace. Perhaps the cause ofstudents’ inability to apply calculus in their economicscourses is that they don’t recognize it when the setting haschanged, not that they have forgotten the repertoire ofalgorithms. The effectiveness of actions taken in responseto evidence depends on the validity of inferences drawn aboutcauses of observed effects. Yet in assessment, as in otherevents, the more distant the effect, the more difficult theattribution. Thus principle three: Assessment must promotevalid inferences.

Compared to assessing the quality of education, takingthe temperature of a home is trivial. Even though temperaturedoes vary slightly from floor to ceiling and feels lower inmoving air, it is fundamentally easy to measure. Temperatureis one-dimensional, it changes slowly, and commonmeasuring instruments are relatively accurate. None of thisis true of mathematics. Mathematical performance is highlymultidimensional and varies enormously from one contextto another. Known means of measuring mathematicalperformance are relatively crude — either simple butmisleading, or insightful but forbiddingly complex.

Objective tests, the favorite of politicians and parents, at-omize knowledge and ignore the interrelatedness of concepts.Few questions on such tests address higher level thinking andcontextual problem solving — the ostensible goals of educa-tion. Although authentic assessments that replicate real chal-lenges are widely used to assess performance in music, athlet-ics, and drama, they are rarely used to assess mathematics per-formance. To be sure, performance testing is expensive. Butthe deeper reason such tests are used less for formal assess-ment in mathematics is that they are perceived to be less ob-jective and more subject to manipulation.

The quality of evidence in an assessment process is offundamental importance to its value and credibility. Theintegrity of assessment data must be commensurate with thepossible consequences of their use. For example, informalcomments from students at the end of each class may helpan instructor refine the next class, but such comments haveno place in an evaluation process for tenure or promotion.Similarly, standardized diagnostic tests are helpful to advisestudents about appropriate courses, but are inappropriate ifused to block access to career programs. There are very fewgeneralizations about assessment that hold up under virtuallyall conditions but this fourth principle is one of them:Assessment that matters should always employ multiplemeasures of performance.

Mathematics assessment is of no value if it does notmeasure appropriate goals — the mathematics that isimportant for today and tomorrow [11, 12]. It needs topenetrate the common facade of thoughtless mastery and

inert ideas. Rhetorical skill with borrowed ideas is notevidence of understanding, nor is facility with symbolicmanipulation evidence of useful performance [21].Assessment instruments in mathematics need to measure allstandards, including those that call for higher order skillsand contextual problem solving. Thus the content principle:Assessment should measure what is worth learning, not justwhat is easy to measure.

The goal of mathematics education is not to equip allstudents with identical mathematical tool kits but to amplifythe multiplicity of student interests and forms of mathematicaltalent. As mathematical ability is diverse, so must bemathematics instruction and assessment. Any assessmentmust pass muster in terms of its impact on varioussubpopulations — not only for ethnic groups, women, andsocial classes, but also for students of different ages,aspirations (science, education, business) and educationalbackgrounds (recent or remote, weak or strong).

As the continuing national debate about the role of theSAT exam illustrates, the impact of high stakes assessmentsis a continuing source of deep anxiety and anger over issuesof fairness and appropriate use. Exams whose items arepsychometrically unbiased can nevertheless result inunbalanced impact because of the context in which they aregiven (e.g., to students of uneven preparation) or the waythey are used (e.g., to award admissions or scholarships).Inappropriate use can and does amplify bias arising fromother sources. Thus a final principle, perhaps the mostimportant of all, echoing recommendations put forward byboth the Mathematical Sciences Education Board [11] andthe National Council of Teachers of Mathematics [12]:Assessment should support every student’s opportunity tolearn important mathematics.

Implementations of Assessment

In earlier times, mathematics assessment meant mostlywritten examinations — often just multiple choice tests. Itstill means just that for high-stakes school mathematicsassessment (e.g., NAEP, SAT), although the public focus onstandardized exams is much less visible (but not entirelyabsent) in higher education. A plethora of other methods,well illustrated in this volume, enhance the options forassessment of students and programs at the postsecondarylevel:

• Capstone courses that tie together different parts ofmathematics;

• Comprehensive exams that examine advanced parts of astudent’s major;

• Core exams that cover what all mathematics majors havein common;

• Diagnostics exams that help identify students’ strengthsand weaknesses;

Page 17: Assessment Practices in Undergraduate Mathematics - Northern

4 Assessment Practices in Undergraduate Mathematics

• External examiners who offer independent assessmentsof student work;

• Employer advisors to ensure compatibility of courses withbusiness needs;

• Feedback from graduates concerning the benefits of theirmajor program;

• Focus groups that help faculty identify patterns in studentreactions;

• Group projects that engage student teams in complextasks;

• Individual projects which lead to written papers or oralpresentations;

• Interviews with students to elicit their beliefs,understandings, and concerns;

• Journals that reveal students’ reactions to theirmathematics studies;

• Oral examinations in which faculty can probe students’understanding;

• Performance tasks that require students to usemathematics in context;

• Portfolios in which students present examples of theirbest work;

• Research projects in which students employ methods fromdifferent courses;

• Samples of student work performed as part of regularcourse assignments;

• Senior seminars in which students take turns presentingadvanced topics;

• Senior theses in which students prepare a substantialwritten paper in their major;

• Surveys of seniors to reveal how they feel about theirstudies;

• Visiting committees to periodically assess programstrengths and weaknesses.

These multitude means of assessment provide options formany purposes — from student placement and grading tocourse revisions and program review. Tests and evaluationsare central to instruction and inevitably shine a spotlight (orcast a shadow) on students’ work. Broader assessmentsprovide summative judgments about a students’ major andabout departmental (or institutional) effectiveness. Sinceassessments are often preludes to decisions, they not onlymonitor standards, but also set them.

Yet for many reasons, assessment systems often distortthe reality they claim to reflect. Institutional policies andfunding patterns often reward delaying tactics (e.g., bysupporting late summative evaluation in preference to timelyformative evaluation) or encourage a facade of accountability(e.g., by delegating assessment to individuals who bear noresponsibility for instruction). Moreover, instructors orproject directors often unwittingly disguise advocacy asassessment by slanting the selection of evaluation criteria.Even external evaluators often succumb to promotionalpressure to produce overly favorable evaluations.

Other traps arise when the means of assessment do notreflect the intended ends. Follow-up ratings (e.g., courseevaluations) measure primarily student satisfaction, notcourse effectiveness; statements of needs (from employersor client departments) measure primarily what people thinkthey need, not what they really need; written examinationsreveal primarily what students can do with well-posedproblems, not whether they can use mathematics in externalcontexts. More than almost anything else a mathematicianengages in, assessment provides virtually unlimitedopportunities for meaningless numbers, self-delusion, andunsubstantiated inferences. Several reports [e.g., 15, 18, 19]offer informative maps for navigating these uncharted waters.

Assessment is sometimes said to be a search for footprints,for identifying marks that remain visible for some time [4].Like detectives seeking evidence, assessors attempt todetermine where evidence can be found, what marks weremade, who made them, and how they were made. Impressionscan be of varying depths, more or less visible, more or lesslasting. They depend greatly on qualities of the surfaces onwhich they fall. Do these surfaces accept and preservefootprints? Few surfaces are as pristine as fresh sand at thebeach; most real surfaces are scuffed and trammeled. Realprograms rarely leave marks as distinguishing or as lastingas a fossil footprint.

Nevertheless, the metaphor of footprints is helpful inunderstanding the complexity of assessing program impact.What are the footprints left by calculus? They includecognitive and attitudinal changes in students enrolled in theclass, but also impressions and reputations passed on toroommates, friends, and parents. They also include changesin faculty attitudes about student learning and in the attitudesof client disciplines towards mathematics requirements [20].But how much of the calculus footprint is still visible two orthree years later when a student enrolls in an economics orbusiness course? How much, if any, of a student’s analyticability on the law boards can be traced to his or her calculusexperience? How do students’ experiences in calculus affectthe interests or enthusiasm of younger students who are ayear or two behind? The search for calculus footprints canrange far and wide, and need not be limited to course gradesor final exams.

In education as in industry, assessment is an essential toolfor improving quality. The lesson learned by assessmentpioneers and reflected in the activities described in thisvolume is that assessment must be broad, flexible, diverse,and suited to the task. Those responsible for assessment(faculty, department chairs, deans, and provosts) need toconstantly keep several questions in the forefront of theiranalysis:

• Are the goals clear and is the assessment focused on thesegoals?

• Who has a voice in setting goals and in determining thenature of the assessment?

Page 18: Assessment Practices in Undergraduate Mathematics - Northern

Introduction 5

• Do the faculty ground assessment in relevant research fromthe professional literature?

• Have all outcomes been identified — including those thatare indirect?

• Are the means of assessment likely to identify unintendedoutcomes?

• Is the mathematics assessed important for the students inthe program?

• In what contexts and for which students is the programparticularly effective?

• Does the assessment program support development offaculty leadership?

• How are the results of the assessment used for improvingeducation?

Readers of this volume will find within its pages dozensof examples of assessment activities that work for particularpurposes and in particular contexts. These examples canenrich the process of thoughtful, goal-oriented planning thatis so important for effective assessment. No single systemcan fit all circumstances; each must be constructed to fit theunique goals and needs of particular programs. But all canbe judged by the same criteria: an open process, beginningwith goals, that measures and enhances students’mathematical performance; that draws valid inferences frommultiple instruments; and that is used to improve instructionfor all students.

References

[1] American Mathematical Association of Two-YearColleges. Crossroads in Mathematics: Standards forIntroductory College Mathematics Before Calculus.American Mathematical Association of Two-YearColleges, Memphis, TN, 1995.

[2] Committee on the Undergraduate Program inMathematics (CUPM). “Assessment of StudentLearning for Improving the Undergraduate Major inMathematics,” Focus: The Newsletter of the Mathe-matical Association of America, 15 (3), June 1995, pp.24–28, reprinted in this volume, pp. 279–284.

[3] Douglas, R.G., ed. Toward a Lean and Lively Calculus,Mathematical Association of America, Washington DC,1986.

[4] Frechtling, J.A. Footprints: Strategies for Non-Traditional Program Evaluation., National ScienceFoundation, Washington, DC, 1995.

[5] Glassick, C.E., et. al. Scholarship Assessed: Evaluationof the Professoriate, Carnegie Foundation for theAdvancement of Teaching, Jossey-Bass, San Francisco,CA, 1997.

[6] Hoaglin, D.C. and Moore, D.S., eds. Perspectives onContemporary Statistics, Mathematical Association ofAmerica, Washington, DC, 1992.

[7] Joint Policy Board for Mathematics. Recognition andRewards in the Mathematical Sciences, AmericanMathematical Society, Providence, RI, 1994.

[8] Loftsgaarden, D.O., Rung, D.C., and Watkins, A.E.Statistical Abstract of Undergraduate Programs in theMathematical Sciences in the United States: Fall 1995CBMS Survey, Mathematical Association of America,Washington, DC, 1997.

[9] Madison, B. “Assessment of UndergraduateMathematics,” in L.A. Steen, ed., Heeding the Call forChange: Suggestions for Curricular Action,Mathematical Association of America, Washington,DC, 1992, pp. 137-149.

[10] Mathematical Sciences Education Board. MovingBeyond Myths: Revitalizing UndergraduateMathematics, National Research Council, Washington,DC, 1991.

[11] Mathematical Sciences Education Board. MeasuringWhat Counts: A Conceptual Guide for MathematicsAssessment, National Research Council, Washington,DC, 1993.

[12] National Council of Teachers of Mathematics.Assessment Standards for School Mathematics,National Council of Teachers of Mathematics, Reston,VA. 1995.

[13] National Council of Teachers of Mathematics.Curriculum and Evaluation Standards for SchoolMathematics, National Council of Teachers ofMathematics, Reston, VA, 1989.

[14] Roberts, A.W., ed. Calculus: The Dynamics of Change,Mathematical Association of America, Washington,DC, 1996.

[15] Schoenfeld, A. Student Assessment in Calculus,Mathematical Association of America, Washington,DC, 1997.

[16] Steen, L.A., ed. Calculus for a New Century: A Pump,Not a Filter, Mathematical Association of America,Washington, DC, 1988.

[17] Steen, L.A., ed. Reshaping College Mathematics,Mathematical Association of America, Washington,DC, 1989.

[18] Stenmark, J.K., ed. Mathematics Assessment: Myths,Models, Good Questions, and Practical Suggestions,National Council of Teachers of Mathematics, Reston,VA, 1991.

[19] Stevens, F., et al. User-Friendly Handbook for ProjectEvaluation, National Science Foundation, Washington,DC, 1993.

[20] Tucker, A.C. and Leitzel, J.R.C. Assessing CalculusReform Efforts, Mathematical Association of America,Washington, DC, 1995.

[21] Wiggins, G. “A True Test: Toward More Authentic andEquitable Assessment,” Phi Delta Kappan, May 1989,pp. 703–713.

Page 19: Assessment Practices in Undergraduate Mathematics - Northern

6 Assessment Practices in Undergraduate Mathematics

[22] Wiggins, G. “The Truth May Make You Free, but theTest May Keep You Imprisoned: Toward AssessmentWorthy of the Liberal Arts,” The AAHE AssessmentForum, 1990, pp. 17–31. (Reprinted in Steen, L.A., ed.,Heeding the Call for Change: Suggestions forCurricular Action, Mathematical Association ofAmerica, Washington, DC, 1992, pp. 150–162.

Page 20: Assessment Practices in Undergraduate Mathematics - Northern

7

In August 1990 at the Joint Mathematics Meetings inColumbus, Ohio, the Subcommittee on Assessment of theMathematical Association of America’s (MAA) Committeeon the Undergraduate Program in Mathematics (CUPM) heldits organizational meeting. I was subcommittee chair and,like my colleagues who were members, knew little aboutthe topic we were to address. The impetus for assessment ofstudent learning was from outside our discipline, fromaccrediting agencies and governing boards, and even thevocabulary was alien to most of our mathematics community.

We considered ourselves challenging as teachers and asstudent evaluators. We used high standards and rigorous testsin our courses. What else could assessment be? We weresuspicious of evaluating student learning through group work,student-compiled portfolios, and opinion surveys. And wewere uncertain about evaluating programs and curricula usingdata about student learning gathered in these unfamiliar ways.Although many of us believed that testing stimulatedlearning, we were not prepared to integrate more complexassessment schemes into our courses, curricula, and otherdepartmental activities.

We began learning about assessment. In its narrowest sense,our charge was to advise the MAA membership on the assess-ment of student learning in the mathematics major for the pur-pose of improving programs. Sorting out distinctions in themeaning of testing, student evaluations, program evaluations,assessment, and other recurring words and phrases was chal-lenging, though easily agreed to by committee members ac-customed to precision in meaning. We were to discover thatassessment of student learning in a multi-course program wasfamiliar to us, but not part of most of our departments’ prac-tices in undergraduate programs. In fact, departments and fac-ulties had confronted a similar scenario in implementing aplacement scheme for entering freshman students. The most

comprehensive placement schemes used a variety of data aboutstudents capabilities in pre-college mathematics to place themin the appropriate college course. However, many placementschemes were influenced by the practices of traditional testingin undergraduate courses and relied on a single measurementtool, a test, often multiple-choice.

Another place where we had used a multi-faceted schemefor assessment of student learning was in our graduateprograms, particularly the doctoral programs. Individualcourse grades are much less critical and meaningful in adoctoral program, where assessment of learning relies heavilyon comprehensive examinations, interviews, presentations,and an unarticulated portfolio of interaction between thestudent and the graduate faculty. And, finally, there is themajor capstone experience, the dissertation.

The subcommittee decided to draft a document that wouldoutline what a program of assessment of student learningshould be, namely a cycle of setting learning goals, designinginstructional strategies, determining assessment methods,gathering data, and using the results to improve the major.Because of a lack of research-based information about howstudents learn mathematics, we decided not to try to addresswhat specific tools measure what aspects of learning.

By 1993 we had a draft document circulating in themathematics community asking for feedback. The draftpresented a rather simple cyclical process with lists of optionsfor learning goals, instructional strategies, and assessmenttools. Some were disappointed that the draft did not addressthe more complex learning issues, while others failed to findthe off-the-shelf scheme they thought they wanted. Thissearch for an off-the-shelf product was similar to thecircumstances in the 70s and 80s with placement schemesand, as then, reflected the lack of interest in ownership ofthese processes by many mathematics faculty members.

ASSESSMENT AND THE MAA

Bernard L. MadisonUniversity of Arkansas, Fayetteville

Page 21: Assessment Practices in Undergraduate Mathematics - Northern

8 Assessment Practices in Undergraduate Mathematics

In spite of the suspicions and disinterest of many in ourmathematics community, the discipline of mathematics wasfurther along than most collegiate disciplines in addressingassessment of student learning. Through attendance at andparticipation in national conferences on assessment, we soonlearned that we were not alone in our confusion and that therhetoric about assessment was fuzzy, unusually redundant,and overly complicated.

The draft document, in general, received positive reviews,while eliciting minimal suggestions for improvement.Departmental faculties faced with a mandate of implementingan assessment program were generally pleased to have asimple skeletal blueprint. Feedback did improve thedocument which was published by the MAA in the June 1995issue of Focus. [1]

After publication of the report to the MAA membershipin 1995, the subcommittee had one unfinished piece ofbusiness, compiling and distributing descriptions of specificassessment program experiences by departmental faculties.We had held one contributed paper session at the 1994 JointMathematics Meetings and another was scheduled in January1996, to be organized by subcommittee members BarbaraFaires and Bill Marion.

As a result of the contributed paper sessions, weconcluded that the experience within departments waslimited, but by 1996, there were encouraging reports. At the1996 meetings Bill Marion, Bonnie Gold, and Sandra Keithagreed to undertake the compilation the subcommittee hadplanned and to address a broader range of assessment issuesin a volume aimed for publication by the MAA. This volumeis the result.

The articles on assessing the major in the volume makeat least three points. First, the experiences related here showthat attitudes have changed over the past seven years aboutvarious instructional and assessment strategies. Group work,comprehensive exams, focus groups, capstone courses,surveys, and portfolios are now widely discussed and used.This change has not been due to the assessment movementalone. Reform in teaching and learning, most notably incalculus over the past decade, has promoted these non-traditional teaching and learning methods, and, as a result,has called for new assessment strategies. The confluence of

pressures for change has made these experiences morecommon and less alien.

Second, the articles show that our experience is still toolimited to allow documentation of significant changes inlearning. As the articles indicate, the symptoms areencouraging, but not yet conclusive.

Third, the articles make it clear that developing anassessment program requires intense faculty involvement atthe local level and a strong commitment by individual facultyleaders. The variety of experiences described in the sectionon assessing the major spans several non-traditionalmethodologies: portfolios, capstone courses, comprehensiveexaminations, focus groups, and surveys of graduates andemployers. The variety enriches the volume considerably.

Some who read this volume may be disappointed by againnot being led to a recipe or an off-the-shelf program ofassessment. Only after one investigates the complexity ofestablishing a promising assessment program will one fullyappreciate the work of those who are relating theirexperiences here and those who have compiled and editedthis volume. This volume contributes significantly to anongoing process of improving teaching and learning incollegiate mathematics. No doubt, someday it will bevaluable only as a historical document as our experiencesgrow and we understand better how students learn and howto measure that learning more effectively.

But, for now, this is a valuable volume recounting wellthought out experiences involving teachers and studentsengaged in learning together. The volume is probably just intime as well: too late for the experiences to be mistaken foroff-the-shelf recipes, but in time to share ideas to makeassessment programs better.

Reference

[1] Committee on the Undergraduate Program inMathematics (CUPM). “Assessment of StudentLearning for Improving the Undergraduate Major inMathematics,” Focus: The Newsletter of theMathematical Association of America, 15 (3), June1995, pp. 24–28. Reprinted on pp. 313 of this volume.

Page 22: Assessment Practices in Undergraduate Mathematics - Northern

9

Lynn Steen has set the stage for this book by describing, inthe preface, the pressures that bring many mathematiciansto seek help with assessment, as well as some of the resistanceto assessment that many of us in mathematics may feel.“Assessment” to many is mere jargon; it’s a code word formore administrative red tape. Some fear that it will be usedto push us to lower our expectations of our students, to lowerour “standards.” We hope our book will help allay some ofthese fears, and will go beyond that to encourage you, thereader, to view assessment as a natural part of the process ofcreating a positive and exciting learning environment, ratherthan as a duty inflicted from outside.

Many mathematicians are already taking seriously thechallenge of finding better methods of assessment. What weoffer here is essentially an album of techniques, from a wideassortment of faculty across the nation. We, as editors ofthis collection, are amazed at the energy of the individualsand schools represented here, all of whom care deeply aboutlearning. We delight in their successes, insights,inventiveness, and the sheer combined diversity of theirefforts, and we appreciate the frankness with which they havebeen willing to share the down-side of what they’ve tried.Their schools are in the midst of experiencing change. Theseare contributors who are courageous in experimenting withassessment and generous in their willingness to share withyou ideas which are still in the process of development. Yetthis book is not premature: good assessment is a cyclicprocess, and is never “finished.”

Take a look at the table of contents. This is a book forbrowsing, not to be approached as a novel; nor is it merelyan encyclopedia for reference. Skim over the topics untilyou find one that appeals to you — and we guarantee youwill find something here that intrigues. Perhaps it will besomething, such as capstone courses, that you have wonderedabout, or an in-class writing assignment that you have tried

yourself. Who you are will affect how you approach thisbook. The chair of a department in a liberal arts college,faced with developing a departmental assessment plan, mayturn first to the section on assessing the major, and be curiousabout the assessment of general education courses orplacement programs. The chair of a department at acomprehensive university might view the assessment of adepartment’s role on campus as most important. Anindividual faculty member, on the other hand, might findimmediately useful the section on classroom assesssment.We hope, above all, to draw you into seeing that assessmentis conducted in many different ways, and that it pertains toalmost everything we do as teachers.

To make the book easy to use, each article follows thesame basic format. At the top of each article, there is a briefdescription, to help you decide whether it is the article youwere expecting from its title. The article begins with a“Background and Purpose” section, which gives the context(type of institution, etc.) of the assessment method beingdescribed, and some history of the author’s use of thetechnique. “Method” describes the technique in sufficientdetail, we hope, for you to begin picturing how you mightadapt it to your situation. “Findings” and “Use of Findings”describe the results the author has experienced, and howthese can be used to improve student learning. At the end ofeach article, “Success Factors” cautions the readers onpotential pitfalls, or explains what makes the method tick.

Each author writes from his or her personal perspective.What our authors present, then, may not translate instantlyfrom one institution to another. However, our hope is thatwith the sheer diversity of institutions represented (rangingfrom 2-year colleges, to research universities) you will findsomething that attracts your interest or inspires you. Make ita personal book. Because of the wide variety of perspectivesand philosophies of our authors, some of the ideas presented

HOW TO USE THIS BOOK

Page 23: Assessment Practices in Undergraduate Mathematics - Northern

10 Assessment Practices in Undergraduate Mathematics

here you will take delight in, and others you will dislike. Aseditors, we found ideas which we wanted to try immediatelyin our own institutions, as did the MAA Notes Committeeas they reviewed the manuscript. However, after reading anarticle which interests you, ask yourself: “Since the author’sstudents are stronger/ weaker/harder working/ lazier/ moreadvanced/ less sophisticated than mine: how can I makethis work for my class? Exactly what questions should I ask?How will I collect and analyze the data? How will I use theresults?” You may even decide to contact the author forhelp as questions arise; authors’ addresses are at the end ofthe book. Then try the activity on yourself. What is yourresponse? You may also want to try it, if appropriate, on acolleague, to get feedback on how the questions are heardby someone else. Be prepared, however, for very differentresponses from the students, as they bring their own level ofsophistication to the process. And be sure to let studentsknow that the purpose of the activity is to improve whatthey get out of the program, and that their input is essentialto this process. Then, after using the technique once, reviseit. Or look for another technique which better focuses on thequestion you’re concerned with.

Keep in mind that assessment must be cyclic. The cycleincludes deciding what the goals of the activity are, figuringout what methods will allow you to reach those goals,deciding on a method of assessment which will let you knowhow well you have met the goals, administering theassessment instrument, compiling the data, and reporting theresults. Then you return to the beginning: deciding whetherthe goals need revision given your new understanding ofwhat is happening, what revision of methods will be needed,etc. Throughout the process, students should be keptinformed of your reasons for doing the assessment, and ofthe findings and how they will be used. Of course, unlessit’s a completely new activity being undertaken, you willprobably start somewhere in mid-cycle—you’re alreadyteaching this course, or offering this major, and had somegoals in mind when you started, though these may havereceded into the mists of the past. However, keeping theassessment cycle in mind will improve progress toward theultimate goal, improving student learning.

We have tried both to offer as wide a variety of assessmentmethods as possible, and to show their interrelation, so thatit is easy to move from one techinque to another. We’vedone this in several ways. In this introductory material, thereis a chart of which topics occur in which articles, so that if,for example, you’re looking for assessment via portfolios,you can look down the chart to see which articles describetheir use. In the introduction to each section of the book,we’ve tried to show how the various articles in the sectionare related to each other. Finally, at the end of the book is alist of other books you may want to look at, which aren’tspecific to mathematics but which offer ideas that can beadapted to mathematics programs.

We must warn you what our book cannot achieve for you.Although some individual pieces may give you the flavor ofresearch in several directions, this is not a collection ofresearch articles in assessment in undergraduate mathematics.Nor is this book a collection of recipes ready-made to use.There is no one overarching theoretical view driving thesearticles, although most do assume that students learn betterwhen active rather than passive. But all authors are concernedwith improving student learning.

As the mathematical community as a whole has onlyrecently begun to think seriously about assessment, werecognize that this book is only a beginning. We hope thatothers will take up ideas offered here, and look into them ingreater depth. We offer you not only a compendium of ideas,but a source of individuals with whom you might correspondto learn more or develop collaborations. We hope that inseveral years, there will be a greater variety of methods, manyof which will have gone through several full assessmentcycles. Once this has happened, a more definitive volumecan be written. But the book we have brought together herecould not afford to come any later; at this stage, informationand inspiration are needed above all.

Bonnie Gold, Monmouth UniversitySandra Z. Keith, St. Cloud State UniversityWilliam A. Marion, Valparaiso University

Page 24: Assessment Practices in Undergraduate Mathematics - Northern

11

Advising, Placement

Administering a Placement Test: St. Olaf College ....... 178A Comprehensive Advising Program ............................ 184Creating a Professional Environment in the Classroom .. 98Factors Affecting the Completion of Undergraduate

Degrees in Science, Engineering, and Mathematics forUnderrepresented Minority Students: The Senior BulgeStudy........................................................................ 216

Let Them Know What You’re Up to, Listen to What TheySay ........................................................................... 205

A Mathematics Placement and Advising Program ........ 181

Assessing Reform Courses

Assessing the Effectiveness of Innovative EducationalReform Efforts .......................................................... 249

Assessing Student Attitudes and Learning Gains .......... 241Assessment in One Learning Theory Based Approach to

Teaching................................................................... 229Does Calculus Reform Work? ....................................... 245Evaluating the Effects of Reform .................................. 219An Evaluation of Calculus Reform: A Preliminary Report

of a National Study ................................................... 233The Evaluation of Project CALC at Duke University,

1989–1994................................................................ 253Increasing the Dialogue About Calculus with a

Questionnaire ........................................................... 237Let Them Know What You’re Up To, Listen To What They

Say ........................................................................... 205

Assigning Grades

Assessing Expository Mathematics ............................... 113Assessing Modeling Projects in Calculus and Precalculus:

Two Approaches ....................................................... 116Combining Individual and Group Evaluations .............. 134Flexible Grade Weightings .............................................. 84

Concept Development

Assessing General Education Mathematics ThroughWriting and Questions .............................................. 131

Assessing Learning of Female Students ........................ 149Concept Maps ................................................................. 89Continuous Evaluation Using Cooperative Learning .... 140Define, Compare, Contrast, Explain ... .......................... 104Group Testing ................................................................ 137If You Want to Know What Students Understand, Ask

Them .......................................................................... 91In-Depth Interviews to Understand Student

Understanding.......................................................... 109Interactive E-Mail Assessment ........................................ 80Student-Created Problems Demonstrate Knowledge and

Understanding.......................................................... 106The One-Minute Paper .................................................. 87True or False? Explain! ................................................. 101Using Writing to Assess Understanding of Calculus

Concepts................................................................... 126

Cooperative Learning

Assessing Learning of Female Students ........................ 149Assessing Student Attitudes and Learning Gains .......... 241Assessment During Complex Problem-Solving Group

Work in Class ........................................................... 120Assessment in One Learning Theory Based Approach to

Teaching................................................................... 229Collaborative Oral Take-Home Exams .......................... 143Combining Individual and Group Evaluations .............. 134Continuous Evaluation Using Cooperative Learning .... 140Does Calculus Reform Work? ....................................... 245The Evaluation of Project CALC at Duke University,

1989–1994................................................................ 253Group Testing ................................................................ 137Improving Classes with Quick Assessment Techniques .. 94

ARTICLES ARRANGED BY TOPIC

Page 25: Assessment Practices in Undergraduate Mathematics - Northern

12 Assessment Practices in Undergraduate Mathematics

A Joint Written Comprehensive Examination to AssessMathematical Processes and Lifetime Metaskills ....... 42

Strategies to Assess the Adult Learner .......................... 152Student-Created Problems Demonstrate Knowledge and

Understanding.......................................................... 106A TEAM Teaching Experience in Mathematics/

Economics................................................................ 213Using a Capstone Course to Assess a Variety of Skills ... 31

Course EvaluationThe Course Portfolio in Mathematics ............................ 170Early In-Course Assessment of Faculty by Students ..... 158Early Student Feedback ................................................. 164Friendly Course Evaluations ......................................... 167Improving Classes with Quick Assessment Techniques..... 94The One-Minute Paper .................................................... 87Student Feedback Teams in a Mathematics Classroom . 161

Developmental CoursesA Comprehensive, Pro-Active Assessment Program..... 224Concept Maps ................................................................. 89Does Developmental Mathematics Work? ..................... 202Strategies to Assess the Adult Learner .......................... 152

Exams

Assessing the Major Via a Comprehensive SeniorExam .......................................................................... 39

Assessment of the Mathematics Major at Xavier:First Steps ................................................................... 54

Collaborative Oral Take-Home Exams .......................... 143Group Testing ................................................................ 137Interactive E-Mail Assessment ........................................ 80A Joint Written Comprehensive Examination to Assess

Mathematics Processes and Lifetime Metaskills ........ 42Outcomes Assessment in the B.S. Statistics Program at

Iowa State University ................................................. 49True or False? Explain! ................................................. 101Undergraduate Core Assessment in the Mathematical

Sciences...................................................................... 46Using Pre- and Post-Testing in a Liberal Arts Mathematics

Course to Improve Teaching and Learning ............... 195What Happened to Tests? ................................................ 77

Exit Interviews, Focus Groups, Surveys

Analyzing the Value of a Transitional MathematicsCourse ........................................................................ 64

Assessing the Major: Serving the Needs of StudentsConsidering Graduate School ..................................... 68

Assessing the Teaching of College MathematicsFaculty...................................................................... 265

Assessment of the Mathematics Major at Xavier: FirstSteps........................................................................... 54

Coming to Terms with Quantitative Literacy in GeneralEducation: or, the Uses of Fuzzy Assessment .......... 198

A Comprehensive, Pro-Active Assessment Program..... 224Department Goals and Assessment ................................. 61An Evaluation of Calculus Reform: A Preliminary Report

of a National Study ................................................... 233Exchanging Class Visits: Improving Teaching for Both

Junior and Senior Faculty ......................................... 272Increasing the Dialogue About Calculus with a

Questionnaire ........................................................... 237Outcomes Assessment in the B.S. Statistics Program at

Iowa State University ................................................. 49The Use of Focus Groups within a Cyclic Assessment

Program...................................................................... 35

General Studies Courses

Assessing General Education Mathematics ThroughWriting and Questions .............................................. 131

Coming to Terms with Quantitative Literacy in GeneralEducation: or, the Uses of Fuzzy Assessment .......... 198

Creating a General Education Course: A CognitiveApproach.................................................................. 191

Creating a Professional Environment in the Classroom .. 98A Quantitative Literacy Program................................... 187Using Pre- and Post-Testing in a Liberal Arts Mathematics

Course to Improve Teaching and Learning ............... 195

Institution-Wide Changes

Coming to Terms with Quantitative Literacy in GeneralEducation: or, the Uses of Fuzzy Assessment .......... 198

A Comprehensive, Pro-Active Assessment Program..... 224Department Goals and Assessment ................................. 61Evaluating the Effects of Reform .................................. 219Have Our Students with Other Majors Learned the Skills

They Need? .............................................................. 209Let Them Know What You’re Up to, Listen to What They

Say ........................................................................... 205

Large-Class Methods

The Course Portfolio in Mathematics ............................ 170Early In-Course Assessment of Faculty by Students ..... 158Early Student Feedback ................................................. 164Flexible Grade Weightings .............................................. 84Interactive E-Mail Assessment ........................................ 80A Mathematics Placement and Advising Program ........ 181The One-Minute Paper .................................................... 87

Peer Visitation

Exchanging Class Visits: Improving Teaching for BothJunior and Senior Faculty ......................................... 272

Peer Review of Teaching ............................................... 275

Page 26: Assessment Practices in Undergraduate Mathematics - Northern

Articles Arranged by Topic 13

Using Video and Peer Feedback to Improve Teaching..... 269

Portfolios

Assessing a Major in Mathematics .................................. 21Assessment of the Mathematics Major at Xavier:

First Steps ................................................................... 54The Course Portfolio in Mathematics ............................ 170Departmental Assistance in Formative Assessment of

Teaching................................................................... 261Portfolio Assessment of the Major .................................. 24Student Assessment through Portfolios ......................... 123Using a Capstone Course to Assess a Variety of Skills...... 31Using Writing to Assess Understanding of Calculus

Concepts................................................................... 126

Problem-Centered Work

Assessing Modeling Projects in Calculus and Precalculus:Two Approaches ....................................................... 116

Assessment During Complex Problem-Solving GroupWork in Class ........................................................... 120

Assessment in a Problem-Centered College MathematicsCourse ...................................................................... 146

Creating a General Education Course: A CognitiveApproach.................................................................. 191

The Evaluation of Project CALC at Duke University,1989–1994................................................................ 253

Student-Created Problems Demonstrate Knowledge andUnderstanding.......................................................... 106

Projects, Capstone Courses

Assessing Essential Academic Skills from the Perspectiveof the Mathematics Major .......................................... 58

Assessing Learning of Female Students ........................ 149Assessing the Major: Serving the Needs of Students

Considering Graduate School ..................................... 68An Assessment Program Built around a Capstone

Course ........................................................................ 27The Evaluation of Project CALC at Duke University,

1989–1994................................................................ 253Portfolio Assessment of the Major .................................. 24Strategies to Assess the Adult Learner .......................... 152Student Assessment through Portfolios ......................... 123A TEAM Teaching Experience in Mathematics/

Economics................................................................ 213Using a Capstone Course to Assess a Variety of Skills...... 31Using Writing to Assess Understanding of Calculus

Concepts................................................................... 126

Reading Skills

Assessing General Education Mathematics ThroughWriting and Questions .............................................. 131

An Assessment Program Built Around a CapstoneCourse ........................................................................ 27

Creating a Professional Environment in the Classroom .. 98Using a Capstone Course to Assess a Variety of Skills...... 31

Research in Undergraduate Mathematics Education

Assessment in One Learning Theory Based Approach toTeaching................................................................... 229

Assessment in a Problem-Centered College MathematicsCourse ...................................................................... 146

Creating a General Education Course: A CognitiveApproach.................................................................. 191

In-Depth Interviews to Understand StudentUnderstanding.......................................................... 109

Setting Course Goals

The Class Mission Statement ........................................ 155Creating a Professional Environment in the Classroom .. 98Improving Classes with Quick Assessment Techniques .. 94Using Pre- and Post-Testing in a Liberal Arts Mathematics

Course to Improve Teaching and Learning ............... 195

Special-Needs Students

Assessing Learning of Female Students ........................ 149Does Calculus Reform Work? ....................................... 245Factors Affecting the Completion of Undergraduate

Degrees in Science, Engineering, and Mathematicsfor Underrepresented Minority Students: The SeniorBulge Study .............................................................. 216

Strategies to Assess the Adult Learner .......................... 152

Students Taking Responsibility for Their Learning

Assessment in a Problem-Centered College MathematicsCourse ...................................................................... 146

The Class Mission Statement ........................................ 155Creating a Professional Environment in the Classroom...... 98Early In-Course Assessment of Faculty by Students ..... 158Flexible Grade Weightings .............................................. 84Improving Courses with Quick Assessment

Techniques.................................................................. 94Journals: Assessment without Anxiety .......................... 129Student Assessment through Portfolios ......................... 123Student Feedback Teams in a Mathematics Classroom.... 161

Teaching Effectiveness

Assessing the Teaching of College MathematicsFaculty...................................................................... 265

Departmental Assistance in Formative Assessment ofTeaching................................................................... 261

Early In-Course Assessment of Faculty by Students ..... 158Early Student Feedback ................................................. 164

Page 27: Assessment Practices in Undergraduate Mathematics - Northern

14 Assessment Practices in Undergraduate Mathematics

Using Video and Peer Feedback to Improve Teaching..... 269

Technology

Assessment in One Learning Theory Based Approach toTeaching................................................................... 229

Does Calculus Reform Work? ....................................... 245Interactive E-Mail Assessment ........................................ 80A TEAM Teaching Experience in Mathematics/

Economics................................................................ 213Using Video and Peer Feedback to Improve Teaching..... 269

Writing to Learn Mathematics

Assessing Essential Academic Skills from the Perspectiveof the Mathematics Major .......................................... 58

Assessing Expository Mathematics ............................... 113Assessing General Education Mathematics Through

Writing and Questions .............................................. 131Assessing a Major in Mathematics .................................. 21

Assessing the Major via a Comprehensive SeniorExamination............................................................... 39

Assessing Modeling Projects in Calculus and Precalculus:Two Approaches ....................................................... 116

Assessing Student Attitudes and Learning Gains .......... 241Creating a Professional Environment in the Classroom .. 98Define, Compare, Contrast, Explain ... .......................... 104Friendly Course Evaluations ......................................... 167If You Want to Know What Students Understand, Ask

Them .......................................................................... 91Journals: Assessment Without Anxiety ......................... 129The One-Minute Paper ................................................... 87Portfolio Assessment of the Major .................................. 24A TEAM Teaching Experience in Mathematics/

Economics................................................................ 213True or False? Explain! ................................................. 101Using a Capstone Course to Assess a Variety of Skills...... 31Using Writing to Assess Understanding of Calculus

Concepts................................................................... 126

Page 28: Assessment Practices in Undergraduate Mathematics - Northern

15

The Teaching Goals Inventory below, is reprinted from Classroom Assessment Techniques, by Angelo and Cross. Wesuggest that you, the reader, take this inventory (with a specific course in mind, as the authors suggest) in part for your owninformation, and also because it can help you locate your own teaching styles and priorities in the context of this book.

Similar inventories can be found at a variety of web-sites; some colleges seem to have goals inventories such as this forstudents as part of their guidance policies. In Angelo and Cross, compiled results from this inventory from community andfour-year colleges are offered as well; these can be useful for study and comparison. This inventory can also be effective incommittee meetings (as, for example, when faculty are restructuring a course or curriculum), or to initiate a workshopsession. It can also be used at the beginning and end of a general education course, as a method for observing changes in thevalues and goals of students. Even better than taking the inventory, perhaps, would be to design a comparable inventory ofyour own.

Purpose: The Teaching Goals Inventory (TGI) is a self-assessment of instructional goals. Its purpose is threefold: (1)to help college teachers become more aware of what they want to accomplish in individual courses; (2) to help facultylocate Classroom Assessment Techniques they can adapt and use to assess how well they are achieving their teaching andlearning goals; and (3) to provide a starting point for discussion of teaching and learning goals among colleagues.

Directions: Please select ONE course you are currently teaching. Respond to each item on the inventory in relation tothat particular course. (Your response might be quite different if you were asked about your overall teaching and learninggoals, for example, or the appropriate instructional goals for your discipline.)

Please print the title of the specific course you are focusing on:___________________________________________________

Please rate the importance of each of the fifty-two goals listed below to the specific course you have selected. Assess eachgoal’s importance to what you deliberately aim to have your students accomplish, rather than the goal’s general worthinessor overall importance to your institution’s mission. There are no “right” or “wrong” answers; only personally more or lessaccurate ones.

For each goal, choose only one response on the 1-to-5 rating scale. You may want to read quickly through all fifty-twogoals before rating their relative importance.

In relation to the course you are focusing on, indicate whether each goal you rate is:

(5) Essential a goal you always/nearly always try to achieve(4) Very Important a goal you often try to achieve(3) Important a goal you sometimes try to achieve(2) Unimportant a goal you rarely try to achieve(1) Not applicable a goal you never try to achieve

Rate the importance of each goal to what you aim to have students accomplish in your course.

1. Develop ability to apply principles and generalizations already learnedto new problems and situations 5 4 3 2 1

2. Develop analytic skills 5 4 3 2 13. Develop problem-solving skills 5 4 3 2 14. Develop ability to draw reasonable inferences from observations 5 4 3 2 15. Develop ability to synthesize and integrate information and ideas 5 4 3 2 16. Develop ability to think holistically: to see the whole as well as the parts 5 4 3 2 17. Develop ability to think creatively 5 4 3 2 18. Develop ability to distinguish between fact and opinion 5 4 3 2 1

TEACHING GOALS INVENTORY, SELF-SCORABLE VERSION

Page 29: Assessment Practices in Undergraduate Mathematics - Northern

16 Assessment Practices in Undergraduate Mathematics

9. Improve skill at paying attention 5 4 3 2 110. Develop ability to concentrate 5 4 3 2 111. Improve memory skills 5 4 3 2 112. Improve listening skills 5 4 3 2 113. Improve speaking skills 5 4 3 2 114. Improve reading skills 5 4 3 2 115. Improve writing skills 5 4 3 2 116. Develop appropriate study skills, strategies, and habits 5 4 3 2 117. Improve mathematical skills 5 4 3 2 118. Learn terms and facts of this subject 5 4 3 2 119. Learn concepts and theories in this subject 5 4 3 2 120. Develop skill in using materials, tools, and/or technology central to this subject 5 4 3 2 121. Learn to understand perspectives and values of this subject 5 4 3 2 122. Prepare for transfer or graduate study 5 4 3 2 123. Learn techniques and methods used to gain new knowledge in this subject 5 4 3 2 124. Learn to evaluate methods and materials in this subject 5 4 3 2 125. Learn to appreciate important contributions to this subject 5 4 3 2 126. Develop an appreciation of the liberal arts and sciences 5 4 3 2 127. Develop an openness to new ideas 5 4 3 2 128. Develop an informed concern about contemporary social issues 5 4 3 2 129. Develop a commitment to exercise the rights and responsibilities of citizenship 5 4 3 2 130. Develop a lifelong love of learning 5 4 3 2 131. Develop aesthetic appreciation 5 4 3 2 132. Develop an informed historical perspective 5 4 3 2 133. Develop an informed understanding of the role of science and technology 5 4 3 2 134. Develop an informed appreciation of other cultures 5 4 3 2 135. Develop capacity to make informed ethical choices 5 4 3 2 136. Develop ability to work productively with others 5 4 3 2 137. Develop management skills 5 4 3 2 138. Develop leadership skills 5 4 3 2 139. Develop a commitment to accurate work 5 4 3 2 140. Improve ability to follow directions, instructions, and plans 5 4 3 2 141. Improve ability to organize and use time effectively 5 4 3 2 142. Develop a commitment to personal achievement 5 4 3 2 143. Develop ability to perform skillfully 5 4 3 2 144. Cultivate a sense of responsibility for one’s own behavior 5 4 3 2 145. Improve self-esteem/self-confidence 5 4 3 2 146. Develop a commitment to one’s own values 5 4 3 2 147. Develop respect for one’s own values 5 4 3 2 148. Cultivate emotional health and well-being 5 4 3 2 149. Cultivate an active commitment to honesty 5 4 3 2 150. Develop capacity to think for one’s self 5 4 3 2 151. Develop capacity to make wise decisions 5 4 3 2 152. In general, how do you see your primary role as a teacher? (Although more than one statement may apply, please choose

only one.)1 Teaching students facts and principles of the subject matter2 Providing a role model for students3 Helping students develop higher-order thinking skills4 Preparing students for jobs/careers5 Fostering student development and personal growth6 Helping students develop basic learning skills

Source: Classroom Assessment Techniques, by Thomas A. Angelo and K. Patricia Cross. Copyright© 1993. Used bypermission. Publisher, Jossey-Bass, San Francisco, California.

Page 30: Assessment Practices in Undergraduate Mathematics - Northern

Part IAssessing the Major

Page 31: Assessment Practices in Undergraduate Mathematics - Northern
Page 32: Assessment Practices in Undergraduate Mathematics - Northern

19

In his introduction to this volume, Bernie Madison hasdescribed the process by which the MAA in the early 1990sbecame involved in the issue of assessing the mathematicsmajor. Out of that effort came the document “Assessment ofStudent Learning for Improving the Undergraduate Majorin Mathematics.” [1] (You will find it reprinted on pages279–284.) At about the time this document was publishedby the MAA (1995), a relatively small number ofundergraduate mathematics programs across the country werefaced with developing an “assessment plan” and the hopewas that some of them would find the MAA’s work helpfulas a map to guide them in these uncharted waters. By thistime (1999), the reality is that almost all undergraduateprograms will have to have some type of assessment plan inplace within the next couple of years. The hope is that whatyou will read in the next few pages will be helpful to you asyou face this brave new world.

We, the editors, have chosen a representative sample ofassessment programs from among those college anduniversity mathematical sciences departments which havegotten into the game early. The sample contains articles frommathematics faculty at small schools, medium-size schoolsand large schools; at liberal arts colleges, regionalcomprehensive universities and major research institutions.

These departments are using a variety of methods to assesstheir programs, e.g., capstone courses, comprehensive exams,diagnostic projects, focus groups, student portfolios andsurveys of various types. However, these techniques are notdescribed in isolation; they are to be understood within thecontext of the assessment cycle. As you read these articles,you will see that each of the authors has provided that context.What follows is a brief description of each author’s paper.

In the first two articles student portfolios are the princi-pal assessment methods. Laurie Hopkins describes the as-

sessment process at Columbia College in South Carolina. Inthis program certain courses are designated as portfolio de-velopment courses, and in these courses the student is askedto reflect in writing on the connection between the materialincluded in the portfolio and the departmental goals. LindaSons describes assessment portfolios at Northern IllinoisUniversity, which contain students’ work from various pointsin their mathematics program and are examined by a depart-ment committee after the students have graduated.

Capstone courses as assessment techniques are discussedin two articles, one by Charles Peltier at St. Mary’s Collegein Indiana and the other by Deborah Frantz of KutztownUniversity in Pennsylvania. Peltier describes a full-yearsenior seminar in which independent study projects (writtenand oral presentations) are developed as part of the seminar.Frantz provides us with the details of a one-semester SeniorSeminar in Mathematics in which oral and writtenpresentations are just two among a number of assessmenttechniques used.

Bonnie Gold and Dan Callon present us with differentmodels for using comprehensive exams as assessment in-struments. Gold describes a two-part exam consisting of awritten component in the mathematics major and an oralcomponent over the liberal arts. Callon discusses a ratherunique approach to giving a comprehensive: a one-week,joint, written exam given to seniors during their fall semes-ter. John Emert and Charles Parish discuss a “less” compre-hensive exam developed to assess the core courses taken byall undergraduate mathematics majors.

Next we are introduced to an entirely different approachto assessing the mathematics major: the use of focus groups.Marie Sheckels gives us some insight as to how focus groupswith graduating seniors can be incorporated into anassessment process.

Introduction

William A. MarionValparaiso University

Page 33: Assessment Practices in Undergraduate Mathematics - Northern

20 Assessment Practices in Undergraduate Mathematics

The next two articles describe overall departmental as-sessment plans involving a variety of techniques to assessthe major. Dick Groeneveld and Robert Stephenson describethe assessment measures used in their statistics program, par-ticularly the use of grades of graduates, surveys of graduatesand surveys of employers of graduates. Janice Walker alsodiscusses all of her mathematics department’s assessmenttechniques including the use of exit interviews and the Edu-cational Testing Service’s Major Field Test in Mathematics.

Three of the articles deal with assessing the mathematicsprogram either at the midway point of a student’s four-yearcareer or at the freshmen level. Mark Michael discusses asemester-long Diagnostic Project which is part of a requiredDiscrete Mathematics course, usually taken by mathematicsmajors in their sophomore or junior year. Judith Palagalloand William Blue of Akron University in Ohio describe asophomore-level Fundamentals of Advanced Mathematicscourse and what information it reveals about the major earlyon. Elias Toubassi introduces us to a major effort at a largeuniversity to reform the entry-level mathematics coursesthrough assessment and its subsequent effect on the math-ematics major.

Finally, Deborah Bergstrand discusses assessing the major

from the point of view of students who are potentialgraduate students in mathematics. Three measures — anHonors Project, a Senior Colloquium and SummerUndergraduate Research project — are used.

This section has been organized so that you can look atevery article or pick and choose those which are of mostinterest to you. In some sense each of our undergraduatemathematics programs is unique and yet there is enoughcommonality so that you might be able to adapt at least oneof the assessment processes described here to fit your ownsituation.

Reference

[1] Committee on the Undergraduate Program inMathematics (CUPM). “Assessment of StudentLearning for Improving the Undergraduate Major inMathematics,” Focus: The Newsletter of theMathematical Association of America, 15 (3), June1995, pp. 24–28. Reprinted on pp. 279–284 of thisvolume.

Page 34: Assessment Practices in Undergraduate Mathematics - Northern

21

Assessing a Major in Mathematics

Laurie HopkinsColumbia College

At a small, private women’s liberal arts college in the South student portfolios have become the principalmeans for assessing the major. Unique to this program, certain courses are designated as portfoliodevelopment courses and in these courses the student is asked to reflect in writing on the connectionbetween the material included in the portfolio and the department goals.

Background and PurposeColumbia College is a small private women’s college in Co-lumbia, South Carolina, founded by the United MethodistChurch. Although the college has a small graduate programin education, by far the major focus of the institution is onexcellence in teaching undergraduates. The college is a lib-eral arts institution, but we do offer business and educationmajors. The number of students who major in elementaryeducation, early childhood education, and special educationform a significant percentage of the total student body. Also,many of the more traditional liberal arts majors offer second-ary certifications in addition to major requirements. In math-ematics, the majority of our majors also choose to certify toteach. The large commitment to pre-service teachers rein-forces the institutional commitment to excellence in the class-room.

Our mathematics department has been experimentingwith assessment of the major for improvement of studentlearning for about five years. Initially, the primary impetusfor assessment was pleasing our accrediting body. As thecollege has embraced the strategic planning process, moreand more departments are using planning and assessmentfor improvement purposes. Our first attempts centeredaround efforts to locate a standardized instrument. Thissearch was unsuccessful for several reasons. First, the par-ticular set of courses which our majors take was not repre-sented on any test we examined. Moreover, without the ex-

pense and inconvenience of a pre-test and post-test adminis-tration, we felt that the information from such a measure wouldtell us more about ability than learning. Finally, if we did thedual testing we would only obtain answers to questions aboutcontent knowledge in various mathematical areas. Althoughthere would clearly be value in this data, many unansweredquestions would remain. In fact, many of the kinds of class-room experiences which we think are most valuable are noteasily evaluated. Consequently, we chose student portfoliosas the best type of assessment for our major. It seemed to usthat this particular technique had value in appraising outcomesin mathematical communication, technological competence,and collaborative skills that are not measurable through stan-dardized tests.

To assess the major, we first split our departmental goalsinto two sections, a general section for the department and aspecific section for majors. We articulated one goal for mathmajors, with nine objectives. For each of the objectives welooked for indicators that these objectives had been met byasking the question “How would we recognize success inthis objective?” Then we used these indicators as the as-sessment criteria. Some of our objectives are quite genericand would be appropriate for many mathematics depart-ments, while others are unique to our program.

The goal, objectives, and assessment criteria follow.Goal: The Mathematics Department of Columbia Col-

lege is committed to providing mathematics majors thecoursework and experiences which will enable them to at-

Page 35: Assessment Practices in Undergraduate Mathematics - Northern

22 Assessment Practices in Undergraduate Mathematics

tend graduate school, teach at the secondary level or enter amathematics related career in business, industry or govern-ment. It is expected that each mathematics major will beable to:

1. Demonstrate the ability to write mathematical proofs.Assessment: Each student portfolio will contain at leastfour examples of proofs she has written in differentclasses.

2. Demonstrate the ability to read and evaluate a statisticalstudy.Assessment: Each student portfolio will contain anevaluation of a statistical study that she has completed.

3. Demonstrate knowledge of women in mathematics.Assessment: Each student portfolio will contain evidenceof knowledge of the contributions of women tomathematics.

4. Demonstrate topics in calculus from various perspectives— numerical, graphical, analytical and verbal.Assessment: Each student portfolio will contain at leastone topic from calculus demonstrated from the numerical.graphical, analytical and verbal perspectives.

5. Use mathematics as a problem solving tool in real lifesituations.Assessment: Each student portfolio will contain at leastfour examples of her use of mathematics in solving a reallife problem.

6. Work collaboratively to solve problems.Assessment: (i) Each student portfolio will contain at leastfour examples of problems that the student has solvedcollaboratively and a reflective statement about thecollaborative process.(ii) 75% of course syllabi in upper level courses willindicate learning activities requiring collaborativeproblem-solving and/or presentation of in-depth problemsolving to the class.(iii) A team of students selected and trained by themathematics faculty will participate in the nationwideCOMAP contest.

7. Organize, connect and communicate mathematical ideaseffectively.Assessment: Each student portfolio will contain fourexamples with evidence of mathematical communication,either in written or spoken form.

8. Use technology appropriately to explore mathematics.Assessment: Each student portfolio will contain at leastfour examples of mathematical exploration usingtechnology.

9. Feel adequately prepared for their post-graduationexperience.Assessment: 90% of math major graduates who respondto a graduate survey will indicate that they feltmathematically prepared for the job they hold.

Method

Our initial effort to require the portfolio for each studentwas an informal one. We relied on a thorough orientation tothe process of creating a portfolio, complete with checklists,required entries and cover sheets for each entry on whichthe student would write reflectively about the choice of theentry. We included this orientation in the advisement processof each student and stressed the value of the final collectionto the student and to the department. Faculty agreed toemphasize the importance of collecting products for theportfolio in classes and to alert students to products whichmight make good portfolio entries.

We now designate a series of courses required for allmajors as portfolio development courses. In these coursesproducts are recommended as potential candidates forinclusion in the collection. Partially completed portfoliosare submitted and graded as part of the courses. Perhapsmost importantly, some class time in these courses is devotedto the reflective writing about ways in which specific productsreflect the achievement of specific goals. Completing thiswriting in the context of classroom experience has had avery positive impact on the depth of the reflective work.

Findings

Our early efforts were a complete failure. No portfolios wereproduced. We probably should have realized that theportfolios would not actually be created under the informalguidelines, but in our enthusiasm for the process we failedto acknowledge two basic realities of student life: if it is notrequired, it will not be done and, equally important, if it isnot graded, it will not be done well. After two years duringwhich no student portfolios were ever actually collected, wemade submission of the completed portfolio a requirementfor passing the mandatory senior capstone course. The firstsemester this rule was in place, we did indeed collectcomplete portfolios from all seniors. However, it was obviousthat these portfolios were the result of last-minute scroungingrather than the culmination of four years of collecting anddocumenting mathematical learning.

The second major problem was related to the first.Although the department was unanimous in its endorsementof the portfolio in concept, the faculty found it difficult toremember to include discussions related to the portfolio inadvisement sessions or in classes. Incorporating the portfoliointo specific mathematics classes was an attempt to remedyboth situations.

The most striking problem surfaced when we finallycollected thoughtfully completed portfolios and began to tryto read them to judge the success of our program. We hadnever articulated our goals for math majors. The departmenthad spent several months and many meetings formulating

Page 36: Assessment Practices in Undergraduate Mathematics - Northern

Part I: Assessing the Major 23

departmental goals, but the references to student behaviorswere too general to be useful. The two relevant goals were“Continue to provide a challenging and thorough curriculumfor math majors” and “Encourage the development of studentportfolios to document mathematical development.” In ourinexperience we assumed that weaknesses in the major wouldbe apparent as we perused each student’s collection of “bestwork.” Unfortunately, the reality was quite different. In theabsence of stated expectations for this work, no conclusionscould be reached. A careful revision of the departmentalgoals for a mathematics major was crafted in response tothis finding. The resulting document is included in the firstsection. Examination of the portfolios pointed to a moredifficult potential problem. In our small department, eachstudent is so well known that it is difficult to inject anyobjectivity into program evaluation based on particularstudent work. We were forced to acknowledge that anobjective instrument of some sort would be necessary tobalance the subjective findings in the portfolios.

Use of Findings

One of the most important measures of the usefulness of adepartmental strategy for assessment is that informationgained can cause changes in the major, the department orthe process. The changes made in the process include arevision of the goals and objectives, the addition of portfoliodevelopment in several required classes and the use of apost-graduation survey to determine graduates’ perspectiveon the adequacy of their mathematical preparation for theirpost-graduation experiences. However, with the modifiedprocess now in place we have been able to recognize otherchanges that need to be made.

Examination of statistical entries in the portfolio indicatedthat there was not a sufficient depth of student understandingin this area. As a result we have split the single course whichour students took in probability and statistics into a requiredseries of two courses in an effort to cover the related topicsin greater depth and to spend more time on the applications.

One graduate expressed her lack of confidence in enteringgraduate school in mathematics on the questionnaire. Wehave consequently added a course in Advanced Calculus toour offerings in an effort to help the students in our program

who are considering graduate school as an option. Commentsby several graduates that they did not understand their careeroptions well have caused us to add a special section of thefreshman orientation section strictly for mathematics majors.This course incorporates new majors into the department aswell as the school and has several sessions designed to helpthem understand the options that mathematics majors haveat our college and in their careers.

Success Factors

The assessment process is not a static one. Not only shouldchange come about as a result of specific assessmentoutcomes, the process itself is always subject to change. Thefirst series of changes made in our plan were the result ofdeficiencies in the plan. The next changes will be attemptsto gain more information and to continue to improve ourpractices. We have recently added objective instruments tothe data we are collecting. The instruments are a test tomeasure each student’s ability in formal operations andabstract logical thought and a test to determine the stage ofintellectual development of each student. We have giventhese measures to each freshman math major for the pasttwo years and plan to administer them again in the capstonecourse. After an initial evaluation of the first complete dataset, we will probably add an objective about intellectualdevelopment to our plan with these instruments as thepotential means for assessment. We have recently begunanother search for a content test to add to the objective datagathered at the beginning and the end of a student’s collegeexperience. There are many more of these types of measuresnow available and we hope to be able to identify one thatclosely matches our curriculum. With the consideration of astandardized content exam as an assessment tool, there is asense in which we have come full circle. However, the spiralimagery is a better analogy. As we continue to refine andimprove the process, standardized tests are being examinedfrom a different perspective. Rather than abdicating ourassessment obligations to a standardized test, we have nowreached the point where the information that can be providedby that kind of tool has a useful position in our assessmentstrategy.

Page 37: Assessment Practices in Undergraduate Mathematics - Northern

24

Background and Purpose

Evaluating its degree programs is not a new activity for theDepartment of Mathematical Sciences at Northern IllinoisUniversity. The Illinois Board of Higher Education mandatesperiodic program reviews, while the Illinois Board ofEducation, NCATE, and the North Central Association alsorequire periodic reviews. The University regularly surveysgraduates concerning their employment, their satisfactionwith the major program completed, and their view of theimpact the University’s general education program had ontheir learning. The Department also polls graduatesconcerning aspects of the undergraduate degree program.However, in 1992 in response to an upcoming accreditationreview by the North Central Association the Departmentdeveloped a new assessment scheme for its undergraduatemajor program—one which is longitudinal in character andfocuses on how the program’s learning goals are being met.

The B.S. in Mathematical Sciences at Northern IllinoisUniversity is one of 61 bachelor’s degree programs the Uni-versity offers. As a comprehensive public institution, the Uni-versity serves nearly 24,000 students of which about 17,000are undergraduates who come geographically mostly from thetop third of the State of Illinois. Located 65 miles straight westof Chicago’s loop, Northern Illinois University accepts manytransfer students from area community colleges. For admis-sion to NIU as a native student an applicant is expected to

have three years of college preparatory high school mathemat-ics/computer science. Annually about 50 students obtain theB.S. in Mathematical Sciences of which about half are nativestudents. Each major completes at least one of five possibleemphases: General, Applied Mathematics, ComputationalMathematics, Probability and Statistics, and Mathematics Edu-cation. Ordinarily, about one-half of the majors are in Math-ematics Education, about one-quarter are in Probability andStatistics, and the remaining quarter is divided pretty evenlyamong the other three emphases.

Regardless of the track a student major may choose, theDepartment’s baccalaureate program seeks to develop at leastfive capabilities in the student. During the Spring of 1992our weekly meeting of the College Teaching Seminar (abrown-bag luncheon discussion group) hammered out aformulation of these capabilities which were subsequentlyadopted by the Department’s Undergraduate StudiesCommittee as the basis for the new assessment program.These capabilities are:1. To engage effectively and efficiently in problem solving;2. To reason rigorously in mathematical arguments;3. To understand and appreciate connections among

different areas of mathematics and with other disciplines;4. To communicate mathematics clearly in ways appropriate

to career goals;5. To think creatively at a level commensurate with career

goals.

Portfolio Assessment of the Major

Linda R. SonsNorthern Illinois University

A Midwestern, comprehensive university which has five different programs in the mathematical sciences— General, Applied Mathematics, Computational Mathematics, Probability and Statistics, andMathematics Education — requires students to maintain assessment portfolios in courses which arecommon to all five of the emphases. The portfolios of those who have graduated during the year areexamined by a department assessment committee shortly after the close of the spring semester.

Page 38: Assessment Practices in Undergraduate Mathematics - Northern

Part I: Assessing the Major 25

A discussion of a mechanism which could be used to gaugestudent progress towards the acquisition of these goals led tothe requirement of an Assessment Portfolio for each student.The Portfolio would contain work of the student from variouspoints in the program and would be examined at the time ofthe student’s graduation.

Method

Common to all of the emphases for the major are the lowerdivision courses in the calculus sequence, a linear algebracourse, and a computer programming course. At the juniorlevel, each emphasis requires a course in probability andstatistics and one in model building in applied mathematics,while at the senior level each student must take an advancedcalculus course (which is an introduction to real analysis). Eachemphasis requires additional courses involving the constructionof rigorous mathematical proofs and at least one sequence ofcourses at the upper division intended to accomplish depth.

The choices made for material for the assessment portfoliowere:

a) the student’s final examination from the required lowerdivision linear algebra course (for information related togoals 1 and 2);

b) the best two projects completed by the student in themodel building course—the course requires 5 projectsfor each of which a report must be written up like a tech-nical report (for information related to goals 1, 3, and 4)

c) two homework assignments (graded by the instructor),one from about midsemester and one from late in thesemester from the advanced calculus course (forinformation related to goals 1, 2, 4, and 5)

d) the best two homework assignments from two otheradvanced courses determined according to the student’semphasis, for example, those in the probability andstatistics emphasis are to present assignments from atheoretical statistical inference course and a methodscourse, while those in the general emphasis present workfrom the advanced calculus II course and from a secondsenior-level algebra course (for information related to allfive goals);

e) a 250–300 word typed essay discussing the student’sexperience in the major emphasizing the connections ofmathematics with other disciplines (for informationrelated to goals 3 and 4).

Items a)–d) are all collected by instructors in the individualcourses involved and placed in files held in a Department Officeby office personnel. The student is responsible for the essay ine) and will not be cleared for graduation until the essay is writtenand collected by the Department’s Director of UndergraduateStudies. Thus, all items to be in the portfolio are determinedby the faculty, but the student’s essay may offer explanationfor the portfolio items. The student’s graduation is contingent

upon having the portfolio essay submitted, but otherwise theportfolio has no “control” in the graduation process. Studentsare told that the collection of the items for the portfolio is forthe purpose of evaluating the degree program.

After the close of the spring semester each year, a committeeof senior faculty examines the assessment portfolios of thestudents who have graduated in the just completed year and isexpected to rate each capability as seen in the portfolio in oneof three categories—strong, acceptable, and weak. After allportfolios have been examined, the committee may determinestrengths and weaknesses as seen across the portfolios andmake recommendations for future monitoring of aspects ofthe program, or for changes in the program, or for changes inthe assessment procedures. These are carried to theDepartment’s Undergraduate Studies Committee forconsideration and action.

Findings

The implementation of this new assessment scheme hasresulted in the third review of portfolios being conducted inMay–June 1997. Given the time lag inherent in the collectionof the portfolios, the initial review committees have not beenable to work with complete portfolios. However, these year-end committees made some useful observations about theprogram and the process. They discovered:

1. that the linear algebra course appeared to be uneven as itwas taught across the department — on the department-wide final examinations, student papers showedconcentrated attention in inconsistent ways to certainaspects of the examination, rather than a broad responseto the complete examination;

2. that some students who showed good capability inmathematical reasoning on the linear algebra finalexamination experienced a decline in performance onhomework assignments completed in the upper divisioncourses emphasizing rigorous mathematical reasoning;

3. that evaluating capabilities was difficult without havinga description of those capabilities in terms ofcharacteristics of student work;

4. that student essays were well written (a pleasant surprise!),but the understanding expressed concerning connectionsbetween mathematics and other disciplines was meager;

5. that there appeared to be a positive influence on course-work performance when courses were sequenced incertain orders (e.g., senior-level abstract algebra taken asemester before the advanced calculus).

Use of Findings

In response to the discoveries made by the year-endcommittees, the Department took several actions.

Page 39: Assessment Practices in Undergraduate Mathematics - Northern

26 Assessment Practices in Undergraduate Mathematics

First, to address the concerns related to the linear algebracourse, the coordinators for the course were asked to clarifyfor instructors that the course was intended not only to enablestudents to acquire some computational facility with conceptsin elementary linear algebra, but also to provide studentswith a means to proceed with the gradual development oftheir capacity in rigorous mathematical reasoning. This meantthat some faculty needed to teach the course with lessconcentration on involvement with numerical linear algebraaspects of the course and greater concentration oninvolvement with students’ use of precise language andconstruction of “baby” proofs. Further, a lower divisioncourse on mathematical reasoning was introduced as anelective course. This elective course which enables studentsto study in mathematical settings logical statements, logicalarguments, and strategies of proof can be recommended tothose students who show a weak performance in theelementary linear algebra course or who have difficulty withthe first upper division course they take which emphasizedsproof construction. It can, of course, also be taken by otherswho simply wished to take such a course.

A second area of response taken by the Department wasto ease the execution of the assessment process byintroducing a set of checklists. For each capability a checklistof characteristics of student work which that capabilityinvolved were defined. For instance, under problem solvingis listed: student understands the problem (with minor errors?gross errors?), approach is reasonable or appropriate,approach is successful, varied strategies and decision makingis evident, informal or intuitive knowledge is applied.

Regarding the need for more connections to disciplinesoutside mathematics, as yet no satisfactory plan has emerged.While students in the applied emphasis must choose anapproved minor area of study in the University, there isreluctance to require all majors to have such a minor and alack of appropriate “single” courses offered in the Universitywhich easily convey the mathematical connections. For nowthe agreed upon strategy is to rely on connections expressedin the current set of required courses (including the modelbuilding course) and in programs offered by the Math Club/MAA Student Chapter.

The observation concerning the performance of studentswhen courses were taken in particular sequential patterns

was not new to those who were seasoned advisers in theDepartment. But it did trigger the reaction that new advisersshould be especially alerted to this fact.

Success Factors

While the work of the first review committee of faculty waslengthy and cautious because of the newness of the process,the next review committees seemed to move right along withgreater knowledge of what to do and how to do it. Theportfolio material enables the growth of a student to be tracedthrough the program, and the decision of what to includemakes it possible to have complete portfolios for mosttransfer students as well as for the other students. Furthereven incomplete portfolios help in the discernment ofpatterns of student performance. However, it does takeconsiderable time to read through the complete folder foran individual student.

So far the University has been willing to provide extramonetary compensation for the additional faculty timeinvolved in evaluating the portfolios. Should this no longerbe the case, an existing Department committee could examinea subset of portfolios each year, or reduce consideration ofall the objectives to tracing one objective in one year and adifferent objective in the next.

The program has not been difficult to administer sincemost of the collection of materials is done through theindividual course instructors. The Department’s Director ofUndergraduate Studies needs to be sure these faculty areaware of what needs to be gathered in the courses and to seethat the appropriate duplication of materials is done forinclusion in the folders. In addition, he/she must collect thestudent essays.

The process has the added benefit that involvement ofmany faculty in collecting the materials makes more facultythink about the degree program in its entirety, rather thanmerely having a focus on the individual course taught.Finally, the mechanism of the portfolios being collected whenthey are, and evaluated AFTER the student graduates, alongwith the qualitative nature of the evaluation process, insuresthat the assessment IS of the program rather than of individualstudents, faculty, or courses.

Page 40: Assessment Practices in Undergraduate Mathematics - Northern

27

Background and Purpose

Saint Mary’s College is a catholic, liberal arts college forwomen with about 1500 students. We typically have ten to adozen mathematics majors graduating each year; all of themathematics and computer science courses are taught withinthe department, which consists of eight full-time faculty andtwo or three adjuncts teaching specialized courses. Sincethe 1930s the college has required that each student pass acomprehensive examination in her major. The requirementwas rewritten in the early 1970s to give each department thefreedom to determine the most appropriate form for exam-ining its students. At that time the mathematics departmentreplaced the “test” format with a requirement for an inde-pendent study project, which has come to be known as thecomprehensive project.

A capstone seminar was developed to provide aframework for student work on the comprehensive projectand to foster student independence in learning and skill indiscussing mathematics. The spur for developing a formalprogram for overall assessment of student learning was morebureaucratic. In 1996 the college had to present to NorthCentral, as part of the reaccreditation process, a plan forassessing student learning. The mathematics department hadresponsibility for developing the plan for assessment in themajor and built its plan around the existing senior capstonecourse and the project.

In developing the assessment program, we had toarticulate our goals for the major in terms of studentachievement rather than department action.

Goals for a Mathematics Major

As a result of study in mathematics, a graduate of SaintMary’s College with a major in Mathematics will have metthe following goals.1. The graduate will have developed learning skills and

acquired a firm foundation of knowledge of fundamentalmathematical concepts, methods, reasoning and languagesufficient to support further academic work or a career inan area that requires mathematical understanding.

2. The graduate will be able to apply her mathematicallearning skills and knowledge and also to utilizeappropriate technology to develop models for solvingproblems and analyzing new situations, both inmathematics and in areas that use mathematics.

3. The graduate will be able to communicate her ideas andthe results of her work, both orally and in writing, withclarity and precision.

4. The graduate will be prepared to use her knowledge andlearning skills to undertake independent learning in areasbeyond her formal study.

5. The graduate will be prepared to use her critical thinkingskills and mathematical knowledge as a contributingmember of a problem solving team.

An Assessment Program Built Around a Capstone Course

Charles PeltierSaint Mary’s College

A full-year senior capstone course has evolved at a small, private women’s liberal arts college in theMidwest to become the principal tool for assessing the major. Within this two-semester seminar eachstudent has to develop an independent study project, known as the comprehensive project. Preliminarywork on the project begins in the first semester and oral and written presentations of the completedproject are given in the second semester.

Page 41: Assessment Practices in Undergraduate Mathematics - Northern

28 Assessment Practices in Undergraduate Mathematics

6. The graduate will have examined and formed ethicalprinciples which will guide her in making professionaldecisions.

The senior seminar and senior comprehensive project, asassessment tools, are intended to provide both developmentand information on goals 1, 3, and 4. Some of the work inthe first semester of the seminar touches on goal 5, andindividual projects may get into actual problem-solving ofthe sort envisioned in goal 2. In practice, the senior seminar,like all of our courses, was designed to foster developmenttoward these goals, as well as to assess their achievement.

In addition to the assessment of student learning describedhere, the department evaluates its programs through abiennial program in which recent graduates are invited backto campus for a day-long discussion with current students,through contact with alumnae, through contacts withinterviewers who come to campus, and by using theinformation in the college’s surveys of recent graduates.

The assessment method described here consists of (1) afull-year course meeting twice a week, called the SeniorSeminar, and (2) an independent study project culminatingin a formal oral and written presentation and response tofaculty questions, known as the Senior ComprehensiveProject. Roughly speaking, the first semester seminar beginsthe experience of extended independent learning andpresentation and provides a vehicle for organizing thestudents’ search for a topic and advisor. The second semesterseminar is built around students’ presentation of preliminaryfindings while they are working on their individual projects.

Method

During the first semester of the Senior Seminar, all studentswork on a common topic and from a common text. They arerequired to learn and present material on the level of a junior-senior mathematics course. The way this usually works isthat the seminar director selects a text involving materialnot already covered by the students in other courses anddetermines a course schedule based on one-week sectionsof the material. Each section of material is assigned to ateam of two students which is responsible, with advice fromthe seminar director, for learning the material, presentingthe material to the class and assigning and correcting writtenproblems based on the material. One semester generallyallows time for two rounds of presentations, with the teamschanged between first and second rounds, and for severalproblem presentation days in which individual students areassigned to present solutions of some of the more difficultproblems. In weeks in which they do not present, studentsare responsible for reading the material, asking questions ofthe presenters, critiquing the presentations, and working theproblems. Each student is required to have an advisor andtopic for her senior comprehensive project by mid-October.

The writing assignment for the fall seminar often requiresan introduction to the topic of the comprehensive project.The grade in the first semester of the seminar is based ondemonstrated knowledge of the material, the presentations,participation (including asking questions, completing thecritiques, etc.), and the required paper.

During the second semester, each student is working onher own senior comprehensive project. She works with theadvisor in learning the material and preparing a paper and aformal public presentation based on her work. In the seminar,she presents two fifty-minute preliminary talks on her topicand is expected to follow other student talks in enough detailto ask questions and to critique the presentations. Theseminar talks are expected to contain sufficient informationfor her classmates to understand her formal presentation.The grade in the second semester of the seminar is based ondemonstrated learning of the topic, presentation of the topic,participation, and work with the comprehensive advisor.

The comprehensive project must develop material that isnew to the student. This may be an extension of an areapreviously studied or a completely new area for the studentand may be in any area of mathematics, includingapplications and computer science. The comprehensiveadvisor and the student work out a project that is of sufficientdepth and breadth and that can be completed in the timeavailable. The progress is monitored by both the advisor,who meets regularly with the student, and the seminardirector, who observes the preliminary talks. The studentprovides a formal written presentation of her work, with asummary of the material covered in the seminar talks and amore detailed treatment of the final material. Once the paperis written, it is submitted to a committee of three facultymembers — the advisor, the seminar director and one other— who read the paper and prepare questions related to thematerial. The formal presentation is made before an audienceconsisting of this committee, the seniors in the seminar, andany others who may wish to attend. It consists of a forty-minute presentation on the final material in her work andtwenty minutes of response to questions from the facultycommittee. The formal paper, the oral presentation, and theresponse to questions constitute the senior comprehensiveexamination required by the college. The paper for the seniorcomprehensive project also serves as the final submissionfor the student’s advanced writing portfolio.

Grading of the senior comprehensive projects occurs intwo stages. The committee determines whether the studentpasses or fails; there is usually some reworking of the paperneeded, but except in extreme cases this not an impedimentto success. After all the presentations have been completed,the department faculty meets to discuss all of thecomprehensives and to determine which, if any, are to beawarded honors. The discussion is based on five criteria:

1) mastery of the subject,

Page 42: Assessment Practices in Undergraduate Mathematics - Northern

Part I: Assessing the Major 29

2) the quality of the written paper,3) the quality of the oral presentation,4) the response to questions,5) the independence and reliability shown in the work with

the advisor.

This discussion also serves as an overview of theachievement of the students in the senior class, and theseminar director writes a report to the department based onthe discussion.

Findings

Since the seminar and comprehensive project have been inplace longer than our formal assessment program, most ofthe information has been gained and used in planning in aninformal and implicit fashion. There are, however, a fewresults that can be stated directly.

The first is that taking courses does not prepare studentsto explain what they know to people who do not alreadyknow it — that is, to anyone other than teachers or otherstudents who have seen the material. There are really twodifferent but related issues here. The first is the identificationof assumptions and background for a particular piece of work— understanding where it fits into a larger picture. Thesecond is the need to move from the “show that I know it”mindset of passing a test to the “giving you an understandingof something I know and you don’t” mindset. There is astage of development here that does not occur automaticallyand it seems to be very closely related to learning to workindependently and to critique the results. We do see growthhere during the senior year.

A related fact is that students find it difficult to work with,and especially to combine, different presentations of the sameidea. Searching for examples and dealing with varyingnotation in different sources is a major difficulty for many.Seeing old ideas in a new form and a new setting is often achallenge, especially to a student who is working at the limitof her experience. The failure to recognize ideas fromprevious courses, and a consequent inability to use them innew settings, is often another form of this problem. Theseminar also provides a sometimes discouraging reminderthat there are always topics from previous study that havebeen forgotten or were never really understood.

Discussions with alumnae and interviewers have stronglysupported our belief that several aspects of the seminar andcomprehensive project are very important for those graduates— a majority, in our case — who do not go into teaching orinto explicitly mathematics-related fields. The importantaspects seem to be the independent work on a long-termproject and the organization and presentation of results. Thepractice in independent learning and interpretation has beenequally important to those who have gone on to graduatestudy in mathematics or mathematics-related fields.

Use of Findings

The department’s experience with the seminar and thecomprehensive project has affected all planning anddiscussion for the last twenty years. One major resultoccurred at the level of the college curriculum when thedepartment strongly supported the introduction of the“advanced writing” requirement, to be fulfilled in thestudent’s major. Our experience in asking students to explainmathematics made it clear that we needed to develop thisskill, and we had already begun introducing writingrequirements within courses when this college-wideframework came under discussion. Currently the second yearwriting requirement in mathematics focuses on expositionand the third year adds technical writing, but we may findthat some other approach works better.

The fall semester of the seminar has changed often,usually in small ways, from experience with the secondsemester and the comprehensive project. In order to makestudents more conscious of the decisions made in preparinga presentation, we introduced a student critique form, whichmust be filled out by each student for each speaker. We areconsidering having each student fill out such a form forherself, as well, to foster more reflection on the process.When the seminar was put in place, students worked onseparate topics in both semesters — shorter topics in thefall, longer in the spring. We have found that the process ofexplanation takes more effort and learning, and that workingon a common topic in the fall allows for more mutual supportand cooperation. By encouraging more student questions ofthe presenters it also improves the feedback on thepresentations and student awareness of the process ofpresentation. The use of teams of students arose for the samereason. The use of written exercises puts more pressure onthe “audience” to really work at learning the material, andsomewhat reduces the tendency to let the presenters get bywith “good enough.”

Our experience with the seminar and comprehensive hasalso contributed to an increase in assignments requiringlonger explanations in earlier mathematics courses and anincrease in requirements for in-class presentations by studentsin earlier years.

Success Factors

Alumnae who have gone to graduate programs and thosewho have gone to work report that the experience has beenvaluable in preparing them to deal with the work they haveto do. A part of this is certainly the confidence they developfrom having already worked on extended projects andreported on their work. We have also gained information onsome things that work and some that do not work indeveloping the skills.

Page 43: Assessment Practices in Undergraduate Mathematics - Northern

30 Assessment Practices in Undergraduate Mathematics

A major advantage to this program as an assessment toolis that it is a required part of the educational program of thedepartment, not an added testing requirement. The seminarinvolves graded academic credit and the whole program isapproached as a learning tool, so that problems of studentparticipation are minimized. It is a great help that seniors inother majors are also involved in meeting the comprehensiverequirement in ways tailored to these majors, even thoughthe mathematics comprehensive is generally recognized asone of the most demanding.

There are two major drawbacks to this program — use ofstudent time and use of faculty time. Since the seminar involvestwo semesters, the number of other courses taken by a studentis reduced, though the fall seminar does, in fact, deal withstandard content in a fairly standard way. The comprehensiveproject involves a great deal of time, and a student who hasthought in terms of “a two-hour course” for the spring semestermay find herself overextended. The program also usesconsiderably more faculty time than would appear on officialrecords. The seminar director is credited with two semesterhours of teaching in each semester, but the faculty who serveas project advisors receive no official teaching load credit.During the first semester, the seminar director spends a greatdeal of time in one-on-one or one-on-two discussion with thestudents as they first learn the material and then begin thinkingabout how to present it; this takes far more time than preparinglectures and exercises on the material. In addition, the seminardirector provides written critiques to the students, incorporatingthe information from the student critiques. In the seniorcomprehensive phase, each faculty member is advising one ortwo students on projects which extend over several months

and require planning of presentations and writing. Naturally,weaker students require a great deal of direction and advice,but faculty find that they are more likely to plan moreambitious projects with stronger students, so that these alsorequire a lot of time. There is a positive side to this effort inthe opportunity for faculty to explore new areas in their fieldsor in related fields. The seminar director has the uniqueopportunity of reading all the projects and seeing new areasin many fields of mathematics.

In working with a program such as this, it is essential tohave requirements explicit and clear from the beginning.There is always time pressure on students, and for many thelong-term nature of the project is a new experience. Withoutdeadlines and dates the weaker students, especially, may getso far behind that they can catch up only with heroic effort— and at the cost of work in other courses. Grading criterianeed to be as explicit as possible, because there are manyopportunities for misunderstandings and incorrectexpectations in a mathematics course that is very differentfrom students’ previous experience.

The involvement of all faculty in the senior comprehen-sive projects means that the program cannot work withoutfull faculty support. Not only is there a serious time com-mitment, but it is necessary that all faculty members keeptrack of deadlines, formats for papers, etc., and be willing toadjust schedules to serve on the committees for other stu-dents. Working out the inevitable differences of opinionabout interpretation of requirements before they becomecritical falls mainly to the seminar director, but would notbe possible if other faculty were not interested in the pro-gram as a whole.

Page 44: Assessment Practices in Undergraduate Mathematics - Northern

31

Background and Purpose

Kutztown University is one of fourteen universities in theState System of Higher Education in Pennsylvania. Of theapproximately 7,000 students, four to six mathematics majorsgraduate each year. (This excludes those who plan teachingas a career.) As in many undergraduate institutions, a studentwith a mathematics major at Kutztown University is requiredto take a wide variety of courses that include a calculussequence, several proof-based courses like abstract algebraand advanced calculus, and several application courses likelinear algebra, and probability and statistics. It is typicallythe case, too, that a student’s grade for each of these coursesdepends heavily (and sometimes entirely) on objective,computational exams. Because of options given to thestudents, rarely (if ever) do two students who finish ourcafeteria menu of mathematics courses experience identicalmathematics programs. How, then, does one measure thequality of the major? How can we ensure that each major isequipped to face a future in graduate school or in the workforce?

One requirement for all of our mathematics majors is thesuccessful completion of the Senior Seminar in Mathematics.It is designed as a culminating experience. It used to be taughtusing the lecture method on a mathematical topic of theinstructor’s choice, and was assessed using only traditionalexaminations. I was first assigned to teach the course in 1992.

Having already begun to incorporate oral presentations insome of my other classes, I decided to experiment withnontraditional teaching methods and alternative assessmenttechniques. The course has evolved to the point that nolectures are given. Students now study a variety ofmathematical topics, keep a journal, participate in classdiscussions, write summaries of readings, write a resume,do group problem solving, give oral presentations, andcomplete a project. (The project consists of writing anexpository paper and of delivering an oral presentation, bothof which are results of an exploration of a mathematical topicof each student’s choice.) Such activities enable students tomore successfully achieve these course objectives:

1. increase his/her competence in the independent readingof mathematical materials

2. review, structure and apply mathematical knowledge3. develop skills needed in presenting a rigorous argument4. organize and deliver a mathematical presentation5. pursue topics in mathematics not met in previous

mathematics courses6. explore further several topics that have been studied

earlier7. develop group problem solving skills8. improve skills in utilizing resources9. develop a global perspective of the role of mathematics

in society.

Using a Capstone Course to Assess a Variety of Skills

Deborah A. FrantzKutztown University

At a mid-sized, regional university in the East each student must complete a one-semester senior seminarin which a variety of assessment methods are used to assess student learning in the major. These methodsinclude: a traditional final exam, a course project, an expository paper, reading and writing assignments,a journal and a portfolio — all of which are designed to assess a variety of skills acquired throughout astudent’s four years.

Page 45: Assessment Practices in Undergraduate Mathematics - Northern

32 Assessment Practices in Undergraduate Mathematics

Method

Due to the small class size (about 10), considerable individualattention can be paid to each student. Moreover, assignmentscan be adapted to meet the needs of a particular group ofstudents. The course grade is determined by a collective setof assessment techniques that include holistic grading,traditionally graded problems, instructor evaluations basedon preselected criteria, faculty-student consultations, andportfolios. In the paragraphs that follow, assignments aredescribed along with the assessment techniques used forthem. Objectives that are addressed by the assignments areidentified by number at the end of each paragraph.

Early in the course, each student is given a differentmathematical article to read and is asked to organize anddeliver a five minute talk about it. The content is elementaryin nature in order that the student focuses on the mechanicsof communication and is not intimidated by the content.While each classmate identifies (in writing) at least onepositive and one negative attribute, the instructor uses a setof criteria known to the students. Both forms of evaluationprovide immediate feedback to the speaker. Using the samemathematical article, students give a second talk,implementing the assessment input from the first talk. Thispair of talks serves as preparation for the 25-minute oralpresentation later as part of the course project. This longertalk is assessed two ways: the instructor uses writtenevaluations based on preset criteria and a team of threemembers of the department faculty uses a pass/fail evaluationbased on the student’s apparent level of understanding, andon the cohesion of the talk. (Objectives 1, 2, 3, 4, 5, 6)

To indicate how the major program is complemented bythe general education curriculum, several class discussionsof preassigned readings are held. Topics for discussion in-clude historical, ethical, philosophical, and diversity con-cerns and how they are related to the mathematics profes-sion. Assessing performance in a class discussion is basedon two criteria: Did the student understand the main idea ofthe assigned readings? Did the student convey individualinterpretations of the issues at hand? (Objectives 1, 3, 9)

There are a variety of writing assignments. These includejournal writings, summaries of readings from books andprofessional journals, the writing of a resume, and thecomposition of a 20-page expository paper. Students keep ajournal that reflects their thoughts on mathematics-relatedtopics. Three to five entries are made each week. The journalsare holistically graded three times during the semester. Inaddition, each student uses the journal to provide a partialself-assessment on the final examination for the course.(Objective 9)

Students read articles from books and journals and thenwrite a two-page summary of each one. These are gradedholistically, as they are used primarily for diagnosticpurposes. Students are also required to write their resume

for this course. They are urged to obtain assistance from theOffice of Career Services. (Objectives 1, 3, 8, 9)

As part of the course project, each student writes a 20-page expository paper on a mathematical topic of his/herchoice. The student is guided through several steps of thewriting process throughout the semester. A draft of the paperis highly scrutinized based on preselected criteria (knowledgeof the subject, breadth of research, clarity of ideas, overallflow of the paper, etc.). No grade is assigned to the draft.However, the same criteria are used in the final assessmentof the paper. (Objectives 1, 2, 3, 4, 5, 6, 8)

Every college graduate should possess resource skills:library skills; internet skills; professional networking skills;and skills needed to find career-related information. Todevelop skills, students are given sets of specific questions,often in the form of a scavenger hunt. For example, theymay be asked to find the number of articles published byDoris Schattschneider in a given year and then conjectureabout her field of expertise, or they may be asked to find outas much as possible about Roger Penrose from the WorldWide Web. These assignments are graded on correctness ofanswer. (Objectives 1, 2, 5, 6, 7, 8, 9)

The content portion of the course includes a wide varietyof mathematical topics that are typically not covered in othercourses required of our majors. These include graph theory,complex variables, continued fractions, fractals, non-Euclidean geometry, as well as topics in more classical areasof mathematics. (I have used books by William Dunham,The Mathematical Universe and Journey Through Genius:Great Theorems of Mathematics, as stepping stones into avast range of topics.) This portion of the course is designedso that students both learn and tackle mathematics problemsin small groups. The grouping of students is done by theinstructor and is changed with each new topic and set ofproblems. Students earn both individual grades for correctsolutions, and class participation points for team interaction.Groups are changed frequently in an effort to develop somedegree of leadership skill in each student, and to allowstudents to learn of each others’ strengths and knowledgebase. (Objectives 1, 2, 3, 5, 6, 7, 8)

The final examination for the course consists of two parts:a set of questions, and a portfolio. The set of questions areprincipally content-based problems, to measure the student’smathematical knowledge based on the problems that wereworked in small groups. There are a few open-endedquestions that ask students to compare and contrastperspectives found in the assigned readings, writtensummaries, and class discussions. Students also provide anassessment of their attitudes and study habits that is basedon their own journal entries. (Objectives 2, 3, 5, 6, 9)

A portfolio is prepared by each student and is focused onthe individual’s perceived mathematical growth. Studentsare asked to describe and document the growth that theymade during the past three months. Content of the portfolio

Page 46: Assessment Practices in Undergraduate Mathematics - Northern

Part I: Assessing the Major 33

need not be restricted to materials obtained from the SeniorSeminar. They write an essay unifying the various items thatthey choose to include. Assessment of the portfolio is basedupon the breadth of perceived learning and how it is unified.Preselected criteria are used in the assessment. (Objectives3, 8, 9)

Findings

Results of assessments of the oral components of the courseindicate that examinations are not always a true indicator ofa student’s level of understanding. Mathematical knowledgecan often be conveyed more effectively by oral discoursethan by written computations or expositions.

Assessment results from writing assignments indicate thatour mathematics majors need more than basic compositionskills prior to enrolling in the Senior Seminar in Mathematics.

Prior to the reading assignments, few (if any)connections were recognized between the knowledge gainedthrough general education courses and the discipline ofmathematics.

Although students are more technologically prepared asthey enter college than those who entered four years ago,results from the resource skills assignments indicate that littledemand is made on them to develop further their information-retrieval skills during their undergraduate years.

A significant amount of bonding occurs as a result ofworking in small groups. Consequently, classmates are oftenused as resource people while the course project is beingprepared. Students are forced to work with a variety ofpersonality combinations, which each of them will have todo in the future.

Use of Findings

The oral presentations illustrate that students often can betterdemonstrate understanding by presenting a topic orally. Othermathematics courses should (and have begun to) incorporateoral assignments to facilitate learning.

Our major programs already require that studentscomplete a scientific writing course and it is stronglyrecommended that a student complete this course prior toenrolling in the Senior Seminar. More writing needs to beincorporated in other mathematics courses as well as remainin the capstone course. Ideally, every mathematics courseshould incorporate some form of nontechnical writingcomponent; realistically, post-calculus applications-basedcourses such as differential equations, numerical analysis,probability and statistics, and operations research appear tobe those most suited for such assignments.

Students with severe writing difficulties (grammar,punctuation, run on sentences, etc.) or with readingdeficiencies are identified early in the semester and advised

to get extra help. Such assistance can be obtained withoutcost to the student from the Writing Lab and from theDepartment of Developmental Studies.

Information gleaned from the journals is used to identifyan individual’s strengths and weaknesses and to identifyquestions and concerns that confront the student. Theinstructor is able to provide relevant, timely feedback and isbetter able to serve as a facilitator of learning.

As a result of producing a resume, students become awareof the skills that may be required for a career that usesmathematics. The academic advising of our majors shouldbe improved so that such information is realized by studentsearlier in the program. Then appropriate coursework couldbe recommended to help develop these skills. We also needto demand that more technological and resource skills beused in lower-level mathematics courses.

Class discussions indicate the need to guide the studentsin making connections: those among the many aspects ofthe discipline of mathematics; and those betweenmathematics and the general education component of theuniversity degree. This type of assignment should remainpart of the Senior Seminar course.

Results from the content-based portion of the course indi-cate that the assessment of previous coursework is reliable in-asmuch as it measures computational skills and elementarycritical thinking skills. Existing performance standards in re-quired mathematics courses appear to be consistent.

Information gleaned from student portfolios is sometimesenlightening. For example, one student wrote that she feltthat the amount of writing done in this course far surpassedthe total amount of writing in all of her other classescombined! This reinforces our conclusion that more writingneeds to be incorporated in other courses, whethermathematics courses or not. The portfolio also provides uswith an early warning system for potential problems. Forexample, one student complained that our mathematicsprogram did not prepare him for entrance into the actuaryprofession. (He took and did not pass the first actuary examthree times.) This has alerted us to try to better explain thedifference between an undergraduate degree in mathematicsand an individual’s ability to score well on standardized tests.

Success factors

By using a variety of assignments and assessment techniques,the Senior Seminar in Mathematics has become a place inwhich many skills can be developed. Assignments are allrelated to the successful pursuit of a career in mathematics,most of which require skills that lie outside the traditionalscope of a mathematics content course. The course worksbecause of its small enrollment, and the instructor’s knowl-edge of a wide variety of writing techniques, mathematicalfields, and assessment techniques, and the willingness of

Page 47: Assessment Practices in Undergraduate Mathematics - Northern

34 Assessment Practices in Undergraduate Mathematics

colleagues in the department to maintain quality programs.Faculty colleagues who have assessed an oral presentationvery quickly gain a perspective of the strengths and weak-nesses of our major programs. Continued support by themis an invaluable asset when changes in the programs are rec-ommended.

However, it takes time and patience to build qualityassignments and develop meaningful assessment devices,and this should be done gradually. The strengthening of themajor that results is worth the effort.

Page 48: Assessment Practices in Undergraduate Mathematics - Northern

35

Background and PurposeMary Washington College is a state-supported, coeducational,predominantly undergraduate residential college of the liberalarts and sciences. It is located in Fredericksburg, Virginia, ahistoric city, which is about halfway between Washington, D.C. and Richmond, Virginia. The College is rated as “highlyselective” in its admission status, and enrolls approximately3000 undergraduates. There is also a small graduate program.There are ten full-time faculty members in the Department ofMathematics. On the average, 20 students graduate each yearwith degrees in mathematics.

Mary Washington College began to develop a programof outcomes assessment in 1989. The College wisely decidedthat each department, or major program within a department,should be responsible for developing its own plan to assesshow well it was preparing its majors. By 1991, both facultyand administrators had learned more about outcomesassessment, and decided that assessment should be conductedaccording to a four-year cycle. Each major program has onefaculty member, the “Outcomes Assessment Coordinator,”who is responsible for the assessment.

MethodDuring the first semester of Assessment Year 1, faculty examineand revise, if necessary, their list of “Outcomes Expected.”

This list of goals and objectives details the essential knowledge,skills, and abilities that students who complete their majorprogram should possess. Following this, faculty decide howthey will determine the extent to which these outcomes areachieved by their major students. Faculty determine whatmethodologies and instruments will yield relevant data, anddecide upon a timetable for these evaluation procedures. Overa four-year assessment cycle, each major program must collectdata using at least one of each of the following: a direct measure(e.g. tests, capstone courses, portfolios), an indirect measure(e.g. focus groups, exit interviews), and a survey of programalumni. During the second semester of Assessment Year 1,and during Assessment Years 2 and 3, outcome assessmentcoordinators collect and analyze the data gathered through thevarious forms of assessment. Generally, it is expected that datawill be collected during one semester and then analyzed andinterpreted during the next semester.

During Assessment Year 4, assessment coordinators,along with other faculty, compile, analyze and interpret allof the findings of the past three years regarding their majorprogram. Although changes in the major program which arebased on assessment results can be proposed at any timeduring the assessment cycle, it is often helpful to wait untilYear 4 when changes can be based on the cumulative resultsof various forms of assessment.

The 1996–97 academic year was Year 1 in the MaryWashington College assessment cycle. Departments have

The Use of Focus Groups Within aCyclic Assessment Program

Marie P. SheckelsMary Washington College

An entirely different approach to assessing the mathematics major has been developed at a state-supported,coeducational, liberal arts college in the Midsouth. Graduating seniors participate in focus group sessionswhich are held two days prior to graduation. These are informal sessions with a serious intent: to assessstudent learning in the major.

Page 49: Assessment Practices in Undergraduate Mathematics - Northern

36 Assessment Practices in Undergraduate Mathematics

adjusted their schedules so that all are now on the same four-year cycle. This was the start of the mathematics departments’third cycle (although one cycle was three years rather thanfour). During the fall 1996 semester the Department ofMathematics extended and refined the outcomes expectedof its majors. They are listed below.

Outcomes ExpectedInterpretation of Mathematical Ideas

• Students will read and interpret mathematical literature.• Students will read and interpret graphical and numerical

data.

Expression of Mathematical Ideas

• Students will use mathematical symbols correctly andprecisely in expressing mathematical information.

• Students will represent quantitative information by meansof appropriate graphing techniques.

• Students will present well-structured and validmathematical arguments.

Critical Thinking

• Students will employ critical thinking skills in theircomprehension and application of mathematics.

• Students will analyze and construct logical arguments.

Discovery

• Students will discover mathematical patterns and formu-late conjectures by exploration and experimentation.

Applications

• Students will express problems in mathematical terms.• Students will identify areas of mathematics that are most

useful in solving practical problems.• Students will use technology appropriately in solving

problems.

Appreciation

• Students will give examples of the beauty and power ofmathematics.

During the past eight years the mathematics departmenthas employed the following measures to assess its majorprogram: mathematics tests, both alumni and faculty surveys,and focus groups with graduating seniors. During 1989, 1990and 1995 we administered in-house assessment tests whichasked the students to read, write and interpret mathematics.On each occasion, students enrolled in Calculus II(beginning-level “serious” mathematics students), andstudents enrolled in Real Analysis (senior level mathematicsmajors) took the test. Faculty compared and analyzed theanswers given by the two groups of students.

In 1993, we conducted an alumni survey of recentmathematics graduates. The surveys included multiple-choice questions in which the alumni rated different aspectsof their mathematics education as well as free responsequestions where students wrote suggestions and advice for

improving the program. In 1994 we circulated a surveyasking mathematics faculty for their comments on variousaspects of the program. The outcomes assessmentcoordinator held individual comments in confidence andreported a summary to the department.

In 1992 and 1996, we conducted focus group meetingswith graduating seniors. These focus groups were designedto provide information on how our majors perceived differentaspects of the mathematics program and to elicit theiropinions on ways the program might be improved. Whileall of the data have been useful, the focus groups ofgraduating seniors have been the most enlightening. On bothoccasions, we conducted the focus groups on the two dayspreceding graduation, after final grades had been submitted.Seniors knew that they could give their honest impressionsof the program at this time without fear of negativerepercussions. We invited all of our graduating seniors toattend one of two one-hour sessions. Nearly all of the seniorsattended. We served drinks and snacks and there was arelaxed, congenial atmosphere; it was one last time for majorsto get together. Two faculty members conducted the sessions.They took turns where one asked the questions and the otherrecorded. Neither faculty had taught these seniors in theirupper-level courses so the students were free to makecomments about the instruction in these classes. The facultyinformed the students that the purpose of the session was torecord the students’ perceptions of their education in themathematics program and asked that they be open and honestin their responses. Faculty distributed the list of questions tothe students and then one of the faculty read aloud eachquestion in turn and encouraged discussion. Examples ofquestions we asked included:

• What are some of your initial feelings, opinions orcomments about your experience as a mathematics major?

• Do you have any strong feelings about any specificcourses? What seemed to you to be the main concern orcharacteristic of the math department? Why do you saythis? Can you give examples?

• In what areas do you feel most prepared?• If you could make only one change in the mathematics

department, what would that be? Why?

The faculty encouraged students to extend, agree ordisagree with each other’s remarks by asking questions suchas “Do you all feel the same way?” or “Would anyone liketo elaborate on that?” Everyone was encouraged to expresstheir opinion. The students were very forthright, open andhonest in their responses.

Findings

It was interesting to note that the data we obtained throughthe various types of assessment were remarkably consistent.The results of our assessments were very positive. However,

Page 50: Assessment Practices in Undergraduate Mathematics - Northern

Part I: Assessing the Major 37

we did find a few areas where our program could bestrengthened. We were very gratified to hear our majors saythat one of the strongest aspects of the program was that ithad taught them to think and that they were confident intheir abilities to solve problems. These, of course, are majorgoals of the program. The students were thankful for thesmall classes and personal attention. Seniors in the focusgroups and alumni who were surveyed were very pleasedwith their preparation in mathematics in general. However,both groups stated that they would like to see more emphasisplaced on the applications of mathematics.

Another recommendation of the focus groups was tofurther integrate technology into the program and encouragestudents to become familiar with computers. These studentsalso recommended that we improve the department’s careeradvising, and suggested that we integrate some group projectsin our courses, since prospective employers seemed to valuethis kind of experience.

In addition to learning about our major program, we havealso learned about the process of assessment. The cyclicapproach to outcomes assessment seems to work very well.Within certain guidelines, major programs can choose whatforms of assessment best meet their needs. Using a varietyof assessment techniques helps departments see the broadpicture and determine which findings are consistent acrossmeasures. Since assessment activities are spread over severalyears, they are not overwhelming.

The focus groups, in particular, seem to yield a largequantity of valuable information for the time and effort spent.Conducting the focus groups is quite enjoyable andrewarding for the faculty — assuming that you have a goodprogram. The students enjoyed this chance to share theiropinions with the faculty and in so doing to help maintain astrong mathematics program at their soon-to-be alma mater.Moreover, they were gratified that the faculty respected themenough to care about their opinions.

Use of Findings

The results from the focus group, along with the results fromour other assessment activities, helped the mathematicsdepartment determine how well we were meeting our goalsand plan actions to maintain a strong major program. Partlyin response to the recommendation to place more emphasison mathematical applications, we hired two new facultywhose areas are in applied mathematics and we strengthenedthe program offerings in these areas. These faculty haveallowed us to offer more sections of our “applied” coursesand they have also developed and taught new upper-levelcourses in Chaos, and Linear Models. They also havedeveloped a new sophomore-level course entitled“Mathematical Modeling,” an interdisciplinary course inbasic mathematical modeling investigating various scientific

models with an environmental theme. These faculty haveboth worked with students on independent studies in variousapplied fields, and have sponsored several students ininternships. In addition to hiring these new faculty, we urgedall faculty members to integrate more applications ofmathematics into their courses wherever possible. Thesemodifications should help students meet the outcomesexpected in the application area.

We have further integrated technology into our programby using graphing calculators in all of the Precalculus andCalculus courses and by using computers in the Statistics,Differential Equations and Linear Algebra courses, as wellas in some Special Topics courses. Also, we now stronglyencourage students to take computer science courses tocomplement their program in mathematics.

In response to improving career advising, we have hostedprofessionals from the area to speak to students and we nowhave a “Careers Information Link” on the mathematicsdepartment’s home page where students can find generalinformation on careers for mathematics majors, internshippossibilities, and specific information about organizationswhich are currently hiring.

Also in response to focus group recommendations, wehave improved placement in freshmen courses, fought tokeep class sizes as small as possible to ensure qualityinstruction, and have integrated group work and projects intoexisting coursework.

The results from the senior focus groups were soinformative that the mathematics department is currentlyplanning to conduct alumni focus groups. We will inviterecent alumni in this area to meet with us to seek theirperceptions of how well our mathematics program preparedthem. We trust that we will get the same in-depth, thoughtfulresponses that we did from our senior majors. We then planto use this information to help us write an alumni survey forthose mathematics graduates who could not attend the focusgroups.

Success Factors

We have learned much from the results of our various formsof outcomes assessment. We understand, however, that wemust be cautious in interpreting the results of assessment.For example, the responses in both the focus groups and onalumni surveys were their perceptions of the program andwe viewed them as such. A few of these perceptions werefactually incorrect, such as on issues related to coursescheduling. Student and faculty opinions did not alwaysagree, such as on the value of certain courses and majorrequirements. However, this does not negate the importanceof students’ perceptions and opinions.

We believe that our focus groups have been successfuldue, in part, to precautions that we took in planning and

Page 51: Assessment Practices in Undergraduate Mathematics - Northern

38 Assessment Practices in Undergraduate Mathematics

conducting the focus groups. We scheduled the groupsessions for a convenient time after grades had been turnedin. Students signed up during their classes for one of thesessions. The day before their session, the departmentsecretary called with a reminder. We selected faculty withwhom the students could be open and honest and chosequestions which were open-ended and invited discussion.While conducting the sessions, faculty waited patiently forresponses and accepted the students’ opinions without beingdefensive toward negative comments. Faculty encouragedthe students to respond to and extend each others’ responses.This was the students’ turn to talk and the faculty’s turn tolisten.

Page 52: Assessment Practices in Undergraduate Mathematics - Northern

39

Background and Purpose

Wabash College is a small (850 student), non-religious pri-vate college for men, founded in 1832 in a small town inIndiana. The department of mathematics and computer sci-ence, with 7 full-time faculty members, offers a major and aminor in mathematics, and a minor in computer science. Wegraduate between 2 and 18 mathematics majors per year,many of them double majors with the other major in a fieldwhich uses mathematics heavily. In order to graduate, everystudent must pass a comprehensive examination (often called“comps”). This requirement has been part of the collegecurriculum since 1928. This examination consists of twoparts: (1) a written component in the major, two days, threehours per day, and (2) an oral component over the liberalarts. The purposes of the written part of the examinationinclude (1) having students review their courses in their majorso that they will become aware of connections which mayhave been missed in individual courses and (2) satisfyingthe faculty that the graduating seniors have an acceptablelevel of competence in their major. The content of the writ-ten component is left to each department to put together asit pleases. The purpose of the oral component is to causestudents to reflect on their whole liberal arts experience: howthey have examined, and perhaps changed, their values andbeliefs during their time here; what were the positive andnegative aspects of their Wabash education; how they may

begin to fit into the world beyond Wabash; what directionsthey still need to grow in and how they can continue to growafter they leave. The oral is a 50-minute exam with each stu-dent examined by three faculty members, one from the major,one from the minor, and one at-large. Thus, the major is alsoexamined in this component. It provides for the departmentvery different information from that provided by the writtenpart, as will be discussed below. Because we have been givingthese examinations for many years, there has been a lot ofopportunity to reflect on and change the curriculum in responseto weaknesses we find in our seniors on comps.

MethodThe written part of the comprehensive examination is giventwo days before the start of the spring semester. Themathematics exam is always given in the afternoons, whilemost other disciplines give theirs in the mornings. Thus,mathematics majors who are double majors (many are) cantake two sets of exams during the two days. In November,the department discusses exactly what format that year’scomps will take. There is always an essay or two (recentlyover post-calculus core courses, e.g., abstract algebra andreal variables), as well as problems over the calculussequence, on the first day, and advanced topics on the secondday. However, the number of essays, how the topics areselected, exactly where the dividing line between the calculus

Assessing the Major Via aComprehensive Senior Examination

Bonnie GoldMonmouth University (formerly at Wabash College)

At a small, private men’s liberal arts college in the Midwest, a comprehensive examination, known ascomps, has been a tradition for seventy years. It has evolved into the assessment technique which thedepartment uses to assess student learning. Comps are taken by seniors over a two-day period just priorto the start of the spring semester. The exam consists of two parts: a written component in the mathematicsmajor and an oral component over the liberal arts.

Page 53: Assessment Practices in Undergraduate Mathematics - Northern

40 Assessment Practices in Undergraduate Mathematics

sequence and advanced topics will be, and whether studentsare allowed aids such as calculators or textbooks, changesfrom year to year based on changes we have been making inthe courses students take. For example, recently we decidedthat we were interested in how well students understandcalculus concepts, not in how well they remembered formulasthey had learned three years earlier. So we allowed them tobring texts or a sheet of formulas to the first day’s exam. Werecently changed from requiring all students to take bothreal variables and abstract algebra for the major to requiringonly one of these courses. Therefore this year’s exam onlyhad one essay question, a choice between a topic in algebraand a topic in analysis. However, beginning with next year’sclass, all students will have taken a full semester of linearalgebra as sophomores, and so we will then be able to requireeveryone to write an essay on a topic from linear algebraand one from either real variables or abstract algebra.

The essay topics are rather broad; from algebra: algebraicsubstructures or homomorphisms of algebraic structures; fromreal variables: differentiability in R and Rn, or normed vectorspaces. The instructions say: write an essay which includesdefinitions of appropriate terms, relevant examples and counter-examples, statements of several important theorems, and theproof in detail (including proofs of lemmas used) of at leastone of these theorems. Students are given a short list of potentialtopics in advance, and are told that the topics they will have towrite on will be chosen from among these. They are encouragedto prepare these essays in advance, and even speak withmembers of the department about what they’re planning toinclude. However, they write the essay without notes on theday of the exam. We’re looking for their ability to chooseappropriate examples and theorems, to explain them, and toput this all together into a coherent whole.

The non-essay portions of the exam are typical coursetest problems. Whoever has taught a given course within thelast two years writes three problems over that course, andgives them to the exam committee (which consists of twopeople). The committee then takes all the submitted questionsand chooses which to use. They try to have a good balanceof easy and difficult problems. Since over the course of theirfour years the students have had a wide choice of electivecourses, the exam committee must make sure that the secondday’s examination on advanced topics has a sufficient balanceof problems to enable each student to choose problems fromcourses he has studied. The problems are not supposed torequire memorization of too much information or otherwisebe too obscure; rather, they should cover the main points ofthe courses. A few sample questions:

From calculus: Find a cubic polynomial g(x) = ax3 + bx

2 +

cx + d that has a local maximun at (0,2) and a localminimum at (5,0).

From number theory: Find all solutions in positive integersfor 123x + 360y = 99.

Students are given identification numbers (the correspon-dence between names and ID numbers is sealed until thewritten exams have been graded and grades decided on),and all questions are double-graded, with the departmentsharing equally the job of grading. When there is a substan-tial disagreement between the two graders of a given ques-tion, they look over all the papers which answer that ques-tion and regrade that question together. The college requiresthat the oral component comprise 1/4 of the grade, but that,to pass the comprehensive examination, students must passboth the oral and written parts separately.

Each student has a different oral examination committee.All examiners are to examine the student over the liberalarts generally, but usually each one has about 15 minutesduring which (s)he has principal responsibility for the ques-tions (but the others are encouraged to pursue the studentfor more details if they feel the response is inadequate).Generally, the questions mathematicians ask of students whoare mathematics majors are questions which would not havebeen asked in the written portion. They involve philosophi-cal questions (“is mathematics a science or an art”), or testthe student’s ability to speak about mathematics to non-math-ematicians (“please explain to Professor Day, who teachesclassics, the idea of the derivative”). While we have severalother methods of assessing the major, the oral exam is theonly assessment tool for the minor (other than examinationsin individual courses), and students with mathematics mi-nors are often asked questions about the relation betweenmathematics and their chosen major (usually a physical orsocial science).

Findings

When I first came to Wabash, the comprehensive exams wereoften a source of considerable discomfort to the department.When asked to explain what the derivative was, mathemat-ics majors would often start (and end!) by explaining thatthe derivative of xn was nxn–1. Students’ grades, even thoseof good students, on the written comprehensives were oftenpoor. Many majors have advisors outside of mathematics,and in the past, some accumulated a number of D’s in math-ematics courses. They then came to the comprehensive ex-amination unable to pass. It’s too late, in the second semes-ter of the senior year, to tell a student he’d better get seriousor find another major. We have made numerous changes inthe program due to these results, and have also started fail-ing students when they do sufficiently poorly (below 60%overall). When this happens, the department must write anew examination and the student must take the comps againin April in order to be able to graduate with his class. If hefails again, then he has to wait until the following year’sexams. This has made most students take the exams fairlyseriously. (About 10 years ago, 4 students failed on the first

Page 54: Assessment Practices in Undergraduate Mathematics - Northern

Part I: Assessing the Major 41

try, none of whom had done any serious studying. All passedon the second attempt. In the last 5 years, we’ve only had tofail one student.) At the moment, when students have stud-ied for the exams, the results largely are consistent with thedepartment’s view of the student: good students (usually 1or 2 each year) — heading to graduate school or promisingcareers — write exams which merit distinctions (90% orbetter, approximately), mediocre students (roughly 1/3)barely pass, and middle level students — the remainder —either do a good pass or a “high pass” (80% to 90%). Math-ematics majors are now graduating with a better understand-ing of modern mathematics than in the past. The performanceof minors and other students on the oral portion, however, isstill causing some changes in the lower level curriculum.

Use of Findings

Over the years, having obtained unsatisfactory results incomprehensive exams, we have made many changes indepartmental offerings and requirements. We startedrequiring abstract algebra and real variables when it becameclear (from oral exam answers like the one mentioned above)that students were not getting a sufficiently rigoroustheoretical background in mathematics. We started tryingcalculus reform when we saw how little students rememberedfrom their calculus experience, and the minimalunderstanding demonstrated on the oral exam by minors andothers who had not gone further. These changes in turn ledto other changes, such as the introduction of a required,theorem-based linear algebra course in the sophomore year.On the other hand, because we want to allow a variety ofelectives, having added the linear algebra course, we decidedto allow students to choose between real variables andabstract algebra, while encouraging them to take both. Wehave also moved to more active learning in the classroom,to help students become better able to handle the material.Students are learning more: we are graduating fewer studentsthat we are embarrassed to acknowledge as mathematicsmajors than we had in the past.

To deal with the problem of students having advisorsoutside mathematics and surprising us by taking and failingcomps, we adapted St. Olaf College’s contract major. Eachstudent signs a contract with the department describing whatdirection he plans his major to take (towards actuarial work,applied mathematics, pure mathematics, etc.). The contractis signed before registration for the second semester of thesophomore year, when students have their first choice ofmathematics courses (between multivariable calculus anddifferential equations, though both are recommended). Eachcontract has some flexibility within it, and by petitioningthe department, students can switch contracts if they findthat their interests change. But by requiring them to sign acontract as sophomores, students begin working with the

department much earlier in their student careers, when it isstill possible to prod the student to the required level of effort.

Success Factors

Our capstone course, which involves readings in history ofmathematics and preparation of a senior expository paper, istaken in the semester just before the comprehensiveexamination. The combination of a course spent reflecting onthe history of mathematics together with the comprehensiveexamination for which the students review the courses theyhave taken helps students place the individual courses inperspective. Toward the end of the capstone course studentsare given a copy of the previous year’s examinaton and thereis some opportunity to discuss the upcoming comps.

The faculty meets weekly over lunch to discuss departmentalbusiness. Comps help focus our discussions, both before writingthe test in the fall and after finishing grading them in the spring.In the fall we think about the progress our seniors have made,and what their focuses are. In the spring, we think about whatchanges we need to make in the program due to weaknesseswhich show up on the students’ examinations.

The timing of the written comps has changed over the years.A long time ago, they were in April. However, if a studentfailed the examination he would have to wait until the follow-ing year to retake it and graduate. So they were moved to lateJanuary, during classes. Then, because seniors not only missedtwo days of classes, but often a whole week (a few days beforeto study, and a day or so after to celebrate), the written part ofthe comps was moved to the first two days of the spring se-mester. This way, grades are turned in by spring break, andany students who fail can retake them once in April and thushave a chance to graduate with their class.

We have had to fail students every several years on theexams; otherwise, some students do not take them seriously.However, since we switched to the contract major, studentswho have failed the written part have admitted that they didn’tput much effort into studying for the exam. Nonetheless, bothbecause of the additional trouble it causes the department, andbecause sometimes we’re not convinced that a given student iscapable of performing better on a second attempt, there is acontingent of the department which opposes failing anyone. Itis always painful to fail our students, both because it’s not clearwhether we failed them or they failed us, and because of theadditional time and suffering it causes us all. However, havingestablished a history of doing this, our students in the last fewyears have taken the examinations fairly seriously and the re-sults seem to reflect well what they have learned.

Alumni remember their comprehensive exams, especiallythe oral portion, vividly, particularly the composition of thecommittee and the questions they struggled with. They viewthe process as an important rite of passage, and one of thehallowed traditions of the college.

Page 55: Assessment Practices in Undergraduate Mathematics - Northern

42

Background and Purpose

Franklin College is a private baccalaureate liberal arts collegewith approximately 900 students. It is located in Franklin,Indiana, a town of 20,000 situated 20 miles south ofIndianapolis. About 40% of our students are first-generationcollege students, and about 90% come from Indiana, manyfrom small towns.

The Department of Mathematical Sciences offers majorsin applied mathematics, “pure” mathematics, mathematicseducation, computer science, and information systems, andgraduates between 8 and 15 students each year. The depart-ment has actively pursued innovative teaching strategies toimprove student learning and has achieved regional and na-tional recognition for its efforts, including multiple grants fromthe Lilly Endowment, Inc., and the National Science Founda-tion, and was named in 1991 by EDUCOM as one of the “101Success Stories of Information Technology in Higher Educa-tion.” The department seeks to promote active learning in theclassroom through the implementation of cooperative learn-ing and discovery learning techniques and the incorporationof technology. Members of the department have been unani-mous in their support of the department’s initiatives, and haveparticipated on their own in a variety of programs at the na-tional, regional, and local levels to improve teaching and learn-ing. The department also maintains close ties with local publicand private school systems, and has worked with some of thoseschools to aid in their faculty development efforts.

As part of a college-wide assessment program, theDepartment of Mathematical Sciences developed adepartmental student learning plan, detailing the goals andobjectives which students majoring in mathematics orcomputing should achieve by the time of graduation. Formathematics, there were three major goals. The first goalrelated to an understanding of fundamental concepts andalgorithms and their relationships, applications, and historicaldevelopment. The second centered on the process ofdevelopment of new mathematical knowledge throughexperimentation, conjecture, and proof. The third focusedon those skills which are necessary to adapt to new challengesand situations and to continue to learn throughout a lifetime.These skills, vital to mathematicians and non-mathematiciansalike, include oral and written communication skills, theability to work collaboratively, and facility with the use oftechnology and information resources.

The focus of efforts to assess students’ attainment of thesegoals on a department-wide basis is a college-mandated se-nior comprehensive practicum, the format of which is left tothe discretion of individual departments. In mathematics,the senior comprehensive is a component of the two credithour senior seminar course, which is taught in the fall (toaccommodate mathematics education majors who studentteach in the spring). The seminar is designed as a capstonecourse, with its content slightly flavored by the interests ofthe faculty member who teaches it, but generally focusingon mathematics history and a revisiting of key concepts from

A Joint Written Comprehensive Examination to AssessMathematical Processes and Lifetime Metaskills

G. Daniel CallonFranklin College

A rather unique approach to giving a comprehensive exam to seniors is described in this article by afaculty member at a small, private co-ed college in the Midwest. The exam is taken by seniors in theirfall semester and lasts one week. It is a written group exam which is taken by teams of three to fivestudents. Currently, the exam is written and graded by a faculty colleague from outside the college.

Page 56: Assessment Practices in Undergraduate Mathematics - Northern

Part I: Assessing the Major 43

a variety of courses and relying largely on student presenta-tions. The latter provide some information for departmentalassessment of student achievement in some of the areas ofthe three major goals. Other assessment methods, includingdepartmental exit interviews and assessment efforts in indi-vidual classes, provide useful information but lack eitherthe quantitative components or the global emphasis neededfor a complete picture of student achievement.

Previously our senior comprehensive had consisted of aone-hour individual oral examination, with three professorsasking questions designed to assess the student’s mastery ofconcepts from his/her entire mathematics program. The em-phasis of the oral exam was to draw together threads whichwound through several different courses. Those professorsevaluated the quality of the responses and each assigned aletter grade; those grades were then averaged to produce thestudent’s grade on the examination. Students were givensome practice questions beforehand, and the exam usuallystarted with one or more of those questions and then branchedinto other, often related topics.

When additional money was budgeted for assessment ef-forts, a decision was made to add the Major Field Achieve-ment Test (MFAT) from Educational Testing Service as anadditional component of the senior comprehensive tocomplement the oral examination. However, that still lefttwo of the three goals almost untouched, and we could findno assessment instruments available that came close to meet-ing our needs.

So we decided to come up with our own instrument. Wewanted to address essentially the second and third goals fromabove, consisting of the experimentation-conjecture-proofprocess and the lifetime metaskills. We also wanted to includethe modeling process, which is part of the first goal dealingwith concepts and their application and development but isnot covered in either the oral examination or the MFAT. Wedid not see any need to address the oral communication aspectof the lifetime metaskills in the third goal, which we felt wassufficiently covered in the oral examination. The result wasthe joint written comprehensive examination, which we haveused since 1993.

Method

The joint written comprehensive examination is given aroundthe fifth or sixth week of the senior seminar class. Studentsare informed at the beginning of the semester about thepurpose and format of each of the components of the seniorcomprehensive practicum, but no specific preparation isprovided for the joint written exam other than the generalreview included in the senior seminar course. Students inthe class are divided into teams of 3–5. The class determineshow the teams are to be selected if there are more than fivestudents in the class. They are given a week to complete the

test, and the class does not meet that week. Tests consist offour to five open-ended questions, and involve modeling,developing conjectures and writing proofs, and use of libraryand electronic information resources. Any availablemathematical software is allowed. The team distributes theworkload in any manner it deems appropriate, and isresponsible for submitting one answer to each question.

Since the senior oral examination is evaluated bydepartmental faculty, we have a colleague from anothercollege or university who is familiar with our environmentwrite and grade the joint written comprehensive exam for asmall stipend. The first two years we were fortunate to havea colleague who formerly taught at Franklin and now teachesat a slightly larger university in Indianapolis develop theexam, and her efforts helped smooth out many potential roughspots. The last two years we have employed colleagues atother similar institutions in Indiana, which has provided somevariety in the questions and therefore allowed us to obtainmore useful assessment information.

The following are a few excerpts from questions fromthe tests. Each of the four tests given thus far is representedwith one question. (If taken in their entirety, the four questionstogether are a little more than the length of a typical exam.)

1. To facilitate a presentation for the annual Franklin CollegeMath Day, it is necessary to construct a temporarycomputer communication link from the Computer Centerto the Chapel. This link is to be strung by hanging cablefrom the tops of a series of poles. Given that poles can beno more than z feet apart, the heights of the poles are hfeet, cable costs $c per foot and poles cost $p each,determine the minimum cost of the materials needed tocomplete the project. (Be sure to include a diagram andstate all the assumptions that you make.)

2. Find and prove a simple formula for the sum

1

1 4

3

3 4

5

5 4

1 2 1

2 1 4

3

4

3

4

3

4

3

4+-

++

++ +

-( ) +( )

+( ) +L

n n

n

3. Geographers and navigators use the latitude-longitudesystem to measure locations on the globe. Suppose thatthe earth is a sphere with center at the origin, that thepositive z-axis passes through the North Pole, and thatthe positive x-axis passes through the prime meridian.By convention, longitudes are specified in degrees eastor west of the prime meridian and latitudes in degreesnorth or south of the equator. Assume that the earth has aradius of 3960 miles. Here are some locations:

St. Paul, Minnesota (longitude 93.1° W, latitude 45° N)

Turin, Italy (longitude 7.4° E, latitude 45° N)Cape Town, South Africa (longitude 18.4° E, 33.9° S)

Franklin College (longitude 86.1° W, latitude 39.4° N)

Calcutta, India (longitude 88.2° E, latitude 22.3° N)...

Page 57: Assessment Practices in Undergraduate Mathematics - Northern

44 Assessment Practices in Undergraduate Mathematics

(b) Find the great-circle distance to the nearest mile fromSt. Paul, Minnesota, to Turin, Italy.(c) What is the distance along the 45° parallel betweenSt. Paul and Turin?

...

(f) Research the “traveling salesman problem.” Write awell-prepared summary of this problem. In your writingindicate how such a problem might be modeled in orderto find a solution.(g) Starting and ending at Franklin College indicate asolution to the traveling salesman problem whichminimizes the distance traveled. (Assume there is aspaceship which can fly directly between cities.)

4. One of the major ideas of all mathematics study is theconcept of an Abstract Mathematical System (AMS forshort). Consider the AMS which is defined by:A nonempty set S = {a, b, c, …} with a binary operation* on the elements of S satisfying the following threeassumptions:

A1: If a is in S and b is in S then (a*b) is in S.A2: If a*b = c then b*c = a.A3: There exists a special element e in S such that

a*e = a for each a in S.Which of the following are theorems in the above abstractmathematical system? Prove (justify) your answers.

(a) e is a left identity, that is, e*a = a for all a in S.

(b) a*a = e for all a in S....

(d) * is a commutative operation....

Findings

For individual students, results from each of the componentsof the senior comprehensive practicum (the oral examination,the joint written examination, and the MFAT) are convertedinto a grade. These grades are assigned a numerical valueand averaged on a weighted basis for a grade on the entirepracticum, which appears on the transcript (with no impacton the GPA) and also is part of the grade for the seniorseminar course. The oral examination is weighted slightlymore than the other two due to its breadth and depth, andhas remained unchanged since the tests complement eachother so well. For the department, the results of all threecomponents are evaluated to determine whether anymodifications to departmental requirements or individualcourses are indicated.

In analyzing the results of the joint written comprehensiveexam, what has struck us first and most strongly has beenthe need for more focus on modeling. Students have haddifficulty in approaching the problems as well as incommunicating their assumptions and solutions. We have

also noted room for improvement in the use of informationresources. Real strengths have been the use of technology,particularly in the experimentation-conjecture-proof process,and working collaboratively, although the latter seemsstronger in the development of solutions than in puttingresults in writing.

The remainder of the senior comprehensive, the oral examand the MFAT, have confirmed the impressions of ourstudents which faculty members have developed byobserving and interacting with them over four years, albeitwith an occasional surprise. The difference in the tests’formats has allowed almost all students to showcase theirabilities and accomplishments. In addition, our students havecompiled a strong record of success both globally andindividually on the MFAT, with consistently highdepartmental averages and very few students falling belowthe national median.

Use of Findings

Two major changes have resulted from these findings. First,a modeling requirement (either a course in simulation andmodeling or a course in operations research, both part ofour computing curriculum) has been added to the relatedfield requirement for mathematics education and “pure”mathematics majors, solely based on the outcomes of thejoint written comprehensive exam. We have also tried to bemore conscious of the process of developing and analyzingmodels in the applied mathematics curriculum. Second, as aresult of trying to answer the question of how to develop thecompetencies we are testing in the senior comprehensiveexams, our department has also moved beyond curriculumreform of single courses to programmatic reform, in whichwe develop goals and objectives for individual courses andcourse sequences under the framework of the departmentalgoals and objectives. This has led to the identification ofdevelopmental strands, in which each objective (such as thedevelopment of oral communication skills) is specificallyaddressed in three or four courses with an emphasis onbuilding on previous accomplishments rather than eachcourse standing alone. These strands begin with the freshmancalculus courses and continue through the entire four-yearprogram. Sophomore-level courses in multivariable calculusand linear algebra have been particularly focused as a result,whereas in the past we would fluctuate in which goals wereemphasized and to what degree.

A more subtle effect has been in faculty’s awareness ofwhat goes on in other courses. Not only are we more in touchwith what content students have seen before they arrive inour classes, we also know (or can determine) how muchexposure they have had to applications, written and oralcommunication skills, and other components of our goalsand objectives. Although our department has had a long-

Page 58: Assessment Practices in Undergraduate Mathematics - Northern

Part I: Assessing the Major 45

standing tradition of effective working relationships and goodcommunications, we have been surprised at how helpful thisprocess has been.

The influences of the other two components of the seniorcomprehensive have been less pronounced. The oral examhas emphasized to faculty the importance of tying conceptstogether from course to course. The success of our studentson the MFAT has been a helpful tool in recruiting and in thecollege’s publicity efforts as well as in demonstrating thequality of the department’s educational efforts to the collegeadministration.

Success Factors

The joint written comprehensive examination fits well intoFranklin’s liberal arts philosophy since students areaccustomed to being asked to draw together threads from avariety of courses and topics, both within and acrossdisciplines. The fact that our department unanimouslyendorses the importance of all three components of the senior

comprehensive results in an emphasis on similar themes in avariety of courses and encourages students to take the examsseriously. The willingness of our test authors to take on sucha unique challenge has played a major role in the progresswe’ve achieved. We have talked about the possibility ofarranging for a consortium of similar colleges and universitiesto work together on a common joint written examination,with the result that the workload could be spread aroundand additional data and ideas generated.

Probably the biggest practical drawbacks to theimplementation of a joint written comprehensive examinationin the manner we have are logistical, including funding topay the external reviewer and identifying willing colleagues.It would also be difficult for those departments which donot have a specific course or time frame in which such aninstrument can be administered, since the investment ofstudent time is quite large. It is also vital to get departmentalconsensus about what students should be able to do and howto use the information acquired, although that will probablybe less of an issue as accrediting agencies move stronglytoward requiring assessment plans to achieve accreditation.

Page 59: Assessment Practices in Undergraduate Mathematics - Northern

46

Background and Purpose

Ball State University is a comprehensive university withapproximately 17,000 undergraduates. The University offersa strong undergraduate liberal and professional educationand several graduate programs. The Department ofMathematical Sciences includes about 40 faculty, andgraduates about 50 undergraduate majors each year.

Assessment activities for mathematics majors began atBall State University in 1988 with data collection using theEducational Testing Services (ETS) Major Field Test inMathematics. Coincidentally, our department initiated in thesame year a new core curriculum to be completed by all ofour undergraduate majors. These students pursue one of fourmajors: Actuarial Science, Mathematics, Mathematics Edu-cation, or Statistics. The core of courses common to theseprograms has grown to now include: Single- and Multi-vari-able Calculus, Linear Algebra, Discrete Systems, Statistics,Algebraic Structures (except Actuarial Science) and acapstone Senior Seminar.

Experience with the ETS examinations and knowledgeof the mathematical abilities needed by successful graduatessuggested that a specifically focused evaluation of the newcore curriculum was both appropriate and desirable. Itbecame apparent that there were several expectations relativeto the core courses: the ability to link and integrate concepts,the ability to compare and contrast within a conceptualframework, the ability to analyze situations, the ability to

conjecture and test hypotheses, formation of reasoningpatterns, and the gaining of insights useful in problemsolving. That is, the core courses (and thus the assessment)should maximize focus on student cognitive activity andgrowth attributable to concept exposure, independent of thecourse instructor.

MethodDevelopment and construction of a pilot instrument (i.e., aset of questions to assess the core curriculum) were initiatedduring 1991 and 1992. We independently formulated acombined pool of 114 items using various formats: true/false, multiple choice, and free response. Several items wererefined and then we selected what to keep. Selection of itemswas based on two criteria: there should be a mix of problemformats, and approximately half of the questions should benonstandard (not typically used on in-class examinations).Part I, containing 21 items covering functions and calculusconcepts, and Part II, containing 18 items covering linearalgebra, statistics, discrete mathematics, and introductoryalgebraic structures, were constructed. Each part wasdesigned to take 90 minutes for administration.

The nonstandard questions asked students to find errorsin sample work (e.g., to locate an incorrect integral substitu-tion), extend beyond typical cases (e.g., to maximize thearea of a field using flexible fencing), or tie together relatedideas (e.g., to give the function analogue of matrix inverse).

Undergraduate Core Assessment in theMathematical Sciences

John W. Emert and Charles R. ParishBall State University

In this article a “less” comprehensive exam to assess student learning in the core courses taken by all under-graduate mathematics majors at a regional, comprehensive university in the Midwest is discussed. We areguided through the process involved in developing the assessment instrument which is used in all four tracksof the mathematical sciences program: Actuarial Science, Mathematics, Mathematics Education and Statistics.

Page 60: Assessment Practices in Undergraduate Mathematics - Northern

Part I: Assessing the Major 47

These questions often asked for justified examples or para-graph discussions.

Departmental faculty members other than the investigatorswere assigned to select subjects and administer the instrumentsin Fall 1991 (Part I) and Fall 1992 (Part II). Subjects werestudents who had completed appropriate core components, andwere encouraged by the department to participate on a voluntarybasis during an evening session. The students were notinstructed to study or prepare for the activity, and there was noevidence to indicate that special preparation had occurred.While the student population for each part was relatively small(between 10 and 15), each population represented a good cross-section of the various departmental majors. Subjects wereencouraged to give as much written information as possible aspart of their responses, including the true/false and multiplechoice items. Responses proved to be quite authentic anddevoid of flippant, inappropriate comments. The subjects gaveserious consideration to the instruments and gave fairlycomplete responses.

The responses were evaluated by the investigatorsindependently, using a scoring rubric developed for thispurpose. This rubric is presented in Table I. Differences ininterpretation were resolved jointly through discussion andadditional consideration of the responses. However, we didnot formally address the issues of inter-rater reliability andstatistical item analysis.

Table 1: Scoring Rubric

Score Criteria

3 Conceptual understanding apparent; consistent nota-tion, with only an occasional error; logical formu-lation; complete or near-complete solution/response.

2 Conceptual understanding only adequate; carelessmathematical errors present (algebra, arithmetic,for example); some logical steps lacking;incomplete solution/response.

1 Conceptual understanding not adequate; proceduralerrors; logical or relational steps missing; poorresponse or no response to the question posed.

0 Does not attempt problem or conceptualunderstanding totally lacking.

Findings

Data collected suggested that subjects appear to be mostcomfortable with standard items from the first portion of theCalculus sequence—functions, limits, and derivatives.Subjects showed marginally better aptitude throughnonstandard items than standard items in the subsequenttopics of Calculus—integration, sequences, and series. Infact, by considering the frequencies associated with theCalculus items, a clearer picture of subject responsesemerged. Of the items attempted by the subjects, slightly

more than half of the responses were minimally acceptable.The remainder of the responses did not reflect adequateconceptual understanding. About two-thirds of those givingacceptable responses presented complete or near-completeresponses. The data suggested that most inaccurate responseswere not due to carelessness, but rather incomplete orinaccurate conceptualization.

For the non-Calculus items, subjects demonstrated greatvariability in response patterns. A few subjects returned blankor nearly-blank forms, while others attempted a few scattereditems. When only a few items were attempted, subjects tendedto respond to nonstandard items more often than to standarditems. It appeared that subjects were more willing to attemptfreshly-posed, nonstandard items than familiar but inaccessiblestandard items. A few subjects attempted most or all of theitems. Of the non-Calculus items attempted by the subjects,about half of the responses were minimally acceptable.

Subsequently, revised (shortened) instruments wereadministered to a second set of approximately 15 subjectsin Spring 1996. The resulting data suggested that the samebasic information could be obtained using instruments withfewer items (12 to 15 questions, balanced between standardand nonstandard items) and a 60 minute examination period.It was observed from response patterns to the original andrevised instruments that significant numbers of subjects donot explain their answers when specifically asked to do so.By the students’ responses, it appears that attitudinal factorsmay have interfered with the assessment process. It wasperplexing to the investigators that departmental majorsappear to possess some of the same undesirable attitudestoward mathematics that cause learning interference in non-majors. We are in the midst of a pre/post administration of aLikert-type instrument developed to assess our majors’attitudes toward mathematics.

It should be noted that all core courses during the periodof assessment were taught by standard methods. Because coursesyllabi did not require a technology component at the time, thedegree of use of available technology such as graphingcalculators or available computer software varied widely amonginstructors and courses.

Use of Findings

Our students need experience pushing beyond rote skills towarda more mature perspective of the subject. In order to help ourstudents to develop better discussion and analysis skills,restructuring of the core courses was carried out beginning in1994. We expect these changes will introduce students to morenonstandard material and to utilize nonstandard approaches toproblem solving. These changes should lead to better problem-solving abilities in our students than is currently apparent.

Restructured versions of our Calculus, Discrete Math-ematics, and Algebraic Structures courses came online in Fall

Page 61: Assessment Practices in Undergraduate Mathematics - Northern

48 Assessment Practices in Undergraduate Mathematics

1996. Our Calculus sequence has been restructured and a newtext selected so that the organization of the course topics andtheir interface with other courses such as Linear Algebra ismore efficient and timely. More discretionary time has beenset aside for the individual instructors to schedule collabora-tive laboratory investigations, principally using Matlab andMathematica. The extant Discrete Mathematics course has beenrestructured as a Discrete Systems course and expanded toinclude logic, set theory, combinatorics, graph theory, and num-ber systems development. The Algebraic Structures course nowbuilds on themes introduced in the Discrete Systems course. Arestructured Linear Algebra course with integrated applica-tions using graphing calculators or computer software is inprocess. We anticipate that these curricular refinements willbe evaluated during the 1998–99 academic year.

Success Factors

This project forced us to grapple with two questions: “Whatdo we really expect of our graduates?” and “Are these

expectations reasonable in light of current curricula andtexts?” This concrete approach to these questions helped usto focus on our programmatic goals, present theseexpectations to our students, and gauge our effectiveness tothis end. The project has guided us to redefine our curriculumin a way that will better serve our graduates’ foreseeableneeds. As their professional needs change, our curriculumwill need continued evaluation and refinement.

Developing a custom instrument can be time intensive,and needs university support and commitment. Propersampling procedures and a larger number of student subjectsshould be used if valid statistical analyses are desired.

Reference

[1] Emert, J.W. and Parish, C.R. “Assessing ConceptAttainment in Undergraduate Core Courses inMathematics” in Banta, T.W., Lund, J.P., Black, K.E.,andOblander, F.W. eds., Assessment in Practice: PuttingPrinciples to Work on College Campuses, Jossey-BassPublishers, San Francisco, 1996, pp. 104–107.

Page 62: Assessment Practices in Undergraduate Mathematics - Northern

49

Background and Purpose

Iowa State University (ISU) is a large land-grant institutionwith a strong emphasis in the areas of science and technology.It was, in fact, designated the nation’s first land-grant collegewhen Iowa accepted the terms of the Morrill Act in 1864. Itis located in Ames, Iowa, near the center of the state about30 miles north of Des Moines. The current enrollment isabout 24,000 students and about 20,000 of these areundergraduates, eighty-five percent from Iowa. The B.S.program in statistics began with the establishment of theDepartment of Statistics at ISU in 1947. It was one of thefirst universities in the U.S. or elsewhere to offer a curriculumleading to the B.S. degree in statistics. The first B.S. degreein statistics was awarded in 1949, and through 1997 over425 individuals have received this degree. The enrollmentin the major has remained reasonably stable over the period1975–1997 in the range of 30–40 students. The currentcurriculum for the B.S. statistics degree includes:

1. An introductory course in statistics.

2. A three semester sequence in calculus and a course inmatrix algebra or linear algebra.

3. A two semester sequence in probability and mathematicalstatistics (with the prerequisites of (2) above).

4. A two semester sequence in statistical methods includingthe analysis of variance, design of experiments, andregression analysis.

5. A two semester sequence in statistical computing.6. A course in survey sampling design.

7. Two or more elective courses in statistics at the seniorlevel or above.

In addition, the College of Liberal Arts and Sciences(LAS) has a variety of distribution requirements in the artsand humanities, foreign language, communication, thenatural sciences and the social sciences.

The goals of the B.S. program in statistics had beenimplicitly considered as early as 1947 when the curriculumfor the B.S. degree was established. However, the aims ofthe program had not been written in explicit form. Amemorandum in September 1991, from the Office of theProvost, required each department and college to develop aplan for assessing the outcomes of undergraduateinstructional programs. In response, in spring 1992 theDepartment submitted a document entitled “StudentOutcomes Assessment for the B.S. Program in Statistics,”to the Dean of the LAS College, which summarized theintended outcomes for our undergraduate major as follows:

Students completing the undergraduate degree instatistics should have a broad understanding of thediscipline of statistics. They should have a clearcomprehension of the theoretical basis of statisticalreasoning and should be proficient in the use of modernstatistical methods and computing. Such graduatesshould have an ability to apply and convey statistical

Outcomes Assessment in the B.S. Statistics Programat Iowa State University

Richard A. Groeneveld and W. Robert StephensonIowa State University

Although the statistics program at this large Ph.D.-granting university in the Midwest is not housedwithin a mathematical sciences department, the wide variety of measures used to assess the statisticsprogram can serve as one model for assessing the mathematics major. All of the assessment measuresused are described with particular emphasis on surveys of graduates and surveys of employers of graduates.

Page 63: Assessment Practices in Undergraduate Mathematics - Northern

50 Assessment Practices in Undergraduate Mathematics

concepts and knowledge in oral and written form. Theyshould have the technical background and preparationto assume an entry level statistics position incommerce, government or industry. Academicallytalented and strongly motivated B.S. level graduatesshould have adequate background to pursue studytowards an advanced degree in statistics.

With the idea of finding suitable measures of programperformance, the 1992 plan also included the followingabilities, knowledge and/or skills expected of B.S. graduatesin statistics.

1. A knowledge of the mathematical and theoretical basisof statistical inference and reasoning. At a minimum thisincludes a knowledge of calculus, through multivariablecalculus, of matrix theory and of probability andmathematical statistics.

2. A knowledge of statistical methods commonly used inpractice. These include the analysis of variance, the designof experiments, the design of statistical surveys, andmultiple regression.

3. Competency in the use of modern statistical computing.This includes facility with one or more statistical packagessuch as the Statistical Analysis System (SAS) or theStatistical Package for the Social Sciences (SPSS).Graduates should be capable of using modern graphicaland display methods with real data.

4. An ability to summarize and present the results of astatistical study, orally or in writing to an educated, butnot necessarily statistically expert, audience.

5. Proficiency in the use of statistics in a particular area ofapplication or proficiency in the statistical analysis of aparticular type of data.

Method

The following measures and procedures have been used overthe period 1992 to date to assess the success in achievingthe goals or knowledge/skills for students in the B.S. programin statistics at ISU. A brief description of these methods ispresented here together with an indication of which goal(s)or knowledge/ skills(s) mentioned in Section 1 they aredesigned to measure or reflect.

1. A distribution of B.S. level statistics graduates grades(without names) has been maintained for the followingcategories of courses:

a) Mathematical and theoretical courses

b) Statistical methods courses

c) Computer science and statistical computing coursesd) Statistics courses

e) All courses.

Grade data is summarized annually by the Director ofUndergraduate Studies and is included in an annualOutcomes Assessment report to the LAS Dean. Thisinformation is also available to students interested inmajoring in statistics.

These grade records provide information about the goalsof B.S. graduates having a clear comprehension of thetheoretical basis of statistics and being proficient in the useof modern statistical methods and computing. They referdirectly to assessment of program and student successmentioned in the first three knowledge/skills categories inthe first section.

2. A record of the performance on Actuarial Examinations100 (General Mathematics) and 110 (Probability andStatistics) has been kept.

While taken by only a small number of our graduatesthese standardized examinations assess a B.S. student’sknowledge of the mathematical and theoretical basis ofstatistics.

3. The Department keeps a record of summer undergradu-ate internships involving statistics in the Statistical LaboratoryAnnual Report, which receives wide distribution.

4. A record of the first positions or activities of ourgraduates is maintained. The names of the corporations,governmental bodies, or other institutions employing B.S.graduates together with the positions our students haveobtained are listed. These positions or activities are groupedas commercial, governmental, manufacturing, graduateschool and other.

The success of our graduates in obtaining internships andfirst positions in statistics reflects the goal of preparing ourgraduates to obtain entry level positions in commerce,government and industry. Additionally, the later hiring ofstatistical interns in permanent positions and multiple hiringby well established corporations or other institutionsindicates the success of these graduates in meeting the generalgoals of our program.

5. A record of graduate degrees obtained by B.S.graduates (and the institution from which these degrees havebeen received) has been kept since 1974.

Most of our graduates who continue to graduate schooldo so in statistics. A record of the success of our graduatesin obtaining an advanced degree (usually the M.S. instatistics) indicates the success of the program in providingan adequate background for academically talented studentsto pursue graduate study in statistics.

6. During the period July–October 1992 a survey ofemployers of two or more of our B.S. graduates in statisticssince 1980 was conducted to obtain information about theopinions of supervisors of our graduates about theireducational background. Respondents were encouraged togive their views on the strongest and weakest abilities of our

Page 64: Assessment Practices in Undergraduate Mathematics - Northern

Part I: Assessing the Major 51

graduates and make recommendations for improvement ofour B.S. program.

This survey was valuable in providing the viewpoint ofthe employers of our graduates on all aspects of the B.S.program. Questions were asked relating to all the goals andknowledge/skills mentioned in Section 1.

7. A questionnaire was sent in the academic year 1992–93 to 111 B.S. graduates of the Department of Statisticsreceiving their undergraduate degrees in the period 1981–1991. Of these surveys 55 (50 percent) were returned. Thesurvey was developed by Professor W. Robert Stephensonwith assistance from the Survey Section of the StatisticalLaboratory at ISU. The survey was aimed at determiningthe current employment and graduate educational experienceof our B.S. graduates. Questions were also included todetermine these graduates’ evaluation of their educationalexperience at ISU. Additional questions were included todetermine how well graduates of our program thoughtindividual courses offered in the undergraduate statisticsprogram prepared them for further education andemployment. Open-ended questions about strong and weakpoints of the undergraduate program and an opportunity tomake suggestions for improving the undergraduate statisticsprogram were also included. This survey was also valuablein providing responses from our B.S. graduates concerningall aspects of the B.S. program in statistics at ISU.

Findings

1. Distribution of B.S. level grade point averages by categories.For the 62 graduates of the program over the period from

fall 1990 to spring 1997 the average GPAs were calculatedfor the categories defined in Section 2 and are presentedbelow.

StandardCategory Mean Deviation

(a) Mathematical and theoretical courses 2.84 0.69(b) Statistical methods courses 3.30 0.56(c) Computer science and statistical 3.03 0.77 computing courses(d) Statistics courses 3.25 0.56

(e) All courses 3.06 0.56

The information concerning grades reinforced our generalperception that statistics majors have most difficulty withtheory courses and least with statistical methods courses.No trends over this relatively short period of time in theseGPAs have been observed.

2. Actuarial Examinations 100 (General Mathematics) and110 (Probability and Statistics).

During the 1991–97 period three students took the 100Examination, all during the current (1996–97) academic year.Two passed and one did not.

3. Summer Internships.As an example, in the summer of 1996 four students held

such positions. Companies employing these students wereJohn Deere Health Care in Davenport, IA, the Mayo Clinicin Rochester, MN, the Motorola Company in Mount Pleas-ant, IA, and the Survey Sampling Section of the StatisticalLaboratory at ISU. The first of these students was hired per-manently by his internship company upon his graduation inspring 1997. The second was rehired by the Mayo Clinicthis summer before he begins graduate school in statistics infall 1997 at ISU. The third has an internship with the 3MCorporation in Minneapolis, MN this summer before shebegins graduate school in statistics at the University of Min-nesota in the fall. The last is working for the Statistical Labo-ratory again, prior to his senior year at ISU. It is generallytrue that it is possible to place only our better students inthese rewarding positions.

4. First Positions or Activities.Of the 62 graduates from fall 1990 to spring 1997, 15

have gone to graduate school, 27 to positions in commerce(in banking, health services, insurance, research and similarorganizations), 2 to positions in manufacturing, 7 togovernmental positions and 11 are in other situations.

5. Graduate School in Statistics.Eight B.S. graduates since 1990 have received M.S.

degrees in statistics, four are currently in M.S. degreeprograms in statistics, and one (in a Ph.D. program instatistics) has completed the M.S. degree requirements.

6. Survey of Employers of B.S. Graduates.The purpose of this survey was to obtain an evaluation of

the strengths and weaknesses of graduates of our program asseen by employers of these graduates and to make suggestionsto improve our program. The average overall response (on ascale of Poor=1 to Excellent=5) of respondents to the ques-tions concerning the overall quality of the education of ourB.S. graduates was 4.21/5.00, received with some gratitudeby the current authors. The strongest abilities of our graduatesnoted by their supervisors were in general statistical backgroundand in an area best described as “having a good work ethic.”

Employers expressed concern about the following areasof the background of our graduates:a) Knowledge of “real world” applications of statistics.b) Ability to communicate statistical ideas well orally and

in writing.c) Having substantive knowledge in a specific field of

application.

7. Survey of Graduates of the B.S. Program in Statistics atISU.

Of the 55 individuals returning a survey 30 (54.5 percent)had continued to graduate school with 26 of 30 (86.7 percent)either having completed a graduate degree or continuing ingraduate school. This probably indicates that this group was

Page 65: Assessment Practices in Undergraduate Mathematics - Northern

52 Assessment Practices in Undergraduate Mathematics

more likely to respond to the survey, as other informationindicates that 35–40 percent of our B.S. graduates continueto graduate education. Almost all of the respondents havebeen employed since receiving the B.S. degree, with 87percent of the respondents having taken a position in statisticsor a related field.

B.S. graduates did indicate some areas of concern withour program. They indicated, as did employers, that there isa clear need for strong oral and written communication skills.This was particularly evident in the responses of individualswho went directly to positions in commerce, government orindustry. Secondly, B.S. graduates reported a need forimproved background in two topics in statistical computing— computer simulation and graphical display of data. Finally,the course on survey sampling was the lowest rated of thecore (required) courses in the student survey.

Readers who wish to obtain copies of the surveyinstruments and/or a more complete summary of the resultsof the survey of B.S. graduates may contact W. RobertStephenson via email at: [email protected].

Use of Findings

Explicit action has been taken regarding the findings underthe headings 1, 6 and 7 in the previous section. Firstly, notingthat the courses in probability and mathematical statisticsare the most difficult core courses for our students, additionalefforts have been made to provide regular problem sessionsand review lectures prior to examinations in these coursesin probability and mathematical statistics.

The Department has recognized the need for better com-munications skills and has for many years required a Junior/Senior level writing course (Business Communications orTechnical Writing) in addition to the one year FreshmanEnglish sequence. We have also required a speech course,typically Fundamentals of Speech Communication. Both ofthese courses were ranked highly by students in response tothe question: “How valuable has this course been in relationto your current employment?” As a result of the continuedemphasis placed on communications by our graduates andtheir employers, we now incorporate writing projects in thestatistical methods sequence required of all majors. Thesecover a wide range of topics and styles. At the beginning ofthe sequence students are asked to write short summaries ofstatistical results they obtain on weekly assignments. Later,they are asked to write a short newspaper-like article de-scribing a scientific study that explains statistical analysesand concepts. The students also participate in group datacollection and analysis projects that result in five to ten pagetechnical reports. The Department now has a requirementthat a sample of the technical writing of each graduate beevaluated and placed in their departmental file prior tograduation.

Concerning computing, the Department received anNational Science Foundation (NSF) supported InstructionalScientific Equipment Grant (1992–94) which has permittedthe acquisition of 22 Digital Equipment Corporation (DEC5000/25) workstations in a laboratory in Snedecor Hall wherethe Department of Statistics is located. This has permittedsubstantial improvement in the teaching of computer graphicsand simulation in many of the required courses in the B.S.program. For example, in the multivariate analysis course,students can now look at several dimensions simultaneously.It is now possible to rotate the axes of these plots to visuallyexplore interesting patterns in multidimensional data. In thestatistical computing courses, students are not only able tosimulate sampling from populations, but they are able tovisualize sampling distributions and statistical concepts suchas the central limit theorem and confidence intervals.

Recognizing that computing is becoming extremely im-portant for practicing statisticians, the Department has re-ceived funding from NSF (1997–99) to upgrade the equip-ment available to our undergraduates. All of the worksta-tions described above will be replaced by high performanceDEC Alpha workstations. This equipment will also allowfaculty members to develop instructional modules that willgo beyond traditional statistical methods. The modules willbe developed around real world problems and will imple-ment new statistical methods (often computationally inten-sive) to solve those problems. It is our intent to expose ourstudents to these new methods so that they will be betterprepared as statisticians.

Responding to the suggestions of our B.S. graduates, thesurvey sampling course has been redesigned by two newfaculty members. They have introduced hands-on experiencein designing surveys and analyzing their results utilizing asimulation package on the work-stations. The recent (S95and S96) student evaluations have shown substantialimprovement in student reception of the redesigned surveysampling course.

The documentation of summer internships achieved, firstpositions obtained, and graduate degrees received by B.S.graduates under headings 3, 4, and 5 in the previous sectiongive a method of monitoring the success in achieving thegoals of preparing graduates for entry level positions incommerce, government and industry and of preparation forgraduate school stated in Section 1.

Success Factors

To implement a program of outcomes assessment of this typerequires substantial effort on the part of all faculty associatedwith the B.S. program. It is important that a program ofStudent Assessment have support at a high administrativelevel (i.e., the presidential level) and the support of theFaculty Senate (or equivalent group). There must also be

Page 66: Assessment Practices in Undergraduate Mathematics - Northern

Part I: Assessing the Major 53

departmental support to provide adequate financial andsecretarial aid to maintain records and carry out surveys. Inour opinion this will take time equivalent to at least onecourse per year for a faculty member. Such an individualshould be assigned to maintain such records and carry outand analyze appropriate surveys. We feel that this duty shouldbe assigned over a period of years (about four). Such aprogram needs administrative support and understanding thatimprovements in undergraduate education require effort,resources and time. We believe that improvements in ourB.S. program have justified these expenditures.

Page 67: Assessment Practices in Undergraduate Mathematics - Northern

54

Background and Purpose

Xavier University is a Catholic, Jesuit institution havingapproximately 2800 full-time undergraduates and about1,100 part-time undergraduates. The Department ofMathematics and Computer Science is housed in the Collegeof Arts and Sciences and has fourteen full-time membersand six part-time members. All but three teach mathematicsexclusively. There are about 45 mathematics majors with anaverage of ten graduates each year. The mathematics majorat Xavier must complete 42 semester hours (thirteen courses)of mathematics, which include three semesters of calculus,differential equations, two semesters of linear algebra,abstract algebra, real analysis, and four upper divisionelectives.

In the fall of 1994, each department received a requestfrom the office of the academic vice president to devise plansto assess programs for its majors by March of 1995. Eachplan was to cover the following topics:

• goals and objectives• the relationship of department goals to university goals• strategies to accomplish the goals for student achievement• assessment techniques• feedback mechanisms (for students, faculty, and the

university)

Although we have a bachelor’s program for computerscience in the Department, this paper will only address

assessing the program for mathematics majors at XavierUniversity.

According to the mission statement of the University, theprimary mission of the University is to educate, and theessential activity at the University is the interaction ofstudents and faculty in an educational experiencecharacterized by critical thinking and articulate expressionwith special attention given to ethical issues and values.Moreover, Jesuit education is committed to providingstudents with a supportive learning environment, addressingpersonal needs, and developing career goals along with theacademic curriculum.

Method

Our assessment plan records students’ growth and maturityin mathematics from the time they enter the program untilthey graduate. The plan was submitted in the spring of 1995and put into place during the school year 1995-96. We chosethe following methods for assessing the major:

• Portfolio• Course Grades• Course Evaluations• MFT (Major Field Test)• Senior Presentation• Exit Questionnaire and Exit Interview• Alumni Survey and Alumni Questionnaire

Assessment of the Mathematics Major at Xavier:the First Steps

Janice B. WalkerXavier University

In an evolving assessment program at a private, medium-sized, comprehensive university in the Midwesta variety of assessment techniques are being developed to assess student learning. How two of them —exit interviews and the Educational Testing Service’s Major Field Test in Mathematics — are part of thefabric of the mathematics program’s assessment cycle is described in this article.

Page 68: Assessment Practices in Undergraduate Mathematics - Northern

Part I: Assessing the Major 55

During the first few weeks of the fall semester, freshmanmathematics majors receive letters stating the expectationsfor majors for the coming years. In particular, they areinformed that they must maintain a portfolio, make an oralpresentation during their senior year, and meet our standardsfor satisfactory performance on the MFT (Major Field Test)in Mathematics.

Portfolio: Portfolios are maintained by faculty advisors.Each semester the student must submit final examinationsand samples of their best work from each mathematics class.A copy of any research or expository paper in an upper levelclass should also be included. Hopefully, these works willprovide evidence of how well the students are learning tosynthesize and integrate knowledge from different coursesin the program. Moreover, the samples should indicate thata student can use various approaches to solve problems.Samples of work will be collected for the portfolio byadvisors during formal advising periods.

A number of additional documents are standard. Theseinclude an entry profile, an entry questionnaire, counselingforms, and advisor notes. The entry profile contains informationon a student prior to enrolling at the University, such as themathematics courses taken in high school, grade point averageand rank in graduating class, SAT and/or ACT scores, andadvanced placement (AP) credits. It also contains the placementscores from the examinations in mathematics and foreignlanguage which are administered by the University. Theentrance questionnaire, a short questionnaire given in the fallterm, reveals the student’s perception of mathematics and hisexpectations of the program and faculty. (See [1] in this volumefor further information on student portfolios).

MFT: The Department has given some form of acomprehensive exam for many years. The Advanced Test inMathematics of the GRE served as the last measure until the1994-1995 academic year. It was replaced by the Major FieldTest (MFT) in Mathematics, which is distributed and scoredby the Educational Testing Service. The profiles of thestudents taking the MFT are a far better match with theprofiles of our students than with those taking the GRE.Moreover, not only are individual and mean scores reported,but also subscores which help point out the strengths andweaknesses of the examinees in broad areas withinmathematics.

The Department has set as successful performance a scoreat the 60% mark or higher. However, this mark in not rigidlyenforced. The Department undertakes an extensive study ofthe individual case whenever this goal is not met. Then arecommendation is made whether to pass the student or askthe student to retake the exam. A student may request adepartmental comprehensive examination after twounsuccessful performances on the MFT. If a student’s scoreon this exam is also unsatisfactory, the student may be advisedto seek a bachelor’s degree in liberal arts. A degree in liberal

arts would necessitate no additional requirements to befulfilled for graduation.

We expect very few students, if any, to find themselvesin the position of having failed the MFT twice. Thus far, noone has been unsuccessful on their second attempt at makinga passing score. During the fall semester the Departmentprovides review sessions for seniors preparing for the MFT.The whole experience of the review sessions and exam-takingserves as a mechanism for students to rethink and synthesizemathematical concepts as well as to perfect skills that werecovered in their mathematics courses.

Senior Presentation: Prior to our assessment plan, studentsrarely participated in departmental colloquia, but will nowassume an integral and vital part. Beginning in the 1998-99academic year each senior will be required to make an oralpresentation. The topic for the senior presentation will be selectedby the student with the approval of the advisor for seniorpresentations. The advisor will provide each senior with a listof steps that are helpful in preparing a talk. After a studentchooses a topic, she must submit a written draft for a thirty-minute lecture. The advisor will provide feedback on the draftand set up a practice session for the student.

Questionnaires: Questionnaires and surveys are critical el-ements of the assessment process. Exit questionnaires are sentto graduating seniors about three weeks prior to the end of thespring semester and exit interviews are scheduled. An alumnisurvey will be administered every five years. This will help ustrack the placement of students in various jobs and professionalprograms. Between the years scheduled for alumni surveys,alumni questionnaires are mailed to graduates one year aftergraduation. These forms are much shorter and do not covertopics in the breadth and depth as does the alumni survey.

Findings

We have completed two years with the new assessment plan.The first steps proceeded fairly well. The freshman advisorscreated portfolios for the freshmen, the chair passed out entryprofiles, and advisors collected samples of work for theportfolios throughout the year. There were very few freshmenin the second year and thus the job of the creating portfolioswas easier, but that of recognizing a class profile was moredifficult.

Although our experience with portfolios is very limited,portfolios have been useful. Not only is there an evolvingpicture of mathematical growth and progress, but alsoevidence of the content, level of difficulty, and teachingphilosophy in the mathematics courses for majors. Theyprovide information on the manner in which a course is taughtwhich may not be obtained from course evaluations. Thus, amore thorough understanding of the overall curriculum islikely as more students and courses are tracked.

Page 69: Assessment Practices in Undergraduate Mathematics - Northern

56 Assessment Practices in Undergraduate Mathematics

In 1996, seniors took the MFT exam and each met thedepartmental goal. In 1977 two of ten seniors did not meetthe goal. They subsequently retook the exam and passed.

Although seniors are not yet required to make formalpresentations, there have been presentations by the juniorsand seniors in the departmental colloquium series for thelast two years. Several students who participated in the 1996-97 Mathematics Modeling Competition gave talks whichwere very well attended by mathematics and computerscience majors. Freshmen have been strongly encouragedand sometimes required to attend the colloquia.

Almost all majors in the classes of 1996 and 1997 returnedtheir exit questionnaires and had an exit interview. The datafrom both 1996 and 1997 were similar and consistent with theinformation gathered in our alumni survey, which was carriedout in the fall of 1995. Although these students and alumnihave overall favorable impressions of the program and faculty,they raised issues that must be addressed. In particular, somefelt that there should be more electives and fewer requiredcourses, while others suggested that specific courses be addedto or removed from the curriculum. A few thought that thereshould be a capstone course. Comments about the discretemathematics course were generally negative and the merits ofparticular electives repeatedly surfaced.

There was substantial feedback on instructors, the use oftechnology in courses, and the availability of resources(particularly computers) in the exit questionnaires. Inparticular it was remarkably clear whom students classifiedas the very good and the not-so-good instructors and thereasons for their choices. Experiences with MAPLE, thecomputer algebra system employed in the calculus and linearalgebra sequences during the first two years, drew mixedreviews. Students in Calculus III liked using MAPLE; asubstantial number in Calculus I and Linear Algebra did not.Moreover, with the increasing use of computer labs oncampus, accessibility to computer equipment is an issue thatmust be continually addressed.

The questionnaires and alumni survey also indicated aneed for more counseling about career options. Prior to theseresults, most advisors believed that students were aware ofthe career opportunities that are available to them. This hasproved to be false. Some students professed little knowledgeabout career opportunities and some had little idea of whatthey would do after graduation.

Use of Findings

The Department is responding to the issues that have surfacedfrom assessing the mathematics major. There are three mainareas of focus. The one that has prompted the most scrutiny isthe curriculum for the mathematics major itself. Some alumniwho are currently teaching secondary mathematics remarkedon the value of the elective Survey of Geometry that had been

offered, but was dropped several years ago when the list ofelectives was streamlined. Some alumni and seniors suggestedthat statistics should be a requirement for all mathematicsmajors. In addition, many students questioned the intent ofdiscrete mathematics and its contribution to their overalldevelopment. As a result of these findings, the Departmentwill first thoroughly review the content in the discretemathematics and the computer science courses in the freshmanyear. Then we will examine the cycle of electives. We are opento revising any aspect of the curriculum.

The use of MAPLE is the second area of concern. Whenthere was a very heavy emphasis on MAPLE in a course,there was a considerable number of negative comments. Insuch cases, seniors clearly expressed the need for morebalance in the use of technology. Many felt that the intenseuse of MAPLE left too little time for a deep understandingof basic concepts. This message was communicated to alldepartmental members, especially those named in the exitinterviews and questionnaires. Moreover, the expansion ofthe use of MAPLE outside the calculus sequence has createda few problems of accessibility of computers.

The area of counseling is the third area of concern. Tohelp students make better decisions about courses and careeroptions, we are giving students more information early. Inparticular, during preregistration in the summer, the “factsheet” on the mathematics program is given to all incomingmajors. This brochure is put together jointly by theDepartment and the Office of Admissions (and revisedannually). It contains the goals of the program, a descriptionof the mathematics major, a recommended sequence ofcourses for the mathematics major, and information aboutcareers in mathematics. In the fall semester, a packetcontaining additional information about careers from othersources will be given to freshmen majors.

It is quite apparent from the exit questionnaires andinterviews which faculty members students perceive to bethe best professors in the Department and the reasons forthe acclaim. It is just as apparent which courses are the mostdifficult. Such information provides the chair with valuableinformation when composing a schedule of classes. The chairwill speak to the individual faculty members whenever thereis feedback which merits special attention.

The exit questionnaires are available to all members ofthe Department. Information in summary form is also madeavailable to the Department and discussed in a departmentmeeting. Thus, the Department is kept apprised of students’reactions.

Success Factors

Trends, problems, and successes have been fairly evidentusing data acquired from the groups of students with similarexperiences at approximately the same time. Unfortunately,

Page 70: Assessment Practices in Undergraduate Mathematics - Northern

Part I: Assessing the Major 57

we have no means of requiring students to participate in theinterviews and questionnaires, which are key elements ofassessing the program for majors. Of course, students musttake the MFT in Mathematics and make a senior presentation.These are now printed in the current University catalog. Wehope to impress upon students the importance of theirparticipation in the entire assessment process.

Assessing the program for majors is an ongoing projectthat will demand a commitment of time and energy frommany department members. It is not difficult to foresee howthe enthusiasm of some may wane and how those upon whommost of the work falls may become disgruntled. Because

there are many aspects of the plan to monitor, it would beeasy to let one or more measures slide from time to time.How the next chair will respond to the tasks is also anunknown. Hopefully, the Department will continue to assessthe mathematics major beyond the “first steps” with theinterest and energy that currently exist.

References[1] Sons, L. “Portfolio Assessment of the Major,” in this

volume, p. 24.

Page 71: Assessment Practices in Undergraduate Mathematics - Northern

58

Background and Purpose

King’s College, a church-related, liberal arts college withabout 1800 full-time undergraduate students, has had anactive assessment program for more than a decade. Theprogram is founded on two key principles: (1) Assessmentshould be embedded in courses. (2) Assessment shouldmeasure and foster academic growth throughout a student’sfour years in college. The entire program is described in [1]and [2], while [4] details one component as it relates to aliberal arts mathematics course. The remaining componentsare aimed at meeting goal (2).

Of these, the key is the Four-Year Competency GrowthPlans, blueprints designed by each department to direct thetotal academic maturation of its majors. These plans outlineexpectations of students, criteria for judging studentcompetence, and assessment strategies with regard to severalbasic academic skills, e.g., effective writing. Expectationsof students increase from the freshman year to the senioryear. More importantly, the continued growth and assessmentof competencies are the responsibilities of a student’s majordepartment. In the freshman year, all departments havesimilar expectations of their majors. In subsequent years,competencies are nurtured and evaluated in discipline-specific ways; mathematics majors are expected to knowthe conventions for using mathematical symbols in prose,while computer science majors must be able to give apresentation using a computer.

In addition to continual assessment charted by the growthplans, there are two comprehensive, one-time assessmentevents which take a wide-angle view of a student’sdevelopment from the perspective of the student’s major:the Sophomore-Junior Diagnostic Project and the SeniorIntegrated Assessment. The former is the subject of thisreport. It is a graded component of a required major coursetaken in the sophomore or junior year. Coming roughlymidway in a student’s college career, the Sophomore-JuniorDiagnostic Project is a “checkpoint” in that it identifiesanyone whose command of fundamental academic skillsindicates a need for remediation. However, the Project isnot a “filter;” it is, in fact, a “pump” which impels all studentstoward greater facility in gathering and communicatinginformation. The variety of forms the Project takes in variousdepartments can be glimpsed in [5].

Method

For Mathematics majors and Computer Science majors, theProject’s format does not appear particularly radical. Itresides in the Discrete Mathematics course since that courseis required in both programs, is primarily taken bysophomores, and is a plentiful source of interesting,accessible topics. The Project revolves around a substantialexpository paper on an assigned subject related to the course.It counts as one-fourth of the course grade. But how the

Assessing Essential Academic Skills from thePerspective of the Mathematics Major

Mark MichaelKing’s College

At a private, church-related liberal arts college in the East a crucial point for assessing student learningoccurs midway through a student’s four year program. A sophomore-junior diagnostic project which ispart of the Discrete Mathematics course, taken by all majors, is the vehicle for the assessment. Eachstudent in the course must complete a substantial expository paper which spans the entire semester on asubject related to the course.

Page 72: Assessment Practices in Undergraduate Mathematics - Northern

Part I: Assessing the Major 59

Project is evaluated is a radical departure from the traditionalterm papers our faculty have long assigned in Geometry andDifferential Equations courses. Previously, content was theoverriding concern. Now the emphasis is on giving thestudent detailed feedback on all facets of his/her developmentas an educated person.

Another departure from past departmental practice is thatthe Project spans the entire semester. It begins with a memofrom the instructor to the student indicating the topic chosenfor him/her. After the student has begun researching the topic,he/she describes his/her progress in a memo to the instructor.(This sometimes provides advanced warning that a studenthas difficulty writing and needs to be referred to the King’sWriting Center for help.) It is accompanied by a proposedoutline of the report and a bibliography which is annotatedto indicate the role each reference will play in the report.This assures that the student has gathered resources andexamined them sufficiently to begin assimilating information.It also serves to check that the student has not bitten off toomuch or too little for a report that should be about 10 double-spaced pages.

Requiring that work be revised and resubmitted iscommon in composition courses, less so in mathematicscourses. It is crucial in this exercise. The “initial formal draft”is supposed to meet the same specifications as the final draft,and students are expected to aim for a “finished” product.Nonetheless, a few students have misunderstandings abouttheir subject matter, and all have much to learn aboutcommunicating the subject. To supplement the instructor’sannotations on the first draft, each student is given detailed,personalized guidance in a conference at which the draft isdiscussed. This conference is the most powerful learningexperience to which I have ever been a party; it is where“assessment as learning” truly comes alive. Recognizing thatall students need a second chance, I count the final drafttwice as much toward the project grade as the first draft.

Also contributing to the project grade is a formal oralpresentation using an overhead projector. As with the writtenreport, students make a first attempt, receive feedback tohelp them grow, and then give their final presentation. Unlikethe first draft of the report, the practice presentation has noimpact on a student’s grade. In fact, each student chooseswhether or not the instructor will be present at the trial run.Most students prefer to speak initially in front of only theirclassmates, who then make suggestions on how thepresentation could be improved.

While we mathematics faculty are increasinglyrecognizing the importance of fostering and evaluating ourstudents’ nonmathematical skills, there is still the problemof our having to make professional judgments about writingand speaking skills without ever having been trained to doso. The solution in our department was to create “crutches”:an evaluation form for the written report and another for theoral presentation. Each form has sections on content,

organization, and mechanics. Each section has a list ofquestions, intended not as a true-false check-list but as away to help the reader or listener focus on certain aspectsand then write comments in the space provided. Some of thequestions are designed for this particular project (“Are theoverhead projector and any additional audio-visual materialsused effectively?”) while others are not (“Does the speakerarticulate clearly?”).

Findings

All students — even the best — find writing the Projectreport to be more challenging than any writing they havedone previously. Student Projects have revealed weaknessesnot seen in students’ proofs or lab reports, which tend to beof a more restricted, predefined structure. In explaining anentire topic, students have many more options and typicallymuch difficulty in organizing material, both globally and atthe paragraph level. They also have trouble saying what theyreally mean when sophisticated concepts are involved.

Peer evaluations of the oral presentations have proved tobe surprisingly valuable in several ways. First, by beinginvolved in the evaluation process, students are moreconscious of what is expected of them. Second, theirperspectives provide a wealth of second opinions; oftenstudents catch flaws in presentation mechanics that theinstructor might overlook while busy taking notes on contentand technical accuracy. Third, students’ informal responsesto their peers’ practice presentations have improved theoverall quality of final presentations — partly by puttingstudents more at ease!

Perhaps the most surprising finding, however, is howwidely sets of peer evaluations vary in their homogeneity;some presentations elicit similar responses from all theaudience, while in other cases it is hard to believe thateveryone witnessed the same presentation. Peer evaluations,therefore, present several challenges to the instructor. Whatdo you tell a student when there is no consensus among theaudience? How does one judge the “beauty” of a presentationwhen different beholders see it differently? What are theimplications for one’s own style of presentation?

Use of Findings

Since the institution of the Sophomore-Junior Diagnos-tic Project, there has been an increased mathematics facultyawareness of how well each student communicates in themajor. As the Projects are the most demanding tests of thoseskills, they have alerted instructors in other classes to scru-tinize student written work. Before, instructors had discussedwith each other only their students’ mathematical perfor-mance. Moreover, there has been an increased faculty ap-preciation for writing and the special nature of mathemati-

Page 73: Assessment Practices in Undergraduate Mathematics - Northern

60 Assessment Practices in Undergraduate Mathematics

cal writing. While national movements have elevated theprominence of writing as a tool for learning mathematics,this activity promotes good mathematical writing, somethingmy generation learned by osmosis, if at all.

Furthermore, there is an increased realization that astudent’s ability to obtain information from a variety ofsources — some of which did not exist when I was a student— is an essential skill in a world where the curve of humanknowledge is concave up. Previously, some, believing thatisolated contemplation of a set of axioms was the only routeto mathematical maturity, had said, “Our students shouldn’tbe going to the library!”

Collectively, student performances on Projects havemotivated changes in the administration of the Project. Onesuch change was the addition of the trial run for oralpresentations. A more significant change was suggested bystudents: replace the generic evaluation form used in thegeneral education speech class (a form I tried to use the firsttime through) with a form specifically designed for thesepresentations.

While students were always provided detailed guidelinesfor the projects, I have continually endeavored to refine thosehandouts in response to student performances. Other forms ofaid have also evolved with time. Exemplary models of exposi-tion relevant to the course can be found in MAA journals orpublications. For abundant good advice on mathematical writ-ing and numerous miniature case studies, [3] is an outstandingreference. For advice on the use of an overhead projector, theMAA’s pamphlet for presenters is useful. In addition, I nowmake it a point to use an overhead projector in several lecturesexpressly to demonstrate “How (not) to.”

Success Factors

The primary key to the success of the Sophomore-JuniorDiagnostic Project is in giving feedback to students ratherthan a mere grade. When penalties are attached to variouserrors, two students may earn very similar scores for verydifferent reasons; a traditional grade or score might notdistinguish between them, and it will help neither of them.

Another contributor to the Project’s success as a learningexercise is the guidance students are given throughout thesemester. Students need to know what is expected of them,how to meet our expectations, and how they will be evaluated.The instructor needs to provide specifications, suggestions,examples, and demonstrations, as well as the evaluationforms to be used.

The third factor in the success of the Projects is the follow-up. Best efforts — both ours and theirs — notwithstanding,some students will not be able to remedy their weaknesseswithin the semester of the Project. In light of this (andcontrary to its standard practice), King’s allows a coursegrade of “Incomplete” to be given to a passing student whoseProject performance reveals an inadequate mastery of writingskills; the instructor converts the grade to a letter grade thefollowing semester when satisfied that the student, havingworked with the College’s Writing Center, has remedied his/her deficiencies. This use of the “Incomplete” grade providesthe leverage needed to ensure that remediation beyond theduration of the course actually occurs.

The many benefits the Sophomore-Junior DiagnosticProjects bring to students, faculty, and the mathematicsprogram come at a cost, however. The course in which theProject is administered is distinguished from other coursesin the amount of time and energy both instructor and studentmust consume to bring each Project to a successfulconclusion. This fact is acknowledged by the College: thecourse earns four credit-hours while meeting only three hoursper week for lectures. (Extra meetings are scheduled for oralpresentations.)

For the instructor, this translates into a modicum ofoverload pay which is not commensurate with the duties inexcess of teaching an ordinary Discrete Mathematics course.The primary reward for the extra effort comes from beingpart of a uniquely intensive, important learning experience.

References[1] Farmer, D.W. Enhancing Student Learning: Emphasizing

Essential Competencies in Academic Programs, King’sCollege Press, Wilkes-Barre, PA, 1988.

[2] Farmer, D.W. “Course-Embedded Assessment: ATeaching Strategy to Improve Student Learning,”Assessment Update, 5 (1), 1993, pp. 8, 10–11.

[3] Gillman, L. Writing Mathematics Well, MathematicalAssociation of America, Washington, DC 1987.

[4] Michael, M. “Using Pre- and Post-Testing in a LiberalArts Mathematics Course to Improve Teaching andLearning,” in this volume, p. 195.

[5] O’Brien, J.P., Bressler, S.L., Ennis, J.F., and Michael,M. “The Sophomore-Junior Diagnostic Project,” inBanta, T.W., et al., ed., Assessment in Practice: PuttingPrinciples to Work on College Campuses, Jossey-Bass,San Francisco, 1996, pp. 89–99.

Page 74: Assessment Practices in Undergraduate Mathematics - Northern

61

Background and Purpose

During the 1970s the University of Arizona, like many othercolleges and universities, experienced a large growth inenrollment in beginning mathematics courses. This growthwas coupled by a loss of faculty positions over the sameperiod. The resulting strain on resources forced thedepartment to make some difficult decisions in order to meetits teaching obligations. Notable among these was thecreation of a self-study algebra program and the teaching offinite mathematics and business calculus in lectures of size300 to 600. The situation reached its lowest point in theearly 1980s when the annual enrollment in the self studyalgebra program exceeded 5000 students and the attritionrates (failures and withdrawals) in some beginning coursesreached 50%. Before we describe the turnaround that beganin 1985 we give some data about the department.

Currently the Mathematics Department has 59 regular fac-ulty, 8 to 12 visiting faculty from other universities, 20 to 25adjunct faculty, 6 to 8 visiting faculty from high schools andcommunity colleges, 2 research post-doctorate faculty, and 2teaching post-doctorate faculty. Each semester the Departmentoffers between 250 and 300 courses serving about 10,000 un-dergraduate and graduate students. (The enrollment of theUniversity is around 35,000.) Sixty to seventy percent of math-ematics students take freshmen level courses in classes of size35. There are approximately 350 majors in mathematics, math-ematics education and engineering mathematics. In the past

five years the Department has secured external funding atan average of 2.5 million per year.

The overarching goal of the Mathematics Department atthe University of Arizona is to provide intellectual leadershipin the mathematical sciences. It spans the areas of research,undergraduate and graduate teaching, outreach, andcollaboration with other University units. The Department’sgoals include:

1. To provide flexible yet solid undergraduate and graduateprograms which challenge the intellect of students.

2. To prepare an ethnically diverse spectrum of enteringstudents.

3. To embrace the notion that change such as is manifestedin computer technology and educational reform canenhance learning and enrich the intellectual environment.

4. To be a resource in the mathematical sciences for otherdisciplines whose own activities have an ever-increasingneed for the power of mathematics.

5. To work closely with colleagues from the local schoolsand community colleges who share with us theresponsibility of ensuring the flow of a mathematicallyliterate generation of students.

Method

The assessment of the freshman mathematics program beganin 1983 with a Provost appointed University committee

Department Goals and Assessment

Elias ToubassiUniversity of Arizona

At a large, research university in the West, a major effort over the past 10 to 15 years has been underwayto reform the entry level mathematics courses which the department offers. Assessment has been at theheart of this process. Focusing on all first year courses through assessment has had a positive effect onthe undergraduate mathematics major and the courses in that major. This article describes the processand the ongoing assessment of student learning.

Page 75: Assessment Practices in Undergraduate Mathematics - Northern

62 Assessment Practices in Undergraduate Mathematics

consisting mostly of faculty from client departments withthree representatives from mathematics. This was followedby an intensive departmental review in 1984. The reviewincluded an examination of the department by an InternalReview Committee and an External Review Committee ofdistinguished mathematicians from other universities. Thecommittees reviewed reports on enrollment, pass/failure ratesin lower division courses, interviewed faculty membersindividually or in small groups, met with graduate teachingassistants, mathematics majors, representatives of clientdepartments and the University Administration.

In the 1992/93 academic year the Department wentthrough its second review to check on its progress to meetits goals. This included departmental committees on Facultyand Research, Undergraduate Education, Graduate Program,Academic Outreach, Collaboration with other UniversityUnits, Space, the Computer Environment, Library Facilities,and the Departmental Administration. In addition, there wasa University Review Committee consisting of faculty fromhydrology, chemistry, philosophy and biology and anExternal Review Committee with mathematics faculty fromMIT, Rutgers, Wisconsin, Nebraska, and Texas.

Other reviews of the program are conducted regularly.One in particular that is worthy of mention is done by theMathematics Center in the form of exit surveys ofmathematics majors. The surveys cover three main areas.The background section covers such topics as why the studentbecame a mathematics major and whether the student workedwhile in college. The mathematics experience section coverssuch areas as the courses and instructors the studentappreciated and whether the overall educational experiencemet the student’s expectations. The future plans sectioninquires into the student’s immediate plans such as graduateschool, employment or other options. Exit interview data isreviewed by the Associate Head for Undergraduate Programsand the Department Head. Comments and suggestions bystudents are considered in making changes in theundergraduate program and in continuing to find ways tosupport our undergraduates.

Findings

These reports documented several important findings. Thefollowing are three quotations from the 1983 and 1984reports indicating the dismal nature of things at the time.Provost Committee: “Although other factors need to beconsidered, it is difficult to avoid the conclusion that manyof the problems in freshman math can be traced to inadequateallocation of resources by the University Administration.”Internal Review Committee: “The most important course inthe lower division is calculus and this course is not a success...the success rate is less than 50% ...” External ReviewCommittee: “One of the best ways to help the research effort

... would be to tame this monstrous beast (precalculusteaching) ... it seems essential for everyone’s morale that thepass/fail ratio be increased, section size reduced, ...”

The situation was different in 1992. The findings by thereview committees showed that substantial improvementswere accomplished since 1984. The following are somequotations from these reports. University Review Committee:“The Department of Mathematics has made tremendousstrides since its last review in 1984. Its faculty has grownand established a firm scholarly reputation in key areas ofpure and applied mathematics. Undergraduate education hasimproved greatly due to a reduction in entry level class sizes,the introduction of desktop computers in the classroom, andthe vibrant activities of the mathematics education groupthat has evolved into a national leader ...” External ReviewCommittee: “Since the last external review (1984), theDepartment has addressed the problems of entry-levelinstruction responsibly, effectively, and efficiently . . .”

Use of Findings

These reports led to the appointment of two significantcommittees: an Ad Hoc Intercollegiate Committee onCalculus and a Departmental Entry Level Committee withthe charge to draft a plan to address the problems in beginningmathematics courses. The result was a five yearimplementation plan together with an estimate of theresources needed in each year. The plan was based on fivefundamental premises.

1. Students must be placed in courses commensurate withtheir abilities and mathematical background.

2. Students must be provided with a supportive learningenvironment and a caring instructor committed toundergraduate education.

3. Entry Level courses must be structured to meet currentstudent needs.

4. The future success of the program relies on an effectiveoutreach program to local schools.

5. Students need to be exposed to technology to enhancethe learning experience.

The following are the key recommendations of the EntryLevel Committee’s action plan. First, reduce class size. Second,institute a mandatory math placement and advising program.Third, institute a new precalculus course. Fourth, introduce acalculus I course with five credit hours. Fifth, replace the self-study intermediate and college algebra courses with a twosemester sequence in college algebra to be taught primarily insmall classes. (Note: This sequence is currently being upgradedto a one semester four credit algebra course.)

The findings by the various review committees coupledwith the action plan of the Entry Level Committee began aprocess of change in 1985 which resulted in a dramatic

Page 76: Assessment Practices in Undergraduate Mathematics - Northern

Part I: Assessing the Major 63

turnaround in the educational environment. Over the pasttwelve years the University has provided mathematics witha dozen new faculty positions and a supplementary budgetto improve undergraduate education. (This is currently at$900,000 per year.) These funds, augmented by grant moniesfrom NSF, have allowed the Department to dramaticallyimprove the quality of the undergraduate mathematicsexperience particularly for beginning students. This wasaccomplished under the leadership of the Head, a dozen orso faculty, and a dedicated cadre of adjunct lecturers. In sharpcontrast to the situation pre 1985, currently all freshman levelcourses are taught in classes of size 35. Student performancehas improved greatly and the passing rate in these courseshas increased 30 to 40 percent. We have excellent relationswith other departments in the University, with the highschools, and with the local community college.

The number of mathematics majors has increased steadilyover the past eight years. In the 1988-89 school year theUniversity awarded 12 BS or BA degrees while in 1996-97we expect to award 52 degrees. In this period the number ofdegrees in mathematics education have been steady at around12 while the number of engineering mathematics degreeshas declined from 12 to 4 due to the establishment ofundergraduate degree programs in computer science andcomputer engineering.

The increase in the number of mathematics majors is dueto several factors but in particular to the increased time andeffort put in to improve their lot. We have created a newMathematics Center—an office with library materials, a full-time advisor, lockers, and computer facilities. The Centerprovides special services to mathematics majors includingaccess to an academic advisor during working hours, upperdivision tutoring, an undergraduate mathematics colloquium,and information about graduate schools, careers, and summerprograms. The Center is under the direction of a new positionof Associate Head for Undergraduate Programs.

In the last six years the Department has been involved ineducation reform. It is becoming a recognized national leaderin calculus and differential equations reform and the use oftechnology in the classroom. Its leadership role is evidentfrom the amount of external funding it has received foreducational endeavors, for the number of visiting scholarswho come to Arizona to learn about our classroomenvironment, and the participation of faculty members innational meetings and service on national committees.

Success Factors

The Mathematics Department has tackled the challenge ofimproving the educational experience for students in whatwe feel is a wholly unique manner. The additional resourcesallowed the Department to augment its faculty with adjunctlecturers who are dedicated to teaching. This meant severalthings. First, the number of students who had to take theself-study algebra program decreased in stages to itsdiscontinuation in Fall 1997. Second, our faculty noticed animprovement in the teaching environment. Because of themandatory mathematics placement test their classescontained students with better mathematical preparation.Furthermore, the addition of adjunct faculty gave the regularfaculty opportunities to teach the classes that they preferred,generally in smaller sections. The result is an improvementin faculty morale in the Department. A number of facultyhave been involved in curriculum projects to improveundergraduate education. Notable among these is theConsortium Calculus Project, the development of materialfor two differential equations courses, and a geometry andalgebra course for prospective teachers.

We close with a list of some of the ingredients that gointo a successful program.

1. Broad faculty involvement. 2. General departmental awareness and support.

3. University encouragement and support for faculty.

4. Regular communication with interested universitydepartments.

5. A mandatory mathematics placement test.

6. Careful teaching assignments.

7. Augmentation of regular faculty with instructorsdedicated to teaching.

8. Open dialogue and development of joint programs withlocal school districts and community colleges.

9. Assessment of the curriculum and definition of the goalsfor each course.

10. Development of an effective training program forgraduate teaching assistants.

11. Reward for faculty who make significant contributionsto the undergraduate program.

12. Adequate staff support.

Page 77: Assessment Practices in Undergraduate Mathematics - Northern

64

Background and Purpose

The University of Akron is the third largest public universityin the state of Ohio. The departmental goals for themathematics major state that each student should be able toread and write mathematics, to communicate mathematicalideas verbally, and to think critically. In addition, each studentshould acquire a body of fundamental mathematical ideasand the skill of constructing a rigorous argument; each studentshould achieve an understanding of general theory and gainexperience in applying theory to solving problems. Severalyears ago the mathematics faculty became concerned aboutthe difficulty experienced by some students in making thetransition from the traditional calculus courses, where fewwritten proofs are required, to more advanced work inmathematics. The abrupt transition to rigorous mathematicsoften affected even those who had done superior work in thecalculus courses and was targeted as an important issueinvolved in a student’s decision to leave the study ofmathematics. To address the problem we designed the courseFundamentals of Advanced Mathematics (FOAM) to helpbridge the gap of the students’ understanding and preparationfor continued mathematical training.

The prerequisite for FOAM is the second semester ofcalculus, so students can take the course as early as their

sophomore year. FOAM is required for mathematics majorsand for those preparing to be secondary mathematics teachers.Other students with diverse majors such as physics, engineering,computer science, and chemistry also enroll in the course(especially those who desire a minor in mathematics). Thetopics included in FOAM were designed to introduce thestudents to the goals we have for our majors. Elementarysymbolic logic is the first topic of the course and sets thefoundation for later work. A study of standard proof techniques,including the Principle of Mathematical Induction, requiresthe application of symbolic logic to the proof process. Theproof techniques are then applied to topics such as sets(including power sets and arbitrary families of sets), relations,equivalence relations, and functions. Discussion of these majortopics shows students the interconnections of ideas that explainand give meaning to mathematical procedures. Students canalso see how mathematicians search for, recognize, and exploitrelationships between seemingly unrelated topics. Throughoutthe semester students write many short proofs. For nearly all,this is a first experience at writing a paragraph of mathematicalargument. The proofs are graded so that each student has plentyof feedback. The students also read proofs written by othersand evaluate them for approach, technique and completeness.Students are required to make periodic oral presentations oftheir own proofs at the board.

Analyzing the Value of aTransitional Mathematics Course

Judith A. Palagallo* and William A. BlueThe University of Akron

At a large, regional university in the Midwest a specific course (Fundamentals of Advanced Mathematics)at the sophomore level provides a transition for student majors from the more computationally-basedaspects of the first year courses to the more abstract upper-division courses. Surveys have been developedto measure the effects of this course on upper level courses in abstract algebra and advanced calculus. Inaddition, these surveys provide information about student learning in the major.

——————* Research supported in part by OBR Research Challenge Grant, Educational Research and Development Grant #1992-15, and Faculty Research Grant#1263, The University of Akron.

Page 78: Assessment Practices in Undergraduate Mathematics - Northern

Part I: Assessing the Major 65

Once FOAM had been taught for a couple of years, anassessment was necessary to see if it achieved the broadgoal of “bridging the gap.” We needed to know if themathematical topics covered were suitable preparation formore advanced courses. In addition we wanted the courseto alleviate some of the difficulty in making the transition toproof-oriented courses.

Method

We decided to assess the goals for the course by developinga survey in two parts. In one part we would assess theperceived value to the student of the course content inpreparation for more advanced work. The second part wouldevaluate the confidence level of the students in doingmathematics in the later courses. Before constructing a surveyinstrument, we interviewed students who were currentlyenrolled in FOAM to hear their reactions to the course. Wealso reviewed the literature for other attitude surveys,especially those done in mathematics. (See, for example,[1] and [2].) From the student responses and the literature,we selected statements to include in an attitude survey.

The content of FOAM was analyzed as viewed from theattitudes of students who took the course and then continuedon to take advanced calculus or abstract algebra. Over thepast five years, surveys were distributed to students near theend of the semester in the advanced course. Because the

FOAM course is required, we do not have a control groupof students who did not take FOAM before these advancedcourses. Students were asked to rate each major topic inFOAM with respect to its value in their current courses. Thetopics and a frequency count of their responses are listed inTable 1. (Not all students responded to every question.) Thearbitrary coding — extremely helpful = 3, moderately helpful= 2, slightly helpful = 1, not helpful = 0 — is used, and amean is calculated.

The second phase of the evaluation process was todetermine if this initial experience in conceptual topicsincreased the confidence of students in advancedmathematics courses. Confidence in learning mathematicswas found to be related to student achievement inmathematics and to students’ decisions to continue or notcontinue taking mathematics courses. (See, for example, theattitude studies in [1,3].) To gauge this, the same studentsresponded to a sequence of statements on an attitude survey.The statements, and the frequencies and means of theresponses are shown in Table 2.

Findings

A quick inspection of the frequency counts in Table 1suggests that students feel the studies of proof techniques,logic, induction and set theory were helpful. They weredivided on the study of functions, perhaps because this is a

Table 1

Perceived Value of Course Content

Evaluate how helpful the following topics from FOAM have been in Advanced Calculus and/or Abstract Algebra.Use a scale of

EH=extremely helpful MH=moderately helpfulSH=slightly helpful NH=not helpful

EH MH SH NH MeanA. Advanced Calculus

Proof techniques 24 22 8 1 2.25Set Theory 18 18 10 9 1.82Functions 9 24 14 8 1.62Induction 24 21 6 4 2.18Logic 20 21 13 1 2.09

B. Abstract AlgebraProof techniques 24 34 10 5 2.05Set Theory 20 33 14 7 1.89Functions 16 27 20 7 1.74Induction 23 25 11 12 1.83Logic 22 29 11 10 1.88

Note: EH=3, MH=2, SH=1, NH=0.

Page 79: Assessment Practices in Undergraduate Mathematics - Northern

66 Assessment Practices in Undergraduate Mathematics

final topic and may not have been covered completely in agiven term.

From Table 2 we note that the students reported anincrease in confidence and in general reasoning skills. Mostnoticeably, they feel they are better able to read and writemathematical proofs after taking FOAM. They were dividedon whether their mathematical interest had increased. It isencouraging to see that most are still interested in continuinga mathematical career!

Use of findings

We can conclude that we are teaching correct and relevanttopics to help prepare the students for later work. Studentsreported that their confidence and skills in mathematicsincreased with their experience in FOAM. The boost inconfidence should help retain students in the field.

The surveys conducted also indicate that in certain areasthe attitudes of mathematics education majors (prospectivesecondary mathematics teachers) differ from mathematics

majors. The graphs in Table 3 report the percentages ofstrongly agree/agree responses on selected questions by thetwo groups of students. (The question numbers refer to thequestions on the attitude survey in Table 2.) Both groupsfeel they are better able to read and write mathematical proofsafter taking FOAM. However, only about half (49%) of themathematics education majors reported an increase inconfidence for later courses and fewer than a third reportedan increase in interest (29%). From conversation with thesestudents we infer that they do not recognize where they willuse this type of material in their future classrooms and, attimes, resent the requirement of the courses in their degreeprogram. We are trying to address these issues now in FOAM,and also in the advanced courses, by introducing examplesof the actual material in the secondary school setting.

This project has led to additional questions and studies.Using multiple linear regression, we built a model to predicta student’s grade in FOAM based on grades in the first twosemesters of calculus. The model predicts that a student’sgrade will drop about 0.5 (on a 4-point scale) in FOAM.However, for a certain collection of students, grades drop

Table 2

Attitude Survey Results

Respond to these statements using the given code

SA=strongly agree A=agree N=neutral/no opinionD=disagree SD=strongly disagree

SA A N D SD Mean1. The experience of taking FOAM

increased my confidence to takelater mathematics courses. 21 37 20 15 8 2.48

2. The experience of taking FOAMincreased my interest in mathematics. 13 31 27 24 6 2.21

3. I understood how to write mathematicalproofs before taking FOAM. 0 16 13 50 22 1.23

4. After taking FOAM I ambetter at reading proofs. 37 46 8 9 1 3.08

5. After taking FOAM I canwrite better proofs. 36 43 12 9 1 3.03

6. I am still confused aboutwriting proofs. 4 27 20 43 7 1.60

7. I like proving mathematicalstatements. 6 35 24 27 9 2.02

8. Taking FOAM has increased my generalreasoning and logical skills. 21 52 12 13 2 2.77

9. If I had known about the abstract natureof mathematics, I would havechosen a different major. 2 8 21 46 24 1.19

Note: SA=4, A=3, N=2, D=1, SD=0.

Page 80: Assessment Practices in Undergraduate Mathematics - Northern

Part I: Assessing the Major 67

dramatically when going from the calculus sequence toFOAM. These students are the subjects of a separate surveyto determine where they attribute the cause of this decline intheir performance. Finally, the original study concentratedon students who complete FOAM and continue into latercourses. We have collected data on the students who do notcomplete the course, and thus decide to quit the study ofmathematics at this point. Analyzing the perceptions of thesestudents will be important to the extended project.

Success factors

A reliable assessment study is a slow process. The sample issmall since there are not many mathematics/ mathematicseducation majors, even at a large university. Furthermore,individual classes may vary in size (in our case, from 12-30). Also student attitudes and reactions to a course areundoubtedly dependent on the instructor. Thus the data isbest collected and evaluated over an extended time period.

Note: In [4], Moore reports on a study of cognitivedifficulties experienced by students in a course similar to

Table 3 Strongly Agree/ Agree Responses

0

20

40

60

80

100

1 2 4 5 7 8

Question Number

Math (n=66)Math Ed (n=35)

FOAM. He discusses experiences with individual studentsin learning proof techniques and includes a relatedbibliography.

References[1] Crosswhite, F.J. “Correlates of Attitudes toward

Mathematics,” National Longitudinal Study ofMathematical Abilities, Report No. 20, StanfordUniversity Press, 1972.

[2] Fennema, E. and Sherman, J. “Fennema-Shermanmathematics attitudes scales: Instruments designed tomeasure attitudes toward the learning of mathematicsby females and males,”JSAS Catalog of SelectedDocuments in Psychology 6 (Ms. No. 1225), 1976, p.31.

[3] Kloosterman, P. “Self-Confidence and Motivation inMathematics,” Journal of Educational Psychology 80,1988, pp. 345–351.

[4] Moore, R.C. “Making the transition to formalproof,”Educational Studies in Mathematics 27, 1994,pp. 249–266.

Page 81: Assessment Practices in Undergraduate Mathematics - Northern

68

Background and Purpose

Williams College is a small liberal arts college in ruralMassachusetts with an enrollment of 2000 students. TheMathematics Department has 13 faculty members coveringabout 9.5 FTE’s. Williams students are bright and very wellprepared: the median mathematics SAT score is 700. Forthe past 7 years, annual mathematics enrollments haveaveraged about 1050, about 30 students have majored inmathematics from each class of 500, an average of 3 or 4students have gone on to graduate school in mathematics orstatistics, and about half to two-thirds of those havecompleted or are making progress toward the PhD.

Three aspects of the mathematics experience outside thestandard curriculum at Williams are particularly relevant tostudents considering graduate school: the Senior Colloquium,the honors thesis, and the SMALL Undergraduate SummerResearch Project. The colloquium and honors thesis liewithin the formal structure of the major; both have existedfor decades. The SMALL project is a separate programwhich began in 1988. These three experiences help cultivatein students some of the qualities necessary for success ingraduate school such as independent learning, creativethinking on open problems, and the ability to presentmathematics well orally and in writing. How well our studentsrespond to these undertakings in part measures how well therest of the major prepares them for graduate work. Which

students choose to do an honors theses or apply to SMALLis another measure of how appropriately we inspire ourstudents to consider graduate school.

The Senior Colloquium and Honors Program have existedat Williams for many years. The former is required of allmajors, a shared experience with many challenges andbenefits. The latter is a special opportunity appropriate onlyfor the most able and ambitious students. Each is a barometerin some obvious ways of how well we prepare students forgraduate school. The ability of students to prepare clear andengaging talks on demanding topics they research with littlefaculty assistance is a sign of good preparation, not only forgraduate school but for many other undertakings. Studentswho produce interesting results in their theses, written andpresented orally in mature mathematical style, furtherdemonstrate that we are helping their development asmathematicians. Likewise the performance of students in theSMALL project is an obvious preparation indicator.

MethodThe Senior Colloquium is a colloquium series that meetstwice a week throughout the school year. It is advertised andopen to the entire community. The talks are given mainly byseniors, with occasional outside speakers, to an audience ofmostly mathematics majors and faculty. Presentations last30–40 minutes, followed by soda, cookies, and informal

Assessing the Major: Serving the Needs of StudentsConsidering Graduate School

Deborah BergstrandWilliams College

The focus of this article is on assessing student learning for a segment of the undergraduate mathematicsmajors: those talented students who are prospective mathematics graduate students. At this private,liberal arts college in the Northeast there are three aspects of the undergraduate mathematics experienceoutside the standard curriculum which are described in this paper: a senior seminar (required of allmajors), an Honors thesis and a summer undergraduate research experience. All three combined areused to assess how well the department is preparing students for graduate school.

Page 82: Assessment Practices in Undergraduate Mathematics - Northern

Part I: Assessing the Major 69

discussion. Senior majors are required to give one talk each,prepared under the guidance of a faculty member on a subjectthey have not studied in a class. After a few minutes of minglingwith students, faculty members evaluate the talk together on apass/fail basis. This private discussion lasts only a few minutes.The advisor verbally conveys the result to the studentimmediately, along with comments on the presentation. In anaverage year at most one or two failures occur. Seniors whofail the talk are usually required to give a second talk on a newtopic. Seniors must also meet an attendance requirement topass the Colloquium. Students do not receive formal credit forthe Senior Colloquium, but it is a requirement for the major.

Evaluation of the Senior Colloquium from the studentpoint of view takes place in two ways: the MathematicsStudent Advisory Board and the senior exit interview. TheMathematics Student Advisory Board (MSAB) is a set ofsix students (three seniors, two juniors, and one sophomore)who meet with job candidates, help plan department socialevents, and serve as a sounding board for student concernsthroughout the year. The senior and junior members aremathematics majors elected by their peers. The sophomoreis appointed by the department. As with many such matters,the MSAB is routinely consulted about the SeniorColloquium, especially when a particular concern arises.Though seniors, both through the MSAB and through theirexit interviews, are almost always the source of specificconcerns or suggestions about the colloquium, the resultingissues are nearly always discussed with the MSAB as awhole. The input of underclass students is often very helpful.

Very rarely seniors will approach their colloquium soirresponsibly that their talks need to be canceled (the student,the advisor, or both may come to this decision). In such casesthe student is usually required to give an extra colloquiumtalk. In general, we tend to deal with recalcitrant seniors onan ad hoc basis. Fortunately they are rare, and theircircumstances often convoluted enough, that this approachis most effective and appropriate.

The Senior Colloquium can be a particularly valuableexperience for students considering graduate school. Theindependent work required is good practice for moredemanding graduate-level courses. The effort required toprepare a good talk alerts students to the demands of teaching.

The Senior Honors Thesis is an option in all majors, andis typical of many schools’ honors programs. Because itusually involves original research, the thesis is of tremendousbenefit to students going on to graduate school. The thesisis one of a senior’s four regular courses in the fall semesterand the sole obligation during the 4-week January term. Moreambitious seniors may extend their theses into the springterm as well, if their advisor approves. Successful completionof an honors thesis along with some other accomplishmentsgive the student the degree with Honors in Mathematics.

The honors thesis culminates in a formal paper writtenby the student and a thesis defense given orally. The paper

gives the student’s results along with appropriate backgroundand bibliography. Students present their results in the talks,during and after which faculty members may ask questionsor make suggestions. Revisions to the thesis may resultthough they are usually minor. The student receives a regularcourse grade determined by the faculty advisor. Thedepartment as a whole determines whether the thesis warrantsHighest Honors, Honors, or, in rare cases, neither. Somestudents’ work is exceptional enough to publish. Clearly thesenior thesis is a good foretaste of research in graduateschool. The Honors Program is evaluated in basically thesame way as the Senior Colloquium: through studentinterviews, discussions with MSAB, and faculty discussionat department meetings.

The SMALL Undergraduate Research Project is a 9-weeksummer program in which students investigate open researchproblems in mathematics and statistics. The largest programof its kind in the country, SMALL is supported in part by aNational Science Foundation grant for Research Experiencesfor Undergraduates and by the Bronfman Science Center ofWilliams College. Over 140 students have participated inthe project since its inception in 1988. Students work in smallgroups directed by individual faculty members. Manyparticipants have published papers and presented talks atconferences based on work done in SMALL. Some havegone on to complete Ph.D’s. Due to funding constraints,about 60% of SMALL students come from Williams; therest come from colleges and universities across the country.(To answer the obvious question, “SMALL” is an acronymfor the faculty names on the original grant.)

SMALL was initiated in order to give talented studentsan opportunity to experience research in mathematics. Wehoped those with particularly strong potential who wereinspired by the experience would then choose to attendgraduate school and be that much better prepared for it. (Ishould point out that during the early years of SMALL therewere many other efforts undertaken in the department toincrease interest in mathematics in general. The best evidenceof our success is the increase in majors from about 10 peryear to an average over 30, and the increase in enrollmentsin mathematics courses from around 600 to over 1000 peryear.)

Evaluation of SMALL has two in-house components: oneby student participants and one by faculty. Each summer thestudents conduct their own evaluation in the last two weeksof the project. Two or three students oversee the process ofdesigning the questionnaire, collecting responses, andassembling a final report in which comments are presentedas anonymously as possible. Each faculty supervisor gets acopy of the evaluation. (The students themselves each receivea written evaluation from their faculty supervisor.) Studentcomments cover many aspects of SMALL, including thestructure of research groups, amount of direction fromfaculty, number and type of mathematics activities, length

Page 83: Assessment Practices in Undergraduate Mathematics - Northern

70 Assessment Practices in Undergraduate Mathematics

of the program, and various extracurricular matters such ashousing and social life. Some of the changes motivated bythese comments will be presented in the next section.

Faculty evaluate SMALL more informally throughdiscussions during the summer and school year. We havedebated many issues, including how to attract and selectparticipants, expectations and work structure for researchgroups, and the value of attending conferences and otheractivities.

Because participants work on open problems, SMALLis an excellent means for students to test their graduate schoolmettle. Though the circumstances are somewhat idealized(summer in the Berkshires, no classes or grades), how oneresponds to doing mathematics all day every day is a verybasic factor in the decision to go to graduate school. Moreimportant is one’s response to the challenge, frustration, and,hopefully, exhilaration that comes from tackling a realresearch problem.

Both the SMALL Project and the Honors Program alsoprovide feedback on graduate school preparation in a moreindirect but still important way. The number of our most talentedand dedicated students who choose to apply to SMALL andgo on to do an honors thesis reflects how well we prepare,advise, and encourage students regarding graduate study. Ofcourse, there will always be variations from year to year in thenumber of honors students and SMALL applicants, andWilliams mathematics majors often have strong interests inother fields, but if we notice particularly talented studentschoosing not to pursue these options, we are forced to examineour role in such a decision. Did we miss the opportunity tocreate a department mentor for the student? Did the studenthave some negative experience we could have avoided? Is thestudent well-informed about the benefits of both SMALL andthe Honors Program even if they are unsure about theircommitment to mathematics? There may be many reasonableexplanations; not all our strong students will choose graduateschool or our programs in undergraduate research. In fact, giventhe current job market, we must be cautious and honest withour students about graduate study. Nonetheless, it is extremelyhelpful for us and for our students to have programs in whichparticipation is instructive in so many ways.

Findings

Near the end of the spring term, seniors are interviewed by asenior faculty member about their experience as amathematics major. A question like “What do you think ofthe Senior Colloquium?” is always included. Most seniorsconsider the colloquium a highly valuable experience; somefeel it is one of the best parts of the major, even though itlies outside the formal curriculum.

Evaluation of the Senior Colloquium from the facultypoint of view occurs mainly through discussions at

department meetings. Over the past several years, we havediscussed balancing advising work among faculty,responding to recalcitrant students (ill-prepared, cancelingtalks, etc.), evaluating talks (in a different sense than thatabove), and the process by which we select the bestcolloquium (for which we have a prize each year).

We feel our rather informal evaluation procedures aremore than adequate in large part because the overall qualityof colloquium talks has increased. Failures have always beenrare, but there used to be several (3-4) borderline cases eachyear. Now it’s more likely to have only one or two such cases,and they are almost always not poor enough to fail but ratherjust weak or disappointing.

As mentioned previously, the department has grown a lotin the last ten or more years. In particular, the number ofstudents involved in research and the level to which weencourage such activity has increased dramatically. Wheremost honors theses used to be expository, now most involveoriginal research. Not every student produces interestingresults, but the experience of doing mathematics rather thanjust learning about mathematics is still significant. We believethis change is due in large part to the existence of the SMALLProject and the department’s conviction that undergraduatescan indeed do research in mathematics.

Several external agencies provide a more indirectevaluation of SMALL. The NSF and other granting agenciesexpect annual reports. More significantly, by renewing theREU grant three times over the last 9 years, the NSF hassignaled its approval of our project. Otherwise, the NSF hasprovided no specific findings resulting in changes to SMALL.Further indirect evaluation comes from those journals towhich research results are submitted for publication. (Journalspublishing SMALL papers include The Pacific Journal ofMathematics, The American Mathematical Monthly, and theJournal of Undergraduate Research.)

Use of Findings

In discussion with students about the Senior Colloquium,concerns raised over the years include attendance, how talksare evaluated, and the procedures for preparing a talk. Onthe first matter, keeping track of attendance has beenawkward and unreliable in the past. We now have a systemthat works very well. At the front of the room we hang aposter-sized attendance sheet for the entire semester withthe heading “Colloquium Attendance Prize.” (We do give asmall prize at the end of the year to the student who attendsthe most talks.) The prize is a reasonable cover for theattendance sheet which, being posted publicly, encouragesstudents to be responsible about coming to talks. We havealso relaxed the requirement to specify a certain number oftalks each semester, rather than one each week as we used torequire.

Page 84: Assessment Practices in Undergraduate Mathematics - Northern

Part I: Assessing the Major 71

Faculty used to evaluate a talk immediately afterwards,leaving students to chat over tea. Though usually we woulddisappear into another room for only a few minutes, by thetime we returned most students would be gone, leaving onlythe speaker and a few loyal friends. Now we delay our meetinga bit, talking with students about the colloquium or othermatters. Though the speaker has to wait a little longer for ourverdict, we all benefit from the relaxed time to interact overcookies and soda. On the preparation of talks, wide variationsin faculty approaches to advising a talk used to confuse manystudents. We now have, to some extent, standardized thetimetable under which talks are prepared, so students know,for example, that they have to pick an advisor at least fourweeks before the talk and give at least one practice talk.

Some of the issues that continue to be revisited includebalancing colloquium advising work among faculty andresponding to recalcitrant students (ill-prepared, cancelingtalks, etc). In some years, by their own choice students seemto distribute themselves pretty well amongst different faculty.In other years, however, a few faculty members have foundthemselves deluged with requests to advise colloquium talks.It’s hard to say no, especially for junior faculty. Rather thanset specific limits on the number of talks one can advise, weprefer to remind ourselves annually that an imbalance mayoccur, and then keep an eye on each other and especially onnontenured colleagues, encouraging them to say no if theyfeel overburdened advising talks.

Our internal evaluations of the SMALL Project haveresulted in some significant changes over the years. Theproject now lasts nine weeks instead of ten. In response torequests from both students and faculty for a more flexibleschedule, there is now only one morning “convocation” perweek for announcements and talks rather than dailygatherings. (Weekly afternoon colloquia and Fridayafternoon teas are still standard.) A more significant changeaffected the project format. The original format had studentswork in two groups of four to five on problems supervisedby two different faculty members. Under this structurestudents and faculty often felt that student efforts were spreadtoo thin. Furthermore, fostering efficient and comfortablegroup dynamics among five or six students could be quite achallenge. Participants now work with only one facultymember in one group of three students. Though there is somerisk that students might not find their one group engaging,for the most part students now seem more focused,productive, and, ultimately, satisfied.

The Senior Colloquium has affected some aspects of ourteaching in a way not solely connected to graduate schoolpreparation. Seeing the benefits students gain by giving acolloquium talk, some faculty have added a presentationcomponent to their more advanced courses. This not onlyenhances students’ experiences in that course, but can alsoin return help prepare them for the more demanding SeniorColloquium which may lie ahead.

Success Factors

Our Honors Program is typical of that found at many otherschools. Many of the benefits of such a program have alreadybeen mentioned: gives talented students an experience inindependent work and original research, offers students andfaculty a chance for extended one-on-one work, fostersinterest in graduate study, helps prepare students for someof the rigors of graduate study. The costs tend to be whatone would suppose: the challenge of finding appropriateprojects, the extra workload for faculty, the potential formeandered dabbling rather than the structured learning thestudent would find in a regular course. There is a bit of anironic downside as well. We have learned through studentinterviews that sometimes majors who are not doing honorstheses feel they are less interesting to the faculty or lessimportant to the department. Such a circumstance reflectsthe broader challenge any department faces in trying to meetthe needs of all its students.

A full-fledged REU program like the SMALL Project israrer than an Honors Program. It requires a strongcommitment by the department as well as the privilege of anNSF grant or other outside funding and, in our case,substantial institutional resources. The costs and benefits ofsuch a program are extensive. On the plus side are many ofthe same benefits as an Honors Program as mentioned above.In terms of cost, a summer research program requires facultywho are both willing to commit most of their summer andable to engage students in research. While not all faculty atWilliams do SMALL every summer, and those who do alsospend time on other projects, the fact remains that workingin SMALL can take away substantially from other workfaculty may wish or need to do. The project also requires alarge commitment from the College. Students get subsidizedhousing and over half of all participants have their stipendspaid by the College. (Williams has a strong tradition ofsupporting student summer research, especially in thesciences. Many faculty receive outside funding and theCollege returns a portion of the resulting overhead to thesciences to help support student research.) Furtherinformation about SMALL may be found in [1].

Though the department is fully committed to it, the SeniorColloquium is an expensive undertaking. Advising a talk istime consuming. Typically majors require some assistanceselecting a topic, two or more meetings to digest thereading(s) they have found or which the advisor has giventhem, further discussion on the design of the talk, and thenat least one practice talk. Over the course of three or fourweeks one can spend eight or ten hours advising a talk. It’snot unusual for some faculty to advise five talks in a year, sothe time commitment is nontrivial. Then there is thecolloquium series itself in which visitors speak as well asseniors. All faculty try to attend at least one of the two talkseach week, often both. Since we also have a weekly faculty

Page 85: Assessment Practices in Undergraduate Mathematics - Northern

72 Assessment Practices in Undergraduate Mathematics

research seminar, the time spend attending talks is alsonontrivial. All of this effort would probably not be well-spent if its only benefit were to students preparing forgraduate school. The overall benefit to all our majors,however, makes the effort well worthwhile. Seniors gainindependence, poise, and confidence even as they learn ahealthy respect for the effort required to prepare a goodpresentation. We think the Senior Colloquium is one of thebest aspects of our major.

Reference

[1] Adams, C., Bergstrand, D., and Morgan, F. “TheWilliams SMALL Undergraduate Research Project,”UME Trends, 1991.

Page 86: Assessment Practices in Undergraduate Mathematics - Northern

Part IIAssessment in the

Individual Classroom

Page 87: Assessment Practices in Undergraduate Mathematics - Northern
Page 88: Assessment Practices in Undergraduate Mathematics - Northern

75

When Sandy Keith first invited me to join her in puttingtogether this book, my motivation was the section onassessing the major. In our last accreditation review, WabashCollege, where I began this work, had been told that it wasonly being provisionally accredited, pending adoption of anacceptable assessment plan. The administration then toldeach department to come up with a plan for assessing itsactivities. With little guidance, we each put together acollection of activities, hoping they would be sufficient tosatisfy the accrediting association. Having wasted a lot oftime trying to figure out what assessment meant and whatoptions we might consider, I felt that by writing a book suchas this one, we could save many other colleagues a similarexpenditure of time. Thus, I began work on this book morefrom a sense of duty to the profession than from anyexcitement over assessment.

My attitude toward assessment reversed dramaticallywhen, on Sandy’s suggestion, I read Angelo and Cross’s,Classroom Assessment Techniques. This book containedideas which could help me in the classroom, and immediately.I began putting some of these techniques to use in my classesand have found them very worthwhile.

Most of us have had the experience of lecturing on a topic,asking if there are any questions, answering the few that thebetter students have the courage to ask, assigning and gradinghomework over the topic, and yet when a question on thattopic is asked on an examination, students are unable toanswer it. Classrooms Assessment Techniques (CATs) areways to find out whether students understand a concept, andwhat difficulties they are having with it, before they fail thetest on it. They also include ways to find out how wellstudents are reading, organizing the material, and so on.Traditional assessment is summative assessment: givestudents a test to find out what they’ve learned, so they can

be assigned a grade. CATs are formative assessment: find outwhere the problems are so students can learn more and better.

Not all of the assessment techniques in this section of ourbook are CATs; many of them can be used for both formativeand summative assessment, and some are mainly summative.However, what they all have in common is that in the processof assessment, students are learning by participation in theassessment technique itself. We’ve always given lip serviceto the assertion that on the best tests, students learn somethingtaking the test; the techniques in this section make this homilya reality. This section doesn’t try to duplicate Angelo &Cross, and we still recommend that you look at that book,since many of the techniques which they don’t specificallyrecommend for mathematics can be adapted to use in ourclassrooms. However, we present here a good variety ofassessment techniques actually being used in mathematicsclassrooms around the country to improve student learning.

This section begins with variations on the traditional testsand grades. Sharon Ross discusses ways in which traditionaltests can be modified to examine aspects of studentunderstanding, beyond the routine computations which formthe core of many traditional examinations. One variation ontests is Michael Fried’s Interactive Questionnaires, a formof quiz taken electronically and sorted automatically to givelarge classes semi-individualized detailed responses to theirwork. Then William Bonnice discusses allowing students tofocus on their strengths by choosing what percent (within arange selected by the instructor) of their grade comes fromeach activity.

Next comes a collection of CATs, ways of finding outhow well students understand a concept or section whichhas been taught. David Bressoud discusses adapting Angelo& Cross’ One-Minute Paper to mathematics classes, largeor small. It takes a very small amount of classroom time to

Introduction

Bonnie GoldMonmouth University

Page 89: Assessment Practices in Undergraduate Mathematics - Northern

76 Assessment Practices in Undergraduate Mathematics

find out how productive a class period has been, and whattopics need further clarification. Dwight Atkins also adaptsan Angelo & Cross technique, Concept Maps, to mathematicsclasses to learn how well students understand connectionsbetween related concepts. John Koker has his studentsdiscuss the evolution of their ideas as they work on homeworksets, which gives him insight into how much they actuallyunderstand and what they have learned by working theproblems. John Emert offers three quick CATs: a brief versionof the class mission statement (which Ellen Clay discusses,later in the section, in more detail), a way to check how wellthe course is going before the end of the semester, and aquick way to assess how groups are working. Sandra Keithdescribes a number of “professional assessment techniques,”ways to encourage students to take a more professionalattitude toward their studies, going beyond the individualcourse.

Several articles discuss helping students review andclarify concepts. Janet Barnett uses true/false questions, buthas students justify their answers. These justifications bringout confusions about the concepts, but are also the beginning,for calculus students, of writing mathematical proofs. JoannBossenbroek helps developmental and precalculus studentsbecome comfortable with mathematical language and learnhow to use it correctly. Agnes Rash encourages students tolook for their own applications of the methods they’restudying while reviewing the material for a test.

At the opposite extreme from most CATs, which are fairlyshort and give helpful information on individual details, aremethods used in research on teaching and learning. We havenot tried to cover that field in this section, but do have onearticle, by Kathleen Heid, on using videotaped in-depthinterviews to assess student understanding. While verytime-consuming, it is a method of finding out just whatstudents have understood, and where confusions lie.

There has been considerable attention in the last severalyears to having students engage in larger projects and writeabout mathematics as a route to learning it better. Severalother Notes volumes (especially #16, Using Writing to TeachMathematics, and #30, Problems for Student Investigation)discuss these processes in more detail; here we presentadditional guidance in this direction. Annalisa Crannelldiscusses assigning and assessing writing projects. CharlesEmenaker gives two kinds of scales which can make theprocess of assessing project reports less tedious. BrianWinkel explains how having students work on large projectsduring class time gives multiple opportunities to redirectstudents’ formation of concepts before they are set in stonein the students’ minds. Alan Knoerr and Michael McDonalduse two kinds of portfolios to help students aggregate andreflect on concepts studied. Dorothee Blum uses studentpapers in an honors, non-science majors’ calculus course tointegrate ideas studied. Alvin White discusses using studentjournal writing to expand students’ vistas of mathematics.

Patricia Kenschaft uses short papers and students’ questionsto have general education students think carefully about thereadings.

As many of us move away from complete dependence onthe lecture method, the issue of assessing these alternativeteaching techniques arises. Nancy Hagelgans responds tothe concern, when using cooperative learning, of how toensure that individuals take responsibility for their ownlearning. Catherine Roberts discusses effective ways ofgiving tests in groups. Carolyn Rouviere discusses usingcooperative groups for both learning and assessment.Annalisa Crannell explains how to give and gradecollaborative oral take-home examinations. (See also thearticle by Emert earlier in this section for a quick CAT ondiscerning how well groups are working.) Sandra Trowelland Grayson Wheatley discuss assessment in a problem-centered course.

In recent years we have become more aware that not allstudents are 18-22-year-old white males. Some of these otherstudents don’t “test” well, but can demonstrate what they’velearned when alternative assessment methods are used. Twoarticles discuss these special-needs students. ReginaBrunner discusses what techniques she’s found effective inher all-female classes, and Jacqueline Giles-Giron writes ofstrategies to assess the adult learner.

Turning from assessment of individual parts of a courseto assessment of how a course as a whole is going, EllenClay helps set the tone of her courses at the beginning bydeveloping a “class mission statement,” which can berevisited as the course progresses to assess progress towardmeeting course goals. William Bonnice, David Lomen, andPatricia Shure each discuss methods of finding out, during asemester, how the course is going. Bonnice usesquestionnaires and whole-class discussion. Lomen usesstudent feedback teams. Shure has an outside observer visitthe class, who then holds a discussion with the class in theabsence of the instructor, and provides feedback to theinstructor. Janet Barnett and Steven Dunbar discussalternatives to the standard course questionnaire for summingup a course once it’s finished. Their methods provide moreuseful information than do student evaluation questionnairesfor improving the next incarnation of the course. Barnetthas students write letters to friends describing the course.Dunbar collects data throughout the semester into a courseportfolio, which can then be used by that instructor or passedon to others.

Keep in mind that as you consider using one of thesetechniques, even if your course and institution are very similarto that of the author, to make the technique work well youwill need to modify the technique to fit your personality andthe tone and level of your class. But for most of yourassessment problems, there should be at least one techniquefrom this assortment which can help.

Page 90: Assessment Practices in Undergraduate Mathematics - Northern

77

Background and Purpose

Tests — timed, in-class, individual efforts — are still theprimary assessment tool used by mathematics faculty atDeKalb College, but comparing current tests to tests of ten,or even five, years ago show that perhaps the characteristicsgiven above are the only things that are still the same. Andeven these three descriptors are often wide of the mark. Avariety of other types of assessment are used on a regularbasis at DeKalb College, but this essay is primarily a reporton how tests have changed.

DeKalb College is a large, multi-campus, public, commuter,two-year college in the metropolitan Atlanta area. We haveapproximately 16,000 students, many of whom are non-traditional students. The curriculum is liberal arts based anddesigned for transfer to four-year colleges. There are only ahandful of career programs offered by the college. Themathematics courses range from several low-level courses forthe career programs through a full calculus sequence andinclude discrete mathematics, statistics, linear algebra, anddifferential equations courses. In each course we follow adetailed common course outline, supplemented by a course-specific teaching guide, and use the same textbook in eachclass. At regular intervals we give common final examinationsto assess the attainment of course goals, but otherwise individualinstructors are free to design their own tests and other assign-ments. A comprehensive final examination is required by thecollege in all courses. In practice, mathematics faculty oftenwork together in writing tests, assignments, and final exams.

Mathematics teachers at all levels joke about responsesto the dreaded question, “Will this be on the test?” But

students are reminding us with this question that in theirminds, tests are the bottom line in their valuation of material.Most students value ideas and course material only to theextent that they perceive the instructor values them, and thatis most clearly demonstrated to them by inclusion of a topicon a test. Our students come to us with very strong ideasabout what tests are and what they are not. Changing testsmeans changing our messages about what we value in ourstudents’ mathematical experiences. Acknowledging thismeans we have to think about what those messages shouldbe. At DeKalb College, we see some of our entry-levelcourses primarily as skill-building courses, others as concept-development courses. Even in courses that focus on skillbuilding, students are asked to apply skills, often in novelsettings. We want all students to be able to use theirmathematics. Also, fewer skills seem critical to us now; intime it will be hard to classify a DeKalb College mathematicscourse as skill building rather than concept building.

Technology was one of the original spurs for rethinkinghow tests are constructed; writing common finals was theother spur. One of the first issues we tackled in usingtechnology was how to ask about things that faculty valued,such as students’ knowing exact values of trig functions.Those discussions and their continuations have served uswell as we use more and more powerful technology. Someof the questions that recur are: Is this something for thetechnology to handle? at this stage in the student’s learning?ever? Is this topic or technique as important as before? worthkeeping with reduced emphasis? Common finals mean wehave to discuss what each course is about, what is neededfor subsequent courses, and what can really be measured.

What Happened to Tests?

Sharon Cutler RossDeKalb College

For students to believe that we expect them to understand the ideas, not just be able to do computations,our tests must reflect this expectation. This article discusses how this can be done.

Page 91: Assessment Practices in Undergraduate Mathematics - Northern

78 Assessment Practices in Undergraduate Mathematics

The driving force now is our evolving philosophy about ourcourses and their goals. Changes in course goals have resultedin changes in instruction. We use lots of collaborative work,lab-type experiences, extended time out-of-classassignments, and writing in a variety of settings. All of theserequire appropriate types of assessment, and this affects whatour traditional tests must assess.

Method

Assessing Skills

When we do want to evaluate mathematical skills, we askdirectly for those things we think are best done mentally.The experience of choosing appropriate technology includingmental computations and paper-and-pencil work is anecessary part of testing in a technology-rich environment.We try to be careful about time requirements as we want toassess skill not speed. As skill assessments, the next twoexamples differ very little from textbook exercises or testitems from ten years ago, but now items of this type play avery minor role in testing.

Example: Give the exact value of sin p

3.

Example: (a) Give the exact value of e dxx

0

1

z .

(b) Approximate this value to three decimal places.

Another type of question for skill assessment that givesthe student some responsibility is one that directly asks thestudent to show that he or she has acquired the skill.

Example: Use a 3 ́ 3 system to demonstrate that you knowand can apply Cramers rule.

Example: Describe a simple situation where the Banzhafpower index would be appropriate. Give the Banzhafindex for each player in this situation.

Assessing Conceptual Understanding

Good ways to measure conceptual understanding can be moreof a challenge to create, but there are some question styleswe have found to be useful. Here are examples of (1) creationitems, (2) reverse questions (both positive and negativeversions), (3) transition between representations, (4)understanding in the presence of technology, and (5)interpretation.

(1) Creation tasks are one way to open a window into studentunderstanding. Often a student can identify or classifymathematical objects of a certain type, but can not create anobject with specified characteristics.

Example: Create a polynomial of fifth degree, withcoefficients 3, –5, 17, 6, that uses three variables.

Example: Create a function that is continuous on [–5,5],whose derivative is positive on [–5,3) and negative on(3,5], and whose graph is concave up on [0,2] andconcave down elsewhere.

(2) One thing we have learned is that students are alwayslearning. Whether they are drawing the conclusions theinstructor is describing is an open question however.Reversing the type of question on which the student has beenpracticing is a good way to investigate this question. Thesetwo examples come from situations where exercises aremostly of the form: Sketch the graph of …. (The graphshave been omitted in the examples.)

Example: Give a function whose graph could be the one given.

Example: Why can the graph given not be the graph of afourth degree polynomial function?

(3) Being able to move easily between representations ofideas can be a powerful aid in developing conceptualunderstanding as well as in problem solving, but studentsdo not make these transitions automatically. Test items thatrequire connecting representations or moving from one toanother can strengthen these skills as well as reveal gaps inunderstanding.

Example: What happens to an ellipse when its foci are“pushed” together and the fixed distance is held constant?Confirm this using a symbolic representation of theellipse.

Example: The columns of the following table representvalues for f, f ´, and f´́ at regularly spaced values of x.Each line of the table gives function values for the samevalue of x. Identify which column gives the values for f,for f ´, and for f´´. (Table omitted here.)

(4) When technology is used regularly to handle routinecomputations, there is a possibility that intuition and a “feel”for the concept is being slighted. If this is a concern, onestrategy is to ask questions in a form that the availabletechnology can not handle. Another is to set a task wheretechnology can be used to confirm, but not replace intuition.

Example: Let f (x) = ax2 and g(x) = bx3 where a > b.Which function has the larger average rate of change on

the interval 0,a

b

FH

IK ? Support your conclusion

mathematically.

Example: Change the given data set slightly so that the meanis larger than the median.

weight (g) 2 3 4 5 6frequency 10 9 17 9 10

What is another result of the change you made?

(5) Another source of rich test items is situations that askstudents to interpret a representation, such as a graph, orresults produced by technology or another person.

Page 92: Assessment Practices in Undergraduate Mathematics - Northern

Part II: Assessment in the Individual Classroom 79

Example: The following work has been handed in on a test.Is the work correct? Explain your reasoning.

dx

xx

-= - = ( )- ( )=- ( )

- -z 22 1 3 3

1

3

1

3ln ln ln ln

Example: The following graph was obtained by dropping aball and recording the height in feet at time measured inseconds. Why do the dots appear to be further apart asthe ball falls? (The graph is omitted here.)

Assessing Problem Solving

Because developing problem-solving skills is also a goal inall of our courses, we include test items that help us measureprogress to this goal. Questions like this also tell our studentsthat they should reflect on effective and efficient ways tosolve problems.

Example: Our company produces closed rectangular boxesand wishes to minimize the cost of material used. Thisweek we are producing boxes that have square bottomsand that must have a volume of 300 cubic inches. Thematerial for the top costs $0.34 per square inch, for thesides $0.22 per square inch, and for the bottom $0.52 persquare inch. (a) Write a function that you could use todetermine the minimum cost per box.(b) Describe the method you would use to approximatethis cost.

Example: (a) Use synthetic division to find f (2.7) for thefunction f(x) = 2x5 – 3x3 + 7x – 11. (b) Give two otherproblems for which you would have done the samecalculations as you did in part (a).

Findings

In addition to the broad categories described thus far,questions that require students to describe or explain,estimate answers, defend the reasonableness of results, orcheck the consistency of representations are used more oftennow. All these changes in our tests are part of a rethinking ofthe role of assessment in teaching and learning mathematics.We understand better now the implicit messages studentsreceive from assessment tasks. For example, tests may be the

best way to make it clear to students which concepts andtechniques are the important ones. Assessing, teaching, anddetermining curricular goals form an interconnected process,each shaping and re-shaping the others. Assessment, inparticular, is not something that happens after teaching, butis an integral part of instruction.

Use of Findings

As our tests have changed, we have learned to talk with stu-dents about these changes. Before a test students should knowthat our questions may not look like those in the text andunderstand why they should be able to apply concepts anddo more than mimic examples. Students should expect tosee questions that reflect the full range of classroom andhomework activities—computations, explorations, applica-tions — and to encounter novel situations. We suggest thatstudents prepare by practicing skills (with and without tech-nology, as appropriate), by analyzing exercises and examplesfor keys to appropriate problem-solving techniques, and byreflecting on the “big” ideas covered since the last test. Af-ter a test, students should have opportunities to discuss whatwas learned from the test, thought processes and problem-solving strategies, and any misunderstandings that were re-vealed. These discussions, in fact, may be where most of astudent’s learning takes place.

Success Factors

The world provides many wonderful and clever problems,and newer texts often contain creative exercises and testquestions. But our purpose is to learn what the student knowsor does not know about the material, not to show how smartwe are. So we examine course goals first, then think aboutquestions and tasks that will support those goals. We suggestalso reconsidering as well the grading rubric for tests. Usingan analytic or holistic grading rubric can change the messagessent by even a traditional test. Consider the proper role oftests in a mix of assignments that help students accomplishcourse goals. And ask one final question: do you really needto see timed, in-class, individual efforts?

Page 93: Assessment Practices in Undergraduate Mathematics - Northern

80

Background and Purpose

Vector calculus is difficult for all students. It combines algebraand geometry in ways that go beyond high school training.Students need frequent and consistent responses. Withoutassurances they are learning a model supporting the course’sideas, students revert to memorizing details and superficialimitation of the instructor’s blackboard topics. Many bogdown, lose interest, and drop the course.

I needed more resources to give a class of 65 students thehelp they were clamoring for. So, I set out to develop a systemof e-mail programs that would give more effective interactionwith students. My goals were to:

1. Retain minority students2. Enhance interaction between professors and students

through project and portfolio creation3. Find out why — despite good evaluations — fundamental

topics didn’t work4. Continue to live a reasonable life while accomplishing

goals 1 and 2.

I chose technology because it is the only resource wehave more of than five years ago. We have less money andless time. Further, a Unix e-mail account is cheap andpowerful, once you learn how to use it. Here are basic

elements of the program, which was funded by the SloanFoundation:

• Establish quality student/instructor e-mail communication• Gather student data through e-mail Interactive

Questionnaires (IQs)• Develop a system of automated portfolios• Create a process for developing individual and team

projects• Reinforce student initiative toward completion of projects• Collect weekly comment files for students and teaching

assistants

These elements required designing and programming anoffice system of seven modules. Complete documentationwill appear in [3]. [2] has more discussion of what to expectfrom on-screen use of the programs. Each moduleincrementally installs into existing e-mail Unix accounts.These are now available to others at my institution. Forexample, having teaching fellows use parts of the systemimproved their performance and communication with me.Using the system requires training on basics, like how tomove to a directory in a typical e-mail account. In this articleI concentrate on IQs and automated portfolios.

This e-mail system allowed efficient fielding of 30–40quality interchanges a day. Further, it kept formatted

Interactive E-Mail Assessment*

Michael D. FriedUniversity of California at Irvine

This article discusses how internet technology can be harnessed to give students semi-automatedindividualized help. The intended reader appreciates how little problem solving guidance students getin class, and how they are on their own to handle giving meaning to learning an overstuffed curriculum.

–––––––*Partial support: Sloan Foundation 1994–96, NSF #DMS 9622928.

Page 94: Assessment Practices in Undergraduate Mathematics - Northern

Part II: Assessment in the Individual Classroom 81

electronic portfolios tracking each student’s interactions withme, in several forms.

Method

Interactive Questionnaires (IQs). An IQ is an enhancedinteractive quiz, (controlled by a computer program) offeringvarious aids and guidance to students who receive it by e-mail. Students take IQs at computer terminals — any terminalwith access to their mail accounts. When the student finishes,the IQ automatically returns the student-entered data to theinstructor by e-mail for (automatic) placement in the student’sportfolio. The IQ reader software sorts and manipulatesevaluative data entered by the student and returned by the IQ .

Teaching the modularity of mathematics is exceptionallyhard. IQs’ highly modular structure assists this process. Theyhelp students focus on analyzing one step at a time. Datafrom an IQ is in pieces. You view pieces extracted from astudent’s (or many students’) response(s) to any IQ with theIQ reader. Therefore, IQs help an instructor focus on smallconceptual problem elements in a class.

Here is a sample IQ from vector calculus, divided intoparts, to help students analyze lines in 3-space. (Numbersafter each part show the point value that part contributes tothe 45 points.)

Comparing lines. (45 points) Consider two lines L1 ={(2,1,3) + t(vx,vy,vz) | t Î R} and L2 = {(x0,y0,z0) + s(2,1,0)|s Î R}. You choose the vector (vx,vy,vz) and the point(x0,y0,z0) to construct lines with certain properties.

a) 7: For what vectors (vx,vy,vz) is L1 parallel to L2?

b) 13: Suppose (x0,y0,z0) = (2,1,3). Let V be the set ofnonzero vectors (vx,vy,vz) which make L1^ L2. Findvectors v1, v2

Î V such that v1 is perpendicular to v2.

c) 10: Suppose (vx,vy,vz) = (1,3,2), and L1 and L2 meet at apoint. Find a vector w and a point (x1,y1,z1) so the planeP = {(x,y,z)|w · ((x,y,z) – (x1,y1,z1)) = 0} contains allpoints of both L1 and L2.

d) 15: Suppose (vx,vy,vz) = (1,3,2). For what points (x0,y0,z0)is there a plane P that contains both L1 and L2?

An IQ presents each problem part, with help screens,separately. We (now) use simple TeX notation (see discussionof this in Findings). Numbers after help screen menu itemsinform students of how many points they will lose for usingthat help option. The IQ tracks student use of help screens,returning that data to the instructor via e-mail. Here is thehelp screen from part (b) of the problem above.

Help for Part b: Choose from among these if you feel youneed help.

[0.0]: No help needed; I’m ready to answer the question.[1.2]: Choosing an example vector in V.[2.2]: Checking if two vectors in 3-space are perpendicular.[3.3]: Describing V as a set given by equations.

[4.3]: Solving equations for v1 and v2.

The student enters a menu choice. Choosing 0 lets thestudent type written answers into the IQ . These answers areautomatically sent to the instructor and put in the instructor’sportfolio for that student. Any use of other help items beforethe student sends in the answer is also included in theinformation the instructor gets. Students can retry menus anytime. After returning the IQ to the instructor, a student cango through it again, this time using the help screens to checkthe work. As a developer of IQs, I work on enhancing thequality of evaluation and interaction.

Polling Portfolios. To poll a portfolio is to collect IQand other interaction data from the student portfolios. Pollingportfolios gives an instructor an instantaneous screen or filereport on how a collection of 65 students answered a specificquestion. Unless I choose otherwise, only I have access tothe portfolio.

The phrase automated portfolios refers to creating(additions to) portfolios semiautomatically from interactions.Each piece of each problem in an IQ has a tag. This simplifiesmoving material from a student’s IQ responses (e-mailmessages), to the student’s portfolio, and gathering theresponses of all students to a given question so the instructorcan evaluate them. It is as if the instructor had ripped thebluebook apart to allow grading all students’ responses to agiven item together. Placing items correctly into the portfoliois what allows easy polling (creation of reports about theportfolio contents). IQs are much easier to grade thansimilarly enhanced paper and pencil exams. The viewerprogram for that IQ can create a report across the studentportfolio collections using responses to that tagged piece.IQ technology allows the following activities, under therubric we call polling portfolios.

• Automatic placing/formatting of IQ responses intostudent portfolios

• Batched evaluating and commenting on specific piecesfrom an IQ

• Automatic mailings of commented IQ portions toportfolio owners [2]

Below are the answers the IQ reader collected from studentswith mail-names Akoines and Gmoore (two of my 60+students) to part b of the problem above, followed by myresponse. Student responses to problem b showed two typesof difficulty among all 60+ answers. Other students tried tostart the problem without using the help screens. The responseto them showed what they would have gained by doing so.

The problem asked:

Suppose (x0,y0,z0) = (2,1,3). Let V be the set ofnonzero vectors (vx,vy,vz) which make L1 ^ L2. Findvectors v1, v2

Î V such that v1 is perpendicular to v2.

Akoines responded: “(x0,y0,z0) = (2,1,3), so L2 = {(2,1,3)+ s(2,1,0) s Î R}.” Akoines then paraphrased what Help

Page 95: Assessment Practices in Undergraduate Mathematics - Northern

82 Assessment Practices in Undergraduate Mathematics

told him, and proceeded with the problem: “As L1 and L2both go through (2,1,3), they are perpendicular. This meanstheir directions are perpendicular. So, (vx,vy,vz) isperpendicular to (2,1,0). Dot product gives 2vx

+ vy

= 0. I

didn’t get what general solution means.”Akoines admitted not understanding an idea in the last

sentence. Though he didn’t use it, Help had a phrase on thispoint: Let vz be anything; solve the relation between vy andvx. Gmoore tackled this point without needing Help.

Gmoore responded, “As (x0,y0,z0) = (2,1,3), L2 = {(2,1,3)

+ s(2,1,0)}. Since L1 and L2 go through (2,1,3), they areperpendicular if their vectors are perpendicular. The product(vx,vy,vz) · (2,1,0) = 2vx

+ vy is 0. So, vz is anything, and vx

and vy satisfy this equation.”The IQ reader also told me that Akoines had used help

screens 1 and 3, at a total cost of 5 points out of the 13 forthat part of the problem, and Gmoore had used differenthelp screens, with a different resulting score.

My response to Akoines was “List vectors (vx,vy,vz) with2vx

+ vy

= 0. Take (vx,vy,vz) from V = {(u, –2u, v) | u, v Î R}

as in b3 Help: u = 0, v = 1, and v = 0, u = 1 (as in b.4 Help)to give v1 = (0,0,1) and v2

= (1, –2, 0).”

Gmoore and every other student with difficulties similarto Akoines got the same response I sent to Akoines. I onlyhad to write this response once, and then tag which studentsthat response should be sent to. Solving part b required usingthe solution of one linear equation in two unknowns. Evenwith help screens, this was a hard analysis for students. So,the same grader response appeared in many IQs. The viewerlets the grader — seeing many answer parts juxtaposed onthe screen prior — simultaneously add this one response tomany IQs. This typifies how a batch response to problempieces cuts grading time. My upcoming book [3] featuresmany applications of the system for batch and personalizedmerge mail, including returning IQs to students.

Findings

Enhanced e-mail interactions brought more contact withstudents in one course than I had in my previous 20 years ofteaching. Without this system, these interactions andassociate evaluations would have overwhelmed me.

A ten-week course covers a few succinct ideas. Instructorsknow where the road goes. Portfolio data shows few studentsknow there is a road. Day to day, students lose their way,and we march on without them. Many faculty are awarestudents get too little feedback, and that giving more withoutsome aid would overwhelm instructors. I now routinely retainclose to 100% of my students. Often students report to mefrom the next class, on how some particular idea improvedtheir ability to use my course in the next.

Further, these portfolios started the process of documentingthe value added by the instructor. Side messages from students

— also put into their portfolios — show specific changes ofattitude. Epiphany occurs when students realize it’s not luck ifan exam suits their studying. Rather than being lucky to havememorized a problem rubric, they see they need moreconcentration to maintain an analyzing mode. My web site [2]quotes a student likening her negative approach to analyzingproblems with her difficulties using a modem.

Using TeX helped my whole class, for the first time ever,use precision in their mathematical writing. Computershorthand for mathematics formulas (via TeX) aids effectiveuse of mathematics e-mail. The instructor can teach thisincrementally. Further, running e-mail exchanges through aTeX compiler translates symbols to magnificent output, areward for students.

Use of Findings

Suppose in a particular class I find students show they haveforgotten material from two weeks before. I use portfoliosto show us all what is happening. I ask the polling programto list questions from previous IQs. When I find one thatillustrates (say from an IQ two weeks ago) the same questionsstudents couldn’t answer in today’s class, I create a reporton the old responses to that question. That takes no time: Ijust choose “create a report” from the menu in the program.The report of students’ responses several weeks ago astoundsthem. Most would have asserted we never had that materialbefore. It shows them they must consider anew theeffectiveness of their study habits, and other aspects of howthey respond to classroom data.

Student portfolios contain interactions, including withclassmates, which lay out their intellectual problems. Until Iused IQs and portfolios, I didn’t realize how often studentslose what they have learned. Despite years of excellentinteractions with students, by the early 90’s I felt studentswere deceiving me. They appeared to be masters atconvincing me they understood material that in truth, theydid not. Using IQs showed what was happening.

[2] gives detailed screen shots from IQ sessions. Agraphic with it illustrates polling portfolios. This shows thedynamics of students learning and losing core material.Students had a tenuous grasp on material soon after apreliminary introduction. Then, weeks later, they lost it. Theviewpoint from high school kept reclaiming territory fromuniversity ideas. This happened with characteristic materialrepeatedly (example: falling back on point/slope lines in 3-space). Polling student portfolios simplified catching thisand interceding.

My Sloan Technology activities have enjoyed somecampus support (for trying to handle 18,000 students). SteveFranklin and Leonard Meglioli [1] at UCI’s Office ofAcademic Computing have replicated simple IQs in HTMLforms under the name “ZOT dispatch.” OAC offers training

Page 96: Assessment Practices in Undergraduate Mathematics - Northern

Part II: Assessment in the Individual Classroom 83

in it to induce faculty to get students’ comments on theirWeb offerings. It is a valid, though limited, entrance to thewhole portfolio system.

Success Factors

Any use of paper, like students responding to IQs on paperrather than on-line, bogged down recording the effort andresponding for the teaching fellow or me. My upcoming book[3] discusses at length why office hours can’t come close tothe effectiveness of IQs. Especially: The Sloan system givesan electronic record — put to many uses — and it has allstudents participate.

References

[1] Franklin, S. and Meglioli, L., III, Documentation forZot Dispatch, comment system for EEE-class resourcesat UCI, http://www.oac.uci.edu/X/W6/forms/zot-dispatch/, 1996.

[2] Fried, M.D. “Electronic Portfolios: EnhancingInteractions Between Students and Teachers,” essay andgraphic at http://www.oac.uci.edu/indiv/franklin/doc/mfried, 1994.

[3] Fried, M.D. Interactive E-mail; Enhanced StudentAssessment, in preparation, 6 of 10 chapters completeas of August 1997.

Page 97: Assessment Practices in Undergraduate Mathematics - Northern

84

Background and Purpose

I facilitate cooperative classrooms where students aresupportive of one another and where much of the learningtakes place in class discussions and with students workingin groups. To deepen student involvement and to inculcate asense of their responsibility, I work with the class in makingdecisions about course operation and learning processes. Oneday, when the class was arguing about how much weight toplace on the various factors that contribute to their grade, itcame to me: why not let each student decide for him or herselffrom a given range of choices? This way students can weightheavily factors which they like or in which they are strong,and they can minimize the weight of areas in which they areweaker or which they dislike. For example, some very matureand conscientious students claim that they study moreefficiently if they make their own choices as to whathomework to do. One could respond constructively to suchstudents by allowing zero as the lower bound of thehomework range. These students then could choose not toturn in any homework and weight other factorscorrespondingly higher.

Assessment methods for ascertaining grades should bemore broadly based, yet the number of different tasks wecan expect a student to undertake is limited. Ideally eachstudent should be able to choose his or her own method ofassessment to demonstrate competence and learning. Theflexible weighting method, to be described here, is a simpleversatile extension of standard grading methods. It

accommodates the diverse strengths, interests, and desiresof the students.

Method

Early in the course, a form which lists the ranges for eachfactor that contributes toward the grade is handed out to thestudents. They are given some time to experience the courseand gauge the teacher before they must make their decisions.Usually shortly after the first exam and the first project arereturned, the students are required to make their choices ofspecific weights, sign the form, and turn it in. At the end ofthe semester, I record the computation of the students’ gradesright on this form, in case they want to check mycomputations and the weights that they chose.

A typical grading scheme I now use is:

Self-Evaluation 5%Teacher-Evaluation 5%Board Presentations 5 to 10%Journal 5 to 20%Projects 10 to 30%Homework 5 to 20%Hour Exams 30 to 45%Final Exam 15 to 25%

The following simplified example illustrates thepossibilities. The first column of figures gives the rangesoffered and the subsequent columns show possible selectionsof percentages by different students.

FLEXIBLE GRADE WEIGHTINGS

William E. BonniceUniversity of New Hampshire

Using a computer spreadsheet to compute grades, it’s easy to let students (even in a large class) focus ontheir strengths by choosing what percent (within a range selected by the instructor) of their grade comesfrom each required activity.

Page 98: Assessment Practices in Undergraduate Mathematics - Northern

Part II: Assessment in the Individual Classroom 85

We see that the “Exam Hater” may choose to putmaximum weight on homework and the portfolio, whereasthe “Exam Lover” may choose to put maximum weight onthe exams. The “Homework Hater” may choose that thehomework not count at all and the “Portfolio Hater” maychoose not to keep a portfolio. We also see that the quizzesand final exam combined must account for a minimum offorty-five percent of a student’s grade; thus homework andthe portfolio combined can account for at most fifty-fivepercent of a student’s grade. If one did not want to makehomework optional, one would simply change the lowerbound on the homework range to be greater than zero.

Findings

Since I have been using flexible weighting, students havealways reacted to it favorably. Often students ask, “Why don’tother teachers use this method?” In view of currenttechnology, that is a legitimate question.

Because I want to foster cooperation, not competition, inmy classes, I set grade categories at the beginning of thecourse: 90% or above receives an A, 80% to 89% receives aB, 70% to 79% gets a C, 60% to 69% gets a D, and below60% receives an F. When students are able to focus on theirstrengths and interests their learning seems to improve.

Although originally I made some factors optional bygiving them a lower bound of zero, lately I have been usingfive percentage points as the smallest lower bound. This givesthe students the message that all methods of assessment thatwe use are important but that if they really don’t want to useone of them, the loss of the five points might possibly bemade up by utilizing the freed up time to work more on adifferent, preferred area.

Use of Findings

Using this flexible weighting scheme has influenced me toinclude more factors in my grading than I had in the past.For example 5% of a student’s grade comes from a “self-evaluation.” What part of this 5% students receive dependson the quality of their self-evaluation. Students could saythat they deserves an A in the course and receive a zero orclose to 0 (out of a possible 5%) because they did a poor job

of justifying the A. Similarly students could receive the full5% of the self-evaluation factor for doing an excellent jobof justifying a poor grade in the course. These self-evaluations have enabled me to know and understand mystudents much better. They make very interesting reading.My present thought is that they ought to be done earlier inthe course, around mid-semester, so that students would havetime to make behavioral changes based on my feedback.

Note that five percentage points may come from “teacher-evaluation.” On the same day that the students turn in theirself-evaluations, I give them my teacher-evaluation of theirwork and learning. Writing the teacher-evaluation is quitedemanding and requires getting to know each student. Foreach student I keep a separate sheet of paper in my notebookon which I record significant observations as they occurduring the semester. I record examples of effort andaccomplishment such as class participation, willingness tohelp classmates, intelligent comments and contributions,well-done homework or projects, etc. On the flip side, Irecord lapses and lack of effort and participation, especiallyinstances of times when students were unprepared or didnot do their assigned work. If students have no negativecomments on their record, I give them the full five pointsand tell them some of the good things I recorded about theirwork during the semester.

Note that in the typical scheme, board presentations havebeen given a minimum of five percentage points. Somestudents are terrified of presenting in front of the class. Theymay choose not to present and the loss of five points won’tkill them. However my experience has been that, in ordernot to lose those five points, students overcome their fearand make a presentation. It is surprising how many of thefearful students go on to “flower” and become regularpresenters at the board.

Note also that a minimum of five percentage points hasbeen given to keeping a journal. This motivates most studentsto perform the useful process of writing about themathematics that we are learning. Yet the recalcitrant studentmay still choose, without undue penalty, not to keep a journal.

This flexible grade method has led me to give morerespect to students as individuals with their own strengths,weaknesses, preferences, and interests. It has stimulated meto move away from being an authority in the classroomtoward looking for other ways to turning responsibility andauthority for learning over to the students.

Ranges Exam Exam Homework Portfolio BalancedOffered Hater Lover Hater Hater Student

Homework 0 – 25 25 10 0 25 25Quizzes 25 – 40 25 40 40 40 25

Final Exam 20 – 40 20 40 30 35 25

Portfolio 0 – 30 30 10 30 0 25

Page 99: Assessment Practices in Undergraduate Mathematics - Northern

86 Assessment Practices in Undergraduate Mathematics

Success Factors

After the students have read the first day’s handout describingthe method, they are usually enthusiastic about it and thefollowing in-class discussion takes less than twenty minutes.It must be made clear that, after they have chosen theirpercentages, their choices may not be changed later in thesemester.

I have written a program on my calculator which I use tocompute each student’s final numerical grade. If one keepsthe class grades on a computer spreadsheet, a formula canbe entered which will immediately calculate each student’sfinal numerical grade, once her/his percents (weights) havebeen entered into the spreadsheet.

Because the method of flexible weightings makes gradingmore agreeable for both teachers and students, it is worththe extra time and effort that it entails.

Page 100: Assessment Practices in Undergraduate Mathematics - Northern

87

Background and Purpose

No matter how beautifully prepared our classroompresentation may be, what the student hears is not alwayswhat we think we have said. The one-minute paper (describedin Angelo and Cross, Classroom Assessment Techniques) isa quick and easy assessment tool that helps alert us whenthis disjuncture occurs, while it also gives the timid studentan opportunity to ask questions and seek clarification. I haveused the one-minute paper in large lecture classes at PennState where I would choose one of the ten recitation sectionsto respond and where it helped keep me informed of whatthe students were getting from the class. I have also used itin the relatively small classes at Macalester College whereit is still helpful, alerting me to the fact that my students arenever quite as fully with me as I would like to think they are.

Method

In its basic format, the instructor takes the last minute (or,realistically, three minutes) of class and asks students to writedown short answers to two questions:

• What was the most important point made in class today?• What unanswered question do you still have?

Responses can be put on 3´5 cards that are I hand out,or on the student’s own paper. Students can be allowed torespond anonymously, to encourage them to admit points ofconfusion they might hesitate to put their name to, or theycan be asked to write their names so that the instructor can

write a brief, personal response to each question or encouragethoughtful answers by giving extra credit.

The questions can be modified in various ways, but theyshould remain open-ended. In one variation described byAngelo and Cross, the instructor asked each student to namefive significant points that had been made in that session.This can be especially useful in identifying the range ofperceptions of what has been happening in class. By spendingsome time early in the semester discussing these perceptionsand how they relate to what the instructor hopes that thestudents will see as the central ideas of the class, studentscan learn how to identify the central themes in each lecture.

In the large lecture classes at Penn State, students wererequired to write their names on their papers. After class, itwould take me less than 30 minutes to go through the thirtyor so papers that would be turned in, check the names ofthose who had turned them in (a bonus amounting to 1% ofthe total grade was given to those who turned them inregularly), and then write a one-sentence response to eachquestion. These were returned to the students by theirrecitation instructors.

Findings

Many of the students in large lecture classes viewed the one-minute paper as simply a means of checking on whether ornot they attended class, and, in fact, it did help keep classattendance up. It kept me abreast of what students weregetting out of the lectures and helped establish some personalcontact in the very impersonal environment of the

The One-Minute Paper

David M. BressoudMacalester College

This quick technique helps the instructor find out what students have gotten out of a given day’s class,and works well with both large and small classes.

Page 101: Assessment Practices in Undergraduate Mathematics - Northern

88 Assessment Practices in Undergraduate Mathematics

amphitheater lecture hall. But even in the small classes atMacalester College where I now teach, where students areless hesitant to speak up, and where it seems that I can sensehow well the class is following me, the one-minute paperoften alerts me to problems of which otherwise I would notbe aware until much later in the semester.

Use of Findings

Since the purpose of the one-minute paper is to identify andclarify points of confusion, I start the next class with a fewminutes spent discussing student answers to the first questionand explaining the misunderstandings that seemed to beshared by more than one student.

For example, after a class in which I introducedexponential functions of the form bx and explained how tofind their derivatives, I discovered that there was still a lotof confusion about how the derivative of 3x was obtained.For many students, using (3x+.01 - 3x-.01)/.02 as anapproximation to the derivative of 3x was more confusingthan useful because they thought that an approximation tothe derivative should look like (3x+h - 3x)/h. Some studentswere thrown by my use of the phrase “in terms of x” when Ispoke of “the derivative in terms of x.” They knew aboutderivatives, but had never heard of derivatives in terms of x.This provided an opening the next day to come back to someof their continuing misunderstandings about functions. Morethan one student thought that the most important point ofthis class was how to find the derivative of e. This forewarnedme that some of my students considered e to be the name ofa function. Only four students picked out what I thought Ihad been emphasizing: that the significance of e is that itprovides a base for an exponential function that is its ownderivative. Almost as many thought that the identificationof ln with loge was the most important thing said all period.Knowing the principal points of confusion about derivativesof exponentials, I was able to start the next class by clearing

up some of them and using selected questions to motivatethe new material I wanted to introduce.

To use the one-minute paper as a learning tool, it isessential that you be consistent and regular and spend timeearly in the course clarifying what you want. It can also beemployed simply as a periodic check on how accurate areyour perceptions of what students are learning and whatunanswered questions remain at the end of each class. Thebeauty of this tool lies in its simplicity and flexibility.

Success Factors

It is not easy to get students to identify important points orto ask good questions. They will tend to retreat intogeneralities. Following the class on derivatives of exponentialfunctions, most students wrote that the most important ideawas how to take derivatives of exponential functions. A fewsimply identified “taking derivatives” as the most importantidea. Similarly, many students will either write that they haveno questions or tell you only that they are confused. You cancorrect this by refining the question (“What was the mostsurprising or enlightening moment in class today?”) or byspending time early in the course discussing examples ofthe kinds of specific observations that you would like themto be able to make. You may also want to talk about how toformulate questions during a lecture or presentation as ameans of sharpening attention, illustrate this with examples— taken from students in the class — of questions thatdemonstrate this kind of attention, and then give some smallbonus credit to those students who can consistently end eachclass with a question.

Reference[1] Angelo, T.A., and Cross, K.P. Classroom Assessment

Techniques, 2nd ed., Jossey-Bass, San Francisco, 1993,pp. 148–153.

Page 102: Assessment Practices in Undergraduate Mathematics - Northern

89

Background and Purpose

Concept maps are drawings or diagrams showing connectionsbetween major concepts in a course or section of a course. Ihave used concept maps in precalculus and calculus classesat Surry Community College, a medium sized member ofthe North Carolina Community College System. About one-third of our students transfer to senior institutions. Thisincludes most of the students who enroll in precalculus andnearly all of those who take calculus.

Concepts maps can be a useful tool for formativeassessment. They can also provide an added benefit ofhelping students organize their knowledge in a way thatfacilitates further learning. I first learned about concept mapsfrom Novak [3]. Further insight about concept maps and theiruses in the classroom can be found in [1] and [2].

Method

Concept maps are essentially drawings or diagrams showingmental connections that students make between a majorconcept the instructor focuses on and other concepts thatthey have learned [1]. First, the teacher puts a schematic beforethe students (on a transparency or individual worksheet) alongwith a collection of terms related to the root concept. Studentsare to fill in the empty ovals with appropriate terms from thegiven collection maintaining the relationships given alongbranches. From the kinds of questions and errors that emerge,I determine the nature and extent of review needed, especiallyas related to the individual concepts names that should be

familiar to each student. I also give some prompts that enablestudents to proceed while being careful not to give too muchinformation. This is followed by allowing the students to usetheir text and small group discussions as an aid. The idea atthis stage is to let the students complete the construction ofthe map. Eventually they all produce a correct concept map.Once this occurs, I discuss the concept map, enrich it withother examples and extensions in the form of other relatedconcepts.

This strategy seems to work particularly well when deal-ing with several related concepts, especially when someconcepts are actually subclasses of others. I have used thisprocedure in precalculus in order to introduce the conceptof a transcendental number. Students chose from the list:integers, real, 3/5, rational, irrational, e, algebraic, p, tran-scendental,2 , and natural to complete the concept map onthe top of page 90.

As students proceeded individually, I walked around theclassroom and monitored their work, noting different areasof difficulty. Some students posed questions that revealedvarious levels of understanding. Among the questions were“Don’t these belong together?” “Does every card have to beused?” and “Haven’t some groups of numbers been left out?”

I have also used this procedure in calculus to introducethe gamma function. I wanted to introduce the gammafunction by first approaching the idea of a non-elementaryfunction. This prompted me to question what studentsunderstood about the classifications of other kinds offunctions. I presented a schematic and gave each student aset of index cards. On each card was one of the followingterms: polynomial, radical, non-elementary, trigonometric,

CONCEPT MAPS

Dwight AtkinsSurry Community College

By drawing concept maps, students strengthen their understanding of how a new concept is related toothers they already know.

Page 103: Assessment Practices in Undergraduate Mathematics - Northern

90 Assessment Practices in Undergraduate Mathematics

inverse trigonometric, hyperbolic, and gamma. The studentswere told to arrange the cards in a way that would fit theschematic.

Use of Findings

The primary value of using concept maps this way is that ithelps me learn what needs review. They provide a meansfor detecting students misconceptions and lack of knowledgeof the prerequisite concepts necessary for learning newmathematics.

Success Factors

The primary caveats in this approach are that the teachershould be careful not to let the students work in groups tooquickly and not to give away too much information in theway of prompts or hints. One alternative approach is to havethe students, early in their study of a topic, do a conceptmap of the ideas involved without giving them a preset

schematic. Then you gather the concept maps drawn anddiscuss the relations found: what’s good about each, or what’smissing. You can also have students draw a concept map bybrainstorming to find related concepts. Then students drawlines between concepts, noting for each line what therelationship is.

References

[1] Angelo, T. A. and Cross, K. P. Classroom AssessmentTechniques, A Handbook for College Teachers (2nd ed.,p. 197). Jossey-Bass, San Francisco, 1993.

[2] Jonassen, D.H., Beissneer K., and Yacci, M.A. Struc-tural Knowledge: Techniques for Conveying, Assessing,and Acquiring Structural Knowledge. LawrenceErlbaum Associates, Hillsdale, NJ, 1993.

[3] Novak, J.D. “Clarify with Concept Maps: A tool forstudents and teachers alike,” The Science Teacher, 58(7), 1991, pp. 45–49.

reals

may bemay be

such assuch asmay bemay be

such assuch asmay be

may be

Page 104: Assessment Practices in Undergraduate Mathematics - Northern

91

Background and Purpose

The University of Wisconsin Oshkosh is both a majorundergraduate and regional graduate campus in the statewideUniversity of Wisconsin System with a student populationof about 10,000. Many of the students I work with are pre-service elementary and middle school teachers. I teach anAbstract Algebra class for prospective middle schoolmathematics teachers. This course is specifically designedas part of a mathematics minor for those majoring inelementary education.

In my classes I assign non-routine, challenging problemsthat are to be handed in and evaluated. These problemsusually go a step beyond the material discussed in class andare designed to set up situations where students can discoversome mathematics on their own. Most students are successfulin working exercises similar to those done in class andexamples in the text, while few hand in “good” solutions tothe challenging problems. I found myself giving low scoresto what looked like poor attempts and meager results.

Anyone familiar with the history of mathematics can thinkof several examples of mathematicians who have workedon a given problem for many years. Perhaps the originalproblem was never solved, but the amount of mathematicslearned and developed in working on the problem wasastounding. I applied this situation to my students. What ismore important: that they solve a given problem, or theylearn some new mathematics while struggling with aproblem? I was looking for papers which were organized,

logical, thoughtful and demonstrated students had engagedthemselves with the course material.

Method

In many of the problems I assign in my Abstract Algebracourse, I expect students to compare properties of abstractmathematical structures with more familiar structures(integers and rationals). However, these students are not yetready to respond to this kind of assignment with a clear,textbook-like essay. To help them understand what I doexpect of them, I hand out “Guidelines for WrittenHomework,” which explains

Your papers can be as expository as you want to makethem. Tell how you arrived at your particular solution.You can discuss how the problem relates to material inclass and possible applications of the problem. Do youunderstand the class material better now by working onthis problem? Explain. If you could not arrive at asolution tell what you tried and why it didn’t work, ordescribe what you think may work. Vent frustrations!Identify obstacles you encountered and what you thinkyou need to overcome these. The most important thingis to demonstrate to me that some thought went intoyour work and that you learned something!

I add that an expository paragraph describing theirexperience with the problem must be included with thesolution.

If You Want To Know What StudentsUnderstand, Ask Them!

John KokerUniversity of Wisconsin Oshkosh

To gain insight into how much students actually understand, and what they have learned by working theproblems, have them discuss the evolution of their ideas as they work on homework sets.

Page 105: Assessment Practices in Undergraduate Mathematics - Northern

92 Assessment Practices in Undergraduate Mathematics

For example, the following homework assignment wasgiven while we were discussing groups, before they wereaware that a group satisfies the cancellation property.

HOMEWORK #3When working with numbers, we “cancel” without eventhinking. For example, if 3 + x = 3 + y, then x = y, and if 3x= 3y, then x = y. The purpose of this homework problem isto get you thinking about this “cancellation process.”

a. Let a, b and c be elements of Z6 (integers modulo 6).

Then if a Å b = a Å c, is it necessarily true thatb = c? Justify your answer.

b. Let a, b and c be elements of Z6. Then if a Ä b = a Ä c,is it necessarily true that b = c? Justify your answer.

c. Let (G, *) be a mathematical structure and suppose thata, b and c are elements of G. What properties must (G, *)satisfy so that a * b = a * c implies that b = c?

Findings

The expository portions of the first two assignments werequite unrevealing. Comments like, “the problem was hard,”“I really didn’t understand what you wanted,” “I worked 2hours on this.” “this one wasn’t too bad,” etc., were common.I used the “power of the grade” to obtain more thoughtfulresponses. If, in my opinion, the expository section was tooshallow, I would deduct a few points. I would also include aquestion or two which they could consider for the nextassignment. Those who did not include any expositorysection, lost several points. After a few assignments, I beganto get some interesting comments.

Assignment #3 described above was the first one on whichI received good comments. Comments I received included:

I first thought in part (a) you just subtract a from eachside and in part (b) and (c) you just divide by a oneach side. When I mentioned this to you, you askedme if subtraction and division are defined in thesestructures. This helped me in part (b), because we dida problem in class about division in Zn. But I am stuckwith part (c). How can I define division in (G, *)? Iam sure that when a, b and c are in Zn, a Ä b = a Ä cimplies that b = c if a is not a zero divisor because Ican then define division. You said I am on the righttrack, but I still don’t see it.

I didn’t really have any trouble understanding parts aand b. I simply made the tables and checked to see ifeach element of Z

6 appeared at most one time in each

row of the table. If this happens the element can be can-celed. For example in Z

6, 3 Ä 0 = 3 Ä 4, but 0 ¹ 4.

In the “3” row, the number 0 appears more than once.However, part c was a different story. I now see whatyou mean when you say that it is different to showsomething which is an example and something is true

in general. I could just make a table for Z6 but how in

the world do you make one for (G, *)? When I cameand talked to you, you told me to think of the alge-braic properties involved. I didn’t know what youmeant by this. You then asked me what are the “me-chanics involved in going from 3x = 3y to x = y.” Ithen started to see the picture. Is this what abstractalgebra is like? Trying to decide what properties willallow certain results?

Neither of the students who wrote the above paragraphsabove solved part c. But because of what they wrote, I knowthat the first student has a better understanding of what itmeans for division to be defined and the second appreciatesthe difference between working examples and justifyingstatements in general. Both of these tell me the student haslearned something.

I realized that this was the first assignment where thestudent was given an open-ended problem in which theyneeded to make a conjecture. I didn’t require proof. Theyneeded to convince themselves that their guess was accurate.I also got something that I didn’t expect. Here is what onestudent wrote about part c:

I feel that the examples we did in class on Friday helpedserve as a guide for this problem. It wasn’t really toodifficult. I didn’t spend very much time on it.

By itself this seems like a pretty insignificant comment.However the solution the student handed in was completelyincorrect! Some students submit incorrect solutions knowingthey are incorrect. That, in itself, shows that the studentknows something. It is quite different, however, when astudent believes an incorrect solution is correct. I simplywrote a note to this student and we discussed this problemin my office.

Later in the semester I gave an assignment which wasdesigned to have the student conjecture when a linearequation ax + b = 0 (with a and b elements of a ring R) hasa unique solution. In the first part, students were asked tofind examples of a linear equation which has more than onesolution, a linear equation which has a unique solution, anda linear equation which has no solution in each of the ringsZ, Q, Z4, Z5, and Z6 (if possible). Then they were asked togive a general condition which will imply that a linearequation ax + b = 0 over a ring R has a unique solution.

Most students could completely handle the integer andrational cases. Additionally, only a few struggled with thelinear equations over Z4, Z5, and Z6. Discovering thealgebraic properties involved was the heart of the problemand the difficult part. One interesting comment I receivedindicated that the student never thought about the number ofsolutions an equation has and that this is somehow connectedto the elements and properties of the structure. She just solvedthem without much thought. She also wondered if the sameproblem could be solved for quadratic equations.

Page 106: Assessment Practices in Undergraduate Mathematics - Northern

Part II: Assessment in the Individual Classroom 93

A second student thought that a linear equation alwayshad at most one solution. This proved to be true in Z and Q.She described trying to use “regular” algebra in Z4 bysubtracting b from both sides and then dividing by a. Thiscaused problems when she found that division is not alwayspossible in Z4. So she chose numbers and substituted themfor a and b in the equation ax + b = 0. She solved the problemin Z4

by using the operation tables. She then went on to

explain that this is where she figured out that if an elementdoesn’t have an inverse, there can be more than one solution.She concluded that this is why in Z and Q there are uniquesolutions.

I feel like the first student’s interest was sparked by thisproblem. The second has some misconceptions. She saysthat there are multiple solutions when an element is notinvertible and comments that this is why equations over theintegers have unique solutions (when solutions exist). Yetonly two integers are invertible.

Several students told me they learned a lot by completingthis assignment. That is insufficient. I want to know whatthey have learned.

For the most part, the student reaction to writing abouttheir experience with a problem was quite positive. At theend of the semester I asked for students to comment on thisprocess. Students said that the expository section of thehomework really helped them to think about what they didand why it did or did not work. In general they felt they hadan alternate route to demonstrate learning was taking place.They also appreciated getting credit even if their solutionwas not entirely correct.

Use of Findings

I have been learning about how students approach and thinkabout problems. Before, I could tell whether or not a studentcould solve a problem, but that is all. I couldn’t even tell ifthey knew if they were right or wrong. Students see thistechnique as a nonthreatening way to express theirunderstanding. I use their paragraphs to construct newproblems, to prepare for class, and to instigate classroomdiscussions. For example I wrote a problem asking whenthe quadratic equation will work in Z

4, Z

5, and Z

6 based on

the student comment above. After grading a problem withpoor results, I go over it in class. These paragraphs helped

me to focus on what the students did and didn’t understand.I could use their ideas to change a condition in the problemand asked them how the results change. We found ourselvesasking “What if …?” a lot.

In addition, I now reward process, not just results. I try toassess the solution together with the exposition, giving onescore for the combination. I try to convince my students thatthe most important part of their paper is to demonstrate tome that substantial thought went into their work. Theparagraphs which addressed a student’s own learning wereparticularly helpful with my assessment. In addition, studentswere using these papers to ask questions about material thatwas not clear. The method created a nonthreatening situationfor the student to express confusion.

Success Factors

It is important to realize that this expository writing will notwork for all kinds of problems, e.g., routine symbolmanipulating problems. I had my greatest success on open-ended problems. Initially, I didn’t get any really goodresponses, and some students never really cooperated. Inthe future I may include specific questions I want answeredwith a given problem. Some example of questions on mylist are:

• Do you understand more or less about this topic after thisproblem?

• Explain how you arrived at your solution. Discuss trialsthat didn’t work.

• Discuss possible alternate solutions.

• Discuss possible applications of this problem.• Discuss the connections between this problem and our

work on ____ in class.

• Describe what you were thinking and feeling whileworking on this problem.

• Can you think of a better problem to assign on this topic?If so, what is it?

These questions may be helpful at the beginning of thesemester. Students will be encouraged to add additionalcomments beyond responding to the specified question.Then, depending on the results, I can allow more freedomas the class moves on and the students’ confidence increases.

Page 107: Assessment Practices in Undergraduate Mathematics - Northern

94

As I continue to experiment with various informal classroomassessment techniques, I have come to favor assessment toolswhich use no more than ten minutes of class time and requireno more than an hour after class to tabulate and form a response.The three techniques included here provide quick ways to findout what is going well and what is not, and allow me to addressany problems in a timely manner.

Ball State University is a comprehensive community withapproximately 17,000 undergraduates. The University offersa strong undergraduate liberal and professional education andsome graduate programs. Most University courses are taughtin small classes of 35 or fewer students by full-time faculty.

My classes tend to sort into three categories: contentcourses for departmental majors, colloquia courses for ourHonors College, and a mathematics topics course that fulfillsa University general studies requirement. While each classdevelops its own unique personality, each of these categoriesbrings its own challenges.

Technique 1: Setting Course Goals

Background and Purpose

A section of our general studies topics course generallyincludes 35 students, mostly first-year, with widely varyingmathematics abilities and declared majors in the creativearts, communications, humanities, social sciences, andarchitecture. Students in this course often enter the course

annoyed that a mathematics class is required by theUniversity. I want my students to examine the role of thiscourse in their college curriculum.

Method

At the first meeting I follow a brief course introduction bydistributing copies of the General Studies Program Goals,as found in the Undergraduate Catalogue.

General Studies Program Goals The General Studiesprogram is designed to help you develop knowledge, skills,and values that all graduates of the university are expected toshare. The program of General Studies has the following goals:

1. An ability to engage in lifelong education by learning toacquire knowledge and to use it for intelligent ends.

2. An ability to communicate at a level acceptable for collegegraduates.

3. An ability to clarify one’s personal values and to besensitive to those held by others.

4. An ability to recognize and seek solutions for the commonproblems of living by drawing on a knowledge ofhistorical and contemporary events and the elements ofthe cultural heritage related to those events.

5. An ability to work with others to solve the commonproblems of living.

6. An ability to assess one’s unique interests, talents, andgoals and to choose specialized learning experiences that

Improving Classes with Quick Assessment Techniques

John W. EmertBall State University

This article discusses three quick techniques which can alert the instructor to potential problems. Thefirst helps students understand course goals, the second evaluates the effectiveness of group work, andthe third is a general way of finding out how things are going in time to make changes.

Page 108: Assessment Practices in Undergraduate Mathematics - Northern

Part II: Assessment in the Individual Classroom 95

will foster their fulfillment. This program is made up ofcore requirements and distribution requirements, whichare groups of courses from which you choose. All studentsgraduating with baccalaureate degrees must complete the41-hour requirement in General Studies.

After reviewing these six goals, I ask student to select thegoals which they believe are important to all graduates, and ofthese, to choose one which they think the course will help themto achieve. I then ask students to write a paragraph identifyingthe goal and why they think it is important to all graduates.

Findings

Through this activity, I am able to focus the class on thegeneral studies mission of the course. While this methodeffectively forces a positive reaction from each student, itallowed me to solicit student reactions and learn what theymight expect from this class. This activity also creates anexpectation for additional classroom assessment activitiesin the course.

Use of Findings

I tabulate the responses and share them with the class thenext day. All six goals are generally included among theresponses. As the course progresses through the term, whenexamples or illustrations could support these goals, I can torefer back to this list and the supporting student paragraphs.This activity helps me effectively combat the naggingquestion of relevance which can often plague a generalstudies course.

Success Factors

Courses for majors can also benefit from an initial overviewof the course’s intent, purpose, and breadth. You can alsosolicit the students’ perceived course objectives and comparethem to yours and those of the university.

Technique 2: Evaluating How Groups Work

Background and Purpose

Our departmental majors, including undergraduates andmaster’s candidates in actuarial science, mathematics,mathematics education, and statistics, range broadly in abilityand motivation. These students take our Discrete Systemscourse after one term of Calculus. Group laboratory projectshave developed in recent years to become part of the Calculussequence, but the experience can vary greatly from instructor

to instructor. Since Discrete Systems includes several topicsthat can naturally support group exploration, I elected todevelop similar projects for this course.

Method

After the second project was completed, I wanted to learnhow my students’ perception of this activity compared tomy own. Therefore, I developed an evaluation form andsolicited responses from each student.

Group Projects Evaluation Form

1. Overall, how effectively did your groups work together onthe project? (Poorly, Adequately, Well, Extremely Well)

2. In an effective working group, each person should be anactive participant. How well did your groups meet thisgoal? (Poorly, Adequately, Well, Extremely Well)

3. Give one specific example of something you learned froma group that you probably would not have learned alone.

4. Give one specific example of something another groupmember learned from you that he or she probably wouldnot have learned alone.

5. What is the biggest challenge to group projects? Howcould this challenge be overcome?

Findings

After tabulating the responses, I found that over 75% of theclass rated their ability to work together in groups “Well” or“Extremely Well” in the first two questions. In fact, about45% of the class responded “Extremely Well” to bothquestions. In response to questions 3 and 4, positive aspectsof group work included: different points of view (45%),better understanding of class topics (40%), improved libraryskills (35%), improved computer skills (15%) and groupcommitment (10%). The comments to the final questionincluded several versions of “Finding time when we bothcan work on the project was the biggest challenge.”

Use of Findings

I returned copies of these tabulated responses to the classthe following day, along with the following paragraph:

Based on your responses, most of you view the groupprojects as a valuable part of the course. In additionto focusing on specifically assigned “group questions,”the group structure promotes discussion of class topics,improves study skills, and helps to develop classrapport. Because I agree that group work is a valuablepart of this class, and since the scheduled class timeoffers a guaranteed opportunity for groups to meet, I

Page 109: Assessment Practices in Undergraduate Mathematics - Northern

96 Assessment Practices in Undergraduate Mathematics

intend to devote parts of future class times for groupdiscussion. Though the projects will not be completedduring such short meetings, these times may allowgroups to share ideas, report progress, and planstrategy. I intend to use a similar evaluation instrumentfollowing the final report. —JWE

Success Factors

I have found that the third and fourth questions can beadapted as an effective evaluation tool, as well as a promptfor classroom discussion. For example, when anotherDiscrete Systems class recently visited our Annual StudentSymposium, a poster session for student research projects, Iasked the following three questions:

1. What is something you learned from a presenter at thesymposium that you might not have otherwise learned?

2. What is something that someone else at the symposium(another student or a presenter) learned because of youthat he or she might not have otherwise learned?

3. Should I encourage other mathematics classes to attendsimilar symposia in the future? Please elaborate.

Technique 3: Mid-Term Adjustments

Background and Purpose

My Honors College colloquia tend to attract a broad spectrumof curious upper class students, including majors inarchitecture, art, computer science, mathematics and music.Each Colloquium course develops in a different way. Evenwhen a topic list is repeated, class dynamics can changedrastically from term to term.

I recently taught a colloquium course on “Common themes”using Douglas Hofstadter’s book, Gödel, Escher, Bach [2], toexamine traits common to Bach’s music, Gödel’s mathematicallogic, and Escher’s graphic art. Having taught this courseseveral times, I have learned the necessity of providingsufficient foundational knowledge in each of these areas sothat effective discussions can occur in class. This time, I foundthat after the first few weeks the class discussions and activitieshad drifted precipitously from my initially announced goalsand I knew that a few students were somewhat frustrated withthis shift. While I had used other informal assessment tools inthis course (such as the minute paper), the appropriateassessment activity for this occasion was much more focused.

Method

I asked the students to anonymously answer the followingthree questions:

1. What’s one specific way that this class has addressed agoal? That is, “what worked?”

2. What’s something that shouldn’t have happened? Thatis, “what didn’t work?”

3. What’s something missing or unnoticed that shouldhappen? That is, “what’s missing, so far?”

Findings

These are difficult questions, both for the student and forthe instructor. However, my class had a healthy atmosphere,I asked sincerely for their feedback, and I received specificand helpful responses from each student. Some studentsobserved that the videotapes I had used at times were“boring.” A few students thought that discussions had driftedtoo far into mathematics without sufficient foundation.Students also observed that some alternative activities, suchas creating examples of self-reference in literature and thearts, were very effective. Most importantly, the third itemprovided a valuable resource for additional ideas: anotherguest speaker, an experiment using video feedback, and aculminating group project.

Use of Findings

This activity gave me the chance to solicit and immediatelyreact to my students’ concerns and desires, rather than waitinguntil “the next term.” I summarized the results and reportedthem to the class the following week, along with myintentions: to arrange a tour to the Art Museum by the Curatorof Education, to greatly reduce the use of videos, and tocontinue a significant amount of class discussions. I alsosolicited volunteers to set up the video feedback experimentfor the class. The class collected their favorite examples andillustrations and compiled a “Proceedings” which wasdistributed to each participant at the conclusion of the course.

Success Factors

This should not be the first assessment tool you use in acourse. Your students must know by your example that youvalue their sincere responses and that you will reactappropriately to their comments.

Final Words

As Angelo and Cross [1] often say, “Don’t ask if you don’twant to know.” An effective classroom assessment shouldattempt to confirm or refute your suspicions. Before yousolicit student responses, identify what you want to learn,what you think you will find, and what you intend to do with

Page 110: Assessment Practices in Undergraduate Mathematics - Northern

Part II: Assessment in the Individual Classroom 97

the results. Without feedback of some sort, these activitieslose their assessment quality, and become quizzes and studytools. For me, techniques that are quick to develop andadminister tend to be used most often. One must remember,however, to allot the time to evaluate and respond to thestudents.

References

[1] Angelo, T.A., and Cross, K.P. Classroom AssessmentTechniques, second edition. Jossey-Bass, San Francisco,1993.

[2] Hofstadter, D. Gödel, Escher, Bach: an eternal goldenbraid. Basic Books, New York, 1979.

Page 111: Assessment Practices in Undergraduate Mathematics - Northern

98

Background and Purpose

St. Cloud State University serves approximately 13,000students. Since most students are self-supporting, the jobethic is very strong. The state pays 2/3 of tuition costs,expecting in return at least an eight-hour day of classes andstudying from students, but most students maintain part-timeand even full-time jobs (often at night). Worse than the effectof this on studying habits is the fact that our students oftenassociate greater value to their jobs, and this can translateinto excuses for missing class or turning in assignments late.Over the years I have found that some of these attitudes canactually be turned to a learning advantage — especially whenone explains education situations in terms of jobs. Forexample, the student who complains that a take-home examis too difficult will understand my point when I ask how aboss would react to the sentiment, “the job is too difficult.”

The assignments described here (which I might call PATs,for “Professional Assessment Techniques”) are designed totake advantage of the situation, and to encourage students tothink of themselves as pre-professionals in the educationalenvironment. My work with PATs preceded my discoveryof Angelo and Cross’s Classroom Assessment Techniques(CATs, [1]) and the assessment movement, tracing back to aWriting-Across-the-Curriculum faculty developmentworkshop in the early 80’s, during which I became interestedin using exploratory writing assignments to enhance learning[3]. Angelo & Cross, however, helped me to think moresystematically about using student writing for classroom

research and for shaping learning. Some of the techniques Iuse are explained in [2].

Method

I use some combination of the following assignments in avariety of courses. I tell the class in advance of an assignmenthow many points it will count.

1. Examining Teacher/Student Roles.Assignment: List (overnight) ten characteristics of a good

teacher and ten characteristics of a good student.Findings. The combined list for the teacher tends to be

highly diverse, mostly featuring friendly qualities such as“smiles a lot, understands when we don’t get something, isavailable, doesn’t assign too much homework.” In fact, thislist inevitably sounds like the preferred qualities of an instructoron faculty evaluation forms, raising the question of exactlywhat comes first. The list of features of the good student,however, is by comparison very short and incomplete. A goodstudent “smiles a lot, asks questions, is helpful to others, triesto come to class as often as possible, keeps up as much aspossible.” I have never received the response, “reads the book.”After compiling and reviewing these two lists, I draw up myown expectations, and in the next class we spend a half-hourcomparing our lists on an overhead projector. I highlightcomments that strike me as interesting.

I wish I could say that this exercise is enjoyable. Rather,coming at the beginning of the term, with our expectations

Creating a Professional Environment in the Classroom

Sandra Z. KeithSt. Cloud State University

This article describes four activities which can help students develop a more professional attitude towardtheir coursework: looking directly at their expectations, revising goals after a test, finding applicationsof the course in their chosen field, and preparing a résumé.

Page 112: Assessment Practices in Undergraduate Mathematics - Northern

Part II: Assessment in the Individual Classroom 99

on a collision course, it’s more likely to cause students tofeel that they are being “set up.” But I don’t expect thisexercise to enhance my popularity. It is instead intended tolet students reflect on their own learning objectives, and inparticular, to make some expectations extremely clear; e.g.,“to come to class as often as possible” is not acceptable.(“How would that sound to your boss?”) More congenially,throughout the course, I can remind them, for example, that“good students ask questions.”

2. Revising Goals.Assignment: (a) Resubmit your exam with revisions and

corrections to anything “in red.” (b) Complete the Post-TestSurvey. (c) Review your initial goals and create five newones, based on your experiences in this class to date.

Some questions I use for the Post-Test Survey:(1) How prepared did you feel for this test?(2) Did you do as well as you expected?(3) Was the test fair?(4) On what questions did you make your mistakes and why?

What did you feel were your weak areas? Be explicit.(5) How many hours per day do you actually study?

Findings: Many industries and academic institutions,including my own, ask for goal reports. While many teachersset goals for their classes and expect the same from students,there is value in having students ASSESS and REVISE theirgoals. Students write five learning goals at the outset of thecourse; then after each exam they submit a folder consistingof corrections to their tests, the Post-Test Summary, and arevision of their goals, based on the test performance. Iconsider the revising process an important part of studentself-assessment. It is gratifying to see goals become morespecific over time, with instructor feedback. For example, atypical goal from students at the outset is, “to get an A.” Imay write that this is my goal for the student as well, butHOW shall we achieve it? For example, how will we knowwhen the work will produce A’s on tests? Subsequent goalsbecome more sophisticated — one student amended this goalto “During this class I wish to obtain better study habits byorganizing myself and the notes I take. I see from this testthat I will probably have to study longer on certain thingsthat I do not understand in class and go to the tutoring sessionsif it is necessary for me to do.” Perhaps this student islearning to write longer sentences, but he is also becomingmore aware of his share of responsibility in learning.

I also collate responses to selected questions on the Post-Test Summary and show them on overhead projector. If 24students felt the test was fair, while four did not, reasons canbe discussed. Occasionally I compare study patterns of someanonymous A-students with those of low performers. Testperformance is generally a private thing, and when tests areopened up to discussion, students become very curious aboutwhat their peers are saying. Any tendencies of students toregard themselves as victims of the test, the teaching, or the

book dissolve in the more serious discussion about what wecan all do productively to improve learning.

The folders draw me into the students’ lives, and I findmyself making extensive comments. Learning flows fromengagement. Red-flagging a rationalization exactly when thestudent is engaged with it provides the best opportunity forcatching the student’s attention. Many students will claim tohave made “careless” mistakes, which allows me to discusswhy these are nevertheless a problem. Or I might flag anexcellent student, demoralized by an exam, whose goal isnow “to settle for a C.” The interactive format allows me torespond to issues and questions the students themselves haveraised, and thus provides coaching in a more personalizedframework.

3. Getting into the real world.Assignment: Go to the library and photocopy a page,

from an ADVANCED text or journal in the field of yourmajor interest, that includes some calculus. Document yoursource. Write a 1-page explanation about what themathematics is saying, in language that will beunderstandable to another student in this class. (Anyonedoubting calculus will be used in his or her field should lookat an advanced statistics or economics text.)

Findings. A universally expressed goal among studentsis “to understand where this material will apply!”Nevertheless, the examples from applied fields in a calculusbook never seem to have an impact. Possibly these examplesgo by too fast, or seem rigged, or they are not the type to beon tests. In this writing assignment, students explore thesection of the library devoted to books in their chosen field;usually they have not been to this section of the library. Theymust choose a page that includes some calculus (perhapsuse of the integral) and explain, as best they can, thebackground of the problem being discussed, as well as thesteps of the mathematics. Many texts will be too advancedfor the students at their current level of understanding, andthey may simply have to admit as much, but chances are,with some searching, they will find a page they can minimallyinterpret. Since this assignment takes students out of thenormally sequenced learning of both mathematics and theirmajor fields, it is probably unlike anything they have everdone before.

To help students understand what I expect on this project,I show the work of former students (good result, poor result).Most students react positively; they are proud of the level ofdifficulty of the fields they plan to enter. And while studentsmay not always provide the best of explanations, they atleast can witness first-hand that the mathematics in theirchosen field is probably much tougher and even moretheoretical than what is currently being expected of them.This assignment is also a wonderful opportunity to observehow students perceive the meanings of the mathematics theyhave learned. For instance, students may accept some

Page 113: Assessment Practices in Undergraduate Mathematics - Northern

100 Assessment Practices in Undergraduate Mathematics

equation in physics as a given rule or law, without any sensethat the rule could be derived with the mathematics we havejust studied. For example, it is a rule, to them, that totalcharge is the integral of current over time; it does not seemto be something about which they should question, “why?”or “how?” On overhead projector, I show all the xeroxingsof the text pages, briefly explaining in what context we areseeing the integral, for example. This gives a very streamlinedoverview of the ubiquity of calculus in applied fields.

Since the grading for this assignment is somewhatproblematic, generally I grade on effort. A more in-depthgrading is not easy if one must make comments on themathematics. The following system for giving fast butpersonal responses helps: since I type fast, I create a formletter on the word-processor, citing the directions; below Iadd my personal responses. This way it somehow becomeseasier to explain where the student has succeeded or fallenshort of expectations.

4. Looking Good on a Resume.Assignment: Write a 2-page self-introduction to a

potential employer presenting your activities as a student(including mathematics classes) as evidence for yourpotential as an employee.

Findings. At some point during the term, I spend a half-hour discussing with the class how they can build theirresumes. Even at the sophomore level, some students arestill coasting on their high school athletic achievements. Wediscuss joining clubs, doing student research, becoming atutor, going to conferences, and setting up a file in theplacement office. Many students at our institution (which islooking into its general advising policies) have never heardthese sorts of suggestions before.

This writing assignment may directly impact on their jobs:our institution, for example, when hiring, first asks interestedindividuals to write a letter introducing themselves (with alist of publications), and on this basis, determines who shouldreceive an application form. Since many students feelembarrassed about presenting themselves in a positive orforceful way, I usually show what former students havewritten as well as what I have written as letters ofrecommendations for students (names erased, of course). Inthis way, students can see that the work we do on the board,in groups, and even the very writing we do on this assignmentallow me to add more information on my recommendationletters for them.

When I give this assignment, I explain that I like to tackonto my letters of recommendation some exceptional workof the student (this might be the interpretive library

assignment above or a take-home exam), conjoining onceagain the work we do in class with the students’ future jobs.

Findings and Use of Findings

The credit given for these assignments constitutes a smallpercentage of the grade. Students seem to appreciate anytake-home work that provides balance to test grades, althoughthey may initially be puzzled by these assignments, whichtend to be unlike any they have seen in mathematics classesbefore. A major benefit of these projects is a heightenedsense of collegiality among the students in class as they findinteresting examples of applications of calculus in differentfields to share, and as the goals and autobiographical materialprovide an opportunity for more personal interaction thanmath classes typically offer. Most importantly, these projectsheighten student awareness of how to shape their prospectsfor jobs and careers, and this is a value they recognize withno difficulty at all.

For my own portfolios, I copy selected strong/medium/weak responses to the assignments to represent the range ofclass responses. I keep these portfolios to read for reflectionand to mine for evidence of effective teaching. Whether theseteaching portfolios are seriously studied by universityevaluators or not, I do not know, but the massive binders Ihave collected over the years allow me to study how studentslearn, and refresh me with ideas for future classes.

Success Factors

Some students respond trivially to these assignments earlyin the term, and I may request reworkings. Generally studentscome to appreciate the assignments when they see the seriousresponses of other students. I show transparencies of muchof the students’ output on an overhead projector; thisprovides students with response to their input — importantin any assessment activity and invaluable in discussing“student professionalism.”

References[1] Angelo, T.A., and Cross, K.P. Classroom Assessment

Techniques, 2nd ed., Jossey-Bass, San Francisco, 1993.[2] Keith, S.Z. “Self-Assessment Materials for Use in

Portfolios,” PRIMUS, 6 (2), 1996, pp. 178–192.[3] Keith, S.Z. “Explorative Writing and Learning

Mathematics,” Mathematics Magazine, 81 (9), 1988,pp. 714–719.

Page 114: Assessment Practices in Undergraduate Mathematics - Northern

101

Background and Purpose

My initial experimentation with modified true/false questionswas prompted by frustration with my students’ performancein the first-year calculus courses which I was responsiblefor teaching as a graduate assistant at the University ofColorado, Boulder. Although my students’ ability to performthe various algorithms of the course was largely acceptable,the proofs which they were producing (especially on tests)left much to be desired. As a replacement for these “proofs,”I began to use true/false questions addressing course conceptson exams; to avoid mere guessing, I also required studentsto write a short explanation of their response. These writtenexplanations proved to be extremely valuable as a methodof identifying student misconceptions and learningdifficulties, thereby allowing me to modify my instructionto address those problems.

After joining the faculty at the University of SouthernColorado, I began to use modified true/false questions bothfor on-going assessment and on exams in a variety of lowerdivision courses, including calculus, college algebra, liberalarts mathematics courses, and content courses for pre-serviceelementary teachers. I have also used them as one means toassess conceptual understanding on upper division courseexams. A regional state institution with moderately openenrollment, USC limits class size in mathematics to 45 students,and in many classes the size is closer to 30. This allows me touse modified true/false and other writing assignments as oftenas once a week so that students become familiar with both the

format and my expectations of it. Regular use of these questionsprovides me with instructional feedback, and serves as amechanism to encourage student reflection on concepts.

Method

The statements themselves are designed to test students’understanding of specific theoretical points, includingdefinitions, properties, theorems, and relations amongconcepts. (See below for examples.) By casting thesestatements in the form of implications, it is possible to drawattention to the logical form of mathematical theorems, suchas the distinction between the hypothesis and the conclusion,and the relation between a statement, its contrapositve andits converse. The form of the question is thus conducive to awide variety of concepts, making it readily adaptable todifferent topics. The critical step in designing a good questionis to identify those aspects of the concept are (i) mostimportant and (ii) most likely to be misunderstood. Thepurpose should not be to trip students up on subtle technicalpoints, but to call their attention (and reflection) to criticalfeatures of the concepts. Some examples:

Determine if each of the following is true or false, andgive a complete written argument for your response.Your explanation should be addressed to a fellowstudent whom you are trying to convince of your reply.Use diagrams, examples, or counterexamples asappropriate to supplement your written response.

True or False? Explain!

Janet Heine BarnettUniversity of Southern Colorado

One way to find out what students understand is to ask them true/false questions, but have them justifytheir answers. These justifications bring out confusions about the concepts, but are also the beginning,for calculus students, of writing mathematical proofs.

Page 115: Assessment Practices in Undergraduate Mathematics - Northern

102 Assessment Practices in Undergraduate Mathematics

If 6 | n and 4 | n, then 24 | n.

If a is a real number, then a a2 = .

If f is differentiable at a, then lim x a f x®( ) exists.

If f(c) is a local maximum of f, then f ́ (c)=0.

If the series an∑ converges, then an → 0 .

A typical true/false writing assignment consists of two - fourquestions addressing the same basic concept or relation; sampleinstructions for the students are shown above. The stipulationthat students write their explanation to a fellow student is madeto encourage students to provide a convincing, logical argumentconsistent with their current level of formal understanding. (Inupper division courses, more formalism and rigor is expected.)The goal is to have students communicate their understandingas precisely and clearly as possible, but to articulate thisunderstanding in their own terms, rather than using formal termsor conventions which they don’t understand. When employingwriting assignments as a part of the course grade, the role ofthis audience is explained and emphasized to the students at thebeginning of and throughout the semester. Students are allowedthree to five days to complete the assignment, and are encouragedto discuss the questions with each other and with me duringthis time. Students who wish to discuss the assignments withme must show that they’ve already made a significant attempt,and they find that I am not likely to give them “the answer.”

In evaluating individual responses, credit is given primarilyfor the explanations, rather than the true/false response. In fact,a weak explanation for a correct response may earn less creditthan a “good” explanation for a wrong response. Bothcomposition and content are considered, with the emphasisplaced on content. Typically, I read each response twice: thefirst time to make comments and place papers into rough pilesbased on either a three or five point scale, the second time toverify consistency within the piles and assign the score. Whenusing true/false assignments as part of the course grade, ageneric version of the scoring guide is shared with studentsearly in the semester. This generic guide is tailored to specificassignments by setting specific content benchmarks for eachscore (for two types of such scales, see the article by Emenakerin this volume, p. 116). The scores from these assignments canthen be incorporated into the final course grade in a variety ofways depending on overall course assessment structure andobjectives. Methods that I have used include computing anoverall writing assignment grade, incorporating the writingassignment scores into the homework grade, and incorporatingthe writing assignment scores into a course portfolio grade.

Findings

Students tend to find these assignments challenging, butuseful in building their understanding. Their value in buildingthis understanding has led me to use writing increasingly as

an instructional device, especially with difficult concepts. Theprimary disadvantage of this assignment is the fact that anytype of writing requires more time for both the instructor(mostly for grading) and the student (both for the physicalact of writing and the intellectual challenge of conceptualquestions). I have found that modified true/false questionsrequire less time for me to evaluate than most writingassignments (as little as one hour for three questions in aclass of 25). Because these questions focus on a specificaspect of the concept in question, most students are alsoable to respond fairly briefly to them, so that including themon exams is not unreasonable provided some extra time isallowed.

Use of Findings

The instructional feedback I’ve obtained from student re-sponses to these questions has led me to modify my teach-ing of particular concepts in subsequent courses (see [1]).For instance, the responses I initially received to questionsaddressing the role of derivatives in locating extrema sug-gested that students were confused, rather than enlightened,by class presentations of “exceptional” cases (e.g., an in-flection point that is also an extremum). Accordingly, I nowuse true/false assignments (e.g., “If (c , f(c)) is an inflectionpoint, then f(c) is not a local maximum of f.”) as a means toprompt the students to discover such examples themselves,rather than including them in class presentations. This workedwell in calculus, where students have enough mathematicalmaturity for such explorations, but was less successful incollege algebra. In response, I began to modify the questionformat itself in order to force the students to confront con-ceptual difficulties. For example, rather than ask whether

a a2 = for all reals a, I may tell the students this state-ment is false, and ask them to explain why.

A more global instructional change I made as a result ofa true/false assignment occurred when one of the beststudents in a first semester calculus course repeatedly gaveexcellent counterexamples in response to false statements,while insisting that the statements themselves were not falsesince he knew of other examples for which the conditions ofthe statement did hold. I now regularly explore the distinctionbetween universally and existentially quantified statementsin introductory courses, whereas I had previously notconsidered it an issue.

Success Factors

Early in a semester, one should expect the quality of re-sponses to be rather poor, both in terms of content and com-position. Quality improves dramatically after the first fewassignments as students become familiar with its require-

Page 116: Assessment Practices in Undergraduate Mathematics - Northern

Part II: Assessment in the Individual Classroom 103

ments. To facilitate this process, it is important to provideeach student with specific written comments on their work,and to encourage revisions of unacceptable work. I havealso found that sharing samples of high quality student re-sponses helps students learn to distinguish poor, good, andoutstanding work. For some students (non-native Englishspeakers, learning disabled, etc.), the writing requirement(and the fact that composition does count) has posed prob-lems not solved by these measures. Since the number of thesestudents is small at USC, I have always been able to arrangean alternative for them, such as an oral interview or a tran-scriber.

References

[1] Barnett, J. “Assessing Student Understanding ThroughWriting.” PRIMUS (6:1), 1996. pp. 77–86.

Page 117: Assessment Practices in Undergraduate Mathematics - Northern

104

Background and PurposeI teach mostly at the precalculus level at Columbus StateCommunity College and have used the following techniquein my Intermediate Algebra, College Algebra, and Precal-culus classes. Our students come to us very often with ex-tremely weak mathematical backgrounds, with a history offailure at mathematics, with a significant amount of mathanxiety, and frequently after being out of school for severalyears. I feel that my students need a variety of assessmenttechniques in order to learn the material and also to commu-nicate to me what they have learned. Often my students haveonly a surface understanding of a particular concept. Theassessment technique described in this paper helps studentslearn how to use mathematical language precisely. It helpsme check whether they have absorbed the distinctions thatlanguage is trying to establish and the details of the princi-pal concepts.

MethodOne week before each test I hand out ten or twelve questionswhich require students to explain, describe, compare,contrast, or define certain mathematical concepts or terms.The goal is to have my students discuss these concepts usingcorrect terminology. I expect the students to answer thequestions using correct sentence structure, punctuation,grammar, and spelling. This strategy also serves as a reviewof the material for the students as it covers concepts that aregoing to be on my traditional test.

Examples:From an Intermediate Algebra class:

• In your own words define “function.” Give an exampleof a relationship that is a function and one that is not andexplain the difference.

• Explain, without using interval notation, what (–¥, 7]means. What is the smallest number in the interval? Thelargest?

From a College Algebra class:• If x = 2 is a vertical asymptote of the graph of the function

y = f(x), describe what happens to the x- and y-coordinatesof a point moving along the graph, as x approaches 2.

• Compare/contrast the difference between a verticalasymptote and a horizontal asymptote.

From a Precalculus class:• Explain, referring to appropriate geometric transforma-

tions, and illustrate, using graphs, why sin 2x ¹ 2 sin x.

Emphasizing why sin 2x ¹ 2 sin x by referring to geometrictransformations and graphs helps students when they mustlearn the double angle formulas. It is so tempting for studentsto assume sin 2x = 2 sin x.

In order to receive full credit for this example, studentsmust explain that the function y = sin 2x has a horizontalcompression by a factor of 2 which makes the period of thefunction p and leaves the range at [–1, 1] whereas the functiony = 2 sin x has a vertical stretch by a factor of 2 which makesthe range [–2,2] but leaves the period at 2p. They must alsosketch the graphs of each function. Requiring students to

Define, Compare, Contrast, Explain ...

Joann BossenbroekColumbus State Community College

These short writing assignments help students clarify concepts, and show the instructor where more workis needed.

Page 118: Assessment Practices in Undergraduate Mathematics - Northern

Part II: Assessment in the Individual Classroom 105

both verbalize and visualize a concept is an excellent way tostrengthen understanding.

• Define a logarithm. Explain how a logarithm relates to anexponential function both algebraically and graphically.

Findings

I have been using this strategy for more than six years. Manystudents have told me that these assignments help them makeconnections and gain a deeper understanding of the conceptsthat are being taught. One student recently told me that I forcedhim to think; his other mathematics instructors only made himwork problems. Students who go on to take higher levelmathematics courses frequently come back to thank me forthe rigor that this type of strategy demands. They claim thatthe higher level courses are easier for them because of thisrigor. At Columbus State half of the final exam in CollegeAlgebra and Precalculus is a department component which allinstructors must use. I compared the results on the departmentcomponent from my students for the last eight quarters withthe results from the entire department and found that, onaverage, my students scored 4% higher than the students fromthe department as a whole. I believe that this assessment strategyis a contributing factor for this difference. A few exampleswill illustrate the type of misconceptions that I have found:

From Intermediate Algebra: When I ask the students toexplain the meaning of (–¥,7], I am reminded that studentsfrequently confuse the use of ( and [. I am also remindedhow fuzzy students ideas of the meaning of ¥ and –¥ are.One student told me that “the smallest number in the intervalwas the smallest number that anyone could think of.”

From College Algebra: When asked to respond to “If x= 2 is a vertical asymptote of the graph of the function y =f (x), describe what happens to the x- and y-coordinates of apoint moving along the graph, as x approaches 2,” one studentresponded, “as x gets closer, nothing specific happens to they-value but the graph approaches –¥.” This kind of ananswer gives me insights that I would never get by asking atraditional test question.

Use of Findings

When evaluating the results, I make extensive comments tomy students trying to help clarify the concepts that have

confused them and to deepen their understanding. I rarelyjust mark the result incorrect. These assignments have helpedme improve my teaching. They have given me a betterunderstanding of what it is about a specific concept that isconfusing. I then carefully tailor my examples in class toaddress the ambiguities. So for example, when I discuss thevertical asymptotes of the graph of a rational function, I verycarefully find the x- and y-values for a number of points onthe graph for which the x-value is approaching thediscontinuity. I have found certain concepts that I thoughtwere obvious or easy to grasp needed more emphasis suchas when to use “(” and when to use “[.” I give more examplesin class to illustrate the difference between using “(” and“[.” When I return the papers, I only discuss the responseswith the class if a number of students have made the sameerror or if a student requests more clarification.

Success Factors

• Students need to be cautioned to take these questionsseriously. They are not accustomed to writing in amathematics class. These questions only require shortwritten statements for the most part, but studentssometimes forget to use correct sentence structure,grammar, spelling, and punctuation.

• Students need to be reminded that these questions maynot be easy to answer. Students often find it easier towork a problem than to describe a concept. They shouldbe cautioned not to try to finish the assignment in oneevening.

• Students need to be encouraged to be very complete andthorough. They have had very little experience inexplaining mathematical concepts, and so, at first, theytend to be very superficial with their explanations. I findthat students do improve their ability to answer these typesof questions as the quarter progresses.

• At first I found these assignments to be quite timeconsuming to correct, but I have become more efficient.Because I use them frequently, I have had practice makingdecisions as to how to grade them.

• When I grade these assignments, I do point out grammarand spelling errors, but I do not count off heavily forthese mistakes. I am primarily interested in the students’understanding of mathematics.

Page 119: Assessment Practices in Undergraduate Mathematics - Northern

106

Background and Purpose

Saint Joseph’s University is a small, comprehensiveinstitution with majors in the humanities, business, socialsciences and physical and life sciences. Every student isrequired to complete two semesters of mathematics. Thecourses vary from one division to another.

In any course, we are challenged to construct tests thatmeasure what students have learned. Introspectively, we oftenacquire a deeper understanding of a concept by teaching it,by explaining it to another, or by trying to construct problemsfor a test. The inadequacy of in-class tests for assessing thelearning and understanding of my students led me to explorenew methods of appraising their knowledge and reasoning.I wanted to ascertain student perception of which conceptsare important and how each individual thinks about problems.Using student-created problems (and solutions) as anassessment procedure reveals both the thinking processesand interests of the students.

Student-created problems allow the instructor todetermine the level of each student’s comprehension of topicsand her/his ability to apply the concepts to new problems.In the process, students hone communication skills, sinceproblems and solutions are submitted in writing and oftenexplained orally in class as well. With several days to createa problem, speed is not an issue; each student works at her/his own rate. This process fosters a commitment to personalachievement, and encourages the student to become moreself-directed. Although some students submit only routine

problems, each knows the grading scale in advance and soshares control in the teaching-learning environment. Thedifficulty level of the problems is entirely in the hands of thestudents, and hence they set their own personal goals forachievement.

The method of using student-created problems can beimplemented in any course. I have used the techniquesuccessfully in freshmen level courses (with a maximumenrollment of 30) and in upper division courses with smallerenrollments. In this article, we demonstrate a use of thetechnique in a course for social science majors.

Method

In the courses for social science majors, the textbook ForAll Practical Purposes [1] is used. Broadly stated, theconcepts include Euler and Hamiltonian circuits, scheduling,linear programming, voting strategies, population growth,bin packing, optimization, and statistics. The topics arecurrent and relevant to today’s world. During the course,each student is required to design a word problem and itssolution for each of the major concepts. The instructorcollects the problems, organizes them, and checks foraccuracy. The corrected problems are used as a study guideand review aid for tests and the final exam.

• For each of the five to seven units discussed in a semester,each student constructs a word problem and its solution.Within a unit, students are asked to construct problems

Student-Created Problems DemonstrateKnowledge and Understanding

Agnes M. RashSaint Joseph’s University

Having students write problems shows the instructor where the gaps in their understanding are at thesame time that it has students review for an upcoming test. Further, it can help students find the relevanceof the course to their own interests.

Page 120: Assessment Practices in Undergraduate Mathematics - Northern

Part II: Assessment in the Individual Classroom 107

for different concepts. For example, in a unit on graphtheory, some students construct a problem using Eulercircuits; others prepare a problem on Hamiltonian circuitsor minimum-cost spanning trees, etc.

• Students are divided into small groups of four to sixmembers. All members of a group work on the same topicbut each student designs her/his own problem on theconcept. After completing a topic, students are given 15minutes at the end of a period for discussion of ideas.Questions can be asked of the instructor at this time.Students are given several days (for example Mondaythrough Friday, or a long weekend) to construct andsubmit a correct word problem and its solution. (Workingindividually may be as effective in some classes.)

• Problems and solutions are collected, read, graded andmarked with comments by the instructor. Grammaticalas well as mathematical errors are indicated when theproblems are returned to the students.

• When a problem needs correction or additional work,the student is given the opportunity to correct and resubmita revised version, usually within three days. (Problemsnot relevant to the current topic are returned also. Asuitable problem may be resubmitted within the timelimit.)

As one would expect, the student-created problems runthe gamut from mundane to exciting. Interesting problemsare presented in class. A set of correct problems (withoutsolutions) is given to the class as a review. Questions can beasked in a later class. Each student then becomes an authorityon her/his problem and its solution, and can be called uponto explain or solve the problem in class. In practice, membersof the class actively engage in questioning each other aboutthe solution to a particular problem. Lively discussions takeplace. The grade on a problem becomes one part of theevaluation of a student’s knowledge in the course. Theinstructor acts as an advisor in the development of problemsand solutions of high quality.

In my classes, the collection of problems is given theweight of one test, which is 25% of the final grade. (Thereare two in-class tests, a collection of problems, and a finalexamination. I include student-constructed problems on thetests.) Problems are graded on the basis of correctness,relevance to the topic, and originality. If a student elects tocorrect a problem, the numerical value it receives is the sameas if it had been submitted correctly initially.

Findings

Analysis of a problem allows me to determine if a studentunderstands a concept. Possible errors include misusing thedefinition, confusion among the concepts, applying a conceptto a problem for which the method is inappropriate, or in

extreme cases, inability to apply the concept to a newsituation. The interests of the students appear in the problemsconstructed and I use this information in designing examplesfor other topics in the course. Since my tests consist of routineexercises and problems drawn from the students’ work,students’ complaints about the test questions, when the testis returned, are virtually eliminated.

Since resubmission is encouraged for poorly posedproblems or incorrect solutions, students tend to be morecareful in designing the initial problem. Complimentingstudents on innovative problems, even if they are notcompletely accurate on the initial submission, improvesstudents’ self-esteem and self-confidence. They take moreresponsibility for the learning process, and develop pride intheir work. Because students are actively engaged in thecreative process, listening, writing, seeing and thinking areall involved. Furthermore, students learn to read the textbookcarefully and to reread examples in order to developproblems.

Students are actively engaged in a cooperative activityand view me as a resource in developing problems ratherthan as the controller of their academic destiny. Each personhas an opportunity to show the depth of her/his understandingwithout a time constraint. Since students may discuss ideaswith each other and with the instructor before submitting aproblem, more student-faculty contact, greater cooperationamong students, and proactive learning result. As studentsinternalize a concept, they can apply the concept to situationsof personal interest. The instructor shows respect for thediverse interests, talents, and modes of learning among thestudents.

The outcomes of this process are well worth the effort. Itis gratifying to observe the students as they find creativeways to use the concepts. There is a great variety in theproblems, some of which are quite ingenious and trulyenjoyable to read. An unanticipated benefit of the techniqueis that the instructor obtains a broader range of applicationsto discuss in the class. Below are a few examples of students’work.

A woman is campaigning for mayor. She needs togo door to door to gather votes. If she parks her car at61st Street and Lancaster Avenue, design a path forher to use that minimizes deadheading. (Included withthe problem is a map of a 3 block by 4 block sectionof the city.)

A chocolate chip cookie company ships cookies toa convenience store. The bags of cookies come in fourdifferent sizes: 1/2 lb., 1 lb., 2 lb., and 4 lb. Thecompany packs these bags into cartons. Each cartonholds at most ten pounds. The quantities of cookiesthat must be shipped are: ten 1/2-lb. bags, fifteen 1-lb. bags, ten 2-lb. bags, and two 4-lb. bags. Using thebest-fit decreasing heuristic, place the 37 bags into asfew cartons as possible.

Page 121: Assessment Practices in Undergraduate Mathematics - Northern

108 Assessment Practices in Undergraduate Mathematics

In making a decision about which college to attend,the following tasks must be completed: (a) make alist of schools to consider (2 days), (b) send forinformation and applications (14 days), (c) researchthe schools (60 days), (d) apply to schools (30 days),(e) visit the schools (5 days), (f) receive acceptances/rejections (90 days), (g) make final decision about theschool to attend (30 days). Construct an orderrequirement digraph and find the critical path.

Course evaluation forms from students indicate that theassignments are worthwhile, increase understanding, and(sometimes) are fun to do. A student’s time commitment indesigning and solving a problem exceeds that which a studentusually spends solving exercises in the textbook. Courseevaluations indicate that students spend more time and effortin this course than they usually devote to studyingmathematics taught in a traditional style.

Use of Findings

This alternative to in-class testing is enlightening to me aswell as beneficial to the student. I am more open to studentideas, and use a formative approach to the development ofconcepts as a result of this practice. Sharing control in theteaching-learning environment means that the topicsdiscussed can be modified to reflect the interests of thestudents. When submitted problems have common errors orexhibit similar misconceptions about the application of atopic, I know I have not made the topic clear and delineatedits parameters adequately. I need to find another approachor to give more complete examples. Usually, I respond bymore carefully explaining a new example and examine howthis problem fits the assumptions of the technique. In thefollowing year, I know to be more thorough when explainingthis topic. I also can tailor my examples to the interests ofthe students.

Success Factors

Using this strategy to determine a student’s knowledgeinvolves a considerable investment of time and effort on thepart of the instructor, particularly in reading and commentingon the work submitted by the students. I estimate that itprobably takes three times as long to read and return student-created problems as test problems. Grading the problemsand checking the accuracy of the solutions can be a slowprocess. Solving each problem and checking for accuracytakes longer than checking routine problems. On an ordinarytest, all students are solving the same problems. While

grading a test, one becomes familiar with the common errorsand easily sees them. With student-created problems, eachone is different, so an instructor solves 20–30 problems ratherthan 8–12 problems. On the other hand, time is saved by nothaving to construct good test questions. Furthermore, classtime is saved by eliminating at least one test, and perhapsanother class used to return and review the test.

Depending on the availability of readers and/or typists,you may be performing the task of organizing and collatingthe problems alone. Compiling and editing can be completedmore simply when the problems are composed on a computerand submitted electronically than when they are submittedtyped on paper.

Some helpful hints:

• Adopt a policy that no problems are accepted late, lestyou find yourself cornered at the end of the semester withmore evaluation to do than you can handle.

• Prepare in advance the topics you want the students toexplore. Determine the number of problems you willcollect. Explain to students the purpose of the problemsand how the questions will be used.

• Demonstrate what you want students to do. Give examplesof problems in each of the grading categories for thestudents to use as guidelines. Involve the class in adiscussion of the grade each example should receive.

• Give students ample time to create a problem.

• Provide feedback in a timely fashion. Grade and returnthe problems so that they can be corrected and used. Evenif you have insufficient time to organize the problemscompletely, return the papers with comments.

• Make use of the problems, as a study guide for anupcoming test, as test questions, as examples in class, oras problems on the final examination.

• Compliment the students on their work.

• If possible, have students submit problems electronically,on disk or by e-mail. (This may not be possible with allproblems.) Those not in this form can be photocopiedfor distribution.

For other assistance in getting started and furtherreferences see Rash [2].

References

[1] COMAP. For All Practical Purposes, W. H. Freemanand Company, New York, 1991.

[2] Rash, A.M. “An Alternate Method of Assessment, UsingStudent-Created Problems.” Primus, March, 1997, pp.89–96.

Page 122: Assessment Practices in Undergraduate Mathematics - Northern

109

Background and Purpose

Drawing on and expanding techniques developed by Jean Piagetin his studies of the intellectual development of children,researchers in mathematics education now commonly usecontent-based interviews to understand students’ mathematicalunderstandings. What started as techniques for researchers toassess student understanding has more recently been used bymathematics instructors who are interested in understandingmore deeply the nature of their students’ mathematicalunderstandings. Although it is possible to use interviews forthe purpose of grading, that is not the intention of this article.If the instructor approaches the interview by informing thestudent that the goal is to understand how students think insteadof to assign a grade, students are generally quite ready to sharetheir thinking. Many seem fairly pleased that the instructor istaking this kind of interest in them. The primary purpose of theinterviews described in this article is the enhancement of themathematics instructor’s understanding of students’understandings with its subsequent improvement ofmathematics teaching.

My first experiences with using interviews to understandstudent understanding came when I interviewed high schoolstudents to learn about what kinds of activities made them thinkdeeply in their high school classes. A few years later, I wantedto explore differences between good and poor students’understandings of calculus, so I began by interviewing an “A”student and a “D” student (as reported by their instructor).One of the questions I asked each of them was “What is a

derivative?” The “A” student answered in the following way:“A derivative? I’m not sure I can tell you what one is, but giveme one and I’ll do it. Like, for x2, it’s 2x.” Further probingrevealed no additional insight on the part of the student. The“D” student gave an explanation of the derivative as the limitof the slope of a sequence of tangent lines, complete withillustrative drawing. This experience first suggested to me thepotential for interviews as vehicles to reveal students’mathematical understandings in a way that grades do not.

Since my first experiences with interviews, I have usedinterviews in a variety of circumstances. The goals of interviewslike the ones I have conducted are to find out more about eachstudent’s personal understanding of selected mathematicalconcepts and ideas and to assess the depth and breadth ofstudents’ mathematical understandings. Many assessmenttechniques are designed to inventory students’ abilities toexecute routine skills or their abilities to demonstrate a pre-identified set of understandings. This type of interview,however, is designed to uncover a network of understandingsthat may or may not be part of a pre-identified list. The interviewis not just an oral quiz or test but rather a way to dig moredeeply into the complexities of students’ mathematicalunderstandings.

Method

Using interviews to examine student understanding usuallyinvolves several steps.

In-Depth Interviews toUnderstand Student Understanding

M. Kathleen HeidThe Pennsylvania State University

Students’ answers on tests don’t always show their true level of understanding. Sometimes they understandmore than their answers indicate, and sometimes, despite their regurgitating the correct words, theydon’t understand what they write. This article discusses a method to probe what they actually understand.

Page 123: Assessment Practices in Undergraduate Mathematics - Northern

110 Assessment Practices in Undergraduate Mathematics

Identifying the goal(s) of the interview. Before beginningto construct an interview schedule, the interviewer needs toclarify his or her goals for the interview. Interviews may bedirected toward goals like describing the interviewee’sconcept image of derivative, determining the extent to whichthe student sees connections among the concepts of limit,derivative and integral, or determining the depth of astudent’s knowledge of a particular set of concepts.

Designing an interview schedule of questions to be asked.An interview schedule is a set of directions for the interview,including questions that the interviewer plans to ask, directionsfor how to follow-up, and tasks to be posed during the courseof an interview. The schedule should include a core set ofquestions or tasks that will be posed to every interviewee anda set of potential follow-up questions or tasks — items whoseuse would depend on the interviewee’s initial set of responses.The schedule should also include a plan for what the interviewerwill do under different circumstances. For example, theinterviewer will want to plan ahead of time what to do in casethe interviewee gets stuck on particular facets of the task, or incase the interviewee asks for assistance, or in case theinterviewee brings up an interesting related point. Finally,attached to each copy of the interview schedule should be copiesof the core set of questions or problem tasks in large enoughtype to be visible by interviewer, interviewee, and the video-camera.

Piloting and revising the interview schedule. Just as forany assessment instrument, interview schedules need to bepiloted and revised based on the success of those pilots inachieving the goals of the interview. I recommend taping apilot interview or two to enable more accurate analysis. Afterconducting a pilot interview, ask yourself questions such as:

• Were the questions understood as intended?

• Were the questions adequate catalysts for finding outabout the student’s mathematical understandings?

• Were the planned follow-up questions useful?

• Are there additional follow-up questions that should beincluded?

• Was the sequence of questions appropriate for the purposeof the interview?

• For this interview, was it more appropriate to focus thecamera on students’ work or on the computer or calculatorimages (which can be projected behind the students) oron the students?

Preparing for and conducting the interviews. I have foundit useful to prepare, before each round of interviews, an “in-terview box” which contains a copy of the interview sched-ule, tapes, batteries, an accordion pocket folder for each in-terviewee, a pen, blank paper, graph paper, a straightedge,and an appropriate calculator. Knowing that needed suppliesare ready allows me to focus my attention on the interview.

Conducting the interviews. I will illustrate my method ofconducting an interview by describing a particular set ofinterviews. (See sample interview schedule at the end of thearticle.) In a yearlong curriculum development, teaching, andresearch project, I examined the effects of focusing an appliedcalculus course on concepts and applications and using acomputer algebra system for routine skills. Central to myunderstanding of the effects of this curricular approach was aseries of content-based interviews I conducted both withstudents from the experimental classes and with students fromtraditionally taught sections. These interviews were looselystructured around a common set of questions. As catalysts withwhich to start students talking about their understanding ofcalculus concepts, these interviews used questions like: “Whatis a derivative?”, “Why is it true that ‘If f(x)= x2, then f´(x) =2x.’?”, “How could you find the slope of a tangent line like theone shown in the sketch?”, “Could you explain in your ownwords what a second derivative is?”, and “How can youestimate the area under a given curve?” In these and otherinterviews I have conducted, I continued to probe as long asthe probing seemed to produce additional information aboutthe interviewee’s mathematical understandings. I asked theinterviewees to share their rationales for each answer, regardlessof the “correctness” of the response. At every perceivedopportunity as interviewees responded to these initial questions,I encouraged them to talk openly and freely about theirunderstandings of course concepts. When their answerscontained phrasing that appeared to be uniquely personal, Iprobed so that I could understand its meaning. When they madestatements that resembled the language of their textbook or ofmy classroom explanations, I asked them to explain their ideasin another way. When they made generalizations, I asked themto give instances and to explain them. Perhaps my most usefulguideline is when I think I understand what the student is saying,I probe further. Many times in these cases, the student’s responseturns my initial interpretation on end. Interviews like these weredesigned to allow each student’s personal understandings toemerge.

Analyzing the results of the interviews. When possible andreasonable, I have found it useful to watch interview tapeswith a colleague who is willing and able to engage in an in-depth discussion of what the tapes seem to indicate about theinterviewee’s mathematical understandings. Because we havea videotape, we can see what the student is writing and enteringinto the calculator or computer while making a particularstatement. The question we continually ask ourselves is “Howdoes this student make sense of the mathematics he or she isusing?” rather than “Is the student’s answer right or wrong?”

Findings

Content-based interviews can reveal a variety of aspects ofstudents’ mathematical understandings that are not visible

Page 124: Assessment Practices in Undergraduate Mathematics - Northern

Part II: Assessment in the Individual Classroom 111

through many other methods of assessment. The interrelat-edness of one student’s understandings of various mathemati-cal topics was revealed in an interview I designed to tapunderstanding of limit, derivative, and integral. Early in theinterview, I asked the interviewee to use the word limit todescribe a few graphical representations of functions whichI had provided. In each case, the interviewee’s response sug-gested a view of the limit as the process of approaching andnever reaching. When asked to explain what was meant by aderivative, the interviewee said that the derivative ap-proached, but never reached, the slope of a tangent line. Andfinally, the interviewee spoke of the definite integral as ap-proaching the area under a curve, never exactly equal to thatarea. Further probing in each of these cases uncovered astrongly held and consistent belief based on the notion oflimit as the process of approaching rather than as the num-ber that is being approached. The interview allowed me todiscover this aberrant belief and to test its strength and con-sistency in a way that other assessment techniques couldnot.

Use of Findings

As mathematics instructors learn more about their students’mathematical understandings through interviews, severalbenefits arise. First, the experience of designing andconducting content-based interviews can help instructors tolisten with a new attention and ability to focus on the student’spersonal interpretations and ways of thinking. Second, aninstructor who is aware of possible pitfalls in reasoning canconstruct examples that are likely to pose cognitive conflictsfor students as they struggle with refining the ways they arethinking about particular aspects of mathematics. Thesecognitive conflicts are helpful in inducing a more useful androbust way to think about the concept in question. Forexample, after recognizing that students were viewing thelimit as a process of approaching rather than as the numberbeing approached, I have centered class discussion on thedifference between thinking of limit as a process and thinkingof limit as a number. I have been more attentive to thelanguage I use in describing functions, and I have identifiedexamples that would engage students in thinking about limitas a process and as a number. For example, a discussion ofwhether .9 1= is likely to bring this issue to the fore.

However, even the best discussion is not enough to move astudent who is not ready beyond a procedural understandingof limit.

Success factors

Key to success in using this method is a deep-seated beliefon the part of the interviewer that each student’sunderstanding is unique and that this understanding is bestrevealed through open-ended questions and related probes.Also key to the success of interviewing as a way to assessstudent understanding is the belief that each studentconstructs his or her own mathematical knowledge and thatthis knowledge cannot be delivered intact to students nomatter how well the concepts under consideration areexplained.

Perhaps just as important as understanding whatinterviews are is understanding what they are not. Interviewsare not tutoring sessions because the goal is not for theinterviewer to teach but to understand what the intervieweeis thinking and understanding. A successful interview mayreveal not only what the interviewee is thinking about a pieceof mathematics but also why what he or she is thinking isreasonable to him or her. Interviews are not oralexaminations because the goal is not to see how well theinterviewee can perform on some fixed pre-identified set oftasks but rather to be able to characterize the interviewee’sthinking. And this sort of interview is not usually a teachingexperiment because its goal is not to observe studentslearning but to access what they think in the absence ofadditional purposeful instruction.

For further examples of what can be learned about studentunderstanding from content-based interviews, the reader canconsult recent issues of the journals, Educational Studies inMathematics and the Journal of Mathematical Behavior,and David Tall’s edited volume on Advanced MathematicalThinking [1].

The interview schedule on the next page was developedby Pete Johnson (Colby-Sawyer College) and Jean Werner(Mansfield University), under the direction of the author.

Reference

[1] Tall, D. (Ed.). Advanced Mathematical Thinking.Kluwer Academic Publishers, Dordrecht, 1991.

Page 125: Assessment Practices in Undergraduate Mathematics - Northern

112 Assessment Practices in Undergraduate Mathematics

INTERVIEW ON REPRESENTATIONAL UNDERSTANDINGOF SLOPES, TANGENTS, AND ASYMPTOTES*

Goal for this interview: To assess interviewee’s numerical, graphical, and symbolic understanding of slopes,tangents, and asymptotes and the connections they see between them.

The interview schedule consisted of four parts: one on slope, one on tangent, one on asymptote, and one onthe connections among them. Below are sample questions used in the “tangent” and “connections” parts.

Part II TANGENT

1. DESCRIBE OR DEFINE

In your own words, could you tell me what the word “tangent” means?

[If applicable, ask] You said earlier that the slope of the function is the slope of the tangent to the function.Now I want you to explain to me what you mean by “tangent.” [If they say something like slope of a functionat a point, ask them to sketch a picture.]

I have a graph here [Show figure.], with several lines which seem to touch the curve here at just one point.Which of these are tangents?

What does the tangent tell you about a function?

2. ROLE OF TANGENT IN CALCULUS

Suppose it was decided to drop the concept of tangent from Calculus. How would the study of Calculus beaffected?

3. SYMBOLIC REPRESENTATION

Suppose I gave you a function rule. How would you find the tangent related to that function?

Can you find the tangent in another way?

Discuss the tangent of the function f x x( )= . [Then ask,] What is the tangent at the point x = 0? [Followup with questions about the limit to the right and left of 0.]

4. GRAPHICAL REPRESENTATION

Can you explain the word “tangent” to me by using a graph? Explain.[Show a sketch of f xx

( )=1

.] Discussthe tangents of this function.

Part IV - CONNECTIONS

1. Some people say that the ideas of asymptote and slope are related. Do you agree or disagree? Explain.

2. Some people say that the ideas of asymptote and tangent are related. Do you agree or disagree? Explain.

————* This interview schedule was developed by Pete Johnson (Colby-Sawyer College) and Jean Werner (Mansfield University), under thedirection of the author.

Page 126: Assessment Practices in Undergraduate Mathematics - Northern

113

Background and Purpose

This article will attempt to illuminate the issues of how, whenand why to assign writing in mathematics classes, but it willfocus primarily on the first of these three questions: how tomake and grade an expository assignment.

I have been writing and using writing projects for justover four years at Franklin & Marshall College, a highlyselective liberal arts college in bucolic Lancaster, Pennsyl-vania. The majority of our students hail from surroundingcounties in Pennsylvania, Delaware, and New Jersey, butwe also draw some 15% of our students from overseas. Wehave no college mathematics requirement, but there is awriting requirement. Those of our students who choose totake mathematics usually begin with differential or integralcalculus; class size is generally limited to 25 students. Ourstudents tend to be more career-oriented than is usual at lib-eral arts colleges (an implication of which is that post-classwrangling about the distinction between a B and B+ is com-mon).

I have worked with instructors who come from a varietyof geographic regions and from a variety of types ofinstitutions: liberal arts colleges, community colleges, branchcampuses of state universities. I have found that theirexperiences with having students write mathematical essaysare remarkably similar to my own. Therefore, although Iwill use my own projects as specific examples, what I wouldlike to discuss in this paper are general guidelines that havehelped me and my colleagues. I will present some of the key

ingredients to first assigning and then grading a writingexperience which are both beneficial to the students and non-overwhelming to the instructor.

Method

The first question to address when creating an assignmentis: why assign writing in the first place? What is the goal ofthe assignment, and how does it fit into the larger course?Other articles in this volume will address this question ingreater detail, so I will risk oversimplifying the treatment ofthis question here. Let me then posit several possible reasonsfor assigning writing: (1) to improve students’ mathematicalexposition; (2) to introduce new mathematics; (3) tostrengthen understanding of previously encounteredmathematics; or (4) to provide feedback from the student tothe instructor. Closely related to this question of “why doit?” is the question of audience: to whom is the studentwriting? Students are most used to writing to an omniscientbeing: the instructor. Other possibilities include a potentialboss, the public, a fellow student, or the student him/herself.

Deciding for yourself the answers to these two questions,and then explaining these to the students, is a very importantpart of creating an assignment. It focuses the student on yourintentions, it reduces the student’s level of confusion andangst, and it results in better papers.

Knowing the answers to the above questions, you candesign your assignments to match your goals. For example,

Assessing Expository Mathematics:Grading Journals, Essays, and Other Vagaries

Annalisa CrannellFranklin & Marshall College

This article gives many helpful hints to the instructor who wants to assign writing projects, both on whatto think about when making the assignment, and what to do with the projects once they’re turned in.

Page 127: Assessment Practices in Undergraduate Mathematics - Northern

114 Assessment Practices in Undergraduate Mathematics

I want my Calculus students to explain mathematics to thosewho know less mathematics than they do, and I want theirwriting experience to reinforce concepts being taught in thecourse. The projects I assign take the form of three lettersfrom fictional characters who know little mathematics butwho need mathematical help. My students are expected totranslate the problems from English prose into mathematics,to solve the problems, and to explain the solutions in prosewhich the fictional characters could understand. I requirethat these solutions be written up on a word processor. (Anexample of one such problem can be found in [2]).

Once you have decided the over-arching purpose behindyour assignment, you should try to pin down the specifics.The “details” that matter little to you now matter a greatdeal to the students—and will haunt you later. Do you wantthe papers to be word processed or handwritten? What is areasonable length for the final product (one paragraph/onepage/several pages)? What is the quality of writing that youexpect: how much do the mechanics of English writing mat-ter? How much mathematical detail do you expect: shouldthe students show every step of their work, or instead pro-vide an overview of the process they followed? Will stu-dents be allowed or required to work in groups? How muchtime will they be given to complete the paper? What will thepolicy on late assignments be? What percentage of classgrade will this assignment be?

Especially for shorter papers, students find it very helpfulif you can provide an example of the type of writing youexpect. I have made it a habit to ask permission to copy oneor two students’ papers each semester for this purpose, andI keep my collection of “Excellent Student Papers” on reservein the library for the use of future students.

Findings

After you assign and collect the papers, you face the issueof providing feedback and grades. There is a considerableliterature which documents that instructors provide poorfeedback to our students on writing. What students see whenthey get their papers back is a plethora of very specificcomments about spelling and grammar, and only a few(confusing) comments on organization and structure. As aresult, students do not in general rewrite papers by thinkingabout the process of writing. Instead, when rewriting ispermitted they go from comment to comment on their draft,and change those things (and only those things) that theirinstructor pointed out. For this reason, I avoid assignmentsthat require rewriting whenever I can: the instructor does alot of work as an editor, and the student learns a minimalamount from the process.

One of my favorite assessment tools is the rubric, “a direc-tion or rule of conduct.” My own rubrics take the form of achecklist with specific yes or no questions, which I list here:

Does this paper:

1. clearly (re)state the problem to be solved?2. state the answer in a complete sentence which stands on

its own?3. clearly state the assumptions which underlie the

formulas?4. provide a paragraph which explains how the problem

will be approached?5. clearly label diagrams, tables, graphs, or other visual

representations of the math (if these are indeed used)?6. define all variables used?7. explain how each formula is derived, or where it can be

found?8. give acknowledgment where it is due?

In this paper,

9. are the spelling, grammar, and punctuation correct?10. is the mathematics correct?11. did the writer solve the question that was originally

asked?

Certainly I am not advocating that this is the only form acover sheet could take. Other rubrics might be more generalin nature: for example, having one section for comments onmathematical accuracy and another for comments onexposition, and then a grade for each section. At the otherextreme, there are rubrics which go into greater detail aboutwhat each subdivision of the rubric entails, with a slidingscale of grades on each. There is a growing literature on theuse of rubrics—see Houston, et al [3], or Emenaker’s paperin this volume, p. 116.

Use of Findings

Students get a copy of my checklist even before they get theirfirst assignment; they are required to attach this sheet to thefront page of their completed papers. I make almost all of mycomments directly on the checklist, with a “yes” or “no” nextto each question. The total number of “yes’s” is their gradeon the paper.

The main reason I like the rubric as much as I do is that itserves as both a teaching and an assessment tool. Studentsreally appreciate having the guidance that a checklistprovides, especially on their first paper. Moreover, the factthat the checklist remains consistent from one paper to thenext (I assign three) helps the student to focus on the overallprocess of writing, rather than on each specific mistake theinstructor circled on the last paper.

I said that my foremost reason for using the checklists ispedagogical idealism, but running a close second ispracticality. Using this checklist has saved me a considerableamount of time grading. For one thing, it makes providingfeedback easier: I can write “X?” next to question (6) aboveinstead of the longer “What does X stand for?” Moreover,

Page 128: Assessment Practices in Undergraduate Mathematics - Northern

Part II: Assessment in the Individual Classroom 115

the checklist makes it less likely that the student will neglectto describe “X.” Using the checklist gives the appearance ofobjectivity in grading; I have had very little quibbling overscores that I give—this is unusual at my college. I describethis checklist in greater detail in [2]; and I refer the interestedreader there.

Success Factors

If an instructor does decide to have students rewrite their work,one way to provide valuable commentary but still leaveresponsibility for revision in the students’ hands is to askstudents to turn in a cassette tape along with their draft. Asimple tape player with a microphone allows for much moredetailed comments of a significant kind: “In this paragraph, Idon’t understand whether you’re saying this or that … youjumped between three topics here without any transitions …oh, and by the way, you really need to spell-check your workbefore you turn it in again.” I’ve found that students muchprefer this kind of feedback. It means less writing for me; andit results in significantly changed (and better) final drafts.

If you assign a long-term project, it is extremely helpfulto mandate mini-deadlines which correspond to relevantsubtasks. These help to reduce student anxiety and to avoidprocrastination. For example, when I assign a semester-longresearch paper, I have a deadline every week or two, in whichstudents are expected to turn in:

• a thesis topic;

• a partial bibliography;

• a list of mathematicians with the same last names as theyhave (this teaches them to use Science Citation Index orMath Reviews, from which they can expand theirbibliography);

• a vocabulary list relevant to their paper, including wordsthat they don’t understand;

• a rough draft of an introductory paragraph;

• an outline with a revised bibliography;

• a rough draft of the paper; and finally

• the final paper.One last piece of advice: As you prepare your assign-

ments, make sure that you have resources available on cam-pus to help the anxious or weak student. One of these re-sources will necessarily be you, but others might includemath tutors, a writing center, and books or articles on writ-ing mathematics (such as [1] or [4]).

References

[1] Crannell, A., A Guide to Writing in MathematicsClasses, 1993. Available upon request from the author,or from http://www.fandm.edu/Departments/Mathematics/Writing.html

[2] Crannell, A., “How to grade 300 math essays and surviveto tell the tale,” PRIMUS 4 (3), 1994.

[3] Houston, S.K., Haines, C. R., Kitchen, A. , et. al.,Developing Rating Scales for UndergraduateMathematics Projects, University of Ulster, 1994.

[4] Maurer, S., “Advice for Undergraduates on SpecialAspects of Writing Mathematics,” PRIMUS, 1 (1), 1990.

Page 129: Assessment Practices in Undergraduate Mathematics - Northern

116

Background and Purpose

Projects that require students to model a real-life situation,solve the resulting mathematical problem and interpret theresults, are important both in enabling students to use themathematics they learn and in making the mathematicsrelevant to their lives. However, assessing modeling projectscan be challenging and time-consuming. The holistic andanalytic scales discussed here make the task easier. Theyprovide the students with a set of guidelines to use whenwriting the final reports. They also provide me with astructured, consistent way to assess student reports in arelatively short period of time.

Currently I use these scales to assess projects given inprecalculus and calculus classes taught at a two-year branchof a university. I have used these scales over the last sevenyears to assess projects given in undergraduate as well asgraduate-level mathematics courses. The scales have workedwell at both levels.

Method

I provide each student with a copy of the assessment scales thefirst class meeting. The scales are briefly discussed with thestudents and care is taken to point out that a numeric solutionwithout the other components will result in a failing grade. Ireview the scale when the first group project is assigned. Thishelps the students to understand what is expected of them.

The analytic scale (p. 117, adapted from [1]) lends itselfwell to situations where a somewhat detailed assessment ofstudent solutions is desired. When the analytic scale is used,a separate score is recorded for each section: understanding,plan, solution, and presentation. This allows the students tosee the specific strengths and weaknesses of the final reportand provides guidance for improvement on future reports.

The holistic scale (p. 119, adapted from [2]) seems bettersuited to situations where a less detailed assessment isrequired. It often requires less time to apply to each studentreport and can be used as is or adjusted to meet your individualneeds and preferences. For some problems it might be usefulto develop a scale that has a total of five or six points possible.This will require rewriting the criteria for each level of score.

Findings

One of the easiest ways to demonstrate how the scales workis to actually apply them. Figure 1 contains a proposed solutionto the problem given below. The solution has been evaluatedusing the analytic scale.

The Problem: You want to surprise your little brotherwith a water balloon when he comes home, but wantto make it look “accidental.” To make it look accidentalyou will call, “Watch out below!” as you release theballoon, but you really don’t want him to have time tomove. You time the warning call and find that it takes1 second. You have noticed that it takes your brother

Assessing Modeling Projects In Calculus andPrecalculus: Two Approaches

Charles E. EmenakerUniversity of Cincinnati

Raymond Walters College

Once you’ve assigned a writing project and collected the papers, how are you going to grade it withoutspending the rest of the semester on that one set? This article outlines two grading scales which canmake this more efficient.

Page 130: Assessment Practices in Undergraduate Mathematics - Northern

Part II: Assessment in the Individual Classroom 117

about 0.75 seconds to respond to warning calls andyou know sound travels about 1100 feet per second.If your brother is 5 feet tall, what is the greatest heightyou could drop the balloon from and still be certain ofdousing him?

Using the holistic scale on p. 119 to assess this proposedsolution results in a score of “3a.” The “a” is included toprovide the students with guidance regarding theshortcoming(s) of the report.

Use of these scales, especially the analytic scale, has ledto reports that are much more consistent in organization andusually of a higher quality. Students have repeatedly madetwo comments regarding the use of the scales. First theyappreciate the structure. They know exactly what is expectedof them. Second they are much more comfortable with thegrade they receive. On more than one occasion students haveexpressed concern regarding the consistency of assessmentin another class(es) and confidence in the assessment basedon these scales.

Use of Findings

One issue I focus on is how well the students are interpretingwhat is being asked of them. After reading many reports wherethe students are first asked to restate the problem, it hasbecome clear that I need to spend time in class teaching thestudents how to read and interpret problems. They may beliterate, but they often lack the necessary experience tounderstand what is being described physically.

I also focus on the students’ mathematical reasoning. Inthe problem presented above, a number of students run intodifficulty developing an equation for the brother’s reactiontime. They will add the times for the warning call and thereaction while completely ignoring the time for sound totravel. Although it does not significantly alter the solution inthis problem, it is important for the students to realize thatsound requires time to travel. Prior to giving this assignment,we now spend time in class looking at situations where thetime needed for sound to travel affects the solution.

Success Factors

When using the holistic scale, I include a letter (the “a” of“3a” above) to indicate which criterion was lacking. Studentsappreciate this information.

When new scales are developed it is well worthremembering that as the number of points increases so doesthe difficulty in developing and applying the scale. In thecase of the holistic scale, it is often best to start with fourpoints. As you gain experience, you may decide to developmore detailed scales. I have always found five or at most sixpoints sufficient for holistic scales.

To avoid problems with one or two students in a groupdoing all of the work, I have them log the hours spent on theproject. I warn the class that if a student in a group spends 20minutes on the project while each of the other membersaverages four hours, that person will receive one-twelfth ofthe total grade. There has only been one case in seven yearswhere this type of action was necessary.

A final note about using these scales. It often seems a hugetask to evaluate the reports, but they go rather quickly oncestarted. First, there is only one report per group so the numberof papers is reduced to one-third of the normal load. Second,having clearly described qualities for each score reduces thetime spent determining the score for most situations.

An analytic scale for assessing project reports(Portions of this are adapted from [1]. See Figure 1 for asample.)

Understanding3 Pts The student(s) demonstrates a complete understanding

of the problem in the problem statement section aswell as in the development of the plan andinterpretation of the solution.

2 Pts The student(s) demonstrates a good understanding ofthe problem in the problem statement section. Someminor point(s) of the problem may be overlooked inthe problem statement, the development of the plan,or the interpretation of the solution.

1 Pt The student(s) demonstrates minimal understandingof the problem. The problem statement may be unclearto the reader. The plan and/or interpretation of thesolution overlooks significant parts of the problem.

0 Pt The student(s) demonstrates no understanding of theproblem. The problem statement section does notaddress the problem or may even be missing. The planand discussion of the solution have nothing to do withthe problem.

Plan3 Pts The plan is clearly articulated AND will lead to a

correct solution.2 Pts The plan is articulated reasonably well and correct OR

may contain a minor flaw based on a correctinterpretation of the problem.

1 Pt The plan is not clearly presented OR only partiallycorrect based on a correct/partially correctunderstanding of the problem.

0 Pt There is no plan OR the plan is completely incorrect.

Solution3 Pts The solution is correct AND clearly labeled OR though

the solution is incorrect it is the expected outcome ofa slightly flawed plan that is correctly implemented.

Page 131: Assessment Practices in Undergraduate Mathematics - Northern

118 Assessment Practices in Undergraduate Mathematics

Figure 1. A proposed solution evaluated using the analytic scale.

Page 132: Assessment Practices in Undergraduate Mathematics - Northern

Part II: Assessment in the Individual Classroom 119

2 Pts Solution is incorrect due to a minor error inimplementation of either a correct or incorrect plan ORsolution is not clearly labeled.

1 Pt Solution is incorrect due to a significant error inimplementation of either a correct or incorrect plan.

0 Pt No solution is given.

Presentation1 Pt Overall appearance of the paper is neat and easy to

read. All pertinent information can be readily found.0 Pt Paper is hard to read OR pertinent information is hard

to find.

A Holistic Scoring Scale(This is an adaptation of a scale from [2].)

4 Points: Exemplary ResponseAll of the following characteristics must be present.• The answer is correct.

• The explanation is clear and complete.

• The explanation includes complete implementation of amathematically correct plan.

3 Points: Good ResponseExactly one of the following characteristics is present.a The answer is incorrect due to a minor flaw in plan or an

algebraic error.b The explanation lacks clarity.

c The explanation is incomplete.

2 Points: Inadequate ResponseExactly two of the characteristics in the 3-point section are

present OROne or more of the following characteristics are present.a The answer is incorrect due to a major flaw in the plan.b The explanation lacks clarity or is incomplete but does

indicate some correct and relevant reasoning.c A plan is partially implemented and no solution is provided.

1 Point: Poor ResponseAll of the following characteristics must be present.• The answer is incorrect.

• The explanation, if any, uses irrelevant arguments.

• No plan for solution is attempted beyond just copyingdata given in the problem statement.

0 Points: No Response

• The student’s paper is blank or contains only work thatappears to have no relevance to the problem.

References

[1] Charles, R., Lester, F., and O’Daffer, P. How to EvaluateProgress in Problem Solving, National Council ofTeachers of Mathematics, Reston, Va, 1987.

[2] Lester, F. and Kroll, D. “Evaluation: A New Vision,”Mathematics Teacher 84, 1991, pp. 276–283.

Page 133: Assessment Practices in Undergraduate Mathematics - Northern

120

Background and PurposeComplex, technology-based problems worked in groupsduring class over a period of several days form a suitableenvironment for assessment of student growth andunderstanding in mathematics. The teacher can makeassessment of student initiative, creativity, and discovery;flexibility and tolerance; communication, team, and groupself-assessment skills; mathematical knowledge;implementation of established and newly discoveredmathematical concepts; and translation from physicaldescriptions to mathematical models. Viewing students’progress, digression, discovery, frustration, formulation,learning, and self-assessing is valuable and instant feedbackfor the teacher.

During student efforts to attack and solve complex,technology-based problems there is rich opportunity forassessment. Active and collaborative learning settingsprovide the best opportunity for assessment as studentsinteract with fellow students and faculty. The teacher seeswhat students know and use and the students receiveimmediate feedback offered by the coaching faculty.

Student initiative is very easy to measure, for the studentsvoice their opinions on how they might proceed. Creativity,discovery, and critiquing group member’s offerings come

right out at the teacher as students blurt out ideas, build upongood ones, and react to ideas offered by others. Listening togroups process individual contributions will show the teacherhow flexible and tolerant members are of new ideas. Thewrite-up of a journal recording process and the final reportpermits assessment qualities such as follow-through, rigor,and communication.

I illustrate success using this assessment strategy withone problem and direct the reader to other problems. I usedthe problem described below at Rose-Hulman Institute ofTechnology several times [1] as part of a team of facultyteaching in an Integrated First-Year Curriculum in Science,Engineering, and Mathematics (IFYCSEM). In IFYCSEMthe technical courses (calculus, mechanics, electricity andmagnetism, chemistry, statics, computer programming,graphics, and design) are all put together in three 12 creditquarter courses [2]. This particular problem linksprogramming, visualization, and mathematics.

This activity fits into a course in which we try to discoveras much as possible and reinforce by use in context ideascovered elsewhere in this course or the students’backgrounds. Further, it gives the teacher opportunities tosee the students in action, to “head off at the pass” some badhabits, to bring to the attention of the class good ideas of agiven group through impromptu group presentations of their

Formative Assessment During Complex,Problem-Solving, Group Work in Class

Brian J. WinkelUnited States Military Academy*

During student efforts to attack and solve complex, technology-based problems there is rich opportunityfor assessment. The teacher can assess student initiative, creativity, and discovery; flexibility and toler-ance; communication, team, and group self-assessment skills; mathematical knowledge; implementationof established and newly discovered mathematical concepts; and translation from physical descriptionsto mathematical models.

————*This work was done at the author’s previous institution: Rose-Hulman Institute of Technology, Terre Haute, IN 47803 USA.

Page 134: Assessment Practices in Undergraduate Mathematics - Northern

Part II: Assessment in the Individual Classroom 121

ideas, and to have students “doing” mathematics rather than“listening” to professorial exposition.

The activity is graded and given a respectable portion ofthe course grade (say, something equal to or more than anhourly exam) and sometimes there is a related project quizgiven in class after the project has been completed.

In IFYCSEM we used Mathematica early in the course asa tool for mathematics and as the introductory programmingenvironment. Thus we were able to build on this expertise in anumber of projects and problems throughout the course. I havealso used such projects in “stand-alone” calculus courses atWest Point and they work just as well.

Method

A suitable class size for “coaching” or “studio modeling”such a problem is about 24 students — eight groups of three.It is possible to visit each group at least once during an hour,to engage the class in feedback on what is going on overall,and to have several groups report in-progress results. I haveused this problem with two class periods devoted to it at thestart, one class period a week later to revisit and shareprogress reports, and one more week for groups to meet andwork up their results.

One has to keep on the move from group to group andone has to be a fast read into where the students’ mindsetsare, to try to understand what they are doing, to help thembuild upon their ideas or to provide an alternative they canput to use, but not to destroy them with a comment like,“Yes, that is all well and good, but have you looked at it thisway?” (Translated into adolescent jargon — “You are wrongagain, bag your way, do it my way!”)

An Example

The problem is to describe what you can see on one mountainwhile sitting on an adjacent mountain. See Figure 1. In aseparate paper [1] I discuss how students, working in groups,attack the problem and the issues surrounding the solutionstrategies. This problem develops visualization skills,verbalization of concepts, possibly programming, and themathematical topics of gradients, projection, optimization,integration, and surface area.

The problem stated

For the function

f x yx x

x y,a f=

- +

+ +

3

4 4

3 4

5 20

suppose your eye is precisely on the surface z = f (x, y) (seeFigure 1) at the point (2.8, 0.5, f (2.8, 0.5)). You look to theleft, i.e., in the direction (roughly) (–1, 0, 0). You see amountain before you.

(a) Determine the point on the mountain which you can seewhich is nearest to you.

(b) Describe as best you can the points on the mountainwhich you can see from the point (2.8, 0.5, f (2.8, 0.5)).

(c) Determine the amount of surface area on the mountainwhich you can see from the point (2.8, .5, f (2.8, 0.5)).

I introduce the above problem in a room withoutcomputers so the students will visualize without turning tocomputers to “crunch.” I give the students the statement ofthe problem with the figure (see Figure 1) and say, “Go toit!” Working in groups, they turn to their neighbor and startbuzzing. Circulating around the class, I listen in on groupdiscussion.

Findings

Typically students offer conjectures which can be questionedor supported by peers, e.g., the highest point visible on theopposite mountain can be found as the intersection of avertical plane and the opposite mountain. Students withhiking experience will say you cannot see the top of themountain over the intervening ridges. A recurring notion isthat when placing a “stick” at the point of the viewing eyeand letting it fall on the opposite mountain this stick is tangentto the opposite mountain. Other students can build upon thisvision, some even inventing the gradient as a vector whichwill necessarily be perpendicular to the stick at the tangentpoint on the opposite mountain. Others can use this lattertangency notion to refute the vertical plane notion of thefirst conjecture.

I also assign process reporting (summative evaluation)at the end of the project in which students describe and reflectupon their process. This is part of their grade. They keepnotes on out of class meetings, report the technical advancesand wrong alleys, and reflect on the workings of the group.Assigned roles (Convenor, Recorder, and Reactor) arechanged at each meeting.

Use of Findings

The formative assessment takes place in the interactionamong students and between students and teacher. Basically,the students “expose” their unshaped ideas and strategies,get feedback from classmates on their ideas, hone theirarticulation, and reject false notions. In so doing they clarifyand move to a higher level of development. Observing andinteracting with students who are going through this problem-solving process is an excellent way for the teacher to assesswhat students really understand.

Reading the journal accounts of progress gives the teacherinsight into the learning process as well as group dynamics.The former helps in understanding the nature of student

Page 135: Assessment Practices in Undergraduate Mathematics - Northern

122 Assessment Practices in Undergraduate Mathematics

difficulty. The latter can help the teacher see why somegroups are smoother than others and thus be aware ofpotential problems in the next group activity assigned.

Success Factors

The only real difficulty in this entire approach is how toaddress the many conjectures and convictions the studentsbring forth in the discussion. One has to be thinking andassessing all the time in order to give meaningful,constructive feedback. The teacher has to watch that oneidea (even a good one) does not dominate from the start andthat all ideas are given fair consideration. The entire processnever fails if the problem is interesting enough.

A Mathematica notebook, ASCII, and HTML versionof this problem, with solution and comments is available

(under title “OverView”) at web site http://www.rose-hulman.edu/Class/CalculusProbs. This is a part of a largerNational Science Foundation project effort, “DevelopmentSite for Complex, Technology-Based Problems in Calculus,”NSF Grant DUE-9352849.

References

[1] Winkel, B.J. “In Plane View: An Exercise inVisualization,” International Journal of MathematicalEducation in Science and Technology, 28(4), 1997, pp.599–607.

[2] Winkel, B.J. and Rogers, G. “Integrated First-YearCurriculum in Science, Engineering, and Mathematicsat Rose-Hulman Institute of Technology: Nature,Evolution, and Evaluation,” Proceedings of the 1993ASEE Conference, June 1993, pp. 186–191.

Figure 1

Page 136: Assessment Practices in Undergraduate Mathematics - Northern

123

Background and Purpose

We discuss our use of portfolio assessment in undergraduatemathematics at Occidental College, a small, residentialliberal arts college with strong sciences and a diverse studentbody. Portfolios are concrete and somewhat personalexpressions of growth and development. The followingelements, abstracted from artist portfolios, are common toall portfolios and, taken together, distinguish them from otherassessment tools:

• work of high quality,

• accumulated over time,

• chosen (in part) by the student.

These defining features also indicate assessment goalsfor which portfolios are appropriate.

Our use of portfolios is an outgrowth of more than sixyears of curricular and pedagogical reform in our department.We focus here on two types of portfolios — reflectiveportfolios and project portfolios — which we use in somesecond-year and upper-division courses. We also describehow we integrate portfolios with other types of assessment.

Reflective Portfolios. In a reflective portfolio, studentschoose from a wide range of completed work using carefullyspecified criteria. They are asked to explicitly consider theirprogress over the length of the course, so early work whichis flawed may nonetheless be included to illustrate how farthe student has progressed. A reflective portfolio helpsstudents assess their own growth.

The collection of portfolios can also help a teacher reflecton the strengths and weaknesses of the course. It can pointout strong links made by the students, and indicate strugglesand successes students had with different topics in the course.It can further open the window to student attitudes andfeelings.

Project Portfolios. Project portfolios are one componentof a “professional evaluation” model of assessment [1, 2].The other components are “licensing exams” for basic skills,small but open-ended “exploration” projects, and targeted“reflective writing” assignments. These differentcomponents, modeled on activities engaged in byprofessionals who use and create mathematics, have beenchosen to help students develop a more mature approach totheir study of mathematics.

Students choose projects from lists provided for each unitof the course. Completed projects are included in their projectportfolio. This method of assessment helps students identifytheir interests, produce work of high quality, and tackle moreambitious assignments which may take several weeks tocomplete. Collecting work from the entire course encouragesstudents to look beyond the next test or quiz.

Method I. Reflective PortfoliosTo increase students’ self-awareness of how theirunderstanding has developed requires that information aboutthis understanding be collected by the students as the courseprogresses. Keeping a journal for the course is a useful

Student Assessment Through Portfolios

Alan P. Knoerr and Michael A. McDonaldOccidental College

Reflective portfolios help students assess their own growth. Project portfolios identify their interestsand tackle more ambitious assignments.

Page 137: Assessment Practices in Undergraduate Mathematics - Northern

124 Assessment Practices in Undergraduate Mathematics

supplement to the usual homework, classwork, quizzes, andtests. For example, in a real analysis course, students areasked to make a journal entry two or three times each weekas they, among other possibilities: reflect on their readingsor problem sets, discuss difficulties or successes in the course,clarify or connect concepts within this course or with thosein other courses, or reflect on their feelings toward this courseor mathematics in general. The teacher collects and respondsto the journals on a regular basis.

Near the end of the semester, students are asked to preparea reflective portfolio. The assignment says:

The purpose of this portfolio assignment is to allowyou to highlight your own selections of your work andgive an analysis of them in your own words. I willfocus on two specific things in evaluating yourportfolio: (1) an understanding of some key conceptsin real analysis and (2) a self-awareness of your journey(where you started from, where you went, and whereyou are now).

Select three pieces of work from this semester toinclude in your portfolio. These pieces can includejournal writing, homework assignments, tests, classworksheets, class notes, computer experiments, or anyother pieces of work you have produced in this class.Your analysis should explain your reasons for pickingthese pieces. As examples, you might consider aselection which shows the development of yourunderstanding of one key concept, or a selection whichshows your growing appreciation of and proficiencywith formal proofs, or a selection which shows yourconnection of two or more key concepts.

The most important part of the portfolio is yourreflection on why you chose the pieces you did, howthey show your understanding of some key concept(s),and how they show a self-awareness of your journeythrough this class. You should definitely write morethan one paragraph but no more than five pages. Theportfolio is 5% of your final grade.

Method II. Project PortfoliosThe project portfolio corresponds to papers and othercompleted projects a professional would include in his orher curriculum vitae. Two sorts of themes are used forprojects in a multivariable calculus course with a linearalgebra prerequisite. For example, “Vector Spaces ofPolynomials” concerns fundamental ideas of the course,while “Optimization in Physics” is a special topic for studentswith particular interests. All students are required to completecertain projects, while in other cases they choose from severaltopics. Here is an example of a portfolio project assignment.

Vector Spaces of Polynomials. What properties define avector space over R? Let P3 denote the space of polynomials

over R of degree less than or equal to three. Show that P3 isa vector space over R with natural definitions for “vectoraddition” and “scalar multiplication.” Find a basis for P3and show how to find the coordinate representation of apolynomial in P3 relative to this basis. Show that“differentiation” is a linear transformation from P3 to P2,where P2 is the space of polynomials over R of degree lessthan or equal to 2. Find a matrix representation for thistransformation relative to bases of your choice for the inputspace P3 and the output space P2. To check your work, firstmultiply the coordinate representation of a generalpolynomial p(x) in P3 by your matrix representation. Thencompare this result with the coordinate representation of itsderivative, p (́x). Discuss what you learned from this project.In particular, has this project changed how you think aboutvector spaces, and if so, in what way?

Project reports must include: a cover page with a title,author’s name, and an abstract; clear statements of problemssolved, along with their solutions; a discussion of what waslearned and its relevance to the course; acknowledgementof any assistance; and a list of references. Reports are usuallybetween five and ten pages long. They are evaluated on bothmathematical content and quality of presentation. First andfinal drafts are used, especially early in the course whenstudents are learning what these projects entail.

For the typical project, students will have one week toproduce a first draft and another week to complete the finaldraft. Students may be working on two different projects indifferent stages at the same time. Allowing for test breaksand longer projects, a completed course portfolio willcomprise seven to ten projects. Students draw on thisportfolio as part of a self-assessment exercise at the end ofthe course.

Findings

The collection of reflective portfolios can serve as feedbackof student mathematical understanding and growth throughthe course. It can sometimes also give us a glimpse of thejoy of learning, and thus the joy of teaching. One studentwrote:

Without hesitation I knew the selections I was goingto analyze. I think one of the reasons why thisworksheet and these two journal entries came to mindso quickly is because they reflect a major revelation Ihad in the course. Not often in my math classes have Ifelt so accomplished .... These selections representsomething I figured out ON MY OWN! That’s whythey so prominently came to mind.

The first drafts of projects for the project portfolio are arich source of information on how students are thinking aboutimportant topics being covered in the course. The final draftsreveal more clearly their depth of understanding and degree

Page 138: Assessment Practices in Undergraduate Mathematics - Northern

Part II: Assessment in the Individual Classroom 125

of mastery of mathematical language. We find that theseprojects help students achieve a better conceptualunderstanding of important aspects of the course and becomemore mature in presenting mathematical arguments. Thediscussion sections of their reports offer students somechance for reflection. For example, one student wrote thefollowing in his report on vector spaces of polynomials:

I am fascinated to see how much we can deduce froman abstract space without being able to visualize it.While the concept of a polynomial vector space stillamazes me it is very interesting to see how we canmake this abstraction very tangible.

Use of Findings

Reflective portfolios document the development ofmathematical knowledge and feelings about mathematics foreach student in a particular course and class. This informationhas been used to realign class materials, spending greatertime on or developing different materials for particularlydifficult concepts. In addition, the evaluation of studentjournals throughout the semester allows for clarifyingconcepts which were not clearly understood. As an example,one student discussed an in-class worksheet on uniformcontinuity of a function f (x) and did not understand the roleof x in the definition. Seeing this, class time was set aside toclarify this concept before more formal evaluation.

First drafts of project portfolios may highlight difficultiesshared by many students and thus influence teaching whilea course is in progress. Reflecting on completed portfolioscan also lead to changing how we teach a course in the future.The projects themselves may be improved or replaced bynew ones. Special topics which were previously presentedto the entire class may be treated in optional projects, leavingmore class time for fundamental ideas. These projects canalso lead to more radical reorganization of a course. Forexample, one year a project on Fubini’s Theorem revealedthat students had trouble understanding elementary regionsfor iterated integrals. The next year we first developed lineintegrals, then introduced elementary regions throughGreen’s Theorem before treating iterated integrals. Teachingthese topics in this unusual order worked quite well.

Success Factors

The reflective portfolio, journal writing, and the projectportfolio are all writing-intensive forms of assessment.Clarity, time and patience are required of both teachers andstudents in working with assignments like these which areunusual in mathematics courses.

The reflective portfolio is little more than a short paperat the end of the semester. The time is put in during thesemester as the teacher reads and responds to the journals.Cycling through all the journals every few weeks is probablythe most efficient and least burdensome way of handlingthem.

Some students enjoy reflective writing while others mayfeel awkward about it. In the real analysis class, for example,the female students used the journal more consistently andproduced deeper reflections than did the males. Teachersmust also allow for reflections which may not be consistentwith the outcomes they desire.

Several techniques can make correcting first drafts ofproject portfolios more efficient and effective. Commentswhich apply to many reports for a given project can becompiled on a feedback sheet to which more specificcomments may be added for individual students. Peer reviewwill often improve the quality of written work. A conferencewith a student before his or her report is submitted may makea second draft unnecessary. Students often need to completeseveral projects before they fully appreciate the degree ofthoroughness and clarity required in their reports. Sharingexamples of good student work from previous years can helpcommunicate these expectations.

References

[1] Knoerr, A.P. “A professional evaluation model forassessment.” Joint Mathematics Meetings. San Diego,California: January 8–11, 1997.

[2] Knoerr, A.P. “Authentic assessment in mathematics: aprofessional evaluation model.” Ninth Annual LillyConference on College Teaching — West, LakeArrowhead, California: March 7–9, 1997.

Page 139: Assessment Practices in Undergraduate Mathematics - Northern

126

Background and Purpose

Millersville University is one of fourteen institutions in theState System of Higher Education in Pennsylvania. It has anenrollment of approximately 6,300 undergraduate students,about 280 of whom participate in the University HonorsProgram. Each student who plans to graduate in the HonorsProgram must complete a stringent set of requirements,including a course in calculus. For honors students seekingdegrees in business or one of the sciences, this particularrequirement is automatically met because calculus is requiredfor these majors. However, most honors students major inthe humanities, fine arts, social sciences, or education. Toaccommodate these students, the mathematics departmentdesigned a six-credit, two semester sequence, HonorsApplied Calculus. There are usually between 25 and 30students in each course of the sequence.

At Millersville University, courses are not approved ashonors courses unless they include a well-defined writingcomponent. At its inception, this requirement was notenforced for Honors Applied Calculus because no one atany level in the university associated writing assignmentswith undergraduate mathematics courses. The lack of anhonors writing component troubled me, so I attendedworkshops such as “Writing Across the Curriculum” in orderto obtain ideas about incorporating writing into the twocourses. As a result, not only was I able to find ways to addwriting assignments to Honors Applied Calculus, but I alsodecided to replace the usual in-class test on applications ofcalculus with a portfolio of applications. Writing would be

an integral part of the portfolio because the students wouldbe required to preface each type of application with aparagraph explaining how calculus was used to solve theproblems in that section.

Method

I use, on a regular basis, four topics for writing assignmentsin these courses. They are 1) a discussion of the three basictypes of discontinuity; 2) a discussion of how the concept oflimit is used to obtain instantaneous velocity from averagevelocity; 3) an analysis of a graph representing a real worldsituation; and 4) a discussion of how Riemann sums are usedto approximate area under a curve. For the Riemann sumpaper, I specify the audience that the students should address;usually it is someone who has had minimal exposure tomathematics such as a younger sibling. In the first semester,I assign one of the first two topics during the first half of thesemester and the graph analysis paper near the end of thesemester. In the second semester, the portfolio of calculusapplications is due by spring break, whereas the Riemannsum paper is assigned near the end of the semester. I alwaysallow at least three weeks for students to complete eachwriting assignment.

Typical instructions for the graph analysis paper are:

Find, in a publication, a graph of a function thatrepresents a real world situation, or make up your owngraph of a function that could represent a real worldsituation. The graph you use for this assignment must

Using Writing to Assess Understanding of Calculus Concepts

Dorothee Jane BlumMillersville University of Pennsylvania

Student write expository papers in an honors, non-science majors’ calculus course to integrate the majorideas they’re studying.

Page 140: Assessment Practices in Undergraduate Mathematics - Northern

Part II: Assessment in the Individual Classroom 127

exhibit at least six aspects from the list of graph aspectsthat we compiled in class. You are to make a chartindicating each aspect of the graph, where it occurs,and what it indicates about the function or itsderivatives. Then you are to write a narrative in whichyou explain the connections between each graph aspectand the real world situation it represents. The finalpaper that you submit must include the graph, the chart,and the narrative.

I believe that this assignment is the definitive assessmenttool for the conceptual material that we cover in the firstsemester. By transferring what they have learned in the courseto a situation outside of the course, the students clarify forthemselves the significance of such things as extrema,discontinuities, and inflection points. It is important to realizethat in this assignment, I have no interest in evaluatingcomputational ability. Rather, I am assessing students’ criticalthinking, analytical thinking, and interpretative skills.

When I grade writing assignments, my primary concernis accuracy and my secondary concern is organization. Noone is required to be creative or original, but most studentsvoluntarily incorporate both attributes in the work theysubmit. The portfolio project, on the other hand, involvesproblem solving and computation in addition to writing. Eachstudent includes two or three problems, chosen from a listprovided by me, for each type of application (Newton’sMethod, optimization, differentials, and related rates). Eachsection of the portfolio corresponds to one of the types ofapplications and includes the brief written explanationmentioned earlier, followed by the statement of each problemto be solved in the section and its detailed solution, includingrelevant diagrams. I evaluate the portfolio primarily oncorrectness of solutions and secondarily on writing andorganization.

Findings

Students have reacted favorably to the writing assignments.On the Honors Program course evaluation questionnaire,many students claim that writing about mathematical ideasis a greater intellectual challenge and accomplishment thandoing calculations or solving problems. They also haveclaimed that writing clarifies concepts for them. So far, nostudent has indicated that he or she would prefer to havetests in place of the writing assignments. However, studentshave indicated that putting together the portfolio is very timeconsuming and several have remarked that they would havepreferred a test in its place. There is also universal agreementthat the writing assignments and portfolio do comprise amore than adequate honors component for the sequence.

It is interesting to note that on the graph analysisassignment, approximately the same number of studentschoose to make up their graphs as choose to use graphs from

publications. In the latter case, many students use graphsrelated to their majors. Some examples of situations for whichstudents have created their own graphs are:

1) a graph of Joe Function’s trip through the land of Graphiawhere a removable discontinuity is represented by a trap;

2) a graph of Anakin Skywalker/Darth Vader’s relationshipwith the force where the x-intercepts represent his changesfrom the “good side” to the “dark side” and back again;

3) a graph of profits for Computer Totes, Inc., where a firein the factory created a jump discontinuity. The narrativewritten by the student in this paper was in the form ofnewspaper articles.

Some examples of graphs students found in publicationsare:

1) a graph of power levels in thermal megawatts of theChernobyl nuclear power plant accident where there is avertical asymptote at 1:23 a.m. on April 16, 1986;

2) a graph of tourism in Israel from 1989 to 1994 with anabsolute minimum occurring during the Persian Gulf War;

3) a graph of the unemployment percentage where it isincreasing at an increasing rate between 1989 and 1992.

The assignment on Riemann sums has also produced somevery innovative work. One student prepared a little bookmade of construction paper and yarn, entitled “RiemannSums for Kids.” A French major, writing in French, explainedhow one could amuse oneself on a rainy day by usingRiemann sums to approximate the area of an oval rug.Another student wrote a guide for parents to use when helpingtheir children with homework involving Riemann sums. Anelementary education major prepared a lesson plan on thesubject that could be used in a fifth grade class. Severalstudents submitted poems in which they explained theRiemann sum process.

The portfolio produces fairly standard results because itconsists of more “traditional” problems. However, there havebeen some creative presentations, such as prefacing eachsection of the portfolio by a pertinent cartoon.

Use of Findings

When I first started using writing assignments in HonorsApplied Calculus, I only counted them collectively as 10%of the final course grade. At that time, I did not value thewriting as much as I did the in-class computational tests.However, I soon realized that students devoted more thought,time, and effort to the writing assignments than they did tothe tests. I also realized that I learned more, through theirwriting, about how my students thought about and related tomathematics than I did from their tests. As a result, I nowgive equal weight to writing assignments, the portfolio, tests,and homework when determining the final course grades inthe course.

Page 141: Assessment Practices in Undergraduate Mathematics - Northern

128 Assessment Practices in Undergraduate Mathematics

Furthermore, when I began to use writing assignments, Idid not feel comfortable evaluating them. This situationresolved itself once I realized that the students, writing aboutmathematics for the first time, were just as unsure of whatthey were doing as I was. Now, I encourage the students toturn in rough drafts of their work the weekend before theFriday due date. This provides me with the opportunity tomake suggestions and corrections as needed without thepressure of assigning final grades right away. Sometimes Iwill assign provisional grades so students will have an ideaof the quality of their preliminary work.

I also have learned about curriculum from these projects.For example, based on results from the writing assignmenton types of discontinuity, I am convinced that the mosteffective way to introduce limits is within the context ofcontinuity and discontinuity. This is the approach I now usein all my calculus courses.

Success Factors

It would be difficult to imagine better conditions under whichto use writing in a mathematics course than those that existin the honors calculus sequence. The students are amongthe most academically motivated in the university. They aremajoring in areas where good writing is highly valued. Thepace of the sequence is determined by the needs of thestudents and not by a rigid syllabus that must be covered ina specific time frame. As a result, there is ample class timeto discuss topics after they have been introduced anddeveloped. For example, in the first semester, we spend twiceas much time on the relationship between functions and theirderivatives as is typically scheduled in other calculus courses.

Even though I initially introduced the writing assignmentsto fill a void, I now view them as an indispensable part oflearning calculus. Consequently, I am looking for ways toincorporate writing in all my calculus courses.

Page 142: Assessment Practices in Undergraduate Mathematics - Northern

129

Background and Purpose

A journal is a personal record of occurrences, experiences andreflections kept on a regular basis. In my mathematics classesstudents keep a journal of their mathematical experiences insideand outside of class. The purpose of journals is not to assign agrade for each entry but to help students find their own voicesand to be reflective about the subject. Allowing more informaltentative writing into the classroom encourages students to thinkfor themselves as opposed to only knowing second hand whatothers have thought before them.

Mathematics is sometimes perceived as stark andunbending. This may be caused by presentations which arestrictly definition-theorem-proof, or lack a sense of historicalevolution and excitement. “Mathematics has a public imageof an elegant, polished, finished product that obscures itshuman roots. It has a private life of human joy, challenge,reflection, puzzlement, intuition, struggle and excitement.The humanistic dimension is often limited to this privateworld. Students see this elegant mask, but rarely see thisprivate world, though they may have a notion that it exists.”(D. Buerk [1], page 151)

Some professors insist that all answers to homework andexam problems be in full sentences, with severe penaltiesfor violations. The students are given a sheet with instructionsand illustrative examples and are left to sink or swim. Thereis little effort to convince students of the merits andadvantages of this demand. The perception is that theprofessor is not “student friendly.”

If the object is to help students learn to express themselvesin writing, journals offer a more natural approach, and the

perceived relationship with the professor is not soconfrontational. When the task is to “solve a problem,”students who are already writing in their journals mayapproach the solution in a more expansive and discursivefashion. Mathematics may thus be elevated abovememorization of facts and formulas.

When I asked my 20 to 30 students in Calculus and inLinear Algebra to keep journals that would be read andcommented upon by me, they remarked that they were alsokeeping journals in their humanities, social science andbiology courses. Harvey Mudd College is a small (650students) college of engineering and science with ademanding workload. One-third of the curriculum is in thehumanities and social sciences. A Writing Center exists to helpstudents improve their abilities. All first year students take awriting intensive course that is limited to eighteen students perclassroom. This course is taught by writing specialists who arepart of the Humanities and Social Science Department.

Journals are a form of self-assessment, an opportunityfor students to think about their knowledge of the subjectand to strengthen their confidence. The journals are notgraded; they afford an opportunity for dialogue betweeneach student and the teacher. Grades are based on homework,class participation, quizzes and exams. The journal dialoguesallow the students and the teacher to know each other beyondthe anonymity of the classroom activities.

MethodI ask my students to keep journals about their encounterswith mathematics in or out of class. These may include

Journals: Assessment Without Anxiety

Alvin WhiteHarvey Mudd College

Using journals for formative assessment encourages students to explore topics they might have beenintimidated by if they were being graded.

Page 143: Assessment Practices in Undergraduate Mathematics - Northern

130 Assessment Practices in Undergraduate Mathematics

insights or puzzlements. Some students may be interested inpursuing the history or the implications of some aspect ofthe subject. Entries should be made at least once a week.The journals are collected every two or three weeks andreturned within a week. No length or other guidelines areoffered. Students are encouraged to express themselves inwriting. The exercise of writing may encourage furtherdiscussion among students and between students and teacher.Some entries have nothing to do with mathematics, but comefrom other concerns of young people.

An early entry for my students is “A MathematicalAutobiography.” Students seem to enjoy writing about theirexperiences with good or not so good teachers, their awardsand successes, sometimes starting in the third grade. Otherentries may be responses to questions or topics that arise.There may not be enough time to reflect on those ideas inclass. Students may also express their puzzlements in theirjournals. The journal becomes part of the conversationbetween students and teacher.

A colleague periodically asks his students to turn in alooseleaf binder that contains returned, corrected homework,personal responses to readings in the text and classroomnotes. These portfolios may or may not be graded, but willbe commented upon in the margins.

A successful alternative or addition to journals that I havetried is to ask students to write a short (3–5 pages) essay onanything related to mathematics:

Is there some subject that you are curious about? Learnsomething about the subject and write about it.

Findings

Many students seize the opportunity to learn independentlyand to write about their knowledge. Grades are not men-tioned. (The activity might be part of class participation.)All students, with rare exception, participate enthusiastically.Some of the responses have been inspired. An Iranian stu-dent could not think of an appropriate topic. I suggested thathe learn and write about the Persian poet-mathematicianOmar Khayyam. He was so excited by what he learned thathe insisted on writing a second paper about his countryman.The activity offered an independent, self-directed task thatcontributed to knowledge and self confidence. A follow-upfor that last activity might be duplicating all of those papersand distributing to the class (and beyond) a stapled or boundset.

Use of Findings

Grading journal entries might inhibit the free exchange ofideas and the viewing of the journals as part of an extendedconversation. One might assume that not formally gradingthem would lead many students to neglect keeping journals.Paradoxically, the absence of grades seems to liberatestudents to be more adventurous and willing to explore ideasin writing and in the classroom. The usual subjects of thejournals are the ideas that are studied and discussed in theclassroom or from supplementary reading done by all or partof the class. The students flourish from the intrinsic rewardsof expressing themselves about things that matter to them,and having a conversation with the teacher.

Although I do not grade the students’ journals, I do correctgrammar and spelling, and comment on the content withquestions, references and appropriate praise. Studentsrespond to the conversations via journals by speaking in classwith more confidence and a greater variety of ideas. Entriesmade before discussions are rehearsals for class participation.Entries made after class are opportunities for reflection.

Success Factors

How do students acquire knowledge of mathematics?Memorization and solving problems are two routes that maybe followed. Constructing personal meaning by reflectionand conversation is another route. These routes are notmutually exclusive. Asking students to discuss in theirjournals some of the questions that were touched on ordiscussed in class extends the intellectual agenda in ameaningful way. The journal is an opportunity for eachstudent to be personally involved in the agenda of the class.

Students will not be anonymous members of the class ifthey participate in writing and reflecting as well as speaking.

Reference

[1] Buerk, D. “Getting Beneath the Mask, Moving out ofSilence,” in White, A., ed., Essays in HumanisticMathematics, MAA Notes Number 32, TheMathematical Association of America, Washington, DC,1993.

Page 144: Assessment Practices in Undergraduate Mathematics - Northern

131

Background and Purpose

Montclair State University was founded as a public two-year normal school in 1908 and has evolved into a teachinguniversity with a Master’s Degree program and plans for adoctorate. Almost 13,000 students are enrolled. Most havea mid-range of collegiate abilities, but there are a few super-stars. Most are first generation college students; many areworking their way through college.

MSU requires all students to take mathematics, and spe-cially designed mathematics courses support many majors.Students in other majors have a choice of courses to satisfytheir mathematics general education requirement. One, “De-velopment of Mathematics” has as its aim “to examine math-ematics as a method of inquiry, a creative activity, and lan-guage, and a body of knowledge that has significantlyinfluenced our culture and society through its impact on re-ligion, philosophy, the arts, and the social and natural sci-ences.” It attracts majors in religion, philosophy, psychol-ogy, the arts, foreign languages, English, journalism,elementary education, pre-law, political science, and history.

I was first assigned this course in 1993. How could onegrade students in such a course? Computation doesn’t havemuch of a role in achieving the stated aims, but downplayingcomputation essentially eliminates traditional methods ofmathematics assessment. A friend who teaches collegiateEnglish and gives workshops nationwide for teachers ofintroductory college writing courses urged me to considergiving short writing assignments and just checking them off,

as is common in introductory writing courses. This fliesagainst our custom in teaching mathematics. When I balked,she admitted she gives “check plus” to those who showspecial insight in their writing.

Wanting more variety, I decided to include weeklyquestions on the reading and a term paper as majorcontributors to the semester grade. A smaller weight is givento a final paper, attendance, and class participation, although(to avoid intimidating the students) I de-emphasize the oralcomponent.

The two aspects that are most unusual in mathematicscourses are discussed here: short papers and weekly questionson the reading.

MethodShort Papers. Typically, the weekly 250-word essay countsthree points toward the final grade. The syllabus that I handout on the first day of classes says, “You may explore ideasfurther (if you want an ‘A’ in the course) or merely summarizethe author’s comments (if you are content with a ‘B–’).”Many students choose to “explore ideas further,” and whenthey do this creatively, they obtain a “check plus,” worthfour points instead of the usual three.

Some papers, but relatively few, are skimpy, and deserveonly two points. The level of most papers is more thanadequate. For lateness I subtract points, but I stay free of analgorithm, since some papers are more justified in being latethan others.

Assessing General Education Mathematics ThroughWriting and Questions

Patricia Clark KenschaftMontclair State University

General education students can learn to read mathematics more thoughtfully and critically by writing ques-tions over the reading, and short response papers—without enormous investment of faculty time grading.

Page 145: Assessment Practices in Undergraduate Mathematics - Northern

132 Assessment Practices in Undergraduate Mathematics

Weekly Question. A more explicit device for discoveringwhat students want and need to learn is to require studentsto hand in questions at the beginning of each week on thatweek’s assigned reading. Each question counts one pointtoward their final grade, and ensures that most students haveread the assignment before the week begins. These questionsform the basis of much of our in-class discussion (see Useof Findings).

Findings

One of the surprises is the variety in the students’ papers asthey try to write something that is more than routine. Iinevitably learn a great deal, not only about the students, butabout new ways of viewing and teaching mathematics, andabout other subjects related to mathematics that they knowbetter than I do.

Another surprise has been how easy it is to decide forgrading purposes which papers demonstrate originality.Excerpts from one of the first papers I received indicateshow one student explored ideas further. (Rudy Rucker is theauthor of the text.)

....Suddenly, I’m drowning in base 27’s and numbersI’ve never heard of before. I cry out for help as I noticea googol chasing me across the room.

My senses eventually come back and I realize that[the author] has made one major blunder... [He claims]that if we were to count two numbers a second for thebetter part of six days, one would reach a million.

....One could count two numbers a second very eas-ily if it were up to one hundred. [but]...the numbers inthe one hundred thousand to nine hundred ninety-ninethousand, nine hundred ninety-nine range are going totake more than a half a second. They are actually goingto take the better part of a second. This does not eventake into account that a person will probably get windedsomewhere along the line and have to slow down. Fa-tigue may also factor into this because a person is boundto be tired after not sleeping for six days.

Nevertheless, I did want to make sure that my pointwas legitimate, so I took different sequences ofnumbers in a clump of ten and said them, checkingthe second hand of my watch before and after the setof numbers in the one hundred thousand range. Myaverage was considerably greater than a half secondper number. For example, try counting the sequence789,884 through 789,894.

Mr. Rucker, your question is good. Its answer isn’t.I’d tell whoever decided to attempt this to set forth acouple extra days. Six days is not enough.

Such a paper clearly merits an extra point beyond thosethat merely regurgitate the reading.

Another discovery is the extent to which students rereadeach week’s assignment for their papers. Most read itreasonably carefully before submitting their question at thebeginning of the week. Many refer to a second, third, oreven fourth reading in their paper written at the end of theweek. They are obviously seeking a deeper understandingbased on the class discussions. Some students actuallychronicle for me how much they comprehended at eachreading, a useful account for my future teaching.

Use of Findings

Grading such papers is far more interesting than checkingcomputations; it is worth the time it takes. I feel more intouch with the students than in any other course. Since I tryto write at least one comment about the content of each paper,the papers become, in effect, a personal correspondence witheach student.

I often read excerpts from the best papers aloud, and it isclear that the students also are impressed. Having one’s paperread aloud (but anonymously) is an intrinsic reward thatstudents work hard to achieve. Reading interesting passagesto the class also sometimes generates discussions that probethe ideas further.

The papers that merely summarize the reading are lessexciting to read, but they too inform my teaching. SometimesI catch misunderstandings. More commonly, studentsmention confusions that need clarification. The writtenpapers give me an opportunity to see how and to what extentthey are absorbing the material.

Students’ written questions are the backbone of my lessonplanning. Some of their questions lend themselves totraditional teaching:

Explain what a logarithm is. I’m still a little confused.or

Please explain the significance of 7=111two=1+2+4.I’ve never seen this before.

I usually begin the week’s exposition by answering suchquestions. One advantage of basing lessons on the students’questions is that at least one student is curious about theanswer.

After I have taught the strictly mathematical topics, I turnto other questions where I believe I know the answer betterthan the student. For example:

When I think of a computer, I imagine an overpricedtypewriter, and not a math-machine. I especially don’tunderstand the system that a computer uses to translatethe math information into word information. In otherwords, what is a bite (sic) and how does it work?

This is not a traditional mathematics question, but it doesguide me in helping the students understand how mathematics

Page 146: Assessment Practices in Undergraduate Mathematics - Northern

Part II: Assessment in the Individual Classroom 133

“has significantly influenced our culture and society” (thecourse’s aim). I answer such questions in the lecture formatof a liberal arts or social science course, keeping the lectureshort, of course.

For the final group of questions, I have the studentsarrange their chairs in a circle. I read a question and wait.My first class let hardly a moment pass before the ideas wereflying. The second class taxed my ability to extend my “waittime.” Sitting silent for long gaps was not one of my fortésbefore teaching this course, but developing it paid off.Student discussion would eventually happen, and it wasalmost always worth waiting for. Subsequent classes havebeen between these two extremes; none have beendisappointing.

Sample discussion questions students contributedfrom the same lesson as above included:

If we agree that “computerized polling procedures”give citizens a “new power over the government,” isthe reverse not also true?

Page 30-bottom: “I construct in my own brain a patternthat has the same feel as your original thought.” Idisagree with this. Information obtained doesn’tnecessarily have the same “feel” to it as thetransmitter’s thought intended, for we decipher itaccording to our perception and, therefore, can andoften do distort the intended meaning or end productof information.

This last question is related to a fascinating student story.A clerical worker from Newark, she wrote at the beginningof the semester in her introductory comments to me that shewas scheduled to graduate; this was her last course. Thenshe added, “I have enormous math anxiety. I hope to be ableto ease this somewhat and sweat in class only from the high

temperature!” The student spoke not a word in class duringits first month, but she was always there on time and madeconstant eye contact. Class reaction to her questions (suchas the one above) was strong, however, and I remember herresponse when another student exclaimed to one of herquestions, “What a genius must have asked that one!” I sawher sit a bit taller. After a few weeks, she began to participate,very cautiously at first. By early December, her term paper,“From Black Holes to the Big Bang,” about StephenHawking and his theories, was so fascinating that I askedher if she would be willing to present it to the class. To myamazement, she hardly blinked, and a few minutes later gavea fine oral summary that mesmerized the class and stimulatedan exciting discussion. She received a well-earned “A” atthe end of the semester.

The best part of using essays and questions as the majorassessment devices is that students take control over theirlearning. The essays reveal what they do know, whiletraditional tests find holes in what they don’t know. Studentsrise to the challenge and put real effort into genuine learning.

Success factors

The sad fact is that the general public perceives traditionalmath tests as not objective because people can so easily“draw a blank” even though they knew the subject matterlast evening — or even an hour ago. My student ratings on“grades objectively” have risen when using this method.

Students like to demonstrate what they know, instead ofbeing caught at what they don’t know. As the semesterprogresses, their efforts accelerate. Taking pressure offcompetitive grades results in remarkable leaps ofunderstanding.

Page 147: Assessment Practices in Undergraduate Mathematics - Northern

134

Background and Purpose

Assessment of student learning in a mathematics course withcooperative learning groups includes assessment of groupwork as well as assessment of individual achievement. Theinclusion of a significant group component in the coursegrade shows the students that group work is to be seriouslyaddressed. On the other hand, assessment of individual workis consistent with the goal of individual achievement. Suchan assessment plan promotes learning through groupinteraction and also holds students accountable for theirindividual learning.

Ursinus College is an independent, coeducational, liberalarts college located in suburban Philadelphia. Over half thegraduates earn degrees in science or mathematics, andbiology is one of the largest majors. About 75% of thegraduates enter graduate or professional schools.Mathematics classes are small, with a maximum of 20students in calculus sections and even fewer students in upperlevel courses. In recent years, a majority of the mathematicsmajors has earned certification for secondary schoolteaching.

My first experiences with extensive group work occurredin the fall semester of 1989 when I took both my Calculus Isections to the microcomputer laboratory in order tointroduce a computer algebra system. Throughout thesemester, the students worked in pairs on weeklyassignments, which counted about 20% of their course grade.I continued this arrangement for the next three semesters inmy sections of Calculus I and Calculus II. Then, after the

summer of 1991, when I attended a Purdue Calculusworkshop, I started using the cooperative learning aspectsof that program in many of my courses. Subsequently I wasa co-author of the book, A Practical Guide to CooperativeLearning in Collegiate Mathematics [2], which includes afull chapter on assessment. I now use cooperative learningand assess the group work in all the mathematics classesthat I teach. The assessment method is applicable to anymathematics course, including first-year classes and majors’courses, in which cooperative learning groups are assignedfor the semester. I have used similar methods in Calculus I,Calculus II, sophomore Discrete Mathematics, Topology, andAbstract Algebra.

Method

The assessment plan for classes with cooperative learningincludes evaluation of most of the students’ activities. I assessquizzes, tests, a final examination, group assignments andprojects, and “participation.” The participation grade reflectseach student’s involvement with all aspects of the course,including submission of journal entries each week,preparation for and contributions in class, and groupinteraction during class and outside class. A typicaldistribution of total points for the various activities assessedis shown Table 1.

With this scheme about 45% of a student’s course gradeis earned in the group since the first of the three hour examsgiven during the semester is a group test and about half theparticipation grade is related to group work. The quizzes,

Combining Individual and Group Evaluations

Nancy L. HagelgansUrsinus College

One frequent concern of faculty members who have not yet tried cooperative learning is that giving thesame grade to a whole group will be unfair both to the hard workers and the laggards. This articleaddresses this issue.

Page 148: Assessment Practices in Undergraduate Mathematics - Northern

Part II: Assessment in the Individual Classroom 135

two tests, and the final examination all are individual tests.At times I give a few extra points on an individual test orquiz to all members of a group that has excelled or greatlyimproved. I give each student a midsemester report in whichI allocate up to half the points for group projects and forparticipation. In this midsemester report, with one group testand two quizzes given, 275 out of a total of 350 points (about79%) are related to group work. The final examination is achallenging comprehensive examination on the entire course.

The group test is given during a class hour around thefourth week of the semester. The goals of the group testinclude promoting group interaction early in the semester,reducing tension for the first test of the course, avoidingearly withdrawals from the course (especially in CalculusI), and learning mathematics during a test. A group test issomewhat longer and harder than an individual test wouldbe, and it certainly is too long for an individual to completeduring the hour. A typical problem on an abstract algebragroup test is: Prove: If x x= -1 for every element x of agroup G, then G is abelian. Seven of ten problems on onediscrete mathematics group test asked students to determinewhether a statement is true or false and to provide either aproof or counterexample with a full explanation; one suchstatement was: The square of an integer has the form 4k or4k + 1 for some integer k. The group test in any course isdesigned with completely independent problems so thatstudents can start anywhere on the test. Each student receivesa copy of the test questions and blank paper for the solutions.Each group may submit only one solution to each problem.

Preparation for a group test includes the usual individualstudying. In addition, I frequently hold a review problemsession in the class hour just before the test, and the groupscontinue to work on these problems outside class as theystudy for the test. Also, I instruct the groups to plan theirtest-taking strategies before the day of the test, and we discussthe possibilities in class. For example, a group of fourstudents could work in pairs on different questions, one pairstarting with the first question and the other with the lastquestion; after all problems have been solved, students cancheck each others’ work and discuss the solutions. I mentionthat individuals need time to think quietly before talking withother group members about a problem.

The group assignments, which vary with each course,comprise challenging problems that are submitted and graded

almost every week during the semester. The calculus classessolve problems using a computer algebra system; thelaboratory book is chosen by the Department. The discretemathematics students implement the fundamental conceptswith some problems that I have written for the same computeralgebra system [1]. The abstract algebra and topologystudents solve traditional problems without the use oftechnology, but with an emphasis on writing clearly.

The journal is a diary in which the students write abouttheir own experiences in the class. On some weeks I askthem to concentrate on certain topics, such as their groupmeetings, the text book, the department’s open tutoringservice, or how they learn mathematics. I respond to thejournals without determining a grade. Each week I give anyreasonable submission a check to indicate credit toward theparticipation grade.

Findings

With the assessment described above, the students activelyparticipate in their learning both in class and outside classevery week of the semester. They interact with other studentsas they work together in long, problem-solving sessions,when they learn to discuss mathematics. They encourageeach other to persist until the problems are solved. Duringthe frequent group visits to my office for help, the studentsfluently talk about the mathematics, and they show that theyhave thought deeply about the problems. The students aremotivated to complete the group assignments by theassignments’ part in determining the course grades as wellas by the responses I write on the work.

The test grades on the group test are quite high, and nogroup has ever failed a group test. The lowest grade everwas 68 in one Calculus I class, and one year all four groupsin Discrete Mathematics earned grades above 90. This testforestalls a disastrous first testing experience for first-yearstudents in Calculus I and for majors enrolled in a firstabstract mathematics course. Few students have complainedthat the test grade lowered their individual grades, since, asa matter of fact, most students have a higher grade on thegroup test than on subsequent hour exams. On the other hand,some weaker students have written in their journals abouttheir concern that they cannot contribute enough during thetest, and no student has depended entirely on other membersof the group. Since it is to the advantage of the group that allmembers understand the material, the group members usuallyare very willing to help each other study for the exam. Thegroup exam does reduce tension and foster the team spirit.Many students claim that they learned some mathematicsduring the test, and they frequently request that the othertests be group tests. I see evidence that students have editedeach others’ work with additions and corrections written ina different handwriting from the main work on a problem.

Activity Points Percent of Course Grade3 tests 100 points each 30%

4 quizzes 25 points each 10%final exam 200 points 20%

group projects 300 points 30%participation 100 points 10%

Table 1

Page 149: Assessment Practices in Undergraduate Mathematics - Northern

136 Assessment Practices in Undergraduate Mathematics

Students in all the courses have time to become acclimatedto the course before the individual work counts heavily, andno students have withdrawn from my Calculus I classes inrecent years. This assessment plan is much more successfulthan a traditional plan that involves only periodic assessmentof performance on timed tests. The plan promotes studentbehavior that supports the learning of mathematics, and itallows me to have a much fuller understanding of the currentmathematical knowledge and experiences of each student.

Use of Findings

By reading the groups’ work each week, I am able to followclosely the progress of the students throughout the semesterand to immediately address any widespread weaknesses thatbecome evident. For example, in one proof assigned to theabstract algebra class, the student groups had twice applieda theorem that stated the existence of an integer, and theyhad assumed that the same integer worked for both cases.During the next class I discussed this error, later in the classI had the groups work on a problem where it would bepossible to make a similar mistake, and then we related thesolution to the earlier problem.

I use the information in the weekly journals in much thesame way to address developing problems, but here theproblems usually are related to the cooperative learninggroups. I offer options for solutions either in a group

conference or in my response written in the journal. Inaddition, when students write about their illnesses or otherpersonal problems that affect their participation in the course,we usually can find a way to overcome the difficulty.

Success FactorsI discuss the assessment plan and cooperative learning at thebeginning of the course to motivate the students to fullyparticipate and to avoid later misunderstandings. The coursesyllabus includes the dates of tests and relative weights ofthe assessed activities. The midterm report further illustrateshow the grades are computed. At that time I point out thatthe individual work counts much more heavily in theremainder of the semester.

References

[1] Hagelgans, N.L. “Constructing the Concepts of DiscreteMathematics with DERIVE,” The InternationalDERIVE Journal, 2 (1), January 1995, pp. 115-136.

[2] Hagelgans, N.L., Reynolds, B.E., Schwingendorf, K.E.,Vidakovic, D., Dubinsky, E., Shahin, M., and Wimbish,G.J., Jr. A Practical Guide to Cooperative Learning inCollegiate Mathematics, MAA Notes Number 37, TheMathematical Association of America, Washington, DC,1995.

Page 150: Assessment Practices in Undergraduate Mathematics - Northern

137

Background and Purpose

As a student, the opportunity to collaborate with otherstudents was instrumental to my success in learningmathematics. As an instructor, I take these approaches onestep further. If the structure of a class is largely group-based,why not use group activities during testing as well? Examsthen become genuine learning experiences.

I have been teaching at large state universities for fiveyears. I use these approaches in classes ranging from CollegeAlgebra to Advanced Calculus. The average class size isabout thirty, but the techniques can be used in larger classesas well.

Below is a list of group activities that can be used to assesslearning in the mathematics classroom. They are arrangedfrom modest to radical, so if you are just starting groupactivities, you might find it helpful to move through the listsequentially. I discuss findings and use of findings with eachactivity, and make brief remarks that apply to all the methodssubsequent the Method section.

Method

Group Exam Review

This approach allows you to cover a broad spectrum ofproblems in a limited amount of time. Hand out a worksheetwith all the problems you would like students to review foran upcoming exam. Divide the class into groups, and haveeach group begin by solving only one problem and putting

the solution on the board. Then, instruct groups to work onadditional problems throughout the class period, alwaysentering their solution on the board. When it is clear thatseveral groups agree on a solution, mark that problem as“solved.” When groups disagree, have the entire class workon the problem until agreement is reached. At the end of theclass period, everyone has a completed solution set for theentire worksheet, even though as individuals they onlywrestled with a few problems.

Findings. Students really like this approach for review.They spend the class period actively solving problems andfeel that their time has been used wisely. They make contactsfor forming study groups.

Use of Findings. This approach excites students to beginstudying: they’ve gotten started and have a guide (theremaining problems on the worksheet) for continuing theirstudies. They have some sense of what they do and do notknow, and thus how much studying will be necessary.

Think, Pair, Share

Write a problem on the board and ask students to think abouthow to solve this problem for a few minutes. Then, havethem pair up with a neighbor to share solutions and requireeach pair to agree on one solution. Next, ask each group toreport its result to the class.

Findings. Since some students are more prepared thanothers, this technique prevents the most prepared studentsfrom dominating an open class discussion on a new topic. Itforces each student to spend some time thinking alone. They

Group Activities to Evaluate Students in Mathematics

Catherine A. RobertsNorthern Arizona University

From reviewing before a test to various ways for students to take tests collaboratively, this article looksat ways that groups can be used to evaluate student learning while increasing that learning.

Page 151: Assessment Practices in Undergraduate Mathematics - Northern

138 Assessment Practices in Undergraduate Mathematics

can then refine their ideas and build confidence by pairingbefore sharing their solutions with the entire class.

Use of Findings. Student confidence increases and theyare more willing to participate in the class discussion.

Quiz for Two and Quiz Consulting

In Quiz for Two, students pair up on a quiz. They can catcheach other’s mistakes. In Quiz Consulting, students areallowed to go out in the hall empty-handed to discuss thequiz, or to talk to each other for one minute during a quiz.By leaving their work at their desks, students are forced tocommunicate ideas to each other verbally.

Findings. Students learn during the assessment itself. I’vebeen surprised to see that a wide cross section of students(not just the weakest ones) choose to take advantage ofopportunities to discuss quizzes with their peers.Performance on these quizzes rises while student anxietydeclines

Use of Findings. I have found that I can make theproblems a bit more challenging. One must take care,however, not to overdo this, as this may cause the class’anxiety level to rise, defeating the purpose of the technique.

Collaborative Exams

Option 1: I give an extremely challenging multiple choiceexam to the class. I allow the class to discuss the exam, butrequire that they each turn in their own solution at the end ofthe class period. I’ve seen this technique used in a sociologycourse with hundreds of students at the University of RhodeIsland. Option 2: Give a shorter exam, but have them workon it twice. For the first half of the testing period, studentswork alone and turn in their individual answer sheets. Then,allow them to work in small groups and turn in a group answersheet at the end of the period.

Findings. One danger with collaborative exams is thatstudents frequently defer to the person who is perceived as the“smartest”—they don’t want to waste precious time learningthe material but rather hope to get the best score possible byrelying on the class genius. Option 2 gives students anopportunity to discuss the exam more deeply since they havealready spent a good deal of time thinking about the problems.Discussion can deteriorate, however, if the students are moreinterested in trying to figure out what their grade is going to bethan in discussing the problems themselves.

Use of Findings. With Option 2, the final score can be anaverage of their two scores.

Findings

While group work is a slower mechanism than a lecture fortransmitting information, a well designed group activity can

lead to deeper learning. Moreover, activities can combineseveral topics, thereby saving time. My main finding is thatstudent anxiety about testing is reduced. Students think ofquizzes as learning exercises instead of regurgitationexercises. Students are thinking more during testing and arelearning more as a result.

I use Group Exam Review several times each semester.Think, Pair, Share is most useful when I am introducing anew concept. There is a definite penalty to pay in class timewhen using this technique, so I try to use it sparingly. I useQuiz Consulting frequently; it is one of my favoritetechniques.

Use of Findings

I primarily use these methods as part of students’ overallgrade for the course. As with all teaching techniques, it isimportant to vary assessment methods so that any weaknessesare minimized.

Success Factors

I solicit regular feedback from my students and rearrangegroups based on this feedback. I ask students individually,for example, how well they felt the group worked together,and whether they would enjoy working with the same groupagain or would prefer a new arrangement next time. Studentswho report that they do not want to work in groups at all aregrouped together! This way, they can work individually(perhaps checking their answers with each other at the end)and don’t spoil the experience for other students who aremore invested in this collaborative approach.

A concern with allowing collaboration for assessment isthat the strong students will resent “giving” answers awayand carrying weaker students, or that weaker students willreceive an unfair boost since they “scam” answers fromothers. This is a real worry, but there are ways to address it.It is helpful to rearrange the groupings frequently. I’ve foundthat students who do not contribute adequately feelembarrassed about their performance and consequently workharder to be well-prepared for the next time. If there areenough opportunities for students who are strong todemonstrate their individual understanding, there is not muchgrumbling.

Although many people use groups of four to five students,I’ve had more success with groups of three to four.Experiment with different grouping approaches — try puttingstudents with similar scores together or pairing dominantand talkative people. For my initial group arrangements, Ihave students fill out information cards and then pair themaccording to similarities — commuters, parents, students wholike animals, etc. Some instructors like to assign roles toeach group member — one person is the record keeper, one

Page 152: Assessment Practices in Undergraduate Mathematics - Northern

Part II: Assessment in the Individual Classroom 139

person is the doubter/questioner, one person is the leader. Iprefer to allow the leadership to emerge naturally. I amconstantly rearranging groups until members unanimouslyreport that they like their group. This can be a lot of work,however I feel it is worth the effort.

It helps to give specific objectives and timelines in orderto keep the group on task. Group conversation can wanderand it helps to forewarn them that you will be expecting acertain outcome in a certain amount of time.

Give your students guidelines, such as those found in [1].These guidelines can, for example, encourage students tolisten to each other’s opinions openly, to support solutionsthat seem objective and logical, and to avoid conflict-reducing tricks such as majority vote or tossing a coin.

References

[1] Johnson D.W. and Johnson, F.P. Joining Together;Group Theory and Group Skills, 3ed., Prentice Hall,Englewood Cliffs, N.J., 1987

[2] Peters, C.B. (sociology), Datta. D. (mathematics), andLaSeure-Erickson, B. (instructional development), TheUniversity of Rhode Island, Kingston RI 02881, privatecommunications.

[3] Roberts, C.A. “How to Get Started with GroupActivities,” Creative Math Teaching, 1 (1), April 1994.

[4] Toppins, A.D. “Teaching by Testing: A GroupConsensus Approach,” College Teaching, 37 (3), pp.96–99.

Page 153: Assessment Practices in Undergraduate Mathematics - Northern

140

Background and Purpose

From 1993–1995, I taught at Shippensburg University, PA, aconservative, rural, state university with about 6000 students.I used cooperative learning in a geometry course for studentsmost of whom were intending to become teachers, but thismethod could be used with most students. The text wasGreenberg [1].

The NCTM Standards [4] urge teachers to involve theirstudents in doing and communicating mathematics. Theresponsibility for learning is shifted to the students as activeparticipants and the instructor’s role is changed fromdispenser of facts to facilitator of learning.

I chose to use the cooperative learning format for tworeasons: I felt my students would learn the content better ina more interactive atmosphere and they may need to knowhow to teach using these methods. I hoped experiencing thesenew methods would help them in their future teachingcareers. Since most teachers teach the way they were taught,this would be one step in the right direction.

MethodI use heterogeneous groups (based on sex, ability, age, andpersonality) of four students, or five when necessary. I use avariety of cooperative activities, including team writing ofdefinitions, Comment-Along, Teams-Games-Tournaments,and Jigsaw. I’ll describe each briefly.

One of the first activities the students work on is writingdefinitions. Each group receives a geometric figure to define,

and then exchanges definitions (but not the original figure).The new group tries to determine the shape from thedefinition given to them. In whole class follow-up discussion,we try to verify if more than one shape can be created usingthe definition, and in some cases they can. Students arethereby able to understand the difficulty and necessity ofwriting clear definitions.

To be successful in mathematics, students must be ableto decide if their judgments are valid. Comment-along givesthem practice with this. This activity begins as homework,with each student writing two questions and answers. In class,the teams decide on the correct answers and select twoquestions for to pass to another team. The new groupconsiders the questions, adds their comments to the answers,and then passes the questions to a third team which alsoevaluates the questions and answers. Then the questions arereturned to the original teams which consider all responses,and prepare and present the conclusions to the whole class.This activity is an assessment that students do on their ownreasoning.

Teams-Games-Tournaments (TGT) is a Johnson &Johnson [3] cooperative learning activity which consists ofteaching, team study, and tournament games. We use this atthe conclusion of each chapter. The usual heterogeneousgroups are split up temporarily. Students are put intohomogeneous ability groups of three or four students for acompetition, using the list of questions at the end of thechapter. Students randomly select a numbered cardcorresponding to the question they are to answer. Theiranswers can be challenged by the other students and the

Continuous Evaluation Using Cooperative Learning

Carolyn W. RouviereLebanon Valley College

This article discusses several cooperative learning techniques, principally for formative assessment.These include ways to help students learn the importance of clear definitions and several review techniques(comment-along, teams-games-tournaments, and jigsaw).

Page 154: Assessment Practices in Undergraduate Mathematics - Northern

Part II: Assessment in the Individual Classroom 141

winner keeps the card. Students earn points (one point foreach card won) to bring back to their regular teams, a teamaverage is taken, and the teams’ averages are announcedand all congratulated. On occasion, I follow up with a quizfor a grade. The quiz takes a random selection of three orfour of the questions just reviewed; students write the quizindividually.

Jigsaw is a cooperative learning activity [3] where eachmember on a team becomes an “expert” on a topic. Afterthe teacher introduces the material, each team separates, withthe members joining different groups who study oneparticular aspect of the topic. In effect, they become “experts”on that topic. The teacher’s role is to move among the studentgroups, listening, advising, probing, and assuring that thegroups make progress and correctly understand the concepts.When asked, the teacher should not try to “teach” thematerial, but rather pose questions which lead the studentsto form their own correct conclusions. When the studentsreturn to their original teams, they teach the other memberswhat they have learned. The teacher is responsible forchoosing topics and for monitoring the groups to assist andverify that what is being learned is accurate. Ultimately, allstudents are responsible for knowing all the information. Iuse this method to reinforce the introductory discussion ofnon-Euclidean geometry. Each expert team is responsiblefor the material on a subtopic, such as Bolyai, Gauss,Lobachevski, and similar triangles. When the students aredivided into “expert” groups and each group is assigned asection to learn, I listen to, and comment on, what they aresaying.

FindingsBecause each team gives an oral analysis of the definitionsit wrote, the whole class is able to share in the decisions andreasoning. Since students’ understanding of the need for cleardefinitions is essential to their understanding and using anaxiomatic system, this activity provides them with neededpractice. This assessment provided me with immediatefeedback verifying that this foundation was indeed in place.

One of the goals of TGT is to give weaker students anopportunity to shine and carry back the most points for theirteam. After one tournament, a student was ecstatic at hertop score! She had never before been best at mathematics.

Once I tried using team TGT average grades in an attemptto promote accountability. Students howled, “It’s not fair.”So, I stopped counting these grades, and yet they still workeddiligently at the game. (I then replaced this group quiz withan individual one.) Several students commented that theyliked the game format for review.

During Jigsaw, the students seemed highly competentdiscussing their topics. During such activities, the classroomwas vibrantly alive, as groups questioned and answered eachother about the meaning of what they were reading. When

the students finished teaching their teammates, one groupassured me that the exercise was a great way to learn thematerial, that it had really helped their team. Students teachthe content and assess their knowledge through questioningand answering each other.

One day, before class, a student told me that his teacher-wife was pleased to hear I was teaching using cooperativelearning; it is where the schools are heading.

Use of Findings

Teacher self-assessment of effectiveness of the classroomassignment was done informally through the questions heardwhen traveling through the room. It let me know how thestudents were learning long before test time, while I couldstill effect improvements. Depending on what was learned,the lesson was revisited, augmented, or consideredcompleted. For instance, students had a very difficult time“letting go” of the Euclidean parallel postulate. When theywould incorrectly use a statement that was a consequence ofthe Euclidean parallel postulate in a proof, I would remindthem of our first lesson on the three parallel postulates.

Some group assessment for student grades was done (onewas a team project to write proofs of some theorems inEuclidean geometry), but most of the formal assessment ofstudents for grades was done by conventional individualexamination.

Daily and semester reflection on the results of myevaluations allowed me to refine my expectations, activities,and performance for the improvement of my teaching. Thestudents were learning, and so was I.

Success Factors

I would offer some advice and caution to making the changeto a cooperative learning format in your classroom. Thecooperative classroom “feels” different. Some collegestudents may feel anxious because the familiar frameworkis missing. The “visible” work of teaching is not done in theclassroom. Good activities require time and considerablethought to develop. Consider working with a colleague forinspiration and support. Don’t feel that it is an all or nothingsituation; start slowly if that feels better. But do start.

I continued because of my desire to help all students learnand enjoy mathematics. But more, it was a joy to observestudents deeply involved in their mathematics during class.There was an excitement, an energy, that was missing duringlecture. It was addictive. Try it, you may like it, too.

References

[1] Greenberg, M.J. Euclidean and Non-EuclideanGeometries (3rd ed.). W. H. Freeman, New York, 1993.

Page 155: Assessment Practices in Undergraduate Mathematics - Northern

142 Assessment Practices in Undergraduate Mathematics

[2] Hagelgans, N., et al., A Practical Guide to CooperativeLearning in Collegiate Mathematics, MAA NotesNumber 37, Mathematical Association of America,1995.

[3] Johnson, D. and R. Cooperative Learning SeriesFacilitators Manual. ASCD.

[4] National Council of Teachers of Mathematics.Professional Standards for Teaching Mathematics,National Council of Teachers of Mathematics, Reston,VA, 1991.

Page 156: Assessment Practices in Undergraduate Mathematics - Northern

143

Background and Purpose

I began assigning Collaborative Oral Take-Home Exams(hereafter called COTHEs) because of two incidents whichhappened during my first year of teaching at Franklin &Marshall College. My classes have always had a large in-class collaborative component, but until these two incidentsoccurred, I gave very traditional exams and midterms.

The first incident occurred as I was expounding to my stu-dents in an integral calculus course upon the virtues of work-ing collaboratively on homework and writing projects. One ofthe better students asked: if working together is so desirable,why didn’t I let students work together on exams? What shookme up so much was the fact that I had no answer. I could thinkof no good reason other than tradition, and I had already partedfrom tradition in many other aspects of my teaching.

The second incident followed closely on the heels of thefirst. I visited the fifth International Conference on Technol-ogy in Collegiate Mathematics, where I heard MercedesMcGowen and Sharon Ross [3] share a talk on alternative as-sessment techniques. McGowen presented her own experienceswith using COTHEs, and this—combined with my student’sunanswered question still ringing in my ears—convinced meto change my ways. Although the idea behind COTHEs origi-nated with McGowen, I will discuss in this article my ownexperiences—good and bad—with this form of assessment.

Method

I usually assign four problems on each COTHE, each ofwhich is much more difficult than anything I could ask during

a 50-minute period. I hand out the exam itself a week beforeit is due. Because of this, I answer no questions about “whatwill be on the midterm,” and I hold no review sessions—they do the review while they have the exam.

These are the instructions I provide to my students:

Directions This is an open-note, open-book, open-groupexam. Not only are you allowed to use other referencebooks and graphing calculators, but in fact you areencouraged to do so, as long as you properly cite yourresources. However, please do not talk to anybodyoutside your group, with the possible exception of me;any such discussions will, for the purposes of thismidterm, be considered plagiarism and will be groundsfor failure for the midterm or the course.

You should work on this midterm in groups nolarger than 3 people, unless you ask for permissionbeforehand. Everybody in the group will receive thesame grade, so it is up to the group to make sure thatall members understand the material.

You and your group will sign up for a half-hour timeslot together. On the day of your exam, please bring yourwritten work to hand in — the group may submit onejoint copy or several individual ones. You may bringsolutions, books, and calculators to use as reference. Youshould be prepared to show your graphs to me, and youmay of course use your written work to aid you with theoral questions. However, you will not be graded directlyon your written work; rather you will be graded on yourunderstanding of the material as presented to me.

I begin each appointment by explaining the ground-rulesfor the interview. I will choose students one at a time to answer

Collaborative Oral Take-Home Exams

Annalisa CrannellFranklin & Marshall College

Giving Collaborative oral take-home examinations allows the instructor to assess how well studentshandle the kind of non-routine problems we would all like our students to be able to solve.

Page 157: Assessment Practices in Undergraduate Mathematics - Northern

144 Assessment Practices in Undergraduate Mathematics

the questions. While that student is answering, all other groupmembers must be silent. Once that student has finished, theother students will be permitted to add, amend, or concur withthe answer. I accept no answer until the whole group has agreed.Then I move to a new student and a new problem. The questionsI ask tend to go backwards: what did you find for an answer tothis problem? How did you go about finding this answer? Didyou try any other methods? What made this problem hard?Does it look like other problems that you have seen?

Although each exam appointment is 1/2 hour, I leave 15minutes in between appointments for spill-over time, whichI invariably end up using. The interviews themselves usuallytake a whole day for a class of 30 students — when I teach 2calculus classes, I have interviews generously spread over a3-day period, with lots of time left free. While this certainlyseems like a lot of time, I spend no time in class on thisexam: no review sessions, no questions about what will becovered, no in-class test, and no post-exam follow up. Betteryet, I spend no time grading papers!

Since I assign the grades as the students take their exams, Ihave had to learn how to present the grades face-to-face. I’venever had a problem telling students that they earned an ‘A.’To students who earn ‘B’s, I explain that they are doing “solid”work, but that there is still room for improvement — and Iexplain just where (perhaps they need to be able to relate vi-sual understandings back to the algebraic definitions). I tellstudents who do ‘C’ work that they’re really not doing quiteas well as I would like, for reasons X, Y, and Z. Notice thatthe emphasis here is not that I think that they are stupid orlazy, but rather that I have high expectations for them, andthat I hope that this situation of weak work is temporary. Forstudents who earn ‘D’s or worse, I explain that I am veryworried about them (again, for reasons as broad as that theiralgebra skills are very weak and they seem to be confusedabout fundamental concepts in the course) and that they willneed substantial effort to do better in the course.

Findings

The idea of an oral exam in mathematics strikes terror in theheart of the student, but knowing the questions beforehandand having their whole group present at the interview goes along way to mitigating that fear. Despite the newness andscariness of taking oral exams, students wind up being fondof this form of exam. Indeed, by the end of the course whatthey are most worried about is taking the final exam asindividuals—a turnabout I hadn’t expected!

Use of Findings

From the instructor’s perspective, the exam interviews be-come a mint of insight into the minds of the student. I gain areal appreciation of what was hard and what was easy about

each problem. I get a chance to challenge misconceptionsdirectly, and also to encourage flashes of insight. Moreover,because every student not only has to come to my office, butalso talk directly to me, it becomes much easier to carryconversations about mathematics or study habits into out-of-exam time. The benefits of these brief obligatory encoun-ters spill over into the rest of the semester.

Several of my students told me that for the first time theylearned from their exam mistakes; that when they gettraditional exams back covered with comments, they cannot bear to read any of it, so painful is their remembrance ofthe midterm. Whereas in a COTHE, they get the feedbackeven as they present their results; and since they have debatedthe results with their teammates, they care about the answer.All of them appreciate the time they get to do the problems—a frequent comment I get is that they’d never be able todo these problems on a timed exam. (Of course, I wouldnever assign such difficult problems on a timed exam,although that does not seem to occur to them.)

I now design my exams to be both forward and backwardlooking—in this way I can use the interviews as a way ofpreparing students for material to come, and myself for theirreaction to it. (For example, Problem D at the end of thispaper requires students to do something that we encouragethem to do in the regular course of the class: to read atextbook and learn new mathematics. The COTHE allowsme to test this skill and to build on this material.)

What my students tell me during their exam tells me a lotabout their “trouble spots.” One year several groups ofstudents were concerned about whether a function could beconcave up at a point of discontinuity (see Problem A at theend of this paper). Although this would not have been myown choice of an important topic to cover, I then spent ahalf day in class working with the students on this issue andits relative importance to the subject of Calculus, and thestudents seemed to appreciate the class.

Often I discover—and can immediately correct—difficulties with interpreting the displays on graphingcalculators. Sometimes I discover, to my delight, that thestudents really understand a concept (such as horizontalasymptotes) and that I can safely use that concept to introducenew topics (such as limits of sequences).

I enjoy using “stupid mistakes” as an opportunities to seewhat students know about the problem. If a students makesan arithmetical or algebraic error at the beginning of aproblem on a written test, it can change the nature of the restof the problem. On an oral exam though, I might say, “Atthis step, you decided that the absolute value of –x is x. Canyou tell me why?” At that point, the students bang themselveson the head and then huddle together to work the problemthrough again. This opportunity to see the students in actionis illuminating (plus it provides them the opportunity to getfull credit on the problem). We discuss the correct solutionbefore the students leave.

Page 158: Assessment Practices in Undergraduate Mathematics - Northern

Part II: Assessment in the Individual Classroom 145

Success Factors

The most hard-won advice I have learned about assigningCOTHEs has to do with the grades:

Do not assign grades by points. Providing so many pointsfor getting so far, or taking off so many points for missingthus-and-such a fact, tends to be at odds with what thestudents really know. Also, because the students do so wellon this kind of exam, the grades bunch up incredibly. I oncewas in the awkward position of explaining that ‘83–85 wasan A’ and ‘80-82 was a B.’ Neither the students nor Iappreciated this.

Do decide on larger criteria beforehand, and do share thesecriteria with the students. Pay attention to the larger, moreimportant areas that make up mathematical reasoning:mechanics (how sound are their algebra skills; can they usethe graphing calculator in an intelligent way?); concepts (dothey understand the principles that underlie the notation? Canthey make links between one concept and another?); problemsolving (can they read mathematics, attempt various solutions,etc.?). I assign grades based on the strength of these areas, andtry to provide at least cursory feedback on each.

There are occasional concerns from the students aboutgetting group grades, and occasional concerns I have aboutlosing track of how individuals are doing (particularly I worryabout not catching the weak students early enough), and forthat reason I’m moving in the direction of a two-part exam(similar to the “pyramid” exam described by Cohen andHenle in [1]). The first part of my two-part exam would be aCOTHE, and the second would be a series of follow-upquestions taken individually in class.

References

[1] Cohen, D. and Henle, J. “The Pyramid Exam,” UMETrends, July, 1995, pp. 2 and 15.

[2] Levine, A. and Rosenstein, G. Discovering Calculus,McGraw-Hill, 1994.

[3] McGowen, M., and Ross, S., Contributed Talk, FifthInternational Conference on Technology in CollegiateMathematics, 1993.

[4] Ostebee, A., and Zorn, P. Calculus from Graphical,Numerical, and Symbolic Points of View, Volume 1,Saunders College Publishing, 1997.

Selected COTHE Questions

Problem A.(a) Create a formula for a function that has all the following

characteristics:• exactly 1 root;

• at least 1 horizontal asymptote; and

• at least 3 vertical asymptotes.(b) Using any method you like, draw a nice graph of the

function. Your graph should clearly illustrate all thecharacteristics listed in part (a).

(c) You should be prepared to talk about direction andconcavity of your function.

Problem B. Let ¢( )= -FHG

IKJ

+-

h xx

x

32

2

1

5

1c h

defined on the

domain [–2, 3] — that is, –2 £ x £ 3. Please notice that thisis , h´(x), not h(x) !(a) Sketch a graph of h(x) on the interval. (Note: there is

more than one such sketch.)(b) Locate and classify the critical points of h. Which values

of x between –2 and 3 maximize or minimize h?(c) For which value(s) of x does h have an inflection point?

Problem C. Let f x x Lx( )= - +35 1 (where L is theaverage of the number of letters in your group’s last names).The tangent line to the function g(x) at x = 2 is y = 7x – 4F,where F is the number of foreign languages being taken byyour group. For which of the following functions can wedetermine the tangent line from the above information? If itis possible to determine it, do so. If there is not enoughinformation to determine it, explain why not.

(a) A(x) = f(x) + g(x) at x = 2.(b) B(x) = f(x) · g(x) at x = 2.(c) C(x) = f(x)/g(x) at x = 2.(d) D(x) = f(g(x)) at x = 2.(e) E(x) = g( f(x)) at x = 2.

Problem D. Attached is an excerpt from Calculus fromGraphical, Numerical, and Symbolic Points of View [4].Read this excerpt and then use Newton’s Method to find anapproximate root for the function f(x) = x3 – Lx + 1 where Lis the average of the number of letters in your group’s lastnames. You should start with x0 = 0, and should make a tablewhich includes

x0, f(x0), f´(x0), x1, f(x1), f´(x1),…, x5, f(x5), f´(x5),

You might also refer to Project 3.5 in your ownDiscovering Calculus [2].

Page 159: Assessment Practices in Undergraduate Mathematics - Northern

146

Background and Purpose

In this article, we describe an assessment procedure thesecond author used at Florida State University in an upperlevel mathematics course on problem solving. Analysis ofthis course formed part of the doctoral dissertation of thefirst author [3]. Florida State University is a comprehensiveinstitution with a research mission. It is part of the stateuniversity system and has 30,000 students. The 16 collegesand schools offer 91 baccalaureate degrees spanning 190fields. There is an extensive graduate program in each ofthe schools and colleges. Florida State University is locatedin Tallahassee and has strong research programs in both thesciences and humanities. The main goal of the course, takenby both undergraduate and graduate students, was toengender mathematics problem solving through thedevelopment of heuristics. A second goal was to provideopportunities for students to reconstruct their mathematicsin an integrated way so that it could be utilized. The secondauthor has taught this course several times using the methodsdescribed in this paper. We view mathematics as a personalactivity — one in which each individual constructs his/herown mathematics. It is our belief and experience that aproblem-centered instructional strategy is more effective thanexplain-practice which tends to emphasize procedures [4].For assessment to be consistent with this view ofmathematics, it should occur as students are engaged inproblem solving/learning mathematics rather than as aseparate activity.

Method

In order to promote reflection on their mathematical activity,it is important to negotiate a classroom environment whichfocuses on the students’ problem solving and explanationsof their methods rather than prescribed procedures. In thisproblem solving course, nonroutine mathematical problemssuch as the following constituted the curriculum.

Fraction of Singles The fraction of men in a populationwho are married is 2/3. The fraction of women in thepopulation who are married is 3/5. What fraction ofthe population is single?

Angle Bisector Problem Write the equation of thebisector of one of the angles formed by the lines 3x +4y = –1 and 5x + 12y = 2.

The assessment plan involves considering each students’problem solutions and providing written feedback in the formof comments, without points or grades. Overall, assessmentis based on classwork, homework, a midterm examination,and a final examination. However, it differs from traditionalpolicies in that the grade is not simply a weighted averageof numbers for these components but reflects the students’progress in becoming competent mathematical problemsolvers. The assessment plan and the goals of the course arediscussed with the students on the first day of class andelaborated in subsequent sessions.

Reading the homework on a regular basis is the richestsource of assessment data as the student’s mathematical

Assessment in a Problem-Centered CollegeMathematics Course

Sandra Davis Trowell and Grayson H. WheatleyFlorida State University

When using a problem-centered teaching approach, the instructor needs new methods of assessment.This article explores one such approach.

Page 160: Assessment Practices in Undergraduate Mathematics - Northern

Part II: Assessment in the Individual Classroom 147

competence and reasoning become clear. For example, onestudent usually wrote very little on each problem, unlike otherstudents who used numerous pages explaining their methods.Yet he frequently developed quite elegant solutions. Eleganceof solutions came to be valued by all students as the courseprogressed. In contrast, another student, while getting correctanswers, would often use less sophisticated methods,nongeneralizable methods such as trial and error or anexhaustive search. On another occasion, a graduate studentwho thought he knew mathematics turned in simplisticsolutions which did not withstand scrutiny. He was sure hisanswers were correct and was surprised to see the methodsthat were explained by other students. He initiallyunderestimated the course demands but adjusted and becamea strong problem solver. A few students were not successfulin the course because their mathematical knowledge wasinadequate. They had been successful in procedural coursesbut had difficulty when required to construct solutionswithout prescribed methods. By carefully trackinghomework, the instructor is able to chart progress and assessquality. Some students who have little success on problemsduring the first few weeks of the course come alive andbecome successful problem solvers by the end the course.In previous years, the second author tried assigning letter ornumber grades on homework, but found that this practiceencouraged students to do homework for the grade ratherthan focusing on learning and doing mathematics. The lessstudents are thinking about their grade the more they can bedoing mathematics. An average of test scores will not showthis progression as clearly. Thus homework papers are anessential source of assessment data.

While generalizable and elegant solutions are valued,students are free to use any solution methods they wish. Theyare encouraged to explain their solution process and goodideas are valued. Comments made on papers include “verynice,” “how did you get this?” “I do not follow,” “this doesnot follow from what you did before,” or “how did you thinkto do this?” Quality work is expected and the instructor willwrite “unacceptable” on some of the first homework papersif necessary. Students are not expected to be expert problemsolvers when they enter the course; so weaknesses on earlyassignments are not unexpected. Furthermore, a student isnot penalized by poor work initially.

In class, students present their solutions to the class forvalidation. There is no one method expected by the instructor.Some students are frequently eager to present their solutionswhile others are reluctant. By making notes on class activitiesafter each session, these data can be a significant componentof the assessment plan. On some days the students work ingroups of three or four solving assigned problems. Duringthis time, the instructor is actively engaged in thinking aboutthe group dynamics: who initiated ideas, who is just listening;and assessing each student’s involvement. In an early groupsession, the instructor mentioned that during the small group

sessions he had seen several different approaches. Hefollowed this by saying that “we are always interested inother ways.” He was pointing out that there was notnecessarily one correct way to solve a problem and stressingthe belief that problem solving is a personal activity.

The midterm and final examination questions are similarto the nonroutine problems assigned on homework andclasswork. Each examination consists of a take-home partand an in-class part. Early in the course it is announced thatan A on the final examination will result in an A in the course,since it would be evidence that students had become excellentproblem solvers. The comprehensive final examinationconsists of approximately 10 challenging problems. The goalis a high level of competence and many students reach thisby the end of the course. In determining the final grades forstudents in this course, all the information about the studentis considered, not just numerical entries in a gradebook.Particular attention is given to the level of competence atthe end of the course rather than using an average of gradesthroughout the semester. Early in the course, some studentsare not able to solve many of the problems assigned but asthey come to make sense of mathematics previously studiedand develop effective heuristics, their level of competencerises dramatically. The students are also encouraged to assesstheir own work, and near the end of the course they turn into the instructor the grade they think they should receive.This information can serve as a basis for discussion, but inthe end the instructor determines the course grade. The finalgrade reflects the knowledge at the end of the course withoutpenalizing students for lack of success early in the course.Thus, students may receive a higher course grade than wouldresult from an average of recorded numbers.

Findings

Students became more intellectually autonomous, moreresponsible for their own learning. This was particularlyevident in the whole class discussions. Students began toinitiate sharing solutions, suggestions, and ideas. By the thirdclass session, some students would walk to the board withoutwaiting for the instructor’s prompt in order to elaborate uponanother idea during the discussion of a problem. Theiradditions to the class discussion added to the richness oftheir mathematics. In addition, students would suggestproblems that were challenging and also voluntarily hand insolutions to nonassigned challenging problems.

This assessment plan, which encourages self-assessment,reflects student competence at the end of the course ratherthan being an average of grades taken at various pointsthroughout the course. This practice encourages students whomight get discouraged and drop the course when they actuallyhave the ability to succeed. Students tend to like this gradingmethod because there is always hope. Four students of

Page 161: Assessment Practices in Undergraduate Mathematics - Northern

148 Assessment Practices in Undergraduate Mathematics

varying levels of mathematical sophistication wereinterviewed throughout this Problem Solving course by thefirst author. The students generally liked the assessment plan.One student said that this plan was very appropriate asmathematical problem solving should be evolving and thatthe comments on their problem solving were helpful. Wheneach of these four students was asked during the last weekof the course what grade they believed they would receivefor this course, all four reported grades which were identicalto those later given by the instructor, one A, two Bs, and oneC. Even without numerical scores, these students were ableto assess their own work and this assessment was consistentwith the instructor’s final assessment.

Use of Findings

The results of this study suggests that the holistic plandescribed in this paper has merit and serves education betterthan traditional systems [1, 2]. Furthermore, by a carefulreading of the homework, the instructor is better able tochoose and design mathematics tasks to challenge and enrichthe students’ mathematical problem solving. For example,if students are seen as being weak in a particular area, e.g.geometric problems or proportional reasoning, the instructorcan focus on these areas rather than spending time on lesschallenging problems.

Success Factors

Students enter a course expecting practices similar to thosethey have experienced in other mathematics courses. Thus,in implementing an assessment method which encouragesstudents to focus upon doing mathematics rather than gettingpoints, it is essential to help students understand andappreciate this unconventional assessment plan.

References

[1] Mathematical Sciences Education Board. Measuringwhat counts. National Academy Press, Washington,1993.

[2] National Council of Teachers of Mathematics.Assessment standards for school mathematics, NCTM,Reston, VA, 1995.

[3] Trowell, S. The negotiation of social norms in auniversity mathematics problem solving class.Unpublished doctoral dissertation, Florida StateUniversity, Tallahassee, FL, 1994.

[4] Wheatley, G.H. “Constructivist Perspectives on Scienceand Mathematics Learning,” Science Education 75 (1),1991, pp. 9–21.

Page 162: Assessment Practices in Undergraduate Mathematics - Northern

149

Background and Purpose

Some women students, including most of us who have acquiredPh.D.s in mathematics, excel at demonstrating what they’velearned on traditional tests. However, most women studentsare less successful than men with comparable levels of under-standing when only these instruments are used. It is thus im-portant to look for assessment methods that allow women todemonstrate as well as men what they have learned. CedarCrest College is a four year liberal arts women’s college inAllentown, Pennsylvania. As a professor in a women’s col-lege, I have experimented with a variety of assessment tech-niques over the past 14 years. These include journal writing,group work, individual and group tests, and creative projects.My findings are that journals help form positive attitudes infemales through encouragement and weekly guidance. For-mative evaluation improves the teaching/learning process.

A research study of students enrolled in college remedialmathematics courses [4, p. 306] supported the assertion thatbeliefs such as self-confidence were more important influenceson mathematics achievement and success for females thanmales. Research indicates that sex differences in mathematicsself-efficacy expectations are correlated with sex differencesin mathematical performance [2, p. 270]. Mathematics self-efficacy expectations refer to a person’s beliefs concerningability to succeed in mathematics. Such beliefs determinewhether or not a person will attempt to do mathematics andhow much effort and persistence will be applied to the task athand and to obstacles encountered in the process. Suggestionsfor improving learning experiences of females include changing

students’ perceptions of the nature of mathematics byemphasizing creativity in mathematics, making connectionsto real life situations, permitting students to engage in problemposing, and encouraging debates, discussions, and critiques ofmathematical works [1, p. 89–90].

I find that our students learn mathematics better with ahands-on teaching approach and with daily assessmentactivities to boost their self-esteem and to build theirconfidence as successful problem solvers. Making modelshelps our students visualize mathematical concepts. Sincemany women lack experiences with objects of physics whichmotivate much of mathematics, they require experience withsuch objects to continue studying mathematics successfully.

Creative projects help students visualize a mathematicalconcept. Also, projects require students to reflect on howall the concepts they learned fit together as a whole.

Women as nurturers thrive on cooperation and collaborationrather than competition. The first attribute of women leadersis collaboration [3, p. 26]. Women leaders elicit and offersupport to group members while creating a synergisticenvironment for all and solving problems in a creative style.

Classes used in the following assessments include:Precalculus, Finite Math, Calculus I, II, III, and IV,Probability and Statistics, and Exploring Mathematics forPreservice K-8 teachers over the past fourteen years.

Method

1. Group Work Group work in class gives instantaneousassessment of each individual’s daily progress. Students

Assessing Learning of Female Students

Regina BrunnerCedar Crest College

The author discusses assessment techniques which she has found to be particularly effective with femalestudents.

Page 163: Assessment Practices in Undergraduate Mathematics - Northern

150 Assessment Practices in Undergraduate Mathematics

thrive as active learners receiving constant feedback andpositive reinforcement. Problems can be solved by groupsworking with desks drawn together or working in groups onthe blackboard. When groups work on the blackboard, I canwatch all the groups at the same time. It is easy for me totravel around the room from group to group givingencouragement to those who are on the right track andassisting those who are stuck. This works successfully withclasses as large as thirty-two.

A natural outgrowth of working in teams during class isto have team members work together on an exam. All teammembers receive the same grade for their efforts if allcontribute equally to the end result. Teams are given 7–10days to complete the test, and hand in one joint solution.Each problem is to be signed by all team members thatcontribute to its solution. Teams are only allowed to consulteach other and their texts.

A good group teaching technique is to play “Pass theChalk.” When the teacher says to pass the chalk, then theperson with the chalk passes it to another group memberwho continues solving the problem. This technique bringseveryone into the problem solving process. No spectatorsare allowed: all are doers.

Group work leads to working together outside of class.The class mathematics achievement as measured by hourlytests improves. They feel comfortable as a successful workingunit and take tests together as a team. For more detaileddiscussion of group work, see articles by Crannell (p. 143),Hagelgans (p. 134), and Roberts (p. 137) in this volume.

2. Journal Daily journal writing provides a one-on-one con-versation with the teacher about progress and stumblingblocks. I answer student inquiries about learning or abouthomework, and dispel negative feelings with encouragement.Students require about 5 minutes to complete a daily journalentry. Weekly reading of journal entries takes about 3–5minutes per student. Scoring consists of a point for eachentry. Total points are used as an additional test grade.

SAMPLE JOURNAL WRITING TEMPLATE

CLASS1. In class, I felt...

2. In class, I learned...

3. The most positive result of class was...4. The least positive result of class was...

5. Some additional comments related to class are...

I use the same questions (replacing “In class,” by “Whiledoing homework”) for responding to homework. Questions3 and 4 are both formed in terms of a positive response andnot with the words, “most negative result,” to emphasize apositive frame of mind. Question 5 allows individuality inresponse. Students freely share their progress and problemsin these entries.

3. Creative Projects In Calculus II, for instance, I assignedstudents the task of making a model of an inverse function.One ingenious model was made from hospital-sized Q-tips.This student made a three dimensional cube using these over-sized Q-tips. Inside the cube she placed three axes. Withwire, she constructed a physical representation of a functionand its inverse. Another movable model was made from apipe cleaner and ponytail beads. The function (a pipe cleanerattached to a piece of cardboard by the ponytail beads) wasattached to a two-dimensional axes model. Then by liftingand rotating the pipe cleaner, a student could view a functionand its inverse function by rotating the function (pipe cleaner)about the line y = x. One student brought in her lamp as anillustration since the contours of the lamp’s shadow wererepresentative of a function and its inverse function.

A possible assignment in precalculus is to draw a conceptmap, a scaffold, or a flow chart to classify conic sections.For additional information on concepts maps, see article byDwight Atkins in this volume, p. 89.

I am always amazed at how creative projects put a sparkof life into mathematics class and that spark sets off a chainreaction to a desire to succeed in the day-to-day mathematicalactivities in the rest of the course. These creative projectsrange from bringing in or constructing a model of a calculusconcept, writing a poem or a song, or developing anumeration system and calendar for a planet in the solarsystem. In the latter case, students are required to explaintheir numeration system and why it developed as it did onthis planet using relevant research from the Internet.Enthusiasm for mathematics increases as students researchmathematical ideas and concepts, brainstorm, and engagein critical thinking and reasoning.

Findings

In the fall of 1996, I used journals, group work, a group test,and creative projects in Calculus I and Finite Math.

Final grades included two individual tests, one group test,and an individual comprehensive three-hour final. Incomparing student grades on the first test of the semester totheir final grades, I found that students performedsubstantially better on the final tests.

Written student final evaluations note that journal writinggives students time to reflect on how they learn best, a focuson class work, and a link between the professor and thestudent. In addition, journals provide the teacher anopportunity to view student thought processes and use thisknowledge to teach more effectively. Group work increasedstudent confidence in mathematics, replaced competitionwith cooperation, emphasized hands-on problem solving,and individual contact with the teacher. Projects enabledstudents to review and reflect on major concepts from thecourse.

Page 164: Assessment Practices in Undergraduate Mathematics - Northern

Part II: Assessment in the Individual Classroom 151

Use of Findings

I find journal writing improves my students’ attitudes towardsmathematics, and thus their success in doing mathematics.As the course progresses, journal entries provide the stimulusfor making changes in the learning environment. If manystudent responses indicate that a definite change is required,then the teacher may decide to reteach a concept or proceedat a slower pace. If just a handful require additional help,then the teacher may schedule an additional study sessionwith these students.

Students realize that I care about their learning because Irequire and grade their journals each week. In turn, theywork harder and harder to understand mathematics. I willsolve original problems that they pose in the journal. So thejournals become written, one-on-one, semester-longconversations and dialogues.

Group work helps the teacher view the interactions withina group, notice which group members need additional helpin graphing or algebra skills, and provide help while the groupis problem solving. I find group work invaluable. I do notenjoy giving long lectures anymore. I want to present aconcept briefly and then solve problems in groups at theblackboard for the rest of the class period. I teach individualgroups and assess learning effectiveness as they learn in smallgroups. My students always write in their journals of thevalue of the board work. It prepares them for the homeworkfor that night. They are successful on the homeworkassignments because of the struggles in class that day.

Creative projects provide a needed respite to talk aboutconcepts and to bring them to life in the classroom. I amalways intrigued by student creativity and originality. A groupin Finite Math shared a counting technique they found onthe Internet for making sand patterns in India. The class wentto the blackboard to draw these patterns because they wereintrigued with this concept and wanted to try it also. Wantingto learn more than is required by the coursework and bringing

mathematics into their lives to me makes our studentsmathematically literate and aware of the power, beauty, andmystique of mathematics.

Success Factors

1. Start on a small scale.2. Be willing to fail and to try again.

3. Do not abandon lectures. Students cannot succeed unlessyou give them the background needed.

4. Answer everything asked in the journal. Try to be positive.5. Shuffle group members during the semester. Working with

a variety of partners encourages students to explorevarious ways to solve a problem, requires that our studentsbe active learners, and increases communication withinthe class.

These techniques are especially helpful with femalestudents yet they can also be beneficial for all students.

References

[1] Barnes, M. “Gender and Mathematics: Shifting theFocus,” FOCUS on Learning Problems in Mathematics,Vol. 18, No. 1, 2, & 3, 1996, pp. 88–96.

[2] Hackett, G. and Betz, N.E. “An Exploration of theMathematics Self-Efficacy/Mathematics PerformanceCorrespondence,” Journal for Research in MathematicsEducation, Vol. 20, No. 3, 1989, pp. 261–273.

[3] Regan, H.B. and Brooks, G.H. Out of Women’sExperience: Creating Relational Experiences.Thousand Oaks CA: Corwin Press, Inc., 1996.

[4] Stage, F. and Kloosterman, P. “Gender, Beliefs,andAchievement in Remedial College Level Mathematics.”Journal of Higher Education, Vol. 66, No. 3, 1995, pp.294–311.

Page 165: Assessment Practices in Undergraduate Mathematics - Northern

152

Background and Purpose

The Houston Community College System’s Central Collegehas about 27,000 full-time equivalent students who havevarious goals including changing careers, acquiring associatedegrees and transferring to programs which will result in 4-year degrees. There are no residential facilities at the collegewhich is located in downtown Houston, Texas. Some studentsare homeless, physically challenged, in rehabilitationprograms; and others are traditional students. The averageage of our students is 27 years old.

Older students present the faculty member with unusualchallenges since they don’t necessarily respond well totraditional methods of teaching and assessment. However, theyare capable of succeeding in learning mathematics if givenappropriate support and opportunities to show what they havelearned. Many students have personal difficulties at thebeginning of the semester. They are challenged with housingarrangements, parking and transportation problems. Day careis also a problem for many single or divorced women. Althoughthe students have social problems and economic constraints,they are motivated to strive toward the academic goals of myclass. They honestly communicate their difficulties and theyshare their successes. I design activities and assessment in myclass to address these special needs of my students. Flexibilityand compassion create an environment rich with opportunitiesto learn, while the expectations of excellence, mastery and skillsacquisition are maintained.

These non-traditional students usually begin my coursein college algebra with little confidence in their abilities.

They are reluctant to go to the board to present problems.Older students tend to do homework in isolation. Many ofthem do not linger on campus to benefit from interactionwith their classmates. College Algebra students at CentralCollege have no experience with mathematics software orcomputer algebra systems such as DERIVE, and areuncomfortable with computer technology generally. Whileolder students may be uncomfortable with reform in collegealgebra, with time they do adjust and benefit from apedagogical style which differs from the one they expected.In this article, I discuss some assessment techniques which Ihave found help these students grow, and allow them to showwhat they have learned.

MethodI usually start my college algebra classes with a 10 or 15minute mini-lecture. I pose questions which encouragediscussion among the students. These questions aresequenced and timed to inspire students to discover new waysof approaching a problem and to encourage them to persisttoward solutions. Students are invited to make presentationsof problems on the board.

Cooperative Learning

Cooperative Learning offers opportunities for sharinginformation and peer tutoring. Students are invited to formgroups of four or five. I assign the work: for example, thesection of Lial’s College Algebra test on word problems.

Strategies to Assess the Adult Learner

Jacqueline Brannon GilesHouston Community College System – Central College

Adult students are often better motivated than traditional-age students, but many have not taken anexamination for many years. Thus, finding appropriate methods of assessment poses a challenge.

Page 166: Assessment Practices in Undergraduate Mathematics - Northern

Part II: Assessment in the Individual Classroom 153

Each group selects its group leader. Participants keep ajournal of their work, and each member must turn in a copyof all solved problems in their own handwriting. Studentsare warned that each group member must be an activeproblem solver and no one is to be a “sponge.” One classperiod is dedicated to forming the groups and to establishingthe rules for completion of the assignment, and a second towork on the assignment. If the word problems are notcompleted in the designated class period, the assignmentbecome a homework assignment to be completed by eachgroup. Approximately 30 word problems requiring the useof equations are solved using this method.

As a result of this activity, students become aware of theirstrengths and weaknesses and become better at pacingthemselves and at self assessment. Although I give a groupgrade for projects, I do not give a group grade for classworkor participation in discussion groups: each individualreceives a grade for work completed.

Projects

Projects provide an opportunity for students to become familiarwith the Mathematics Laboratory. The goal of the project isgraphing and designing at least three images: a cat’s face, aspirograph and a flower. Most designs are accomplished usinglines and conics such as circles, ellipses and parabolas.

Some students do library or internet research for theirprojects: they look up trigonometric equations and discoverthe appropriate coefficients to produce the best (and prettiest)flowers, or use polar coordinates or parametric equations.An objective of the research is to identify role models andpeople of diverse backgrounds who persisted and succeededin mathematics courses or professions that are mathematicsdependent. A second objective is to inspire students to writeabout mathematics and to record their attitude towardmathematics and technology. I had the students in summerschool find a web page on Mathematics and Nature, andwrite a short report. They also did research on biomimeticsand composites to become aware of the usual connectionsin science, mathematics and nature. Connecting mathematicsto the world around them helps motivate them to work harderlearning the mathematics. The Mathematics and Nature website that the students found is the one that I have linked tomy page. I encouraged the students to use e-mail and toprovide feedback to me by e-mail.

ExaminationsExaminations usually contain 14-20 questions requiring thestudents to “show all work.” This gives me an opportunityto see exactly what their needs are. Four examinations and acomprehensive final are administered. The student is giventhe option to drop the first examination, if the grade is below70. This option provides a degree of flexibility and oftenrelaxes tension.

Findings

Our most successful endeavor has been the establishment ofmy web page, using student research and feedback toimprove the design. The web page can be accessed at http://198.64.21.135. I have an instructional page (withinformation for my students on syllabus, expectations, etc.)and a personal page that students viewed and providedcorrective feedback. This activity inspired many students todo more research using World Wide Web. Encouragingstudents to use this technology helps some of those who arefeel bypassed by technology overcome their fears. Onestudent sent me an e-mail message: “This project to researchyour webpage was not only educational to students it gavethem an opportunity to use current fast moving technology.”Another commented, “I have been meaning to start on myown for quite some time, and now you have inspired me todo so.” Attitudinal learning took place and students seemedmore excited about the mathematics class and the use oftechnology.

The college algebra classes have developed into astimulating and broad intellectual experience for my students.They gained the algebra skills and a new attitude aboutmathematics and careers in mathematics. The drop out ratein my college algebra classes is lower than most classes inmy department. Approximately 89 percent of the studentspassed the course with grade C or above, while othersrealized that they did not have the time nor dedication to doa good job and they simply dropped the course. I believestudents made wise judgments about their own ability andthat they will probably re-enroll at a later date to completethe coursework.

With persistence on the instructor’s part, older studentsbecome more confident learners. For example, toward thebeginning of the semester, a 62-year-old businessman wouldoften ask me to do more problems on the board for him. Hewould say, “Darling, would you work this problem for me?...Ihad a hell of a time with it last night.” I didn’t mind his styleas long as he continued doing the work. Eventually, he wouldput forth more effort and go to the board to show the classhow much progress he made. He would point to the areawhere he was challenged, and then a discussion would ensue.As I observed his change in behavior I noted that he andother students were taking more and more responsibility fortheir learning.

Participants in my classes seem more appreciative of thebeauty of mathematics and mathematics in nature. Studentswho successfully completed their designs using DERIVEdiscovered the beautiful graphs resulting from various

types of mathematics statements. They learned how touse scaling and shifting to design images that were symmetricwith respect to a vertical axes. While designing, for example,the cat’s face, they learned how to change the radius and totranslate small circles to represent the eyes of the cat. Some

Page 167: Assessment Practices in Undergraduate Mathematics - Northern

154 Assessment Practices in Undergraduate Mathematics

students used a simple reflection with respect to the x-axisto obtain a portion of their design. As they pursue their work,they had conferences with me to gain more insight.

Many students needed to learn basic computer skills: howto read the main menu of a computer, search and findDERIVE, and author a statement. Tutors were available inthe Mathematics Laboratory and I assisted students whenquestions arose. It was the first time many of my studentshad ever made an attempt to integrate the use of technologyinto college algebra.

More students are enthusiastic about the use of technologyin mathematics. Three years ago only about 5 percent of theCentral College students owned graphing calculators or useda computer algebra system. In 1997, approximately 35percent of them are active participants in the MathematicsLaboratory or own graphing calculators.

Use of Findings

As a result of student input, via informal discussions and e-mail, I will lengthen my mini-lectures to 20–25 minutes. Fur-thermore, at the end of each class I will review concepts andprovide closure for the entire group. More demonstrationsusing the TI-92 will help prepare the students for their project

assignments and a bibliography on “Diversity in Mathemat-ics” will assist the student research component. I usuallydocument class attendance on the first and last day of classso that I have a photographic record of retention. The inspi-ration to design a more suitable attitudinal study using a pre-test and post-test to measure change in is my greatest gain asa result of these experiences.

Success Factors

The adult learner can be an exceptional student; adults learnvery rapidly in part, because they don’t need to spend timebecoming adults, as traditional-aged students do. Howeverin my experience, adults may be more terrified of tests, theymay feel a great sense of failure at a low grade, they may beslower on tests, and they may have lost a lot of backgroundmathematical knowledge. Also they have more logisticalproblems which make working outside of class in groupsharder, thereby not benefitting from explaining and learningmathematics with others. The methods I have used arespecifically designed to bring the adult learners together, torelieve anxiety, to encourage group work, the “OK-ness” ofbeing wrong.

Page 168: Assessment Practices in Undergraduate Mathematics - Northern

155

Background and Purpose

Several semesters of teaching expression manipulation-typecourses tired me out. In order to re-establish my enthusiasmfor mathematics in the classroom I remembered thatmathematics is a science in which we look for patterns,generalize these patterns and then use them to betterunderstand the world around us. Yet when I introduced theseideas into a traditional atmosphere, the students revoltedbecause it wasn’t what they expected or were accustomedto. It was difficult to acquire the new skills being asked ofthem: attacking a problem they had never seen before,experimenting with a few examples, choosing from theassortment of mathematical tools previously learned, stickingwith a problem that had more than one answer,communicating their answers, and even asking the nextquestion. During the transition period, frustration levels werehigh for everyone. I was disappointed with their work whichshowed little or no improvement throughout the semester.The students were angry and I found myself on the defensive.From their point of view, it was a desperate semester-longstruggle to bring their grades up with very little idea of howto accomplish that. This led to the introduction of the MissionStatement into my classes during the first week of eachsemester. When I began to use Mission Statements I was ata small liberal arts college in upstate New York. I amcurrently at a state college in New Jersey. Although theatmospheres of the colleges are quite different, mathematicseducation at the K–12 level is fairly consistent. Studentscome out of the secondary system in the expression

manipulation mode. For this reason I believe that MissionStatements can be useful at any institution and in everyclassroom. To date, I have used them successfully in servicecourses, general education or core distribution courses, andprogram courses. While the process takes a full class period,from the first day of class, students begin to take an activerole in their own learning. Learning becomes a process overwhich students have control, something they can improve,not something to simply submit to. Once they have somecontrol, once they see a purpose, their motivation to learncarries the class, rather than the more typical scenario wherethe professor carries the course dragging the students withher.

Method

I begin the first day with a preliminary process to determinestudents’ expectations for the class. I read each of thefollowing statements aloud and have the students completethem.

Each day I come to class I expect to (be doing) ...Each night for homework I expect to (be doing) ...Each day I come to class I expect the professor to (be

doing) ...I expect to do ____ hours of homework between each

class (even on weekends??).I expect my grades to be based on ...If I could get anything out of this class I wanted to, I

would make sure that I (got) ...

The Class Mission Statement

Ellen ClayThe Richard Stockton College of New Jersey

To set the tone of a course at the beginning, develop with the class a “class mission statement,” whichcan be revisited as the course progresses to assess progress toward meeting course goals.

Page 169: Assessment Practices in Undergraduate Mathematics - Northern

156 Assessment Practices in Undergraduate Mathematics

After each statement is completed, I pick up the responses,usually written on index cards, and react to them. This allowsme to express my expectations for the class. It brings us alittle closer together on what the course is all about and itopens up a dialogue between the students and me. Once thispreliminary exercise is complete, I facilitate the creation ofindividual Mission Statements. Each student is handed 20to 30 post-it notes or index cards and a large sheet of paper.Then I read through the following process.

1. To establish a PURPOSE for the course, ask yourself whyyou are taking this course. What’s your purpose? Whatdo you hope to achieve? You must write until I tell youto stop. Brainstorm everything that comes to your mind.Write each thought on a different card. (Have studentswrite for five minutes.) Now organize your thoughts byprioritizing them, grouping them, or whatever works bestfor you. (Allow up to 2 minutes.) Then set these ideasaside for now.

2. To establish a PROCESS for the course, ask yourself howyou expect to fulfill those purposes. What means andmethods will you use? What skills and behaviors willyou use, both inside and outside of the classroom? (Havestudents write for five minutes.) Organize in the sameway. Then set these aside.

3. To establish your ACCOMPLISHMENTS, ask yourself,what results do I plan to achieve? What specific outcomeswill be evidence that I’ve succeeded? What topics do Iwant to have learned? (Have students write for fiveminutes.) Organize in the same way.

4. Arrange your individual thoughts from the cards onto thelarge sheet of paper using the following format:

I AM TAKING “COLLEGE ALGEBRA”(purpose statements)

BY(process statements)

SO THAT(accomplishment statements)

At this point you may switch any statements betweencategories if you find they fit better elsewhere. Thegrammar necessary to make each statement follow thecategory headings will help you decide which statementsgo under which categories.

5. Take these home and type them up. (This ensures thatthey at least read them one more time and in the best caseponder over them again.) Also write a paragraph or soreacting to the entire process.

During the semester, we reflect on our Mission Statementsseveral times, to make sure that those goals, processes, andaccomplishments which we chose to the best of our abilityat that time still work for us. For example, after returning aproject I ask them to decide if the assignment pertained toany of their goals, if they actually used the processes on

their Mission Statement. If so, did they work; if not are theywilling to try them now? If the processes didn’t work, I askthem either to think of other processes that might work, orto come to see me, because I have experiences which mightbe beneficial to them.

Findings

By the time the process is complete, I have assessed students’goals for the course, students’ expectations of themselves,students’ expectations of me, and students’ definitions ofsuccess. Maybe even more importantly, students have takenthe time to ponder these questions for themselves, many forthe very first time.

Use of Findings

If the Mission Statements appear to be too diverse to workwith, you can combine common elements from the individualMission Statements, plus anything that was in at least onestatement that you feel is essential, into a common MissionStatement for the course. Presenting this gives you anotheropportunity to get those students who chose to avoid theprocess, by writing shallow statements, to buy into it. If,during this process, you have had to eliminate goals thatwould lead to learning, you can have students createindividual projects, related to the course material, which helpthem realize their individual goals. Keep in mind that eachdiscussion about goals and how to achieve them, offersopportunities for students to learn about themselves, theirlearning styles, and their defense mechanisms.

This method of assessment has enabled me to make manychanges in my classroom. One major change that I haveincorporated into several of my general education coursesis that I no longer use the textbook as the course syllabus.Our goal is no longer to complete sections 2.1–6.5 skippingas few sections as possible. Now we set our goals based onmy own and my students’ Mission Statements and use thetextbook when and as needed. As an example, in my CollegeAlgebra class, students wanted to see the usefulness ofmathematics. Therefore, we began the semester with a linearprogramming problem. As we solved it we found ourselvesgraphing lines and linear inequalities, finding points ofintersection, and invoking major theorems. We turned to thetextbook in order to remember how to accomplish some ofthese tasks. We began in the index and the table of contents,and often found ourselves in several different chapters in asingle day. Many students are finally seeing that much ofthe mathematics they have previously learned is useful andthat they can return to a textbook as a reference whennecessary.

Another substantial change that I have incorporated intomy classes is that of allowing revision on projects. Many of

Page 170: Assessment Practices in Undergraduate Mathematics - Northern

Part II: Assessment in the Individual Classroom 157

my students are being asked to make decisions based onquantitative data, and to communicate these decisions forthe first times in their academic careers. This can be anonerous task for them. It is my experience that once a gradeis assigned, learning stops. So instead of reading,commenting, and assigning grades to projects, I glancethrough them to find what is lacking and/or mostmisunderstood. Then I either make anonymous copies ofthe misconceptions or lackings on transparencies for the nextclass period or I create a peer review form in which studentsare asked to look for these particular misconceptions/lackingsin their peers’ work. Using transparencies on an overheadprojector takes less time - the students look through theirown paper for each suggestion I present. The peer reviewtakes more time but better prepares them for critically lookingat their own papers in the future. Peer review works better ifeach student is working with different data. In either case Ioffer the class the opportunity to work on projects one moretime before I put a grade on them. Students are very grateful.This change took place as a result of our discussion ofwhether we were meeting our goals or not. Again theusefulness of the subject came up. We realized that in orderfor work to be useful, we need to be able to communicate itto others and the students had absolutely no training in thisarea. On a personal note, revision has been a life-savingdevice as far as time goes. Rather than write the samecomments over and over, assignment after assignment, I canglance through a set of projects and decide if I should gradethem or build in a revision. The second time around I havethe same choice, request a second revision or grade. When Ido decide to make individual comments and assign a grade,it takes very little time because I am reading the assignmentfor the second or third time and the comments are more oftenabout how to extend the project, include deeper mathematicsor simply suggest that next time the student take advantageof the revision process. A more impromptu change in thecourse took place when several of my students madestatements about integrating mathematical ideas into theirown lives. It developed rather like a challenge: I apparentlyclaimed that mathematics is everywhere and as a result theirMission Statements challenged me to integrate any subjectof their choice into the class. From that conversation thefollowing project developed during a class session. Each ofmy thirty-two students had to think of a situation that couldbe modeled linearly. That night’s assignment was to call twocompanies which provided the service they suggested inclass. These ranged from mechanics’ shops to pizza parlorsto long-distance phone carriers. Students had to write a

narrative description comparing and contrasting the twoservices to decide how one would choose between the two.The narrative had to be supported by appendices whichincluded graphical, numerical and analytical representationsof the data. Each representation had to point out all pertinentinformation used in their narrative argument. Both thecreation of and reflection on our Mission Statements haveus continually rethinking our intention for the course. Forthe first time I am treating general education studentsdifferently from mathematics majors. We spend significanttime and effort thinking about how to communicate themathematics we learn (as otherwise it is useless to us) andhow to create assignments and processes that enhance thislearning. One of the most exciting aspects of the new journeywe are on is that for the first time we have student input at atime when students have a stake in the course rather thanrelying on student evaluations when often they arecommenting on their success in the course not on improvingor increasing the learning that takes place.

Success Factors

The more responsibility I give to the students, the moresuccess we all have. However, you must be willing torelinquish the feeling of control! This has been difficult forme. Originally I asked the students to write their MissionStatements during week three or four so that they would havean idea of what the course was all about. Now I ask them tocreate them on the first day or two, before they feel that theyare stuck with my choices. Then after three or four weekswe review the Mission Statements to decide whether theyare working for us or not.

A second transition I have had to make to ensure successis accepting that this process takes time. You can’t rush it!It takes an entire class period at the beginning of the course,and a few minutes of class time on occasion for reflectionand discussion. I now know that the time is well worth itbecause when students and professor are working towardsthe same goal, progress is hastened dramatically. The timespent on the Mission Statement is made up in the first fewweeks of class. I have had several students, reflecting ontheir Mission Statements, admit that they are not doinganything they claimed they would and this is affecting theirgrades. That is taking responsibility for one’s own actions.If you feel as though your job is to educate students beyondcovering content, you will have given them a useful, lifelongtool.

Page 171: Assessment Practices in Undergraduate Mathematics - Northern

158

Background and Purpose

The University of New Hampshire (UNH) is predominantlyan undergraduate public institution with an enrollment ofapproximately 14,000. The Department of Mathematics isin the College of Engineering and Physical Sciences. Thisencourages meaningful interaction between mathematiciansand users of mathematics. The fact that the MathematicsEducation Group (those at UNH who educate mathematicsteachers for elementary and secondary schools) resideswithin the Department of Mathematics has fostered specialconcern about educational matters. Although I am a regularmember of the Department of Mathematics and not part ofthe Mathematics Education Group, I am continually experi-menting with new teaching methods. For example, I use stu-dent-centered methods in my standard sized classes of aroundthirty students and am experimenting with extending thesemethods to classes of up to 250 students.

In order to promote a student-centered approach in theclassroom, several times during the course I formally gatherstudent input. This input is then the basis for a class discus-sion about possible changes. Getting early feedback fromstudents and implementing changes improve the course andmy teaching. Students are also gratified to be able to giveinput and see their suggestions implemented. Much of whatI do in the classroom has been inspired by the work of psy-chologist Carl Rogers. The three core conditions for suc-cessful psychotherapy that Rogers [1] formulated, in whatis now called the Person-Centered Approach (PCA) applyto all human relationships, but, in particular, to student-

teacher and student-student relationships.These core conditions as I understand them are:

(1) Unconditional positive regard: The therapist is willingand able to value the client and trusts that the client hasthe ability to solve her/his own problems.

(2) Empathy: The therapist is willing and able to put her/his experience aside and really understand the experi-ence and feelings of the client.

(3) Congruence: The expressions and communications ofthe therapist genuinely represent the true thoughts andfeelings of the therapist.

In the classroom Rogers believed that “students can betrusted to learn and to enjoy learning, when a facilitativeperson can set up an attitudinal and concrete environmentwhich encourages responsible participation in selection ofgoals and ways of reaching them.” (from fly leaf, [2])

Method

Immediately after the first and second exams, at approxi-mately the fourth and eighth weeks of the course respec-tively, I pass out questionnaires soliciting anonymous stu-dent feedback. Some questions seek open responses and oth-ers seek specific responses to matters of concern to me ormy class. After collecting the questionnaires at the next class,I pass out a written summary of the results and we discusspossible changes. When there is general agreement, we adoptsuch changes. In cases of disagreement, we discuss possibleactions and try to arrive at consensus. When someone has

Early In-Course Assessment of Faculty by Students

William E. BonniceUniversity of New Hampshire

Students learn to take responsibility for their learning by giving input on how the class is going and whatneeds to be changed. This works in classes of all sizes.

Page 172: Assessment Practices in Undergraduate Mathematics - Northern

Part II: Assessment in the Individual Classroom 159

formulated what seems to be a good decision, we take athumbs-up, thumbs-down, thumbs-to-the-side (“Will supportbut not my first choice”) vote. We ask everyone who votedthumbs-down or thumbs-to-the-side to tell what would haveto be done in order for them to change their vote. We try toincorporate these suggestions and come up with a betterdecision to try out.

Examples of issues and topics of discussion which havecome up include: use of calculators on exams; use of “cheatsheets” on exams; the number of times cooperative groupsshould be changed during the semester and method of choos-ing the groups; grading criteria, methods of insuring indi-vidual responsibility in group work; use of too much timediscussing class processes; etc.

Sometimes the class arrives at a decision that I don’t like.Using Rogers’ third condition, “congruence,” I explain whyI disagree with their decision, but if the students are notswayed to my point of view, if possible I go along with them.This is an application of the first condition, “unconditionalpositive regard.”

The matters addressed on the questionnaire generally fallinto three categories:

(1) Attitudinal. E.g.:

• I like working in cooperative groups as we have beendoing in this class.Strongly Agree 10 9 8 7 6 5 4 3 2 1 Strongly Disagree

• I would suggest the following improvements:

(2) Specific questions address problematic areas. E.g.:

• What would better foster cooperation among team mem-bers; what would help teams function more effectively?

• When a team produces a single write-up for a project,what grading scheme will insure individual accountabil-ity?

• What would discourage talking and noise in the large lec-ture room? How can we move from active participationin team work to quiet attention to the lecture?

(3) Open response items seek general input. E.g.:

• Things I like/dislike about this class:

• Suggestions for improvement:

• The topic I found most difficult to understand:

• It would have been easier for me to understand this topicif:

Findings and Use of Findings

Attitudinal questions are especially useful after I have triedsomething innovative, and I want to know how the class feelsabout it.

Often students come up with better ways to deal withdifficulties that arise in the classroom than methods I have

heard in professional conferences and workshops. Thus itseems that giving students “unconditional positive regard”is warranted. Examples of things implemented as a result ofstudent feedback and discussion are:

• I let the students know the first day that the class will beunconventional, that probably most learning will takeplace in cooperative groups, that I will do little lecturingand that decisions about how the class will be run will bebased on class discussions. I request that students be opento new ways of learning and that they be willing to workcooperatively to help one another learn. This early no-tice gives students the opportunity to withdraw if theyprefer a conventional class.

• A grading scheme which factored into each individualteam member’s grade a small percentage of the team av-erage was discarded. The intention had been to encour-age group members to help one another. Instead it stirredup resentment in the better students whose grades werepulled down.

• Instead of keeping the same groups all semester, the com-position of the groups was changed after the first examand after the second exam. We made changing optionalafter the third exam.

• A flexible-weight grading system was initiated as de-scribed in my other article in this volume (p. 84).

• Several graphing calculators were put on reserve in thelibrary.

• Each team developed a list of its own guidelines for groupprocess.

• A grading scheme was initiated for projects which fos-ters individual accountability and virtually eliminatescomplaints about shirkers.

• Bonus questions were added to exams.

• New topics are motivated by illustrating their importanceor usefulness.

• Students with identical calculators sit together wheneverthe class is working on programming an algorithm.

Success Factors

Of course students want their input taken seriously. There-fore, presenting a summary in writing at the class followingthe survey and using it as a basis for discussion of possiblechanges is very effective. If the discussion leads to a deci-sion to make a change, it is important to implement it imme-diately.

As good as this all seems, not everything is rosy. Whenthere is considerable disagreement on a matter, some stu-dents think that the resulting discussion is “wasting valu-able class time.” In fact, some students would like the teacherto make all the decisions and they object to using any class

Page 173: Assessment Practices in Undergraduate Mathematics - Northern

160 Assessment Practices in Undergraduate Mathematics

time to discuss the surveys. Other students resist change andobject to trying different approaches in mid-semester. Whena change is implemented and doesn’t work, these studentsare particularly vehement. And certainly not everythingworks the way we think it will. For example, I have triedstudents suggestions about how to quiet a lecture hall afterthe students have been working in cooperative groups butnone of them have worked. However if the entire class isinvested in a decision, it has the greatest chance of success.

It takes extra time to make up survey questionnaires, andeven more time to summarize the results while also gradingexams. (Work-study students may help with the summariza-

tion.) But the extra effort pays off because it makes the coursemore alive and interesting for both the students and theteacher. Students and teacher alike learn from carrying outearly in-course assessments.

References

[1] Rogers, C. R. “The Necessary and Sufficient Conditionsof Therapeutic Personality Change.” Journal of Con-sulting Psychology 21(2), 1957, pp. 95–103.

[2] Rogers, C. R. Freedom to Learn. Merrill Publishing,1969.

Page 174: Assessment Practices in Undergraduate Mathematics - Northern

161

Background and Purpose

The University of Arizona is a public institution with about35,000 students. Each fall its Mathematics Department offers15 sections of second semester calculus (Calc II), with about35 students per section. Recently we noticed that studentsentering with credit on the AB Advanced Placement CalculusExamination do not seem to be well served by our Calc II.These incoming students are definitely not challenged bythe standard integration material at the start of the semester,so instead of fully mastering the familiar material they relax.By the time new material is introduced they are well behindtheir classmates and many never catch up. Another possiblereason is that in Calc II we emphasize written explanationsalong with numerical and graphical reasoning. Mostincoming students are not very proficient with this. As aresult, many of these students find their first mathematicscourse in college frustrating and unrewarding, even thoughthey are often the most intelligent students in the class.

During the academic year 1996–1997, we made an effortto remedy this problem. We created a special year long coursethat uses differential equations to motivate topics fromsecond semester calculus which are not covered in the ABexam. Enrollment was limited to incoming students whoreceived a 4 or 5 on the AB Advanced PlacementExamination. To determine the course content, I consultedseveral high school teachers, the Advanced Placementsyllabus and past AP examinations.

The result was a course that starts with a very quick reviewof functions, limits, continuity, differentiation, and

antidifferentiation, often within the context of somemathematical model. The emphasis here is on numerical andgraphical interpretation. We then analyze simple ordinarydifferential equations using techniques of differentialcalculus. Standard integration topics are covered as theyoccur in finding explicit solutions of differential equations.Taylor series are introduced as a technique which allowssolutions of difficult nonlinear differential equations. Herethe ratio test for convergence of series is motivated by suchseries solutions. The textbooks for the course are Hughes-Hallett [1] and Lomen & Lovelock [2].

The course was limited to 30 students, as that was thenumber of chairs (and personal computers) in our classroom.With this agenda, a new type of mathematics classroom, andthe nonuniform background of the students, it was evidentthat I needed some type of continuous feedback from theclass.

Method

My first assessment effort solicited their response to thefollowing two statements:

“One thing I understand clearly after today’s class is ___ .”“One thing I wish I had a better understanding of after

today’s class is ___ .”

The students were to complete their responses during thelast two minutes of the class period and give them to me asthey left. (This is a particular version of the One-MinutePaper—see David Bressoud’s article in this volume, p. 87.)

Student Feedback Teams in a Mathematics Classroom

David LomenUniversity of Arizona

A student feedback team is a subset of the class which gives the instructor feedback on how the class isdoing with new material.

Page 175: Assessment Practices in Undergraduate Mathematics - Northern

162 Assessment Practices in Undergraduate Mathematics

It had seemed to me that this was a very appropriate way tohave them evaluate the material I was covering, especiallyduring the review phase at the start of the semester.

The second method entailed my meeting with a committeeof individuals once a week to assess their response to ourclassroom activities, their learning from the class, and theirhomework assignments. From the 20 volunteers I chose fourwho had different majors and had used different calculusbooks in high school. (At our first class meeting I had allstudents fill out a one page fact sheet about themselves whichgreatly aided in this selection.) To facilitate communicationamong the class, I formed a listserve where I promised toanswer all questions before going to bed each night. I alsohanded out a seating chart which contained each student’stelephone number, e-mail address, and calculator type.Comments, complaints, suggestions, etc., were to be directedeither to me or to this committee, either in person or via e-mail. I emphasized that this was an experimental class, andI really needed their help in determining the course contentand the rate at which we would proceed.

During the second semester of the course, these meetingsof the four students were disbanded, and then resurrected(see below) and made open to any four students who wishedto attend, with preference given to those who had notparticipated previously. Also, during the second semester, Ihad the students work in groups for some of the morechallenging homework exercises. Included with their write-ups was an assessment of how the group approached theexercises, how they interacted, and what they learned.

Findings

This class was the most responsive class I have everexperienced as far as classroom interactions were concerned.However, there were so many unusual things about this class,it is impossible to attribute this solely to the use of feedbackteams. For example, almost all of the students used e-mail toask questions that arose while they were doing their homework,especially group homework. For example, in one assignmentthey ended up with the need to solve a transcendental equation.A reminder that there were graphical and numerical ways tosolve such equations allowed the assignment to be completedon time. Students could send messages either to me or to thelistserve. When the latter was used, several times other studentsresponded to the question, usually in a correct manner.

Use of Findings

It turned out that the students required no urging to expresstheir opinions. At the inaugural meeting first semester, thecommittee of four very clearly stated that they KNEW myfirst means of assessment was not going to be effective. Theyreasoned that the Calc I review was being presented and

discussed in such an unusual manner that they had no ideawhat they understood poorly or well until they had sometime to reflect on the material and worked some of theassigned homework. After much discussion, they agreed to amodification of this procedure where these two questions wereanswered AFTER they completed the written homeworkassignment. This was done, and provided very valuablefeedback, including identifying their struggle in knowing howto read a mathematics textbook. To help with this readingproblem, I had the students answer a series of True/Falsequestions (available from http://www.calculus.net/CCH/) thatwere specific to each section of the calculus book. I wouldcollect these as students came to class and use their answersas a guide to the day’s discussion. While not all the studentsenjoyed these T/F questions, many said they were a big helpin their understanding the material.

One other item this committee brought to my attentionwas the enormous amount of time it took them to work someof the word problems. To rectify this situation, I distributedthe following seven point scheme, suggested by a colleagueof mine, Stephanie Singer: 1) write the problem in English,2) construct an “English-mathematics dictionary,” 3) translateinto equations, 4) include any hidden information, andinformation from pictures, 5) do the calculations, 6) translatethe answer to English, 7) check your answer with initialinformation and common sense. This, together with somedetailed examples, provided a remedy for that concern.

As the class continued on for the spring semester, I thoughtthat the need for this committee had disappeared. However,after three weeks, three students (not on the originalcommittee) wondered why the committee was disbanded.They had some specific suggestions they wanted discussedconcerning the operation of the class. My reaction to thiswas to choose various times on Friday for a meeting withstudents. Whichever four students were interested andavailable at the prescribed time would meet with me. Thefirst meeting under this format was spent discussing how toimprove the group homework assignments which I institutedSpring semester. Because several homework groups hadtrouble arranging meeting times, they suggested these groupsconsist of three, rather than four students, and that any onegroup would not mix students who lived on campus withthose living off campus. This was easily accomplished.

A meeting with a second group was spent discussing the“group homework” assignments which were always wordproblems, some of them challenging or open ended. Thesestudents noted that even though these assignments were verytime consuming, they learned so much they wanted them tocontinue. However, they requested that the assignments bemore uniform. Some assignments took them ten hours, someonly two hours. The assignments given following thismeeting were more uniform in difficulty.

At a another meeting the student’s concern shifted to thenumber of exercises I had been assigning for inclusion in

Page 176: Assessment Practices in Undergraduate Mathematics - Northern

Part II: Assessment in the Individual Classroom 163

their notebook (only checked twice a semester to assess theireffort). These exercises were routine, and for every sectionI had been assigning all that were included in a student’ssolution manual. They said there was not enough time to doall of these exercises and they did not know which they couldsafely skip. In response, I then selected a minimal set whichcovered all the possible situations, and let those studentswho like the “drill and practice” routine do the rest.

Success Factors

For larger classes, the student feedback teams could solicitcomments from students before or after class, or have themrespond to specific questionnaires. Occasionally they couldhold a short discussion session of the entire class withoutthe instructor present. In an upper division class for majors,this team could work with the class to determine what prereq-uisites needed to be reviewed for rapid progress in that class,see [3]. Another aid for this process are computer programs(available from http://math.arizona.edu/software/uasft.html)which help in identifying any background weakness.

Other instructors have used “student feedback teams” indifferent ways. Some meet with a different set of students each

time, others meet after each class. A more formal processusually used once during a term involves “student focusgroups” (see the article by Patricia Shure in this volume, p.164). The essence of this process is for a focus group leader(neither the instructor nor one of the students) to take half aclass period and have students, in small groups, respond toa set of questions about the course and instructor. The groupsthen report back and the leader strives for consensus.

Other suggestions for obtaining feedback from largeclasses may be found in [4].

References[1] Hughes-Hallett, D. et al. Calculus, John Wiley & Sons

Inc., 1994.[2] Lomen, D. and Lovelock, D. Exploring Differential

Equations via Graphics and Data, John Wiley & SonsInc., 1996.

[3] Schwartz, R. “Improving Course Quality with StudentManagement Teams,” ASEE Prism, January 1996, pp.19–23.

[4] Silva, E.M. and Hom, C.L. “Personalized Teaching inLarge Classes,” Primus 6, 1996, pp. 325–336.

Page 177: Assessment Practices in Undergraduate Mathematics - Northern

164

Background and Purpose

Each fall at the University of Michigan, most of the incomingstudents who take mathematics enroll in one of theDepartment’s three large introductory courses. These threecourses, Data, Functions, and Graphs, Calculus I, andCalculus II, enroll a total of 3600 students each fall. Notonly must the department design and run these courseseffectively, but it is clearly crucial that we find out whetherthe courses are succeeding. Each course is systematicallymonitored by all the conventional methods such as uniformexaminations, classroom visits, and student evaluations, butone assessment technique has proved especially helpful. Thisprocedure, which we call Early Student Feedback, has theadvantage of simultaneously giving both students andinstructors an opportunity to review the goals of the courseand providing a simple mechanism for improving instruction.

The courses have been planned around a set of specificeducational objectives which extend beyond the topics welist in our syllabi. We are interested in encouraging thetransfer of knowledge across the boundaries betweendisciplines, so our course goals reflect the needs of the manyother disciplines whose students we teach. For example,

Course Goals for Introductory Calculus

1) Establish constructive student attitudes about the valueof mathematics by highlighting its link to the real world.

2) Persuade more students to continue in subsequentmathematics and science courses.

3) Increase faculty commitment to the course by increasingthe amount of student-faculty contact during each class

period and having faculty grade students’ homeworkthemselves.

4) Develop a wide base of calculus knowledge including:basic skills, understanding of concepts, geometricvisualization, and the thought processes of problem-solving, predicting, and generalizing.

5) Strengthen students’ general academic skills such as:critical thinking, writing, giving clear verbal explanations,understanding and using technology, and workingcollaboratively.

6) Improve students’ ability to give reasonable descriptionsand to form valid judgments based on quantitativeinformation.

All three of our large first-year courses are run in the samemulti-section framework, a framework which makes achievingthese objectives possible. Classes of 28–30 are taught by amix of senior faculty, junior faculty who come to Michigan fora 3-year period, mathematics graduate students, and a fewvisitors to the department. Here is the general format.

Key Features of Introductory Courses

1) Course content: The course emphasizes the underlyingconcepts and incorporate challenging real-worldproblems.

2) Textbook: The textbook emphasizes the need to under-stand problems numerically, graphically, and throughEnglish descriptions as well as by the traditional algebraicapproach.

3) Classroom atmosphere: The classroom environment usescooperative learning and promotes experimentation bystudents.

Early Student Feedback

Patricia ShureUniversity of Michigan

In introductory courses in mathematics at the University of Michigan, an instructional consultant visitsthe class one-third of the way into the semester. This observer holds a discussion with the class, in theabsence of the instructor, about how the course is going, and provides feedback to the instructor.

Page 178: Assessment Practices in Undergraduate Mathematics - Northern

Part II: Assessment in the Individual Classroom 165

4) Team homework assignments: A significant portion ofeach student’s grade is based on solutions to interestingproblems submitted jointly with a team of three otherstudents and graded by the instructors.

5) Technology: Graphing calculators are used throughoutthe introductory courses.

6) Student responsibility: Students are required to read thetextbook, discuss the problems with other students, andwrite full essay answers to most exercises.

Early Student Feedback was originally introduced to theMathematics Department by Beverly Black of the Centerfor Research on Learning and Teaching (CRLT) at theUniversity of Michigan. The procedure we currently useevolved from the Small Group Instructional Diagnosis(SGID), described by Redmond and Clark [1].

Method

A) Overview: Approximately one-third of the way into theterm, each class is visited by an instructional consultant. Theconsultant observes the class until there are about 20 minutesleft in the period. Then the instructor leaves the room andthe consultant takes over to run a feedback session with theclass. Soon afterward, the consultant and the instructor havea follow-up meeting to discuss the results and plan teachingadjustments. The results of each Early Student Feedbackprocedure are confidential.

B) Consultants: The consultants we use are the instructorswho run our professional development program. They areupper-level mathematics graduate students and facultymembers. Consultants are trained according to CRLTguidelines for observing classes. These guidelines stress theimportance of having instructors reflect on their plans and goalsfor each class session in light of the overall course goals.

C) Observing the class: During the observation period,the consultant objectively records everything as it happens;the classroom setup, what the instructor says and does, studentinteractions with the instructor and with each other, etc.

D) Running the feedback session: First, the consultantexplains to the class that the Mathematics Departmentroutinely conducts these sessions at this point in the termand briefly explains the procedure.Then the class is dividedinto groups (of four or five students each) and the groupsare given 7 minutes to discuss, reach a consensus, and recordtheir responses on the Early Feedback Form:

EARLY FEEDBACK FORM

In your small group, please discuss the followingcategories and come to a consensus on what shouldbe recorded for the instructor. Using detailed examplesand specific suggestions will make your commentsmore useful to the instructor. Please have a recorder

write down the comments that you all agree on.List the major strengths in this course. What is helpingyou learn in the course? (Space is provided for fiveresponses)List changes that could be made in the course to assistyour learning. (Space is provided for five responses.)

While the students are talking, the consultant writes twoheadings on the board:

Strengths Changes

After getting a volunteer to copy what will be written on theboard, the consultant calls on the groups in turn to read aloudone “strength.” One by one, the consultant records thecomments on the board in essentially the students’ own wordswhile continually checking for clarity and consensus. Whenall the “strengths” have been read, the consultant uses thesame procedure to generate a list of “changes.”

E) Debriefing the instructor: Before the meeting to debriefthe instructor, the consultant makes a two page list of thecomments from the board (strengths on one page and changeson the other). The beginning of this meeting is an opportunityfor the consultant to get instructors to talk about how theclass is going, what goals they had for the class, and whatthey see as their own possible strengths and weaknesses.The consultant then gives the instructor the list of students’comments (strengths first) and goes over it one point at atime. The consultant’s own observation of the beginning ofthe class period provides valuable detail and backup for thestudents’ interpretation of their experience. The final step isto plan with the instructor how to respond to the studentsand what adjustments should be made. For instance, ifstudents suggest that the instructor spend more time goingover homework problems, the answer may well be “OK.”But, if they want to do less writing (“after all, this is amathematics class, not an English class”), the instructor willneed to spend more time explaining the connection betweenwriting clear explanations and understanding the ideas.

Findings

During the feedback process, we find that everyone involvedin the course, from the individual student, to the instructor,to the course director benefits from reflecting on the goalsof the course. For example, students often complain (duringthe feedback session) that their instructor is not teachingthem; that they have to teach themselves. They think that a“good” teacher should lead them through each problem stepby step. In response, the consultant gets a chance to talkwith the instructor about students’ perception of mathematicsas simply performing procedures, whereas what we wantthem to learn is how to become problem solvers. Commonly,both the students and their instructors respond well to theEarly Student Feedback process. The students are pleased

Page 179: Assessment Practices in Undergraduate Mathematics - Northern

166 Assessment Practices in Undergraduate Mathematics

that the Mathematics Department cares what they think. Wefind that they will discuss the course freely in this setting.Similarly, there has been a uniformly positive response frominstructors at all levels, from first-time graduate students tosenior faculty. All instructors feel more confident about theirteaching after hearing such comments as “he’s always helpfuland available” or “she knows when we’re having trouble.”In fact, since these feedback sessions have proved so usefulin the introductory courses, the Department has begun touse them occasionally in more advanced courses.

Use of Findings

Each Early Student Feedback session is confidential and theresults are never recorded on an individual instructor’srecord. However, the consultants talk over the results ingeneral looking for patterns of responses. We ask questionsabout how the courses are going. Do students see the valueof the homework teams? Is there lots of interaction in theclassrooms? Do the students seem to be involved with thematerial? Are the instructors spending too much time on theirteaching? The answers give a clear profile of each course.The timing of the procedure allows us to reinforce thecourse’s educational objectives while the course is still inprogress. Furthermore, it gives us the opportunity to improvethe instruction in each individual classroom and make anynecessary adjustments to the courses themselves.

Success Factors

Early Student Feedback has been used successfully in manycourses throughout the University at the request of individualinstructors. But in the Mathematics Department, we requireit every term in every section. Making the process mandatoryhad the effect of making it seem like a routine (and henceless threatening) part of each term. Since almost everyonehas had a consultation, there are always many experiencedinstructors who spontaneously tell newcomers that thesessions are very beneficial. It is easy to calculate the “cost”of using Early Student Feedback as an assessment technique.

Students – 20 minutes of class timeInstructors – 1/2 to 1 hour for debriefingConsultants – (1 hour class visit) + (1 hour preparation

time) + (1/2 to 1 hour debriefing)

The method is certainly cheap in comparison to theamount of information it generates about teaching andlearning.

Reference[1] Redmond, M.V. and Clark, D.J. “A practical approach

to improving teaching,” AAHE Bulletin 1(9–10), 1982.

Page 180: Assessment Practices in Undergraduate Mathematics - Northern

167

Dear Professor Math,

Thank you for your interest in my use of topic letters in lowerdivision mathematics courses. I am sending you moreinformation on these assignments in letter format so that youwill have some sense of their style. Let me begin by outliningfor you their Background and Purpose.

For a number of years, I have been using student writingboth to develop and evaluate conceptual understanding inmy lower division courses. The topic letters are one suchassignment. In these letters, students are asked to write aboutmathematics to someone they know; for instance, I mightask them to describe the relation between various concepts(e.g., what is the relation between exponential andlogarithmic functions?), the relation of a concept topreviously studied concepts (e.g., how does knowledge ofthe logarithmic function extend our calculus repertoire?), orthe relation of some concept to its motivating problem (e.g.,how does statistics address the problem of prediction?).Because these types of questions are conceptual in nature, Ihave used topic letters primarily in my calculus and liberalarts courses, although there are ideas in more skills-relatedcourses like college algebra for which a letter could also beused.

I am fortunate that class sizes at USC average 25–45, sothat evaluating the letters is manageable. Initially, myobjective with the topic letters was solely to evaluate studentconceptual understanding. I have found, however, that theyalso provide an evaluation of how well I’ve done at designingcourse experiences to develop conceptual insight andunderstanding. Unlike traditional anonymous student

questionnaires which focus almost exclusively on teachingmechanics like “punctuality,” the topic letters tell mesomething about the quality of the learning environment Itried to create, my success in creating it, and its success inbuilding conceptual understanding. An end-of-semester“Letter to a Friend” addressing the course as a whole wastherefore a natural extension of the topic letter assignment.

My Method is illustrated in these partial instructions foran End-of-Semester letter in first semester calculus:

You have a friend who foolishly decided to attendanother university in the east. (Like Harvard hasanything on USC.) This friend has heard that calculusis a requirement for many disciplines, and isconsidering a course in calculus next semester to keepall major options open. Knowing that you are justcompleting this class, your friend has written to youasking for your insights.

Having never heard much about what calculusreally is, this person would like to know what kinds ofproblems calculus can solve, what its most importantideas are, and what, if anything, is interesting orexciting about the subject.

Write a letter to your friend giving your answersand insights into these questions......

The audience for the assignment is set very deliberately.I emphasize to students that they must write to someone whohas never studied the mathematics in question, but who hasthe same working knowledge that they had at the start of thetopic, or the start of the course. Without such an audience,

Friendly Course Evaluations

Janet Heine BarnettUniversity of Southern Colorado

By having students write a letter to a friend about your course, you can get useful information both onwhat the students have learned and on what they thought of the course.

Page 181: Assessment Practices in Undergraduate Mathematics - Northern

168 Assessment Practices in Undergraduate Mathematics

students are less inclined to describe their personalunderstanding of the mathematics, and more inclined to tryto impress me with the use of formal terms that they may ormay not understand...and I will probably let them.

I do assign a grade to the letters, based primarily on theinsight and mathematical correctness displayed. I have foundthat it is important to stress that quality of insight (and notquantity of facts) is the key feature of a strong letter. Studentsare instructed to write no more or less than is needed toconvey the ideas clearly. To allow them as much freedom aspossible in how they do this, I do not set page limits or otherformatting requirements. (Most letters fall into the 3–5 typedpage range.)

Students who submit weak topic letters are encouragedto revise them, and I provide each student with specificwritten suggestions on areas that could be improved.Revisions of the end-of-semester letters are not permittedsince they are collected during the last week of class. Sincemy main purpose with the end-of-semester letter is to get amore meaningful course evaluation, I simply give all studentsfull credit as long as they seem to have made a serious effortwith it.

Findings of note include the fact that the grading itselfcan take some time.....10–15 minutes per letter since I readeach one twice (first for comments and a preliminary grade,then again to ensure consistency and to assign the actualgrade). Most of this time goes to writing comments; in largerclasses, this lead me to require fewer letters than I mightotherwise, with four being the maximum I would considerfor any class.

The time is worth it. First, the students receive an excellentopportunity to synthesize the material. Most students findthis useful (although a few question the place of writing in amathematics class), and quite a few students go out of theirway to be creative. I’ve gotten interesting letters to a varietyof real and imagined friends, family members, pets, plus theoccasional politician and movie star, which makes the lettersfun for me to read. I also enjoy the letters that take a morestraight-forward approach to the assignment, since even thosestudents use their own voices in the composition, somethingwhich rarely happens in other assignments.

The value of the topic letters for course evaluation becameespecially clear to me when I first began using topic lettersin my liberal arts mathematics course about two years ago. Ihad been using them for some time before that in othercourses, and knew that the assignment sheet needed to warnthem (loudly) that working examples was not the point, andencourage them (strongly) to convey the “big picture” in away meaningful to their “friend.” Still, most studentsincluded numerous examples in their letters, and not muchelse. After reflecting on this, I realized that much of ourclass time focused on just that: working examples, and notmuch else. In fact, several of the letters came straight from

those class examples. Naturally, we looked at conceptualexplanations in class (as did the book), but these were notincluded in the letters. Was this just because we spent moretime on the examples? And why was I spending so muchmore time on examples? With further reflection, I realizedthat virtually all my homework, quiz and test questions werefocusing on mechanics; the letters, I thought, would addressthe conceptual side of the course. But with such a heavyemphasis on mechanics in the course grade, I had (somewhatunwittingly) emphasized examples in class in order to preparestudents to earn those points, and students responded to thisin their letters.

My own Use of Findings came through various modifi-cations to the course. After some initial experimentation, Ireplaced in-class quizzes based on mechanics with take-homequizzes that require students to discuss and apply concepts..I’ve also developed a set of “course questions” which I useon the final exam, after giving them (verbatim) to the studentsat the beginning of the semester and returning to them duringthe semester as we cover the relevant material. Along withthese changes in the course assessment structure, I modifiedinstruction through the use of more discussions centeringon the course questions and concepts, and less emphasis onexamples of mechanics in lectures. These modifications arenow having the desired effect; the topic letters are muchbetter, as are students’ responses to the conceptual questionson various tests and quizzes, and their mastery of mechanicshas not suffered.

The information I gain from the end-of-semester lettersalso helps me to fine-tune course instruction and the overallsyllabus, as well as instruction in specific topic areas. To getcourse evaluation information from these letters, I watch forcomments that indicate the value of the various activitieswe did during the semester (such as advice to their friendconcerning important things to do), an indication of theemphasis placed on concepts versus skills (such as a list ofrules with no concepts mentioned), or an indication of aconcept that may have been under (or over) emphasized (suchas no mention of the Fundamental Theorem in a calculusclass).

Some Success Factors to consider have already beenmentioned; the time concern is one which I would reiterate.With the topic letters, there is also the difficulty of sortingcourse evaluation information from student assessmentinformation; for example, when I get a letter emphasizingexamples and algorithms instead of concepts (as I still do),it may not be due to the class experiences provided. Thestudent may not have understood my expectations or theconcepts, or may simply have taken the easy way out. Thismeans that I must look at the collection of letters, as well asthe individual letters, to get a “course” reading. All thesefactors make it essential to set the objectives of the letters

Page 182: Assessment Practices in Undergraduate Mathematics - Northern

Part II: Assessment in the Individual Classroom 169

carefully, so that both you and the students gain from thetime investment.

Another concern I have is the fact that the letters are notanonymous, which may limit their validity as a courseevaluation tool. On the other hand, some of the thingsstudents put in their letters suggest they are not overly-concerned with anonymity. In some of these, I suspect thestudent is going for shock value (with tales of drinking partiesand other such escapades). As far as possible, I sort out themathematical and course information in these, withoutreacting to their provocative side. A more distressing type

of “no-holds barred” letter I have received comes fromstudents who did not do well in the course, or whoseconfidence was shaken along the way, and who tell theirfriend very clearly how this has affected them. Although I’venever read any anger in these, they are very difficult to readand comment on. Despite that (or perhaps because of that),this type of letter has given me good information about thecourse, and what kinds of experiences are discouraging tostudents.

I hope you too will find the letters valuable, and that you’llshare your experiences with me. Best wishes...J.B.

Page 183: Assessment Practices in Undergraduate Mathematics - Northern

170

Background and Purpose

The course portfolio documents course and classroom activityin a format open to reflection, peer review, standards evaluation,and discussion. The course portfolio can demonstrate theintellectual development in the students of the concepts in thecourse and documents that a transformation of studentsoccurred because of the intellectual efforts of the instructor.

Bill Cerbin, at the University of Wisconsin - La Crossewas one of the first to form the idea of the course portfolio.The American Association for Higher Education (AAHE)Teaching Initiative Project “From Idea to Prototype: ThePeer Review of Teaching” promoted the portfolio in a menuof strategies [2,3].

According to a report from the Joint Policy Board onMathematics, a major obstacle to including evaluation ofteaching in the reward system is the absence of evaluationmethods. The course portfolio is one response to the need forbetter tools to document and evaluate teaching. It puts facultyin charge of monitoring, documenting and improving the qualityof teaching and learning. It provides a vehicle for the facultymember to reflect on what’s working and what’s not, and forcommunicating that reflective wisdom and practice tocolleagues.

Peer feedback on the course portfolio providesprofessional accountability, and improves the process ofteaching. Both the presenter and the reader learn newpedagogical methods to improve their teaching.

I have created a course portfolio for a course on Principlesof Operations Research. Our department teaches this courseyearly in one or two sections to both senior mathematicsmajors and as a service course for senior students in actuarialscience. From a prerequisite of linear algebra and probability,the course covers linear optimization, queuing theory, anddecision analysis.

Method

The word “portfolio” conjures up a vision of a collection ofpapers, or a binder of whatever was available, perhapsassembled in haste. But this is not what I mean by the “courseportfolio.” The course portfolio starts with the coursesyllabus, with explicit mathematical goals for students inthe course. The instructor measures progress toward the goalsthrough a pre-course student background knowledge probe;informal course assessment by students; homeworkexercises, solutions, and scores; in-class active-learningexercises; tests; labs or projects; formal student courseevaluations; and a post-course evaluation by the instructor.

A “course portfolio” is not a “teaching portfolio.” Ateaching portfolio is a larger, longer document thatestablishes over a period of time and a range of courses thatyou are an effective teacher.

Here are some questions to address in a portfolio:

• What were you trying to accomplish?

The Course Portfolio in Mathematics:Capturing the Scholarship in Teaching

Steven R. DunbarUniversity of Nebraska-Lincoln

The teaching portfolio is an alternative to the standard course questionnaire for summing up a course.The instructor collects data throughout the semester into a course portfolio, which can then be used bythat instructor or passed on to others teaching the course.

Page 184: Assessment Practices in Undergraduate Mathematics - Northern

Part II: Assessment in the Individual Classroom 171

• Why were these goals selected?

• Did the course meet the goals? How do you know?

• What problems were encountered in meeting the goals?

• How did you conduct the course?

• How did you challenge students?

• What changes in topics, materials and assignments arenecessary?

For the course portfolio on Principles of OperationsResearch, I began with 5 prerequisite skills for the course,including the ability to solve systems of equations by rowreduction, and computing conditional probabilities. I listed5 specific goals of the course, including formulating andsolving optimization models, and formulating and analyzingqueuing models. All prerequisite skills and goals of thecourse appeared on the course information sheet distributedto students on the first day of class. I selected mathematicalmodeling as a major focus of the course, since I believedthat most senior mathematics majors were already adept insolving formulated problems, but needed practice in turningoperations research situations into a mathematical model.The course information sheet became the first item in thecourse portfolio.

For the first day of class, I created a “background knowledgeprobe” testing student knowledge of both the prerequisite skillsand the goals of the course. Problems on the backgroundknowledge probe were of the sort that would typically be onthe final exam for the course or its prerequisites. A copy of thebackground knowledge probe together with an item analysisof student scores went into the course portfolio. Comparingthis with the scores on the final exam gives a measurement ofstudent learning and progress.

I included homework assignments, copies of exams, andcourse projects in the course portfolio as evidence of thedirection of the course toward the goals. Occasional minutepapers and in-class assessments measured student progressthrough the term. I collated and summarized informationfrom the minute papers and included this in the courseportfolio as the “student voice.” I also did an item analysisof the scores from tests as a further refinement on measuringstudent gains in knowledge. Angelo and Cross [1] havesamples and ideas for constructing the backgroundknowledge probe, lecture assessments, and other assessmentinstruments that become evidence in the course portfolio.

The final element of the course portfolio was a “coursereflection memo.” In this memo created at the end of thecourse, I reflected on the appropriateness of the goals, thestudent progress on the goals, and recommendations for thecourse. I included the course reflection as the second elementin the course portfolio, directly after the course informationsheet. There is no recipe, algorithm, or checklist for preparinga course portfolio. In practice, the course portfolio will beas varied as the faculty preparing it, or the courses taught.

Preparing a course portfolio requires time that is alwaysin short supply. Nevertheless, a course portfolio is not sucha daunting task since we already keep course notes from thelast time we taught a course. My experience is that the courseportfolio requires about an average of one hour per weekthrough the term of the course. The time is not evenlydistributed, however, since careful preparation of the goalsand organization of the course takes place at the beginningof the term. Assessing student learning as a whole is a largepart of the portfolio that occurs best at the end of the term.Thoughtful reflection on the course and its conduct andresults naturally occurs at the end of the term. Fortunatelythese are the parts of the term when more time is available.During the term, collection of course materials and evidenceof student learning is easy and automatic.

Findings

In the course on Operations Research that I documented witha course portfolio I found that students generally came tothe course with good prerequisite skills. This factual evidencewas encouraging counterpoint to “hallway talk” about studentlearning. I was able to document that students improved theirhomework solution writing through the course of thesemester. The classroom assessments showed that thestudents generally understood the important points in mylectures, with some exceptions. The assessments also showedthat the students generally were not reading the materialbefore coming to class, so I will need to figure out someway to enforce text reading in the future. Analysis of thescores on the background knowledge probe, the exams, andthe final exams documented that students learned themodeling and analysis of linear programming problemsacceptably well, but had more slightly more trouble withqueuing models and dynamic programming. The courseportfolio was generally able to document that student learninghad taken place in the course.

Use of Findings

One immediate benefit of the portfolio is a course record. Acourse portfolio would be especially useful to new faculty,or to faculty teaching a course for the first time. The portfoliowould give that faculty member a sense of coverage, thelevel of sophistication in presenting the material, andstandards of achievement. The course portfolio also gives aplatform for successive development of a course by severalfaculty. This year, two other instructors taught the course onOperations Research. At the beginning of the term, I turnedover my portfolio to them as a guide for constructing theircourse. The course portfolio informed their thinking aboutthe structure and pace of the course, and helped them makeadjustments in the syllabus.

Page 185: Assessment Practices in Undergraduate Mathematics - Northern

172 Assessment Practices in Undergraduate Mathematics

Having a course portfolio presents the possibility of usingthe material for external evaluation. Peers are essential forfeedback in teaching because they are the most familiar withthe content on a substantive level.

Success Factors

The AAHE Peer Review Project [2] has some advice aboutcourse portfolios:

• Seek agreement at the outset about the purposes ofportfolios, who will use the information, who owns it,what’s at stake.

• Be selective. The power of the portfolio comes fromsampling performance, not amassing every possible scrapof information.

• Think of the portfolio as an argument, a case, a thesiswith relevant evidence and examples cited.

• Organize the portfolio around goals, your own goals ordepartmental goals.

• Include a variety of kinds of evidence from a variety ofsources.

• Provide reflective commentary on evidence. Reflectivecommentary is useful in revealing the pedagogicalthinking behind various kinds of evidence and helpingreaders to know what to look for.

• Experiment with various formats and structures, bedeliberate about whether portfolios are working and howyou can refine the structure.

It may be useful to include the actual course syllabus, toshow what the instructor actually covered in the course incomparison with what the instructor intended to cover.

A variant is the “focused portfolio,” which concentrateson one specific aspect or goal of a mathematics course andthe progress made toward the goal. For example, a courseportfolio might focus on the integration of graphingcalculators or on the transition of students from “problem-solving” to “proof-creating.” In any case, there should be acover letter or preface to the reader, briefly explaining thepurpose and methods of the portfolio.

An important addition to the portfolio is the “studentvoice,” so the portfolio clearly represents not only what theinstructor saw in the course, but also the students got out ofit. Informal class assessments, formal student evaluationsand samples of homework and student projects are ways ofincluding the student voice.

References

[1] Angelo, T.A., and Cross, K.P.Classroom AssessmentTechniques, second edition. Jossey-Bass, San Francisco,1993.

[2] Hutchings, P. From Idea to Prototype: The Peer Reviewof Teaching. American Association of Higher Education,1995.

[3] Hutchings, P. Making Teaching Community Property:A Menu for Peer collaboration and Peer Review.American Association for Higher Education, 1996.

Page 186: Assessment Practices in Undergraduate Mathematics - Northern

Part IIIDepartmental Assessment

Initiatives

Page 187: Assessment Practices in Undergraduate Mathematics - Northern
Page 188: Assessment Practices in Undergraduate Mathematics - Northern

175

Mathematics departments are increasingly being asked, bothinternally by other departments and externally by accreditors,to reflect on their role on campus. At one time, a mathematicsdepartment was acting within the expectations of a collegeby providing research and teaching in a way that weededout the poorer performers and challenged and pushed forwardthe elite. With the diversity of today’s student clientele,changing expectations on the nature of education, the jobscrisis, and the downsizing of mathematics departments, weare having to examine the broader issue of how we canprovide a positive, well-rounded learning environment forall students.

As more and more disciplines require mathematics,outside expectations of mathematics departments have risen;other departments are now competing for ownership of someof our courses. The issue of effective teaching inmathematics, for example, is a frequently discussed problemfor students and faculty on campus. So is the content we areteaching, say, in the calculus sequence. And we shake ourheads at what to do about general education courses inphysics and chemistry in which students can’t usemathematics because they can’t do the simple, prerequisitealgebra. However, assessment does not function at its bestwhen aspects are pulled apart and examined under amicroscope. The evaluation of mathematics teaching or thecurriculum or the quantitative capabilities of our studentscannot be separated from the evaluation of broader issuesfor which the department as a whole is responsible. Some ofthese issues involve finding ways to properly place and advisestudents, networking with client disciplines, examining newcurricula and methods of teaching, and demanding andmaintaining basic quantitative literacy requirements that willassure our students graduate even minimally competent.

All of these major departmental initiatives must operatewithin an assessment framework. Until recently, however,mathematics departments have not generally been at ease

with the idea of going public. Even if we move beyond theidea that the word “assessment” is mere jargon, perhaps wefeel we are being expected to compromise our high standards,which are naturally opaque to others, or that our freedom toteach in ways that intuitively feel best is being stripped fromus. Many of us, overworked as we are, may be content toleave assessment issues to administrators. However it hasbecome clear through a variety of publicized situations thatif we neglect to deal with new expectations, then decisionsabout us will be made by others, with the result that we willbe marginalized.

Some mathematics departments do not change; othersmay adopt changes (such as a change in a text or curriculum,a change in the use of technology, or a change in a policy ofplacement or admittance of transfer students) without everpaying explicit attention to the results of these changes. Thisunexamined approach to teaching will probably not beacceptable in the future.

In this section we have searched for efforts by departmentswhich have created changes that feature assessment as anintegral component. The method of assessment-as-processhelps create an educational environment that is open andflexible in ways which benefit faculty, students and theinstitution as a whole. For example, everyone benefits whenplacement is running efficiently. We are not as concernedthat the final results of the projects should be awesomelysuccessful (although some are) as we are by the processesby which departments have examined and made public theresults of their efforts. Good assessment is not a finishedproduct but the input for more changes. In fact, most of thereform efforts of departments presented here are not at astage of completion. Rather, the purpose of this section is toacknowledge that on campuses now there is some fermenton the issue of assessment, and that questions are being raisedand brooded about for which we hope to have provided someplausible starting solutions.

Introduction

Sandra Z. KeithSt. Cloud State University

Page 189: Assessment Practices in Undergraduate Mathematics - Northern

176 Assessment Practices in Undergraduate Mathematics

Our contributions are highly diverse; we will see theefforts of small and large schools, research institutions andsmall inner city schools. Retention of majors is the issue insome cases, while elsewhere the focus is on developmentalmathematics or general education mathematics. Someprograms are personally focused on listening to students;others emphasize dealing collaboratively with faculty fromother departments, and still others (not necessarilyexclusively) have conducted massive analytical studies ofstudent performance over the years for purposes of placementor for charting the success of a new course. We will learnabout the efforts of one school with a bias against assessmentto lift itself up by the bootstraps, and about some otherschools’ large-scale, pro-active assessment programs thatoperate with an assessment director/guide and encouraging,assessment-conscious administrators. We will also hear fromsome experts on quantitative literacy across campus, learningtheory, and calculus reform so we may view concreteassessment practices against this theoretical backdrop.

In the diversity of methods exhibited here, it is interestingto observe how the rhetorical dimensions of assessmentfunction; how the use of language in the interpretation ofresults has the power to influence opinion and persuade.Rhetorical interpretations of assessment procedures mayindeed be intended to sell a program, but if this strikes us asan unnatural use of assessment, we would be equallyunaccepting of the language of politics, law, business, andjournalism. We would never recommend blanket adoptionof these programs — this would not even be feasible. Ratherwe hope that the spectrum of efforts presented here, achievedthrough the obvious commitment and energy of themathematics departments represented, will suggest ideasabout what is possible and will serve to inspire the examinedway.

Here is an overview of this section.

Placement and Advising

Judith Cederberg discusses a small college, nurturing a largecalculus clientele in a flexible, varied calculus program,which recognizes the need for a careful placement procedureand studies the effectiveness of this effort with statistics thathelp predict success. Placement is also the issue at a largecomprehensive school, in an essay by Donna Krawczyk andElias Toubassi explaining, among other initiatives thedepartment took to improve its accessibility, theirMathematics Readiness Testing program. Statistics about thereliability of the testing are studied, and the authors observethat their results have been found useful by other departmentson their campus. At many institutions, advising operates ata minimal level — it may be the freshmen, or possibly eventhe seniors who are forgotten. Steve Doblin and WallacePye, from a large budding university, have concentrated on

the importance of advising students at all levels; fromproviding mentors for freshmen to surveying graduatingstudents and alumni, this department operates withcontinuing feedback. Materials which carefully describe theprogram are distributed to students.

Quantitative Literacy

When our students graduate from four or more years ofcollege, are they prepared for the quantitative understandingthe world will require of them? This issue, which could beone of the most important facing mathematics departmentstoday, has suffered some neglect from the mathematicscommunity, in part because of the varied ways in whichliberal arts courses are perceived and the general confusionabout how to establish uniform standards and make themstick. Guidelines from the MAA for quantitative literacy havebeen spelled out in a document authored by Linda Sons whooffers an assessment of how this document was actually usedin a quantitative literacy program at her large university. Fora variety of liberal arts mathematics courses and calculus,this school tested students and graded the results by a uniformstandard, then compared results with grades in the courses.Results led to making curricular changes in some of thecourses. Some of us may have tried student-sensitive teachingapproaches in higher level courses, but when it comes toteaching basic skills, there seems to be a general taboo aboutdeviating from a fixed syllabus. Cheryl Lubinski and AlbertOtto ask, what is the point of teaching students something ifthey’re not learning? They describe a general educationcourse that operates in a learning-theoretic mold. For them,the content is less important than the process in which thestudents construct their learning. Elsewhere, Mark Michaeldescribes a pre- and post-testing system in a liberal artsmathematics course, that among other things, raises someinteresting questions about testing in general, such as howstudents may be reluctant to take such testing seriously, andwhy some students may actually appear to be goingbackwards in their learning. With an even broaderperspective, how does one assure that all students graduatequantitatively literate at a large collective bargaining schoolwith little coordination in its general education program anda tradition of diversity and disagreement among faculty? Isthe mathematics requirement continually being watered downin other general education courses, in effect bowing tostudents’ inability? What do students perceive and how canone interpret the broad findings? A non-mathematician,Philip Keith, discusses the ramifications of accommodatingvarious departments’ views of quantitative literacy andreports on using a simple survey to get an institution on track.How realistically can such broad data be interpreted? Thevalue of this survey lay in its effect to prod the faculty tobegin thinking and talking about the subject.

Page 190: Assessment Practices in Undergraduate Mathematics - Northern

Part III: Departmental Assessment Initiatives 177

Developmental Mathematics

Developmental (remedial) mathematics is the subject of EileenPoiani’s report from an inner city school with a diverse, multi-cultural clientele. Deeply committed to raising students’ abilityin mathematics, this school has a richly expansive supportiveprogram. They ask: does developmental mathematics help oris it a hopeless cause? On the receiving end of grants, anassessment plan was in order, and three major studies werecompleted over the course of ten years, with observations aboutretention and graduation data and performance indevelopmental and subsequent mathematics courses. As thisstudy continues, it looks more in depth at student characteristics.

Mathematics in a Service Role

How is a mathematics department supposed to please itsclient disciplines? From a comprehensive university, CurtChipman discusses a variety of ways in which an enterprisingmathematics department created a web of responsiblepersons. Goals to students and instructors are up-front andacross the board, course leaders are created, and committeesand faculty liaisons are used to the fullest for discussions onplacement and course content. In another paper, WilliamMartin and Steven Bauman discuss how a university whichservices thousands of students begins by asking teachers inother disciplines — not for a “wish list” — but for a practicalset of specific quantitative knowledge that will be essentialin science courses. As determined by the mathematicsdepartment and client disciplines, pre-tests are designed, thequestions on the tests asking about students’ conceptualunderstanding relating to the determined skills. Results are usedfor discussions with students, the mathematics faculty, andfaculty in client disciplines, creating a clear view for everyoneof what needs to be learned and what has been missed. From asmall university in Florida, Marilyn Repsher and J. Rody Borg,professors of mathematics and economics respectively, discusstheir experiences in team teaching a course in businessmathematics in which mathematical topics are encountered inthe context of economic reasoning. Opening up a course toboth mathematics and business faculty stimulates and makespublic dialogue about the issues of a course that straddles twodepartments. It is hard not to notice that seniors graduating inmathematics and our client disciplines often take longer tograduate, and minority students, even longer. Martin Bonsangueexplores this issue in his “Senior Bulge” study, commissionedby the Chancellor’s Office of the California State University.The results directly point to the problems of transfer students,and Bonsangue suggest key areas for reform. At West Point,when changes were felt to be in order, Richard West conductedan in-depth assessment study of the mathematics program. Withcareful attention to the needs of client disciplines (creating aprogram more helpful to students), the department created a

brand new, successful curriculum—cohorts are studied andanalyzed using a model by Fullan. Robert Olin and Lin Scruggsdescribe how a large technical institute has taken a “pro-active”stance on assessment: hiring Scruggs as an assessmentcoordinator, this institute has been able to create an assessmentprogram that looks at the full spectrum of responsibilities ofthe department. Among other things, it includes statisticalstudies to improve curriculum, teaching, and relations withthe rest of the university. These authors point out the benefitsof using assessment to respond carefully to questions that arebeing asked and that will be asked more and more in the future.An unexpected benefit of this program proved to be increasedstudent morale.

New Curriculum Approaches

In a discussion piece, Ed Dubinsky explains an approach toteaching based on learning theory. He asks how one can reallymeasure learning in a calculus course. This question stimu-lates an approach in which continual, on-going assessment feedsback into the learning environment. The efforts of Dubinskyand RUMEC have often focused on the assessment of learn-ing; one can observe in this volume how the methods he sug-gests here have influenced many of our authors.

Reform efforts in teaching (particularly in calculus) havebeen the subject of much discussion, and we could not beginto accommodate all the research currently conducted as to thesuccess of such programs. Here we touch on a few that representdifferent schools and different approaches. Sue Ganter offersa preliminary report of a study at the National ScienceFoundation discussing what NSF has looked for and what ithas found, with directions for future study. While it’s true thatthe emphasis has shifted from teaching to student learning, shereports, we are not simply looking at what students learn, buthow they learn, and what sort of environment promoteslearning. At some departments, even where change is desirable,inertia takes over. Darien Lauten, Karen Graham, and JoanFerrini-Mundy have created a practical survey that might beuseful for departments to initiate discussion about goals andexpectations about what a calculus course should achieve.Nancy Baxter Hastings discusses a program of assessment thatstresses breadth, from pre- and post-testing, to journals,comparative test questions, interviews of students, end ofsemester attitudinal questionnaires, and more. Good assessmentis an interaction between formative and summative. Largerprograms demand more quantified responses. In their separatepapers, Joel Silverberg and Keith Schwingendorf (both usingC4L) and Jack Bookman (Project Calc) describe their effortswith calculus programs that have looked seriously at the dataover time. These longitudinal studies are massive and broad,and not necessarily replicable. However, they indicate what isimportant to study, and suggest the ways in which gatheringdata can refine the questions we ask.

Page 191: Assessment Practices in Undergraduate Mathematics - Northern

178

Background and Purpose

St. Olaf is a selective, church related, liberal arts collegelocated in Northfield, Minnesota. The majority of the 2900students are from the North Central states and nearly all ofthem come to college immediately after high school. Overthe past 5 years, the median PSAT mathematics score ofincoming St. Olaf students has ranged from 58 to 62 andapproximately 60% were in the top 20% of their high schoolgraduating class. In recent years, 8–10% of the St. Olafstudents have graduated with a major in mathematics.

The St. Olaf Mathematics Department has 24 facultymembers filling 18.9 F.T.E. positions. In addition to themajor, the department also offers concentrations (minors)in statistics and computer science (there are no majors inthese areas). The entry level mathematics courses include atwo semester calculus with algebra sequence (covering onesemester of calculus), calculus (honors and nonhonors sec-tions using the same text), gateways to mathematics (a non-calculus topics course designed for calculus ready students),finite mathematics, elementary statistics, and principles ofmathematics (intended for non-science majors). Each of thesecourses (with the exception of the first semester of the cal-culus with algebra course) can be used to fulfill the one-

course mathematics requirement. This requirement, one ofa new set of general education requirements instituted afterapproval of the college faculty, took effect with the incom-ing students in 1994. Prior to this time there was no formalmathematics course required, but most St. Olaf students tookat least one of these courses to fulfill a two course science/mathematics requirement. In the last three years, the aver-age total of fall semester calculus enrollments of first-yearstudents has been 498. In addition, approximately 10 first-year students enrolled in sophomore level courses each fall.

Fall Semester Calculus Enrollments

Calculus with Algebra 28Calculus I (1st semester) 266Math Analysis I (Honors Calculus I) 102Calculus II (2nd semester) 9Math Analysis II (Honors Calculus II) 93

The St.Olaf mathematics placement program began inthe late 1960’s in response to the observation that studentsneeded guidance in order to place themselves properly. Oneof the major realizations in the development of the programwas the inadequacy of a single test for accurate placement.So in addition to creating a test (later replaced by MAAtests2), regression equations were developed to use

Administering a Placement Test: St. Olaf College1

Judith N. CederbergSt. Olaf College

A small college nurturing a large calculus clientele in a flexible calculus program recognizes the need forcareful placement, and studies the effectiveness of its efforts with statistics to derive a formula forplacement.

———————1 An earlier version of this article was published in the Fall 1994 issue of the Placement Test Newsletter, a publication of the MAA

Committee on Testing.2 For detailed information on the MAA Placement Test program, contact the MAA headquarters in Washington, DC (phone: 800-

741-9415).

Page 192: Assessment Practices in Undergraduate Mathematics - Northern

Part III: Departmental Assessment Initiatives 179

admissions information and answers to subjective questionsalong with placement test scores to predict gradepoints asthe one quantifiable measure of successful placement. Overthe years, the one placement test evolved into three separatetests, the regression equations have been refined, and moresophisticated computer technology has been employed.

Method

All new students are required to take one of the threemathematics placement exams, with exceptions made for non-degree international students and for students who havereceived a score of 4 or 5 on the College Board Calculus BCexam. The placement examinations are administered earlyduring Week One—a week of orientation, departmentinformation sessions, placement testing and registration (thereis no early registration for new students) which immediatelyprecedes the beginning of fall semester. All three exams containan initial list of subjective questions asking students forinformation about (1) their motivation for taking mathematics;(2) the number of terms of mathematics they plan to take; (3)the area in which they expect to major; (4) what their lastmathematics course was, and their grade in this course; and(5) how extensively they have used calculators. Questions onthe regular and advanced exams also ask for (6) how muchtrigonometry and calculus they have had; and (7) themathematics course in which they think they should enroll.This latter question provides helpful information, but alsoindicates the need for providing guidance to the student sincemany would not place themselves into the proper course.

The exams all are timed exams with a ninety minute limit.Students taking the Advanced and Regular Exams are allowedto use calculators without a QWERTY keyboard. Currently,students are not allowed to use calculators on the Basic Exam.The three exams and their audience are listed below.

1. Advanced Exam: Designed for students who have hadat least one semester of calculus and want to be consideredfor placement beyond first semester calculus at St. Olaf. Thistest consists of a locally written exam covering topics infirst semester calculus (25 questions) together with amodified version of an MAA trigonometry and elementaryfunctions exam (15 questions each). Approximately 220students take this exam.

2. Regular Exam: Designed for students with standardmathematics backgrounds who intend to take calculussometime during their college career. This test consists of atrigonometry and functions section (identical to that on theAdvanced Exam) together with an MAA calculator basedalgebra exam (32 questions). Approximately 425 studentstake this exam.

3. Basic Exam: Designed for students with weakermathematics backgrounds who have no plans to take calculus

and who are hesitant about taking any mathematics. Thistest consists of an MAA exam over arithmetic and basic skills(32 questions) and an MAA algebra exam (32 questions).Approximately 120 students take this exam.

Following the exams, the questionnaire and test data arescanned into the computer, which grades the tests, thenmerges the test scores with the admissions data and otherrelevant information to predict a grade. Finally, the computerassigns a recommendation to each student using a cutoffprogram that is refined from year to year. Borderline andspecial cases (e.g., low class rank from a selective highschool) are considered separately. Individual studentrecommendations are printed on labels which are then pastedon letters and distributed to student mail boxes the nextmorning. In addition, each academic advisor is sent anelectronic message containing the results for their advisees.

The current St. Olaf mathematics placement program isadministered by a member of the mathematics faculty whois given a half course teaching credit (out of a six-coursestandard load) for serving as director. As the present Directorof Mathematics Placement, I send each new student a letterearly in the summer describing the placement process andhandle questions and concerns that arise. I also manage thedetails of giving the exams and reporting the results.Following the exams, I counsel students who have questionsabout their placement recommendation; notify instructorsof students in their classes who did not take an examinationor who did not follow the placement recommendation; andassist students with changes among calculus sections.

Findings

The placement recommendations are computed using a largenumber of regression equations. Dr. Richard Kleber, amathematics faculty member and our senior statistician, hasspent numerous years developing and refining theseequations. Each summer he has run a series of regressionstudies in order to “fine tune” the regression equations. Thereare regression equations for each of 24 cases, depending onthe information available for a particular student. Theseequations are used in five different sets as follows:

• One set of equations is used to predict calculus grades forall students based only on information available from theadmissions office in case the student never takes the exam.

• Two sets are used to distinguish between students whoneed to take Calculus with Algebra (or who need to dopreliminary work before enrolling in this course) and thosewho are ready for Calculus I. One of these sets is usedfor students who took the Basic Exam and the other setfor those who took the Regular Exam.

• Two more sets are used to recommend honors calculusversus regular calculus. One of these sets is used for

Page 193: Assessment Practices in Undergraduate Mathematics - Northern

180 Assessment Practices in Undergraduate Mathematics

students who took the Regular Exam and the other set forthose who took the Advanced Exam. Students who takethe Advanced Exam and who achieve a score of 12 orhigher on the calculus portion are placed into regular orhonors Calculus II. Those with especially good scoreson the calculus portion who indicate that they have hadat least a year of high school calculus are asked to talk tothe Director of Placement about the possibility of placingbeyond first-year calculus.

Use of Findings

As an example of these regression equations, the equationbelow predicts a grade (gr) on a four point scale using anormalized high school rank (based on a 20 to 80 scale),PSAT and ACT math scores, a self-reported math grade fromthe ACT exam and the algebra, trigonometry and functionscores from the placement exam.

gr = –2.662 + 0.0385nrank + 0.00792psatm+ 0.0470actm + 0.218mgrade + 0.0237ascore+ 0.0277tscore + 0.0244fscore

This particular equation, modified by information from thesubjective questionnaire, is one of those used to make decisionsabout honors versus regular calculus placement for studentswho take the Regular Placement Exam. Typically, students withgr scores of 3.5 or above are given a recommendation for honorsCalculus I; those with scores between 3.3 and 3.5 are told theymay make their own choice between the honors and regularversions of Calculus I; and those with scores below 3.3 aregiven a recommendation for regular Calculus I.

Success Factors

The success and efficiency of this placement process is dueto much “behind-the-scenes” effort by several people in themathematics department and involves everyone in thedepartment on the actual day of testing. This extensive effortis rewarded by the number of students who successfullycomplete their initial courses in mathematics. In the firstsemesters of the 1991–92 and 1992–93 academic years, 92%of the students who initially enrolled in a calculus coursecompleted a semester of calculus; and of these, 92% receiveda grade of C– or above.

Page 194: Assessment Practices in Undergraduate Mathematics - Northern

181

Background and Purpose

The University of Arizona is a large research institution withapproximately 35,000 students. The mathematics departmentconsists of 59 regular faculty, 8-12 visiting faculty, 20-25adjunct lecturers, and 60-70 teaching and research assistants.Each semester the department offers between 250 and 300courses serving about 10,000 undergraduates and graduatestudents. There are approximately 350 majors in mathematicsincluding mathematics engineering and education students.

During the 1970s the University of Arizona experienceda large growth in enrollment in entry level mathematicscourses. An advisory placement program was initiated in1978. It did not work very well since anywhere from 25% to50% of the students did not take the test and the majority ofthose who took it ignored the results. The placement testwas given in a single session prior to the first day of classes.This, together with an inadequate allocation of resources,resulted in average attrition rates (failures and withdrawals)of almost 50%. In the mid 1980s the mathematics department,with support from the administration, took a number of stepsto reverse the trend. They included the implementation of amandatory Mathematics Readiness Testing Program (MRT),a restructuring of beginning courses, providing students asupportive learning environment, and an outreach programto schools and community colleges. The mathematicsdepartment appointed a full time faculty member as an MRTprogram coordinator with a charge to work with the studentresearch office to analyze MRT scores. By 1988, the MRTprogram was fully implemented for all new students enrollingin courses from intermediate algebra to Calculus I.

Method

Over 5,000 students participate in the MRT program eachyear. Although placement tests are given throughout the year,most are taken during summer orientation sessions, whichare mandatory for all new students. These two-day sessionsaccommodate 350-400 students at a time, and provide testing,advising, and registration for the following semester. Duringlunch on the first day, the MRT coordinator gives anexplanation of the placement process and testing procedures.Students may ask questions during this hour before breakinginto groups of 50 to be escorted to various testing sites.Administration of the tests, electronic scoring/coding,printing of results, and data analysis are handled by theUniversity Testing Office.

Students choose between two timed multiple-choice testsadopted from the 1993 California Mathematics DiagnosticTesting project. The version currently being used allows avariety of calculators (including graphing calculators). Fortest securing reasons, those with QWERTY keyboards arenot allowed at this time. Test A is a 50-minute, 45-questiontest covering intermediate algebra skills. This test is used toplace students into one of three levels of algebra or a liberalarts mathematics course. Test B is a 90-minute, 60-questiontest covering college algebra and trigonometry skills. It isused to determine placement in finite mathematics, pre-calculus, or calculus. The students’ choice of test dependsprimarily on their mathematical background and to someextent, on their major.

The tests are scored electronically, and results are printedthat evening. Students pick up their results (in mathematics,

A Mathematics Placement and Advising Program

Donna Krawczyk and Elias ToubassiUniversity of Arizona

Placement was the issue at this large, comprehensive school. This article explains, among other initiativesthe department took to improve its accessibility to students, a Mathematics Readiness Testing Program.Statistics measuring the reliability of the testing are included.

Page 195: Assessment Practices in Undergraduate Mathematics - Northern

182 Assessment Practices in Undergraduate Mathematics

English, and languages) early the next morning prior to theiracademic advising sessions. Each student receives a profilesheet indicating their mathematics placement and abreakdown of their total score by topic. (Questions aregrouped into 7–8 topic areas, such as linear functions,simplifying expressions, and solving equations.) Studentsmay place into one of three levels of courses through TestA, the lowest level being a non-credit beginning algebracourse offered by the local junior college (the communitycollege course is offered on our campus), while the highestlevel is college algebra and trigonometry. Although Test Bcan also place a student into college algebra, it is primarilydesigned to place students into one of 5 levels above collegealgebra. The lowest of these includes finite mathematics andbrief calculus, while the highest level is an honors versionof Calculus I. Students may register for courses at their levelor lower. The computerized registration process blocksstudents from registering for courses at a higher level.

Due to the complexity of the levels of courses and thewide variety of mathematics requirements by major,experienced mathematics advisors are available to answerstudent questions during the 90-minute period after profilesare distributed. Although placement is initially based on atest score, other factors are considered. One of these is thehigh school GPA which is automatically considered forfreshmen whose scores fall near a cut-off. Appropriate GPAcut-off values have been determined by measuring successrates for courses at each level. A pilot placement programbegan this year in which freshmen can place into calculusbased on high RSAT1 or ACT mathematics scores withouttaking Test B. Transfer students are allowed to useprerequisite coursework in addition to their test results. Thecoordinator makes final decisions when overriding any initialplacement based on test results.

Occasionally, students take the wrong test. In such cases,they may take the correct test on the second day of the session.

These tests are hand-scored. As a general rule, the same testcannot be taken more than once within any three-monthperiod. This has lowered the number of appeals from studentsclaiming, “I just forgot the material, but I know it’ll comeback to me once I’m in class.”

Throughout the year, the coordinator monitors the numberof students placing at each level, assists other academicadvisors with placement issues, and distributes informationto outside groups, such as high school counselors andteachers. At the end of each year, an analysis is made of thesuccess of the various placement procedures. Adjustmentsmay be made and monitored during the following semester.

Findings

Although our tests assess skills, we do not base placementon the correct performance of these skills. For example, wedo not give questions which a student must answer entirelycorrectly. The minimum or cut-off scores, that were usedinitially were based on the data of students when the testwas optional, in the early 80s. These cut-off scores aremonitored periodically and adjusted if needed. We do nothave a particular goal set for the percentage of studentspassing courses at each level. Our goal is to set a cut-offscore that minimizes the errors on either side of the cut-off;i.e., we do not want to set an unreasonably high cut-off scorethat would deny registration to a student with a reasonablechance for success. The tables below give some informationon our students.

Use of Findings

The data we gather, as well as the periodic studies providedus by the Office of Research on Undergraduate Curriculum,allows us to monitor the effectiveness of our program.

———————1 RSAT are recentered SAT scores. SAT did this recentering a few years ago. “RSAT” gives a way to distinguish between “new” SAT

scores and “old” SAT scores.

Percentages of new students placing at various levels in Fall 1996:

Placing above the college algebra/trigonometry level 33.9%Placing at the college algebra/trigonometry level 46.4%Placing at the non-credit, beginning algebra level 19.7%

Failure/withdrawal percentages for first time freshmen in Fall 1996:

Above college algebra/trigonometry Failure 12%, Withdrawal 9%College algebra/trigonometry level Failure 14.4%, Withdrawal 4.8%Calculus I* Failure 5%, Withdrawal 5%

(*Approximately half of the students participating in our pilot program for placement into calculus based on RSAT andACT scores also took the MRT. Of those, 91% placed into calculus.)

Page 196: Assessment Practices in Undergraduate Mathematics - Northern

Part III: Departmental Assessment Initiatives 183

Currently, we are generally pleased with the success ratesabove. However, we are about to embark on a significantchange, due to higher university requirements and theconsolidation of our two-semester algebra sequence into one.We are presently trying to determining the “best” cut-off forplacement into our new college algebra course. During Fall1996, we experimented with lower cut-off scores for entryinto the new algebra course. The preliminary analysis of thedata shows that these borderline students in fact were atsignificantly higher risk than students who met our regularcut-offs. More analysis and experimentation will be donebefore we settle on a cut-off score.

Another use of the data is to support the chemistry andbiology departments as they determine the mathematics pre-requisites for a few of their entry level courses. Both depart-ments independently looked at their pass/fail rates for entrylevel courses. Chemistry studied their two-semester sequencefor all science majors and minors. They found that studentstesting above the college algebra level or completing col-lege algebra did significantly better. In fact, the students whoplaced lower, or only completed intermediate algebra receivedonly D’s and Failing E’s. Biology studied their two-semestersequence for all science majors, minors, and health-relatedprofessions. The students who did not place above intermedi-ate algebra had a 90% chance of receiving a D or E.

One important outcome of the data analysis is that thereis no room for flexibility with regard to mandatory testingand forced placement. An optional test would be completelyineffective with the 5000 students in our entry-level courses.We have found that the majority of students taking the testare freshmen who have difficulty determining their ownreadiness for course work at a university.

Success Factors

Several features of the MRT program are essential to itssuccess. Strong support by the department and university iscrucial. The move from an optional program to the currentone required the cooperation of administrators, faculty, andeven students. Continuing communication is also important.Misinformation or lack of information can undermine thecredibility of the program. It is imperative that thiscommunication extend to the high schools and local colleges.

Funding and manpower are major obstacles. Until twoyears ago, the test was free. Now a $5.00 charge isautomatically billed to the student’s account. This offsetsthe costs of administering the test through the Testing Office,proctoring, reproduction, and scoring. The mathematicsadvisors and coordinator are funded through the department.The coordinator also receives a one-course reduction inteaching load during each semester. The commitment of theadvisors and coordinator is essential.

Another significant factor for success is the ability toanalyze the program and make adjustments. Here are someexamples of the types of questions that come up: When thedepartment phases out intermediate algebra next year, willwe be able to place some of these students at the next level?Is it necessary to test all transfer students? When the newuniversity entrance requirements of 4 years of mathematicsbegin in 1998, will Test A be necessary? Flexibility alsoextends to the appeals process. Although there are guidelinesto follow, such as test scores, prerequisite courses, and GPA,a more subjective set of guidelines may be necessary tohandle individual cases. In any case, continuity is important.

Page 197: Assessment Practices in Undergraduate Mathematics - Northern

184

Background and Purpose

The University of Southern Mississippi (USM) is a relativelyyoung, comprehensive research university with a studentbody of approximately 14,000 located on two campuses.The main campus in Hattiesburg typically enrolls 90% ofthese students, among them a large number of communitycollege transfer students. In fact, in fall, 1996, the maincampus enrollment of first-time students consisted of 1231freshman and 1569 transfers. While no precise data areavailable, a large number of students in both groups are thefirst members of their families to attend an institution ofhigher learning. Mathematics is one of nine departmentshoused in USM’s College of Science & Technology. In thefall of 1996, the Department of Mathematics had 123prospective majors and taught approximately 3100 studentsin 80 course sections. Preservice secondary teachers are amajor source of enrollment, as well as students pursuingpreprofessional degrees (e.g., premedicine, predentistry,preveterinary medicine). The university core mathematicscourse is College Algebra, and the department offers anintermediate algebra course for the most mathematicallyunderprepared university students. Mathematics faculty havekept the degree requirements at 128 semester hours, and thenumber of degrees in mathematics has remained relativelyconstant over the last few years (52 per year since 1991).Most serious students finish in the traditional four years.

Because of the enrollments cited above, our interest inattracting mathematics majors, and the fact that many of our

students are transfers or naïve about college, advising foracademic success and retention became a critical issue thataroused discussion in the 80s. In response to these concerns,the University Office of Admissions determined to involvefaculty from the appropriate academic unit as early in therecruitment process as is feasible, often establishing theadvisor-advisee relationship even before the student arriveson campus. The University instituted General Studies 101,a fairly typical “introduction to university life” course, aswell as a summer orientation and counseling session for allnew students. Even more significantly the computerizeddegree audit and advisement tool PACE (ProgrammedAcademic Curriculum Evaluation) became available to thedepartment and its college starting fall 1992.

Recognizing that retention and success in mathematicscourses are nationwide concerns, at this time the mathematicsdepartment, in concert with our college and university, forgedan advisement partnership with the college, which fostereda variety of advising, tracking and support strategies.

Method

The Advisement Program for mathematics majors is regardedas one of the primary responsibilities of the department(working in partnership with the college). In order to meetthe needs of a student body with a mean ACT of 21.3 (19.9in mathematics), reasonable high school mathematicspreparation (at least two years of algebra and a year ofgeometry), and fuzzy expectations of college, our

A Comprehensive Advisement Program

Stephen A. Doblin and Wallace C. PyeUniversity of Southern Mississippi

A large, budding university concentrates on the importance of advising students at all levels. Fromproviding mentors for freshmen to surveying graduating students and alumni, this department operateswith continuing feedback.

Page 198: Assessment Practices in Undergraduate Mathematics - Northern

Part III: Departmental Assessment Initiatives 185

mathematics faculty believed that advising must becomprehensive, flexible, and afford a personal flavor.

Prospective majors are advised by a department facultymember in a summer orientation and counseling session. Atthis time, students are given the department’s Guide for theMathematics Major1. This guide has proven popular amongboth parents and potential majors, and serves as a resource forstudents after enrolling. It includes a summary of degreerequirements, employment opportunities within the department,career opportunities in mathematics in general, job titles ofrecent graduates, course descriptions, faculty biosketches, anda five-year plan for the student regarding advanced courseofferings. The students also select their fall classes in thiscounseling session. (For placement, the department relies on acombination of ACT scores, high school courses and grades,and self-selection based upon course catalog descriptions anddiscussions with the advisor.) The students are also encouragedat this time to participate with the department in its emphasison undergraduate research.

The college also offers a Freshman Mentor Program; thisprogram is discussed in another brochure which offersbiosketches of the faculty volunteering. Mentors arevolunteers drawn from the College of Science andTechnology who provide freshmen support and advice.Students who do not apply for a mentor are contacted byphone so they can be personally encouraged to understandthe benefits of the program. Although the mentor-menteerelationship is largely up to the individuals, students whohave opted to participate in the program are matched upwith their mentors in two planned group activities — usuallythe program reception and a Halloween pumpkin carving /pizza social. The reception is welcoming and the atmospherecheerful — it includes a brief panel discussion led by a formerprogram participant and a popular mentor, who gives somecautions about the pitfalls of freshman year.

As students progress through the years, it is important tokeep information current. For this, we use the student’s PACEform, which contains personal data and specific programrequirements. The PACE form is automatically updated toinclude the results of courses taken, grade point averages inthe various categories, credit completed at the various levels,remaining degree requirements, etc.

In the fall of the student’s first year the student is assigneda permanent advisor and a mentor (if he/she opts for thelatter); the advisor of a mathematics major comes from themathematics department. During the first semester the studentmeets with the advisor to preregister for spring classes,discusses summer employment /undergraduate researchopportunities, the department’s undergraduate researchprogram; and the student academic goals. Since the PACE

program frees the advisor from bookkeeping responsibilities,important issues about the students’ immediate and long-range prospects can be discussed during these sessions.

During the year, students are kept on track by graduatestudents, undergraduate majors, and some faculty whoprovide tutorial assistance in a departmentally sponsoredMathematics Learning Center and new Calculus Lab.

Complementing these department initiatives, the Collegeof Science & Technology Scholars Program recognizes thecollege’s most outstanding sophomores and juniors byhosting a banquet at which attendees are provided certificatesof achievement and information about local and nationalscholarships and awards, summer research opportunities, andrequirements for admission to professional and graduateschools.

During the early fall of each year, the college also sponsorsa Senior Workshop, primarily for students who plan to enterthe world of work upon graduation. Conducted by USM’sDirector of Career Planning and Placement, the workshopprovides tips on the job search — identifying career options,applying, interviewing, etc.

Advising of the Faculty: All department advisors are givena copy of the college’s Academic Advisor’s Handbook ([1],also available on our World Wide Web home page), andeach fall personnel from the Dean’s Office conduct anadvisement workshop for new faculty.

Ongoing Assessment of the Program: Trying toincorporate all factors which impact on students’ ability toachieve, we have opted for an approach which also assessesthe effectiveness of our advisement initiatives. Hence, we:

• mail survey forms to mathematics majors 1, 5, 10, 15,and 20 years after graduation, soliciting each studentviews on department faculty’s teaching and advisementeffectiveness, perceptions of the quality of preparationfor further education and/or employment, and opinionson ways in which the department could improve oureducational offerings;

• require that each graduating senior in the college completea questionnaire on items such as those mentioned above.This takes place at the time the application for the degreeis filed, normally one semester before degree requirementshave been satisfied;

• query faculty during the annual performance intervieweach spring concerning the advisement and supportprogram (i.e., academic advisement, student mentoring,encouragement and direction of undergraduate researchprojects, etc.), their role in it, their perception of the

———————1 Copies of the various survey forms and brochures, the college’s Academic Advisor’s Handbook, and the Guide for the Mathematics

Major are available from the authors.

Page 199: Assessment Practices in Undergraduate Mathematics - Northern

186 Assessment Practices in Undergraduate Mathematics

effectiveness of their contributions, and opportunities forimprovement;

• obtain numerous student comments — many unsolicited— relative to these initiatives throughout the year;

• survey all mentor program participants — both studentsand faculty — to determine their perceptions of programeffectiveness and value as well as suggestions forimprovement; and

• conduct an exit interview with each student in the collegewho withdraws from the university during a semester, inan effort to identify the most common reasons for thisaction.

Findings

The University has seen increased success in earning nationalrecognition for scholarship and service, improvement inacademic records, and higher graduation rates for Collegeof Science and Technology majors, and we feel that this isin part due to the success of our comprehensive approach toadvisement and support and our continuing assessment ofour efforts. For example, the information gleaned from thesevarious mechanisms are discussed in appropriate forums(e.g., faculty meetings, chair’s yearly performance reviewswith faculty, dean’s annual evaluation of departmentalperformance with chair), and has been used in modifyingprogram components as well as in changing the recognitionand reward structure for faculty at the department level anddepartments at the college level.

We believe that the advisement and support programwhich is currently in place effectively assists many of ourstudents to persevere toward a degree and motivates our beststudents to establish enviable undergraduate records. Thesuccess of our program, we feel, is founded on some basicpoints:

• It is crucial to provide recognition and reward for thosefaculty who participate willingly and productively in suchinitiatives. The one-on-one interaction which we view asso important to the effectiveness of the advisement processis possible only if a substantial number of faculty believein its value and cooperate with the process.

• It is important to remove most of the record-keeping tasksfrom the advisor so that he/she can devote interactionswith the student to substantive issues (e.g., courseperformance, undergraduate research and employmentopportunities, academic and career goals).

• To be effective, an advisement and support program mustprovide an assortment of initiatives that address ourdiverse student body and the variety of their needs andconcerns.

• Surveys of current students and graduates can be quitevaluable if one wishes to assess effectiveness and make

improvements, but only if there is in place a formalmechanism for evaluating responses and developing andimplementing new initiatives.

Use of Findings

With these observations in mind, here are some of the issueswhich we plan to address in the near future:

1. We are pleased that the vast majority of mentor programparticipants earn freshman-year grade point averagesexceeding 2.0 and enroll for the sophomore year.However, many of the students who opt not to participatein the advice and support programs we offer are preciselythose who need these programs most. We are seekingbetter ways of attracting the attention of these students.

2. Transfer students, who make up the largest portion ofour new students each year, require extra effort inadvising. While one could make the case that freshmenneed the support far more than junior transfers, ourobservation is that there is little difference between thesetwo groups, especially when the juniors transferred froma small, home-town community college.

3. Every senior who files an application for degree isrequired to complete the college’s exit questionnaire. Weare pleased that even though this form is submittedanonymously, the vast majority of students take the timeto provide thoughtful and often provocative responses.On the other hand, the response rate to the department’smail-out survey to graduates is quite low (20–25%)(probably a common characteristic of mailing attempts,but below our expectations). A departmental advisorycommittee, consisting of current students and/or relativelyrecent graduates, could possibly provide perspective onthis that the department now lacks.

Success Factors

In our experience, advisement is an important factor instudent success and retention. University policy must beexplicit; it is essential that faculty members from theacademic unit be involved in an advisor-advisee relationshipwith their unit’s majors. The university must be willing topay for programs like PACE and for the increased staff whoaid the faculty in their advisement and support role. Thedean must actively cooperate in the student advisement andsupport process with broad-based imaginative initiatives thatwill complement and support the departments within thecollege. Most importantly, faculty must believe that theiradvisement and student support role does make a difference,and that it is duly valued by their department, college, andinstitution.

Page 200: Assessment Practices in Undergraduate Mathematics - Northern

187

Background and Purpose

In the mid-eighties the faculty at Northern Illinois University(De Kalb, Illinois) reviewed the requirements for theirbaccalaureate graduates and determined that each should beexpected to be at least minimally competent in mathematicalreasoning. Setting up an appropriate means for students toaccomplish this expectation required the faculty to take acareful look at the diversity of student majors and intentsfor the undergraduate degree. The result of this examinationwas to establish multiple ways by which the competencyrequirement could be met, but that program carried with itthe burden of showing that the various routes led to thedesired level of competency in each instance. In this articlethe quantitative literacy program at NIU will be described,and some aspects of assessment of it will be discussed.

The author’s involvement in this work stems from herservice on the University’s baccalaureate review committee,the subsequent establishment of a course at NIU related tocompetency, and the chairing of a national committee of theMathematical Association of America (MAA) onquantitative literacy requirements for all college students.The latter resulted in the development of a report publishedby the MAA [1] which has also been available at the Website: “MAA ONLINE” — http://www.maa.org.

Northern Illinois University is a comprehensive publicuniversity with roughly 17,000 undergraduate students whocome largely from the top third of the State of Illinois. TheUniversity is located 65 miles straight west of Chicago’sloop and accepts many transfer students from area commu-

nity colleges. For admission to the University students areto have three years of college preparatory high school math-ematics/computer science (which means that most studentscome with at least two years of high school mathematics.)Special programs are offered for disadvantaged students andfor honors students. The University consists of seven collegesand has 38 departments within those colleges, offering 61 dis-tinct bachelor’s degree programs within those departments.Designing a quantitative literacy program to fit a university ofthis type represents both a challenge and a commitment on thepart of the university’s faculty and administrators.

The term “quantitative literacy program” is defined inthe 1995 report of the Mathematical Association ofAmerica’s Committee on the Undergraduate Program inMathematics (CUPM) regarding quantitative literacyrequirements for all college students (See [1] Part III). Therecommendations in that report are based on the view that aquantitatively literate college graduate should be able to:

1. Interpret mathematical models such as formulas, graphs,tables, and schematics, and draw inferences from them.

2. Represent mathematical information symbolically,visually, numerically, and verbally.

3. Use arithmetical, algebraic, geometric and statisticalmethods to solve problems.

4. Estimate and check answers to mathematical problemsin order to determine reasonableness, identify alternatives,and select optimal results.

5. Recognize that mathematical and statistical methods havelimits.

A Quantitative Literacy Program

Linda R. SonsNorthern Illinois University

The author helped create MAA Guidelines for Quantitative Literacy and here spells out how this documentwas used at her large university. This school tested students and graded results in a variety of courses.Results led to curricular changes.

Page 201: Assessment Practices in Undergraduate Mathematics - Northern

188 Assessment Practices in Undergraduate Mathematics

Such capabilities should be held at a level commensuratewith the maturity of a college student’s mind.

CUPM notes in Part III of the report that to establishliteracy for most students a single college course in reasoningis not sufficient, because habits of thinking are establishedby means of an introduction to a habit followed byreinforcement of that habit over time as the habit is placedin perspective and sufficiently practiced. Thus, CUPMrecommends colleges and universities establish quantitativeliteracy programs consisting of a foundations experience andcontinuation experiences. The foundations experienceintroduces the desired patterns of thought and themathematical resources to support them, while thecontinuation experiences give the student the growing timeessential to making new attitudes and habits an integral partof their intellectual machinery.

Method

At Northern Illinois University the foundations experiencefor the quantitative literacy program consists of taking aspecific course as part of the general education program setfor all students. However, the specific course may vary fromstudent to student depending on placement data and theprobable major of the student.

Many major programs at NIU require students to studyspecific mathematical content. For students in those majorprograms which do not require the taking of specificmathematics, the Department of Mathematical Sciencesdevised a foundations course called Core Competency. Thiscourse focuses on computational facility and facts, theinterpretation of quantitative information, elementarymathematical reasoning, and problem solving. Topics in thecourse are taught in applications settings which may be partof the real life experience of a college graduate and includesome probability and statistics, geometry, logical arguments,survey analysis, equations and inequalities, systems ofequations and inequalities, and personal businessapplications. Elementary mathematics is reviewed indirectlyin problem situations.

In contrast, the courses offered to meet the needs of stu-dents whose programs require specific mathematical con-tent often demand a background of computational facilitygreater than that which is needed merely for admission toNIU and have objectives which relate to their service role.Thus while a student taking one of these latter courses mayhave attained the computational facility which the core com-petency necessitates, the student may not have acquired thebroad mathematical skills the core competency entails. Soin examining how the various service courses the Depart-ment taught met the competency objectives, the Departmentbelieved that the six courses (1) Elementary Functions; (2)Foundations of Elementary School Mathematics (a course

for future elementary school teachers and taken almost ex-clusively by students intending that career outlet); (3) Intro-ductory Discrete Mathematics; (4) Finite Mathematics; (5)Calculus for Business and Social Science; and (6) CalculusI would satisfy the competency objectives provided a stu-dent completed such a course with a grade of at least a C.

The Calculus I course is always offered in small sectionsof about 25 students, and, in a given semester, some of theother courses are offered in small sections as well. However,some are offered as auditorium classes which have associatedrecitation sections of size 25–30. Since some students takemore than one of the seven courses listed, it is not easy toknow precisely which students are using the course at a giventaking for the competency requirement, but approximately13% meet the competency by taking the Core Competencycourse, 16% meet it by taking Elementary Functions; 8%meet it by taking Foundations of Elementary SchoolMathematics; 1% meet it by taking Introductory DiscreteMathematics; 25% meet it by taking Finite Mathematics;25% meet it by taking Calculus for Business and SocialScience; and 12% meet it by taking Calculus I.

In executing the competency offerings, two questionsemerged which subsequently became the framework aboutwhich an assessment plan was set.

1. What hard evidence is there that these seven routes(including the Core Competency route) each lead to atleast a minimal competency?

2. Can we obtain information from a plan for measurementof minimal competency which can be used to devise betterplacement tests for entering students? (The Departmenthas historically devised its own placement tests to meshwith the multiple entry points into the mathematicsofferings which entering students may face.) Could thissame information be used to construct an exit examinationfor degree recipients regarding quantitative literacy?

Tests were prepared for use in the Fall of 1993 whichcould be administered to a sample which consisted of nearlyall students in each of the courses taught that semester whichwere accepted as meeting the competency requirement.These tests were comparable in form and administered nearthe end of the semester so as to determine the extent to whichstudents had the desirable skills. Because these tests had tobe given on different dates in various classes, multiple formshad to be prepared. Each form included questions testingcomputational facility, questions involving problem solving,and questions requiring interpretation and mathematicalreasoning. Each form consisted of ten multiple choicequestions, three open-ended interpretive questions, and threeproblems to solve. To be sure that students took seriouslytheir performance on these examinations, each courseattached some value to performance on the examinationwhich related to the student’s grade in the course. Forexample, in the Core Competency course the test score could

Page 202: Assessment Practices in Undergraduate Mathematics - Northern

Part III: Departmental Assessment Initiatives 189

be substituted for poor performance on a previous hourexamination, while in the Calculus for Business and SocialScience course points were given towards the total pointsused to compute the final grade in the course.

A uniform grading standard was drawn up for theexaminations, and a team of five graders evaluated all of theexaminations except for the Calculus I papers which weregraded by the course instructors according to the setguidelines. Examinations were scored for 2358 students. Thecourse Introductory Discrete Mathematics was not offeredin Fall 1993, so data from it needed to be gathered anothersemester. Clearly many hours were devoted to grading! Theexaminations were designed so that “passing” wasdetermined to be the attainment of at least 50% of the possible100% completion of the test for the Core Competencystudents. Test scores were divided according to thecomputational facility section (the multiple choice part) andthe remainder, which was viewed as the problem solvingcomponent.

Findings

To interpret the scores on the graded examinations a com-parison had to be made with the grades the students receivedin the course they took. These showed the pass rates in thetable below for the foundations experience requirement oncourses which are used to satisfy that requirement.

A comparison of the means of the various test formssuggested two of the seven were harder than the others (theseforms were given in sections of more than one course andcompared). Both of these forms were taken by students inthe Elementary Functions course. For those students in theElementary Functions course who did not take the harderforms, the pass rate was 71%. Looking at the totalperformance of all students in the Elementary Functionscourse who took an assessment test, it was seen that theyscored high on the computational facility portion of theexamination, but relatively low on the problem solvingcomponent. In contrast, the students in the Foundations ofElementary School Mathematics course showed weak scoreson the computational facility portion of the test and relatively

high scores on the problem solving component. Students inthe remaining courses averaged a respectable passingpercentage on each examination component.

Use of Findings

Overall analysis of the assessment data led to the conclusionsthat:

• We should bolster the problem solving component of theElementary Functions course and once again measurestudent performance on an assessment test;

• We should bolster the computational facility componentof the Foundations of Elementary School Mathematicscourse and once again measure student performance onan assessment test;

• We should make no change in the requirement of a gradeof C or better in the six courses cited earlier in order fora student to receive credit for meeting the competencyrequirement;

• We should use the item analysis of test responses to helpconstruct an improved placement test which encompassessome measure of quantitative literacy;

• We should use the item analysis made of the test responsesto construct a senior follow-up test intended to providesome measure of a degree recipient’s literacy as therecipient leaves the University.

Other conclusions about the study were that we shouldperiodically do an analysis across all the courses mentioned,but it need not be done in all courses during the samesemester. Also devising many comparable tests is likely toresult in some unevenness across test forms — a situationwhich needs to be carefully monitored. (Using test forms inmore than one course and using statistics to compare thetest results among the different forms should “flag” a badform.) And although it was not desirable, multiple choicequestions were used on the computational facility part ofthe test merely to save time in grading. (When fewer studentsare being examined in a short period of time, the use ofmultiple choice questions can be eliminated).

Course Number of students Pass Percentage of those whotaking test received competency credit

Core Competency 278 84

Elementary Functions 358 44

Foundations of Elementary School Mathematics 122 64Finite Mathematics 924 69

Calculus for Business and Social Science 398 80

Calculus I 278 84

Page 203: Assessment Practices in Undergraduate Mathematics - Northern

190 Assessment Practices in Undergraduate Mathematics

Success Factors

This assessment process took considerable effort on the partof the author (a seasoned faculty member) and two supportedgraduate assistants. Two others were brought in to help withgrading. However, the assessment process was organized soas to be minimally intrusive for the individual course andinstructor.

This assessment process grew out of the Department ofMathematical Sciences’ ongoing concern for the success ofits programs and was furthered by the University’s GeneralEducation Committee which then recommended the plan theDepartment proposed regarding assessment of the corecompetency requirement. Consequently the University’sAssessment Office partially supported the two graduateassistants who assisted in the construction of theexaminations, the administration of the examinations, andthe grading of the examinations.

Now the Department needs to turn its attention to thefollow-up uses noted above and to an assessment of the role

of the continuation experiences. At present these continuationexperiences vary from student to student also in accordancewith the student’s major. But there has been discussion withinthe University of having a culminating general educationexperience for all students which might involve a projecthaving a quantitative component. In either case a currentintent is to analyze student achievement based on the follow-up test noted above, where possible, and on the facultyjudgment of student attainment of the five capabilities listedin the CUPM report (and listed above). The appendix ofthat CUPM report has some scoring guides to try.

Reference

[1] Committee on the Undergraduate Program in Mathe-matics. Quantitative Literacy for College Literacy, MAAReports 1 (New Series), Mathematical Association ofAmerica, Washington, DC, 1996.

Page 204: Assessment Practices in Undergraduate Mathematics - Northern

191

Background and Purpose

Creating a general education course that is successful in de-veloping quantitative reasoning is a reoccurring issue for mostmathematics departments. A variety of approaches currentlyflourish, ranging from offering practical problems to viewingthe course as linked to sociological, historical, or “awareness”issues. For many of us, such a course is a major source of adepartment’s credit hours, yet the mathematics community hasnot been as aggressive about looking into new models for thesecourses as they have been, for example, about calculus reform.Moreover, general education courses in mathematics are usu-ally not found to be enjoyable courses, either by students or byinstructors. Students are often thrust into these courses becausethey need a single mathematics course and often have failed toplace into a higher-level course. During a typical course ofthis nature, the instructor quickly proceeds through variousskills, like Bartholomew shedding his 500 hats, providing ex-posure to a range of “interesting” topics rather than helpingstudents develop in-depth understanding.

Illinois State is a midsize multipurpose university of20,000 students, drawing primarily from the state. As mem-bers of the mathematics department who are doing researchon undergraduate learning and who are interacting withRUMEC [1], we began to implement learning-theory ideasto design our general education course, “Dimensions ofMathematical Problem Solving.” In order to do this, we askedourselves: What does it really mean for students to “learn?”

What do we most want them to achieve from such a course?And, how can this course offer students an enriching, posi-tive experience in mathematics? The professional literature[3, 5, 8] has stressed the need to promote active learningand teacher-student interaction. For example, the reportReshaping College Mathematics [8] speaks to “the neces-sity of doing mathematics to learn mathematics,” “the ver-balization and reasoning necessary to understand symbol-ism,” and “a Socratic approach, in which the instructor workscarefully to let the students develop their own reasoning”(pp. 111–112). This mode of instruction has been used byus and other mathematics education instructors who haveworked with developing reasoning and problem-solvingskills of students in our courses. Using this shared-knowl-edge base about teaching undergraduates, the pedagogy forthis new education course developed.

Even though our students have had three or four years ofhigh school mathematics, they still believe that doingmathematics means memorizing and identifying theappropriate equation or formula and then applying it to thesituation. We wanted a course that would generate substantialimprovement in their reasoning and problem-solving skillsrather than exposing them to more topics to memorize andto not understand. The philosophy guiding the “Dimensions”course can best be summarized by the following paragraphfrom the syllabus of the course.

It is important that you realize that you cannotunderstand mathematics by observing others doing

Creating a General Education Course:A Cognitive Approach

Albert D. Otto, Cheryl A. Lubinski, and Carol T. BensonIllinois State University

What is the point of teaching students if they’re not learning? Here a general education course operatesfrom a learning-theoretic mold; mathematics education instructors become involved to help studentsconstruct their own learning.

Page 205: Assessment Practices in Undergraduate Mathematics - Northern

192 Assessment Practices in Undergraduate Mathematics

mathematics. You must participate mentally in thelearning process. This participation includes studyingthe material; working with others; struggling with non-routine problems; using calculators for solving andexploring problems; conjecturing, justifying, andpresenting conclusions; writing mathematics; listeningto others; as well as the more typical tasks of solvingproblems and taking examinations. The emphasis inthis course will be on ideas and on understanding andreasoning rather than memorizing and using equationsor algorithms.Mathematical topics are selected from number theory,

discrete mathematics, algebra, and geometry. Specialattention is given to topics from the 6-10 curriculum forwhich students have demonstrated a lack of 1) conceptualunderstanding and 2) reasoning and problem-solving skills.By focusing on material that students have presumablycovered, such as proportional reasoning and algebraicgeneralizations, we can emphasize the development ofreasoning and problem-solving skills as well as conceptualunderstanding. Guiding us are the following courseprinciples.

• The course has no textbook because students tend tomimic the examples in the textbooks rather than to reason.

• The course is problem driven; i.e. we do not use the lectureformat followed by assignments. Rather, problems aregiven first to drive the development of reasoning andproblem solving and the understanding of mathematicalconcepts.

• The pedagogy of the class consists of students presentingtheir reasoning and strategies for solving problems. Thisis supplemented with questions from the instructor andother students to clarify the explanations of the students.

• The course focuses on quantitative reasoning [9, 10] andmultiple representations.

• Class notes are used on examinations to reinforce thatreasoning and problem solving are the focus and not thememorization of procedures and formulas.

• Homework used for assessment is completed in smallgroup settings [7] outside of class. Students must submittheir work with complete explanations of their reasoning.

• Students submit an individual portfolio at midterm andat the end of the semester in which they provide examplesof how their problem solving and reasoning haveimproved.

• The faculty presently teaching as well as those who willbe teaching this course meet weekly to share and discussstudents’ intellectual and mathematical progress.

• We use students’ reasoning and explanations as we makeinstructional decisions for the next class period. Howeverbecause of the focus on students explaining their reasoning

and problem solving, we are aware that students initiallyreact negatively to this form of pedagogy.

Method

Many believe that proper assessment means collecting datafor evaluations for grades. These data present only a snapshotor limited view of in-depth understanding. In “Dimensions,”we have designed a course in which assessment is an ongoing,formative process. A guiding principle for our pedagogy isthat what and how our students feel about learningmathematics affects what and how they learn mathematics.

To measure attitudes, at the beginning of the course weadminister the “Learning Context Questionnaire.” [4] Thisquestionnaire is based on the works of William G. Perry Jr.[2, 6] and requires students to select a response from a six-point Likert scale (strongly agree to strongly disagree) on50 items, some of which are “I like classes in which studentshave to do much of the research to answer the questions thatcome up in discussions,” and “I am often not sure I haveanswered a question correctly until the teacher assures me itis correct.”

We also administer a second survey which we developed,“Survey on Mathematics,” that asks students more about theirbeliefs of what it means to learn mathematics, such as

1. How would you respond to someone in this class whoasks, “What is mathematics?”

2. What does it mean to understand mathematics?3. How do you best learn mathematics?4. Explain the role of reasoning when doing a problem that

uses mathematics.5. Describe the role of a mathematics teachers AND describe

the role of a mathematics student.6. Describe a memorable mathematical experience you have

had.

Findings

On the Learning Context Questionnaire, a low score (re-ceived by about 65% of our students) describes students wholook to authority figures for correct answers or for makingdecisions. These students do not believe they have the abil-ity to work out solutions by themselves, suggesting imma-ture cognitive development. About 25% of our students fallinto the second category. These students realize there maybe different perspectives to situations; however they are notable to differentiate or to evaluate among perspectives. Theystill tend to look to authority figures for help in finding solu-tions and have little confidence in their own ability to findsolutions. Only 10% of the students in these classes are atthe upper end of the cognitive development scale. Thesestudents take a more critical look at situations and rely ontheir own ability to find solutions to problems.

Page 206: Assessment Practices in Undergraduate Mathematics - Northern

Part III: Departmental Assessment Initiatives 193

The results of the Survey of Mathematics are analyzed bothby noting clusters of similar responses for each question andby making connections among individual student’s responsesover all questions in order to better understand students’perspectives on learning mathematics. For example, responsesto the first question typically indicate that students often thinkof mathematics as “the study of how to find an answer to aproblem by using numbers and various theories.” Some notethat in mathematics there is only one correct solution. About20% indicate that mathematics and understanding are related,such as “mathematics deals with understanding and computingnumbers.” However, question two provides additional insightsinto what students mean by understanding. Students feel thatbeing able to explain their answer or being able to solveequations and perform procedures means to understand. Forquestion #5, most students state that teachers need to go throughprocedures step by step, allow students time to practice, andshow specific examples, “I learn the best when a teacherexplains step by step the answer to a question.” Few studentsstate, as one did, “...to really learn you have to actually sitdown and try to figure it out until you get it.” Thus, studentsperceive that the role of the teacher is to show, tell, explain,and answer questions. Their perceived role of a mathematicsstudent is to listen or to pay attention, ask questions, and someeven state “to take notes.” There is probably no better place tobring to student attention these fundamental misconceptionsof our mutual roles than in a general education course.

Use of Findings

What we find from both the Learning Context Questionnaire(LCQ) and our own survey about how students anticipatelearning mathematics forms our general approach as well asday-to-day decisions about our instruction. We do not lecturestudents on how to think about solving problems, but guidethem to develop their own mathematical processes. Becausethe majority of our students are at the lower end of the scaleon the LCQ, we recognize the need to help them realize thatthey are ultimately responsible for their own learning and“sense-making.”

Class periods are spent analyzing varieties of solutions byhaving students use the board or transparencies to present theirsolutions. In addition, students must provide completeexplanations about their reasoning and learn to includeappropriate representations (pictorial, algebraic, graphical,etc.). Grading on written assessments is based on correctnessof reasoning, understanding of the situation, and completenessand quality of explanations. However, we also informally assessby comparing students’ mathematical performance with thatsuggested by their beliefs to evaluate how the students’problem-solving and reasoning skills are developing.

During a class session, we first determine if a strategyused by the student is comparable to a strategy used in class,

looking for whether the student has merely mimicked theclass example or was able to generalize. An importantcomponent of our assessment is the degree to which students’explanations match the students’ use of numbers andsymbols: often the answer is correct, but the explanationdoes not match the symbols. In other words, we look forcognitive growth on the part of the student. Our ongoingassessment (both verbal and written) assists us to identifymisconceptions that may have arisen as a result of classroomdiscourse.

Success Factors

The course as now developed and taught is labor-intensive.We continually search for content-rich problems. In addition,the instructor contributes a lot of effort to encourage studentsto volunteer and to respectfully respond to other students’reasonings and explanations. It is not surprising that withsuch a format we initially meet with resistance and frustrationfrom students. It becomes important to resist the temptationto tell students how to do a problem. Likewise, we mustreassure students that we are aware of their frustration butthat making sense out of mathematics for themselves is aworthwhile and rewarding experience. Unfortunately, we findwe cannot have all students understand every topic, so wefocus our attention on those students who are willing to spend6–10 hours each week struggling to understand and makesense of the mathematics; it is these students who influenceour instructional pace.

As part of our ongoing course development, the weeklyplanning sessions for instructors provide an opportunity forinstructors to share. These sessions become mini staffdevelopment sessions with all instructors learning from eachother.

In spite of the many obstacles and demands this coursecreates, it is rewarding that we are able to collaborate withstudents in their learning. For many of our students, thiscourse experience has allowed them for the first time to makesense out of mathematics. We see our course as a processcourse, and, as such, we feel we are able to offer studentsthe best possible opportunity for improvement of theirlifelong mathematical reasoning and understandingprocesses. Simultaneously, we put ourselves in anenvironment in which the drudgery of low-level teaching isreplaced by the excitement of being able to dynamicallyreshape the course as our students develop their mathematicalreasoning, understanding, and problem-solving skills.

References

[1] Asiala, M., Brown, A., Devries, D.J., Dubinsky, E.,Mathews, D., and Thomas, K. “A Framework for Re-search and Curriculum Development in Undergraduate

Page 207: Assessment Practices in Undergraduate Mathematics - Northern

194 Assessment Practices in Undergraduate Mathematics

Mathematics Education,” Research in CollegiateMathematics Education II, American MathematicalSociety, Providence RI, 1996.

[2] Battaglini, D.J. and Schenkat, R.J. Fostering CognitiveDevelopment in College Students: The Perry andToulmin Models, ERIC Document Reproduction ServiceNo. ED 284 272, 1987.

[3] Cohen, D., ed.Crossroads in Mathematics: Standardsfor Introductory College Mathematics Before Calculus,American Mathematical Association of Two-YearColleges, 1995.

[4] Griffith, J.V. and Chapman, D.W. LCQ: LearningContext Questionnaire, Davidson College, Davidson,North Carolina, 1982.

[5] Mathematical Association of America,Guidelines forPrograms and Departments in UndergraduateMathematical Sciences. Mathematical Association ofAmerica, Washington, DC, 1993.

[6] Perry Jr., W.G. Forms of Intellectual and EthicalDevelopment in the College Years, Holt, Rinehart andWinston, NY, 1970.

[7] Reynolds, B.E., Hagelgans, N.L., Schwingendorf, K.E.,Vidakovic, D., Dubinsky, E., Shahin, M., and Wimbish,G.J., Jr. A Practical Guide to Cooperative Learning inCollegiate Mathematics, MAA Notes Number 37, TheMathematical Association of America, Washington, DC,1995.

[8] Steen, L.A., ed. Reshaping College Mathematics, MAANotes Number 13, Mathematical Association ofAmerica, Washington, DC, 1989.

[9] Thompson, A.G. and Thompson, P.W. “A CognitivePerspective on the Mathematical Preparation ofTeachers: The Case of Algebra,” in Lacampange, C.B.,Blair, W., and Kaput, J. eds., The Algebra LearningInitiative Colloquium, U.S. Department of Education,Washington, DC, 1995, pp. 95–116.

[10] Thompson, A.G., Philipp, R.A., Thompson, P.W., andBoyd, B.A. “Calculational and Conceptual Orientationsin Teaching Mathematics,” in Aichele, D.B. andCoxfords, A.F., eds., Professional Development forTeachers of Mathematics: 1994 Yearbook, NCTM,Reston, VA, 1994, pp. 79–92.

Page 208: Assessment Practices in Undergraduate Mathematics - Northern

195

Background and Purpose

King’s College is a liberal arts college with about 1800 full-time undergraduate students. Its assessment program hasseveral components. The program as a whole is describedin [2] and [3], and [4] outlines the program briefly and detailsa component relating to mathematics majors.

The component of interest here is assessment in core(general education) courses. The assessment program wasinitially spearheaded by an academic dean, who built uponideas from various faculty members. Assessment in each corecourse was to consist of administering and comparing resultsfrom a pretest and post-test. By the time the program hadmatured sufficiently for the College’s Curriculum andTeaching Committee and Faculty Council to endorse acollege-wide assessment policy, it became clear that not alldisciplines would be well served by the pre/post-testingapproach; the college policy was written in a more liberalfashion, and the use of a pretest became optional.

The use of both a pre- and post-test, however, proves tobe very useful in the quantitative reasoning course, which isrequired of all students who do not take calculus, i.e.,humanities and social science majors. It has no prerequisite,nor is there a lower, remedial course. The course covers aselection of standard topics — problem solving, set theoryand logic, probability and statistics, and consumer math.Recent texts have been [1] and [5].

Method

As with every core course, quantitative reasoning is looselydefined by a set of learning goals and objectives for thestudent. These are initially formulated by the Project Teamfor the course. (The team typically consists of the instructorsteaching the course; it may also have members from otherdepartments to provide a broader perspective, though that isnot currently the case.) The set of goals and objectives mustultimately be approved by the College’s Curriculum andTeaching Committee, which represents the faculty’s interestin shaping the liberal education our students take with theminto society.

Learning objectives are distinguished from learning goalsin that the former are more concrete and more easily as-sessed. For example, one of the eight objectives for quanti-tative reasoning is: “to be able to compute measures of cen-tral tendency and dispersion for data.” By contrast, the sixgoals for the course are more abstract, as in: “to becomemore alert to the misuses of statistics and of graphical repre-sentations of data.” All goals and objectives are phrased interms of student (not teacher) performance.

The learning goals and objectives guide what a ProjectTeam decides should be taught and how an assessmentinstrument should be designed. Teams differ in theirapproaches on the various sections of a course; for example,different instructors may be using different textbooks to teach

Using Pre- and Post-Testing in a Liberal Arts MathematicsCourse to Improve Teaching and Learning

Mark MichaelKing’s College

This pre- and post-testing system in a liberal arts mathematics course raises interesting questions abouttesting in general, and asks why students may sometimes appear to go backwards in their learning.

Page 209: Assessment Practices in Undergraduate Mathematics - Northern

196 Assessment Practices in Undergraduate Mathematics

the same core course. The Quantitative Reasoning ProjectTeam has chosen to use a single textbook and a singleassessment instrument. Atypical of most core courses, thepre- and post-tests for quantitative reasoning are identical,both being handed back at the same time. This approachlimits how the pretest can be used as a learning tool duringthe course, but it provides the instructor the cleanest before-and-after comparisons. By not returning the pretest early,he/she does not have to worry about whether the second testis sufficiently similar to the pretest on one hand and whetherstudents are unduly “prepped” for the post-test on the other.

While the quantitative reasoning course strives to providethe kind of sophistication we want each of our graduates topossess, the pre/post-test focuses even more intently on skillswe might hope an alum would retain years after graduation— or an incoming freshman would already possess! Whilethe test is sufficiently comprehensive to span the full rangeof areas covered by the course, it does not evaluate the entirecollection of skills taught in the course. It fails intentionallyto test for knowledge of the more complicated formulas (e.g.,standard deviation). The pre/post-test also deliberatelyavoids the use of “jargon” that might be appropriate in thecourse, but which is unlikely to be heard elsewhere. Forexample, in the course we would discuss the “negation” ofan English sentence; this term would be used in homework,quizzes, and tests. On the pre/post-test, however, a sentencewould be given, followed by the query: “If this sentence isfalse, which of the following sentences must be true?”

The pre/post-test also makes a more concerted attemptto detect understanding of concepts than do traditionaltextbook homework problems. For example, given that theprobability of rain is 50% on each of Saturday and Sunday,students are asked whether the probability of rain during theweekend is 50%, 100%, less than 50%, or in between 50%and 100%; a formula for conditional probability is notneeded, but understanding is.

The pre/post-test differs from the course’s quizzes andtests in other ways. It consists of 25 short questions, aboutone-third of which are multiple-choice. No partial credit isawarded. The problems are planned so that computationsdo not require the use of a calculator. This latter feature isimposed because the pretest is given on the first day of class,when some students come sans calculator. The pretest doesnot contribute to the course grade; even so, our students seemadequately self-motivated to do the best they can on the test.Each student should be able to answer several pretestquestions correctly. The pretest thus serves as an early-warning system, since any student who gets only a fewanswers right invariably will need special help. Time pressureis not an issue on the pretest because the only other item ofbusiness on the first day of class is a discussion of the coursesyllabus. The post-test likewise is untimed, since it is givenon the same day as the last quiz, and nothing else is donethat day. The post-test counts slightly more than a quiz. It

becomes as an immediate learning tool, as it is used inreviewing for the last test and the final exam.

Findings

Since the pre- and post-tests are identical, and questions aregraded either right or wrong, it is easy to collect question-by-question before-and-after data. During our six years ofpre/post-testing, the precise balance of questions has changedslightly each year as we continually strive to create the mostappropriate test. Still, consistent patterns emerge with regardto the learning exhibited in different subject areas. Forinstance, our students tend to be more receptive to probabilityand statistics than to logic.

Another finding is that the pretest is in fact, not a reliablepredictor of success in the course except at the high and lowends of the scale; viz., students who score high on the pretest(better than 15/25) do not experience difficulties with thecourse while students who score very low (worse than 7/25), do. Similarly, post-test scores do not closely correspondto final exam scores. This is partly because some studentsmake better use of the post-test in preparing for the finalexam. Also contributing to the discrepancy is the differencein focus between the post-test and the final exam. This raisesthe legitimate question of whether the post-test or the moretraditional final exam provides the “better” way of producinga grade for each of our students.

The most startling pattern to emerge is a good-news-bad-news story that is somewhat humbling. Most of our studentsarrive already knowing the multiplicative counting principleof combinatorics. By the end of the course, they have madegood progress mastering permutations and combinations —but at a cost: consistently fewer students correctly solved asimple pants-and-shirts problem on the post-test than on thepretest!

Use of Findings

Even though students do not see the results of their pretestsuntil late in the course, those results can be useful to theinstructor. The instructor is forewarned as to the areas in whicha class as a whole is strong or weak. Where a class is weak,additional activities can be prepared in advance. Where a classis strong, the instructor can encourage students by pointingout, in general terms, those skills which most of them alreadyhave. At the individual level, if a student has a very low pretestscore, he/she may be advised to sign up for a tutor, enroll in aspecial section, etc. An obvious way to use before-and-aftercomparative data is for teachers and learners to see where theymight have done their respective jobs better. For teachers, whotypically get to do it all over again, comparing results fromsemester to semester can indicate whether a change in pedagogyhas had the desired effect. Again, the news may be humbling.

Page 210: Assessment Practices in Undergraduate Mathematics - Northern

Part III: Departmental Assessment Initiatives 197

A case in point: knowing that students were likely to “gobackwards” in regard to the multiplicative counting principle,I attempted to forewarn students to be on guard against thiscombinatorial backsliding; pre/post-test comparisonsrevealed that my preaching had no impact! What was neededto overcome this tendency was a new type of in-class exercisedesigned to shake students’ all-abiding faith in factorials.

Another potential use for pre/post-test data is to makecomparisons among different sections of a course. Doingthis to evaluate the effectiveness of faculty could bedangerous, as we all know how different sections of studentscan vary in talent, background, attitude, etc. But an importantapplication of the data has been to compare two sectionstaught by the same professor. A special section of thequantitative reasoning course was set up for students self-identified as weak in math. The section required an extraclass meeting per week and a variety of additional activities(e.g., journals). Pretest scores confirmed that, with oneexception, the students who chose to be in the special sectiondid so for good reason. Pre/post-test comparisons indicatedthat greater progress was made by the special section—notsurprising since students who start with lower scores havemore room to move upward. But, in addition, the post-testscores of the special section were nearly as high as in theregular section taught by the same instructor.

Success Factors

Comparing results from two tests is the key to this assessmentmethod. Some particulars of our approach — using the sametest twice, using simple questions, giving no partial credit— simplify a process which, even so, tells us much we wouldnot have known otherwise about the learning that is or is nottaking place; they are not essential to the success of testingstudents twice. However, pre/post-test comparative data only

reveals what progress students have made. It does not revealwhat brought about that progress or what should be done tobring about greater progress. In the case of the special sectionfor weak students, the pre/post-test could not tell us to whatextent the journal, the extra work at the board, and the extrahomework assignments each contributed to the success ofthat section. Likewise, merely detecting negative progressin one area (combinatorics) was not enough to improve theteaching/learning process; a new pedagogical approach wasneeded.

Pre/post-testing does not provide the formula forimprovement. It must be accompanied by a teacher’screativity and flexibility in devising new techniques. As withany powerful tool, it is only as good as its user! Ultimatelythe most important factor in the success of this assessmentmethod is not how it is administered, but how it is used.

References

[1] Angel, A.R., and Porter, S.R. Survey of Mathematicswith Applications (Fifth Edition), Addison-WesleyPublishing Company, 1997.

[2] Farmer, D.W. Enhancing Student Learning:Emphasizing Essential Competencies in AcademicPrograms, King’s College Press, Wilkes-Barre, PA,1988.

[3] Farmer, D.W. “Course-Embedded Assessment: ATeaching Strategy to Improve Student Learning,”Assessment Update, 5 (1), 1993, pp. 8, 10–11.

[4] Michael, M. “Assessing Essential Academic Skills fromthe Perspective of the Mathematics Major,” in thisvolume, p. 58.

[5] Miller, C.D., Heeren, V.E., and Hornsby, Jr., E.J.Mathematical Ideas (Sixth Edition), HarperCollinsPublisher, 1990.

Page 211: Assessment Practices in Undergraduate Mathematics - Northern

198

Background and Purpose

St. Cloud State University is a medium-sized university ofabout 15,000 students, the largest of the state universities inMinnesota. As a public university within a statewide publicuniversity system, it operates with strained resources andconstraints on its independence, and serves at the publicwhim to an extent that is at times unnerving. In 1992 adecision was made to merge the four year universities withthe community college and technical college systems, andthe state legislature has been demanding maximization oftransferability across the public higher education system.This process has been developing with minimal attention tothe issue of assessment: assessment functions have beenbasically assigned to individual universities so thatinstitutions assess their own transfer curriculum for studentsmoving to other institutions.

St. Cloud State has math-intensive major programs inscience, engineering, education and business, but it also hasmany largely or totally “mathasceptic” major programs inarts, humanities, social science and education. The generaleducation program has until now had no math requirement,but an optional mathematics “track” within the science dis-tribution — students may count one of three non-majormathematics courses as part of the science distribution or,(under charitable dispensation!) use two advanced mathemat-ics courses at the calculus level or higher to substitute forthe math alternative. Until recently, admission standards

permitted high school students with little in and less beyondelementary algebra to enroll. Looking at the national litera-ture on the importance of mathematical “literacy” for careersuccess and citizenship awareness, we felt this state of af-fairs was problematic.

The challenge of assessing quantitative literacy in a totalgeneral education curriculum lies in the amorphousness ofthe problem. Traditional testing trivializes the problembecause restricted testable objectives in mathematics failsto take account of the variety of mathematical needs of thestudents in terms of their goals and expectations. Our purposein doing assessment here is thus not to validate academicachievement, but to provide a rough overview of what ishappening in the curriculum in order to identify general needsfor future academic planning.

Rather than use expensive testing methods, we decidedto settle for a cheap and simple way of getting some sense ofwhat was going on with regard to general education learningobjectives. General education courses can only be approvedif they met three of the following five criteria: basic academicskills (including mathematical skills), awareness ofinterrelation among disciplines, critical thinking, valuesawareness, and multicultural/gender awareness. A 1987accreditation review report had expressed concern that wehad not been monitoring the student learning experience,but had settled merely for “supply side” guarantees thatweren’t guarantees of learning at all. To provide more of aguarantee, we established a five-year course revalidation

Coming to terms with Quantitative Literacy inGeneral Education or, The Uses of Fuzzy Assessment

Philip Keith, General Education Assessment CoordinatorSt. Cloud State University

This article presents an administrative look at the ramifications of accommodating various departments’views of how quantitative literacy is to be defined. The issue is: what are the students telling us—how do weinterpret the answers they provide to the questions we’ve asked? The value of “fuzzy” assessment is dis-cussed in the interpretation of a simple survey which helps move a collective bargaining institution on track.

Page 212: Assessment Practices in Undergraduate Mathematics - Northern

Part III: Departmental Assessment Initiatives 199

cycle under which all general education courses needed tobe reviewed in terms of demonstrations that they were eachmeeting at least three of the five objectives above.

MethodWe developed a simple survey instrument that asked studentsin all general education classes whether they were aware ofopportunities to develop their level of performance orknowledge in these criterion areas: 1) basic academic skillsincluding mathematics, 2) interdisciplinary awareness, 3)critical thinking, 4) values awareness, and 5) multiculturaland gender and minority class awareness. The survey askedthe students to assess the opportunity for development inthese areas using a 1–5 scale. The widespread surveying inconjunction with the revalidation process gave us a fuzzysnapshot of how the program worked, with which we couldpersuade ourselves that change was needed. Such a“rhetorical” approach to assessment may seem questionableto mathematicians, but I would argue that such techniqueswhen used formatively have their value.

Institutional assessment at St. Cloud State has always beenproblematic. The unionized faculty have been extremelysensitive to the dangers of administration control, and to theneed for faculty control over the learning process. Thissensitivity has translated into a broad faculty ownership ofsuch assessment processes. Thus, the changes in the generaleducation program developing out of the assessment processhave been very conservative, maximizing continuity. Also,the faculty has claimed the right to be involved in the designand approval of such a simple thing as the development of astudent survey whose already determined purpose was toask about whether already defined general educationobjectives were being emphasized in a particular class. Theassessment committee and assessment coordinator haveauthority only to make recommendations for the facultysenate to approve. While this governance process makesdevelopment processes restrictive, it permits a fairly broaddegree of faculty ownership of the process that provides apositive atmosphere for assessment operations.

We wanted to use the survey instrument in a way thatwould provide feedback to the instructor, to the departmentfor course revalidation, and to the general education programto help trace the operationality of our program criteria. Thesurveying system has been working as follows. When webegan the process in 1992, we received approval for a generalsurvey of all general education classes being taught duringthe spring quarter. Because many lower level students areenrolled in various general education classes, this meant thatwe would need to be able to process something like 20,000forms. We developed a form with questions based on theprogram criteria, and had the students fill out that form, andthen coded their responses onto a computer scannable form.These forms were then processed by our administrative

computing center, and data reports returned to the appropriatedepartments for their information. Thereafter, the survey wasavailable for use as departments prepared to have theirgeneral education courses revalidated for general educationfor the next five year cycle. A requirement for revalidationwas that survey data be reported and reflected upon.

At the end of the first year, a data report of the wholeprogram was used to identify criteria that were problematic.In 1996, as part of a program review for converting from aquarter calendar to a semester calendar, we reviewed andrevised the program criteria, and created a new survey thatreflected the new criteria. Again, we requested and gainedapproval for a broad surveying of all general educationcourses. This new survey has proved a more varied and usefulinstrument for assessing general education. However, itremains a student-perception instrument of fairly weakvalidity and reliability, not an instrument for measuringacademic achievement. Even so, it has proven useful, as thefollowing focus on the quantitative literacy criterionillustrates.

Findings

Figure 1 is a graph of the results from the first surveyingfrom 1992 through 1995. It shows averages for classes ineach of the program areas: communication, natural sciencedistribution, social science distribution, arts and humanitiesdistribution, and electives area (a two course area that allowsstudents to explore areas for possible majors in more pre-professional areas). What the picture showed was that verylittle mathematics was happening anywhere in the generaleducation program. In fact, we found only 3 classes in thewhole program “scored” at a mean level over 3.5 out of arange of 1 to 5 (with 5 high). The courses in the sciencedistribution showed the strongest mathematical content, thesocial science courses showed the weakest.

We found it easy, of course, to identify partial explanations— the serious mathematics courses with large enrollmentswere not in the general education program, the question wasframed narrowly in terms of mathematical calculation ratherthan quantitative thinking, and so forth. But none of that gotaround the fact that the students were telling us that theywere aware of having precious little mathematical experiencein the general education program. The effect was to putquantitative literacy in a spotlight as we began monitoringthe program.

Use of Findings

The information provided by the graph was widely discussedon campus. Our concern about mathematical awareness ingeneral education became a major issue when the Minnesotalegislature decided to require that all public universities and

Page 213: Assessment Practices in Undergraduate Mathematics - Northern

200 Assessment Practices in Undergraduate Mathematics

colleges convert from a quarter to a semester calendar. Thisrequired a reshaping of the general education program, and inparticular, reestablishing a quantitative literacy requirementas part of a general education core. In addition, the programcriteria have been revised to provide a stronger basis for pro-gram assessment. The “mathematical component”, rephrased

as the “quantitative literacy” component, has been redefinedin terms of quantitative and formal thinking, and additionalbasic skills criteria have been added relating to computer ex-perience and science laboratory experience. Results comingfrom the new survey that started to be used in the winter of1996 indicate that the students are seeing more mathematics

Figure 2. New Gened Survey

1995–7 results: mean per student

Figure 1. Old GEd Survey

Student Perception: Objectives/student

Page 214: Assessment Practices in Undergraduate Mathematics - Northern

Part III: Departmental Assessment Initiatives 201

content everywhere (see the Quantitative Thinking results inFigure 2), but particularly in the social science distributionblock as understanding data and numerical analyses becomedefining criteria. We have tried a small pilot of a Quantita-tive Literacy assessment of upper division classes across thecurriculum using a test, admission data, mathematics coursegrades, and a survey. We found no significant correlations,probably indicating the roughness of the testing instruments,but did get a strong indication that the general educationlevel math courses were not effectively communicating theimportance of mathematics knowledge and skills to successin the workplace.

We will continue the surveying process, anticipating thatas all students have to work with the new semesterquantitative thinking requirement, they will be able to dealwith more sophisticated numerical analyses in other generaleducation courses. This improvement should be visible inindividual courses, in the course revalidation review process,and in the program as a whole in the results from survey.Since our goal under the North Central Associationassessment requirement is to be able to document theimprovement of student learning in the program, we lookforward to such results.

Success Factors

The first success factor as been the way a simple survey hasgenerated meaningful discussion of definitions and learningexpectations of general education criteria. Whereas courseevaluation has tended to generate anxiety and obfuscationin reporting, the survey has provided a vehicle for productivecollegial discussion within the faculty at different levels. Forexample, the revalidation process requires that facultyrationalize the continuation of a particular social sciencecourse using survey results. Suppose the math result for thatcourse was 3.8 out of 5, with a large variance and 25% ofthe students indicating that this goal seemed not applicableto the course. That might reflect the extent to whichunderstanding data analyses and measuring social change isbasic to that course, but it definitely leads to some thinkingabout just how quantitative-thinking expectations for a courseare to be defined, and just where the numbers are comingfrom. This then would be discussed in the application forrevalidation, and some decisions would probably be madeto include some alternative reading and perhaps some workwith a statistical analysis package. Admittedly the definition

of quantitative literacy is loosely defined by non-mathematicians, but once it is defined, however loosely, itbecomes discussible at the program level. Part of the five-year program review involves reviewing overall surveyresults and course descriptions to see what is meant in variousdepartments by “quantitative thinking” and how it is taughtacross the program curriculum.

A second success factor is the current framework of in-stitutional and system expectations concerning assessment.Our faculty have been aware that assessment reporting willbe required in the long term to justify program expendituresand that, to meet accreditation expectations, assessment re-porting needs to include data on academic achievement. Stu-dent learning data is thus required in the second revalida-tion cycle, and survey results have been providing some stu-dent input that help to define critical assessment questions.In the case of the hypothetical sociology course, the surveyscore is something of an anomaly: in spite of the emphasisin the class on the use of data to describe and evaluate socialrealities, half of the 75% who even thought mathematicalthinking was relevant to the course scored the emphasis frommoderate to none. However “soft” that result from a valid-ity perspective, it provides something of a shock that wouldmotivate faculty interest in emphasizing mathematical think-ing and tracking student ability and awareness of mathemati-cal reasoning in tests, papers, and other activities in the class.In this sense, the survey has served as a crude engine formoving us along a track we are already on.

Editor�s Note

When asked if the numbers could be used to present morespecific results, Philip Keith replies, no, because, “We aredealing with a huge program that has been flying in the darkwith pillows, and the point is to give questions like ‘why didthe students give this score to this item and that to that’ somevalid interpretation. For example, students in GeographyXXX feel that writing is more important than other studentsdo, in English Composition. How is this possible? Well, thisGeography class serves a small band of upperclassmen whowork continually on research reports which count towardtheir major in chemistry and meteorology. In other words,the numbers don’t determine or judge anything, they point!When they are counterintuitive, then you look forexplanations. And that’s a useful way to get into assessmentin murky areas.”

Page 215: Assessment Practices in Undergraduate Mathematics - Northern

202

Background and Purpose

The June 1997 issue of Network News [1] states:

Remediation at the postsecondary level has long beena controversial topic. Those in favor argue that post-secondary remediation provides a second chance forunderprepared students, while those opposed maintainthat it is duplicative and costly, and may not beeffective.

Saint Peter’s College has tried to address both aspects ofthis issue. As a medium-sized (about 3800 students) Catholic,Jesuit liberal arts college in an urban setting with a richlydiverse student body, Saint Peter’s has long applied resourcesto assist underprepared students in mathematics. The rootsof the developmental mathematics courses go back to theEducational Opportunity Fund’s Summer MathematicsProgram in 1968. The department then initiated CollegeAlgebra (now MA 021, 3 credits, for the day session) in1975, and added Introductory Algebra (now MA 001, nocredit) in 1980. All students must fulfill a six-credit coremathematics requirement, consisting of a Calculus or FiniteMathematics sequence. (Students normally progress fromMA 001 to the Finite Mathematics sequence or from MA

001 to MA 021 and then to the Calculus Sequence.) Collegeadvisors use placement test results and student past academicperformance to assign students to appropriate developmentalcourses.

How well students are being “mainstreamed” from thedevelopmental into regular college courses was the subjectof three major studies. The first Mainstreaming Study wasprompted by a request from the Middle States Associationof Colleges and Schools, our regional accrediting agency,for a follow-up report on developmental programs.

MethodOver the course of years, we have conducted severalinvestigations1:

I. The 1990 Mainstreaming Study completed in May,1990, analyzed freshmen who entered in Fall 1984, byconsidering three entrance categories of students: (1) regularadmits, (2) students in the Educational Opportunity Fund(EOF) program — a state supported program for selectedstudents with income and academic disadvantage — and (3)students in the College’s Entering Student Support Program(ESSP), who were identified at admission as academically

Does Developmental Mathematics Work?

Eileen L. PoianiSaint Peter’s College

An inner city school with a diverse, multicultural clientele is deeply committed to raising students’mathematical abilities. The school has operated with grants that are now drying up, but that help authoredsome assessment studies over a ten-year period. They ask: does developmental mathematics help, or isit a hopeless cause?

———————1 To produce these reports, we have had the able assistance of Institutional Research Directors Thomas H. Donnelly, Brother James

Dixon, Kathy A. Russavage; Developmental Mathematics Director and current department chair, Gerard P. Protomastro, formerMathematics chair, Larry E. Thomas, and Director of the Institute for the Advancement of Urban Education, David S. Surrey, as well asthe entire mathematics department.

Page 216: Assessment Practices in Undergraduate Mathematics - Northern

Part III: Departmental Assessment Initiatives 203

underprepared in one or more subjects. (ESSP began undera Title III grant in 1981.)

This study conducted by Dr. Thomas H. Donnelly hadtwo components:

(1) retention/graduation data for students in these categories.Here we looked at entrance categories, racial/ethnicbackground, gender, verbal and mathematics SAT scores,high school quintile ranking, college GPA, status andcredits earned by non-graduates, and consistency inentering and final major for graduates and non-graduates.

(2) student success in developmental mathematics coursesand subsequent mainstream college courses. We studiedenrollment and grades among developmental (ESSP andEOF) students and non-developmental students inselected developmental courses and subsequent corecourses, and enrollment and grades for these groups inother core courses.

II. A 1992 study conducted by Brother James Dixonproduced performance data on freshmen entering from Fall1984 -1991. These data examined outcomes in a coremathematics sequence (Finite or Calculus) relative to whetheror not a developmental course had preceded it.

III. A 1996 study conducted by Dr. Kathy Russavagereplicated and expanded the earlier Mainstreaming Study,examining outcomes of freshmen (day session) who enteredcollege from Fall 1990-1995. These data were consideredin preparing the Five-Year Review of the MathematicsDepartment and the results continue to be used to shape ourprogress. In this study, the 1990 study was expanded toexamine enrollment patterns in developmental and corecourses, to identify repeaters of developmental courses, andto examine retention and graduation relative to variousstudent characteristics.

Findings

Later studies echoed trends of the original 1990 study. Theresults in 1990 showed that students participating in EOFand ESSP developmental programs did move, in sizeableproportions, into mainstream courses and did persist tocomplete bachelor’s degrees. Since 1990, the performanceand persistence of ESSP students has declined somewhat,due to diminished services possible with available funding.In the 1997-1998 academic year, we have worked to restoresome of the positive aspects of the ESSP program based onthe results of our studies. As expected, developmentalmathematics works less successfully for students in theCalculus sequence than in the Finite Mathematics sequence.Findings of the most recent, 1996, study which Dr. Russaveidentified are as follows:• An average of approximately 27% of incoming freshmen

from Fall 1990–1995 were required to take a develop-mental mathematics course.

• An average of about 23% of the freshmen enrolled inMA 001 needed to repeat it to achieve successfulcompletion (16% twice, 7% more than twice). Meanwhile,6% of MA 021 students took it twice and 1% more thantwice.

• Performance in developmental courses strongly correlateswith the successful completion of core curriculummathematics courses, as the following table shows:

Developmental % FreshmenCourse Grade Completing Core

4.0 (highest) 53%3.0 or 3.5 44%2.0 or 2.5 36%1.0 or 1.5 26%

0.0 3%WA/WD/IC* 2%

No developmental course 60%

(* Withdrawal for Absence/Withdrawal/ Incomplete)

Approximately 55% of entering freshmen completed thecore mathematics requirement within two years ofentrance.

• For the cohorts studied, 86% of those who completed thecore mathematics requirement are continuing or havegraduated by the end of the Fall 1995 term.

• EOF students, a small group of 50-55 students, whoreceive remedial mathematics in the summer beforecollege, and continue to receive special tutoring thereafter,normally have better retention and graduation rates thanother special admits.

Use of Findings

The Mathematics Department has long been attentive to theneed to continually evaluate its program outcomes, andresults such as these have helped us pinpoint our weaknessesand address them. We have introduced, over time, computersupported learning modules, personalized instruction, andgreater use of graphing calculators in developmental work.The CALL program (Center for Advancement of Languageand Learning) also provides student tutors free of charge.Furthermore, several core mathematics sections are taughtby the “Writing to Learn” method, whereby students maintainjournals and enhance problem-solving skills through writingand frequent communication with the instructor. Facultymembers undergo intensive training to become part of the“Writing to Learn” faculty. Saint Peter’s has also used theresults of these studies in its Institute for the Advancementof Urban Education, which reaches out to promising highschool juniors and seniors in need of mathematics and otherremediations by offering special after school and Saturdayprograms.

Page 217: Assessment Practices in Undergraduate Mathematics - Northern

204 Assessment Practices in Undergraduate Mathematics

A concerted effort is currently underway to study thefactors contributing to the repetition of developmentalcourses. We are considering running special sections forrepeaters. Whether or not learning disabilities play a rolealso needs to be explored. The most important feature ofthese studies is our goal to identify “at risk” students earlyso we may impress on them the need for regular attendance,good study habits, and persistence. To this end, a three-weekSummer Academy was created for August, 1997, seeking toreplicate within limited internal resources, some of thefeatures of the EOF program. Open to any incoming freshman(except EOF students, who have their own six-weekprogram), the Academy focuses on a successful transitionto college, emphasizes academic and life skills, effectivecommunication, and expectations for achieving success incollege.

Success Factors

Saint Peter’s College is aware of the fact that every walk oflife requires mathematics literacy. But to help developmental

students understand its importance is no mean feat. Studentanxieties and frustration need to be replaced by confidenceand persistence. Students need to cultivate good study habitsand understand the importance of regular attendance. To thisend, personal attention and support of students is important.Peer tutoring, cooperative learning, and encouragement fromfaculty and family, where possible, are ingredients in asuccessful developmental program.

Overall, we are encouraged by our assessment data, tosay that our developmental program has succeeded inassisting many under-prepared and overanxious students toachieve satisfactory performance in developmental coursesand related core courses, and ultimately reach their careergoals. However, in the Saint Peter’s tradition, we arecontinuing to examine the data to find areas for improvement.

References

[1] National Center for Education Statistics (Project of theState Higher Education Executive Officers). NetworkNews, Bulletin of the SHEEO/NCES CommunicationNetwork, 16 (2), June 1997.

Page 218: Assessment Practices in Undergraduate Mathematics - Northern

205

Background and Purpose

It’s one thing to say that effective departmental relations withthe rest of the institution depend upon good communications,it’s quite another to determine who should be talking to whomand what they should be discussing. This article illustrateshow common assessment issues form a natural structure forsuch a dialogue — one that can actually produce results.

Oakland University is a comprehensive state universitylocated on the northern edge of the Detroit metropolitan area.Its total enrollment is approximately 13,500, 20% of whichis graduate and primarily at the master’s level. Most studentscommute, many are financing their education through part-time employment, and pre-professional programs are themost popular. The average ACT score of matriculatingstudents is in the low 20s.

The Department of Mathematical Sciences has 26 tenuretrack faculty positions. A mixture of part-time instructorsand graduate teaching assistants brings the total instructionalstaff to a total of 35 full-time equivalents. The departmentaccounts for over 10% of the institution’s total credit delivery.The bulk of these credits are at the freshmen and sophomorelevel in courses required for major standing in the variousprofessional schools or elected to satisfy the university’sgeneral education requirements. Average course success rates(measured in terms of the percentage of students enrolled ina course who complete the course with a grade of 2.0 orhigher) have varied widely over the last 15 years, from ahigh of 70% down to 33%. By the end of the 1980’s, these

low rates had led to very negative perceptions of thedepartment’s instructional efforts. There was generaldepartmental consensus that these perceptions threatenedinstitutional support for the department’s priority aspirationsin the areas of research and emerging industrialcollaborations. An effective response was clearly necessary.The one developed had two phases: the first took place solelywithin the department and the second, which continues tothe present, involved the department and major units of theuniversity.

Method

To improve its instructional image (and hopefully itsinstructional effectiveness) the department developed aunified policy for the delivery of courses at the freshmanand sophomore levels. This policy included a generalstatement of departmental goals and objectives for all of thecourses, detailed policies specific to each course, and aprocess for continuing course development withresponsibilities allocated between teaching faculty and thedepartment’s curriculum committee.

For example, the general policy set the goal of “anacademically sound curriculum in which most conscientiousstudents could expect to be successful.” It committed to“provide the skills and understandings necessary for latercourses,” to insure “consistent course policies in all sectionsof a course during a given semester,” and “as consistent with

�Let them know what you�re up to, listen to what theysay.� Effective Relations Between a Department of

Mathematical Sciences and the Rest of the Institution

J. Curtis ChipmanOakland University

How can a mathematics department please its client disciplines? This department finds a solution in establishinga web of responsible persons to establish goals for students and instructors, course leaders, committees, andfaculty liaisons, for placement of students into courses and for the content of those courses.

Page 219: Assessment Practices in Undergraduate Mathematics - Northern

206 Assessment Practices in Undergraduate Mathematics

other course goals, to adopt changes likely to increase thenumber of students being successful in each class.”

Specific course policies would be described in Studentand Instructor Information Sheets distributed at the beginningof each course. Issues such as prerequisites, grading,calculator usage, syllabus, and suggestions for successfulstudy habits were addressed in the Student Informationsheets. The Instruction Information sheets addressed issuessuch as typical student clientele, template processes forcommon test construction, current course issues, and studentsuccess rates in the course for the preceding eight semesters.The initial approval of these sheets would be made by thedepartment. Future revisions in these sheets would be at theinitiative of the designated faculty Course Leader with theapproval by the department’s committee on undergraduateprograms, who would seek departmental approval for majorcourse changes.

Further details are given in Flashman’s panel article [1],but for this account, the key fact is that with the adoption ofthis policy, typical major assessment elements were in place.The department had fully considered, debated, and decidedwhat it was trying to do, how it would try to do it, how itwould measure how well it was doing, and how it wouldmake changes that could assist in doing it better.

Findings

Since the unified policy was approved, there has been andcontinues to be a series of interactions with the rest of theuniversity in the context of this general policy. Thedepartment found these to be a natural consequence of theimplementation demands and communication needs of thenew policy.

Three interaction examples are discussed in the nextsection. What is illustrative about these examples is not somuch the items under consideration or the actual participants,but the manner in which the implementation of a specificassessment mechanism naturally leads to a process foreffective interaction with the rest of the institution.

Use of Findings

Interaction 1. Calculus Reform and Relations with otherScience Departments

The first example concerns the issue of calculus reform; theexternal units were the School of Computer Science andEngineering along with natural science departments in theCollege of Arts and Sciences. The policy impetus for thisinteraction was the initial departmental approval of theinformation sheets for the mainstream calculus course. Here’swhat happened.

The department’s undergraduate committee decided thiswas the time to grasp the nettle of calculus reform and

determine a departmental reaction to the various nationalefforts underway. The Chair wrote to the Dean of ComputerScience and Engineering and to his counterparts in the naturalsciences informing them of this effort and requesting facultyin their units to be identified as liaisons for consultativepurposes. These colleagues were initially interviewed by thedepartment committee concerning the state of the currentcourse and later invited to review texts under considerationand drafts of the materials to be submitted for departmentalapproval. The process resulted in the departmental approvalof new materials for the calculus sequence. A universityforum was held for the science liaisons and other interestedcolleagues to describe the coming course changes. All ofthis was widely publicized across the university and coveredin the student newspaper.

The department’s currency with, and willingness toactively consider, issues of national curricular change wasdemonstrated. Its concern for the views of its major clientsin the sciences was emphasized. A positive precedent forexternal consultation was established (which was later to bereinforced as described in the third example that follows).Subsequently, there was little surprise and general supportat the most recent departmental meeting when the currentChair announced his intention to ask that 1) a permanentengineering faculty liaison be appointed and 2) a rotating(nonvoting) seat be created on the undergraduate committeefor the chief academic advisors from the various professionalschools.

Interaction 2. Student Support and Relations with theDivision of Student Affairs

The second example concerns the issue of support for studentwork outside of class; the external unit was the AcademicSkills Center in the Division of Student Affairs. While therehad been a number of positive faculty interactions with theCenter prior to the policy, the policy context for thisinteraction was the initial departmental approval of thestudent information sheets which addressed issues such ashomework, office hours, and other help outside of class.Here’s what happened.

Some of the support services offered by the AcademicSkills Center include free peer tutoring, supplemental instruc-tion, and luncheon seminars on various study skills. Manyof these are routinely promoted in the student informationsheets. In addition, departmental faculty participate in theCenter’s training sessions for supplemental instruction, holdreview briefings for tutors, and lead study skills seminars. Ithas also become a common practice for many faculty to al-locate some of their office hours directly to the Center, meet-ing with walk-in students at the Center itself. The thank younotes for this departmental commitment to student successreached all the way up to university’s president.

There has also been an economic component to this activedepartmental support of another unit’s efforts. The vast

Page 220: Assessment Practices in Undergraduate Mathematics - Northern

Part III: Departmental Assessment Initiatives 207

majority of the tutors hired and Supplemental Instructionsessions offered by the Center are for the direct support ofcourses in the mathematical sciences. Indeed, the dollarsspent far exceed those which the department could possiblyallocate from its College budget.

Interaction 3. Placement, Course Content, and Relationswith the School of Business Administration

The third and final example concerns the issues of studentplacement and course content; the external unit was theSchool of Business Administration. The impetus for thiseffort was a task force appointed by the dean of that schoolto review and make recommendations concerning the twocourse sequence in mathematics, Linear Programming/Elementary Functions and Calculus for the Social Sciences,required in the pre-major program in business. Since in itspolicy governing these courses, the department had alreadycommitted itself to the goal of increasing the number ofstudents successful in such courses and had also assignedresponsibilities and processes for considering and institutingcourse improvements, it was well positioned to respondpositively to this external initiative. Here’s (some of) whathappened.

The task force included mathematical sciences facultyamong its membership, met for a full academic year,commissioned a number of studies, and issued itsrecommendations. During the academic year which followedthe filing of its final report, the department developed formalresponses to all of the task forces recommendations, assistedby a number of formal studies and pilot projects supportedby joint funding from the deans of Business Administration,the College of Arts and Sciences, and the Vice President ofStudent Affairs. Faculty liaisons from the department andthe school were appointed to oversee this process. For thepurposes of this article, the focus will be upon two specificitems which illustrate well the assessment issues of carefulinformation gathering, data-based decision making, andresulting change. As described at the beginning of thisexample, the two issues are student placement and coursecontent.

Course Placement

The task force had recommended that the department reviewits method of student placement, given the low success ratesin the two courses required of its pre-majors (which typicallyranged in the low 50% range.) To assist in the formulationof its response, the department’s undergraduate committeeaccepted the invitation of the business school’s liaison toconduct, with another colleague trained in industrial/organizational psychology, a formal validation study of thedepartment’s existing placement test. The committee alsoundertook a survey of other departments’ placement practicesthroughout the state. As a result of this external study and its

own survey, the department determined to change itsplacement process. The existing test had been based upon aversion distributed by the MAA more than ten years agowhich the department had never updated. The statisticalresults from the validation study revealed low correlationbetween test scores and later results in some courses. It alsoidentified many test questions as invalid since their unitscores deviated strongly from total test scores or otherwisefailed to differentiate among students taking the test. Theimplementation of a placement system based upon ACTscores for all beginning courses outside the mainstreamcalculus sequence begins in this current academic year. Forthe mainstream calculus course, a process of revising thecurrent test is underway with continuing advice andconsultation of the business faculty who conducted theoriginal study.

Course Content

In the area of course content, the task force had conducted acareful survey of course topics used in advanced businesscourses, surveyed the content of corresponding coursesacross the state, interviewed instructors of the courses, anddeveloped a statement written by a working group of businessfaculty describing their goals for the course in a businessmajor’s curriculum. In formulating their response, thedepartment’s undergraduate committee asked the courseleaders of these two courses to draft new syllabi andcommissioned a question by question analysis of thedepartmental final examinations in these courses. Througha series of revisions and consultations between the committeeand the working group, new syllabi for both courses werefinally approved and implemented. The final result was a20% reduction in the topic coverage for each course.

In the years since these studies were completed, both thedepartment and the school have continued to appoint facultyliaisons who meet monthly to discuss the implementation ofthese and other changes that resulted from the process, aswell as developing other means for potential courseimprovements. Their current efforts were recently the subjectfor a major article in the university’s newspaper.

Success Factors

The actual adoption of a specific assessment policy by areal department and its subsequent implementation by realpeople in a real university set in motion an entire chain ofinteractions whose ramifications would have been difficultto predict. In addition to the constructive context which thedepartment’s policy provided for its relations with the restof the university, there were four lessons learned from theinitial round of interactions. They appear to be particularlyrelevant for any department contemplating such a processand can be summarized as follows.

Page 221: Assessment Practices in Undergraduate Mathematics - Northern

208 Assessment Practices in Undergraduate Mathematics

• Faculty and administrative professional colleaguesoutside the department are much more willing and ableto support your efforts if they know what you are tryingto do and how you are trying to do it.

• Inviting and receiving recommendations from others doesnot obligate you to accept them, only to seriously (andactually) respond to them. Indeed, many faculty outsidethe department have professional skills and interests muchmore suited to the development of relevant data than youdo. Even though your own colleagues may be initiallyquite leery of such external involvement, positiveprecedents can allow (mutual) trust and confidence todevelop quickly.

• Just as in a class, important messages have to becontinually repeated, not just to the participants, but to

interested observers as well. Publicity is not a dirty word,it is an important way of letting people know what you’reup to.

• Effective university relations require a great deal of timeand energy. This means someone’s real time andsomeone’s real energy. If a department wishes to makethis investment, it needs to carefully consider who willbe involved and how their efforts will be assessed for thepurposes of both salary and promotion.

Reference

[1] Roberts, A.W., Ed. Calculus: The Dynamics of Change,MAA Notes, Number 39, The Mathematical Associationof America, Washington, DC, 1996.

Page 222: Assessment Practices in Undergraduate Mathematics - Northern

209

Background and PurposeQuantitative assessment at Madison began for a most familiarreason: it, along with verbal assessment, was externallymandated by the Governor of Wisconsin and the Board ofRegents. Amid increasing pressure for accountability inhigher education ([3]), all UW system institutions weredirected to develop programs to assess the quantitative andverbal capabilities of emerging juniors by 1991. Althoughthe impetus and some support for the process of assessmentwere external, the implementation was left up to theindividual institutions.

The University of Wisconsin at Madison has been usinga novel assessment process since 1990, to find whetheremerging juniors have the quantitative skills needed forsuccess in their chosen upper-division courses; a similarprogram began at North Dakota State University in 1995. Aunique characteristic of both programs is a focus on facultyexpectations and student capabilities across the campuses,rather than on specific mathematics or statistics courses.

The important undergraduate service role of most math-ematics departments is illustrated by some enrollment datafor the UW-Madison Department of Mathematics: in Fall1994, the department had about 200 undergraduate majorsand enrollments of about 6500 in courses at the level of lin-ear algebra, differential equations, and below. Some of thesestudents go on to major in a mathematical science; most arestudying mathematics for majors in other departments. Math-ematics faculty must perform a delicate balancing act as they

design lower-division course work that must meet diverseexpectations of “client faculties” across the campus.

Method

In a program of sampling from departments across thecampus, we have gathered information about quantitativeskills used in specific courses and the extent to which studentscan show these important skills at the start of the semester.Instructors play a key role in helping to design free-responsetests reflecting capabilities expected of incoming studentsand essential for success in the course. Two importantcharacteristics of this form of assessment are direct facultyinvolvement and close ties to student goals and backgrounds.We have found that the reflection, contacts, and dialoguespromoted by this form of assessment are at least as importantas the test results.

The purpose of assessment is to determine whetherinstructional goals, or expectations for student learning, arebeing met. We have found that explicit goals statements,such as in course descriptions, focus on subject content ratherthan on the capabilities that students will develop. Suchstatements either are closely tied to individual courses orare too broad and content focused to guide assessment ofstudent learning. Complicating the situation, we encounterdiverse goals, among both students and faculty. In responseto these difficulties, we sample in junior-level courses froma wide range of departments across the campus. (e.g.

Have Our Students with Other MajorsLearned the Skills They Need?

William O. Martin and Steven F. BaumanUniversity of Wisconsin-Madison and North Dakota State University

A large university begins by asking teachers in other disciplines not for a “wish list” but for a practicalanalysis of the mathematical knowledge required in their courses. Pretests for students reflect theseexpectations, and discussion of results encourages networking.

Page 223: Assessment Practices in Undergraduate Mathematics - Northern

210 Assessment Practices in Undergraduate Mathematics

Principles of Advertising, Biophysical Chemistry, and CircuitAnalysis). Instructors are asked to identify the quantitativecapabilities students will need to succeed in their course.With their help, we design a test of those skills that areessential for success in their course. We emphasize that thetest should not reflect a “wish list,” but the skills andknowledge that instructors realistically expect students tobring to their course.

By design, our tests reflect only material that facultyarticulate students will use during the semester — contentthat the instructor does not plan to teach and assumes studentsalready know. This task of “picking the instructor’s brain”is not easy, but the attempt to identify specific, necessarycapabilities, as opposed to a more general “wish list,” is oneof the most valuable parts of the assessment exercise.

A significant problem with assessment outside the contextof a specific course is getting students (and faculty!) toparticipate seriously. We emphasize to participating facultymembers the importance of the way they portray the test tostudents and to inform students that

• the test does not count toward their grade,but

• test results will inform students about their quantitativereadiness for the course

• the instructor is very interested in how they do, so it iscrucial that students try their best

• results of the test may lead to course modifications tobetter match content to student capabilities.

On scantron sheets, each problem is graded on a five-point scale (from “completely correct” to “blank/irrelevant”);information is also coded about the steps students take towarda solution (for example, by responding yes or no to statementssuch as “differentiated correctly” or “devised an appropriaterepresentation”). Within a week (early in the term) thecorrected test papers are returned to students along withsolutions and references to textbooks that could be used forreview.

Although we compute scores individually, our main focusis on the proportion of the class that could do each problem.Across a series of courses there are patterns in the resultsthat are useful for departments and the institution. Over time,test results provide insight to the service roles of the calculussequence. We also use university records to find themathematics and statistics courses that students have taken.Without identifying individuals, we report this information,along with the assessment test score, to course instructors.

Findings

During the first five years of operation at UW-Madisonnearly 3700 students enrolled in 48 courses took assessmentproject tests of quantitative skills. We have found that in-structors often want students to be able to reason indepen-

dently, to make interpretations and to draw on basic quanti-tative concepts in their courses; they seem less concernedabout student recall of specific techniques. Students, on theother hand, are more successful with routine, standard com-putational tasks and often show less ability to use concep-tual knowledge or insight to solve less standard problems([1]), such as:

Here are the graphs of a function, f, and its first andsecond derivatives, f ´ and f ´´. (Graph omitted.) Labeleach curve as the function or its first or secondderivative. Explain your answers.

(In one NDSU engineering class 74% of the studentscorrectly labeled the graphs; 52% of them gave correctsupport. In another engineering class, which also requiredthe three-semester calculus sequence, 43% of the studentssupported a correct labeling of the graphs.)

Here is the graph of a function y = f (x). Use the graphto answer these questions:(a) Estimate f ´(4). (On the graph, 4 is a local minimum.84% correct)(b) Estimate f ´(2). (On the graph, 2 is an inflectionpoint with a negative slope. 44% correct)(c) On which interval(s), if any, does it appear thatf ´(4) < 0? (65% correct)

(Percentages are the proportion of students in a UWengineering course who answered the question correctly —a course prerequisite was three semesters of calculus)

To illustrate common expectations, these two problemshave been chosen by instructors for use in many courses.Our experience suggests that many instructors want studentsto understand what a derivative represents; they have lessinterest in student recall of special differentiation orintegration techniques. Few students with only one semesterof calculus have answered either problem correctly. Even inclasses where students have completed the regular three-semester calculus sequence, success rates are surprisinglylow. Most students had reasonable mathematics backgrounds,although more than half of the 87 students mentioned herehad a B or better in their previous mathematics course, whichwas either third semester calculus or linear algebra. Problemsuccess rates often are higher if we just ask students todifferentiate or integrate a function. For example, over three-quarters of the students in the same class correctly evaluated

the definite integral te dtt--

z02

. (See [5] for a discussion of

student retention of learned materials.)Indicative of the complex service role played by the lower

division mathematics sequence we noted the differingbalance of content required by faculty in the three mainsubject areas: (a) Mathematics (four distinct courses); (b)Physical Sciences (five courses); and (c) Engineering (sixcourses) , and we structured our problems to lie in four maingroups: (A) non-calculus, (B) differential calculus, (C)

Page 224: Assessment Practices in Undergraduate Mathematics - Northern

Part III: Departmental Assessment Initiatives 211

integral calculus, and (D) differential equations. Inmathematics courses, for example, 60% of the problems usedwere non calculus; physical science drew heavily fromdifferential calculus (56% of the problems), whileengineering courses had a comparatively even balance ofproblems from the four main groups.

Use of Findings

Important advantages of this assessment method include:

• Faculty members must focus on specific courseexpectations to prepare an appropriate test.

• Student needs and backgrounds are reflected in thisprocess because the test is tied to a course the student haschosen, usually at the start of their studies in the major.

• Faculty from mathematics, statistics, and clientdepartments talk about faculty expectations, studentneeds, and student performance in relation to specificcourses and programs.

• The conversations are tightly focused on the reality ofexisting course content and written evidence from studentsabout their quantitative capabilities.

• Everyone is involved; students and faculty gain usefulinformation that has immediate significance apart fromits broader, long-term institutional meaning.

Instructors have mostly reacted very favorably to theassessment process. Some report no need to make changeswhile others, recognizing difficulties, have modified theircourses, sometimes through curriculum, or by omittingreviews or including additional work. Students report lessinfluence, partly because many mistakenly see it as a pretestof material that they will study in the course. Some fail tosee the connection between a mathematical problem on thetest and the way the idea is used in the course. In technicalcourses, typically around half the class may report studyingboth before and after the assessment test and claim that thereview is useful. Most students, when questioned at the endof the semester, recognize that the skills were important intheir course but still chose not to use assessment informationto help prepare.

We report annually to the entire mathematics faculty, butwe have probably had greater curricular influence bytargeting our findings at individuals and committeesresponsible for specific levels or groups of courses,particularly precalculus and calculus. Findings from manyassessed courses have shown, for instance, that facultymembers want students to interpret graphical representations.This had not always been emphasized in mathematicscourses.

• After finding that many students in an introductory coursewere unable to handle calculus material, one department

increased their prerequisite from first semester businesscalculus to two semesters of regular calculus.

• In another department, many students with poor recordsin mathematics apparently did not realize that materialfrom a prerequisite calculus course would be expected inlater work. This illustrated the importance of advising ,especially regarding the purpose of general educationrequirements.

• Faculty in other departments typically welcome theinterest of our committee. One nontechnical departmentrestructured their undergraduate program to incorporatemore quantitative reasoning in their own lower levelcourses.

• In another department, following a planning session, thecoordinator for a large introductory science courseremarked that he “couldn’t remember having spent evenfive minutes discussing these issues with mathematicsfaculty.”

An early, striking finding was that some students wereavoiding any courses with quantitative expectations. Thesestudents were unable to use percentages and extractinformation from tables and bar graphs. A universitycurriculum committee at UW-Madison viewing these resultsrecommended that all baccalaureate degree programs includea six-credit quantitative requirement. The Faculty Senateadopted the recommendation, a clear indication that our focuson individual courses can produce information useful at thebroadest institutional levels. Result of assessment not onlyled to the policy, but aided in designing new courses to meetthese requirements. We are now refining our assessmentmodel on the Madison campus to help assess this new generaleducation part of our baccalaureate program.

How do faculty respond when many students do not havenecessary skills, quantitative or otherwise? Sometimes, wehave found a “watering down” of expectations. This is adisturbing finding, and one that individuals cannot easilyaddress since students can “opt out” of courses. Ourassessment can help to stem this trend by exposing theinstitutional impact of such individual decisions to facultymembers and departments.

Success Factors

Angelo and Cross, in their practical classroom assessmentguide for college faculty [1], suggest that assessment is acyclic process with three main stages: (a) planning, (b)implementing, and (c) responding (p. 34). Although we havecited several positive responses to our assessment work, therehave also been instances where assessment revealedproblems but no action was taken, breaking our assessmentcycle after the second stage. We expect this to be an enduringproblem for several reasons. First, our approach operates

Page 225: Assessment Practices in Undergraduate Mathematics - Northern

212 Assessment Practices in Undergraduate Mathematics

on a voluntary basis. Interpretation of and response to ourfindings is left to those affected. And the problems do nothave simple solutions; some of them rest with mathematicsdepartments, but others carry institutional responsibility.

Some of our projects’ findings are reported elsewhere([2], [4]). While they may not generalize beyond specificcourses or perhaps our own institutions, the significance ofthis work lies in our methodology. Because each assessmentis closely tied to a specific course, the assessment’s impactcan vary from offering particular focus on the mathematicsdepartment (actually, a major strength), to having a campus-wide effect on the undergraduate curriculum.

Assessment has always had a prominent, if narrow, rolein the study of mathematics in colleges and universities.Except for graduate qualifying examinations, most of thisattention has been at the level of individual courses, withassessment used to monitor student learning during and atthe end of a particular class. The natural focus of amathematics faculty is on their majors and graduate students.Still, their role in a college or university is much largerbecause of the service they provide by training students forthe quantitative demands of other client departments. It isimportant that mathematicians monitor the impact of thisservice role along with their programs for majors.

References[1] Angelo, T.A., & Cross, K.P. Classroom assessment

Techniques (second edition), Jossey-Bass, San Francisco,1993.

[2] Bauman, S.F., & Martin, W.O. “Assessing theQuantitative Skills of College Juniors,” The CollegeMathematics Journal, 26 (3), 1995, pp. 214–220.

[3] Ewell, P.T. “To capture the ineffable: New forms ofassessment in higher education,” Review of Research inEducation, 17, 1991, pp. 75–125.

[4] Martin, W. O. “Assessment of students’ quantitativeneeds and proficiencies,” in Banta, T.W., Lund, J.P.,Black, K.E., and Oblander, F.W., eds., Assessment inPractice: Putting Principles to Work on CollegeCampuses, Jossey-Bass, San Francisco, 1996.

[5] Selden A. and Selden, J. “Collegiate mathematicseducation research: What would that be like?” CollegeMathematics Journal, 24, 1993, pp. 431–445.

Copies of a more detailed version of this paper areavailable from the first author at North Dakota StateUniversity, Department of Mathematics, 300 Minard, POBox 5075, Fargo, ND 58105-5075 (email: [email protected]).

Page 226: Assessment Practices in Undergraduate Mathematics - Northern

213

Background and Purpose

Networking with client disciplines is a role that mathematicsdepartments must be prepared to take seriously. Businessdisciplines have large enrollments of students in mathematics,and many mathematics departments offer one or two coursesfor business majors, generally including such topics as wordproblems in the mathematics of finance, functions andgraphing, solution of systems of equations, some matrixmethods, basic linear programming, introduction to calculus,and elementary statistics. This is a dazzling array of ideas,even for the mathematically mature. As a result, the requiredcourses often have high drop and failure rates, much to thefrustration of students and faculty. Nevertheless, at jointconferences, business faculty consistently urge themathematics department to keep the plethora of topics,assuring the mathematicians that business has a stronginterest in their students’ development of a workingknowledge of mathematics.

This article discusses one attempt to incorporate economicconcepts with mathematics. What if students encounteredmathematical topics in the context of economic reasoning?Would understanding of both disciplines be increased?Would student satisfaction improve? Would students stopasking, “What is this good for?” An integrated mathematicsand economics course ensued: TEAM — Technology,Economics, Active learning, and Mathematics. The plan was

to obtain a dual perspective on the problems of first yearinstruction in both economics and mathematics, and to usethese assessments to feed back to both disciplines for furthercourse development and more cooperation among thedepartments.

Jacksonville University is a small, private, liberal artscollege which draws students from the Northeast and fromFlorida. Students tend to be career-oriented, fairly well-prepared for college, but lacking self-motivation. They vieweducation as an accumulation of facts, and synthesis of thesefacts is largely a foreign notion. But if instructors make nosynthesis of course content, it is unrealistic to expectfreshmen to do so. The following is an account of successand failure of a totally integrated program* in whichmathematical concepts are developed as needed, within theframework of a two-semester course in the principles ofeconomics.

Method

We decided to develop mathematical concepts as needed inthe study of the principles of economics. For example, slopewould be couched within the topic of demand curves, andderivatives would emerge in an investigation of marginalcost and marginal revenue. The year-long syllabus includedthe standard topics of business calculus and elementary

A TEAM Teaching Experience in Mathematics/Economics

Marilyn L. Repsher, Professor of MathematicsJ. Rody Borg, Professor of Economics

Jacksonville University

Opening a course to both mathematics and business faculty teaching as a team creates public dialogueabout problems that straddle two departments.

———————* Supported in part by DUE grant 9551340.

Page 227: Assessment Practices in Undergraduate Mathematics - Northern

214 Assessment Practices in Undergraduate Mathematics

statistics, both required by the College of Business. Theclasses were taught in a two-hour block with little distinctionbetween topics in economics and mathematics. Both theeconomics and the mathematics instructors were present forall class meetings, and each pair of students had a computerequipped with appropriate software. The subject matterinvolved a series of problem-solving exercises which enabledthe students to construct their knowledge via active,cooperative learning techniques. The instructors served asfacilitators and coaches, and lecturing was kept at a minimum.Some of the exercises were done by the student pairs, butmost were completed by two pairs working together so thatfour students could have the experience of setting prioritiesand assigning tasks. Reports were group efforts.

Apart from a desire to increase conceptual understandingin both disciplines, the instructors’ intentions, we found later,were initially too vaguely formulated. But as the courseevolved over the two-year period, more definite goalsemerged. Ultimately, we assessed a number of fundamentallyimportant criteria: (1) student achievement, (2) studentretention, (3) development of reasoning skills, (4) attitudeof students, and (5) attitude of other faculty toward therequired courses.

Findings

Student achievement. The Test of Understanding in CollegeEconomics (TUCE) was administered at the beginning andthe end of each semester both to TEAM members and tostudents in the standard two-semester Principles ofEconomics course. Slightly greater increases were indicatedamong TEAM members, but the difference was notstatistically significant. Student achievement in mathematicswas measured by professor-devised examinations. Similarquestions were included in the TEAM final examination andin examinations in the separate, traditional mathematicscourses. No significant differences were noted, but studentsin the TEAM course performed on the traditional items atleast as well as the others. Students in the TEAM courseperformed better on non-traditional questions; for example,open-ended items on marginal analysis within the contextof a test on derivatives. Since confidence and a sense ofoverview are ingredients of successful achievement, thisanecdotal evidence supports the hypothesis that integratedunderstanding of the subjects took place.

Student retention. In the TEAM course, the dropout ratewas substantially reduced. In comparable mathematicscourses it is not unusual for ten percent to fail to finish thecourse, but nearly every one of the TEAM memberscompleted the courses. Numbers were small, just twenty eachsemester for four semesters, so it is possible that thisimprovement in retention would not be replicated in a largersetting. The students evidently enjoyed their work and

expressed pride in their accomplishments.

Reasoning skills. TEAM students were more proficientthan others in using mathematics to reason about economics.For example, class discussions about maximizing profitshowed TEAM students had an easy familiarity withderivatives. Students on their own seldom make suchconnections, and this may be the most important contributionof the TEAM approach. When students link mathematics toapplications, they become more sure about the mathematics.

Student attitudes. TEAM members demonstratedincreased confidence in attacking “what-if” questions, andthey freely used a computer to test their conjectures. Forexample, in a project about changes in the prices of coffee,tea, sugar, and lemons, the students first reasoned thatincreasing the price of lemons would have no effect on thedemand for coffee, but after some arguing and much plottingof graphs the teams concluded that the price of lemons couldaffect the demand for coffee since coffee is what economistscall “a substitute good” for tea.

TEAM students uniformly reported satisfaction with theirimproved computer skills. On the other hand, they showedsome resentment, especially at the beginning of the course,that the material was not spoon-fed as they had come toexpect. Even though some economic topics are highlymathematical, the resentment was more pronounced in whatthe students perceived to be “just math.” Indeed,mathematical concepts often became more acceptable whendiscussed by the economist.

The emphasis on writing added some frustration. Onestudent asked, “How can I do all the computations and getthe right answers but still get a C?” Not every student wasentirely mollified by the explanation that a future employerwill want, not only the correct answers, but a clear report ofthe results.

Faculty attitudes. The authors were surprised that theydid not meet the kind of opposition that has sometimesattended calculus reform efforts, but there was somereluctance among colleagues to consider expanding theprogram to a larger audience. Even people who are favorablyinclined to participate in an integrated course are madenervous by the fact that some topics must of necessity becurtailed or eliminated. In hindsight, it would have been wellto involve more faculty from both departments at the earlystages.

Use of Findings

Although the TEAM course is no longer offered, this doesnot mean that the experiment failed, for much of what waslearned is being incorporated into the existing courses. Thelab assignments have been revised and expanded for usewith a larger audience both in principles of economics classes

Page 228: Assessment Practices in Undergraduate Mathematics - Northern

Part III: Departmental Assessment Initiatives 215

and in the business calculus and elementary statisticssessions. Teachers are finding, often based on discussionswith the two experimenters, the many advantages ofcooperative learning together with an interdisciplinaryapproach to mathematical content. More importantly, plansare being laid to offer a new integrated course in the 1999-2000 academic year.

Success FactorsA small institution with computerized classrooms will findthe experiment worthwhile, but the labor-intensive deliverysystem is probably too expensive for widespread applica-tion. Desirable as it is to have both economics and math-

ematics instructors present for all class meetings, it may notbe feasible on a routine basis. A revision of the economicscurriculum is under discussion. It has become desirable tooffer an introduction to economics aimed at students with astrong mathematics background with a syllabus which wouldeasily incorporate the experimental materials. Finally, andeven more importantly, the dialogue between mathematiciansand members of their client disciplines will continue, notjust with business. Our work has set a model for networkingand assessing across the curriculum, with physics, engineer-ing, and others. We must decide what concepts studentsshould carry with them and we must work for the develop-ment of those concepts in all related disciplines. A teamapproach teaches some lessons about how this can be done.

Page 229: Assessment Practices in Undergraduate Mathematics - Northern

216

Background and Purpose

The Alliance for Minority Participation program is anationally-based effort designed to supportunderrepresented minority students enrolled in science,engineering, and mathematics programs at four-yearcolleges and universities. The primary goal of theAlliance for Minority Participation program is to increasethe number of minority students graduating in a science,engineering, or mathematics (SEM) major. While anincreasing number of minority students have enrolled inSEM programs in this decade, not all of these studentsare completing their degree in a timely way (see, e.g.,[6]). Indeed, anecdotal comments from mathematics,science, and engineering departments indicate that itseems that a large number of students either seem to “hangaround” a long time, or are behind schedule in theirprograms (e.g., senior enrolled in lower-divisionmathematics courses). Concern about what appears tobe a bottleneck, or “bulge,” for many minority seniorsenrolled in SEM programs prompted the Chancellor’sOffice of the California State University (CSU) tocommission a study relative to this issue. Thus, thepurpose of this study was to identify possible factorsaffecting the completion of degrees in mathematics-baseddisciplines and how departments and institutions might

help to streamline the path to graduation for theirstudents.

MethodThe research was limited to students with senior status, andincluded two components: transcript analyses and studentinterviews. Transcript analyses of student records were doneat six participating CSU campuses by local administrativeoffices and academic departments, gathering information onthree criteria:

• fulfillment of university general education requirements

• fulfillment of upper-division major requirements

• fulfillment of mathematics requirements

Follow-up interviews, by phone or writing, or in person,were used to help understand student perceptions of theirown experiences.

The sample for transcript analysis was comprised of 813students currently enrolled as seniors at one of six of the 22campuses of the California State University. About three-fourths (74%) of these were transfer students from acommunity college coming in as third-year students. Morethan half of the students were majoring in engineering (55%)or the natural or physical sciences (41%), with about 4%majoring in mathematics.

Factors Affecting the Completion of Undergraduate Degreesin Science, Engineering, and Mathematics for

Underrepresented Minority Students: The Senior Bulge Study

Martin Vern BonsangueCalifornia State University, Fullerton

Commissioned by the California State University’s Chancellor’s Office, this study looks at transfer studentsand suggests key areas for reform.

Page 230: Assessment Practices in Undergraduate Mathematics - Northern

Part III: Departmental Assessment Initiatives 217

Findings

Only one-half (52%) of the students had completed theirgeneral education requirements, while fewer than that (40%)had completed the upper division requirements in their major.Moreover, nearly half (45%) of these SEM students had notyet completed their mathematics requirement (first and/orsecond year calculus) even though the students were seniors.While many students owned more than one of these threedeficits, relatively few seniors were qualified to take senior-level courses in their major, that is, were deficit-free.Typically, a student was at least two semesters from havingcompleted all prerequisites for senior-level work, and in somecases, was essentially a freshman in the major.

There were notable differences in the trajectories towardssuccessful completion of graduation requirements betweentransfer and non-transfer students. Virtually all of the transferstudents had at least one of the three deficits listed above.By comparison, non-transfer students who had attended theCSU as freshmen were much more timely in their completionof courses, with more than half of them having completedall three requirements by the end of the junior year. Thus,students in mathematics-based majors who began theircareers as freshmen in the CSU had a reasonable expectationof graduating in a timely way (for this study, within 6 yearsfor engineering majors, and within 5 years for math andscience majors). By comparison, SEM students who hadtransferred seemed to have no chance to finish on time.Graduation checks showed that while non-transfer studentscomprised only one-fourth of the sample, these studentscomprised more than 90% of the minority graduates in SEMmajors. Moreover, this trend was true for all disciplines,including mathematics.

While the data here suggest that transferability (or lackof) is the real culprit, problems associated with changingschools are more severely felt by the minority community.Recall that three fourths of the SEM minority students weretransfer students, compared with typical ratios of around 30-40% for non-minority students. Since the non-transferstudents are more timely to graduation than are transferstudents, it becomes a minority issue.

Follow-up interviews seemed to confirm that transferringcreates problems, both obvious and subtle. Three hundredforty students were interviewed by telephone, in person, orin writing. The interviews were not sympathetic, butinformational, in nature. While the interview format andquestions asked varied somewhat by campus, studentresponses centered on the following issues:• course availability

• course repeating

• inaccurate or unhelpful advising

• problems associated with isolation

• financial and personal issues

While financial and personal issues were mentioned byvirtually all students as a factor affecting their academicprogress, transfer students raised the other four issues asbeing stumbling blocks much more frequently (more than3:1) than did non-transfer students. Typically, transferstudents had not yet completed their required mathematicscourses, and so had to accommodate these courses in theirschedules. Once in the courses, they failed at a rate morethan double that of their non-transfer counterparts. Somestudents reported that they had been advised to takeunnecessary courses, but had not been advised to take coursesthat were really needed. Transfer students found that theacademic level and expectation were much higher than theyhad experienced at the community college, and had oftenfelt “on the outside” compared to students who had been inthe department for all four years. Specifically, transferstudents were much less likely to be involved with formativeundergraduate activities such as conducting student-facultyresearch, attending departmental functions, or participatingin social gatherings.

Use of Findings

This study showed the presence of a significant bottleneckfor many California State University minority senior studentscurrently enrolled in mathematics-based programs. Theproblems seemed to be triggered by issues relating totransferring from another institution, typically a communitycollege. To what extent can the university take responsibilityfor these problems, or create changes that are genuinelyeffective? In California, the “Senior Bulge” study did resultin helping to convince the Chancellor’s office to initiatevoluntary programs for interested campuses. Each campuswas invited to devise a plan to address the issues associatedwith untimely graduation, with funding available (between$20-45 K) to help implement the plan. While eachparticipating campus (12 of the 22 CSU campuses are nowinvolved) customized its plan, there were at least threecommon elements shared by all:

1. “Catch” transfer students early. It is easy to assumethat since transfer students have already attended college,they do not need guidance from the university. This studyfound that transfer students are an at-risk group in terms ofadjusting to the academic rigors of a university, enrolling inthe right classes, and forming early connections to theiracademic department. Academic departments identifying,contacting, and meeting with transfer students early in theiruniversity career may eliminate some of the problems lateras seniors. (As Uri Treisman once remarked, “Care for yourown wounded.”)

2. Provide accurate academic advising within thedepartment. Interview data showed that most students,especially transfer students, felt varying degrees of isolation

Page 231: Assessment Practices in Undergraduate Mathematics - Northern

218 Assessment Practices in Undergraduate Mathematics

in their quest to gather accurate information about specificrequirements and prerequisites, scheduling, and academicsupport programs and services. Having the academicdepartment individually advise students throughout theirenrollment at the university may be extremely helpful instreamlining their paths to graduation.

3. Provide effective academic support for key courses.Providing (and perhaps requiring participation in) academicsupport, such as Treisman [8] workshop-style groups,together with scheduling key classes to accommodate studentneeds, may be a significant way that the department can notonly facilitate the success of its students but increase studentinvolvement as well.

Success Factors

Studies by Treisman [8], Bonsangue [2, 3], and Bonsangueand Drew [4] have suggested that the academic departmentis the key element in facilitating changes that will make areal difference for students. Academic departments may bethe key link in addressing each of the specific needs identifiedabove, especially in providing academic advising and increative scheduling to accommodate “off-semester” transferstudents. Institutions whose mathematics courses aresupported by academic programs such as the AcademicExcellence Workshop Program have reported significantlyhigher rates of on-time course completion and subsequentgraduation than they observed before instigating suchprograms [4, 5, 8]. While the majority of students in thisstudy were not necessarily mathematics majors, thesuccessful and timely completion of mathematics coursesseemed to play a crucial role both in the students’ time tograduation as well as in their attitudes about school [1, 7].

The programs described to address the Senior Bulgephenomenon are works in progress, with most still in theirfirst or second semester at the time of this writing. Eachprogram has its own character, with most programs run by

an SEM faculty member and a person working in studentsupport department. Interested persons should feel free tocontact me to discuss gains (as well as mistakes) that wehave made.

References

[1] Academic Excellence Workshops. A handbook forAcademicExcellence Workshops, Minority EngineeringProgram and Science Educational EnhancementServices, Pomona, CA, 1992.

[2] Bonsangue, M.The effects of calculus workshop groupson minority achievement and persistence inmathematics, science, and engineering, unpublisheddoctoral dissertation, Claremont, CA, 1992.

[3] Bonsangue, M. “An efficacy study of the calculusworkshop model,” CBMS Issues in CollegiateMathematics Education, 4, American MathematicalSociety, Providence, RI, 1994, pp. 117–137.

[4] Bonsangue, M., and Drew, D. “Mathematics: Openingthe gates—Increasing minority students’ success incalculus,” in Gainen, J. and Willemsen, E., eds.,Fostering Student Success in Quantitative GatewayCourses, Jossey-Bass, New Directions for Teaching andLearning, Number 61, San Francisco, 1995, pp. 23–33.

[5] Fullilove, R.E., & Treisman, P.U. “Mathematicsachievement among African American undergraduatesat the University of California, Berkeley: An evaluationof the mathematics workshop program,” Journal ofNegro Education, 59 (3), 1990, pp. 463–478.

[6] Science (entire issue). “Minorities in science: Thepipeline problem,” 258, November 13, 1992.

[7] Selvin, P. “Math education: Multiplying the meagernumbers,” Science, 258, 1992, pp. 1200–1201.

[8] Treisman, P.U. A study of the mathematics performanceof black students at the University of California,Berkeley, unpublished doctoral dissertation, Berkeley,CA, 1985.

Page 232: Assessment Practices in Undergraduate Mathematics - Northern

219

Background and Purpose

In 1990, the US Military Academy at West Point changed toa bold new core mathematics curriculum that addressed seventopics in four semester-long courses. The needs for changewere both internal and national in scope. Internally, math,science and engineering faculty were disappointed in themath abilities of the junior and senior students. Externally,the national reform movement was providing support in theform of interest, initiative and discussion. Initially, myresearch evaluated this curriculum change from threeperspectives: how the new curriculum fit the nationalrecommendations for reform, how the change wasimplemented, and what the effects were on studentachievement and attitudes toward mathematics. This paperwill report on the resulting longitudinal comparison of twocohorts of about 1000 in size, on the “steady-state”assessment of subsequent cohorts, and on the changes as aresult of these assessments. This study informs theundergraduate mathematics and mathematics educationcommunity about the effects of mathematics reform onstudent performance, about the implementation and valueof department-level reform and evaluation, and theimplications and prospects of research of this type.

a) Description of Students: Admission to West Point isextremely competitive. Current levels of admission aregreater than ten applicants for each acceptance. The goal ofthe admissions process is to accept students who are as “well-rounded” as possible, including both physical and leadershipaspects. Average Math SAT scores are around 650. Education

at West Point is tuition-free. In general, West Point cadetsare very good students from across the nation with diversecultural backgrounds.

The comparison cohort entered West Point in July 1989and began the old core mathematics curriculum in August1989. There were approximately 1000 students who finishedthe four core courses together. Most of these studentsgraduated in May 1993. The reform cohort entered WestPoint in July 1990 and was the first group to take the newcurriculum, starting in August 1990. There wereapproximately 1000 students who finished the four corecourses together. Most of these students graduated in May1994.

b) Descriptions of Old and New Curricula: West Point’score curriculum comprises 31 out of the 40 courses requiredfor graduation. Throughout the first two years, all studentsfollow the same curriculum of five academic courses eachsemester in a two-semester year. Approximately 85% of anyfreshman or sophomore class are studying the same syllabuson the same day. Over the first two years, every student musttake four math courses as well as year-long courses inchemistry and physics. Approximately 85% choose a majortoward the end of the third semester. During their last twoyears, all students take one of seven five-course engineeringsequences. Thus, the core mathematics program providesthe basis for much of the student’s education, whether he orshe becomes an English, philosophy, or math, science, andengineering major.

The old core math curriculum was traditional in contextand had no multivariable calculus, linear algebra, or discrete

Evaluating the Effects of Reform

Richard WestUnited States Military Academy at West Point

West Point turned an entire department around. Using an in-depth assessment study with careful attentionto the needs of client disciplines, the department created a brand new curriculum, and continues to studyit with the “Fullan model” which the author investigated in his dissertation.

Page 233: Assessment Practices in Undergraduate Mathematics - Northern

220 Assessment Practices in Undergraduate Mathematics

math. The courses were Calculus I, Calculus II, DifferentialEquations, and Probability and Statistics. For those whomajored in most engineering fields there was anotherrequired course called Engineering Mathematics that coveredsome multivariable calculus, some linear algebra, and somesystems of differential equations.

The current core curriculum (initiated in August 1990)covers seven topics in four semesters over the first two years.The topics are discrete mathematics, linear algebra,differential, integral and multivariable calculus, differentialequations, and probability and statistics. A discretemathematics course focused on dynamical systems (ordifference equations) and on the transition to calculus (orcontinuous mathematics) starts the two-year sequence. Themathematics needed for this course is new to the majority ofhigh school graduates but is also intuitive and practical.Linear algebra is embedded in a significant way, as systemsof difference equations are covered in depth. So, two of theseven-into-four topics are addressed in this first course.Further, this course by design provides a means to facilitatethe accomplishment of many other reform goals and the goalsof the curriculum change (see appendix), such as integratingtechnology, transitioning from high school to collegiatemathematics, and modeling “lively” application problems.

The current Calculus I course finishes differential calculusand covers integral calculus and differential equationsthrough systems, thus addressing three of the seven-into-four topics. Calculus II is a multivariable calculus course.In addition to the above, these new calculus courses differfrom their predecessors by integrating more technology,utilizing more interactive instruction, and including moregroup projects that require mathematical modeling, writingfor synthesis, and peer-group interaction. The probabilityand statistics course, the last of the four core courses, hasbeen taught for over thirty years to all second-year studentsand is gaining in importance to our engineering curricula.

The use of technology and the integration of the contentin this curriculum into one program provide the opportunitiesto fit the topics from seven courses into the four semesters.In short, the gains are coverage of linear algebra, discretemath, and multivariable calculus, while totally integratingmodeling and technology use. The losses are relativelyminor: reduced emphasis on analytic geometry, series, andintegration techniques and movement of Laplace transformsto an Engineering Mathematics elective specifically for thosemajoring in engineering.

c) Framework for Evaluation: Utilizing the threeperspectives of (1) reform, (2) implementation process, and(3) comparison of two student cohorts, I conducted threedifferent analyses of the curriculum change. Although thischange appears to have come from needs internal to WestPoint, during the time period much was being said nationallyabout mathematics education at all levels from kindergarten

through college. While the NCTM Standards [3] showedthe way for K–12, the colleges have had their own voicesfor reform particularly the Committee on the UndergraduateProgram in Mathematics (CUPM) Recommendations [5] in1981 and the whole Calculus Reform movement whichappears to have its beginnings around 1986. In my studiesthe recommendations of this national reform movement werebest synthesized in Reshaping College Mathematics [4]. Iused these recommendations and others to analyze whetherWest Point’s curriculum reform had core characteristicssimilar to those of the national reform movement.

Large-scale educational change is difficult to initiate, toimplement and to maintain. There are many obstacles toovercome in starting up and maintaining a new curriculum,not the least of which is a resistance to change itself. Conse-quently, the documentation of a major curriculum reform isextremely valuable to all those who wish to attempt such aninnovation. One model whose designer addresses the pro-cedures and factors that make up a successful educationalchange is posed by Michael Fullan in [1]. Fullan says thateducational change has two main aspects: what to changeand how to change. The national reform movement has pro-vided a consensus of what to change and Fullan provides atheoretical model to compare the West Point innovationagainst. According to Fullan educational change has threephases: initiation, implementation, and continuation, all lead-ing to outcomes. Each of these phases interacts with its se-quential neighbor. In this study, Fullan’s factors for each ofthese phases were analyzed for relevance and impact to thechange process. In short, the Fullan model provides a co-gent framework to evaluate the change process.

The outcomes for this evaluation were studentachievement and attitudes. Mathematics reform at the collegelevel, as with most other educational reforms, seeks toimprove student learning and attitudes in the hope that thisimprovement will in turn motivate students to further studyand application of the mathematics they have learned.Further, these outcomes are addressed to determine if thegoals for the curricular reform are being accomplished.

Method

My focus was to evaluate the impact of reform on studentperformance and attitudes toward mathematics by comparingthe achievement and attitudes of the two student cohortsdescribed above. To conduct the evaluation, I formulatedtwelve guiding questions, nine of which describe the contextof my study and provide input for the reform andimplementation perspectives. The remaining three questionsfocus on the comparison of the two cohorts.

Data for the two contextual perspectives were obtainedthrough an extensive literature search of recent reform andeducational change references, review of historical docu-

Page 234: Assessment Practices in Undergraduate Mathematics - Northern

Part III: Departmental Assessment Initiatives 221

ments, and interviews of students and faculty. Data for thecomparison of the two cohorts were obtained from quizzes,exams, questionnaires, interviews and grades. Except for theinterviews, most data were collected and stored for lateranalysis. There was almost no a priori experimental design.

FindingsThe details of this analysis are contained in my dissertation[6], which was completed while I was an associate profes-sor in mathematics at West Point. In addressing the threeperspectives I have outlined above, I found for the reformperspective that the revised curriculum at West Point usedfor the core mathematics curriculum was consistent in mostways with the national call for reform in mathematics cur-riculum at the college level. For the implementation per-spective, I reported that the processes for implementationof the curriculum change involve many factors, but that thechange studied was successful in accomplishing its articu-lated goals (see appendix). The informed and empoweringleadership of the department head and the involvement andconsensus-building style of the Department of Mathemati-cal Sciences senior faculty in the change process were thekey factors that motivated the implementation of the revisedcurriculum. Over the two years prior to the August 1990implementation, the senior faculty planned and built institu-tion-wide consensus for the initiation of the revised coremathematics curriculum. The planning and implementationcontinued through January 1992, when the last of the fourcourses began. Improvements to this original revised cur-riculum have continued through the present. Articulated goals

for this curriculum for the most part appear to have beenaccomplished, and having continued with the curriculum forseven years, the change appears to be institutionalized.

Finally, for the comparison perspective I evaluated theeffects of the change in curriculum in terms of studentmathematics achievement and attitudes toward mathematics.Students in the reform cohort under the current corecurriculum were compared with students under a traditionalcurriculum. The comparison of the two cohorts in terms ofstudent achievement and attitudes was difficult. My planneddata collection included results of math tests, commonquizzes, questionnaires, and interviews. I found these datavery informative about a certain cohort. Yet, a directcomparison was like comparing apples and oranges, or theresults were inconclusive. The lesson learned is thatexperimental design is needed before-the-fact for theseinstruments to be compared. At the same time, this neededbefore-the-fact planning may not be feasible.

In contrast, my analysis of grades was intended purelyfor informational purposes, but produced the mostcompelling results. The comparison of grades in follow-oncourses such as physics and engineering science, which pridethemselves in standardization from one year to the next,showed significant improvements between cohorts. Thetables below from my dissertation show the results of thecomparison of grades for the two-semester physics sequence.Similarly, I looked at eight engineering science courses takenby a total of 85% of each of the cohorts. Four of these eightcourses showed significant results (p-value < 0.05) and asimilar shift in grades to the physics courses. The reformgroup performed better in these courses.

01020304050

F D C B A

ComparisonReform

01020304050

F D C B A

ComparisonReform

Table 1. Percentage of Grade Category and Median forPH201 Physics I

NC = 1000 and NR = 1030

PH201 F D C B A MedianComparison 2.6 16.3 43.5 27.0 10.6 CReform 1.8 12.3 39.7 33.7 12.5 C+

Note. c2 = 17.60 with p < 0.002, t = 3.82 with p < 0.001.

Figure 1. Percentage of GradeCategory in PH201.

Table 2. Percentage of Grade Category and Median forPH202 Physics II

NC = 970 and NR

= 1003

PH202 F D C B A MedianComparison 2.9 18.9 48.4 22.1 7.8 CReform 0.3 5.9 46.4 38.2 9.3 C+

Note. c2 = 132.75 with p < 0.00001, t = 9.90 with p < 0.001.

Figure 2. Percentage of GradeCategory in PH202.

Page 235: Assessment Practices in Undergraduate Mathematics - Northern

222 Assessment Practices in Undergraduate Mathematics

I was able to respond to all my guiding questions exceptthe comparison of attitudes of the two groups. Attitude datafor the comparison group were either not available, of sucha small scale, or not of similar form to make a reasonablecomparison feasible. I found myself comparing differentquestions and having to draw conclusions from retrospectiveinterviews of students and faculty. If questionnaires are tobe used, some prior planning is needed to standardizequestions. However, the data from the student and facultyinterviews indicate some improvement by the reform groupin the areas desired to be affected by the revised curriculum.Attitudes had not been measured until 1992, after the revisedcurriculum had been implemented.

Use of Findings

While I concluded that the reform curriculum wassuccessfully implemented, the change process in theMathematical Sciences Department at West Point is stillongoing. This current year saw the adoption of a new calculustext. Further, for varying reasons of dissatisfaction,availability, cost, and adjusting to other changes, fourdifferent differential equations texts have been used overthe six years of the new curriculum. In addition, the text forthe discrete dynamical systems course will change this fall.All courses are using interdisciplinary small group projectsdesigned with other departments and disciplines. The studentgrowth model (more below) developed in 1991 and used toshape the four-course program is being updated continuously.At the same time, the faculty development program thatsupports improvements to the curriculum and the way weteach has been significantly enhanced in the last two years.Finally, assessment instruments are currently under greatscrutiny to ensure that they mirror the goals of the studentgrowth throughout the four-course program and attitudequestionnaires have been administered each semester sincespring 1992.

The most significant framework for change over the pasteight years has been the department’s focus on improvingstudent growth over time. The senior faculty started by es-tablishing program goals in the spring of 1990 (see appen-dix). Basically, these goals are difficult, as they are gearedto developing aggressive and confident problem solvers.Their intent is that students are required to build mathemati-cal models to solve the unstructured problems that they willface in the real world. The senior faculty operationalizedthese goals by establishing five educational threads that wereintegrated throughout the four courses. Still in effect, thesefive threads are scientific computing, history of mathemat-ics, communications, mathematical reasoning and math-ematical modeling. Each of the courses wrote objectives toaddress each of these specific ideas toward accomplishingover time the goals we established for the program. These

threads and the course objectives are called the “studentgrowth model.” As a result, all assessment instruments aredesigned to address these objectives and thereby measurestudent growth.

The department turns over about one third of its facultyeach year. As a result, over time the student growth modelbecomes unclear and open to interpretation. As students andfaculty become less familiar with the student growthobjectives they become unfocused. Therefore, approximatelytwo years ago the senior faculty found it necessary toarticulate the student growth model in terms of contentthreads. The result was nine: vectors, limits, approximation,visualization, models (discrete and continuous, linear andnonlinear, single- and multi-variable), functions, rates ofchange, accumulation, and representations of solutions(numerical, graphical, symbolic, descriptive). The intent wasthat both students and faculty could more readily identifygrowth if it was articulated in mathematical terms. Further,these content threads provide avenues for better streamliningof the curriculum to enhance depth on those topics essentialto growth in the program. Finally, they facilitate the designof assessment instruments by having objectives that arecontent-specific. Further articulation of the goals andobjectives for these content threads are forthcoming as wellas a requisite assessment scheme.

Evaluation of the program continues. As a result of thisinitial study, since 1992 I have created a database for eachcohort of grades of all mathematics-based core courses.Further, I have used common attitude questions on entry andat the end of each mathematics course. Study of these dataare ongoing and are used to inform senior faculty aboutspecific mid-program and mid-course corrections. Theseinstruments actually tell more about the growth of the cohortover time rather than serve to compare one group to the other.However, looking at the same course over time does informabout trends in that course.

A recent additional evaluation tool uses portfolios tomeasure conceptual growth over time. We have been usingstudent portfolios since 1993, mostly as self-evaluationinstruments. This past year we instituted five commonquestions following the themes of our five educationalthreads that all students must respond to in each of theircore mathematics courses. Each is supposed to be answeredwith a paragraph up to a half-page typewritten and includedin the portfolio for each of the four courses. A couple ofexamples from this year are: (1) Define a function. Give anexample of a function from this course and explain its use.(2) Discuss how a math modeling process is used in thiscourse. Describe the impact of “assumptions” and how onecan “validate” their model. Further, each subsequent portfoliomust contain the responses to the questions from the previouscourse(s). This gives the student and faculty an example ofan individual student’s growth over the span of the fourcourses. Since we just started this year, we do not know how

Page 236: Assessment Practices in Undergraduate Mathematics - Northern

Part III: Departmental Assessment Initiatives 223

this will work. But we have hopes that this snapshot will bevaluable to both the student and the faculty.

Success Factors

The references below proved excellent in shaping aconsensus interpretation of the national reform movement.Further, the West Point Math Sciences Department was verygood about articulating goals for the new curriculum (seeappendix). The fact that they were written down allowed meto understand very quickly the stated focus of what I wastrying to evaluate. It further allowed me to conclude thattheir stated goals had been accomplished. At the same time,the Fullan model provided a cogent model for evaluatingeducational change at the undergraduate level.

I understand that West Point is not the typical college,and for my dissertation [6] I devoted an entire appendix tothe issue of generalizability. In this appendix I enclosed lettersfrom prominent faculty familiar with our curricula from largeresearch universities to small liberal arts colleges thatsupported the generalizability of the results. I believe thatmost of what I have related here is generalizable to otherschools and other programs. The evaluation I have conductedis an example of the use of the model in [2] The bottom lineis that the MAA model works and the assessment processcan be very useful in informing senior faculty who must makecurricular decisions.

In closing, assessment at the department level is a processthat can involve the entire faculty, build consensus, informdecisions about improving curricular programs, and evaluatestudent learning over time. My experience with the evaluationof the curriculum reform at West Point is an example of this.I hope that the ideas posed here will encourage others toproactively design assessment programs with the goal ofimproving student learning.

References

[1] Fullan, M.G. The New Meaning of Educational Change(2nd ed.), Teachers College Press, New York, 1991.

[2] Mathematical Association of America. “Assessment ofStudent Learning for Improving the UndergraduateMajor in Mathematics,” Focus, 15 (3), 1995, pp. 24-28.

[3] National Council of Teachers of Mathematics.Curriculum and Evaluation Standards for SchoolMathematics, NCTM, Reston, VA, 1989.

[4] Steen, L.A., ed. Reshaping College Mathematics, MAANotes Number 13, Mathematical Association ofAmerica, Washington DC, 1989.

[5] Tucker, A. ed. Recommendations for a GeneralMathematical Sciences Program: A Report of the

Committee on the Undergraduate Program inMathematics, Mathematical Association of AmericaWashington, DC, 1981.

[6] West, R.D. Evaluating the Effects of Changing anUndergraduate Mathematics Core Curriculum whichSupports Mathematics-Based Programs, UMI, AnnArbor, MI, 1996.

Appendix

West Point Goals for StudentLearning in the Revised Curriculum

1. Learn to use mathematics as a medium of communicationthat integrates numeric, graphic, and symbolicrepresentations, structures ideas, and facilitates synthesis.

2. Understand the deductive character of mathematics,where a few principles are internalized and most notionsare deduced therewith.

3. Learn that curiosity and an experimental disposition areessential, and that universal truths are established throughproof.

4. Understand that learning mathematics is an individualresponsibility, and that texts and instructors facilitate theprocess, but that concepts are stable and skills aretransient and pertain only to particular applications.

5. Learn that mathematics is useful.

6. Encourage aggressive problem solving skills byproviding ample opportunities throughout the corecurriculum to solve meaningful practical problemsrequiring the integration of fundamental ideasencompassing one or more blocks of lessons.

7. Develop the ability to think mathematically through theintroduction of the fundamental thought processes ofdiscrete, continuous, and probabilistic mathematics.

8. Develop good scholarly habits promoting studentindependence and life-long learning ability.

9. Provide an orderly transition from the environment ofthe high school curriculum to the environment of an

upper divisional college classroom.

10. Integrate computer technology throughout the four-semester curriculum.

11. Integrate mathematical modeling throughout thecurriculum to access the rich application problems.

Page 237: Assessment Practices in Undergraduate Mathematics - Northern

224

Background and Purpose

Virginia Polytechnic Institute (“Virginia Tech”) is a landgrant, state, Type I Research University, with an overallenrollment of 26,000 students. Since 1993, the semesterenrollment of the mathematics department averages 8,000 -12,000 students. Of this group, approximately 90% of thestudents are registered for introductory and service courses(1000 and 2000 level). These first and second-yearmathematics courses not only meet university core courserequirements, but are also pre- or co-requisites for a numberof engineering, science and business curricula.

A comprehensive assessment program, encompassing theactivities of the mathematics department, began in the springof 1995 at Virginia Tech. In the preceding fall of 1994, anew department head had been named. Within days ofassuming the position, he received opinions, concerns, andquestions from various institutional constituencies andalumni regarding the success and future direction of Calculusreform. The department head entered the positionproactively: finding mechanisms which could provide facultywith information regarding student performance andlearning; developing a faculty consensus regarding corecourse content and measurable objectives; recognizing andidentifying the differences in faculty teaching styles andlearning styles for students; synthesizing and using thisinformation to help improve student learning and academicoutcomes. Concurrently, the mathematics department wasin the midst of university restructuring, and “selective anddifferential budget adjustments” were the menu of the day.

From the departmental perspective, the universityadministration required quantitative data and an explanationof a variety of course, student, and faculty outcomes, oftenwith time frames. The resources in Institutional Researchand Assessment were shrinking, and more often data had tobe collected at the departmental level. The departmentdecided data and analyses available within the departmentwere much preferable to obtaining data via the universityadministrative route. The selection of a person to analyzeand interpret the data had particular implications sincelearning outcomes are sensitive in nature and are used withinthe department; another problem was that educationalstatistics and measurement design skills, distinctly differentfrom mathematical expertise, were needed for the analysisand interpretation of data. Networking with the universityassessment office on campus provided an acceptable option— an educational research graduate student who could workpart-time with the department in the planning andimplementation of the assessment program (Scruggs).

Method

Data Gathering and Organization: The departmentalassessment effort roots itself in obtaining forms of data,organized by semester, with students coded for anonymity:

High school data are obtained from admissions forentering freshmen and include SAT verbal and mathscores, high school GPAs, high school attended, and initialchoice of major.

A Comprehensive, Proactive Assessment Program

Robert Olin, Lin ScruggsVirginia Polytechnic Institute

A large technical institute using the author as assessment coordinator, creates a broad new assessmentprogram, looking at all aspects of the department’s role. Statistical studies guide improvements incurriculum, teaching and relations with the rest of the university.

Page 238: Assessment Practices in Undergraduate Mathematics - Northern

Part III: Departmental Assessment Initiatives 225

Background survey data from the CooperativeInstitutional Research Program (CIRP) are acquiredduring freshmen summer orientation.

Course Data including overall course enrollment,sectional enrollment, student identification numbers bysection, instructors of each section, and times and locationsof courses.

Achievement Data including grades from common finalfor about 8 core courses, ranging from college algebra toengineering calculus and differential equations. (Initially thepurpose of these examinations was to provide a mechanismto evaluate Mathematica as a component of engineeringcalculus.) These examinations are multiple choice andinclude three (3) cognitive question types: skills, concepts,and applications. These examinations help determine theextent to which students have mastered mathematical skillsand concepts; and secondly, allow comparisons between andamong sections, in light of instructional modalities, methodsand innovations. After common exams are administered andscored, each instructor receives a printout detailing the scoresfor their class, as well as the mean and standard deviationfor all students taking the common examination. Theassessment coordinator receives all scoring and testinformation electronically, including the individual itemresponses for all students. Test reliability, validity, and itemanalyses are performed for each course common exam. Thisdata is then made available to mathematics faculty to aid inthe interpretation of the current test results, as well as forthe construction and refinement of future test questions.

Survey Data: Virginia Tech has administered theCooperative Institutional Research Project (CIRP) Surveyevery year since 1966. Student data, specific to ourinstitution, as well as for students across the United States,is available to the mathematics department for research,questions, and analyses. The mathematics departmentassessment program is actively involved in the process ofidentifying variables from the CIRP surveys which areassociated with student success. Additionally, several in-house survey instruments have been designed to augmentthis general data and gauge specific instructional goals andobjectives. The departmental surveys use Likert rating scalesto accommodate student opinion. With this procedure,affective student variables can be merged with morequantitative data.

Methodology Overview: Data is stored in electronic files,with limited access because of student privacy concerns, onthe mathematics department server. SPSS-Windows is usedfor statistical analyses. Specific data sets can be created andmerged, using variables singularly or in combination, fromacademic, student, grade, and survey files. More informationon this process is available from the author.

Findings and Use of Findings

The department’s role on campus ranges from teachingstudents and providing information for individual instructorsto furnishing information to external constituencies includingother departments, colleges, university administration andstate agencies. A comprehensive program of mathematicalassessment must be responsive to this diverse spectrum ofpurposes and groups. Departmental assessment then refersto the far-reaching accounting of the students anddepartmental functioning within the department andthroughout the university. Assessment has different purposesfor different groups, and given this range of applications,the following discussion incorporates selected examples ofdata analyses, outcomes, and decisions.

Academic Measurement: While tests are important, theirusefulness is contingent on the quality of the instrument,course goals, and the intended purposes. In the common finalexaminations, close attention is given to the constructionand evaluation of the tests themselves. Common finals areconstructed by faculty committees using questions submittedby individual faculty members who have taught the course.The tests are then evaluated for appropriateness of content,individual question format and style, and item difficulty. Post-test analyses are performed by the assessment coordinatorthat include test reliability coefficients, item analyses, overalland sectional means and standard deviations, and scoredistributions. With each administration, faculty and studentfeedback regarding the finals has become increasinglypositive, indicating that the tests are more representative ofcourse content, and the questions have greater clarity. Duringthis iterative process, faculty knowledge and involvementin assessment has grown with increasing dialogue amongfaculty a welcome outcome.

Departmental assessment practices have provided amechanism for monitoring and analyzing student outcomesas innovative and different teaching methods have beenintroduced and technology added to existing courses, suchas engineering calculus and college algebra. Did thesechanges have a positive effect on student learning? Whateffects, if any, did the changes have on long-term learningand performance in other courses? These questions wereposed from inside the department and from other departmentsand university administration. The departmental data baseallows rapid access to grade and common final data formathematics courses, and grade outcomes for engineeringand science courses.

For the freshmen students enrolled in the fallsemesters of 1993 and 1994, student academic backgrounddata in conjunction with course grades were used to examinelongitudinal outcomes for traditional and “Mathematica”calculus students who subsequently enrolled in advancedmathematics and engineering courses, such as differential

Page 239: Assessment Practices in Undergraduate Mathematics - Northern

226 Assessment Practices in Undergraduate Mathematics

equations, statics, and dynamics. Mean comparison studieswith t-tests were performed, comparing grade outcomes ofthe traditional and Mathematica students. (No statisticallysignificant differences noted except for the dynamics course[Table 1].)

A designated section of differential equations was taughtin the Fall of 1996 as a response to a request from civilengineering and was nicknamed the “Green” version.Targeted toward Civil Engineers, the course utilizedenvironmental and pollution examples to support differentialequation concepts and theory. Table 2 summarizes “Green”course outcomes as compared to sections taught in atraditional format with a variety of majors in each section.

Integrating teaching methods and theory application anduse as they apply to specific major areas offers intriguingopportunities. During the fall of 1997, faculty in the collegealgebra sequence collaboratively with other departments andindividual faculty outside of the mathematics department.So designed, the sets help students recognize thatmathematics is a valuable aspect of all that they do, not justa core university requirement .

Student Placement: To keep up with a changing studentpopulation and changing expectations of the university,students, parents, state legislatures, and the media (asevidenced in the charge for academic and fiscalaccountability), the departmental “menu” of courses andteaching methods and student support options have beenexpanded, as has the requirement for assessing and justifyingthe changes. Appropriate placement of students under thesecircumstances becomes both an educational andaccountability issue.

Since 1988, Mathematics Readiness Scores have beencalculated for entering freshmen. Institutional Researchdevised the initial formula using multiple regression analysis.The formula for the calculation has gone through severaliterations, with scores currently calculated from studentbackground variables available through universityadmissions: high school mathematics GPA; College BoardMathematics scores; and a variable which indicates whether

or not the student had taken calculus in high school. Adecision score was determined above which students areplaced in the engineering calculus course, and below which,in pre-calculus courses. Each semester, using the coursegrades, the scores are validated and the formula modified tomaximize its predictive capability.

Special Calculus Sections: Even though student successin engineering calculus improved after the math readinessscores were utilized, academic achievement remained elusivefor many capable students who had enrolled in the pre-calculus course. Grade averages were low in this course,and longitudinal studies indicated that many students whohad earned a grade of C or above, failed to complete thesecond course in the engineering calculus sequence with acomparable grade. In the fall of 1996, based on a modeldeveloped by Uri Treisman at the University of California,Berkeley, a pre-Calculus alternative was piloted within themathematics department. An augmented version of theengineering calculus sequence was begun — EmergingScholars Program (ESP) calculus, now operating for the firstand second semesters of calculus. The traditional three-hourlecture course was accompanied by two, two-hour required

Table 2Green Differential Equation Approach, Fall 1996

SATM Mean Common Mean CourseFinal Score Grade

Green section 640 63 3.1n=15

Composite 643 47 2.3section* n=22

All sections 640 44 2.0except for Greenn=685

* 4% of the students not participating in the Green sectionwere randomly selected and descriptive statistics calculated.

Table 1Technology and Traditional Teaching

General Engineering Majors EngineeringCalculus Sequence, Fall 1994

SATM HSGPA CALC I CALC II DIFF EQU MULTI VAR STAT DYNAM

traditional 629.9 3.48 2.77 2.19 2.58 2.55 2.28 1.83*n=324

with technology 624.0 3.50 2.72 2.43 2.48 2.43 2.38 2.06*n=165

* indicates statistically significant difference between groups (t-test)

Page 240: Assessment Practices in Undergraduate Mathematics - Northern

Part III: Departmental Assessment Initiatives 227

problem-solving sessions, supervised by faculty and usingundergraduate teaching assistants as tutors. Due to theacademic success of the students, as well as faculty, tutor,and student enthusiasm for the approach, 6 sections of ESPcalculus were incorporated into the spring course schedule.Students enrolled in the spring ESP sections had previouslybeen enrolled in traditional calculus or pre-calculus in thefall. Average course grades for the traditional calculusstudents was 0.8 (out of a possible 4.0). Comparisons of thestudent outcomes for the traditional and ESP versions fromthe spring of 1997 are shown in Table 3. The fall of 1997has 17 sections of ESP calculus on the schedule, with anumber of sections of traditional engineering calculus.Previous assessment efforts, both quantitative and qualitative,supported the departmental decision to proceed toward theESP approach and away from pre-calculus.

Developmental Courses: The college algebra/ trigonom-etry course enrolls approximately 1300–1400 students eachfall semester. This non-major service course, serving pri-marily freshmen students, requires significant departmentalresources. In the fall of 1995, a computer-assisted, self-pacedapproach was pilot tested, involving a cohort of 75 studentsfrom the 1300 total enrollment. At the beginning of the se-mester, all students were given a departmental survey that isdesigned to ascertain student perceptions of their learningskills and styles, motivation, and mathematical ability. Atthe conclusion of the semester, these non-cognitive items,determined from the factor analyses of survey data, wereanalyzed with student grades using regression analysis. Thegoal was to identify predictors of success in the computer-assisted version of the course. Significantly related to successwere the self-reported attributes of being good to very good inmath, organized and factual in learning new material.

Two very different means of placement have beendescribed above. One utilized cognitive achievement data,while the second made use of non-cognitive student reportedinformation. Both approaches have provided valuableinformation, for student placement and for course evaluation

and modification. Since the introduction of technology asthe primary instructional modality in 1995, the collegealgebra course has undergone several iterations in responseto quantitative and qualitative departmental data analyses.At the present time, this course maintains its technology-driven, self-paced instructional core. As a response to studentsurvey responses which indicated a need for more personaland interactive experiences, a variety of instructionalalternatives, such as CD lectures, have been incorporatedinto the course.

Technology: A variety of student outcomes andbackground variables were used as a means of assessing theincorporation of computer technology into the collegealgebra and engineering calculus courses. Assessment resultsand outcomes are generally positive with some concerns.One finding indicated that the use of technology allowedstudents to pace themselves, within a time frame beneficialto student schedules. Also the downstream results forengineering calculus indicated that students receivingtechnological instruction during the regular course time didas well, if not better, in the more advanced course work.Negative findings were related to computer and networkfunctioning, certain aspects of the software, and the lack ofcongruity between lecture and computer assignments. Usingthe outcomes as a guide for modifying courses each semester,technology use within the department has increased. In fallof 1997, the mathematics department opened a MathematicsEmporium, with 200 computers and work stations, soon tobe expanded to 500. Assessment has played and will continueto play a role in ideas, decisions, and educational innovationregarding technology.

Learning/Teaching: Our data base enables our departmentto effectively respond to issues raised from within thedepartment and externally from other departments anduniversity administration. For example, the common finalexaminations in many departmental service courses havegiven additional information regarding student, sectional,and course outcomes. Scores, in conjunction with course

Table 3Traditional and ESP Calculus Student Outcomes, Spring 1997

%A %B %C %C- or mean common mean coursebelow final° grade

ESP calculus 14.9 32.3 24.7 29.9 9.58* 2.32*n=128

Traditional calculus 10.3 25.0 23.1 41.0 8.58* 1.86*n=155

°average number of correct items* t-test indicates that the differences between the scores and grades for the ESP and traditional groups were

statistically significant (p<.01).

Page 241: Assessment Practices in Undergraduate Mathematics - Northern

228 Assessment Practices in Undergraduate Mathematics

grades, have been used to examine the connection betweengrading practices and student learning. In the fall of 1995,there were 31 sections of engineering calculus, all relyingon the same course goals and text book. After common finalswere taken and grades assigned, Pearson Correlations wereused to ascertain the association between sectional finalscores and grades. For all sections taken together, thecorrelation was calculated to be a 0.45. Though statisticallysignificant, the magnitude of the result was lower thanexpected, prompting further study, as sectional mean scoresand grades were examined individually. The following tableaffords examples of the variety of sectional outcomes. Actualdata is used, though the sections are identified only bynumber and in no particular order [Table 4].

One can note that Sections 2 and 3 are disturbing in theincongruence demonstrated between course grades andcommon final scores.

Making this data available anonymously to instructorsoffers them the opportunity to compare and analyze forthemselves. The department head promoted the use ofassessment data to generate an informed and potentiallycollaborative approach for the improvement of teaching.

Focus Groups: Seeking to evaluate the ESP calculusprogram, a focus group component was included to obtainstudents’ views and feelings regarding the course format,philosophy, and expectations. Student responses wereuniformly positive. About this Calculus approach, freshmenstudents suggested an unanticipated aspect of its value —the sense of community they experienced within the

mathematics department, and by extension, the universityas a whole. Student comments show that they feel the valueof esprit de corps in a school that uses their input. As onestudent remarked, “Learning math takes time and resources.ESP is what makes Tech a good school.”

Success Factors

Assessment within the mathematics department is thereflection of a variety of factors, many planned, othersserendipitous. But how can success be gauged? What is theevidence of the value added to students, the department, andthe institution? Who has gained? Answers to these relate tothe department, the individual faculty and the students. Theongoing assessment of the department allows the departmentto be public, share concerns and answer questions, and allowit to better identify, compete for, and manage availableresources within the department, the university, and beyond.The faculty is more able to monitor their students’ outcomes,as well as that of curriculum and instructional techniques.Ensuing program planning provides faculty the opportunityfor increased ownership and distinctly defined roles ininstructional development. Students have certainly receivedthe benefits of assessment by feelings of enhancedinvolvement and contribution to their educational process.Though probably unaware of the scope and extent ofquantitative information which impacts their educationalexperiences, students interact with assessment and thedepartment through opinion surveys regarding their coursesand occasionally through participation in focus groups.Through the realization that their opinions matter, there isthe opportunity for a strengthened sense of affiliation withmathematics, the department, and the university.

A mathematics department faculty member recently askedthe question, “Whatever happened to the ivory tower?” Theanswer of course is that it no longer exists, or that it hasbeen remodeled. Departments are no longer concernedprimarily with their discipline. In today’s educational climate,valid thoughtful information must be readily availableregarding student learning and success, programdevelopment and improvement. Stewardship of faculty andfinancial and space resources must be demonstrated to avariety of constituents beyond the department. As a matterof performance and outcomes, everyone gains from theassessment process on the departmental level.

Table 4Selected Examples of Sectional Outcomes,

Engineering Calculus, Fall 1995

mean mean meanSection SATM Common Final* course grade

1 620 6.6 2.1

2 648 6.6 1.7

3 618 5.2 2.74 632 6.3 2.1

5 640 6.7 2.4

overall 630 6.8 2.2

Page 242: Assessment Practices in Undergraduate Mathematics - Northern

229

I am engaged in a number of curriculum development projects(see [2], [4], [6]) based on theoretical and empirical researchin how mathematics can be learned. The research is done inconnection with a loosely organized group of mathematiciansand mathematics educators known as the Research in Under-graduate Mathematics Education Community, or RUMEC.(For more about RUMEC, visit our web site at http://rumec.cs.gsu.edu/.) The educational strategy which arises outof this research involves a number of innovations including:cooperative learning, students constructing mathematical con-cepts on the computer, de-emphasizing lectures in favor ofproblem solving and discussions designed to stimulate studentconstructions of mathematical concepts.

Implementing these innovations raises a number ofassessment questions. How do we estimate what individualstudents have learned if most of their work is in a group? Ifstudents construct mathematical concepts on the computer,how can we tell if they have made similar constructions intheir minds? If our theoretical perspective implies that astudent may know something quite well but not necessarilydisplay that knowledge in every instance, what is the meaningof answers to specific questions on a timed test?

I will describe how the curriculum development projectsrelate to these issues, beginning with a very brief sketch ofthe theoretical framework in which the research takes placeand the overall pedagogical strategies it leads to. Then I willdescribe some ways in which research has influenced theassessment component of the curriculum development.Finally I will outline our approach to assessment.

A theoretical framework

Our theory begins with an hypothesis on the nature of math-ematical knowledge and how it develops. An individual’smathematical knowledge is her or his tendency to respond toperceived mathematical problem situations by reflecting onthem in a social context and constructing or reconstructingmathematical actions, processes and objects and organizingthese in schemas to use in dealing with the situations. [1]

There are a number of important issues raised by thisstatement, many relating to assessment. For example, thefact that one only has a “tendency” rather than a certainty torespond in various ways brings into question the meaning ofwritten answers in a timed exam. Another issue is that oftenthe student perceives a very different problem from whatthe test-maker intended and it is unclear how we shouldevaluate a thoughtful solution to a different problem. Theposition that learning occurs in response to situations leavesvery much open the sequence of topics which a student willlearn. In fact, different students learn different pieces of thematerial at different times, so the timing of specificassessments becomes important. Finally, the position thatlearning takes place in a social context raises questions abouthow to assess individual knowledge.

The last part of our hypothesis relates directly to how thelearning might actually take place. It is the role of our re-search to try to develop theoretical and operational under-standings of the complex constructions we call actions, pro-cesses, objects and schemas (these technical terms are fully

Assessment in One Learning Theory Based Approachto Teaching: A Discussion

Ed DubinskyGeorgia State University

In this discussion piece, the author explains an approach to teaching based on Learning Theory, particularlyexamining a Calculus course to ask how assessment can best feed back into the learning environment.

Page 243: Assessment Practices in Undergraduate Mathematics - Northern

230 Assessment Practices in Undergraduate Mathematics

described in our publications) and then to relate those un-derstandings to specific mathematical topics. (See [1] andsome of our research reports which are beginning to appearin the literature, and visit our web site.)

Given our understandings of the mental constructionsinvolved in learning mathematics, it is the role of pedagogyto develop strategies for getting students to make them andapply them to the problem situations. Following is a list ofthe major strategies used in courses that we develop. Formore information see [1], [3], [7].

• Students construct mathematical concepts on thecomputer to foster direct mental constructions and providean experiential base for reflection.

• Students work in cooperative groups that are not changedfor the entire course.

• Lectures are de-emphasized in favor of small-groupproblem solving to help students reflect on their computerconstructions and convert them to mental constructionsof mathematical concepts.

• Students are repeatedly confronted with the entirepanorama of the material of the course and have variousexperiences that help different students learn differentportions of this material at different times. We refer tothis arrangement as an holistic spray.

Some inputs to assessment from research

The position on assessment which follows from our theoreticalframework is that assessment should ask two kinds of questions:Has the student made the mental constructions (specific actions,processes, objects and schemas) which the research calls for?and: Has the student learned the mathematics in the course?Positive answers to the first kind of question allow the assertionthat the mathematics based on these mental constructions hasbeen learned. This permits us to test, albeit indirectly, forknowledge that the second kind of question may not get to.

Unfortunately, it is not practical in a course setting to teststudents for mental constructions. In our research, we useinterviews, teaching experiments and other methods, all ofwhich require enormous amounts of time and energy, to get atsuch questions. So we must introduce another indirectcomponent to our assessment. This involves two stages: designand implementation of a very specific pedagogical approach,referred to as the ACE teaching cycle, designed to get studentsto make certain mental constructions and use them to constructmathematical knowledge; and application, to a particular groupof students, of certain assertions, based on research, about theeffect of this pedagogical strategy on students’ making mentalconstructions.

The ACE teaching cycle is a course structure in which thereis a weekly repetition of a cycle of (A) activities in a computerlab, (C) classroom discussion based on those activities, and

(E) exercises. The computer activities are intended to directlyfoster the specific mental constructions which, according toour research, can lead to understanding the mathematics weare concerned with; the classroom discussions are intended toget students to reflect on these constructions and use them todevelop understandings of mathematical concepts; and theexercises, which are fairly traditional, are expected to help thestudents reinforce and extend their developing mathematicalknowledge. (For more details, see [1].)

The second stage of this component is an application ofour ongoing research. Our investigations use laborious methodscombining both quantitative and qualitative data to determinewhat mental constructions students appear to be making, andwhich mental constructions appear to lead to development ofmathematical understanding. One outcome of these studies isto permit us to assert, not with certainty, but with some support,that if the pedagogy operated as we intended, that is, the studentparticipated in all of the course activities, cooperated in her orhis group, completed the assignments, did reasonably well inexams, etc., then the mental constructions were made.

Because this last point is somewhat different from the kindsof assessments most of us have been used to, perhaps anexample will help communicate what we have in mind.Consider the chain rule. We would like students to be able touse this to compute the derivative of a “function of a function”in standard examples, but we would also like the student tounderstand the rule well enough so that later it can be used tounderstand (and perhaps even derive, from the FundamentalTheorem of Calculus) Leibnitz’ formula for the derivative of afunction defined by an integral whose endpoints are functions.

Our research suggests that a key to understanding the chainrule might be an understanding that certain definitions offunctions amount to describing them as the composition oftwo functions, which itself is understood as the sequentialcoordination of two processes. Our research also suggests thatif students successfully perform certain computer tasks andparticipate in certain discussions, then they are likely toconstruct such an understanding of the chain rule and also willbe reasonably competent in applying this rule in traditionalexamples.

In principle we could simply perform the same research onthe students in our classes and get the assessment directly. Butthis would be vastly impractical since the research involvesinterviews and transcribing and analyses that could take years.Instead we ask if the students did perform the computer tasks,did participate in the discussions, and did cooperate in theirgroups (we determine this by keeping records of their writtenwork, classroom participation, and meetings with groups). Wealso ask (by testing) if they can use the chain rule to computevarious derivatives. If the answer to these questions is yes,then, given the research we have reason to hope that the studentsnot only learned to use the chain rule, but also developed anunderstanding that could help them understand Leibnitz’formula in a subsequent course.

Page 244: Assessment Practices in Undergraduate Mathematics - Northern

Part III: Departmental Assessment Initiatives 231

There is a second consequence of our theoretical positionwhich moves us away from thinking of the course as a set ofmaterial which the students must learn so that assessment mustmeasure how much of it they did learn. Rather we think of thestudents as beginning with a certain knowledge and the goal ofthe course is to increase that knowledge as much as possible.Thus, in making up an examination, for example, we don’tthink so much of questions that cover as large a portion of thematerial as possible, but we try to ask the hardest possiblequestions about the material we believe the students havelearned. The expectation is that the students will do well onsuch tests and a part of our assessment of how well the coursewent in terms of what was intended (in the sense of the previousparagraphs) consists of assessing how hard the tests were andhow much material they covered.

It could be argued that in this second consequence we arethrowing out the requirement that, for example, everyone mustlearn a certain amount of material in order to get an A. Wewould respond that, in fact, such a requirement cannot be, andis not, implemented. It is simply impossible, given the realitiesin which we work, to take a course such as Calculus I, list a setof material and then determine with any degree of accuracythat a given student has learned this or that portion (i.e.,numerical percentage) of it. We accept this reality, for example,when we give an exam limited to one, or even two hours and,of necessity, select only a portion of the material to test. Weare making an assumption that students who score x on such atest understand x amount of the selected material and also xamount of the material not tested! We don’t see this as a morecompelling conclusion about how much of the material waslearned than the conclusions we draw using our research.

We also accept the reality when we curve our results, basingour grades not on a given amount of material which we judgeto warrant an A, but based on how well the brightest studentsin the class perform on the exam. Again, assumptions are beingmade that are not more certain than ones being made in ourapproach to assessment. As an aside, I would like to forestallan argument that curving grades is a practice not used veryoften today. I think it may be used more than we think, perhapsimplicitly. For example, consider a large engineering orientedschool with thousands of students each year taking calculus tosatisfy engineering requirements. The grades in such a coursegenerally fall along a certain bell shaped distribution. Imagine,for example, what would be the reaction if the studentperformance were significantly lower (three-quarters of theclass failed) or higher (more than half the class got an A). Arewe prepared to deny that there is (perhaps implicit) curvinghere? Do we think that this situation represents a reasonablestandard of a given amount of material for an A? If so, whatwould a list of that material — as determined by what is on thetests — look like?

Finally, let me mention one other input, this time fromgeneral research in cooperative learning. The results regardingthis pedagogical strategy are mixed. There are reports showing

large gains as well as others that do not show much advantagefrom it, and there do not appear to be many results in whichcooperative learning was harmful. Studies that have taken acloser look report that there are conditions under whichcooperative learning is more likely to be beneficial. One of themost important conditions, according to Slavin [8] is thatstudents are rewarded individually for the performance of theirgroup. (There are some opposing views in the literature ([5])but they are more about the general question of using rewards,such as tests, to motivate students.) As will be seen in the nextsection, we make heavy use of this principle.

An approach to assessmentIn our courses, students are assigned to permanent groups(of 3 or 4) very early in the course and they do most of theirwork in these groups, including some of the tests. We usethe following assessment items. Because the first of these,computer assignments, are designed to stimulate mentalconstructions and often ask students to do things that arenew and different for them, the grading is relatively lenientand tries to measure mental effort as much as correctness.All of the other instruments are graded in standard ways.Each of the first two exams listed below is held throughoutan entire day so that students do not have time limits. Theyare allowed to leave and return during the day, on the honorsystem that nothing related to the course will be done duringthat day except when they are in the exam room.

Weekly computer assignments. Students have lab time towork on these in their groups, but not enough for the wholeassignment and they must spend large amounts of time ontheir own, either individually or in collaboration with theirgroup. The assignment is submitted as a group.

1. Weekly exercises. These are almost entirely apart fromthe computer and are fairly traditional. They are doneentirely on the students’ own time, and again thesubmission is by group.

2. First exam. This is a group exam. It comes about 40%through the course and the students take it as a group,turning in only one exam for the entire group. Everystudent in a group receives the same grade.

3. Second exam. This comes half way between the first examand the end of the course. It is taken individually, buteach student receives two grades: her or his score on theexam, and the average of the scores of all of the membersof the student’s group.

4. Final Exam. This exam is given in the standard way duringthe standard time period. It is taken individually andstudents receive only their individual score.

5. Classroom participation. Much of the class time is takenup with small group problem solving and discussion ofthe problems and their solutions. Both individual andgroup participation are recorded.

Page 245: Assessment Practices in Undergraduate Mathematics - Northern

232 Assessment Practices in Undergraduate Mathematics

For the final grade, all but the last item are given equalweight and the last is used to resolve borderline cases. Thus anindividual student’s grade is determined, essentially, by sixscores, four of which are group scores and two are individual.This imbalance between group and individual rewards ismoderated by one other consideration. If the student’sindividual scores differ sharply from her or his group scores,then as much as a single letter upgrade or downgrade in thedirection of the individual scores will be given.

Uses of assessments

We can summarize the uses we make of the various assessmentactivities as follows: assigning grades, reconsidering specificparts of the course, and reconsidering our entire approach. Wedescribed in the previous section how assignment scores, examscores and classroom participation of individuals and groupsare combined in determining the grade of an individual student.As always, such measures leave some students on the borderlinebetween two possible grades. In these cases we apply ourfeeling, based on our research, that when all of the componentsof our course work as we intended them to, then learning takesplace. Thus, if the course seemed to go well overall, in its ownterms, and students appeared to buy into our approach, thenwe will tend to choose the higher grade. Otherwise, we willnot give students much “benefit of the doubt” in determiningthe final grade.

The combination of course information about each studentand research on students in general provides a sort of triangu-lation that can be used in a formative way in some cases. If theresearch tells us that students who experience our approachare likely to learn a particular concept, but they do not do wellon this point in the actual course, then we look hard at thespecific implementation as it regards that concept. If, on theother hand, students perform well on a particular concept butresearch suggests that their understanding leaves much to bedesired, then we worry that the performance might be due tomemorization or other superficial strategies. Finally, if bothstudent performance in courses and subsequent research sug-gests that they “are not getting it,” then we think about localrevisions to our approach.

This latter occurs from time to time. For example, in theC4L calculus reform project, we have a set of computeractivities designed to help students develop an understandingof the limit concept by looking at the problem of adjustingthe horizontal dimension of a graphics window so as to keepa particular curve within a specified vertical dimension.Students don’t like these problems very much and don’t doexceptionally well on related exam questions. Moreover,there is nothing in the research to suggest anything strikingin their understanding of the limit concept. Therefore wehave reconsidered and adjusted our approach to limits.

Finally, there is the possibility that, over a period of time,performance of students in courses and research could leadus to a more general feeling of malaise with respect to ouroverall methods. In this case, more systemic changes,including the possibility of rejecting the entire approachwould be considered. So far, this has not happened.

ConclusionThere are two comments to make in evaluating our approach.One is that it is clear that this approach to assessment reflectsthe principles we espouse and what we think research tells us.In particular, we have addressed the questions raised at thebeginning of this article. The second is that we cannot say withcertainty how effective is our assessment. We do have researchreports that encourage us but, in the end, teaching, likeparenting, is an activity in which we can never really knowhow effective were our efforts. We can only try as hard as wecan to determine and implement what seems to us to be themost effective approaches, and then hope for the best.

References

[1] Asiala, M., Brown, N., DeVries, D., Dubinsky, E.,Mathews, D. and Thomas, K. “A Framework forResearch and Development in UndergraduateMathematics Education,” Research in CollegiateMathematics Education II, CBMS Issues inMathematics Education, 6, 1996, pp. 1–32.

[2] Dubinsky, E. “A Learning Theory Approach toCalculus,” in Karian, Z., ed. Symbolic Computation inUndergraduate Mathematics Education, MAA NotesNumber 24, The Mathematical Association of America,Washington, DC, 1992, pp. 48–55.

[3] Dubinsky, E. “ISETL: A Programming Language forLearning Mathematics,” Comm. in Pure and AppliedMathematics, 48, 1995, pp. 1–25.

[4] Fenton, W.E. and Dubinsky, E. Introduction to DiscreteMathematics with ISETL, Springer, 1996.

[5] Kohn, A. “Effects of rewards on prosocial behavior,”Cooperative Learning, 10 (3), 1990, pp. 23–24.

[6] Leron, U. and Dubinsky, E. “An Abstract AlgebraStory,” American Mathematical Monthly, 102 (3), 1995,pp. 227–242.

[7] Reynolds, B.E., Hagelgans, N.L., Schwingendorf, K.E.,Vidakovic, D., Dubinsky, E., Shahin, M., and Wimbish,G.J., Jr. A Practical Guide to Cooperative Learning inCollegiate Mathematics, MAA Notes Number 37, TheMathematical Association of America, Washington, DC,1995.

[8] Slavin, R.E. “When does cooperative learning increasestudent achievement?” Psychological Bulletin 94, 1983,pp. 429–445.

Page 246: Assessment Practices in Undergraduate Mathematics - Northern

233

Background and PurposeIn 1986, the National Science Board (NSB) published areport from their Task Committee on Undergraduate Scienceand Engineering Education, whose charge was to considerthe role of the National Science Foundation (NSF) inundergraduate education. The Neal Report, as it is commonlyknown, outlined the problems in undergraduate mathematics,engineering, and science education that had developed inthe decade prior to 1986. Specifically, the Task Committeediscussed three major areas of the undergraduateenvironment that required the highest level of attention: (1)laboratory instruction; (2) faculty development; and, (3)courses and curricula. Their primary recommendation to theNSB was the development of a plan for “new and innovativeprogram approaches that will elicit creative proposals fromuniversities and colleges...as part of the National ScienceFoundation” ([5], p. v). As a response to this and other reports(e.g., [2], [10]), NSF published their first programannouncement [6] for calculus in 1987, with the first awardsimplemented in 1988. This program has served as a drivingforce for the national effort in the mathematics communityknown as the calculus reform movement.

Institutions nationwide have implemented programs aspart of the calculus reform movement, many of whichrepresent fundamental changes in the content andpresentation of the course. For example, more than half ofthe projects funded by NSF use computer laboratory

experiences, discovery learning, or technical writing as amajor component of the calculus course, ideas rarely usedprior to 1986 [3]. The content of many reform coursesfocuses on applications of calculus and conceptualunderstanding as important complements to thecomputational skills that were the primary element ofcalculus in the past. It is believed by many that such changeis necessary for students who will live and work in anincreasingly technical and competitive society.

A critical part of this process of change is the evaluationof these programs and their impact on the learningenvironment. As early as 1991, NSF began receiving pressurefrom the academic community and Congress to place moreemphasis on evaluating the impact of these developmentson student learning and the environment in undergraduateinstitutions (see [7]). Although this pressure has resulted ina heightened awareness of the need for evaluation andfinancial support for a few such studies, the area of evaluationresearch in undergraduate reform is still in its infant stages,with much of the work done by graduate students in the formof unpublished doctoral dissertations and master’s theses.Only through additional studies can the mathematicscommunity continue to develop the calculus course in waysthat are most conducive to the needs of their students, theprofession, and society.

A number of reports that present programmaticinformation and indicators of success in the efforts toincorporate technology and sound pedagogical methods in

An Evaluation of Calculus Reform:A Preliminary Report of a National Study

Susan L. GanterAmerican Association for Higher Education

This is a preliminary report of a study at NSF dealing with what NSF has looked for, what it has found,and directions for future study. One such direction will be to shift from “teaching” to “student learning”and the learning environment.

Page 247: Assessment Practices in Undergraduate Mathematics - Northern

234 Assessment Practices in Undergraduate Mathematics

calculus courses have indeed been written (e.g., [8], [9],[12]). Reform has received mixed reviews, with studentsseemingly faring better on some measures, while laggingbehind students in traditional courses on others. However,these reports present only limited information on studentlearning in reform courses, primarily because the collectionof reliable data is an enormous and complicated task andconcrete guidelines on how to implement meaningfulevaluations of reform efforts simply do not exist ([12]). Theneed for studies that determine the impact of these efforts,in combination with the increase in workload brought on byreform, is creating an environment of uncertainty. Fundingagencies, institutions, and faculty require the results of suchstudies to make informed decisions about whether to supportor withdraw from reform activities.

This study is being conducted as a part of a larger effort byNSF to evaluate the impact of reform in science, mathematics,engineering, and technology (SMET) education at theundergraduate level. This study has been designed to investigatewhat is currently known about the effect of calculus reform on(1) student learning, attitudes, and retention; (2)use ofmechanisms that historically have been shown to improve thelearning environment, i.e., faculty development activities,student-centered learning, and alternative methods of deliveryand assessment of knowledge; and, (3) the general educationalenvironment. Preliminary results from this project will bereported here, including information from NSF projects,insights from the mathematics community, and anticipatedimplications for future efforts in calculus.

Method

The calculus reform initiative that NSF encouraged throughawards from 1988 to 1994 set the direction for much of theundergraduate reform that has followed. Therefore, it isespecially important that a thorough study of this pioneeringeffort be conducted in order to make informed decisions notonly about the future of calculus, but also regarding all reformefforts in undergraduate SMET education. To this end, theproject was designed to synthesize what is currently knownabout the impact of calculus reform on the learningenvironment, including student learning, attitudes, andretention. The information to be presented is not intended tobe a definitive end to the evaluation process, but rather aprogress report of what is known to date. It is expected thatthis information will generate discussion within the academiccommunity, resulting in additional evaluation studies.

For the purpose of this evaluation study, information wasgathered using the following methods:

1. A search of literature was conducted, including journalarticles, conference proceedings, dissertations, and otherrelevant publications, to identify evaluations of calculusreform that are available. The information was compiled and

synthesized to determine what is currently known aboutcalculus reform from these evaluations.

2. Folders containing all information that has beensubmitted to NSF for each NSF-funded calculus project weresearched for any proposed evaluation of the project andcorresponding findings, as well as dissemination information.This evaluation information has been summarized in aqualitative database and cross-analyzed between projects.A framework for the information to be documented in thisdatabase was developed in consultation with NSF programofficers prior to the development of the database. Precisedefinitions to be used in determining the existence of variousevaluation, dissemination, and reform activities for eachproject were also developed and used to guide the data entry.

3. A letter was developed and sent to approximately 600individuals who have participated in the reform efforts. Theletter requested assistance in the compilation of existingevaluation studies in calculus reform. The mailing listincluded not only principal investigators from NSF calculusprojects, but also individuals from non-funded efforts andothers who have been involved in the evaluation of calculusprojects. The names of those who have not been affiliatedwith an NSF project were obtained through the literaturesearch. The letter was sent via email to a select group ofmathematicians and mathematics educators for feedback andcomments prior to the mailing.

Preliminary Findings

1. Analysis of NSF ProjectsNSF folders have been obtained for the 127 projects awardedto 110 institutions as part of the calculus initiative (1988–94). Each folder was reviewed and analyzed as discussedabove, yielding the following information:

• NSF funding for individual projects ranged from $1,500per year to $570,283 per year, with a mean annual awardof $186,458; the duration of the awards was usually twoyears or less;

• computer use/laboratory experience, applications, andconceptual understanding are the objectives of reformdeemed most important in the projects;

• pedagogical techniques most often cited as part of theprojects are technical writing, discovery learning, use ofmultiple representations of one concept, and cooperativelearning;

• the most popular methods for distributing results to thecommunity were conference presentations, journalarticles, organization of workshops, and invitedpresentations at college colloquia;

• evaluations conducted as part of the curriculum dev-elopment projects mostly concluded that students inreform courses had better conceptual understanding,

Page 248: Assessment Practices in Undergraduate Mathematics - Northern

Part III: Departmental Assessment Initiatives 235

higher retention rates, higher confidence levels, andgreater levels of continued involvement in mathematicsthan those in traditional courses; however, scores oncommon traditional exams yielded mixed results, makingit unclear whether there was any significant loss oftraditional skills in reform students.

2. Information obtained from the CommunityAlmost 200 evaluation documents have been obtained fromthe academic community, including published papers andcurricular materials, dissertations, printed conferencepresentations and proceedings, letters describing results fromprojects, and internal reports submitted within variouscolleges and universities. All information is beingsummarized in qualitative data files that will be discussed inconjunction with the data on the NSF projects in the fullreport. A preliminary review of the information submittedhas revealed the following trends:

• the reform effort has motivated many (often controversial)conversations among faculty about the way in whichcalculus is taught; these conversations are widespread andcontinuous and they have resulted in a renewed sense ofimportance about undergraduate mathematics education;

• in general, regardless of the reform method used, theattitudes of students and faculty seem to be negative inthe first year of implementation, with steady improvementin subsequent years while making continuous revisionsbased on feedback;

• most faculty believe that the “old” way of teachingcalculus was ineffective; however, there is much debateabout whether reform is moving in the right direction;

• there does not seem to be any consistency in thecharacteristics of faculty who are for/against reform;however, the students who seem to respond mostpositively to reform are those with little or no experiencein calculus prior to entering the course, with the studentswho excel in the traditional environment having the mostnegative reactions to reform methods;

• the requirements of standardized tests seem to have thegreatest effect on the adoption of reform practices bysecondary AP calculus teachers; i.e., the recentlyimplemented requirement of graphing calculators on theAP calculus exam has ensured their use in virtually everysecondary calculus classroom, even though many APteachers are opposed to such a requirement;

• faculty universally agree upon the importance ofevaluating the effect of the reform efforts on studentlearning, faculty and student attitudes, and curriculumdevelopment; they believe that this information is neededto justify their work within academic departments, tounderstand the impact of various methods, and to givethem motivation to continue in the struggle.

Implications of Findings for the ReformEnvironment

The existence of common elements in many of the calculusreform projects with varying levels of success implies thatthe impact of reform is perhaps not so dependent upon whatis implemented, but rather the educational environment thatis created in which to implement it. This environment isdetermined in part by the level of departmental commitmentand the amount of faculty involvement in the reform efforts.Specifically, departments in which the reform courses aredeveloped and supported by only a small portion of thefaculty inevitably confront difficulties, whether thosedifficulties be the inconsistency between their courses orsimply the exhaustion of those involved as they work to keepthe program going, often at the expense of a positiverelationship with their colleagues.

However, even departments that are very committed toreform will confront problems. It is the anticipation of theseproblems as well as the construction of methods for handlingthem that seems most critical to the continuing success of aprogram. For example, the expectations of a reform calculuscourse are often ones with which students (and theirinstructors) have had very little experience. How can studentsbe engaged in productive group learning situations? Whatare the most effective and appropriate uses of computers?These and the multitude of other questions that reformimplies about the educational experience in calculus havemade apparent the need for additional support from a varietyof sources as both students and faculty adjust their styles inthe classroom. These necessary resources include, forexample, technical and educational support for faculty andspecial study sessions and computer assistance for students.

Directions for Further Study

A detailed report of the results from this evaluation projectwill be published and distributed as appropriate to the broadercommunity. This report will provide information that can beused not only by NSF, but also by the mathematicscommunity at large to inform educational improvements atthe undergraduate level. In addition, the report willrecommend a plan for the continuing evaluation of calculusreform. The need for studies that employ more in-depth,rigorous data collection, including studies of the long-termimpact of the reform efforts, is clearly an area for furtherresearch.

Specifically, projects that contribute to the sparseliterature addressing the effects of undergraduate reform onstudent learning, attitudes, and retention are much needed.Although there has been considerable work in the area ofstudent learning in calculus (e.g., [1], [4], and [11]), most ofthese investigations have looked at how students learn. It is

Page 249: Assessment Practices in Undergraduate Mathematics - Northern

236 Assessment Practices in Undergraduate Mathematics

imperative to understand not only how students learn, butalso the actual impact of different environments created bythe calculus reform movement on their ability to learn. Thisinformation can be used to inform allocation of resourcesand to help the faculty involved in reform to continuedeveloping their ideas in ways that will be most productivefor them and their students.

In addition, this study has made evident the need to inves-tigate further the impact of reform on faculty, departments,and institutions, including methods for developing the teach-ing styles of current and future instructors in ways that areconducive to reform. Such work and the implementation ofthe resulting findings could help faculty to better understandtheir students and the pedagogical methods that best suit theirneeds at various times in their undergraduate experience. Itwill also help departments and institutions to provide the mosteffective educational experience for their students.

This research was supported by a grant from the AmericanEducational Research Association which receives funds forits “AERA Grants Program” from the National ScienceFoundation and the National Center for Education Statistics(U.S. Department of Education) under NSF Grant #RED-9452861. Opinions reflect those of the author and do notnecessarily reflect those of the granting agencies. The authorwould like to thank John Luczak, Patty Monzon, NataliePoon, and Joan Ruskus of SRI International for theirassistance in compiling the information for this report.

Please send all correspondence regarding this article to:Dr. Susan L. GanterDirector, Program for the Promotion of Institutional

ChangeAmerican Association for Higher EducationOne Dupont Circle, Suite 360Washington, DC [email protected], ext. 32

References

[1] Davidson, N.A. Cooperative Learning in Mathematics:A Handbook for Teachers, Addison-Wesley PublishingCompany, Inc., Menlo Park, CA, 1990.

[2] Douglas, R. Toward a Lean and Lively Calculus, Reportof the Tulane Conference on Calculus, Mathematical

Association of America, Washington DC, 1987.[3] Ganter, S.L. “Ten Years of Calculus Reform and its

Impact on Student Learning and Attitudes,” Associationfor Women in Science Magazine, 26(6), Association forWomen in Science, Washington, DC, 1997.

[4] Heid, M.K. “Resequencing Skills and Concepts inApplied Calculus using the Computer as a Tool,”Journal for Research in Mathematics Education, 19 (1),National Council of Teachers of Mathematics, Reston,VA, 1988, pp. 3–25.

[5] National Science Board. Undergraduate Science,Mathematics and Engineering Education: Role for theNational Science Foundation and Recommendationsfor Action by Other Sectors to Strengthen CollegiateEducation and Pursue Excellence in the NextGeneration of U.S. Leadership in Science andTechnology, Report of the Task Committee onUndergraduate Science and Engineering Education,Neal, H., Chair, Washington DC, 1986.

[6] National Science Foundation. UndergraduateCurriculum Development in Mathematics: Calculus.Program announcement, Division of MathematicalSciences, Washington, DC, 1987.

[7] National Science Foundation. UndergraduateCurriculum Development: Calculus, Report of theCommittee of Visitors, Treisman, P. Chair, Washington,DC, 1991.

[8] Roberts, A.W., ed. Calculus: The Dynamics of Change,Mathematical Association of America, Washington, DC,1996.

[9] Solow, A., ed. Preparing for a New Calculus,Conference Proceedings, Mathematical Association ofAmerica, Washington, DC, 1994.

[10] Steen, L.A., ed. Calculus for a New Century,Mathematical Association of America, Washington,DC, 1987.

[11] Tall, D. “Inconsistencies in the Learning of Calculusand Analysis,” Focus on Learning Problems inMathematics, 12 (3 and 4), Center for Teaching/Learning of Mathematics, Framingham, MA, 1990, pp.49–63.

[12] Tucker, A.C. and Leitzel, J.R.C., eds. AssessingCalculus Reform Efforts: A Report to the Community,Mathematical Association of America, Washington, DC,1995.

Page 250: Assessment Practices in Undergraduate Mathematics - Northern

237

Background and Purpose

Many mathematics departments across the country areconsidering change in the teaching of calculus. In our work onthe Calculus Consortium Based at Harvard Evaluation andDocumentation Project (CCH EDP) [2], we developed a setof questionnaires. Items from these questionnaires can providemathematics departments with a vehicle for beginningdiscussions about possible change in the teaching of calculusor for assessing changes already made. Throughout the projectwe were constantly reminded of the wide variation inparticipants’ perspectives and local situations. Theimplementation of reform-based calculus should be differentat each institution because implementation patterns reflect localstudent needs, faculty attitudes and beliefs, and characteristicsunique to institutions.

Method

The CCH EDP began in the fall of 1994. In this project, wesought (a) to investigate faculty perceptions of studentlearning and faculty attitudes and beliefs towards calculusreform and (b) to examine and describe the evolution ofefforts to reform the teaching of calculus in the context ofCCH Curriculum Project materials ([3]). As part of the CCHEDP, we surveyed 406 faculty members, who were using orhad used CCH Curriculum Project materials, at 112institutions of higher education. In the surveys, we askedabout types of courses, types of students using the materials,faculty interpretation and use of the materials, pedagogical

approaches, the extent of technology use, institution anddepartment reaction to the use of the materials, factorsinfluencing the initiation of reform efforts, and facultyattitudes and beliefs.

FindingsThe questionnaire (see Appendix I) contained in this articlerepresents a subset of the original CCH EDP survey items.The items were chosen because we found, based on responsesto CCH EDP survey, that they revealed significantinformation about how faculty interpret and implementreform-based calculus. Many of the questionnaire items werebased on goals for reform in calculus instruction establishedat the Tulane Conference in 1986 [1].

Use of FindingsOur experience indicates that casual and formal discussionswithin a department are critical in assessing a calculus pro-gram. There are several ways you might use the questionnaire.You could use the questions as the basis for calculus-instruc-tor or full-department discussions without the faculty complet-ing the survey beforehand. You may decide to modify the itemsto develop a questionnaire more suited to your own situationor to include fewer or more comment-type questions. On theother hand, you might ask faculty members to anonymouslycomplete and return the questionnaire prior to a meeting. Youcould collate and analyze the responses and distribute an analy-sis of the quantitative responses and a listing of the comments.

Increasing the Dialogue About Calculus with a Questionnaire

Darien Lauten, Karen Graham, and Joan Ferrini-MundyRivier College in Nashua and University of New Hampshire at Durham

A practical survey is provided here which might be found useful for departments looking to initiatediscussion about goals and expectations in a Calculus course.

Page 251: Assessment Practices in Undergraduate Mathematics - Northern

238 Assessment Practices in Undergraduate Mathematics

The responses to each set of items could serve as the founda-tion for more formal calculus-instructor or department discus-sions. To get a discussion rolling, you might ask the followinguseful discussion questions:

• How would you characterize your department’s approachto calculus instruction? What are the more importantfeatures, the less important?

• Is using technology the basis for your efforts? Why orwhy not?

• Do your efforts encourage student engagement withcalculus? How do you know?

• What role do the following pedagogical methods play inyour approach: cooperative groups, projects, studentpresentations, using concrete materials to teach calculus?What solid evidence do you have that these techniqueswork or that the lecture method works?

• What does “rigor” mean to you, and what importance doyou give “rigor” in the teaching of calculus?

Success Factors

Continued use of this questionnaire will refine the issuesthat are important to you as a faculty and the process could

initiate bringing about changes in the questions you ask. Thequestionnaire provides a format for meetings in which thefaculty, and possible student representation, can cometogether periodically to discuss the results. In this way, yourdepartment can create a beginning for reforming the learningenvironment. Above all, your faculty and students willbecome more sensitized to the goals of the department andthe issues surrounding the teaching of calculus.

References[1] Douglas, R., ed. Toward a Lean and Lively Calculus:

Report of the conference/workshop to developcurriculum and teaching methods for calculus at thecollege level (MAA Notes Series No. 6), MathematicalAssociation of America, Washington, DC, 1986.

[2] Ferrini-Mundy, J., CCH Evaluation and DocumentationProject, University of New Hampshire, Durham, NH,1994.

[3] Hughes-Hallett, D., Gleason, A.M., Flath, D.E., Gordon,S.P., Lomen, D.O., Lovelock, D., McCallum, W.G.,Osgood, B.G., Pasquale, A., Tecosky-Feldman, J.,Thrash, J.B., Thrash, K.R., & Tucker, T.W. Calculus,John Wiley & Sons, Inc., New York, 1992.

Appendix

CALCULUS QUESTIONNAIREPlease answer all questions and comment as often as you wish.

1. CONTENT:1A. Please indicate the level of emphasis that you feel should be placed on the following topics in a first-year calculus course.

Amount of emphasislittle or none heavy

a. Preliminaries (functions, absolute value, etc.) 1 2 3 4 5b. Limits (lengthy treatment, rate “heavy”) 1 2 3 4 5c. Derivative as a rate, slope of tangent line, etc. 1 2 3 4 5d. Using definition to find derivative 1 2 3 4 5e. Techniques of differentiation 1 2 3 4 5f. Applications of the derivative (max/min, related rates, etc.) 1 2 3 4 5g. Fundamental Theorem of Calculus (lengthy treatment, rate “heavy”) 1 2 3 4 5h. The definite integral as area (lengthy treatment, rate, “heavy”) 1 2 3 4 5i. Techniques of integration 1 2 3 4 5j. Applications of integration (arc length, volumes of solids, surface area, etc.) 1 2 3 4 5k. Applications of exponential/logarithmic functions (growth, decay, etc.) 1 2 3 4 5l. Solving differential equations 1 2 3 4 5m. Applications of differential equations 1 2 3 4 5n. Series (lengthy treatment, rate “heavy”) 1 2 3 4 5o. Series — techniques to determine convergence 1 2 3 4 5p. Taylor series (lengthy treatment, rate “heavy”) 1 2 3 4 5q. Applications of Taylor Series 1 2 3 4 5r. Parametrizations, Vectors 1 2 3 4 5

Page 252: Assessment Practices in Undergraduate Mathematics - Northern

Part III: Departmental Assessment Initiatives 239

1B. Please add any other topics that you feel should be included, and any other remarks concerning content. Technologywill be addressed separately.

2. THEORETICAL METHODOLOGY2A. Please indicate the level of emphasis that you feel should be placed on the following methods in a first-year calculuscourse. It might be helpful to think about how much emphasis you place on these items in your assessment of students.Technology will be addressed separately.

Amount of emphasislittle or none heavy

a. Formal definitions 1 2 3 4 5b. Statements of theorems, counterexamples, etc. 1 2 3 4 5c. Proofs of significant theorems 1 2 3 4 5d. Historical themes in mathematics 1 2 3 4 5e. Writing assignments 1 2 3 4 5f. Student practice of routine procedures 1 2 3 4 5g. Applications of real world problems 1 2 3 4 5h. The analysis and solution of non-routine problems 1 2 3 4 5

2B. Please state your own definition of the phrase “mathematical rigor.”

2C. Based on your definition, please describe what level is appropriate in first-year calculus and how you would assess(grade, for example) at that level.

3. TECHNOLOGY FOR LEARNING:3A. Please state your feelings about the role of technology in the classroom, in promoting or hindering learning in a firstyear calculus course.

3B. To what extent do you feel technology should be integrated throughout the course versus for special projects, if at all.

3C. Please select the response that best represents your views about the ideal use of calculators or computers in theclassroom.

Amount of emphasislittle or none heavy

a. Calculators for numerical purposes 1 2 3 4 5b. Calculators for graphing purposes 1 2 3 4 5c. Calculators for symbolic manipulation 1 2 3 4 5d. Computer courseware (Maple, Mathematica, etc.) 1 2 3 4 5e. Modifying existing programs/Programming 1 2 3 4 5f. Spreadsheets or tables 1 2 3 4 5

4. CLASSROOM TEACHING APPROACHES4A. Please respond: In an ideal calculus course, how frequently would your students use the following instructionalsystems?

not frequent very frequenta. Use lecture notes as basis for learning 1 2 3 4 5b. Participate in a specially designed calculus laboratory 1 2 3 4 5c. Use concrete materials/equipment to explore calculus ideas 1 2 3 4 5d. Work in small groups on mathematics problems 1 2 3 4 5e. Work in small groups on projects that take several class meetings to complete 1 2 3 4 5f. Practice calculus procedures in the classroom 1 2 3 4 5g. Make conjectures, explore more than one possible method to solve a calculus problem1 2 3 4 5

4B. Please add any additional comments. Perhaps you would like to address the phrase “ideal classroom.” What types ofsupport might your department in an ideal world provide you with to help you accomplish your teaching goals?

Page 253: Assessment Practices in Undergraduate Mathematics - Northern

240 Assessment Practices in Undergraduate Mathematics

5. STUDENT ASSESSMENT/ EVALUATION5A. In your calculus course, what importance to course grade do you assign to each of the following items? (1) Please rateon the scale. (2) Please circle the methods of assessment that you would like to discuss.

unimportant very importanta. Quizzes, tests, or examinations that measure individual mastery of content material1 2 3 4 5b. A final examination that measures individual mastery of content material 1 2 3 4 5c. Individual tests of mastery of content material 1 2 3 4 5d. Small group tests of mastery of content material 1 2 3 4 5e. Lab reports — individual grades 1 2 3 4 5f. Lab reports — group grades 1 2 3 4 5g. Quizzes, tests, or examinations of material learned in labs 1 2 3 4 5h. Homework exercises — individual grades 1 2 3 4 5i. Homework exercises — group grades 1 2 3 4 5j. Projects — individual grades 1 2 3 4 5k. Projects — group grades 1 2 3 4 5l. Journals 1 2 3 4 5m. Class participation 1 2 3 4 5n. Portfolios 1 2 3 4 5o. Other: Please describe: 1 2 3 4 5

6. PERSPECTIVES ON CALCULUS REFORM6A. How aware are you of “calculus reform” issues and efforts? Please explain.

6B. What do you find encouraging about the directions of calculus reform?

6C. What are your concerns about the directions of calculus reform?

7. PERSPECTIVES ON THE CURRENT TEXT: Please write your comments about the text in use in your department?Do you want to change the current text? Please explain.

8. PERSPECTIVES ON THE CURRENT STUDENTS: It is important to share perspectives about the students we teach.Please give your impressions of the latest Calculus class you have taught, and share any comments from the students thatyou would like to pass on to members of the department.

Page 254: Assessment Practices in Undergraduate Mathematics - Northern

241

Background and Purpose

Dickinson College is a four-year, residential, liberal artsinstitution serving approximately 1,800 students. During thepast decade, the introductory science and mathematicscourses at the college have been redesigned to emphasizequestioning and exploration rather than passive learning andmemorization. The Workshop Calculus project1, now in itssixth year of development, is part of this college-wide effort.

Workshop Calculus is a two-semester sequence thatintegrates a review of pre-calculus concepts with the studyof fundamental ideas encountered in Calculus I: functions,limits, derivatives, integrals and an introduction to integrationtechniques. The course provides students who have had threeto four years of high school mathematics, but who are notprepared to enter Calculus I, with an alternate entry pointinto the study of higher-level mathematics. It seeks to helpthese students, who might otherwise fall through the cracks,develop the confidence, understanding and skills necessaryto use calculus in the natural and social sciences and tocontinue their study of mathematics. After completing bothsemesters of Workshop Calculus, workshop students jointheir peers who have completed a one-semester Calculus Icourse, in Calculus II.

All entering Dickinson students who plan to take calculusare required to take the MAA Calculus Readiness Exam.Students who score below 50% on this exam are placed inWorkshop Calculus, while the others enter Calculus I. Thetwo strands serve the same clientele: students who plan tomajor in mathematics, physics, economics or other calculus-based disciplines. While both courses meet 5 hours a week,Calculus I has 3 hours of lecture and 2 hours of laboratorysessions, while there is no distinction between classroomand laboratory activities in Workshop Calculus, which isprimarily hands-on.

A forerunner to Workshop Calculus consisting of a reviewof pre-calculus followed by a slower-paced version ofcalculus was developed in response to the fact that 30–40%percent of the students enrolled in Calculus I were havingdifficulty with the material, even though they appeared tohave had adequate high school preparation. Students’reactions to the course, which was lecture-based with anemphasis on problem solving, were not positive. They didnot enjoy the course, their pre-calculus skills did not improveand only a few ventured on to Calculus II. In addition,colleagues in client departments, especially economics andphysics, continued to grumble about the difficulty studentshad using calculus in their courses. Workshop Calculus was

Workshop Calculus:Assessing Student Attitudes and Learning Gains

Nancy Baxter HastingsDickinson College

This assessment program stresses breadth — pre- and post-testing, journals, comparative test questions,student interviews and questionnaires, and more.

————————1 The Workshop Calculus project has received support from the Knight Foundation, the US Department of Education Fund for

Improvement of Postsecondary Education (FIPSE) and the National Science Foundation (NSF).

Page 255: Assessment Practices in Undergraduate Mathematics - Northern

242 Assessment Practices in Undergraduate Mathematics

the department’s response.With the Workshop approach, students learn by doing

and by reflecting on what they have done. Lectures arereplaced by interactive teaching, where instructors try notto discuss an idea until after students have had an opportunityto think about it. In a typical Workshop class, the instructorintroduces students to new ideas in a brief, intuitive way,without giving any formal definitions or specific examples.Students then work collaboratively on tasks in theirWorkshop Calculus Student Activity Guide [1]. The tasksin this manual are learner-centered, including computer tasks,written exercises and classroom activities designed to helpstudents think like mathematicians — to make observationsand connections, ask questions, explore, guess, learn fromtheir errors, share ideas, and read, write and talk mathematics.As they work on assigned tasks, the instructor mingles withthem, guiding their discussions, posing additional questions,and responding to their queries. If a student is havingdifficulty, the instructor asks the student to explain what heor she is trying to do and then responds using the student’sapproach, trying not to fall into let-me-show-you-how-to-do-this mode. The instructor lets — even encourages —students to struggle, giving only enough guidance to helpthem overcome their immediate difficulty. After completingthe assigned activities, students participate in classdiscussions, where they reflect on their own experiences. Atthis point, the instructor can summarize what has beenhappening, present other theoretical material or give a mini-lecture.

Method

Assessment activities are a fundamental part of the WorkshopCalculus project. With the help of external collaborators2,we have analyzed student attitudes and learning gains,observed gender differences, collected retention data andexamined performance in subsequent classes. Thisinformation was used to help make clearer the tasks expectedof students laid out in our Study Guide. More significantly,it has provided the program with documented credibility,which has helped the Workshop program gain the supportof colleagues, administrators, outside funding agencies, andeven the students themselves. The following describes someof the tools we have used. For a more in-depth descriptionof these assessment tools and a summary of some of theresults, see [2].

a. Collecting Baseline Data. During the year prior tointroducing Workshop Calculus, baseline information was

collected concerning students’ understanding of basiccalculus concepts. Workshop Calculus students were askedsimilar questions after the new course was implemented. Forexample, students in the course prior to the Workshop versionwere asked:

• What is a derivative?

• If the derivative of a function is positive in a given interval,then the function is increasing. Explain why this is true.

In response to the first question, 25% of the studentsstated that the derivative represented the slope of the tangentline, and half of these students used this fact to give areasonable explanation to the second question. Theremaining students answered the first question by giving anexample and either left the second question blank or wrote astatement that had little relationship to the question; theycould manipulate symbols, but they didn’t understand theconcepts.

This data showed the need to emphasize conceptualunderstanding of fundamental concepts and to have thestudents write about these ideas. Consequently, Workshopstudents learn what an idea is — for example, what aderivative is and when it is useful — before they learn howto do it — in this case, the rules of differentiation — andthey routinely write about their observations.

b. Administering Pre- and Post-tests. Workshop Calculusstudents answer questions prior to undertaking particularactivities and then are re-asked the questions later. Forexample, on the first day of class, students are asked to writea short paragraph describing what a function is, withoutgiving an example. Although they all claim to have studiedfunctions in high school, many write gibberish, some leavethe question blank, and only a few of students each yeardescribe a function as a process. After completing theactivities in the first unit of their activity guide (where theydo tasks designed to help them understand what a functionsis, without being given a formal definition of “function”),nearly 80% give correct, insightful descriptions of theconcept of function.

c. Analyzing Journal Entries. At the end of each of the tenunits in the Workshop Calculus activity guide, students areasked to write a journal entry, addressing the followingquestions:

• Reflect on what you have learned in this unit. Describe inyour own words the concepts that you studied and whatyou learned about them. How do they fit together? Whatconcepts were easy? Hard? What were the main,important ideas? Give some examples of these ideas.

———————2 Ed Dubinsky, from Georgia State University, and Jack Bookman, from Duke University, helped assess student learning gains and

attitudes in Workshop Calculus. They were funded by FIPSE.

Page 256: Assessment Practices in Undergraduate Mathematics - Northern

Part III: Departmental Assessment Initiatives 243

• Reflect on the learning environment for the course. Describethe aspects of this unit and the learning environment thathelped you understand the concepts you studied.

• If you were to revise this unit, describe the changes youwould make. What activities did you like? Dislike?

The students’ responses provide the instructor with valuableinsight into their level of understanding; their candid repliesalso provide important feedback about what works and doesn’twork and about changes that might need to be made.

d. Asking Comparative Test Questions. Student performancein Workshop Calculus has been compared to students at otherinstitutions. For instance, on the final exam for WorkshopCalculus, in the spring of 1994, students were asked fourquestions from an examination developed by Howard Pennat the US Naval Academy to assess the effectiveness of usingthe Harvard materials at the USNA versus using a traditionallecture-based approach [3]. (See the Appendix for thesequestions.) We are pleased to report that the WorkshopCalculus students did about as well as the USNA studentswho were using the Harvard materials, even though theAcademy is certainly more selective than Dickinson College.

e. Conducting Critical Interviews. A representative groupof Workshop students have been interviewed in structuredways, to help determine what they are thinking as they workon a given problem and to determine their level of under-standing of a particular concept. For this, a questionnaire isadministered to students, and we categorize responses andapproaches used, even taping one-on-one “critical inter-views” with a representative group of students, transcribingthe interview sessions, and analyzing the results. After us-ing the first version of the Workshop Calculus materials inthe spring of 1992, this approach was used, for instance, toanalyze students’ understanding of “definite integral.” Afteranalyzing students’ responses on the questionnaire and duringthe interview sessions, we realized that some students couldonly think about a definite integral in terms of finding the areaunder a curve and had difficulty generalizing.3 Based on thisobservation, the tasks pertaining to the development of defi-nite integral in the Student Activity Guide were revised.

f. Scrutinizing the End-of-Semester Attitudinal Questionnaire.At the end of each semester, students are asked to rate theeffectiveness of various activities, such as completing tasksin their activity guide, participating in discussions with peers,using a computer algebra system. They are also asked torate their gains in understanding of fundamental ideas —such as their understanding of a function, or the relationshipbetween derivatives and antiderivatives — to compare howthey felt about mathematics before the course with how they

feel after completing the course, and to describe the mostand least useful activities in helping them learn mathematics.

Student responses are analyzed for gender differences.For instance, both male and female students who tookWorkshop Calculus in 1993-1994, claimed, on the average,that they felt better about using computers after completingthe course than before. Men showed a greater increase inconfidence, however, and even after the course, women werenot as comfortable using computers as the men were initially.

g. Gathering Follow-up Data. Information about studentattitudes is also collected from students who took either theWorkshop Calculus sequence or the regular Calculus Icourse, one or two years after they complete the course(irrespective of whether they have gone ahead withmathematics or not).4 Their responses are used to determinethe impact of the course on their attitudes towardsmathematics and their feelings about the usefulness andapplicability of calculus in any follow-up courses. Forinstance, students are asked whether they strongly agree,somewhat agree, somewhat disagree, or strongly disagreeto 46 statements, including:

• I am more confident with mathematics now, than I wasbefore my calculus course.

• I feel that I can apply what I learned in calculus to realworld problems.

• On the whole, I’d say that my calculus class was prettyinteresting.

In addition, we gather data about Workshop Calculusstudents who continue their study of mathematics and/or takea course outside the Mathematics Department that hascalculus as a pre-requisite, asking questions such as:

• How many Workshop students continued their study ofmathematics by enrolling in Calculus II?

• How do the Workshop students perform in Calculus II incomparison with those who entered via the regularCalculus I route?

• How many Workshop students became mathematicsmajors?

• How do Workshop students do in courses outside theDepartment that have calculus as a pre-requisite?

Findings and Use of Findings

With the exception of the critical interviews, the assessmenttools used for Workshop Calculus are easy to administerand the data can be easily analyzed. Moreover, these toolscan be used in a variety of courses at a variety of institutions.In general, our assessment process requires developing a

———————3 Ed Dubinsky helped design the questionnaire and analyze the results.4 Items on the follow-up questionnaire were developed by Jack Bookman, Project Calc, Duke University.

Page 257: Assessment Practices in Undergraduate Mathematics - Northern

244 Assessment Practices in Undergraduate Mathematics

clear statement of intended outcomes, designing and utilizingsupportive assessment tools, and analyzing the data andsummarizing the results in ways that can be easily and readilyunderstood (for instance, bar charts or graphs are helpful).

Success Factors

Something should be said here for the advantages of using avariety of assessment measures. We feel that our programbenefits from the breadth of our methods for collectinginformation about performance, teaching, learning, and thelearning environment. In particular our focus is onunderstanding how students are thinking, learning, andprocessing the courses. Often assessment measures raisemore questions than they answer, and when students areasked questions, they may be reading and responding ondifferent wavelengths from those we are broadcasting on.We have found that an advantage to using a “prismatic”assessment lens is that we obtain a variety of ways ofexploring issues with students, and therefore we becomecloser to understanding students and more inspired to makethe changes that will be beneficial to all of us. Goodassessment offers students feedback on what they themselvesreport. We feel that our methods are appreciated by students,and that the department is perceived as caring as reformsare instituted. Good assessment also promotes ongoingdiscussion. And our measures have certainly helped tostimulate ongoing faculty dialogue, while unifying thedepartment on the need for further assessment. It has beensix years since the Workshop Calculus project began andwe are still learning!

References

[1] Hastings, N.B. and Laws, P. Workshop Calculus:Guided Exploration with Review, Springer-Verlag, NewYork, vol. 1, 1996; vol. 2, 1998.

[2] Hastings, N.B. “The Workshop Mathematics Program:Abandoning Lectures,” in D’Avanzo, C. and McNeal,A., Student-Active Science: Models of Innovation inCollege Science Teaching, Saunders Publishing Co.,Philadelphia, 1997.

[3] Penn, H., “Comparisons of Test Scores in Calculus I atthe Naval Academy,” in Focus on Calculus, ANewsletter for the Calculus Consortium Based atHarvard University, 6, Spring 1994, John Wiley & Sons,Inc., p. 6.

Appendix

Q1 showed five graphs and asked which graph had f ´(x) < 0and f ´´(x) < 0.

Q2 showed the graph of a function h and asked at which offive labeled points on the graph of h is h´(x) the greatest.

Q3 showed the graph of a function which was essentiallycomprised of two line segments—from P(–1,0) toQ(0,–1) and from Q(0,–1) to R(3,2)—and asked toapproximate the integral of the function from –1 to 3.

Q4 showed the graph of the derivative of a function, wheref ´ was strictly positive, and asked at which of five labeledpoints on the graph of f ´ is f (x) the greatest.

Page 258: Assessment Practices in Undergraduate Mathematics - Northern

245

Background and Purpose

For many years now the mathematical community hasstruggled to reform and rejuvenate the teaching of calculus.My experience with one of these approaches is in the contextof a small comprehensive college or university of 2400students. Our university attracts students of modest academicpreparation and achievement. Most freshmen come frombetween the 30th and 60th percentiles of their high schoolclass. The central half of our students have mathematics SATscores between 380 and 530 (prior to the re-norming of theexams). We offer support courses in elementary algebra,intermediate algebra, trigonometry, and precalculus for anystudent whose placement examinations show they lack therequisite skills for the first mathematics course required bytheir major.

In view of the characteristics of our student body, thenature of the improvements seen in calculus performanceare extremely important. One or two sections of the sixnormally offered for first semester calculus were taught usinga reform calculus approach and text developed by Dubinskyand Schwingendorf at Purdue University. The remainingsections were taught using a traditional approach and textsupplemented by a weekly computer lab.

The goals we established for this course were to improvethe level of understanding of fundamental concepts incalculus such as function, limit, continuity, and rate of change,and to improve student abilities in applying these conceptsto the analysis of problems in the natural sciences andengineering. We hoped to reduce the number of studentswho withdrew from the course, who failed the course, or

who completed the course without mastering the material ata level necessary to succeed in further mathematics coursesor to successfully apply what they had learned in courses inscience and engineering. Performance in these courses hasfrequently been sharply bimodal, and it was hoped that wecould find an approach which would narrow the gap betweenour better students and our weaker students. Our pastexperience had been that between 20 and 45% of beginningcalculus students failed to perform at a level of C– or better.Approximately half of that number withdrew from the coursebefore completion, the remainder receiving grades of D orF. Furthermore, students passing with a C– or D in CalculusI seldom achieved a passing grade in Calculus II.

In this reform approach, the way in which material ispresented is determined by extensive research into howmathematics is learned. The very order in which the topicsappear is a reflection not of the finished mathematicalstructure, but of the steps students go through in developingtheir understanding from where they start to where we wishthem to end. Mathematics is treated as a human and socialactivity, and mathematical notation is treated as a vehiclewhich people use for expressing and discussing mathematicalideas. A cooperative group learning environment is createdand nurtured throughout the course. Students work in teamsof 3–4 students in and out of class and stay with the samegroup for the entire semester. Group pride and peer pressureseem to play a significant motivational role in encouragingstudents to engage with the material in a nontrivial way.

The computer is used not for practice or for drill, butrather for creating mathematical objects and processes. Theobjects and processes which the students create are carefully

Does Calculus Reform Work?

Joel SilverbergRoger Williams University

This large study examines data over a long period of time regarding a calculus “reform” course. Gradesand attitude are studied, with advice for novices at assessment.

Page 259: Assessment Practices in Undergraduate Mathematics - Northern

246 Assessment Practices in Undergraduate Mathematics

chosen to guide the students through a series of experiencesand investigations which will enable them to create their ownmental models of the various phenomena under investigation.Basic concepts are developed in depth and at leisure, lookingat them from many viewpoints and in many representations.ISETL and Maple are used to describe actions in mathematicallanguage and to generate algorithmic processes which graduallycome to be viewed as objects in their own right upon whichactions can be performed. Objects under study are dissectedto reveal underlying processes and combined with otherprocesses to form new objects to study. The time devoted towhat the student views as “real calculus” — the mechanics offinding limits, determining continuity, taking derivatives,solving applications problems, etc. is perhaps only one-halfthat in a traditional course. We hoped that the learning of theseskills would go far more easily, accurately, and rapidly, becausethe underlying ideas and concepts were better understood, andin fact this did seem to be the case.

Method

Two assessment models were used, each contributing adifferent viewpoint and each useful in a different way. Onewas a quantitative assessment of student performance andunderstanding based upon performance on the finalexamination for the course. A parallel qualitative assessmentwas obtained through careful observation and recording ofstudent activities, discussions, insights, misconceptions,thought patterns, and problem solving strategies. Some datacame from student interviews during and after the classes.Other data came from the frequent instructor/studentdialogues generated during classroom time, laboratory andoffice hours.

The quantitative assessment provided a somewhatobjective measure, unclouded by the instructor’s hopes andaspirations, of the general success or failure of the effort.Although rich in numerical and statistical detail, and valuablein outlining the effectiveness (or lack thereof) for varioussub-populations of students, it offered few clues, however,as to directions for change, modification, and improvement.

The goal of the qualitative assessment was to gain a morecomplete idea of what the student is actually thinking whenhe or she struggles with learning the material in the course.Without a clear understanding of the student’s mentalprocesses, the instructor will instinctively fill in gaps, resolveambiguities, and make inferences from context that may yielda completely inappropriate impression of what the studentunderstands. If we assume that the student reads the sametext, sees the same graph, ponders the same expression aswe do, we will be unable to effectively communicate withthat student, and are likely to fail to lead them to a higherlevel of understanding. The qualitative assessment, althoughsomewhat subjective, proved a rich and fertile source for

ideas for improvement, which were eventually reflectedthrough both assessment vehicles.

The two methods, used in tandem, were quite effective,over time, in fine tuning the course delivery to provide anenhanced learning environment for the vast majority of ourstudents. By clarifying what it is that was essential for astudent to succeed in the course, and by constantly assessingthe degree to which the essential understandings were beingdeveloped (or not) and at what rate, and in response to whichactivities, the delivery of the course was gradually adjusted.

Findings

The instructors of the traditional sections prepared a finalexamination to be taken by all calculus sections. The examwas designed to cover the skills emphasized in the traditionalsections rather that the types of problems emphasized in thereform sections. In an attempt to minimize any variance inthe ways individual instructors graded their examinations,each faculty member involved in teaching calculus gradedcertain questions for all students in all sections.

At the start of this experiment, the performance of studentsin the reform calculus sections, as measured by this exam,lagged significantly behind students in traditionally taughtsections. During the third semester of the experiment thestudents in the reform sections were scoring higher than theircohorts in the traditional sections. Furthermore, the segmentof our population who most needed support, those in thelower half or lower two-thirds of their calculus section (asmeasured by performance on the final examination) provedto be those most helped by these changes in pedagogy.Although every class section by definition will have a lowerhalf, it is quite a different story if the average grade of thosein the lower half is a B than it is if it is a D. By the thirdsemester of the experiment, the bottom quartile score ofreform sections was higher than the median score of thetraditional sections (see the graphs on the next page).

Student interviews, discussions, and dialog quicklyrevealed that what the student sees when looking at a graphis not what the teacher sees. What students hear is not whatthe instructor thinks they hear. Almost nothing can be takenfor granted. Students must be taught to read and interpretthe text, a graph, an expression, a function definition, afunction application. They must be taught to be sensitive tocontext, to the order of operations, to implicit parentheses,to ambiguities in mathematical notation, and to differencesbetween mathematical vocabulary and English vocabularywhen the same words are used in both. Interviews revealedthat the frequent use of pronouns often masks an ignoranceof, or even an indifference to, the nouns to which they refer.The weaker student has learned from his past experience,that an instructor will figure out what “it” refers to and assumehe means the same thing.

Page 260: Assessment Practices in Undergraduate Mathematics - Northern

Part III: Departmental Assessment Initiatives 247

Use of Findings

When first implemented, students resisted the changes, andtheir final exam grades lagged behind those of the other sec-tions. Some of the students found the method confusing. Onthe other hand, others found this the first satisfying experi-ence with mathematics since fourth grade, and reported greatincreases in confidence. Withdrawals in these coursesdropped to zero. Analyzing the program, we found thatamong the students with a negative reaction, the responsi-bility and independence required of them was a factor. As aresult, in future offerings I hired lab assistants; morale im-proved and student reactions became highly positive; scoresmeanwhile rose modestly. The challenge of the followingsemester was to convert student enthusiasm within the class-room setting to longer periods of concentration requiredoutside the classroom. Here I myself became more active inguiding students. Student explorations were replaced byteacher-guided explorations, but simultaneously, I pushedstudents harder to develop a mastery of skills and techniqueswhich the problems explored. Each group was required todemonstrate their problem solving skills in front of the othergroups. This led to a drop in morale to more normal levelsbut also to sharp increases in student performance as dem-onstrated throughout the semester and also on the commonfinal examination.

The dramatic changes in student performance were notobserved the first or even the second time the course wasoffered. It required an ongoing process of evaluation,modification, revision, before achieving a form whichappears to work for our students with their particularbackgrounds. A break-in period of 1-1/2 to two years maybe necessary for experimental approaches to be adapted tolocal conditions before the full effectiveness of the changesmade can be determined and measured. The “curricularmaterial” was essentially the same through all deliveries of

the course. What was refined and addressed over time wasthe pedagogy and classroom approach, as well as studentattitudes and morale issues.

Success Factors

The approach to teaching and assessment outlined in thispaper requires that the professor take the time to figure outwhat is going on inside his or her student’s minds. Theprofessor should make no assumptions and let the studentsexplain in their own words what it is that they see, read,hear, and do. The professor must then take the time to designactivities that will help students replace their naive modelswith more appropriate ones. The syllabus and use of classtime must be adapted to allow this to happen. Some studentsmay be resistant to thinking about the consequences of theirmathematical action and will be resistant to reasoning aboutwhat the mathematical symbols communicate. Some facultymay have difficulty understanding what you are doing, andprefer that you “train” or “drill” them to perform the requiredmanipulations. But if approached with sensitivity,perseverance, and helpfulness most students can beencouraged to expand their horizons and indulge in somecritical thinking. Some advice:

• Be patient: changes take time, changes in long establishedhabits take a long time.

• Advise your students to be patient: learning ischallenging, success in this course will require a lot oftime on the part of the teacher and on the part of thestudent.

• Persevere: it make take you several tries to get it right.

• Maintain your sense of humor: some people (and theycan be students, faculty, or administrators) don’t likechange.

Summary of Final Examination Scores (out of 160 points)

Term 1 Term 2 Term 3Reform Trad Reform Trad Reform Trad

Maximum: 146 159 103 114 154 1583rd Quartile 84 115 83.5 91 123 118.5Median 86 91.5 66.5 75 105 931st Quartile 43.5 71 44.5 54.5 100.5 74Minimum 16 43 31 2 55 9

Final Exam Grades/ Traditional Sections

Percentile Range: 0–24 25–49 50–74 75–100semester 1 C C B Asemester 2 F C C+ A–semester 3 D C B B

Final Exam Grades /Reform Sections

Percentile Range: 0–24 25–49 50–74 75–100semester 1 F D C B–semester 2 F C– C Bsemester 3 C B B B+

Page 261: Assessment Practices in Undergraduate Mathematics - Northern

248 Assessment Practices in Undergraduate Mathematics

• You will not please everybody, but you may be a welcomebreath of fresh air and a new chance for success to many.

• Make no assumptions: be open to many revelations aboutthe student’s world view and his or her view of thematerial.

• Be creative in what you do to help develop the student’sview of the material.

• Enjoy what you are doing and share that pleasure withyour students.

Page 262: Assessment Practices in Undergraduate Mathematics - Northern

249

Background and Purpose

The “Calculus, Concepts, Computers and CooperativeLearning,” or C4L Calculus Reform Program is part of theNational Calculus Reform Movement. The initial design ofthe C4L program began in 1987 under the leadership of EdDubinsky and Keith Schwingendorf on the West Lafayettecampus of Purdue University, a large Midwestern Land GrantUniversity. The C4L program has received NSF funding forits design and development during eight of the past nineyears, most recently from Fall 1994 through Summer 1997(grant #DUE-9450750) to continue and expand theprogram’s assessment and evaluation efforts on two fronts:(1) qualitative research into how students learn calculus(mathematics) concepts, and (2) quantitative research andthe development of a (national) analytical model forassessment and evaluation of the effectiveness of innovativeeducational reform efforts as compared to other pedagogicaltreatments. The C4L program is co-directed by Dubinsky(Georgia State University), Schwingendorf (PurdueUniversity North Central), and David Mathews (South-western Michigan College, Dowagiac, MI).

This paper will describe the twofold assessment andevaluation process* developed as part of the C4L Calculus

Reform Program. The major focus of this paper will be theanalytical model designed to address differences within thestudy population of students and deal effectively with thelimitation of self-selection — a key problem encounteredby virtually all control versus alternative treatment studiesin education and similar studies related to (among others)health and industrial issues. The analytical method allowsresearchers to make more meaningful comparisons andconclusions regarding the effectiveness of innovativecurriculum reform treatments and other pedagogicaltreatments. (A detailed description of the C4L Study togetherwith references on qualitative research studies regarding theC4L Program can be found in [3].) The qualitative researchphase and its impact on curriculum development will bebriefly addressed. (A detailed description of the researchframework used in the qualitative phase of assessment ofthe C4L Calculus Reform Program can be found in [1].)

The C4L Program is based on a constructivist theoreticalperspective of how mathematics is learned (see [1]).According to the emerging learning theory on which the C4LProgram courses are based, students need to construct theirown understanding of each mathematical concept. Theframework and step-by-step procedure of the qualitativeresearch phase is described in detail in [1].

Assessing the Effectiveness ofInnovative Educational Reform Efforts

Keith E. SchwingendorfPurdue University North Central

This study explains the creation of a calculus reform program, its objectives, and philosophy and providesan in-depth comparison of reform-trained students with traditional students.

———————*The C4L Study and subsequent research paper could not have been completed without the expert advice and generous time

provided by our statistical consultants George and Linda McCabe, and Jonathan Kuhn (Purdue University North Central). The authorwishes to thank Professor Kuhn for his insightful comments and suggestions for the completion of this paper.

Page 263: Assessment Practices in Undergraduate Mathematics - Northern

250 Assessment Practices in Undergraduate Mathematics

C4L Calculus courses differ radically from traditionallytaught courses and from courses in most other Calculusreform programs in many fundamental ways. Traditionalcourses, delivered primarily via the lecture/recitation system,in general, attempt to “transfer” knowledge, emphasize roteskill, drill and memorization of paper-and-pencil skills. Incontrast, the primary emphasis of the C4L program is tominimize lecturing, explaining, or otherwise attempting to“transfer” mathematical knowledge, but rather to createsituations which foster student to make the necessary mentalconstructions to learn mathematics concepts. The emphasisof the C4L program is to help students gain a deeperunderstanding of concepts than can be obtained viatraditional means, together with the acquisition of necessarybasic skills through the Activity-Class-Exercise, or ACE,learning cycle developed for the C4L Program, [1].

Each unit of the ACE learning cycle, which generallylasts about a week, begins with students performing computerinvestigations in a laboratory setting in an effort to helpstudents construct their own meaning of mathematicalconcepts and reflect on their experiences with their peers ina cooperative learning environment. Lab periods are followedby class meetings in which a modified Socratic approach isused in conjunction with cooperative problem solving insmall groups to help the students to build upon theirmathematical experiences from the computer laboratory.Finally, relatively traditional exercises are assigned toreinforce the knowledge students are expected to haveconstructed during the first two phases of the learning cycle.

Method

(1) Qualitative Research Phase of the C4L Program. Acritical aspect of the qualitative aspect of the C4L programis a “genetic decomposition” of each basic mathematicalconcept into developmental steps following a Piagetiantheory of knowledge. Genetic decompositions, initiallyhypothesized by the researchers based on the underlyinglearning theory, are modified based on in-depth studentinterviews together with observations of students as theyattempt to learn mathematical concepts. Qualitativeinterviews have been designed and constructed by theprogram’s research team resulting to date in over 125completed interviews on the concepts of limit, derivative,integral, and sequences and series. A complete descriptionof the C4L Program’s qualitative research procedure iscarefully and completely described in [1].

(2) Quantitative Research Phase of the C4L Program. Alongitudinal study (which will be referred to as the “C4LStudy”) of the C4L Calculus Reform Program was designedin an effort to make meaningful comparisons of C4L (reform)calculus students with traditionally taught students (TRAD)via lecture and recitation classes.

The students in the C4L Study were those enrolled on theWest Lafayette campus of Purdue University from Fall 1988to Spring 1991, in either the C4L (reformed) Program threesemester calculus sequence, or in the traditionally taughtlecture/recitation three semester calculus sequence.

The student population consisted primarily ofengineering, mathematics and science students. The averageMath SAT score for first semester calculus students in boththree semester calculus sequences was about 600.Comparisons of the 205 students who completed the C4Lcalculus sequence were made with the control group of 4431students from the traditionally taught courses. Only data on4636 students enrolled for the first time in: (i) first semestercalculus during the Fall semesters of 1988, 1989 and 1990;(ii) second semester calculus during the Spring semesters of1989, 1990, and 1991; and (iii) third semester calculus duringthe Fall semesters of 1989, 1990, and 1991 were includedin the longitudinal study. This was done in order to makemeaningful comparisons with C4L and TRAD studentshaving as similar as possible backgrounds and experiences.The caveat of self-selection is a limitation of the C4L Studydesign. However, there are no practical alternatives to theC4L Study design that would provide a sufficient amount ofdata on which to draw meaningful conclusions. In oursituation, random assignment of students to the C4L andTRAD courses would have been preferable to self-selection,since by doing so any possible confounding factors whichmay have influenced outcomes of the C4L Study designwould have been eliminated. For example, the possibilitythat academically better prepared students would enroll inthe C4L courses would be offset by random assignment, thuspreventing bias in favor of the C4L program. However, astudy involving random assignment of students is notpragmatic, since students dropping out of either programwould adversely effect the study results. Moreover, the C4Land TRAD programs are so different that students wouldquickly become aware of these differences, resulting inpossible resentment, which would again detract from the C4LStudy results.

The C4L Study, under the supervision of Professor GeorgeP. McCabe (Professor of Statistics and Head of StatisticalConsulting at Purdue University), featured response andexplanatory variables; the first measuring outcomes, and thelatter, attempting to explain these outcomes. For example,in a comparison of two Hospitals A and B, we might find abetter success rate for all surgeries in Hospital A, whereas,on a closer analysis, Hospital B has a better rate for personsentering surgery in poor condition. Without explanatoryvariables, any statistical study becomes dubious. Our basicidea was to compare C4L and TRAD students usingexplanatory variables as controls, a method similar to thatfound in epidemiological studies. Confounding factors inthe C4L Study, indeed in any such observational study ofthis kind, are often accounted for by a “matching” procedure.

Page 264: Assessment Practices in Undergraduate Mathematics - Northern

Part III: Departmental Assessment Initiatives 251

However, a “traditional matching procedure” would have beenproblematic for the C4L Study, since the number of studentsenrolled in the C4L program was so small as compared to thenumber of TRAD students. In other words, a traditionalmatching procedure would have used only a very small portionof the available study data. So, in order to use all the availabledata, the C4L Study used linear models which were designedto determine whether or not each of the explanatory variableshad a statistically significant effect on each of the responsevariables, and the linear models themselves performed amatching of C4L and TRAD students with as “identical” as ispossible characteristics for comparison. The explanatoryvariables used in the linear models represented variousconfounding and interaction variables in addition to the oneexplanatory variable of interest, namely, whether or not theC4L or TRAD teaching method was used. A completediscussion of the analytical method can be found in [3].

The set of response variables used in the C4L Study linearmodels were as follows:

• The number of calculus courses taken from first, secondand third semester calculus

• The number of mathematics courses taken beyondcalculus

• The average grade in calculus courses• The average grade in mathematics courses beyond

calculus• The last available (overall) grade point average

The set of explanatory variables used in the C4L Studylinear models were as follows:

• Predicted grade point average (PGPA) — a statisticcomputed by the registrar’s office for entering freshmanat Purdue to predict student success (at the C grade levelor higher). Multiple linear regression analysis wasperformed to determine which combination of availablepredictor variables were most strongly associated with astudents’ first semester grade index. The best predictorswere found to be SAT-Verbal, SAT-Math, average highschool grades, and the high school percentile rank cubed(cubed rank was used since it spreads the distributionand reduces the skew). Prediction equations involvingthese four variables were developed for each of the variousschools within Purdue, and in some cases, specificprograms within schools. The equations provided theweights needed to combine a given student’s scores onthe four variables to arrive at a student’s PGPA.

• An indicator variable for a missing PGPA was includedas a categorical variable.

• The number of semesters since the first calculus coursewas included as a categorical variable.

• Major was included as a categorical variable with valuescorresponding to engineering, math, science and other.

• The interaction between semester and major was includedas a quantitative variable to account for the fact that the

effect of the number of semesters since the first calculuscourse was taken on the response variables can dependon major and conversely.

• Gender was included as a categorical variable.• An indicator variable to distinguish C4L students from

TRAD students was used as a categorical variable.

Each response variable was modeled as a linear functionof the explanatory variables. Statistical tests were performedfor the general linear model to determine whether or not theexplanatory variables had statistically significant effects oneach response variable. The focus of the C4L Study was todraw meaningful conclusions and comparisons by answeringquestions like the following:

• Are C4L students equally likely to take one or morecourses beyond calculus?

• Do C4L students take more calculus courses than TRADstudents?

• Do C4L students take more math courses after calculusthan do TRAD students?

• How do Calculus course grades of C4L students compareto those of TRAD students?

• How do grades in courses beyond Calculus compare forC4L and TRAD students?

• How does the last grade point average compare for C4Land TRAD students?

Findings

The following summarizes the comparisons and conclusionsmade in the C4L Study:

• C4L students are equally likely as TRAD students to takeat least one more math course beyond calculus.

• C4L students take more calculus courses than TRADstudents.

• C4L students take more math courses beyond calculusthan do TRAD students.

• C4L students’ average grades in courses beyond calculusversus those of TRAD students have no statisticallysignificant differences.

• C4L students’ grades in calculus are better than those ofTRAD students.

• C4L students’ last (overall) grade point average versusTRAD students have no statistically significantdifferences.

These results indicate that not only did C4L students dojust as well as their TRAD counterparts in mathematicscourses beyond calculus, but a larger number of C4L studentswent on to do just as well as traditional students in highermathematics courses.

A recently published study [2] suggests that C4L calculusstudents appear to be spending more time studying calculusthan do their traditionally taught students counterparts. But,

Page 265: Assessment Practices in Undergraduate Mathematics - Northern

252 Assessment Practices in Undergraduate Mathematics

as the results of the C4L longitudinal study suggest, C4Lstudents may indeed reap the rewards for studying more thantraditional students in that they often receive higher grades incalculus. Moreover, C4L students appear not to be adverselyaffected in their other courses by the increased study time spenton calculus, since no statistically significant differences in C4Lstudents’ grade point averages as compared to traditionallytaught students were found in the C4L Study.

Use of Findings

No significant changes in the C4L Program three semestercalculus sequence were made based on the findings of theC4L Study. However, the findings of the study do providepotential implementers of the C4L program and the programdirectors with useful information regarding the effects ofthe use of such a radical approach to calculus reform.Concerns regarding the increased study time required ofstudents and the possible detrimental effects on their overallperformance in other classes seem to have been be effectivelyaddressed. Most faculty might agree that students across thenation do not spend enough time studying calculus whichmay be one of the critical factors which contributes to attritionand poor performance in calculus.

The results of the qualitative research phase of the C4Lassessment/evaluation process have been used to makerevisions in curriculum design, the text and other C4LProgram course materials. A critical aspect of the C4Lprogram is a decomposition of each mathematical conceptinto developmental steps (which is often referred to as a“genetic decomposition” of a concept) following a Piagetiantheory of knowledge based on observations of and in-depthstudent interviews as they attempt to learn a concept [1].The results of the qualitative research phase of the C4Lassessment program is used to modify and adjust the C4Lpedagogical treatment. In particular, the development stepsproposed by researchers may, or may not, be modified. Forexample, the C4L text treatment and genetic decompositionof the concept of the limit of a function at a point wasmodified and an alternative treatment was proposed basedon analyses of 25 qualitative student interviews on theconcept of limit. When possible an attempt is made to makemeaningful comparisons of the analyses of interviews of bothC4L and TRAD students. This was possible with a set of 40interviews on derivatives and an in-depth analyses of twointerview questions on the students’ understanding of thederivative as slope. However, whether such comparisons canbe made or not, sometimes the research results confirm thatthe C4L treatment of a particular concept appears to be doingwhat is expected regarding the outcomes of studentunderstandings. In this case no change in the C4L treatmentis made, as was the case with results of analyses of theinterview questions on derivative as slope.

Success Factors

To plan and design a longitudinal study like the C4L Studyand then carry out the analysis of the results requiresnumerous hours of planning, brainstorming and consultationwith statistical consultants. Such a major study cannot betaken lightly and does require expert advice and counsel.

Regarding qualitative interviews, we will only say herethat to design and construct each interview, the questionsand the guidelines for what interviewers should probe forduring an interview requires detailed and careful planning.Pilot interviews must be carried out and analyzed prior tothe completion of the entire set of interviews. A particulartheoretical perspective on how students learn mathematicalconcepts, such as that used in the C4L program, provides asolid foundation on which the interview process can be based.Students are paid $10-$15 for each qualitative interview,each of which lasted between one and two hours in duration.We note that this need not always be done, as we haveconducted interviews on courses other than calculus wherestudent volunteers for interviews were obtained. Transcribingeach interview usually requires at least six hours. In the C4LProgram, in addition to the researchers doing transcriptions,undergraduate and graduate students are often paid to dotranscriptions. The analysis phase of a set of interviewsrequires many hours of dedicated researchers, not to mentionthe writing of research papers. The process becomes moretime consuming if video taping is involved. However, thewhole interview process from design to the writing of aresearch paper can be a very rewarding experience whichcontributes to pedagogical design (not to mention thecontribution to professional development, and possibletenure and promotion). Once again, we caution that aqualitative research program should not be taken lightly. Sucha program requires that the researchers becomeknowledgeable of qualitative research procedures throughthe necessary training in order to do a competent job.

References

[1] Asiala, M., Brown, N., DeVries, D., Dubinsky, E.,Mathews, D. and Thomas, K. “A Framework forResearch and Development in UndergraduateMathematics Education,” Research in CollegiateMathematics Education II, CBMS Issues in MathematicsEducation, 6, 1996, pp. 1–32.

[2] Mathews, D.M. “Time to Study: The C4L Experience,”UME Trends, 7 (4), 1995.

[3] Schwingendorf, K.E., McCabe, G.P., and Kuhn, J. “ALongitudinal Study of the Purdue C4L Calculus ReformProgram: Comparisons of C4L and Traditional Stu-dents,” Research in Collegiate Mathematics Education,CBMS Issues in Mathematics Education, to appear.

Page 266: Assessment Practices in Undergraduate Mathematics - Northern

253

Background and Purpose

As part of the National Science Foundation’s CalculusInitiative, Lawrence Moore and David Smith developed anew calculus course at Duke University. The course, calledProject CALC, differs from the traditional Calculus coursein several fundamental ways. The traditional courseemphasizes acquisition of computational skills whereas thekey features of Project CALC are real-world problems,activities and explorations, writing, teamwork, and use oftechnology. The explicitly stated goals for Project CALCare that students should

• be able to use mathematics to structure their understandingof and to investigate questions in the world around them;

• be able to use calculus to formulate problems, to solveproblems, and to communicate the solution of problemsto others;

• be able to use technology as an integral part of this processof formulation, solution, and communication;

• learn to work cooperatively [11, 12].

Project CALC classes meet for three 50-minute periods withan additional two-hour lab each week. Classroom projects areworked in teams. For approximately five of the labs orclassroom projects each semester, teams of two students submita written report. In the traditional Calculus class taught at Duke,classes (maximum size of 35) meet three times per week for50 minutes. The lectures usually closely follow a textbook likeThomas and Finney’s Calculus and Analytic Geometry. Thoughthere is, naturally, some overlap, the two courses do not havethe same mathematical content.

Method

Paralleling Moore and Smith’s development of the course,the authors of this paper designed and implemented anevaluation of Project CALC. The evaluation had two phases.During the first years of the project, the emphasis was onformative evaluation, during which the evaluator

• taught both Project CALC (PC) and traditional students(TR) calculus;

• observed, once a week, a TR class and a PC class;• conducted a dinner discussion, on alternate weeks with a

regular group of students from the TR class that was beingobserved and with a regular group from the PC class beingobserved;

• read students’ comments about both the TR and PCcalculus courses;

• held extensive conversations with the faculty who taughtPC.

During the later years, the emphasis changed to comparingtraditionally taught students with experimentally taughtstudents on a set of outcomes. The outcome based phasehad three main components• a problem solving test given to both PC and TR while

they were enrolled in Calculus II (See [4].)• a “retention” study of sophomores and juniors, both PC

and TR• a follow-up study, conducted during the fourth and fifth

years of the project, which focused on the question, “DoPC students do better in and/or take more courses thatrequire calculus?”

The Evaluation of Project Calc at Duke University,1989-1994

Jack Bookman Charles P. FriedmanDuke University University of Pittsburgh

This in-depth study analyzes a Calculus reform program. It looks not only at analytical gains in studentunderstanding but affective gains as well.

Page 267: Assessment Practices in Undergraduate Mathematics - Northern

254 Assessment Practices in Undergraduate Mathematics

Findings

Formative Evaluation. During the 90–91 and 91–92academic years, PC was taught to about one-third of thestudents enrolled in beginning calculus. The primary focusof the evaluation during the second year of the project’sdevelopment was on formative evaluation and on observingand describing qualitatively the differences between PC andTR. By the end of the first semester certain strengths andproblems became apparent [1,5,6]. Students were upset andunhappy about Lab Calculus I. They complained about theunfamiliar course content, pedagogy and style. They felt thatthe course required an unreasonable amount of time outsideof class and they had difficulty learning how to readmathematics. The student response to traditional Calculus Iwas better but not glowing. The TR students felt that thecourse was too hard and went too fast. They also felt that thecourse was not useful and presented “too much theory.”

The attitudes of the students in Project CALC improvedremarkably from the fall to the spring. In the spring semes-ter, though they still complained that the material was toovague and complicated, they also stated that they understoodthe material rather than just having memorized it and that itwas interesting to see the connections to the real world. Inthe second semester, the responses of the TR students weremuch harsher. Their most positive comments were that thecourse “forces students to learn lots of material” and the“basics of calculus were taught.” One difference was remark-ably clear from the classroom observations: The level ofattention and concentration of the PC students was muchhigher. In every TR class observed, at least some of the stu-dents fell asleep and most of them started packing their booksbefore the lecture was finished. These behaviors were notobserved in PC classes. In fact, often the class ran five min-utes overtime without the students even noticing. In sum-mary, PC students were engaged and relatively interested intheir work. The TR students were largely passive in classand alienated. On the other hand, TR students knew what toexpect and what was expected of them, whereas the PC stu-dents probably expended a lot of energy figuring out whatthe course was about.

Problem Solving Test. In year two of the study, a five-question test of problem solving was administered toCalculus II students (both PC and TR). An effort was madeto make the testing and grading conditions for the studentsas uniform as possible. The test consisted of five items novelto both groups of students and that reflected topics that weretaught in both courses. The items were selected with contextsfrom various fields: biology, chemistry, economics as wellas mathematics. On both versions of the test, PC studentsoutperformed the TR students, with highly statisticallysignificant differences, indicating that Project CALC madesome progress towards meeting its goals. A more detailed

discussion of both the method and the findings can be foundin [4].

Retention Study. Also during the third year of the project,in the spring of 1992, the five-part test developed during thefirst year was administered to a group of sophomores andjuniors half of whom had a year of PC in their freshmanyear and half of whom had a year of TR in their freshmanyear. Approximately one third of randomly selected studentsagreed to participate in the study. The five sections of thetest addressed writing, basic skills, problem-solving,concepts, and attitudes. The test items are briefly describedbelow.

Attitudes. Each student was asked to indicate the extentto which he or she agreed or disagreed with 40 itemssuch as “I’ve applied what I’ve learned in calculus tomy work in non-mathematics courses” and “When Iwas in calculus, I could do some of the problems, butI had no clue as to why I was doing them.”

Skills. Ten items such as: (1) Compute x x dx2 3 1+zand (2) Find the maximum and minimum values of y= sinxcosx on [0, p]

Writing. A short essay addressing the question,“Illustrate the concept of derivative by describing thedifference between the average velocity andinstantaneous velocity.”

Problem solving. Two non-routine word problems.

Conceptual understanding. Ten items such as, “Thegraphs of three functions appear in the figure below[omitted here]. Identify which is f, which is f ´, andwhich is f ´´.”

The results showed a highly statistically significant dif-ference in attitude favoring PC students. This was some-what surprising in light of the fact that the same studentswho had often complained, sometimes bitterly, about PCexhibited, a year and two years later, much more positiveattitudes about calculus and its usefulness than their tradi-tionally taught peers. A more complete discussion of atti-tudes is given in [5]. Not surprisingly, the regular studentsoutperformed PC students on the skills calculations, thoughnot statistically significantly. This finding that TR studentswere better at these kind of skills was verified in much ofthe interview data collected during the five year study. PCstudents out-performed TR students on the writing test butnot statistically significantly. The results of the problem solv-ing were somewhat consistent with the problem solving testdescribed above (see [3]). The scores on the conceptual un-derstanding test were poor for both groups. It is highly pos-sible that fatigue could have been a factor in their poor perfor-mance, since, during the previous two hours, the subjects hadtaken four other subtests. In retrospect, the subjects were prob-ably asked to do too much in one sitting. The most useful in-formation was gotten from the attitude and skills subtests. The

Page 268: Assessment Practices in Undergraduate Mathematics - Northern

Part III: Departmental Assessment Initiatives 255

results of the attitude test were powerful and dramatic. Theresults of the skills test, though not quite statistically signifi-cant were indicative of a weakness in the experimental coursealready perceived by students and faculty.

Follow-up Study. The evaluation in the fourth and fifthyears of the study focused on the question, do experimentalstudents do better in and take more subsequent courses thatrequire calculus? The grades of two groups of students wereexamined. in a set of math related courses. The detailedresults of this study are available elsewhere ([3,6]). To betterunderstand these results seven pairs of these students —matched by major and SAT scores, with one subject fromthe experimental course and one from the traditional course— were interviewed.

It was found that on average, the TR students did betterby 0.2 out of a 4-point GPA. This is a small but statisticallysignificant difference. It seems that though there were fewdifferences between the grades of the two groups in the mostmathematical courses, the significant differences overall maybe explained partially by differences in the performance inother classes. In particular, there were significant differencesin the grades of TR and PC students in Introductory Biologywith TR students outperforming others. On average, PCstudents took about one more math, physics, engineering,and computer science course than the TR students and PCstudents took significantly more math courses. It is not clearwhy we got these results. One possible explanation is thatduring the first two weeks of classes, when students wereallowed to add and drop courses, the most grade orientedand risk aversive students, who are often pre-medicalstudents, may have switched from the PC to TR course.Though the students who were the subjects of this study wererandomly assigned to PC and TR classes, it was not possibleto force them to remain in their assigned sections.

In the interview data, most of the students felt that theywere adequately prepared for future courses. In general, thePC students were less confident in their pencil and papercomputational skills, but more confident that they understoodhow calculus and mathematics were used to solve real worldproblems [5].

Project CALC has made some modest headway inkeeping more students in the mathematical “pipeline,” thatis increasing continuation rates in mathematics and sciencecourses. As mentioned above, PC students took more of themost mathematically related courses. In addition, retentionrates from Calculus I to Calculus II have improved slightly.

Use of Findings

The evidence, from both the qualitative and quantitativestudies, indicates several strengths and weaknesses of theproject. The observations and interviews, as well as theproblem solving tests and attitude instrument, indicate that

the students in PC are learning more about how math is usedthan their counterparts in the TR course. On the other hand,as reported in interviews, students and faculty in the earlyversion of the PC course did not feel that enough emphasiswas placed on computational skill, whereas this has beenthe main emphasis of the TR course. This view is supportedby the results of the skills test. Although, there is a danger inoveremphasizing computational skills at the expense of otherthings, PC students should leave the course with betterabilities in this area. The course has been revised to addressthese problems and in more recent versions of the course,there is considerably more practice with basic skills such asdifferentiation, integration and solving differential equations.(See [2].)

Students in PC became better problem solvers in the sensethat they were better able to formulate mathematicalinterpretations of verbal problems and solve and interpretthe results of some verbal problems that required calculusin their solution. Although PC appears to violate students’deeply held beliefs about what mathematics is and asks themto give up or adapt their coping strategies for dealing withmathematics courses, their attitudes gradually change. Whensurveyed one and two years after the course, PC studentsfelt, significantly more so than the TR students, that theybetter understood how math was used and that they had beenrequired to understand math rather than memorize formulas.Observations of students in class showed that PC studentsare much more actively engaged than were TR students. Theevidence gathered indicates some improvements incontinuation rates from Calculus I to Calculus II and fromthere into more advanced mathematics classes.

Success Factors

The heart of program evaluation is assessment of studentlearning, but they are not the same thing. The purpose ofprogram evaluation is to judge the worth of a program and,in education, that requires that assessment of student learningtake place. But program evaluation may include other thingsas well, for example, problems associated withimplementation, costs of the program, communication withinterested parties. Assessment of student learning is a meansto achieve program evaluation, but assessment of studentlearning is also a means of understanding how students learnand a means of evaluating the performance of individualstudents [8].

Educational research in a controlled laboratory setting iscritical for the development of our understanding of howstudents learn which in turn is critical for learning how toassess student learning. Program evaluation must rely onthis research base and may use some of the same methodsbut it must take place in a messier environment. In thisevaluation for example, while it was possible to randomly

Page 269: Assessment Practices in Undergraduate Mathematics - Northern

256 Assessment Practices in Undergraduate Mathematics

assign students to classes, it was not possible to keep themall there. The calculus program at a university is a muchmore complex setting than would be ideal for conductingbasic educational research. On the other hand, programevaluation can contribute to an understanding of whatactually happens in real, complex learning situations.

This study used both qualitative and quantitative methods.For example, the interview data collected during theformative evaluation pointed out concerns about student’slevel of computational skills. The quantitative data collectedthe next year corroborated this. In addition, the interviewsconducted with students two years after they took the course,helped provide insight into the quantitative attitudeinstrument given in the previous year. Traditionally, thesetwo methodologies were seen as diametrically opposed butthis attitude has changed as more evaluators are combiningthese approaches [7].

There is not always a clear distinction between formativeand summative evaluation though Robert Stake summed upthe difference nicely when he stated that: “When the cooktastes the soup, that’s formative; when the guests taste thesoup, that’s summative.” [9] The summative aspects of thisevaluation were used to inform changes in the course andbecame, therefore, a formative evaluation as well [2].

Inside evaluators (evaluators who work in the programbeing evaluated) are subject to bias and influence. On theother hand, outside evaluators are less familiar with thecomplexities of the situation and can rarely spend as muchtime on an evaluation as an insider can. To overcome theseproblems, this evaluation used both an inside and outsideevaluator. The insider conducted most of the evaluation; theoutsider served as both a consultant and auditor, whose job,like that of certified public accountants, was to periodicallyinspect the “books” to certify that the evaluation was beingcarefully, fairly and accurately executed.

Finally we offer some advice for those readers who areplanning on conducting program evaluations at theirinstitutions. It is both necessary and difficult to communicatethe value and difficulties of program evaluation. Beware ofadministrators who say: “We’re going to apply for acontinuation of the grant, so we need to do an evaluation.”Be prepared for results that are unexpected or that you don’tlike. Always look for sources of bias and alternativeexplanations. Develop uniform, replicable grading schemesand where possible use multiple graders. Be clear and honestabout limitations of the study, without being paralyzed bythe limitations. Get an outsider to review your plans andresults.

References[1] Bookman, J. Report for the Math Department Project

CALC Advisory Committee — A Description of ProjectCALC 1990-91, unpublished manuscript, 1991.

[2] Bookman, J. and Blake, L.D. Seven Years of ProjectCALC at Duke University — Approaching a SteadyState?, PRIMUS, September 1996, pp. 221–234.

[3] Bookman, J. and Friedman, C.P. Final report: Evaluationof Project CALC 1989–1993, unpublished manuscript,1994.

[4] Bookman, J. and Friedman, C.P. “A comparison of theproblem solving performance of students in lab basedand traditional calculus” in Dubinsky, E., Schoenfeld,A.H., Kaput, J., eds., Research in CollegiateMathematics Education I, American MathematicalSociety, Providence, RI, 1994, pp. 101–116.

[5] Bookman, J. and Friedman, C.P. Student Attitudes andCalculus Reform, submitted for publication to SchoolScience and Mathematics.

[6] Bookman, J. and Friedman, C.P. The Evaluation ofProject CALC at Duke University — Summary ofResults, unpublished manuscript, 1997.

[7] Herman, J.L., Morris, L.L., Fitz-Gibbon, C.T.Evaluators Handbook, Sage Publications, NewburyPark, CA. 1987.

[8] Schoenfeld, A.H. et al. Student Assessment in Calculus.MAA Notes Number 43, Mathematical Association ofAmerica, Washington, DC, 1997.

[9] Stevens, F. et al. User-Friendly Handbook for ProjectEvaluation, National Science Foundation, WashingtonD.C., 1996.

[10] Smith, D.A. and Moore, L.C. “Project CALC,” inTucker, T.W., ed., Priming the Calculus Pump:Innovations and Resources, MAA Notes Number 17,Mathematical Association of America, Washington,DC, 1990, pp. 51–74.

[11] Smith, D.A. and Moore, L.C. (1991). “Project CALC:An integrated laboratory course,” in Leinbach, L.C., etal, eds., The Laboratory Approach to TeachingCalculus, MAA Notes Number 20, MathematicalAssociation of America, Washington, DC, 1991, pp.81–92.

[12] Smith, D.A. and Moore, L.C.(1992). “Project CALC:Calculus as a laboratory course,” in Computer AssistedLearning, Lecture Notes in Computer Science 602, pp.16–20.

Page 270: Assessment Practices in Undergraduate Mathematics - Northern

Part IVAssessing Teaching

Page 271: Assessment Practices in Undergraduate Mathematics - Northern
Page 272: Assessment Practices in Undergraduate Mathematics - Northern

259

For many of us the primary means for assessing how wellwe teach have been student evaluation instruments andpossibly, on occasion, peer visits of our classes. Yes, wefaculty still make the final judgments about teachingeffectiveness since in most cases the student evaluations andpeer visit observations are summarized and the results areput into some kind of context. But in the end, all too oftenjudgments about the quality of our teaching come down tothe observations students make about what goes on in ourclassrooms. We feel uneasy about the over-reliance onstudent evaluations in measuring teaching effectiveness. Weeven wonder whether something as qualitative as teachingcan be measured, yet we know that somehow our teachingmust be assessed.

But, what is it that we really want to access? Reform ofthe mathematics curriculum, changes in our style of teaching— no longer does the lecture method reign supreme — andvariations in our own methods for assessing student learninghave led us to address this question more urgently than wehave done in the past. Some tentative answers are nowbeginning to emerge and along with them a variety ofmethods with which to assess teaching effectiveness are beingexplored. What follows in this section are descriptions ofsome of those experiments.

In the first article, Alan Knoerr, Michael McDonald andRae McCormick describe a departmental-wide effort toassess teaching, both while a course is in progress as well asan on-going practice. This effort grew out of curriculumreform, specifically the reform of the calculus sequence atthe college.

Next, Pat Collier tells us of an approach his departmentis taking to broaden the definition of what it means to be aneffective teacher and then, how to assess such effectiveness.This effort grew out of dissatisfaction with the use of studentevaluations as the primary measure of the quality of one’steaching.

Hamkins discusses using video and peer feedback toimprove teaching. While his experience with these techniquesinvolved graduate students, the methods can be modifiedfor use with faculty members, both junior and senior.

Lastly, we have two articles which give us some guidanceas to how we might use peer visitation more effectively inthe assessment of teaching. Deborah Bergstrand describesher department’s class visitation program which is designedto be of benefit to both junior and senior faculty. Pao-shengHsu talks about her experience with a peer review team,consisting of faculty from within and outside the mathematicsdepartment

Introduction

Bill MarionValparaiso University

Page 273: Assessment Practices in Undergraduate Mathematics - Northern
Page 274: Assessment Practices in Undergraduate Mathematics - Northern

261

College teaching is usually formally assessed at the end of acourse by students and occasionally through observation ofclasses by another (more senior) faculty member. Theseassessments, along with other documentation of teaching,are evaluated as part of annual or multiyear reviews. Thisapproach to assessment and evaluation does not, however,provide a teacher with timely feedback on his or herperformance and effectiveness.

This article discusses what a department can do to helpits members improve their teaching while courses are inprogress. We draw on over six years of experience withextensive curricular and pedagogical reform in mathematicsat Occidental College, a small, residential liberal arts collegewith strong sciences and a diverse student body. Our specificconcern is assessment — ways in which teachers can getinformation about their teaching and their students’ learning.

From Curriculum Reform toFormative Assessment

Our department’s interest in formative assessment is a naturaloutgrowth of our work in curriculum reform. As the calculusreform movement of the past decade began to develop, weexamined our own program and decided to make somechanges. We wanted to accomplish the following:

• improve persistence rates through the first and secondyear courses;

• integrate technology into the teaching and learning ofmathematics;

• strengthen ties to other fields which use mathematics;• engage all students in mathematics as a living science

rather than as an archive of procedures

Within two years, all our calculus courses were usingreform materials and had weekly computer lab sections.Dissatisfaction with published reform texts led us to createa great deal of our own supplementary material. New courseswere introduced at the second-year level and others weremodified. Subsequent conversion from a term to a semesteracademic calendar gave us an opportunity to reformulatethe major and to make improvements in our upper-divisioncourses as well.

Although we began with a focus on curricular reform, wesoon began to also change how we taught mathematics. Fromthe outset we formed teaching teams for our calculus courses.This was initially done to facilitate development of our ownmaterials and to enable us to staff computer lab sections.Extensive conversations about pedagogy as well as contentwere a happy, if unexpected, result.

Modifying what we were teaching was not enough to meetthe goals of our reform. By also changing how we teach, wehave made substantial and encouraging progress towardsthese goals. Many of us have made some effort to becomeacquainted with the literature on pedagogy and educationalpsychology, both in general and as it applies to mathematics.Our growing interest in assessment parallels increased

Departmental Assistance in FormativeAssessment of Teaching

Alan P. Knoerr Michael A. McDonald Rae McCormickOccidental College

At a small liberal arts college in the West a department-wide program has been developed to help facultyassess and improve their teaching while courses are in progress. Descriptions of what led to this effort,the steps already taken, the resources involved and plans for the future are presented.

Page 275: Assessment Practices in Undergraduate Mathematics - Northern

262 Assessment Practices in Undergraduate Mathematics

attention to student outcomes in educational theory andadministrative policy.

The Role of the Department

It is certainly possible for an individual teacher to learn andadopt new methods of teaching on their own. However, it isunusual for many to make significant changes in theirprofessional practice without some institutional support. Thedepartment can provide leadership and support for improvingteaching. The expectations of one’s colleagues and theresources the department can offer are especially important.

ExpectationsOur department believes it is possible and important to improveteaching, and shares a strong expectation that its members willdo so. This expectation is communicated in daily conversations,through policy decisions such as adopting calculus reform andteam teaching, by directing resources to improving teaching,and through the exemplary practice of senior members of thedepartment. It has also been an important criterion in hiringnew members. The climate created by these expectations isespecially welcomed by junior faculty who care deeply aboutteaching and still have much to learn. This climate enablesthem to improve their teaching while also advancing in otherareas of professional development.

ResourcesThere are many ways a department can help its membersobtain resources needed to improve their teaching. Many ofthese involve investments of time rather than money. Hereare some of the things our department does internally:

• senior faculty mentor junior faculty, particularly in team-teaching reformed calculus;

• the department maintains a common computer networkdrive for sharing teaching materials we create;

• we keep faculty informed of conferences, workshops, andfunding opportunities related to improving teaching;

• we encourage publishing scholarship related to teachingmathematics;

• senior and junior faculty collaborate to secure coursedevelopment grants;

• and we hold departmental retreats for sharing teachinginnovations, planning curricula, and discussing policyrelated to teaching. These retreats have been inexpensive,one day affairs held either on campus or at the home of afaculty member.

Interaction between the department and the rest of thecollege has been important in developing and maintainingcollege-wide resources for improving pedagogy. The Deanof Faculty provides support for faculty attending conferencesand workshops related to teaching and provides matching

funds for grants related to teaching. We also benefit fromthe resources of Occidental’s Center for Teaching andLearning. This center offers faculty opportunities to learnmore about college teaching, including:

• monthly interdisciplinary lunch discussions;• individual and small group diagnosis and consultation;• videotaping and debriefing services;• a library of resources on teaching and learning;• publicity about conferences and workshops on college

teaching;• and Teaching Institutes for faculty at Occidental and other

area colleges. Themes are Cooperative Learning,Women’s Ways of Knowing, and Teaching WithTechnology.

Several Mathematics faculty were also instrumental indeveloping LACTE, the Los Angeles Collaborative forTeacher Excellence. This National Science Foundationsupported collaborative of ten area colleges and communitycolleges sponsors many activities aimed at improving thetraining and continuing education of elementary andsecondary teachers. These include faculty developmentworkshops and course development release time for collegeprofessors who are educating future teachers.

A Departmental Retreat and Assessment WorkshopSeveral of us in the Mathematics Department began learningmore about formative assessment through meetings,workshops, and reading. Our program of curricular andpedagogical reform had reached a point where we feltassessment could be very helpful. Together with the Centerfor Teaching and Learning, we planned a one daydepartmental retreat and workshop at the start of thisacademic year, focussed on assessment.

Prior to the workshop, the Center for Teaching andLearning provided each member of the department with apersonal Teaching Goals Inventory [3] and readings onclassroom assessment. In the morning, the departmentdiscussed a number of issues related to teaching andcurriculum development. Having devoted a lot of energy inprevious years to lower-division courses, we focussed thistime on students making the transition to upper-divisioncourses. A shared concern was students’ abilities to engagemore abstract mathematics.

In the afternoon, the director of the Center for Teachingand Learning led a workshop on formative assessment. Webegan by discussing individual and departmental goals, asrevealed by the Teaching Goals Inventory. This was aninteresting follow-up to the morning’s discussion and modeledthe value of assessment. A number of points were raised bythe Teaching Goals Inventory which had not emerged in theearlier conversation. In the second part of the workshop, welearned about specific assessment techniques and set specificindividual goals for ourselves on using assessment.

Page 276: Assessment Practices in Undergraduate Mathematics - Northern

Part IV: Assessing Teaching 263

Some techniques concerned assessing teaching per se.The course portfolio was described as a way to assess anddocument one’s work for subsequent evaluation and tosupport scholarship on teaching. Several of us decided todevelop a portfolio for one or more courses we were teachingthis year. (See [4] in this volume for further information oncourse portfolios.)

We also learned about several techniques for classroomassessment of student learning, drawn from the excellentcollection by Angelo and Cross [1]. An “expert-groupsstrategy” was used during the workshop to teach thesetechniques. Individuals set goals to use some of thesetechniques, to learn more about other classroom assessmenttechniques, and to improve assessment in cooperativelearning. Other goals included using assessment to helpanswer certain questions: what difficulties do students havein writing about mathematics, what difficulties do first-yearstudents have adjusting to academic demands in college, whatfactors affect student confidence, and how can we helpstudents develop a more mature approach to mathematics?

Findings and Use of Findings

We frequently discuss our teaching with each other. Amidsemester follow-up meeting with the director of theCenter for Teaching and Learning gave the department aspecific opportunity to discuss its experience withassessment. A number of us have gone beyond the goals weset for ourselves at the start of the year.

Some of us did develop course portfolios. These portfoliosdocument materials prepared for the course, such as classnotes and exercises, projects, and computer labs. They mayalso include copies of representative student work. A journal,reflecting on teaching and student learning as the courseprogresses, is another important component. Other effortsin assessing teaching include the following:

• Several of us are asking students to complete midcourseevaluations and many of us use supplementary evaluationsat the end of a course.

• Members of some teaching teams are helping each othermake specific improvements in their teaching byobserving and providing specific feedback.

Student learning is the most important outcome ofteaching, and more of us have focussed on this. Reflectionon the results of this sort of assessment helps us improveteaching. Here are some of the things we are doing.

• The “muddiest point” and variations on this techniqueare widely used.

• Weekly self-evaluated take-home quizzes are given withsolutions on the back. Students are required to report howthey did by e-mail, and to ask questions about difficulties.Participation is recorded but grades are not.

• Concept maps are being used for both pre-unit assessmentand for review.

• Several of us have students keep journals about theirexperiences in a course.

• A number of us have created e-mail servers for particularcourses. This provides a forum outside of class hours fordiscussing course topics. It also can be used to create aclass journal in which the instructor requests responsesto particular prompts.

• A “professional evaluation” model of assessment is beingused in a second-year multivariable calculus course. Itsmultiple components are designed to foster mathematicalmaturity while ensuring that fundamental skills aremastered [see 4].

• Students in some courses are being asked to write on thequestion, “What is abstraction and what does it have todo with mathematics?” We hope to gain some insightinto how student perspectives on abstraction change asthey gain experience with mathematics.

While departmental activities have helped us each learnmore about assessment, the process of evaluating theinformation we acquire through assessment has been largelyinformal and private. With the assistance of the Center forTeaching and Learning we are hoping to learn how to makebetter use of this information. Both teaching portfolios andcolleague partnerships seem to be especially promising andto suit the culture of our department.

Volunteer faculty pairs would agree to help each otherwork towards specific goals concerning teaching and studentlearning. Among other ways, this assistance could beprovided by observing teaching, reviewing course materialsand student work, and meeting regularly to discuss teachingand learning. These partnerships would be an extension ofthe work of our teaching teams.

Success Factors

Our experience with formative assessment is one facet of aserious and long-term departmental commitment toimproving the teaching and learning of mathematics. Webelieve that the depth of this commitment has been essentialto the progress we have made. Critically reviewing theundergraduate mathematics curriculum to decide what toteach, and training ourselves to teach mathematics in wayswhich best promote student learning, are challenging andongoing processes. The cooperation of department membershelps us take risks and learn from each others’ experienceand strengths.

In reflecting on how this departmental commitment hasbeen achieved and maintained, we find the elements presentwhich have been noted in studies of successful institutionalchange in secondary education [2]. We began with a historyof commitment to high quality traditional instruction at an

Page 277: Assessment Practices in Undergraduate Mathematics - Northern

264 Assessment Practices in Undergraduate Mathematics

institution which allowed individuals and departments to takesome risks. The decision to become involved in calculusreform was supported by a majority of senior faculty in thedepartment and enthusiastically adopted by the junior facultyand new hires. Teams for teaching and curriculumdevelopment soon led to a culture in which mathematics,pedagogy, and the interaction of these two are our constantfocus. The college and the National Science Foundationprovided adequate financial support for equipment, releasetime, conferences and workshops. Through the Center forTeaching and Learning, the college also provided a criticalsource of expertise. Finally, we have been given time to learnand create new ways to work as teachers. For us, acooperative departmental approach to learning aboutassessment has been natural, efficient, and has furtherstrengthened our program.

References

[1] Angelo, T.A. and Cross, K.P. Classroom AssessmentTechniques: A Handbook for College Teachers, 2ndedition, Jossey-Bass Publishers, San Fransisco, 1993.

[2] Fullan, M.G. The New Meaning of Educational Change,2nd edition, Teachers College Press, New York, 1991.

[3] Grasha, A.F. Teaching with Style: A practical guide toenhancing learning by understanding teaching andlearning styles, Alliance Publishers, Pittsburgh, 1996.

[4] Knoerr, Alan P. and Michael A. McDonald. “StudentAssessment Through Portfolios,” this volume, p. 123.

Page 278: Assessment Practices in Undergraduate Mathematics - Northern

265

Background and Purpose

The University of Wisconsin Oshkosh is one of thirteen four-year campuses in the University of Wisconsin System. UWOshkosh typically has an undergraduate student body of9,000–10,000 plus another 1,000–1,300 graduate students.It offers masters degrees in selected areas, including an MSMathematics Education degree administered by theMathematics Department.

The mathematics department is primarily a servicedepartment, providing mathematics courses for generaleducation, for business administration and for prospectiveand inservice teachers. Over 80% of the credits generatedby the department are earned by students who are not majorsor minors in mathematics. The largest number of credits aregenerated in intermediate and college algebra. About one-third of all students enrolled in service courses earn less thana C grade or withdraw, with permission, after the free dropperiod.

The system for evaluating teaching effectiveness,particularly evaluation for the purpose of determining meritpay increments, has not had wide support for many years.Minor changes have been proposed with the intention ofmaking the policy more widely acceptable. More recentlychanges that are more substantive have been considered.These will be described after providing some backgroundon personnel decisions, on the role of teaching effectivenessin those decisions, and on the role of student input into thosedecisions.

Faculty are involved in four kinds of personnel decisions:renewal, promotion, merit review and post tenure review.Each of these decisions involves an evaluation of teaching,professional growth and service. Our experience hasprobably been similar to that of many departments of oursize. There has been dissatisfaction for many years with themanner in which teaching has been evaluated but there hasnot been any consensus on how to change it. We havegenerally required probationary faculty to demonstrate theywere effective teachers without giving them a workingdefinition of effective teaching. Instead we have generallyrelied on how their student “evaluations” compared withthose of other teachers.

The UW System Board of Regents has, since 1975,mandated the use of “student evaluations” to assess teachingeffectiveness. The mathematics department responded to theRegent mandate by constructing a nine-item form. Studentswere given nine prompts (e.g. “prepared for class,” “relateswell to students,” “grading”) and were asked to rate eachinstructor on a five point scale ranging from “very poor” to“outstanding.” The ninth item was “considering all aspectsof teaching that you feel are important, assign an overallrating to your instructor.” Almost all of the items weredesigned to elicit subjective responses. Department policycalled for deriving a single number for each class by takinga weighted average in which the ninth “overall item” wasweighted at one-half and the remaining items as a compositegroup weighted half. The numbers assigned to an instructorby course were then averaged over courses to determine a

Assessing the Teaching of College Mathematics Faculty

C. Patrick CollierUniversity of Wisconsin Oshkosh

At a regional, comprehensive university in the Midwest the mathematics faculty have been grapplingwith what it means to be an effective teacher and how to evaluate such effectiveness. Their conclusion isthat student evaluations should not be the primary means of evaluating teaching. Hence, the departmentis in the process of articulating a statement of expectations for teaching from which appropriate assessmentinstruments will be developed.

Page 279: Assessment Practices in Undergraduate Mathematics - Northern

266 Assessment Practices in Undergraduate Mathematics

single numerical rating for each instructor. For some numberof years the single numerical score on the student evaluationsurvey represented the preponderance of evidence ofteaching effectiveness used in retention decisions and meritallocation. Peer evaluation was a factor in some instances,but those instances were most often situations in which thepeer evaluation was to “balance” or mitigate the effects ofthe student evaluation. Over a period of time, the departmentidentified several faults in this process. Some of those werefaults with the survey instrument; some were faults ininterpreting and using the results. A partial listing follows.

1. Most of the items in the survey were written from theperspective that an effective teacher is essentially a presenterand an authority figure. This is not the image of an effectiveteacher that is described in the current standards documents.

2. The student opinions were not entirely consistent withother measures of teaching effectiveness. For example, somefaculty found they could improve their student evaluationratings by doing things they did not consider to be effectivepractice or by ceasing to do things that they believed weregood instructional practices.

3. Faculty objected to calling the data collected fromstudents “evaluations” but did not object to it being called“opinions,” since the data were to be input to an evaluationperformed by faculty members.

4. The Regents had mandated the use of student evaluationdata but had not mandated that it represent the preponderanceof evidence of effective teaching.

5. We had been led to evaluate teaching effectiveness bycomparing single numerical scores. Eventually each scorewas translated into a decile. So, half of the students couldrate you as “average” and the other half as “good” for a 3.5on the five point scale. But that may have placed you in the20th percentile in the departmental ranking. What startedout as a fairly good evaluation (between average and good)was ultimately translated into a rating which suggested poorshowing (since you were in the bottom fifth of all teachers).

Probationary faculty are subject to renewal every twoyears. The decision to renew is made by a committee oftenured faculty who rate each candidate on a five point scalewith “meets expectations” in the center. The decision topromote in rank is made by a committee of faculty in theupper ranks who rate each candidate on a five point scalewith “meets expectations” in the center. Merit reviews havebeen conducted every two years and have been independentof other reviews. Each faculty member has been rated on anine point scale by separate committees which evaluaterespectively, teaching, professional growth and service. Thepost tenure review, conducted by a committee of tenuredfaculty, is based on merit reviews over four years and resultsin a designation of “meets expectations” or “needs a plan ofimprovement.”

In renewal and promotion decisions, the appropriatepersonnel committee evaluates teaching after examining

evidence submitted by the candidate. Candidates are requiredto submit student evaluations from each course taught,including free response comments of students. There willalso be about two reports of class visitations each year frompeers. Candidates also write a summary of their teachingaccomplishments and may supplement it with copies ofsyllabi and sample exams.

In merit review, a rating for teaching has been determinedby taking 0.4 times a weighted average of student evaluationsand 0.6 times a rating determined by a committee. Thecommittee has not had access to student evaluations, butbases its ratings on two-page statements from the faculty.The faculty are directed to include in their statement:curriculum activity (course and instructional materialsdevelopment), classroom activities different from traditionallecture, assessment (exams, quizzes, assignments), gradingstandards, accommodations made (new preparations, morethan two preparations, night classes, adapting to schedulechanges). The committee rates each faculty member on ascale of 1–9.

Method

The department has not been satisfied with the merit policyand has taken some steps to reform it. The reforms deal withthe following items.

1. Rather than have the committee rate each individualon a scale of 1–9, the department would develop a statementof what is expected and the committee would rate faculty ona five point scale with “meets expectations” in the middle.The expectation would be stated in observable terms. Thiswould make the form of the decision of the teachingevaluation committee in the merit process consistent withthe form of the decision for renewal and promotion. It wouldalso make the decision process less arbitrary.

2. The old policy did not promote improvement inteaching. Faculty were given a rating (which they usuallyperceived as being too low relative to colleagues) and thatrating did not provide a direction for improvement. But withthe “expectations” approach we can ask for evidence thatthe instructor has thought about teaching, has done somehonest reflection on teaching, and has a realistic plan forimprovement. For a department it may be more importantover time that each member has some identifiable programfor improving teaching than it is to compare “performance”and determine whose is in the top quartile and whose is inthe bottom quartile.

3. The old policy did not promote improvement inprograms offered by the department. For example, we arerequired to have an assessment plan. That plan is supposedto ensure that data are collected and used to make decisionsto improve programs. But none of our evaluation of teachingis connected to promoting the assessment plan. A revised

Page 280: Assessment Practices in Undergraduate Mathematics - Northern

Part IV: Assessing Teaching 267

merit policy should recognize individual efforts that helpthe department meet its collective responsibilities.

In summary, our experience with “student evaluation” hasled to general acceptance of the following principles.

1. The opinions of students should be collectedsystematically as input to an evaluation of teaching to bemade by faculty. The input should be properly referred to as“student opinion” or as “student satisfaction” and not bereferred to as “student evaluation.”

2. The survey instrument should, as much as possible,ask students to record observations of what they perceive inthe classroom; it should not ask for direct evaluativejudgment on those observations.

3. Faculty should have “ownership” of the items in thesurvey instrument, by selecting them from a larger bank ofitems or by constructing them to match with teachingstandards they have adopted.

4. Student opinions most often reflect classroomperformance. Classroom performance is one aspect ofteaching effectiveness. Other very important aspects arecurriculum planning, materials development and assessment.Generally, students are not reliable sources for input intoevaluating those components of effective teaching.

Having articulated these principles, we began to developa new survey instrument. The general strategy was to createa consensus on a statement of teaching standards andconstruct a new survey instrument which would ask studentswhether they observed behavior that reflected thosestandards. After about a year’s work had produced a roughdraft of a statement of standards, the task was temporarilyabandoned because the University developed a newinstrument. The new SOS, as it was called, contained 30items. The department policy committee recommended wechoose a subset of the new SOS. It asked departmentmembers to examine each of the SOS items and to choose asubset of ten or more items on which they would prefer tobe judged. After several rounds of surveying the faculty, therewas general consensus on ten items. These ten items becamethe New Mathematics Department SOS. [See Appendix.]

The new survey is probably not what would have resultedif we had developed teaching standards and then developeda survey to match them. However, the new survey had severalproperties that we sought. It does have a separate scale foreach item and most items ask students to make anobservation, rather than an evaluation. For example, considerthis item: “During most class periods, this teacher raisesthought-provoking ideas and asks challenging questions_______.” The choices for the blank are: “several times,more than a few times, a few times, one or more times, zerotimes.” Of course, we have an implied value system that“several times” is preferable to “zero times.” But studentsmay not share that value system. Indeed, some may respond(accurately for their class) “more than a few times” andbelieve that is not the mark of an effective teacher. As another

example consider the prompt: “The teacher is ______attentive and considerate when listening to students’questions and ideas.” The choices for the blank are “neveror almost never, often not, usually, almost always, always.”The implied value is that teachers are to encourage dialogueand student participation.

Findings

The new survey instrument, consisting of the selected subsetof the university instrument items, took about a year to getinto place. Meanwhile, work on developing a set ofdepartment standards for teaching has not progressed. Rather,we will be redirecting that effort to developing a statementof expectations for teaching. The expectations are differentfrom standards in the sense that standards relate exclusivelyto classroom management and performance, while theexpectations will include aspects of teaching (e.g., planning,assessing, developing curriculum, experimenting with newstrategies) that occur outside the classroom, aspects whichstudents are not generally able to observe. The expectationswill be outcomes that will be judged by peers, not by students.Some sample proposed statements of expectations includethe following:

(1) Every faculty member should be involved, perhapswith a group of others in the department, in a programof improvement. That program can and, perhaps,should include making visits to classes of colleaguesand having colleagues visit their classes.(2) Every faculty member should be involved in theassessment of students, courses and programs and inthe formulation of reasonable strategies for improvingthose students, courses and programs.(3) Every faculty member should be involved incurriculum building by designing new courses, revisingold courses, and constructing meaningful activities forstudents.(4) Every faculty member should collect studentopinions of his/her teaching and should consider thoseopinions as part of a program for improving his/herteaching.

After work on the expectations has been completed weexpect to resume the quest to construct a statement ofteaching standards that can be translated into prompts for adepartment SOS instrument. That is likely to take at leasttwo more years.

Use of Findings

The merit evaluation for the last two decades had beenindependent of all other evaluations. The department decidedthat the merit policy should not continue as an independent

Page 281: Assessment Practices in Undergraduate Mathematics - Northern

268 Assessment Practices in Undergraduate Mathematics

process but take as input the results of review for renewal ofprobationary faculty and post tenure review of tenuredfaculty. Expectations for probationary faculty had alreadybeen articulated in the Renewal/Tenure Policy. It remains tocomplete a list of expectations for tenured faculty in the PostTenure Review Policy. We have broadened the definition ofeffective teaching to include performance on tasks thatusually take place outside of the classroom. We have placedmore emphasis on encouraging improvement than on makingcomparisons between one faculty member and another.

We have taken steps to reduce the amount of weight givento student opinions. In fact student opinions will not factordirectly into merit points in the future. Rather, student opinionis one of several factors that are considered by the RenewalCommittee or the Post Tenure Review Committee as theymake an evaluation. At the same time we have tried to gatherstudent opinions on items that we believe they can observeand report with some objectivity. We have also taken stepsto describe aspects of teaching that students do not usuallyobserve. As a result we hope to have a policy that focusesmore on professional development than on forcing facultyto compete in a zero-sum game.

Success Factors

As a result of this experience we recommend the followingto all departments confronted with the very difficult task ofevaluating teaching.

1. Focus as much as possible on improving teaching and aslittle as possible on rewards and punishments. Facultycan be motivated to participate in a program ofprofessional development much more readily than theycan be persuaded to participate in a system designed toreward or punish.

2. Focus as much as possible on developing and buildingconsensus for your own statement of standards of teachingas a foundation for evaluating teaching. Evaluation shouldbe based on an explicit standard, a standard with whichall faculty can identify.

3. Focus on a definition of teaching that extends to tasksthat are performed outside the classroom. Teachingeffectiveness is much more than a “performance” in theclassroom.

4. Find a realistic role for student input. Determine the rolethat you want student data to play in the evaluationprocess.

5. Do not expect anything more than temporary, partialsolutions but continue to seek more permanent andcomplete solutions.

APPENDIXStudent Opinion of Instruction Items

1. This teacher makes _____ use of class time.

5 - very good; 4 - good; 3 - satisfactory; 2 - poor; 1- very poor

2. The syllabus (course outline) did _____ job in explainingcourse requirements.

5 - a very good; 4 - a good; 3 - an adequate; 2 - notdo an adequate; 1 - (there is no syllabus/course outline)

3. This teacher raises thought-provoking ideas and askschallenging questions _____ during most class periods.

5 - several times; 4 - more than a few times; 3 - a fewtimes; 2 - one or two times; 1 - zero times

4. This teacher provides _____ feedback on my progress inthe course.

5 - a great deal of; 4 - more than an average amountof; 3 - an average amount of; 2 - little; 1 - very littleor no

5. The assignments (papers, performances, projects, exams,etc.) in this class are _____ used as learning tools.

5 - always; 4 - almost always; 3 - sometimes; 2 -seldom; 1 - never

6. Tests _____ assess knowledge of facts and understandingof concepts instead of memorization of trivial details.

5 - always; 4 - almost always; 3 - usually; 2 - oftendo not; 1 - almost never

7. The quality of teaching in this course is giving me _____opportunity to gain factual knowledge and importantprinciples.

5 - a very good; 4 - a good; 3 - an average; 2 - a poor;1 - a very poor

8. This teacher is _____ attentive and considerate whenlistening to students’ questions and ideas.

5 - always; 4 - almost always; 3 - usually; 2 - oftennot; 1 - never or almost never

9. The work assigned contributes _____ to my understandingof the subject.

5 - a very significant amount; 4 - a significant amount;3 - a good amount; 2 - little; 1 - very little

10. The difficulty level of this course is _____ challengingto me.

5 - constantly; 4 - often; 3 - sometimes; 2 - almostnever; 1 - never

Page 282: Assessment Practices in Undergraduate Mathematics - Northern

269

Background and Purpose

Diverse and informed criticism can improve teaching,especially when coupled with video feedback. In theGraduate Student Instructor (GSI) training program which Idirected in the UC Berkeley Mathematics Department, suchcriticism helped all the GSIs improve their teaching. At UCBerkeley, the introductory mathematics courses are typicallydivided into large lectures which meet with the professor,and then are further subdivided into recitation sections ofabout 25 students each of which meets twice weekly with aGSI. All first time GSIs are required, concurrently with theirfirst teaching appointment, to enroll in the training program(as a course), and in the Fall of 1994 we had about 35 suchGSIs. Though the course was intended specifically toimprove the teaching of novice graduate student instructors,teachers and professors at all levels could benefit from thecritical feedback at the heart of our program.

MethodAfter an initial university-wide training workshop, the coursemet for the first half of the semester every Monday afternoonfor about 2-1/2 hours. Each meeting consisted largely of anextended discussion of various topics on teachingeffectiveness, such as the importance of knowing whetherthe students understand your explanations, the best ways toencourage student involvement in the classroom and how tomanage groups, as well as more mathematical matters, such

as how best to explain continuity, the chain rule, or inversefunctions. Many of the topics we discussed are found in thetext [1], which I highly recommend. Outside the weeklydiscussion session, the GSIs were required to visit eachother’s sections and evaluate the teaching they observed,meeting individually with the observed GSI to discussdirectly their criticisms. Also, the GSIs were required toexperiment in their own sections with some of the dozens ofthe more unusual or innovative teaching techniques we haddiscussed (e.g. using name-tags on the first few days of class,group exams, games such as Math Jeopardy, or havingstudents contribute exam questions), and report back to therest of us on the success or failure of the technique, or howthe technique might be improved. These analyses were postedpublicly in the mathematics department for the benefit of allGSIs and professors. Finally, twice during the semester, GSIs,while teaching their classes, were videotaped by the TeachingConsultant (a talented, experienced GSI). This video wasthe basis of an intensive evaluation and consultation sessionwith the Teaching Consultant. First the novice GSI and theConsultant watched the video alone, looking for strengthsand weaknesses while recalling the actual class performance.Then, while watching the video together, the TeachingConsultant led the GSI through a sort of guided self-analysisof his of her performance in the classroom. In sum, thetraining program relied essentially on three methods: theprogram’s classroom discussions, peer observation andcriticism, and a video consultation session with the TeachingConsultant.

Using Video and Peer Feedback to Improve Teaching

Joel David HamkinsCUNY-College of Staten Island

This article discusses a program at Berkeley of using videotaping of actual classes, and peer feedback, toimprove teaching. While the program was aimed at graduate students, it can be adapted to use withfaculty members.

Page 283: Assessment Practices in Undergraduate Mathematics - Northern

270 Assessment Practices in Undergraduate Mathematics

Findings

The goal of the program was to improve the educationalexperience of undergraduates at UC Berkeley by improvingGSI teaching. We therefore discussed and analyzed a widevariety of teaching techniques and topics, many of whichgrew out of situations that the GSIs had encountered in theirown classrooms. Let me list here a few of the most importanttopics that arose:• How to generate enthusiasm in the students by using

interesting or unusual examples.• How to get more useful feedback from students by

phrasing questions in a positive manner (e.g. “Raise yourhand if you understand this part” rather than “Does anyonestill not get this?”).

• How to use a chalkboard effectively.• How to engage the students with leading questions to

actively involve them in the classroom (“Who can suggestwhat we should do at this step?”).

• How to encourage collaboration among students.• How to give useful comments on homework.• How to deal with disruptive students.• How to simultaneously challenge the students to think

and also obtain critical feedback about what the studentsknow by asking the right kinds of questions.

• How best to explain specific mathematical topics such aspartial differentiation or the ratio test, etc.

• How to manage various unusual teaching techniques suchas group quizzes, special ‘all-theory’ office hours orstudent presentations.

• How to deal with cheating on exams.• How to run a review session.• How to teach simultaneously to students of varying ability.

All the GSIs benefitted from these suggestions and thechance to discuss their ideas about teaching in an opensupportive forum.

Use of Findings

Most of the GSIs were able to immediately improve theirteaching using the ideas and techniques they had encounteredin the program’s class meetings. Indeed, after experimentingwith some of the more unusual techniques, many of the GSIsreported them to be so successful that they were now usingthem regularly. By attending each other’s sections and givingcritical feedback to each other, the GSIs became aware ofcertain shortcomings, such as their need to lead the discussionmore, their need to write more on the chalkboard, or theirneed to engage more of the students. After the videotapedobservations, the experienced Teaching Consultant was ableto point out specific strengths and weaknesses in each GSI’steaching. One GSI, for example, was having trouble gettingstudents to respond; the Teaching Consultant pointed out

simply that he was rushing, and not allowing them enoughtime to answer. Afterwards, this GSI simply waited a bitlonger after a question and found that his students wereperfectly willing to contribute. Other GSIs were helped bythe Consultant’s suggestions concerning boardwork or howto recognize cues that the students have not followed anexplanation.

Success Factors

Overall, the GSIs and I were very pleased with the structureand effectiveness of the program. The use of an experiencedTeaching Consultant, I believe, was an essential aspect of it.I have, however, several suggestions on how to implementan even better program.

• Let each GSI observe and evaluate many other GSIs.This was one of the most effective and easy-to-organizemeans of giving a lot of diverse criticism to each GSI,and, simultaneously, expose each GSI to various teachingstyles, which they viewed with a critical eye. Thus, itbenefitted both the evaluator and the GSI who wasevaluated.

• Use videos only to augment actual observations, and havethe Teaching Consultant make the video while observingthe class. Because a videotape only imperfectly recordswhat went on in the classroom — sounds are distorted,student reactions are lost — one should never evaluate ateacher using a videotape of a class one didn’t attend.Rather, use videos to augment actual observations. Wefound it very convenient for the Teaching Consultant tosimply arrive a few minutes early to set up equipment,and then occasionally check the camera during the classwhile taking notes and recording points to be discussedlater in the consultation session. In past years, each GSIwas videotaped by a technician who knew nothing aboutmathematics, but who only delivered the tape to themathematics department to be viewed by the GSI andperhaps the program’s class, none of whom actuallyobserved the class in person. Worse, in some years, thetapes were made by film students, who attempted to makethe tapes more interesting with close-up shots of studentsand whatnot, when it would have been more desirable torecord the GSIs’ board-work. These suggestions have thefollowing corollary.

• Do not show the videos as a part of the discussion session.In previous years, a large part of program consisted ofviewing videos. But this is a waste of time. It is far betterto simply send the GSIs into each other’s classrooms andthen discuss the issues that arise in the GSI discussionsection.

• Avoid mini-teaching sessions or other forms of simulatedteaching. Other programs rely on this technique, but Ifind it to have little value. In past years, for example, the

Page 284: Assessment Practices in Undergraduate Mathematics - Northern

Part IV: Assessing Teaching 271

GSIs would prepare calculus lessons to be presented tothe other GSIs in the course, who were encouraged topretend not to understand the material and ask questionsas though they were in an actual calculus class. But thisis both absurd and boring for everyone. Important partsof teaching well — such as generating enthusiasm in thestudents, getting feedback from the students by askingquestions, sometimes easy, sometimes difficult, andpaying attention to nonverbal cues — simply cannot besimulated; there is no substitute for actual students.

• Insist on a rigorous schedule for the consultation sessions.We had problems scheduling all the sessions and wouldhave benefitted from a rigorous initial policy.

• Videotape the GSIs twice. Once is not enough. The firsttime one sees oneself on videotape is a bit disconcerting.Time is spent merely getting over how one’s hair lookson camera, and so forth; it is difficult to pay attention tothe teaching. With a second tape, it is much easier to focus

on the teaching. Also, taping twice may showimprovements in the instructor’s teaching effectiveness.

• Encourage experienced GSIs and professors toparticipate in the video consultation program.Experienced faculty can benefit from the critical feedbackwhich lies at the heart of our program. I therefore suggestthat all teaching faculty be encouraged to participate inthe peer observation and videotape consultation sessions;they could simply be included into the schedule. A long-term continuing program would perhaps focus on thenovice GSIs every fall semester, when most are startingtheir first teaching assignment, and in the spring semester,when there are fewer novice GSIs, focus on moreexperienced teachers and professors.

Reference

[1] Davis, B.G. Tools for Teaching, Jossey-Bass, 1993.

Page 285: Assessment Practices in Undergraduate Mathematics - Northern

272

Background and PurposeWilliams College is a small liberal arts college in ruralMassachusetts with an enrollment of 2000 students. TheMathematics Department has 13 faculty members coveringabout 9.5 FTE’s. Williams students are bright and very wellprepared: the median Math SAT score is 700. The academiccalendar is 4-1-4: two twelve-week semesters are separatedby a four-week January term called Winter Study. Thestandard teaching load is five courses per year; every otheryear one of the five is a Winter Study course.

We put a lot of effort into our teaching, as well as intoevaluating teaching and supporting the improvement ofteaching. Many of the more formal aspects of teacherevaluation exist in large part to inform decisions onreappointment, promotion, and tenure. Some of ourevaluative procedures also serve as good mechanisms tofoster discussion among colleagues about teaching. Inparticular, we find exchanging class visits between juniorand senior faculty a highly collegial way to learn about andfrom each other.

To put the evaluative aspects into context, I will brieflydescribe the general procedures Williams uses to evaluateteaching. As at many other institutions, Williams has astructured, college-wide protocol for evaluating teaching.The Student Course Survey (SCS) is required in all courses.Students give numerical ratings anonymously on variousaspects of the course and instructor. All data are gatheredand analyzed by the Vice Provost, who produces detailed

comparisons of individual results with departmental,divisional, peer group, and college-wide results. Teachersreceive the analysis of their own results, with results fornontenured faculty also going to the department chair andthe Committee on Appointments and Promotion (CAP). TheSCS also includes a page for descriptive comments fromeach student; these are passed directly to the teacher aftergrades are submitted.

Following specific guidelines, departments themselvesmust also gather student opinion on all nontenured facultymembers annually, using either interviews, letters, ordepartmental questionnaires. Though not required, thecollege encourages class visits, in accordance with specifiedguidelines. All information on a nontenured faculty member’steaching gathered through the above means is discussed withthe faculty member, and summarized in an annual report tothe CAP.

Method

In the Mathematics Department, senior faculty have beenobserving classes of junior faculty for many years. The goalswere to evaluate nontenured faculty and to offer commentsand constructive criticism. Even if being observed wassomewhat awkward, the system worked well and was seenby all as an important and useful complement to studentevaluations of teaching. During the 1980’s, as anxieties abouttenure decisions rose along with the complexity of

Exchanging Class Visits: Improving Teaching forBoth Junior and Senior Faculty

Deborah BergstrandWilliams College

A peer visitation program for both junior and senior faculty at a small, liberal arts college in the East has beenput into place to help improve the quality of teaching. Every junior faculty member is paired with a seniorcolleague to exchange class visits. This program is designed to foster discussion of teaching, and the sharingof ideas and to provide constructive criticism about the teaching effectiveness of each member of the pair.

Page 286: Assessment Practices in Undergraduate Mathematics - Northern

Part IV: Assessing Teaching 273

procedures for evaluating teaching and scholarship, juniorfaculty in many departments at Williams felt more and more“like bugs under a microscope.” In an effort to alleviatesome of the anxiety, and in turn expand the benefits of classvisits, our department decided to have pairs of junior andsenior faculty exchange class visits each semester. Thusjunior faculty are now also observers as well as the observed.They appreciate the opportunity to see their senior colleaguesin the classroom, and we all benefit from increased exposureto different teaching styles.

Every semester, each junior faculty member is paired witha senior member to exchange class visits. The departmentchair arranges pairs so that over the course of a few years,every senior colleague observes each junior colleague. Thechair also insures that someone observes every course taughtby a junior colleague at least once. (So if the junior facultymember teaches Calculus III several times, at least one ofthose offerings will be observed.) Special requests are alsoconsidered, such as a senior faculty member’s curiosity abouta particular course offered by a junior colleague.

Because visits to classes are part of the department’sevaluation of junior faculty, they follow a formal structureunder college guidelines. Early in the semester the two facultymembers meet and go over the course syllabus. The juniorcolleague may suggest particular classes to visit or mightleave the choice open. Two or sometimes three consecutiveclasses are visited. In the discussion following a visit, it isimportant to put observations into context. What looks tothe observer like an inadequate answer to a question mightmake sense once the visitor learns that the same questionhad been addressed at length the previous day. What lookslike an awkward exchange between a student and the teachermight be the result of some previous incident in which thestudent was disrespectful. The two come to an understandingof the visitor’s evaluation of the class, which the seniorcolleague then conveys to the junior colleague in writing,with a copy to the department chair. This letter becomespart of the junior faculty member’s file. The write-up includesa general description of the class, specific comments bothpositive and negative, and the suggestions or ideas discussedafter the visit.

Visits by junior faculty to senior faculty classes are lessstructured. As with senior visits to junior classes, they areintended to foster discussion of teaching, sharing ideas, andconstructive criticism. Unlike senior visits to junior classes,they are not evaluative. Because these visits are not a formalpart of departmental evaluation procedures, sometimes theydon’t even take place. (We encourage but do not requirejunior faculty to visit classes.) Those visits that do occurare followed by an informal discussion of the visitor’simpressions, comments and criticisms. We recognize thatnot all junior faculty are going to be comfortable criticizingtheir senior colleagues, no matter how valid those criticismsmay be.

Findings

Some outcomes of these class visits are predictable. Thevisitor might have suggestions about organizing the lecture,improving blackboard technique, or responding to andencouraging student questions. Sometimes more subtleobservations can be made. A visitor in the back of the roomcan watch student reactions to various aspects of the class ina way the teacher running the class may not be able to do.On the positive side, I have seen classes where the studentswere so engaged and excited that while the teacher was facingthe blackboard some students would look at each other andsmile, clearly enjoying both the material and the teacher’sstyle. On the negative side, I have seen classes where somestudents were engaged but many others were not, and somewere even doing other work. In each example it was helpfulto share my observations with my colleague.

Another subtle effect of a class visit might be hearing acolleague’s impression of how one comes across to the class.In a basically positive report of a colleague’s visit to one ofmy classes (both of us tenured), he described my style assomewhat businesslike. Though not intended as a criticism,I was quite surprised. I had thought of my teaching style asquite warm, friendly, and encouraging. Not that“businesslike” was bad, it just wasn’t how I thought I cameacross.

Use of Findings

Every faculty member reacts in their own way to commentsabout their teaching. Following the “businesslike” commenton my own teaching style referred to above, I tried to paymore attention to the tone I set in the classroom. For example,I used to take a fairly strict approach to collecting quizzesprecisely at the end of the designated time period, in an effortboth to be fair and to keep control of class time. Thinkingabout the atmosphere such a policy created, however, Irealized that it was more strict and formal than I really wantedor needed to be. As a result, I’m now more relaxed aboutquiz time and about some other things as well. The studentsstill respect me, I still have control of my class, but I feelwe’re all more relaxed and hence able to learn more.

One junior colleague’s classroom style was quite formal,even though outside of class he was very friendly andengaging. While visiting classes of two senior colleagues headmired and respected, he noticed they were closer andfriendlier with their students in class than he was. He hassince incorporated some of that spirit in his own classes.For example, he now arrives in class a few minutes early sohe can chat with his students, not only about mathematics,but about other things going on in their lives.

Not all faculty take the advice given. After being told hispace was slow, one junior colleague decided not to speed up

Page 287: Assessment Practices in Undergraduate Mathematics - Northern

274 Assessment Practices in Undergraduate Mathematics

the class. He decided his own pace was the appropriate one.Thus even in a department with lots of visiting, discussing,and evaluating, faculty members retain their teachingautonomy. Structuring class visits as a two-way street reallyhelps. One junior colleague commented that while he knowshe’s being evaluated, he also knows that comments he makesto senior colleagues about their teaching will be takenseriously.

Success Factors

Classroom visits do have their limits. One sees only a fewclasses, not the entire course. In smaller classes, the verypresence of a visitor can affect student behavior and classdynamics. It is our practice to have the teacher beingobserved decide whether to introduce visitors, explain theirpresence, invite them to participate, etc. In some cases thevisitor remains unobtrusive and unacknowledged. The latter

approach has the advantage of perhaps producing a “purer”observation. The former has the advantage of informingstudents about and including them more directly into theprocess of teacher evaluation and improvement.

Over the last few years, the tenure balance in thedepartment has shifted. We now have only two nontenuredmembers and eleven tenured. Being familiar with one-to-one functions, we recognize that a true “exchange” of classvisits between junior and senior faculty will either excludemany senior faculty each year or will impose an unreasonableburden on the junior faculty. We now also encourage classvisits between senior faculty. Such visits are both relaxedand stimulating. We have all taken ideas from our colleaguesto use in our own classes; we have all benefitted from eventhe simplest of observations from another teacher about ourown teaching. The result, we hope, is a set of junior andsenior colleagues, all aware of each other’s teaching effortsand challenges, and all ready to support and learn from eachother’s creative energy.

Page 288: Assessment Practices in Undergraduate Mathematics - Northern

Part IV: Assessing Teaching 275

Background and Purpose

The University of Maine, the land-grant university and thesea-grant college of the State of Maine, enrolls approximately8000 undergraduate and 2000 graduate students. In 1996-97 there were 24 full-time faculty and several part-timefaculty in the Department of Mathematics and Statistics. Thedepartment had 42 majors and awarded seven degrees thatyear. Also, there were seven Master’s degree students. Thedepartment uses “classroom observations” by departmentmembers on a one-time basis as part of an evaluation processfor tenure considerations; it has no regular “peer review”practice.

Nationally, in the evaluation of teaching process, the prosand cons of having other faculty review a classroom havebeen widely debated. Proponents argue that peer reviewscan provide the teacher with insights into the classroomlearning environment unattainable in other ways, and thatthese reviews also strengthen the faculty’s voice in personneldecisions. Opponents maintain that political and personalfactors sometimes enter the evaluative process and theopportunity for misuse and abuse is real. While the debatecontinues, it is not uncommon that by default, the burden ofevidence for arriving at a judgment of a faculty member’steaching effectiveness may fall entirely on student

evaluations, since these are often mandated by universityadministrations. In an effort to generate discussion andbroaden the perspective on evaluation of teaching, I initiatedthe experiment of peer review of my classes. I haveexperimented with inviting faculty members who haveexperience in ethnographic research1 and are from outsideof my own discipline, as well as colleagues from my owndepartment.

Method

Observers in the classroom can document what actually goeson there. Having several observers in the class at the sametime allows multiple perspectives on the class. Further,observers from other disciplines bring different perspectivesand different expectations of teaching strategies; these enrichthe ensuing discussions. A team of three observers is selected,either by the faculty member or (if part of an officialdepartmental process) the department: one from within thedepartment, the other two from outside the department. Allobservers should have an interest or previous experience inevaluating teaching. Observers visit the class at least twiceduring the term. Each time, all three observers are present,and the faculty member is informed in advance of the visit.

Peer Review of Teaching

Pao-sheng HsuUniversity of Maine

At a land-grant university in the Northeast one mathematics faculty member has begun experimentingwith a peer visitation program in which a team of faculty visits her class at least twice a semester. What’sunique about this process is that the team, usually three in number, consists of faculty both within andwithout the department and all visit the same class at the same time. This technique provides the instructorwith a diversity of views on her teaching effectiveness.

1 In their book Ethnography and Qualitative Design in Educational Research [1], Goetz and LeCompte described this kind ofresearch as the “holistic depiction of uncontrived group interaction over a period of time, faithfully representing participant views andmeanings” (p.51)

Page 289: Assessment Practices in Undergraduate Mathematics - Northern

276 Assessment Practices in Undergraduate Mathematics

Before the first visit, the faculty member meets with theobservers to inform them of what has occurred so far in thecourse and provide them with written materials such assyllabi, assignments, and samples of student writing.

During the visit, some observers may arrive early to chatwith some students or observe student interactions. Observersmay sit in different parts of the room, so that some can watchstudents’ level of engagement from the back, while otherscan hear the muttered comments of shy students in the front.Observers may also stay after class to talk with some students.The faculty member should meet with the observersimmediately after the visit to give context to the observations.The instructor can provide background history about the classwhich explained some aspects of classroom dynamics andthe direction of the discussion or what was not done in theclass. Without that knowledge, an observer might have cometo a different conclusion of what had taken place in the classdue to the instructor’s actions or inactions. The results ofthese conversations were sometimes incorporated into theobservation reports.

After the visit, each observer writes a report of what wasobserved (either in a format that the observer chooses, togive the widest possible range of records, especially if thereview is done at the faculty member’s request; or on adepartmentally constructed format if uniformity is requiredfor an official process). The observers may also meet withthe faculty member to discuss their observations, either as agroup or individually.

Between visits, the faculty member provides the observerswith written information such as the course tests, test resultsand correspondences with some students.

For example, during the Fall of 1994, I invited a trio ofobservers to a Calculus II class of about 30–35 students. Theobservers consisted of (1) a colleague from the Speech/Com-munication Department who had participated in system-wideWomen-in-the-Curriculum activities, (2) a sociologist who hadattended Writing-Across-the-Curriculum workshops and (3) amathematician who oversaw graduate teaching assistants. Theyvisited twice during the term and wrote a total of four reports.In the Fall of 1996, the observers were (1) an English Depart-ment colleague who had coordinated the campus Writing-Across-the-Curriculum Program and who has the responsibil-ity of evaluating part-time/fixed length (i.e., nontenure track)faculty in her department, (2) a sociologist who had won auniversity teaching award, and (3) a mathematician with whomI had discussed evaluation of teaching. This group observedmy Precalculus class (30–35 students) three times and wrotefive sets of observation notes.

FindingsIn the classes observed, I was using a very collaborativelecturing style, inviting students to participate in what wasbeing discussed and to interact with me as well as with each

other. Sometimes the discussions were led in a certaindirection because of comments or questions by one or morestudents; sometimes I cut short a discussion because I wantedstudents to gather all information they had after class andthink more before we discussed a problem further. Mostobservers found that learning environment positive. Theyalso had suggestions for improvement. For example, oneobserver suggested that I write on transparencies so that Icould face the students more; one observer suggested that Ileave more time for discussions of new topics.

Students did not seem to behave much differently withobservers in the room. However, the presence of observersdid have quite a psychological impact on me. For example,it is difficult to avoid eye contacts with the observers.

Use of Findings

Observers’ suggestions can lead to changes in teaching styleor strategies, both immediately and after some reflection onhow to incorporate the suggestions in a manner consistentwith the instructor’s personality. Sometimes suggestions donot work with a particular class. However, when this happens,the instructor may think of other ways to remedy an identifiedproblem.

I have found both the encouragement and criticalsuggestions by the observers useful in helping me work outmy practices. Thus, I felt encouraged to continue workingwith students in helping them think through the mathematicsthat they are learning and doing while I became more carefulin weighing how much time I could spend in drawing studentsinto a discussion versus the time I would need for new topics.However, I found the use of transparencies very restrictingand asked the students for their opinion; they suggested thatI use the board for writing.

Together with some student work and students’evaluations, the peer review reports can provide a fuller viewof an instructor’s work in the classroom for a teachingportfolio — a piece of much needed documentation of afaculty member’s teaching. Above all, the peer review reportshelp inform the instructor from outside perspectives onchanges that can be made. It also gives the instructor morecolleagues with whom to discuss teaching.

Success Factors

This kind of peer review is very labor intensive and time-consuming: scheduling multiple visits for three people whileavoiding test dates for the class is a logistic challenge andthe writing of reports takes tremendous time and energy. Inan institution where peer review is not an established practice,it will take persuasion to convince colleagues to participate.

Ideally, the purpose of observing a class is to assess howstudents are learning; nevertheless, a great deal of learning

Page 290: Assessment Practices in Undergraduate Mathematics - Northern

Part IV: Assessing Teaching 277

should be taking place outside the classroom, but mostobservers will be assessing teaching performance or“teaching effectiveness” on what is observed in theclassroom. As views of what is “effective” may vary widely,the faculty member being reviewed should exercise judgmentabout the sensitivity of these colleagues to his or her teachinggoals and make these goals explicit to the observers inadvance. Providing the observers with any material such assyllabus, assignments, discussion topics given to studentsand student writings will afford the observers a context inwhich to view the sessions.

To avoid the misuse and abuse that opponents of peerreviews worry about, there needs to be some consensus in adepartment so that the practice is seen especially by thestudents as a departmental effort to help students learn.Faculty development workshops may provide training forfaculty members in observing and commenting sensitivelyon teaching.

An alternative form of peer evaluation is having a facultymember visit the class, with the instructor absent, and holdfocus group discussions with the class. For furtherinformation on this, see Patricia Shure’s article in this volume,p. 187.

Reference

[1] Goetz and LeCompte, Ethnography and QualitativeDesign in Educational Research , 1984.

Page 291: Assessment Practices in Undergraduate Mathematics - Northern
Page 292: Assessment Practices in Undergraduate Mathematics - Northern

279

Preface

Recently there has been a series of reports andrecommendations about all aspects of the undergraduatemathematics program. In response, both curriculum andinstruction are changing amidst increasing dialogue amongfaculty about what those changes should be. Many of thechanges suggested are abrupt breaks with traditional practice;others are variations of what has gone on for many decades.Mathematics faculty need to determine the effectiveness ofany change and institutionalize those that show the mostpromise for improving the quality of the program availableto mathematics majors. In deciding which changes hold thegreatest promise, student learning assessment providesinvaluable information. That assessment can also helpdepartments formulate responses for program review or otherassessments mandated by external groups.

The Committee on the Undergraduate Program inMathematics established the Subcommittee on Assessmentin 1990. This document, approved by CUPM in January1995, arises from requests from departments across thecountry struggling to find answers to the important newquestions in undergraduate mathematics education. Thisreport to the community is suggestive rather than prescriptive.It provides samples of various principles, goals, areas ofassessment, and measurement methods and techniques. Thesesamples are intended to seed thoughtful discussions andshould not be considered as recommended for adoption in aparticular program, certainly not in totality and notexclusively.

Departments anticipating program review or preparingto launch the assessment cycle described in this report shouldpay careful attention to the MAA Guidelines for Programsand Departments in Undergraduate Mathematical Sciences[1]. In particular, Section B.2 of that report and step 1 of theassessment cycle described in this document emphasize theneed for departments to have

a. A clearly defined statement of program mission; andb. A delineation of the educational goals of the program.

The Committee on the Undergraduate Program inMathematics urges departments to consider carefully theissues raised in this report. After all, our programs shouldhave clear guidelines about what we expect students to learnand have a mechanism for us to know if in fact that learningis taking place.

James R. C. Leitzel, Chair, Committee on the Under-graduate Program in Mathematics, 1995

Membership of the Subcommittee on Assessment, 1995Larry A. Cammack, Central Missouri State University,

Warrensburg, MOJames Donaldson, Howard University, Washington, DCBarbara T. Faires, Westminster College, New

Wilmington, PAHenry Frandsen, University of Tennessee, Knoxville, TNRobert T. Fray, Furman University, Greenville, SCRose C. Hamm, College of Charleston, Charleston, SCGloria C. Hewitt, University of Montana, Missoula, MTBernard L. Madison (Chair), University of Arkansas,

Fayetteville, AR

AppendixReprint of �Assessment of Student Learning for Improving the

Undergraduate Major in Mathematics�

Prepared by The Mathematical Association of America, Subcommittee on Assessment,Committee on the Undergraduate Program in MathematicsApproved by CUPM at the San Francisco meeting, January 4, 1995

Page 293: Assessment Practices in Undergraduate Mathematics - Northern

280 Assessment Practices in Undergraduate Mathematics

William A. Marion, Jr., Valparaiso University, Valparaiso,IN

Michael Murphy, Southern College of Technology,Marietta, GA

Charles F. Peltier, St. Marys College, South Bend, INJames W. Stepp, University of Houston, Houston, TXRichard D. West, United States Military Academy, West

Point, NY

I. Introduction

The most important indicators of effectiveness ofmathematics degree programs are what students learn andhow well they are able to use that learning. To gauge theseindicators, assessment — the process of gathering andinterpreting information about student learning — must beimplemented. This report seeks to engage faculty directly inthe use of assessment of student learning, with the goal ofimproving undergraduate mathematics programs.

Assessment determines whether what students havelearned in a degree program is in accord with programobjectives. Mathematics departments must design andimplement a cycle of assessment activity that answers thefollowing three questions:

• What should our students learn?• How well are they learning?• What should we change so that future students will learn

more and understand it better?

Each step of an ongoing assessment cycle broadens theknowledge of the department in judging the effectiveness ofits programs and in preparing mathematics majors. Thisknowledge can also be used for other purposes. For example,information gleaned from an assessment cycle can be usedto respond to demands for greater accountability from stategovernments, accrediting agencies, and universityadministrations. It can also be the basis for creating a sharedvision of educational goals in mathematics, thereby helpingto justify requests for funds and other resources.

This report provides samples of various principles, goals,areas of assessment, and measurement methods andtechniques. Many of the items in these lists are extractedfrom actual assessment documents at various institutions orfrom reports of professional organizations. These samplesare intended to stimulate thoughtful discussion and shouldnot be considered as recommended for adoption in aparticular program, certainly not in totality and notexclusively. Local considerations should guide selection fromthese samples as well as from others not listed.

II. Guiding Principles

An essential prerequisite to constructing an assessment cycleis agreement on a set of basic principles that will guide the

process, both operationally and ethically. These principlesshould anticipate possible problems as well as ensure soundand effective educational practices. Principles and standardsfrom several sources (see references 2,3,4,5,and 6) wereconsidered in the preparation of this document, yielding thefollowing for consideration:

a. Objectives should be realistically matched to institutionalgoals as well as to student backgrounds, abilities,aspirations, and professional needs.

b. The major focus of assessment (by mathematicsdepartments) should be the mathematics curriculum.c. Assessment should be an integral part of the academicprogram and of program review.

d. Assessment should be used to improve teaching andlearning for all students, not to filter students out ofeducational opportunities.

e. Students and faculty should be involved in and informedabout the assessment process, from the planning stagesthroughout implementation.

f. Data should be collected for specific purposes determinedin advance, and the results should be reported promptly.

III. The Assessment Cycle

Once the guiding principles are formulated and understood,an assessment cycle can be developed:

1. Articulate the learning goals of the mathematicscurriculum and a set of objectives that should lead to theaccomplishment of those goals.

2. Design strategies (e.g., curriculum and instructionalmethods) that will accomplish the objectives, taking intoaccount student learning experiences and diverse learningstyles, as well as research results on how students learn.

3. Determine the areas of student activities andaccomplishments in which quality will be judged. Selectassessment methods designed to measure student progresstoward completion of objectives and goals.

4. Gather assessment data; summarize and interpret theresults.

5. Use the results of the assessment to improve themathematics major.

Steps 1 and 2 answer the first question in the introduction— what should the students learn? Steps 3 and 4, whichanswer the second question about how well they are learning,constitute the assessment. Step 5 answers the third questionon what improvements are possible.

Step 1. Set the Learning Goals and Objectives

There are four factors to consider in setting the learning goalsof the mathematics major: institutional mission, backgroundof students and faculty, facilities, and degree program goals.Once these are well understood, then the goals and objectivesof the major can be established. These goals and objectives

Page 294: Assessment Practices in Undergraduate Mathematics - Northern

Appendix 281

of the major must be aligned with the institutional missionand general education goals and take into account theinformation obtained about students, faculty, and facilities.

Institutional Mission and Goals. The starting point forestablishing goals and objectives is the mission statement ofthe institution. Appropriate learning requirements from amission statement should be incorporated in the departmentsgoals. For example, if graduates are expected to write withprecision, clarity, and organization within their major, thisobjective will need to be incorporated in the majors goals.Or, if students are expected to gain skills appropriate forjobs, then that must be a goal of the academic program formathematics majors.

Information on Faculty, Students, and Facilities. Eachinstitution is unique, so each mathematics department shouldreflect those special features of the institutional environment.Consequently, the nature of the faculty, students, courses,and facilities should be studied in order to understand specialopportunities or constraints on the goals of the mathematicsmajor. Questions to be considered include the following:

• What are the expectations and special needs of ourstudents?

• Why and how do our students learn?• Why and how do the faculty teach?• What are the special talents of the faculty?• What facilities and materials are available?• Are mathematics majors representative of the general

student population, and if not, why not?

Goals and Objectives of Mathematics Degree Program.A degree program in mathematics includes general educationcourses as well as courses in mathematics. General educationgoals should be articulated and well-understood before thegoals and objectives of the mathematics curriculum areformulated. Of course, the general education goals and themathematics learning goals must be complementary andconsistent [6, pages 183-223]. Some examples of generaleducation goals that will affect the goals of the degreeprogram and what learning is assessed include the following:

Graduates are expected to speak and write withprecision, clarity, and organization; to acquire basicscientific and technological literacy; and to be able toapply their knowledge.

Degree programs should prepare students forimmediate employment, graduate schools, professionalschools, or meaningful and enjoyable lives.

Degree programs should be designed for allstudents with an interest in the major subject andencourage women and minorities, support the studyof science, build student self-esteem, ensure a commoncore of learning, and encourage life-long learning.

Deciding what students should know and be able to do asmathematics majors ideally is approached by setting thelearning goals and then designing a curriculum that will

achieve those goals. However, since most curricula arealready structured and in place, assessment provides anopportunity to review curricula, discern the goals intended,and rethink them. Curricula and goals should be constructedor reviewed in light of recommendations on the mathematicsmajor as contained in the 1991 CUPM report on theUndergraduate Major in the Mathematical Sciences [6, pages225–247].

Goal setting should move from general to specific, fromprogram goals to course goals to assessment goals. Goalsfor student learning can be statements of knowledge studentsshould gain, skills they should possess, attitudes they shoulddevelop, or requirements of careers for which they arepreparing. The logical starting place for discerning goalsfor an existing curriculum is to examine course syllabi, finalexaminations, and other student work.

Some samples of learning goals are:

Mathematical Reasoning. Students should be able to performcomplex tasks; explore subtlety; discern patterns,coherence, and significance; undertake intellectuallydemanding mathematical reasoning; and reasonrigorously in mathematical arguments.

Personal Potential. Students should be able to undertakeindependent work, develop new ideas, and discover newmathematics. Students should possess an advanced levelof critical sophistication; knowledge and skills neededfor further study; personal motivation and enthusiasm forstudying and applying mathematics; and attitudes of mindand analytical skills required for efficient use,appreciation, and understanding of mathematics.

Nature of Mathematics. Students should possess anunderstanding of the breadth of the mathematical sciencesand their deep interconnecting principles; substantialknowledge of a discipline that makes significant use ofmathematics; understanding of interplay amongapplications, problem-solving, and theory; understandingand appreciation of connections between different areasof mathematics and with other disciplines; awareness ofthe abstract nature of theoretical mathematics and theability to write proofs; awareness of historical andcontemporary contexts in which mathematics is practiced;understanding of the fundamental dichotomy ofmathematics as an object of study and a tool forapplication; and critical perspectives on inherentlimitations of the discipline.

Mathematical Modeling. Students should be able to applymathematics to a broad spectrum of complex problems andissues; formulate and solve problems; undertake some real-world mathematical modeling project; solve multi-stepproblems; recognize and express mathematical ideasimbedded in other contexts; use the computer for simulationand visualization of mathematical ideas and processes; anduse the process by which mathematical and scientific factsand principles are applied to serve society.

Page 295: Assessment Practices in Undergraduate Mathematics - Northern

282 Assessment Practices in Undergraduate Mathematics

Communication and Resourcefulness. Students should beable to read, write, listen, and speak mathematically; readand understand technically-based materials; contributeeffectively to group efforts; communicate mathematicsclearly in ways appropriate to career goals; conductresearch and make oral and written presentations onvarious topics; locate, analyze, synthesize, and evaluateinformation; create and document algorithms; thinkcreatively at a level commensurate with career goals; andmake effective use of the library. Students should possessskill in expository mathematical writing, have adisposition for questioning, and be aware of the ethicalissues in mathematics.

Content Specific Goals. Students should understand theoryand applications of calculus and the basic techniques ofdiscrete mathematics and abstract algebra. Studentsshould be able to write computer programs in a high levellanguage using appropriate data structures (or to useappropriate software) to solve mathematical problems.

Topic or thematic threads through the curriculum arevaluable in articulating measurable objectives for achievinggoals. Threads also give the curriculum direction and unity,with courses having common purposes and reinforcing oneanother. Each course or activity can be assessed in relationto the progress achieved along the threads. Possible threadsor themes are numerous and varied, even for the mathematicsmajor. Examples include mathematical reasoning,communication, scientific computing, mathematicalmodeling, and the nature of mathematics. The example of alearning goal and instructional strategy in the next sectiongives an idea of how the thread of mathematical reasoningcould wind through the undergraduate curriculum.

Step 2. Design Strategies to Accomplish Objectives

Whether constructing a curriculum for predeterminedlearning goals or discerning goals from an existingcurriculum, strategies for accomplishing each learning goalshould be designed and identified in the curricular and co-curricular activities. Strategies should respect diverselearning styles while maintaining uniform expectations forall students.

Strategies should allow for measuring progress over time.For each goal, questions such as the following should beconsidered.

• Which parts of courses are specifically aimed at helpingthe student reach the goal?

• What student assignments help reach the goal?• What should students do outside their courses to enable

them to reach the goal?• What should the faculty do to help the students reach the

goal?• What additional facilities are needed?• What does learning research tell us?

The following example of a goal and strategy can be mademore specific by referencing specific courses and activitiesin a degree program.

Learning goal. Students who have completed amathematics major should be able to read and understandmathematical statements, make and test conjectures, and beable to construct and write proofs for mathematical assertionsusing a variety of methods, including direct and indirectdeductive proofs, construction of counterexamples, andproofs by mathematical induction. Students should also beable to read arguments as complex as those found in thestandard mathematical literature and judge their validity.

Strategy. Students in first year mathematics courses willencounter statements identified as theorems which havelogical justifications provided by the instructors. Studentswill verify the need for some of the hypotheses by findingcounterexamples for the alternative statements. Students willuse the mathematical vocabulary found in their courses inwriting about the mathematics they are learning. In the secondand third years, students will learn the fundamental logicneeded for deductive reasoning and will construct proofs ofsome elementary theorems using quantifiers, indirect anddirect proofs, or mathematical induction as part of thestandard homework and examination work in courses.Students will construct proofs for elementary statements,present them in both written and oral form, and have themcritiqued by a mathematician. During the third and fourthyears, students will formulate conjectures of their own, statethem in clear mathematical form, find methods which willprove or disprove the conjectures, and present thosearguments in both written and oral form to audiences of theirpeers and teachers. Students will make rational critiques ofthe mathematical work of others, including teachers andpeers. Students will read some mathematical literature andbe able to rewrite, expand upon, and explain the proofs.

Step 3. Determine Areas and Methods of Assessment

Learning goals and strategies should determine the areas ofstudent accomplishments and departmental effectiveness thatwill be documented in the assessment cycle. These areasshould be as broad as can be managed, and may includecurriculum (core and major), instructional process, co-curricular activities, retention within major or withininstitution, and success after graduation. Other areas suchas advising and campus environment may be areas in whichdata on student learning can be gathered.

Responsibility for each chosen area of assessment shouldbe clearly assigned. For example, the mathematics facultyshould have responsibility for assessing learning in themathematics major, and the college may have responsibilityfor assessment in the core curriculum.

Assessment methods should reflect the type of learningto be measured. For example, the Graduate Record Exami-nation (GRE) may be appropriate for measuring prepara-

Page 296: Assessment Practices in Undergraduate Mathematics - Northern

Appendix 283

tion for graduate school. On the other hand, an attitude sur-vey is an appropriate tool for measuring an aptitude for life-long learning. An objective paper-and-pencil examinationmay be selected for gauging specific content knowledge.

Eight types of assessment methods are listed below, withindications of how they can be used. Departments willtypically use a combination of methods, selected in view oflocal program needs.

1. Tests. Tests can be objective or subjective, multiple-choiceor free-response. They can be written or oral. They can benational and standardized, such as the GRE and EducationalTesting Service Major Field Achievement Test, or they can belocally generated. Tests are most effective in measuring specificknowledge and its basic meaning and use.

2. Surveys. These can be written or they can be compiledthrough interviews. Groups that can be surveyed are students,faculty, employers, and alumni. Students can be surveyed incourses (about the courses), as they graduate (about the major),or as they change majors (about their reasons for changing).

3. Evaluation reports. These are reports in which anindividual or group is evaluated through a checklist of skillsand abilities. These can be completed by faculty members,peers, or employers of recent graduates. In some cases, self-evaluations may be used, but these tend to be of less valuethan more objective evaluations. Grades in courses are, ofcourse, fundamental evaluation reports.

4. Portfolios. Portfolios are collections of student work,usually compiled for individual students under faculty su-pervision following a standard departmental protocol. Thecontents may be sorted into categories, e.g., freshman orsophomore, and by type, such as homework, formal writtenpapers, or examinations. The work collected in a student’sportfolio should reflect the student’s progress through themajor. Examples of work for portfolios include homework,examination papers, writing samples, independent projectreports, and background information on the student. In or-der to determine what should go in a portfolio, one shouldreview what aspects of the curriculum were intended to con-tribute to the objectives and what work shows progress alongthe threads of the curriculum. Students may be given theoption of choosing what samples of particular types of workare included in the portfolio.

5. Essays. Essays can reveal writing skills in mathemat-ics as well as knowledge of the subject matter. For example,a student might write an essay on problem-solving tech-niques. Essays should contribute to learning. For example,students might be required to read four selected articles onmathematics and, following the models of faculty-writtensummaries of two of them, write summaries of the other two.Essays can be a part of courses and should be candidates forinclusion in portfolios.

6. Summary courses. Such courses are designed to coverand connect ideas from across the mathematics major. Thesemay be specifically designed as summary courses and as such

are usually called capstone courses, or they may be less specific,such as senior seminars or research seminars. Assessment ofstudents performances in these courses provides good summaryinformation about learning in the major.

7. Oral presentations. Oral presentations demonstratespeaking ability, confidence, and knowledge of subjectmatter. Students might be asked to prepare an oralpresentation on a mathematics article. If these presentationsare made in a summary course setting, then the discussionby the other students can serve both learning and assessment.

8. Dialogue with students. Student attitudes, expectations,and opinions can be sampled in a variety of ways and can bevaluable in assessing learning. Some of the ways are studentevaluations of courses, interviews by faculty members oradministrators, advising interactions, seminars, studentjournals, and informal interactions. Also, in-depth interviewsof individual students who have participated in academicprojects as part of a group can provide insights into learningfrom the activities.

Student cooperation and involvement are essential to mostassessment methods. When selecting methods appropriateto measuring student learning, faculty should exercise careso that all students are provided varied opportunities to showwhat they know and are able to do. The methods used shouldallow for alternative ways of presentation and response sothat the diverse needs of all students are taken into account,while ensuring that uniform standards are supported. Studentsneed to be aware of the goals and methods of thedepartmental assessment plan, the goals and objectives ofthe mathematics major and of each course in which theyenroll, and the reason for each assessment measurement. Inparticular, if a portfolio of student work is collected, studentsshould know what is going to go into those portfolios andwhy. Ideally, students should be able to articulate theirprogress toward meeting goals — in each course and in anexit essay at the end of the major.

Since some assessment measures may not affect theprogress of individual students, motivation may be aproblem. Some non-evaluative rewards may be necessary.

Step 4. Gather Assessment Data

After the assessment areas and methods are determined, theassessment is carried out and data documenting studentlearning are gathered. These data should provide answers tothe second question in the introduction — how well are thestudents learning?

Careful record keeping is absolutely essential and shouldbe well-planned, attempting to anticipate the future needsof assessment. Additional record storage space may beneeded as well as use of a dedicated computer database.The data need to be evaluated relative to the learning goalsand objectives. Evaluation of diverse data such as that in astudent portfolio may not be easy and will require someinventiveness. Standards and criteria for evaluating data

Page 297: Assessment Practices in Undergraduate Mathematics - Northern

284 Assessment Practices in Undergraduate Mathematics

should be set and modified as better information becomesavailable, including longitudinal data gathered throughtracking of majors through the degree program and aftergraduation. Furthermore, tracking records can provide a basefor longitudinal comparison of information gathered in eachpass through the assessment cycle.

Consistency in interpreting data, especially over periodsof time, may be facilitated by assigning responsibility to acore group of departmental faculty members.

Ways to evaluate data include comparisons with goalsand objectives and with preset benchmarks; comparisonsover time; comparisons to national or regional norms;comparisons to faculty, student, and employer expectations;comparisons to data at similar institutions; and comparisonsto data from other majors within the same institution.

If possible, students should be tracked from the time theyapply for admission to long after graduation. Their interests atthe time of application, their high school records, their personalexpectations of the college years, their curricular andextracurricular records while in college, their advanced degrees,their employment, and their attitudes toward the institution andmajor should all be recorded. Only with such tracking can thelong-term effectiveness of degree programs be documented.Comparisons with national data can be made with informationfrom such sources as Cooperative Institutional ResearchProgram’s freshman survey data [7] and American CollegeTesting’s College Outcomes Measures project [8].

Step 5. Use the Assessment Results to Improve theMathematics Major

The payoff of the assessment cycle comes whendocumentation of student learning and how it was achievedpoint the way for improvements for future students.Assessment should help guide education, so this final stepin the cycle is to use the results of assessment to improve thenext cycle. This is answering the third assessment question— what should be changed to improve learning? However,this important step should not be viewed solely as a periodicevent. Ways to improve learning may become apparent atany point in the assessment cycle, and improvements shouldbe implemented whenever the need is identified.

The central issue at this point is to determine validinferences about student performances based on evidencegathered by the assessment. The evidence should show notonly what the students have learned but what processescontributed to the learning. The faculty should become betterinformed because the data should reveal student learning ina multidimensional fashion.

When determining how to use the results of theassessment, faculty should consider a series of questionsabout the first four steps—setting goals and objectives,identifying learning and instructional strategies, selectingassessment methods, and documenting the results. The mostcritical questions are those about the learning strategies:

• Are the current strategies effective?• What should be added to or subtracted from the strategies?• What changes in curriculum and instruction are needed?

Secondly, questions should be raised about the assessmentmethods:

• Are the assessment methods effectively measuring theimportant learning of all students?

• Are more or different methods needed?

Finally, before beginning the assessment cycle again, theassessment process itself should be reviewed:

• Are the goals and objectives realistic, focused, and well-formulated?

• Are the results documented so that the valid inferencesare clear?

• What changes in record-keeping will enhance thelongitudinal aspects of the data?

IV. Conclusion

During an effective assessment cycle, students become moreactively engaged in learning, faculty engage in serious dialogueabout student learning, interaction between students and facultyincreases and becomes more open, and faculty build a strongersense of responsibility for student learning. All members ofthe academic community become more conscious of andinvolved in the way the institution works and meets its mission.

References

1. Guidelines for Programs and Departments inUndergraduate Mathematical Sciences, MathematicalAssociation of America, Washington, DC, 1993.

2. Measuring What Counts, Mathematical SciencesEducation Board, National Research Council, NationalAcademy Press, Washington, DC, 1993.

3. Assessment Standards for School Mathematics, NationalCouncil of Teachers of Mathematics, Reston, VA,circulating draft, 1993.

4. Principles of Good Practice for Assessing StudentLearning, American Association for Higher Education,Washington, DC, 1992.

5. “Mandated Assessment of Educational Outcomes,”Academe, November-December, 1990, pp. 34-40.

6. Steen, L.A., ed. Heeding the Call for Change, TheMathematical Association of America, Washington, DC,1992.

7. Astin, A.W., Green, K.C., and Korn, W.S. The AmericanFreshman: Twenty Year Trends, Cooperative InstitutionalResearch Program, American Council on Education,University of California, Los Angeles, 1987. (Also annualreports on the national norms of the college freshman class.)

8. College Level Assessment and Survey Services, TheAmerican College Testing Program, Iowa City, 1990.

Page 298: Assessment Practices in Undergraduate Mathematics - Northern

285

The following are several books on assessment whichare not primarily concerned with assessment in mathematics.However, they are worth reading, for they contain a goodnumber of assessment methods not discussed here, many ofwhich can be adapted to the mathematics classroom.

Alverno College Faculty, Student Assessment-as-Learningat Alverno College, 3rd ed., Alverno College, Milwaukee,WI, 1994

Angelo, T.A. and Cross, K.P. Classroom AssessmentTechniques: A Handbook for College Teachers, 2nd ed.Jossey-Bass, San Francisco, 1993.

Astin, A.W. Assessment for Excellence: the Philosophy andPractice of Assessment and Evaluation in HigherEducation. Oryx Press, Phoenix, AZ, 1993.

Banta, T.W. Implementing Outcomes Assessment: Promiseand Perils, Jossey-Bass, San Francisco, 1988.

Banta, T.W. and associates. Making a Difference: Outcomesof a Decade of Assessment in Higher Education, Jossey-Bass, San Francisco, 1993.

Suggestions for Further Reading

Banta, T.W., Lund, J.P., Black, K.E.,and Oblander, F.W. eds.Assessment in Practice: Putting Principles to Work onCollege Campuses, Jossey-Bass Publishers, SanFrancisco, 1996.

Braskamp, L.A., Brandenburg, D.C., and Ory, J.C. EvalutingTeaching Effectiveness: a practical guide, Beverley HillsPub., Sage, 1984.

Braskamp, L.A. and Ory, J.C. Assessing Faculty Work :Enhancing Individual and Institutional Performance,Jossey-Bass, San Francisco, 1994.

Kulm, G. Mathematics Assessment: What Works in theClassroom, Jossey-Bass, San Francisco, 1994.

Light, R., Harvard Assessment Seminars (first and second),Harvard University, Cambridge, MA, 1990, 1992.

Wiggins, G. Assessing Student Performance: Exploring thePurpose and Limits of Testing, Jossey-Bass, SanFrancisco, 1993.

See also two journals, AAHE Assessment, published bythe American Association for Higher Education, andAssessment Update, edited by Trudy Banta, published byJossey-Bass.

Page 299: Assessment Practices in Undergraduate Mathematics - Northern
Page 300: Assessment Practices in Undergraduate Mathematics - Northern

287

Academic Excellence Workshops. A Handbook for AcademicExcellence Workshops, Minority Engineering Program andScience Educational Enhancement Services, Pomona, CA,1992.

Adams, C., Bergstrand, D., and Morgan, F. “The Williams SMALLUndergraduate Research Project,” UME Trends, 1991.

Angel, A.R., and Porter, S.R. Survey of Mathematics withApplications (Fifth Edition), Addison-Wesley PublishingCompany, 1997.

Angelo, T.A. and Cross, K.P. Classroom Assessment Techniques:A Handbook for College Teachers, 2nd ed. Jossey-Bass, SanFrancisco, 1993.

Asiala, M., Brown, N., DeVries, D., Dubinsky, E., Mathews, D.and Thomas, K. “A Framework for Research andDevelopment in Undergraduate Mathematics Education,”Research in Collegiate Mathematics Education II, CBMSIssues in Mathematics Education, 6, 1996, pp. 1–32.

Barnes, M. “Gender and Mathematics: Shifting the Focus,” FOCUSon Learning Problems in Mathematics, 18 (numbers 1, 2,and 3), 1996, pp. 88–96.

Barnett, J. “Assessing Student Understanding Through Writing.”PRIMUS 6 (1), 1996. pp. 77–86.

Battaglini, D.J. and Schenkat, R.J. Fostering Cognitive Dev-elopment in College Students: The Perry and Toulmin Models,ERIC Document Reproduction Service No. ED 284 272, 1987

Bauman, S.F., & Martin, W.O. “Assessing the Quantitative Skillsof College Juniors,” The College Mathematics Journal, 26(3), 1995, pp. 214–220.

Bonsangue, M.The effects of calculus workshop groups on minorityachievement and persistence in mathematics, science, andengineering, unpublished doctoral dissertation, Claremont,CA, 1992.

Bonsangue, M. “An efficacy study of the calculus workshopmodel,” CBMS Issues in Collegiate Mathematics Education,4, American Mathematical Society, Providence, RI, 1994, pp.117–137.

Bonsangue, M., and Drew, D. “Mathematics: Opening the gates—Increasing minority students’ success in calculus,” in Gainen,J. and Willemsen, E., eds., Fostering Student Success inQuantitative Gateway Courses, Jossey-Bass, New Directionsfor Teaching and Learning, Number 61, San Francisco, 1995,pp. 23–33.

Bookman, J. and Blake, L.D. Seven Years of Project CALC atDuke University — Approaching a Steady State?, PRIMUS,September 1996, pp. 221–234.

Bookman, J. and Friedman, C.P. Final report: Evaluation of ProjectCALC 1989–1993, unpublished manuscript, 1994.

Bookman, J. and Friedman, C.P. “A comparison of the problemsolving performance of students in lab based and traditionalcalculus” in Dubinsky, E., Schoenfeld, A.H., Kaput, J., eds.,Research in Collegiate Mathematics Education I, AmericanMathematical Society, Providence, RI, 1994, pp. 101–116.

Bookman, J. and Friedman, C.P. Student Attitudes and CalculusReform, submitted for publication to School Science andMathematics.

Buerk, D. “Getting Beneath the Mask, Moving out of Silence,” inWhite, A., ed., Essays in Humanistic Mathematics, MAANotes Number 32, The Mathematical Association of America,Washington, DC, 1993.

Charles, R., Lester, F., and O’Daffer, P. How to Evaluate Progressin Problem Solving, National Council of Teachers ofMathematics, Reston, Va, 1987.

Cohen, D., ed.Crossroads in Mathematics: Standards for Intro-ductory College Mathematics Before Calculus, AmericanMathematical Association of Two-Year Colleges, Memphis,TN, 1995.

Cohen, D. and Henle, J. “The Pyramid Exam,” UME Trends, July,1995, pp. 2 and 15.

COMAP. For All Practical Purposes, W. H. Freeman andCompany, New York, 1991.

Committee on the Undergraduate Program in Mathematics(CUPM). “Assessment of Student Learning for Improvingthe Undergraduate Major in Mathematics,” Focus: TheNewsletter of the Mathematical Association of America, 15(3), June 1995, pp. 24–28.

Committee on the Undergraduate Program in Mathematics.Quantitative Literacy for College Literacy, MAA Reports 1(New Series), Mathematical Association of America,Washington, DC, 1996.

Crannell, A. A Guide to Writing in Mathematics Classes (1993).Available upon request from the author, or from http://www.fandm.edu/Departments/Mathematics/Writing.html.

Crannell, A. “How to grade 300 math essays and survive to tell thetale,” PRIMUS 4 (3), 1994.

Crosswhite, F.J. “Correlates of Attitudes toward Mathematics,”National Longitudinal Study of Mathematical Abilities, ReportNo. 20, Stanford University Press, 1972.

Davidson, N.A. Cooperative Learning in Mathematics: AHandbook for Teachers, Addison-Wesley PublishingCompany, Inc., Menlo Park, CA, 1990.

Bibliography

Page 301: Assessment Practices in Undergraduate Mathematics - Northern

288 Assessment Practices in Undergraduate Mathematics

Davis, B.G. Tools for Teaching, Jossey-Bass Publishers, SanFrancisco, 1993.

Douglas, R., ed. Toward a Lean and Lively Calculus: Report ofthe conference/workshop to develop curriculum and teachingmethods for calculus at the college level (MAA Notes SeriesNo. 6), Mathematical Association of America, Washington,DC, 1986.

Dubinsky, E. “A Learning Theory Approach to Calculus,” inKarian, Z., ed. Symbolic Computation in UndergraduateMathematics Education, MAA Notes Number 24, TheMathematical Association of America, Washington, DC, 1992,pp. 48–55.

Dubinsky, E. “ISETL: A Programming Language for LearningMathematics,” Comm. in Pure and Applied Mathematics, 48,1995, pp. 1–25.

Emert, J.W. and Parish, C. R. “Assessing Concept Attainment inUndergraduate Core Courses in Mathematics” in Banta, T.W.,Lund, J.P., Black, K.E.,and Oblander, F.W. eds., Assessmentin Practice: Putting Principles to Work on College Campuses,Jossey-Bass Publishers, San Francisco, 1996, pp. 104–107.

Ewell, P.T. “To capture the ineffable: New forms of assessment inhigher education,” Review of Research in Education, 17,1991, pp. 75–125.

Farmer, D.W. Enhancing Student Learning: Emphasizing EssentialCompetencies in Academic Programs, King’s College Press,Wilkes-Barre, PA, 1988.

Farmer, D.W. “Course-Embedded Assessment: A TeachingStrategy to Improve Student Learning,” Assessment Update,5 (1), 1993, pp. 8, 10–11.

Fennema, E. and Sherman, J. “Fennema-Sherman mathematicsattitudes scales: Instruments designed to measure attitudestoward the learning of mathematics by females andmales,”JSAS Catalog of Selected Documents in Psychology6 (Ms. No. 1225), 1976, p. 31.

Fenton, W.E. and Dubinsky, E. Introduction to DiscreteMathematics with ISETL, Springer, 1996.

Ferrini-Mundy, J., CCH Evaluation and Documentation Project,University of New Hampshire, Durham, NH, 1994.

Frechtling, J.A. Footprints: Strategies for Non-TraditionalProgram Evaluation., National Science Foundation,Washington, DC, 1995.

Fullan, M.G. The New Meaning of Educational Change, 2ndedition, Teachers College Press, New York, 1991.

Fullilove, R.E., & Treisman, P.U. “Mathematics achievementamong African American undergraduates at the Universityof California, Berkeley: An evaluation of the mathematicsworkshop program,” Journal of Negro Education, 59 (3),1990, pp. 463–478.

Ganter, S.L. “Ten Years of Calculus Reform and its Impact onStudent Learning and Attitudes,” Association for Women inScience Magazine, 26(6), Association for Women in Science,Washington, DC, 1997.

Gillman, L. Writing Mathematics Well, Mathematical Associationof America, Washington, DC 1987.

Glassick, C.E., et. al. Scholarship Assessed: Evaluation of theProfessoriate, Carnegie Foundation for the Advancement ofTeaching, Jossey-Bass, San Francisco, CA, 1997.

Goetz and LeCompte, Ethnography and Qualitative Design inEducational Research , 1984.

Grasha, A.F. Teaching with Style: A practical guide to enhancing

learning by understanding teaching and learning styles,Alliance Publishers, Pittsburgh, 1996.

Greenberg, M.J. Euclidean and Non-Euclidean Geometries (3rded.). W. H. Freeman, New York, 1993.

Griffith, J.V. and Chapman, D.W. LCQ: Learning ContextQuestionnaire, Davidson College, Davidson, North Carolina,1982.

Hackett, G. and Betz, N.E. “An Exploration of the MathematicsSelf-Efficacy/Mathematics Performance Correspondence,”Journal for Research in Mathematics Education, 20 (3), 1989,pp. 261–273.

Hagelgans, N.L. “Constructing the Concepts of DiscreteMathematics with DERIVE,” The International DERIVEJournal, 2 (1), January 1995, pp. 115-136.

Hastings, N.B. and Laws, P. Workshop Calculus: GuidedExploration with Review, Springer-Verlag, New York, vol. 1,1996; vol. 2, 1998.

Hastings, N.B. “The Workshop Mathematics Program:Abandoning Lectures,” in D’Avanzo, C. and McNeal, A.,Student-Active Science: Models of Innovation in CollegeScience Teaching, Saunders Publishing Co., Philadelphia,1997.

Heid, M.K. “Resequencing Skills and Concepts in Applied Calculususing the Computer as a Tool,” Journal for Research inMathematics Education, 19 (1), NCTM, Reston, VA, 1988,pp. 3–25.

Herman, J.L., Morris, L.L., Fitz-Gibbon, C.T. Evaluators Handbook,Sage Publications, Newbury Park, CA. 1987.

Hoaglin, D.C. and Moore, D.S., eds. Perspectives on ContemporaryStatistics, Mathematical Association of America, Washington,DC, 1992.

Hofstadter, D. Gödel, Escher, Bach: an eternal golden braid. BasicBooks, New York, 1979.

Houston, S.K., Haines, C.R., Kitchen, A., et. al. Developing RatingScales for Undergraduate Mathematics Projects, Universityof Ulster, 1994.

Hughes Hallett, D., Gleason, A.M., Flath, D.E., Gordon, S.P.,Lomen, D.O., Lovelock, D., McCallum, W.G., Osgood, B.G.,Pasquale, A., Tecosky-Feldman, J., Thrash, J.B., Thrash, K.R.,& Tucker, T.W. Calculus, John Wiley & Sons, Inc., New York,1992.

Hutchings, P. From Idea to Prototype: The Peer Review ofTeaching. American Association of Higher Education, 1995.

Hutchings, P. Making Teaching Community Property: A Menu forPeer collaboration and Peer Review. American Associationfor Higher Education, 1996.

Johnson, D. and R. Cooperative Learning Series FacilitatorsManual. ASCD.

Johnson D.W. and Johnson, F.P. Joining Together; Group Theoryand Group Skills, 3rd ed., Prentice Hall, Englewood Cliffs,N.J., 1987

Joint Policy Board for Mathematics. Recognition and Rewards inthe Mathematical Sciences, American Mathematical Society,Providence, RI, 1994.

Jonassen, D.H., Beissneer K., and Yacci, M.A. StructuralKnowledge: Techniques for Conveying, Assessing, andAcquiring Structural Knowledge. Lawrence ErlbaumAssociates, Hillsdale, NJ, 1993.

Keith, S.Z. “Explorative Writing and Learning Mathematics,”Mathematics Magazine, 81 (9), 1988, pp. 714–719.

Page 302: Assessment Practices in Undergraduate Mathematics - Northern

Bibliography 289

Keith, S.Z. “Self-Assessment Materials for Use in Portfolios,”PRIMUS, 6 (2), 1996, pp. 178–192.

Kloosterman, P. “Self-Confidence and Motivation in Mathematics,”Journal of Educational Psychology 80, 1988, pp. 345–351.

Kohn, A. “Effects of rewards on prosocial behavior,” CooperativeLearning, 10 (3), 1990, pp. 23–24.

Leron, U. and Dubinsky, E. “An Abstract Algebra Story,” AmericanMathematical Monthly, 102 (3), 1995, pp. 227–242.

Lester, F. and Kroll, D. “Evaluation: A New Vision,” MathematicsTeacher 84, 1991, pp. 276-283.

Levine, A. and Rosenstein, G. Discovering Calculus, McGraw-Hill, 1994.

Loftsgaarden, D.O., Rung, D.C., and Watkins, A.E. StatisticalAbstract of Undergraduate Programs in the MathematicalSciences in the United States: Fall 1995 CBMS Survey, TheMathematical Association of America, Washington, DC, 1997.

Lomen, D., and Lovelock, D. Exploring Differential Equationsvia Graphics and Data, John Wiley & Sons Inc., 1996.

Madison, B. “Assessment of Undergraduate Mathematics,” in L.A.Steen, ed., Heeding the Call for Change: Suggestions forCurricular Action, Mathematical Association of America,Washington, DC, 1992, pp. 137–149.

Martin, W. O. “Assessment of students’ quantitative needs andproficiencies,” in Banta, T.W., Lund, J.P., Black, K.E., andOblander, F.W., eds., Assessment in Practice: PuttingPrinciples to Work on College Campuses, Jossey-Bass, SanFrancisco, 1996.

Mathematical Association of America,Guidelines for Programsand Departments in Undergraduate Mathematical Sciences.Mathematical Association of America, Washington, DC, 1993.

Mathematical Sciences Education Board. Moving Beyond Myths:Revitalizing Undergraduate Mathematics, National ResearchCouncil, Washington, DC, 1991.

Mathematical Sciences Education Board. Measuring What Counts:A Conceptual Guide for Mathematics Assessment, NationalResearch Council, Washington, DC, 1993.

Mathews, D.M. “Time to Study: The C4L Experience,” UMETrends, 7 (4), 1995.

Maurer, S., “Advice for Undergraduates on Special Aspects ofWriting Mathematics,” PRIMUS, 1 (1), 1990.

McGowen, M., and Ross, S., Contributed Talk, Fifth InternationalConference on Technology in Collegiate Mathematics, 1993.

Miller, C.D., Heeren, V.E., and Hornsby, Jr., E.J. MathematicalIdeas (Sixth Edition), HarperCollins Publisher, 1990.

Moore, R.C. “Making the transition to formal proof,” EducationalStudies in Mathematics 27, 1994, pp. 249–266.

National Center for Education Statistics (Project of the State HigherEducation Executive Officers). Network News, Bulletin of theSHEEO/NCES Communication Network, 16 (2), June 1997.

National Council of Teachers of Mathematics. Curriculum andEvaluation Standards for School Mathematics, NationalCouncil of Teachers of Mathematics, Reston, VA, 1989.

National Council of Teachers of Mathematics. ProfessionalStandards for Teaching Mathematics, National Council ofTeachers of Mathematics, Reston, VA, 1991.

National Council of Teachers of Mathematics. Assessmentstandards for school mathematics, NCTM, Reston, VA, 1995.

Novak, J.D. “Clarify with Concept Maps: A tool for students andteachers alike,” The Science Teacher, 58 (7), 1991, pp. 45–49.

National Science Board. Undergraduate Science, Mathematics andEngineering Education: Role for the National ScienceFoundation and Recommendations for Action by OtherSectors to Strengthen Collegiate Education and PursueExcellence in the NextGeneration of U.S. Leadership inScience and Technology, Report of the Task Committee onUndergraduate Science and Engineering Education, Neal, H.,Chair, Washington DC, 1986.

National Science Foundation. Undergraduate CurriculumDevelopment in Mathematics: Calculus. Programannouncement, Division of Mathematical Sciences,Washington, DC, 1987.

National Science Foundation. Undergraduate CurriculumDevelopment: Calculus, Report of the Committee of Visitors,Treisman, P. Chair, Washington, DC, 1991.

O’Brien, J.P., Bressler, S.L., Ennis, J.F., and Michael, M. “TheSophomore-Junior Diagnostic Project,” in Banta, T.W., et al.,ed., Assessment in Practice: Putting Principles to Work onCollege Campuses, Jossey-Bass, San Francisco, 1996, pp.89–99.

Ostebee, A., and Zorn, P. Calculus from Graphical, Numerical,and Symbolic Points of View, Volume 1, Saunders CollegePublishing, 1997.

Penn, H., “Comparisons of Test Scores in Calculus I at the NavalAcademy,” in Focus on Calculus, A Newsletter for theCalculus Consortium Based at Harvard University, 6, Spring1994, John Wiley & Sons, Inc., p. 6.

Perry Jr., W.G. Forms of Intellectual and Ethical Development inthe College Years, Holt, Rinehart and Winston, NY, 1970.

Rash, A.M. “An Alternate Method of Assessment, Using Student-Created Problems.” PRIMUS, March, 1997, pp. 89–96.

Redmond, M.V. and Clark, D.J. “A practical approach to improvingteaching,” AAHE Bulletin 1(9–10), 1982.

Regan, H.B. And Brooks, G.H. Out of Women’s Experience:Creating Relational Experiences. Thousand Oaks CA: CorwinPress, Inc., 1996.

Reynolds, B.E., Hagelgans, N.L., Schwingendorf, K.E., Vidakovic,D., Dubinsky, E., Shahin, M., and Wimbish, G.J., Jr. APractical Guide to Cooperative Learning in CollegiateMathematics, MAA Notes Number 37, The MathematicalAssociation of America, Washington, DC, 1995.

Roberts, A.W., Ed. Calculus: The Dynamics of Change, MAANotes, Number 39, MAA, Washington, DC, 1996.

Roberts, C.A. “How to Get Started with Group Activities,” CreativeMath Teaching, 1 (1), April 1994.

Rogers, C.R. “The Necessary and Sufficient Conditions ofTherapeutic Personality Change.” Journal of ConsultingPsychology 21(2), 1957, pp. 95–103.

Rogers, C.R. Freedom to Learn. Merrill Publishing, 1969.Schoenfeld, A. Student Assessment in Calculus, Mathematical

Association of America, Washington, DC, 1997.Schwartz, R. “Improving Course Quality with Student Management

Teams,” ASEE Prism, January 1996, p. 19–23.Schwingendorf, K.E., McCabe, G.P., and Kuhn, J. “A Longitudinal

Study of the Purdue C4L Calculus Reform Program:Comparisons of C4L and Traditional Students,” Research inCollegiate Mathematics Education, CBMS Issues inMathematics Education, to appear.

Science (entire issue). “Minorities in science: The pipelineproblem,” 258, November 13, 1992.

Page 303: Assessment Practices in Undergraduate Mathematics - Northern

290 Assessment Practices in Undergraduate Mathematics

Selden A. and Selden, J. “Collegiate mathematics educationresearch: What would that be like?” College MathematicsJournal, 24, 1993, pp. 431–445.

Selvin, P. “Math education: Multiplying the meager numbers,”Science, 258, 1992, pp. 1200–1201.

Silva, E.M. and Hom, C.L. “Personalized Teaching in LargeClasses,” PRIMUS 6, 1996, p. 325–336.

Slavin, R.E. “When does cooperative learning increase studentachievement?” Psychological Bulletin 94, 1983, pp. 429–445.

Smith, D.A. and Moore, L.C. (1991). “Project CALC: Anintegrated laboratory course,” in Leinbach, L.C., et al, eds.,The Laboratory Approach to Teaching Calculus, MAA NotesNumber 20, Mathematical Association of America,Washington, DC, 1991, pp. 81–92.

Solow, A., ed. Preparing for a New Calculus, ConferenceProceedings, Mathematical Association of America,Washington, DC, 1994.

Stage, F. and Kloosterman, P. “Gender, Beliefs,and Achievementin Remedial College Level Mathematics.” Journal of HigherEducation, 66 (3), 1995, pp. 294–311.

Steen, L.A., ed. Calculus for a New Century: A Pump, Not aFilter, Mathematical Association of America, Washington,DC, 1988.

Steen, L.A., ed. Reshaping College Mathematics, MAA NotesNumber 13, Mathematical Association of America,Washington, DC, 1989.

Stenmark, J.K., ed. Mathematics Assessment: Myths, Models,Good Questions, and Practical Suggestions, National Councilof Teachers of Mathematics, Reston, VA, 1991.

Stevens, F., et al. User-Friendly Handbook for Project Evaluation,National Science Foundation, Washington, DC, 1993.

Tall, D. “Inconsistencies in the Learning of Calculus and Analysis,”Focus on Learning Problems in Mathematics, 12 (3 and 4),Center for Teaching/Learning of Mathematics, Framingham,MA, 1990, pp. 49–63.

Tall, D. (Ed.). Advanced Mathematical Thinking. Kluwer AcademicPublishers, Dordrecht, 1991.

Thompson, A.G. and Thompson, P.W. “A Cognitive Perspectiveon the Mathematical Preparation of Teachers: The Case ofAlgebra,” in Lacampange, C.B., Blair, W., and Kaput, J. eds.,The Algebra Learning Initiative Colloquium, U.S. Departmentof Education, Washington, DC, 1995, pp. 95–116.

Thompson, A.G., Philipp, R.A., Thompson, P.W., and Boyd, B.A.“Calculational and Conceptual Orientations in TeachingMathematics,” in Aichele, D.B. and Coxfords, A.F., eds.,Professional Development for Teachers of Mathematics: 1994Yearbook, The National Council of Teachers of Mathematics,Reston, VA, 1994, pp. 79–92.

Toppins, A.D. “Teaching by Testing: A Group ConsensusApproach,” College Teaching, 37 (3), pp. 96–99.

Treisman, P.U. A study of the mathematics performance of blackstudents at the University of California, Berkeley, unpublisheddoctoral dissertation, Berkeley, CA, 1985.

Trowell, S. The negotiation of social norms in a universitymathematics problem solving class. Unpublished doctoraldissertation, Florida State University, Tallahassee, FL, 1994.

Tucker, A. ed. Recommendations for a General MathematicalSciences Program: A Report of the Committee on theUndergraduate Program in Mathematics, MathematicalAssociation of America Washington, DC, 1981.

Tucker, A.C. and Leitzel, J.R.C., eds. Assessing Calculus ReformEfforts: A Report to the Community, Mathematical Associationof America, Washington, DC, 1995.

West, R.D. Evaluating the Effects of Changing an UndergraduateMathematics Core Curriculum which Supports Mathematics-Based Programs, UMI, Ann Arbor, MI, 1996.

Wheatley, G.H. “Constructivist Perspectives on Science andMathematics Learning,” Science Education 75 (1), 1991, pp.9–21.

Wiggins, G. “A True Test: Toward More Authentic and EquitableAssessment,” Phi Delta Kappan, May 1989, pp. 703–713.

Wiggins, G. “The Truth May Make You Free, but the Test May KeepYou Imprisoned: Toward Assessment Worthy of the LiberalArts,” The AAHE Assessment Forum, 1990, pp. 17–31.(Reprinted in Steen, L.A., ed., Heeding the Call for Change:Suggestions for Curricular Action, Mathematical Associationof America, Washington, DC, 1992, pp. 150–162.)

Winkel, B.J. “In Plane View: An Exercise in Visualization,”International Journal of Mathematical Education in Scienceand Technology, 28(4), 1997, pp. 599–607.

Winkel, B.J. and Rogers, G. “Integrated First-Year Curriculum inScience, Engineering, and Mathematics at Rose-HulmanInstitute of Technology: Nature, Evolution, and Evaluation,”Proceedings of the 1993 ASEE Conference, June 1993, pp.186–191.

Page 304: Assessment Practices in Undergraduate Mathematics - Northern

291

Dwight Atkins, Mathematics Department, Surry CommunityCollege, Dobson, NC 27017; [email protected]

Janet Heine Barnett, Department of Mathematics, University ofSouthern Colorado, 2200 Bonforte Boulevard, Pueblo, CO81001-4901; [email protected]

Steven F. Bauman, William O. Martin, Department of Mathematics,North Dakota State University, PO Box 5075, Fargo, ND58105-5075; [email protected]

Deborah Bergstrand, Department of Mathematics, WilliamsCollege, Williamstown, MA 01267; [email protected]

Dorothee Jane Blum, Mathematics Department, MillersvilleUniversity, P.O. Box 1002, Millersville, PA 17551-0302;[email protected]

William E. Bonnice, Department of Mathematics, Kingsbury Hall,University of New Hampshire, Durham, NH 03824-3591;[email protected]

Martin Vern Bonsangue, Department of Mathematics, CaliforniaState University, Fullerton, Fullerton, CA 92634;[email protected]

Jack Bookman, Mathematics Department, Box 90320, DukeUniversity, Durham, NC 27708-0320; [email protected]; Charles P. Friedman, MathematicsDepartment, University of Pittsburgh, Pittsburgh, PA

Joann Bossenbroek, Mathematics Department, Columbus StateCommunity College, P.O. Box 1609, Columbus, OH 43216-1609; [email protected]

David M. Bressoud, Mathematics and Computer ScienceDepartment, Macalester College, 1600 Grand Avenue, SaintPaul, MN 55105; [email protected]

Regina Brunner, Department of Mathematics and ComputerScience, Cedar Crest College, 100 College Drive, Allentown,PA 18104; [email protected]

G. Daniel Callon, Department of Mathematical Sciences, FranklinCollege, Franklin, IN 46131; [email protected]

Judith N. Cederberg, Department of Mathematics, St. Olaf College,Northfield, MN 55057; [email protected]

John C. Chipman, Department of Mathematical Sciences, OaklandUniversity, Rochester, MI 48309-4401; [email protected]

Ellen Clay, Stockton State College, Jim Leeds Road, Pomona, NJ08240; [email protected]

C. Patrick Collier, Department of Mathematics, University ofWisconsin, Oshkosh, Oshkosh, WI 54901; [email protected]

Annalisa Crannell, Department of Mathematics, Franklin andMarshall College, Lancaster, PA 17604-3003; [email protected]

Steven A. Doblin, Wallace C. Pye, Department of Mathematics,University of Southern Mississippi, Box 5165, Hattiesburg,MS 39406-5165; [email protected]

Ed Dubinsky, Mathematics and Computer Science Department,Georgia State University, Atlanta, GA 30303-3083;[email protected]

Steven R. Dunbar, Department of Mathematics and Statistics,University of Nebraska-Lincoln, P.O. Box 880323, Lincoln,NE 68588-0323; [email protected]

Charles E. Emenaker, Mathematics, Physics and Computer ScienceDepartment, Raymond Walters College, University ofCincinnati, 9555 Plainfield Road, Blue Ash, OH 45236-1096;[email protected]

John W. Emert, Charles R. Parish, Department of MathematicalSciences, Department of Mathematical Sciences, Ball StateUniversity, Muncie, IN 47306; [email protected]

Deborah A. Frantz, Department of Mathematics and ComputerScience, Kutztown University, Kutztown, PA 19530;[email protected]

Michael D. Fried, Department of Mathematics, University ofCalifornia at Irvine, Irvine, CA 92717; [email protected]

Susan L. Ganter, Director, Program for the Promotion ofInstitutional Change, American Association for HigherEducation, One Dupont Circle, Suite 360, Washington, DC20036; [email protected]

Jacqueline Brannon Giles, 13103 Balarama Drive, Houston, TX77099; [email protected]

Bonnie Gold, Department of Mathematics, Monmouth University,West Long Branch, NJ; [email protected]

Richard A. Groeneveld, W. Robert Stephenson, Department ofStatistics, Iowa State University, Ames, IA 50011;[email protected], [email protected]

Nancy L. Hagelgans, Department of Mathematics, Ursinus College,P.O. Box 1000, Collegeville, PA 19426-0627; [email protected]

Joel David Hamkins, Mathematics Department 1S-215, CUNY-CSI, 2800 Victory Boulevard, Staten Island, NY 10314;[email protected]

Addresses of Authors

Page 305: Assessment Practices in Undergraduate Mathematics - Northern

292 Assessment Practices in Undergraduate Mathematics

Nancy Baxter Hastings, Department of Mathematics and ComputerScience, Dickinson College, P.O. Box 1773, Carlisle, PA17013-2896; [email protected]

M. Kathleen Heid, 271 Chambers Building, University Park, PA16802; [email protected]

Laurie Hopkins, Department of Mathematics, Columbia College,Columbia, SC 29203; [email protected]

Pao-sheng Hsu, Department of Mathematics and Statistics,University of Maine, Orono, Orono, ME 04469;[email protected]

Philip Keith, English Department, St. Cloud State University, 720Fourth Street, St. Cloud, MN 56301-4498; [email protected]

Sandra Z. Keith , Mathematics Department, St. Cloud StateUniversity, 720 Fourth Street, St. Cloud, MN 56301;[email protected]

Patricia Clark Kenschaft, Department of Mathematics andComputer Science, Montclair State University, UpperMontclair, NJ 07043; [email protected]

Alan P. Knoerr, Michael A. McDonald, Department ofMathematics, Rae McCormick, Department of Education,Occidental College, Los Angeles, CA 90041; [email protected], [email protected], [email protected]

John Koker, Mathematics Department, University of Wisconsin-Oshkosh, Oshkosh, WI 54901; [email protected]

A. Darien Lauten, Karen Graham, Joan Ferrini-Mundy, Departmentof Mathematics, Kingsbury Hall, University of NewHampshire, Durham, NH 03824-3591; [email protected]

David Lomen, Mathematics Department, University of Arizona,Tucson, AZ 85721; [email protected]

William A. Marion, Department of Mathematics and ComputerScience, Valparaiso University, Valparaiso, IN 46383;[email protected]

Mark Michael, Department of Mathematics, King’s College,Wilkes-Barre, PA 18711; [email protected]

Robert Olin, Lin Scruggs, Department of Mathematics, VirginiaPolytechnic Institute & State University, Blacksburg, VA24061-0123; [email protected]

Albert D. Otto, Cheryl A. Lubinski, Carol T. Benson, Departmentof Mathematics, Illinois State University, Campus Box 4520,Normal, IL 61790-4520; [email protected]

Judith A. Palagallo, William A. Blue, Department of MathematicalSciences, The University of Akron, Akron, OH 44325;[email protected]

Charles Peltier, Department of Mathematics, Saint Mary’s College,Notre Dame, IN 46556; cpeltier@ saintmarys.edu

Eileen L. Poiani, Department of Mathematics, St. Peter’s College,Jersey City, NJ 07306-5997; [email protected]

Agnes M. Rash, Department of Mathematics and ComputerScience, Saint Joseph’s University, 5600 City Avenue,Philadelphia, PA 19131-1395; [email protected]

Marilyn L. Repsher, Department of Mathematics; J. Rody Borg,Department of Economics, Jacksonville University,Jacksonville, FL 32211; [email protected]

Catherine A. Roberts, Mathematics Department, Northern ArizonaUniversity, Flagstaff, AZ 86011-5717; [email protected]

Sharon Cutler Ross, Department of Mathematics, DeKalb College,555 N. Indian Creek Drive, Clarkston, GA 30021-2395;[email protected]

Carolyn W. Rouviere, 377 Coreopsis Dr., Lancaster, PA 17606Keith E. Schwingendorf, Purdue University North Central, SWRZ

109, 1401 South US 421, Westville, IN 46391-9528;[email protected]

Marie P. Sheckels, Department of Mathematics, Mary WashingtonCollege, Fredricksburg, VA 22401; [email protected]

Patricia Shure, Department of Mathematics, University ofMichigan, Ann Arbor, MI 48109-1003; [email protected]

Joel Silverberg, Mathematics Department, Roger WilliamsUniversity, Bristol, RI 02809; [email protected]

Linda R. Sons, Department of Mathematical Sciences, NorthenIllinois University, DeKlab, IL 60115; [email protected]

Elias Toubassi, Donna Krawczyk, Department of Mathematics,Building #89, University of Arizona, Tucson, AZ 85721;[email protected]

Sandra Davis Trowell, 2202 Glynndale Drive, Valdosta, GA 31602;[email protected]

Janice B. Walker, Department of Mathematics and ComputerScience, Xavier University, Cincinnati, OH 45207;[email protected]

Richard West, Department of Mathematical Sciences, US MilitaryAcademy, West Point, NY 10996-1786; [email protected]

Alvin White, Mathematics Department, Harvey Mudd College,Claremont, CA 91711; [email protected]

Brian J. Winkel, Department of Mathematical Sciences, UnitedStates Military Academy, West Point, NY 10996;[email protected]