Prospectus Version 3.0

Embed Size (px)

Citation preview

  • 8/8/2019 Prospectus Version 3.0

    1/47

    Running head: HOW STUDENTS PERCEIVE THEIR LEARNING 1

    How Students Perceive Their Learning:

    An exploration of student reflections on assessment

    Amanda L. Wilson

    Appalachian State University

  • 8/8/2019 Prospectus Version 3.0

    2/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 2

    Abstract

    This prospectus aims to present the proposal of a thesis for completion of the requirements of an

    EdS in Higher Education at Appalachian State University. It presents the case for an action

    research study based in Classroom Research on student perceptions of assessment in a beginning

    Spanish 1 course. This paper presents a review of literature emphasizing a culture shift in

    education from teaching to student learning and the way assessment and accreditation procedures

    are playing a part in this shift. Student perception is noted as being less of a focus in published

    literature and one of the reasons for this proposed study. The proposed purpose of this study

    would be to explore the reflections of students in a beginning level Spanish class towards various

    forms of traditional and authentic assessment tools. The context, participants, research plan, and

    plan for evaluation are explicated in great detail. A time line is provided for the study that would

    have it begin in January of 2011 and end in April of the same year. Possible methods of showing

    validity are discussed, along with issues of the ethics and subjectivity of the study. A lengthy

    bibliography is provided followed by appendices that concisely provide the research tools and an

    informed consent form for participants.

    Keywords : student perception, traditional assessment, authentic assessment, accreditation,

    action research, Classroom Research

  • 8/8/2019 Prospectus Version 3.0

    3/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 3

    Statement of the Problem

    There has been a shift in higher education from teaching objectives to student learning

    outcomes. Much of this drive has precipitated from a shift in focus by regional accreditors of

    schools from how many resources a school has to educational effectiveness. The focus on

    student learning outcomes facilitates a culture of continual improvement of the educational

    institution as a whole that strives to engage the entire academic community. Authentic

    assessment measures, those methods of assessment not only measure achievement as an end

    product but are learning experiences as well, are becoming more common, as supplements to

    more traditional assessment measures, to provide valuable data needed to inform this culture of change and improvement. The outcomes assessment movement in foreign language classes has

    embraced these authentic assessment measures, mostly in the form of portfolio-style projects for

    students. Unfortunately, there is less research on how students are receiving these various forms

    of assessment and what they perceive as the benefits or drawbacks of each. The purpose of this

    study is to explore the reflections of students in a beginning level Spanish class towards various

    forms of traditional and authentic assessment tools.

    Significance of the Problem

    An Action Research Study Based in Classroom Research

    The focus of my study will be at the classroom-level, specifically focused on students

    from my beginning Spanish 1 classes. This will be an action research study that will build out of

    Classroom Research, as developed through the work of Dr. Patricia Cross. Cross defines

    Classroom Research as ongoing and cumulative intellectual inquiry by classroom teachers into

    the nature of teaching and learning in their own classrooms (1996 p. 2). In her text on action

    research, Hendricks states that the purpose of action research is for practitioners to investigate

  • 8/8/2019 Prospectus Version 3.0

    4/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 4

    and improve their practice (2009 p. 3). These two perspectives on research as investigation of

    teachers into their practice for the purpose of improved student learning allows me to approach

    the issue of student perspectives on assessment in a very localized but very powerful way. By

    seeking out the opinions of my own students on the specific assessment tools I have used in their

    classes, we will have a concrete context in which we can explore their insights on how they learn

    and the way assessment affects their learning. These reflections can then directly impact my

    future assessment planning.

    Cross lists the characteristics of Classroom Research as follows: Learner-Centered,

    Teacher-Directed, Collaborative, Context-Specific, Scholarly, Practical and Relevant, andContinual (1996 p. 2). My study will focus on my students perspectives of learning. It will of

    course be facilitated and directed by me, as their former teacher. It is collaborative in the sense

    that my students will help direct my research, conclusions, and future behaviors based on their

    feedback. This study is context-specific because it focuses on how these assessments are applied

    in, not just a foreign language class, but specifically a beginning Spanish class. The project is

    scholarly because it builds off ideas and insights researched and presented in the literature review

    below. My study is practical and relevant because I will use the data I gather to form conclusions

    that will inform my future practice. Finally, my study is continual in the sense that it will serve

    as a foundation for future work on student reflections and working with different types of

    assessment, which will undoubtedly play a part in my doctoral research pursuits.

    Cross comments that a Classroom Research project is not a one-shot effort that is

    completed, published, and assumed to contribute one more brick to building the wall of truth

    (1996, p. 12). This study does not aim to predict how all students will view and appreciate

    various assessment techniques. Nor should the conclusions drawn be assumed to show any one

  • 8/8/2019 Prospectus Version 3.0

    5/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 5

    truth. This will be an exploration of a select few students taking a specific course taught a certain

    way by me at one given time. The results of this study would most assuredly be different if the

    study were duplicated. The purpose of this study is to explore the reflections of specific students

    in a specific beginning level Spanish class towards various forms of traditional and authentic

    assessment tools. I hope that by gathering and reflecting on their various opinions, I can make an

    informed decision about how to make the tools I use more effective for the learning experiences

    of my future students.

    The process is one of trial and error and any study will give specific information that may

    be difficult to generalize to other populations. Cross says that Classroom Research is based onthe premise that generalizations across classrooms are, at best, tentative hypotheses, to be tested

    within the specific context of a given classroom (1996, p. 12). This study aims to create another

    piece of the puzzle. By collecting student reflections on assessment, it aims to give a tiny insight

    into how some students perceive these tools. The hope is that those insights, while very context-

    specific, might inspire me, and hopefully others, to continue asking the questions that must be

    asked in the classroom: How do students learn? How can teachers facilitate student learning?

    What tools can educators employ to create the most effective learning environment possible?

    Because at the very foundation of what it is to be an educator must be the hope, desire, and

    aspiration to help make sure it just keeps getting better.

    Fundamental Assumptions

    I hold a few fundamental assumptions going into this study. The first assumption that has

    always rung true for me is that students are experts at being students. I teach university-level

    beginning Spanish 1 and my students are primarily freshman and sophomores, the very youngest

    being 17 years old. Said another way, every student in my class has been going to school for at

  • 8/8/2019 Prospectus Version 3.0

    6/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 6

    least 12 years. While few have any knowledge of pedagogy, they have some awareness of how

    they learn. I believe that by asking students to reflect on the way they learn and what assessment

    tools have benefitted their learning experience in my beginning Spanish 1 courses, I can learn to

    see my classroom from their perspective and improve the way I approach and plan future classes.

    Students are not teachers and may not always see what a teacher does or understand why they do

    it, but by gathering data from these learning experts, I can better understand how to facilitate

    their learning.

    The second assumption that I hold going into this study is that good assessment is not

    simply a measure of student accomplishment but a method to engage and promote learning. Ichose to focus on assessment tools in this study because I worry that the goals of my course may

    not align well with the ways I assess learners. I see this as major worry across various institutions

    of learning and from various colleagues and students. It is too common to hear negative

    anecdotes about the business of education. These pessimistic views do not truly represent the

    educational system but there is a grain of truth in every story. I chose to pursue this study

    because I want to find the places where my assessment practices are aligning with my course

    goals and continue to pursue those avenues while discovering weaknesses in my practices that I

    can improve upon. I believe that by focusing on studying my current assessment methods, I can

    determine where students are simply going through the motions of assessment to demonstrate

    accomplishment and where they are really being engaged by these tools. That is not to say that I

    believe accomplishment and engagement are mutually exclusive, simply that I want to make sure

    I am working to promote learning. If student learning becomes more effective through the use of

    the tools, I hope that demonstration of accomplishment and levels of engagement will also

    benefit.

  • 8/8/2019 Prospectus Version 3.0

    7/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 7

    It Just Keeps Getting Better

    By conducting this action research study, founded in Crosss theories on Classroom

    Research, and focused on my specific students and assessment tools, I believe I will gain

    valuable insight into student perspectives on various forms of traditional and authentic

    assessment tools and lay a foundation for me to continue improving my teaching practice. The

    most important objective is to learn to be the most effective facilitator of student learning that I

    can be though continual reflection and improvement.

    Literature Review

    As I reviewed published literature in relation to the topics of assessment, accreditation,and student perceptions, a shifting picture began to emerge. The educational system, due to many

    internal and external factors, is shifting its focus from teaching to learning, from teacher to

    student, and from product to process. Instead of starting educational planning with the discipline

    being taught, planning is beginning with thinking about what educators want students to learn.

    The idea of continual improvement of educational practices by means of a cycle of assessment,

    reflection, and improvement is being disseminated and accepted all over, much of which is

    happening via the regional accrediting agencies. This is sparking collaboration amongst the

    entire academic community. Alternative methods of assessing teaching and learning are being

    used and the focus is resolving on student learning and continual assessment and improvement.

    But what do students think? Is this shifting focus getting to them and making a difference? There

    is much less written about this. The review below will highlight this shift and a bit of the force

    behind it in the guise of the accreditors. It will hit on the purpose of all this assessment and the

    way it is helping to bring some of the various campus populations together to collaborate. It will

    explore some of the ways alternative assessments are offering helpful options and the way the

  • 8/8/2019 Prospectus Version 3.0

    8/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 8

    shift to student learning outcomes, specifically in foreign languages, is playing out. To round it

    out, I will bring in the little that I was able to find concerning students perspective on this

    process to complete the picture.

    A Focus Shift: Student Learning Outcomes

    A culture change is occurring in higher education that shifts the primary focus of

    educators from teaching the subject matter of specific disciplines to a perspective of student

    learning (Allen 2004, p. 1). As departmental, organizational, and institutional cultures undergo

    change, and as the focus of that change is less on teaching and more on learning, a commitment

    to sustainable outcomes assessment becomes essential (Hernon, et al. 2006, p. 1). According toAllen, this type of assessment occurs when, empirical data on student learning is used to refine

    programs and improve student learning (2004, p. 2). The focus becomes here shifts to how

    effective programs are at facilitating student learning instead of perhaps a more traditional

    perspective on assessment that would focus solely on whether or not the student is showing

    mastery of the topic being taught.

    At the classroom-level, Palmer encourages viewing student assessment as a strategic

    tool for enhancing teaching and learning (2004, p. 194). He goes on, a few pages later, to add

    that Continuous assessment starting early in the semester has the benefit of quickly identifying

    those students falling behind and perhaps at risk of dropping out, so remedial action can be

    taken (p. 198). With this focus, assessment becomes more focused on being formative than

    summative, though it can certainly be both.

    Allen comments that while classroom assessment examines learning in the day-to-day

    classroom, program assessment systematically examines student attainment in the entire

    curriculum (2004, p. 1). In their 2001 work, Ratcliff, et al., point out that the continuous

  • 8/8/2019 Prospectus Version 3.0

    9/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 9

    improvement cycle should begin with clear departmental goals that identify what a student can

    expect to gain as a result of studying a particular field or discipline (p. 25). Hernon, et al. add

    that programs and institutions need to develop a strong and sustainable commitment to

    assessment as a process and as a means to improve learning based on explicit student learning

    outcomes (2006, p. 11). Ratcliff, et al., further point out that while a colleges or universitys

    general goals for student achievement can be measured at the university level, the accreditation

    self-study must address student academic achievement in the discipline...departments and

    programs must contribute to the self-study by assessing their students learning (2001, p. 32).

    This process is not, at least at its core, one of displaying big numbers with no real meaning orimpact. Departments must assess student learning so that not only are they able to show

    achievement but they are able gather necessary information for improvement of student learning.

    Classroom, program or department, and institutional assessment support a foundation for

    accreditation. As this culture shift continues to push the focus towards student learning,

    accreditation standards continue to link student outcomes assessment to continued accreditation

    of programs and schools (Ratcliff, et al. 2001, p. 13). The regional accreditors then become one

    of the primary exterior forces helping to drive this shift of focus to student learning outcomes.

    Regional Accreditors on Assessment

    Allen tells us on page 18 of her 2004 text that accrediting organizationsgenerally

    focus on two major issues: capacity and effectiveness. She goes on to explain that capacity is

    the bean counting of the process: when tallies are taken of the resources any institution has to

    support its students such as libraries, technology, physical space, and student support services.

    The focus, however, has really turned more towards a long-term commitment to improving

    student learning (Hernon, et al. 2006, p. 1). Allen also says that accrediting organizations expect

  • 8/8/2019 Prospectus Version 3.0

    10/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 10

    campuses to document their impact on student learning (2004, p. 18) and that when accrediting

    bodies require assessment, campuses pay attention (2004, p. 2). She cautions, however, that

    assessment should be implemented because it promotes student learning, not because an

    external agency requires it (Allen 2004, p. 2). The entire purpose behind this drive toward

    student learning is that assessment must be followed up by reflection and improvement.

    Otherwise, assessment becomes an exercise in futility instead of a powerful agent of change.

    Closing the Loop

    On pages 163 and 164, Allen imparts some friendly suggestions, one of which is to

    close the loop, stating that good assessment has impact (2004). The foundation of all thisassessment is that it will drive change towards the continual improvement of the quality of the

    educational system. As Ratcliff, et al., put it Assessment and accreditation are both premised on

    the importance of quality assurance (2001, p. 17). If the data collected through the assessment

    procedures is not analyzed for methods of improvement and if those methods are not

    implemented, the whole process is invalidated. Assessment is only the beginning of the cycle.

    The academic community must work together to be sure that the data collected is used

    effectively.

    A Community of Assessors: Collaboration is Key

    To move toward these lofty goals, faculty, institutional research offices, and everyone in

    the educational enterprise, has [the] responsibility for maintaining and improving the quality of

    services and programs (Ratcliff, et al. 2001, p. 17). Assessment of student learning outcomes

    includes all members of the [educational] community as they strive to contribute to and enhance

    the educational enterprise (Ratcliff, et al. 2001, p. 17). The various smaller communities within

    colleges and universities have to coordinate their efforts and pool their resources to make sure

  • 8/8/2019 Prospectus Version 3.0

    11/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 11

    the assessment process is as productive and effective as possible. One type of tool that is

    becoming more popular amongst these communities are authentic assessment tools. More

    formative in nature that traditional assessment tools, they are adding more and richer data to the

    assessment process.

    Authentic Assessment as Means to Focus on Student Learning Outcomes

    To understand what tools will work best for these assessments, it is best to start with an

    idea of what assessment should aim to do. Brown, et al., expound the functions of assessment on

    page 47 of their 1999 work as six points:

    1.

    Capturing student time and attention.2. Generating appropriate student learning activity.

    3. Providing timely feedback which students pay attention to.

    4. Helping students to internalize the disciplines standards and notions of quality.

    5. Marking: generating marks or grades which distinguish between students or which enable

    pass/fail decisions to be made.

    6. Quality assurance: providing evidence for others outside the course (such as external

    examiners) to enable them to judge the appropriateness of standards on the course.

    With these purposes in mind, assessment methods can be examined to determine their validity.

    On pages 62 and 63, the authors discuss traditional unseen written exams and how they function

    as assessments: In particular, this assessment format seems to be at odds with the most

    important factors underpinning successful learningthere is cause for concern that traditional

    unseen written exams do not really measure the learning outcomes which are the intended

    purposes of higher education (Brown, et al. 1999). Palmer echoes these concerns on page 194 of

    his 2004 paper on authenticity in assessment, stating that traditional forms of assessment can

  • 8/8/2019 Prospectus Version 3.0

    12/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 12

    encourage surface learning rather than deep learning. Banta goes a bit further, with her colorful

    simile to discourage purchasing more traditional assessment measures for the purposes of

    improving student learning: Just as weighing a pig will not make it fatter, spending millions to

    test college students is not likely to help them learn more (2007, p. 2). Seeing some deficiencies

    in traditional forms of assessment, experts start working on describing what might improve

    attempts at assessment.

    Watson points out a need for more authentic, learner-friendly methods to encourage

    [student] engagement (2008, p. 1) which seems to align with Brown, et al.s first and second

    functions of assessment listed above of capturing student time and attention and generatingappropriate student learning activity (1999, p. 47). Watson goes on to point out, a bit further on,

    that the assessment of authentic performancehas the potential to address a number of

    contemporary criticisms of assessment (2008, p. 1) such as testing not necessarily being a

    method of improving learning, testing promoting surface, rather than deeper, learning, and

    testing not being aligned with learning objectives as quoted above of Banta, Palmer, and Brown,

    et al., respectively. Banta also notes that authentic and valid assessment approaches must be

    developed and promoted as viable alternatives to scores on single-sitting, snapshot measures of

    learning that do not capture the difficult and demanding intellectual skills that are the true aim of

    a college education (2009, p. 3). She continues on page 4 that the point is knowledge creation,

    not knowledge reproduction (Banta 2009, p. 4). Her concerns about assessment draw a picture

    of a system that needs improvement and, as Brown, et al. also refer to, alignment with learning,

    not just an attempt to demonstrate that learning has or has not occurred.

    Brown, et al., bring it together nicely when they state that ultimately, assessment should

    be for students[as] a formative part of their learning experience and that students who

  • 8/8/2019 Prospectus Version 3.0

    13/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 13

    develop their test-taking skills also tend to succeed in assessment most, regardless of whether

    or not they are the most qualified in their field (1999, p. 58). In other words, their first two

    functions of assessment, capturing student time and attention and generating appropriate

    student learning activity, along with the fourth and sixth, helping students to internalize the

    disciplines standards and notions of quality and providing for quality assurance, are just as

    important as the fifth, marking (Brown, et al. 1999, p. 47). It is marking which tends to get all

    the attention but most often seems more prone to be partially invalid in the case of many

    traditional assessment measures, whereas authentic assessment puts the focus on student learning

    outcomes. Because the focus becomes the experience of learning, rather than grading, the onusis on lecturers to be able to demonstrate that assessment is measuring well what it is intended to

    measure, and thereby facilitates an increase in the validity of the assessments (Brown, et al.

    1999, p. 59).

    It is important to note that the point is not to throw away traditional assessment measures.

    They can still serve some of the functions of assessment well. As Ratcliff, et al., state on page 28

    of their 2001 text, formative and summative assessment methodologies provide the department

    or program with evidence of their students learning. While traditional summative assessments

    can, and should, support the functions of assessment processes, their results cannot stand alone to

    inform the process of continual improvement of learning (Ratcliff, et al. 2001, p. 28). To really

    get at that sixth purpose of quality assurance, a balance is needed (Brown, et al. 1999, p. 47).

    Authentic assessment measures, matched with more traditional assessment measures, add the

    necessary formative piece to the assessment puzzle and provide the necessary data for improved

    student learning.

    Outcomes Assessment in Foreign Languages

  • 8/8/2019 Prospectus Version 3.0

    14/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 14

    The imbalance and the lack of alignment with learning of assessment can be seen in the

    field of foreign languages as well. Trends over the last couple of decades in foreign language

    methodologies have espoused communicative goals of instructionyet, examinations in foreign

    language courses typically are pen and paper exercises that single out discrete points of grammar

    or vocabulary (Higgs 1987, p. 1). Higgs raised the warning almost twenty-five years ago that if

    foreign language educators really want to set communication as a goal for their students, then,

    assessment procedures must test for communicative function (1987, p. 1). While these pen and

    paper exams are still common place, language classes have also seen an influx of authentic

    assessment measures (Sullivan 2006, p. 590).Most of these assessments have come in the form of portfolio-style projects. Banta

    commented in her 2007 article on assessment that portfolio assessments would be the most

    authentic because students develop the content themselves (p. 4). Studies of English as a Foreign

    Language (EFL) learners have found that portfolio-style assessments have contributed to student

    learning, especially when combined with other assessment measures, and that portfolio

    assessments help students take ownership of their learning (Barootchi, et al. 2002 & Caner

    2010). Additionally, one study found that some EFL students in writing courses preferred the

    portfolio assessments over more traditional assessments (Caner 2010, p. 1) though research into

    student preferences on assessment is rare.

    There are many styles of portfolio assessments depending upon the specific needs of the

    assessment, but the seemingly most popular version in language learning is the self-assessment

    portfolio. The European Language Portfolio (ELP) was the model for the American adaptations:

    LinguaFolio and the Global Language Portfolio (Cummings, et al. 2009, p. 1). These portfolios

    present a learner-empowering alternative to traditional assessment (Cummings, et al. 2009, p.

  • 8/8/2019 Prospectus Version 3.0

    15/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 15

    1). Moeller points toward these self-assessment models as more valid forms of assessment than

    more traditional assessment models (Moeller 2010 Self-assessment in the foreign language

    classroom ). She describes LinguaFolio, in particular, as, a portfolio that focuses on student self-

    assessment, goal setting and collection of evidence of language achievement (Moeller 2010

    LinguaFolio ). The students set their language goals and determine when their goals are met

    based on the evidences that they collect of their own work (Fasciano 2010, slide 9). Moeller

    points out that if language educators are using LinguaFolio effectively, it will necessitate moving

    away from teacher-centered methodologies and toward learning-centered outcomes because it is,

    by its very definition, a learner-centered self-assessment tool that facilitates the processes of goalsetting and self-reflection and establishes intrinsic motivation in students (Moeller 2010

    LinguaFolio ). Brown, et al., also mention that a major advantage of these types of assessments is

    that they promote intrinsic motivation through personal involvement because the student is

    taking charge of their learning (1999, p. 75).

    Self-assessment portfolios are not the ultimate answer to the issues that traditional

    assessment has raised. Student self-assessment comes with its own set of new issues. Moeller

    points out three main disadvantages to self-assessment in her paper, Self-assessment in the

    foreign language classroom : it can be unreliable because students are not experts on assessment,

    students can cheat, and few students engage in it (2010, p. 3). Still, these types of portfolios can

    become a part of the solution if used carefully and contentiously.

    It is essential to remember that there is no one ultimate assessment but by using various

    methods and blending authentic and traditional assessment measures, the necessary data can be

    collected to inform change and promote student learning. The last piece of the puzzle is to

    involve the most important part of the academic community to the discussion: the students.

  • 8/8/2019 Prospectus Version 3.0

    16/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 16

    Student Perception on Assessments

    While the focus has apparently shifted from teachers to students, the assessment process

    is still very much a top-down one. At present, students often feel that they are excluded from

    the assessment culture, and that they have to use trial and error to make successive

    approximations towards the performances that are being sought in assessed work (Brown, et. al.

    1999 p. 58). Because of the reflections gathered from their students, Brown, et al. encourage

    innovation in assessment (1999, p. 81). They also stated: to some students, conventional

    forms of assessment appear to have no relevancerather than involving them in genuine

    learning (Brown, et al. 1999, p. 81). Instead they found that, students appreciate assessmenttasks which help them to develop knowledge, skills and abilities which they canuse in other

    contexts and they encourage assessment which incorporates elements of choice because it

    can give students a greater sense of ownership and personal involvement in the work and avoid

    the demotivating perception that they are simply going through routine tasks (Brown, et al.

    1999, p. 81). From the little bit from Brown, et al., here (1999) and the bit from Caner on EFL

    students preferring portfolio assessments to traditional ones (2010, p. 1), it seems that students

    prefer having some form of authentic assessment to be involved in the process. Unfortunately,

    due to the lack of research in this area, anything more would be merely idle conjecture.

    The students are the untapped resource here. Their reflections could provide valuable

    information on the assessment process. To that end, the purpose of this study is to gather some of

    those reflections towards various forms of traditional and authentic assessment tools.

    Research Questions

    As previously stated, the purpose of this action research study is to explore the reflections

    of students in a beginning level Spanish class towards various forms of traditional and authentic

  • 8/8/2019 Prospectus Version 3.0

    17/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 17

    assessment tools. To this end, I plan to focus my research on the overarching question: What are

    the perceptions of undergraduate students related to traditional and authentic assessments used in

    an introductory Spanish course? Both the interviews and surveys, each of which will make up the

    methods of data collection in this study and will be described in detail in the methodology

    section below, will attempt to solicit information related to this question. To support this

    overarching question, I will explore the following five sets of sub-questions:

    1. What do students think are the benefits or limitations of each type of assessment on

    their learning? Do students think these assessments reflect their learning?

    2.

    How do students feel that these assessments can enhance or detract from theirlearning experience?

    3. What factors do students feel affect the impact of each type of assessment on their

    learning?

    4. What preferences do students express toward each type of assessment? What are

    their reasons for these preferences?

    5. What recommendations do students have for enhancing their perceived

    effectiveness of each type of assessment?

    The first set of sub-questions: What do students think are the benefits or limitations of each type

    of assessment on their learning? and Do students think these assessments reflect their

    learning? aim to organize the reflections of students on what is working and what is not in

    relation to the effect, if any, these assessments have on their learning, as well as whether or not

    they believe each type of assessment is able to capture and demonstrate what they are learning.

    There are both interview and survey questions that will attempt to get students to reflect on these

    aspects of the assessments.

  • 8/8/2019 Prospectus Version 3.0

    18/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 18

    The second sub-question: How do students feel that these assessments can enhance or

    detract from their learning experience? is meant to get at how, and whether, students perceive

    these assessments having any effects on their entire process of learning. Since Brown, et al.,

    listed one of the functions of assessment as generating appropriate student learning activity

    (1999, p. 47), it seemed important to ask students to reflect on whether they consider these

    assessments part of their learning. Since this question is asking only for open-ended responses, it

    will only be addressed in the interview questions, not those of the survey.

    The third sub-question: What factors do students feel affect the impact of each type of

    assessment on their learning? will focus the data gathered from student reflections on how thespecific elements of each type of assessment changed or influenced their learning. Like the

    previous one, this sub-question calls for open-ended feedback that will most likely come almost

    exclusively from the interview questions.

    The fourth sub-question set: What preferences do students express toward each type of

    assessment? and What are their reasons for these preferences? will explore where student

    preference falls among the types of assessments and their reasons for those preferences. This will

    help to show if it is the type of assessment that is preferred by the students or only specific

    aspects of the specific assessments the students have been exposed to that they prefer. Some of

    this data will come from the survey but the majority of it will be collected via the interviews.

    The final sub-question: What recommendations do students have for enhancing their

    perceived effectiveness of each type of assessment? aims to have students hypothesize about

    ways to make each type of assessment more useful to them. This question will be almost

    exclusively answered during the interviews but there is also the possibility that some of this

    information may be volunteered at the end of the survey when students are asked for any

  • 8/8/2019 Prospectus Version 3.0

    19/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 19

    additional comments.

    Figure 1.1 below summarizes which data collection methods, described in the

    methodology section of this work, will most likely address each research question listed above.

    Figure 1.1Research Questions and Related Data Collection Methods MatrixResearch Questions Interview

    QuestionsSurveyStatements

    Overarching question: What are the perceptions of undergraduate studentsrelated to traditional and authentic assessments used in an introductorySpanish course?

    X X

    Sub-Q 1: What do students think are the benefits or limitations of each typeof assessment on their learning? Do students think these assessments reflecttheir learning?

    X X

    Sub-Q 2: How do students feel that these assessments can enhance or detract

    from their learning experience?X

    Sub-Q 3: What factors do students feel affect the impact of each type of assessment on their learning? X

    Sub-Q 4: What preferences do students express toward each type of assessment? What are their reasons for these preferences? X X

    Sub-Q 5: What recommendations do students have for enhancing theirperceived effectiveness of each type of assessment? X

    Methodology

    Context/Setting

    I will conduct my research by gathering data from former students of my fall 2010

    beginning Spanish college courses. I teach at a public, state-funded institution, and although we

    are moving towards being a more research-focused institution, the current focus is aligned more

    with teaching. I am fortunate to have a great deal of freedom and control in my classroom. While

    it is true that my general curriculum and my textbook are mandated by the tenured faculty of my

    department, I am free to choose whatever path I believe will best help my students achieve the

    course goals. That is to say that while I am not free to choose what I teach, I am free to

    determine how to facilitate student learning in my classes. While I also receive feedback from

    peers once a year, I feel no other demands from any supervisors on my teaching methods. This

  • 8/8/2019 Prospectus Version 3.0

    20/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 20

    allows me to constantly experiment with ways to improve how I teach my classes. I am currently

    in my sixth semester teaching these courses, and I can say with certainty that no two semesters

    have held very much in common outside of my general teaching philosophy. I am constantly

    trying to improve my methods based on what I have perceived as being effective. This hands-off

    situation created by the administration allows me to be fluid in my methods. While the freedom

    to teach the way I feel is best has many advantages, it also carries heavy responsibility; I have to

    rely on my perceptions of my students learning with little feedback from anyone else.

    The classes I teach are capped at twenty-eight students. This is a moderate number of

    students for a beginning foreign language class. While it would be ideal to have a smallernumber because it would allow for more individualized attention, there are advantages to this

    class size as well. With this many students it is easier to employ group learning strategies,

    allowing students to facilitate their own, as well as each others, learning processes. These large

    classes make it easy to use traditional assessment measures because they are easy to administer

    and assess, even in so large a group. Authentic assessments, like portfolio-style projects, are

    more challenging to administer and evaluate because they take longer to facilitate, collect, and

    assess but can provide richer, more detailed feedback to students. During the semester that my

    students reflections will be based on, they were exposed to various assessment measures, both

    traditional and authentic; therefore, these students will have a concrete context on which to base

    their reflections.

    Participants

    For this project, there will be two primary participant groups. First, I will invite the

    seventy-nine students who are taking my beginning Spanish 1 courses in the fall semester of

    2010 to participate in a broad attitude survey (see Appendix B). These students run the gambit in

  • 8/8/2019 Prospectus Version 3.0

    21/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 21

    class rank, from freshmen to seniors, and in age, the youngest being seventeen and the older ones

    being over thirty, as well as in educational experiences and majors. I hope to see at least forty

    percent of these students respond to the survey to ensure a valid sample.

    The second group of students I plan to solicit for this study will ideally be a group of

    nine. I would like to ask three students from each of the three sections to participate in individual

    interviews. I will choose these participants based on their performance levels in class. Ideally, I

    will find one high-, one mid-, and one low-performing student to ensure a broader range of

    perspectives on the assessment measures. I will use a semi-structured interview guide (see

    Appendix A) and record the interviews digitally.Research Plan

    I will complete my research by two methods: individual interviews and a broad attitude

    survey. First, I will employ a semi-structured interview method to ask the nine students,

    described above, to explore their reflections on various forms of traditional and authentic

    assessment tools, which they have encountered in their previous semester of beginning level

    Spanish 1. I will use the semi-structured interview guide (see Appendix A) to solicit their

    opinions on these assessments. I will record the interviews digitally, after having each student

    sign an informed consent form (see Appendix C). When I begin analyzing their reflections, I

    will create anonymity for my students by giving each student a pseudonym, known only to me,

    before categorizing each one as either high-, mid-, or low-performing.

    Additionally, I will email an invitation to all seventy-nine students to participate in the

    broad attitude survey (see Appendix B), which will be housed online. This survey will ask these

    former students to comment on their engagement and motivation levels as well as the perceived

    effectiveness on their learning experiences with the various assessment tools used throughout the

  • 8/8/2019 Prospectus Version 3.0

    22/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 22

    course. I hope to have at least thirty-two of those students surveyed to respond, which would be

    an approximate response rate of 40%. Based on previous experience with polling students, this

    seems to be a realistic goal. I plan to send out an invitation to these students in early January,

    after grades have posted for the semester, asking them to complete the survey by February 1,

    2011. On January 30, 2011, I plan to send another email reminder asking students to complete

    the survey. If my results are still under 50% participation, I will send a final email request during

    the second week of February 2011. This survey will be completely anonymous because it will

    not ask for any identifying information (see Appendix B).

    Plan for Evaluation of DataThe data collected from the individual interviews and online survey will be coded and

    evaluated from the foundation of the research questions they hope to answer. I will use the

    following color scheme, as illustrated in the figures below, to code information related to each of

    the respective research questions: the Overarching question will be yellow, the Sub-Q 1 will be

    green, the Sub Q-2 will be teal, the Sub Q-3 will be pink, the Sub-Q 4 will be blue, and the Sub-

    Q 5 will be red. This color coding will allow me to find and separate out comments and

    information that will enable me to reflect on each of my research questions. Figure 1.2

    demonstrates the relationship between each set of questions posed in the semi-structured

    interview guide (see Appendix A) and the related research questions.

    Figure 1.2: Interview to Research Questions Alignment MatrixThe following questions are a general guide for the interviews. The final project may contain slightly different data depending on the responses of individual participants and the open-ended nature of the interview.

    Interview Questions Research Questions1. Tell me a little bit about your first experiencewith Spanish. How old were you? Whathappened? Why did you decide to takeSpanish? What are your goals in regards toSpanish? What do you want to do with thelanguage?

    (Warm-Up)

  • 8/8/2019 Prospectus Version 3.0

    23/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 23

    Figure 1.2: Interview to Research Questions Alignment MatrixThe following questions are a general guide for the interviews. The final project may contain slightly different data depending on the responses of individual participants and the open-ended nature of the interview.

    Interview Questions Research Questions2. Tell me about how you learn. What situations

    and tools help you learn? What tools and

    strategies do you seek out and employ?3. Tell me a bit about your experience inbeginning Spanish I this past fall of 2010.

    Overarching question: What are the perceptionsof undergraduate students related to traditionaland authentic assessments used in anintroductory Spanish course?

    Sub-Q 1: What do students think are thebenefits or limitations of each type of assessment on their learning? Do students think these assessments reflect their learning?

    Sub-Q 3: What factors do students feel affectthe impact of each type of assessment on theirlearning?

    Sub-Q 4: What preferences do students expresstoward each type of assessment? What are theirreasons for these preferences?

    4. There were four chapter tests that includedsections on grammar, vocabulary, listening,writing, and speaking. Tell me about yourexperience taking these tests. What did youthink the point of taking the tests was? Do youthink they measured your ability to understandand use Spanish? Did you feel you could betterunderstand and use Spanish as a result of preparing for and completing them?

    5. The final exam was similar in structure to thechapter tests except it was cumulative, coveringall five chapters. Tell me about your experiencetaking this exam. What did you think the pointof taking the exam was? Do you think itmeasured your ability to understand and useSpanish? Did you feel you could betterunderstand and use Spanish as a result of preparing for and completing it?

    6. You were required to create a culture blog thissemester that asked you to reflect on culturalartifacts of your choosing and how they related

    to you personally and what you were learningin the class. Tell me about your experience increating this blog. What did you think the pointof creating the blog was? Do you think itmeasured your ability to understand and useSpanish? Did you feel you could betterunderstand and use Spanish as a result of researching and completing it?

    Overarching question: What are the perceptionsof undergraduate students related to traditionaland authentic assessments used in anintroductory Spanish course?

    Sub-Q 1: What do students think are thebenefits or limitations of each type of assessment on their learning? Do students think these assessments reflect their learning?

    Sub-Q 3: What factors do students feel affectthe impact of each type of assessment on their

    learning?

    Sub-Q 4: What preferences do students expresstoward each type of assessment? What are theirreasons for these preferences?

    7. Throughout the semester you were required tokeep up with the eLinguaFolio self-assessmentproject that asked you to reflect on your ownlearning and provide samples that demonstrated

    your best efforts. Tell me about yourexperience working on this project. What didyou think the point of working on theeLinguaFolio project was? Do you think itmeasured your ability to understand and useSpanish? Did you feel you could betterunderstand and use Spanish as a result of working on it?

  • 8/8/2019 Prospectus Version 3.0

    24/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 24

    Figure 1.2: Interview to Research Questions Alignment MatrixThe following questions are a general guide for the interviews. The final project may contain slightly different data depending on the responses of individual participants and the open-ended nature of the interview.

    Interview Questions Research Questions8. Considering these four ways (give student four

    index cards, each with the name of one of the

    above assessments to help them concentrate oneach one individually and in relation to theothers as they talk about them) in which youwere tested during the semester, tell me howyou feel they compared to each other. Do youhave a favorite?

    9. How did completing these assignments affectthe way you learned in the course?

    Sub-Q 2: How do students feel that theseassessments can enhance or detract from theirlearning experience?

    10. What suggestions for improvement would youmake about any or all of these assignments?Any other comments on testing in this course?

    Sub-Q 5: What recommendations do studentshave for enhancing their perceivedeffectiveness of each type of assessment?

    Interview question sets one and two above are meant primarily as warm-ups to help participants

    become comfortable with the conversation and begin to reflect on their learning. To begin, I will

    ask student to tell me about their first language learning experiences, why they want to study

    Spanish, and what their goals are or what they want to do with language. This will provide a

    context to their comments to allow for a deeper understanding of where they are coming from as

    students and what they want from learning before delving into how they learn. The second set of

    interview questions will then ask students to reflect on the way they learn and what tools and

    methods they use. This will help to set up an understanding of how they view learning and what

    they believe works well for them.

    The third interview question brings the focus of the conversation from learning in general

    to the specific context of my beginning Spanish 1 course that they will have just completed. This

    question may include reflections related to the overarching question of my research, depending

    on what the student decides to focus on in answering. I do not want to lead them into any

    specifics with this question, just get a general sense of their learning experience in that course to

    help determine how to proceed with the next few, more specific questions about the particular

  • 8/8/2019 Prospectus Version 3.0

    25/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 25

    assessments.

    Interview question sets four and five will ask students to reflect specifically on the two

    particular examples of traditional assessments used in the course: the four chapter tests and the

    cumulative final exam. To help students reflect on these two tools, I will at this point provide

    them with two index cards to hold, point to, or just look at as they think. One card will read

    chapter test and the other will say final exam. These cards will hopefully help them focus

    their reflections on traditional assessment measures but I will at points also ask them about

    similar assessment measures in other contexts for comparisons and further depth of reflections. It

    is my hope that data collected from these two sets of questions will help to answer myoverarching research question as students reflect on traditional assessment measures, as well as

    providing insight on sub-questions one, three, and four, related to benefits and limitations of

    these assessments, factors they feel affect the impact of these assessments on their learning, and

    their preferences for these assessments, respectively. During data analysis, comments specific to

    each question will be color coded as indicated above and in Figure 1.2 to help sort this data for

    reflection.

    Question sets six and seven, in contrast to four and five described above, will ask

    students to reflect specifically on the two particular examples of authentic assessments used in

    the course: the culture blog and the eLinguaFolio self-assessment tool. Just as I did with the

    previous question sets, I will at this point provide students with two additional index cards to

    hold, point to, or just look at as they think. One card will read culture blog and the other will

    say eLinguaFolio self-assessment. These cards will offer students tangible focal points for

    their reflections on authentic assessment measures but, just as previously, I will at points also ask

    them about similar assessment measures in other contexts for comparisons. The data collected

  • 8/8/2019 Prospectus Version 3.0

    26/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 26

    from these two sets of questions will help to answer my overarching research question as

    students reflect on authentic assessment measures, as well as providing insight on sub-questions

    one, three, and four, related to benefits and limitations of these assessments, factors they feel

    affect the impact of these assessments on their learning, and their preferences for these

    assessments, respectively. During data analysis, comments specific to each question will be color

    coded as indicated previously and in Figure 1.2 to help sort this data for reflection.

    To sum up this part of the interview, if students have not already sufficiently covered this

    topic, I will ask questions from sub-set eight, giving the student all four of the assessment name

    cards: chapter tests, final exam, culture blog, and eLinguaFolio self-assessment, to hold and look at while sorting their thoughts on comparing each method to the others. This question set will

    provide more detail on the same research questions mentioned above: the overarching questions,

    plus sub-questions one, three, and four, and the data will be coded in the same manner.

    Question set nine from the interview guide asks students to reflect on how completing

    these various assessments affected their learning experience. This set aims to get reflections on

    research sub-question two related to how students feel these assessments enhance or detract from

    their learning experience. Information and reflections that pertain to this research question will

    be highlighted in teal to enable ease of sorting this information for analysis.

    Finally, interview question set 10 will intentionally ask for student recommendations for

    improvement on any of these assessments or any additional comments on testing in this course.

    This set of questions aims to answer research sub-question five, which deals directly with student

    recommendations for enhancing the effectiveness of each type of assessment. The information

    related to this research question will be color coded in red for marking and analysis.

    Figure 1.3 demonstrates the relationship between each set of questions or statements

  • 8/8/2019 Prospectus Version 3.0

    27/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 27

    posed in the broad attitude survey (see Appendix B) and the related research questions. The same

    color coding scheme will be employed to facilitate the organization of this data: the Overarching

    question will be yellow, the Sub-Q 1 will be green, the Sub Q-2 will be teal, the Sub Q-3 will be

    pink, the Sub-Q 4 will be blue, and the Sub-Q 5 will be red.

    Figure 1.3: Survey to Research Questions Alignment MatrixAll survey statements (except 12) will have responses from a scale of strongly disagree to strongly agree with this statement.)

    Survey Statements Research Questionsmotivation

    1. I felt motivated to study and learn to prepare forthe four chapter tests.3. I felt motivated to study and learn to prepare forthe final exam.

    5. I felt motivated to work on the cultureblog/portfolio.7. I felt motivated to work on the eLinguaFolioproject.

    Sub-Q 4:What preferences do students express toward each

    type of assessment?

    effectiveness

    2. I felt like the four chapter tests helped todemonstrate what I learned.4. I felt like the final exam helped to demonstratedwhat I learned.6. I felt like the culture blog/portfolio helped to

    demonstrate what I learned.8. I felt like the eLinguaFolio project helped todemonstrate what I learned.

    Sub-Q 4:What preferences do students express toward eachtype of assessment?

    fairness

    9. Overall, I felt like the individual grades Ireceived in this course were a fair assessment of my learning.10. Overall, I felt like my final grade in this coursewas a fair assessment of my learning.11. Overall, I felt like I understood the point behindthe tests and projects in this course.

    Sub Q-1, part 2:Do students think these assessments reflect theirlearning?

    additional comments

    12. Please leave any additional comments here.Recommendations for improvement are welcomeand appreciated.

    Overarching Question:What are the perceptions of undergraduate studentsrelated to traditional and authentic assessmentsused in an introductory Spanish course?

  • 8/8/2019 Prospectus Version 3.0

    28/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 28

    Figure 1.3: Survey to Research Questions Alignment MatrixAll survey statements (except 12) will have responses from a scale of strongly disagree to strongly agree with this statement.)

    Survey Statements Research QuestionsInformed consent

    13. I understand that by completing this anonymous

    survey I am consenting to allow all informationprovided to be used as part of an action researchstudy for a thesis.

    Informed consent: the participant must agree to this

    statement in order to submit the survey.

    The survey statements above are intended to create a larger overview of student opinion by

    asking the entire population of the three sections of this course from the fall semester of 2010 to

    choose the best description of their opinions on assessment from an attitude scale ranging from

    strongly disagree to strongly agree. One final area will also be provided for any comments

    student wish to volunteer.

    The first set of statements is organized by the students perceived motivation to complete

    the assessments. The second set of statements is organized by the students perceived

    effectiveness of the assessment tools. These statements will be sequenced as the first four odd

    and the first four even numbered questions, respectively. The reason behind the sequencing of

    one motivationally directed item followed by one effectiveness directed item, is to allow students

    to reflect on their perceived motivation and effectiveness of each assessment tool discretely.

    Therefore, statement one is directed toward perceived motivation and statement two is directed

    toward perceived effectiveness but they are both concerned with the first traditional assessment

    tool, the chapter tests. This is the pattern for the first eight statements. This means that statements

    one, three, five, and seven will focus on perceived motivation of the four assessment tools, while

    two, four, six, and eight will focus on perceived effectiveness of the four assessment tools. Alleight of these statements are aimed at providing reflections on research sub-question four: What

    preferences do students express toward each type of assessment?

    Statements nine, ten, and eleven aim to gather data on perceived fairness to partially

  • 8/8/2019 Prospectus Version 3.0

    29/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 29

    answer research sub-question one concerning whether or not students feel that these assessments

    reflect their learning.

    Students will be asked in statement twelve to provide any additional comments or

    recommendations and they will be provided an open textbox in which to provide as much or as

    little data as they wish. This is the only statement on the survey that participants may elect to not

    answer entirely. However, the first eleven statements do have an option they may choose for

    neutral/no opinion. This statement could potentially provide data for any of the research

    questions but will mostly likely provide general feedback, if any, that will contribute to the

    overarching research question concerning more general perceptions of these types of assessmentsin this context.

    The last statement on the survey will cover informed consent. Student will be required to

    check a box to consent that the research be used as part of this study before the survey can be

    submitted.

    Time Line

    Figure 2.1 Summary of Time LineDate DueJan 10 Initial email of surveyJan 17 Email solicitation of intervieweesJan 17-20 To UWC with chapters 4, 5, & 6Jan 24 Chapters 4, 5, & 6 due to committeeJan 24-28 To UWC with bibliography & appendicesJan 30 First survey reminder via emailFeb 1 First due date for surveyFeb 7 Additional reminder of survey via emailFeb 4-8 To UWC with chapter 2

    Feb 12 Chapter 2 due to committeeFeb 18 Interviews will be completeFeb 27 Transcriptions and data analysis dueFeb 28-Mar 4

    To UWC with chapters 3 & 7

    Mar 7 Chapters 3 & 7 due to committeeMar 7-11 To UWC with chapters 1 & 8Mar 14 Chapters 1 & 8 due to committee

  • 8/8/2019 Prospectus Version 3.0

    30/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 30

    On Monday, January 10,

    2010, I will email the initial invite

    to the seventy-nine students I am

    asking to participate in the survey, requesting they complete it by Tuesday, February 1, 2010.

    Since this is the first day of classes for spring semester and I know students will be very busy, I

    plan to wait one week before sending out solicitations via email, on Monday, January 17, 2010,

    to my nine ideal interview candidates to start setting up interview slots. I will ask students to pick

    a time slot for their interview. These time slots will fall between Tuesday, January 18, 2010-

    Friday, February 18, 2010. If any of the nine ideal interview candidates have not responded byFriday, January 21, 2010, I will email as many alternate candidates as necessary to fill those

    spaces.

    The week of January 17-20, 2010, I will make three appointments at the University

    Writing Center on three separate days to have a consultant work through one of the following

    three prospective chapters with me: Chapter 4 (Methodology), Chapter 5 (Validity), and Chapter

    6 (Ethics). I will make corrections after each session and send Chapters 4, 5, and 6 to the

    committee by Monday, January 24, 2010. The week of January 24-28, 2010, I plan to make an

    appointment with the University Writing Center to revise my bibliography and any appendices

    that are complete at this point.

    On Sunday, January 30, 2010, I will send an email reminder to the survey participants to

    remind them to complete the survey by Tuesday, February 1, 2010, if they have not already done

    so. This reminder will have to go out to all participants because the survey is anonymous and I

    will have no way to identify which participants have already completed the survey. If on

    Monday, February 7, 2010, I have a response rate of less than 50% on the survey, I will send out

    Mar 15-20 Final trips to UWC and final revisionsMar 21 Full final draft due to committeeApr 4-7 DefenseApr 7-17 Final revisionApr 18 Final thesis submitted to graduate schoolApr 20 Graduate schools deadline for final thesis

  • 8/8/2019 Prospectus Version 3.0

    31/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 31

    one more email reminder, asking students to please complete the survey. I believe this will be

    more than sufficient to get the 40% response rate I am hoping for.

    On either February 4, 7, or 8, 2010, I plan to take Chapter 2 (Literature Review) to the

    University Writing Center for a consultation. Ideally, I would like to make the appointment on

    the 4 th in case there is not sufficient time to get through the entire chapter in one session. I will

    send Chapter 2 to the committee by Saturday, February 12, 2010.

    All interviews should be complete by Friday, February 18, 2010. Therefore, I plan to

    complete the transcription and coding of all data, including that of the survey, by Sunday,

    February 27, 2010. The week of February 28-March 4, 2010, I plan to make at least twoappointments at the University Writing Center to review Chapter 3 (Research Details) and

    Chapter 7 (Data Representation). I will submit Chapters 3 and 7 to the committee by Monday,

    March 7, 2010.

    The week of March 7-11, 2010, I plan to make at least two, but most likely three,

    appointments with the University Writing Center to work on Chapter 1 (Introduction) and

    Chapter 8 (Conclusion). I plan to submit Chapters 1 and 8 to the committee by Monday, March

    14, 2010.

    During the week of March 15-20, 2010, I will make any necessary final visits to the

    University Writing Center and finalize any revisions the committee has previously advised me

    about throughout the course of the semester. By Monday, March 21, 2010, I will submit a

    complete final draft to the committee for review.

    I would like to schedule the defense of my thesis for the week of April 4-7, 2010. This

    will allow me one and half to two weeks to make any necessary adjustments before I submit my

    final thesis to the graduate school on Monday, April 18, 2010, which is two days before the day

  • 8/8/2019 Prospectus Version 3.0

    32/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 32

    the graduate school requires receipt on Wednesday, April 20, 2010, to allow for additional issues

    that may arise.

    Validity

    I chose to focus on four types of qualitative validity, described by Hendricks in her text

    on action research, to validate this study: democratic, outcome, process, and catalytic (2009, p.

    112). Hendricks, paraphrasing Anderson et al., defines democratic validity as, the extent to

    which stateholders have collaborated in the research process and/or the extent to which the

    researcher has taken into account their various viewpoints (2009, p. 112). I chose democratic

    validity as the first way to validate my study because I will choose particular students to voicetheir opinions. I will choose students to interview whom I attribute to one of the following three

    categories: high-, mid-, or low-performing, based on their performance in my classes the

    previous semester. Since my research questions seek to solicit the opinions of students on

    assessment, I feel it is crucial to solicit the opinions of students who fall at all ranges of assessed

    performance. This will hopefully allow me to understand both strengths and weakness of these

    assessment tools from multiple perspectives.

    The second measure I chose is outcome validity because it speaks to how I will use the

    results for continued planning, ongoing reflection, and deepening my personal understanding of

    the topics I am exploring. Hendricks, paraphrasing Anderson et al., defines outcome validity as,

    the degree to which there has been a successful resolution to the research problem and she

    emphasizes that successful resolution may not mean an end to the research but a place from

    which to start the cycle again (2009, p. 112). Through this research I hope to learn which of my

    assessment tools are working well and which need improvement. I hope to also learn about new

    ideas while conducting the interviews and from the open-ended portion of the survey that will

  • 8/8/2019 Prospectus Version 3.0

    33/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 33

    help shape how I move forward in my practice. I will take what I learn, reflect on what it means

    to me, and begin to brainstorm ways to apply my understanding to the way I approach my

    classroom. I also plan to continue to solicit feedback from my students in future research to

    continue the cycle and work toward continual improvement of my practice. I hope that by taking

    their opinions into consideration, I will continue to improve the way I teach and thereby help my

    students engage and participate in class and generally get more out of their learning experience

    with me. There is so much more to learn and try as I continue my path to become the best teacher

    I can be.

    Hendricks, paraphrasing Anderson et al., defines process validity as the use of appropriate process for studying the research questions, and she continues to say that it relies

    on the researchers commitment to carry out the study in a way that allows the research to

    engage in a way that develops a depth of understanding or change (2009, p. 112). I chose

    process validity because I need to insure I look deeply at the problem so I can understand the

    ways context and processes have influenced my results and how this information can carry me

    forward. To insure I look deeply and critically at these issues, I will rely on two main methods:

    asking open-ended questions during the interviews that left plenty of room for the students to tell

    me what they really think and writing out my reflections on the interviews, and the broad attitude

    survey so I remain very conscious of conclusions I have drawn. All of these notes and reflections

    will be included in the appendices of my final report.

    Hendricks, paraphrasing Lather, defines catalytic validity as, the extent to which the

    research transforms or changes the researchers views and/or practices (2009 p. 112). I chose

    catalytic validity because it will allow me to be aware of the ways my processes and outcomes

    will change my practices. This is the most important part of my study. If the results do not

  • 8/8/2019 Prospectus Version 3.0

    34/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 34

    change the way I approach my classroom, I will need to revise my research and try again. While

    I believe that I am a proficient instructor, my primary goal is to improve. Any insight I can gain

    from this study will help reshape my perspective and improve my practice.

    Ethics and Subjectivity

    This study will gather observations from students who have taken beginning Spanish 1

    with me in the fall semester of 2010. During this semester these students were exposed to various

    types of traditional and authentic assessment measures. While some of these assessment

    techniques are new, they were not introduced into the course for the purpose of studying their

    specific function, only to serve the purposes of assessment in my classroom. My students areunaware, at this juncture, that this research project is being proposed. They are only being

    presented with these assessment measures in the context of their use in the classroom. While

    these measures will receive the benefit of any insight gained during the course of this study, it is

    important to note that they are not the primary focus of the study. The purpose of this study is to

    explore the reflections of students in a beginning level Spanish class towards various forms of

    traditional and authentic assessment tools in general. The specific tools used in the course, which

    these participants will have completed, will serve as specific examples of the larger context of

    traditional and authentic assessment. Participants in the interviews will be given plenty of

    latitude to include other examples of these assessment tools to which they have been exposed in

    other contexts. To clarify, the tests and portfolios these students will have completed for me by

    the time this research begins, while benefiting from this study, are not the focus. The focus will

    be on the reflections my students have on traditional and authentic assessment measures in a

    more general sense. This distinction is important to make because I want it to be clear that

    administering these particular assessments for the purposes of trying to validate them is not an

  • 8/8/2019 Prospectus Version 3.0

    35/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 35

    aim of this study.

    Participants in the anonymous survey will be asked to check a box indicating that they

    are aware that by completing it they are consenting to allow all information provided to be used

    as part of an action research study for a thesis before the survey can be submitted. No identifying

    information will be gathered during the survey. The survey will be emailed out to 79 students but

    there will be no way to tell which will choose to participate or which submission belongs to any

    particular student. This does make unintentional bias possible. There is every likelihood that only

    particularly motivated students, whether they are satisfied with the course or not, will complete

    the survey. This makes it possible that the survey results could be polarized. The results mightthen only reflect either end, or perhaps only one end, of the spectrum of opinions. In my previous

    experience with similar student polling, responses have been more or less balanced but this will

    be something to watch for and will affect any conclusions drawn from this data.

    The participants in the interviews will be selected by me based on their performance

    levels throughout the course. By choosing students at different achievement levels in the course,

    I hope to provide a more balanced perspective of student opinions across the board. There are

    two concerns here, however. First, I am sure that I would select students from each of the three

    achievement levels who I perceive as being the most capable of deep reflection and interest in

    these issues. Because I am intentionally choosing these students, there is the chance that my

    selections will be faulty in some way. The students I choose may or may not be representative of

    the others. I hope this will not be the case, but there is always this risk when sampling from a

    larger population and this will be something to address, as the data is being collected and

    analyzed. The second concern is that I may have difficulty finding mid- and low-achieving

    students who are willing to participate. These students may be upset or embarrassed because of

  • 8/8/2019 Prospectus Version 3.0

    36/47

  • 8/8/2019 Prospectus Version 3.0

    37/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 37

    ask for any identifying information on the survey itself, thereby making it anonymous to allow

    for more freedom and honesty. I am open to getting some negative or even potentially hurtful

    comments in exchange for the chance of obtaining some useful feedback to help me be a more

    effective teacher.

    Conclusion

    The purpose of this study is to explore the reflections of students in a beginning level

    Spanish class towards various forms of traditional and authentic assessment tools. Those

    reflections, once collected, will be evaluated according to the aforementioned methods.

    Conclusions will then be drawn to inform my future practice and to add one small, and veryfocused, piece of the puzzle to the literature surrounding assessment practices in higher

    education. Additionally, it will serve to inspire me, and hopefully others, to continue asking the

    questions that must be asked in the classroom: How do students learn? How can teachers

    facilitate student learning? What tools can educators employ to create the most effective learning

    environment possible? To facilitate the processes of continual improvement of the educational

    system and to help make sure it just keeps getting better.

  • 8/8/2019 Prospectus Version 3.0

    38/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 38

    Bibliography

    Ainsworth, Larry, & Viegut, Donald (2006). Common formative assessments, How to connect

    standards-based instruction and assessment. Thousand Oaks, CA, Corwin.

    Allen, M.J. (2004). Assessing academic programs in higher education. Boston, MA: Anker

    Publishing Company, Inc.

    Angelo, Thomas A. (1993). Classroom assessment techniques: a handbook for college teachers.

    San Francisco: Jossey-Bass.

    Banta, T. W. (2002). Building a scholarship of assessment. San Francisco: Jossey-Bass.

    Banta, T. W. (2007). Can Assessment for Accountability Complement Assessment forImprovement?. Peer Review, 9(2), 9-12. Retrieved from Academic Search Complete

    database.

    Banta, T. W., Griffin, M., Flateby, T.L., & Kahn, S. (2009, December).Three promising

    alternatives for assessing college students' knowledge and skills. (NILOA Occasional

    Paper No.2). Urbana, IL: University of Illinois and Indiana University, National Institute

    of Learning Outcomes Assessment.

    Barootchi, N., & Keshavarz, M. (2002). Assessment of achievement through portfolios and

    teacher-made tests. Educational Research, 44(3), 279-288.

    doi:10.1080/00131880210135313.

    Bers, T. (2008). The role of institutional assessment in assessing student learning outcomes. New

    Directions for Higher Education, (141), 31-39. doi:10.1002/he.291.

    Bers, T. (2004). Assessment at the program level. New Directions for Community Colleges,

    2004(126), 43-52. Retrieved from Academic Search Complete database.

  • 8/8/2019 Prospectus Version 3.0

    39/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 39

    Bers, T., & Smith, K. (1990). Assessing assessment programs: The theory and practice of

    examining reliability and validity of a.. Community College Review, 18(3), 17. Retrieved

    from Academic Search Complete database.

    Bers, T., Davis, B., & Taylor, B. (2000). The Use of Syllabi in Assessments: Unobtrusive

    Indicators and Tools for Faculty Development. Assessment Update, 12(3), 4. Retrieved

    from Academic Search Complete database.

    Blackburn, B., Dewalt, M., & Vare, J. (2003). A Case of Authentic Redesign: Collaborating with

    National Board Certified Teachers to Revise an Advanced Middle Level Program.

    Research in Middle Level Education Online, 26(2), 45-56. Retrieved from EducationResearch Complete database.

    Brimi, H. (2010). Darkening the Ovals of Education. Clearing House, 83(5), 153-157.

    doi:10.1080/00098650903505472.

    Brint, S., Proctor, K., Murphy, S., Turk-Bicakci, L., & Hanneman, R. (2009). General Education

    Models: Continuity and Change in the U.S. Undergraduate Curriculum, 1975--2000.

    Journal of Higher Education, 80(6), 605-642. Retrieved from Education Research

    Complete database.

    Brown, S. A., & Glasner, A. (1999). Assessment matters in higher education: Choosing and

    using diverse approaches. Buckingham [England: Society for Research into Higher

    Education & Open University Press.

    Caner, M. (2010). STUDENTS VIEWS ON USING PORTFOLIO ASSESSMENT IN EFL

    WRITING COURSES.Anadolu University Journal of Social Sciences, 10(1), 223-235.

    Retrieved from Academic Search Complete database.

  • 8/8/2019 Prospectus Version 3.0

    40/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 40

    Carr, Judy F & Harris Douglas E. (2001). Succeeding with standards linking curriculum,

    assessment, and action planning. Alexandria, Virginia: Association for Supervision and

    Curriculum Development.

    Cauley, K., & McMillan, J. (2009). Formative Assessment Techniques to Support Student

    Motivation and Achievement. Clearing House, 83(1), 1-6. Retrieved from Education

    Research Complete database.

    Choate, J. S. (1995). Curriculum-based assessment and programming. Boston: Allyn and Bacon.

    Cross, K. Patricia. (1996). Classroom research: implementing the scholarship of teaching. San

    Francisco: Jossey-Bass.Cummins, P., & Davesne, C. (2009). Using Electronic Portfolios for Second Language

    Assessment.Modern Language Journal, 93848-867. doi:10.1111/j.1540-

    4781.2009.00977.x.

    Davis, N., Kumtepe, E., & Aydeniz, M. (2007). Fostering Continuous Improvement and

    Learning Through Peer Assessment: Part of an Integral Model of Assessment.

    Educational Assessment, 12(2), 113-135. doi:10.1080/10627190701232720.

    Earl, L., & Torrance, N. (2000). Embedding Accountability and Improvement Into Large-Scale

    Assessment: What Difference Does It Make?. PJE. Peabody Journal of Education, 75(4),

    114-141. Retrieved from Academic Search Complete database.

    Ellis, Arthur K. (2001). Teaching, learning, & assessment together, The reflective classroom,

    Larchmont, New York: Eye On Education.

    Eubanks, D. (2008). Assessing the General Education Elephant. Assessment Update, 20(4), 4-16.

    Retrieved from Academic Search Complete database.

  • 8/8/2019 Prospectus Version 3.0

    41/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 41

    Ewell, P. (2008). Assessment and accountability in America today: Background and context.

    New Directions for Institutional Research, 20087-17. doi:10.1002/ir.258.

    Ewell, Peter. (2009). Assessment, Accountability, and Improvement: Revisiting the Tension.

    Fasciano, Helga. (August 2, 2010). Formative Assessment in a Balanced Assessment System .

    Presentation at the LinguaFolio Institute, Charlotte, NC. Retrieved from

    https://sites.google.com/site/nclfpilot/home/presentations/FormativeAssessmentandaBala

    ncedAssessmentSystem.pptx?attredirects=0&d=1.

    Felner, R., Bolton, N., Seitsinger, A., Brand, S., & Burns, A. (2008). Creating a statewide

    educational data system for accountability and improvement: A comprehensiveinformation and assessment system for making evidence-based change at school, district,

    and policy levels. Psychology in the Schools, 45(3), 235-256. Retrieved from Academic

    Search Complete database.

    Gulikers, J., Bastiaens, T., Kirschner, P., & Kester, L. (2008). Authenticity Is in the Eye of the

    Beholder: Student and Teacher Perceptions of Assessment Authenticity. Journal of

    Vocational Education and Training , 60(4), 401-412. Retrieved from ERIC database.

    Hendricks, C. (2009). Improving schools through action research: A comprehensive guide for

    educators . Upper Saddle River, N.J.: Pearson.

    Hernon, P., Dugan, R.E., & Schwartz, C. (Eds.) (2006). Revisiting outcomes assessment in

    higher education. Westport, CT: Libraries Unlimited.

    Higgs, T. (1987). Oral Proficiency Testing and Its Significance for Practice. Theory Into

    Practice, 26(4), 282. Retrieved from Academic Search Complete database.

    Israel, J. (2007). Authenticity and the assessment of modern language learning. Journal of

    Research in International Education, 6(2), 195-231. doi:10.1177/1475240907074791.

  • 8/8/2019 Prospectus Version 3.0

    42/47

  • 8/8/2019 Prospectus Version 3.0

    43/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 43

    Neagu, Maria-Ionela. (2009). The Influence of the Teacher's Experience and of the Institution

    Type in Foreign Language Teaching, Learning and Assessment. Petroleum - Gas

    University of Ploiesti Bulletin, Educational Sciences Series, 61(2), 127-132. Retrieved

    from Academic Search Complete database.

    Oberg, C. (2009). Guiding Classroom Instruction Through Performance Assessment. Journal of

    Case Studies in Accreditation & Assessment, 1-11. Retrieved from Education Research

    Complete database.

    Palmer, S. (2004). Authenticity in assessment: reflecting undergraduate study and professional

    practice. European Journal of Engineering Education, 29(2), 193-202. Retrieved fromEducation Research Complete database.

    Ratcliff, J.L., Lubinescu, E.S., Gaffney, M.A. (Eds.) (2001). How accreditation influences

    assessment (New Directions for Higher Education, 113). San Francisco, CA: Jossey-

    Bass.

    Ross, Steven J. (2005). The Impact of Assessment Method on Foreign Language Proficiency

    Growth.Applied Linguistics, 26(3), 317-342. Retrieved from Academic Search Complete

    database.

    Sandoval, P., & Wigle, S. (2006). Building a Unit Assessment System: Creating Quality

    Evaluation of Candidate Performance. Education, 126(4), 640-652. Retrieved from ERIC

    database.

    Sehlaoui, A. (2008). Language Learning in the United States of America. Language, Culture &

    Curriculum, 21(3), 195-200. doi:10.1080/07908310802385873.

  • 8/8/2019 Prospectus Version 3.0

    44/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 44

    Sullivan, J. (2006). The Importance of Program Evaluation in Collegiate Foreign Language

    Programs.Modern Language Journal, 90(4), 590-593. doi:10.1111/j.1540-

    4781.2006.00466_6.x.

    Tileston, Donna Walker (2005). 10 Best teaching practices, How brain research, learning styles,

    and standards define teaching competencies, Second Edition. Thousand Oaks, CA:

    Corwin Press.

    Watson, D., & Robbins, J. (2008). Closing the chasm: reconciling contemporary understandings

    of learning with the need to formally assess and accredit learners through the assessment

    of performance. Research Papers in Education, 23(3), 315-331.doi:10.1080/02671520701755408.

    Williams, J., & Kane, D. (2009). Assessment and Feedback: Institutional Experiences of Student

    Feedback, 1996 to 2007. Higher Education Quarterly, 63(3), 264-286.

    doi:10.1111/j.1468-2273.2009.00430.x.

  • 8/8/2019 Prospectus Version 3.0

    45/47

    HOW STUDENTS PERCEIVE THEIR LEARNING 45

    Appendix A: Semi-structured Interview Guide

    1. Tell me a little bit about your first experience with Spanish. How old were you? Whathappened? Why did you decide to take Spanish? What are your goals in regards toSpanish? What do you want to do with the language?

    2. Tell me about how you learn. What situations and tools help you learn? What tools andstrategies do you seek out and employ?

    3. Tell me a bit about your experience in beginning Spanish I this past fall of 2010.4. There were four chapter tests that included sections on grammar, vocabulary, listening,

    writing, and speaking. Tell me about your experience taking these tests. What did youthink the point of taking the tests was? Do you think they measured your ability tounderstand and use Spanish? Did you feel you could better understand and use Spanish asa result of preparing for and completing them?

    5. The final exam was similar in structure to the chapter tests except it was cumulative,covering all five chapters. Tell me a