28
P7204 Final Report Page 1 of 28 Enhancing student feedback – providing higher quality feedback in a shorter time A UWS Learning and Teaching Action Plan (LTAP) 2006-2008 Project

UWS Learning and Teaching Action Plan Projects€¦  · Web viewThe project team developed surveys to gauge the usefulness of the tools from both the student and staff perspectives

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: UWS Learning and Teaching Action Plan Projects€¦  · Web viewThe project team developed surveys to gauge the usefulness of the tools from both the student and staff perspectives

Project leader: Paul Davies

Project report: Paul Davies

LTAP project no: P7204

P7204 Final Report Page 1 of 23

Enhancing student feedback – providing higher quality feedback in a shorter time

A UWS Learning and Teaching Action Plan (LTAP) 2006-2008 Project

Page 2: UWS Learning and Teaching Action Plan Projects€¦  · Web viewThe project team developed surveys to gauge the usefulness of the tools from both the student and staff perspectives

TABLE OF CONTENTS

Project aims...................................................................................................................4

Comments......................................................................................................................4

Evaluation of the project...............................................................................................4

Scalability of the project...............................................................................................4

Sustainability of project................................................................................................4

Project implementation issues.....................................................................................4

Dissemination of project outcomes:...........................................................................5

Other comments............................................................................................................5

SECTION 2: DETAILED REPORT, RELATED DOCUMENTS AND MEDIA................5

The problems.................................................................................................................5

Proposed solution.........................................................................................................5

Method............................................................................................................................6

Using the marking tools to provide feedback on assessments................................7

Figure 1 - Creating a marking scheme in MarkTool...................................................7

Figure 2 - Creating a comment in the student assessment submission showing choice of Colour Coded Marking Criteria this comment belongs to and the option of adding the comment to the Frequently Used Comments or choosing one from the previously saved Frequently Used Comments....................................................8

Figure 3 - The MarkTool user interface showing some of the annotations made in the students submitted work........................................................................................9

Figure 4 – part of the pdf document produced by MarkTool as feedback to the student..........................................................................................................................10

Figure 5 - Summary of all comments made for this assessment submission......10

Figure 6 - Marks given by the marker against each criteria embedded in the feedback document produced by MarkTool.............................................................11

Figure 7 - Colour coded marking scheme/criteria is embedded in the feedback document produced by MarkTool..............................................................................12

Figure 8 - General comment section in the feedback document produced by MarkTool.......................................................................................................................12

Problems Encountered (and some solutions)..........................................................13

P7204 Final Report Page 2 of 23

Page 3: UWS Learning and Teaching Action Plan Projects€¦  · Web viewThe project team developed surveys to gauge the usefulness of the tools from both the student and staff perspectives

TABLE OF CONTENTS (Continued)

Figure 10.......................................................................................................................15

Results..........................................................................................................................15

Response 3..................................................................................................................17

Response 5..................................................................................................................17

Response 6..................................................................................................................17

Response 7..................................................................................................................18

Response 8..................................................................................................................18

Response 9..................................................................................................................18

Response 10................................................................................................................18

Recommendations......................................................................................................20

Appendix A...................................................................................................................21

Student Survey Questions..........................................................................................21

Appendix B...................................................................................................................22

Staff Survey Questions...............................................................................................22

References...................................................................................................................22

P7204 Final Report Page 3 of 23

Page 4: UWS Learning and Teaching Action Plan Projects€¦  · Web viewThe project team developed surveys to gauge the usefulness of the tools from both the student and staff perspectives

UWS Learning and Teaching Action Plan Projects

LTAP Project Completion Report

Project aims

To trial a small number of software tools which assist in the marking of open-ended questions. The results of the trial will be used to develop recommendations for wider-scale adoption and implementation within the UWS environment.

Actual outcomes achieved

Student and Staff surveys for evaluating the usefulness of electronic marking tools.Mechanisms to enable easy distribution of marked electronic assignments to students via vUWS.A list of recommendations has been produced regarding electronic marking tools for open-ended assessment questions.

Comments:

Response to this project from staff was poor. This lead to a very small number of units trialling the marking software. This affected the ability of this project to achieve what we had intended since the result set was not statistically significant.

Evaluation of the project

Students undertaking the unit Advanced Web Site Development evaluated the use of one of the tools to provide feedback on their major assignment work via an online survey in vUWS at the end of Spring semester 2008. Overall the feedback from students was positive – see analysis of results in report. Staff that either used the tool or attempted to use the tool also provided feedback on the tool via email. Staff feedback was more negative regarding the tools due to certain perceived limitations of the tools.

Scalability of the project

The tools evaluated in this project are best suited to assessments with open-ended type responses such as essays or reports. Practical limitations with respect to easy download of submitted assessments from vUWS and linking with the tools, and with uploading feedback for students reduced the uptake by staff in using these tools. For such tools to be more successful in UWS unit assessment feedback it is recommended that a more integrated solution be found.

Sustainability of project

Due to the lack of initial interest and actual uptake of the tools within units by staff the project leader and other project team members ensured that their own units assessment was at least in some part suitable for the use of these tools.

Project implementation issues

Interest in the tools that were being offered as part of this project was very low. This was not anticipated. Of those staff that did register some interest in the use of the tools for their assessment feedback only two actually used the tools. This resulted in a very

P7204 Final Report Page 4 of 23

Page 5: UWS Learning and Teaching Action Plan Projects€¦  · Web viewThe project team developed surveys to gauge the usefulness of the tools from both the student and staff perspectives

small sample of units trialling the software tools. This aspect was unfortunate and was attempted to be addressed by offering one-on-one training for the initially interested staff. No such training was taken up as it was perceived that the tools themselves had sufficient limitations that would actually increase the staff workload in marking rather than decrease it.

Dissemination of project outcomes

School based demonstration of the tools.

Other comments

The project team are continually investigating new tools for online and electronic marking of assessment work. Products such as WebNotes have improved in recent times and may now be more usable. Investigations into methods of providing feedback via recorded audio are also underway.

SECTION 2: DETAILED REPORT, RELATED DOCUMENTS AND MEDIA

The problems

Feedback on assessment has a formative as well as summative role. Feedback should “allow each student to identify and appreciate their achievement and help them remedy any deficiencies” (Willson, 2004). Unfortunately, it is easy for academics to view marking open-ended assignments as a burden particularly when large classes are involved. This can lead to delays in returning feedback and sometimes the provision of relatively poor quality feedback.

The UWS Student Satisfaction Surveys for the College from 2004-on have consistently identified ‘Providing timely and constructive feedback on learning’ as highly important to students but indicated our performance is poor.

Online learning environments, such as WebCT and Blackboard, offer some support through the provision of automatic marking of multiple-choice, matching, fill-in-the-blank and formulaic questions. However, for time-poor academics it can be tempting to over-use this facility and avoid open-ended questions and assignments, which in turn can promote surface learning.

Another common problem encountered, particularly across multiple campuses, is the difficulty in maintaining consistency in marking.

Proposed solution

Computer-based marking assistants have been shown to considerably reduce marking time, speed up turn-around time and promote the provision of more in-depth feedback (Willson, 2004). Such systems are not new. In fact, lecturers in computing departments

P7204 Final Report Page 5 of 23

Page 6: UWS Learning and Teaching Action Plan Projects€¦  · Web viewThe project team developed surveys to gauge the usefulness of the tools from both the student and staff perspectives

from universities around the world have long been developing tools which make marking easier, for example see (Heo, 2003; Loddington, Pond, Wilkinson, & Willmot, 2009; Plimmer & Apperley, 2007; Plimmer & Mason, 2006). UWS is no different. There are at least 2 systems in operation within the School of Computing and Mathematics at the moment. While these tend to work extremely well, they are usually highly specialised and may only be applicable for a particular unit or even a particular assessment item.

The original intent was to take one of the UWS models and make it suitable for a wide range of subjects and assessment items. However, after a review of the literature a number of suitable generic models/tools were found. These include MarkTool (Massey University), ClassMate (USQ) and Markin (Creative Technology). All of these models/tools include an onscreen marking tool which allows human markers to annotate assignments in a similar fashion to paper-based marking. In addition, they all have the ability to store frequently used comments which can be called up again with ease. However, there are differences in the way assignments are handled, whether it is possible to link with WebCT/Blackboard, how co-markers collaborate and the ability to link the feedback with the assessment criteria. Based on this, it was decided to trial MarkTool and Markin within the UWS context.

Method

The project and the chosen online marking tools were advertised via the School of Computing & Mathematics school retreat in November 2007, the UWS Emerging Technologies Forum, College Teaching and Learning Forum, and via College wide emails on 2 occasions. There was only a very limited response from College staff. Only six staff indicated interest in trialling the marking tools within their unit.

Originally it was anticipated that response from staff would be much higher and so the team had budgeted time and money to be spent on developing online training modules to assist interested staff setting up and using the tools. However, due to the small response the team decided that these modules were not needed. Instead Dr Salter liaised with these interested staff to determine the suitability of the marking tools to the assessments within their units and to offer one-to-one training as was necessary for each staff member. This occurred at the start of Spring session 2008. A number of staff indicated that the tools being trialled were not as flexible as they would like and they felt they would find the tools limiting.

Ultimately, only two participants did use the tools for assessment marking within their units. One used Markin for the unit Human-Web Interaction, a Masters unit, for a large number of short answer type questions across several weeks of practical work. The other used MarkTool in the unit Advanced Web Site Development for two web based assignments. Unfortunately neither of these unit’s assessment types completely suited the tools being trialled. The participants were keen however to trial the software in these contexts to see if they could be adapted to suit their requirements and to ease their marking burden.

Assessments in these two units were run and then marked using the Markin and Marktool software. Details on using the MarkTool software are provided in the next section. The marked assessments were then returned to the students with simple instructions in the retrieval and use of the feedback documents. To facilitate the return of the marked assignments a method was devised by the project team to work around

P7204 Final Report Page 6 of 23

Page 7: UWS Learning and Teaching Action Plan Projects€¦  · Web viewThe project team developed surveys to gauge the usefulness of the tools from both the student and staff perspectives

a limitation encountered in the vUWS elearning environment. Details of this limitation and how it was resolved are given in the Problems Encountered section.

The project team developed surveys to gauge the usefulness of the tools from both the student and staff perspectives (see Appendix A & Appendix B for the survey questions). At the conclusion of Spring session 2008 the student survey was administered online via vUWS for the students in Advanced Web Site Development. Students were asked to voluntarily and anonymously answer the survey questions regarding the tools. Staff that participated in the trial were asked to provide feedback either via the staff survey or via email. A summary of the results of the administered surveys is found in the results section below.

Using the marking tools to provide feedback on assessments

This section summarises the main points of the use of the MarkTool software to mark assessments electronically on screen as carried out in the unit Advanced Web Site Development.

The MarkTool software allows the staff member to create a Marking Scheme file (Figure 1) in which is defined the total mark for the assessment, the individual marking criteria, the marks for each criteria and a colour for each criteria.

Figure 1 - Creating a marking scheme in MarkTool

Once created, the marking scheme can be used as a template against which all assessments are marked. By doing so the MarkTool software will keep track of each annotation that is made in the assessment according to which criteria it is designated to belong to (see Figures 2 & 3). As well as identifying which criteria the annotation belongs to the user is able to elect to either save the comment as a Frequently Used Comment or to retrieve a previously created comment from the saved Frequently Used Comments (Figure 2).

P7204 Final Report Page 7 of 23

Page 8: UWS Learning and Teaching Action Plan Projects€¦  · Web viewThe project team developed surveys to gauge the usefulness of the tools from both the student and staff perspectives

Figure 2 - Creating a comment in the student assessment submission showing choice of Colour Coded Marking Criteria this comment belongs to and the option of adding the comment to the Frequently Used Comments or choosing one from the previously saved Frequently Used Comments

P7204 Final Report Page 8 of 23

Page 9: UWS Learning and Teaching Action Plan Projects€¦  · Web viewThe project team developed surveys to gauge the usefulness of the tools from both the student and staff perspectives

Figure 3 - The MarkTool user interface showing some of the annotations made in the students submitted work

The view that the student sees of these annotations is depicted in Figure 4. This clearly shows that the annotations made by the marker are embedded in the assessment document in the section of the document pertaining to the comment. This is of course similar to how a paper-based report or essay would be annotated and so the MarkTool software is providing a user interface to the student that is familiar to them which makes reading of the electronic comments easier to consume.

P7204 Final Report Page 9 of 23

Page 10: UWS Learning and Teaching Action Plan Projects€¦  · Web viewThe project team developed surveys to gauge the usefulness of the tools from both the student and staff perspectives

Figure 4 – part of the pdf document produced by MarkTool as feedback to the student

The MarkTool software also uses the Marking Scheme XML file to generate a summary comments section (Figure 5) which is organised by the marking criteria. This feature is useful to the students if they wish to see all the comments relating to particular criteria without having to scan through the entire assessment feedback document for the inline annotations. This of course is not possible in a paper-based marking scenario and so is regarded as an improvement upon paper-based marking. A down-side of this particular part of the feedback document is that the student loses the context of the comment since it is not directly embedded in the section of the assessment that pertains to the comment.

Figure 5 - Summary of all comments made for this assessment submission

P7204 Final Report Page 10 of 23

Page 11: UWS Learning and Teaching Action Plan Projects€¦  · Web viewThe project team developed surveys to gauge the usefulness of the tools from both the student and staff perspectives

The MarkTool software also uses the Marking Scheme XML file to generate a marks section which is also organised by the marking criteria, and automatically calculates the total mark (Figure 6).

Figure 6 - Marks given by the marker against each criteria embedded in the feedback document produced by MarkTool

The colour coded marking criteria are shown to the student in another section of the feedback document as depicted by figure 7. This enables the student to know directly what the criteria are and how many marks are associated with them.

P7204 Final Report Page 11 of 23

Page 12: UWS Learning and Teaching Action Plan Projects€¦  · Web viewThe project team developed surveys to gauge the usefulness of the tools from both the student and staff perspectives

Figure 7 - Colour coded marking scheme/criteria is embedded in the feedback document produced by MarkTool

As well as the marker being able to make inline annotations to the submitted assessment document they are able to enter a general comment about the submitted work also. An example of this is shown in figure 8.

Figure 8 - General comment section in the feedback document produced by MarkTool

When marking of the submitted assessment is complete the feedback document can be saved. The feedback document contains the submitted work and the annotations, criteria and marks as outlined above. When saving the feedback document the tool

P7204 Final Report Page 12 of 23

Page 13: UWS Learning and Teaching Action Plan Projects€¦  · Web viewThe project team developed surveys to gauge the usefulness of the tools from both the student and staff perspectives

allows the feedback to be saved as both an XML file and a PDF file. The project team recommended that both file types be created for each submitted assessment – there was no extra effort involved in doing this since it is a setting that can be made when the marking scheme is created. This was suggested for two reasons. Firstly, if modifications to the marking were needed then the xml file could be opened in MarkTool and then edited directly. Secondly, the students would be provided with the pdf version of the feedback which is easily read in Adobe Acrobat Reader without having to use the MarkTool software.

The feedback documents for each assessment marked were saved with a filename that could be used to easily identify the student that had produced the assessment work. For example, ########_assign1_marked.pdf was used where ######## is the student ID number. Doing so helped keep the marking organised and assisted with the method used to return the marked assessments to students (see the Problems Encountered section of this report).

Problems Encountered (and some solutions)

Whilst the MarkTool and Markin software provides a mechanism to annotate directly into an electronically submitted assessment from a student there are some limitations of each software package. This section highlights some of those problems and suggests solutions for some of the problems.

Neither software tool that was trialled as part of this project had a direct way of integrating with the university’s e-learning environment or with student enrolment lists. This meant that the process of creating a marking feedback document for each submitted assessment was a little longer than was desirable since student details had to be manually entered rather than being automatically imported from a student list.

Associated with this limitation of the marking software is that within vUWS there is no documented way of providing a personalised document to a specific student without attaching the document to an email communication. This means then that when the marking is finished the marker has a feedback document for each student but no easy way of returning that document to the student. The project team consulted the vUWS help facility and the UWS elearning support staff but were told there was no way of doing this within vUWS. Given that the number of enrolled students in a unit may be large this was seen by the team as a real barrier to the use of the marking tools since returning marked work to the students needed to be as easy as possible for both staff and students. For the staff member to send personalised email (either from within vUWS or externally to vUWS) with an attachment to each student was determined to be too arduous and time-consuming. The mechanism of returning the assessment feedback had to be easier than this. The project team came up with a novel and yet simple approach using the Grade Tool in vUWS. Part of the functionality of the Grade Tool enables uploads of CSV files to the Gradebook. Typically this is used by staff to upload assessment marks which have been marked outside of vUWS. The CSV upload can also include text based items so we were able to extend this to html to enable the creation of customised hypertext links which would point directly to the assessment feedback documents which had been uploaded into a folder within vUWS. By using a standardised file naming convention, as indicated in the previous section of this report, we were able to generate a customisable link based upon the filename and the location within vUWS of the uploaded feedback documents. By then uploading a spreadsheet which contained both the student ID and the filename details into the gradebook we

P7204 Final Report Page 13 of 23

Page 14: UWS Learning and Teaching Action Plan Projects€¦  · Web viewThe project team developed surveys to gauge the usefulness of the tools from both the student and staff perspectives

successfully created a link for each student within the MyGrades tool that allowed direct access to the personalised feedback document which only that student could access via their vUWS account.

The MarkTool software can be used to annotate PDF documents only. Any assessment therefore that was to be marked using this tool must be able to be converted to a PDF document. For assessments such as reports or essays which are submitted in MS Word documents or similar the conversion is straight forward using MS Word itself or a utility program. The problem here though is that if the student has not submitted the report or essay in PDF format then the conversion would need to be done by the marker. This process, whilst simple, adds significant time to the overall marking of the assessment. The project team found a small number of free PDF conversion utilities that could be used to do a batch conversion of files within a folder to PDF hence eliminating the need for individual files to be converted manually by the marker. The PDF conversion utility that was found to be useful for this project was PDF4U Printer. In the case of the unit Advanced Web Site development which was reported in the previous section the situation was different again in that the main product that needed marking was not a MS Word document but a web application that would run in a browser from a web server. The problem with this is that to be able to generate a PDF of the entire application is near impossible and far too time consuming since the application is made up of multiple executable web pages that generate output in the browser. The marker would have to generate a PDF for each major user interaction in the web application and then use the annotation tool to mark those particular user interactions. This would result in far too much time being used to produce the PDFs and then also result in far too many feedback documents being generated per student assessment. Other tools such as WebNotes1 were investigated but were found wanting in available functionality and portability at the time of the project. For this unit then it was decided by the unit coordinator to use MarkTool to annotate the storyboard part of the assignment requirements that did produce a MS Word document. Whilst the storyboard was not the main part of the assignment it did provide the opportunity to comment directly in a document that represented many aspects of the user interface which needed feedback on prior to further implementation in Assignment 2 in the unit. The unit coordinator was then able to build into the marking of the storyboard feedback sections which related to the functionality of the web application.

The Markin tool which was used to mark assessments in the Human-Web Interaction unit is designed to annotate directly in RTF or plain text documents by utilising a set of buttons which can be customised to insert either positive or negative comments into the assessment document (see Figures 9 & 10). This button interface allows the marker to very easily insert customised comments directly into the document being marked. As shown in figure 9 the button can be categorised and a mark can be assigned to it. To place a comment in the document the marker can highlight the section of text and then click the appropriate button from the button bar. This creates a hypertext link in the document which when clicked will display the comment to the student. This way of re-using comments that are to be used on many occasions is very easy to do with the exception that the marker has to try to remember what each of the button captions represent so that they can choose the most appropriate comment to make in that part of the document. The marker for Human-Web Interaction commented that this particular aspect of Markin was the single most annoying feature of the product and lead to him abandoning the use of the tool after several weeks of marking practical work.

1 WebNotes and other similarly named tools allow post-it type notes or stickies to be added to web pages on hosted web sites.

P7204 Final Report Page 14 of 23

Page 15: UWS Learning and Teaching Action Plan Projects€¦  · Web viewThe project team developed surveys to gauge the usefulness of the tools from both the student and staff perspectives

Figure 10

Figure 9 - The create/edit button interface in Markin

Results

After the tools had been used in the two units students were asked to provide feedback on the usefulness of the tools.

Students in Human-Web Interaction did not complete the survey from Appendix A but provided verbal feedback to the unit coordinator. According to the unit coordinator the feedback was very positive from the students. He stated “the student feedback (verbal) was excellent. They appreciated both the improvement in quantity and quality of comments”. The unit coordinator further indicated that students improved their marks in successive assessments based upon the previous assessment feedback provided by the Markin tool. From this perspective the unit coordinator was very happy to use the tool since it did improve student learning through the feedback given. However, as indicated in the previous section the biggest problem that he had with the Markin software was the button interface that required him to remember what each three letter

P7204 Final Report Page 15 of 23

Page 16: UWS Learning and Teaching Action Plan Projects€¦  · Web viewThe project team developed surveys to gauge the usefulness of the tools from both the student and staff perspectives

abbreviation meant to be able to use the comment buttons effectively. This led him to abandon the use of the tool in the unit. In summarising his experience of the Markin tool, he stated in an email

“It was good to learn how to overcome the logistical barriers associated with vUWS. However, it was still a bit fiddly to load, open, annotate and store the final version. I found that my assessment items in that particular semester didn't lend themselves well to this approach as there were a larger number of shorter questions - this led to a longer preparation time. It would have been excellent to use the tool for a longer essay or research report. ... I am keen to find other ways to annotate because I am finding my feedback slipping back into old habits (less plentiful and less descriptive).”

Students in the unit Advanced Web Site Development were provided with feedback using MarkTool for both Assignment 1 and Assignment 2. Assignment 1 and 2 were directly related and so the feedback provided from Assignment 1 was useful in improving performance in Assignment 2. The students were asked to voluntarily and anonymously complete the student survey in vUWS after receiving their results for Assignment 2. Of the 51 enrolled students 13 completed the survey giving a return rate of only 25%.

Whilst this set of results is small some of the results are interesting and useful. Of the students that completed the survey 50% indicated that the quantity of feedback provided by the use of the tool was more than what was provided in normal written assignment feedback whilst 41.7% indicated that the amount of feedback was about the same. 66.7% of the students felt that the comments provided were better in quality than compared with normal written feedback, the other 33.3% indicating that it was about the same. This was a positive sign given that none indicated that it was worse. This result is probably attributable to the ability that the tool gives to the marker to reuse his/her comments and hence the marker is more likely to provide comments of a detailed nature since it is easier to do so.

A disappointing and yet not unexpected result was that only 16.7% of students felt that the feedback was provided faster to them due to the use of the marking tool with the remainder of the students equally split (41.7%) between ‘slower’ and ‘about the same’. As indicated this result was not unexpected with the marker stating in his feedback

“The tool was not completely what I had hoped for and so I had to make special adjustments to the way I marked. The two assignments that I marked using the MarkTool software were web based applications. As such there was not a document that I could easily annotate with my comments. To get around this issue I required the students to provide a word document that contained a storyboard of their web application. I then used the story board document to produce a pdf. This pdf was then annotated using the MarkTool software. The process of having to create the pdf for each assignment submission certainly added time to my marking. However, once the pdf was created I found the tool relatively easy to use to create annotations in the pdf and to store and re-use frequently used comments. These aspects of the MarkTool software were important to me as I have been looking for a way to improve the online marking that I already do using another custom built tool.”

It was encouraging to find that the students clearly understood that the feedback that they had been provided with in the first assignment was valuable in relation to improving their performance in the subsequent assignment with 83.3% indicating that

P7204 Final Report Page 16 of 23

Page 17: UWS Learning and Teaching Action Plan Projects€¦  · Web viewThe project team developed surveys to gauge the usefulness of the tools from both the student and staff perspectives

they would make changes to future assignments based upon feedback in the current assignment. This is contrary to the impression that I often receive from students since they seem to have not actually read the feedback and are only interested in their mark.

As indicated previously the project team felt that it was important to have a simple mechanism to return the marked assessments to the students. Due to the simple mechanism that was employed by the project team to provide the feedback documents to the students in vUWS 83% of students indicated that it was either ‘Easy’ or ‘Very easy’ to obtain the electronic feedback.

The following unedited open-ended comments from students relating to the best aspects of receiving feedback generated by the marking tool are encouraging since it shows that they did appreciate a number of aspects of the marking software tool:

Response 3Easy access to feedback

Response 5 pin points exactly what needs to be improved

Response 6 If I feel the need to not attend lectures or practicals, I still receive my results in a timely manner. If an assignment has been extended into the stuvac or is due in the final weeks of semester, then results can be obtained without the need of contacting uni after semester end

P7204 Final Report Page 17 of 23

Page 18: UWS Learning and Teaching Action Plan Projects€¦  · Web viewThe project team developed surveys to gauge the usefulness of the tools from both the student and staff perspectives

Response 7 This tool allows students to read feedback that is specific to some aspects. Easy to find comments with the link functions used.

Response 8 No paper needed. Visual aspects, we knew exactly what he was talking about because it was clearly marked on the assignment.

Response 9 Instant -- can be viewed pretty much anywhere

Response 10 Quick and easy to access the areas of the assignment that the comments refer to.

The marker that used the MarkTool software made the following comment regarding the best aspects of the tool:

“The Marktool software allows you to build in the marking criteria into the feedback document by building a template for the feedback. This template is then used for each assessment to be marked and is built into the annotated pdf that is produced for the feedback. This enables the student to see the annotations and the marking criteria in the one feedback document. Another aspect that is very useful is being able to create and store comments that can then easily be re-used. Also the MarkTool software automatically generates sections in the generated report which accumulates the comments and the marks for each criteria for the student.”

The marker indicated that certain aspects of the tool needed improvement. For him the lack of integration with vUWS or the ability to import into the tool a list of students was an issue as well as limitations found with the Frequently Used Comments facility:

“It would be good to be able to import a list of students to be able to select the student rather than entering their details for each new assessment. Also, the Frequently Used Comments section is not categorised by the marking criteria even though when you create the comment in the first instance you also designate which marking criteria it belongs to. Categorising the comments in the Frequently Used Comments would make it much easier to find a previous comment that you want to reuse rather than having to search the entire list of comments.”

Another issue that this marker identified is the need to read content on screen rather than on paper:

“This has always been an issue for me as I still like to be able to read from paper and annotate directly on the paper. However, I see clear advantages to electronic submission of assignments and marking for both the students and myself and so I try as much as I can to use electronic forms of assessment submission and marking.”

P7204 Final Report Page 18 of 23

Page 19: UWS Learning and Teaching Action Plan Projects€¦  · Web viewThe project team developed surveys to gauge the usefulness of the tools from both the student and staff perspectives

From the results stated above it is clear that limitations do exist with both marking tools that were trialled. However, both tools do also provide certain points of functionality that improve the ability to reuse comments from the markers perspective and that the reusability of the comments may lead to improvements in the quality and quantity of the feedback given to students since it is easier to make more extensive comments if they can be stored and then be reused by the marker. Importantly, students surveyed in both units recognise the importance of using the feedback given in one assessment to improve their performance in subsequent assessments.

P7204 Final Report Page 19 of 23

Page 20: UWS Learning and Teaching Action Plan Projects€¦  · Web viewThe project team developed surveys to gauge the usefulness of the tools from both the student and staff perspectives

Recommendations

Based upon these results and the experience and observations of the team members we make the following recommendations in relation to electronic marking software at UWS:

Neither the Markin or MarkTool software trialled in this project are completely suitable to all types of assessment at UWS. Rather, they are suited to only a subset of assessments which require students to produce electronic documents that can be easily converted to PDF, RTF or plain text. Many assessments at UWS could fit this criterion even if they are presently paper-based. Other limitations, such as a lack of integration with vUWS, also exist with these tools which would hinder their uptake at UWS The tools trialled are not suitable for the marking of web applications or of computer programs in general and so would have limited use with the School of Computing and Mathematics. If the type of assessment is directly appropriate for the tools functionality, tools of this nature can improve the quantity and quality of feedback provided to students and can improve the speed of returning feedback to students if they are not too difficult to set up. It is recommended then that other marking tools be investigated not necessarily with the vision of finding one tool that can be used for all assessment types but perhaps several tools which can be customised depending upon the assessment being marked. Students do find the feedback useful and do use it to improve their performance in subsequent assessments. They also find the electronic nature of such feedback easy to access. From the student perspective we see it as important that other tools be investigated. For a marking tool to be more useful to most academic staff there needs to be a high level of integration with the university’s elearning environment. This is not to say that the tool should be a part of the elearning environment but rather there needs to be easy mechanisms to

o integrate the submitted assessments with the tool,o utilise student enrolment lists as part of the tool to minimise the

administrative aspect of marking,o enable seamless return of results/marks and feedback documents to

students without adding extra burden to the academic staff member or the student.

Staff must be convinced that marking electronically will not take them more time than marking paper-based assessments. To do this the staff must see that the administrative aspect has been reduced by such tools rather than increased.

Due to the distributed nature of the university and markers for a unit residing on different campuses the marking tool must also enable easy distribution and management of submitted assignments to and from markers. Staff using electronic marking tools must be relatively comfortable in reading assessment content on screen rather than in printed form. Staff that are wedded to printed documents are unlikely to find the tools an aid in their marking unless they are sufficiently motivated to make the change.

P7204 Final Report Page 20 of 23

Page 21: UWS Learning and Teaching Action Plan Projects€¦  · Web viewThe project team developed surveys to gauge the usefulness of the tools from both the student and staff perspectives

Appendix A

Student Survey QuestionsSurvey on Assignment FeedbackDuring this semester we have trialled the use of an online marking tool which allows us to directly annotate on your assignments. This survey is to gauge the effectiveness of this tool in providing meaningful and useful feedback in a timely manner.

Question 1 1) In comparison to normal written assignment feedback, the quantity of comments was: a. More b. Less c. About the same

Question 2 2) In comparison to normal written assignment feedback, the quality of comments was: a. Worse b. Better c. About the same

Question 3 3) In comparison to normal written assignment feedback, how long did it take to get your assignment back? a. Faster b. Slower c. About the same

Question 4 How useful did you find this feedback? a. Very useful b. A little useful c. Not very useful d. Useless

Question 5 Would you make changes to future assignments based upon this feedback? (eg, from Assignment 1 to Assignment 2) a. Yes b. No c. Maybe

Question 6 How easy was it to get your assignment back compared to papers based assignments? a. Very easy b. Easy c. Difficult d. Very difficult

Question 7

P7204 Final Report Page 21 of 23

Page 22: UWS Learning and Teaching Action Plan Projects€¦  · Web viewThe project team developed surveys to gauge the usefulness of the tools from both the student and staff perspectives

Approximately what percentage of the comments received did you read?[open-ended response]

Question 8 Would you prefer your assignments marked with this tool? a. All assignments b. Some assignments c. None

Question 9 What were the best aspects of receiving feedback generated by this tool?[open-ended response]

Question 10 What improvements can be made in the use of this tool?[open-ended response]

Appendix B

Staff Survey Questions

Survey on Trial of Marking ToolsDear Colleague, thank you for trialling the use of either Markin or MarkTool. Please answer the following questions to now provide feedback on the usefulness and usability of the chosen software tool.1) Was reading off the screen an issue for you?2) Did the tool save time in your marking?3) Do you think the students like receiving feedback in this way?4) Would you use the tool again? Why, or why not?5) What were the best aspects of the tool?6) What were some aspects of the tool that need improvement?7) How easy was it to obtain the submitted assignments?8) How easy was it to return the marked assignments to the students?9) What other issues did you have?10) How did the use of this tool compare to others that you may have used?11) How did the use of this tool compare to hand-written feedback?

References

Heo, M. (2003). A learning and assessment tool for web-based distributed education. Paper presented at the Proceedings of the 4th conference on Information technology curriculum.

Loddington, S., Pond, K., Wilkinson, N., & Willmot, P. (2009). A case study of the development of WebPA: An online peer-moderated marking tool. British Journal of Educational Technology, 40(2), 329-341.

Plimmer, B., & Apperley, M. (2007). Making paperless work. Paper presented at the Proceedings of the 7th ACM SIGCHI New Zealand chapter's international conference on Computer-human interaction: design centered HCI.

P7204 Final Report Page 22 of 23

Page 23: UWS Learning and Teaching Action Plan Projects€¦  · Web viewThe project team developed surveys to gauge the usefulness of the tools from both the student and staff perspectives

Plimmer, B., & Mason, P. (2006). A pen-based paperless environment for annotating and marking student assignments. Paper presented at the Proceedings of the 7th Australasian User interface conference - Volume 50.

Willson, J. (2004). Enhancing feedback to students: Technology can help. Learning & Teaching in Action, 3(3), 19-24.

P7204 Final Report Page 23 of 23