Upload
independent
View
1
Download
0
Embed Size (px)
Citation preview
An E-Assessment Website to Implement Hierarchical Aggregate
Assessment (http://www.eval.uqam.ca/cluster/index.php )
By:
Martin LesageGilles RaîcheMartin RiopelFrédérick FortinDalila Sebkhi
Centre Cdame (http://www.cdame.uqam.ca/)(Collectif pour le Développement et les Applications en Mesure et Évaluation)
Département d’Éducation et de PédagogieUniversité du Québec à Montréal
World Academy of Science, Engineering and Technology (WASET)ICCSSE 2014 International Conference on Computer Science and Software Engineering on February, 27-28, 2014 at Rio de Janeiro, Brazil
Département d'Éducation et PédagogieUniversité du Québec à Montréal (UQÀM)1205, rue Saint-Denis, Local: N-4900Montréal (Québec) CanadaH2X 3R9Téléphone: (514) 987-0011
Summary of the Presentation
Presentation of the speakerProfessional background of the speakerIntroductionSoftware demonstrationProblematicsResearch ObjectResearch ObjectivesHypothesis & Conceptual and theoretical ParadigmsMethodologyPreliminary results
Background of the speaker (Martin lesage)
Bachelor’s Degree Engineering Computer Science (Laval 1991)Bachelor’s Degree Management Computer Science (Laval 1994)Bachelor’s Degree Mathematical Computer Science (1994 – 1996 – Not completed)Master’s Degree Information Technology (IT) (UQÀM 2006)Ph.D. Student in Education (UQÀM 2007 – to becompleted in 2013)
Professional background of the speaker
Warrant Officer Reserve Forces - Canadian Military Engineers (1982 –today)Multimedia Programmer – Tactics School - G3 AITIS DT Cell – CFB Gagetown (1997-2001)
– ITC 1 distance study package– Infantry 6B distance study package– Started with Col Paul Kearney DS Tactics School and Col Chris Hand –
Col Kearney earned the Order of Military Merit by developping the first multimedia course programmed by the army instead of independent firms (IBM, CGI, DMR and other multimedia companies)
College teacher in computer science (2001-2002)– CÉGEP de Rivière-du-Loup
Course lecturer in computer science (2002-2004)– Université du Québec à Rimouski (UQAR) - DMIG
Introduction
Ph.D. project using research and development (R & D) project methodologySoftware development base concept: Hierarchical aggregate assessment (cluster assessment)To develop an Internet application implementing hierarchical aggregateassessment (cluster assessment)
Introduction
This presentation will be a short introduction to the « Cluster »application Ph.D. project. Only a short intoduction will be given on how to enter course material in the applicationPartnership / collaboration between CDAME and IT&E Research and DevelopmentPublication in Canadian Defence Journal / Canadian Military JournalThe application is currently used by Quebec province 12 years old to 15 years old Army Cadets.The CDAME would like to test the application with adult studentsThe course presented by the application are: « Topography 8 figures coordinates and navigation with the map» and « Instruction techniques ». The CDAME would like that other distance courses could be placed on the « Cluster » application
Introduction (suite)
Définition of hierarchical aggregateassessment (cluster assessment) :– Field (projected – to obtain Ph.D.) of the
teamwork assessment domain specifying that the teams have many levels of supervision where the team leaders (that could be students) are assessed by one or many team administrators (that could be students or teachers) forming an inverted tree hierarchical structure
Hierarchical aggregateassessment
Multi level spherical 3D vision of formative and summative assessment of group members
Summative assessment:• Assessment by the teacher•HTML exams•Homeworks to submit
Combined process of formative andsummative assessment:
Examples:-Classroom newspaper-Navigation with the map-Teaching assessment
Formative assessment:• self assessment• assessment by team members• assessment by team leaders• assessment by team manager / administrator
Course planning
Three documents are needed to plan a course withthe « Cluster » Internet application:
– Course syllabus or schedule– Course material in electronic format:
Course notes (PDF)PowerPoint presentationsWord documents
– Team organigram / Course candidates hierarchical organigram
Navigation with the map in sections
Team manager(administrator)
Cadet 1Tp WO
Team leaderCadet 2
Sect Commander
Team leaderCadet 3
Sect Commander
Team memberCadet 4
Sect Member
Team memberCadet 5
Sect Member
Team memberCadet 6
Sect Member
Team memberCadet 7
Sect Member
Course Examples
High school class newspaperNavigation with the mapEducation university students teaching assignments
Teaching assignment
School principal
Associated teacher
Education student UQÀM
Assignment workplace (school)Registrar
Assignment supervisorUniversity professor
in education
Université du Québecà Montréal (UQÀM)
Assessing
Assessing /Formative assessment
Assignment mark
Summative assessment
Formative assessmentof 13 skills
Assignmentdescription Portfolio
Assessment ofassignmentenvironment
Groupmeeting
End of assignmentreport document
Software demonstration
Cluster application– Student mode– Administrator / assessor mode– Methods to enter course material– Example: navigation with the map in sections
Navigation with the map in sections
Team manager(administrator)
Cadet 1Tp WO
Team leaderCadet 2
Sect Commander
Team leaderCadet 3
Sect Commander
Team memberCadet 4
Sect Member
Team memberCadet 5
Sect Member
Team memberCadet 6
Sect Member
Team memberCadet 7
Sect Member
Problematics
Moodle, WebCT and Blackboard can’timplement complex assessment tasks in collaborative mode using a multi-levelhierarchyAll the present research and litterature onlyconsider a single level hierarchy (a team leader and team members) in:– Military– Medical reanimation teams
Problematics
President
Education
Assessment / Evaluation
Teamwork assessment
Administrator Administrator
Team leader Team leader
Teammember
Teammember
Teammember
Teammember
Teammember
Teammember
Teammember
Teammember
Team leader Team leader
Hierarchical aggregateassessment
Proposed Ph.D. concept
Research objectives
To define hierarchical aggregate assessment as a field of teamwork assessmentTo develop a pedagogical product that is an Internet software application to implement hierarchicalaggregate assessmentTo perform functionnal testing of the « Cluster »application to experimental subjects that will performcomplex assessment tasksThe complex assessment tasks will be performed in teams that will have an hierarchical organizationalstructure with multiple levels of supervision
Ph.D. Project goals to attain
The actual Ph.D. project wants to implementan assessment mode that could evaluateskills (competencies) of : – The whole group– The teams– The team manager– The team leaders;– The team members.
Hypothesis & Theoretical and conceptual paradigms
The Ph.D. project research question is : « How to implement assessment on the Internet with a brand new approach that is called hierarchical aggregate assessment or cluster asessment ? »
Methodology of Research
Pedagogical product developped according to a research and development (R&D) methodology in which the theoreticalconsiderations are on hierarchical aggregate assessmentInterpretive epistemological position that prefers the use of qualitative analysis (Loiselle et Harvey, 2007, p. 48). The Interpretive epistemological position is needing a qualitative analysis process to collect data (Loiselle et Harvey, 2007, p. 48).In this actual Ph.D. project, the data collection will be done withobservations, questionnaires and checklists (Loiselle, 2001, p. 81).
Methodology of Research
Problem to solve: To develop a software application that implements hierarchicalaggregate assessmentDeductive analysis based on theoreticalconsiderations: No research or no authors in the assessment field is studying the assessment of organizations having a multi level hierarchy
Methodology of Research
Development of the idea: to divide teamworkassessment in two fields:
– Single hierarchical level– Multiple hierarchical level (hierarchical aggregate
assessment)Implementation of the action model: To implementthe management of a multi-level hierarchy in the software data structure and databaseImplementation of a software prototype: Software development of the « Cluster » application in Java supported by a SQL database
Methodology of Research
Functional testing (beta-tests):– Software developers of the « Cluster » application– CDAME staff and scientists
Empirical testing: Teaching of a geography class in high schoolSystematical testing: Cartography course (8 figures coordinate & navigation with the map) done withthree army cadets corps in Quebec
Conclusion
To define the field of hierarchical aggregateassessment and to publish the theory to obtain Ph.D.The « Cluster » Internet software application is nowdevelopped at a satisfying levelSome trials and experimentations are now in progress:
– Internet distance geography course in High School. Tutor: Ms. Dalila Sebkhi (Done)
– Internet distance course in cartography [8 figures coordinates and navigation] for Army cadets (In progress)
– Management of teaching assignments I, II, III et IV for UQAM education students (Future)
Bibliography
Calisir, F., et Calisir, F. (2003). The relation of interface usability characteristics, percieved usefulness, and perceived ease of use to end-user satidfaction with enterprise resource planning (ERP) systems. Computer in Human Behavior, 20, 505-515.Chin, J. P., Diehl, V. A. et Norman, K. L. (1988). Development of an instrument measuring user satisfaction of the human-computer interface. Dans Proceedings of SIGCHI ‘88, 213–218, New York. ACM/SIGCHI.Freeman, M. et McKenzie, J. (2000). Self and peer assessment of student teamwork: Designing, implementing and evaluating SPARK, a confidential, web based system [En ligne]. Dans Flexible Learning for a Flexible Society, Proceedings of ASET-HERDSA 2000 Conference. Toowoomba, Qld, 2-5 July. ASET and HERDSA. Accès: http://www.aset.org.au/confs/aset-herdsa2000/procs/freeman.html.Ghani, J. A., et Deshpande, S. P. (1994). Task characteristics and the experience of optimal flow in human-computer interaction. The Journal of Psychology, 128 (4), 381-391.
Bibliography
Lewis, J. R. (1993). IBM computer usability satisfaction questionnaires: Psychometric evaluation and instructions for use (Tech. Report 54.786) [En Ligne]. Boca Raton, FL: IBM Corp. Accès: http://drjim.0catch.com/usabqtr.pdf.Loiselle, J. (2001). La recherche développement en éducation: sa nature et ses caractéristiques. Dans M. Anadón et M. L'Hostie (Dirs), Nouvelles dynamiques de recherche en éducation (p. 77-97). Québec: Les Presses de l'UniversitéLaval.Loiselle, J. et Harvey, S. (2007). La recherche développement en éducation : fondements, apports et limites. Recherches qualitatives, 27 (1), 40-59.Nonnon, P. (1993). Proposition d'un modèle de recherche de développement technologique en éducation. Regards sur la robotique pédagogique. Technologies nouvelles et éducation. Publications du service de technologie de l'éducation de l'Université de Liège et de l'Institut nation de recherche pédagogique, Paris, pp. 147-154.
Bibliography
Nonnon, P. (2002). La R&D en éducation. Contribution aux actes du symposium international francophone sur les technologies en éducation de l’INRP sous la direction de Georges Louis Baron et Éric Bruillard, Paris, France (pp. 53-59).Perlman, G. (2012). Questionnaire for User Interface Satisfaction (Quis). [En Ligne]. Accès: http://hcibib.org/perlman/question.cgi?form=QUIS.Ritchie, P. D., et Cameron, P. A. (1999). An evaluation of trauma team leader performance by video recording. Australian and New Zealand Journal of Surgery, 69, 183-186.Schneiderman, B. (1992). Designing the User Interface: Strategies for effective Human-Computer Interaction. (2è ed.). New York: Addison-Wesley Publishing Company.
Bibliography
Sittig, D. F., Kuperman, G. J., et Fiskio J. (1999). Evaluating physician satisfaction regarding user interactions with an electronic medical record system. Proceedings of the American Medical Informatics Association (AMIA) Annual Symposium, 400 (4).Sugrue, M., Seger, M., Kerridge, R., Sloane, D., et Deane, S. (1995). A Prospective Study of the Performance of the Trauma Team Leader. The Journal of Trauma: Injury, Infection and Critical Care, 38 (1), 79-82.Walker, J.H., Sproull, J., et Subramani, R (1994). Using a Human Face in an Interface, Proceedings of the 1994 ACM Conference on Human Factors in Computing Systems (CHI '94), New York: ACM Press, 85-91.Wikipedia. (2012). Questionnaire for User Interaction Satisfaction (QUIS). [En Ligne]. Accès: http://en.wikipedia.org/wiki/Questionnaire_for_User_Interaction_Satisfaction_(QUIS).