9
Student Perceptions of Chemistry Experiments with Dierent Technological Interfaces: A Comparative Study Samuel J. Priest, Simon M. Pyke,* and Natalie M. Williamson School of Chemistry and Physics, University of Adelaide, Adelaide, South Australia 5005, Australia * S Supporting Information ABSTRACT: Microcomputer based laboratory activities have been suggested to have a number of benets in science education. However, their implementation produces mixed results, and student perception data have in the past yielded responses that are negative regarding the technology. This work presents a case study of three rst year undergraduate chemistry experiments in which negative views of hand- held graphic data logger devices were signicantly alleviated by replacing them with notebook computers equipped with equivalent software. Widespread improvements in student perceptions of their laboratory experience were observed, prominently including reduced negativity concerning the equipment, an increased perception of understanding, an increased perception of simplicity, and more positive perceptions of overall laboratory experience. These improvements, which were not limited to perceptions directly concerning the technology specically, suggest a broad range of substantial advantages is present in providing a suitable user interface. KEYWORDS: Computer-Based Learning, Laboratory Instruction, Laboratory Computing/Interfacing, First-Year Undergraduate/General, Chemical Education Research FEATURE: Chemical Education Research INTRODUCTION Laboratory work provides a wide range of benets for learning in chemistry. 1 Practical laboratory work has historically been claimed to be benecial for multiple reasons, 2,3 including exposing students to concrete experiences with objects and concepts mentioned in the classroom. 4,5 Connecting concrete, macroscopic observations to the abstract representations and symbolizations used in science is well understood to be a key hurdle in the understanding of chemistry concepts 6-9 and is a task hailed as being one of the most dicult challenges facing science teachers, as well as one of the most important. 10 In part to assist in the teaching of abstract concepts, and to engage students with technology they may use as working scientists, technology plays an increasingly large role in science education, including data collection and display in the laboratory setting. 11 A large body of research exists concerning activities in which computing devices are used in conjunction with measurement devices (probeware) to gather and display data in laboratory teaching exercises. Students may be required to use hand-held graphing data logger devices 12-14 specically designed to display and analyze data collected from associated probeware, or alternately the probeware may be connected to a notebook or desktop computer equipped with the necessary soft- ware. 15-18 These activities have been termed microcomputer based laboratory (MBL) activities. Because of their ability to pair events with their graphical representation in real time, MBL activities have been suggested to assist in making the connection between the concrete and the abstract, 19-21 notably in the form of improving studentsgraph interpretation skills. More active student engagement in constructing understanding has been suggested as an additional benet, arising from increased focus on data interpretation instead of data collection and increased student collabora- tion. 22,23 These MBL activities have been said to have the ability to transformlaboratory activities due to these advantages. 24 While a number of studies have supported these claims, 14,25-31 particularly in the context of inquiry based learning, 12,32,33 overall the results of implementing MBL activities appear mixed 34 and successful implementation of MBL activities appears to be a complex issue. 35,36 Studies exist which counter the suggested benets concerning student understanding of graphs 37 and the connection between the macroscopic and more abstract. 38 Other prominent issues appear to be the possibility that students may watch uncritically as the computer does all the work for them, as well as students encountering diculty using the computing devices themselves. 35,39 Studies documenting student perspectives of this technology reveal that student views of MBL vary to a great extent. 40 Negative issues raised again include students feeling disengaged as the data logger does all the work, as well as a lack of technical Article pubs.acs.org/jchemeduc © XXXX American Chemical Society and Division of Chemical Education, Inc. A dx.doi.org/10.1021/ed400835h | J. Chem. Educ. XXXX, XXX, XXX-XXX

Student Perceptions of Chemistry Experiments with Different Technological Interfaces: A Comparative Study

Embed Size (px)

Citation preview

Student Perceptions of Chemistry Experiments with DifferentTechnological Interfaces: A Comparative StudySamuel J. Priest, Simon M. Pyke,* and Natalie M. Williamson

School of Chemistry and Physics, University of Adelaide, Adelaide, South Australia 5005, Australia

*S Supporting Information

ABSTRACT: Microcomputer based laboratory activities have beensuggested to have a number of benefits in science education. However,their implementation produces mixed results, and student perceptiondata have in the past yielded responses that are negative regarding thetechnology. This work presents a case study of three first yearundergraduate chemistry experiments in which negative views of hand-held graphic data logger devices were significantly alleviated by replacingthem with notebook computers equipped with equivalent software.Widespread improvements in student perceptions of their laboratoryexperience were observed, prominently including reduced negativityconcerning the equipment, an increased perception of understanding, anincreased perception of simplicity, and more positive perceptions ofoverall laboratory experience. These improvements, which were notlimited to perceptions directly concerning the technology specifically, suggest a broad range of substantial advantages is presentin providing a suitable user interface.

KEYWORDS: Computer-Based Learning, Laboratory Instruction, Laboratory Computing/Interfacing,First-Year Undergraduate/General, Chemical Education Research

FEATURE: Chemical Education Research

■ INTRODUCTION

Laboratory work provides a wide range of benefits for learningin chemistry.1 Practical laboratory work has historically beenclaimed to be beneficial for multiple reasons,2,3 includingexposing students to concrete experiences with objects andconcepts mentioned in the classroom.4,5 Connecting concrete,macroscopic observations to the abstract representations andsymbolizations used in science is well understood to be a keyhurdle in the understanding of chemistry concepts6−9 and is atask hailed as being one of the most difficult challenges facingscience teachers, as well as one of the most important.10 In partto assist in the teaching of abstract concepts, and to engagestudents with technology they may use as working scientists,technology plays an increasingly large role in science education,including data collection and display in the laboratory setting.11

A large body of research exists concerning activities in whichcomputing devices are used in conjunction with measurementdevices (probeware) to gather and display data in laboratoryteaching exercises. Students may be required to use hand-heldgraphing data logger devices12−14 specifically designed todisplay and analyze data collected from associated probeware,or alternately the probeware may be connected to a notebookor desktop computer equipped with the necessary soft-ware.15−18 These activities have been termed microcomputerbased laboratory (MBL) activities.Because of their ability to pair events with their graphical

representation in real time, MBL activities have been suggested

to assist in making the connection between the concrete andthe abstract,19−21 notably in the form of improving students’graph interpretation skills. More active student engagement inconstructing understanding has been suggested as an additionalbenefit, arising from increased focus on data interpretationinstead of data collection and increased student collabora-tion.22,23 These MBL activities have been said to have theability to “transform” laboratory activities due to theseadvantages.24

While a number of studies have supported theseclaims,14,25−31 particularly in the context of inquiry basedlearning,12,32,33 overall the results of implementing MBLactivities appear mixed34 and successful implementation ofMBL activities appears to be a complex issue.35,36 Studies existwhich counter the suggested benefits concerning studentunderstanding of graphs37 and the connection between themacroscopic and more abstract.38 Other prominent issuesappear to be the possibility that students may watch uncriticallyas the computer “does all the work for them”, as well asstudents encountering difficulty using the computing devicesthemselves.35,39

Studies documenting student perspectives of this technologyreveal that student views of MBL vary to a great extent.40

Negative issues raised again include students feeling disengagedas the data logger does all the work, as well as a lack of technical

Article

pubs.acs.org/jchemeduc

© XXXX American Chemical Society andDivision of Chemical Education, Inc. A dx.doi.org/10.1021/ed400835h | J. Chem. Educ. XXXX, XXX, XXX−XXX

familiarity.41 Students are reported to claim the technology iscomplex and difficult to use42 and that they do not have thetime to “play” with the technology and undergo trial and errorprocesses of learning like they would do with homecomputers.41 These issues, including that students havedifficulty manipulating and using MBL technology, notably in“older” forms, have also been recognized in teacher views.39

The disadvantages of having to learn how to use the technologyas well as the exercise’s learning objectives has been observed tooutweigh the advantages of MBL in the past,43 and it has beensuggested that MBL activities may be better suited to thosewho have a better idea of both content and the handling ofsensor technology associated with the microcomputerdevices.44 This is reminiscent of classroom based studiessuggesting students benefit more from computer basedexercises if they are comfortable with their use,45 and it isnot unreasonable to expect that the same is true of computingdevices in the laboratory setting. Recent review of studiesconcerning MBL activities in secondary school chemistrysuggests more research needs to be conducted to discoverwhat can be done to assist students in overcoming these issuesthey express.40

Purpose and Context of This Study

This study reports differences in student perception data fromspecific first year undergraduate chemistry experimentsconducted by two different cohorts of students; one cohortusing a hand-held data logger device to collect and display datain the relevant experiments and the other cohort performingthe identical tasks, instead using a notebook computer. Thischange in technological interface was made in response tonegative views expressed by students regarding the data loggerdevices, gathered using a survey instrument distributed withoriginal intentions other than the research presented here. Thesame survey instrument was used to monitor studentperceptions the following year after the change had beenmade, and the results are presented in the following discussion.The observations made are suggested to be of use in movingtoward overcoming the reported student difficulties associatedwith microcomputer based laboratories.

■ METHODOLOGY

Survey Instrument Use and Administration

Student perception data reported in this research were obtainedusing a survey instrument commonly used as part of theAdvancing Science by Enhancing Learning in the Laboratory(ASELL) project.46 The ASELL project was initially conceivedas a means of combatting falling enrolments and interest inphysical chemistry by establishing a collaborative effort todisseminate information and develop a database of education-ally sound experiments transferrable across institutions.47,48 Inpart using data concerning students’ perceived learningexperience to evaluate experiments for their positive attributes,the project expanded from its beginnings in physical chemistry(APCELL)49,50 to chemistry in general (ACELL).51−54 Aftercalls for further expansion55,56 the project evolved toencompass a broader range of science disciplines(ASELL).57−59 The survey used as part of the ASELL projectwas initially utilized for purposes other than the researchreported here. However, after yielding negative responsesconcerning the graphing data logger devices used in someexperiments in 2011, despite no survey items directly targetingthis facet of the laboratory experience, the same surveys were

also used to monitor improvements following the replacementof these devices in 2012.The survey instrument commonly used as part of the ASELL

project, specifically the ASELL Student Learning Evaluation(ASLE) survey, includes 14 Likert-style response format items,as well as five open response items, designed to gain a profile ofstudent perceptions of the experiments (as shown in Table 1).

ASLE surveys were distributed to students at the completionof each experiment for the purposes of evaluating theirperceptions specific to each separate experiment. Completionof the survey was presented to the students as optional andanonymous. Though these surveys were distributed for eachexperiment conducted over the course of 2011 and 2012 (12experiments in each year), only a small number of theseexperiments involved the use of electronic data collection (3 ineach year), and for that reason only the results collected fromthose experiments using electronic data collection arespecifically discussed herein.Experiments Conducted

During the year 2011 at the University of Adelaide, chemistryexperiments studied within this research involved student useof the PASCO Xplorer GLX hand-held graphing data loggerdevice.13,14 Student feedback data, collected for other researchpurposes using the ASLE instrument, revealed negativeperceptions of these devices, which had been utilized in the

Table 1. Survey Items Included on the ASLE Instrument

No. ItemResponseFormat

1 This experiment helped me to develop my datainterpretation skills

Likert-style

2 This experiment helped me to develop my laboratory skills Likert-style

3 I found this to be an interesting experiment Likert-style

4 It was clear to me how this laboratory exercise would beassessed

Likert-style

5 It was clear to me what I was expected to learn fromcompleting this experiment

Likert-style

6 Completing this experiment has increased myunderstanding of chemistry

Likert-style

7 Sufficient background information, of an appropriatestandard, is provided in the introduction

Likert-style

8 The demonstrators offered effective supervision andguidance

Likert-style

9 The experimental procedure was clearly explained in thelab manual or notes

Likert-style

10 I can see the relevance of this experiment to my chemistrystudies

Likert-style

11 Working in a team to complete this experiment wasbeneficial

Likert-style

12 The experiment provided me with the opportunity to takeresponsibility for my own learning

Likert-style

13 I found the time available to complete this experiment was Likert-style

14 Overall, as a learning experience, I would rate thisexperiment as

Likert-style

15 Did you enjoy doing the experiment? Why or why not? Open16 What did you think was the main lesson to be learnt from

the experiment?Open

17 What aspects of the experiment did you find mostenjoyable and interesting?

Open

18 What aspects of the experiment need improvement andwhat changes would you suggest?

Open

19 Please provide any additional comments on thisexperiment here.

Open

Journal of Chemical Education Article

dx.doi.org/10.1021/ed400835h | J. Chem. Educ. XXXX, XXX, XXX−XXXB

teaching laboratory for a number of years prior. In response tothis feedback, the devices were replaced with notebookcomputers equipped with software replicating the capabilitiesof the data loggers: PASCO DataStudio.15 The only differencesbetween the two years, aside from this technological interfacechange, include the laboratory demonstrators and the portionof the instruction manual devoted to use of the technologyimplemented (in the form of an appendix, isolated from the restof the manual). All other features of the relevant experimentsremained identical, including the tasks performed using eitherthe data logger or the notebook computer. A total of threeexperiments were studied, with the perception of eachcontrasted between the two years. The utilized data measure-ment tools and functions of the data loggers or notebooks weredifferent in each of the experiments surveyed.In “Vapour Pressure”, students measure the vapor pressures

of various mixtures of cyclohexane and ethanol by manipulatingapparatus and reading equilibrium pressure values from graphsgenerated in real time on the computer or data logger display.The data are used to investigate the applicability or lack thereofof Raoult’s law, thereby making inferences about the similarityor difference in the intermolecular interactions of the twosolvents.“Biological Buffers” is an experiment in which students set up

and perform automated titrations, where pH values are plottedagainst the volume of liquid added in real time on the computeror data logger display. Students are required to analyze thesegraphical data to determine the effective pH range of buffersolutions, equivalence points, and pKa values.In “Determination of Copper(II) Ion Concentration”,

students use a colorimeter connected to the computer ordata logger to measure the absorbance values of a number ofstandard solutions of different concentration. Students thenconstruct a calibration plot using the measured absorbancevalues on either the data logger or computer and use theresulting linear relationship to determine the concentration ofan unknown solution by applying Beer’s law.

Data Treatment

Qualitative comments received in response to the openresponse items on the ASLE instrument (items 15−19) wereassigned codes based on their content and also whether thecomment was of a positive, negative, or neutral nature. The 13content-specific codes used for all open response survey itemsexcept item 16 were preestablished, devised for the purposes ofseparate research conducted in previous years (unpublisheddata). Codes used for item 16, in the case in which this itemwas used in this research, were devised as appropriate for thespecific experiment’s learning objectives. The codes used arepresented in the Supporting Information alongside theassociated data. Frequencies of comments which were andwere not assigned each of these codes were enumerated foreach survey item. Fisher’s exact test was used to compare thesefrequencies between the two forms of each experimentindividually.Responses to Likert-style items were assigned scores

corresponding to their position on the five point scale.Responses to items 1−12 of the ASLE instrument wereassigned successive integer scores from +2 to −2 (“stronglyagree” to “strongly disagree”) with zero (“neutral”) as themidpoint, +2 being the optimal response. Item 13 responses,concerning time availability, were also scored from +2 to −2(“way too much” to “nowhere near enough”) with zero (“about

right”) as the midpoint and optimal response. The final Likert-style item, concerning overall learning experience, was similarlyscored from +2 to −2 (“excellent” to “very poor”) with zero(“average”) as the midpoint and +2 as the optimal response.Mean values of response scores received from each student

cohort, from each year, for each experiment, for each Likert-type response format item on the surveys were calculated forthe purposes of comparison (labeled as m2011 and, m2012 formean values from 2011 and 2012 data, respectively). This is inline with both standard methodology of the ASELLproject54,58,60,61 and recommendations from statistics litera-ture.62 Mean scored responses to the Likert-type responseformat items were compared between years for each of thethree experiments using the t test for unequal variances,63,64

using this test in preference to the equal variances test in allcases as recommended in the statistics literature for data sets ofunequal sample size.65,66 The value of alpha (α) was selected tobe 0.05 for the purposes of statistical testing, and two tailedprobability values were obtained for all comparisons made. Inorder to account for the issue of multiple comparisons andcontrol the family-wise error rate, the unweighted Bonferronimethod67 was applied. All hypothesis tests conducted tocompare the same experiment between the two years of studywere taken to be of the same family of hypotheses, therebyyielding one family of hypothesis tests for each of the threeexperiments in the study. Consequently, statistically significantdifference was inferred at p < α/n, where n is the number ofhypothesis tests conducted to compare the relevant experi-ment’s two different forms.

Student Cohorts Sampled

Data featured in this study were obtained from studentsenrolled in the first year undergraduate courses Chemistry IAand Chemistry IB, both of which have prerequisite priorchemistry study requirements. Cohort sizes typically range fromapproximately 450 to 550 students. The format of thelaboratory program for both of these courses in both yearswas similar, with no alignment with lecture material and arandomized sequence of experiments during the semester.Students enrolled in these courses were randomly assignedpractical groups, with each group conducting experiments in adifferent sequence during fortnightly laboratory sessions. Eachexperiment was conducted over a 3 h time frame and involvedthe completion of report book questions. Students were alsorequired to complete a set of preliminary computer exercisesthe week prior to each laboratory session.The number of students sampled was variable between

experiments and between years, with students presented withthe noncompulsory ASLE instrument at the end of theirlaboratory sessions. This sampling technique, as well as therelatively low number of responses compared to the size of thestudent population presents the possibility of introducing biasin the sample, deviating from a true representation of thestudent population. Evidence of these effects was investigated,and they were found not to be present to any degree whichwould influence conclusions drawn.

Investigating Evidence of Sampling Bias

Testing the equality of student cohorts between years andanalyzing the samples for evidence of bias was conducted usingRasch analysis. The effects of sampling from different weeks ofthe semester was also examined where possible. However, giventhat the week sampled was only available for the 2012 data set,investigation of this topic is ongoing.

Journal of Chemical Education Article

dx.doi.org/10.1021/ed400835h | J. Chem. Educ. XXXX, XXX, XXX−XXXC

To gain a measure of each student’s bias toward answeringpositively, a rating scale model68 was generated for each cohort,for each experiment, using the Winsteps Rasch measurementsoftware (version 3.80.1, John M. Linacre). These Raschmodels use the entire set of responses collected to estimate ameasure of each individual student’s propensity to respondpositively, as well as an item response structure equal for allstudents.Student propensity to respond positively was assumed to be

normally distributed for a representative sample; an approx-imation also made when determining initial parameterestimates in the Rasch model optimization process.69,70

Deviations from normality in these distributions were thereforeused as indicators of potential bias in the sample. Distributionswere also tested for equality in order to test for equivalentpropensity for students to respond positively between differentsubsets of the data.Measures defining the item−response construct are reason-

ably informative for polychotomous survey items (such as is thecase in this study) for sample sizes above approximately 50,71

and these parameters are independent to the studentssampled.72 Differential item functioning (DIF) analysis, whichcompares these constructs between groups, is therefore robustagainst large differences in student propensity to respondpositively when coupled with Rasch analysis.73 This techniquewas therefore used to identify differences in survey itemresponse, independent of any potential large differences inoverall bias (whether an artifact of sampling or otherwise)between the two groups compared. However, the specific itemswhich differ between compared groups were not necessarilyinferred, as the items showing significant DIF are not always theones which genuinely differ between groups.74

Detailed presentation of these investigations is available asSupporting Information. Results indicate comparability be-tween the 2011 and 2012 cohorts and that sampling bias is notevident to any extent that would influence results of this mainstudy. While the presence of bias cannot be ruled out in thesamples taken from the copper(II) ion concentration experi-ment, this would not impact the broader conclusions of thisstudy as they rest almost exclusively on data from the otherexperiments. Data for the copper(II) ion concentration

experiment are included here for the sake of completeness asthis experiment also relied on electronic data collection.Statistical tests associated with DIF analysis were conductedwithin the Winsteps software, whereas other statistical testswere conducted using IBM SPSS statistics (version 20).

■ RESULTS

Improved Perception of Overall Learning Experience

The effects of replacing the data logger interface with the laptopinterface appear to have yielded noticeable improvements instudent perception of overall learning experience. Whenresponding to the Likert-type item “Overall, as a learningexperience, I would rate this experiment as”, students conductingthe vapor pressure experiment responded significantly morepositively (m2011 = −0.02, m2012 = 0.69, t(149.0) = −5.06, and p= 1.22 × 10−6). Once the data logger interface was replaced,this experiment in particular showed apparent improvement ina large number of respects judging by Likert-type itemresponses. These improvements, displayed in Figure 1, will beelaborated upon throughout the discussion.Change in overall perception of the vapor pressure

experiment was also clearly evident in the open responsecomments. When asked if they enjoyed the vapor pressureexperiment (item 15), a significantly greater number ofstudents gave positive responses (18 of 55 in 2011 and 54 of79 in 2012; p = 5.34 × 10−5) and fewer gave negative responses(39 of 55 in 2011 and 31 of 79 in 2012; p = 4.09 × 10−4).The biological buffers experiment may also have been

perceived more positively overall judging by responses to theLikert-type question: “Overall, as a learning experience, I wouldrate this experiment as” (item 14; m2011 = 0.60, m2011 = 0.88,t(185.3) = −2.52, and p = 1.26 × 10−2); however this resultcould not be deemed statistically significant when accountingfor family-wise error.

Reduced Negativity about the Equipment

Examining the qualitative comments received for the threeexperiments, there are a number of indicators that the laptopinterface was better received than the data logger interface.During 2011, of the 24 students who stated they did not enjoy

Figure 1. Mean Likert-type item response scores for “Vapour Pressure” as obtained using the PASCO GLX Xplorer hand-held graphing data loggerinterface and the more often familiar Windows interface of a laptop computer. *Error bars represent the standard error in the mean value.

Journal of Chemical Education Article

dx.doi.org/10.1021/ed400835h | J. Chem. Educ. XXXX, XXX, XXX−XXXD

the biological buffers experiment in response to the question“Did you enjoy the experiment? Why or why not?”, 21 of thosereferenced the equipment used. A number of these commentsmentioned that the data logger devices were prone to error anddifficult to use. Some examples from the biological buffersexperiment include:

“It was ok, the GLX thing was difficult to use”; “it wasannoying using the GLX”; “No. The explorer GLX is hard touse”; “I do not like the pasco” “no glx do not work well, mostpeople had problems”Comments from the vapor pressure experiment for this same

question further reveal issues with the use of the data loggers:“yes however I found the xplorer GLX difficult to use andvery frustrating. The video, written instructions anddemonstrator could not supply sufficient information onusing the xplorer GLX”; “No. I do not like using the explorerGLX”Once the hand-held data loggers had been replaced, students

conducting the biological buffers experiment made negativecomments about the equipment used significantly less often inresponse to this same survey item, both among all commentsreceived for this item (21 of 51 comments in 2011 and 4 of 54comments in 2012; p = 5.96 × 10−5) and also when consideringonly those comments which were negative (21 of 24 commentsin 2011 and 4 of 19 comments in 2012; p = 2.44 × 10−5).However, this was not clearly evident in the other twoexperiments, possibly as a consequence of the fact the openresponse comments do not require students to describe reasonsfor their answers, and thus low sample sizes for specific contentcodes produce difficulties in attaining statistical significance.When the question “What aspects of the experiment need

improvement, and what changes would you suggest?” was asked ofstudents, significantly fewer equipment related negative com-ments were received for the biological buffers experiment in2012 (25 of 35 comments in 2011 and 6 of 32 comments in2012; p = 2.16 × 10−5). There was also some indication of asimilar effect in the copper(II) ion concentration experiment(18 of 43 comments in 2011 and 12 of 66 in 2012; p = 4.61 ×10−2); however, this could be attributed to family-wise error.Among the negative comments which were received for thisquestion about the copper ion experiment, the equipment wasmentioned significantly less frequently among the improve-ments listed for the 2012 cohort (25 of 31 negative commentsin 2011 and 6 of 25 negative comments in 2012; p = 3.38 ×10−5).In the case of the vapor pressure experiment, comments

viewing the equipment negatively could not be said to differsignificantly. However, there appeared to be a positive shift inthe number of students who found the equipment appealingonce the data loggers had been replaced, often due to the newand unfamiliar use of technology (in this context, “unfamiliar”refers to students not having seen technology used in thismanner in the laboratory previously). In response to item 15,the level of familiarity or unfamiliarity was mentioned morefrequently as a reason for liking the experiment (1 of 55 totalcomments in 2011 and 15 of 79 comments in 2012; p = 2.20 ×10−3). Although this difference could be attributed to family-wise error, these comments almost exclusively made verypositive reference to the enjoyable experience of the novel useof technology, and this was something not seen when using thedata logger interface.Additionally, these comments cite the ability to watch the

graph change in real time in a number of cases. Some examples

of most enjoyable aspects of the experiment students describedinclude the following:

“Working with the computer program to see live data ofchanges in temp (of water bath) and vapour pressure”;“watching the graph on the screen as it changed”; “recordingthe pressure of a system onto a real time computer graph”These comments are reminiscent of some of the benefits

suggested in the literature of the use of microcomputers inlaboratories, and no comments referenced this until 2012 whenthe data logger devices were replaced with the laptopcomputers. Citing the equipment as the most enjoyable aspectof this experiment (in response to item 17) did not increase toa level able to be deemed statistically significant (14 of 43 totalcomments in 2011 and 35 of 79 comments in 2012; p = 7.82 ×10−2) but nonetheless did increase to some degree.

Greater Perception of Understanding

A broad range of indicators across all three experiments showevidence that students who used the notebooks felt a greaterperception of understanding the experiments than those usingthe data loggers. The vapor pressure experiment in particularshows clear improvement. Significant positive changes in themean response scores for items 5 and 6 were observed in thisexperiment, providing reason to suspect a potential increase instudent perception of clarity of the expected learning outcomes(m2011 = 0.28, m2012 = 0.91, t(138.4) = −4.15, and p = 5.76 ×10−5) and a greater perception that the experiment increasedthe students’ understanding of chemistry (m2011 = 0.33, m2012 =0.98, t(142.9) = −4.21, and p = 4.58 × 10−5).Open response comments for this same experiment

corroborate this finding. In response to the question “Did youenjoy the experiment? Why or why not?”, the relative occurrenceof positive and negative comments relating to understandingappeared to shift in favor of more positive comments in 2012(1 positive and 11 negative in 2011 and 6 positive and 2negative in 2012; p = 4.44 × 10−3). Understanding was alsonegatively mentioned less frequently in response to thisquestion, both in the context of all item 15 comments received(11 of 55 comments in 2011 and 2 of 79 in 2012; p = 1.69 ×10−3) and considered among the negative responses only (11 of39 in 2011 and 2 of 31 in 2012; p = 2.91 × 10−2). Thecopper(II) ion concentration experiment also shows someindication of an increased perception of understanding, withpositive comments referring to understanding stated as a reasonfor liking the experiment (item 15) more often in 2012 (8 of 81in 2011 and 21 of 95 in 2012; p = 4.05 × 10−2), though samplesfor this specific experiment may be biased. While thesedifferences are not statistically significant to the extent theymay not be attributed to family-wise error, these observationscollectively serve to reinforce the significant differences alreadyobserved in the Likert-type data. Considering the multitude ofindicators that students’ perceived understanding was greater,in the vapor pressure experiment at the very least, there is clearevidence that student perception of understanding increasedwhen the laptops were used in place of the data logger devices.Whether this improved perception of understanding is a

genuine reflection of deeper learning remains unknown. Asignificant increase in perception of data interpretation skillsdevelopment (m2011 = 0.36, m2012 = 1.06, t(131.7) = −4.90, andp = 2.76 × 10−6) was detected in responses to the first surveyitem for the vapor pressure experiment; however, the truth ofthis increased perception is unknown, even if genuinely present.In response to the question “What did you think was the main

Journal of Chemical Education Article

dx.doi.org/10.1021/ed400835h | J. Chem. Educ. XXXX, XXX, XXX−XXXE

lesson to be learnt f rom the experiment?”, students conducting thevapor pressure experiment in 2012 did cite at least one of themain concepts (Dalton’s law or Raoult’s law and when itapplies, intermolecular forces, nonideal mixtures) in theirresponses more frequently (20 of 45 comments in 2011 and 47of 71 in 2012; p = 3.33 × 10−2). Comments including Raoult’slaw or Dalton’s law were also more frequent (14 of 45comments in 2011 and 37 of 71 in 2012; p = 3.48 × 10−2).However, these results may be attributed to family-wise errorand hence are inconclusive.

Increased Perceived Simplicity

In addition to the improvement in student perception ofunderstanding, the students using the notebook computers alsoappear to have reported a view that the experiment was“simple” (a term commonly used by students in their openresponses) more frequently than the cohort using the datalogger devices. In response to being asked if they liked theexperiment and why (item 15), students conducting the vaporpressure experiment using the data logger interface made nopositive comments about the level of simplicity of theexperiment, whereas 7 negative comments were receivedstating the lack of simplicity was their reason for disliking theexperiment. In 2012, using the laptop interface, this perceptionwas entirely reversed. No students mentioned a lack ofsimplicity, and 12 listed the experiment’s simplicity as theirreason for liking the experiment, a clear significant improve-ment (p = 1.99 × 10−5).As a consequence of this, the vapor pressure experiment

exhibits a number of differences in the tests comparingfrequency of reasons for liking and disliking the experimentbetween the two years regarding the experiment’s simplicity. Inresponse to item 15, simplicity was more frequent amongreasons given for liking the experiment (0 of 18 comments in2011 and 12 of 54 in 2012; p = 2.99 × 10−2) and was lessfrequent among reasons given for disliking the experiment (7 of39 comments in 2011 and 0 of 31 in 2012; p = 1.50 × 10−2).When considering all comments received for this survey item,simplicity was mentioned positively more often (0 of 55comments in 2011 and 12 of 79 in 2012; p = 1.45 × 10−3) andmentioned negatively less often (7 of 55 comments in 2011 and0 of 79 in 2012; p = 1.55 × 10−3). These differences are notsignificant beyond attribution to family-wise error but can easilybe taken as genuine since they are reminiscent of the significantdifference already confirmed.There is also some evidence that students conducting the

copper(II) ion concentration experiment in 2012 when usingthe laptop interface reported a perception of simplicity morefrequently than the 2011 cohort, though it is possible that thereexists a degree of bias in samples taken for this experiment.Simplicity was mentioned as a reason for liking the experimentmore often among all item 15 responses (12 of 81 comments in2011 and 32 of 95 in 2012; p = 4.98 × 10−3), as well as amongonly those reasons for liking the experiment (12 of 66comments in 2011 and 32 of 86 in 2012; p = 1.18 × 10−2).These differences are not large enough to exclude thepossibility of being artifacts of family-wise error but again arereminiscent of effects already seen elsewhere.

Further Comments

Following the technological user interface change made,student perception of the vapor pressure experiment inparticular was subject to widespread improvement, withstudents reporting a greater perception of understanding of

the experiment and clarity of the learning outcomes, a completereversal in the initial perceived lack of simplicity of theexperiment and a more positive perception of overall learningexperience. The novel equipment was seen in a more positivelight, and students began making comments more reminiscentof the usual benefits of real time graphing technology in MBLexercises. The biological buffers experiment also received fewernegative comments about the equipment, showing the laptopinterface to be the more positively received option. Thecopper(II) ion concentration experiment did not show anyclear difference between the two forms of the experiment, butthe data do show some signs of improvement in students’perceived understanding and the perceived simplicity of theexperiment.Reasons for the lack of major difference observed in the

copper(II) ion concentration experiment are unclear, althoughit is possible that sampling bias could prevent any differencesbeing detectable. It is also possible that differences for thisexperiment were not observable as the microcomputertechnology is used in a comparatively smaller portion of thetask in this case. Relative portions of the task in which themicrocomputer technology was used could also explain whywidespread significant improvement was more evident in thevapor pressure experiment than for the biological buffersexperiment.It is conceivable that these observed improvements arose

because a portion of the instructions were rewritten toaccommodate the different technology in 2012, but there islittle data to suggest this is the case. An exception to this is thecase of the vapor pressure experiment, in which the manual wasreported as a reason for disliking the experiment (item 15) lessoften in 2012 (12 negative comments of 55 total comments in2011 and 5 of 79 in 2012; p = 1.53 × 10−2). However, thisdifference is not significant to the degree that it could not beattributed to family-wise error. Far stronger evidence exists thatchanging the manual was in fact detrimental rather thanbeneficial in the case of the biological buffers experiment. Thematerial provided for the experiment was negatively mentionedsignificantly more frequently when asking students for potentialimprovements to the experiment (item 18), both in the contextof all comments made (3 of 35 in 2011 and 17 of 32 in 2012; p= 1.10 × 10−4), and considering only those improvementslisted (3 of 31 in 2011 and 17 of 25 in 2012; p = 1.11 × 10−5).It is possible that without this detrimental change in theinstructions, some of the positive effects of replacing the hand-held graphing data loggers seen in the vapor pressureexperiment would also have become apparent in the biologicalbuffers experiment. In any case, the changes in the manual donot appear clearly responsible for the significant improvementsobserved, meaning these effects are reasonably concluded to begenuine consequences of the technological change.The influence of having different practical demonstrators

between years is not known with certainty. However, responsesto item 8 of the survey, asking students about effectivesupervision and guidance of the demonstrator, do not differbetween years in any case even if multiple comparisons areunaccounted for. From this it appears reasonable to concludethat demonstrators were similarly effective in the students’views and are unlikely to have differently influenced theperceptions of students in the two different years compared.

Journal of Chemical Education Article

dx.doi.org/10.1021/ed400835h | J. Chem. Educ. XXXX, XXX, XXX−XXXF

■ CONCLUSIONWhen conducting experiments in 2011 using the PASCOXplorer GLX hand-held graphing data logger, a number ofnegative student comments were received about the devices,often concerning the difficulty of their use. Replacing these withnotebook computers equipped with PASCO DataStudiosoftware in 2012 resulted in numerous improvements instudent perception data as compared with the previous year’scohort, including a significant reduction in these commentsamong other benefits. Based on the reduction in commentsabout the difficulty of using the technology, and the fact thatthe user interface was the only facet of the procedure altered,the data suggest that a more easily used technological interfaceplays a key role in positive student perception of theseexperiments.A recognition of the fact that the user interface of technology

can influence student perceptions as strongly as has beenobserved here may significantly assist in alleviating studentissues with microcomputer based laboratory activities reportedin the literature. It is suggested that a familiar user interface is avital element of the teaching laboratory when computers areused and that in order to allow full access to the potentialbenefits of microcomputer based laboratory activities from thestudent perspective, hand-held graphic data logging devices arefar less preferable than alternatives equipped with the samecapabilities that are more easily used by students, such asnotebook computers.

■ ASSOCIATED CONTENT*S Supporting Information

Tables listing full summary of all data collected and the resultsof all comparative statistical tests conducted in the main study,text elaborating upon the methodology and results of studiesused here to investigate evidence of sample bias, and figuresshowing analyses of the various experiments. This material isavailable via the Internet at http://pubs.acs.org.

■ AUTHOR INFORMATIONCorresponding Author

*E-mail: [email protected]

The authors declare no competing financial interest.

■ ACKNOWLEDGMENTSWe acknowledge the School of Chemistry and Physics at theUniversity of Adelaide for funding the implementation oflaptop computers in the laboratory, Peter Roberts for the setupof the laboratory equipment, as well as the practicaldemonstrators for accommodating the distribution andcollection of surveys, and the students for their valuedcontribution of responses. Collection of data for this projectwas approved by the University of Adelaide Human ResearchEthics Committee, Project No. H-2012-097.

■ REFERENCES(1) Hofstein, A. The Laboratory in Chemistry Education: ThirtyYears of Experience with Developments, Implementation andResearch. Chem. Ed. Res. Pract. 2004, 5 (3), 247−264.(2) Hofstein, A.; Lunetta, V. N. The Laboratory in ScienceEducation: Foundation for the 21st Century. Sci. Educ. 2004, 88,28−54.

(3) Hofstein, A.; Mamlok-Naaman, R. The Laboratory in ScienceEducation: The State of the Art. Chem. Educ. Res. Pract. 2007, 8 (2),105−107.(4) Hofstein, A.; Lunetta, V. N. The Role of the Laboratory inScience Teaching: Neglected Aspects of Research. Rev. Educ. Res.1982, 52 (2), 201−217.(5) Tsaparlis, G. Learning at the Macro Level: The Role of PracticalWork. In Multiple Representations in Chemical Education; Gilbert, J. K.,Treagust, D., Eds.; Springer: Dordrecht, The Netherlands, 2009; pp109−136.(6) Johnstone, A. H. Why is science difficult to learn? Things areseldom what they seem. J. Comput. Assisted Learn. 1991, 7 (2), 75−83.(7) Johnstone, A. H. The development of chemistry teaching: Achanging response to changing demand. J. Chem. Educ. 1993, 70 (9),701.(8) Gilbert, J. K.; Treagust, D. Multiple representations in chemicaleducation; Springer: Dordrecht, The Netherlands, 2009.(9) Johnstone, A. H. You Can’t Get There from Here! J. Chem. Educ.2010, 87 (1), 22−29.(10) Cantu, L. L.; Herron, J. D. Concrete and formal piagetian stagesand science concept attainment. J. Res. Sci. Teach. 1978, 15 (2), 135−143.(11) Linn, M. Technology and science education: Starting points,research programs, and trends. Int. J. Sci. Educ. 2003, 25 (6), 727−758.(12) Durick, M. A. The Study of Chemistry by Guided InquiryMethod Using Microcomputer-Based Laboratories. J. Chem. Educ.2001, 78 (5), 574.(13) Nyasulu, F.; Barlag, R. Thermokinetics: Iodide-CatalyzedDecomposition Kinetics of Hydrogen Perioxide. An Initial-RateApproach. J. Chem. Educ. 2009, 86 (10), 1231.(14) Struck, W.; Yerrick, R. The Effect of Data Acquisition-Probeware and Digital Video Analysis on Accurate GraphicalRepresentation of Kinetics in a High School Physics Class. J. Sci.Educ. Technol. 2010, 19 (2), 199−211.(15) Amrani, D. Determination of absolute zero using a computer-based laboratory. Physi. Educ. 2007, 42 (3), 304.(16) Vannatta, M. W.; Richards-Babb, M.; Solomon, S. D. PersonalMultifunctional Chemical Analysis Systems for UndergraduateChemistry Laboratory Curricula. J. Chem. Educ. 2010, 87 (8), 770−772.(17) Chebolu, V.; Storandt, B. C. Stoichiometry of the Reaction ofMagnesium with Hydrochloric Acid. J. Chem. Educ. 2003, 80 (3), 305.(18) Choi, M. M. F.; Wong, P. S.; Yiu, T. P.; Mark, C. Application ofdatalogger in observing photosynthesis. J. Chem. Educ. 2002, 79 (8),980.(19) Thornton, R. K.; Sokoloff, D. R. Learning motion conceptsusing real-time microcomputer-based laboratory tools. Am. J. Phys.1990, 58 (9), 858−867.(20) Mokros, J. R.; Tinker, R. F. The impact of microcomputer-basedlabs on children’s ability to interpret graphs. J. Res. Sci. Teach. 1987, 24(4), 369−383.(21) Trumper, R.; Gelbman, M. A Microcomputer-Based Con-tribution to Scientific and Technological Literacy. J. Sci. Educ. Technol.2001, 10 (3), 213−221.(22) Linn, M. C.; Layman, J. W.; Nachmias, R. Cognitiveconsequences of microcomputer-based laboratories: Graphing skillsdevelopment. Contemp. Educ. Psychol. 1987, 12 (3), 244−253.(23) Rogers, L. T. Computer as an Aid for Exploring Graphs. SchoolSci. Rev. 1995, 76 (276), 31−39.(24) Redish, E. F.; Saul, J. M.; Steinberg, R. N. On the effectivenessof active-engagement microcomputer-based laboratories. Am. J. Phys.1997, 65 (1), 45−54.(25) BouJaoude, S.; Jurdak, M. Integrating physics and math throughmicrocomputer-based laboratories (MBL): Effects on discourse type,quality, and mathematization. Int. J. Sci. Math Educ. 2010, 8 (6),1019−1047.(26) Metcalf, S.; Tinker, R. Probeware and Handhelds in Elementaryand Middle School Science. J. Sci. Educ. Technol. 2004, 13 (1), 43−49.

Journal of Chemical Education Article

dx.doi.org/10.1021/ed400835h | J. Chem. Educ. XXXX, XXX, XXX−XXXG

(27) Stringfield, J. K. Using Commercially Available, Microcomputer-Based Labs in the Biology Classroom. Am. Biol. Teach. 1994, 56 (2),106−108.(28) Russell, D. W.; Lucas, K. B.; McRobbie, C. J. Role of themicrocomputer-based laboratory display in supporting the construc-tion of new understandings in thermal physics. J. Res. Sci. Teach. 2004,41 (2), 165−185.(29) Solomon, J.; Bevan, R.; Frost, A.; Reynolds, H.; Summers, M.;Zimmerman, C. Can pupils learn through their own movements? Astudy of the use of a motion sensor interface. Phys. Educ. 1991, 26 (6),345.(30) Trumper, R. Learning Kinematics with a V-Scope: A CaseStudy. J. Comput. Math. Sci. Teach. 1997, 16 (Generic), 91−110.(31) Thornton, R. K. Tools for scientific thinking-microcomputer-based laboratories for physics teaching. Phys. Educ. 1987, 22 (4), 230.(32) Espinoza, F. The Use of Graphical Analysis with Micro-computer-Based Laboratories to Implement Inquiry as the PrimaryMode of Learning Science. J. Educ. Technol. Syst. 2007, 35 (3), 315−335.(33) Espinoza, F.; Quarless, D. An Inquiry-Based ContextualApproach as the Primary Mode of Learning Science with Micro-computer-Based Laboratory Technology. J. Educ. Technol. Syst. 2009,38 (4), 407−426.(34) Weller, H. G. Assessing the impact of computer-based learningin science. J. Res. Comput. Educ. 1996, 28 (4), 461.(35) Rogers, L.; Wild, P. Data-logging: effects on practical science. J.Comput. Assisted Learn. 1996, 12 (3), 130−145.(36) Newton, L. R. Data-logging in practical science: Research andreality. Int. J. Sci. Educ. 2000, 22 (12), 1247−1259.(37) Adams, D. D.; Shrum, J. W. The effects of microcomputer-basedlaboratory exercises on the acquisition of line graph construction andinterpretation skills by high school biology students. J. Res. Sci. Teach.1990, 27 (8), 777−787.(38) Nakhleh, M. B.; Krajcik, J. S. Influence of levels of informationas presented by different technologies on students’ understanding ofacid, base, and ph concepts. J. Res. Sci. Teach. 1994, 31 (10), 1077−1096.(39) Atar, H. Y. Examining students’ and teacher’s perceptions ofmicrocomputer base laboratories (MBLs) in a high school chemistryclasses. Master’s Thesis, Florida State University, Tallahassee, FL,USA, 2001.(40) Tortosa, M. The use of microcomputer based laboratories inchemistry secondary education: Present state of the art and ideas forresearch-based practice. Chem. Educ. Res. Pract. 2012, 13 (3), 161−171.(41) Thomas, G. P.; Man-wai, P. F.; Po-keung, E. T. Students’perceptions of early experiences with microcomputer-based laborato-ries (MBL). Br. J. Educ. Technol. 2004, 35 (5), 669−671.(42) Aksela, M. Supporting meaningful chemistry learning and higher-order thinking through computer-assisted inquiry: A design researchapproach. Academic Dissertation, University of Helsinki, Finland,2005.(43) Roth, W.-M.; Woszczyna, C.; Smith, G. Affordances andconstraints of computers in science education. J. Res. Sci. Teach. 1996,33 (9), 995−1017.(44) Atar, H. Y. Chemistry Students’ Challenges in Using MBL’s inScience Laboratories. Distributed by ERIC Clearinghouse: Washington,DC, USA, 2002.(45) Kay, R.; Knaack, L. Evaluating the Use of Learning Objects forSecondary School Science. J. Comput. Math. Sci. Teach. 2007, 26 (4),261−289.(46) ASELL Advancing Science by Enhancing Learning in theLaboratory website. http://www.asell.org (accessed May 2014).(47) Barrie, S.; Buntine, M.; Jamie, I.; Kable, S. Physical Chemistry inthe Lab. Chem. Aust. 2001, 68 (2), 36−37.(48) Barrie, S.; Jamie, I.; Kable, S.; Buntine, M. Teaching andLearning in the Laboratory: Can We Do It Better? Synergy 2001, 8−9.

(49) Barrie, S. C.; Buntine, M. A.; Jamie, I. M.; Kable, S. H. APCELL:The Australian Physical Chemistry Enhanced Laboratory LearningProject. Aust. J. Educ. Chem. 2001, 57, 6−12.(50) Barrie, S. C.; Buntine, M. A.; Jamie, I. M.; Kable, S. H. InAPCELL: Developing better ways of teaching in the laboratory,Proceedings of Research and Development into University ScienceTeaching and Learning Workshop, Sydney, NSW, Australia;Fernandez, A., Ed.; UniServe Science: Sydney, Australia, 2001; pp23−28.(51) Read, J. R.; Barrie, S. C.; Bucat, R. B.; Buntine, M. A.; Crisp, G.T.; George, A. V.; Jamie, I. M.; Kable, S. H. Achievements of anACELL workshop. Chem. Aust. 2006, 73 (9), 17−20.(52) Read, J. R. The Australian Chemistry Enhanced LaboratoryLearning Project. Chem. Aust. 2006, 73 (1), 3−5.(53) Buntine, M. A.; Read, J. R.; Barrie, S. C.; Bucat, R. B.; Crisp, G.T.; George, A. V.; Jamie, I. M.; Kable, S. H. Advancing Chemistry byEnhancing Learning in the Laboratory (ACELL): A Model forProviding Professional and Personal Development and FacilitatingImproved Student Laboratory Learning Outcomes. Chem. Educ. Res.Pract. 2007, 8 (2), 232−254.(54) George, A. V.; Read, J. R.; Barrie, S. C.; Bucat, R. B.; Buntine, M.A.; Crisp, G. T.; Jamie, I. M.; Kable, S. H., What Makes a GoodLaboratory Learning Exercise? Student Feedback from the ACELLProject. In Chemistry Education in the ICT Age; Gupta-Bhowon, M.,Kam Wah, H., Jhaumeer-Laulloo, S., Ramasami, P., Eds.; Springer:Dordrecht, The Netherlands, 2009; pp 363−376.(55) Jamie, I. M.; Read, J. R.; Barrie, S. C.; Bucat, R. B.; Buntine, M.A.; Crisp, G. T.; George, A. V.; Kable, S. H. From APCELL to ACELLand beyond: Expanding a multi-institution project for laboratory-basedteaching and learning. Aust. J. Educ. Chem. 2007, No. 67, 1−17−23.(56) Wilson, K.; Mills, D.; Sharma, M.; Kirkup, L.; Mendez, A.; Scott,D. In ACELL for Physics?, Proceedings of The Australian Conferenceon Science and Mathematics Education (formerly UniServe ScienceConference); Australian National University: Canberra, Australia,2012.(57) Pyke, S. M.; Yeung, A.; Kable, S. H.; Sharma, M. D.; Barrie, S.C.; Buntine, M. A.; Da Silva, K. B.; Lim, K. F. In The Advancing Scienceby Enhancing Learning in the Laboratory (ASELL) Project: The NextChapter, Proceedings of the 16th UniServe Science Annual Confer-ence; The University of Sydney: Sydney, Australia, 2010.(58) Yeung, A.; Pyke, S. M.; Sharma, M. D.; Barrie, S. C.; Buntine, M.A.; Da Silva, K. B.; Kable, S. H.; Lim, K. F. The Advancing Science byEnhancing Learning in the Laboratory (ASELL) Project: The firstAustralian multidisciplinary workshop. Int. J. Innovation Sci. Math.Educ. 2011, 19 (2), 51−72.(59) Kable, S.; Buntine, M.; Yeung, A.; Sharma, M.; Lim, K.; Pyke, S.;Da Silva, K. B.; Barrie, S. Advancing Science by Enhancing Learning inthe Laboratory (ASELL) Final Report 2012; Australian Learning andTeaching Council: Sydney, Australia, 2012.(60) Read, J. R.; Kable, S. H. Educational analysis of the first yearchemistry experiment ‘Thermodynamics Think-In’: An ACELLexperiment. Chem. Educ. Res. Pract. 2007, 8 (2), 255−273.(61) Crisp, M. G.; Kable, S. H.; Read, J. R.; Buntine, M. A. Adisconnect between staff and student perceptions of learning: AnACELL educational analysis of the first year undergraduate chemistryexperiment ’investigating sugar using a home made polarimeter’. Chem.Educ. Res. Pract. 2011, 12 (4), 469−477.(62) Graubard, B. I.; Korn, E. L. Choice of Column Scores forTesting Independence in Ordered 2 × K Contingency Tables.Biometrics 1987, 43 (2), 471−476.(63) Welch, B. L. The Generalization of ‘Student’s’ Problem whenSeveral Different Population Variances Are Involved. Biometrika 1947,34 (1/2), 28−35.(64) Welch, B. L. The significance of the difference between twomeans when the population variances are unequal. Biometrika 1938, 29(3−4), 350−362.(65) Zimmerman, D. W. A note on preliminary tests of equality ofvariances. Br. J. Math. Stat. Psychol. 2004, 57 (1), 173−181.

Journal of Chemical Education Article

dx.doi.org/10.1021/ed400835h | J. Chem. Educ. XXXX, XXX, XXX−XXXH

(66) Zimmerman, D. W.; Zumbo, B. D. Hazards in ChoosingBetween Pooled and Seperate-Variances t Tests. Psicologica 2009, 30,371−390.(67) Shaffer, J. P. Multiple Hypothesis Testing. Annu. Rev. Psychol.1995, 46 (1), 561−584.(68) Andrich, D. A Rating Formulation for Ordered ResponseCategories. Psychometrika 1978, 43 (4), 561−573.(69) Linacre, J. M. PROX with missing data, or known item orperson measures. Rasch Meas. Trans. 1994, 8 (3), 378.(70) Cohen, L. Approximate expressions for parameter estimates inthe Rasch model. Br. J. Math. Stat. Psychol. 1979, 32 (1), 113−120.(71) Linacre, J. M. Sample Size and Item Calibration Stability. RaschMeas. Trans. 1994, 7 (4), 328.(72) Wright, B.; Panchapakesan, N. A Procedure for Sample-FreeItem Analysis. Educ. Psychol. Meas. 1969, 29 (1), 23−48.(73) Riley, B. Considering Large Group Differences in Ability in DIFAnalysis. Rasch Meas. Trans. 2011, 251 (2), 1326.(74) Andrich, D.; Hagquist, C. Real and Artificial Differential ItemFunctioning. J. Educ. Behav. Stat. 2012, 37 (3), 387−416.

Journal of Chemical Education Article

dx.doi.org/10.1021/ed400835h | J. Chem. Educ. XXXX, XXX, XXX−XXXI