Upload
michael-m-grant
View
502
Download
1
Embed Size (px)
Citation preview
Arrogance or Apathy?
The need for formative evaluation + current & emerging strategies
Michael M. Grant, PhD University of South Carolina
Michael M. Grant 2015
Michael M. Grant The University of South Carolina http://viral-notebook.com @michaelmgrant
Arrogance or Apathy? We just
don’t have time to do evaluation.
Our HR folks
won’t let us do evals.
There’s really no point because we’re going to
deploy it anyways.
It’s just not going to make a difference.
We don’t have access to testers.
Our managers
don’t care.
We’re just doing it for compliance.
Name a reason for evaluation.
http://pollev.com/mgrant
Level 5: ROI
Level 4: Organization
Level 3: Transfer
Level 2: Learning
Level 1: Reaction
Kirkpatrick (& Phillips) Levels
Level 2 Evaluations Level 5: ROI
Level 4: Organiza2on
Level 3: Transfer
Level 2: Learning
Level 1: Reac2on
Appeal
Effectiveness
Efficiency
Level 5: ROI
Level 4: Organization
Level 3: Transfer
Level 2: Learning
Level 1: Reaction
Kirkpatrick Levels
91.3%
53.9%
22.9%
7.6%
2.1%
(ASTD, 2005)
in practice
79%
38%
15% 9%
0%10%20%30%40%50%60%70%80%
Reaction Cognitive Behavior Results
Kirkpatrick Levels
R.A. Noe (2005) Employee Training and Development
in practice
Level 5: ROI
Level 4: Organization
Level 3: Transfer
Level 2: Learning
Level 1: Reaction
Kirkpatrick Levels
(ASTD, 2009)
92%
53.9%
22.9%
7.6%
17.9%
in practice
Level 5: ROI
Level 4: Organization
Level 3: Transfer
Level 2: Learning
Level 1: Reaction
Kirkpatrick Levels
(TrainingMag, 2013)
68 percent of applicants u2lize Return on Value; 71 percent u2lize Return on Investment; 56 percent u2lize balanced scorecards; and 47 percent u2lize Six Sigma. The Kirkpatrick Levels of Evalua2on are more widely used: Level 1 (97 percent), Levels 2 and 3 (94 percent), and Level 4 (88 percent).
97%
94%
94%
88%
71%
in practice
Training Magazine Top 125 Companies in 2013
The New World Kirkpatrick 4 Levels
!!!
6
Required Drivers
!
Learning Context Performance Context
Level 4 and Level 5
Level 5: ROI
Level 4: Organiza2on
How do we measure the impact on business?
How do we measure the return on investment?
Level 4 and Level 5
Level 5: ROI
Level 4: Organiza2on
How do we measure the impact on business?
How do we measure the return on investment?
Compares benefits to cost/ Benefit Cost ra2o: ROI(%) = Net Monetary Benefits x 100
Program costs
Measures changes in business impact variables ( produc2vity, incidents, compliance discrepancies, customer service, etc,.
Effectiveness Evaluation Activities include field tests, observations, interviews and performance assessments. Purpose? Determine whether the ILE accomplishes its objectives within the immediate or short-term context of its implementation.
Formative Evaluation What ’s the purpose?
A focus on improvement during development.
User Review Observations from one-on-ones and small groups
“Vote early and often.” The sooner formative evaluation is conducted during development, the more likely that substantive improvements will be made and costly errors avoided.
— Reeves & Hedberg (2003), p. 142
Formative Evaluation Stages • Design team review • Expert review • One-to-one • Small group • Field trials
Questions for Evaluation 1. What are the logistical
requirements for implementing the ILE?
– Hardware – Software – Adjunct materials – Help and support
2. What are the user reactions to the ILE?
– Appeal – Motivation – Usability – Comprehension
3. What are the trainer/instructor reactions to the ILE?
– Appeal – Utility
4. What are the expert reactions to the ILE?
– Content – Instructional design – Human-computer interface – Aesthetics
5. What corrections must be made to the ILE?
6. What enhancements can be made to the ILE?
Data Collection Matrix
Methods
1. What are the logistical requirements?
2. What are user reactions?
3. What are trainer reactions?
4. What are expert reactions?
5. What corrections must be made?
6. What enhancements can be made?
Anecdotal records X X X X X
User questionnaires X X X X
User interviews X X X X
User focus groups X X X
Usability observations X X X X
Online data collection X X
Expert reviews X X X
Evaluation informs development
• Project Conceptualization Review
• Design Needs Assessment
• Development Formative Evaluation
• Implementation Effectiveness Evaluation
• Institutionalization Impact Evaluation
• Project Re-conceptualization Maintenance Evaluation
from Reeves & Hedberg (2003)
Contemporary Development Models
Michael Allen/Allen Interactions’ Successive Approximation Model (SAM)
Contemporary Development Models
Image from h`p://www.intechopen.com/source/html/19453/media/image2.jpeg
Concurrent Design
from Tripp, S., & Bichelmeyer, B. (1990)
Rapid Prototyping
Contemporary Development Models
Contemporary Development Models
• Originated in manufacturing • ID hijacked from software
development • Focused on development
primarily
• Types of prototypes § Look-and-feel: colors,
effects, gross screen layouts
§ Media: use of sound effects, narration, 3D illustrations, video, etc.
§ Navigation: move through sections, access support (glossary, calculator, etc.)
§ Interactivity: content, activities, feedback
Rapid Prototyping
Contemporary Development Models
1. Active user involvement is imperative
2. The team must be empowered to make decisions
3. Requirements evolve but the timescale is fixed
4. Capture requirements at a high level; lightweight & visual
5. Develop small, incremental releases and iterate
6. Focus on frequent delivery of products
7. Complete each feature before moving on to the next
8. Apply the 80/20 rule 9. Testing is integrated
throughout the project lifecycle – test early and often
10. A collaborative & cooperative approach between all stakeholders is essential
Agile Software Development
What to consider with effectiveness . . . • An approved evaluation plan
– e.g., union, stakeholders, management
• Feasibility • Reliability • Validity • Implementation logs
“Training evaluation provides the data needed to demonstrate that training does provide benefits to the company.”
(p. 311, R. Krishnaveni, 2008)
“Vote early and often.”
The sooner forma6ve evalua6on is conducted during development, the more likely that substan6ve improvements will be made and costly errors avoided.
(Reeves & Hedberg, 2003, p. 142)
Formative Evaluation
Expert review during development
User review during development
Usability tes2ng
3 Methods
“Experts are anyone with specialized knowledge that is relevant to the design of your interactive learning
environment.”
(Reeves & Hedberg, 2003, p. 145)
Expert Review • Scope • Sequence • Accuracy • Scenarios • Examples
SMEs
• Instructional strategies • Sequence • Practice • Mnemonics
Instructional experts
• Aesthetics • Metaphors • Icons • Navigation
Graphic designers
• Logistics
Teachers/ Trainers
• User experience • Story/narrative • Connections with
other systems
Interaction designers
• SCORM compliance/pkg • LMS integration • Metadata
LMS Administrators
Interface Review Guidelines
from h`p://it.coe.uga.edu/~treeves/edit8350/UIRF.html
How usable is this interface?
Here’s what I wrote …
User review in development
1-on-1 Observations
• Prototype Revision 1 • Try-out impressions;
obvious flaws; examples/scenarios
• 2 to 3 people • Instruments
– Observation Notes Form – Interview Protocol – Attitude Survey
Small Group Trials
• Prototype Revision 2 • Identify strengths and
weaknesses • 3 to 4 people • Instruments
– Observation Notes Form – Attitude Survey – Interview Protocol – Posttest/Learner
Performance
Observations from one-on-ones and small groups
User review in development
In a contemporary model, users are likely involved early and through multiple iterations and multiple prototypes.
What Is Usability?
Two Major Methods to Evaluate Usability
Heuristic Evaluation
• Quick • Expert Analyses • No user involvement
User Testing
• Finds more problems • User involvement increases
validity • Seeing problems has a huge
impact on developers
“The most common user action on a Web
site is to flee.” — Edward Tufte
“At least 90% of all commercial Web sites are overly difficult to use …. the average outcome of Web
usability studies is that test users fail when they try to perform a test task
on the Web. Thus, when you try something new on the Web, the
expected outcome is failure.
— Jakob Nielsen
Nielsen Web Usability Rules
1. Visibility of system status
2. Match between system and real world
3. User control and freedom
4. Consistency and standards
5. Error prevention 6. Recognition rather
than recall
7. Flexibility and efficiency of use
8. Help users recognize, diagnose, and recover from errors
9. Help and documentation
10. Aesthetic and minimalist design
• Ease of learning - How fast can a user who has never seen the user interface before learn it sufficiently well to accomplish basic tasks?
• Efficiency of use - Once an experienced user has learned to use the system, how fast can he or she accomplish tasks?
• Memorability - If a user has used the system before, can he or she remember enough to use it effectively the next time or does the user have to start over again learning everything?
• Error frequency and severity - How often do users make errors while using the system, how serious are these errors, and how do users recover from these errors?
• Subjective satisfaction - How much does the user like using the system?
Heuristic Evaluation Process 1. Several experts
individually compare a product to a set of usability heuristics
2. Violations of the heuristics are evaluated for their severity and extent suggested solutions
3. At a group meeting, violation reports are categorized and assigned
4. average severity ratings, extents, heuristics violated, description of opportunity for improvement
Heuristic Evaluation Comparisons Advantages • Quick: Do not need
to find or schedule users
• Easy to review problem areas many times
• Inexpensive: No fancy equipment
Disadvantages • Validity: No users
involved
• Finds fewer problems (40-60% less??)
• Getting good experts
• Building consensus with experts
Heuristic Evaluation Report
Heuristic Evaluation Report
User Testing
User Testing
• People whose characteristics (or profiles) match those of the Web site’s target audience perform a sequence of typical tasks using the site.
• Examines: – Ease of learning – Speed of task
performance – Error rates – User satisfaction – User retention over
time
Image from (nz)dave at h`p://www.flickr.com/photos/nzdave/491411546/
“For most companies…it 's fine to conduct tests in a conference room or an office — as long as you can close the door to keep
out distractions. What matters is that you get hold of real users and sit with them while they use the design. A notepad is the only
equipment you need.” — Jakob Nielsen
http://www.nngroup.com/articles/usability-101-introduction-to-usability/
Elements of User Testing • Define target
users • Have users
perform representative tasks
• Observe users • Report results
Often called a user profile or persona.
Image from h`p://www.op2mum-‐web.co.uk/services/user-‐needs/personas/ & h`p://uxsuccess.com/2009/12/01/agile-‐personas-‐and-‐context-‐scenario/
Why Multiple Evaluators?
• Single evaluator achieves poor results – Only finds about 35% of usability problems – 5 evaluators find more than 75%
Why only 5 Users?
(Nielsen, 2000)
Reporting User Testing
• Overall goals/objectives • Methodology • Target profile • Testing outline with test script • Specific task list to perform • Data analysis & results • Recommendations
User Experience (UX)
from Jesse James Garrett | http://www.jjg.net/ia
User Experience (UX)
from Peter Morville | http://semanticstudios.com/user_experience_design/
User Experience (UX)
• Project Management • User Research • Usability Evaluation • Information Architecture (IA) • User Interface Design • Interaction Design (IxD) • Visual Design Content Strategy • Accessibility • Web Analytics
b Learner
Current & Emerging Strategies for User Testing
Now defunct!
A/B Testing
usabilla
Morae
userzoom
10 Second Usability Test
1. Disable stylesheets 2. Check for the following:
1. Semantic markup 2. Logical organization 3. Only images related to content appear
References & Acknowledgements
American Society for Training & Development. (2009). The value of evaluation: Making training evaluations more effective. Author.
Follett, A. (2009, October 9). 10 qualitative tools to improve your web site. Instant Shift. Retrieved March 18, 2010 from http://www.instantshift.com/2009/10/08/10-qualitative-tools-to-improve-your-website/
Image from http://www.flickr.com/photos/mutsmuts/4695658106/sizes/z/in/photostream/ Nielsen, J. (2000, March 19). Why you only need to test with 5 users. Jakob Nielsen’s Alertbox.
Retrieved from http://www.useit.com/alertbox/20000319.html Nielsen, J. (2012, January 4). Usability 101: Introduction to usability. NielsenNorman Group.
Retrieved from http://www.nngroup.com/articles/usability-101-introduction-to-usability/ Reeves, T.C. (2004, December 9). Design research for advancing the integration of digital
technologies into teaching and learning: Developing and evaluating educational interventions. Paper presented to the Columbia Center for New Media Teaching and Learning, New York, NY. Available at http://ccnmtl.columbia.edu/seminars/reeves/CCNMTLFormative.ppt
Reeves, T.C. & Hedberg, J.C. (2003). Interactive learning systems evaluation. Englewood Cliffs, NJ: Educational Technology Publications.
Usability.gov
Wu, S. (2015, June 22). 7 best pieces of user testing software. Creative Bloq. Retrieved from http://www.creativebloq.com/ux/best-user-testing-software-61515337
Michael M. Grant 2015