Assessment 101 Joyce Chapman Project Manager, Triangle Research Libraries Network Heads of Cataloging Interest Group ALA annual, Anaheim CA 25 June 2012
Assessment 101 Joyce Chapman Project Manager, Triangle Research
Libraries Network Heads of Cataloging Interest Group ALA annual,
Anaheim CA 25 June 2012
Slide 2
WHAT IS ASSESSMENT?
Slide 3
Assessment is a continuous and cyclical process by which we
evaluate and improve services, products, workflows, and
learning.
Slide 4
A continuous process "Assessment is a process whose power is
cumulative. University of Washingtons assessment principles
One-shot" assessments can be useful in the right context Ongoing
assessment is what fosters ongoing improvement
Slide 5
Slide 6
Planning phase Amazing how often people decide to skip this.
Planning is one of the most difficult phases. Determine your
objectives Define the questions that need to be answered Design a
method to answer the questions (set up a study, collect new data,
extract existing data from a system)
Slide 7
Slide 8
Implementation (data gathering) NOT numbers for the sake of
numbers We frequently measure things that are easy to measure,
without a good reason for doing so. This data may not help us
answer meaningful questions. For data collection to foster
assessment, we must first determine what it is we really care
about, then initiate data collection that will inform meaningful
analysis and outcomes.
Slide 9
Slide 10
Assessment is a continuous and cyclical process by which we
evaluate and improve services, products, workflows, and
learning.
Slide 11
React / refine The most frequent piece of the assessment cycle
that is ignored is the last: making change based on the findings of
data analysis. It is often inaction on the part of management that
causes the assessment loop to remain incomplete, ending with
reporting of data analysis findings and never resulting in
action.
Slide 12
Slide 13
Summary Continuous Cyclical Evaluate AND improve Requires that
action be taken in response to findings Our environment and users
are always changing; we are always reacting.
Slide 14
Evidence based practice A movement to encourage and give
practitioners the means to incorporate research into their practice
where it may have been lacking. Journal of Evidence Based Library
and Information Practice Assumption that it is impossible to make
good evidence-based decisions when our evidence base is weak;
therefore, an evidence base must be built.
Slide 15
Evidence based practice Stresses three aspects contributing to
a practice that is evidence-based: 1.the best available evidence is
used 2.moderated by user needs and preference 3.applied to improve
the quality of professional judgments
Slide 16
An approach to information science that promotes the
collection, interpretation, and integration of valid, important and
applicable user-reported, librarian-observed, and research- derived
evidence. The best available evidence, moderated by user needs and
preferences, is applied to improve the quality of professional
judgments. Anne McKibbon
Slide 17
Context of assessment Libraries often talk about assessment in
the specific context of proving our institutional value to external
audiences Contribution to student retention, graduation, and
employment rates; student learning outcomes Equally valuable is
assessment to improve internal workflows and services; or
assessment of the cost and value of workflows to contribute to
knowledge in the field.
Slide 18
Why perform assessment? 1.Improve efficiency 2.Modify workflows
to funnel staff time and efforts where they provide the most
benefit 3.Prove the value of existing or proposed
services/positions to higher administration 4.Contribute to the
available data/literature in the cataloging field so that you can
work together to implement evidence-based practice across the
nation
Slide 19
A CULTURE OF ASSESSMENT
Slide 20
What is a culture of assessment? A culture of assessment refers
to whether the predominating attitudes and behaviors that
characterize the functioning of an institution support
assessment.
Slide 21
What signals a culture of assessment? Do staff take ownership
of assessment efforts? Does administration encourage assessment? Is
there comprehensive program? Is assessment present, do we see
ongoing assessment efforts throughout the organization? Are there
efforts to teach staff about assessment? Is there inclusion of
assessment in plans and budget? Establishing a Culture of
Assessment by Wendy Weiner, 2009
Slide 22
What signals a culture of assessment? Is assessment mentioned
in the strategic plan? Does the organization have an assessment
plan? Does the organization financially support staff members whose
positions are dedicated in whole or in part to assessment-related
activities? Does the organization fund professional development of
staff related to assessment? Is the organization responsive to
proposals for new endeavors related to assessment? Establishing a
Culture of Assessment by Wendy Weiner, 2009
Slide 23
Structural difficulties Bottom-up: it is difficult for staff to
gain support for conducting assessment projects or for implementing
change based on findings when upper admin is not
assessment-focused. Top-down: an assessment-focused upper admin can
have a staff that does not support assessment and mandates for
assessment are resented (the defiant compliant culture).
Slide 24
CONTEXT OF ASSESSMENT IN HIGHER EDUCATION
Slide 25
Inputs, outputs, and outcomes Input measures: quantify a
library's raw materials (collection size, staff size, budget).
Longest tradition of measurement in libraries. Output measures:
measure the actual use of library collections and services
(circulation stats, gate counts, reference transactions) Outcome
measures: measure the impact that using library services,
collections, and space has on users (libraries' impact on student
learning)
Slide 26
What drives the assessment agenda? Changing times Explosive
growth in technologies Increased customer expectations for services
such as quality and responsiveness Shrinking budgets justifications
for spending $ on resources, programs, and services are now
required Increased competition for resources A fight to remain
relevant and prove value Martha Kyrillidou, Planning for Results:
Making the Data Work For You. 2008.
Slide 27
Why wasnt there a focus on assessment in libraries for so
long?
Slide 28
Return On Investment A performance measure used to evaluate the
efficiency of an investment. a way of considering profits in
relation to capital invested. ROI provides a snapshot of
profitability adjusted for the size of the investment assets tied
up in the enterprise. Sources: Wikipedia and Investopedia
Slide 29
PROFITS! Cost = $$ Value = $$ For-profits must create their own
revenue. Otherwise, they cease to exist. Livelihood depends on
cost/value assessment.
Slide 30
Libraries Like most higher education, we are non-profits.
Funding sources arent directly tied to real value (more perceived
value and tradition). Cost = ?? Value = ??
Slide 31
Put simply because higher education has historically been given
a large % of annual funding by external powers based on perceived
value, we have not developed a culture of needing to closely prove
value, track input to output, or investment to profit.
Slide 32
ASSESSING THE COST AND VALUE OF BIBLIOGRAPHIC CONTROL
Slide 33
2011 LRTS article by Stalberg & Cronin
Slide 34
June 2009, Heads of Technical Services in Large Research
Libraries interest Group of ALCTS sponsored a Task Force on the
Cost/Value of Bibliographic Control. Members: Ann-Marie Breaux,
John Chapman, Karen Coyle, Myung-Ja Han, Jennifer OBrien Roper,
Steven Shadle, Roberta Winjum, Chris Cronin, and Erin Stalberg
Slide 35
The task group found that the technical services community has
long struggled with making sound, evidence-based decisions about
bibliographic control If technical services is to attempt to
perform cost/value assessment on bibliographic control, one of our
first problems is a lack of operational definitions of value we
must create our own operational definitions of value with which to
work.
Slide 36
Fundamental questions for defining value 1.Can value be
measured in ways that are non-numeric? 2.Is discussing relative
value over intrinsic value helpful? 3.Does value equal use? 4.Is it
possible to define a list of bibliographic elements that are
high-value and others that are low-value?
Slide 37
While the charge was to develop measures for value, the Task
Force determined that doing so would not be helpful until the
community has a common vocabulary for what constitutes value and an
understanding of how value is attained, and until more user
research into which bibliographic elements result in true research
impact is conducted.
Slide 38
Operational definitions of value 1.Discovery success 2.Use
3.Display understanding 4.Ability of bibliographic data to operate
on the open web and interoperate with vendors and suppliers in the
bibliographic supply chain
Slide 39
Operational definitions of value 5.Ability to support the
Functional Requirements of Bibliographic Records (FRBR) user tasks
6.Throughput and timeliness 7.Ability to support the librarys
administrative and management goals
Slide 40
Value multipliers Extent to which bibliographic data: are
normalized support collocation and disambiguation in discovery use
controlled terms across format and subject domains level of
granularity matches what users expect enable a formal and
functional expression of relationships (links between resources) to
find like items are accurate enhancements are able to proliferate
to derivative records
Slide 41
Measuring cost While elements contributing to the cost can be
outlined, determining whether the costs are too high is impossible
without first having a clear understanding of value. Salaries &
benefits multiplied by time spent on a task Cost of cataloging
tools, such as software Time spent on database maintenance Overhead
(training, policy development, documentation) Opportunity
costs
Slide 42
COLLECTING DATA
Slide 43
Types of data Quantitative methods focus on numbers and
frequencies; provide data that is easy to analyze statistically.
Numbers. Analysis of log data, systems reports, time data, web
usage analytics, survey data (not free text) Qualitative methods
capture descriptive data and focus on experience and meaning.
Words. Usability testing, focus groups, user interviews,
ethnographic studies
Slide 44
Coding qualitative data "There's no such thing as qualitative
data. Everything is either 1 or 0. -Fred Kerlinger While
qualitative data provides the important whys and hows of user
behavior, it is difficult for us to digest large quantities of
descriptive data. It is often useful to code quantitative data
qualitatively for analysis.
Slide 45
Fear: assessment takes a lot of time Reality: it depends on the
methodology and data sources used Qualitative data gathering,
coding, and analysis usually take a lot of time Systems can be set
up to gather quantitative data programmatically. Such data can be
analyzed quickly, given the proper tools and skills Quantitative
data might also be gathered manually. Data collection will be a
hassle, but analysis will be quick
Slide 46
Existing data or new data?
Slide 47
Slide 48
Collect new data Know what questions the data needs to be able
to answer Data requirements; structure of the data Make sure you
will be able to extract the data Make sure the data format youve
chosen will be interoperable with any other data you are using in
an initiative
Slide 49
Bad data planning
Slide 50
METHODOLOGIES / TECHNIQUES
Slide 51
A/B Testing Common in web-based marketing research. Involves an
online performance comparison between a control group and a single
variable test group. Measures differences in web usage stats Could
test use differences based on: presence or absence of metadata,
order of metadata display, metadata display labels
Slide 52
Warning If more than one variable is at play during A/B
testing, it becomes difficult to know which variables was
responsible for performance differences achieved. As of summer
2012, Google Analytics includes a new A/B Testing feature.
Slide 53
Focus groups A form of qualitative research in which a group of
people are asked about their perceptions and opinions of, or
interaction with, a product or service. Questions are asked in an
interactive group setting where participants are free to talk with
other group members. A moderator leads a small group of people who
share a common experience or characteristic through a discussion
using a pre-prepared script of open-ended questions.
Slide 54
Usability testing Focuses on measuring a product's ability to
meet its intended purpose by gathering direct input about how real
users use the system. In contract to a focus group or interview,
usability testing captures a users' behavior whether they are aware
of it or not. Code as you go! Software such as Morae includes
special features to help with this.
Slide 55
Slide 56
Slide 57
Web usage analytics data What kinds of things you can track?
Clicks on any link (Event Tracking) The pages people came from The
pages people go to next on your site The pages people exit your
site from
Slide 58
Slide 59
Important to evaluating value A/B testing Focus groups
Usability studies Web usage analytics data Important to evaluating
cost Time studies
Slide 60
Watch out for bias Biased goal: To prove the value of X
Unbiased goal: to prove or disprove the value of X It can be
considered a serious ethical conflict to have the people who
benefit from a certain outcome of assessment be the same people who
conduct the assessment.
Slide 61
Institutional Review Board (IRB) Charged with protecting the
rights of human research subjects Mandated by federal law Library
projects are the least of their worries, but you may need to go
through the process Check with your Assessment Librarian or
Assessment committee; they may have guidelines for when to go
through review
Slide 62
Institutional Review Board (IRB) Guidelines provided by IRB rep
to UNC-CH: Must do IRB if you are using human subjects and plan to
Publish results Make generalizable claims Collect identifying or
sensitive information (SSN, sexual orientation, names)
Slide 63
Institutional Review Board (IRB) All investigators listed on
IRB must do a CITI online ethics training (~3 hours) the first time
they submit an IRB proposal ( no need to list every single person
involved in the study!) Most library research is eligible for
expedited approval Tip for quick process: list the IRB approval
number for similar library studies already approved by the
IRB.
Slide 64
THE ROLE OF MANAGEMENT
Slide 65
Earning buy-in Discuss/explain assessment and why its important
Considering bringing a speaker to a department meeting to talk
about assessment practices, tools, ideas, etc. Show examples from
other institutions about how assessment benefited a department
Explain that you arent assessing staff; youre assessing
workflows
Slide 66
Dealing with fear of task timing Its scary to be asked to time
yourself as you conduct your daily tasks! Assumption that data
collection is based on a desire to decrease the time spent on
tasks, or penalize those who take too long Explain the big picture
and what youre trying to achieve; consider whether anonymous data
could be just as useful to you
Slide 67
Prove that it makes a difference Make sure you dont forget the
4 th step of assessment: take ACTION based on findings Inform staff
of how their work informed your decision-making and helped the
management of the department: show concrete changes Praise staff
participating in assessment library- wide; publicize the success of
the assessment project within the library or department
Slide 68
Provide staff with training Its important to provide staff with
the training they need to do what youre asking Does required
reporting involve querying an Access database? Crunching data in
Excel (writing functions or making pivot tables)? Are you asking
them to create a survey? These are often not listed in required job
skills, but people are later tasked with data reporting
Slide 69
Thank you! Joyce Chapman [email protected] Digital exhaust
base matrix image credit:
http://alwaysoncommunications.com/22/advanced-online-audience-buying/
http://alwaysoncommunications.com/22/advanced-online-audience-buying/
Resources Kyrillidou, Martha. (Feb. 2008). FLICC meeting, Planning
for Results: Making the Data Work For You. Why Assess? What is
assessment? What do we meanWhy Assess? What is assessment? What do
we mean by actionable data?by actionable data? The Cato Institute,
Washington, D.C. McKinsey Global Institute report (May 2011), Big
data: the next frontier for innovation, competition, and
productivity. Stalberg, Erin & Christopher Cronin. (2011).
"Assessing the Cost and Value of Bibliographic Control. Library
Resources & Technical Services, 55(3) 124-137. Weiner, Wendy.
(2009). Establishing a Culture of Assessment. Academe
95(4).Establishing a Culture of Assessment.
Slide 70
SPEAKING OF ASSESSMENT Please fill out the evaluation survey
for the ALCTS events that you attended at ALA annual conference
2012 in Anaheim, CA:
http://www.surveymonkey.com/s/alctsevents2012?
Slide 71
GROUP DISCUSSION
Slide 72
Discussion questions 1.What are the kinds of things we always
want to know about ourselves (TS) but never invest the time in
assessing? 2.What are your ideas about / experiences with how to
create a culture of assessment? 3.What important assessment work do
we need to do as a community? Can we initiate any of that
collaborative work now?