26
A Dashboard Report: Value Added Through Drill-Downs, Peer Comparisons, and Significance Tests Mary M. Sapp, Ph.D. Assistant Vice President Office of Planning & Institutional Research University of Miami SAIR, October 24, 2005

A Dashboard Report: Value Added Through Drill-Downs, Peer Comparisons, and Significance Tests Mary M. Sapp, Ph.D. Assistant Vice President Office of Planning

Embed Size (px)

Citation preview

Page 1: A Dashboard Report: Value Added Through Drill-Downs, Peer Comparisons, and Significance Tests Mary M. Sapp, Ph.D. Assistant Vice President Office of Planning

A Dashboard Report: Value Added Through Drill-Downs, Peer Comparisons, and Significance Tests

Mary M. Sapp, Ph.D.Assistant Vice President

Office of Planning & Institutional Research

University of Miami

SAIR, October 24, 2005

Page 2: A Dashboard Report: Value Added Through Drill-Downs, Peer Comparisons, and Significance Tests Mary M. Sapp, Ph.D. Assistant Vice President Office of Planning

Definition of Dashboard

A Dashboard is a visual display of Key Performance Indicators (KPIs) presented in a concise, intuitive format that allows decision makers to monitor institutional performance at a glance.

Page 3: A Dashboard Report: Value Added Through Drill-Downs, Peer Comparisons, and Significance Tests Mary M. Sapp, Ph.D. Assistant Vice President Office of Planning

Dashboard Characteristics

Provides visual display of important Key Performance Indicators (KPIs)

Uses concise, intuitive, “at-a-glance” format (uses icons and colors)

Offers high-level summary (reduces voluminous data)

Display of “gauges” to monitor key areas

Page 4: A Dashboard Report: Value Added Through Drill-Downs, Peer Comparisons, and Significance Tests Mary M. Sapp, Ph.D. Assistant Vice President Office of Planning

Uses of A Dashboard

Provides quick overview of institutional performance

Monitors progress of institution over time (trends)

Alerts user to problems (colors indicate positive/negative data)

Highlights important trends and/or comparisons with peers

Allows access to supporting analytics when needed to understand KPI results (drill down)

Page 5: A Dashboard Report: Value Added Through Drill-Downs, Peer Comparisons, and Significance Tests Mary M. Sapp, Ph.D. Assistant Vice President Office of Planning

Predecessors & Related Approaches

Executive Information Systems, and their predecessor, Decision Support Systems

On-Line Analytical Processing (OLAP) associated with data warehouses

Balanced scorecards Key success factors Benchmarking Key performance indicators

Page 6: A Dashboard Report: Value Added Through Drill-Downs, Peer Comparisons, and Significance Tests Mary M. Sapp, Ph.D. Assistant Vice President Office of Planning

Why Do A Dashboard?

Senior managers Want to monitor institutional performance Are very busy—little time to study reports Value reports that clearly show conclusions Appreciate overview, with indicators from

different areas in one place Use both trends and peer data

“What you measure is what you get.”

Robert S. Kaplan and David P. Norton

Page 7: A Dashboard Report: Value Added Through Drill-Downs, Peer Comparisons, and Significance Tests Mary M. Sapp, Ph.D. Assistant Vice President Office of Planning

Impetus for Next Generation Dashboard Report

Session at Winter 2004 HEDS conference Representatives from 4 HEDS institutions

shared dashboard reports Presentation & discussion prompted ideas

about features that might add value Dashboard report described here developed

as result

Example of how conference session led to project that would not have been done otherwise

Page 8: A Dashboard Report: Value Added Through Drill-Downs, Peer Comparisons, and Significance Tests Mary M. Sapp, Ph.D. Assistant Vice President Office of Planning

Characteristics of Dashboards Presented at HEDS Conference

All used single page (though some had 2nd page for definitions & instructions)

All presented trend data (changes over 1, 5, 6, and 10 years)

All used up/down arrows, </> icons, or “Up”/“Down” to show direction of trends

Three displayed minima and maxima values for the trend period

Three used colors to show whether trends were positive or negative

One used peer data

Page 9: A Dashboard Report: Value Added Through Drill-Downs, Peer Comparisons, and Significance Tests Mary M. Sapp, Ph.D. Assistant Vice President Office of Planning

Questions Generated by HEDS Dashboards and Proposed Solutions

1. Concise or detailed? HEDS: Laments about not being able to

provide more detail (“senior administrators should want to see more”)

Reaction: Sympathized with viewpoint, but have learned most senior administrators want summaries, not detail

Next Generation Dashboard: Keep concise format plus links to optional graphs & tables

Page 10: A Dashboard Report: Value Added Through Drill-Downs, Peer Comparisons, and Significance Tests Mary M. Sapp, Ph.D. Assistant Vice President Office of Planning

Issues that Came Up in Discussion at HEDS and UM Solutions

2. Trends or peer data? HEDS: All four dashboards used trend data;

one also used peer data Reaction: UM values peer data to support

benchmarking Next Generation Dashboard : Use both peer

and trend data

Page 11: A Dashboard Report: Value Added Through Drill-Downs, Peer Comparisons, and Significance Tests Mary M. Sapp, Ph.D. Assistant Vice President Office of Planning

Issues that Came Up in Discussion at HEDS and UM Solutions

3. When should icon for trend or difference from peers appear?

HEDS: Dashboards seemed to display icons for all non-zero differences

Reaction: Didn’t want small differences to be treated as real changes

Next Generation Dashboard : Use p-values from regression and t-tests to control display of icons for trends and peer comparisons

Page 12: A Dashboard Report: Value Added Through Drill-Downs, Peer Comparisons, and Significance Tests Mary M. Sapp, Ph.D. Assistant Vice President Office of Planning

Issues that Came Up in Discussion at HEDS and UM Solutions

4. Include minima and maxima? HEDS: Three displayed minima and maxima

over the trend period Reaction: UM’s senior VP decided too

cluttered Next Generation Dashboard : Shows trends

of own institution and 25th and 75th percentiles of peers, with no maxima or minima

Page 13: A Dashboard Report: Value Added Through Drill-Downs, Peer Comparisons, and Significance Tests Mary M. Sapp, Ph.D. Assistant Vice President Office of Planning

Unique Aspects of Next Generation Dashboard Dashboard

Provide drill-down links to graphs and tables for more detail, if desired

Provide peer data in addition to trends Use regression (rather than maxima and minima)

to determine direction of trends Use statistical significance of slope (rather than

just difference) to generate trend icons Use t-tests to generate peer comparison summary

Functions like adding “Global Positioning System (GPS)” to your dashboard

Page 14: A Dashboard Report: Value Added Through Drill-Downs, Peer Comparisons, and Significance Tests Mary M. Sapp, Ph.D. Assistant Vice President Office of Planning

Implementation

Two dashboard reports: student indicators and faculty/financial indicators

17/21 KPIs on single page Box for each with current value, arrows to

show trends, and text to show relation to peers Links to more detailed graphs and tables

Page 15: A Dashboard Report: Value Added Through Drill-Downs, Peer Comparisons, and Significance Tests Mary M. Sapp, Ph.D. Assistant Vice President Office of Planning

Indicator Display

Upper left corner Up arrow, down arrow, or horizontal line Shows direction of UM trend for last 6 years vs. vs. based on slope of regression & p-values Color based on desired outcome Link to graph with trends for UM and 25th & 75th percentiles for peers

Accept RateAR!A1 Mid. of Peers

44%

Current value

Page 16: A Dashboard Report: Value Added Through Drill-Downs, Peer Comparisons, and Significance Tests Mary M. Sapp, Ph.D. Assistant Vice President Office of Planning

Display

Upper right corner Shows relation to 12 peers Above Peers vs. Below Peers vs. Mid. of Peers based on

t-tests (UM vs. mean of peers) Color based on desired outcome Link to table with five years of data for UM and peers

Accept RateAR!A1 Mid. of Peers

44%

Page 17: A Dashboard Report: Value Added Through Drill-Downs, Peer Comparisons, and Significance Tests Mary M. Sapp, Ph.D. Assistant Vice President Office of Planning

Macros

Used to display Data for year chosen Direction of arrow icons Color of arrow (green for positive, red for

negative, black for neutral)

Page 18: A Dashboard Report: Value Added Through Drill-Downs, Peer Comparisons, and Significance Tests Mary M. Sapp, Ph.D. Assistant Vice President Office of Planning

Spreadsheet

Dashboard developed using Excel spreadsheet, with one sheet for the dashboard report and one sheet for each indicator (graph, peer data, and raw data)

Macro updates year and controls display of arrow icons (direction and color)

Spreadsheet with template for the dashboard and instructions for customizing it shared upon request (leave card)

Page 19: A Dashboard Report: Value Added Through Drill-Downs, Peer Comparisons, and Significance Tests Mary M. Sapp, Ph.D. Assistant Vice President Office of Planning

Indicators Used

Selected with input from the Provost, Vice President for Enrollments, Senior Vice President for Business & Finance, and Treasurer

Mandatory criterion: availability of peer data; sources:– CDS data from U.S. News (Peterson’s/Fiske for earlier years)– IPEDS– National Association of College and University Business

Officers– Council for the Aid to Education– National Science Foundation– Moody’s—average A data used instead of individual peers – National academies

See last page of handout for list of indicators used by UM and HEDS institutions

Page 20: A Dashboard Report: Value Added Through Drill-Downs, Peer Comparisons, and Significance Tests Mary M. Sapp, Ph.D. Assistant Vice President Office of Planning

Dashboard Complements Existing Key Success Factors (KSF) Report

Distinction between monitoring “critical” measures (tactical/operational, usually updated on daily, weekly, or monthly basis) and tracking strategic outcomes (key to long-term goals, updated less often)

Both KSF & Dashboard presented to senior administrators in Operations Planning Meeting (KSF bi-monthly and each Dashboard annually)

Page 21: A Dashboard Report: Value Added Through Drill-Downs, Peer Comparisons, and Significance Tests Mary M. Sapp, Ph.D. Assistant Vice President Office of Planning

KSF Monitors Changes for Critical Tactical KPIs

KPIs in KSF usually related to process (e.g., admissions, revenue sources, and expenditures in various categories)

KSF indicators limited to indicators that change on a continuous (e.g., daily, monthly) basis, as captured at the end of each month

Page 22: A Dashboard Report: Value Added Through Drill-Downs, Peer Comparisons, and Significance Tests Mary M. Sapp, Ph.D. Assistant Vice President Office of Planning

Dashboard Monitors Strategic KPIs

KPIs related to effectiveness and quality (student quality and success, faculty characteristics, peer evaluations)

Dashboard KPIs not included in KSF because measured on annual rather than continuous basis

Dashboard KPIs limited to indicators for which peer data available

Page 23: A Dashboard Report: Value Added Through Drill-Downs, Peer Comparisons, and Significance Tests Mary M. Sapp, Ph.D. Assistant Vice President Office of Planning

Future Directions And Adaptations

Adapt Dashboard format for UM’s KSF report Include targets and significant differences from

targets instead of/in addition to peers Make available online Link directly to various data sources (e.g., data

warehouse) Apply at the school or department level Allow individuals to personalize their own

databases to include KPIs directly relevant to them

Page 24: A Dashboard Report: Value Added Through Drill-Downs, Peer Comparisons, and Significance Tests Mary M. Sapp, Ph.D. Assistant Vice President Office of Planning

Implementing Next GenerationDashboard at Other Institutions

Session focus is on effective presentation rather than integration of data into report (low-tech spreadsheet, with tables of existing data copied in)

Spreadsheet itself can be used or some of the key concepts can be adapted in other situations

Author will e-mail spreadsheet template and instructions to those interested

Page 25: A Dashboard Report: Value Added Through Drill-Downs, Peer Comparisons, and Significance Tests Mary M. Sapp, Ph.D. Assistant Vice President Office of Planning

Choosing KPIs

Choosing which KPIs to use is critical Small amount of space, so choose carefully Appropriateness of KPIs is institution-specific Critical or strategic focus? Need to interview key stakeholders to

determine what data are important for them Use different types of KPIs (e.g., quality,

process, financial, personnel) to provide balanced perspective

Page 26: A Dashboard Report: Value Added Through Drill-Downs, Peer Comparisons, and Significance Tests Mary M. Sapp, Ph.D. Assistant Vice President Office of Planning

Demo of Dashboard Spreadsheet

Copies of spreadsheet available upon request—e-mail [email protected]