44
IMPROVING DECISION MAKING IN SMALL SCHOOL SYSTEMS: AN EXAMINATION OF DATA LITERACY AND DATA DASHBOARD DESIGN Client: Dorothy I. Height Community Academy Public Charter Schools Project Liaison: Colin Welch, Data Specialist, Dorothy I. Height Community Academy Public Charter Schools Prepared By: Jennifer Briones, Alison Friedman, Isabel Huston, Emily MacNeil, and Michael Gaskins May 5, 2014

GW MPP Capstone Report for CAPCS

Embed Size (px)

Citation preview

Page 1: GW MPP Capstone Report for CAPCS

   

IMPROVING  DECISION  MAKING  IN  SMALL  

SCHOOL  SYSTEMS:  AN  EXAMINATION  OF  DATA  

LITERACY  AND  DATA  DASHBOARD  DESIGN  

 

Client:  Dorothy  I.  Height  Community  Academy  Public  Charter  Schools  

Project  Liaison:  Colin  Welch,  Data  Specialist,    Dorothy  I.  Height  Community  Academy  Public  Charter  Schools  

Prepared  By:  Jennifer  Briones,  Alison  Friedman,  Isabel  Huston,    Emily  MacNeil,  and  Michael  Gaskins    

May  5,  2014  

 

Page 2: GW MPP Capstone Report for CAPCS

   

2  

 

Table  of  Contents  

Acknowledgements.........................................................................................................................................3  

List  of  Acronyms..............................................................................................................................................4  

Executive  Summary.........................................................................................................................................5  

Project  Rationale.............................................................................................................................................6  Introduction ...............................................................................................................................................6  Data-­‐Driven  Decision  Making.....................................................................................................................6  Dashboard  Creation ...................................................................................................................................6  Current  Data  Systems.................................................................................................................................7  Research  Questions....................................................................................................................................8  

Background .....................................................................................................................................................8  Community  Academy  Public  Charter  Schools ............................................................................................8    Table  1:  CAPCS  Student  Population  by  Campus………………………..………………….…………………………...………...9    Accountability  and  CAPCS .........................................................................................................................9    Accountability  and  the  Need  for  Accessible  Data:    The  No  Child  Left  Behind  Act  of  2001 .....................10    Applied  Data-­‐Driven  Decision  Making:  Turning  Data  into  Actionable  Knowledge..................................10  

           Figure  1:  Framework  for  Describing  Data-­‐Driven  Decision  Making  in  Education………………………………....11    Factors  Affecting  Data-­‐Driven  Decision  Making......................................................................................12  

Overview  of  the  Study ..................................................................................................................................12  Phase  1:  Research-­‐Informed  Prototype  Creation ....................................................................................13  Phase  2:  Data  Collection  with  Semi-­‐Structured  Interviews......................................................................14  

           Figure  2:  Data  Should  Be  Used  to  Improve  Outcomes………………………………………….………………………...……15              Table  2:  CAPCS  Stakeholders  Optimistic  About  Data  Literacy……………………………………………………….……..16              Table  3:  Context  is  Crucial  to  a  Dashboard…………………………………..………………………………….………………....17              Table  4:  CAPCS  Stakeholders  Seek  Trend  Indicators  on  Dashboards……….…….……………………………………..18              Figure  3:  CAPCS  Stakeholders  Reveal  Most  Important  Data  Points………………………………………………………18  

Phase  3:  Final  Dashboard  Prototype  Creation .........................................................................................19              Figure  4:  Sample  Final  Dashboard  Prototype……………………………………………………………………………………….20  

Dashboard  Recommendation ..................................................................................................................20  

Further  Recommendations  for  Dashboard  Use ............................................................................................20  

Conclusion.....................................................................................................................................................22  

Appendix  A:  Current  CAPCS  Dashboard........................................................................................................23  

Appendix  B:  Board  Summary  Document ......................................................................................................29  

Appendix  C:  Initial  Dashboard  Prototype......................................................................................................31  

Appendix  D:  Final  Dashboard  Prototype.......................................................................................................33  

Appendix  E:  Interview  Protocol  and  Script ...................................................................................................35  

References ....................................................................................................................................................44    

Page 3: GW MPP Capstone Report for CAPCS

   

3  

Acknowledgements    We  would  like  to  extend  our  sincere  gratitude  to  the  following  individuals,  without  whom  we  would  not  have  been  able  to  complete  this  report:                       Colin  Welch,  our  Dorothy  I.  Height  Community  Academy  Public  Charter  Schools  liaison,  for  guiding  us  through  the  dashboard  creation  process  and  connecting  us  with  multiple  stakeholders;                       The  administration  and  management  staff  at  the  Dorothy  I.  Height  Community  Academy  Public  Charter  Schools,  for  providing  their  time  and  honest  feedback  during  interviews;                       Professor  Yas  Nakib,  for  offering  advice  and  providing  us  with  resources  and  literature  to  write  this  report;                       Megan  Hatch,  our  Research  Advisor,  for  guiding  us  throughout  the  research  and  report  writing  processes;                       And  Professor  Elizabeth  Rigby,  for  providing  us  with  the  necessary  feedback,  information,  and  tools  to  work  with  CAPCS  and  write  this  report.  

Page 4: GW MPP Capstone Report for CAPCS

   

4  

 

List  of  Acronyms    ANet  -­‐  The  Achievement  Network  CAPCS  -­‐  Community  Academy  Public  Charter  Schools    DDDM  -­‐  Data-­‐driven  decision  making  ELL  -­‐  English  Language  Learners  LEA  -­‐  Local  education  agency  NCLB  -­‐  The  No  Child  Left  Behind  Act  of  2001  OSSE  -­‐  Office  of  the  State  Superintendent  for  Education  PCSB  -­‐  Public  Charter  School  Board  PMF  -­‐  Performance  Management  Framework  SPED  -­‐  Special  Education    

Page 5: GW MPP Capstone Report for CAPCS

   

5  

 

Executive  Summary         The  Dorothy  I.  Height  Community  Academy  Public  Charter  Schools  (CAPCS)  form  a  charter  school  network  in  Washington,  DC  that  serves  grades  pre-­‐kindergarten  through  six.  Like  many  schools,  CAPCS  uses  data-­‐driven  decision  making  (DDDM)  to  track  progress  toward  goals,  determine  effective  instructional  strategies,  and  meet  accountability  requirements  set  by  local,  state,  and  federal  education  agencies.  CAPCS  desires  a  data  dashboard  that  can  be  utilized  universally  by  school  administrators,  central  office  staff,  and  the  Board  of  Trustees  to  aid  in  these  processes.  In  collaboration  with  CAPCS  and  under  the  advisement  of  Professor  Elizabeth  Rigby  and  Research  Advisor  Megan  Hatch,  we  developed  the  following  research  questions  to  guide  the  redesign  of  CAPCS’  current  dashboard:    

1. What  are  the  current  best  practices  for  creating  dashboards?    

2. How  should  CAPCS  visualize  data  for  use  in  making  decisions?    

3. What  are  essential  contextual  factors  to  foster  implementation  of  data  dashboards?         To  address  these  questions,  we  conducted  research  to  inform  creation  of  an  initial  dashboard  prototype,  collected  feedback  from  relevant  CAPCS  stakeholders,  and  created  a  finalized  prototype  based  on  that  feedback.  We  also  crafted  this  report,  which  includes  analysis  of  stakeholder  feedback  and  recommendations  for  the  use  of  the  revised  dashboard.                       Initial  research  for  this  project  examined  the  concepts  of  DDDM  and  data  literacy  in  an  educational  context  to  gain  an  understanding  of  how  schools  successfully  implement  these  processes  and  integrate  them  into  staff  workflow.  We  found  that  developing  a  common  culture  of  data  literacy  and  buy-­‐in  for  DDDM  is  perhaps  as  important  as  providing  stakeholders  with  high-­‐quality  data  analysis  tools.                                         Through  semi-­‐structured  interviews  with  a  variety  of  stakeholders  at  CAPCS,  we  gained  an  understanding  of  what  features  people  most  wanted  in  a  dashboard,  and  what  the  context  of  data  use  and  data  literacy  is  at  CAPCS.  We  found  that  CAPCS  stakeholders  are  comfortable  using  data  in  their  work,  but  they  do  not  always  feel  that  there  is  a  strong  culture  of  data  literacy  throughout  the  organization.  For  the  dashboard,  stakeholders  were  interested  in  a  document  that  allowed  them  to  find  personally  significant  data  quickly,  and  to  see  performance  trends  over  time.         In  addition  to  creating  the  dashboard  prototypes,  we  have  included  a  detailed  analysis  of  the  feedback  we  received  on  the  culture  of  data  literacy  and  the  use  of  data  at  CAPCS.  In  the  final  section  of  this  report,  we  explain  the  features  of  the  new  dashboards  and  provide  a  set  of  further  recommendations  for  implementing  this  revised  dashboard.  The  recommendations  for  successful  implementation  are  as  follows:    

1. Focus  resources  on  building  a  strong  and  supportive  culture  of  data  literacy  and  use.    

2. Individualize  dashboards  to  meet  stakeholders’  diverse  needs.      

3. Standardize  protocol  for  dashboard  dissemination  and  create  regular  space  for  data  analysis  and  collaboration.  

 4. Continue  to  improve  dashboard  and  data  systems  as  needs  and  culture  at  CAPCS  evolve.  

Page 6: GW MPP Capstone Report for CAPCS

   

6  

 

Project  Rationale    

Introduction       The  Dorothy  I.  Height  Community  Academy  Public  Charter  Schools  (CAPCS)  were  founded  in  1998  as  a  response  to  the  pressing  need  for  a  high-­‐quality  educational  option  for  urban  students  in  Washington,  DC.  CAPCS  has  five  campuses  and  serves  mostly  low-­‐income  and  minority  students  from  grades  pre-­‐kindergarten  through  six.  Like  many  schools,  CAPCS  uses  data-­‐driven  decision  making  to  track  progress  toward  goals,  determine  effective  instructional  strategies,  and  meet  accountability  requirements  set  by  local  and  state  education  agencies.  One  of  the  tools  that  CAPCS  uses  for  data-­‐driven  decision  making  is  a  data  dashboard,  which  uses  graphs  and  charts  to  present  and  summarize  critical  school  and  student-­‐level  data  such  as  attendance,  enrollment,  and  academic  performance.  For  our  Master  of  Public  Policy  Capstone  project,  Colin  Welch,  our  CAPCS  liaison,  asked  us  to  create  updated  prototypes  for  a  new  dashboard  that  could  be  used  beginning  in  the  2014-­‐2015  school  year.  We  conducted  research  to  inform  creation  of  an  initial  dashboard  prototype,  collected  feedback  from  relevant  CAPCS  stakeholders,  and  created  a  finalized  prototype  based  on  that  feedback.  In  addition,  we  prepared  an  analysis  of  stakeholder  feedback  and  recommendations  for  the  use  and  implementation  of  the  revised  dashboard,  which  can  be  found  later  in  this  report.    

Data-­‐Driven  Decision  Making                     In  2001,  the  passage  of  the  No  Child  Left  Behind  Act  (NCLB)  became  the  impetus  for  a  shift  in  focus  onto  performance-­‐based  school  accountability.  The  policy  aimed  to  improve  transparency  by  mandating  that  educators  and  administrators  meet  specific  data  requirements  in  areas  such  as  academic  achievement  levels,  student  learning,  and  teacher  professional  development.  Those  districts  that  met  the  requirements  would  receive  federal  funding,  while  those  that  continually  failed  to  meet  them  risked  losing  funding  and  having  schools  closed.    

The  policy  was  driven  in  part  by  the  belief  that  the  effective  use  of  data  is  necessary  to  help  leaders  at  all  levels  assess  progress,  make  informed  decisions,  and  ultimately  improve  student  achievement.  This  process,  known  as  data-­‐driven  decision  making,  has  become  an  essential  part  of  school  management  practices  due  to  the  increase  in  federal  standards-­‐based  accountability  requirements.    

School  systems  like  CAPCS  create  strategies  that  allow  for  effective  DDDM  through  the  use  of  tools  such  as  data  dashboards.  Data  dashboards  are  documents  that  use  graphs  and  charts  to  present  and  summarize  critical  school  and  student-­‐level  data  such  as  enrollment,  suspensions  and  expulsions,  teacher  attendance,  and  professional  development.    

Dashboard  Creation                     The  purpose  of  this  project  was  to  provide  an  improved  data  dashboard  that  would  help  better  facilitate  the  decision  making  process  of  stakeholders  at  CAPCS  beginning  in  2014-­‐2015  school  year.  The  new  dashboard  was  created  with  several  aims,  including  improving  comprehension,  readability,  usability,  interactivity,  and  implementation.  The  dashboard  was  to  be  shared  internally  with  a  range  of  decision  makers  and  users  such  as  central  office  staff,  academy  leaders  (principals),  instructional  coaches,  and  the  Board  of  Trustees.  The  effective  use  of  the  data  in  the  dashboard  will  help  these  stakeholders  assess  programs  and  make  informed  decisions.  Decisions  based  on  data  are  crucial  due  to  the  high  standards  and  performance  requirements  that  must  be  achieved  annually  in  order  for  the  schools  to  retain  their  charter  and  funding.    

Page 7: GW MPP Capstone Report for CAPCS

   

7  

Current  Data  Systems  A  representative  from  CAPCS,  Colin  Welch,  provided  us  with  samples  of  dashboards  that  CAPCS  

has  used  in  the  past  [Appendices  A  and  B].  Mr.  Welch  also  communicated  how  he  intends  to  use  the  dashboards  and  provided  suggestions  for  their  look  and  feel.  He  requested  that  we  review  the  samples  provided,  collect  samples  from  other  schools  (or  similar  sources),  review  the  literature  pertaining  to  the  topic,  interview  stakeholders  within  the  organization,  and  create  several  sample  dashboard  designs.  We  finalized  a  template  for  CAPCS  to  use  after  creating  an  initial  prototype  based  on  focused  research,  promising  practices,  and  feedback  from  key  stakeholders.       CAPCS  relies  on  seven  data  systems  to  manage  its  student  and  school  information.  CAPCS  manages  four  of  these  data  systems  itself,  while  the  Office  of  the  State  Superintendent  for  Education  (OSSE)  and  the  DC  Public  Charter  School  Board  (PCSB)  manage  the  other  three.  Data  from  this  collection  of  systems  flows  into  PowerSchool,  the  core  data  information  system  used  by  CAPCS.  PowerSchool  and  other  centralized  information  systems  allow  administrators  and  teachers  to  access  enrollment,  demographic,  attendance,  and  discipline  records  using  a  single  login  and  portal  rather  than  several  portals.  Mr.  Welch  uses  PowerSchool  to  create  the  existing  data  dashboard  and  a  monthly  summary  for  the  Board  of  Trustees.  By  aggregating  student  and  classroom  information,  Mr.  Welch  synthesizes  key  internal  and  accountability  metrics  into  a  single  document.  This  document  is  then  shared  electronically  and  in  print  with  school  leaders,  central  office  staff,  and  the  Board  of  Trustees.        

Page 8: GW MPP Capstone Report for CAPCS

   

8  

 

Research  Questions  This  project  aimed  to  answer  the  following  research  questions:      

1. What  are  the  current  best  practices  for  creating  dashboards?    

2. How  should  CAPCS  visualize  data  for  use  in  making  decisions?    

3. What  are  essential  contextual  factors  to  foster  implementation  of  data  dashboards?    

Background    

Community  Academy  Public  Charter  Schools       The  Dorothy  I.  Height  Community  Academy  Public  Charter  Schools  (CAPCS)  were  founded  in  1998  as  a  response  to  the  pressing  need  for  a  high-­‐quality  educational  option  for  urban  students  in  Washington,  DC.  CAPCS  serves  students  in  pre-­‐kindergarten  through  sixth  grade  at  four  traditional  campuses  located  in  Northwest  and  Northeast  DC  (Amos  1,  Amos  2,  Amos  3,  and  Butler)  and  an  online  campus  (CAPCS  Online).  CAPCS’  mission  is  to  create  a  caring  learning  community  where  students  acquire  the  knowledge,  skills,  and  habits  of  mind  to  think  critically;  to  read,  write,  speak,  and  listen  effectively;  to  reason  mathematically;  to  inquire  scientifically;  and  to  develop  the  social  competence  that  ensures  meeting  the  qualifications  for  acceptance  to  a  competitive  high  school  (Community  Academy  Public  Charter  Schools  2014).  The  table  below  contains  aggregated  data  from  the  District  of  Columbia  Public  Charter  School  Board  (PCSB).  As  the  table  below  demonstrates,  student  population  consists  of  primarily  minority  students  from  low-­‐income  families.      

Page 9: GW MPP Capstone Report for CAPCS

   

9  

 

Table  1:  CAPCS  Student  Population  by  Campus  

 Amos  1   Amos  2   Amos  3   Butler  

Total  Enrollment  

510   280   479   308  

African  American  

65.9%   62.5%   99.0%   61.7%  

Hispanic/  Latino  

32.2%   35.4%   0.6%   28.2%  

White   0.0%   0.7%   0.0%   3.2%  

Asian/Pacific  Islander  

0.2%   0.7%   0.0%   2.9%  

Native  American/  Indian  

1.4%   0.0%   0.2%   0.6%  

Other   0.4%   0.7%   0.2%   3.2%  

English  Language  

Learners  

40.2%   45.7%   2.9%   31.5%  

Low-­‐Income   87.8%   77.9%   89.4%   70.1%  

Special  Education  

12.0%   6.4%   12.9%   10.7%  

Source:  DC  Public  Charter  School  Board.  2013  DC  Public  Charter  School  Performance  Reports.      

Accountability  and  CAPCS                       According  to  its  SY  2012-­‐2013  annual  report,  CAPCS  is  committed  to  consistent  monitoring  of  accountability  and  increasing  its  response  to  data  results.  In  addition  to  guiding  values,  CAPCS  is  accountable  to  multiple  education  agencies.  First,  its  charter  must  be  renewed  every  five  years  by  the  PCSB.  CAPCS’  charter  was  most  recently  renewed  in  2013.  Secondly,  CAPCS  is  accountable  to  OSSE,  the  state  education  agency  that  governs  all  public  schools  in  the  District  of  Columbia.  In  addition,  CAPCS  is  accountable  to  federal  achievement  and  attendance  regulations  created  by  the  No  Child  Left  Behind  Act  (NCLB).  Finally,  the  school  system  is  also  held  accountable  by  its  own  Board  of  Trustees.    

Page 10: GW MPP Capstone Report for CAPCS

   

10  

The  combined  requirements  of  the  PCSB  and  other  localities,  including  federal  laws  like  NCLB,  oblige  CAPCS  to  amass  a  large  amount  of  data  on  their  students’  and  staff’s  achievement,  attendance,  and  other  activities.  As  a  result,  CAPCS  is  utilizing  the  required  collected  data  to  improve  decision  making  on  a  day-­‐to-­‐day  and  year-­‐to-­‐year  basis.  These  factors  combined  with  the  ability  to  access  large  swaths  of  data,  are  what  led  the  central  office  at  CAPCS  to  create  internal  data  dashboards  that  can  be  used  by  the  Board  of  Trustees,  central  office  staff,  and  academy  leaders  to  track  goals  and  inform  decision  making.    

Accountability  and  the  Need  for  Accessible  Data:    The  No  Child  Left  Behind  Act  of  2001                     The  2001  passage  of  NCLB  mandated  that  educators  and  administrators  meet  specific  data  requirements  in  order  to  receive  certain  federal  funding.  This  requirement  was  based  on  the  assumption  that  more  analysis  and  interpretation  of  data  would  lead  to  more  informed  decisions  for  school  reform.  The  policy  itself  is  based  on  the  premise  that  accountability  and  accessible  data  will  be  a  major  mechanism  in  improving  student  achievement  and  schools  as  a  whole  (Linn  2002).  School  districts  and  charter  management  organizations  are  now  required  to  report  on  a  variety  of  performance  measures  such  as  achievement  levels,  student  learning,  and  professional  development  (Park  2009).  Performance-­‐based  accountability  has  improved  transparency  in  education.  Specifically,  NCLB  required  that  performance  data  be  disaggregated  by  sub-­‐group  such  as  low-­‐income  and  minority,  students  with  disabilities,  and  English  Language  Learners  (ELL).  This  provided  data  analysts  with  a  clearer  understanding  of  the  situation  at  the  school  and  district  levels  (Wong  2003).                         The  increase  in  available  data  allows  teachers  and  administrators  to  evaluate  existing  capacities  and  identify  weaknesses,  monitor  progress  and  efficacy  of  programs,  and  inform  future  development  plans  and  decisions  (Park  2009).    These  factors  together  will  hopefully  lead  to  improved  student  performance.  However,  the  benefits  of  data  will  not  be  realized  until  they  are  communicated  effectively  and  to  an  audience  that  is  able  to  understand  and  interpret  the  information.  A  school  needs  internal  motivation,  structure,  and  capacity  as  well  as  external  requirements  (i.e.  NCLB)  in  order  to  create  an  effective  accountability  system  and  a  culture  of  DDDM  (Sutherland  2004).                     Although  NCLB  brought  accountability  and  DDDM  into  the  spotlight  of  education  reform,  it  is  not  a  novel  idea.  DDDM  in  education  originates  from  successful  practices  in  industry  and  manufacturing,  in  which  the  assessment  of  input  data  yields  successful  and  efficient  output  (Marsh  2006).  Still,  data  were  important  in  education  reform  for  decades  prior  to  the  passage  of  NCLB.  State  requirements  for  data  use  in  school  improvement  plans  began  in  the  1970s,  and  in  the  1980s  there  were  debates  about  measurement-­‐driven  instruction  (Marsh  2006).  Additionally,  data  use  for  strategic  planning  in  school  systems  dates  back  to  the  1980s  and  1990s  (Marsh  2006).  Still,  NCLB  marks  a  greater  transition  to  accountability  because  of  test-­‐based  requirements  and  data  reporting  in  aggregated  and  disaggregated  forms  (Marsh  2006).         Schools  now  have  a  vast  amount  of  data  at  their  disposal  and  need  mechanisms  and  tools  that  allow  them  to  analyze  the  information  and  make  decisions.  Data  dashboards  that  clearly  and  succinctly  depict  this  information  are  an  invaluable  tool  that  educators  and  administrators  can  use  to  do  their  jobs  more  effectively.  As  Sutherland  (2004)  discussed,  both  external  and  internal  factors  are  necessary  in  order  to  create  and  maintain  a  culture  of  evaluation  and  data  use.  Assessment  and  data  are  only  useful  if  there  is  the  capacity  to  use  that  information  effectively.  A  dashboard  is  an  effective  tool  for  this  purpose.  However,  capacity  for  DDDM  goes  beyond  having  a  dashboard  for  teachers  and  administrators;  it  also  refers  to  the  capacity  of  those  teachers  and  administrators  to  interpret  and  analyze  the  information  as  it  is  presented  to  them.    

Applied  Data-­‐Driven  Decision  Making:  Turning  Data  into  Actionable  Knowledge           Many  schools  utilize  the  data  made  available  by  federal,  state,  and  local  requirements  to  better  inform  decision  making  and  strategy  applied  by  various  stakeholders.  In  the  case  of  CAPCS,  the  Board  of  Trustees  uses  data  to  ensure  that  year-­‐end  goals  are  met.  Other  stakeholders  such  as  central  office  staff,  

Page 11: GW MPP Capstone Report for CAPCS

   

11  

academy  leaders,  and  instructional  coaches  use  data  to  track  their  students’  achievement  and  attendance,  teacher  professional  development,  and  other  important  factors.    

A  base  of  literature,  both  theoretical  and  applied,  examines  effective  and  ineffective  ways  for  a  school  system  or  school  to  practically  apply  DDDM  to  its  day-­‐to-­‐day  practices  (see  Figure  1).  Figure  1  shows  an  applied  framework  that  we  created  based  on  the  literature  and  research  that  was  conducted.  It  illustrates  a  path  that  might  be  taken  when  an  actor  employs  DDDM.  The  dashed  feedback  line  indicates  that  an  actor  might  move  between  stages  instead  of  following  the  arrows  from  step  to  step.  The  remainder  of  this  section  details  the  steps  that  might  be  taken  by  an  actor  to  fully  implement  DDDM.            

Figure  1:  Framework  for  Describing  Data-­‐Driven  Decision  Making  in  Education    

   In  coordination  with  Figure  1,  the  following  steps  are  based  on  the  literature  and  research  and  might  be  taken  by  a  set  of  actors  engaged  in  DDDM.    Step  1  -­‐  Gather  and  Organize  Raw  Data                     First,  actors  gather  and  organize  raw  data  to  use  in  what  is  ideally  the  most  effective  manner  that  matches  their  needs.  There  can  be  many  types  of  data:  input  (school  expenditures  or  demographics),  process  (information  on  financial  operations  or  quality  of  instruction),  outcome  (dropout  rate  or  student  assessment),  and  satisfaction  (opinions  from  teachers,  students,  parents,  or  members  of  the  community)  (Marsh  2004).  These  data  can  be  described  in  a  quantitative,  qualitative,  simple,  or  complex  manner  (Ikemoto  2007)  and  can  be  organized  and  stored  in  numerous  ways.  Some  schools  use  student  information  systems  like  PowerSchool  or  data  management  systems  that  are  created  specifically  for  their  needs.  Others  export  data  from  a  management  system  and  place  it  into  a  spreadsheet  that  then  configures  the  data  into  a  tool  that  can  be  used  to  inform  selected  stakeholders.          

Page 12: GW MPP Capstone Report for CAPCS

   

12  

   Step  2  -­‐  Information  and  Data  Literacy       Once  the  data  are  gathered,  they  are  presented  to  the  relevant  stakeholders  and  become  information.  Information  might  be  presented  in  the  form  of  a  PDF,  an  Excel  spreadsheet,  or  via  a  program  such  as  PowerSchool  that  is  accessed  via  the  Internet.  The  form  data  takes  when  presented  as  information  is  extremely  important.  Bambrick-­‐Santoyo  (2010)  notes  that  it  is  easy  to  gather  data  but  hard  to  analyze  and  utilize  its  conclusions  effectively.  He  also  asserts  that  the  ultimate  end  users  must  be  kept  in  mind  when  creating  a  template  that  will  be  used  for  decision  making.       In  this  step,  a  separate  but  important  consideration  is  data  literacy.  Data  literacy  is  a  fundamental  aspect  of  effective  data  use.  The  modern  era  of  DDDM  causes  a  transition  such  that  now  not  only  an  exceptional  principal,  expert  teacher,  or  central  office  member  manages  a  school’s  vital  information,  but  all  teachers  and  administrators  are  expected  to  be  capable  to  conduct  their  own  data  analysis  within  their  professional  role  (Park  2009).    

If  stakeholders  do  not  feel  comfortable  and  regard  data  as  overwhelming  rather  than  as  a  useful  tool,  a  dashboard  will  be  unable  to  serve  its  intended  purpose  or  be  utilized  to  its  maximum  potential  (Almy  2014).  Additionally,  in  their  study  of  district-­‐wide  data  systems,  Hayman  and  Cho  found  that  it  is  important  for  district  leadership  to  set  a  vision  for  how  data  will  be  used  by  all  stakeholders  across  positions.  Districts  that  actively  cultivated  a  common  culture  of  data  literacy  and  data  use  were  most  successful  at  fully  implementing  DDDM  (Hayman  and  Cho  2014).    Step  3  -­‐  Decisions  from  Data                     In  the  third  step,  decisions  are  made  when  information  is  turned  into  actionable  knowledge  (Park  2009).  Depending  on  what  is  being  tracked,  these  decisions  might  inform  a  decision,  compare  metrics,  or  lead  the  actor  to  take  a  new  course  of  action.  According  to  Bambrick-­‐Santoyo  (2010),  the  decisions  must  be  made  and  implemented  in  a  timely  manner.  Additionally,  the  context  of  why  and  how  the  decisions  are  made  and  executed  should  be  considered  (Park  2009).    Step  4  -­‐  Implement  Decisions  for  Impact                     During  the  final  step,  the  relevant  actors  implement  decisions  that  were  made  based  on  the  earlier  steps.  Like  many  actions  in  a  school  setting,  proper  implementation  is  vital  not  only  for  DDDM  to  be  effective  but  to  ensure  that  the  goal  or  metric  is  met  or  improved  upon  (Marsh  2006).                

Factors  Affecting  Data-­‐Driven  Decision  Making       Often,  the  reality  of  data-­‐driven  decision  making  is  not  as  linear  as  is  outlined  in  the  steps  above  or  in  the  literature  (Ikemoto  2007).  Like  any  system,  there  is  a  possibility  that  an  actor  might  not  follow  the  prescribed  framework  and  instead  make  a  decision  based  on  intuition,  context,  or  a  separate  factor.  This  reality  makes  it  necessary  for  the  following  factors  and  implications  to  be  considered  by  any  group  that  is  engaging  in  DDDM:  accessibility  and  timeliness  of  data;  perceived  validity  of  data;  staff  capacity  and  support;  time;  partnerships  with  external  organizations;  tools  used;  organizational  culture  and  leadership;  and  policy  context  (Ikemoto  2007).  Finally,  the  leaders  of  the  school  system  or  school  should  anticipate  that  an  actor  might  make  a  decision  outside  the  framework  and  in  turn  be  impacted  by  the  factors  listed.    

Overview  of  the  Study         The  study  used  a  three-­‐phase  methodology  to  achieve  the  ultimate  goal  of  creating  a  more  effective  and  easily  understood  data  dashboard  for  CAPCS.  The  first  phase  used  data  visualization  research  and  CAPCS’  stated  needs  to  create  a  framework  for  the  new  dashboard  prototype.  The  second  

Page 13: GW MPP Capstone Report for CAPCS

   

13  

phase  utilized  semi-­‐structured  interviews  with  key  stakeholders  to  optimize  the  school  performance  dashboard.  Stakeholders  included  different  members  of  the  CAPCS  community  with  a  vested  interest  in  data  and  accountability  such  as:  academy  leaders,  central  office  leaders,  instructional  coaches,  an  English  Language  Learners  (ELL)  representative,  a  data  associate,  and  a  human  resources  representative.  The  final  stage  created  the  new  dashboard  prototype  for  CAPCS  to  use  to  report  school  progress  more  effectively  to  stakeholders.    

Phase  1  

Research-­‐Informed  Prototype  Creation       A  dashboard  is  a  visual  display  of  the  most  important  information  needed  to  achieve  one  or  more  objectives.  Typically,  the  information  presented  on  a  dashboard  is  consolidated  and  arranged  on  a  single  screen  so  the  information  can  be  monitored  at  a  glance.  Dashboards,  which  began  to  appear  in  the  1980s  as  a  way  for  corporate  executives  to  monitor  key  performance  indicators  for  their  entire  organization,  have  recently  become  standard  tools  for  decision  makers  at  all  levels  and  in  all  types  of  organizations.       The  widespread  use  of  dashboards  by  technology  companies  led  to  the  perception  that  the  efficacy  of  a  dashboard  results  from  the  sophistication  of  the  software  used  in  its  creation.  While  technology  plays  an  important  role  in  the  speed  and  efficiency  of  information  transfer,  many  dashboards  fail  to  communicate  with  and  add  value  to  organizations  due  to  poor  design  and  implementation  (Few  2006,  4).       Most  recently,  CAPCS  relied  on  two  data  dashboards:  one  for  CAPCS  board  members  [Appendix  B]  and  another  designed  for  school  leaders  [Appendix  A].  The  board  member  dashboard  was  a  two-­‐page  document  that  listed  CAPCS’  charter  agreement  targets,  the  status  of  each  target,  and  notes  on  each  target  in  tabular  format.  The  school  leader  dashboard  was  a  ten-­‐page  document  that  featured  a  detailed  account  of  metrics  related  to  literacy,  math,  and  behavior  with  over  twenty  graphs,  seven  tables,  and  a  notes  section.            

Findings:  Research-­‐Informed  Prototype  Creation       While  the  dashboards  provided  a  detailed  account  of  the  academic  and  behavioral  performance  of  CAPCS  students,  several  aspects  of  well-­‐designed  dashboards  were  absent.  First,  the  multi-­‐page  design  of  the  school  leader  dashboard  made  it  impossible  to  view,  understand,  and  interpret  information  with  a  simple  glance.  The  human  brain  has  a  limited  amount  of  information  that  can  be  stored  in  working  memory,  often  referred  to  as  short-­‐term  memory.  Research  has  shown  that  the  human  brain  can  hold  between  five  to  nine  items  in  working  memory  at  any  given  time  before  they  are  forgotten  (Miller  1956).  In  short,  it  is  nearly  impossible  for  the  average  person  to  make  sense  of  large  amounts  of  data  spanning  several  pages.  Second,  the  graphs  lacked  visual  indicators  such  as  trend  arrows  or  icons,  which  would  alert  users  of  improving  or  declining  performance  over  time.  Given  the  large  number  of  metrics  that  schools  must  monitor  and  the  limited  amount  of  time  that  staff  are  able  to  spend  analyzing  data,  it  is  imperative  to  design  dashboards  that  quickly  highlight  progress  and  areas  of  concern.    

Based  on  the  research  by  Few  (2006)  and  Miller  (1956),  we  created  a  dashboard  prototype  to  address  the  shortcomings  listed  above  [Appendix  C].  Our  dashboard  prototype  shortened  the  dashboard  from  eleven  pages  to  two  by  limiting  the  scope  of  data  presented  to  include  only  primary  indicators  of  academic  and  behavioral  performance.  Secondly,  color-­‐coded  trend  arrows  were  placed  to  the  left  of  all  graphs  to  indicate  an  improvement  or  decline  in  performance  from  the  previous  month.  Thirdly,  all  graphs  featured  data  spanning  the  previous  three  months  in  order  to  show  longer-­‐term  trends  for  each  metric.  Fourthly,  all  graphs  featured  visual  indicators  marking  CAPCS’  current  performance  in  relation  to  its  end  of  year  goals.  The  twofold  aim  of  the  prototype  was:  to  create  graphics  to  help  users  quickly  identify  areas  of  progress  and  concern,  and  to  present  key  aspects  of  each  metric  without  taxing  the  user’s  capacity  of  working  memory,  thereby  allowing  the  overall  picture  of  student  performance  to  be  more  easily  understood  in  a  short  period  of  time.    

Page 14: GW MPP Capstone Report for CAPCS

   

14  

 

Phase  2  

Data  Collection  with  Semi-­‐Structured  Interviews                     In  Phase  2,  we  conducted  in-­‐person  semi-­‐structured  interviews  to  collect  feedback  from  a  representative  set  of  stakeholders  on  the  two  current  dashboards  and  our  prototype.  A  total  of  21  stakeholders  from  CAPCS  were  contacted  along  with  one  stakeholder  from  another  Washington,  DC-­‐based  public  charter  school  system.  Twelve  of  the  21  stakeholders,  all  of  whom  were  from  CAPCS,  were  interviewed  for  a  response  rate  of  57  percent.  All  twelve  interviews  took  place  in  Washington,  DC  at  CAPCS’  central  office  and  its  four  physical  campuses.  Of  the  twelve  stakeholders  interviewed,  seven  were  central  office  employees,  two  were  academy  leaders,  and  three  were  either  instructional  coaches  or  curriculum  specialists.  The  interviews  took  place  on  various  dates  throughout  the  weeks  of  March  24,  March  31,  and  April  7,  2014.                     All  interviews  were  conducted  in  person  because  displaying  and  explaining  the  multiple  dashboards  over  the  phone  would  have  likely  caused  confusion  and,  therefore,  less  useful  responses.  Research  shows  that  face-­‐to-­‐face  is  the  best  method  for  interviews  that  require  visual  aids  or  contain  many  open-­‐ended  questions  (Wholey  et  al.  2010).  We  elected  to  conduct  interviews  with  stakeholders  in  a  variety  of  roles  because  stakeholders  tend  to  make  sense  of  data  systems  based  on  their  personal  perceptions  and  the  dominant  data-­‐orientation  of  their  respective  workplaces  (Cho  2014).  That  is  why  we  anticipated  that  each  CAPCS  stakeholder  group  would  use  the  data  dashboard  in  different  ways.         We  created  an  interview  script,  which  also  contained  the  interview  protocol  [Appendix  E].  The  purpose  of  this  document  was  to  maintain  a  standard  interview  process  for  all  four  interviewers.  Three  dashboards  were  used  to  assist  the  interview  process  and  inform  the  creation  of  the  final  dashboard  prototype.  These  dashboards  were  referred  to  as  “Current  Tool”  [Appendix  A],  “Dashboard  A”  [Appendix  B],  and  “Dashboard  B”  [Appendix  C].  They  were  chosen  for  use  during  interviews  due  to  the  differences  in  layout  and  content,  which  allowed  the  stakeholders  to  compare  and  contrast  them  to  one  another.  The  “Current  Tool”  is  a  dashboard  created  using  Microsoft  Excel  that  Mr.  Welch  and  the  CAPCS  data  team  use  to  display  campus-­‐specific  information  such  as  in-­‐seat  attendance,  enrollment  changes,  and  academic  interventions.  “Dashboard  A”  is  a  summary  document  that  Mr.  Welch  prepares  monthly  on  Microsoft  Word  and  contains  campus-­‐specific  information  such  as  charter  agreement  targets,  attendance,  re-­‐enrollment,  and  community  engagement.  “Dashboard  B”  is  the  initial  prototype  we  created  using  Microsoft  Word.  It  was  developed  based  on  existing  research  on  data  visualization  and  conversations  with  Mr.  Welch.  “Dashboard  B”  contained  fabricated  campus-­‐specific  data  such  as  reading  and  math  proficiency,  student  absences,  and  parent  event  attendance.                     We  encountered  some  limitations  while  working  on  the  interview  portion  of  the  project.  First,  we  did  not  initiate  contact  with  any  CAPCS  stakeholders  because  we  agreed  that  Mr.  Welch  would  connect  us  via  email  with  all  of  the  stakeholders.  Many  of  the  stakeholders  may  not  have  responded  due  to  the  fact  that  the  interviews  were  being  conducted  during  the  DC  CAS  testing  period.  Additionally,  central  office  managers  determined  that  it  would  not  be  feasible  for  us  to  discuss  the  data  dashboards  with  members  of  CAPCS’  Board  of  Trustees.  While  these  factors  all  led  to  a  small  sample  size,  our  results  are  representative  of  different  levels  of  DDDM  and  data  use  at  CAPCS.  Additionally,  out  of  respect  for  each  interviewee’s  time,  interviews  were  limited  to  30  minutes  and  therefore  certain  questions  that  we  deemed  unessential  were  omitted  in  some  interviews.  In  a  few  cases,  follow-­‐up  questions  that  were  not  on  the  interview  script  needed  to  be  asked  for  clarification  purposes.  Interviews  with  higher-­‐level  staff  members  or  those  who  were  more  familiar  with  the  dashboards  tended  to  be  much  more  open-­‐ended  because  their  increased  levels  of  data  literacy  led  to  more  opinions  and  input  on  the  prototypes  and  data  in  general.  This  gave  us  additional  information,  which  we  were  able  to  apply  during  creation  of  the  final  dashboard  prototype.    

Page 15: GW MPP Capstone Report for CAPCS

   

15  

Phase  2  Findings    

Data  Literacy  Levels       During  the  semi-­‐structured  interviews,  CAPCS  staff  members  self-­‐reported  their  personal  levels  of  comfort  using  data  to  inform  workplace  decisions.  They  were  asked:  “On  a  scale  of  1  to  5,  with  one  being  not  at  all  comfortable  and  five  being  very  comfortable,  how  comfortable  would  you  say  you  are  with  using  data  to  inform  your  work?”  Of  the  twelve  respondents,  75  percent  scored  their  comfort  levels  at  4  or  5.  In  addition,  the  majority  of  surveyed  CAPCS  staff  use  data  regularly  in  their  decision  making  process.  They  were  asked:  “In  your  position,  how  often  do  you  use  data  to  make  decisions?”  Of  the  twelve  respondents,  67  percent  said  they  use  data  to  make  decisions  at  least  once  a  week.  From  these  data,  we  can  see  that  CAPCS  has  a  basic  culture  of  DDDM.  For  the  most  part,  CAPCS  staff  fall  somewhere  between  the  second  and  third  steps  of  Ikemoto’s  DDDM  framework  (2007).  None  of  the  stakeholders  reported  that  they  never  use  data  in  decision  making,  so  we  can  conclude  that  data  is  viewed  as  a  tool  at  CAPCS  and  it  may  not  be  necessary  to  focus  resources  on  developing  very  basic  data  literacy  skills  in  staff  members.       CAPCS  stakeholders  are  also  on  the  same  page  when  it  comes  to  how  data  is  used  at  CAPCS.  As  Figure  2  shows,  central  office  employees,  academy  leaders,  and  instructional  and  curriculum  staff  all  agree  that  CAPCS  uses  data  in  multiple  ways.  Figure  2  counts  the  number  of  stakeholders  who  identified  one  of  three  main  buckets  of  data  use:  improving  outcomes,  tracking  progress  toward  goals,  and  accountability.  Each  letter  in  the  circles  represents  one  respondent  who  has  identified  that  CAPCS  uses  data  in  a  specific  way.  Letters  are  not  unique  across  circles,  so  one  respondent  may  be  represented  in  multiple  circles.  This  shows  that  many  CAPCS  employees  have  a  complex  understanding  of  how  data  is  used  within  the  organization.  So,  looking  only  at  the  “Improving  Outcomes”  bucket,  four  central  office  employees,  two  academy  leaders,  and  two  instructional  coaches  agree  that  CAPCS  uses  data  to  improve  outcomes.  Additionally,  we  can  see  that  there  are  three  central  office  employees  who  identified  all  three  buckets  as  ways  in  which  CAPCS  uses  data.  No  single  stakeholder  thought  that  there  was  only  one  proper  way  to  use  data  at  CAPCS,  and  a  majority  of  those  responding  to  the  question  agreed  that  CAPCS  used  data  to  improve  student  outcomes,  track  progress  toward  goals,  and  for  accountability  (internal  and  external).      

Figure  2:  Data  Should  Be  Used  to  Improve  Outcomes    

 

Page 16: GW MPP Capstone Report for CAPCS

   

16  

        The  quote  at  the  bottom  of  Figure  2  gets  at  the  heart  of  the  culture  that  is  being  cultivated  among  these  stakeholder  groups.  Across  all  groups,  data  use  is  purposeful—these  numbers  are  not  used  punitively  to  “catch”  stakeholders  doing  wrong  or  underperforming;  they  are  useful  tools  to  be  employed  in  the  effort  of  creating  the  best  schools  possible  for  the  students  CAPCS  serves.  In  their  study  of  data  use  and  sense  making  in  school  districts,  Cho  and  Wayman  (2014)  found  that  school  districts  where  multiple  groups  of  stakeholders  in  disparate  positions  had  a  common  understanding  of  the  “why”  of  data  use  were  more  successful  at  creating  a  positive  and  productive  data  culture.  The  attitudes  expressed  in  the  interview  process  show  that  CAPCS  has  done  a  good  job  of  setting  a  comprehensive  and  multifaceted  vision  of  data  use  for  its  staff.                       When  asked  if  CAPCS  actively  cultivates  a  culture  of  literacy,  responses  were  more  mixed.  Only  one  third  of  respondents  agree  outright,  but  those  who  were  neutral  or  disagreed  gave  optimistic  or  aspirational  feedback  about  how  CAPCS  could  reach  a  point  where  there  was  a  true  culture  of  data  literacy  (see  Table  2  below).    

Table  2:  CAPCS  Stakeholders  Optimistic  About  Data  Literacy  

Do  you  agree  or  disagree  that  CAPCS  cultivates  a  culture  of  data  literacy?  

Agreement  

"Absolutely.  Definitely.  Well,  everybody  is  data-­‐driven,  from  the  top—from  the  central  office—down  to  the  campus…We  understand  the  importance  of  data,  I  think  more  than  we  have  before...and  not  just  data  as  far  as  numbers.  I  mean  data  even  as  far  as  how  many  parents  did  you  have  show  up  at  parent/teacher  

conferences?  What  do  you  think  is  attributed  to  them  not  coming?  Just  being  able  to  talk  teachers  through  certain  things  like  that  is  one  way  to  track  the  data”—Academy  Leader  

“There  are  many  data  meetings  where  we  provide  data  to  teachers  and  explain  as  well  as  show  them  where  to  find  the  information  themselves.  There  is  a  focus  on  making  sure  everyone  knows  what  the  data  

means  and  how  to  use  it.”—Central  Office  Employee  

Aspiration/Optimism  

“I  agree,  we  are  moving  in  that  direction.  We  have  someone  specifically  assigned    to  work  on  data  and  push  that  down  into  schools.”—Central  Office  Employee  

"I  have  worked  in  other  cultures  that  are  very  big  with  numbers.  We  look    at  numbers  but  we  don’t  let  them  drive  us  crazy."—Instructional  Coach  

“I  think  that  certain  individuals  at  CAPCS  cultivate  a  culture  of  data  literacy.  I  think  they  are  really  good  about  sharing  their  knowledge  about  data  and  helping  other  people  understand  data.”  

—Central  Office  Employee  

   Those  individuals  who  did  agree  that  CAPCS  has  a  culture  of  data  literacy  pointed  out  ways  that  

the  organization  has  provided  more  opportunity  for  employees  to  engage  in  analysis  and  discussion  around  data.  Data  conferences  involving  multiple  stakeholder  groups  were  a  popular  example,  and  are  exactly  the  sort  of  occasion  that  will  eventually  lead  to  data  sharing  and  collaboration  across  stakeholder  groups.  As  one  central  office  employee  stated,  “They  [CAPCS  stakeholders]  are  now  seeing  how  data  is  helpful  to  guide  instruction.  Now  it  gives  a  reason  to  teachers...why  we  need  them  to  do  the  things  that  

Page 17: GW MPP Capstone Report for CAPCS

   

17  

they  do.”  Continuing  these  practices  will  be  fundamental  to  strengthening  CAPCS’  common  vision  of  how  and  why  data  is  important.       For  those  who  did  not  agree  that  CAPCS  cultivates  a  culture  of  data  literacy,  a  recurring  theme  was  a  certain  “skills  silo”  in  which  the  data  person  has  the  knowledge  and  access  to  help  others,  but  without  whom  analysis  would  not  occur  at  all.  Such  perception  can  be  dangerous  to  an  organization,  as  it  causes  groups  without  access  to  disengage  from  DDDM  and  to  reject  data  as  part  of  their  own  vision  of  CAPCS’  essential  properties  and  values  (Cho  and  Wayman  2014).  As  a  curriculum  specialist  stated,  “I  think  that  certain  individuals  at  CAPCS  cultivate  a  culture  of  data  literacy...I  don’t  think  it’s  been  infused  in  everybody.”  It  will  be  important  for  CAPCS  to  continue  to  offer  individuals  opportunities  to  engage  with  data  and  to  understand  its  role  in  their  own  responsibilities  in  order  to  continue  to  cultivate  a  productive  culture  of  DDDM.    

Prototype  Feedback                     During  the  semi-­‐structured  interviews,  CAPCS  staff  members  were  presented  with  three  dashboards:  the  current  tool  being  used  by  CAPCS  [Appendix  A],  a  board  summary  document  referred  to  as  “Dashboard  A”  [Appendix  B],  and  our  initial  dashboard  prototype,  referred  to  as  “Dashboard  B”  [Appendix  C].  73  percent  of  respondents  reported  that  the  layout  of  the  current  tool  was  easy  to  read  and  understand.  In  addition,  73  percent  of  respondents  reported  that  based  on  the  information  included  on  the  current  tool  and  the  way  in  which  it  is  presented,  the  tool  would  help  them  make  decisions  more  quickly.  However,  many  of  the  stakeholders  were  unwilling  to  look  at  a  dashboard  for  a  long  period  of  time  in  order  to  find  the  information  they  needed.  This  unwillingness  became  evident  as  they  flipped  through  the  current  tool,  which  is  over  ten  pages  in  length.  While  looking  through  the  current  tool,  one  central  office  employee  said,  “There  is  way  too  much  information  on  here.”  Stakeholders  of  all  positions  did  like  the  first  page  of  the  current  tool  which  is  a  summary  page  containing  information  such  as  enrollment  changes,  attendance,  academic  interventions,  and  professional  development.  However,  all  pages  following  the  summary  page  contain  various  charts  and  graphs  for  specific  metrics.         During  the  interviews,  stakeholders  were  asked  what  was  missing  from  both  Dashboard  A  and  Dashboard  B.  Of  the  twelve  respondents,  only  42  percent  stated  that  there  were  elements  missing.  This  low  response  rate  indicates  what  we  had  anticipated,  which  is  that  stakeholders’  ideal  dashboard  would  combine  the  textual  summaries  and  descriptions  featured  on  Dashboard  A  with  the  visual  charts  and  graphs  featured  on  Dashboard  B.  Those  who  were  able  to  identify  what  was  missing  had  suggestions  that  can  be  seen  in  Table  3.  It  became  evident  that  context  is  an  extremely  important  aspect  of  a  data  dashboard.  Stakeholders  suggested  that  perhaps  there  should  be  a  dashboard  containing  subject-­‐specific  metrics.  This  emphasized  the  fact  that  people  in  different  positions  are  looking  for  different  metrics  –  an  instructional  coach  who  focuses  on  math  will  want  to  see  the  students’  progress  in  math,  while  a  central  office  staffer  may  be  more  interested  in  attendance  and  enrollment.    

Table  3:  Context  is  Crucial  to  a  Dashboard  

What  features  are  missing  from  both  of  these  prototypes  that  you  want  to  see?  Why?  

“The  summative  information  is  good,  but  I  would  need  a  break  out  per  campus  to  really  help  inform  decisions.  It  would  also  be  helpful  to  see  the  comparison  of  performance  to  other  ANet  schools.”  

—Central  Office  Employee    

“It’s  just  that  when  you  say  reading  proficiency,  I  think,  ‘based  on  what?’  I  think  that  the  sub-­‐skill  information  would  be  most  useful.  In  literacy,  for  example.”—Curriculum  Specialist    

“...it  would  be  helpful  to  see  what's  happening.  Maybe  a  one  or  two-­‐word  description  of  what  that  intervention  is,  what  that  activity  is."—Central  Office  Employee    

 

Page 18: GW MPP Capstone Report for CAPCS

   

18  

                  Reactions  to  Dashboard  A  and  Dashboard  B  were  positive.  When  asked  to  choose  which  Dashboard  (A  or  B)  they  preferred  at  first  glance,  90  percent  of  respondents  chose  Dashboard  B.  This  was  largely  due  to  the  colors,  graphics,  and  simple  layout  of  the  prototype.  After  being  given  the  chance  to  carefully  review  all  three  prototypes,  73  percent  of  respondents  stated  they  preferred  Dashboard  B.  One  of  the  main  reasons  for  their  preference  was  the  colored  trend  arrows  feature,  which  specified  whether  metrics  had  increased  or  decreased  from  the  previous  time  period.  As  can  be  seen  in  Table  4,  stakeholders  had  mixed  reactions  when  asked  what  features  of  the  current  dashboard  they  prefer  over  the  prototypes.  Their  responses  once  again  depended  upon  their  position.  For  instance,  one  curriculum  specialist  indicated  that  she  preferred  the  current  tool  because  the  information  that  is  relevant  to  her  work  was  not  displayed  on  either  Dashboard  A  or  Dashboard  B.  One  academy  leader  found  the  large  amount  of  information  displayed  on  the  current  tool  useful:  “It’s  all  useful,  it’s  all  right  here  together…”  Other  responses  were  based  on  whether  or  not  each  individual  was  a  visual  learner  and  preferred  graphs  and  charts  over  paragraph  descriptions.  The  most  common  element  that  stakeholders  identified  as  important  was  trend  indicators.  

Table  4:  CAPCS  Stakeholders  Seek  Trend  Indicators  on  Dashboards  

What  features  of  the  current  dashboard  do  you  prefer  over  the  prototypes?  

"There's  more  data  here,  for  sure.  It  looks  like...it's  more  complete  here.  Whether  or  not  that's  a  plus  or  minus  depends  on  the  audience  and  what  they  want  to  see."—Central  Office  Employee    

"I  like  the  actual  numbers  versus  percentages.  Although,  when  you  have  the  percentages  on  Dashboard  B  where  it  says  if  you  increased  from  last  month,  those  are  very  helpful.  But  for  the  actual  count  within  each  

domain,  I  would  prefer  the  number  versus  the  percentage."  —Central  Office  Employee    

“The  last  year  column  for  comparison  is  useful.  I  would  like  a  full  year  summary,  not  just  three  months.”  —Instructional  Coach    

 

Figure  3:  CAPCS  Stakeholders  Reveal  Most  Important  Data  Points    

 

Page 19: GW MPP Capstone Report for CAPCS

   

19  

                    CAPCS  stakeholders  were  also  asked  to  identify  the  top  three  most  important  pieces  of  data  on  each  dashboard.  Figure  3  counts  the  number  of  stakeholders  who  identified  attendance,  enrollment,  or  academic  interventions  and  strategies  as  important  on  the  current  tool.  Each  letter  in  the  circles  represents  one  respondent  who  has  identified  that  item  as  important.  Letters  are  not  unique  across  circles,  so  one  respondent  may  be  represented  in  multiple  circles.  The  figure  shows  that  stakeholders  at  all  positions  identified  attendance,  enrollment,  and  academic  interventions  and  strategies  as  important.  All  three  of  the  items  were  displayed  on  the  first  page  of  the  current  tool,  which  is  a  summary  page.  Only  a  small  minority  took  the  time  to  flip  through  the  document  before  answering  the  question,  which  shows  the  importance  of  having  both  a  summary  page  and  different  metrics  for  different  stakeholders.  For  both  Dashboards  A  and  B,  a  majority  of  stakeholders  claimed  that  the  literacy  and  math  targets  were  the  most  important  aspects  on  display.    

Phase  3  

Final  Dashboard  Prototype  Creation       After  we  conducted  interviews  with  the  CAPCS  stakeholders,  interview  responses  were  transcribed  and  analyzed.  Through  an  analysis  of  stakeholder  responses  to  questions  comparing  CAPCS’  current  dashboards  to  our  prototype,  several  themes  emerged.  First,  stakeholders  were  reluctant  to  spend  more  than  fifteen  seconds  reviewing  a  dashboard.  Second,  stakeholders  favored  visual  indicators  that  specified  when  metrics  had  increased  or  decreased  from  the  previous  period.  Third,  in  addition  to  accountability  metrics,  which  relate  to  students’  overall  proficiency  in  a  subject  area,  stakeholders  suggested  that  subject-­‐specific  skills  metrics  would  provide  more  actionable  insight.  These  results  are  expanded  upon  in  the  “Prototype  Feedback”  section  of  this  report.  

To  address  the  major  concerns  listed  above,  we  created  two  additional  prototypes:  a  document-­‐based  dashboard  using  Microsoft  Excel  and  a  web-­‐based  interactive  dashboard  using  Google  Spreadsheets.  Stakeholders  wanted  to  identify  problem  areas  in  as  little  as  fifteen  seconds,  yet  they  also  desired  a  greater  level  of  detail  for  each  subject  area.  We  provided  Mr.  Welch  with  two  strategies  to  reconcile  both  needs:  (1)  a  document-­‐based  dashboard  that  featured  conditionally  formatted  tables  instead  of  charts,  and  (2)  a  web-­‐based  dashboard  that  allowed  users  to  interactively  explore  accountability  and  behavioral  metrics.    

For  the  final  document-­‐based  dashboard,  metrics  were  summarized  using  tables  instead  of  charts.  Despite  the  positive  feedback  we  received  regarding  the  use  of  graphs  in  our  prototype,  it  was  impossible  to  summarize  all  of  CAPCS’  required  metrics  while  maintaining  a  one-­‐page  limit.  In  order  to  compensate  for  the  lack  of  charts,  we  utilized  Microsoft  Excel’s  conditional  formatting  features  to  quickly  highlight  areas  of  progress  and  concern.  We  also  used  arrow  icons  to  indicate  the  increase  or  decrease  of  each  metric.  Conditional  formatting  was  configured  so  that  metrics  where  CAPCS  was  failing  to  meet  its  yearly  goals  were  automatically  highlighted  in  red,  while  metrics  where  CAPCS  was  successfully  achieving  its  annual  goals  were  highlighted  in  green.  Data  related  to  the  primary  metric  was  listed  below  the  key  metric.  Green  and  red  arrow  icons  were  used  to  show  the  increase  or  decrease  of  each  related  sub-­‐metric  [Appendix  D].  Using  tables  allowed  us  to  increase  the  number  of  metrics  listed  from  a  maximum  of  six  metrics  per  page  to  a  maximum  of  44  metrics  per  page.  This  approach  resulted  in  a  dashboard  on  which  all  accountability  metrics  and  subject-­‐specific  skills  fit  comfortably  on  a  single  page.          The  final  web-­‐based  dashboard  featured  interactive  graphs  that  were  created  using  Google  Spreadsheets.  The  web-­‐based  dashboard  separated  reading,  math,  and  non-­‐academic  metrics  into  three  separate  tabs.  The  tabs  featured  an  interface  that  allowed  users  to  select  metrics  on  an  x-­‐y  axis  and  see  how  metrics  changed  in  relation  to  one  another  over  time.  Users  also  had  the  option  to  choose  between  two  additional  interactive  viewing  modes,  an  interactive  bar  chart  and  interactive  line  chart  [Appendix  D].  Both  charts  gave  users  the  ability  to  view  animations  of  metrics  as  they  changed  over  time  

Page 20: GW MPP Capstone Report for CAPCS

   

20  

Dashboard  Recommendation    Based  on  our  review  of  the  literature  and  interaction  with  stakeholders,  the  following  is  a  sample  

of  our  final  dashboard  prototype  recommendation.  The  final  dashboard  can  be  seen  in  its  entirety  in  Appendix  D.      

Figure  4:  Sample  Final  Dashboard  Prototype  

 

     

Further  Recommendations  for  Dashboard  Use  The  new  dashboard  is  an  improved  tool  to  assist  with  DDDM,  but  successful  practice  is  

dependent  upon  successful  implementation.  This  will  take  capacity  building,  professional  development,  and  buy-­‐in  from  all  stakeholders.  The  following  are  a  set  of  further  recommendations  for  implementation  of  the  dashboard  that  we  feel  will  allow  CAPCS  to  maximize  the  utility  of  this  tool.    

1.  Focus  resources  on  building  a  strong  and  supportive  culture  of  data  literacy  and  use.     Creating  a  whole  school  culture  of  data  use  is  important  because  educators  interpret  data  using  existing  beliefs,  values,  assumptions,  and  practices  (Sutherland  2004,  280).  Research  has  found  that  in  order  for  this  to  be  achieved,  a  teacher  should  lead  the  process  and  administrators  should  provide  support  by  promoting  data  use.  Central  office  staff  are  instrumental  in  making  the  concept  of  data  use  well  known,  but  it  seeing  one’s  peer  using  data  regularly  will  encourage  others  to  use  it  in  everyday  practice  (Cho  2014).  Implementation  research  finds  that  teachers  often  respond  to  peers  rather  than  superiors.     In  order  to  ensure  greater  data  literacy  among  teachers  and  administrators,  CAPCS  may  wish  to  increase  access  to  data  and  promote  data  skills  through  quality  professional  development  and  school  policies  (Almy  2014).  This  process  should  be  done  through  tiered  supports  for  varying  levels  of  data  literacy.  There  should  be  an  emphasis  on  developing  the  skills  of  those  who  are  less  literate,  but  the  focus  of  most  resources  should  be  on  integrating  data  into  the  daily  practices  of  all  stakeholders.  This  focus  will  

Page 21: GW MPP Capstone Report for CAPCS

   

21  

help  all  staff  see  how  they  can  use  dashboards  to  go  deep  into  interpretation  to  support  better  student  outcomes  and  reach  charter  goals.  

There  was  an  indication  from  the  interviews  that  because  previous  dashboard  implementation  was  not  smooth,  buy-­‐in  from  implementers  will  need  to  be  obtained  to  ensure  this  roll  out  has  a  more  positive  outcome.  Most  people  interviewed  were  not  willing  to  spend  more  than  fifteen  seconds  looking  for  the  information  they  need;  therefore,  a  pre-­‐existing  familiarity  with  the  dashboard  will  promote  use.    

2.  Individualize  dashboards  to  meet  stakeholders’  diverse  needs.       Individuals  consistently  gave  feedback  that  they  would  like  to  see  dashboards  more  specifically  tailored  to  their  needs  in  their  specific  position.  Such  a  structure  would  be  beneficial  and  useful  to  staff  members  in  different  positions  who  make  disparate  types  of  decisions.  Therefore,  a  recommendation  for  the  new  dashboard  is  to  create  a  universal  dashboard  in  addition  to  dashboards  that  contain  subject-­‐specific  data  such  as  ELL,  SPED,  math,  and  reading.  These  specialized  dashboards  would  contain  less  data  that  are  irrelevant  to  certain  stakeholders’  needs  and  therefore  those  stakeholders  would  be  more  likely  to  use  them  for  decision  making.  This  can  be  facilitated  by  the  use  of  the  Google  dashboard  prototype,  which  is  the  easiest  and  least  time  consuming  way  to  customize  data  and  give  all  stakeholders  independent  access  to  the  specific  information  they  need.    

3.  Standardize  protocol  for  dashboard  dissemination  and  create  regular  space  for  data  analysis  and  collaboration.     Standard  protocol  for  dashboard  distribution  is  key  to  effective  implementation.    Stakeholder  feedback  indicates  that  dashboard  delivery  should  occur  at  a  consistent  time  every  week.  This  would  allow  individuals  to  plan  and  budget  time  to  review  the  data  weekly  and  be  prepared  for  professional  development  sessions  and  data  discussion  meetings.  Creating  consistency  for  distribution  will  reinforce  data  use  as  a  regular  part  of  stakeholders’  routines  and  help  foster  a  culture  of  data  use.       One  of  the  most  important  factors  considered  during  the  creation  of  the  data  dashboard  was  the  ease  of  access  to  clear  and  actionable  data.  CAPCS  has  an  extended  school  day,  meaning  there  is  limited  time  for  teacher  professional  development  during  the  day.  This  makes  it  even  more  essential  to  ensure  that  the  time  spent  working  with  data  dashboards  is  productive.  Based  on  feedback  from  stakeholders,  it  would  be  beneficial  to  use  professional  development  to  give  a  basic  overview  of  the  dashboard  and  how  to  use  it  quickly  and  effectively.  For  instance,  the  data  meetings  and  conferences  that  CAPCS  holds  could  be  scheduled  regularly  to  coincide  with  the  release  of  the  dashboard.      

4.  Continue  to  improve  dashboard  and  data  systems  as  needs  and  culture  at  CAPCS  evolve.     A  thoughtful  and  well-­‐executed  implementation  of  the  new  dashboard  is  critical  for  success,  but  the  process  for  improved  data  use  does  not  stop  once  the  new  dashboard  is  in  place.  After  the  roll  out  of  new  dashboards,  Mr.  Welch  and  the  CAPCS  Data  Associate  should  continue  to  collect  feedback  from  stakeholder  groups.  This  feedback  can  be  used  in  an  iterative  process  of  continuous  improvement.  As  interventions,  school  performance  data,  staff,  and  internal  culture  change,  this  should  be  reflected  in  the  dashboards  and  their  delivery.  

Page 22: GW MPP Capstone Report for CAPCS

   

22  

Conclusion                   The  purpose  of  this  project  was  to  facilitate  the  decision  making  process  of  stakeholders  at  Community  Academy  Public  Charter  Schools  (CAPCS)  by  creating  an  updated  data  dashboard.  To  understand  the  needs  and  data  literacy  levels  of  stakeholders  at  different  levels,  we  first  used  research  on  data-­‐driven  decision  making  (DDDM)  and  conversations  with  the  CAPCS  liaison  to  develop  an  initial  dashboard  prototype.  We  then  conducted  twelve  semi-­‐structured  in-­‐person  interviews,  during  which  we  showed  each  stakeholder  three  dashboards:  the  current  tool  being  used  by  CAPCS,  a  board  summary  document,  and  our  initial  prototype.  In  terms  of  dashboard  design  and  information  displayed,  we  found  that  stakeholders  were  reluctant  to  spend  long  periods  of  time  reviewing  a  dashboard.  Stakeholders  favored  visual  indicators  that  specified  when  metrics  had  increased  or  decreased  from  the  previous  period.  In  addition  to  accountability  metrics,  which  relate  to  students’  overall  proficiency  in  a  subject  area,  stakeholders  also  suggested  that  metrics  related  to  specific  subject  area  skills  would  provide  more  actionable  insight.  We  also  found  that  the  majority  of  stakeholders  use  data  to  inform  their  work  multiple  times  a  week,  which  shows  that  CAPCS  has  a  basic  culture  of  DDDM  and  data  literacy.  This  report  provides  additional  recommendations  and  promising  practices  to  assist  CAPCS  in  improving  decision  making.      

Page 23: GW MPP Capstone Report for CAPCS

   

23  

 

Appendix  A:  Current  CAPCS  Dashboard    

 

Page 24: GW MPP Capstone Report for CAPCS

   

24  

 

   

Page 25: GW MPP Capstone Report for CAPCS

   

25  

 

Page 26: GW MPP Capstone Report for CAPCS

   

26  

 

Page 27: GW MPP Capstone Report for CAPCS

   

27  

   

Page 28: GW MPP Capstone Report for CAPCS

   

28  

 

Page 29: GW MPP Capstone Report for CAPCS

   

29  

 

Appendix  B:  Board  Summary  Document    

Page 30: GW MPP Capstone Report for CAPCS

   

30  

 

Page 31: GW MPP Capstone Report for CAPCS

   

31  

 

Appendix  C:  Initial  Dashboard  Prototype    

Page 32: GW MPP Capstone Report for CAPCS

   

32  

 

Page 33: GW MPP Capstone Report for CAPCS

   

33  

 

Appendix  D:  Final  Dashboard  Prototype      

   

Page 34: GW MPP Capstone Report for CAPCS

   

34  

 

Page 35: GW MPP Capstone Report for CAPCS

   

35  

 

Appendix  E:  Interview  Protocol  and  Script    

 

5/4/2014 Interview Questions - Google Forms

https://docs.google.com/forms/d/1MYhT_zVC9fIn4emK7_znSxXsELZoPZ8DA0Idj0q3Anw/edit 1/9

Interview  QuestionsIntro:  Good  morning/afternoon.  Thank  you  for  taking  the  time  to  meet  with  me  today.  As  you  might  

know,  I  am  part  of  a  group  of  GW  students  working  with  CAPCS  as  part  of  our  capstone  project  for  our  

Master’s  Degree.  We  are  helping  to  redesign  a  data  dashboard  that  can  be  used  to  help  a  variety  of  

people  in  the  CAPCS  community  get  a  good  understanding  of  what  is  going  on  at  the  schools.  Your  

input  will  help  us  to  create  the  most  useful  tool  for  CAPCS.  You  can  stop  this  interview  or  ask  me  to  

repeat  a  question  at  any  time.  

This  interview  should  take  about  20  minutes  to  complete.  We  are  looking  for  really  honest  feedback  

about  the  current  tools  and  the  prototype  that  we’ve  created.  All  of  your  answers  will  be  completely  

confidential,  and  it  is  only  through  collecting  this  feedback  that  we  can  create  the  best  dashboard  

possible.  So  please  be  as  honest  as  you  can  as  we  go  through  these  questions.

The  prototype  we  created  contains  fabricated  data  and  is  for  display  purposes  only.  I'd  like  to  record  this  

interview,  unless  you  have  any  objections.

Do  you  have  any  questions  for  me  before  we  begin?  

Great,  then  let’s  get  started.

1.   Name

2.   Title

3.   DepartmentMark  only  one  oval.

 Amos  1

 Amos  2

 Amos  3

 Butler

 Central  Office

 Board

 Other:  

4.   Date  of  Interview  

Example:  December  15,  2012

Page 36: GW MPP Capstone Report for CAPCS

   

36  

 

 

5/4/2014 Interview Questions - Google Forms

https://docs.google.com/forms/d/1MYhT_zVC9fIn4emK7_znSxXsELZoPZ8DA0Idj0q3Anw/edit 2/9

5.   Location  of  InterviewMark  only  one  oval.

 Amos  1

 Amos  2

 Amos  3

 Butler

 Central  Office

 Phone

 Other:  

6.   Old  Dashboard/Baseline  -­  How  familiar  are  you  with  the  current  dashboard  being  used?Showing  only  the  old  tool:  Now,  I  am  going  to  show  you  the  data  dashboard  that  is  currently  used  to

track  student  data  and  progress  toward  various  end  of  year  goals.  I’m  going  to  ask  you  some

questions  about  your  initial  reactions  to  this  dashboard  and  then  ask  you  to  explain  your  feelings

about  it  in  more  detail.

Mark  only  one  oval.

 Never  seen  it  before

 Sent  to  you  but  you  don’t  open  it

 Open  it  but  don’t  look  at  it  often

 Look  at  it  but  don’t  use  it  in  any  decisions

 Use  it  occasionally  in  decision-­making  (1-­4  times  in  a  year)

 Use  it  frequently  in  decision-­making  (5+  times  a  year)

7.   Old  Dashboard/Baseline  -­  At  first  look,  what  is  your  reaction  to  the  layout?Mark  only  one  oval.

 Confusing

 Easy  to  read  and  understand

8.   Old  Dashboard/Baseline  -­  What  do  you  find  confusing?If  applicable

 

 

 

 

 

Page 37: GW MPP Capstone Report for CAPCS

   

37  

 

5/4/2014 Interview Questions - Google Forms

https://docs.google.com/forms/d/1MYhT_zVC9fIn4emK7_znSxXsELZoPZ8DA0Idj0q3Anw/edit 3/9

9.   Old  Dashboard/Baseline  -­  What  do  you  find  easy  to  read?

If  applicable

 

 

 

 

 

10.   Old  Dashboard/Baseline  -­  Just  looking  at  the  tool,  how  many  important  or  useful  data  points

do  you  see  displayed  here?

Mark  only  one  oval.

 None

 One  or  two

 Three  or  four

 Five  or  more  

 All  of  the  data  points  I  need  to  make  informed  decisions  in  my  position  are  listed  here

 Not  sure

11.   Old  Dashboard/Baseline  -­  Can  you  identify  the  top  three  most  important  pieces  of  data  you

see  on  this  dashboard?

 

 

 

 

 

12.   Old  Dashboard/Baseline  -­  What  are  the  top  2-­3  items  missing  from  this  dashboard  that  would

be  most  helpful  to  you  in  your  position?

If  applicable

 

 

 

 

 

Page 38: GW MPP Capstone Report for CAPCS

   

38  

 

 

5/4/2014 Interview Questions - Google Forms

https://docs.google.com/forms/d/1MYhT_zVC9fIn4emK7_znSxXsELZoPZ8DA0Idj0q3Anw/edit 4/9

13.   Old  Dashboard/Baseline  -­  Based  on  the  information  included  on  this  dashboard  and  the  wayin  which  it  is  presented,  do  you  feel  this  tool  would  help  you  make  decisions  more  quickly?If  applicableMark  only  one  oval.

 Yes

 No  

 Maybe

14.   How  long  do  you  feel  you  would  be  willing  to  look  at  a  tool  like  this  in  order  to  find  theinformation  you  need?  

 

 

 

 

15.   Old  Dashboard/Baseline  -­  Can  you  explain  your  feelings  about  this  dashboard  more  in  depth?Why  do  you  feel  this  way?  

 

 

 

 

16.   New  Dashboards  -­  At  first  glance,  which  prototype  do  you  prefer?  Why?Based  on  what  we  know  about  CAPCS  and  decision-­making  processes  at  schools,  our  team  hasbeen  working  on  an  alternative  to  the  current  dashboard.  We  are  going  to  show  it  to  you  now,  alongwith  the  board  summary  document.  We  will  ask  a  few  questions  about  what  you  may  or  may  not  likeabout  each  item.  Remember  that  the  more  honest  you  are,  the  better  able  we  are  to  create  the  bestpossible  product  in  the  end.  

 

 

 

 

Page 39: GW MPP Capstone Report for CAPCS

   

39  

 

5/4/2014 Interview Questions - Google Forms

https://docs.google.com/forms/d/1MYhT_zVC9fIn4emK7_znSxXsELZoPZ8DA0Idj0q3Anw/edit 5/9

17.   New  Dashboards  -­  Looking  at  both  prototypes,  can  you  tell  me  the  top  three  things  overall  thatyou  most  like?  

 

 

 

 

18.   New  Dashboards  -­  Looking  at  both  prototypes,  can  you  list  the  top  three  things  overall  thatyou  most  dislike?  

 

 

 

 

19.   New  Dashboards  -­  Just  looking  at  Prototype  A,  how  many  important  or  useful  data  points  doyou  see  displayed  here?Mark  only  one  oval.

 None

 One  or  two

 Three  or  four

 Five  or  more  

 All  of  the  data  points  I  need  to  make  informed  decisions  in  my  position  are  listed  here

 Not  sure

20.   New  Dashboards  -­  Can  you  name  the  top  three  most  important  pieces  of  data  you  see  onPrototype  A?  

 

 

 

 

Page 40: GW MPP Capstone Report for CAPCS

   

40  

 

5/4/2014 Interview Questions - Google Forms

https://docs.google.com/forms/d/1MYhT_zVC9fIn4emK7_znSxXsELZoPZ8DA0Idj0q3Anw/edit 6/9

21.   New  Dashboards  -­  What  are  the  top  2-­3  items  that  are  missing  from  Prototype  A  that  would  bemost  helpful  to  you  in  your  position?If  applicable

 

 

 

 

 

22.   New  Dashboards  -­  Just  looking  at  Prototype  B,  how  many  important  or  useful  data  points  doyou  see  displayed  here?Mark  only  one  oval.

 None

 One  or  two

 Three  or  four

 Five  or  more  

 All  of  the  data  points  I  need  to  make  informed  decisions  in  my  position  are  listed  here

 Not  sure

23.   New  Dashboards  -­  Can  you  name  the  top  three  most  important  pieces  of  data  you  see  onPrototype  B?  

 

 

 

 

24.   New  Dashboards  -­  What  are  the  top  2-­3  items  that  are  missing  from  Prototype  B  that  would  bemost  helpful  to  you  in  your  position?If  applicable

 

 

 

 

 

Page 41: GW MPP Capstone Report for CAPCS

   

41  

 

5/4/2014 Interview Questions - Google Forms

https://docs.google.com/forms/d/1MYhT_zVC9fIn4emK7_znSxXsELZoPZ8DA0Idj0q3Anw/edit 7/9

25.   New  Dashboards  -­  Do  you  think  either  of  these  prototypes  would  help  you  make  decisions

more  quickly?  If  so,  which  one  and  why?

If  applicable  

 

 

 

 

26.   New  Dashboards  -­  What  features  are  missing  from  both  of  these  prototypes  that  you  want  to

see?  Why?

 

 

 

 

 

27.   New  v.  Old  -­  Looking  at  all  three  documents,  which  one  is  your  favorite?  Why?

Now,  let’s  take  a  look  at  the  prototypes  and  the  current  dashboard.  I  am  going  to  ask  your  opinionsabout  all  three.  Remember  to  be  as  honest  as  possible.  

 

 

 

 

28.   New  v.  Old  -­  What  features  of  the  current  dashboard  do  you  prefer  over  the  prototypes?

 

 

 

 

 

Page 42: GW MPP Capstone Report for CAPCS

   

42  

 

5/4/2014 Interview Questions - Google Forms

https://docs.google.com/forms/d/1MYhT_zVC9fIn4emK7_znSxXsELZoPZ8DA0Idj0q3Anw/edit 8/9

29.   New  v.  Old  -­  Now  that  you  have  all  three  documents  in  front  of  you,  is  there  anything  from  the

current  dashboard  that  you  can  see  is  missing  in  the  new  prototypes?

 

 

 

 

 

30.   Position  Description  and  Data  Literacy/Use  -­  To  start,  can  you  tell  me  a  little  about  your

position?  What  is  it  you  do  for  CAPCS  in  two  to  three  sentences?

Finally,  I  am  going  to  ask  a  little  about  your  position  at  CAPCS  and  your  own  experience  with  data.

We  want  to  cover  the  span  of  data  literacy  at  CAPCS,  and  all  answers  will  be  confidential,  so  please

be  as  honest  about  your  skill  level  as  possible.  The  more  we  know  about  the  range  of  abilities  at

CAPCS,  the  better-­tailored  our  work  can  be  to  your  needs.

 

 

 

 

 

31.   Position  Description  and  Data  Literacy/Use  -­  In  your  position,  how  often  do  you  use  data  to

make  decisions?

Mark  only  one  oval.

 Very  seldom  (1-­2  times  a  year)

 Seldom  (3-­4  times  a  year)

 Often  (once  or  twice  a  quarter)

 Frequently  (once  or  twice  a  week)

 Very  frequently  (daily,  multiple  times  a  week)

32.   Position  Description  and  Data  Literacy/Use  -­  On  a  scale  from  1-­5,  with  one  being  not  at  all

comfortable  and  five  being  very  comfortable,  how  comfortable  would  you  say  you  are  with

using  data  to  inform  your  work?

Mark  only  one  oval.

1 2 3 4 5

Not  all  comfortable Very  comfortable

Page 43: GW MPP Capstone Report for CAPCS

   

43  

 

5/4/2014 Interview Questions - Google Forms

https://docs.google.com/forms/d/1MYhT_zVC9fIn4emK7_znSxXsELZoPZ8DA0Idj0q3Anw/edit 9/9

Powered  by

33.   Position  Description  and  Data  Literacy/Use  -­  On  a  scale  from  1-­5,  with  one  being  no  time  at  alland  five  being  nearly  all  my  time,  how  much  time  would  you  say  you  spend  working  with  datain  your  position?Mark  only  one  oval.

1 2 3 4 5

Not  time  at  all Nearly  all  the  time

34.   Position  Description  and  Data  Literacy/Use  -­  Can  you  talk  a  little  about  what  you  understand  tobe  the  reason  why  CAPCS  uses  data?  Is  it  to  improve  student  outcomes?  Keep  track  ofprogress  toward  goals?  Hold  teachers  and  leaders  accountable?  

 

 

 

 

35.   Position  Description  and  Data  Literacy/Use  -­  Do  you  agree  or  disagree  that  CAPCS  cultivates  aculture  of  “data  literacy?”  Why  or  why  not?  

 

 

 

 

36.   Is  there  anything  else  you  would  like  to  add  or  any  suggestions  you  might  have?Thank  you  so  much  for  your  time.  Your  honest  feedback  is  going  to  go  far  in  helping  us  to  create  thebest  possible  tool  for  everyone  at  CAPCS.  We  will  send  you  the  prototypes  so  you  can  take  a  closerlook  at  them.  If  you  have  any  other  comments  or  questions,  please  don’t  hesitate  to  contact  us.  

 

 

 

 

Page 44: GW MPP Capstone Report for CAPCS

   

44  

 

References    Anderson,  S.,  Leithwood,  K.,  &  Strauss,  T.  (2010).  Leading  data  use  in  schools:  Organizational  conditions       and  practices  at  the  school  and  district  levels.  Leadership  and  Policy  in  Schools,  9(3),  292-­‐327.      Almy,  S.,  et.  al  (2014).  “Teacher  Data  Literacy:  Its  about  time”  A  Brief  for  State  Policy  Makers.         Data  Quality  Campaign.      Bambrick-­‐Santoyo,  P  (2010).  “Driven  by  Data:  A  Practical  Guide  to  Improve  Instruction”.  Wiley.            Cho,  V.  &  Wayman,  J.  C.  (2014).  Districts’  efforts  for  data  use  and  computer  data  systems:  The  role  of       sensemaking  in  system  use  and  implementation.  Teachers  College  Record.  Vol.  116,  No.  2      Community  Academy  Public  Charter  Schools.  (2014).  Annual  Report  2012-­‐2013  SY.         Retrieved  from  https://www.capcs.org/about_us/annual_report.php    District  of  Columbia  Public  Charter  School  Board.  (2013).  Community  Academy  Public  Charter  Schools       2012-­‐2013  Charter  Renewal  Report.  Retrieved  from       http://www.dcpcsb.org/data/files/capcs%20finalized%20renewal%20report[4].pdf.    Ikemoto,  S.G.,  &  Marsh,  J.  A.  (2007).  Cutting  Through  the  “Data-­‐Driven”  Mantra.  Rand  Corporation.    Linn,  R.  L.,  Baker,  E.  L.,  &  Betebenner,  D.  W.  (2002).  Accountability  systems:  Implications  of  requirements       of  the  no  child  left  behind  act  of  2001.  Educational  Researcher,  31(6),  3-­‐16.      Marsh,  J.A.,  Pane,  J.  F.,  &  Hamilton,  L.  S.  (2006).  Making  Sense  of  Data-­‐driven  Decision  Making         in  Education:  Rand  Corporation.      Park,  V.,  &  Datnow,  A.  (2009).  Co-­‐constructing  distributed  leadership:  District  and  school  connections  in       data-­‐driven  decision  making.  School  leadership  and  Management,  29(5),  477-­‐494.      Shonkoff,  J.  P.  (2000).  Science,  policy,  and  practice:  Three  cultures  in  search  of  a  shared  mission.         Child  development,  71(1),  181-­‐187.    Sutherland,  S.  (2004).  Creating  a  culture  of  data  use  for  continuous  improvement:  A  case  study  of  an           Edison  Project  school.  American  Journal  of  Evaluation,  25(3),  277-­‐293.    Wholey,  J.  S.,  Hatry,  H.  P.,  &  Newcomer,  K.  E.  (2010).  Handbook  of  Practical  Program  Evaluation  (3rd  ed.).       San  Francisco:  Jossey-­‐Bass,  270.    Wong,  K.,  &  Sunderman,  G.  (2007).  Education  accountability  as  a  presidential  priority:  No  Child  Left       Behind  and  the  Bush  presidency.  Publius:  The  Journal  of  Federalism,  37(3),  333-­‐350.