49
“Alt” is German for “old”: It’s &me to stop talking about metrics in terms of “alterna&ve” metrics. Todd Carpenter Executive Director, NISO February 23, 2015

Carpenter - Lets remove "alt" from altmetrics - ER&L Presentation

Embed Size (px)

Citation preview

“Alt”  is  German  for  “old”:  It’s  &me  to  stop  talking    about  metrics  in  terms  of  “alterna&ve”  metrics.  

Todd Carpenter  Executive Director, NISO

February 23, 2015

!  Non-­‐profit  industry  trade  associa&on    accredited  by  ANSI  

!  Mission  of  developing  and  maintaining  technical  standards  related  to  informa&on,  documenta&on,  discovery  and  distribu&on  of  published  materials  and  media  

!  Volunteer  driven  organiza&on:  400+  contributors  spread  out  across  the  world  

!  Responsible  (directly  and  indirectly)  for  standards  like  ISSN,  DOI,  Dublin  Core  metadata,  DAISY  digital  talking  books,  OpenURL,  MARC  records,  and  ISBN  

About    

February  23,  2015   2  

1980s Alternative  

John  Taylor,  Duran  Duran  

1920s Alternative  

2010s Not-so Alternative  

Cole  Plante  

When it becomes a standard, its not alterative any more…  

Since  Jason  Priem  coined  the  term  

•  There  have  been  some  6,200  scholarly  ar&cles  (per  Google  Scholar)  

•  There  have  been  8  altmetrics  conferences  •  Even  I  have  presented  on  altmetrics  12  &mes!    

(Just  kidding)  

February  23,  2015   7  

More  than  just  popularity  

Research  is  poin&ng  to  the  fact  that  there  is  a  modest  posi&ve  correla&on  between    early-­‐signal  metrics  (altmetrics)  and    later-­‐signal  metrics  (cita&ons)  

Do  altmetrics  correlate  with  cita7ons?  Extensive  comparison  of  altmetric  indicators  with  cita7ons  from  a  mul7disciplinary  perspec7ve  by  Rodrigo  Costas,  Zohreh  Zahedi,  Paul  Wouters    Do  Altmetrics  Work?  TwiCer  and  Ten  Other  Social  Web  Services  by  Mike  Thelwall  ,  Stefanie  Haustein,    Vincent  Larivière,  &  Cassidy  R.  Sugimoto      (2006)  Earlier  web  usage  sta&s&cs  as  predictors  of  later  cita&on  impact  by  Brody  T,  Harnad  S  &  Carr  L      

February  23,  2015   8  

Would  a  researcher  focus  on  only  one  data  source  or  methodological  approach?  

 

February  23,  2015   9  Duke  University  -­‐    Informa&on  Ini&a&ve  at  Duke  (IID)  

There  aren’t    metrics  and  “altmetrics;  there  are  only  metrics!  

 

February  23,  2015   10  

We  have  been  using    non-­‐cita&on-­‐based  metrics  for  decades  

February  23,  2015   11  

February  23,  2015   12  

February  23,  2015   13  

Traverse  Area  District  Library  (TADL)  

What  has  changed  is  our  ability  to  collect  and  

analyze  data    in  ways  that  are  

meaningful  to  others  February  23,  2015   14  

TRUST  

=  

What are the infrastructure elements

of metrics we need standards for?  

Granularity  

Time Scale  

Identification  Definition  

Exchange

Granularity  

Time Scale  

Identification  Definition  

Exchange

However,  there  are  different  metrics  for  different  things.  

 There  are  even  different  metrics    

for  the  same  thing  

February  23,  2015   19  

How fast are we going?  

Pound-foot/seconds or ���kilogram-meter/seconds  

No researcher wants this ���to be the end of their career?  

Are we measuring scholarship using “inches” or “meters”  

Image:  Flickr  user  karindalziel    

I  olen  sound  like  a  broken  record  

•  Defining  what  is  to  be  counted  =  standards  •  How  to  describe  what  to  count  =  standards  •  Iden&fica&on  of  what  to  count  =  standards  •  Procedures  for  coun&ng  or  not  =  standards  •  Aggrega&ng  counts  from  network  =  standards  •  Exchange  of  what  was  counted  =  standards    February  23,  2015   24  

Alterna7ve  Assessment  Ini7a7ve    

 Phase  1  Mee7ngs  

October  9,  2013    -­‐  San  Francisco,  CA  December  11,  2013  -­‐  Washington,  DC  

January  23-­‐24  -­‐  Philadelphia,  PA  Round  of  1-­‐on-­‐1  interviews  –  March/Apr  

 Phase  1  report  published  in  June  2014 �

 

Mee&ng  Lightning  Talks  •  Expecta&ons  of  researchers  •  Exploring  disciplinary  differences  in  the  use  of  social  media  in  

scholarly  communica&on  •  Altmetrics  as  part  of  the  services  of  a  large  university  library  

system  •  Deriving  altmetrics  from  annota&on  ac&vity  •  Altmetrics  for  Ins&tu&onal  Repositories:  Are  the  metadata  

ready?  •  Snowball  Metrics:  Global  Standards  for  Ins&tu&onal  

Benchmarking  •  Interna&onal  Standard  Name  Iden&fier  •  Altmetric.com,  Plum  Analy&cs,  Mendeley  reader  survey  •  Twiper  Inconsistency  February  23,  2015   28  

“Lightning"  by  snowpeak  is  licensed  under  CC  BY  2.0  

February  23,  2015   29  

February  23,  2015   30  

February  23,  2015   31  

30  One-­‐on-­‐One  Interviews  

February  23,  2015   32  

White  Paper  Released  

February  23,  2015   33  

Poten7al  work  themes  

Defini7ons  Applica7on  to  types  of  research  outputs  Discovery  implica7ons  Research  evalua7on  Data  quality  and  gaming  Grouping,  aggrega7ng,  and  granularity  Context    Adop7on  

February  23,  2015   34  

Poten&al  work  themes  

Defini7ons  Applica&on  to  types  of  research  outputs  Discovery  implica&ons  Research  evalua&on  Data  quality  and  gaming  Grouping,  aggrega&ng,  and  granularity  Context    Adop&on  

February  23,  2015   35  

Poten&al  work  themes  

Defini&ons  

Applica7on  to  types  of  research  outputs  Discovery  implica&ons  Research  evalua&on  Data  quality  and  gaming  Grouping,  aggrega&ng,  and  granularity  Context    Adop&on  

February  23,  2015   36  

Poten&al  work  themes  

Defini&ons  Applica&on  to  types  of  research  outputs  

Discovery  implica7ons  Research  evalua&on  Data  quality  and  gaming  Grouping,  aggrega&ng,  and  granularity  Context    Adop&on  

February  23,  2015   37  

Poten&al  work  themes  

Defini&ons  Applica&on  to  types  of  research  outputs  Discovery  implica&ons  

Research  evalua7on  Data  quality  and  gaming  Grouping,  aggrega&ng,  and  granularity  Context    Adop&on  

February  23,  2015   38  

Poten&al  work  themes  

Defini&ons  Applica&on  to  types  of  research  outputs  Discovery  implica&ons  Research  evalua&on  

Data  quality  and  gaming  Grouping,  aggrega&ng,  and  granularity  Context    Adop&on  

February  23,  2015   39  

Poten&al  work  themes  

Defini&ons  Applica&on  to  types  of  research  outputs  Discovery  implica&ons  Research  evalua&on  Data  quality  and  gaming  

Grouping,  aggrega7ng,  and  granularity  Context    Adop&on  

February  23,  2015   40  

Poten&al  work  themes  

Defini&ons  Applica&on  to  types  of  research  outputs  Discovery  implica&ons  Research  evalua&on  Data  quality  and  gaming  Grouping,  aggrega&ng,  and  granularity  

Context    Adop&on  

February  23,  2015   41  

Poten&al  work  themes  

Defini&ons  Applica&on  to  types  of  research  outputs  Discovery  implica&ons  Research  evalua&on  Data  quality  and  gaming  Grouping,  aggrega&ng,  and  granularity  Context    

Adop7on  &  Promo7on  February  23,  2015   42  

Alterna7ve  Assessment  Ini7a7ve    

 

Phase  2  Presenta7ons  of  Phase  1  report  (June  2014)  

Priori7za7on  Effort  (June  -­‐  Aug,  2014)  Project  approval  (Sept  2014)  

Working  group  forma7on  (Oct  2014)  Consensus  Development  (Nov  2014  -­‐  Dec  2015)  

Trial  Use  Period  (Dec  15  -­‐  Mar  16)  

Publica7on  of  final  recommenda7ons  (Jun  16)  

February  23,  2015   44  

0%  

10%  

20%  

30%  

40%  

50%  

60%  

70%  

80%  

90%  

100%  

Unimportant  

Of  liple  importance  

Moderately  important  

Important  

Very  important  

Community  Feedback  on  Project  Idea  Themes  

n=118  

February  23,  2015   45  

0%  

10%  

20%  

30%  

40%  

50%  

60%  

70%  

80%  

90%  

100%  

Unimportant  

Of  liple  importance  

Moderately  important  

Important  

Very  important  

n=118  

Community  Feedback  on  Project  Idea  Themes  

Top-­‐ranked  ideas    (very  important  &  important  >70%)    

•  87.9%  -­‐  1.  Develop  specific  defini&ons  for  alterna&ve  assessment  metrics.    

•  82.8%  -­‐  10.  Promote  and  facilitate  use  of  persistent  iden&fiers  in  scholarly  communica&ons.    

•  80.8%  -­‐  12.  Develop  strategies  to  improve  data  quality  through  normaliza&on  of  source  data  across  providers.    

•  79.8%  -­‐  4.  Iden&fy  research  output  types  that  are  applicable  to  the  use  of  metrics.  

•  78.1%  -­‐  6.  Define  appropriate  metrics  and  calcula&on  methodologies  for  specific  output  types,  such  as  solware,  datasets,  or  performances.    

•  72.5%  -­‐  13.  Explore  crea&on  of  standardized  APIs  or  download  or  exchange  formats  to  facilitate  data  gathering.    

•  70.7%  -­‐  11.  Research  issues  surrounding  the  reproducibility  of  metrics  across  providers.    

February  23,  2015   46  

Alterna7ve  Assessments  of  our  Assessment  Ini7a7ve    

     

White  paper  downloaded  6400  7mes    21  substan&ve  comments  received  

120  in-­‐person  and  virtual  par&cipants  at  the  mee&ngs    These  3  mee&ngs  apracted  >400  RSVPs  for  live  stream      

Goal:  generate  about  40  ideas,  in  total,  generated  more  than  250  Project  materials  downloaded  more  than  22,000  &mes  

More  than  500  direct  tweets  using  the  #NISOALMI  hashtag  Survey  ranking  of  output  by  118  people    

Six  ar&cles  in  tradi&onal  news  publica&ons    15  blog  posts  about  the  ini&a&ve  

 

For more

Project Site: www.niso.org/topics/tl/altmetrics_initiative/

White Paper:

http://www.niso.org/apps/group_public/download.php/13295/niso_altmetrics_white_paper_draft_v4.pdf

February  23,  2015   48  

Questions?

Todd Carpenter

Executive Director [email protected]

National Information Standards Organization (NISO) 3600 Clipper Mill Road, Suite 302 Baltimore, MD 21211 USA +1 (301) 654-2512 www.niso.org

February  23,  2015   49