Disciplinary Differences in Applying E-Journal Usage Metrics

Research output: Contribution to conferencePoster

Abstract

Purpose: Determine if the relationship between a) journal downloads or rankings and b) faculty authoring venue or citations to them varies by discipline. Does the strength of the correlations vary by discipline? Do the social sciences or humanities differ from the physical or health sciences? Are there differences between similar disciplines (e.g. physical & health sciences), or within disciplines (e.g. nursing to pharmacy)? Determine if the newer ranking metrics Eigenfactor & SNIP correlate better with downloads and citations than Impact Factor? Determine if Scopus is a valid alternative to Local Journal Use Reports as a way of correlating faculty publication & citation practices with journal selections Methodology: Use data: 4 years of (2009-2012) collected for each subject journal set: OpenURL link resolver article view requests & publisher’s COUNTER article downloads Ranking data: 5-year Impact Factor, current EigenFactor & Source Normalized Impact Per Paper (SNIP) recorded for each journal title Citation data: 2 years (2009-2010) collected from Thomson Local Journal Use Reports (LJUR); 4 years (2009-2012) from Elsevier SciVal (Scopus) Journal value assessed by: (1) author decisions to publish there (2) external citations to these authors (3) cost effectiveness (via downloads *and* citations) using rank correlation coefficients to compare the different metrics Conclusions: Inform selection decisions Use LJUR and Scopus: LJUR reports more subscribed titles whose local faculty articles get cited by peers, but Scopus reports more subscribed journals that local faculty author in Obtain liaison/subject coordinator input: Hard to centralize collection if the “best fit” metrics vary by discipline Understand patterns of use Capture demographics of logins and interdisciplinary use Show value to the academy Defend library tax on departments Offer services to help faculty demonstrate impact e.g. for tenure portfolios
Original languageEnglish (US)
StatePublished - Aug 4 2015
EventLibrary Assessment Conference: Building Effective, Sustainable, Practical Assessment - University of Washington, Seattle, United States
Duration: Aug 4 2015Oct 1 2015

Conference

ConferenceLibrary Assessment Conference
Abbreviated titleLAC2015
CountryUnited States
CitySeattle
Period8/4/1510/1/15

Fingerprint

ranking
health science
taxes
academy
Values
nursing
social science
methodology
costs

Keywords

  • e-journals
  • metrics
  • impact factors
  • eigenfactors

Cite this

Chew, K. V., Lilyard, C. I., Stemper, J. A., & Schoenborn, M. (2015). Disciplinary Differences in Applying E-Journal Usage Metrics. Poster session presented at Library Assessment Conference, Seattle, United States.

Disciplinary Differences in Applying E-Journal Usage Metrics. / Chew, Katherine V; Lilyard, Caroline I; Stemper, James A; Schoenborn, Mary.

2015. Poster session presented at Library Assessment Conference, Seattle, United States.

Research output: Contribution to conferencePoster

Chew, KV, Lilyard, CI, Stemper, JA & Schoenborn, M 2015, 'Disciplinary Differences in Applying E-Journal Usage Metrics' Library Assessment Conference, Seattle, United States, 8/4/15 - 10/1/15, .
Chew KV, Lilyard CI, Stemper JA, Schoenborn M. Disciplinary Differences in Applying E-Journal Usage Metrics. 2015. Poster session presented at Library Assessment Conference, Seattle, United States.
@conference{261384d632dc48929fa9247eee2ea2c4,
title = "Disciplinary Differences in Applying E-Journal Usage Metrics",
abstract = "Purpose: Determine if the relationship between a) journal downloads or rankings and b) faculty authoring venue or citations to them varies by discipline. Does the strength of the correlations vary by discipline? Do the social sciences or humanities differ from the physical or health sciences? Are there differences between similar disciplines (e.g. physical & health sciences), or within disciplines (e.g. nursing to pharmacy)? Determine if the newer ranking metrics Eigenfactor & SNIP correlate better with downloads and citations than Impact Factor? Determine if Scopus is a valid alternative to Local Journal Use Reports as a way of correlating faculty publication & citation practices with journal selections Methodology: Use data: 4 years of (2009-2012) collected for each subject journal set: OpenURL link resolver article view requests & publisher’s COUNTER article downloads Ranking data: 5-year Impact Factor, current EigenFactor & Source Normalized Impact Per Paper (SNIP) recorded for each journal title Citation data: 2 years (2009-2010) collected from Thomson Local Journal Use Reports (LJUR); 4 years (2009-2012) from Elsevier SciVal (Scopus) Journal value assessed by: (1) author decisions to publish there (2) external citations to these authors (3) cost effectiveness (via downloads *and* citations) using rank correlation coefficients to compare the different metrics Conclusions: Inform selection decisions Use LJUR and Scopus: LJUR reports more subscribed titles whose local faculty articles get cited by peers, but Scopus reports more subscribed journals that local faculty author in Obtain liaison/subject coordinator input: Hard to centralize collection if the “best fit” metrics vary by discipline Understand patterns of use Capture demographics of logins and interdisciplinary use Show value to the academy Defend library tax on departments Offer services to help faculty demonstrate impact e.g. for tenure portfolios",
keywords = "e-journals, metrics, impact factors, eigenfactors",
author = "Chew, {Katherine V} and Lilyard, {Caroline I} and Stemper, {James A} and Mary Schoenborn",
year = "2015",
month = "8",
day = "4",
language = "English (US)",
note = "Library Assessment Conference : Building Effective, Sustainable, Practical Assessment, LAC2015 ; Conference date: 04-08-2015 Through 01-10-2015",

}

TY - CONF

T1 - Disciplinary Differences in Applying E-Journal Usage Metrics

AU - Chew, Katherine V

AU - Lilyard, Caroline I

AU - Stemper, James A

AU - Schoenborn, Mary

PY - 2015/8/4

Y1 - 2015/8/4

N2 - Purpose: Determine if the relationship between a) journal downloads or rankings and b) faculty authoring venue or citations to them varies by discipline. Does the strength of the correlations vary by discipline? Do the social sciences or humanities differ from the physical or health sciences? Are there differences between similar disciplines (e.g. physical & health sciences), or within disciplines (e.g. nursing to pharmacy)? Determine if the newer ranking metrics Eigenfactor & SNIP correlate better with downloads and citations than Impact Factor? Determine if Scopus is a valid alternative to Local Journal Use Reports as a way of correlating faculty publication & citation practices with journal selections Methodology: Use data: 4 years of (2009-2012) collected for each subject journal set: OpenURL link resolver article view requests & publisher’s COUNTER article downloads Ranking data: 5-year Impact Factor, current EigenFactor & Source Normalized Impact Per Paper (SNIP) recorded for each journal title Citation data: 2 years (2009-2010) collected from Thomson Local Journal Use Reports (LJUR); 4 years (2009-2012) from Elsevier SciVal (Scopus) Journal value assessed by: (1) author decisions to publish there (2) external citations to these authors (3) cost effectiveness (via downloads *and* citations) using rank correlation coefficients to compare the different metrics Conclusions: Inform selection decisions Use LJUR and Scopus: LJUR reports more subscribed titles whose local faculty articles get cited by peers, but Scopus reports more subscribed journals that local faculty author in Obtain liaison/subject coordinator input: Hard to centralize collection if the “best fit” metrics vary by discipline Understand patterns of use Capture demographics of logins and interdisciplinary use Show value to the academy Defend library tax on departments Offer services to help faculty demonstrate impact e.g. for tenure portfolios

AB - Purpose: Determine if the relationship between a) journal downloads or rankings and b) faculty authoring venue or citations to them varies by discipline. Does the strength of the correlations vary by discipline? Do the social sciences or humanities differ from the physical or health sciences? Are there differences between similar disciplines (e.g. physical & health sciences), or within disciplines (e.g. nursing to pharmacy)? Determine if the newer ranking metrics Eigenfactor & SNIP correlate better with downloads and citations than Impact Factor? Determine if Scopus is a valid alternative to Local Journal Use Reports as a way of correlating faculty publication & citation practices with journal selections Methodology: Use data: 4 years of (2009-2012) collected for each subject journal set: OpenURL link resolver article view requests & publisher’s COUNTER article downloads Ranking data: 5-year Impact Factor, current EigenFactor & Source Normalized Impact Per Paper (SNIP) recorded for each journal title Citation data: 2 years (2009-2010) collected from Thomson Local Journal Use Reports (LJUR); 4 years (2009-2012) from Elsevier SciVal (Scopus) Journal value assessed by: (1) author decisions to publish there (2) external citations to these authors (3) cost effectiveness (via downloads *and* citations) using rank correlation coefficients to compare the different metrics Conclusions: Inform selection decisions Use LJUR and Scopus: LJUR reports more subscribed titles whose local faculty articles get cited by peers, but Scopus reports more subscribed journals that local faculty author in Obtain liaison/subject coordinator input: Hard to centralize collection if the “best fit” metrics vary by discipline Understand patterns of use Capture demographics of logins and interdisciplinary use Show value to the academy Defend library tax on departments Offer services to help faculty demonstrate impact e.g. for tenure portfolios

KW - e-journals

KW - metrics

KW - impact factors

KW - eigenfactors

M3 - Poster

ER -