Measuring Impact with Altmetrics: Is There One Tool To Rule Them All?

Research output: Contribution to conferencePoster

Abstract

Purpose: Researchers and institutions are faced with a growing number of tools to help maximize and track alternative metrics, or altmetrics. Unlike bibliographic databases, the coverage, functionality, and underlying search algorithms of these tools are often opaque. This poster describes an assessment of these unique resources.

Setting/Participants/Resources: Seven tools were examined: Altmetric Explorer, F1000, ImpactStory, Kudos, Mendeley Stats, Newsflo, and PlumX

Brief Description: We investigated seven altmetrics-related tools to determine their utility both for the Libraries and for our users. We assessed seven tools, and considered their functionality, intended purpose and audience, business model, transparency, accuracy, and flexibility, both for the Libraries as a service and resource delivery unit and for the individual researcher.

Results/Outcomes: While we found that no one tool met all of the articulated and anticipated needs of our Libraries or our users, we developed an overarching rubric which allows us to clearly communicate the benefits and potential challenges of each of these diverse tools. The scope of the tools was often limited, for example focusing only on social media engagement rather than a more robust picture of impact, and they consistently lacked functionality such as the ability to download search results. Associated costs were often ambiguous, as were the search algorithms and data sources included, and tools frequently failed to simultaneously address both individual and institutional needs.

Evaluation Method: A consensus-based model was used to develop an assessment rubric of altmetric tools.
Original languageEnglish (US)
StatePublished - Oct 8 2018
EventMedical Library Association Midwest Chapter Annual Meeting - Cleveland, United States
Duration: Oct 5 2018Oct 9 2018

Conference

ConferenceMedical Library Association Midwest Chapter Annual Meeting
CountryUnited States
CityCleveland
Period10/5/1810/9/18

Fingerprint

functionality
resources
poster
social media
transparency
flexibility
coverage
ability
costs
evaluation

Cite this

Bakker, C. J., Chew, K. V., McBurney, J. L., Reed, D., & Aho, M. (2018). Measuring Impact with Altmetrics: Is There One Tool To Rule Them All?. Poster session presented at Medical Library Association Midwest Chapter Annual Meeting, Cleveland, United States.

Measuring Impact with Altmetrics: Is There One Tool To Rule Them All? / Bakker, Caitlin J; Chew, Katherine V; McBurney, Jennifer L; Reed, Del; Aho, Melissa.

2018. Poster session presented at Medical Library Association Midwest Chapter Annual Meeting, Cleveland, United States.

Research output: Contribution to conferencePoster

Bakker, CJ, Chew, KV, McBurney, JL, Reed, D & Aho, M 2018, 'Measuring Impact with Altmetrics: Is There One Tool To Rule Them All?' Medical Library Association Midwest Chapter Annual Meeting, Cleveland, United States, 10/5/18 - 10/9/18, .
Bakker CJ, Chew KV, McBurney JL, Reed D, Aho M. Measuring Impact with Altmetrics: Is There One Tool To Rule Them All?. 2018. Poster session presented at Medical Library Association Midwest Chapter Annual Meeting, Cleveland, United States.
Bakker, Caitlin J ; Chew, Katherine V ; McBurney, Jennifer L ; Reed, Del ; Aho, Melissa. / Measuring Impact with Altmetrics: Is There One Tool To Rule Them All?. Poster session presented at Medical Library Association Midwest Chapter Annual Meeting, Cleveland, United States.
@conference{5a04249c41744d39a0e4b9a8d3691b59,
title = "Measuring Impact with Altmetrics: Is There One Tool To Rule Them All?",
abstract = "Purpose: Researchers and institutions are faced with a growing number of tools to help maximize and track alternative metrics, or altmetrics. Unlike bibliographic databases, the coverage, functionality, and underlying search algorithms of these tools are often opaque. This poster describes an assessment of these unique resources. Setting/Participants/Resources: Seven tools were examined: Altmetric Explorer, F1000, ImpactStory, Kudos, Mendeley Stats, Newsflo, and PlumXBrief Description: We investigated seven altmetrics-related tools to determine their utility both for the Libraries and for our users. We assessed seven tools, and considered their functionality, intended purpose and audience, business model, transparency, accuracy, and flexibility, both for the Libraries as a service and resource delivery unit and for the individual researcher. Results/Outcomes: While we found that no one tool met all of the articulated and anticipated needs of our Libraries or our users, we developed an overarching rubric which allows us to clearly communicate the benefits and potential challenges of each of these diverse tools. The scope of the tools was often limited, for example focusing only on social media engagement rather than a more robust picture of impact, and they consistently lacked functionality such as the ability to download search results. Associated costs were often ambiguous, as were the search algorithms and data sources included, and tools frequently failed to simultaneously address both individual and institutional needs. Evaluation Method: A consensus-based model was used to develop an assessment rubric of altmetric tools.",
author = "Bakker, {Caitlin J} and Chew, {Katherine V} and McBurney, {Jennifer L} and Del Reed and Melissa Aho",
year = "2018",
month = "10",
day = "8",
language = "English (US)",
note = "Medical Library Association Midwest Chapter Annual Meeting ; Conference date: 05-10-2018 Through 09-10-2018",

}

TY - CONF

T1 - Measuring Impact with Altmetrics: Is There One Tool To Rule Them All?

AU - Bakker, Caitlin J

AU - Chew, Katherine V

AU - McBurney, Jennifer L

AU - Reed, Del

AU - Aho, Melissa

PY - 2018/10/8

Y1 - 2018/10/8

N2 - Purpose: Researchers and institutions are faced with a growing number of tools to help maximize and track alternative metrics, or altmetrics. Unlike bibliographic databases, the coverage, functionality, and underlying search algorithms of these tools are often opaque. This poster describes an assessment of these unique resources. Setting/Participants/Resources: Seven tools were examined: Altmetric Explorer, F1000, ImpactStory, Kudos, Mendeley Stats, Newsflo, and PlumXBrief Description: We investigated seven altmetrics-related tools to determine their utility both for the Libraries and for our users. We assessed seven tools, and considered their functionality, intended purpose and audience, business model, transparency, accuracy, and flexibility, both for the Libraries as a service and resource delivery unit and for the individual researcher. Results/Outcomes: While we found that no one tool met all of the articulated and anticipated needs of our Libraries or our users, we developed an overarching rubric which allows us to clearly communicate the benefits and potential challenges of each of these diverse tools. The scope of the tools was often limited, for example focusing only on social media engagement rather than a more robust picture of impact, and they consistently lacked functionality such as the ability to download search results. Associated costs were often ambiguous, as were the search algorithms and data sources included, and tools frequently failed to simultaneously address both individual and institutional needs. Evaluation Method: A consensus-based model was used to develop an assessment rubric of altmetric tools.

AB - Purpose: Researchers and institutions are faced with a growing number of tools to help maximize and track alternative metrics, or altmetrics. Unlike bibliographic databases, the coverage, functionality, and underlying search algorithms of these tools are often opaque. This poster describes an assessment of these unique resources. Setting/Participants/Resources: Seven tools were examined: Altmetric Explorer, F1000, ImpactStory, Kudos, Mendeley Stats, Newsflo, and PlumXBrief Description: We investigated seven altmetrics-related tools to determine their utility both for the Libraries and for our users. We assessed seven tools, and considered their functionality, intended purpose and audience, business model, transparency, accuracy, and flexibility, both for the Libraries as a service and resource delivery unit and for the individual researcher. Results/Outcomes: While we found that no one tool met all of the articulated and anticipated needs of our Libraries or our users, we developed an overarching rubric which allows us to clearly communicate the benefits and potential challenges of each of these diverse tools. The scope of the tools was often limited, for example focusing only on social media engagement rather than a more robust picture of impact, and they consistently lacked functionality such as the ability to download search results. Associated costs were often ambiguous, as were the search algorithms and data sources included, and tools frequently failed to simultaneously address both individual and institutional needs. Evaluation Method: A consensus-based model was used to develop an assessment rubric of altmetric tools.

M3 - Poster

ER -