Quantcast
Channel: Nader Ale Ebrahim's academic activities and relevant topics
Viewing all articles
Browse latest Browse all 1665

Who is using altmetrics tools? – The Altmetrics Conference – September 28-29, 2016

$
0
0
 Source: http://altmetricsconference.com/who-is-using-altmetrics-tools/


Who is using altmetrics tools?



This is a guest post contributed by Bianca Kramer and Jeroen Bosman, Utrecht University Library, The Netherlands.


Over the past six years, since the publication of the Altmetrics manifesto,
altmetrics have matured and greatly expanded. Applications are
available to gather and display altmetrics data on articles and books
alike, either from freestanding companies or from large publishing
houses. But to what extent are researchers actually using altmetrics
tools, and how does this relate to their use of more traditional
citation-based metrics tools?


Our recent global survey on research tool usage
provides some answers to these questions. For 17 different research
activities, the survey gathered information on which tools are used
across disciplines, research roles, career stages and countries. During
last year’s 2:AM altmetrics conference, we presented some preliminary findings
based on the almost 5,000 responses we had received at the time. Now,
the final results are available, from a total of 20,663 responses. All
data are available through Zenodo and can be explored using our interactive dashboard.


In this blogpost, we’ll discuss some findings from our survey as they
relate to the use of altmetrics and more traditional metrics tools.
Specifically, the survey asked what tools respondents used for measuring
impact, with seven preselected options to tick, and an additional ‘(and
also) others’ option which prompted respondents to manually enter any
tools not among the preselected ones.


3AM Figure 1 Survey question impact


Figure 1 – survey question on tools for measuring impact


Altmetrics – the librarians’ gap


Looking at the results for the preset options, it is apparent that
for researchers, the impact factor (through JCR) and the large citation
databases  (Web of Science and Scopus) are still used most often for
determining impact, with ‘pure’ altmetrics tools such as Altmetric and
 Impactstory scoring much lower (Figure 2). Librarians (who were asked
which tools they recommend, not just use themselves) demonstrate a
higher awareness than researchers for all tools. This difference is
particularly striking for Altmetric: librarians recommend this tool
about as much as the large citation based tools, while actual use by
researchers is much lower.


Librarians might both be well aware of the criticisms of traditional
metrics tools (especially citation metrics at the journal level, i..e
the impact factor) and are in a good position to explore alternatives
due to their knowledge of the information landscape. On the other hand,
the stakes for librarians are lower as they are not the ones that are
dependent on metrics-based criteria for funding and evaluation. Does the
gap between the what researchers use and what librarians recommend mean
that librarians are out of touch with what researchers deem important?
Or is there still a long way to go in making researchers aware of
altmetrics tools?


3AM Figure 2 Impact preset toolsa


Figure 2 – Percentage of researchers and librarians using/recommending metrics tools


The generation effect


Is there a difference in the use of traditional metrics vs.
altmetrics for researchers in various career stages? When we look at the
use of the impact factor and Altmetric as two examples, we see that
while use of the impact factor increases the further researchers
progress in their career, postdocs are the group that most often use
Altmetric.


3AM Figure 3 IF Altmetric RR


Figure 3 – Percentage of researchers in various career stages using IF and Altmetric


The increase in the use of the impact factor as a metric for impact
likely reflects the growing  (perceived) importance of the publication
record as one’s career progresses. Indeed, for PhD students we see a
sharp increase in the use of the impact factor once they have published
their first paper (Figure 4). Despite many documentations of its caveats
and drawbacks (e.g. here[WP Jeroen] and here[paper on distribution]),
use of the impact factor is still so ingrained in the research process
(especially in formal assessment and funding, despite widespread
undersigning of DORA) that for now, it remains the dominant measure of impact across generations of researchers.


For altmetrics, exemplified here by the use of Altmetric, the picture
is different. The use of Altmetric peaks at the postdoc level, which
may reflect a combination of willingness to explore new avenues and
tools, and the need to demonstrate immediate impact of (recent)
publications on the road to acquiring a faculty position. Interestingly,
the use of Altmetric drops again for faculty. This could either be due
to a ‘legacy’ effect of older faculty using altmetrics far less than
younger faculty, thus suppressing average usage, or it could be that
faculty overall (including recently appointed faculty) uses altmetrics
less than postdocs. Again, looking at date of first publication
clarifies this (Fig 4): the use of Altmetric is low overall for all
generations of faculty. This means that also early faculty use Altmetric
less than postdocs.


One possible explanation could be that while the use of altmetrics is
perceived useful during the transition from postdoc to faculty level,
the tenure-track process focuses more heavily on traditional metrics,
thus discouraging young faculty from using altmetrics. This is just a
hypothesis, however, and we cannot rule out other explanations for our
data. For example, we have not yet explored possible differences in
distribution of respondents across disciplines and/or countries, which
could also play a role.


3AM Figure 4 IF Altmetric year

Figure 4 – Percentage of researchers in various career stages using IF and Altmetric,

broken down by year of first publication. Only groups with over 100 respondents are included in this figure.


The disciplinary difference


We did look at the overall share of altmetrics tools across different
disciplines. For this, we included all preset options as well as the
top 10 of ‘other tools’ that were mentioned by researchers, which for
this question included Google Scholar, SCImago journal rank (both
counted as traditional, citation-based tools), ResearchGate and Academia
(both counted as altmetrics tools because they include metrics beyond
citations). The percentage of altmetrics tools among all metrics tools
mentioned by researchers ranges from 10-11% for Physical Sciences and
Engineering & Technology to 19% for Life Sciences (Figure 5).


3AM Figure 5 Altmetrics share per discipline1

Figure 5 – Share of altmetrics tools among all metrics tools (preset
options and top 10 ‘others’) for different disciplines (researchers
only, n=number of respondents)



The high share of altmetrics tools in Life Sciences might be a
reflection of the importance of published articles (for which altmetrics
are readily available) in this field, combined with perhaps a
propensity for experimentation among researchers. It may surprise that
it is not Arts & Humanities that scores lowest, but rather Physical
Sciences and Engineering & Technology. All three disciplines have
research output (books and monographs for the former, and preprints and
conference papers in the latter two) that is less suited to traditional
citation-based metrics research. Arts & Humanities scholars seem to
have embraced the possibilities of altmetrics to a larger extent than
researchers in the “hard sciences”, who may view altmetrics as
 imprecise or irrelevant, or have not yet encountered them much on the
platforms they engage in (e.g. IEEE, ArXiV). In general, the current
expansion of altmetrics across other forms of scholarly output, like
data and code, could increase uptake in many disciplines over the coming
years.


One development to consider is the increasing incorporation of
altmetrics in environments that traditionally would show only
citation-based metrics (e.g. in Scopus and on journal websites) . While
this complicates future analyses of use of altmetrics tools, it makes it
easier for researchers of all stripes to become familiar with
altmetrics – even researchers that so far have proven to be less
inclined to use specific tools for this purpose. Increased awareness and
use of altmetrics by researchers, combined with a broadening of
assessment and funding criteria beyond citation-based metrics, will
hopefully result in a more comprehensive and inclusive concept of
‘research impact’ across academia.


Bianca Kramer (@MsPhelps) and Jeroen Bosman (@jeroenbosman)


Utrecht University Library, The Netherlands





Who is using altmetrics tools? – The Altmetrics Conference – September 28-29, 2016

Viewing all articles
Browse latest Browse all 1665

Trending Articles