Skip to content
ALL Metrics
-
Views
14
Downloads
Get PDF
Get XML
Cite
Export
Track
Research Article

A cross-sectional analysis of Altmetric coverage of health research from Irish research organisations (2017-2023)

[version 1; peer review: 1 approved, 1 approved with reservations]
PUBLISHED 02 May 2025
Author details Author details
OPEN PEER REVIEW
REVIEWER STATUS

Abstract

Background

Altmetric is the largest platform for tracking online attention given to research outputs such as scientific articles. Alt(ernative) metrics provide a broader and more immediate overview of research’s impact than traditional bibliometrics. We aimed to investigate the impact of health research outputs associated with Irish organisations, exploring the amount, type (medium), and trends of Altmetric coverage.

Methods

We used Altmetric institutional access and the Research Organisation Registry database to search 663 Irish research organisations for their health-related research outputs (1 January 2017 – 31 December 2023). We deduplicated and filtered outputs to include those related to at least one field of health research (established in our protocol). The OpenAlex API gathered additional bibliometric data. We used R (version 4.3.2) for descriptive analyses of bibliometrics (e.g., journal, open access status, etc.) and plotted yearly data. Zero-inflated negative binomial regression was used to test for relationships between Altmetric Attention Score (AAS) and traditional article-level citations.

Results

There were 58,056 unique health-related outputs from 303 Irish research organisations, most coming from the education (67.2%) and healthcare (23.8%) sectors. Outputs increased in 2020, peaking in 2021. Open access steadily increased over time. The most popular medium for dissemination was Twitter/X (mean: 21.98 per output), followed by news (1.38), Facebook (.31), and blogs (.18). The average AAS was 21.16 (median: 4). One in five outputs received a 0 AAS.

Conclusions

This work establishes publishing and Altmetric trends in recent years on a large dataset of research outputs associated with Irish research organisations. Attention varied amongst mediums and while the average AAS was above 20 (considered as doing better than its contemporaries), many outputs receiving no attention from tracked mediums. Improved understanding of engagement with the Irish health research landscape can help researchers better navigate their locality and identify pathways for more effective public communication.

Keywords

Altmetric, science communication, media coverage, knowledge dissemination, bibliometrics, Research Performing Organisation

Introduction

The past decade has seen shifts towards more holistic assessments of research impact beyond traditional citation counts. Citation counts have been shown to be poor predictors of quality and citation-based impact metrics are largely focused on other academics, with citations generally taking a long time to accrue due to the nature of academic publishing1,2. To provide a broader and more immediate overview of impact, alternative metrics or altmetrics have been proposed as a complimentary measure to traditional bibliometrics like citation counts3,4.

Altmetric5,6 is the leading platform for tracking altmetrics or the online attention given to research outputs like scientific articles, books, and chapters. Altmetric uses a research output’s unique identifier (e.g., Direct Object Identifier, DOI) to help track coverage on social media networks such as Facebook, X (formerly Twitter), BlueSky, Reddit, YouTube, 4,000 global news outlets, policy documents, Wikipedia, blogs, Mendeley and more7. Altmetric creates an weighted amalgamated ‘Altmetric Attention Score’ (AAS)8 which has been shown to be associated with citation counts3,9 and journal impact factor10. The AAS is automatically calculated and is based on three main factors: volume which increases the AAS as more people mention it, sources which weighs something like a blog post higher than a tweet, and authors which accounts for audience or biases8. This attention score has also been shown to be significantly higher for coronavirus disease (COVID-19) related articles than for non-COVID-19 articles in 202011. Although there are concerns about some confusing a high AAS with a high-quality piece of research12, it remains a common measure of attention3.

Monitoring alternative metrics can help us understand the broader social, cultural, or economic impact that a research output can have -- far beyond typical academic circles. Coverage of research findings in the news and online has been shown to influence the public’s behaviour and perceptions of risk1316. Despite the increasing globalisation of the news, countries still have their own unique landscapes, both in terms of governance and health systems structures and differences in demographics, audience engagement, and trust in news sources17,18. Understanding one’s local landscape can help researchers reflect upon the impact of their own work, identify more effective communication strategies19, and perhaps identify potential collaborators.

In recent years, in common with other countries, Ireland has emphasised the importance of research dissemination and engagement being more aligned with implementation. To encourage this, Ireland has heavily investing in health research and healthcare reforms with its Health Service Executive Action Plan for Health Research (2019–2029)20, Sláintecare reform (initially launched in 2017)21,22, and Irish Research e-Library (IReL) open access publishing agreements (first signed in 2020)23. Within the context of healthcare and research reform and the COVID-19 pandemic, we aimed to map a piece of the complex local landscape of research in Ireland, using a cross-sectional analysis of Altmetric data (2017 – 2023) from health research outputs associated with Irish organisations and see how it has evolved since before, during, and after the COVID-19 pandemic. Our main research questions are: how did the amount of research outputs change over time, specifically how the presence of different topics fluctuated; how the differences in Altmetric coverage of research outputs changed during this period; and was there a relationship between the Altmetric data (as indicated by the Altmetric Attention Score) and citation data.

Methods

Study design, setting, and eligibility

We previously published this project’s protocol which provides more detail about the Altmetric dataset and planned analyses24. We used Altmetric to search for all research outputs (i.e., articles, book, and chapters) published between 1 January 2017 and 31 December 2023 that were associated with an Irish research organisation. This period was chosen to give insight into pre-, during and ‘post’- pandemic trends and to allow for a time-buffer for retrieving data (due to instability concerns). Associations were determined from an author’s affiliation listed as being attached to an Irish research organisation. Of note, order of authorship (e.g., first or senior) was not extracted, thus this is Irish-engaged research not necessarily Irish-led. Data on attention scores from all Irish health research outputs were obtained using an institutional license for Altmetric Explorer, thus we are unable to share the datasets; however, the .rmd files and non-proprietary datasets are available on the Open Science Framework25.

We used R version 4.3.2 (https://posit.co/download/rstudio-desktop/) to analyse data for this project. We are reporting this cross-sectional study according to the STROBE reporting guideline (Appendix 1)26. As data is publicly available, no ethical approval nor consent was needed for this study.

Datasets

Altmetric currently uses the Global Research Identifier Database (GRID) ID27 to identify organisations and maps to the current Research Organisation Registry (ROR) ID system28. ROR is a global registry which helps link researchers and their outputs to the research organisations where they work. As of 13 May 2024, there were 663 Irish research organisations listed in the ROR database. These organisations spanned multiple sectors including: Archive, Company, Education, Facility, Government, Healthcare, Nonprofit, or Other. Three authors (BY, SA, MKS) accessed Altmetric Explorer to manually download comma-separated values (csv) files from June 6 – June 21 2024. Of the 663 listed organisations, 251 had no research outputs during our inclusion dates and 31 organisations were not or no longer indexed in Altmetric – leaving us with an initial dataset including research outputs associated with 381 Irish research organisations.

As we were only interested in ‘health research’, we filtered our dataset to only include biomedical research outputs. To identify these outputs we used the 2020 Australian and New Zealand Standard Research Classification (ANZSRC) system’s Field of Research divisions relating to biomedical research29. Altmetric uses the ANZSRC system to classify outputs. The following five divisions were used: Biological Sciences (31), Biomedical and Clinical Sciences (32), Chemical Sciences (34), Health Sciences (42), and Psychology (52). Excluded fields are listed in our protocol24. ANZSRC contains divisions (broad subject areas or research disciplines) which are further delineated into groups and fields. Each research output is classified to ≥1 division — where an output cannot be classified at this level, the division assignment is based on a journal-level classification. We included research outputs if they belonged to ≥1 of the five divisions listed above.

After filtering and deduplicating our dataset, we retrieved additional citation and topic area information from the OpenAlex API3034 which matched items based on their DOI. OpenAlex is an open-source catalogue which has average reference numbers comparable to Web of Science and Scopus32. OpenAlex contains over 4,500 topics which are organised in a hierarchy containing domain, field, and subfield. These correspond to the categories used by Scopus’s ASJC system, except that they are applied at the level of ‘works’, rather than at the journal level.

Analysis

As established in our protocol24, we performed descriptive analyses (counts and frequencies) for the following bibliometric information: type of research output (i.e., journal article, book, chapter), open access status and type35, sector prominence, five subject divisions and their subdivisions, and 20 most frequent journals, funders, and organisations.

To investigate trends, we plotted the overall number of outputs, subjects, subdivision areas, sectors, and publishers on a yearly basis. When attempting to plot data on a daily and quarterly basis, we discovered indexing errors where 1 January of every year was overrepresented in the dataset. After contacting Altmetric support, we were notified that when a publication does not include a complete publication date, the Altmetric system automatically fills in the missing information with ‘01’, creating January 1st if both the month and day are not specified. We also tried to address date issues by pulling publication dates from OpenAlex API but there were more incorrectly indexed items (e.g., showing years prior to 2017). For this reason, we deviated from our protocol and chose not to conduct any analyses as daily or quarterly. We also did not use the yearly OpenAlex prespecified topics and concepts due to resource constraints although the data is available on the Open Science Framework25.

For Altmetric analyses, we calculated the overall and yearly averages and medians for the AAS and per medium (e.g., X, Facebook, policy documents).

We also performed several subset analyses. Firstly, a subset analysis of the outputs with a score of ≥20 as Altmetric has indicated that this score can generally be considered as receiving more attention than most of its ‘colleagues.’ Secondly, we removed outputs with an AAS score of 0 to see how measures of central tendency could be affected. Next, we calculated the average and median AAS and per medium for a subset of items only on pre-print servers. These subgroup analyses were to highlight differences between ‘under-performers’, ‘over-performers’, and pre-prints, which rose in prominence during the pandemic.

Finally, as outlined in the published protocol24, we split our dataset by year and performed citation analyses on a subset of the data (2017–2020) as previous research has indicated that a 3-year citation window is relatively stable (e.g., the increasing trend generally stagnates)36,37. To investigate whether there was an association between AAS and traditional article-level citation metrics (as measured by both Altmetric Dimensions and OpenAlex) and open access status, we used zero-inflated negative binomial regression models to account for the large number of AAS scores of zero in our dataset. The zero-inflated model contains two parts: 1) a binary model to model the excess zeros and 2) a count model for the count process. Ignoring the zero-inflation component for these metrics could potentially lead to incorrect conclusions. A low probability of excess zeros indicates that almost all zeros in the AAS are adequately explained by the count process, and not due to a separate zero-generating mechanism.

Results

Descriptive statistics

Of the listed 663 Irish research organisations, 381 (Appendix 2) had 135,855 research outputs within our time period with 100,016 being unique. After filtering and deduplicating research outputs to only include those which belonged to at least one of our included areas of health research24, there were 58,056 unique-health related research outputs associated with 303 research organisations (Figure 1).

416057f9-97de-4d2c-a08b-c0f8a460d581_figure1.gif

Figure 1. Search flow diagram.

There was an increase in the number of outputs from 2017 to 2020 with a steep incline from 2019 to 2020 (23%), and an increasing trend continuing into 2021 (8.7%). After 2021, the number of outputs declined but it has not returned to pre-pandemic (2019) levels. (Figure 2) An overwhelming majority of research outputs were articles (96.6%), with much less being chapters (3.34%) or books (<1%). A majority of outputs were available as open access (73.2%) with general increases over time, although rates in 2021 (78%) and 2022 (79%) were higher than 2023 (76%) (Figure 3). Closed access has decreased over time although it still represents roughly 1 in 5 outputs in the dataset. Open access (OA) can be classified by Altmetric35 in several ways: 1) hybrid is freely available under an open licence in a paid-access journal, 2) green is freely available in an OA repository, 3) gold is published in a fully open access journal, 4) bronze is freely available on publisher’s website, but without an open licence. Hybrid and gold access have increased over time while green open access has remained relatively stable and bronze has decreased over time.

416057f9-97de-4d2c-a08b-c0f8a460d581_figure2.gif

Figure 2. Research outputs over time (yearly).

416057f9-97de-4d2c-a08b-c0f8a460d581_figure3.gif

Figure 3. Open Access (OA) status and type over time (yearly).

The sectors most represented were education (67.2%) and healthcare (23.8%) with the others being far less prominent: facilities (4.6%), government (1.4%), other (1.1%) and less than 1% for companies, non-profits, and archive organisations. Accordingly, the organisations with the most research outputs in our dataset were predominately education (i.e., universities) and healthcare organisations. (Table 1) There was a lot of cross-sectoral collaboration, with every sector collaborating at least once. The highest number of collaborations took place between education-healthcare, education-facility, facility-healthcare, and education-government organisations. (Appendix 3, Tables 1–2, Figure 1)

Table 1. Twenty most frequent organisations in the dataset and their number of health-related research outputs.

Number of
Research
Outputs
Organisation
12992
12571
7623
6751


6410
4695
2968
2577
2302
2176
2063
1759
1597
1568

1493
1414
1319
1173
1093
676
University College Dublin
Trinity College Dublin
University College Cork
Royal College of Surgeons in Ireland
[RCSI University of Medicine and Health
Sciences]
University of Galway [NUI Galway]
University of Limerick
St. James's Hospital
St. Vincent's University Hospital
Beaumont Hospital
Mater Misericordiae University Hospital
Dublin City University
National University of Ireland (NUI)
NUI Maynooth
Teagasc - The Irish Agriculture and
Food Development Authority
Cork University Hospital
University Hospital Galway
Tallaght (University) Hospital
Children's Health Ireland at Crumlin
Technological University Dublin
Health Service Executive

Of the five included ANZSRC Fields of Research (Biological Sciences, Biomedical and Clinical Sciences, Chemical Sciences, Health Sciences, and Psychology), Biomedical and Clinical Sciences was the most prominent (n = 33,060, 57%), however this category does have more (16) subdivisions than the others (6–10). Generally, there was not much fluctuation over the years. The five most frequent divisions were: clinical sciences (20.7%), health services and systems (9.7%), oncology and carcinogenesis (5.1%), public health (4.3%), and biochemistry and cell biology (3.5%) (Table 2). Of the clinical sciences outputs, 44% were from the generic clinical sciences subdivision, with oncology and carcinogenesis (10.8%), cardiovascular medicine and haematology (6.1%), reproductive medicine (5.7%), immunology (4.7%), and pharmacology and pharmaceutical sciences (4.7%) rounding out the top 5 specific clinical areas. (Appendix 3, Figures 2–4, Table 3–4)

Table 2. Ten most prevalent Field of Research subjects over time (yearly).

Field of Research Subjects (code)2017201820192020202120222023Total%
Clinical sciences (3202)17801807189825032718246223541552220.7
Health services and systems (4203)715788892106213431259123672959.7
Oncology and carcinogenesis (3211)46748744155568063855538235.1
Public health (4206)27239341249458257051332364.3
Biochemistry and cell biology (3101)32335234838340439241626183.5
Microbiology (3107)30028633540441839141025443.4
Cardiovascular medicine and
haematology (3201)
21723628132638934934121392.9
Biological psychology (5202)25828326135434229929420912.8
Nursing (4205)21523325427336035731920112.7
Reproductive medicine (3215)23127826632433929826620022.7

The 20 most frequent publishers in the dataset align with the largest publishers globally (e.g., Elsevier, Springer, Wiley) which did not show much yearly variation (remaining in the top 20), however MDPI and Frontiers became much more prominent over time. (Appendix 3, Figures 5–6) When investigating the most prevalent journals in the dataset, it became clear that Altmetric makes no distinction nor linkage between a pre-print and a post-print as both are classified as articles with separate DOIs. Pre-print servers (i.e., platforms that only host pre-prints and not peer-reviewed post-prints) and hybrid publishing platforms (i.e., platforms that post a pre-print, conduct open peer review, and publish a final indexed version such as HRB Open Research and F1000) were all within the 20 most frequently reported journals in the dataset. (Appendix 3, Figure 7) Upon investigating known preprint servers (i.e., bioRxiv, ChemRxiv, medRxiv, and Research Square) and hybrid open-access publishing platforms (i.e., Gates Open Research, F1000 Open Research, HRB Open Research, and Wellcome Open Research), from our entire dataset (n = 58,056), there were 2,299 preprints from any of the major preprint servers and 528 (411 unique) items on the hybrid platforms. This represents less than 5% of our dataset. Of note, each output has a unique AAS (i.e., a pre-print and a post-print discussing the same research project could have a different AAS), so we did not exclude them from analyses. Prominent Irish and European funding agencies were well represented in the top 20 funders amongst 1,959 unique funders in our dataset (Table 3).

Table 3. Twenty most frequently referenced funders.

Affiliated
Research
Outputs
Funder
7851
6941
3744
3090
2236
1925
1835
1773
1503

889
834
763
747

674
640
622
608
605
582
548
Science Foundation Ireland
European Commission
Health Research Board
Irish Research Council
Medical Research Council
Wellcome Trust
European Research Council
Department of Health and Social Care
National Institute for Health and Care Research
(NICE), UK
Deutsche Forschungsgemeinschaft
National Natural Science Foundation of China
National Cancer Institute, Bethesda, United States
Biotechnology and Biological Sciences Research
Council
National Health and Medical Research Council
Ministry of Economy, Industry, and Competitiveness
Canadian Institutes of Health Research
National Institute on Aging
Enterprise Ireland
National Hearth Lung and Blood Institute
Higher Education Authority

Altmetric attention

Of the 58,056 unique-health related research outputs in our dataset, 10,917 (18.8%) had an AAS of 0, meaning that no sources covered them. (Table 4, Table 5) There were also no Mendeley readers for 1,754 of these outputs indicating that they received no academic nor public attention from the Altmetric-tracked sources. 9,054 outputs (15.6%) had an AAS of 20 or above (considered as doing better than most of its contemporaries)38. Outputs which were on pre-print servers had the lowest AAS, average mentions, and citations.

Table 4. Average (SD) citation metrics per dataset, Field of Research, and year.

Citation metrics
MendeleyDimensions
Citations
OpenAlex
Citations*
Average (SD)Average (SD)Average (SD)
Datasets
Fully de-duplicated (n = 58,056)48.0 (106.0)21.9 (79.1)26.2 (94.8)
0 AAS removed (n= 47,139)55.0 (115.2)25.4 (87.0)30.4 (104.4)
>20 only (n= 9,054)125.8 (221.6)67.3 (180.2)80.7 (216.8)
Only on pre-print servers (n = 2,299)5.4 (23.7)2.6 (8.5)3.1 (9.3)
Years
2017 (n = 6,646)79.3 (143.4)41.2 (111.5)46.6 (127.2)
2018 (n = 7,109)72.3 (140.4)37.1 (134.1)42.2 (151.1)
2019 (n = 7,402)62.3 (109.2)28.9 (76.2)33.7 (99.5)
2020 (n = 9,132)61.3 (138.5)26.9 (88.7)32.2 (116.3)
2021 (n = 9,929)40.4 (73.8)17.1 (46.8)20.9 (56.6)
2022 (n = 9,104)23.5 (49.2)8.4 (24.2)11.6 (33.2)
2023 (n = 8,734)12.5 (23.2)3.5 (11.3)6.2 (17.9)
Fields of Research
Biological Sciences (n=10,792)51.2 (132.8)27.6 (116.8)32.7 (135.0)
Biomedical/Clinical Sciences (n = 33.060)46.7 (107.2)22.5 (76.6)26.9 (94.6)
Chemical Sciences (n = 5,131)31.2 (60.6)22.5 (52.6)25.7 (58.8)
Health Sciences (n = 14,283)55.8 (29.0)16.2 (61.1)20.0 (74.8)
Psychology (n = 5,733)57.1 (107.9)21.1 (56.9)25.6 (68.1)

*OpenAlex citation data was pulled on 6 December 2024.

SD: standard deviation.

Table 5. Altmetric Attention Score (AAS) and medium average(SD) per dataset, Field of Research, and year.

Sources of Attention*
AASBlogsFacebookNewsPatentPolicyRedditVideoWikipediaTwitter/X
avg(SD)avg(SD)avg(SD)avg(SD)avg(SD)avg(SD)avg(SD)avg(SD)avg(SD)avg(SD)
Datasets
Fully de-duplicated
(n = 58,056)
21.16 (113.97).18(1.19).31(1.63)1.38(11.80).04(.62).07(.57).03(.31).03(.47).11(5.26)21.98(169.96)
0 AAS removed (n= 47,139)26.06(125.97).22(1.32).38(1.80)1.70(13.08).05(.69).09(.64)0.04(.35).03(.53).13(5.83)27.07(188.25)
>20 only (n= 9,054)111.95(270.92).92(2.87)1.33(3.80)8.52(28.84).11(1.04).26(1.26).14(.69).15(1.18).54(13.28)107.66(419.71)
On pre-print servers
(n = 2,299)
13.97(67.53).13(.67).06(.40).35(4.41).01(.12).02(.26)0(0).02(.34).08(1.04)21.32(92.77)
Years
2017 (n = 6,646)17.54(72.08).20(.99).67(3.00)1.20(8.15).12(1.18).12(.70).02(.20).04(.69).14(1.27)15.02(58.30)
2018 (n = 7,109)20.61(90.95).21(1.03).50(2.35)1.31(0).09(0).11(.59).02(.21).05(.84).15(1.93)20.08(86.55)
2019 (n = 7,402)19.37(92.45).15(.82).39(1.89)1.15(9.34).05(.53).08(.61).03(.27).03(.37).12(2.20)20.37(84.08)
2020 (n = 9,132)24.67(128.69).22(1.95).27(1.13)1.48(11.74).04(.42).10(.83).04(.36).03(.48).10(1.63)27.93(209.07)
2021 (n = 9,929)22.44(135.14).17(1.42).18(.80)1.43(12.39).02(.22).07(.58).03(.36).02(.26).06(.56)24.98(219.34)
2022 (n = 9,104)22.03(118.52).15(.74).16(.75)1.59(13.92).01(.15).03(.28).04(.35).02(.29).18(12.85)22.27(163.85)
2023 (n = 8,734)19.83(125.15).14(.68).15(.69)1.37(14.10)0(.07).01(.19).03(.33).01(.18).03(.33)20.26(221.09)
Fields of Research
Biological Sciences (n=10,792)28.37(142.91).32(2.02).33(1.40)1.91(13.43).08(1.04).04(.40).05(.42).03(.33).38(12.15)27.35(225.16)
Biomedical/Clinical Sciences
(n=33,060)
21.65(118.48).15(.80).36(1.87)1.48(12.81).04(.62).07(.59).03(.28).03(.48).05(.48)22.62(4)
Chemical Sciences (n=5,131)5.85(24.58).04(.31).06(.33).27(.24).07(.57).01(.14).02(.30).01(.25).03(.31)6.08(26.66)
Health Sciences (n=14,283)23.30(129.60).17(1.25).37(2.34)1.31(12.79).01(.28).13(.77).03(.32).03(.60).04(.39)27.15(228.23)
Psychology (n=5,733)23.79(97.03).24(.93).32(1.77)1.46(9.59).01(.13).06(.44).05(.36).02(.19).07(.63)22.88(98.01)

Footnotes: LinkedIn (no access post-2016), Pinterest, syllabi, and Weibo had 0 mentions in our entire dataset so they are not presented here. Google+, Weibo, and Pinterest are historical sources and they no longer supply an open feed.

*All metrics with all averages <.03 are not shown here (F1000, Q.A., Peer Review). Additional metrics are available in supplemental files. Avg: average; SD: standard deviation.

The average AAS for the entire dataset was 21.16 with Twitter (X) being the largest medium (n = 1,276,135) represented by far with an average of 21.98 mentions per output. (Appendix 3, Figure 10) The average mentions for the other outputs were much smaller in comparison: 1.38 for news sources (n = 79,964), .31 for Facebook (17,879), .18 for blogs (n = 10,245), .11 Wikipedia (n = 6,278), .07 for policy documents (n = 4,078), and .04 for patents (n = 2,423) (Appendix 3, Figure 9). The average AAS was highest for 2020 (24.7) (Appendix 3, Figure 11) and for outputs in the Biological Sciences Field of Research (28.37). Chemical Sciences had the lowest average AAS as well as average mentions for most mediums. There were some yearly trends for certain mediums with Facebook averages decreasing from 2019 onwards and Twitter peaking in 2020 then decreasing back towards pre-pandemic averages. Higher patent and policy averages were seen in 2017–2019 than in recent years. Other slower-to-accumulate metrics such as citations and Mendeley readers showed yearly variations as well with higher averages for the oldest years in the dataset. Table 5 data obtained from Altmetrics (Dimensions) and the Open Alex API varied slightly but generally aligned. There were many skewed distributions so additional metrics (i.e., medians and IQRs are available in Appendix 4).

We found positive statistically significant associations between the Number of Dimensions citations, OpenAlex citations, and Mendeley readers with the AAS for the dataset (2017 – 2020) and yearly subsets. (Appendix 3, Table 5) As the number of citations increased, the average AAS tended to increase, although the overall effect size was small (a 1–1.5% increase per citation). There were no overall trends for the zero-inflation models. Open access was also found to have a strong positive association with AAS. There was some yearly variation with 2020 showing the largest effect size (1.12), meaning open access articles in 2020 had a larger relative increase in AAS compared to previous years.

Discussion

We investigated the attention obtained by Irish research organisations conducting a detailed, thorough, and innovative evaluation. Aligned with global trends39,40, we found an influx of research outputs and attention during the COVID-19 pandemic and an increase in open access over time41,42. More locally, we established the top funders, publishers, and organisations involved with Irish health research outputs which may be helpful for researchers. Furthermore, we demonstrated trends of changing attention on mediums, and highlighted differences in attention and potentially wider engagement based on research fields and disciplines. While of obvious interest to Irish researchers and organisations, this project also provides a case example with methods and key learnings which can potentially be reproduced in other comparable regions.

Firstly, our region’s results are aligned with international findings indicating a tsunami of research articles during the COVID-19 pandemic39,40. We found a large increase in outputs from 2019 through 2021, with the influx receding in 2022 and 2023. These increases are reflective of the broader publishing system which experienced growth patterns during the peak of the COVID-19 pandemic which were different from ‘normal’ times — on average publications increase 4% per year43 but previous work showed a growth of 846% from May 31st 2020 until May 31st 202141. We also observed an increase in open access outputs over time and, similar to previous research, a positive impact on AAS42 and a stronger future for gold open access over green41. This shows promising impact for Ireland’s recent investment in open access publishing agreements, (IReL) which were signed in 2020. Increased access could also improve the general public’s access to engagement and interest in research44.

Unlike other previous research, however, we did not observe that time lapsed since publication was connected with Altmetric scores as the highest average AAS was in 2020 and 2021. However we did observe an accumulation of Mendeley readers over time10 and some medium or formats (e.g., policy documents and patents) seem to take some time to accrue. In our dataset, the most popular medium for dissemination was Twitter (X) (x̄: 21.98 per output), followed by news (1.38), Facebook (.31), and blogs (.18) with Facebook and blogs showing decreases in recent years. Paired with the academic exodus from X45 and Altmetric’s lack of tracking for mediums such as LinkedIn, Instagram and TikTok there may be ramifications for the AAS as a whole46. However, as of December 2024, BlueSky is emerging as a new popular medium for academic researchers and it is now being tracked by Altmetric47,48. Future analyses with data from 2024 onwards may show much different results to ours as the media landscape changes.

In addition to the concern regarding which medium is most appropriate for dissemination, researchers must be aware that their work may be of much less interest to certain audiences. In our dataset one in five outputs received an AAS of 0 – whether this is due to researcher’s simply not publicising their work or genuine disinterest or disengagement with platform users, it obliges researchers to go beyond the standard media platforms for dissemination and to creatively reach the “unengaged”. We did find much lower AAS for the chemical sciences which could be considered more technical in nature, whereas biological sciences and popular ANZSRC categories about clinical science like ‘oncology and carcinogenesis’ and ‘reproductive medicine’ might be more relatable.

The large prevalence of outputs from the education and healthcare sectors is logical given that much of health research is conducted by academics and healthcare professionals. Also of note, many of the healthcare organisations in our top twenty organisations were hospitals with training partnerships with academic institutions. This also aligns with a recent bibliometric analysis49 of publications supported by Ireland’s funding agency, the Health Research Board, which found the academic sector well-represented. However, output from the government and company sectors were quite low in our dataset which may be surprising given Ireland’s status as the Silicon Valley of Europe50. This may be due in part to the nature of the Altmetric data which, similar to other databases like the Web of Science, has coverage issues51 and that obtaining a DOI costs money which government agencies may avoid as they can it is an additional cost and they can simply upload reports and files to their own websites.

Limitations

The main limitations of our study concern the content and quality of our datasets which should be considered when interpreting the data. For Altmetric data, we found some stability and quality issues which should be considered for future analyses of Altmetric data. Stability issues have been documented previously by work investigating research from 2012 to 2021 which found vanishing mentions, 23.7% of publications with AAS fluctuations over a year, and nearly 30% of papers showing high volatility for Twitter (X)52. We downloaded Altmetric data in one time period to reduce instability but unfortunately due to license and access issues, we were unable to use the API to obtain the Altmetric data by ROR. Due to loss of licensing access, we could not update the data to include 2024 in our dataset. However, we included a range of years and a time buffer as previous research has indicated that Altmetric mentions accrue for a longer period than previously believed53. Our code can help inform future analyses investigating how stable the AAS and mediums are over time.

Due to indexing issues with publication dates, we were also unable to run analyses with more granular detail (quarterly and daily). When a publication does not include a complete publication date, the Altmetric system automatically fills in the missing information with “01”, creating 1 January when both the month and day are not specified. We attempted to pull in the publication date data from the OpenAlex API to address this but found further issues with years incorrectly outside of our range. In fact, OpenAlex notes that filtering by publication date is ‘not a reliable way to retrieve recently updated and created works, due to the way publishers assign publication dates’54.

We also found that pre-print servers (e.g., medRxiv) were well-represented in the top ‘journals’ in our dataset and it has previously been reported that Altmetric has issues tracking publications with multiple versions (i.e., a pre- and post-print)55. We decided to keep pre-prints in our dataset and treat them as unique outputs as each has its own AAS and previous research has indicated that while COVID-19-related articles were more likely to have a preprint, the presence of a preprint does not seem to affect AAS56.

In addition to debates around stability and quality, there are also many critiques of the AAS and its reproducibility issues and how it weights different sources of attention57,58. For example, Christin et al.59 encourage researchers to observe different social media logics in healthcare, such as informing or raising awareness for well-being, creating public debate or curating wider partnership-based engagement. The AAS is weighted to favour traditional posts over social media but the prominence of Twitter (X) in our dataset raises concerns about potential overweighting and focus on academic impact as research has indicated that tweets mostly originated from within academia which engaged other academics60. Lastly, Altmetric does not track other mediums or platforms which are popular in Ireland such as TikTok and Instagram18.

Implications

This work establishes publishing and Altmetric trends in recent years on a large dataset of research outputs associated with Irish research organisations. For researchers, it details key sources of funding, popular publishers and organisations, changing trends in pre-print usage, and medium differences which can inform their dissemination plans. For scientometric researchers in particular, it stresses lingering stability and accessibility concerns with alt- and biblio-metrics which should inform future work. OpenAlex datasets of all included DOIs and code used for this project are available open source.

Ethics and consent

Ethical approval and consent were not required.

Comments on this article Comments (0)

Version 1
VERSION 1 PUBLISHED 02 May 2025
Comment
Author details Author details
Competing interests
Grant information
Copyright
Download
 
Export To
metrics
VIEWS
256
 
downloads
14
Citations
CITE
how to cite this article
K.Sharp M, Abuhaimed S, Yeoh B et al. A cross-sectional analysis of Altmetric coverage of health research from Irish research organisations (2017-2023) [version 1; peer review: 1 approved, 1 approved with reservations]. HRB Open Res 2025, 8:58 (https://doi.org/10.12688/hrbopenres.14121.1)
NOTE: If applicable, it is important to ensure the information in square brackets after the title is included in all citations of this article.
track
receive updates on this article
Track an article to receive email alerts on any updates to this article.

Open Peer Review

Current Reviewer Status: ?
Key to Reviewer Statuses VIEW
ApprovedThe paper is scientifically sound in its current form and only minor, if any, improvements are suggested
Approved with reservations A number of small changes, sometimes more significant revisions are required to address specific details and improve the papers academic merit.
Not approvedFundamental flaws in the paper seriously undermine the findings and conclusions
Version 1
VERSION 1
PUBLISHED 02 May 2025
Views
12
Cite
Reviewer Report 28 May 2025
Ashraf Maleki, University of Turku, Turku, Finland 
Approved with Reservations
VIEWS 12
Thank you for the opportunity to review this preprint, which addresses the trends, coverage, and impact of health-related research outputs from Irish organizations using Altmetric and citation data. The study raises interesting questions about how research attention is distributed across ... Continue reading
CITE
CITE
HOW TO CITE THIS REPORT
Maleki A. Reviewer Report For: A cross-sectional analysis of Altmetric coverage of health research from Irish research organisations (2017-2023) [version 1; peer review: 1 approved, 1 approved with reservations]. HRB Open Res 2025, 8:58 (https://doi.org/10.21956/hrbopenres.15519.r47123)
NOTE: it is important to ensure the information in square brackets after the title is included in all citations of this article.
Views
4
Cite
Reviewer Report 28 May 2025
Sibsankar Jana, University of Kalyani, Kalyani, West Bengal, India 
Approved
VIEWS 4
It is really a good research work. The authors accepted that data on attention scores from all Irish health research outputs were obtained using an institutional license for Altmetric Explorer, thus they were unable to share the datasets; however, the ... Continue reading
CITE
CITE
HOW TO CITE THIS REPORT
Jana S. Reviewer Report For: A cross-sectional analysis of Altmetric coverage of health research from Irish research organisations (2017-2023) [version 1; peer review: 1 approved, 1 approved with reservations]. HRB Open Res 2025, 8:58 (https://doi.org/10.21956/hrbopenres.15519.r47124)
NOTE: it is important to ensure the information in square brackets after the title is included in all citations of this article.

Comments on this article Comments (0)

Version 1
VERSION 1 PUBLISHED 02 May 2025
Comment
Alongside their report, reviewers assign a status to the article:
Approved - the paper is scientifically sound in its current form and only minor, if any, improvements are suggested
Approved with reservations - A number of small changes, sometimes more significant revisions are required to address specific details and improve the papers academic merit.
Not approved - fundamental flaws in the paper seriously undermine the findings and conclusions

Are you a HRB-funded researcher?

Submission to HRB Open Research is open to all HRB grantholders or people working on a HRB-funded/co-funded grant on or since 1 January 2017. Sign up for information about developments, publishing and publications from HRB Open Research.

You must provide your first name
You must provide your last name
You must provide a valid email address
You must provide an institution.

Thank you!

We'll keep you updated on any major new updates to HRB Open Research

Sign In
If you've forgotten your password, please enter your email address below and we'll send you instructions on how to reset your password.

The email address should be the one you originally registered with F1000.

Email address not valid, please try again

You registered with F1000 via Google, so we cannot reset your password.

To sign in, please click here.

If you still need help with your Google account password, please click here.

You registered with F1000 via Facebook, so we cannot reset your password.

To sign in, please click here.

If you still need help with your Facebook account password, please click here.

Code not correct, please try again
Email us for further assistance.
Server error, please try again.