APE 2022—The Future of The Permanent Record
The 17th Annual Academic Publishing in Europe conference was held virtually last month, focusing on the future of the permanent record in academic publishing.
The event was opened by Eric Merkel-Sobotta, Managing Director, Berlin Institute for Scholarly Publishing (BISP) and Dr. Caroline Sutton, CEO, International Association of Scientific, Technical and Medical Publishers (STM) and the Director of Open Research, Taylor & Francis Group. They spoke at length about the role of publishers in the academic process and the collaborative systems of best practices within the industry which help maintain the integrity of the permanent record.
A particular emphasis was placed on the changing landscape of academic publishing, specifically, the transition from traditional subscription-based business models to open access (OA) and the Article Processing Charge (APC). With ongoing pressures for many traditional-style publishers to convert to OA, due to the needs and requirements of a new generation of scientists and researchers, information needs to be easily accessible and not locked behind a paywall.
Preprints
In addition to this, the rise of the preprint was also discussed. With the requirement for the swift dissemination of research due to the COVID-19 pandemic, preprints have become an extremely valuable tool in collaborative research and fastracking research without compromising on the integrity of the peer review process.
Permanent record
This naturally led to a discussion on versioning, which was a common thread throughout the event. The concept of adapting systems to move away from a published permanent record with corrigenda and instead moving towards a version-based record, where papers can be updated as the understanding of science changes, leads to a non-linear approach to the publishing workflow; however, this may cause new challenges in verifying the research so early in the process, leading to the need to develop innovative peer review solutions for publishers to overcome potential inaccuracies.
One such issue is the nature of preprints not being fully peer reviewed. This may result in incorrect conclusions being drawn by news and social media outlets, which may in turn lead to the distribution of “fake news”, which may lead to public trust in science being negatively impacted. The session concluded with the thought-provoking question “where do we assign accountability for the permanent scholarly record?”
Quality and equity
The first presentation was “Quality and Equity in Academic Publishing”, held by Prof. Dr. Maria Leptin, President of the European Research Council. Her keynote focused on the shift towards open science and the peer review process. With the rise of preprints, one potential solution regarding the issue of peer reviewing preprints is the refereed preprint. Refereed preprints accelerate the dissemination of peer reviewed research by removing continuous cycles of peer review in favour of an initial peer review when the preprint is submitted, sparking a cultural shift towards better peer review practices. A potential method of carrying this out would be to use a review commons in an open peer review format for the preprint, allowing for real-time comments from authors and reviewers as the manuscript undergoes processing.
It is well known in the publishing industry that quality publishing is not cheap; it costs both time and money to maintain a high standard of publishing integrity. The peer review process is necessary, and many scientists see it as part of their academic duty to critique the work of their peers (although a minority may argue the process is exploitative). The average academic involved in editorial work and peer review responsibilities will spend approximately eight hours per week undertaking peer review and editorial responsivities; however, this is necessary. At MDPI, we reward reviewers with a voucher towards their next submission for peer reviews of a high quality, completed in a timely manner.
One of the flaws of the APC system used in OA publishing is, if the paper is rejected, after an average of 11 hours of work from editorial staff alone, the process has been a waste of time; however, it is worth it to maintain a high quality of submissions being published in the journals.
Converting to Open Access
For many traditional publishers, there is a pressure to convert to OA models, with OA growing by up to 200% in some publishers in recent years. A key motor towards this driving force of change is the implementation of transformative agreements (TAs), which repurpose subscription expenditure to support OA. This is often supported by funders directly supporting journals or by an annual voluntary donation system, which is used to subsidise APC costs. However, there are drawbacks of TAs. Often, there are no long-term goals, and the schemes don’t usually have a long lifespan because of this. Also, there are inequalities, with funders preferring to back larger organisations, leaving smaller publishers behind, and reinforcing imbalances between publishers in different financial situations. To make TAs work, more innovation is needed to ensure a mechanism is there to support OA past the initial transition.
Online versions
Following on from Prof. Leptin’s presentation was a discussion regarding the transition of the permanent record from paper to online versions, led by Todd Carpenter, Executive Director for the National Information Standards Organisation (NISO), the organisation behind the DOI system we are all extremely familiar with. He discussed the importance of metadata in the new age of digital version storage and how often traditional attitudes hold back infrastructure changes when progress can be made, preferring to go by information that humans can read rather than information which is easy to digest by computers.
Another discussion which took place during day one was posed by Dr Alicia Wise, Executive Director, CLOCKSS. She discussed the need for long-term digital preservation of the scholarly record. CLOCKSS is a long-term digital archive, aiming to store information for up to 500 years. It is a collaborative project governed by publishers (including MDPI) and libraries. There is a lot of concern that not enough information, even from established journals, is being stored for the long term, and this must change with a more active management of the version of record.
Inclusion, diversity, and equity
The first session of the second day was “Towards an IDEAL World: How to foster Inclusion, Diversity and Equity in Scholarly Communication”, moderated by Dr. Christine Smith, Chair of the Editorial IDEAL Committee, De Gruyter. Kicking off the proceedings was Dr. Nicola Nugent, Quality and Ethics at the Royal Society of Chemistry (RSC). We all know equality and equity of opportunities leads to better science, but what’s the best way of going about it? Rather than superficially making ineffective changes, a systematic shift is needed at every level, to shift the narrative towards a more diverse scientific industry.
Reports published by the RSC show that there’s a lack of women in leadership roles within the scientific community. To combat the gender bias in publishing, RSC has published their diversity data to increase transparency and set representation targets based on the data in their report. They have also provided training and resources to editors to eliminate implicit bias, as well as introducing double blind and open reviews to also eliminate bias. RSC and MDPI are both part of a joint commitment for action on inclusion and diversity in publishing, in a framework which aims to understand and reflect the diversity of the research community and set minimum standards on which to build a more diverse industry.
The importance of data for diversity
However, as explained by Nancy Roberts, CEO of Umbrella Analytics, this shift in diversity must be data-driven and not based on the good intentions of a few passionate members of staff. Opinion-based changes are often made without any goals in mind; however, setting goals can really start a push toward fostering a more diverse workplace. By carrying out staff surveys and collecting demographic data, real changes can be made, and targets can be set. Communication is key and asking people on the ground level is essential to making meaningful changes.
However, this may not always go to plan. Sometimes, to make effective change, a 3rd party may need to carry out the survey for accurate data to be gathered, use the data gathered to incentivise change at all levels, and embed diversity within the infrastructure of the company. At MDPI, approximately 80% of the workforce identifies as female and many of the leadership roles are filled by women, including our CEO, Delia Mihaila.
Global South and inclusion
The final part of the diversity panel was presented by Ylann Schemm, Director of the Elsevier Foundation and Chair of the Research4Life Council, who discussed the Global South and their inclusion into academic research. By encouraging researchers from the Global South to enter competitions to gain funding and visibility, we can assist them in furthering their careers and inspire others to go into research. Partnerships between institutions can help foster inclusive research ties and trust, providing research without borders.
Open access is also a key tool in enabling research output from the Global South. Making literature freely available for researchers, no matter their geographical or economic standing, is the future of research. The growth of OA across the publishing industry allows researchers to sidestep many of the issues surrounding research and provides equitable access to published research.
Growth of Open Access
The final session on day two was hosted by Niamh O’Connor of PLOS. She discussed the growth of OA and transitioning from traditional research to a new open science (OS) model. The key difference between open access and open science is that OS is not just the sharing of an article, but it’s about providing the tools for researchers to replicate and collaborate to make science better. Open science’s main aims are to ensure the free usability and availability of scholarly publications and the data and methodologies used to generate the data. This allows researchers and non-researchers alike to retrieve, scrutinise, and continue the work of investigators across the globe, with open inquiry at the centre of the scientific enterprise, with OS referring to the entire process of conducting science and not just the publishing process.
So, what are the benefits of OS over OA? As stated in the NASEM report “Open Science by Design” published in 2018, OS “enables researchers to address entirely new questions and work across national and disciplinary boundaries”. This is because OS is a more transparent process with full access to data, code, and methods for all parties, which is central for scientific trust, and gives an overview of the entire process.
APE 2022
APE 2022 concluded with optimism on the third and final day, with hopes that next year, members of the publishing industry would once again be able to say “Hallo” to one another in person at APE 2023, hosted in Berlin. We hope this article has informed you about the topic, the future of the permanent record, and provided other insights into the various topics discussed at APE 2022. We’ve covered plenty of conferences here on the Blog, click here for more.