Skip to main content
SearchLoginLogin or Signup

Transcript - Changing the scholarly publishing system in favour of open

A conversation with Rob Johnson

Published onApr 17, 2023
Transcript - Changing the scholarly publishing system in favour of open
·
key-enterThis Pub is a Reply to

Transcript 

Jo: Very warm welcome Rob Johnson today at the show, which is Access 2 Perspectives, conversations. It's great having you, Rob. Thanks for joining us.

Rob: Thank you, Jo. It's a pleasure to be here.

Jo: So, Rob, you are the director of your company called Research Consulting, and are basically informing research stakeholders, primarily publishers, but also researchers, themselves and other stakeholders in best practices in scholarly publishing and the wider science communication practices. Could we start the conversation, which will very much rotate around current developments, open access, the different forms of open access, where you see the trends going, the challenges that we're experiencing these days, and conceptions and misconceptions around this topic, that before we dive right in? Can you tell us a little bit about your journey that brought you to, like from where you were, and to to build your consultancy? And what do you enjoy about the work that you do?

Rob: Sure. So I've just actually just spent a decade as a consultant, so we had a little birthday party just last week, with my team, our 10th birthday for research consulting.

Jo: I saw that on the website. Congratulations.

Rob: Thank you. Thank you. So it's nice to make it to that milestone. So yeah, I've had 10 years of working as a consultant across research and scholarly communication. So actually, the biggest part of what we do is working with universities across the team, but I personally do a lot in scholarly communication with publishers and funders. And how I found my way into that was a little bit circuitous. So before I was a consultant, I worked at the University of Nottingham in the UK, I was in a research management role there so I don't have an academic background, my myself and, and going back further still, I originally chained trained as a chartered accountant, so my role at the university was particularly looking at funding and how we deal with all the funds coming in from UK Research Councils, European Commission and so on in those days, pre pre Brexit this was, and at that point, this was just about when research funders were starting to implement mandates around open access. So really, that took me into contact with my colleagues in the library with publishers trying to figure out how these funding mandates get translated into practice? And, and so I learned a lot about open access in those days. And I really just found it fascinating. You know, it was just a really interesting area. I think this intersection between academia and publishing, and the not for profit, and the commercial publishers, and I think it actually crystallizes in many ways, a lot of challenges we see in society at large about what is the role of the market? Where should the state or not for profit, or community actors play a role? And really, for the last 10 to 15 years, I've just been sort of looking at some of those questions on behalf of different actors and trying to figure out, you know, how do we make the system better? And then I started from a position of, you know, why isn't this stuff open? You know, it just seems like such a no brainer. And I'd say that's still, you know, that's still my default. And I still what I'm looking to do is work towards making things open. But of course, the more you look at it, the more you realize the complexities and the nuances and the open doesn't solve everything.

Jo: Yeah, yeah, I agree as long as the word but I'm often missing in the conversations is the talks around the limitations of open like to what degree should we consider research not to be open and work can it not to be open for various reasons, and what often first, and almost only been mentioned as GDPR compliance or sense of data when it comes to personal data of in a medical context, but there's many other reasons for keeping certain datasets and, and research results confidential? When you think of possible military applications, and then the other side is, of course, the military funds a lot of research for that very purpose. But that's a conversation maybe not for today. But yeah, there's a lot of complexities and they're just these are just two examples of what we're observing now is that almost everyone is talking about open science and yet the adoption of open science practices is quite low. And the researchers have a lot of fears that are or even if they want to, they find themselves being entrapped in the publish or perish paradigm. Because as much as everyone is on board leg toward changing the system, the actual change takes time. And a lot of detail needs to be processed and administrative levels and, and decision making positions. So where do you see opportunities? Or also like, what, what's your take on that? Like, where do you see how we can leapfrog? Or do we just have to sit it through and allow time to pass for the transition to be made in a way that takes everyone on board?

Rob: Yeah, that's a really good question, though. I think, I think to some degree, it does just take time. And, and trying to, you know, there's that classic thing of moving fast and breaking things isn't there from the sort of tech companies and so on. And we're kind of seeing some of the adverse consequences of that in other aspects of society. And I think, having been following the sort of transition to open access for, yeah, getting on for a decade and a half. Or also, when you take a step back, we have moved an incredibly long way. So I think it's worth thinking, if you go back 10 years in time, where we are now in terms of the levels of open access, many people almost wouldn't have believed how far we would come though, certainly, it's an uneven, but many parts of Europe, you know, we're at kind of 80% 90% of content being open access, I was doing a project with, with the Dutch universities a year or two ago saying, how do we get to 100%? And above all, already, almost there. But how do we do the last last bit? So I think it's important not to be too discouraged when things happen slowly, it doesn't mean change isn't happening, it just can be quite imperceptible. But the other thing is that there's there's a lot of factors that essentially build inertia into the system. And clearly the biggest one, as you say, is that kind of publish or perish culture, the incentives in academia and a lot of the focus is shifting now to how do we change incentives, but that is a very, very long game. So to answer your question, I wouldn't see it just as sitting it out, you know, these things haven't happened by chance. It is the result of a lot of lobbying and work and evidence gathering. It just takes a long time to come to fruition.

Jo: Yeah, yeah. Okay. I totally subscribe to the you mentioned research integrity or research incentives. And I mentioned integrity now because I am repeatedly doing investigations and studies on our clients on research integrity, and how research integrity can also be measured and incentivized through from the business lingo or KPIs like key performance indicators are there, what are the indicators for research integrity? And would you like, to me, research integrity is working according to open science practices at large. Well, of course, this can be discussed in detail, but is for me open science, not necessarily being fully open as in fully translucent, but as openness as feasible, as open as possible at a time, given the circumstances given the state and progress of the research project. And of course, always considering the contextual setting. But then, if, like for those who are familiar with the principles of open science, it is really about collaboration, interoperability, making systems functional work, transparency, are the values that we as humans are easy to comply with around the world, really. And  then also, there are few policies at research institutions, when they talk about research integrity, that's exactly that. So there's quite a, like an almost 100% match between what it means like for the definition or the interpretation of open science, as compared to a research integrity. 

Already where we're at potential differences. The differences are I don't know I'm just answering my own question.

Rob: No. So research, research integrity is an area that we've, obviously it's always been important. But what as consultants, what we tend to find is, you know, what do people come and ask us about? And what they want us to look at is an indicator of the Zeitgeist to some degree. And I think we've seen integrity really move up the agenda. And interestingly, that goes for all of the groups that we work with. So you know, we do work with funders, we work with institutions, we work with publishers, and we're seeing all of them just get a little bit more concerned about the integrity of the scholarly record about integrity of research practices. So to what extent is this synonymous with open science? What we found when we looked at it, and we did a project saying, you know, are there meaningful indicators of integrity? And, of course, the answer is, it depends, you know, there are some things you can track. And actually open science practices is an element of integrity, which absolutely does, I think, support and underpin research integrity, in terms of reproducible working, transparency, sharing of data. And that's probably the element of integrity that is most amenable to sort of tracking where you can get a sense of how things go. But of course, integrity is a much bigger set of issues. So there's all sorts of things around so the care and care for the work that you're doing respect research participants, things that are not amenable to tracking you have you cited all of the sources that you've drawn on in your work. I mean, there's no way of tracking that in any meaningful sense that actually relies, in many cases, on the individual's integrity. So what we found is, there's a sort of publisher perspective on integrity, which is centered on the scholarly record, and is centered on how do you support reproducibility of the published article, but of course, there's a whole lot of other practices, which are more in the domain of, you know, individual researchers, and by extension institutions and funders, about ethics and, and those elements of integrity that go well beyond just open research practices. So I'd see it as a Venn diagram, there's quite a big bit that does overlap. But there are also elements that are distinct. 

Jo: Yeah. And that's probably also due to the fact that open science is being defined differently by different research communities or research stakeholder communities. Because where I'm from, like, I did my PhD about a decade ago, more than that, now. It's continuing to progress. And, and then I, I kind of sort of grew up into my professional research career with the advocacy for open science or almost, what is it like? You're always like, calling for revolution. So the grassroots approach, which is based, like, fully on ethics and values, which very much speak to integrity. But, but then, of course, there's a business approach to open science nowadays, also, with publishers having adopted not only open access and this is also without judgment, trying to find business and revenue streams, in compliance with open science practices, which brought us to the situation where we currently are with high APCs, or whatever a high means to wherever has to pay. Yeah, so. And then also, the challenges come with having to allocate research funding budgets, more towards the publishing and taking away from other budget items in the research project. So okay, where are we? Where do we go from here? Just trying to build a coherent conversation to the next chapter. The APC approach, looking into that, and again, like it's understandable that the publishers need to secure their businesses as an keeping operations running and flowing. But what what what have you seen as possible for a publisher, be a for profit, nonprofit, and maybe also shedding some light on the differences in how they operate and establish their revenue streams or their operations and paying the staff and the services but also serving the scientific community at the same time and not harming it by overcharging? Which apparently is happening in some places.

Rob: Yeah, I think the APC is enormously valuable in the sense that it allows you to grow your revenue in line with your activity. You know, that's the beauty of the APC model for publishers. And, and, you know, that is one of the reasons that open access has advanced as far as it has because we talk about incentives for researchers. But we also need to think about incentives for publishers, you know, if you're a commercial publisher, and to some degree, a society publisher, if there is a revenue stream there that you can access that incentivizes you to move in that direction. So we've seen that with APCs. We've obviously seen that with established publishers. So the Elsevier is and Springer nature's and Wiley is moving in that direction, but also, new startups as they were some years ago, starting out as born open access publishers, obviously, plus is one of the best examples of a not for profit, but there's also been many for profit publishers. So I think the APC has served a purpose in that it's allowed that transition to open access to happen faster than it could have done otherwise, because it kind of matches the revenues to where the publishing is happening. And then what we're seeing now is, I guess the sort of inevitable but unintended consequence of that is that as you move to an APC based publishing world, you are erecting a barrier to those who don't have the money to publish. And in fact, there's two issues. So one is that researchers without the funds can't publish. The other ways that you incentivize a publisher, the more you publish, the more money that you make. And that's not necessarily a helpful incentive, particularly for commercial publishers. And so we are seeing some publishers that have shown very rapid growth. And coming back to our conversation about research integrity. Clarivate recently delisted a number of journals from Web of Science because they're concerned they say content relevance, particularly, I think there may be some quality concerns there as well. So what we're seeing now is a lot of a lot of debate about, you know, what comes beyond the APC, you know, can we reform the APC is one wallet school of thought. So what if APCs were scaled according to purchasing power? So it might be $4,000, if you're in the US, but maybe it could only be $1,000? If you're in Indonesia, maybe $1,000 is still too much. So there's can we reform the APC, but keep the basic model is one set of questions. The other is, we just need something completely different. This doesn't work at all, we need something more like a collective funding mechanism, actually, maybe something more like subscriptions where the author doesn't have to be involved, the institutions and libraries spread the cross. And there's various models looking to see if that is scalable.

Jo: Yeah, and also keep in mind that the APC is not the only way to compensate for the efforts being made by a publisher. There's other subscription models with institutions. And it's also a question of what is being charged for? And what is the client, like the submitting researcher actually paying for when they pay 2000, 5000 USD for having one research article process. And my assumption is that, like the big publishers, which are only a few of them with hundreds and 1000s of journals each, what are some of them? They do all kinds of work. And primarily nowadays, they've also very proudly announced that they're doing business with data analytics, or research data analytics through the data that they receive through the submissions. And then of course, it costs a lot of money to pay staff and in the computing systems. And the question is, are the other submitting authors, the only paying clients? And maybe they're not, but probably the bulk of the other revenues coming from there? And is that even fair? I mean, not not even to you like, is this a healthy approach to treat your clients to let them pay for services they didn't even ask for, or they're not actually asking you to do.

Rob: And so I was I was looking recently at I mean, Elsevier is that the you know, the prime example of this and it actually if you look at where Elsevier revenues come from, it's it's only about 40% of its revenues as an organization and this is Arielle x, this is the group which effectively Elsevier 40% come from academia and science, and actually have that 40% or less than half I think of the revenues from publishing now. So the rest of it is from workflow products from data from analytics Scopus and Cibao. So they do charge for that. So it's not that the APCs are subsidizing the data products, I think the bigger concern is, there's a set of issues around APCs, and the price of publishing. But almost the bigger issue for me is the kind of enclosure of infrastructures and what's effectively called Platform capitalism. So you end up with big companies like Elsevier, but they're by no means the only ones that own the infrastructure on which science operates. And there's a risk that sometimes we spend so much time looking at the APC and the publishing world that a big problem is happening over here, where so much of the infrastructure is starting to be controlled by corporations. And it's the same set of issues that once you end up with, effectively a monopoly position, you can almost charge what you like. So you may have a monopoly on a very high impact journal that everyone wants to publish for everyone to subscribe to. But you could also have a monopoly on the to some degree Web of Science and Scopus, there's almost a duopoly there. You know, there's two key key providers of that information, which almost any institution, funder researcher that tends to influence their thinking, you know, is this journal in websites? Does it have an impact factor? So you've already got commercial players controlling a lot of crucial data and crucial infrastructure?

Jo: And then you said the word or the term that I tried to avoid, like the past? There my approaches, the genre Impact Factor? And, okay, so let's just take that for just a minute, because it's being talked about a lot. And what pains me is that it still plays such a huge role in so many professional circles, like acknowledging that it is important. Yes, everyone did. Everyone knows how flawed it is, and that it's not even designed to be used for what it's being used for, as an indicator for goods for research integrity, basically are misinterpreted as such, like, Oh, you're publishing that journal with that impact factor. So your research must be good. It's not that that doesn't work like that. So why, how can we mean, okay, coming back to the recent article that you mentioned, how Clarivate Analytics is delisting a range of channels? And like another question would be, are these just scapegoats? Because I'm sure the same issues are present with any publishers. And not because the publisher and editorial boards are doing a bad job, just because we have so many journal articles to process these days, I guess, humanly incapable anymore. So, okay, the D listing and then okay, so they punish these journals by taking the Impact factor away from them. But like I always argue, it's not the journal that's doing good research, it's the researchers who publish and describe the work and the manuscripts and articles that that's supposed to be good research. And a journal is primarily there for the curation aspect. And of course, like making sure that peer review is being conducted in whichever best way possible. And there we have the issue now that so many researchers, whether voluntary or not, are part of the work that have already overworked researchers just can't keep pace with the throughput of research articles. So okay. So that's the issue. And now, how can we get away from the obsession with impact factors, basically, the question, and again, this might take time by I feel like this is really a topic that where I'm getting impatient because, again, like there's still acknowledgement or it's still an incentive, but even though everybody knows it's not the measure, that cannot be the measure to uphold research integrity.

Rob: Yeah, it's, I mean, it is a really thorny problem. And I guess there's two reasons for me why it's so persistent. So one is, one is just the publication output is growing so rapidly, that it's really hard to keep up. So. So that's one element is just the growth in publications, that kind of supply is so great. And then the second is just, it just gives you an example. In the UK, we have something called the Research Excellence Framework, which every seven years sets out to assess the quality of research within different research units essentially sort of disciplinary that aren't and so on. And if you ever talk to the people who've sat on those panels, the volume of work they are asked to assess is physically impossible to read all of the articles or outputs that they're asked to assess. So in a sense, there has to be some kind of heuristic that they can use to make a quick judgment. So, there's a lot of pressure on people to make quick judgments just because they don't have enough time. So of course, you know, human beings as we are anything that helps us make those judgments more quickly, and maybe even outsources some of that. So we haven't got to form our own judgment is incredibly appealing, because there just isn't the time to, to process it all. So I think we sort of have to recognize the reason why, almost from a reader and assessment perspective, the impact factor, if the Impact Factor wasn't there, people would have to invent some other mechanism for those sorts of quick judgments. Then how to reform it, you know, there are quite significant moves afoot, particularly in Europe to reform research assessment. So there are many hundreds of institutions and funders have signed up to a sort of roadmap for reform of research assessment, I think we are, we are starting to see some of this coming through. And this is partly about, it's partly about not just valuing the high impact journals, but it's also about looking beyond the journal to say, Well, if you produce data, if you produce code, if you maintain infrastructure, if you write software, all of all of these things are part of the research enterprise and need to be and need to be valued. And, and so I think there is a sort of social change that is happening. But again, it is very slow. But I don't think it's ever going to completely do away with that need for a quick assessment and something that helps me helps me do that. So it's a really thorny issue. I don't see an immediate solution on the horizon.

Jo: Yeah, I think that's like saying it's for decision makers or research managers. Like they need a measure to make this a quick assessment because also overwork and too much throughput to deal with. Therefore, research quality, and we had Karen Stroobants and Noemie Aubert-Bonne from the COARA here on the show a couple of weeks ago. And in there, we also discussed as it's really difficult to come up with quality measured, because every research project is so specific in its nature, and it's set up and the context it's being what the research being conducted, and but things that we can measure, maybe, yeah, it's not that I have the solution for everybody else is just trying to find one. By the way, if we, if a researcher also puts in a CV things like, Oh, I've shared my data sets on this on that project, and that in that repository, again, following open science practices as in and highlighting this even not the list of journals and journal names, but rather, focusing again on the topic and how the topic has been disseminated. Because I have nothing against the word impact per se, because we need Impact Research and society. This is why we do research in the first place, or, well, I do major research nowadays, not actual applied research or basic research anymore. But that's the whole point and publishing to get information out there to to allow others to reuse it and to also industry that stakeholders to build products from, from the knowledge and if it's being locked away through the issues that we've discussed, what's the whole point of doing research in the first place? So again, maybe coming towards a closing of the conversation and and this is an open topic, obviously, we're not discussing this and it's, it's gonna continue until we stop doing research as a species. Like, and I feel like also research integrity and an open science are, are always on the move and there will not be like, I don't think there's a need for 100% like you, you mentioned before, university was asking, because if we if we are beyond 50% that’s already a great achievement, but to get there to allow the research to unfold its beauty as an toilet to allow the knowledge to be transferred to other sectors of society? How can we achieve that? And what is the meaning of publishing? And then maybe, yeah, I want to give you the river, I also want to, to, to throw in other publishing pathways. And now also pause, publication, peer review. Like there's a whole, maybe without going into too much detail. But mentioning that there's a whole new ecosystem, or the ecosystem still the same with new actors and ecosystems which give variety to the processes, which might be confusing in the beginning. But people like us, trainers, facilitators, consultants can help the researchers and research stakeholders to make sense of so find the best fit for a current project. And then things like diamond open access, and where the publishing process is being subsidized by other means, and not by APCs, which is also not clear how exactly or there's also varying models and how to uphold that and make it sustainable is a bit of a challenge. From what I've seen, but yeah, but back to you. So I gave you a whole lot of talking points.

Rob: You touched on some really interesting developments there. So I think there's, I think there's a couple of ways to look at this. So I think there's often the question of, can something transform the system? And can it replace what we currently have, which clearly doesn't work that well. So, you know, peer review is being overstretched? Or what if we go to post publication, peer review, and we only review the things that, you know, seem to be attracting attention, and so on. That's one idea, or the APC doesn't work? Why don't we have diamond open access instead, and I suppose what I've often seen happen in practice is that things don't necessarily replace what's already there. But they can supplement it and enhance it. And one of the reasons that can happen is research is perpetually growing, there is more, there are more publications each year. But there are also many more research outputs being shared. If you think about the volume of data that is shared between the code and other artifacts and things, we realized a sort of growing environment. And so one of the questions was, what do you do with what's already there? But also, how do you deal with the growth and the proliferation, which is putting the system under strain? So if you take something like diamond open access, I've been looking at this for quite a few years. And one of the things we often just look at is where content is being published? And who is publishing it? And there's all sorts of challenges about what database do you use? And how comprehensive it is. And so it tends to be biased towards the west bias towards the English language. But even so, diamond open access has been pretty small for quite a long time. And it's not the growth coming back my point about APCs, the growth has tended to be in commercial open access. But that doesn't necessarily mean it isn't a way forward, because what's going to happen is there will be certain disciplines where the APC works, okay, or APC with waivers. And suddenly, it can work okay, in some context, but there was a blog from the Open Access Scholarly Publishers Association that came out a few weeks ago, they've been running some workshops on equity, and they suggest the idea of rainbow open access. So the idea that, you know, there isn't just one flavor of open access, actually, we're going to need a rainbow of different solutions. And I think diamond open access is absolutely going to be one of those. And there are quite exciting developments, I think, being led by the European Commission and national funders in Europe to support something called Open Research Europe, which already exists. It's currently outsourced to f 1000, which is a subsidiary of Taylor in France is one of the big commercial publishers, but the intention is to transition that to an open source platform that is, you know, run by a not for a not for profit entity. So it is both diamond in terms of no author fees, and no charges to read, but is also completely not for profit without a large commercial to control the infrastructure. So that's a really quite ambitious activity. Do I think that's going to put Taylor and Francis and Elsevier and Wiley out of business? No, I don't. I don't think that's realistically going to happen. But could that become a really important venue for some researchers who might struggle to get into those journals or don't have the funds? I think it could. And so I think it is about seeing, we need a diversity of solutions. I really liked that term. Rainbow opening Access, recognizing it's not necessarily going to replace, but there are things that can usefully supplement the existing system preprints being another one, you know, we've seen rapid growth in preprints. When you post a preprint, doesn't it mean you don't also submit it to a journal? Sometimes it might. But in most cases, the incentives and so on are there. But you know, authors still want to get it published in a journal, but, but the preprint is additive, you know, and it adds value, it gets it out there sooner.

Jo: Yeah. And also, like, as a biologist, I feel like everything is an ecosystem, like every industry sector, and also has an ecosystem and an ecosystem. If you look at the African savanna it has all kinds of actors in it, from flies to mosquitoes to maybe not rabbits, naturally. indigenous species there. And then you have elephants, hyenas, lionesses, and all kinds of critters, and everybody has a role to play. So how about we look again, at the publishing ecosystem, where I think there has been a lot of focus on the you can call them hyenas or elephants, like the big elephant in the room. Some of them or maybe unintentionally, have developed into something that's exploited as a harsh word but as harm to the system, like painfully, as an extracting too much money into just one direction. And now, yeah, as evolution goes, now, we shifted back towards a better way to rebalance towards other actors, also for the publishing services. And I agree with it , like some of my colleagues want to see the big publishers collapse. And I think we're, there's a time for every empire until it collapses. But I believe there is room for everyone. And it's and, and I've felt that the big publishers also provide a lot of value as in product development, services, workflows. automations, to, for us, as a community as a scholarly community to be able to, to get on top of the wave of the high output that research predicts. And everybody can learn from that, right, we can all learn from each other and build an ecosystem that's more balanced and fair and globally inclusive.

Rob: It's worth them, you know, for all their flaws. The publishers are very international, you know, in a way that many of the other actors in science aren't, you know, institutions that are rooted in a particular national context. Virtually all funders are rooted in national or regional contexts. So, publishers do actually have an ability to move research around the world in a way that is perhaps more difficult. And that's not, it's not as simple as just putting it on the web, you know, obviously it's about the discoverability, and so on. But it is also about marketing and communication in those different markets. So I think it's, you know, they are very flawed, the profit margins are ridiculously high. And there's reasons for that, but that's about the dysfunction of the system, there's sort of a monopoly that you can have on journals. But I think we also need to recognize there is some value that having those global corporations involved can bring, and, and just the scale of what they're doing, you know, say Elsevier disappears. And the system has to pick up an extra million articles a year or whatever, you know, there are, they are operating at a phenomenal, phenomenal scale that is very difficult to replicate the promise of a diamond open access model. So I think we have to keep recognizing the flaws and the limitations and there's a really important place, but I think for some sort of regulation and some forms of regulation there to avoid the sort of private actors extracting too much money from the system. And there's a set of challenges there. But I think we also have to not throw out the baby with the bathwater and say that, you know, commercial actors, big publishers have nothing to offer, you know, it's a mix of good and bad as most things are.

Jo: Sure. Yeah. And it's also important to remember, I think, like the power that the customer has is in the research institution and the researcher. Like there's always options and there's always a negotiation power that can be performed. So, this has happened in the past and will continue to happen. So If. And then there's also examples of research institutions coming together in coalition's to strengthen their negotiation power. So there's always that and yeah, I think we're living in interesting times. There's a lot of work to do for all of us. And it's yeah, basically how, like it's gonna be interesting to see the future developments near and far. It's been a great pleasure talking to you. And yeah what's next on your agenda where or what is maybe your best case scenario on how the next three to five years will unfold? And what direction do you think publishing will take?

Rob: I think the best case scenario, I think in terms of integrity, I think actually seeing a reassertion of the importance of integrity and quality. I think so I think in some respects, we've gone too far in terms of the growth and I think integrity has been compromised. So I think a reassertion of the value of peer review of the importance of reproducibility. And I think we're just starting to see some signs of that kind. So I hope that will, that will gather momentum, and actually that in itself, could then drive people back to some of the more trusted, established names, particularly societies. So I think reassertion of integrity is one thing that I'd, I'd hoped to see, I do really hope that some of the diamond open access initiatives will gain support and that support both from funders and institutions, but also crucially from the researcher community. So I think, you know, research has been willing to publish in places that don't have an impact factor. So open research Europe has stated is not seeking an impact factor. That's not what it's about. And of course, that is a risk, you know, because researchers are incentivized to publish in places with impact factors. So it does rely on the research community, not necessarily with every paper, but with some of their papers being willing to say I choose to support this kind of initiative, I choose to support something that's not for profit. That's, that's reproducible, that's being run on behalf of the community, rather than channeling all my papers to the commercial publishers. So I think that yeah, reassertion of integrity. And then you know, back to that diversity and not seeing the big publishers squeeze out everyone else to the detriment of the system as a whole.

Jo: Thank you so much for the other vests for for the for the next project, and hopefully speak to some here or elsewhere at a conference and yeah. 

Rob: Thanks very much.



Comments
0
comment
No comments here
Why not start the discussion?