Skip to main content
SearchLoginLogin or Signup

Transcript - Hype and flaws in the tech industry and the scholarly publishing system

A conversation with Jeffrey Lee Funk about Startup and technology hype and how both affect research practices

Published onApr 05, 2023
Transcript - Hype and flaws in the tech industry and the scholarly publishing system
key-enterThis Pub is a Reply to


Jo: Welcome everyone for another episode of Access 2 Perspectives Conversations. Today we have Jeffrey Funk in the house. He's a tech consultant and also vividly publishing within academia.  Yeah, the bios in the, sorry, so nervous. The bio in the show notes and the affiliated blog post, So, yeah, you have a decade long experience in like working for tech, but also like at the interface between tech and research.

What is your view, first of all, thanks for making time for this, but yeah, and, and now we're here to hear from you,  your view on how research and technology and the tech industry may or may not work well with each other, how they both.  learn like how other mutual learning happens or can be improved.

Jeffrey: Okay. 

Jo: How else to phrase this? So, so yeah, what, what are the, what are the benefits that research brings to the tech industry in vice versa? And where can we do better?

Jeffrey: I see the technologies that are based on science, fewer coming out now than they did 50 years ago. . So in the glory days of the forties, fifties and sixties, you had integrated circuits and transistors and magnetic storage and nuclear power, and lasers.

 There was a very long list of technologies that came out that were commercialized and have grown successful over the last 20th century.  Growing very big, providing lots of employment for many kinds of people, providing lots of value for many kinds of people. But I see that slowing down. I see that less is coming out in the last 20 years.

I see a few  exceptions, like O LEDs but we don't see it like manual technology or superconductors for energy usage or quantum computers or even bio electronics of the throttles kind. So we see a lot less coming out and  I think that's a big issue because the government has been spending a lot of money on this and they wanna spend more money.

And I have a feeling that that money is only gonna give us more academic papers. And not more technologies, which is what we want.

Jo: So do you see reasons for that? Why is knowledge transfer not happening as efficiently as it used to? 

Jeffrey: Well, I think that the whole way we do basic and applied research has changed dramatically in the last 70 years.

So, the first big change is that corporations stopped doing, started doing less and less basic and applied research. And I don't have the numbers right in front of me, but,  there, there's a big change. People like Ashish Shora at Duke University have documented these changes.

Big reductions in this basic applied reach for corporations. And part of that is because you,  universities, started to do more of that basic applied reach. So in the 1950s and 1960s, the government started to ramp up spending,  and continue through after that. And so that provides a less incentive for those companies to do it.

 There's probably other reasons they probably just weren't as successful at getting money from those, that basic and applied research as they once were. Now, a second big thing is that universities have become obsessed with publishing papers. We see the number of papers published just has really increased over the last 40 years, and we think that's great, but the problem is that people don't read the papers.

You know, a lot of times people make jokes. Or it's only the author and a reviewer that reads the paper, maybe the editor. And if you look at data on it,  because there's ways to, to measure this, you find that it's mostly post-docs and doctoral students that are reading these papers.

So the corporations that you want to be reading these papers really aren't reading them.  Because you want them to commercialize this technology. So that's a second big change. The third big change is that we've had such an increase in the number of journals,  in the number of papers that we've increased, the number of journals.

If you want more papers, you have to publish more journals. We published more journals. We have so many journals. For example, Nature only published one in 1970. Now it's like 150. You see similar increases in the number of journals for the Mechanical Engineering Society,  the IEEE, Medical Society, American Medical Association.

All these societies have just ramped up the number of journals they published. Now, from their standpoint, this is really successful. This is great. We have all these journals we're publishing and we're doing lots of work. The problem is there's so many that they're very hyper-specialized, so it is, you look at them.

It's so specialized, you begin to think, well, maybe these professors can't really come up with an idea that's gonna be useful to people because they're so hyper-specialized. They're so into publishing something very, very narrow. You know, whole, you know, you, you look at what they, they know and it's says, push your fingers together and what you want them to know, what you want them to teach, what you want to develop.

Right. It's, it's, it's out there. So that's the third reason I think that we're getting less out. The fourth is that we've created a huge bureaucracy to handle all of this, right? If you're gonna publish lots of papers and you're gonna hire lots of PhD students, you're gonna hire lots of postdocs.

 You require big bureaucracy to handle all this. I mean, just for example, peer review really did not exist until 1970. Before 1970, it was mostly editors and a few members of the society. So now we have this huge bureaucracy for peer review.  It doesn't  work well, because a lot of people don't like to, they have to review too many papers.

They don't wanna review papers. So it's all done by postdocs and doctoral students who really have better things to do. They're more interested in publishing papers now than reviewing papers. So this whole bureaucracy, I mean, even these mundane things like somebody has to write recommendation letters for all these PhD students, for all these postdoc girl students, these big labs,  people who run these labs, they're administrators. Right? Back in the days when corporations did all this basic and applied research, there was no, none of this bureaucracy, you know, Bell Labs and GE and IBM, where a lot of people did the work that led to a Nobel Prize.

So all of this bureaucracy, it takes these scientists away from being scientists and turns them into administrators. It's a very big waste of talent. So there's a number of reasons I think they're involved here. 

Jo: Well, you just said like the bureaucracy or the system that we currently find ourselves in, due to the publication pressure, I would or as what I heard from you and what, what academics have become or are now very well aware of led to the fact that scientists don't have the capacity anymore to focus enough of their time on actually doing research, what they signed up for, but rather having to deal with all kinds of paperwork, including peer review, rather than doing their own research and focusing on that and doing slow science with less output, but more coherent presentation of the results.

Like more applicable also to other sectors. 

Jeffrey: Yeah.  

Jo: So now we have this open science or the push for open science on all stakeholder levels where the funders, researchers themselves find it difficult to implement because as you just pointed out, so. .  so well,  are overwhelmed by the administrative overload and also want to be researchers first and foremost. I'm struggling to do just that.

And now they're being tasked with, okay, now you open up your research. Like, like be transparent.  still be cautious with sensitive data. So, you know, there's a whole lot of them.  most researchers, I put my hands on fire, like, want to, and, and do the best possible to comply with, but also like, that's just too much pressure.

So do you see any ways out of those misery, like do you have, like, is it like, how can we pull back and,  or not pull back or re like twist the system to make it work again for societies including. Researchers. 

Jeffrey: Well, a lot of the problems stem from the funding agencies, right? You need money and it isn't just for your time.

Modern science requires a lot of money because equipment is expensive. So this is another thing that's changed in the last 70 years, is that modern labs require very expensive equipment. And so you have to get research grants to get that, to get that, that equipment. And a lot of times university researchers aren't very good at sharing their equipment with people.

So universities may have tons of equipment that isn't utilized very much, but a lot of professors don't wanna share. Because they have, they're under pressure to publish. So they need money from funding agencies. And funding agencies want to show that they're giving money,  that good research is coming out from their funding.

So what do they do? Well, they say You gotta publish papers, you gotta get patents. The more papers you publish, the more patents you get, the more money we'll give you, because that's the better that the, not only the researchers look, that makes the funding agencies look good. So there's a lot of people here who are looking at everything superficially.

They wanna have some easy metric, and they wanna measure people in a simple way. So they come up with, well, you gotta publish papers and you gotta have to get patents.  but the problem is, that's not what we really want. We really want science that leads to commercialization. That's a harder thing to measure.

Jo: And that's what some researchers would argue they want to prevent because they feel well publicly funded research. It's a political culture thing. I'm also not buying into, but like the fear of having the research output commercialized. And then, but as you say, like that's exactly what we want. We want to make the knowledge being applied for societal benefits eventually. So do you feel like, okay, so I, so what are the global challenges that we have with climate change, but also that, do you think technology could help us, like also having now observed and witnessed for like five, if not a decade long,  how the political will seems to be there, but does not implement, like do you feel like technology could help us,  mitigate and not only mitigate, but really basically solve the issues that we are saving? Thanks. Like based on research?  Well I think I'm answering that already. So if we know. Like how can we shift gears?

Jeffrey:  How can we switch gears?

Jo: How can we shift gears to make the system work again, to be able to solve all these issues  and to convince politics that we have everything.

Jeffrey: I think it will be very hard.  Because there isn't a political bill, right? The right and the left both have their talking points. They both have things they like to talk about and they think are the key to everything.

So the right believes in the market and so to the right, everything technology's coming out because the free market's great. It must be coming out because the free market is great. Free market works well, so there must be all these technologies coming out. And anybody that says otherwise, well, you must be a socialist.

But then on the left, they wanna say, well, we've got these great universities, these hardworking researchers, they're doing great work, so something must be coming out. It can't be coming out and don't talk about changing our universities. It's those people over there in those companies that are doing the bad things.

 And I think that both groups are wrong. Both groups are focused on the wrong issues. They're not thinking about what it takes to do science and commercialize the science. What does it take to do those things? And do we have a system set up that makes that easy to do? And I wish the left and the right would focus on those things.

But it's very hard. It's very hard. 

Jo: Isn't there like a group in between that could rise up and facilitate such conversations to happen in the near future. Maybe the startup ecosystem, like where, where the entrepreneurs are still young freshman university, they start to understand the corporate world.

And maybe can bridge and mediate the conversations to happen. 

Jeffrey: Maybe there are some examples of,  for example, Elon Musk doing things that none of the incumbent companies could do. None of the other companies were able to make these lithium ion batteries work. So there is an example of a startup, an outsider doing what none of the incumbent auto companies could do, whether they were from America, Japan, Germany,  France,  you know, the incumbents couldn't do it, but Elon Musk could. But that's one exception.  I think it's gonna be hard because if you look at startups in general, most are losing money.

90% of the publicly traded unicorn startups, unicorns are startups that were valued a billion dollars before they went public. 90% of them are losing money.  Very few, I mean, I think the biggest success stories are Moderna and Airbnb. These are the two biggest success stories. So you might look at Moderna and say, okay, they, they commercialize these RNA vaccines.

So that's a good example. And as the advisor and one of the I forget, I'm not gonna pronounce her name, Carrico.  There's a woman who was very, a corporate woman who was very important. Very important part of this commercialization. She has been very critical of academia because she worked in academia for years and she wasn't successful.

They took away her lab, so she had to go to work in corporate, doing corporate research, and then she managed to commercialize these RNA vaccines. So she's very negative on academia.  if you look at Nature. Nature had a few articles. Nature had three articles on this problem in January.

The first article showed that there was less disruptive science being done than in the past.  So that was the first article. The second article is that they found that there were 9,000 researchers in the world who were publishing a paper every five days. And they weren't critical of the people but it obviously raised a lot of eyebrows. It was like, why is this going on every five days? How could somebody publish so many papers?

And the third paper they published was on the large number of academic scientists who were leaving to go to corporate research because they said they just couldn't get anything done in academia. They had become scientists because they wanted to do something great for the world. They wanted to develop science that ended up as useful products and processes and services, and it wasn't happening.

So they said, I'm gonna quit academia. . And so that's actually the article that I was thinking about when you asked the question. But I remember the other two articles, because those three articles came out in January. They are all very close to each other.

Jo:  Right. So, What if we rethink academia to be an incubator to breed ideas, which then, I mean, this is probably already happening sometimes like often out of pain points because hardly any PhD students realizes that they will eventually leave academia. First of all, there's not enough positions, there's too much competition for tenure. But then there's so many opportunities outside academia. So maybe we have an overrated understanding. Maybe there's too much of a hype about academia. I mean, yes,  like we need the academic system for what it is good at, but not see it as a final destination for researchers to spend a lifetime at, because that's not what it can possibly be.

Jeffrey: Yeah. The problem is it. Corporate corporations are doing less basic and applied research. So that's been documented.  I mentioned that earlier. They're mostly moving to the D of the r and d, the development, doing less of the basic and applied research. So we want, without that basic and applied research, then nothing new will come out. Not nothing, nothing new in the, that's similar to integrated circuits or transistors or magnetic storage or lasers or  glass fiber, you know, all these things that really were important. They’re not gonna come out and even things like superconductors for energy transmission,  nuclear fusion and these things still involve a lot of applied research.

And so if corporations don't do these things, then you gotta have academics do them, or you find a way to get corporations to do them. I mean, there are ways to do this. You can give bigger r and d credit, credit tax credits to corporations than we do now. You can take some of the money that's going to academia and give it to some type of laboratory alliance where you say, okay, we will, if five firms can create a laboratory that does basic and applied research cooperatively,  then we will put in as much money as you put in and you can't develop any products there, but you can do the basic and applied research and then share it among the firms,  and they can and commercialize it. So you can do these various things, but no one's tried that yet. And,  there will be a lot of resistance. There's a lot of resistance out there because there's a lot of academics who hate, as you mentioned earlier, hate the idea of corporations

commercializing research. They kind of see it as, oh, those corporations are big and bad and we shouldn't let them do this. And of course right now they're getting a lot of evidence of that right from things like AI and things and all the way that,  social media's being run and people adding AI to that,  upsets a lot of people.

 You know, maybe there's some truth to the fact that corporations do this kind of research that doesn't lead to good social outcomes. So there's gonna be a lot of resistance from universities to giving more money to corporations to do basic and applied research.

Jo:  Yeah. Or there are a lot of good things because I think we all appreciate that on Facebook we can track where our friends are based and how much global reach each of us individuals have. That was only possible through data mining and data scraping.  So we take the good stuff for granted and then are quick to criticize and not so rightfully so, but then lose scope of the actual benefits the same technologies also bring to society and for wider adoption, but  I've come to the conclusion also with chat GPT and the discussions around that.

And now with, I forgot, I'm not as firm with the names that are also now coming along with Google and, and others. But,  these are just tools and,  as I'm also operating or collaborating with colleagues in Africa, mostly in other parts of the world, I think it's important to have these conversations on a global scale, to have these conversations globally inclusive, to hear voices and, and applications around the world.

And yeah, and I think there's…Yeah, we can have more of that. Always , but that's also happening.  So what do, what would you say researchers can do? Like what I am trying to foster in my courses when I talk to  early career researchers to think again, and for some it might be a no-brainer, but to really look into the stakeholder analysis.

Who is your research good for? Who made it benefit? Who could it possibly harm? But have you done any risk analysis? To what degree?  Or yeah, again, like stakeholder analysis, who do you need to communicate your research to? How can you package the information to make it applicable? It's this early stage as you're already, like, already at the stage where you're generating the data and then now being asked from the funders to,  to share your data,  in a fair manner, but also possibly openly like an open access repository.

 I personally hope that researchers have a chance to direct those conversations if they're only being sensitized to the power and opportunity they have to engage the stakeholders directly, which are also corporate, which might be policy makers,  directly or indirectly science journalists who can facilitate these conversations so to get information,  cross-sector and to cross-sectoral conversations. 

Jeffrey:  I think it's a struggle, I think it's a struggle for researchers because it depends on the place you're at. You have to get money, you have to keep it your job, right? You've got personal responsibilities to yourself, your own health. You have responsibilities to your children, to your spouse.  Everybody has this,  and they a lot of times have to just do what's necessary in order to keep their job. And so a lot of what you can do depends on the place you are at. Some places are gonna be more cooperative. Some countries are more cooperative.

I think that there's a lot more pressure in the US to publish lots of papers and to make it look like our university is the number one in the world. Whereas in some universities, in some countries, there's, there's more involvement by companies and more closer cooperation with the companies than universities.

So it depends. It depends on. The university, the country, the type of ways you're being measured. Your work is being measured. So it depends. I don't think there's a simple rule, I think for a lot of people though. You already have a lot of stress. You already have a lot of stress trying to publish, and now you're trying to do something to try to make your research be, you know, work out well.

Well, some people can do that. And you see some Catarina Carrico, and you see,  Nobel Prize winners.  Like,  I forgot who developed crispr, but in her book, she talks a lot about making use of this commercial crispr. It's a challenge and there's some people who do this, but these are the stars, right?

And many of us aren't stars. Many of us are smart people. Almost everybody in the universities, they're smart. Even if you're in the top 0.1%, that doesn't mean you can do these things, right? You have to be in the 0.0001%  to do great things. 

Jo: Yeah. I also have the capacity to start with like you mentioned,  to even think that far and yeah, because like for all of us, we all have the same amount of time available and there's only so much capacity that's too much possible to leverage.

What do you think about open source and the comments and concepts? 

Jeffrey: Open science

Jo: Well, open source in particular. So when it comes to technology, do you believe that open source products are commercializable? I believe they are, but from a corporate angle, there seems to be a lot of fear to embrace open, open source because then there might be fear that you lose the unique selling point.

 You know, to develop products by providing the code, providing the details. Yeah, but then how can a company protect the unique selling point, but also make the benefits that the product brings more widely applicable as we've had these discussions with the vaccines. Dominantly and then Germany as well.

Jeffrey: There's a number of issues here. I couldn't hear you properly, so I don't know exactly what issues you were referring to, but there are a number of important ones that journals are very expensive and you know, so that's not accessible to everyone, which is a problem because if you're not working for a big company or big university, you may not have access to this research, even though your tax dollars paid for it.

So there's this access to these journals that's important. So this movement, a open science. Which for the most part is good. There's also the issue of,  you know, how many patents there are. There's so many patents. America has really increased the number of patents by at least six times in the past 40 years,  more than six times.

 And it means it's increasingly hard to do research if you don't have a lot of money. So these are all issues.  And I agree that, you know, open science and, and things like this can, are, are good ideas. Now the problem is that we just have too many papers. So, you know, what we really need to do is reduce the number of papers that we publish.

We need to reduce the number of journals, reduce the requirement. We need to focus more on quality and not quantity and publish less and better. But I don't know how to do that. I mean, it has to start with the funding agencies who, who are, who become more cons, who should be more, more concerned with,  making sure that new technologies that are coming out and that science is being done that can lead to those new technologies.

So a lot of the responsibility, I think, is on the funding agencies because they have a lot of power, but a lot of, I'm not sure a lot of 'em want to, to think this broadly and because, you know, if they focus less on PA papers, it could be that the US Congress will then call 'em up and say, Hey, what are you guys doing?

You've, you've funded all this research and you're hardly getting any papers out.  Alright. There's a lot of people in the system who have power, but they're so superficial in the way they deal with the system. And so, Somehow we have to overcome those kinds of problems. It's very hard to do.

And of course, the journals, they all wanna make money, right? They love more journals, more money,  they want more papers. It's all great for them. 

Jo: What could be a quality measure in the applied sciences world? Like how many, like you, do you think the patent system, as we have, still holds up with the current place where we are in society. Like with the 20 years expiration date.

Jeffrey: Well, I also think there's too many patents. We, you know, be it, it's mostly the US government that reduced the bar for novelty and began raising the bar for novelty in the 1980s. Partly because they're very concerned about competition from Japan. And they said, oh, okay.

And Asia, and, oh, we're gonna reduce this bar for novelty,  so we can patent everything and keep those foreigners from copying us. But then we not only made it hard for the foreigners to copy, then we made it hard for the locals to copy them right? You just, you just made you, you just made all this business for lawyers and for patent battles and again, that doesn't help us. We don't need more patent battles. We need more products and services that make our lives better. 

Jo: Okay. So basically there's a redundancy of patents now, or basically too many and too,  dependent on low quality, but the novelty aspect is too low. 

Jeffrey: Redundant patents, redundant papers. 

Jo: Okay. But, then if not the patents, what can be a good measure? Like how many startups come out of academia or what could be a quality measure for applied research?

Jeffrey: Well, people, universities use this measure of how many startups they created, and they create lots of them.

The problem is, is they all lose. . Yeah. We want startups, successful startups. We, we want startups that are successful. Startups that not only are created, they release products, the products work well, the products make money.  The startups can survive for a long time. There's a whole set of things.

The problem is, there is no good short term measure, good, short term measure. All the good metrics are long term.

Jo: Right.

Jeffrey:  People always talk about, oh, we gotta think long term. Yeah. They're not thinking long term. Oh my gosh. They ended my project. Well, in reality, it's that somebody came up with these short-term metrics and they don't work well.

And they don't wanna get rid of 'em.  but they don't work. 

Jo: Right. So we need to look at a decade worth of time to also set up an ecosystem and support system that helps the individuals to get to leverage through that time and, and move through academia, the start of incubator and then product development to survive five years of being on the market.

Jeffrey: Well, there's no good short term, simple metrics, but there, but in the short term, it requires very good judgment to be able to evaluate science and technology, and progress requires very good judgment and very few people have judgment. We put these short-term metrics in place that are mostly promoted and monitored and run by bureaucrats who don't have the ability to judge things. Complicated things. It's very sad. 

Jo: So if I take us back to what you said, like the suggestions. So you think we need several consortiums. Every institution is an academic institution,  a consortium of experts from across the sectors of society to look at how we can, like, what's the model that we can set up and suggest to be implemented on institutional levels.

Jeffrey: I don't think there are some simple answers.

I think that they are trying to get corporations to do more basic applied research. Move in the right direction because corporations will be more concerned about funding and doing research that works well. I mean, you can just look at the replication replication crisis, for instance. academics don't care very much about it. Companies do, because companies want to create products and they don't like it when they try to replicate some research results and they don't work. And they say, well, what can we believe? Right. So they're concerned about this. To them it's a much bigger issue than it is to academics.

Because academics are like, well, we're not measured by this. We're measured by how many papers we make, so we're gonna do more papers.  So we need to change the dynamic and one way to change the dynamic is to give more support to companies, to who, who are doing basic and applied research through what I call these cooperative alliances.

Jo: Okay. So what, what does a cooperative alliance look like? 

Jeffrey: Well, they don't really exist right now because, you know, very little money is given to corporations except for contract research. So I just gave them as an example.  There may be better ideas, but the problem is you see that academics aren't looking at this.

Economists and business school professors don't look at this. All they do is call patents and innovation and call papers science, and then they analyze the relationship. They don't really look at whether new products and processes and services are actually coming out. And so there aren't a lot of people who are thinking about this.

I mentioned one person, Ashish Shura at  Duke University, who's done some of this research, but he's in the minority and he doesn't do much. So the problem is we need a lot more academics to start thinking about this. And, but, you know, all the, they're under the same publication pressures that the physicists, the chemists and the biologists are under.

They have to publish papers. And the papers that are published in top economic management journals use very sophisticated statistics to analyze patents and papers. So you see it's very hard to solve the problem when everybody has decided that all we're gonna do is do something that's too professional that gets published.

Jo: Yeah. I mean there is no, amongst the major funders in Europe, and I think also the US agreed that we need qualitative measures and get away from the impact factor. Like the yeah, the quantitative one, but then how, because research is so specific. So it really, it needs to not only be at an institutional level, but it needs to be assessed on a, on a project level, I would argue.

 So it cannot be disciplined cross-cutting either because every research project is so specific and so complex in its nature and set up and applicabilities that only there can you assess like yeah, what other potential benefits and, but maybe there are some criteria that we. Then again, make measurable and, and give points to which, which mine up to.

Yeah. It's a decision making factor of, yeah, this is worth funding. And that's not, I dunno.

Jeffrey: Well, every technology is different. Every technology is different. Every technology has a different set of performance measures. So for batteries, we're mostly concerned about energy storage density, maybe power storage density may be the cost,  per that storage density.

 Solar cells are concerned about efficiency, but we may also be concerned about durability, longevity.  For fusion, there's something called a triple product.  So every one of these technologies has some set of measures and I think that funding agencies need to think about those measures and how we can move technologies forward along these measures so that we're bringing the technologies closer to commercialization.

 And I think that smart people can do that. The problem is, is that a lot of these funding agencies and a lot of managerial positions are filled with people who kind of believe in,  in metrics, right? There's a great book called The Tyranny of Metrics. It was written 20 years ago about how organizations moved to implement all these metrics, many of which really don't help.

Organizations because they're so narrow, they're measuring people by such narrow things. Mm-hmm. , you know, it's not just researchers, but everybody in organizations who suffer from this tyranny of metrics. And so we need research bureaucrats, research administrators, research funding agencies too, to think more deeply about the technology and how you measure progress in the technology.

And then look at whether the research is moving us down that path of progress.

Jo: Right.  Yeah. . Okay. So we are in the midst of the reform. We are on many angles, but it looks like there is a need to, and hopefully it's coming also together that all the, the people who are discussing how we can reform the system there is academia towards.

 And that's the whole idea of open science, to open up to society, to make research societal beneficial again, more so. And I like it. It was mind blowing what you said in the beginning,  like,  again, like how researchers are currently incapable of doing the research they signed up for.  yeah, but which we are aware of in a way.

But then I think we touched on a few. Aspects that are issues currently but should and can be addressed and are probably already being addressed. So let's just keep the conversation going and, and push for more of a coherent conversation and long-term planning. 

Jeffrey: Okay. 

Jo: To facilitate the reform. Is there something else you would like to add or to conclude?  

Jeffrey: No, I think I've said probably enough. 

Jo: But there's always more to be said, but thank you so much for your time. And I appreciate the efforts and your accomplishments. 

Jeffrey: Well, thank you. I hope some of the things I said would be of use.

Jo: I think so. Yeah. I'm sure. 

Jeffrey: Okay. 

Jo: So I'll let you know the feedback and continuation where that's going.

No comments here
Why not start the discussion?