Hello and welcome to one of a short series of podcasts which will investigate aspects of impact assessment from four different perspectives, or four different lenses. Please keep the conversation going by tweeting any thoughts with the #impactframeworks. Thank you for listening.
Hi, my name is Sana Zakaria and I work for NIHR. Today I’m talking to Catriona Firth, who’s the Associate Director for Research Environment at Research England, about her views on the use of impact assessment for allocation from her personal and professional perspective. We’re going to be talking about allocation like I said and allocation of funding, resources. Quite a complex subject with multiple facets to take into account no doubt. So I guess to kick us off really, let’s talk about the REF a little bit, so when it comes to allocation of research funding to university, how do you think an exercise like REF has influenced the allocation agenda?
I think we know that the REF has a huge influence in terms of developing the impact agenda. In terms of allocation I think it might be useful right at the start to outline how Research England allocates its funding and how that’s related to REF because I think it’s not always clear. Before I joined Research England I wasn’t that clear on how it worked. I think it’s quite opaque to those who aren’t directly involved in that and dealing with institutional funding. So unlike the research councils which provide project based bunding, Research England provides what we call block grants of quality related funding which is we always say unhypothecated which again is a word I didn’t know until I joined researching. Another bit of nice jargon. But it means basically that universities have the autonomy to allocate it how they choose, it’s not linked to any specific projects. So this gives them stability through the recurring grants, but also allows them to invest it flexibly. They very often use it to make up overheads, to support staff and student capacity building. And quite often to refund research projects directly where there aren’t any other funding sources available. So that’s quite often to individual researchers, it just goes into a big black big pot, a black hole of university funding. In terms of how it links with the REF, we allocate the funding on the basis of quality, volume, and the relative cost of research in different subject areas. And it’s calculated using a formula based on the outcomes of the REF, so the four started, one star quality. And takes in the volume, so the size of the university submission to the REF. And the subject waiting and to recognise as I say, that different subjects have different cost implications. So it allows us to fund excellence where it’s found and gives us robust transparent allocation methods. I was coming onto impact though, REF 2014 was obviously the first time that impact was included in the assessment exercise, it hadn’t been in previous ones. And subsequently was the first time, in England, that wider societal and economic benefits of research were taken into account when allocated QR. It sends I think a very strong message to the sector and has really helped to embed impact in everyday academic activities in a way that the might not have been beforehand. It validated those who were already doing a lot of impact activities and then opened it up for discussion, prompted others to start engaging beyond the academic community. But in terms of allocation again I think it’s important to note that unlike project funding we can’t actually draw a direct link between the scores of impact case studies that they got in the REF and the allocation of funding at an individual level. The scores of case studies are combined to create an overall sub profile for a unit of assessments that’s then also combined with the volume measure as I say. And universities at the other end, can invest it however they want. So if a department had phenomenally successful impact case studies, high scores, there’s nothing to say first of all that the university would have to invest the money back into that department, or that they would have to invest it in impact. So it’s quite difficult actually to draw that sort of line between allocation and impact and back again, within the flexibility of the QR funding.
From what you’re saying it sounds like the combination of that REF assessment and the volume assessment and the various measures you take into account, it’s perhaps then easier to allocate from Research England’s perspective, but when it goes into universities it’s kind of out of the control and quite subjective perhaps, in terms of where that investment goes further on when in departments?
Yeah, I think universities obviously are able to determine their own priorities with that and you might find that some universities look at those areas where they performed well in the REF and think those are areas we’d like to invest in further, others may decide actually no that they want to pump money into those that didn’t do so well in the REF and bring them up to a higher standard. And we think it’s important to give universities the flexibility to do that. They know their institution, they know their institutional context, regional context, better than anyone else. And I think it’s important that this is the only pot of money that floats about that gives institutions that flexibility to work with it. And it also means that they have more then, potentially, to invest in impact. One of the things we see quite often funded through it and that was personally the case for me when I worked at the university in a research office, is that quality related QR funding can be used to fund posts within research obviously, such as impact officers. So providing that kind of infrastructure and resources within institutions.
So I guess moving on from REF and talking about university league tables. In terms of allocation we do see a lot of practises by funders in assessment panels as well where there might be (unclear 07:23) put in university league tables. What is your sense of using these type of rankings when it comes to allocation?
I would say that Research England and the REF panels do not use league tables or ranking to allocate funding. I think we’re very clear about that, the way we present the REF results, we don’t present them in ranked order and funding is based on the formula. It’s a criteria based approach rather than league table or ranking. As a sector I think we have to be really careful and think about how we use league tables. Many of them are based on proxy measures of quality, things like journal impact factors and they often really lack at transparency in their methods, both in terms of the data that they use and the waitings that they apply to that. It’s not very clear how the universities are ranked. We know that they’re very attractive if you want to get a very quick idea of where an institution sits and of course they’re very attractive for institutions because they act as advertising if they get further up the rankings, particularly internationally. I think that’s obviously something that a lot of vice chancellors are interested in. But I think we are very aware of the risks around about using league tables irresponsibly. If they place a value on a set of nebular symmetrics that’s not a robust way of measuring the quality of research. And I would like to think that you wouldn’t find funders in the UK placing great value on league tables to assess or to allocate funding. And I think I can categorically say that the REF panels do not take them into account in their assessment, that’s something we’re quite clear about in the criteria.
Moving on now to something quite different around the levelling up agenda. There’s been reasonable momentum around this agenda and the challenge and the tension that lies between funding excellence versus funding where the need might be. And then of course (unclear 09:46). So what kind of impact assessments are needed here in this context, to drive appropriate allocation of resources?
I think it’s really important what you say, to acknowledge that there isn’t any kind of inherent conflict between funding excellence and any kind of funding based on need. And what we’ve found, I’m going back to the unique way that Research England allocates funding. Is that there are actually quite a lot of challenges to allocating it for place based outcomes. We reward universities based on their past performance through REF, and we don’t stipulate how it should be spent. It’s interesting that when you look at the original distribution, differences in QR funding levels between regions, are mostly actually due to differences in research capacity. And this comes down to the fact that the REF takes volume into account within the formula. We know that there’s excellent research going on across the UK which then means that we know that there’s scope to use targeted funding for example, to increase capacity where there is excellent research. And that’s something we’ve been trialling and there’s a programme called the Expanding Excellent in England Funder, E3. Which has been offering funding to sort of small but excellent unit, to help them expand and help them expand ahead of the next REF. It’s obviously going to be a long time before we see kind of spill over effects because it’s only been running for a year and these obviously take a while for units to build up and build up this capacity, but it’s a way that we see as being able to help contribute to the levelling up agenda. I did ask about this, I must say probably at this point, that I’m not actually an expert on funding and allocations. That doesn’t actually fall within my portfolio, so I’ve been asking colleagues around about this. And I spoke to one of my colleagues in knowledge exchange who was telling me about the UK Arise Strength and Places fund, the SIP fund which awards funding still based on excellence but is one of the only funds around that takes location into account when assessing an application. And what you see is when you think of allocation one of the things they really consider is the project significance relative to local economic growth. So really not having a one-size fits all approach to it. Really thinking about that relative impact that a project could have. And then also still say taking into account the, again, excellence. I think, she didn’t say this, but I think we can kind of assume that there’s no point funding bad research just because it’s in an area of need, I think that’s not going to help anyone. I think excellence is still incredibly important when thinking about that, but finding that tailored approach where you understand what’s needed and what impact a project could have relative to the need.
Sure, so I guess that’s more about spotting where there is excellence in small pockets and growing that and building that capacity really in a more equitable fashion in the long term. And it really throws an interesting … so for example the pandemic has thrown a really interesting lens on it where we know that most of the investment made around excellence does go into the golden triangle. And, of course, that’s where a lot of the vaccine development work had been focused around and so it again kind of brings a very different perspective on the allocation space.
Yes, and I know the government has been doing a lot of work on this, the R&D roadmap sort of set out the commitment to making sure that research and development system delivers economic societal benefits right across the UK. I think there are the R&D places strategy, and it’s important to note that it’s places rather than place strategy. I think they’re currently working on that and I think we’re feeding into that in some ways. But yeah, what the minister said I think it’s quite important, is that it’s not just about how much money we spend in each place. That you’re not going to level up just by moving all the funding from London to elsewhere. It’s more about thinking about the outcomes and thinking about the impact that the R&D system can have in different places across the country. Coming back again to this idea of sort of relative benefits and relative significance. I’m really thinking about how we can develop the difference approaches that are needed in different places that take account of the different characteristics, different challenges facing regions. But also their different opportunities to develop and I think it needs to be thought of holistically but in a very specific way sometimes for each region rather than trying to find a one-size fits all approach. I don’t think that will help anyone really.
Are there any last words you want to say to our listeners today, Catriona?
Good luck to everyone who is working on their REF submissions, we appreciate that these are challenging times and are really phenomenally impressed with the effort that’s gone in across the sector to making that happen alongside all the amazing work that’s been done on Covid and just in keeping universities running so successfully.
Thank you for listening to this podcast. It’s one of four in a series exploring different impact lenses, please return to the website to discover the others, and don’t forget to Tweet us your comments and questions at #impactframeworks. And once again, thank you for listening.