Artwork for podcast Research Culture Uncovered
(Bonus) Exploring Grimpact: The Other Side of Research Impact with Gemma Derrick
Bonus Episode20th March 2024 • Research Culture Uncovered • Research Culturosity, University of Leeds
00:00:00 00:30:26

Share Episode

Shownotes

In our Research Culture Uncovered conversations we are asking what is Research Culture and why does it matter? In this bonus episode, Ged Hall, Academic Development Consultant for Research Impact, chats to Dr Gemma Derrick about her current research project "At What Cost Societal Impact? How Research Culture and Governance Inspires Grimpact." 

Gemma is an Associate Professor in University of Bristol. Her research interests include researcher behaviour, academic practice, research evaluation, societal impact, the research workforce and its governance (e.g. peer review systems), and the politics and dynamics of knowledge production and translation. She has investigated the effects of national audit frameworks, such as the Research Excellence Framework (REF) and others, to demonstrate their strengths and weaknesses in relation to their stated aims. Gemma has also previously been on the podcast talking about the Hidden REF.

Here are three key takeaways from the discussion of grimpact: 

1.   We need to shift our research culture away from a blind focus on reward to ensure that we aren’t blind to failures. When we are blind to failures we fail to learn from them.

2.   To do that we need much greater reflexivity throughout the research process at both the individual and organisational level.

3.   Which also means that we need to monitor for grimpact not just build our monitoring and evaluation systems around a blind focus on positive impact. 

If you would like to find out more about Gemma’s project, head over to her website: https://www.grimpact.org/. You can also get in touch with Gemma via @GemmaDerrick and connect with her on LinkedIn or email (details on her University of Bristol profile page

All of our episodes can be accessed via the following playlists: 

Follow us on Twitter: @ResDevLeeds (new episodes are announced here), @OpenResLeeds, @ResCultureLeeds  

Connect to us on LinkedIn: @ResearchUncoveredPodcast (new episodes are announced here) 

Leeds Research Culture links: 

If you would like to contribute to a podcast episode get in touch: researcherdevelopment@leeds.ac.uk

Transcripts

Intro:

Welcome to the Research Culture Uncovered podcast, where in every episode, we explore what is research culture and what should it be. You'll hear thoughts and opinions from a range of contributors to help you change research culture into what you want it to be.

Ged:

Hi, this is Ged Hall. And for those of you who don't know me yet, I'm an Academic Development Consultant at the University of Leeds, where my specialism is research impact.

And all of the episodes I produce for the podcast relate to that. Um, if you're interested, there's a playlist in the show notes, so you can have a dive into all the other episodes. Now, when we think of research impact, we sometimes have a positivist or maybe even colloquially, even a rose tinted view that this is going to make the world a better place.

But is that always true? And does everyone who is part of the system we're trying to change have the same view of what is positive and what is negative? Today I'm going to be exploring that with Dr. Gemma Derrick. Now, regular listeners will remember that I spoke to Gemma about her involvement in the Hidden REF.

But if you hadn't listened to that episode, Gemma is an Associate Professor at the University of Bristol and works in the Centre of Higher Education Transformations. That's quite posh. That's quite, that's quite a posh centre name, I like that, where she researches, and this is a long list, so keep with me listeners, academic practice, researcher behaviour.

research evaluation, societal impact, the research workforce and its governance in particularly peer review systems, and the politics and dynamics of knowledge production and translation. And, and essentially the reason I've invited Gemma back to the podcast is to find out more about her latest project.

Which has an intriguing title, "At What Cost Societal Impact? How Research Culture and Governance Inspires Grimpact."

Gemma, it's lovely to have you back and you are the first returning guest. So go yes for that.

Gemma:

Woohoo! So happy!

Ged:

Yeah! Uh, and I promise this time I won't ask you anything about your musical theatre career unless there's been a breakthrough there and you're kind of looking to give up your academic.

Gemma:

Well, I did think that the theme song from Fame, I wanna live forever, was perhaps more appropriate to your topic on Impact and Grimpact here. So let's just start with that, shall we?

Ged:

That's lovely. And to be the first person who's ever sung on the podcast. Yeah, I love that. I love that. Indeed. Um, so before we dive into the details of your project, maybe for our listeners, you can tell them what you mean by Grimpact.

Gemma:

So Grimpact is a really interesting concept. When I first came up with the concept with my, uh, dearly departed colleague, Paul Benneworth, we were concentrating on Grimpact just being associated with negative outcomes. So those pieces of research that are done and that have been done literally created something that is for the un benefit of society for a waiver.

And I think we were focusing more on outputs there and the case studies that we were using, which was the MMR debate, uh, Cambridge Analytica, as well as the financial crisis, were focused on those negative outputs. So, uh, Those things that we can measure that we know in hindsight that were that are negative or had some sort of negative output on society, etc.

And I think that that really captured a lot of imagination because people were like, yeah, like, you know, not all research creates benefits and Paul and I wrote this back to a way. brought this back to the concept of how research is rewarded. And when we looked at the definitions in, uh, that are used to guide impact within the evaluative framework, not just here on the UK, but other countries that have adopted impact in its various forms in either formal or informal assessments, but also how it's seen more broadly across academia is associated primarily with things like benefits or societal changes that are positive.

And the reason for that is because the societal benefits and the societal change that are positive are linked to reward. And it's easy conceptually to think about reward as being associated with something positive. But what we've, what we've developed and what I've continued to develop in Paul's absence is, is this concept of Grimpact is not necessarily just about outputs.

It's about how the conceptualization of impact, how it's operationalized and as an evaluative tool. And, both on the, uh, funding level as well as national and international levels, but also how organisations approach impact is blind to the nature of impact that is not accessible or is not necessarily amenable to these definitions of impact being something that benefits society.

So I say. that Grimpact is what happens between impact in its very broad form and reward, which really actually asks us to focus down the nature of impact in its broad form to something that is amenable to these definitions, um, and associated with positivity. This has been called the pro impact bias. Um, and it has, it has quite potentially unintended and perhaps negative effects on how we visualize the science and society relationship.

The reason for that being is, of course, if research is, society is always thinking about researchers providing them with some sort of benefit to improve their lives or to change something for the better, then we do actually set up ourselves with unrealistic expectations about what society can expect from publicly funded research.

Because publicly funded research deals with risk and uncertainty. All the time. And in fact, even though it's not necessarily rewarded, Grimpact is a product of a very healthy science and society relationship. So it's notjust about impact, not just about negative impacts, but how those negative impacts are basically framed and blinded by the way we reward science and how researchers orientate their behaviours towards monitoring, collecting, and only achieving those impacts that are associated with those personal and organisational rewards.

Ged:

That's, thank you for that. I mean, there's, there's so much to unpick there. Um, but I think I'll, I'll kind of, you, you, you kind of looked at really in, in terms of hinting into the next question, which was around that kind of culture and governance, um, aspects of it. So can you tell us a bit more about your concerns there, uh, and essentially in the project, how you're investigating those concerns?

Gemma:

Thanks.

Okay, so the first thing is that impact the way it is rewarded. There is a danger for researchers individually, as well as personally, but also for organisations in admitting that harm has happened and harm has, uh, something has gone wrong. We don't like failure in academics. In academia, as much as I've had previous research that looks at failure as a learning exercise, and we have to see failure as a learning exercise.

And as the impact agenda grows, we need to learn from not only positive examples, but negative examples about how we can orientate and reward all the benefits about science. In a way that both impacts society, regardless of what that is. So there's a big, big, big personal danger in admitting that something went wrong.

And there's no incentive currently to do it. So if there's no currently any incentive for organisations or individual to, to focus on or to admit when something goes wrong, then we are blindly focused on benefits, all things that happened that went according to plan, that we can put into this linear idea about how impact happens, and we know that that is false and that is constructed within the narratives that we put forward to be assessed.

Uh, But there's an, if we're completely blind to the variety of ways in which science can, can influence society or research can influence society, then we're both underestimating as well as being, um, underestimating, if not purposefully ignoring the negative concepts, um, the negative conceptualization of Grimpact and its effect in society.

And again, Grimpact is a product of a healthy science and society relationship. There are things that, uh, there are things that Grimpact does that do it wrong, but we learn from them. Not only do we learn from them so research can build and negate the Grimpact as we as we do, but also researchers who operationally impact in their organisations because they're seeking reward can learn from their mistakes.

What is it about that relationship that went wrong? Where did I lose control of the conversation? Is it my role to be, um, somehow accountable for when research goes wrong or is misused or used for, uh, Purposes that are not necessarily associated with how I visualize the impact to be in that, you know, ex ante original funding, funding application that I had 10 years ago.

And again, if society has, well, research should be giving us benefits. And then when that doesn't happen, they say, well, there's something wrong with the science. There's something wrong with how the research is managed. And that's only partially true. Because science actually deals with uncertainty and risk all the time.

We never know what the output can be. We can't promise what the output of these will be. And that's the beauty of science. Science is also self correcting. So if I make a mistake, somebody else might be able to build on that mistake to make something better. But if I don't acknowledge that mistake, then that pathway to making something better is completely closed off.

And this is not only associated with academic outcomes and societal outcomes, but also how we reason and mitigate our relationship with society and create reasonable expectations about what research, what society can expect from research and how that might benefit them more in the long run than in the short run.

Ged:

Yeah,

absolutely. Um, I mean, there's so much, again, so much there in terms of, you know, the positivity in terms of research outputs that go into our press that we read, um, and that we don't, we don't share the kind of experiments that failed, the hypothesis that didn't fall, that, you know, that didn't stand up to, to investigation.

Um, you know, it, it, It really is kind of quite, would you say endemic?

Gemma:

It's very difficult to estimate how a grimpact is happening because we're not tailored. The organisations are not set up to monitor things that go wrong. They're purposely monitored, they're purposely set up to monitor those things that go right.

And to somehow only collect evidence when things go right, rather than when things go wrong. I know no organisation. that has a Grimpact, uh, division, but all of them have an impact division. So, I mean, the way that we're framing how we collect information as part of it, and that's why it's really essential to look about how research is governed more openly, more broadly when we're talking about the risk of Grimpact, um, When we look about the incidents of Grimpact, uh, because we, we might be overestimating, but we might be underestimating it too.

As my, you know, singing thing said in the beginning, um, impact, impact, uh, might live forever, but Grimpact is fleeting and it's only fleeting if we are, we acknowledge it, if we're open to it and we're for opening to learning from it in the future as well.

Ged:

Yeah. And interestingly, when I, when I first. read your kind of blog posts, um, that started talking about Grimpact, uh, back a few years ago, was it 2017?

Gemma:

2018.

Ged:

Yeah. I guess I kind of probably had my rose tinted glasses on and I was kind of hoping that there wouldn't be many examples, um, that, uh, that kind of internally we'd be doing that kind of almost, um, conducting our research in really responsible ways and kind of, uh, really thinking about. Um, about those unintended negatives and things like that that come through the responsible research innovation, um, uh, Agenda, I guess, might be, might be a right word for that.

Um, so, so, you know, you, you're collecting examples. So, you know, maybe, maybe that kind of utopian view and, and those rose tinted glasses I need to take off, um, and just kind of get into, You know, from the examples you're seeing, have any themes started to emerge that, uh, from your analysis so far?

Gemma:

Well, there's a couple of things there.

So we set up the Grimpact repository as a place where people could submit, uh, examples of Grimpact. That had various levels of success. Obviously, as I said before, people don't want to admit when something goes wrong. Moreover, and perhaps this is something about research culture in itself, no one wants to dob anyone in.

And I think people are very much, like, socialized to think that Grimpact is something to be ashamed of, something that should be shed under the carpet, so we don't want to talk about it. But when we talk about Grimpact more broadly and started thinking about how research impact is envisaged, managed, and monitored, there are not, and not just about Grimpact.

Ow. outcomes per se. We get a wider variety of cases about how researchers have either successfully or unsuccessfully mitigated the risk of Grimpact at various stages of their research. This can do everything such as the ethics and how they did they in correctly envisage the ethics that ethics considerations that they would be facing in the future at that early stage.

And for some cases It wasn't. I know they didn't envisage it first, but there are some successful cases where they're like, look, we're dealing with vulnerable groups here. We're dealing with vulnerable people and we need to take very, very careful steps, very, very slow and careful steps to make sure that we don't harm them.

Now, the idea of going slow and carefully is not necessarily commensurate with the idea of research being fast and providing outcomes and they need to be positive, etc. And it's not necessarily commensurate with how the majority of researchers like to work. The fear of getting scooped, the pressure to publish, the pressure to get outcomes there quickly, the pressure to have an impact within a seven year, you know, REF timetable is very real.

And it's very, it's very much felt on the individual level. So sometimes people are driven to fight research. any cost. Um, and they are to publish it at any cost. And this goes against the idea of what we should be doing at a step by step, um, step by step mitigation and consideration of how Grimpact might emerge.

Because Grimpact is not just about the outcome and it's something that changes. And the risk needs to be mitigated at every step of the, of the research process. What perhaps movements such as the Responsible Research and Innovation movement perhaps don't see is that there's no template of how to conduct research responsibly.

Research needs to be a set of norms and values that we adhere at every step of the way. It's just that. The norms and values that are guiding research culture at this point in time, being fast, being output driven, being reward focused, are not necessarily in line with the types of values that we need researchers to operationalize at every stage of the research.

So most of the cases that I have seen are not actually focused on research, but are the successful and unsuccessful stages of monitoring that and adjusting the research design, um, to accommodate for these vulnerabilities, to accommodate and visualize potential Grimpacts as the research goes on, rather just focusing on outputs.

I'd just like to say also there's a lot of thing about blind Grimpact and unconscious Grimpact. There is Grimpact where it is purposeful. And the other one where it's just an accident. And I think we need to say, look, these accidents are things we can learn from. But what's also very interesting is this purposeful impact, this blind grimpact, where a lot of researchers say, research for research's sake needs to be done.

And there's a lot of cases that have said, look, it needs to be done, because if it wasn't done by me, it wouldn't have, would have been done by someone else. And when you say, well, it would have been done by me, if it wasn't done by me and it would have been done by someone else, how do I know that you're more trustworthy to manage the Grimpact every step of the way?

And the answer is, we don't know. But if researchers are in competition with each other, there is a lack of trust associated with how are you going to mitigate it? Because we're not opening that conversation to the potential of Grimpact and the necessity for us to mediate every single step of the research process as associated with the possibility that Grimpact might exist, then researchers are not communicating to each other.

And so there's a central mistrust about how I will handle the data versus how my competitor will handle the data. And the way that research is set up is I always need to say to a funding application or to a REF submission that I am somehow more trustworthy or more impactful than my competitor. So this is the way it's linked to research culture as well.

And this is what we're finding out through the data.

Ged:

Yeah, that's, uh, that's absolutely fascinating. Um, I'm going to try and zone in a little bit in terms of, you know, you're in the system, you're the principal investigator of this, um, of this project, um, this project has timelines. Um, so you're feeling those, um, those time pressures that you mentioned in the, in the previous answer too.

Um, so I just wondered. If, you know, from, from your perspective of, of managing this research right at this, at this moment, in terms of what defences you put up to help you guard against potential grimpact from, from the work you're doing at the moment, um, do you have any insights in terms of, that you can explore with us?

Gemma:

I really wish I was that reflective, but I'll try my best. Obviously, when I'm dealing with academics who are telling me of cases that they know when research has gone wrong, or they're admitting that research goes wrong, there is a, there is a sense that I need to provide a safe environment for people to tell them what's going on.

In a way, um, I hope that I'm not seen as anywhere difficult to talk to, but obviously there are certain like, um, uh, de identification and anonymization, uh, aspects that I need to take into account. So the normal ethical things that you need to deal with when you're doing interviews, when you're dealing with people, but perhaps a little bit more because you're dealing with Admissions that people might have done something wrong.

In many situations, there are admissions that things went wrong, but there's a lack of um, there's a lack of responsibility or admission that that has actually gone wrong, which is very interesting as well. We would like to make the data available, obviously, but there are risks associated with making these data.

And um, you know, as you said in your introduction of me, I, I do a lot of research onto researcher behaviour, researchers choices, all of these sorts of things. And so I am. uniquely in to, uh, the necessity to, to protect the researchers in our environment. So I can't give you the outcome here because I'm not that, that stage yet, but I need, I am very wary of those risks and I need to make, you know, informed decision for the, for the benefit and the protection of my participants as well.

Um, and I will do that at every step of the way. It's not like I have an idea about how that will be done now because things might change in the future and I need to make sure that regardless of what type of AI is out there or what type of data risks there are out there that I protect my participants at every single stage of the research.

So I think that every single stage I think, what is the risk here? How do I take care of this? And more importantly, how do I take take care of the emotional and professional well being of my participants who are incredibly vulnerable for giving me this information of which I am incredibly. incredibly grateful as well.

But these people are very brave too because by admitting it and their experiences as well they're also providing the entire sector an opportunity to learn and I think that's really important for a broader impact or grimpact of the research.

Ged:

Yeah, that's absolutely, I mean, you, you, you mentioned a lot of reflective questions there in terms of when you're, when you're thinking about every single interaction with every single person or team, you've submitted a, an example and that you're talking to.

Um, so, Is the, is, is it that key thing that, that, you know, the sector is just not reflective enough and doesn't have enough, you know, on the, I was moving on to the kind of governance angle in terms of this, this question, you know, that, that, that as institutions, we don't, we don't provide that way of being reflective, you know, with that in terms of our culture.

Gemma:

No, and the governance, the way that research is governed globally is not orientating itself towards collecting information when things go wrong. It is completely orientated towards reward seeking behaviour and also blind reward seeking available behaviour. When we look for evidence of reward, regardless of what the rest of the evidence is telling about the impact of this research.

And that is about how. Again, there is a huge amount of literature around there about how research is rewarded, dictates how research is performed. What we're saying here is it dictates how research is performed, but in parallel, it dictates how research is managed and monitored as well. And this is what we're talking about, the research governance link to Grimpact.

It doesn't, Um, necessarily and I don't have the evidence to say that the way that research governance operational is operationalized at the moment somehow increases the risk of Grimpact. No, there's no evidence to suggest that. But because there's no evidence to suggest that means that the counterfactual needs to be considered as well.

If we're not looking for Grimpact we're not going to look for Grimpact and if we find Grimpact And it's, you know, not within our worldview or what we want, then we will dismiss it. There's a sort of institutionalized cognitive bias there, perhaps to the eventuality and the risk of Grimpact that I really hope that we can open the conversations about so that we can learn not just how to have good impact, but how to avoid bad impact as well.

Ged:

Yeah. And I, I guess, you know, I asked, In the previous question, what, you know, what, what are you doing that, uh, that is trying to guard against this? Um, what should I? be doing that, uh, that maybe I'm not in, in, in, in the development I offer for people to get to engage with research impact.

Gemma:

I think we need to embed a level of reflexivity on our own research practice that is perhaps not rewarded or incentivized in any way, shape or form.

At the moment, we are so blindly quickly moving towards reward where researchers in precarious positions. undervalued and underpaid and overworked. So I understand the institutional pressures towards that kind of reward seeking behavior, but we need to slow down and we need to make these reflective steps.

It might be that some people are incapable of making the reflective steps because it's not part of their practice, understand, but there are very good examples of researchers who do this at every stage of the research, especially in the arts and humanities research, where reflexivity is an essential tool towards looking at how research is happening and developing and constructing research as well.

I would really like to see all, um, all disciplines from the STEM subjects as well to embed this level of reflexivity. And not just reflexivity too, blatantly honest reflexivity too, because sometimes we don't ask ourselves the questions that we know will paint us badly. That's a human response. Researchers are human too, but we need to embed ways in which the institutions can embed this or incentivize this level of reflexivity and also perhaps embed a series of, um, ways of thinking about, uh, Grimpact and looking at the potential of Grimpact.

Let's just start monitoring Grimpact, otherwise the rest of the sector is robbed of a very, very important learning process here.

Ged:

Yeah, I guess the, um, the, the previous two questions have been kind of skirting around what do you think your, your impact from this project might be? So, so I'm kind of going to just going to be blunt in terms of, you know, what, what is your hope that at the individual level and at the, and at the system level that you want to see as being different?

Okay. So we have that greater reflexivity, but, you know. What, what would, what would that governance system look like for you in terms of the might lead to?

Gemma:

It's too, it's still too early to say, and I'm not going to anticipate any impact at this point in time, especially when I'm trying to embed that level of reflexivity into my own process at the moment.

It could be that my impact has a grimpact, and I'm very sensitive, but that grimpact can be bad for one person and good for others. For example, if we re envisage the science society relationship and science is like, wow, I mean, society's like, wow, science is not going to provide all the answers for us, that might be good on one side.

one angle, because it, you know, it's a more natural, honest reflection about what science can provide society. But that might not necessarily be in the best intentions or the best ideas of some scientists who really like capitalizing on this blind association society has to society, science to provide answers.

I mean, we saw this during COVID pandemic when science was All people wanted was a vaccine and a way out of this. And they just wanted to know whether they met a mask or not. And when they were, um, embedded with different, when they would bombarded with different types of information saying yes, but, and no, but, it didn't give them the certainty that they felt really.

really like they wanted to hear. So there was this, this difficulty between the relationship. However, on the other hand, when science came out with, we haven't just done one, but we've done three different vaccines. They were like, Oh, science, we love you, et cetera. So it, there may be disadvantages for the sector as a whole, for us to have a more responsible relationship with society about what we can and cannot do.

In the short run, but I do believe and I think this is what all people who go into impact believe that in the long run, the impact will outweigh the grimpact.

Ged:

Yeah, and there's There's echoes with what, what is happening with politicians, isn't there, in terms of, you know, they're starting to look at politicians and go it doesn't really matter what flavour I vote for because nothing changes.

Gemma:

In general, I can tell you that politicians confuse the Grimpact story a lot. Yeah, that's another, that's another case.

Ged:

Yeah, but, but we don't want to end up there with, um, with science and arts researchers also falling into that kind of, um, that kind of bucket with, uh, with the politicians. You know, we, we do need to have that conversation with, with society and hopefully arrive at a, at a place where there's a better understanding, I guess, on both sides.

Gemma:

Yeah. And, and in general, from what I'm seeing through the cases is that the, the, the level of reflexivity. needed to mitigate the possibility of Grimpact at every stage of the research seems to be, and the evidence is still out. That's already embedded into the way a lot of arts and humanities research is done.

It's translating those learnings and that approach and that orientation to research to the high, high impact, high, high speed fields in the STEMs, such as, you know, some types of medical science and high energy physics, for example. Um, That's not necessarily culturally possible, but that's what we're trying to work towards.

Ged:

Yeah, brilliant. I think that's a, that's a great way to end. So Gemma, thank you for, for a really interesting conversation about your contribution to a topic that I know is really high up the agenda in the research impact world. I know a lot of colleagues are thinking around us. those ethical dimensions of impact and trying to, trying to find ways to, to make, make the system more robust, um, more reflexive, um, and hopefully better at it as well.

Yeah. Now, if any of our listeners want to find out more about the project and engage, what's the best way to do that, Gemma?

Gemma:

Well, the best way is I'm quite approachable, in fact, or in fact, otherwise you can drop me an email at the University of Bristol or else you can look at the project website, which also has links to all of our previous research on the topic and some amazing advisory board members as well. And you can find that at www. grimpact.org.

Ged:

Simple as that, grimpact.org. Well, It just leaves me to say thank you on behalf of all the listeners for coming on the Research Culture Uncovered podcast again, and it's been a pleasure once again to talk to you, and um, you know, maybe we should do it as a series, you know.

Gemma:

That'd be great.

Ged:

Gemma number three.

Gemma:

As long as we can associate it to some sort of musical theatre, I'm all in.

Ged:

Excellent, that's good to hear.

Intro:

Thanks for listening to the Research Culture Uncovered podcast. Please subscribe so you never miss out on our brand new episodes. And if you're enjoying the discussions, give us some love by dropping a 5 star rating and written review, as it helps other research culturists find us.

And please share with a friend and show them how to subscribe. Thanks for listening, and here's to you and your research culture.

Links

Chapters