“The omertà was broken, at last”

Interview | door Sicco de Knecht
12 maart 2019 | On the occasion of his farewell as a dean of medicine Science in Transition frontrunner Frank Miedema (UMC Utrecht) speaks on dispelling myths about academia. Looking at the future he predicts a rocky road to open science. "At one point we'll inevitably come to a choice whether or not to abandon university rankings."
Frank Miedema – Image: UMC Utrecht

A little over five years ago Science in Transition took Dutch academia by storm. A small band of academics, amongst them a dean of a medical school, conveyed the message that as a result of a collective effort science had strayed far from what it is supposed to be doing. Academia has become a self-referential and failing system that focuses primarily on rankings and publishing. Instead academia should re-evaluate itself, and aim for fundamental reform in which societal relevance and the added value of research to the body of knowledge take a more prominent position in the value system.

Frank Miedema would probably not call himself a typical biomedical scientist, as his deep passion is the history and sociology of science – thinking about the science business. His biomedical colleagues do not have time for that, he jokes. When the young PhD student Miedema was pipetting in the lab, he started reading the work of Bruno Latour and Jerome Ravetz and never stopped since. At the beginning of this century he read the first critical articles that showed the lack of reliable results in medical and biomedical science. It was in that time that it dawned on him that he himself was also part of an culture that emphasized the importance of publishing over the actual reliability and relevance of research.

In 2013 these insights culminated in Science in Transition’s first position paper entitled “Why science does not work as it should and what can be done about it”, which he wrote with his three Science in Transition peers. At the time to him the publication felt like an expression of a peripheral and exotic view on the workings of academia, but as it turned out the message resonated strongly within Dutch and international academia.

After ten years of service Frank Miedema is stepping down as dean of the Faculty of Medicine and vice-president of the University Medical Center of Utrecht. Soon he will start in his new position of Utrecht University Vice- Rector of Research and professor of Open Science. This occasion marks the end of his biomedical career. In his new role he will continue on the path he started on a few years ago: a path of transition. On this occasion we speak with him about the reception, the embrace and the future of Science in Transition.

Reading national and international science policy documents could leave one with the impression that the philosophy behind the movement of Science in Transition is a commonplace. This begs the question how the position paper initially received?

“To be honest we completely underestimated the power of our message. After publishing the position paper, we planned to top it off with a fancy symposium, utterly convinced that it that would be the end of it. We were expecting an audience of up to a hundred kindred spirits and acquaintances and that everyone in that small circle would totally agree with us, with no effect on the outside world.

To our great surprise, the subject suddenly received a lot of attention, not only from within, but also from outside the academy. On the same day of the symposium I was invited on national television to convey our message. From the moment the ‘on air’ sign went on I fired away and didn’t stop until I had said what I wanted to say.

In short, I argued there that we were dealing with an academic system, floating on myths, in which quality was equated with impact factors. A system in which academics had to fight to build up credit with an ever-increasing list of high-impact publications to keep their operation running. I argued that it was time to expel the myth of science as a noble and innocent craft. Basically we don’t deliver on our promise to society. Many academics are silently unhappy about that fact and the work we publish is by no means always valuable. In short: We have a huge problem.”

How was this message received by your peers?

“Fellow academics, university board members and members of the board of the major national funding agency decried that we had bitten the hand that fed us. We were a defamation to the academy. ‘This is very bad for science, it will cost us,’ people cried out to us. It was felt that we had taken an enormous risk by denouncing science this publicly.

And certainly the implications of our message were extremely tough. It meant that the unimaginable was possible: funders were spending money on shoddy and irrelevant research. This, of course, could never be the case.

As one might expect, the line of reasoning offered was that independent committees judged the merits and quality of proposals objectively, and that this was to be seen a guarantee for good science. To their disadvantage we had been members of said committees ourselves and knew about their practices. In reality Journal Impact Factors were driving the decisions who to grant funds to, in the full knowledge that these impact factors themselves were fatally flawed.

‘Everything is geared to rapid production of data and publications’, we argued. Quantity over quality. It has gotten seriously out of hand. We’re now on the other side of the spectrum. Everyone is carrying out fancy research to join the hype of new and news value, being topical and short-term impact.

In the nineties the Netherlands Cancer Institute, as well as other institutes, would not let you publish in a journal with an impact factor, of say, less than 4. This is how the publication-bias has crept in. Because negative findings do not usually end up in the ‘leading journals.”

One could say that this simply is the way science works…

“I don’t agree with that attitude. This is not intrinsic to the system and much less to science! Science does not necessarily have to work like this. People, especially young researchers, are in fact exploited; it’s just a derailed capitalist production system.

And we were not alone in our position. Suddenly there were a lot more people who were happy that the omertà was broken at last. Those academics could finally breathe a sigh of relief. Now they could say that the system wasn’t right, without immediately being stigmatised as ‘loser.'”

A popular counter argument to your call for change was the image of the lazy academic of the past. Is this image a reflection of a reality, or is the lazy academic a myth as well?

“There were indeed ‘lazy’ academics in the 1960s. I was a student in the 1970s and in those days academics with a permanent appointment typically spent seven to ten years to complete their PhD. And some of them did not write anything at all. An explanation for this work ethic, though, is the fact that every baby boomer that wanted a permanent job got a permanent job. It simply took a lot of people to expand the universities to fit our ideal of total accessibility. So when you looked around the academy as a student in the 1970s, you just saw a lot of people we would now call ‘lazy types’.

As a response to this culture policy makers set out to transform the academy. This was in the 80’s. They were intent on creating a more dynamic research environment. That process was a necessary step forward and partly good thing. But now everything is dynamic, competitive and short-cyclic/short cycled, and we use the first flow of funds to match projects from other flows. And because the first flow of money itself has become such a scarce commodity, the quality requirement has become unfeasible.

But isn’t that what we wanted, to make science more dynamic and increase output?

“That became science policy, but in hindsight, you have to ask yourself who actually had an interest in it? Now everyone is in those four-year cycles, and that’s how science has come to depend on careers and vice versa. It’s like a computer game where you continually try to reach a higher level by adding publications to your résumé.

As members of different boards of directors we felt the obligation to ask the tough questions. First of all because the system of incentives and rewards and the granting process really adds too little to science. There must be a much more consistent basis in coordinating and hence setting the research agenda with society.“

But the system you criticised works very well for some people.

“That system functioned and of course there was a group of people who were ‘happy’ with it, not in the least because their research scored well in the right journals and with sponsors. There is however, rightly a great deal of criticism of the use of the classic ‘metrics’, especially numbers of papers and the Journal Impact Factor. As a result, papers in the typical journals with high Journal Impact Factor are seriously overrated at the cost of good work in good journals.

This is a societal and political problem, not purely a problem for the academy. It has a major impact on the research we do and also on what we don’t do: ‘metrics shapes science’. It is now well understood that biomedical research on societally and clinically most relevant problems is left behind, poorly supported because it didn’t score

Most of the researchers indeed were not ‘happy’ with this at all. As dean, very good researchers told me many times: ‘I don’t know about you, but I quit.’ Such a researcher would be in their early forties, having just successfully completed their first major project, and would say, ‘Heck. I’m no longer going to bend over backwards to get to do something new and fashionable again.’ And that is a bad sign.

The cleverest people thought: This is not for me. This is a game for flashy researchers wo know how to game the system and I don’t want to be part of that. There are also those who do not arrive to that insight. Somehow that may be because they feel that if they allow themselves to realise, then it is over with their motivation. The fact that we are all socialised in that system means that we don’t see it anymore.”

In the end, the philosophy of Science in Transition has been incorporated and it plays a prominent role in current science policy. What was the tipping point?

“It is true that the analysis is now much more accepted. The fact that former Minister of Science and Education Bussemaker, among others, publicly stated that we had a point at the time was great support. The Rectors’ Board, led by University of Amsterdam rector Dymph van den Boom, also changed track at a certain point, although their criticism was that we had been overly critical. Then came the 2015 science vision and the National Science Agenda as the new plans from the Ministry of Education, Culture and Science.

To our own surprise, Science in Transition was becoming an institute. We were suddenly very seriously involved in all kinds of developments, also in the EU. Science in Transition was picked up on not only nationally but also in Europe. For example, you could see it in the Science 2.0 plan, which literally got that subtitle. We were quoted as a movement in the Netherlands that is an example.”

Yet Science in Transition as a movement itself has not ‘grown’ further. It is still mainly the original four authors of the manifesto. Why is that?

“We’d been given that advice by Peter Blom of Triodos Bank, who was a speaker at that first symposium, November 2013. He advised us not to let ourselves be hedged in. And indeed, we were approached by all kinds of people who sympathised with us, but we deliberately never turned it into a movement. We didn’t want to join any movement and we didn’t need anyone to join us. Four people with a clear story are much more dangerous than an interest group.”

In line with this, it is striking that as Science in Transition you have never made an open appeal to young scientists. Why not, aren’t they the future of science?

“I get that question very often. The question of what PhD candidates and postdocs can do. Let me be very honest about that. I think it is they who have the least power in the system. So I’m not going to try and mobilise them if there’s no plan. We do not have strong unions, we cannot go on strike. Of course the young scientists should support Science in Transition, but I don’t want to ask too much of them. When you look at it as a social system, young researchers are the bottom layer of the academic pyramid and have very little power in that system. When it comes to changing the system, I don’t think you should pass the buck to them.

I have to admit, though, that I was to some extent mistaken about PhD candidates. The Dutch PhD Network and local PhD councils have had a clear voice that sounds strong, and through works councils in Universities and UMC’s you can see that young researchers and students can indeed exert influence.

Science in Transition also came from the institutional level. It is therefore not so odd that many, including in my own organisation, find us a bit suspicious. So what do you yourself do about it? It means that you establish an ‘in house’ committee that focuses on these exact questions. How do we deal with rewards and incentives as an organisation? By which indicators do researchers want to be assessed? How do we deal with authorships? One such outcome was that we told department heads that they are not supposed to put their names on every paper that comes out of their department.”

Reward and Incentives have since been dealt with at the national level. Does that have any chance of success?

“Yes, but make no mistake, this will inevitably evoke resistance. Which is understandable, because metrics such as the use of the Journal Impact factors and academic career management are closely related. There is also a direct relationship between the way we assess researchers and how the research money is distributed in academia and by committees at the funders. Right now, as a consequence, we actually leave personnel policy, i.e. career management and promotions to the funders and they have a great deal of influence. They are aware of this but also say: ‘Don’t blame us.’ And indeed, blaming them would be too easy.

But what this all boils down to is the simple fact that there are faceless committees, meeting at some indistinct conference center, who decide who will become an associate professor or professor in our institutes. Although this happens indirectly by granting research proposals, it still is a very strange situation. Shouldn’t universities be able to decide this for themselves?

Right now, in the Netherlands, universities and funders are making a move towards a different system of rewards and incentives, is that a good sign?

“I think it is and indeed, I see a shift taking place. If we let content and social impact prevail over impact factors, the money will at least be divided differently, in a more balanced way. Besides the funders, charities in the Netherlands, Europe and elsewhere have already gone down this road. At least quite a number of stakeholders are declaring they want to move away from a system of based on metrics.

This inevitably means that at one point we’ll have to have a discussion about international ranking systems such as the Shanghai Ranking. The question then arises whether we still want to take part in that. You’ll have to practice what you preach. So if you declare, to your Dutch peers and within your institute, to not assign a prominent value to a ranking then what do you want to communicate internationally.”

You are now going taking up a position as professor of open science. Is that a new subject or is it an extension of Science in Transition?

“For me personally, open science has only really been on the radar since 2015. But I see it as the inevitable consequence of a different approach to rewards and incentives. If we stop misusing metrics, then open science is a logical next step. The open science movement as such is a different being altogether, though. It has originated from the researchers themselves. It is a very strong grassroots movement of people who take the ideal of science very seriously.”

“Is it the answer to all of our problems? To a great extent I think it is. If you start practicing open science as it is pinned out – which means opening up the agenda, more data sharing, more reproduction, no more metrics but evaluation based on content and impact – you’ll come a long way. The only question is how to get there and who you meet along the way.”

You foresee a rocky path for open science ahead?

“It certainly won’t be easy. The discussions will constantly center on what determines quality, and there will always be interests – hidden or exposed. I’ve learned as much by now. People will stand up for their research, for their positions and even for their livelihoods. But we are going to open up these discussions. It’s very exciting and it will take at least ten years before we have an idea of what the long-term effects of open science might be. There are always unintended effects that we need to study and mitigate.

Right now for many people open science is still very much about publishing and open access. But open science is really about a much broader picture of how we practice science, with whom and for whom. And most importantly, it must be cross the divide with society. There is always a fear factor in providing full disclosure. Do people really want to know how the sausage gets made?”

Won’t we lose public trust in science once it gets out what a messy business it is?

“Are we afraid of social discourse? No. We should not be afraid to lose credibility by providing full disclosure. On the contrary, I think it will make us more credible. Even in cases where openness leads to the detection of fraud or bias. Open science means that you anticipate these situations much more readily as a byproduct so to say, because the main goal is that others can verify and use your data and take them further.”

When you are more open, there is more to criticise. But that is a reality you have to deal with and act up on, and it also means, for example, that you look closely at how you deal with your own people. In our case, it has forced us strongly to look at our leadership.”

So open science keeps academia on the right track?

“That’s my starting point, yes, and it forces you to ask the right questions. What was the university for to begin with? It’s not there to confirm and rehash what we were already doing. On the contrary, to ask how and why we do things in our daily life, personally and in society, that’s the essence, I think. So yes, open science, even if it leads or precisely because it leads to discussions and sometimes resistance in the academy, I support that.”


«
Schrijf je in voor onze nieuwsbrief
ScienceGuide is bij wet verplicht je toestemming te vragen voor het gebruik van cookies.
Lees hier over ons cookiebeleid en klik op OK om akkoord te gaan
OK