Harnessing community resources for research assessment reform
By Sarah de Rijcke and Alex Rushforth (both Leiden University)

Discussion on reform of Research Assessment and how resources from the (international) community can be harnessed. There is a growing need for communities of practice, common knowledge resources and shared learning. TARA (part of DORA) is trying to provide tools for that as well as providing an online exchange of practices.

In the first round there was a discussion about the narrative CV. A lot of the debate was caused by misinformation. Is it about storytelling or about providing evidence? Explanation and framing are important. A big misconception is that the narrative CV is no longer about quality. One of the participants said: ‘Problem with narrative CVs, people who are not used to writing do not know what it means to build an argument and provide evidence via language. University is leaning on humanities scholars to help others build narrative CVs. First you have to make people understand what a narrative is. Scientists don’t have to be good writers – but do need training.’ A ‘substantiated CV’ was introduced in the discussion as an alternative to ‘narrative CV’. ‘Think about portfolio of activities for academics that can be translated to a narrative CV that can also be translated to an institutional level.’ We often talk about what we cannot do. It is important to give examples of success. There is no one-size fits all. People often assume they have to do everything. It is important to stress that you can be specific and choose what is relevant for you. Another participant preferred portfolio to narrative. ‘Anyone complaining about narrative CVs, who also writes grant proposals is making objections for arguments sake. Portfolio should prove why someone is best candidate for specific job, grant, etc. Make it limited and make it count. On the question how the Dean can negotiate the researchers’ concerns and opposition, answered Sarah: ‘Set up examples, provide proper infrastructure, be clear on what is required, and develop training for substantiated CV.’

The second discussion round was about diversifying criteria. We need to ask: How do you measure impact? How do you measure impact of new criteria? Qualitative impact measurement with 1 or 2 questions. Did this change your view on academic work in general? If so, how? We need to use qualitative approaches. Someone else said: ‘Involve and invite associate professors into the conversation. What to evaluate is tied into monitoring the outcomes of an intervention. There is a strategic component to determining new evaluation criteria; it depends on the profile of a research your need.’ Someone else again: ‘We need to determine what you want to achieve with an intervention before you evaluate its outcomes. Ultimately, we want good quality research and good quality education.’ Another participant said: ‘Portfolios make it difficult to assess promotion. Why did my colleague get promoted and I did not? Transparency about decision-making is important: is it teaching? If so, need to drill down deeper. Is it number of classes? Number of international students?’ On the final question how the leadership group do go about evaluating the intervention, answered Sarah: ‘is the strategic component of why an intervention was set-up clear in the first place? Were the new criteria co-created with academic community? Narrative vs. metrics are not black and white.’