News
Feb 1, 2017

New Research Initiative Criticised After Concerns Over Ranking Academics

The initiative has already drawn criticism from IFUT over concerns that it might lead to competition between academics.

Róisín Power and Sinéad Baker
blank
Róisín Power for The University Times

A new initiative to measure the research output of Trinity’s staff, currently in its pilot phase, has raised concerns that it might potentially rank and compare individual academics, already drawing criticism from a trade union.

The new initiative, Principal Investigator Quantitative Analytics, is proposing to rank academics based on their research output. The initiative has faced criticism in College, including from the Irish Federation of University Teachers (IFUT).

Speaking to The University Times in his capacity of Trinity Branch Chair of IFUT, Dr John Walsh of the School of Education said that “I think there are concerns among many staff members, and certainly IFUT members, throughout College. We think that it’s really counter to the whole spirit of the academic community”.

ADVERTISEMENT

“Much of PIQA [Principal Investigator Quantitative Analytics] is quite reasonable and consists mainly of information gathering about, say, research income and other aspects of research, which we would have no objection to”, Walsh added.

However, describing the proposed idea of ranking academics as “problematic”, Walsh commented that it is “effectively individual benchmarking or individual league tables” and that it would “potentially involve comparing individual academics based on a small number of criteria such as grant applications, total journal articles or web of science citations”.

A response issued by College spokesperson, Caoimhe Ní Lochlainn, on behalf of the Dean of Research, Prof John Boland, confirmed that the project was in its pilot stage, but declined to comment until the initiative had been fully rolled out.

In the UK, the Research Excellence Framework is used to assess the quality of academic research, with results used to determine how much research funding universities and academics are granted, as well as an institution’s rankings in league tables. Under the framework, 20 per cent of the overall score is for research outside of academia. The framework has been criticised for discouraging long-term research projects. The framework has also been criticised for adding layers of bureaucracy and expense, with Times Higher Education reporting that some universities appeared to be “gaming” the system.

Walsh states that IFUT would “object to anything that involves an approach” similar to that of the Research Excellence Framework: “The English system of measuring research outputs and funding based on research outputs is toxic and has been incredibly damaging to English universities and is under a lot of criticism in England.”

“While it’s not a fully fledged product of the English system”, Walsh continued, “it does bear some of the hallmarks of the English system, and certainly IFUT is entirely opposed to anything that would resemble [it]”.

“In fairness, it’s not clear that that’s the intention of the authors of this, but there are major problems. This would be the first step down the road towards an [English] style system, if adopted.”

At a Research Committee meeting in November 2016, Prof Trevor Spratt, Director of Research for the School of Social Work and Social Policy, agreed that “the tool was valuable” but warned against its use in the future, commenting that “such metrics were used in the UK in a very hard line way when assessing staff”, and said that he “would like to see reassurances on the future use of such a tool”.

Also raising a similar concern at the meeting was Prof Peter Stone, who was filling in for the School of Social Science and Philosophy’s Director of Research, over the potential future use of the metric and the dangers of following a system similar to the UK’s Research Excellence Framework.

Outlining aspects “that are very worrying”, Walsh spoke of two elements, the first being that such ranking of the academic community might be done in a way “that encourages internal competition between academics within a school or within a discipline”.

At the same meeting of the Research Committee, Prof Micheál Ó Siochrú, then-Director of Research for the School of Histories and Humanities, who has since been appointed head of the school, stated that his main concern was also the motivation behind the generation of a tool that may divide the College and “work against the spirit of collegiality”.

Walsh did, however, acknowledge the fact that the Dean of Research, John Boland, who is heading up the project, “has been open to discussing it at various fora”.

The second element raised by Walsh is the fact that the College has, over a number of years, “and more recently the arts and humanities faculty, developed really inclusive research metrics that recognise everything from a standard monograph or a peer-reviewed journal article to a play or editing a manuscript as valid forms of research output”.

Though extremely supportive of the Research Office’s work in improving Trinity’s research profile, Walsh said that “there will be no point in undermining the faculty’s own, and the College’s own, inclusive research metrics, by introducing something that was much more fine grained, was much more limited and which would encourage competition among academics that is really counterproductive”.

“I think a key issue here is that there is a tendency in academic institutions to introduce flawed systems of metrics”, Walsh added.

On the question of the anonymity of the data, Walsh said: “I think whether it’s anonymised or not, it promotes a kind of internal competition and benchmarking that’s very damaging to collegiality and it’s another set of metrics on top of research metrics that we’ve already recently agreed.”

“I think anonymising it may be designed to offer assurance, but I think it misses the point”, Walsh added. “The concerns that people have are much more fundamental than whether it’s anonymised or not. I don’t think it would be possible to relieve people’s concerns if you could find a way to anonymise it perfectly.”

“I don’t want to be competing with my colleagues, I want to be working with them for the benefit of the institution. I think that’s what we should all be aiming for”, Walsh added.

When asked about the fact that the current metrics of the initiative do not account for those who take maternity leave, Walsh said that that was a “key concern”, describing the proposed metrics as “a blunt instrument that would be crude and unfair, both to colleagues carrying extra administrative burdens, but also to those coming back say from maternity leave or other forms of leave”.

Walsh explained that the metrics also do not take into account the fact that you may be a College Tutor, or a director of undergraduate teaching and learning, or a programme coordinator, or other such activities which may make an individual’s research be “over a couple of years less than it would otherwise be”.

Echoing this concern at the meeting of the Research Committee, in November, Prof Helen Sheridan, Director of Research in the School of Pharmacy and Pharmaceutical Sciences, raised the point that the “majority of academics were balancing curriculum delivery and development, administration and research, and there are no balances, measures or tools to measure productivity across the spectrum”.

When asked about the motivations behind the initiative, Walsh said: “The motivations to me are very obscure. I can see the motivations behind PIQA [Principal Investigator Quantitative Analytics], the overall project, and that’s to get as much information as possible and to enhance the College’s research profile.”

“In fact, you could say that all but one of the elements of PIQA [Principal Investigator Quantitative Analytics] are broadly quite reasonable, but one element, which is individual benchmarking”, Walsh stated.

“We’re benchmarked through the promotions process, and that’s an extremely rigorous process”, he added. “We’re more generally benchmarked through the research metrics so it just seems pointless to invent another system of metrics.”

Walsh emphasised that the initiative is still in early stages and would need to be brought to University Council through the Research Committee. “I think it’s very important that before something as controversial and as flawed as this policy is brought forward for consideration, there needs to be very full consultation within the college community”, he stated, adding that “there are signs that that has been understood”.

Sign Up to Our Weekly Roundup

Get The University Times into your inbox every week.