“Results show that universities with frequent QS-related contracts experienced much greater upward mobility in both overall rankings and in faculty-student ratio scores over five years in the QS World Rankings,” noted CSHE in an email.
It added that the report used data on the positions of 28 Russian universities in QS World University Rankings from 2016 to 2021, their contracts for services from QS and then compared it with data from Times Higher Education rankings and national statistics.
“Results show that universities with frequent QS-related contracts experienced much greater upward mobility in both overall rankings”
“Positions of Russian universities that had frequent QS-related contracts increased on 0.75 standard deviations (approximately 140 positions) more than they would have increased without such contracts.
“In a similar way, QS faculty-student ratio scores of Russian universities that had frequent QS-related contracts increased on 0.9 standard deviations more than they would have increased without frequent QS-related contracts. Taken together, these findings suggest that conflicts of interest may produce significant distortions in global university rankings.”
The senior researcher responsible for the study, Igor Chirikov, said this was the “first empirical evidence to date that rankers’ conflicts of interest may negatively impact outcomes in university rankings”.
“The results are novel for higher education but they are consistent with the numerous studies from the other sectors of economy suggesting that conflicts of interest lead to biased evaluations,” he continued.
“Most likely, rankers that depend on universities for resources are vulnerable to an unconscious self-serving bias also reported by the studies of the conflict of interest among auditors. Self-serving bias may lead to a more favourable consideration of the data coming from universities that are also frequent clients as compared to other types of institutions.”
However, according to Bahram Bekhradnia, president of the Higher Education Policy Institute, the findings should come as little surprise.
“QS is a commercial organisation. They’re there to make money and their rankings are not objective. They sell services to universities to help them to improve their position in the ranking, not to help them improve their quality,” he told The PIE.
“Of course, rankings are not measuring quality, rankings are just measuring whatever they measure. And so we shouldn’t be surprised.”
For Tia Loukkola, European University Alliance director of institutional development, the study is also significant because it reveals a link many in the industry have long suspected but found difficult to prove due to the lack of transparency around which universities use such services.
“It’s an interesting finding. In general this is something that has been discussed, when one part of a company does rankings and the other part brings in money through consulting,” she said.
“There are quite a few universities who use these services. QS has different packages. The data on who uses the services is not publicly available. I think this is part of the solution.
“But the other part is that we need to understand that the rankings and the measures of the rankings are compiled by commercial providers. We need to understand how they operate [and] put it into context.”
Chief marketing officer of QS Tim Edwards emphasised that the company is transparent in its ranking approach.
“We ensure that in-depth detail of our methodologies and data definitions for all ranking exercises are publicly available,” he noted.
“We are confident in the robustness of our approach in producing one of the world’s most popular higher education rankings, which has served as a useful tool to students and institutions for 18 years. We consistently welcome feedback and scrutiny from our community as part of this approach.”
Currently serving more than 1,300 universities in 50+ countries, with more than 30 years experience as a key higher education sector partner, QS “support[s] institutions to address their key challenges including student recruitment, internationalisation, research collaborations and performance insight”, he continued.
“Most recently, these insights have played an important role in helping support the long-term resilience of institutions throughout the Covid-19 pandemic.
“We are mission driven with integrity at the core of both our values and of our business model. This is underpinned by well-established governance and quality assurance processes, which ensure that our rankings operations and team are functionally independent, with a clear set of conflict-of-interest policies to which all our employees and stakeholders are subject. Our role as a trusted partner to the sector for over 30 years is testament to the strength of our integrity in all our activities which serve the sector.”
Rankings have long been criticised from many vantage points, from the influence of consulting services to reliance on reputation to the use of self-reported data, others warned.
“I was at the conference in Dubai a couple of years ago and somebody came up to me and told me that they had [self-reported] one measure on the amount of external income. They said that their predecessor had wrongly been reporting the number in the United Arab Emirates dirhams instead of US dollars. So it was inflated by five,” said Bekhradnia.
“The data are not comparable between universities and between countries anyway, let’s face it. And then we got all the shenanigans about selling services. It’s a money making operation and in my view does damage to the university system.”
UPDATE: 16:26 GMT April 28, comment from QS added.