On Sunday, Quacquarelli Symonds (QS) announced that Trinity could face suspension from the university ranking survey on account of a questionable letter sent out to academics and others affiliated with College.
According to the news site Inside Higher Ed, who received a copy of the letter from anonymous recipients, the contents of the document appear to be an attempt to encourage peers familiar with the College to register for the voluntary vote in the survey.
Written by Dean and Vice President for Research John Boland, an excerpt from the emailed letter reads: “University rankings are also important to Trinity, and all other universities around the world, and we were pleased that the latest rankings reaffirmed Trinity College Dublin’s position as Ireland’s top university and one of the best in Europe. We know that our research successes are augmented by our collaborations with world-class academics such as you.”
However, the contention surrounding the letter is due to a link embedded in the email, directing the recipients to the QS site where they may register to vote in the rankings. This link was preceded by an encouragement to fill out the surveys and an indication that the conduction of both the QS and the Times Higher Education (THE) surveys were about to start.
Simona Bizzozero, a spokeswoman for QS, clearly outlined their position on the letter to Inside Higher Ed: “Any attempt to overtly influence a potential survey respondent to vote in favour of a given institution is unacceptable and where identified, such respondents will [be] excluded from analysis,” she said. “Further consequences for the TCD, up to and including temporary suspension from the ranking altogether, will be considered.”
In response to the statement issued by QS, Boland said that he was “surprised by the response to our letters.” According to Boland, Trinity will be in contact with QS in the following days to discuss the situation.
Both QS and THE use surveys for calculating portions of their rankings, but only the former allows individuals to nominate themselves for the survey process.
In 2013, attempts by UCC’s president Dr Michael Murphy to encourage members of staff to garner university-affiliated survey participants sparked QS’s decision to remove a university’s ability to recruit participants for the survey.
However, QS reinstated the sign-up facility last year on the basis that it was a “crucial source of respondents.” In order to maintain a restriction on “noticeable attempts to manipulate participation in QS surveys and therefore results,” they have now included a question within the survey asking respondents to specify from which institution they have heard about the self-nomination facility.
QS have defended the integrity of their survey system in the face of criticism: “QS runs sophisticated screening analysis to detect anomalous patterns in response and routinely discards invalid responses,” remarked Bizzozero, later adding that rival rankings compiler THE is not any less vulnerable to error in their survey practices: “A lower overall sample size, and the practice of drawing outcomes from a single year of response, will render their exercise more statistically sensitive.”
However, outside of their statistical accuracy and safeguards against bias, the role that university rankings play in university improvement remains an even more contentious issue. Many argue that the predominant ranking systems do not give enough consideration to the different contexts in which universities are situated across the globe and are thus an inaccurate measure of reputation.
Speaking to University World News, Gerald Wangenge-Ouma, director of institutional planning at the University of Pretoria in South Africa, argues that current systems encourage universities to chase after improvements in fields of research and study most valued by the survey itself, rather than advancing their own areas of expertise: “What we’ve seen is universities running away from their areas of strength, and trying to do things that are privileged by the ranking systems, even though they don’t have the capacity to do them.”
Wangenge-Ouma will be a member of the panel on university rankings at the British Council’s Going Global 2016 conference being held from 3-5 May in Cape Town, South Africa. He added that more needs to be done in utilising rankings as an incentive to improve university performance, rather than viewing them as ends in themselves: “I’m for a system that supports stability so that universities don’t think they have to keep moving here and there to clinch a top position in the ranking.”