According to the QS World University Ranking 2019, the National University of Singapore is now the 11th greatest university in the world. Not only did NUS rank first in Asia, it also managed to defeat prestigious brands like Princeton (13), Tokyo (23) or Tsinghua (17).
Its arch-rival NTU is not far behind. Tied at 11th place, they are close enough to high-five, and why wouldn’t they? If the rankings can be believed, Singapore must be—pound-for-pound—the greatest hub for higher learning since the Library of Alexandria caught fire.
Take a closer look, however, and doubts might emerge. While QS rates NUS 11th in the world, their opinion is not shared by rival rankers like Times Higher Education, U.S. News, or the Academic Ranking of World Universities.
In fact, the final score reads like a string of TOTO numbers. NUS is ranked 11th (QS), 38th (U.S. News), 23rd (THE), 85th (ARWU) and 103rd (CWUR), while the winning combination for NTU is 11, 49, 51, 96 and 173.
If we compare year-on-year, the whole exercise becomes even more absurd.
10 years ago, NTU was ranked 73rd in the world, wedged between Geneva University and Washington University in St Louis.
10 years later, NTU has apparently made such great strides in research and/or teaching, that it leapfrogged more than 60 competitors to clinch the 11th spot.
With all due respect to NTU graduates—including my boss—I find this rags-to-repute miracle a little hard to believe. If a private company made such advances on the SGX, we would be auditing their every fart for signs of wrongdoing.
Hence, I seriously doubt if the changing numbers reflect any sort of reality, whether for students or faculty. This is not Slumdog Millionaire.
So, depending on who you choose to believe, you either went to one of the best schools in the world, a middling college, or a degree-mill selling glorified toilet paper. Perhaps we need a RICE Ranking of University Rankings to make things easier.
To borrow an analogy from British author Hilary Mantel, asking whether university rankings are accurate predictors for education quality is a bit like asking if horoscopes are accurate.
Of course, they’re not. Every student knows that university life is a mixed bag. Given the sheer variety of courses, modules and instructors on offer, it is nearly impossible to predict the quality of the ‘education’ you eventually receive.
But then again, like horoscopes, aren’t they nice to look at? Isn’t it fun to learn that your alma mater has climbed two places to defeat Manchester City Yale? Times Higher Education cannot predict your love life in the Age of Aquarius, but they’re comforting for lost high-school students and entertaining for jaded graduates. Anxious teens look to them for advice while adults get a good chuckle whilst scrolling past.
If that’s the case, we must ask, why do the rankings continue to exist? Why does the Straits Times—and other newspapers—continue the vaguely Pravda-like practice of publishing ‘good news’ about university rankings, as if NUS were a dog we had launched into space?
In other words, why are they taken so seriously?
I am naturally inclined to blame it on Singaporean Kiasu-ness, but in this scenario, it is not true. In this case, the rise of university rankings has more to do with the emergence of higher education as a lucrative export industry.
Over the past 20 years, the number of highly-mobile international students have increased drastically. According to UNESCO’s data, the student population more than doubled to 4.8 million between 2000 and 2016. By certain estimates, fee-paying international students inject billions into the local economy—$30 billion for Australia, £20 billion for the UK and $39 billion for USA. That’s a lot of money distributed between the university coffers, housing providers and the local drinking establishments.
Singaporean universities do not have the same thirst for foreign money, but our government loves them in their own way. According to ‘Emerging Education Hubs: the case of Singapore’, our government wants to develop the country into ‘The Boston Of The East’. Our ambition to attract foreign talent naturally translates into a quest to attract foreign students.
This is where the rankings come in. Freshly-graduated high school students—whether they’re based in Beijing, Bangalore, or Bukit Timah—usually do not have a fucking clue where they want to go or how to get there. Oxbridge and Ivy league shine out like lights at the end of a dark tunnel, but there’s not enough time to prepare and few students have a good idea of what else is out there. In the opaque higher education market, there is much anxiety, little time, and a desperate paucity of ‘reliable’ information.
Into this void, enter companies like THE and QS, promising nervous matriculants ‘reliable’ information, in easy-to-process integer form. NUS: 11! Duke: 24! SUTD: 500! Never mind that a university has varying degrees of research excellence even in one department. Never mind if the methodology is subject to all sorts of criticism. For a desperate student, there is nothing more comforting than a table telling you A is better than B.
Thus, it should come as no surprise that these ranking companies also offer ‘consulting services’ to help universities ‘improve through performance analysis’ or to ‘advance their reputation’’.
I don’t want to face a libel suit, but I am suddenly reminded of paying for power-ups in Bejeweled Blitz.
Of course, none of the above is really news to those who follow the higher-education debate. The Presidents and provosts of highly-ranked universities have been rubbishing these rankings for as long as consultancies have been publishing them.
Everyone from Malcolm Gladwell to The New York Times has made a case for why university rankings are dubious at best and bollocks at worst.
Many of the discrepancies can be explained by differences in methodology. While some companies like QS survey nebulous qualities like ‘reputation’, others like the ARWU devote themselves almost exclusively to research parameters—research output, Nobel prizes won, etc. While the former favour a highly international faculty, the latter doesn’t give a rat’s arse how many of its professors came from Malta.
But since there is really no agreement for what makes a ‘good university’, there is even less consensus on which university is ‘better’. Hence, the Fibonacci-like sequences found in University rankings.
In short, we give these rankings too much credit. If you doubt this, just check out the SMU’s utterly dismal performance at 477th place. By right, their faculty members should not even possess opposable thumbs, much less teaching skills. Yet, most employers love SMU graduates just fine, and most are unhappily employed beside their more prestigious compatriots from 11 and 12.
Now, isn’t that bloody confusing?
For extra credit, solve for X: (33, 39, 38, 56, 68). Write to us at community@ricemedia.co