Are global rankings any use?

May 2022 I Opinion Article
By Michael Stopford, Founder and Managing Partner at ANCORED

“Google Search is fundamentally an online rankings system,” that’s what the Editor of US News Best Colleges suggested to me recently. And indeed, Google says much the same itself: they process “hundreds of billions of webpages in our Search index to find the most relevant, useful results in a fraction of a second, and present them in a way that helps you find what you’re looking for.”

When we surf the web, we are so often looking for the best university, the least expensive flight, the most fuel-efficient car, the tastiest recipe…

Perhaps it’s a function of our competitive, free market economy and society that we need to know what’s considered the best in every category. As consumers, we may seek advice in wading through the market’s endless options and choices, beset by advertising of every kind. As citizens, we are intrigued by some apparently credible third party assessment of the best political, social or economic models — and influenced accordingly in making our own judgements.

Online data-driven analysis at least brings the advantage of a certain quantitative basis that can be harder to manipulate.

Of course our interest in rankings and lists far predates the Internet: US News Best Colleges, for example, was first published in 1983; Google was founded in 1998. But the online world is supremely well adapted to producing selective lists of every type of sector and preference, with the availability of vast sets of user data and feedback so beloved of marketers and so essential to the existence of the big data companies. We consumers and citizens seem to be ever more addicted to the packaged and processed information they bring us.

Online data-driven analysis at least brings the advantage of a certain quantitative basis that can be harder to manipulate. A major commercial industry has grown up around the production of rankings, lists and indices that are often opaque in their criteria, subject to manipulation and spurious in their objectivity. They have been created to meet our demand for comparison, our social quest for “the best,” our economic hunger for success. For the universities, companies or even countries included or excluded, promoted or demoted in the rankings, their respective positions have come to be of existential importance.

Take universities as a first example. In the US, every high school student thinking about college will look at his or her prospective choice’s position in US News Best Colleges. Chancellors and university presidents profess to detest this instrument, but they have increasingly adapted their academic, financial and strategic objectives to meet the ranking’s criteria and rise in their positioning. In 2014, Boston Magazine ran as its cover story “How Northeastern University Gamed the College Rankings”: a description of how a little-known commuter college in Boston, Northeastern University, managed to jump one hundred places in the US rankings in just a couple of years to feature amongst the country’s top 50 by focusing solely on some key rankings data. And in March this year the prestigious Columbia University was accused by one of its own professors of cheating on its data to get itself listed as #2 in the US.

There is a total lack of transparency on the metrics used — and a tendency simply to accept the companies’ reports of their own performance in this sphere.

Outside the US, the two most popular “World University Rankings” are those run by Times Higher Education (THE) and the Quacquarelli Symonds (QS) company. These rankings are widely used by international students deciding on where to apply for overseas study, as well as by faculty, grant makers and sponsors, even government departments and multilateral institutions in selecting potential partners. The two companies share some criteria but differ sharply on others, for example on how academic citations are counted. THE has an “Impact Index” exclusively tracking universities’ apparent commitment to the UN SDGs. It has become essential for universities to understand precisely how the rankings are calculated (in fact my company has been helping some universities in this respect), since they may find themselves featured whether they actively participate or not. Amazingly, the same companies that run these supposedly objective rankings then offer their services, for a fee, in advising colleges or universities on how to improve their respective positioning. In fact, this is their core business model.

A similar model applies to many business indices which purport to rank Best Companies, especially in the allegedly socially responsible sphere of ESG (Environmental, Social and Governance) performance. Some of the best-known such indices are the DJSI – Dow Jones Sustainability Indices – and the FTSE4Good Index, a “series of ethical investment stock market indices launched in 2001 by the FTSE Group.” ESG Investing is all the rage today, with the added impetus of climate change, and enormous volumes of investment are flowing into ‘green’ funds managed on ESG principles. However, there are gaping flaws in these lists and funds: as a recent Harvard Business School article pointed out, most of them suffer from self-reporting or “greenwashing.” There is a total lack of transparency on the metrics used — and a tendency simply to accept the companies’ reports of their own performance in this sphere. As a counterweight, at my company we have proposed an index based entirely on critics’ and activists’ analyses and assertions…

Again, the rankings or indexing companies often offer their services to the participants ranked or indexed. When I ran public affairs for Syngenta (a Basel-based agribusiness) a few years back, I approached the DJSI group to ask for advice on how to submit data for inclusion in their index – and was astounded to hear that they offered various levels of support on an entirely fee-paying basis.

The answer would seem to be that they may be a useful adjunct source of information, but should never be taken as the sole basis for choice, decision or selection.

Universities, companies, hospitals, cars — rankings can be found on all of them. And what of countries? Here, one of the best-known examples is the so-called Global Competitiveness Report, published by the World Economic Forum since 2004. It claims to measure “the ability of countries to provide high levels of prosperity to their citizens.” The top three places are habitually Singapore, the US, and Switzerland, but it has been criticized for failing to take into account sufficient environmental or ecological criteria. Then we have the Prosperity Index (from the Legatum Institute: all the top four spots in 2021 were the major Nordic countries); the Democracy Index compiled by the Economist Intelligence Unit; even a World Happiness Index (the Nordics doing well again) — and countless others.

So what are we to make of this plethora of rankings and indices? The answer would seem to be that they may be a useful adjunct source of information, but should never be taken as the sole basis for choice, decision or selection. We should always look carefully at their criteria and their transparency, at what data they purport to measure; examine if their business model allows for those ranked to buy services from the ranking companies, which must surely affect their credibility; and compare different indices and rankings to arrive at as balanced an assessment as possible.

Surrounded as we are today by fake news, self-reinforcing social media, and a general mistrust of authoritative sources, when we next search for “The Best” or the “Most Responsible” we should never suspend our source awareness — and our personal judgement.

About the author(s)

Michael Stopford is the Founder and Managing Partner of ANCORED. He has led strategic communications and managed reputations for some of the world’s most famous names: from Coca-Cola and ExxonMobil, to the United Nations, NATO and the World Bank

Who owns the ANCORED publications?

The articles published on the ANCORED website or distributed in the ANCORED newsletter are owned by either ANCORED or our contributing researchers.

Can I reprint or distribute ANCORED publications?

Any attempt to re-publish our articles requires prior approval from ANCORED. Please email us at for any re-publishing enquiries.

ANCORED articles are available as PDF copies for personal, business, educational, and informational purposes.

We do not authorize the use of our content for promotional, sales, or marketing purposes. Please note that we do not supply customized reprints.

Does ANCORED accept article submissions?

While the majority of our articles are written by ANCORED consultants and researchers, we do accept submissions from external thought leaders and practitioners.

The bar is the same for all authors: we look for thinking that is novel, useful, and rigorously substantiated. For external contributors, we also attach weight to work that sheds light on topics that are a priority for our firm and to submissions from recognized leaders in their field.

To explore whether ANCORED might be interested in publishing your work, please email us at We review both drafts and proposals. If your submission holds promise, we will be in touch to discuss the content and clarify our editorial process.

Downloadable resources

Most popular Insights