Do the rankings really help you choose an MBA?

Picking a business school is not made easier by the fact that the rankings vary widely, with each of the big name lists using its own particular criteria and weighting.

by Jeremy Hazlehurst
Last Updated: 09 Oct 2013

With the economy still in the doldrums, this could be a great time to take a full-time MBA. Why slog your guts out working thanklessly in an understaffed department when you could take a year or two out to study and improve your career prospects, then when people are hiring again jump back into the job market with a spring in your step and a solid business qualification under your belt?

Anybody who is thinking along these lines is sure to be salivating after looking at the Financial Times' latest MBA rankings, which came out last month. At the top end, salaries of $150,000-plus and increases in pay of 130% for successful MBA graduates are not uncommon. But just how accurate are such MBA rankings, and how useful are they for those trying to choose an MBA?

The FT's ranking is not the only one, of course. The other three main lists are compiled by the Economist, Business Week and Forbes. The last two are big in America - and concentrate on US schools - while most would-be students in Europe and beyond take the FT's and the Economist's more seriously.

However, anybody hoping to take a casual glance at the rankings and get a clear idea of how the world's MBAs compare is likely to end up confused. In its new ranking, the FT says that London Business School and Wharton are the joint best schools. For the Economist, these two come in 19th and eighth respectively. It reckons that the three best schools are Chicago Booth, Tuck at Dartmouth College and Haas at the University of California - which are 11th, 18th and 25th in the FT. French school Insead is fourth in the FT and 23rd in the Economist, while further down, Chinese school CEIBS is 100th in the Economist, but 17th in the FT.

The lists can also be volatile, especially for smaller schools: in the Economist, Belgian school Vlerick fell from 10th one year to 47th the next, while in the new FT ranking the same school went up 32 places (to 55th). Then there is the Indian Institute of Management in Ahmedebad, which has debuted the FT ranking at number 11. It has a salary increase of 152%, making it third on that metric, while its alumni's average salary is $174,440, putting it second globally behind Stanford. Why does this happen, and what does it all mean?

The anomalies and oddities are results of the methodologies used to compile the lists (see box opposite). Talk to academics, students and alumni and you hear all sorts of criticisms of the ways they are compiled. Some say that the number of female faculty members, measured by the FT, or the number with PhDs, measured by the Economist, is irrelevant. Others think that comparing small specialist schools to giants such as Harvard, where 1,800 people are studying for an MBA, is meaningless.

But leaving this aside, there are two main criticisms. Firstly, that the rankings concentrate too much on money. In the FT, 40% of the ranking is made up of salary and salary increase - by far the biggest chunks. It results in the sorts of schools from which people go straight into highly paid City or Wall Street jobs ranking especially highly.

The FT does try to weight for this, but if you know that 40% of the top-ranked London Business School alumni went to work in financial services in the boom years of 2005-7, then you can imagine how the figure gets boosted. And, in its bid to be global, the FT translates local salaries into US dollars on a purchasing power parity basis. This could be responsible for the big weighted salary results from Indian schools - it's easier to have massive purchasing power in a country where most people are incredibly poor.

Secondly, both the Economist and the FT get their information from alumni and schools - clearly a potential source of bias, whether conscious or not. The schools' data is audited, but naturally enough, students have a vested interest in their school being ranked highly - an MBA from a top 10 school impresses recruiters.

Most business schools generally express a lofty but careful scepticism about rankings - while of course making full use of good scores in their marketing literature. In 2004, Wharton and Harvard boycotted rankings on the grounds that methodologies were not 'objective', but few go so far.

Cass is typical in saying that rankings are 'one important indicator of quality', but warns that 'they are composed of an extremely large amount of disparate information and should be read carefully by prospective students'. Even the FT says that rankings are 'one tool' for deciding on an MBA, and that 'each student will have personal criteria that cannot be represented in global rankings'.

So do would-be MBAs find the rankings useful? A recent survey of MBAs by GMAC, the organisation which administers the GMAT entry qualification used by almost all business schools, found that prospective students said that schools' websites were the most useful source of information, followed by the rankings, then by meeting and talking to current students and alumni.

The question for would-be MBAers is how to use the rankings. The sheer quantity of data in the Economist and FT lists can be used to compare schools in many different ways, depending on an individual's perspective and aims. As Julia Tyler, executive vice president of GMAC, says, choosing an MBA is 'a bit like falling in love - we all wish there was a magic potion or silver bullet that helps us to make the right decision, but there isn't'.

HOW THE RANKINGS WORK

FT

Forty per cent of the final ranking in the FT's list comes from two numbers - the percentage increase in salary between pre and post-MBA and the total post-MBA salary, weighted on a purchasing power parity basis. The other 60% is made up of 20 much smaller slices, including the number of faculty with PhDs, and the answers to rather vague questions such as whether graduates have achieved their intended goals. It is also worth bearing in mind that the 2011 list is compiled using data from graduates of the classes of 2005-7, whose experiences may or may not be relevant to the post-crash world of 2011. Data is provided by alumni and the schools themselves.

The Economist

Alumni ratings make up 20% of the total ranking, while 80% of data is from schools. The scores are weighted 35% for 'career opportunities'; 35% for 'personal development/educational experience'; 20% 'earning potential'; and 10% 'networking potential'. Some of the schools that have appeared in the top places have surprised business education professionals.

Forbes

Uses information from alumni only, and simply looks at the increases in their salaries post-MBA. This has the virtue of simplicity, but is less useful for a prospective MBA scouring the rankings for broader information. It also concentrates heavily on us schools.

Business Week

Published every two years, this survey simply asks alumni how satisfied they are with their MBA school, and asks recruiters which schools they think produce the best MBAs. Again it is simple and easily understood. But it has only 18 non-US schools, which makes it feel somewhat old-fashioned compared with the FT or the Economist's global lists.

Find this article useful?

Get more great articles like this in your inbox every lunchtime