*>we both formally and experimentally show in multiple ways that this ranking scheme is not reliable and cannot be trusted as authoritative because it is too sensitive to weight changes and can easily be gamed.
That's a feature, not a bug, as far as USNWR (the magazine doing the ranking) is concerned. A more reliable ranking would be more stable and therefore less interesting for the universities. Why pay attention to it if Harvard, Standford, MIT are always 1, 2, 3? It's much more "interesting" and certainly sells more copies/gets more clicks if the rankings have a certain amount of volatility. The fact that it's easily game-able also makes it more attractive to universities because they can do something proactive if they're dissatisfied with their ranking. All these factors combine to make the ranking itself seem more important and prestigious.
^ ;-)
Not to be confused with StamfordAre you saying that there is nothing lower ranked universities could do to improve their ranking in a better, more objective ranking of universities? Because it smacks of elitism.
The tops teams don't vary all that much from year to year. They might change positions a bit, but it's rare for a new team to show up who hasn't shown up before.
They essentially have to get lucky and get someone who is super talented into a long term contract, who then attracts additional talent.
Or they have to invent a whole new way of doing things, like how the Oakland A's invented a whole new way of recruiting using math (which incidentally no longer helps them because all the top teams adopted it and are back on top again).
It's all about network effects.
Here's what they did: they took the top research universities and the top liberal arts colleges. They then observed that the rankings can't distinguish between the top institutions within each class, and that you can also flip which of those two classes is preferred.
But, in what sense is it even meaningful to compare Amherst College and Harvard [1]? Those are just enormously different types of institutions. It'd be like creating a ranking of "best cars" that includes the top 5 sedans and the top 5 trucks, and then observing that you can jiggle the rankings to get anyone on top. Is a Toyota Corolla better than an F-150? IDK. Stupid question. US News and World Report, for all its problems, does at least get this much right! they break down institutions by "type" and then rank within type.
Additionally, just because rankings are noisy and easy to game locally doesn't mean they are inaccurate or easy to game globally. Two institutions within 10-20 slots of one another are probably pretty similar and rankings aren't particularly helpful / are easy to game. But the #100 liberal arts college is probably not as good as Amherst, and the #78 National University is probably not as good as Harvard, and no amount of gaming is going to change that.
Rankings are indeed noisy and inaccurate and easy to game. But this particular article is not a compelling demonstration of that fact.
[1] For non-US readers: Amherst college (not to be confused with UMass Amherst) is in a class of peculiar institutions that are fairly unique to the USA as described here: https://en.wikipedia.org/wiki/Liberal_arts_college TL;DR: they're basically the diametric opposite of super-charged research universities like Harvard.
I don't think the paper gives any hints on us dividing universities into classes and comparing. Would love to know how you reached that conclusion.
At the same time, yes, we could have divided universities into different classes: research vs liberal arts, this vs. that state, big vs. small size, etc. These are all trivial groupings but none of these would change the conclusions of this paper. For all practical purposes, we could easily have replaced the university names with labels like U1, U2, ... and still the conclusions would not change. What matters is how a weight-based composite index can be gamed and the paper does show that in multiple ways. Pls review the ILP formulations yourself and run them on the dataset of your choice.
I can see how it would look weird from a German, or possibly a Dutch or Nordic perspective but a deliberate effort to bring up the bottom and limited attempts to raise the top is extremely different from the US system, where there are large and growing returns to excellence.
Also the US elite just have smaller enrollments as a share of population. Oxbridge, the grandes écoles or Korea’s top three enrol 1-2% of yearly students. The top ten US universities maybe a tenth of that.
(not disagreeing with you, just adding some nuance about the direction of things)
Its great that students can make an informed choice about where to start, particularly if they do not come from an academic household. Its not a given that people have heard of places like MIT, and these lists are a somewhat neutral way of seeing which universities are the most "legit". You have to remember that a lot of low-quality institutions are marketing themselves pretty hard to students, and if it weren't for rankings it would be easy to make bad choices when choosing a place to study.
People who care about Ivies and the such don't care because the Ivies have a high ranking. Someone who went to Harvard who wants to use that fact to their advantage isn't going to say that they went to a "top 3" school.
I've sat in department wide meetings where deans of engineering compared their schools to 'the competition,' both in terms of US news rankings and their own preferred metrics. Usually to bemoan how unfair it is that Texas is a petrostate with 10x the population that can fund top tier engineering schools.
But by and large, the tiers are static and a uni in the #47 slot isn't making to the top 10 any time soon. Might not even make top 40. For undergraduate education I think you're right, nobody really cares. There's really three categories of schools: private schools / ivies, flagship state schools, and the rest. I'm guessing if you asked employers to rank schools, they would largely match selectivity -- how hard it is to get into the school. (or more depressingly, how good the athletics programs are doing lately)
Most high school students/parents/guidance counselors/etc check the rankings, reputation, average salary of graduates, etc. It even extends to majors and rankings within majors - which might matter more.
> Even then it's not like it's common for people to pick which school to go to out of the ones that accepted them based solely on rankings.
Who said solely? Of course there are many other factors ( location, tuition, scholarships, etc ). But rankings/reputation/etc are a big part of a high schooler's college decision.
When you're making one of the biggest decisions of your life, rankings can help you know how good your choices are.
Not saying rankings are correct...
But for those applying, they likely appreciate having a ranking mechanism.
It's one of the few objective measures of universities available.
How do you know, otherwise, if this university that wants to charge you $40,000 per year is really going to deliver on their promise?
You can't just go back to school again and start over.
It's a life decision, so there is a desperate desire for the applying student to have an objective way of stack ranking the choices.
(Disclosure: I attended MIT. And I basically just applied to the top 5 or 10 computer science universities -- again, using rankings. And hoped to get into one of those.)
I've observed that, among people who care about university rankings in hiring, there are exactly two categories: the top five, and everything else. If you went to #200, you're the "same" as somebody who went to #25.
Rankings drive enrolment, especially international enrolment, which is a major source of revenue. Also prestige helps to attract the best researchers, who are being wooed by many top schools.
In some specific programs, you'd want to aim for the schools known to have the more robust curriculum but outside of that, it's mostly ego/prestige. "I got to a top N school, I went to harvard", etc.
That badge signals _something_ to your peers, parents, and future employers. Think of all those companies that explicitly filter for "target schools". It works pretty well too, you'll get a call-back to an interview because you have an Ivy on your resume even if you barely passed your classes.
That’s interesting. I’ve consistently heard the same sentiment from ex-students and alumni of certain highly ranked Canadian universities (undergrad) as well. They’ll generally complain about a lack of student support and restrictive policies at these institutions, making it needlessly difficult for students to succeed.
I suspect the higher ranked universities might be under pressure to artificially increase the difficulty of their programs, in order to distinguish their alumni from the alumni of other schools, who’s programs cover more or less the same subject matter.
And I disagree with your statement. Many other western countries have their "ivy league" equivalent. Not all schools are equal in the rest of the world either. But I do however agree that the "marketing" aspect is definitely more pronounced in the US.
Also, in terms of learning, it's really about the access you can get to the ten or so percent of the faculty that actually worth anything. Figure out who they are and go after them.
Also, network, network, network. Smart or not, everyone you meet in school might give you a reference/job/funding someday.
(source: hard experience)
Yes, this is seldom considered. For example, if you plan to graduate with a BS in physics, would you rather be in the top quartile at Cal Poly, or the bottom quartile at Cal Tech?
I.e. getting a ticket to pick any pond you want after graduation.
That's just a guess, but I've seen a lot of others suffer from shooting too high.
1. They are significantly biased to English language (universities/research output) and bigger countries in general 2. They are totally irrelevant from the student perspective as they mostly focus on research output, which means that you'll get the richest universities and the best professors will be writing papers rather than teaching bachelor level courses.
There are some attempts to fix these issues (see eg the EU "Multirank" which allows you to choose your priorities), but it's still hit and miss.
https://www.insidehighered.com/news/2013/05/29/methodology-q...
For instance, QS sells a 'star-ranking' system for schools. Coincidentally the schools who paid for a '5-star' ranking, such as the Unviersity of Bristol or the Universiti Malaya are also placed higher on the actual QS ranking than those that did not, including Georgia Tech, the University of Washington, Ecole Polytechnique and UIUC.
"And there are several perverse incentives in the marketplace that make it hard for colleges to cut costs. The most basic one is that the U.S. News algorithm rewards them for spending a lot of money: Higher faculty salaries and more spending on student services lead directly to better rankings. If you reduce your expenses, your ranking will fall, which means that next year your applicant pool will probably shrink."
You'd be surprised at the burgeoning expenses of universities, and often meager profitability, despite record-high tuition rates. It seems, often like startups, a burn rate signals potential growth!
https://www.nytimes.com/interactive/2019/09/10/magazine/coll...
But if the increased ranking is done based on data from, say a half a year later, then that just seems like a normal audit effect? That it might just make the client better at gaming the rules is true for a normal audit too.
> Freeland swept into Northeastern with a brand-new mantra: recalibrate the school to climb up the ranks. “There’s no question that the system invites gaming,” Freeland tells me. “We made a systematic effort to influence [the outcome].” He directed university researchers to break the U.S. News code and replicate its formulas.
Two things I like about the Northeastern story are that the official demonstrated even more ambitious gaming, and it was publicized.
No criticism of Northeastern; they have many great people, doing great work. And I've heard many faculty objected to the rankings-climbing emphasis as it was happening.
faculty were generally pleasant to interact with, the education was probably average. job prospects were mediocre - alot of competition in boston and NU is in a lower tier, comparatively.
as a measure of quality, based on my experience, that ranking is worthless.
I've also heard some good things about NEU cybersecurity.
I'm not familiar with the other departments.
I can't find it now though because all the search terms I can think of just return a bunch of other stuff related to universities and students.
> In this paper, we take a fresh look at this ranking scheme using the public College dataset; we both formally and experimentally show in multiple ways that this ranking scheme is not reliable and cannot be trusted as authoritative because it is too sensitive to weight changes and can easily be gamed.
No pressure on the professor or university, though. All on the students. There's a rant letter, from the prof, about effort where effort wasn't a factor. My kid survived solely because he has a very good memory. No accountability. I AM PISSED.
I can't imagine this happening in a college class where the instructor get to keep his class next year.
For example I did a stint in policy and I'm reminded of how everyone insisted for a long time that "top" law schools are the "top 14" or T14 for short. This is historically the measure used since Georgetown was in DC, and the children of the elite must be "top tier" regardless of if their school ranked outside top ten on every official metric such as incoming LSATs, publications, etc :D
> There exists an informal category known as the "Top Fourteen" or "T14," which refers to the fourteen institutions that regularly claim the top spots in the yearly U.S. News & World Report ranking of American law schools.[8] Furthermore, only these fourteen schools have ever placed within the top ten spots in those rankings.[9] Although "T14" is not a designation used by U.S News itself, the term is "widely known in the legal community."[10] While these schools have seen their position within the top fourteen spots shift frequently, they have generally not placed outside of the top fourteen spots since the inception of the rankings.[11] There have been rare exceptions to this, however, such as UCLA School of Law appearing in the top fourteen instead of Cornell and Northwestern in 1987 and University of Texas School of Law displacing Georgetown in 2018, although the significance of these changes has been debated.
https://en.wikipedia.org/wiki/Law_school_rankings_in_the_Uni...
>There have been rare exceptions to this, however, such as UCLA School of Law appearing in the top fourteen instead of Cornell and Northwestern in 1987 and University of Texas School of Law displacing Georgetown in 2018, although the significance of these changes has been debated.
People select measures for a number of reasons. It's perfectly possible the single and only reason T14 is used is for the reasons you lay out.
OTOH I have personally had conversations with people who argued that the T14 is a measure of merit when Georgetown was in the T14 then "debated the significance" of the measure when Georgetown dropped out of the top 14 in 2018.
It was not clever or endearing.
Median starting salary / median cost to attend.
Sure, there are other considerations such as. If it will get you into a better grad school or something, but for most it’s really this simple.