Are you wondering who the biggest winners and losers are in the U.S. News & World Report ranking’s shakeup?
If you haven’t heard yet, U.S. News & World Report has overhauled its methodology for listing out the top national colleges and universities in the United States. They’ve called it the most significant change in methodology in the ranking’s history.
So, why is this happening?
We live in a world of cancel culture, and U.S. News & World Report has been getting canceled lately. Not on the undergraduate college list, but on the graduate level. Some of you might be aware that in the last several years, something like a dozen top colleges have pulled out of the medical school rankings of U.S. News & World Report, such as Harvard and Stanford. Same story with law school, with places like Yale, Harvard, NYU, and UVA. So, they’ve just redone all of the metrics, hopefully to try to address some of the concerns about metrics that they had in there that were unfair or stupid.
I’m going to talk through the things that they’ve eliminated and then talk about which schools are winning and losing given these shifts and what it means.
The first thing I’m going to talk about is that they want to focus on outcome versus income. The previous list focused on income, such as alumni donations, and looked at their culture of investing in their own. The 3% of that has been completely nixed. The other metric that was about spending is how much a university spends per student. It used to be 10%, and now it’s gone down to 8%. Together, those two have lost 5%, and that has instead gone to outcome. The outcome is how much money you make when you graduate.
Other lists, like Forbes and the Wall Street Journal, pride themselves on looking at outcomes, not resources and income. They want to know what the bottom line is and if the education is worth it at that college. So, it looks like U.S. News & World Report is doing the same.
The second change is that we start looking at the idea of equity, and in particular, they have two metrics that add up to 5% that are essentially looking at the graduation rates of first-generation students. Now, I know first-generation students don’t capture all the diversity in the world, but it’s one piece of diversity. And we’re saying, do we support these students in two ways? One, financially, do we give them aid packages where they don’t have to drop out sophomore year? Or do we create such high bars on their merit scholarship that if they don’t get a 3.7, they lose their merit scholarship and have to drop out? Two, do we support them academically if they struggle because they came from some craptastic high school? Do we give them the academic support they need to succeed at this level of school and help them get through? Schools that aren’t going to do well in this category tend to be private schools that don’t give enough aid. Places like NYU are notorious for gapping students on aid, meaning they say they’re covering your needs and they might not be. Then you’re really stretched. And guess what happens if you’re stretched year after year and you use up your piggy bank? Well, maybe you don’t come back for junior year. Maybe you drop out for financial reasons, which is one of the top reasons students drop out of college. But looking at this first generation, it’s a way to sort of slap on the wrist those schools that aren’t supporting their kids who are lower-income and not helping them get through.
The third thing is this idea of graduate debt proportion borrowing. They used to try to get at loans in a different way, which basically was the proportion of students that came out with debt versus the proportion that didn’t. And obviously, if most of your students don’t have debt, that’s good. The problem with that is that you can gain that by letting in lots of rich kids. And so places like WashU, Tufts, and Vanderbilt—which are known for enrolling more kind of one percenter kids or having wealthier student bodies and not enrolling as many kids with Pell Grants, etc.—were doing really well with that metric, but they’re not doing as well with the replacement metric, which is what is the amount of loan that an average student has when they graduate, if they do have loans? So, it’s a little bit different, and it’s basically getting rid of that loophole for schools that have lots of rich kids. It’s not rewarding schools for having lots of rich kids anymore, which makes sense. And that’s probably a good response to that critique.
Number four, we’re emphasizing research over the faculty’s salaries and degrees. Meaning, do faculty have what they call terminal degrees, or the highest level of degree available, which is usually a PhD and sometimes an MFA? That used to be a metric—one that actually got Columbia in trouble. That was the one Columbia was fudging. Well, now it’s gone. So, Columbia went up because their fudged numbers don’t matter anymore. And then faculty salaries have gone down in importance just a little bit. Instead, what we have are hard research metrics. We’re looking at what kind of publications research is published in, how many mentions a particular college has, how many professors it’s got putting out research papers, and what the impact factor of those research papers are. We’re getting at that idea of faculty and academic prestige, but in a different way.
What has this done? Well, it’s rewarded some public institutions that do boatloads of research. We’re really seeing places like a lot of the UCs, like UC Davis, which has popped ten places. Purdue, Rutgers, and Texas A&M all have a lot of research, so they’re probably crushing it on this, but it’s hurting colleges that have well-respected disciplines that don’t publish a lot of research. NYU comes to mind for me because they’ve got the Tisch School of Arts. A lot of the professors at Tisch are not publishing research; they’re performers, artists, and screenwriters. Same with USC, where they have some arts schools. Now, USC also has a lot of engineering and a lot of other stuff. So, it’s a little bit of an unfair metric, depending on what kind of school you look at. Obviously, schools that are in the sciences, the social sciences, and engineering are going to publish more papers. So, we can see that if you look at the public colleges that are doing really well, they are places with great research and great engineering programs.
Fifth, we eliminated class sizes and metrics. Big public universities with giant class sizes are now valued much more than places with small class sizes. And some schools have naturally small class sizes; the College of William and Mary is a public institution, but they pride themselves on small class sizes. Well, they didn’t win in this game. They’re also the number one college for alumni giving, but that’s completely gone. So, they moved down the ranks because they look like a private school in a lot of ways, but they’re public. So, it’s not just being public; it’s having these particular metrics. Replacing class size, there is a slight bump in what’s called the faculty to student ratio. There’s a 2% increase there. But again, that’s going to favor these research-heavy schools, because at a research-heavy school, you might have a lot of faculties that do research that don’t teach a single class. That doesn’t influence class size, but it does influence the faculty to student ratio. Is that something that’s important? Yeah, sometimes. Even if you have faculty at a college that aren’t teaching classes, they’re available to be mentors, to mentor students in their honors theses or in their research projects. They have labs where students can go and volunteer and learn. So, it does potentially reflect a resource that colleges have, and maybe that’s something that you can look at legitimately.
Finally, U.S. News & World Report has ditched what’s called high school class standing. That’s where they said what percent of your class was in the top 25% of their high school class or the top 10%. The big problem with this metric is that savvy schools have refused to disclose this information. Especially savvy private schools, well-resourced schools, or schools where the rich kids go because they know if they don’t disclose that information, they could get in more of their kids who rank lower. Especially super-competitive schools. They don’t want to release that because their kids in the top 25% are going to be fine at this college academically, even if they’re not in the top 10%. So, I think getting rid of this metric helps get rid of some of the games that private schools are playing to try to get their kids into these places. And it may level the playing field a bit. It also wasn’t a super reliable metric because it wasn’t complete, and I don’t think we were getting this data from every school. There are so many schools that don’t rank, and that’s been an increasing trend. 50 years ago, everybody ranked. It was probably worth something. But now that everybody’s kind of cheating the system because U.S. News & World Report can’t tell high schools that it’s required, it doesn’t really make sense. There’s no way to police that, so it’s probably good that we’re getting rid of it. And it’s sad because I wish GPAs meant something, but there you go. Test scores have stayed in. GPAs have gone.
Okay, so let’s look at the winners and losers. One of the biggest winners is Texas A&M, which is up 20 spaces. It’s now ranked in the top 50, at 47th. Virginia Tech was one I always thought was underrated, so I’m glad to see it rise. It’s now ranked 47th. Rutgers, now ranked 40th. Georgia Tech, 33rd. I also love Georgia Tech, and it’s a great school for engineers. UC Davis is now ranked 28th, and that’s one that I’m still scratching my head at. University of Maryland, 46, up nine. Purdue, 43rd, up eight. UNC Chapel Hill, ranked 22nd, and it climbed seven places. University of Illinois ranked 35th, up from 41. UT Austin ranked 32nd, up six points. UC San Diego is also up six points at 28. Columbia rose six points to 12th. UCLA and UC Berkeley both rose five points to 15th. And Cornell rose five points to 12th. So, there are only two private schools on this list, and I basically just took the top 50 schools for this. And there were 15 of the top 50 that rose. Only two of them are private. Most of the beneficiaries, again, are public schools.
The biggest losers are, for the most part, private schools. William and Mary is one outlier there because it’s public, but again, it looks kind of private with its metrics. Tulane was the biggest loser, losing 29 spots in the rankings. Wake Forest is down 18. Brandeis is down 16. William and Mary is down 12. University of Rochester, down 11. NYU is down ten places. Wash U in St. Louis is down 9. Case Western is down 9. Northeastern is down 9 to 53rd, and Tufts is down 8 to 40th.
So, what are the takeaways? Takeaway number one is that Harvard, Yale, Princeton, Stanford, and MIT are still the queen bees. They have capable students. They do everything right. They’re still at the top. Public schools have financial benefits that many people probably overlooked before, and U.S. News & World Report, with their new ranking system, is promoting those benefits. But I will say a lot of things haven’t changed since they pushed different buttons in the data. Are we seeing some of the schools that I think were probably underrated rising? Yes. But are we seeing some schools that are still really awesome get bunked because they didn’t hit a particular metric? Also yes. Are class sizes important? Yes. For some students, having small class sizes could be really important to them. Seeing these shifts should tell you something, and that is to not let U.S. News & World Report tell you which data matters to you. You can look up which college has the strongest CS department if that’s what you want to major in. You can look up class sizes. If you want to be in small classes, look up the average class size at a university. Maybe Purdue’s not for you because you don’t want to be in ginormous classes. Maybe you’d be better off at Rose Holman, another Indiana school that has a great technology program but isn’t even a national university, so it doesn’t get ranked on this list and isn’t a liberal arts college because it’s engineering-based, so it doesn’t get ranked there, and it kind of gets lost in the ranking shuffle. But it’s still a great school. Babson only graduates people in business, so it doesn’t make any of these rankings except the business ranking, in which it’s number one in entrepreneurship.
It’s a good place to get started, but just remember, you have a lot more data at your fingertips than just U.S. News & World Report, and the data that matters to you isn’t the data that matters to everyone. There’s no one size fits all, except for the U.S. News & World Report rankings list, which is now a new size to fit everyone. Sort of. I hope you guys enjoyed this blog and that it helped you understand the new rankings a little better!
US News College Rankings Changes
|Schools||2023-2024 Rank||2022-2023 Rank||Change|
|Rutgers- New Brunswick||40||55||+15|
|University of Maryland||46||55||+9|
|UNC Chapel Hill||22||29||+7|
|University of Illinois Urbana-Champaign||35||41||+6|
|UC San Diego||28||34||+6|
|School||2023-2024 Rank||2022-2023 Rank||Change|
|Wake Forest University||47||29||-18|
|William & Mary||53||41||-12|
|University of Rochester||47||36||-11|
|Washington University in St Louis||24||15||-9|
|Case Western Reserve University||53||44||-9|
The whole list
|Institution||Rank (2024)||Rank (2023)||Change|
|University of Chicago||12||6||-6|
|University of Pennsylvania||6||7||1|
|Washington University in St. Louis||24||15||-9|
|University of Michigan||21||25||4|
|University of Virginia||24||25||1|
|UNC Chapel Hill||22||29||7|
|University of Florida||28||29||1|
|UC Santa Barbara||35||32||-3|
|UC San Diego||28||34||6|
|University of Rochester||47||36||-11|
|University of Wisconsin - Madison||35||38||3|
|University of Illinois - Urbana Champaign||35||41||6|
|William & Mary||53||41||-12|
|University of Georgia||47||49||2|
|University of Maryland||46||55||9|